Eclipse Newsletter - Looking Towards Mars

November 26, 2014 05:09 PM

Read the lastest Eclipse Newsletter and check out the new theme!

November 26, 2014 05:09 PM

SWT Mouse Click Implementation

by Frank Appel at November 26, 2014 12:14 PM

Written by Frank Appel

Doing a bit of SWT custom widget development lately, I stumbled across the question why is there no such thing as a default SWT mouse click listener? As this subject raises once in a while, I thought writing a word or two about the rational ground behind – and how to implement mouse clicks in general – would not hurt.

SWT Mouse Click

Event driven widget toolkits usually distinguish between low-level and semantic events. A low-level event represents window system occurrences or low level input. The mouse and keyboard inputs belong basically to this group.

Semantic events in turn are the result of control specific user interaction and might be composed of one or more low-level events. A button-click for example could be specified as a mouse-down followed by a mouse-up without the mouse leaving the bounds of the control.

The point of the matter is the control specific user interaction. An image-click might be specified as a mouse-down followed by a mouse-up without leaving the bounds of particular regions of the image. Which makes a small but mighty difference.

The semantic event type SWT.Selection e.g. corresponds to the button-click specification given above for the org.eclipse.swt.widgets.Button control. But its composition is quite different on org.eclipse.swt.widgets.Slider. The latter behaves rather like the image-click definition:

SWT Mouse Click: Mouse Click on Slider

It is obvious that nobody would like to add particular listeners for mouse-click events on each of the control’s regions. It is much more comfortable to have a semantic abstraction based on those low-level events that notifies observers about the crucial point or interest1.

Button Click for Custom Widgets

So how can a button-click event implementation look like on a (sub-)component of a custom SWT widget? Consider for example a Composite comprised of a few labels and one of those – for whatever reason – should serve as an action trigger.

The click behavior could be accomplished with a little action wrapper working on top of the typed event abstraction of SWT. It may implement/extend and can be registered at controls to serve as a button-click listener:

static class ClickBehavior extends MouseAdapter {

  private final Runnable action;

  ClickBehavior( Runnable action ) {
    this.action = action;

  public void mouseDown( MouseEvent event ) {
    // TODO: decent implementation

  public void mouseUp( MouseEvent event ) {
    // TODO: decent implementation

As you can see the class ClickBehavior wraps a Runnable which should be triggered by a click on the observed control. To do so the first step is to verify that a left-mouse-button-down has occured and flag the observer to be trigger ready. A simple mouseDown implementation might look like this:

public static final int LEFT_BUTTON = 1;
public void mouseDown( MouseEvent event ) {
  if( event.button == LEFT_BUTTON ) {
    armed = true;

The second step is to check whether a subsequent mouseUp event has occured within the bounds of the monitored control. If so (and armed) the semantic condition has been fulfilled and the action can be triggered2:

public void mouseUp( MouseEvent event ) {
  if( armed && inRange( event ) ) {;
  armed = false;

static boolean inRange( MouseEvent event ) {
  Point size
    = ( ( Control )event.widget ).getSize();
  return    event.x >= 0 
         && event.x <= size.x
         && event.y >= 0
         && event.y <= size.y;

This implementation is sufficient to be able to handle a 'button-click' event on e.g. a org.eclipse.widgets.Label as shown by the following snippet:

final Shell shell = [...];
Label label = new Label( shell, SWT.NONE );
label.setText( "Click me!" );
  new ClickBehavior( new Runnable() {

  public void run() {
    MessageBox box = new MessageBox( shell );
    box.setMessage( "Label was clicked." );
    box.setText( "Message" );;

} ) );

And voilà, this is how it looks like at runtime:

SWT Mouse Click: Label Click

Wrap Up

As explained above there are good reasons for SWT to omit a general purpose mouse click implementation. And the given example showed how to implement a simple button-click semantic for custom widgets. However there is still more to consider. Widgets often react on mouse down visually to indicate that they are trigger-ready for example.

Because of this custom widget code gets quickly blown up and tend to blur the various event related reponsibilities. For a clean separation of the event-semantic from the visual-effect code, I usually extract the first one in a little helper class. I even have a general purpose version for button-click events called ButtonClick, which is part of the SWT utility feature of the Xiliary P2 repository.

In case all this mouse click related content raised a desire for some practical application of mouse clicking: how about using the social buttons below to share the knowledge? ;-)

Title Image: ©

  1. The attentive reader may have recognized that I omit the slider's drag region which also adds to the selection semantic. This is because it does not match the click behaviour and would go beyond the scope of this discussion.
  2. It might be noteworthy to mention that a real world implementation should ensure that the armed flag gets also reset in case a runtime exception is thrown during

The post SWT Mouse Click Implementation appeared first on Code Affine.

by Frank Appel at November 26, 2014 12:14 PM

Jigsaw, modules and OSGI - next round

by Andrey Loskutov at November 26, 2014 08:23 AM

 Jigsaw project goes to the next round [1] with the new "Java Platform Module System" JSR 376...

At least OSGI is mentioned this time, quote:

"Some members of the Java community have already invested significantly in applications and frameworks built on top of the OSGi Service Platform."

Buhaha. "Some members"! The whole world except Oracle is using OSGI today... I'm pretty sure Oracle does this too...

" The module system will provide a means for an OSGi kernel to locate Java modules and resolve them using its own resolver, except possibly for core system modules. This will enable OSGi bundles running in such a kernel to depend upon Java modules."

Have no idea how OSGI bundles should depend upon new modules if "module" is not defined in OSGI at all, OSGI defines "bundles". So the new modules will be offered as "faked" bundles to OSGI? NIH principle at glance.

Just wondering - there is not a single word about version schema and version constraints... If this will be different to OSGI, we will have even more fun later...


by Andrey Loskutov at November 26, 2014 08:23 AM

TextEditor Framework with e4 and JavaFX (including jdt and nashorn)

by Tom Schindl at November 26, 2014 12:25 AM

Tonight I had some time to hack on a research project which uses the StyledTextControl provided by e(fx)clipse. Syntax highlighting for the basic languages like Java, JavaScript, XML and groovy has already been implemented since a long time.

So next on my list was support for outlines which turned out to be fairly easy. Watch the video to see it in action.

If you watched the video you noticed that while a generic outline for Java makes always sense for JavaScript it does not necessarily because frameworks invented their own way to define a class. In my example I used qooxdoo – because I knew it from earlier work – but the same applies for others as well.


For Java I simply added org.eclipse.jdt.core and the main logic then looks like this:

	public Outline createOutline(Input<?> input) {
		ASTParser parser = ASTParser.newParser(AST.JLS8);
	    ASTNode cu = parser.createAST(null);

	    Stack<OutlineItem> i = new Stack<>();
        i.push(new JavaOutlineItem("<root>",null));

	    cu.accept(new ASTVisitor() {

	    	public boolean visit(TypeDeclaration node) {
	    		OutlineItem o = new JavaOutlineItem(node.getName().getFullyQualifiedName(), "java-class");

	    		return super.visit(node);

	    	public void endVisit(TypeDeclaration node) {

	    	public boolean visit(FieldDeclaration node) {
	    		for( Object v : node.fragments() ) {
	    			if( v instanceof VariableDeclarationFragment ) {
	    				VariableDeclarationFragment vdf = (VariableDeclarationFragment) v;
	    				i.peek().getChildren().add(new JavaOutlineItem(vdf.getName().getFullyQualifiedName(), "java-field"));
	    		return super.visit(node);

	    	public boolean visit(MethodDeclaration node) {
	    		i.peek().getChildren().add(new JavaOutlineItem(node.getName().getFullyQualifiedName(), "java-method"));
	    		return super.visit(node);

		return new JavaOutline(i.peek());


We only look at the enhanced outline contributed to the framework. To understand it better here’s some simple background:

  1. A class definition is done with qx.Class.define
  2. Properties are defined in the properties-Attribute
  3. Members are defined in the members-Attribute
  4. Names with _ means protected, __ means private

So I used Nashorn to parse the JavaScript files like this:

  public Outline createOutline(Input<?> input) {
    final Options options = new Options("nashorn");
    options.set("anon.functions", true);
    options.set("parse.only", true);
    options.set("scripting", true);

    ErrorManager errors = new ErrorManager();
    Context context = new Context(options, errors, Thread.currentThread().getContextClassLoader());

    Context.setGlobal(new Global( context ));

    final Source   source   = Source.sourceFor("dummy.js", input.getData().toString().toCharArray());
    FunctionNode node = new Parser(context.getEnv(), source, errors).parse();

    JSOutlineItem root = new JSOutlineItem("<root>",null);

    node.accept(new NodeVisitor<LexicalContext>(new LexicalContext()) {
			private JSOutlineItem classDef;

			public boolean enterCallNode(CallNode callNode) {
				if( callNode.getFunction().toString().endsWith("qx.Class.define") ) {
					classDef = new JSOutlineItem(((LiteralNode<?>)callNode.getArgs().get(0)).getString(), "qx-class-def");
				return super.enterCallNode(callNode);

			public boolean enterPropertyNode(PropertyNode propertyNode) {
				if( classDef != null ) {
					switch (propertyNode.getKeyName()) {
					case "include":
					case "extend":
					case "construct":
					case "statics":
					case "events":
					case "properties":
					case "members":
				return super.enterPropertyNode(propertyNode);
		return new JSOutline(root);

	private JSOutlineItem handleProperties(PropertyNode p) {
		JSOutlineItem outline = new JSOutlineItem("Properties", "qx-properties");
		p.accept(new NodeVisitor<LexicalContext>(new LexicalContext()) {
			public boolean enterPropertyNode(PropertyNode propertyNode) {
				if( p != propertyNode ) {
					outline.getChildren().add(new JSOutlineItem(propertyNode.getKeyName(), "qx-property-"+visibility(propertyNode.getKeyName())));
					return false;

				return true;
		return outline;

	private JSOutlineItem handleMembers(PropertyNode p) {
		JSOutlineItem outline = new JSOutlineItem("Members", "qx-members");
		p.accept(new NodeVisitor<LexicalContext>(new LexicalContext()) {
			public boolean enterPropertyNode(PropertyNode propertyNode) {
				if( p != propertyNode ) {
					if( propertyNode.getValue() instanceof FunctionNode ) {
						outline.getChildren().add(new JSOutlineItem(propertyNode.getKeyName()+"()","qx-method-"+visibility(propertyNode.getKeyName())));
					} else if( propertyNode.getValue() instanceof ObjectNode ) {
						outline.getChildren().add(new JSOutlineItem(propertyNode.getKeyName(),"qx-field-"+visibility(propertyNode.getKeyName())));
					} else if( propertyNode.getValue() instanceof LiteralNode<?> ) {
						outline.getChildren().add(new JSOutlineItem(propertyNode.getKeyName(),"qx-field-"+visibility(propertyNode.getKeyName())));
					} else {
						System.err.println("Unknown value type: " + propertyNode.getValue().getClass());
					return false;
				return true;
		return outline;

	private static String visibility(String name) {
		if( name.startsWith("__") ) {
			return "private";
		} else if( name.startsWith("_") ) {
			return "protected";
		return "public";

Next on my list are improvements for the editor-control which still lacks some features and afterwards I want:

  1. Syntax hightlighting for C/C++
  2. Syntax hightlighting for TypeScript
  3. Syntax hightlighting for Swift
  4. Error and warnings for Java, JavaScipt and TypeScript
  5. Auto-Complete for Java

by Tom Schindl at November 26, 2014 12:25 AM

Building Plugin through Jenkins- CI

by Its_Me_Malai ( at November 25, 2014 01:41 PM

Building Plugin through Jenkins- CI 

Having learnt how to build a HelloWorld Plugin using Maven Tycho from Eclipse, the next step is to learn to Automate the Build thru Continous Build Integration System Jenkins.

Step 1. Install Jenkins. It is a war file. Download the same from

Step 2. Deploy the same in tomcat or any Java Container App Server that you have installed by copying into the relevant folder. For tomcat you need to copy it into the webapps folder.

Step 3. Start your Server and on your browser connect to http://localhost:8080/jenkins

It should open a browser content as shown above.

Step 4. Click on Manage Jenkins in the left hand side menu.

Step 5. Click on Configure System > Maven > Add Maven Installer and Select Required Version.

Step 6. Click on Jenkins on top left corner > Create a New Job

Step 7. Provide Job Name

Step 8. Select Build a Maven2/3 Project and Finish

Step 9. Project Configuration Page would open. Scroll down to find Pre Step section

Step 10. Configure the parent pom.xml location and commands that you want to run thru mvn.

Step 11. Save and Apply the Changes. Click on Build Now.

Step 12. Click on the Build Link > Console Output to visualise the Build in Progress.

Step 13. Build should be Successful > Target folder of the Project would contain your plugin.jar.

by Its_Me_Malai ( at November 25, 2014 01:41 PM

What’s New in JavaScript Tools

by dgolovin at November 25, 2014 12:47 PM

JavaScript is extremely popular nowadays and the fact Eclipse JavaScript Development Tools supports JavaScript ECMA3 standard only didn’t let us sleep well, so we decided to do something about it.

First we got to work and fixed tons of issues in Eclipse JavaScript Development Tools to make it usable, then when it esd good enough we extended it with some good features to make it even better. Adapter for JavaScript Facet is now working behind the scene to provide JavaScript code analysis and content assist. Good news there is no need to configure it manually. Just install Adapter from JBoss Tools update site and it configures projects with JavaScript facet automatically to unleash all Tern IDE features for you.

ECMA5 Support

HTML and JavaScript source editors now show JavaScript ECMA5 proposals in content assist for projects with JavaScript facet.


You can also get content assist for many popular JavaScript Libraries through Tern Modules. Manual configuration is required in Project Preferences.


CordovaJS Module for

We provided CordovaJS Module for Tern to improve Cordova related content assist in JavaScript source. Tern integration configure it for Eclipse Thym Project automatically, so you don’t need to do manual configuration to see content assist for CordovaJS. Just create Thym project, open index.htm and start using it.


AngularJS Tools Early Access

JBoss Tools provides support for AngularJS through JBoss Tools Central Early Access. It installs full distribution of AngularJS IDE with many cool feature like:

  • source code highlighting and navigation

  • source code content assist

  • AngularJS model view

  • and more.

Read about all AngularJS IDE features here.

/Denis Golovin

by dgolovin at November 25, 2014 12:47 PM

Project Flux: Connecting developer tools across desktop and web

by Eclipse Foundation at November 25, 2014 12:31 PM

In recent years there has been a steady migration of tools to the web, starting with bug trackers and other collaboration tools. More recently core coding tools have started moving to the web as well, with the rise of web code editors such as Eclipse Orion. However when developers make this leap to web-based tools, they must leave behind their existing tools because there is no way to for tools on the web to connect back to tools on the desktop. This talk introduces Flux, a new project that aims at bridging this gap between existing desktop-class IDEs and future cloud-based developer tooling. It is built around a new cloud-based architecture for developer tooling that integrates with todays desktop IDEs (like Eclipse, IntelliJ, or even Sublime Text). This new architecture takes existing IDEs explicitly into account, but allows developers to move towards cloud- and browser-based tooling at the same time. It provides, for example, real Java reconciling and compilation information, content-assist and navigation support in the browser, in real-time, without forcing developers to discard their desktop IDEs. The talk explains the architecture behind this new project and shows several live demos using the latest prototypes, the integration with Eclipse Orion, Java tooling in the cloud, and more. Here is the online version of the slides:

by Eclipse Foundation at November 25, 2014 12:31 PM

Building a Feature using Maven Tycho

by Its_Me_Malai ( at November 25, 2014 12:10 PM

Building a Feature using Maven Tycho

Step 1. Create a Feature Project with required functionality.

Step 2. Convert the Feature Project to Maven Project

a. Right Click on the Plugin Project > Select Configure > Convert to Maven Project

Step 3. Wizard to create pom.xml would Open. Configure the same as mentioned below

a. Change the version from 0.0.1-SNAPSHOT to 1.0.0-SNAPSHOT

b. Change Packaging from jar to eclipse-feature. This option is not available in the dropdown.

Step 4. Pom.xml on creation would look as shown below

a. I am sure you know why we are seeing an error. Tycho-maven-plugin is missing in the feature’s 

pom.xml. You can remove the error by adding the required plugin to the pom or the 

recommended solution is tycho-maven-plugin is being shared by Feature and Plugin Project 

pom.xml, therefore we add it to the parent pom.xml

Step 5. To fix these errors follow the steps below

a. We need to add a Maven Plugin called tycho-maven-plugin to parent pom.xml

b. Right Click on the parent pom.xml > Select Maven > Add Plugin

Step 6. In the Add Plugin Dialog, type tycho in the textBox as shown below

a. Select org.eclipse.tycho tycho-maven-plugin

b. Open the pom.xml and we need to manually alter the pom.xml to have a new tag called as 

true after versions in the plugin tag. Errors from the pom.xml file 

should be all resolved now.

Step 7. Adding Repository for dependancies resolution

a. Repositories are also shared between the Plugin Project and Feature Project. Therefore the 

same is also moved to the parent POM.

Step 8. Add the Feature Project as a module to the parent pom.xml.

Step 9. Right Click on Feature Project and Select Maven > Update Project to fix the Errors on feature proj

Step 10. Now select the pom.xml and right on the same to run > maven install

Step 11. On Successful build > the Target folder would be updated as shown below

References to read



by Its_Me_Malai ( at November 25, 2014 12:10 PM

Many Bundles of Things

by Eclipse Foundation at November 25, 2014 11:53 AM

This presentation tells how OSGi can help developing a distributed and cloud ready Internet of Things platform. IoT brings unprecedented complexity both in terms of technological variety and new development paradigms. Modularity offered by OSGi is the key concept to build maintainable and robust IoT platforms. OSGi declarative services and dependency injection mechanism allow service producers and service consumers to interact with full respect of mutual component boundaries: this is the fundamental requirement to enable important aspects of an IoT platform like multi-tenancy, separation of concerns between M2M protocols management and application development and dynamic services management. Plat.One IoT platform revolves around the OSGi technology: this presentation describes our lesson learnt during several years of “hands-on OSGi activities” and development.

by Eclipse Foundation at November 25, 2014 11:53 AM

Scaling and Orchestrating Microservices with OSGi

by Eclipse Foundation at November 25, 2014 11:52 AM

The OSGi community has been designing and developing microservices since 1998, and we have gained a great deal of experience with them. The isolation provided by OSGi's Module Layer provides a high degree of safety, along with a powerful model for versioning API contracts and filtering out incompatible service providers. However, OSGi bundles exist within the memory space of a process, and so make a trade-off in favour of simplicity and speed. We are not protected from crashes originating in other modules -- when such protection is required, we can use separate OS processes. This mirrors the general industry trend of developing "microservices". But in return for additional safety we create complexity; not least in the number of artifacts and processes that must be managed. Still, even processes do not provide perfect isolation, and increasingly container models such as Docker are being used to control resource utilisation. Above this, full VMs are used to provide different OS environment to different services. So we see that isolation and modularity are a continuum of trade-offs, and OSGi bundles exist towards the lower end of it. But getting application modularity right is extremely important to avoid much greater complexity at the higher ends. In this talk I will describe how OSGi works as a key part of the modern application architecture and DevOps stack, highlighting the OSGi technologies that can be used to scale upwards and outwards. The talk will include a significant demo section.

by Eclipse Foundation at November 25, 2014 11:52 AM

openHAB 1.6 and 2.0 alpha Release

by Kai Kreuzer ( at November 24, 2014 07:34 PM

We have reached another milestone – openHAB 1.6 has been released! As usual, it brings a couple of new bindings and other add-ons. Just to name a few: There is now support for Belkin WeMo switches, LG TVs, BenQ beamers and Samsung air conditioners. A very interesting new system-to-system integration is now possible with MiOS-bases systems, such as the MiCasaVerde (now Vera Control). But with this release it seems that the focus of development has shifted slightly: While we had huge numbers of new bindings with every release in the past, we “only” have 17 new add-ons this time. But this does not at all mean that there is any slow down in the project activity; the opposite is the case: The community is growing massively and as a result we see more and more people improving existing add-ons and introducing new features to them – the very lively Z-Wave binding that is led by Chris Jacksonis the best example for this.

The community growth is nicely visualized in the OpenHUB (formerly Ohloh) statistics:

The factoids state: “Over the past twelve months, 147 developers contributed new code to openHAB. This is one of the largest open-source teams in the world, and is in the top 2% of all project teams on Open Hub.” - we can be really proud of this community!

openHAB 2.0 alpha

Despite the large number of code commits in 1.6, there has not been much evolution on the core runtime itself. There is a very simple reason for this: The work on openHAB 2!

As a recap: One year ago we contributed the core parts of openHAB to the Eclipse Foundation to start off the Eclipse SmartHome project. This project provides a framework for building smart home solutions – and openHAB 2 will be one of such solutions. Others, like e.g. Deutsche Telekom with its QIVICONplatform and Yetu also chose Eclipse SmartHome as a part of the software stack for their home gateways.

Being a general purpose framework for smart home gateways implies that Eclipse SmartHome has its emphasis on providing means to build user-friendly configuration interfaces as well as optimizing it for embedded systems – both topics were not in focus for openHAB 1.

To show where these developments are leading, we have released an alpha version of openHAB 2.0 today. Let me share some first results with you:

Configuration UI

A prerequisite for providing user interfaces for configuration is that there is a formal way of describing functionality. For this, the new concept of a “Thing” has been introduced in Eclipse SmartHome. Things build the foundation for the new configuration UIs. They are formally described and provide a lot of meta-data about their functionalities. Things represent the physical devices that can be added to the system and also (web) services that offer certain functionality.

Paper UI showing auto-discovered things
As such, one major feature is the auto-discovery of things: The user does not need to know IP addresses or other ids, but devices can be automatically added. Together with their formally described functionality, this can indeed mean zero effort for first steps.

As a new user interface that also supports dealing with things and their configuration, there is a prototype currently being developed by Dennis Nobel, which is HTML5 based and follows Google's material design. Just as the latest Android 5.0 Lollipop, it uses paper elements and is thus called the “Paper UI”. Although it is still in a very early prototype stage, it is a nice mean to use the new configuration features. You can see it in action in this screencast.

Embedded Platforms

Although many users use a Raspberry Pi for openHAB 1.x, it feels a bit sluggish, especially when many different bindings are used.
Used heap of standard openHAB 2 runtime
A significant part of the runtime deals with processing the textual configuration files (incl. rules), which are realized through domain specific languages (DSLs) – requiring them to be parsed by the runtime is not ideal. Once there are the new configuration UIs in place, the textual configuration files could be made optional – while they can be very convenient and powerful for some users, this would also allow minimal setups that do without it. I am currently experimenting with such a minimal runtime, which packages the Paper UI and all new bindings, but comes without DSL support (and thus without sitemaps, so the other UIs cannot be used for the moment).

Used heap of minimal openHAB 2 runtime
A first analysis shows that it very much reduces the required Java heap size and also comes with a much better start up time than the “full” runtime – the goal is that the minimal runtime is nicely usable on an embedded system comparable to the Raspberry Pi Model A(+) with only 256MB RAM.

Note that removing DSL support currently also means that there is no way of defining automation rules. A new rule engine is therefore another topics that is currently worked on. Its vision is to make rules (or parts of them) easily sharable and reusable by others. Again, this should empower average users to set up automation logic through simple UIs – think of something like IFTTT.

There are also other efforts for reducing the footprint of the runtime: The Eclipse Concierge project might be a lightweight alternative to Eclipse Equinox as an OSGi container. Jochen Hiller is doing excessive testing and bug fixing to move this option forward. Another possibility is to go for Java 8 compact profiles, which would allow reducing the size of the JVM itself significantly.

New Bindings

New LIFX binding for openHAB 2.0
The new Thing concept of Eclipse SmartHome means that the API for implementing bindings has changed heavily in comparison to openHAB 1.x. There are hence only a few bindings that are already based on the new APIs. The alpha release package bindings for Philips Hue, Yahoo Weather, Belkin WeMo and LIFX - all of them support discovery and thus make the initial setup very easy. While Philips Hue, Yahoo Weather and Belkin WeMo are all also available on openHAB 1.x, the LIFX binding is the first one that exclusively exists for the new version.

Future Plans for openHAB 1.x vs. 2.x

With so much efforts going into openHAB 2, you might wonder how the future looks for openHAB 1. Well, the openHAB 1.x core runtime has been very stable and thus did not see much evolution during the last two releases. We will keep it this way and will only concentrate on security fixes and critical patches from now on. Note that this is only the case for the core runtime – all 1.x add-ons, i.e. bindings, actions, persistence services, etc. are very actively maintained and we ask our community members to not slow down here.
For the latest cutting-edge features, we want to encourage the users to switch to the openHAB 2 runtime though, once a first production release is out. One important feature is the 1.x compatibility layer, which should help making openHAB 1 add-ons working on the openHAB 2 runtime. With the current alpha release, we want to engage the our community in testing this layer – the goal is to quickly reach a good compatibility with all existing add-ons, so that there should be no reason not to move to the new runtime and benefit from its new features.

That much about the users, but what about add-on developers? Should they start migrating their 1.x add-on to 2.x? Well, there is absolutely no rush to port them to the new 2.x APIs - especially as there are only new APIs for bindings so far, but nothing yet for actions, persistence services or other types of add-ons. Even for bindings you have to be aware that the new APIs are not yet stable and are likely to change over time. Nonetheless, if you start a completely new binding for openHAB, you are encouraged to go for openHAB 2 directly - especially, if your devices can be discovered and formally described. A positive side effect of implementing a binding against the new APIs is the fact that your code is potentially automatically compatible with other Eclipse-SmartHome-based systems. Besides developing and contributing it to openHAB 2, you could also consider to directly contribute it the the Eclipse SmartHome project - but please note that there are some stricter requirements.
To sum things up: The journey of openHAB continues to be very exciting - and as always, there are so many great ideas, that are waiting to be implemented for the next release!

by Kai Kreuzer ( at November 24, 2014 07:34 PM

Eclipse Dev: Using GtkInspector to analyse Eclipse’s widgets.

by Leo Ufimtsev at November 24, 2014 03:44 PM

As Eclipse developers, when troubleshooting Eclipse, it is sometimes important to figure out how parts of a U.I are constructed.


Usually, you would use the plugin spy (alt+shift+f1/alt+shift+f2) to see a menu with info about the widget under the mouse.

As of Gtk3~ish , you can in addition fire up the GtkInspector to see how the widget is constructed on the GTK side. The shortcut key for this is ctrl+shift+d (or i) in any gtk3 application.
You might have to launch Eclipse in Gtk3 for the inspector to function. To do this you can set the global gtk3 variable and launch Eclipse:


More info can be found in this post:

by Leo Ufimtsev at November 24, 2014 03:44 PM

Say hello to JEP 220

by Andrey Loskutov at November 23, 2014 11:08 PM

Java tool makers, unite be aware about upcoming changes in Java 9!

This Java version will introduce new file format for class file containers, change the entire JDK installation directory layout and, which is much more important, rt.jar / tools.jar will be gone. Instead, we will have a new "jrt:/" java runtime file system implementation, modules and 3 (or more) .jimage files. The changes are described in details in JEP 220 [1], you can already download Java 9 early access builds containing jigsaw changes [2] and you can participate in the discussion of various aspects via jigsaw-dev mailing list [3].

Why this is important and why should you care?

JDK 9 will be the first JDK which classes could not be inspected with "standard" tooling (emacs, zip, etc), because it introduces a new binary container format (jimage) which replaces old "jar" (zip) container format.

JDK 9 will be the first JDK which classes could not be found without the help of the JDK itself (or new tooling using new "jrt:/" java runtime file system, see [4]). JDK 9 changes the old "jars/packages" hierarchy by adding a new layer to it - modules. The JDK classes will be not only stored in the new jimage files under JAVA_HOME/lib/modules/, they will belong to different modules which names has to be known before accessing class files via "jrt:/".

For example the java/lang/Class.class "file" previously located inside JAVA_HOME/jre/lib/rt.jar will be physically stored inside the JAVA_HOME/lib/modules/bootmodule.jimage, will be "logically" located in the "java.base" module and so accessible at runtime via the "jrt:/java.base/java/lang/Class.class" url:

$java9/bin/jimage list ../lib/modules/bootmodules.jimage | grep java/lang/Class.class 

$java9/bin/jdeps -module java.lang.Class 

$java.base -> java.base $ java.lang (java.base)

Right now there is no way for the 3rd party tooling to lookup the single .class file by its name in the new jimage files without first traversing and enumerating the entire "jrt:/" file system via jigsaw-jrtfs tool [4], because the "package to module" association is not available yet for 3rd party tools in JDK 9. The discussion how to lookup a .class file from JDK 9 from tools running on older JDK is ongoing [5].


The changes proposed in JEP 220 are most likely breaking for all low level, .class file / bytecode oriented tools like FindBugs. If you are the tool maker of such a tool, plan for a change now, read specs and discuss open issues on jigsaw-dev before it is too late.






by Andrey Loskutov at November 23, 2014 11:08 PM

Building P2 Repository using Maven Tycho

by Its_Me_Malai ( at November 22, 2014 09:56 AM

Building P2 Repository using Maven Tycho

Step 1. Create a General Project using File > New > Others > General > Project.

Step 2. Convert the General Project to Maven Project
a. Right Click on the Plugin Project > Select Configure > Convert to Maven Project

Step 3. Wizard to create pom.xml would Open. Configure the same as mentioned below

a. Change the version from 0.0.1-SNAPSHOT to 1.0.0-SNAPSHOT
b. Change Packaging from jar to eclipse-repository.

Step 4. Adding Repository project to parent pom.xml as Module

a. Click on Add.. button in the Modules section

b. Select the modules to include and dnt forget to check the option “Update POM parent section in selected projects.”
c. Open the module pom.xml to see parent information added in the pom file as shown below

Step 5. Right Click on Repository project and Select Maven > Update Project to fix the errors on the Project.

Step 6. Create a Category Definition file from File > New > Others > Category Definition. To this Category Definition, Create the Category and Add your Feature Project. Having the Category Definition file at the root of the Repository Project is mandated to generate the P2 Repository.

Step 7. Now run the parent pom.xml by Right Click, Run As > Maven Install. Your final output would look like

by Its_Me_Malai ( at November 22, 2014 09:56 AM

Fabulous Forge Fun

by kaers at November 21, 2014 01:10 PM

While JBoss Forge has been included in JBoss Tools and Developer Studio for a while now, the recent JBoss Tools 4.2 and Devloper Studio 8.0 releases contain support for Forge 2.12.1. Forge has now become a mature component inside JBoss Tools and Developer Studio with a plethora of available commands that can be executed either using the convenient wizard style or using the integrated Forge console.

Wise Wizards

To use the wizards, you need to issue the Ctrl+4 key combination (or Cmd+4 on a Mac). It will bring up a popup containing all the commands that are available in the currently selected context.

forge command popup

If this key combination does not work, chances are high that you are using a keyboard layout such as AZERTY. Don’t panic, in this case it is just a matter of redefining the key combination to something that works on your hardware. Use the Eclipse preferences dialog (Window→Preferences or something similar) to do this.

forge keys prefs

Selecting the 'Project: new' entry of the popup will open the Forge wizard that allows you to create a new project.

forge project new wizard

Core Console

If you are a console afficionado and typing and code completion is more your thing, you can use the integrated Forge console. One way to bring up the console is to type 'Forge Console' in the 'Quick Access' text field in the toolbar of your Eclipse workbench and then select the 'Forge Console' entry that is visible in the popup.

forge quick access

Another way would be to select 'Window→Show View→Forge Console' from the main menu bar. This will only be possible when you are using the 'JBoss' perspective. In another perspective, you can use 'Window→Show View→Other…​' and then navigate to the 'Forge→Forge Console' entry.

forge show view

In the console you can type Forge commands to perform the same tasks as if you were using the wizard-style approach.

forge project new console

Chatty Cheatsheet

If you are new to Forge, this new feature will certainly help you on the way. And as a bonus it will also help you with learning to develop an HTML5 application with REST, CDI and AngularJS. You can open this brand new cheatsheet by importing the 'AngularJS Forge' example from the 'JBoss Central' page.

forge angularjs central

In case the 'JBoss Central' page is not open or you inadvertently closed it, no worries. You can always bring it back by clicking the 'JBoss Central' icon in your Eclipse toolbar.

forge jboss central

After clicking the 'AngularJS Forge' example hyperlink, a wizard will guide you through the import process. Just accept all the defaults. When the wizard is done, a popup will ask you if you want to open the cheatsheet.

forge angularjs import

Make sure you keep the open option checked when you click the 'Finish' button. Now the cheatsheet will reveal itself in all its glory.

forge angularjs cheatsheet

There is no point in repeating here all the information that is available in the cheatsheet but in short, it will provide you with some general information and then guide you through the development, deployment and execution of a HTML5 application. You will see different examples of using Forge commands such as REST endpoint generation and scaffolding an AngularJS user interface. Check it out!

Happy Forging,
Koen Aers

by kaers at November 21, 2014 01:10 PM

CDT in the New World

by Doug Schaefer at November 19, 2014 05:12 PM

An interesting thing happened the other day which I think shakes some of the foundation I’ve had in my mind about the CDT. My vision was always for CDT to support C, C++, and related languages on every platform. And that has always included Windows. Mind you, we’ve never really done a great job there but we do have lots of users using it with Cygwin or MinGW and you can argue how well they support Windows application development.

The big news was Microsoft’s Visual Studio Community Edition which is a free version of their complete IDE for individuals and small teams. This is different from their Express editions which were severely limited in the end, including lack of 64-bit support, and the inability to install plug-ins into the IDE. No, the community edition is the full thing and is a pretty important step for Microsoft as they try to keep developer interest in their platform.

As long as Microsoft charged money for Visual Studio, I always thought CDT had a play. I’ve done a little work on supporting integration with their Windows SDK (which includes a 64-bit compiler) at least for build. Debug was always a big hurdle since you’d need to write a complete debugger integration that is very different from gdb. So, really, this integration barely worked. And now that VS is essentially free to the market we serve, there’s little point at continuing that effort. The same is essentially true with Objective-C support with Xcode on Mac, which even they are drifting away from to focus on their new language Swift.

But you know, despite giving up on a dream, I think CDT has an important role to play in it’s field of strength, i.e. support for the GNU toolchain, especially as used for embedded development with it’s new found focus as a pillar in the Internet of Things. As I’ve talked about on Twitter and on CDT forums, I am working on a CDT integration to support building C++ apps for Arduino. Once that’s done, I’ll shift focus to a more complicated environment around Raspberry Pi which I hope to add better remote debug support to CDT. Yes, I know everyone loves to talk about using scripting languages on the Pi, but I want to expose these hobbyists to the real world of embedded which is almost exclusively C and C++, and get them started using CDT.

I still haven’t given up on the desktop, at least desktop support with Qt. I really want to make CDT a first class alternative for Qt development. We have a good start on it and will continue with things like a project file editor, easier setup of builds, and import of projects from Qt Creator. The great thing about Qt is that it also is truly cross platform, allowing you to write your app to run on the desktop and then migrate it to your embedded device, such as the Raspberry Pi. And supporting both of these with the same project is one thing CDT is pretty good at, or at least will be once we fix up the UX a bit.

In the past, much like the rest of the Eclipse IDE, I think we on the CDT have focused a lot on being a good platform, almost too much. While vendors have done a great job at bringing the CDT to their customers writing for their platform, including my employer :), aside from the Linux community, few of us work on supporting the direct user of the Eclipse C/C++ IDE. I think focusing on CDT’s areas of strength and targeting specific open environments and not chasing dreams will help me contribute to the CDT in ways that will actually help the most people. And that’s most satisfying anyway.

by Doug Schaefer at November 19, 2014 05:12 PM

Maven improvements in JBoss Tools 4.2 and Developer Studio 8.0

by fbricon at November 19, 2014 12:51 PM

In JBoss Tools and JBoss Developer Studio, we’re continuously working to augment the Maven integration experience in Eclipse. Some of the features we’ve been playing with, if deemed successful, will eventually be contributed back to m2e, like the Maven Profile Management UI. Others, more centered around JBoss technologies, will stay under the JBoss Tools umbrella.

JBoss Tools 4.2 and Developer Studio 8, based on Eclipse Luna, take advantage of all the nice improvements made to m2e 1.5.0 and then, add some more :

m2eclipse-egit integration

The Import > Checkout Maven Projects from SCM wizard doesn’t have any SCM provider by default, which can be pretty frustrating at times. With Git becoming the new de facto source control system, it only made sense to make m2eclipse-egit the sensible default SCM provider for m2e.

m2eclipse-egit will now be automatically installed when a JBoss Maven integration feature is installed from the JBoss Tools update site.

It is installed by default with JBoss Developer Studio 8.0.0 as well.

Maven Central Archetype catalog

Since m2e 1.5.0 no longer downloads Nexus Indexes by default, a very small, outdated subset of Maven Archetypes is available out of the box.

To mitigate that, the JBoss Tools Maven integration feature now registers by default the Maven Central Archetype catalog, providing more than 9600 archetypes to chose from, when creating a new Maven project. Accessing the complete list of archetypes is even way, waayyyy faster (a couple seconds) than relying on the old Nexus index download.

maven central catalog

Pom properties-controlled project configurators

JBoss project configurators for m2e now support an activation property in the <properties> section of pom.xml. Expected values are true/false and override the workspace-wide preferences found under Preferences > JBoss Tools > JBoss Maven Integration.

Available properties are :

  • <m2e.cdi.activation>true</m2e.cdi.activation> for the CDI Project configurator,

  • <m2e.seam.activation>true</m2e.seam.activation> for the Seam Project configurator,

  • <m2e.hibernate.activation>true</m2e.hibernate.activation> for the Hibernate Project configurator,

  • <m2e.portlet.activation>true</m2e.portlet.activation> for the Portlet Project configurator.

The pom.xml editor also provides matching XML templates for these properties, when doing ctrl+space in the <properties> section.

Maven Repository wizard improvements

The Configure Maven Repositories wizard, available under Preferences > Jboss Tools > JBoss Maven Integration saw a couple improvements as well :

Advanced options for maven repositories

You can now choose the repository layout, enable/disable snapshots or releases, and change the update policy in the advanced section :

maven repository advanced

Automatically identify local Maven repositories

When adding a new Maven repository, you can scan for JBoss Maven repositories unzipped locally, with the Recognize JBoss Maven Enterprise Repositories…​ button, in order to automatically add it to your .m2/settings.xml.

recognize maven repo

The identification process now looks for a .maven-repository file at the root of the folder you selected. This file follows the .properties file format and is expected to contain up to 3 attributes :

  • repository-id : the repository id

  • name : a (descriptive) repository name. Optional, defaults to repository-id

  • profile-id : the profile id the repository will be activated from. Optional, defaults to repository-id

As a concrete example, the JBoss Mobile repository .maven-repository file would contain :

      name=JBoss Mobile Maven Repository

What’s next?

Tired of seeing these "Project configuration is out-of-date" errors whenever you tweak your pom.xml? We’re currently playing with a plugin that will automatically do the Maven > Update project configuration for you. You can try a very alpha version from the following p2 repository : Let us know if/how it works for you, so we can decide what to do next with it.

Enjoy and see you soon!

Fred Bricon

by fbricon at November 19, 2014 12:51 PM

Flexible calendar control in SWT

by Tom Schindl at November 19, 2014 10:22 AM

I’ve been in need for Week-Calendar view to display appointments but could not find anyone that completely fit my needs to I implemented a custom one.

This short screen cast shows what I’ve today.

The control is backed by a replaceable service interface to retrieve and store appointments so it can be connected to fairly any kind of backend store, for my needs a plain XMI-Store was enough but one might want something more fancy can connect it most likely any calendar-service e.g. by using REST, … .

by Tom Schindl at November 19, 2014 10:22 AM

Why should I contribute?

by Jérémie Bresson at November 19, 2014 09:29 AM

One of the opening slides of our “Eclipse Scout Contributor” training looks like this:


There is nothing new in this slide. It just re-states an aspect of the open-source mechanic.

In our opinion the benefits of contributing to open source is often not fully understood. That is why we would like to share some of our thoughts.

Really often people associate open-source with the notion of free of charge and “Free riding”. But this is not necessarily a negative topic. Bruce Perens is writing about this topic in his text about “What Effect Does The Free-Rider Problem Have Upon Open Source?”.

Essentially, every open source users is starting out as a free-rider, and this is by design. The interesting question is related to what happens after the initial adoption. And once a user (or an organisation) is adopting a software project, the interest in the success of the project is growing too. This results in the interest to protect the initial investments of the user/organisation. One of the obvious approaches is to help to increase the success of the software project. And this is the point where the user/organisation can take advantage of the fact, that they decided for open-soure software.

A nice aspect of the open-source model is that there are many opportunities to become a part of the success of your favorite project. Of course, some of the options involve spending money, as in the case of closed source software. But even more opportunities exist that are based on the open-source nature of the software project and only require some of your experience, know-how and time.

Why shoud you care? Contributing to an open-source project can be rewarding in many ways and most often is in your best interest. Here some reason why you might want to contribute to the Eclipse Scout framework:

  • Prevent Work-arounds in Projects: Fix a bug, report it and provide a patch. Over and over it can be shown that fixing bugs at the source (in the open source code) is most efficient. Alternatively, you may privatly implement a work-around in your project. This is often the more expensive approach. First, you cover the costs to analyze, reproduce and work-around the bug. Then, you need to apply the work-around to all of your projects. And over time, you need to maintain your patch/work-around to work with the evolving framework. And when somebody else fixes “your” bug, you should remove all your workarounds in all projects. So: Reporting a bug increases the chance that it will be solved, and providing a good quality patch (including a test) actually reduces your work load over time.
  • Help the Scout Team to be more effective: The Scout code base is 550k LOC. This is pretty large. And, as in every large project, there are bugs (and fixing bugs does usually not reduce the the size of the project). So, if you report bugs in Bugzilla you are already helping Scout. And pushing a patch to our Gerrit instance, can help the Scout developers to be more productive and have more bugs fixed earlier. Which in turn improves your productivity as an application developer.
  • Help the Scout Community. The Community plays a central role. It helps you getting up to speed with Scout and lets you profit from past investments of others. Once you adopt a framework such as Scout you can protect your investment by making Scout more popular and more successful. This is the point when you become an integral part of the Community and your interests start to get aligned nicely with the community. Remember: An open source project is all about sharing. Helping others in the Scout forum with challenges you have mastered often does not take a lot of time. But it can save the other end hours and days.
  • Get a better understanding of Scout: Contributing to Scout significantly increases your understanding of the framework. And you might find additional benefits: Increase your know-how with current tooling such as git/gerrit, improve your technical skill-set and grow your professional network by interacting with other community members. These impressions are also backed by a recent survey of the Linux foundation. And for all this you can accumulate a public track record that demonstrates your work and your skills.

If you are looking for a starting point, we have a contribution page. By the way, helping the community by answering a question in the forum is also a valuable contribution to the project (even if it isn’t mentioned in our Contribution page yet).

If you are interested, do not hesitate to contact us ( You can start a discussion in the forum or ask for help if you are blocked with something. Depending on the subject you choose, the barrier to enter can be high. We will dedicate time to help you until you manage to realize the contribution you want.

And to come back to the initial Scout contributor training we found the visual confirmation on Open HUB that contributing is possible and not all too hard.


Scout Links

Project Home, Forum, Wiki, Twitter

by Jérémie Bresson at November 19, 2014 09:29 AM

Strategy of wrapping C++ libraries in Java

by Tom Schindl at November 18, 2014 10:12 PM

This blog post is more of a question to others (probably much more knowledgeable people than me) who have already wrapped C++ libraries in Java. So if you have and the strategy I document as part of this blog post is completely wrong please comment and point me towards a better one!

Anyways let’s get started!

We take a very simply case where we have a C++-Class like this:


class CTestSimple
    virtual int test();
    int test_delegate();

#include "ctestsimple.h"


int CTestSimple::test() {
    return 100;

int CTestSimple::test_delegate() {
    return test();

and in java we want to do the following things:

import testjni.TestSimple;

public class TestApp {
	public static void main(String[] args) {

		System.out.println("===== Simple");
		TestSimple t = new TestSimple();
		System.out.println("Direct: " + t.test());
		System.out.println("Delegate:" + t.test_delegate());

		System.out.println("===== No Override");
		NoOverride tn = new NoOverride();
		System.out.println("Direct: " + tn.test());
		System.out.println("Delegate: " + tn.test_delegate());

		System.out.println("===== With Override");
		SubTestSimple ts = new SubTestSimple();
		System.out.println("Direct: " + ts.test());
		System.out.println("Delegate: " + ts.test_delegate());

		System.out.println("===== With Override");
		SubSuperTestSimple tss = new SubSuperTestSimple();
		System.out.println("Direct: " + tss.test());
		System.out.println("Delegate: " + ts.test_delegate());

	static class SubTestSimple extends TestSimple {
		public int test() {
			return 0;

	static class SubSuperTestSimple extends TestSimple {
		public int test() {
			return super.test() + 10;

	static class NoOverride extends TestSimple {


We expect the application to print:

===== Simple
Direct: 100
===== No Override
Direct: 100
Delegate: 100
===== With Override
Direct: 0
Delegate: 0
===== With Override & super
Direct: 110
Delegate: 110

The strategy I found to make this work looks like this:

  1. On the C++ side one needs to define a subclass of CTestSimple I named JTestSimple
    #ifndef JTESTSIMPLE_H
    #define JTESTSIMPLE_H
    #include "ctestsimple.h"
    #include <jni.h>
    class JTestSimple : public CTestSimple
        jobject jObject;
        JNIEnv* env;
        jmethodID jTest;
        JTestSimple(JNIEnv *,jobject,jboolean derived);
        virtual int test();
    #endif // JTESTSIMPLE_H
    #include "jtestsimple.h"
    #include <iostream>
    #include <jni.h>
    JTestSimple::JTestSimple(JNIEnv * env, jobject o, jboolean derived)
        this->jObject = env->NewGlobalRef(o);
        this->env = env;
        this->jTest = 0;
        if( derived ) {
            jclass cls = env->GetObjectClass(this->jObject);
            jclass superCls = env->GetSuperclass(cls);
            jmethodID baseMethod = env->GetMethodID(superCls,"test","()I");
            jmethodID custMethod = env->GetMethodID(cls,"test","()I");
            if( baseMethod != custMethod ) {
                this->jTest = custMethod;
    int JTestSimple::test() {
        if( this->jTest != 0 ) {
            return env->CallIntMethod(this->jObject,this->jTest);
        return CTestSimple::test();

    The important step is to check in if the java-object we got passed in has overwritten the test()-method because in
    that case we need to call from C++ to java for the correct result.

  2. On the Java side we define our object like this
    package testjni;
    public class TestSimple {
    	private long NativeObject;
    	public TestSimple() {
    		NativeObject = createNativeInstance();
    	protected long createNativeInstance() {
    		return Native_new(getClass() != TestSimple.class);
    	public int test() {
    		if( getClass() == TestSimple.class ) {
    			return Native_test();
    		} else {
    			return Native_test_explicit();
    	public final int test_delegate() {
    		return Native_test_delegate();
    	private native long Native_new(boolean subclassed);
    	private native int Native_test();
    	private native int Native_test_explicit();
    	private native int Native_test_delegate();

    Some remarks are probably needed:

    • Notice that the there are 2 native delegates for test() depending on fact whether the method is invoked in subclass or not
    • test_delegate() is final because it is none-virtual on the C++ side and hence is not subject to overloading in subclasses
  3. On the JNI-Side
    #include <jni.h>
    #include <stdio.h>
    #include <iostream>
    #include "testjni_TestSimple.h"
    #include "jtestsimple.h"
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->test();
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test_1explicit
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->CTestSimple::test();
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test_1delegate
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->test_delegate();
    JNIEXPORT jlong JNICALL Java_testjni_TestSimple_Native_1new
      (JNIEnv * env, jobject thiz, jboolean derived) {
        jlong rv = (jlong)new JTestSimple(env, thiz, derived);
        return rv;

    The only interesting thing is the Java_testjni_TestSimple_Native_1test_1explicit who does not invoke the test-method on JTestSimple but on its super class CTestSimple.

So my dear C++/Java gurus how bad / dump / … is this strategy?

by Tom Schindl at November 18, 2014 10:12 PM