Say hello to JEP 220

by Andrey Loskutov at November 23, 2014 11:08 PM

Java tool makers, unite be aware about upcoming changes in Java 9!

This Java version will introduce new file format for class file containers, change the entire JDK installation directory layout and, which is much more important, rt.jar / tools.jar will be gone. Instead, we will have a new "jrt:/" java runtime file system implementation, modules and 3 (or more) .jimage files. The changes are described in details in JEP 220 [1], you can already download Java 9 early access builds containing jigsaw changes [2] and you can participate in the discussion of various aspects via jigsaw-dev mailing list [3].

Why this is important and why should you care?

JDK 9 will be the first JDK which classes could not be inspected with "standard" tooling (emacs, zip, etc), because it introduces a new binary container format (jimage) which replaces old "jar" (zip) container format.

JDK 9 will be the first JDK which classes could not be found without the help of the JDK itself (or new tooling using new "jrt:/" java runtime file system, see [4]). JDK 9 changes the old "jars/packages" hierarchy by adding a new layer to it - modules. The JDK classes will be not only stored in the new jimage files under JAVA_HOME/lib/modules/, they will belong to different modules which names has to be known before accessing class files via "jrt:/".

For example the java/lang/Class.class "file" previously located inside JAVA_HOME/jre/lib/rt.jar will be physically stored inside the JAVA_HOME/lib/modules/bootmodule.jimage, will be "logically" located in the "java.base" module and so accessible at runtime via the "jrt:/java.base/java/lang/Class.class" url:

$java9/bin/jimage list ../lib/modules/bootmodules.jimage | grep java/lang/Class.class 

$java9/bin/jdeps -module java.lang.Class 

$java.base -> java.base $ java.lang (java.base)

Right now there is no way for the 3rd party tooling to lookup the single .class file by its name in the new jimage files without first traversing and enumerating the entire "jrt:/" file system via jigsaw-jrtfs tool [4], because the "package to module" association is not available yet for 3rd party tools in JDK 9. The discussion how to lookup a .class file from JDK 9 from tools running on older JDK is ongoing [5].


The changes proposed in JEP 220 are most likely breaking for all low level, .class file / bytecode oriented tools like FindBugs. If you are the tool maker of such a tool, plan for a change now, read specs and discuss open issues on jigsaw-dev before it is too late.






by Andrey Loskutov at November 23, 2014 11:08 PM

Building P2 Repository using Maven Tycho

by Its_Me_Malai ( at November 22, 2014 09:56 AM

Building P2 Repository using Maven Tycho

Step 1. Create a General Project using File > New > Others > General > Project.

Step 2. Convert the General Project to Maven Project
a. Right Click on the Plugin Project > Select Configure > Convert to Maven Project

Step 3. Wizard to create pom.xml would Open. Configure the same as mentioned below

a. Change the version from 0.0.1-SNAPSHOT to 1.0.0-SNAPSHOT
b. Change Packaging from jar to eclipse-repository.

Step 4. Adding Repository project to parent pom.xml as Module

a. Click on Add.. button in the Modules section

b. Select the modules to include and dnt forget to check the option “Update POM parent section in selected projects.”
c. Open the module pom.xml to see parent information added in the pom file as shown below

Step 5. Right Click on Repository project and Select Maven > Update Project to fix the errors on the Project.

Step 6. Create a Category Definition file from File > New > Others > Category Definition. To this Category Definition, Create the Category and Add your Feature Project. Having the Category Definition file at the root of the Repository Project is mandated to generate the P2 Repository.

Step 7. Now run the parent pom.xml by Right Click, Run As > Maven Install. Your final output would look like

by Its_Me_Malai ( at November 22, 2014 09:56 AM

Bundle JRE along with your Product using Maven Tycho

by Its_Me_Malai ( at November 22, 2014 09:27 AM

Bundle JRE along with your Product using Maven Tycho

While building a product, we would many a times desire to bundle the JRE along with it. The normal Eclipse PDE Build takes care of it if we enable “Bundle JRE for this environment with the product” in the launching tab of your product configuration file.

But while building the product using Tycho, this configuration is not considered. Therefore enabling or disabling this checkbox in your product configuration doesn’t really help in bundling the JRE along with your product.

Tycho uses the rootfiles concept of PDE Build. To read more on rootfiles, you could search on google to get links to documentation withing eclipse.

Steps to be followed to include JRE during Maven Tycho Build

1. Copy the required JRE in a sub-folder inside your Feature Project.

For e.g. In the screenshot above 3 Folders are added to the feature project, precisely for linux-64bit, windows-64bit, windows-32bit and copied respective JRE into each of these folders.

2. In the of your Feature Project you need to add the following lines

root.win32.win32.x86_64 = win32-64

Therefore your of Feature Project would look as shown below

3. Now run your parent pom to build your plugins, feature and product. During Product build JRE Folder based on the Platform OS,Arch would be copied from the feature project to the desired location alongside of your product.exe

For e.g. if your are building for linux.gtk.x86 then jre folder placed inside linux-64 of your Feature Project would be copied to the corresponding product directory.

References :

1. PDE Build Advanced Topics > Adding files to the root of the Build
2. Including a JRE in a tycho build by Simon
3. Eclipse Forum :

by Its_Me_Malai ( at November 22, 2014 09:27 AM

Building Multiple Projects using Maven Tycho

by Its_Me_Malai ( at November 22, 2014 04:30 AM

Building Multiple Projects using Maven Tycho

Step 1. Create a General Project using File > New > Others > General > Project.
Step 2. Convert the General Project to Maven Project
a. Right Click on the Plugin Project > Select Configure > Convert to Maven Project

Step 3. Wizard to create pom.xml would Open. Configure the same as mentioned below
a. Change the version from 0.0.1-SNAPSHOT to 1.0.0-SNAPSHOT
b. Change Packaging from jar to pom.

c. This project with packaging set to pom is called as the Parent Project. This pom.xml consists of modules to build. Each project is treated as a module in maven. We need to add Modules in this pom to build Multiple Projects thru Maven Tycho Build. Parent pom.xml can also be configured to have all the general configurations like Repository URLs, Maven plugins required etc.

Step 4. Adding Modules to pom.xml

a. Click on Add.. button in the Modules section

b. Select the modules to include and dnt forget to check the option “Update POM parent section in selected projects.”
c. Open the module pom.xml to see parent information added in the pom file as shown below

Step 5. Run parent pom.xml and you would see that the modules added to the main pom.xml would also be build.

by Its_Me_Malai ( at November 22, 2014 04:30 AM

Fabulous Forge Fun

by kaers at November 21, 2014 01:10 PM

While JBoss Forge has been included in JBoss Tools and Developer Studio for a while now, the recent JBoss Tools 4.2 and Devloper Studio 8.0 releases contain support for Forge 2.12.1. Forge has now become a mature component inside JBoss Tools and Developer Studio with a plethora of available commands that can be executed either using the convenient wizard style or using the integrated Forge console.

Wise Wizards

To use the wizards, you need to issue the Ctrl+4 key combination (or Cmd+4 on a Mac). It will bring up a popup containing all the commands that are available in the currently selected context.

forge command popup

If this key combination does not work, chances are high that you are using a keyboard layout such as AZERTY. Don’t panic, in this case it is just a matter of redefining the key combination to something that works on your hardware. Use the Eclipse preferences dialog (Window→Preferences or something similar) to do this.

forge keys prefs

Selecting the 'Project: new' entry of the popup will open the Forge wizard that allows you to create a new project.

forge project new wizard

Core Console

If you are a console afficionado and typing and code completion is more your thing, you can use the integrated Forge console. One way to bring up the console is to type 'Forge Console' in the 'Quick Access' text field in the toolbar of your Eclipse workbench and then select the 'Forge Console' entry that is visible in the popup.

forge quick access

Another way would be to select 'Window→Show View→Forge Console' from the main menu bar. This will only be possible when you are using the 'JBoss' perspective. In another perspective, you can use 'Window→Show View→Other…​' and then navigate to the 'Forge→Forge Console' entry.

forge show view

In the console you can type Forge commands to perform the same tasks as if you were using the wizard-style approach.

forge project new console

Chatty Cheatsheet

If you are new to Forge, this new feature will certainly help you on the way. And as a bonus it will also help you with learning to develop an HTML5 application with REST, CDI and AngularJS. You can open this brand new cheatsheet by importing the 'AngularJS Forge' example from the 'JBoss Central' page.

forge angularjs central

In case the 'JBoss Central' page is not open or you inadvertently closed it, no worries. You can always bring it back by clicking the 'JBoss Central' icon in your Eclipse toolbar.

forge jboss central

After clicking the 'AngularJS Forge' example hyperlink, a wizard will guide you through the import process. Just accept all the defaults. When the wizard is done, a popup will ask you if you want to open the cheatsheet.

forge angularjs import

Make sure you keep the open option checked when you click the 'Finish' button. Now the cheatsheet will reveal itself in all its glory.

forge angularjs cheatsheet

There is no point in repeating here all the information that is available in the cheatsheet but in short, it will provide you with some general information and then guide you through the development, deployment and execution of a HTML5 application. You will see different examples of using Forge commands such as REST endpoint generation and scaffolding an AngularJS user interface. Check it out!

Happy Forging,
Koen Aers

by kaers at November 21, 2014 01:10 PM

CDT in the New World

by Doug Schaefer at November 19, 2014 05:12 PM

An interesting thing happened the other day which I think shakes some of the foundation I’ve had in my mind about the CDT. My vision was always for CDT to support C, C++, and related languages on every platform. And that has always included Windows. Mind you, we’ve never really done a great job there but we do have lots of users using it with Cygwin or MinGW and you can argue how well they support Windows application development.

The big news was Microsoft’s Visual Studio Community Edition which is a free version of their complete IDE for individuals and small teams. This is different from their Express editions which were severely limited in the end, including lack of 64-bit support, and the inability to install plug-ins into the IDE. No, the community edition is the full thing and is a pretty important step for Microsoft as they try to keep developer interest in their platform.

As long as Microsoft charged money for Visual Studio, I always thought CDT had a play. I’ve done a little work on supporting integration with their Windows SDK (which includes a 64-bit compiler) at least for build. Debug was always a big hurdle since you’d need to write a complete debugger integration that is very different from gdb. So, really, this integration barely worked. And now that VS is essentially free to the market we serve, there’s little point at continuing that effort. The same is essentially true with Objective-C support with Xcode on Mac, which even they are drifting away from to focus on their new language Swift.

But you know, despite giving up on a dream, I think CDT has an important role to play in it’s field of strength, i.e. support for the GNU toolchain, especially as used for embedded development with it’s new found focus as a pillar in the Internet of Things. As I’ve talked about on Twitter and on CDT forums, I am working on a CDT integration to support building C++ apps for Arduino. Once that’s done, I’ll shift focus to a more complicated environment around Raspberry Pi which I hope to add better remote debug support to CDT. Yes, I know everyone loves to talk about using scripting languages on the Pi, but I want to expose these hobbyists to the real world of embedded which is almost exclusively C and C++, and get them started using CDT.

I still haven’t given up on the desktop, at least desktop support with Qt. I really want to make CDT a first class alternative for Qt development. We have a good start on it and will continue with things like a project file editor, easier setup of builds, and import of projects from Qt Creator. The great thing about Qt is that it also is truly cross platform, allowing you to write your app to run on the desktop and then migrate it to your embedded device, such as the Raspberry Pi. And supporting both of these with the same project is one thing CDT is pretty good at, or at least will be once we fix up the UX a bit.

In the past, much like the rest of the Eclipse IDE, I think we on the CDT have focused a lot on being a good platform, almost too much. While vendors have done a great job at bringing the CDT to their customers writing for their platform, including my employer :), aside from the Linux community, few of us work on supporting the direct user of the Eclipse C/C++ IDE. I think focusing on CDT’s areas of strength and targeting specific open environments and not chasing dreams will help me contribute to the CDT in ways that will actually help the most people. And that’s most satisfying anyway.

by Doug Schaefer at November 19, 2014 05:12 PM

Maven improvements in JBoss Tools 4.2 and Developer Studio 8.0

by fbricon at November 19, 2014 12:51 PM

In JBoss Tools and JBoss Developer Studio, we’re continuously working to augment the Maven integration experience in Eclipse. Some of the features we’ve been playing with, if deemed successful, will eventually be contributed back to m2e, like the Maven Profile Management UI. Others, more centered around JBoss technologies, will stay under the JBoss Tools umbrella.

JBoss Tools 4.2 and Developer Studio 8, based on Eclipse Luna, take advantage of all the nice improvements made to m2e 1.5.0 and then, add some more :

m2eclipse-egit integration

The Import > Checkout Maven Projects from SCM wizard doesn’t have any SCM provider by default, which can be pretty frustrating at times. With Git becoming the new de facto source control system, it only made sense to make m2eclipse-egit the sensible default SCM provider for m2e.

m2eclipse-egit will now be automatically installed when a JBoss Maven integration feature is installed from the JBoss Tools update site.

It is installed by default with JBoss Developer Studio 8.0.0 as well.

Maven Central Archetype catalog

Since m2e 1.5.0 no longer downloads Nexus Indexes by default, a very small, outdated subset of Maven Archetypes is available out of the box.

To mitigate that, the JBoss Tools Maven integration feature now registers by default the Maven Central Archetype catalog, providing more than 9600 archetypes to chose from, when creating a new Maven project. Accessing the complete list of archetypes is even way, waayyyy faster (a couple seconds) than relying on the old Nexus index download.

maven central catalog

Pom properties-controlled project configurators

JBoss project configurators for m2e now support an activation property in the <properties> section of pom.xml. Expected values are true/false and override the workspace-wide preferences found under Preferences > JBoss Tools > JBoss Maven Integration.

Available properties are :

  • <m2e.cdi.activation>true</m2e.cdi.activation> for the CDI Project configurator,

  • <m2e.seam.activation>true</m2e.seam.activation> for the Seam Project configurator,

  • <m2e.hibernate.activation>true</m2e.hibernate.activation> for the Hibernate Project configurator,

  • <m2e.portlet.activation>true</m2e.portlet.activation> for the Portlet Project configurator.

The pom.xml editor also provides matching XML templates for these properties, when doing ctrl+space in the <properties> section.

Maven Repository wizard improvements

The Configure Maven Repositories wizard, available under Preferences > Jboss Tools > JBoss Maven Integration saw a couple improvements as well :

Advanced options for maven repositories

You can now choose the repository layout, enable/disable snapshots or releases, and change the update policy in the advanced section :

maven repository advanced

Automatically identify local Maven repositories

When adding a new Maven repository, you can scan for JBoss Maven repositories unzipped locally, with the Recognize JBoss Maven Enterprise Repositories…​ button, in order to automatically add it to your .m2/settings.xml.

recognize maven repo

The identification process now looks for a .maven-repository file at the root of the folder you selected. This file follows the .properties file format and is expected to contain up to 3 attributes :

  • repository-id : the repository id

  • name : a (descriptive) repository name. Optional, defaults to repository-id

  • profile-id : the profile id the repository will be activated from. Optional, defaults to repository-id

As a concrete example, the JBoss Mobile repository .maven-repository file would contain :

      name=JBoss Mobile Maven Repository

What’s next?

Tired of seeing these "Project configuration is out-of-date" errors whenever you tweak your pom.xml? We’re currently playing with a plugin that will automatically do the Maven > Update project configuration for you. You can try a very alpha version from the following p2 repository : Let us know if/how it works for you, so we can decide what to do next with it.

Enjoy and see you soon!

Fred Bricon

by fbricon at November 19, 2014 12:51 PM

Flexible calendar control in SWT

by Tom Schindl at November 19, 2014 10:22 AM

I’ve been in need for Week-Calendar view to display appointments but could not find anyone that completely fit my needs to I implemented a custom one.

This short screen cast shows what I’ve today.

The control is backed by a replaceable service interface to retrieve and store appointments so it can be connected to fairly any kind of backend store, for my needs a plain XMI-Store was enough but one might want something more fancy can connect it most likely any calendar-service e.g. by using REST, … .

by Tom Schindl at November 19, 2014 10:22 AM

Why should I contribute?

by Jérémie Bresson at November 19, 2014 09:29 AM

One of the opening slides of our “Eclipse Scout Contributor” training looks like this:


There is nothing new in this slide. It just re-states an aspect of the open-source mechanic.

In our opinion the benefits of contributing to open source is often not fully understood. That is why we would like to share some of our thoughts.

Really often people associate open-source with the notion of free of charge and “Free riding”. But this is not necessarily a negative topic. Bruce Perens is writing about this topic in his text about “What Effect Does The Free-Rider Problem Have Upon Open Source?”.

Essentially, every open source users is starting out as a free-rider, and this is by design. The interesting question is related to what happens after the initial adoption. And once a user (or an organisation) is adopting a software project, the interest in the success of the project is growing too. This results in the interest to protect the initial investments of the user/organisation. One of the obvious approaches is to help to increase the success of the software project. And this is the point where the user/organisation can take advantage of the fact, that they decided for open-soure software.

A nice aspect of the open-source model is that there are many opportunities to become a part of the success of your favorite project. Of course, some of the options involve spending money, as in the case of closed source software. But even more opportunities exist that are based on the open-source nature of the software project and only require some of your experience, know-how and time.

Why shoud you care? Contributing to an open-source project can be rewarding in many ways and most often is in your best interest. Here some reason why you might want to contribute to the Eclipse Scout framework:

  • Prevent Work-arounds in Projects: Fix a bug, report it and provide a patch. Over and over it can be shown that fixing bugs at the source (in the open source code) is most efficient. Alternatively, you may privatly implement a work-around in your project. This is often the more expensive approach. First, you cover the costs to analyze, reproduce and work-around the bug. Then, you need to apply the work-around to all of your projects. And over time, you need to maintain your patch/work-around to work with the evolving framework. And when somebody else fixes “your” bug, you should remove all your workarounds in all projects. So: Reporting a bug increases the chance that it will be solved, and providing a good quality patch (including a test) actually reduces your work load over time.
  • Help the Scout Team to be more effective: The Scout code base is 550k LOC. This is pretty large. And, as in every large project, there are bugs (and fixing bugs does usually not reduce the the size of the project). So, if you report bugs in Bugzilla you are already helping Scout. And pushing a patch to our Gerrit instance, can help the Scout developers to be more productive and have more bugs fixed earlier. Which in turn improves your productivity as an application developer.
  • Help the Scout Community. The Community plays a central role. It helps you getting up to speed with Scout and lets you profit from past investments of others. Once you adopt a framework such as Scout you can protect your investment by making Scout more popular and more successful. This is the point when you become an integral part of the Community and your interests start to get aligned nicely with the community. Remember: An open source project is all about sharing. Helping others in the Scout forum with challenges you have mastered often does not take a lot of time. But it can save the other end hours and days.
  • Get a better understanding of Scout: Contributing to Scout significantly increases your understanding of the framework. And you might find additional benefits: Increase your know-how with current tooling such as git/gerrit, improve your technical skill-set and grow your professional network by interacting with other community members. These impressions are also backed by a recent survey of the Linux foundation. And for all this you can accumulate a public track record that demonstrates your work and your skills.

If you are looking for a starting point, we have a contribution page. By the way, helping the community by answering a question in the forum is also a valuable contribution to the project (even if it isn’t mentioned in our Contribution page yet).

If you are interested, do not hesitate to contact us ( You can start a discussion in the forum or ask for help if you are blocked with something. Depending on the subject you choose, the barrier to enter can be high. We will dedicate time to help you until you manage to realize the contribution you want.

And to come back to the initial Scout contributor training we found the visual confirmation on Open HUB that contributing is possible and not all too hard.


Scout Links

Project Home, Forum, Wiki, Twitter

by Jérémie Bresson at November 19, 2014 09:29 AM

Strategy of wrapping C++ libraries in Java

by Tom Schindl at November 18, 2014 10:12 PM

This blog post is more of a question to others (probably much more knowledgeable people than me) who have already wrapped C++ libraries in Java. So if you have and the strategy I document as part of this blog post is completely wrong please comment and point me towards a better one!

Anyways let’s get started!

We take a very simply case where we have a C++-Class like this:


class CTestSimple
    virtual int test();
    int test_delegate();

#include "ctestsimple.h"


int CTestSimple::test() {
    return 100;

int CTestSimple::test_delegate() {
    return test();

and in java we want to do the following things:

import testjni.TestSimple;

public class TestApp {
	public static void main(String[] args) {

		System.out.println("===== Simple");
		TestSimple t = new TestSimple();
		System.out.println("Direct: " + t.test());
		System.out.println("Delegate:" + t.test_delegate());

		System.out.println("===== No Override");
		NoOverride tn = new NoOverride();
		System.out.println("Direct: " + tn.test());
		System.out.println("Delegate: " + tn.test_delegate());

		System.out.println("===== With Override");
		SubTestSimple ts = new SubTestSimple();
		System.out.println("Direct: " + ts.test());
		System.out.println("Delegate: " + ts.test_delegate());

		System.out.println("===== With Override");
		SubSuperTestSimple tss = new SubSuperTestSimple();
		System.out.println("Direct: " + tss.test());
		System.out.println("Delegate: " + ts.test_delegate());

	static class SubTestSimple extends TestSimple {
		public int test() {
			return 0;

	static class SubSuperTestSimple extends TestSimple {
		public int test() {
			return super.test() + 10;

	static class NoOverride extends TestSimple {


We expect the application to print:

===== Simple
Direct: 100
===== No Override
Direct: 100
Delegate: 100
===== With Override
Direct: 0
Delegate: 0
===== With Override & super
Direct: 110
Delegate: 110

The strategy I found to make this work looks like this:

  1. On the C++ side one needs to define a subclass of CTestSimple I named JTestSimple
    #ifndef JTESTSIMPLE_H
    #define JTESTSIMPLE_H
    #include "ctestsimple.h"
    #include <jni.h>
    class JTestSimple : public CTestSimple
        jobject jObject;
        JNIEnv* env;
        jmethodID jTest;
        JTestSimple(JNIEnv *,jobject,jboolean derived);
        virtual int test();
    #endif // JTESTSIMPLE_H
    #include "jtestsimple.h"
    #include <iostream>
    #include <jni.h>
    JTestSimple::JTestSimple(JNIEnv * env, jobject o, jboolean derived)
        this->jObject = env->NewGlobalRef(o);
        this->env = env;
        this->jTest = 0;
        if( derived ) {
            jclass cls = env->GetObjectClass(this->jObject);
            jclass superCls = env->GetSuperclass(cls);
            jmethodID baseMethod = env->GetMethodID(superCls,"test","()I");
            jmethodID custMethod = env->GetMethodID(cls,"test","()I");
            if( baseMethod != custMethod ) {
                this->jTest = custMethod;
    int JTestSimple::test() {
        if( this->jTest != 0 ) {
            return env->CallIntMethod(this->jObject,this->jTest);
        return CTestSimple::test();

    The important step is to check in if the java-object we got passed in has overwritten the test()-method because in
    that case we need to call from C++ to java for the correct result.

  2. On the Java side we define our object like this
    package testjni;
    public class TestSimple {
    	private long NativeObject;
    	public TestSimple() {
    		NativeObject = createNativeInstance();
    	protected long createNativeInstance() {
    		return Native_new(getClass() != TestSimple.class);
    	public int test() {
    		if( getClass() == TestSimple.class ) {
    			return Native_test();
    		} else {
    			return Native_test_explicit();
    	public final int test_delegate() {
    		return Native_test_delegate();
    	private native long Native_new(boolean subclassed);
    	private native int Native_test();
    	private native int Native_test_explicit();
    	private native int Native_test_delegate();

    Some remarks are probably needed:

    • Notice that the there are 2 native delegates for test() depending on fact whether the method is invoked in subclass or not
    • test_delegate() is final because it is none-virtual on the C++ side and hence is not subject to overloading in subclasses
  3. On the JNI-Side
    #include <jni.h>
    #include <stdio.h>
    #include <iostream>
    #include "testjni_TestSimple.h"
    #include "jtestsimple.h"
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->test();
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test_1explicit
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->CTestSimple::test();
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test_1delegate
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->test_delegate();
    JNIEXPORT jlong JNICALL Java_testjni_TestSimple_Native_1new
      (JNIEnv * env, jobject thiz, jboolean derived) {
        jlong rv = (jlong)new JTestSimple(env, thiz, derived);
        return rv;

    The only interesting thing is the Java_testjni_TestSimple_Native_1test_1explicit who does not invoke the test-method on JTestSimple but on its super class CTestSimple.

So my dear C++/Java gurus how bad / dump / … is this strategy?

by Tom Schindl at November 18, 2014 10:12 PM

New Eclipse JUnit Feature: Run Subtrees of Tests Individually, e.g. from Parameterized Tests

by Moritz Eysholdt ( at November 18, 2014 01:58 PM

In Eclipse, the JUnit view nicely visualizes execution status and results of JUnit tests. JUnit tests usually are Java classes with test methods or Java classes annotated with @SuiteClasses, which compose multiple other test classes or suites.

But JUnit is more powerful than that. It allows you to implement custom test runners. Those test runners are responsible for creating a tree structure of test descriptions. Later the runner needs to execute all leafs of that tree as test cases.

A node in this tree of tests is not necessarily backed by a Java class, as it is for test suites. It can be something purely virtual. JUnit itself ships with a test runner for which this is true: org.junit.runners.Parameterized.

The screenshots above shows a Parameterized test on the left-hand-side and the Eclipse JUnit View (after the test has been executed) on the right-hand-side. The method parameters() returns a list of test data sets and the Parameterized runner will call the constructor of this test class for each data set. Consequently, the number of constructor parameters must match the number of items in each data set. For each data set, all test methods are executed and what we get, effectively, is a matrix test.

In the screenshot of the JUnit view we can see how the Parameterized runner presents the test matrix as a tree: The test class itself is the root node, every data set is a subtree and every entry of the test matrix is a leaf.

As of now (Eclipse Mars M4), it is possible to run individual subtrees by choosing "run" from the context menu. This effectively means to execute a row from the test matrix.

Additionally, it is now possible to run a single column from the test matrix by running a single method from a Parameterized test.

Note that in the picture on the left-hand-side I'm explicitly clicking on the method's name. Clicking elsewhere would execute all test methods from the class.

The new mechanism to filter down test description trees can achieve this without being specific to the JUnit's Parameterized runner: The filter first extracts the leading part of the test desription's name which is a valid Java identifier. The test is executed if this leading part exists and if it equals the to-be-executed method's name.

I would like to thank NumberFour for having me implement this fix specifically for Xpect. Gratitude also goes to itemis, my employer, who gave me the time to generalize and contribute the fix to Eclipse JDT. Also I would like to thank the JDT team for accepting the contribution.

by Moritz Eysholdt ( at November 18, 2014 01:58 PM

Espruino Pico on Kickstarter: only 4 days to go!

by Benjamin Cabé at November 18, 2014 01:38 PM

If you follow our regular IoT hangouts, you probably have seen this presentation of the Espruino Pico already:

If you don’t, you really want to check out!

Espruino is an Open Source and Open Hardware project that provides a super-tiny implementation of Javascript that runs on micro-controllers. The Espruino board is a ready-to-use board that you can use to run Javascript IoT applications, but the Espruino interpreter can also run on lots of other targets.

Gordon Williams, the lead of Espruino, is working on a new version of the Espruino board with a tiny form factor, and his Kickstarter ends in only 4 days.

While the initial goal has already been reached, I would really like to see him reach the £50,000 stretch goal since it means he will implement socket support in the Espruino interpreter, allowing IoT developers to use MQTT, CoAP, and the like right from their Javascript code!

Please consider supporting this very cool open source project and don’t wait any longer to go visit their Kickstarter page!

by Benjamin Cabé at November 18, 2014 01:38 PM

EclipseCon 2015 - Register Now

November 18, 2014 12:36 PM

It's now time to register for EclipseCon 2015! Join us in San Francisco, March 9-12.

November 18, 2014 12:36 PM

A JRebel Ticket Monster on OpenShift

by adietish at November 18, 2014 11:07 AM

JBoss Developer Studio allows you publish and run your Eclipse projects on OpenShift. This post shows you how to get started using the JBoss Developer Ticket Monster demo application.
In a 2nd part I will show you how to change your code in Eclipse and have those changes instantly available on OpenShift using the JRebel cartridge.

Deploy and Run Ticket Monster on OpenShift

Get the code

The JBoss Developer project created the Ticket Monster demo to showcase a modern angularjs and JavaEE application. The code for this blog is at:
The original code for the Ticket Monster is available at Github.
The webapp with its maven pom is located in the demo folder. OpenShift on the other hand is expecting a pom in the root of the git repository. Without it OpenShift will not know how to build the code when you push it. To fix this I created a fork of the webapp where I put the content of the demo folder to the root of the git repository
  1. Copy this url to your clipboard and in your Eclipse, switch to the Git perspective.

  2. Launch the Clone Git Repository wizard. It will popup with all the values already in place. Hit Next

  3. Pick the master branch and hit Finish to get the repo cloned to your local machine.

  4. Once the cloning is done the new repo will appear in the Git Repositories view.

  5. Import the project within it by using menu:Import Projects…​ in the context menu.

  6. Back to the JBoss or Jave EE perspective you will spot the new project ticket-monster in your workspace.

OpenShift, create an Application for my project, please!

You are now ready to deploy it to OpenShift. Pick Configure ▸ New/Import OpenShift Application from the context menu of the ticket-monster project.

configure openshift application

The OpenShift application wizard that you now get prompts you to provide your OpenShift credentials. Either use an existing account from the combo or create a new one, providing server, username and password.
In the next page we choose to create a new OpenShift application and pick the application type. We will use WildFly 8 which you’ll find by filtering the available choices with "wildfly". The latest version of OpenShift has it in the Basic Cartridges (previous versions - like the one used in the screencast - had it in the quickstarts).

wildfly8 application

Once you picked it you can choose the application name in the next wizard page.
The wizard suggests a name that is based on the project name. OpenShift doesn’t allow non-alphanumeric characters for the name, you therefore have to correct it, remove the hyphen and get "ticketmonster".
You may also select the domain (if you have several ones) and possibly adjust the gear size and have your application scaling.

application settings

If you now hit Finish the wizard will create your OpenShift application, prepare your local project to run on OpenShift and create a server adapter for easy publishing.
Once the wildfly cartridge is created on OpenShift, the tooling presents you the credentials to access the wildfly administrative console in a dialog. You should copy those for later usage:

wildfly admin console

In a last step the wizard informs you that it’ll prepare your local ticket-monster project to run on OpenShift. Confirm these changes by hitting OK.

prepare project for openshift

You are now ready to publish your ticket-monster projcet to OpenShift.

Deploy your project!

Get to the "Servers" view, select the freshly created server adapter and tell it to menu:publish ticket-monster to OpenShift via its context menu.

publish to openshift
publish to openshift 2

The dialog prompts you to provide a commit message and select the files that you want to commit and publish.
Among the listed files is the .gitignore. Double clicking it shows you that it now includes Eclipse specific project settings.
The wizard also added OpenShift configurations in the .openshift folder.
Further interesting bits are a marker to have wildFly running on java 8 (.openshift/markers/java8). OpenShift markers allow you to configure specific bits like enabling debugging (Enable JPDA marker), hot deployment (Hot Deploy marker) etc. You may choose among java 7 and java 8 via marker files in .openshift/markers. You can also spot a wildfly configuration file in .openshift/config/standalone.xml that you can modify to your needs. Once you checked all files by clicking the Select All button and provided a commit message, you’re good to go. You can now publish your project to OpenShift by hitting Commit and Publish

The tooling then informs that it will overwrite the current code in OpenShift by doing a push force. It asks you to confirm this.

push force

This is required since the wizard does not properly include the remote history in our local project/git repo. It clones the remote repo and then copies some of its content into the local project, it does not recursive merge since this is/was not fully reliable yet (see JBIDE-14890 for further details). It is therefore required to overwrite the remote content when you do the initial deployment. Once you hit OK the server adapter is pushing the local code to OpenShift.

Eclipse then shows you the Console with the output of the publishing operation. You can see how the maven build is triggered, WildFly is restarted and your project is then deployed.

build and deploy

In order to verify that your server is fully you can have a look at the logs. Pick OpenShift ▸ Tail files…​ from the OpenShift submenu in the context menu of your ticketmonster server adapter. The upcoming wizard allows you to fine-tune the tail files options and include the gears that your application is running on. You can stick to the default which are usually just fine.

tail files

A fully started Wildfly will output 'Wildfly 8.1.0.Final "Kenny" started' in the logs.

wildfly started

You’re now ready to have a look at your running project. Pick menu:[Show In > Web Browser] from the context menu of your ticketmonster server adapter and see how the browser is pointed to your deployed webapp.

open browser
ticket monster webapp

Change locally, see OpenShift change instantly!

Install JRebel

We are now getting a step further and show you how we can change the application code locally and have those changes instantly available on OpenShift.
To achieve this you need to install the JRebel plugin into your JBoss Developer Studio. The Eclipse plugin is available from JBoss Central. Switch to the Software/Updates tab, search for JRebel, check it once it is listed and hit "Install/Update". Once you restarted Eclipse your have JRebel enabled in your IDE

install jrebel

Enable JRebel for your Project

Open up the context menu of your project and enable the JRebel Nature for your project (menu:[JRebel > Add JRebel Nature]). In a 2nd step then enable JRebel Remoting.

enable jrebel remoting

You have to configure the local JRebel where to publish to. You therefore need the public URL of your ticket monster as it runs on OpenShift. You get this in the application details: pick OpenShift ▸ Details and copy the Public URL.

application public url

Paste it to the JRebel Deployment URL(s) by picking Advanced Properties from the JRebel context menu of your ticket-monster project.

jrebel deployment url

Downgrade to Java 7

Wildfly is configured to run with Java 8 by default. With JRebel enabled the OpenShift small gear that you get for free tends to run out of memory. It is therefore suggested that you downgrade to Java 7. You go to the context menu of your project and pick *OpenShift ▸ Configure Markers…​, uncheck java8 and check Java 7.

java7 marker

Add the JRebel cartridge in OpenShift

The JRebel cartridge for OpenShift, available from Github, makes it very easy to enable JRebel for any Java app on OpenShift. To add this cartridge to your application you get to the Servers view and choose menu: OpenShift[Edit Embedded Catridges…​].
In the upcoming wizard you check the Code Anything cartridge and paste the following url:
code anything cartridge

Once you hit Finish the wizard will add the cartridge to your OpenShift application and enable JRebel for it.

Publish your project to OpenShift

You now have to push all your local changes to OpenShift (you added the JRebel nature and downgraded to java7). You have to tell the server adapter to publish: Choose Publish in the context menu of your OpenShift server adapter.
The upcoming commit- and publish-dialog shows your local changes:

jrebel changes

You replaced the java8 with a java7 marker and added 2 xml files that configure JRebel. Once you add a commit message you’re ready to hit Commit and Publish.
If you now go to the Console view and pick the ticketmonster, you will see how OpenShift picks those changes and rebuilds your code.

ticketmonster console

You can inspect the server logs to make sure wildfly the procedure is all finished and widlfly fully restarted. In the Servers view, pick OpenShift ▸ Tail Files…​, stick to the default options and hit Finish.

wildfly started

'WildFly 8.1.0.Final "Kenny" started' in the logs tells you that wildfly was successfully restarted. You are now ready to change code locally and have them picked up in OpenShift instantly.

Pick my local changes instantly, OpenShift!

We will change the ticket price and we will therefore first check the current price. Use Show In ▸ Browser in the context menu of your server adapter which will open up the application in your browser. In your browser then hit hit Buy tickets now, Book Ticket, choose some venue, date, time and section. You will then see the current price:

ticket price1

Back in your JBoss Developer Studio let us now change the ticket price:
Open up the TicketPrice and get to the getPrice() method. Change it to the following:

    public float getPrice() {
          //   return price;
          	return createFakePrice();
          private float createFakePrice() {
      		return 42f;

When you save your Java editor, you will see the JRebel console popping up and show you how it is updating the java classes in OpenShift.

rebel updating openshift

Now get back to your browser and refresh the page. You will have to select the venue again in order to see the new ticket price: It is now at $42!

ticket price2

We did not have to publish our code to OpenShift via the server adapter. JRebel published our local changes on the fly!

by adietish at November 18, 2014 11:07 AM

Statechart Tools: What's in the pipeline?

by Andreas Mülder ( at November 17, 2014 10:36 AM

In the upcoming statechart tools release we made a huge refactoring to the simulation infrastructure. This long overdue refactoring was necessary to define some concise APIs to plug in custom simulation engines for different execution semantics. It took some time, but in the end our jenkins build is back to stable again ;-)

Debugging Features

On top of the new APIs we build some great new debugging features for the graphical statechart editor. It is possible now to set breakpoints on transitions and states. If a breakpoint is hit, the simulation engine suspends before the transition or the state is executed. The semantic element the breakpoint is attached to is highlighted in a different color as you can see in the screenshot below.

The debugging features are tightly integrated into the Eclipse debugging infrastructure. The eclipse breakpoints view shows all statechart breakpoints and allows to enable or disable them. But that's not all - it is even possible to specify conditional breakpoints with our expression language!

Simply check the 'Conditional' checkbox in the breakpoints detail pane and specify any expression that evaluates to a boolean type. For the example model above, it would be possible to specify an expression like 'ABC.interfaceVar == 10'. If the result of the expression evaluates to 'true', the breakpoint hits.

Simulation Snapshots

Another great new feature we developed are so called simulation snapshots. (Thanks to Holger who developed a large part of the snapshot functionality as his 4 + 1 project) During simulation, you can take a simulation snapshot of the current execution state via the "Snapshot view". These snapshots can be restored at any time and simplifies the process of statechart debugging significantly.

The snapshot details section shows the execution state at the time the snapshot was taken and previews an image with all active states. Statechart Debugging has never been so easy! ;-)

This is just a short preview, detailed documentation on how to use the new features will follow. What do you think? Do you like the new features? Do you have another proposal how to simplify statechart debugging? Just leave a comment or send us some feedback via our user group !

by Andreas Mülder ( at November 17, 2014 10:36 AM

5 Usability Flaws in GMF (and how to fix them)

by Andreas Mülder ( at November 17, 2014 10:35 AM

Although GMF (Graphical Modeling Framework) editors are an endangered species and future editors will be developed with JavaFX there is still a bunch of GMF editors out there. And most of them have something in common: a lack of usability. The good news are that some usability flaws are easy to fix. In Yakindu Statechart Tools we spend a lot of time in improving the usability of our GMF Runtime based editor. Here is a random selection of 5 usability flaws in GMF and how we fixed them.

The following code snippets are just examples how a solution could look like. Most of the  snippets are still experimental and developed for specific use cases and should not be used as is!

1. Remove dithering connection anchors

Everyone who ever worked with a GMF based editor knows this annoying default behavior. You spend a significant amount of time arranging the diagrams connections to produce a nice-looking diagram. When everything looks pretty nice, you enlarge a node  - and all the connection routing work was only a waste of time....

 To fix this, you can use an that can be chained to a SetBoundsCommand and recalculates the IdentityAnchor position based on the resize delta. Create a customized LayoutEditPolicy and install it on the container EditPart as shown below:

installEditPolicy(EditPolicy.LAYOUT_ROLE, new XYLayoutEditPolicy() {
  protected Command getResizeChildrenCommand(ChangeBoundsRequest request) {
    CompoundCommand result = new CompoundCommand();
    AdjustIdentityAnchorCommand command = new  
         .getEditingDomain(resolveSemanticElement()), request);
    result.add(new ICommandProxy(command));
    return result;

2. Remove scrollable compartments and autoresize container hierachy

Working with GMF compartments is no fun. The compartment scrollbars look pretty ugly and diagram elements are hidden in non visible areas. Sometimes, edges are clipped because a node is not fully visible and you do not even know that the edge exists. A better approach is to always show all elements in the compartment. If your diagram gets too large, you should consider using sub diagrams instead of scrollable compartments. When adding new nodes to a compartment, the user has to enlarge the whole container hierachy by hand.

The auto-resizes the container hierachy if more space is required. Every note that is in the way is moved to prevent overlapping nodes. It has to be installed for every EditPart in the compartment:

installEditPolicy(EnlargeContainerEditPolicy.ROLE, new

3. Add a preferred size selection handle

This EditPolicy adds a new handle to the selection handles. If this handle is pressed, the selected node updates its size to the preferred size. The preferred size is the minimum size of the node, where all node children are visible. When modeling with the UML Tool Magicdraw I found this preferred size handle to be very useful, so I added it to my GMF based editors as well.


To add the preferred size selection handle to a node, install the to the nodes EditPart for the PRIMARY_DRAG_ROLE. Check the Type Hierachy carefully!

installEditPolicy(EditPolicy.PRIMARY_DRAG_ROLE, new

4. Highlight clickable areas and open direct editing on double click

It is a good idea to indicate clickable areas with a highlighted border, so the user immediately gets  feedback where he can open an in-diagram editor. These editors should open on a simple double-click, no matter which node is currently selected in the diagram. With GMF defaults it is only possible to open the direct editor when a node is selected.

To highlight clickable areas within the Diagram there is a (really simple) that shows a border on mouse hover to indicate a clickable area. To open the direct editor on double click use the .This DragTracker generates DirectEditRequests on double clicks and replaces the existing DragTracker of the EditPart like this:

public DragTracker getDragTracker(final Request request) {
  if (request instanceof SelectionRequest
    && ((SelectionRequest) request).getLastButtonPressed() == 3)
    return null;
  IDoubleClickCallback callback = new IDoubleClickCallback() {
    public void handleDoubleClick(int btn) {
  return new DoubleClickDirectEditDragTracker(this, getTopGraphicEditPart(),   


5. Add rich text editing support to direct editing

If you use direct editing only to set names for nodes, the default GMF Direct Editing behavior is sufficient. If you have a more formal language, for example an expression language, you should consider using Xtext to generate an editor for your grammar and integrate it into your GMF editor to support syntax coloring, validation and cross referencing.

For the technical details have a look at Xtext integration into GMF !

by Andreas Mülder ( at November 17, 2014 10:35 AM

Yakindu Statechart Tools 2.4.0 released!

by Andreas Mülder ( at November 17, 2014 10:34 AM

We released Yakindu Statechart Tools 2.4.0 today! It contains a lot of bugfixes and improvements as well as new features.

This release requires Eclipse Luna. You can install SCT 2.4.0 from our update site: or you can download a full Eclipse zip package from our dowload page.

New and Noteworthy

Here is a summary of the improvements worth mentioning:

1. Toggle documentation toolbar entry added
Menu-16We added a toolbar menu item  to the editors toolbar to toggle between documentation and formal expression language. This allows to toggle modes for the whole diagram.

2. C/C ++ generator feature for inner class function visibility added
innerFunctionVisibility (String, optional): This parameter is used to change the visibility of inner functions and variables. In default the „private” visibility is used. It can be changed to „protected” to allow function overriding for a class which inherits from the generated statemachine base class.
Example configuration:

feature GeneratorOptions {
innerFunctionVisibility = "protected"

3. C/C++ generator feature to change operation callback generation behavior added
staticOperationCallback (Boolean, optional): If set to ‚true’ the callback function declaration for statechart operations is static and the functions are called by the statemachine code statically.
Example configuration:
feature GeneratorOptions {
staticOperationCallback = true

4. Model and Diagram compare feature
This release contains the beta version of a diff/merge viewer for statechart diagrams. It integrated seamlessly into the Eclipse Team API to allow diffing and merging of different revisions as well as comparing models with the local history.


Note that this feature is still beta. There are currently two known problems (this and this) with the editor that will be fixed for the next release.


We also fixed a bunch of bugs reported via our User Group:

Toggle subregion alignment does not work
Transition into substate does not recognize parents history context
Model can not be simulated if operations in named interfaces are used
C++ generator can generates allways actions with a lower-case letter in source file
CoreFunction methods for long types incomplete
Operations of other Statecharts (in the same folder) are in scope
Class cast exception if java provided custom operation is not executable
Simulating Operations With Custom Java Code is not working as intended
Diagram corrurpted when moving transition label

Thanks to all bug reporters!

by Andreas Mülder ( at November 17, 2014 10:34 AM

"Catch me if you can" - Java on wearables (60-minute extended talk)

by Eclipse Foundation at November 17, 2014 09:47 AM

Wearable computers are one of the next big things. But at the moment, one can buy only specialized systems such as motion trackers, GPS watches, and the like. So why not use existing cheap technology to build your own wearable Java-powered device? This session shows what you can do with affordable technology and Java today. It uses a Raspberry Pi in combination with a heart rate sensor and a GPS sensor to track the heart rate and the location of a runner. The battery-powered Pi measures the data and publishes it via MQTT to different clients such as Java(FX)-based desktop clients, an iPhone client and a smartwatch the runner can wear. Come and get some ideas of what you can do with Java and wearable devices.

by Eclipse Foundation at November 17, 2014 09:47 AM

Eclipse and Java™ 8

by Eclipse Foundation at November 17, 2014 09:31 AM

This session will present the most important new stuff in Java™ 8. It will show how to get started developing Java 8 code with Eclipse and then demo the new features that are available in Eclipse for this new Java™ release. We will also look behind the curtain and see how the JDT team accomplished that great piece of work.

by Eclipse Foundation at November 17, 2014 09:31 AM

The JVM Universe - Java and the IoT Big Bang

by Eclipse Foundation at November 17, 2014 09:18 AM

This session provides an overview of Java's role in the exploding IoT universe. Starting with the JVM, we'll look up through JRE and JDK implementations, and then out through Java Card, Java ME and Java SE platforms. We'll see where Java fits from the tiniest of devices to the very largest of cloud deployments. We will examine the role of OpenJDK and the various Java vendor implementations available. At the end of this talk attendees should have a strong sense of the role the various Java platforms play, what tools are recommended depending on your needed scale and where to find all the right pieces. Attendees should also have a strong sense of the terminology surrounding the Java platform at various stages of development and scale. Don't get trapped in the black hole - use Java to reach escape velocity.

by Eclipse Foundation at November 17, 2014 09:18 AM