Displaying your EMF Model on a TableViewer

by Its_Me_Malai (noreply@blogger.com) at November 20, 2014 11:39 AM

From Part 1 to Part 5 we have been discussing on Edit Plugins support for displaying and editing of
EMF Models on a TreeViewer. In this tutorial we are about to discuss the use of the Edit Plugin to
display the EMF Model on a TableViewer.



Typically any EMF generated Item Provider would look like the above shown screenshot. It extends
from ItemProviderAdapter and implements IEditingDomainItemProvider,
IStructuredItemContentProvider, ITreeItemContentProvider, IItemLabelProvider, IItemPropertySource

Looking at the interfaces IStructuredItemContentProvider, it is clear that this ItemProvider is ready to
work for a TableViewer as a ContentProvider but as a LabelProvider ?? The answer is, it is not.
IItemLabelProvider supports only getText which is good for a single column table or a list or a
treeviewer but not for a multi column tableviewer.

This requires a bit of a customization to the edit plugin to support Tableviewer with custom columns on it. In this tutorial we would learn what is required to customize the edit plugin to work with a
tableviewer to display the relevant column information.


Now the example that I am developing is, whenever a Group is selected in the EMF generated Editor, it needs to display Contacts in a tabular format in my custom view.

1. Create a New Plugin [org.ancit.examples.emf.edit.extension]
2. Add dependancies [org.ancit.examples.emf.edit] to the newly created plugin
3. We need to extend 2 Classes. The ItemProviderAdapterFactory and the ItemProvider itself.
4. Create a new Class called as ExtendedAddressBookItemProviderAdapterFactory which extends
AddressBookItemProviderAdapterFactory.
5. Create a Default Constructor for the new Class.
6. To the list of supportedTypes, add ItableItemLabelProvider as shown below


7. Now override createContactAdapter() method and add the following code snippet


The above codesnippet is about instead of returning the generated ContactItemProvider which doesnt support ItableItemLabelProvider, we return our extended version of ContactItemProvider which implements ItableItemLabelProvider.

8. Create a new Class called as ExtendedContactItemProvider which extends ContactItemProvider
and implements ITableItemLabelProvider.


9. Override getColumnText() and provide your implementation to handle various columns. The
method would look like as shown below



10. Now that the AdapterFactory and ItemProvider has been extended, we are ready to use it in our
TableViewer. Therefore create a new plugin [org.ancit.examples.emf.ui]

11. In the ComposedAdapterFactory instead of adding AddressbookItemProviderAdapterFactory
you need to add ExtendedAddressbookItemProviderAdapterFactory and the view class would
look like as shown below.



12. We are good to run and test. You need to add Columns in the desired order to the TableViewer
which I havnt done in the above shown code.

by Its_Me_Malai (noreply@blogger.com) at November 20, 2014 11:39 AM

Contributing to Context Menu using NewChildDescriptors

by Its_Me_Malai (noreply@blogger.com) at November 20, 2014 10:54 AM

In Part 2, we learnt about using the Basic Command Stack and Edit Domain for enabling Editing on your EMF Model.

The Item Providers are also responsible for providing newChildDescriptors. The newChildDescriptor
means new Child that can be created as a child of the selected node. This is again a feature left to be
self-understood not discussed too often in the EMF Articles.


In this article, I document the simple steps how to create a Context Menu and display New Child and
New Sibling Actions for the selected object in your tree view for your EMF Model.
Enabling the Context Menu is similar to how it is generally done in PDE


1. You need to create a Context Menu for your TreeViewer

                              Fig 1.Snippet of Contributing a ContextMenu for your Viewer


2. Having contributed a context menu for your viewer, we now need to fill the context menu with
the relevant actions to create Siblings and Children for the selected model object.
  
                  Fig 2. Snippet for Filling the ContextMenu with Sibling and ChildActions


On running your code, when you perform a Right Click on your View, you will see 2 new sub menus, New Child and New Siblings with relevant Actions available under each of them.

                          Fig 3. Output showing EMF Consumer View with Property View


Please download the example code snippet from our website www.ancitconsulting.com

REFERENCES:
http://www.eclipse.org/gef/documentation.php
http://www.vainolo.com/tutorials/gef-tutorials/

ABOUT ANCIT:
ANCIT Consulting is an Eclipse Consulting Firm located in the "Silicon Valley of Outsourcing", Bangalore.Offers professional Eclipse Support and Training for various Eclipse based Frameworks including RCP,EMF, GEF, GMF. Contact us on annamalai@ancitconsulting.com to learn more about our services.

by Its_Me_Malai (noreply@blogger.com) at November 20, 2014 10:54 AM

Using CNF to display EMF Model

by Its_Me_Malai (noreply@blogger.com) at November 20, 2014 10:29 AM

In Part 3, we learnt how to integrate and display the attributes of an EObject in Properties View. This article attempts to showcase how to display the EMF Model on a TreeViewer using Common Navigator Framework.

To use the Common Navigator Framework you need to make use of 3 extension points as follows

1. Org.eclipse.ui.views
2. Org.eclipse.ui.navigator.navigatorContent
3. Org.eclipse.ui.navigator.viewer

Steps for displaying an EMF Model on a Common Navigator Framework is as follows

1. Create a View using org.eclipse.ui.views extension point.

a. The base class of View should be CommonNavigator instead of ViewPart.
b. Override the getInitialModel() and return the instance of the model to display.
                                                 Fig 1. Snippet of CommonNavigator


2. Create an Extension for org.eclipse.ui.navigator.navigatorContent

a. We need to provide Content Provider, Label Provider, Trigger Points, Possible Children

                           Fig 2.Screenshot of extension : org.eclipse.ui.navigator.navigatorContent


3. Create a ContentProvider Class that extends from AdapterFactoryContentProvider

a. In the Constructor provide the ItemProviderAdapterFactory of your .edit plugin.

                                              Fig 3. Snippet showing the ContentProvider


4. Create a LabelProvider that extends from AdapterFactoryLabelProvider

a. In the Constructor provide the ItemProviderAdapterFactory of your .edit plugin

                                                Fig 4. Snippet showing the Label Provider


5. Having created the Content and the Label Provider, attach them to the navigatorContent extension created in Step 2.
                    Fig 5. Screenshot showing where to provider the Label and Content Provider


6. Having created the View and also created the Content and Label Provider thru 2 Extn Points, we need to now associate them using org.eclipse.ui.navigator.viewer extension point.

                            Fig 6. Screenshot of Extn : org.eclipse.ui.navigator.viewer


                            Fig 7. Screenshot of contentExtension in org.eclipse.ui.navigator.viewer

7. Run your Code and you should see your new View with CNF displaying your EMF Model.

by Its_Me_Malai (noreply@blogger.com) at November 20, 2014 10:29 AM

CDT in the New World

by Doug Schaefer at November 19, 2014 05:12 PM

An interesting thing happened the other day which I think shakes some of the foundation I’ve had in my mind about the CDT. My vision was always for CDT to support C, C++, and related languages on every platform. And that has always included Windows. Mind you, we’ve never really done a great job there but we do have lots of users using it with Cygwin or MinGW and you can argue how well they support Windows application development.

The big news was Microsoft’s Visual Studio Community Edition which is a free version of their complete IDE for individuals and small teams. This is different from their Express editions which were severely limited in the end, including lack of 64-bit support, and the inability to install plug-ins into the IDE. No, the community edition is the full thing and is a pretty important step for Microsoft as they try to keep developer interest in their platform.

As long as Microsoft charged money for Visual Studio, I always thought CDT had a play. I’ve done a little work on supporting integration with their Windows SDK (which includes a 64-bit compiler) at least for build. Debug was always a big hurdle since you’d need to write a complete debugger integration that is very different from gdb. So, really, this integration barely worked. And now that VS is essentially free to the market we serve, there’s little point at continuing that effort. The same is essentially true with Objective-C support with Xcode on Mac, which even they are drifting away from to focus on their new language Swift.

But you know, despite giving up on a dream, I think CDT has an important role to play in it’s field of strength, i.e. support for the GNU toolchain, especially as used for embedded development with it’s new found focus as a pillar in the Internet of Things. As I’ve talked about on Twitter and on CDT forums, I am working on a CDT integration to support building C++ apps for Arduino. Once that’s done, I’ll shift focus to a more complicated environment around Raspberry Pi which I hope to add better remote debug support to CDT. Yes, I know everyone loves to talk about using scripting languages on the Pi, but I want to expose these hobbyists to the real world of embedded which is almost exclusively C and C++, and get them started using CDT.

I still haven’t given up on the desktop, at least desktop support with Qt. I really want to make CDT a first class alternative for Qt development. We have a good start on it and will continue with things like a project file editor, easier setup of builds, and import of projects from Qt Creator. The great thing about Qt is that it also is truly cross platform, allowing you to write your app to run on the desktop and then migrate it to your embedded device, such as the Raspberry Pi. And supporting both of these with the same project is one thing CDT is pretty good at, or at least will be once we fix up the UX a bit.

In the past, much like the rest of the Eclipse IDE, I think we on the CDT have focused a lot on being a good platform, almost too much. While vendors have done a great job at bringing the CDT to their customers writing for their platform, including my employer :), aside from the Linux community, few of us work on supporting the direct user of the Eclipse C/C++ IDE. I think focusing on CDT’s areas of strength and targeting specific open environments and not chasing dreams will help me contribute to the CDT in ways that will actually help the most people. And that’s most satisfying anyway.


by Doug Schaefer at November 19, 2014 05:12 PM

Maven improvements in JBoss Tools 4.2 and Developer Studio 8.0

by fbricon at November 19, 2014 12:51 PM

In JBoss Tools and JBoss Developer Studio, we’re continuously working to augment the Maven integration experience in Eclipse. Some of the features we’ve been playing with, if deemed successful, will eventually be contributed back to m2e, like the Maven Profile Management UI. Others, more centered around JBoss technologies, will stay under the JBoss Tools umbrella.

JBoss Tools 4.2 and Developer Studio 8, based on Eclipse Luna, take advantage of all the nice improvements made to m2e 1.5.0 and then, add some more :

m2eclipse-egit integration

The Import > Checkout Maven Projects from SCM wizard doesn’t have any SCM provider by default, which can be pretty frustrating at times. With Git becoming the new de facto source control system, it only made sense to make m2eclipse-egit the sensible default SCM provider for m2e.

m2eclipse-egit will now be automatically installed when a JBoss Maven integration feature is installed from the JBoss Tools update site.

It is installed by default with JBoss Developer Studio 8.0.0 as well.

Maven Central Archetype catalog

Since m2e 1.5.0 no longer downloads Nexus Indexes by default, a very small, outdated subset of Maven Archetypes is available out of the box.

To mitigate that, the JBoss Tools Maven integration feature now registers by default the Maven Central Archetype catalog, providing more than 9600 archetypes to chose from, when creating a new Maven project. Accessing the complete list of archetypes is even way, waayyyy faster (a couple seconds) than relying on the old Nexus index download.

maven central catalog

Pom properties-controlled project configurators

JBoss project configurators for m2e now support an activation property in the <properties> section of pom.xml. Expected values are true/false and override the workspace-wide preferences found under Preferences > JBoss Tools > JBoss Maven Integration.

Available properties are :

  • <m2e.cdi.activation>true</m2e.cdi.activation> for the CDI Project configurator,

  • <m2e.seam.activation>true</m2e.seam.activation> for the Seam Project configurator,

  • <m2e.hibernate.activation>true</m2e.hibernate.activation> for the Hibernate Project configurator,

  • <m2e.portlet.activation>true</m2e.portlet.activation> for the Portlet Project configurator.

The pom.xml editor also provides matching XML templates for these properties, when doing ctrl+space in the <properties> section.

Maven Repository wizard improvements

The Configure Maven Repositories wizard, available under Preferences > Jboss Tools > JBoss Maven Integration saw a couple improvements as well :

Advanced options for maven repositories

You can now choose the repository layout, enable/disable snapshots or releases, and change the update policy in the advanced section :

maven repository advanced

Automatically identify local Maven repositories

When adding a new Maven repository, you can scan for JBoss Maven repositories unzipped locally, with the Recognize JBoss Maven Enterprise Repositories…​ button, in order to automatically add it to your .m2/settings.xml.

recognize maven repo

The identification process now looks for a .maven-repository file at the root of the folder you selected. This file follows the .properties file format and is expected to contain up to 3 attributes :

  • repository-id : the repository id

  • name : a (descriptive) repository name. Optional, defaults to repository-id

  • profile-id : the profile id the repository will be activated from. Optional, defaults to repository-id

As a concrete example, the JBoss Mobile repository .maven-repository file would contain :

repository-id=local-jboss-mobile
      name=JBoss Mobile Maven Repository
      profile-id=local-jboss-mobile

What’s next?

Tired of seeing these "Project configuration is out-of-date" errors whenever you tweak your pom.xml? We’re currently playing with a plugin that will automatically do the Maven > Update project configuration for you. You can try a very alpha version from the following p2 repository : http://download.jboss.org/jbosstools/builds/staging/jbosstools-playground_master/all/repo/. Let us know if/how it works for you, so we can decide what to do next with it.

Enjoy and see you soon!

Fred Bricon
@fbricon


by fbricon at November 19, 2014 12:51 PM

Flexible calendar control in SWT

by Tom Schindl at November 19, 2014 10:22 AM

I’ve been in need for Week-Calendar view to display appointments but could not find anyone that completely fit my needs to I implemented a custom one.

This short screen cast shows what I’ve today.

The control is backed by a replaceable service interface to retrieve and store appointments so it can be connected to fairly any kind of backend store, for my needs a plain XMI-Store was enough but one might want something more fancy can connect it most likely any calendar-service e.g. by using REST, … .



by Tom Schindl at November 19, 2014 10:22 AM

Why should I contribute?

by Jérémie Bresson at November 19, 2014 09:29 AM

One of the opening slides of our “Eclipse Scout Contributor” training looks like this:

Why_should_I_contribute

There is nothing new in this slide. It just re-states an aspect of the open-source mechanic.

In our opinion the benefits of contributing to open source is often not fully understood. That is why we would like to share some of our thoughts.

Really often people associate open-source with the notion of free of charge and “Free riding”. But this is not necessarily a negative topic. Bruce Perens is writing about this topic in his text about “What Effect Does The Free-Rider Problem Have Upon Open Source?”.

Essentially, every open source users is starting out as a free-rider, and this is by design. The interesting question is related to what happens after the initial adoption. And once a user (or an organisation) is adopting a software project, the interest in the success of the project is growing too. This results in the interest to protect the initial investments of the user/organisation. One of the obvious approaches is to help to increase the success of the software project. And this is the point where the user/organisation can take advantage of the fact, that they decided for open-soure software.

A nice aspect of the open-source model is that there are many opportunities to become a part of the success of your favorite project. Of course, some of the options involve spending money, as in the case of closed source software. But even more opportunities exist that are based on the open-source nature of the software project and only require some of your experience, know-how and time.

Why shoud you care? Contributing to an open-source project can be rewarding in many ways and most often is in your best interest. Here some reason why you might want to contribute to the Eclipse Scout framework:

  • Prevent Work-arounds in Projects: Fix a bug, report it and provide a patch. Over and over it can be shown that fixing bugs at the source (in the open source code) is most efficient. Alternatively, you may privatly implement a work-around in your project. This is often the more expensive approach. First, you cover the costs to analyze, reproduce and work-around the bug. Then, you need to apply the work-around to all of your projects. And over time, you need to maintain your patch/work-around to work with the evolving framework. And when somebody else fixes “your” bug, you should remove all your workarounds in all projects. So: Reporting a bug increases the chance that it will be solved, and providing a good quality patch (including a test) actually reduces your work load over time.
  • Help the Scout Team to be more effective: The Scout code base is 550k LOC. This is pretty large. And, as in every large project, there are bugs (and fixing bugs does usually not reduce the the size of the project). So, if you report bugs in Bugzilla you are already helping Scout. And pushing a patch to our Gerrit instance, can help the Scout developers to be more productive and have more bugs fixed earlier. Which in turn improves your productivity as an application developer.
  • Help the Scout Community. The Community plays a central role. It helps you getting up to speed with Scout and lets you profit from past investments of others. Once you adopt a framework such as Scout you can protect your investment by making Scout more popular and more successful. This is the point when you become an integral part of the Community and your interests start to get aligned nicely with the community. Remember: An open source project is all about sharing. Helping others in the Scout forum with challenges you have mastered often does not take a lot of time. But it can save the other end hours and days.
  • Get a better understanding of Scout: Contributing to Scout significantly increases your understanding of the framework. And you might find additional benefits: Increase your know-how with current tooling such as git/gerrit, improve your technical skill-set and grow your professional network by interacting with other community members. These impressions are also backed by a recent survey of the Linux foundation. And for all this you can accumulate a public track record that demonstrates your work and your skills.

If you are looking for a starting point, we have a contribution page. By the way, helping the community by answering a question in the forum is also a valuable contribution to the project (even if it isn’t mentioned in our Contribution page yet).

If you are interested, do not hesitate to contact us (scout@bsiag.com). You can start a discussion in the forum or ask for help if you are blocked with something. Depending on the subject you choose, the barrier to enter can be high. We will dedicate time to help you until you manage to realize the contribution you want.

And to come back to the initial Scout contributor training we found the visual confirmation on Open HUB that contributing is possible and not all too hard.

OpenHubScoutContributions

Scout Links

Project Home, Forum, Wiki, Twitter


by Jérémie Bresson at November 19, 2014 09:29 AM

Strategy of wrapping C++ libraries in Java

by Tom Schindl at November 18, 2014 10:12 PM

This blog post is more of a question to others (probably much more knowledgeable people than me) who have already wrapped C++ libraries in Java. So if you have and the strategy I document as part of this blog post is completely wrong please comment and point me towards a better one!

Anyways let’s get started!

We take a very simply case where we have a C++-Class like this:

#ifndef CTESTSIMPLE_H
#define CTESTSIMPLE_H

class CTestSimple
{
public:
    CTestSimple();
    virtual int test();
    int test_delegate();
};

#endif // CTESTSIMPLE_H
#include "ctestsimple.h"

CTestSimple::CTestSimple()
{
}

int CTestSimple::test() {
    return 100;
}

int CTestSimple::test_delegate() {
    return test();
}

and in java we want to do the following things:

import testjni.TestSimple;



public class TestApp {
	public static void main(String[] args) {
		System.load("/Users/tomschindl/build-JTestJNI/libJTestJNI.dylib");

		System.out.println("===== Simple");
		TestSimple t = new TestSimple();
		System.out.println("Direct: " + t.test());
		System.out.println("Delegate:" + t.test_delegate());

		System.out.println("===== No Override");
		NoOverride tn = new NoOverride();
		System.out.println("Direct: " + tn.test());
		System.out.println("Delegate: " + tn.test_delegate());

		System.out.println("===== With Override");
		SubTestSimple ts = new SubTestSimple();
		System.out.println("Direct: " + ts.test());
		System.out.println("Delegate: " + ts.test_delegate());

		System.out.println("===== With Override");
		SubSuperTestSimple tss = new SubSuperTestSimple();
		System.out.println("Direct: " + tss.test());
		System.out.println("Delegate: " + ts.test_delegate());
	}

	static class SubTestSimple extends TestSimple {
		@Override
		public int test() {
			return 0;
		}
	}

	static class SubSuperTestSimple extends TestSimple {
		@Override
		public int test() {
			return super.test() + 10;
		}
	}

	static class NoOverride extends TestSimple {

	}
}

We expect the application to print:

===== Simple
Direct: 100
Delegate:100
===== No Override
Direct: 100
Delegate: 100
===== With Override
Direct: 0
Delegate: 0
===== With Override & super
Direct: 110
Delegate: 110

The strategy I found to make this work looks like this:

  1. On the C++ side one needs to define a subclass of CTestSimple I named JTestSimple
    #ifndef JTESTSIMPLE_H
    #define JTESTSIMPLE_H
    
    #include "ctestsimple.h"
    #include <jni.h>
    
    class JTestSimple : public CTestSimple
    {
    private:
        jobject jObject;
        JNIEnv* env;
        jmethodID jTest;
    public:
        JTestSimple(JNIEnv *,jobject,jboolean derived);
        virtual int test();
    };
    
    #endif // JTESTSIMPLE_H
    
    #include "jtestsimple.h"
    #include <iostream>
    #include <jni.h>
    
    JTestSimple::JTestSimple(JNIEnv * env, jobject o, jboolean derived)
    {
        this->jObject = env->NewGlobalRef(o);
        this->env = env;
        this->jTest = 0;
        if( derived ) {
            jclass cls = env->GetObjectClass(this->jObject);
            jclass superCls = env->GetSuperclass(cls);
    
            jmethodID baseMethod = env->GetMethodID(superCls,"test","()I");
            jmethodID custMethod = env->GetMethodID(cls,"test","()I");
    
            if( baseMethod != custMethod ) {
                this->jTest = custMethod;
            }
        }
    }
    
    int JTestSimple::test() {
        if( this->jTest != 0 ) {
            return env->CallIntMethod(this->jObject,this->jTest);
        }
        return CTestSimple::test();
    }
    

    The important step is to check in if the java-object we got passed in has overwritten the test()-method because in
    that case we need to call from C++ to java for the correct result.

  2. On the Java side we define our object like this
    package testjni;
    
    public class TestSimple {
    	private long NativeObject;
    
    	public TestSimple() {
    		NativeObject = createNativeInstance();
    	}
    
    	protected long createNativeInstance() {
    		return Native_new(getClass() != TestSimple.class);
    	}
    
    	public int test() {
    		if( getClass() == TestSimple.class ) {
    			return Native_test();
    		} else {
    			return Native_test_explicit();
    		}
    	}
    
    	public final int test_delegate() {
    		return Native_test_delegate();
    	}
    
    	private native long Native_new(boolean subclassed);
    
    	private native int Native_test();
    	private native int Native_test_explicit();
    
    	private native int Native_test_delegate();
    }
    

    Some remarks are probably needed:

    • Notice that the there are 2 native delegates for test() depending on fact whether the method is invoked in subclass or not
    • test_delegate() is final because it is none-virtual on the C++ side and hence is not subject to overloading in subclasses
  3. On the JNI-Side
    #include <jni.h>
    #include <stdio.h>
    #include <iostream>
    #include "testjni_TestSimple.h"
    #include "jtestsimple.h"
    
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->test();
    }
    
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test_1explicit
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->CTestSimple::test();
    }
    
    JNIEXPORT jint JNICALL Java_testjni_TestSimple_Native_1test_1delegate
      (JNIEnv * env, jobject thiz) {
        jclass cls = env->GetObjectClass(thiz);
        jfieldID id = env->GetFieldID(cls,"NativeObject","J");
    
        jlong pointer = env->GetLongField(thiz,id);
        JTestSimple* obj = (JTestSimple*)pointer;
        return obj->test_delegate();
    }
    
    JNIEXPORT jlong JNICALL Java_testjni_TestSimple_Native_1new
      (JNIEnv * env, jobject thiz, jboolean derived) {
    
        jlong rv = (jlong)new JTestSimple(env, thiz, derived);
        return rv;
    }
    

    The only interesting thing is the Java_testjni_TestSimple_Native_1test_1explicit who does not invoke the test-method on JTestSimple but on its super class CTestSimple.

So my dear C++/Java gurus how bad / dump / … is this strategy?



by Tom Schindl at November 18, 2014 10:12 PM

New Eclipse JUnit Feature: Run Subtrees of Tests Individually, e.g. from Parameterized Tests

by Moritz Eysholdt (noreply@blogger.com) at November 18, 2014 01:58 PM

In Eclipse, the JUnit view nicely visualizes execution status and results of JUnit tests. JUnit tests usually are Java classes with test methods or Java classes annotated with @SuiteClasses, which compose multiple other test classes or suites.

But JUnit is more powerful than that. It allows you to implement custom test runners. Those test runners are responsible for creating a tree structure of test descriptions. Later the runner needs to execute all leafs of that tree as test cases.

A node in this tree of tests is not necessarily backed by a Java class, as it is for test suites. It can be something purely virtual. JUnit itself ships with a test runner for which this is true: org.junit.runners.Parameterized.

The screenshots above shows a Parameterized test on the left-hand-side and the Eclipse JUnit View (after the test has been executed) on the right-hand-side. The method parameters() returns a list of test data sets and the Parameterized runner will call the constructor of this test class for each data set. Consequently, the number of constructor parameters must match the number of items in each data set. For each data set, all test methods are executed and what we get, effectively, is a matrix test.

In the screenshot of the JUnit view we can see how the Parameterized runner presents the test matrix as a tree: The test class itself is the root node, every data set is a subtree and every entry of the test matrix is a leaf.

As of now (Eclipse Mars M4), it is possible to run individual subtrees by choosing "run" from the context menu. This effectively means to execute a row from the test matrix.

Additionally, it is now possible to run a single column from the test matrix by running a single method from a Parameterized test.

Note that in the picture on the left-hand-side I'm explicitly clicking on the method's name. Clicking elsewhere would execute all test methods from the class.

The new mechanism to filter down test description trees can achieve this without being specific to the JUnit's Parameterized runner: The filter first extracts the leading part of the test desription's name which is a valid Java identifier. The test is executed if this leading part exists and if it equals the to-be-executed method's name.

I would like to thank NumberFour for having me implement this fix specifically for Xpect. Gratitude also goes to itemis, my employer, who gave me the time to generalize and contribute the fix to Eclipse JDT. Also I would like to thank the JDT team for accepting the contribution.


by Moritz Eysholdt (noreply@blogger.com) at November 18, 2014 01:58 PM

Espruino Pico on Kickstarter: only 4 days to go!

by Benjamin Cabé at November 18, 2014 01:38 PM

If you follow our regular IoT hangouts, you probably have seen this presentation of the Espruino Pico already:

If you don’t, you really want to check out espruino.com!

Espruino is an Open Source and Open Hardware project that provides a super-tiny implementation of Javascript that runs on micro-controllers. The Espruino board is a ready-to-use board that you can use to run Javascript IoT applications, but the Espruino interpreter can also run on lots of other targets.

Gordon Williams, the lead of Espruino, is working on a new version of the Espruino board with a tiny form factor, and his Kickstarter ends in only 4 days.

While the initial goal has already been reached, I would really like to see him reach the £50,000 stretch goal since it means he will implement socket support in the Espruino interpreter, allowing IoT developers to use MQTT, CoAP, and the like right from their Javascript code!

Please consider supporting this very cool open source project and don’t wait any longer to go visit their Kickstarter page!


by Benjamin Cabé at November 18, 2014 01:38 PM

EclipseCon 2015 - Register Now

November 18, 2014 12:36 PM

It's now time to register for EclipseCon 2015! Join us in San Francisco, March 9-12.

November 18, 2014 12:36 PM

A JRebel Ticket Monster on OpenShift

by adietish at November 18, 2014 11:07 AM

JBoss Developer Studio allows you publish and run your Eclipse projects on OpenShift. This post shows you how to get started using the JBoss Developer Ticket Monster demo application.
In a 2nd part I will show you how to change your code in Eclipse and have those changes instantly available on OpenShift using the JRebel cartridge.

Deploy and Run Ticket Monster on OpenShift

Get the code

The JBoss Developer project created the Ticket Monster demo to showcase a modern angularjs and JavaEE application. The code for this blog is at:

https://github.com/adietish/ticket-monster.git
The original code for the Ticket Monster is available at Github.
The webapp with its maven pom is located in the demo folder. OpenShift on the other hand is expecting a pom in the root of the git repository. Without it OpenShift will not know how to build the code when you push it. To fix this I created a fork of the webapp where I put the content of the demo folder to the root of the git repository
  1. Copy this url to your clipboard and in your Eclipse, switch to the Git perspective.

  2. Launch the Clone Git Repository wizard. It will popup with all the values already in place. Hit Next

  3. Pick the master branch and hit Finish to get the repo cloned to your local machine.

  4. Once the cloning is done the new repo will appear in the Git Repositories view.

  5. Import the project within it by using menu:Import Projects…​ in the context menu.

  6. Back to the JBoss or Jave EE perspective you will spot the new project ticket-monster in your workspace.

OpenShift, create an Application for my project, please!

You are now ready to deploy it to OpenShift. Pick Configure ▸ New/Import OpenShift Application from the context menu of the ticket-monster project.

configure openshift application

The OpenShift application wizard that you now get prompts you to provide your OpenShift credentials. Either use an existing account from the combo or create a new one, providing server, username and password.
In the next page we choose to create a new OpenShift application and pick the application type. We will use WildFly 8 which you’ll find by filtering the available choices with "wildfly". The latest version of OpenShift has it in the Basic Cartridges (previous versions - like the one used in the screencast - had it in the quickstarts).

wildfly8 application

Once you picked it you can choose the application name in the next wizard page.
The wizard suggests a name that is based on the project name. OpenShift doesn’t allow non-alphanumeric characters for the name, you therefore have to correct it, remove the hyphen and get "ticketmonster".
You may also select the domain (if you have several ones) and possibly adjust the gear size and have your application scaling.

application settings

If you now hit Finish the wizard will create your OpenShift application, prepare your local project to run on OpenShift and create a server adapter for easy publishing.
Once the wildfly cartridge is created on OpenShift, the tooling presents you the credentials to access the wildfly administrative console in a dialog. You should copy those for later usage:

wildfly admin console

In a last step the wizard informs you that it’ll prepare your local ticket-monster project to run on OpenShift. Confirm these changes by hitting OK.

prepare project for openshift

You are now ready to publish your ticket-monster projcet to OpenShift.

Deploy your project!

Get to the "Servers" view, select the freshly created server adapter and tell it to menu:publish ticket-monster to OpenShift via its context menu.

publish to openshift
publish to openshift 2

The dialog prompts you to provide a commit message and select the files that you want to commit and publish.
Among the listed files is the .gitignore. Double clicking it shows you that it now includes Eclipse specific project settings.
The wizard also added OpenShift configurations in the .openshift folder.
Further interesting bits are a marker to have wildFly running on java 8 (.openshift/markers/java8). OpenShift markers allow you to configure specific bits like enabling debugging (Enable JPDA marker), hot deployment (Hot Deploy marker) etc. You may choose among java 7 and java 8 via marker files in .openshift/markers. You can also spot a wildfly configuration file in .openshift/config/standalone.xml that you can modify to your needs. Once you checked all files by clicking the Select All button and provided a commit message, you’re good to go. You can now publish your project to OpenShift by hitting Commit and Publish

The tooling then informs that it will overwrite the current code in OpenShift by doing a push force. It asks you to confirm this.

push force

This is required since the wizard does not properly include the remote history in our local project/git repo. It clones the remote repo and then copies some of its content into the local project, it does not recursive merge since this is/was not fully reliable yet (see JBIDE-14890 for further details). It is therefore required to overwrite the remote content when you do the initial deployment. Once you hit OK the server adapter is pushing the local code to OpenShift.

Eclipse then shows you the Console with the output of the publishing operation. You can see how the maven build is triggered, WildFly is restarted and your project is then deployed.

build and deploy

In order to verify that your server is fully you can have a look at the logs. Pick OpenShift ▸ Tail files…​ from the OpenShift submenu in the context menu of your ticketmonster server adapter. The upcoming wizard allows you to fine-tune the tail files options and include the gears that your application is running on. You can stick to the default which are usually just fine.

tail files

A fully started Wildfly will output 'Wildfly 8.1.0.Final "Kenny" started' in the logs.

wildfly started

You’re now ready to have a look at your running project. Pick menu:[Show In > Web Browser] from the context menu of your ticketmonster server adapter and see how the browser is pointed to your deployed webapp.

open browser
ticket monster webapp

Change locally, see OpenShift change instantly!

Install JRebel

We are now getting a step further and show you how we can change the application code locally and have those changes instantly available on OpenShift.
To achieve this you need to install the JRebel plugin into your JBoss Developer Studio. The Eclipse plugin is available from JBoss Central. Switch to the Software/Updates tab, search for JRebel, check it once it is listed and hit "Install/Update". Once you restarted Eclipse your have JRebel enabled in your IDE

install jrebel

Enable JRebel for your Project

Open up the context menu of your project and enable the JRebel Nature for your project (menu:[JRebel > Add JRebel Nature]). In a 2nd step then enable JRebel Remoting.

enable jrebel remoting

You have to configure the local JRebel where to publish to. You therefore need the public URL of your ticket monster as it runs on OpenShift. You get this in the application details: pick OpenShift ▸ Details and copy the Public URL.

application public url

Paste it to the JRebel Deployment URL(s) by picking Advanced Properties from the JRebel context menu of your ticket-monster project.

jrebel deployment url

Downgrade to Java 7

Wildfly is configured to run with Java 8 by default. With JRebel enabled the OpenShift small gear that you get for free tends to run out of memory. It is therefore suggested that you downgrade to Java 7. You go to the context menu of your project and pick *OpenShift ▸ Configure Markers…​, uncheck java8 and check Java 7.

java7 marker

Add the JRebel cartridge in OpenShift

The JRebel cartridge for OpenShift, available from Github, makes it very easy to enable JRebel for any Java app on OpenShift. To add this cartridge to your application you get to the Servers view and choose menu: OpenShift[Edit Embedded Catridges…​].
In the upcoming wizard you check the Code Anything cartridge and paste the following url:

https://cartreflect-claytondev.rhcloud.com/reflect?github=openshift-cartridges/openshift-jrebel-cartridge
code anything cartridge

Once you hit Finish the wizard will add the cartridge to your OpenShift application and enable JRebel for it.

Publish your project to OpenShift

You now have to push all your local changes to OpenShift (you added the JRebel nature and downgraded to java7). You have to tell the server adapter to publish: Choose Publish in the context menu of your OpenShift server adapter.
The upcoming commit- and publish-dialog shows your local changes:

jrebel changes

You replaced the java8 with a java7 marker and added 2 xml files that configure JRebel. Once you add a commit message you’re ready to hit Commit and Publish.
If you now go to the Console view and pick the ticketmonster, you will see how OpenShift picks those changes and rebuilds your code.

ticketmonster console

You can inspect the server logs to make sure wildfly the procedure is all finished and widlfly fully restarted. In the Servers view, pick OpenShift ▸ Tail Files…​, stick to the default options and hit Finish.

wildfly started

'WildFly 8.1.0.Final "Kenny" started' in the logs tells you that wildfly was successfully restarted. You are now ready to change code locally and have them picked up in OpenShift instantly.

Pick my local changes instantly, OpenShift!

We will change the ticket price and we will therefore first check the current price. Use Show In ▸ Browser in the context menu of your server adapter which will open up the application in your browser. In your browser then hit hit Buy tickets now, Book Ticket, choose some venue, date, time and section. You will then see the current price:

ticket price1

Back in your JBoss Developer Studio let us now change the ticket price:
Open up the TicketPrice and get to the getPrice() method. Change it to the following:

    public float getPrice() {
          //   return price;
          	return createFakePrice();
          }
      
          private float createFakePrice() {
      		return 42f;
      	}

When you save your Java editor, you will see the JRebel console popping up and show you how it is updating the java classes in OpenShift.

rebel updating openshift

Now get back to your browser and refresh the page. You will have to select the venue again in order to see the new ticket price: It is now at $42!

ticket price2

We did not have to publish our code to OpenShift via the server adapter. JRebel published our local changes on the fly!


by adietish at November 18, 2014 11:07 AM

Statechart Tools: What's in the pipeline?

by Andreas Mülder (noreply@blogger.com) at November 17, 2014 10:36 AM

In the upcoming statechart tools release we made a huge refactoring to the simulation infrastructure. This long overdue refactoring was necessary to define some concise APIs to plug in custom simulation engines for different execution semantics. It took some time, but in the end our jenkins build is back to stable again ;-)
 

Debugging Features

On top of the new APIs we build some great new debugging features for the graphical statechart editor. It is possible now to set breakpoints on transitions and states. If a breakpoint is hit, the simulation engine suspends before the transition or the state is executed. The semantic element the breakpoint is attached to is highlighted in a different color as you can see in the screenshot below.

The debugging features are tightly integrated into the Eclipse debugging infrastructure. The eclipse breakpoints view shows all statechart breakpoints and allows to enable or disable them. But that's not all - it is even possible to specify conditional breakpoints with our expression language!

Simply check the 'Conditional' checkbox in the breakpoints detail pane and specify any expression that evaluates to a boolean type. For the example model above, it would be possible to specify an expression like 'ABC.interfaceVar == 10'. If the result of the expression evaluates to 'true', the breakpoint hits.

Simulation Snapshots

Another great new feature we developed are so called simulation snapshots. (Thanks to Holger who developed a large part of the snapshot functionality as his 4 + 1 project) During simulation, you can take a simulation snapshot of the current execution state via the "Snapshot view". These snapshots can be restored at any time and simplifies the process of statechart debugging significantly.





















The snapshot details section shows the execution state at the time the snapshot was taken and previews an image with all active states. Statechart Debugging has never been so easy! ;-)

This is just a short preview, detailed documentation on how to use the new features will follow. What do you think? Do you like the new features? Do you have another proposal how to simplify statechart debugging? Just leave a comment or send us some feedback via our user group !
 


by Andreas Mülder (noreply@blogger.com) at November 17, 2014 10:36 AM

5 Usability Flaws in GMF (and how to fix them)

by Andreas Mülder (noreply@blogger.com) at November 17, 2014 10:35 AM



Although GMF (Graphical Modeling Framework) editors are an endangered species and future editors will be developed with JavaFX there is still a bunch of GMF editors out there. And most of them have something in common: a lack of usability. The good news are that some usability flaws are easy to fix. In Yakindu Statechart Tools we spend a lot of time in improving the usability of our GMF Runtime based editor. Here is a random selection of 5 usability flaws in GMF and how we fixed them.



The following code snippets are just examples how a solution could look like. Most of the  snippets are still experimental and developed for specific use cases and should not be used as is!



1. Remove dithering connection anchors

Everyone who ever worked with a GMF based editor knows this annoying default behavior. You spend a significant amount of time arranging the diagrams connections to produce a nice-looking diagram. When everything looks pretty nice, you enlarge a node  - and all the connection routing work was only a waste of time....


 To fix this, you can use an AdjustIdentityAnchorCommand.java that can be chained to a SetBoundsCommand and recalculates the IdentityAnchor position based on the resize delta. Create a customized LayoutEditPolicy and install it on the container EditPart as shown below:

installEditPolicy(EditPolicy.LAYOUT_ROLE, new XYLayoutEditPolicy() {
  @Override
  protected Command getResizeChildrenCommand(ChangeBoundsRequest request) {
    CompoundCommand result = new CompoundCommand();
    result.add(super.getResizeChildrenCommand(request));
    AdjustIdentityAnchorCommand command = new  
       AdjustIdentityAnchorCommand(TransactionUtil
         .getEditingDomain(resolveSemanticElement()), request);
    result.add(new ICommandProxy(command));
    return result;
  }
}); 

2. Remove scrollable compartments and autoresize container hierachy

Working with GMF compartments is no fun. The compartment scrollbars look pretty ugly and diagram elements are hidden in non visible areas. Sometimes, edges are clipped because a node is not fully visible and you do not even know that the edge exists. A better approach is to always show all elements in the compartment. If your diagram gets too large, you should consider using sub diagrams instead of scrollable compartments. When adding new nodes to a compartment, the user has to enlarge the whole container hierachy by hand.


The EnlargeContainerEditPolicy.java auto-resizes the container hierachy if more space is required. Every note that is in the way is moved to prevent overlapping nodes. It has to be installed for every EditPart in the compartment:

installEditPolicy(EnlargeContainerEditPolicy.ROLE, new
  EnlargeContainerEditPolicy());
 

3. Add a preferred size selection handle

This EditPolicy adds a new handle to the selection handles. If this handle is pressed, the selected node updates its size to the preferred size. The preferred size is the minimum size of the node, where all node children are visible. When modeling with the UML Tool Magicdraw I found this preferred size handle to be very useful, so I added it to my GMF based editors as well.

 

To add the preferred size selection handle to a node, install the PreferredSizeHandlerEditPolicy.java to the nodes EditPart for the PRIMARY_DRAG_ROLE. Check the Type Hierachy carefully!


installEditPolicy(EditPolicy.PRIMARY_DRAG_ROLE, new
  PreferredSizeHandlerEditPolicy());


4. Highlight clickable areas and open direct editing on double click

It is a good idea to indicate clickable areas with a highlighted border, so the user immediately gets  feedback where he can open an in-diagram editor. These editors should open on a simple double-click, no matter which node is currently selected in the diagram. With GMF defaults it is only possible to open the direct editor when a node is selected.


To highlight clickable areas within the Diagram there is a (really simple) HighlightingWrappingLabel.java that shows a border on mouse hover to indicate a clickable area. To open the direct editor on double click use the DoubleClickDirectEditDragTracker.java .This DragTracker generates DirectEditRequests on double clicks and replaces the existing DragTracker of the EditPart like this:

@Override
public DragTracker getDragTracker(final Request request) {
  if (request instanceof SelectionRequest
    && ((SelectionRequest) request).getLastButtonPressed() == 3)
    return null;
  IDoubleClickCallback callback = new IDoubleClickCallback() {
    public void handleDoubleClick(int btn) {
      performDirectEditRequest(request);
    }
  };
  return new DoubleClickDirectEditDragTracker(this, getTopGraphicEditPart(),   

    callback);
}

5. Add rich text editing support to direct editing

If you use direct editing only to set names for nodes, the default GMF Direct Editing behavior is sufficient. If you have a more formal language, for example an expression language, you should consider using Xtext to generate an editor for your grammar and integrate it into your GMF editor to support syntax coloring, validation and cross referencing.



For the technical details have a look at Xtext integration into GMF !




by Andreas Mülder (noreply@blogger.com) at November 17, 2014 10:35 AM

Yakindu Statechart Tools 2.4.0 released!

by Andreas Mülder (noreply@blogger.com) at November 17, 2014 10:34 AM

We released Yakindu Statechart Tools 2.4.0 today! It contains a lot of bugfixes and improvements as well as new features.

Installation
This release requires Eclipse Luna. You can install SCT 2.4.0 from our update site:
http://updates.yakindu.org/sct/luna/releases/ or you can download a full Eclipse zip package from our dowload page.

New and Noteworthy

Here is a summary of the improvements worth mentioning:

1. Toggle documentation toolbar entry added
Menu-16We added a toolbar menu item  to the editors toolbar to toggle between documentation and formal expression language. This allows to toggle modes for the whole diagram.

2. C/C ++ generator feature for inner class function visibility added
innerFunctionVisibility (String, optional): This parameter is used to change the visibility of inner functions and variables. In default the „private” visibility is used. It can be changed to „protected” to allow function overriding for a class which inherits from the generated statemachine base class.
Example configuration:

feature GeneratorOptions {
innerFunctionVisibility = "protected"
}

3. C/C++ generator feature to change operation callback generation behavior added
staticOperationCallback (Boolean, optional): If set to ‚true’ the callback function declaration for statechart operations is static and the functions are called by the statemachine code statically.
Example configuration:
 
feature GeneratorOptions {
staticOperationCallback = true
}

4. Model and Diagram compare feature
This release contains the beta version of a diff/merge viewer for statechart diagrams. It integrated seamlessly into the Eclipse Team API to allow diffing and merging of different revisions as well as comparing models with the local history.

compare_example

Note that this feature is still beta. There are currently two known problems (this and this) with the editor that will be fixed for the next release.

Bugfixes

We also fixed a bunch of bugs reported via our User Group:

Toggle subregion alignment does not work
Transition into substate does not recognize parents history context
Model can not be simulated if operations in named interfaces are used
C++ generator can generates allways actions with a lower-case letter in source file
CoreFunction methods for long types incomplete
Operations of other Statecharts (in the same folder) are in scope
Class cast exception if java provided custom operation is not executable
Simulating Operations With Custom Java Code is not working as intended
Diagram corrurpted when moving transition label

Thanks to all bug reporters!

by Andreas Mülder (noreply@blogger.com) at November 17, 2014 10:34 AM

"Catch me if you can" - Java on wearables (60-minute extended talk)

by Eclipse Foundation at November 17, 2014 09:47 AM

Wearable computers are one of the next big things. But at the moment, one can buy only specialized systems such as motion trackers, GPS watches, and the like. So why not use existing cheap technology to build your own wearable Java-powered device? This session shows what you can do with affordable technology and Java today. It uses a Raspberry Pi in combination with a heart rate sensor and a GPS sensor to track the heart rate and the location of a runner. The battery-powered Pi measures the data and publishes it via MQTT to different clients such as Java(FX)-based desktop clients, an iPhone client and a smartwatch the runner can wear. Come and get some ideas of what you can do with Java and wearable devices.

by Eclipse Foundation at November 17, 2014 09:47 AM

Eclipse and Java™ 8

by Eclipse Foundation at November 17, 2014 09:31 AM

This session will present the most important new stuff in Java™ 8. It will show how to get started developing Java 8 code with Eclipse and then demo the new features that are available in Eclipse for this new Java™ release. We will also look behind the curtain and see how the JDT team accomplished that great piece of work.

by Eclipse Foundation at November 17, 2014 09:31 AM

The JVM Universe - Java and the IoT Big Bang

by Eclipse Foundation at November 17, 2014 09:18 AM

This session provides an overview of Java's role in the exploding IoT universe. Starting with the JVM, we'll look up through JRE and JDK implementations, and then out through Java Card, Java ME and Java SE platforms. We'll see where Java fits from the tiniest of devices to the very largest of cloud deployments. We will examine the role of OpenJDK and the various Java vendor implementations available. At the end of this talk attendees should have a strong sense of the role the various Java platforms play, what tools are recommended depending on your needed scale and where to find all the right pieces. Attendees should also have a strong sense of the terminology surrounding the Java platform at various stages of development and scale. Don't get trapped in the black hole - use Java to reach escape velocity.

by Eclipse Foundation at November 17, 2014 09:18 AM

Highly Efficient Java & JavaScript Integration

by Ian Bull at November 17, 2014 05:24 AM

Over the past 4 months I’ve been working on integrating Java and JavaScript in a highly efficient manner. Rhino and Nashorn are two common JavaScript runtimes, but these did not meet my requirements in a number of areas:

  • Neither support ‘Primitives‘. All interactions with these platforms require wrapper classes such as Integer, Double or Boolean.
  • Nashorn is not supported on Android.
  • Rhino compiler optimizations are not supported on Android.
  • Neither engines support remote debugging on Android.

To help address these issues, I’ve built a new JavaScript runtime for Java based on Google’s JavaScript Engine, V8. The runtime, called J2V8, is open sourced under the EPL and available on GitHub. I’ve been running it on MacOS, Linux and Android.

Unlike other JS Runtimes (including JV8, Jav8), J2V8 takes a primitive based approach, resulting in much less garbage. The following script produces an array containing the first 100 fibonacci numbers. Each of these numbers can be accessed directly, without the need for wrapper objects. Executing this script 10,000 times on Rhino takes 7.5 seconds. The same script runs 10,000 times on J2V8 in under 1 second.

var i;
var fib = []; //Initialize array!
 
fib[0] = 1;
fib[1] = 1;
for(i=2; i&lt;=100; i++)
{
    fib[i] = fib[i-2] + fib[i-1];
}
fib;

Fibonacci in JavaScript

V8Array array = v8.executeArrayScript(script);
double total = 0;
for (int i = 0; i &lt; 100; i++) {
  total += array.getDouble(i);
}
System.out.println(total);
array.release();

Accessing JS Arrays in Java

The runtime was built by exposing parts of the V8 API through a thin JNI bridge. This approach allows you to embed V8 directly into Java applications without the use of C/C++. J2v8 currently targets V8 version 3.26, but I’ll be updating that to a more recent build soon.

The runtime currently supports V8Objects, V8Arrays, invoking scripts, calling JS functions from Java and registering Java Functions as callbacks from JS. There is also a small library for converting V8Objects and V8Arrays to Java Maps and Lists. Finally, the runtime supports remote debugging and can be utilized with tools such as Chrome Developer Tools for Eclipse.

Screen Shot 2014 11 14 at 2.41.40 PM Highly Efficient Java & JavaScript Integration
Over the next few weeks I’ll be publishing builds and putting together a getting started guide. If you are interested in this project, please let me know or ping me on twitter.


TwitterGoogle+LinkedInFacebook

4 Comments


by Ian Bull at November 17, 2014 05:24 AM

ECF 3.9.1 Released

by Scott Lewis (noreply@blogger.com) at November 16, 2014 11:31 PM

ECF 3.9.1 is now available.   See the New and Noteworthy.

by Scott Lewis (noreply@blogger.com) at November 16, 2014 11:31 PM