Creating a new stack in Eclipse Che from an existing Docker image

by Tracy M at February 24, 2017 02:07 PM

title_che_custom_whalesayIn Eclipse Che, stacks are runtime configurations for workspaces.  Eclipse Che 5.0 provides the ability to quickly create a new stack based on an existing Docker image. There are a few requirements, with the main one being that the image must include a bash shell (fortunately most already do). Additionally, Docker images used with Che must have a non-terminating CMD command so that the container doesn’t shut down immediately after loading. However, we can do this in Che without having to modify the Docker image.

This article outlines how you can create a new stack based on the docker/whalesay image, the Docker image many folks would have come across when going through the Docker tutorial.

Step 1: Create a new Runtime Stack

  1. Install and launch Eclipse Che. (This article is based on 5.3.1)
  2. In the left-hand menu, click on ‘Stacks’ then ‘Add Stack’stacks1
  3. In the new stack page, fill in the ‘Name’ field e.g. Eclipse Che Whalesay stacks2
  4. For good measure, in the Tags section, delete the Java 1.8 tag and you can add in any preferred tags (press Enter after each tag to turn it blue).

From now your stack is available on the dashboard or stack list, but it doesn’t yet do anything impressive.

Step 2: Reference the Docker Image

  1. To reference the Docker image, in the stack editor, first scroll down to the ‘Raw Configuration’ section and click ‘Show’. Find where it says ‘image’ and replace the existing entry with the docker/whalesay one. Click ‘Save’.stacks3
  2. As the whalesay image normally shuts down after running, we need to make it so it keeps running. We can do this using Docker Compose syntax that is now supported in the stack editor. To do this we add the following command to the content.
    command: [tail, -f, /dev/null]

    stacks5.png
    Whitespace matters so check the recipe preview in the ‘Runtimes’ section to ensure it looks like correctly formatted Docker Compose syntax.  Although this is a bit hacky, it is great because it means we can reuse the docker image completely unchanged. stacks7.png

Step 3: Add a Custom Command

Next we will create a command to allow us easily run the whalesay image within the browser IDE.

  1. In the Commands section, click Add. Fill in the name and command fields. In this case we’ll simply make the command just say ‘boo’.
    • Name: cowsay
    • Command: cowsay boostacks8

Step 4: Test the Stack

  1. That’s it, now we can test it straightaway by clicking on the ‘Test’ button. A dialog will pop-up, click ‘Test Workspace’, no need to import any projects.
  2. This should pull the whalesay image and launch up a workspace with it configured.stacks9.png
  3. Click on the play icon next to the cowsay command and you should whalesay in the terminal saying boo!stacks10.png

Bonus Step: Use Macros

  1. We can also use macros in commands, try editing the command to have the whale say the name of the selected file like this:
    • Name: cowsay
    • Command: cowsay ${explorer.current.file.name}

    A full list of macros can be found here: https://www.eclipse.org/che/docs/ide/commands/#macros

  2. Then test the workspace again, this time including one or two example projects. Click on a file, then run the command. This time whalesay should say the name of any selected files. stacks11

And that’s it, an unchanged Docker image easily integrated into a new runtime stack!

whalesaystack



by Tracy M at February 24, 2017 02:07 PM

itemis + TypeFox = Xtext!

by Jens Wagener (wagener@itemis.de) at February 24, 2017 01:35 PM

Wie viele wissen, hat sich vor etwa einem Jahr etwas sehr entscheidendes im Eclipse Xtext Projekt verändert. Unsere Kollegen in Kiel haben sich entschlossen itemis zu verlassen und haben ein eigenes Unternehmen gegründet, TypeFox.

itemis & TypeFoxDamit basiert das Xtext Projekt nun mehr auf zwei großen Säulen und nicht mehr nur auf einer.

Um ehrlich zu sein sind wir bei itemis nicht gerade glücklich gewesen über diese Entwicklung. Und wie immer geht eine Trennung nicht ohne Reibereien ab.

Nichtsdestotrotz werden wir unser Open Source Commitment im Allgemeinen und unser Eclipse Xtext Engagement im Speziellen nicht aufgeben. Im Gegenteil, wir werden uns verstärkt beiden Themen widmen und uns weiterhin voll auf das Thema Language Engineering konzentrieren.  Wir haben einiges vor und werden bald eine großartige Neuigkeit ankündigen können. Lasst euch mal überraschen.

Dabei freuen wir uns auf die Zusammenarbeit mit unseren Ex-Kollegen von TypeFox, deren Kompetenz wir nach wie vor sehr schätzen. Wir sind uns sicher, dass das Xtext Projekt mehr denn je auf einem soliden Fundament steht und auch in Zukunft das sein wird was es immer war: Ein verdammt gutes Framework.

Vielen Dank an alle die in der Vergangenheit hierzu beigetragen haben und es in Zukunft tun werden.

As you might have noticed there was a very crucial change in the Eclipse Xtext project at the beginning of last year. Our colleagues in Kiel have decided to leave itemis and have founded their own company, TypeFox.

itemis & TypeFoxThe Xtext project is now based on two major pillars and not just on one.

To be honest, we at itemis were not were happy about this. As always, a separation does not take place without personal frictions.

None the less we will continue our Open Source Commitment, in particular our commitment to Eclipse Xtext.

We decided to increasingly focus on the broader topic of Language Engineering, in addition to the specific technology of Xtext. We have made a lot of plans and will soon be able to announce great news. Let us surprise you.

We are looking forward to working with our ex-colleagues of TypeFox, whose expertise we still value very much. We are sure that the Xtext project is more than ever on a solid foundation and will be in the future what it always was: A damn good framework.

Thanks to all who have contributed to this in the past and will do it in the future.


by Jens Wagener (wagener@itemis.de) at February 24, 2017 01:35 PM

JBoss Tools and Red Hat Developer Studio Maintenance Release for Eclipse Neon.2

by jeffmaury at February 24, 2017 11:42 AM

JBoss Tools 4.4.3 and Red Hat JBoss Developer Studio 10.3 for Eclipse Neon.2 are here waiting for you. Check it out!

devstudio10

Installation

JBoss Developer Studio comes with everything pre-bundled in its installer. Simply download it from our JBoss Products page and run it like this:

java -jar jboss-devstudio-<installername>.jar

JBoss Tools or Bring-Your-Own-Eclipse (BYOE) JBoss Developer Studio require a bit more:

This release requires at least Eclipse 4.6.2 (Neon.2) but we recommend using the latest Eclipse 4.6.2 Neon JEE Bundle since then you get most of the dependencies preinstalled.

Once you have installed Eclipse, you can either find us on the Eclipse Marketplace under "JBoss Tools" or "Red Hat JBoss Developer Studio".

For JBoss Tools, you can also use our update site directly.

http://download.jboss.org/jbosstools/neon/stable/updates/

What is new?

Our main focus for this release was improvements for container based development and bug fixing.

Improved OpenShift 3 and Docker Tools

We continue to work on providing better experience for container based development in JBoss Tools and Developer Studio. Let’s go through a few interesting updates here.

Scaling from pod resources

When an application is being deployed to Openshift, it was possible to scale the pod resources from the service resource.

scale command from service

However, it was not a very logical choice. So the command is also available at the pod level, leading to better usability.

scale command from pod

OpenShift Container Platform 3.4 support

OpenShift Container Platform (OCP) 3.4 has been announced by Red Hat. JBossTools 4.4.3 has been validated against OCP 3.4.

CDK 3 Beta Server Adapter

A new server adapter has been added to support the next generation of CDK 3. This is Tech Preview in this release as CDK 3 is Beta. While the server adapter itself has limited functionality, it is able to start and stop the CDK virtual machine via its minishift binary. Simply hit Ctrl+3 (Cmd+3 on OSX) and type CDK, that will bring up a command to setup and/or launch the CDK server adapter. You should see the old CDK 2 server adapter along with the new CDK 3 one (labeled Red Hat Container Development Kit 3 (Tech Preview) ).

cdk3 server adapter

All you have to do is set the credentials for your Red Hat account and the location of the CDK’s minishift binary file and the type of virtualization hypervisor.

cdk3 server adapter1

Once you’re finished, a new CDK Server adapter will then be created and visible in the Servers view.

cdk3 server adapter2

Once the server is started, Docker and OpenShift connections should appear in their respective views, allowing the user to quickly create a new Openshift application and begin developing their AwesomeApp in a highly-replicatable environment.

cdk3 server adapter3
cdk3 server adapter4
This is Tech Preview. The implementation is subject to change, may not work with next releases of CDK 3 and testing has been limited.

Hibernate Tools

Hibernate Runtime Provider Updates

A number of additions and updates have been performed on the available Hibernate runtime providers.

Hibernate Runtime Provider Updates

The Hibernate 5.0 runtime provider now incorporates Hibernate Core version 5.0.12.Final and Hibernate Tools version 5.0.4.Final.

The Hibernate 5.1 runtime provider now incorporates Hibernate Core version 5.1.4.Final and Hibernate Tools version 5.1.2.Final.

The Hibernate 5.2 runtime provider now incorporates Hibernate Core version 5.2.7.Final and Hibernate Tools version 5.2.1.Final.

Forge Tools

Forge Runtime updated to 3.5.1.Final

The included Forge runtime is now 3.5.1.Final. Read the official announcement here.

startup

What is next?

Having JBoss Tools 4.4.3 and Developer Studio 10.3 out we are already working on the next maintenance release for Eclipse Neon.3.

Enjoy!

Jeff Maury


by jeffmaury at February 24, 2017 11:42 AM

Control OSGi DS Component Instances via Configuration Admin

by Dirk Fauth at February 24, 2017 07:43 AM

While trying to clean up the OSGi services in the Eclipse Platform Runtime I came across the fact that singleton service instances are not always feasible. For example the fact that the localization is done on application level does not work in the context of RAP, where every user can have a different localization.

In my last blog post I showed how to manage service instances with Declarative Services. In that scope I mainly showed the following scenarios:

  • one service instance per runtime
  • one service instance per bundle
  • one service instance per component/requestor
  • one service instance per request

For cases like the one in the RAP scenario, these four categories doesn’t match very well. We actually need something additionally like one service per session. But a session is nothing natural in the OSGi world. At least not as natural as it is in the context of a web application.

First I tried to find a solution using PROTOTYPE scoped services introduced with DS 1.3. But IMHO that approach doesn’t fit very well, as by default the services have bundle scope, unless the consumer specifies that a new instance is needed. Also the approach of creating service instances on demand by using a Factory Component or the DS 1.3 ComponentServiceObjects interface does not seem to be a good option in this case. The consumer is in charge of creating and destroying the instances, and he needs to be aware of that fact.

A session is mainly used to associate a set of states to someone (e.g. a user) over time. The localization setting of a user is a configuration value. And configurations for OSGi services are managed by the Configuration Admin. Having these things in mind and searching the web and digging through the OSGi Compendium Specification, I came across the Managed Service Factory and this blog post by Neil Bartlett (already quiet some years old).

To summarize the information in short, the idea is to create a new service instance per Component Configuration. So for every session a new Component Configuration needs to be created, which leads to the creation of a new Component Instance. Typically some unique identifier like the session ID needs to be added to the component properties, so it is possible to use filters based on that.

The Managed Service Factory description in the specification is hard to understand (at least for me), the tutorials that exist mainly focus on the usage without Declarative Services by implementing the corresponding interfaces, and the blog post by Neil unfortunately only covers half of the topic. Therefore I will try to explain how to create service instances for different configurations with a small example that is based on the previous tutorial.

The sources for this blog post can be found in my DS projects on GitHub:

Note:
I will try to bring in some Configuration Admin details at the corresponding places, but for more information in advance please have a look at my Configuring OSGi Declarative Services blog post.

Service Implementation

Let’s start by creating the service implementation. Implement the OneShot service interface and put it in the org.fipro.oneshot.provider bundle from the previous blog post.

@Component(
    configurationPid="org.fipro.oneshot.Borg",
    configurationPolicy=ConfigurationPolicy.REQUIRE)
public class Borg implements OneShot {

    @interface BorgConfig {
        String name() default "";
    }

    private static AtomicInteger instanceCounter =
            new AtomicInteger();

    private final int instanceNo;
    private String name;

    public Borg() {
        instanceNo = instanceCounter.incrementAndGet();
    }

    @Activate
    void activate(BorgConfig config) {
        this.name = config.name();
    }

    @Modified
    void modified(BorgConfig config) {
        this.name = config.name();
    }

    @Override
    public void shoot(String target) {
        System.out.println("Borg " + name
            + " #" + instanceNo + " of "+ instanceCounter.get()
            + " took orders and executed " + target);
    }

}

You should notice the following with that implementation:

  • We specify a configuration PID so it is not necessary to use the fully qualified class name later.
    Remember: the configuration PID defaults to the configured name, which defaults to the fully qualified class name of the component class.
  • We set the configuration policy REQUIRE, so the component will only be satisfied and therefore activated once a matching configuration object is set by the Configuration Admin.
  • We create the Component Property Type BorgConfig for type safe access to the Configuration Properties (DS 1.3).
  • We add life cycle methods for activate to initially consume and modified to be able to change the configuration at runtime.

Configuration Creation

The next thing is to create a configuration. For this we need to have a look at the ConfigurationAdmin API. In my Configuring OSGi Declarative Services blog post I only talked about ConfigurationAdmin#getConfiguration(String, String). This is used to get or create the configuration of  a singleton service. For the configuration policy REQUIRE this means that a single Managed Service is created once the Configuration object is used by a requesting bundle. In such a case the Configuration Properties will contain the property service.pid with the value of the configuration PID.

To create and handle multiple service instances via Component Configuration, a different API needs to be used. For creating new Configuration objects there is ConfigurationAdmin#createFactoryConfiguration(String, String). This way a Managed Service Factory will be registered by the requesting bundle, which allows to create multiple Component Instances with different configurations. In this case the Configuration Properties will contain the property service.factoryPid with the value of the configuration PID and the service.pid with a unique value.

As it is not possible to mix Managed Services and Managed Service Factories with the same PID, another method needs to be used to access existing configurations. For this ConfigurationAdmin#listConfigurations(String) can be used. The parameter can be a filter and the result will be an array of Configuration objects that match the filter. The filter needs to be an LDAP filter that can test any Configuration Properties, including service.pid and service.factoryPid. The following snippet for example will only return existing Configuration objects for the Borg service when it was created via Managed Service Factory.

this.configAdmin.listConfigurations(
    "(service.factoryPid=org.fipro.oneshot.Borg)")

The parameters of ConfigurationAdmin#getConfiguration(String, String) and ConfigurationAdmin#createFactoryConfiguration(String, String) are actually the same. The first parameter is the PID that needs to match the configuration PID of the component, the second is the location binding. It is best practice to use “?” as value for the location parameter.

Create the following console command in the org.fipro.oneshot.command bundle:

@Component(
    property= {
        "osgi.command.scope=fipro",
        "osgi.command.function=assimilate"},
    service=AssimilateCommand.class
)
public class AssimilateCommand {

    @Reference
    ConfigurationAdmin configAdmin;

    public void assimilate(String soldier) {
        assimilate(soldier, null);
    }

    public void assimilate(String soldier, String newName) {
        try {
            // filter to find the Borg created by the
            // Managed Service Factory with the given name
            String filter = "(&(name=" + soldier + ")"
                + "(service.factoryPid=org.fipro.oneshot.Borg))";
            Configuration[] configurations =
                this.configAdmin.listConfigurations(filter);

            if (configurations == null
                    || configurations.length == 0) {
                //create a new configuration
                Configuration config =
                    this.configAdmin.createFactoryConfiguration(
                        "org.fipro.oneshot.Borg", "?");
                Hashtable<String, Object> map = new Hashtable<>();
                if (newName == null) {
                    map.put("name", soldier);
                    System.out.println("Assimilated " + soldier);
                } else {
                    map.put("name", newName);
                    System.out.println("Assimilated " + soldier
                        + " and named it " + newName);
                }
                config.update(map);
            } else if (newName != null) {
                // update the existing configuration
                Configuration config = configurations[0];
                // it is guaranteed by listConfigurations() that
                // only Configuration objects are returned with
                // non-null properties
                Dictionary<String, Object> map =
                    config.getProperties();
                map.put("name", newName);
                config.update(map);
                System.out.println(soldier
                    + " already assimilated and renamed to "
                    + newName);
            }
        } catch (IOException | InvalidSyntaxException e1) {
            e1.printStackTrace();
        }
    }
}

In the above snippet name is used as the unique identifier for a created Component Instance. So the first thing is to check if there is already a Configuration object in the database for that name. This is done by using ConfigurationAdmin#listConfigurations(String) with an LDAP filter for the name and the Managed Service Factory with service.factoryPid=org.fipro.oneshot.Borg which is the value of the configuration PID we used for the Borg service component. If there is no configuration available for a Borg with the given name, a new Configuration object will be created, otherwise the existing one is updated.

Note:
To verify the Configuration Properties you could extend the activate method of the Borg implementation to show them on the console like in the following snippet:

@Activate
void activate(BorgConfig config, Map<String, Object> properties) {
    this.name = config.name();
    properties.forEach((k, v) -> {
        System.out.println(k+"="+v);
    });
}

Once a service instance is activated it should output all Configuration Properties, including the service.pid and service.factoryPid for the instance.

Note:
Some more information on that can be found in the enRoute documentation and of course in the specification.

Service Consumer

Finally we add the following execute command in the org.fipro.oneshot.command bundle to verify the instance creation:

@Component(
    property= {
        "osgi.command.scope=fipro",
        "osgi.command.function=execute"},
    service=ExecuteCommand.class
)
public class ExecuteCommand {

    @Reference(target="(service.factoryPid=org.fipro.oneshot.Borg)")
    private volatile List<OneShot> borgs;

    public void execute(String target) {
        for (ListIterator<OneShot> it =
            borgs.listIterator(borgs.size());
                it.hasPrevious(); ) {
                it.previous().shoot(target);
        }
    }
}

For simplicity we have a dynamic reference to all available OneShot service instances that have the service.factoryPid=org.fipro.oneshot.Borg. As a short reminder on the DS 1.3 field strategy: if the type is a Collection the cardinality is 0..n, and marking it volatile specifies it to be a dynamic reluctant reference.

Starting the application and executing some assimilate and execute commands will show something similar to the following on the console:

g! assimilate Lars
Assimilated Lars
g! assimilate Simon
Assimilated Simon
g! execute Dirk
Borg Lars #1 of 2 took orders and executed Dirk
Borg Simon #2 of 2 took orders and executed Dirk
g! assimilate Lars Locutus
Lars already assimilated and renamed to Locutus
g! execute Dirk
Borg Locutus #1 of 2 took orders and executed Dirk
Borg Simon #2 of 2 took orders and executed Dirk

The first two assimilate calls create new Borg service instances. This is verified by the execute command. The following assimilate call renames an existing Borg, so no new service instance is created.

Now that I have learned about Managed Service Factories and how to use them with DS, I hope I am able to adapt that in the Eclipse Platform. So stay tuned for further DS news!


by Dirk Fauth at February 24, 2017 07:43 AM

EMF Support for Che – Day 2: Generating code

by Maximilian Koegel and Jonas Helming at February 23, 2017 12:20 PM

In this blog series, we share our experience with extending Eclipse Che and describe how we have built initial EMF support for the Eclipse Che IDE. Please see the first post in this series for an overview of our goals. In the last blog post, we described how to create a custom workspace containing a template modeling project. This template provides a fully configured model. Before looking at any editing support or the creation of custom models, we first have a look at generating code from that template project. This is one of the most crucial requirements of our project, as we want to reuse the existing EMF code generator. So is it possible to reuse this existing Eclipse framework feature in Che?

Let’s have a quick look on how the EMF code generator can be triggered. The default way would be to use the UI in the Eclipse IDE. Furthermore, EMF provides a Java API to run the code generation. This sounds appealing, as the Che server component is also written in Java.So we could implement a wrapper service for the Che server, which is triggered by the browser IDE and call the API of EMF to generate the code. But wait, EMF is designed to run in an OSGi runtime environment. Additionally, it uses extension points (at least the package registry). It is actually possible to call the code generator of EMF with plain Java, but we would need to wire things manually. Setting up the classpath without OSGi looks like a nightmare in this scenario. Another disadvantage of that approach is that we would have to deploy the EMF libraries along with our server, which makes an update cumbersome.

Luckily, there is a much simpler way of integrating the existing code generator. The Eclipse Desktop IDE provides a headless application to be executed on the command line. With the following call you can generate the code for the make it happen example.

$ /eclipse/eclipse \
-noSplash \ # do not show the eclipse splash screen
-data /path/to/data/dir \ # the path to be our current project
-application org.eclipse.emf.codegen.ecore.Generator \ # the application id to execute
-model \ # generate EMF model classes
-edit \ # generate EMF edit bundle
/path/to/modelname.genmodel # the path to the genmodel file

So how do we integrate this with Che? The good news is that we can simply deploy Eclipse into a workspace. Workspaces in Che are not only directories hosting code, they also act as a docker container and maycontain tools. So if we install an Eclipse Modeling Tools Edition into our modeling workspace, we should be able to generate code using the EMF Application on the command line.

First we need to get an Eclipse installation into our Che workspace container. As a Che workspace container is based on a Linux image we can use the shell to download and extract the latest Eclipse Modelling Tools. The download link can be fetched from the official downloads page here (copy the link for Linux 32/64 bit). To access the container shell just click on the “Terminal” tab at the bottom of the screen. The actual shell commands are listed below.

$ sudo su # gain super user privileges (become root)
$ cd / # switch to the root directory
$ wget ${Download Link} -o eclipse.tar.gz # download eclipse
$ tar xfv eclipse.tar.gz # extract the downloaded tar.gz file

Afterwards you can trigger the EMF code generator by typing the following command.

$ /eclipse/eclipse \
-noSplash \
-data /projects/makeithappen \
-application org.eclipse.emf.codegen.ecore.Generator \
-model \
-edit \
/projects/makeithappen/org.eclipse.emf.ecp.makeithappen.model/model/task.genmodel

The following screenshot shows the code generation log on the console. We can then open the generated code using the IDE. That means we have re-used the existing code generator of EMF in Che, just by typing in one line on the console!

image01

Now that we can trigger the code generator on the command line, let’s make that more convenient for the user. Instead of typing in a complex command, we want to enable the code generation in one click. Therefore, Che allows us to define a “custom command”. To do so, click on the command drop-down in the top right corner of the IDE and select “Edit Commands”.

image08

Then click on the “+” button next to the “Custom” section and fill in the form on the right side (see screenshot). As you can see, the command uses a Che variable for the current project path. However, the last segment of the path to the genmodel is still static.

image07

Now we can generate the code for the template project with one click. We are reusing the existing EMF code generator. So far, we did not have to write a single line of code. However, there are multiple open issues and things that need to be improved. As an example, we can only trigger the code generation on a fixed project, we still work with a fixed template and we cannot really modify our model (except in plain XML). All those extension will require us to implement extensions for the Che browser IDE and the Che server, which we will begin later in this blog series. In the next part, we will look at creating a custom stack to make the adaptations of this part reproducible. The goal is to have the downloaded Eclipse instance (for the code generation) and the custom command available from scratch. So stay tuned!

If you are interested in learning more about the prototype for EMF support, if you want to contribute or sponsor its further development, or if you want support for creating your own extension for Che, please get in contact with us.

_MG_4540b2
Co-Author
Mat Hansen


TwitterGoogle+LinkedInFacebook

Leave a Comment. Tagged with che, eclipse, che, eclipse


by Maximilian Koegel and Jonas Helming at February 23, 2017 12:20 PM

Fuse Tooling for Fuse Integration Services and Teiid Designer on Neon

by pleacu at February 22, 2017 02:22 PM

Try our complete Eclipse-Neon capable, Devstudio 10.2.0 compatible integration tooling.

jbosstools jbdevstudio blog header

JBoss Tools Integration Stack 4.4.1.Final / JBoss Developer Studio Integration Stack 10.1.0.GA

All of the Integration Stack components have been verified to work with the same dependencies as JBoss Tools 4.4 and Developer Studio 10.

What’s new for this release?

Fuse Tooling is providing support for FIS 2.0 projects based on the fabric8-maven-plugin workflow. FIS 2.0 spring-boot based projects can be debugged locally and deployed on an OpenShift instance. Also - this release features the newest production 11.0.1.Final Teiid Designer. There are no more early access bits for Neon!

Released Tooling Highlights

JBoss Fuse Development Highlights

Fuse Tooling Highlights

See the Fuse Tooling 9.1.0.Final Resolved Issues Section of the Integration Stack 10.1.0.GA release notes.

SwitchYard Highlights

See the SwitchYard 2.3.0.Final Resolved Issues Section of the Integration Stack 10.1.0.GA release notes.

JBoss Business Process and Rules Development

BPMN2 Modeler Known Issues

See the BPMN2 1.3.2.Final Known Issues Section of the Integration Stack 10.1.0.GA release notes.

Drools/jBPM6 Highlights

Data Virtualization Highlights

Teiid Designer Highlights

See the Teiid Designer 11.0.1.Final Resolved Issues Section of the Integration Stack 10.1.0.GA release notes.

What’s an Integration Stack?

Red Hat JBoss Developer Studio Integration Stack is a set of Eclipse-based development tools. It further enhances the IDE functionality provided by JBoss Developer Studio, with plug-ins specifically for use when developing for other Red Hat JBoss products. It’s where the Fuse Tooling, DataVirt Tooling and BRMS tooling is aggregated. The following frameworks are supported:

JBoss Fuse Development

  • Fuse Tooling - JBoss Fuse Development provides tooling for Red Hat JBoss Fuse. It features the latest versions of the Fuse Data Transformation tooling, Fuse Integration Services support, SwitchYard and access to the Fuse SAP Tool Suite. Read more about Microservices Solutions for Integration

  • SwitchYard - A lightweight service delivery framework providing full lifecycle support for developing, deploying, and managing service-oriented applications.

JBoss Business Process and Rules Development

JBoss Business Process and Rules Development plug-ins provide design, debug and testing tooling for developing business processes for Red Hat JBoss BRMS and Red Hat JBoss BPM Suite.

  • BPEL Designer - Orchestrating your business processes.

  • BPMN2 Modeler - A graphical modeling tool which allows creation and editing of Business Process Modeling Notation diagrams using graphiti.

  • Drools - A Business Logic integration Platform which provides a unified and integrated platform for Rules, Workflow and Event Processing including KIE.

  • jBPM6 - A flexible Business Process Management (BPM) suite.

JBoss Data Virtualization Development

JBoss Data Virtualization Development plug-ins provide a graphical interface to manage various aspects of Red Hat JBoss Data Virtualization instances, including the ability to design virtual databases and interact with associated governance repositories.

  • Teiid Designer - A visual tool that enables rapid, model-driven definition, integration, management and testing of data services without programming using the Teiid runtime framework.

JBoss Integration and SOA Development

JBoss Integration and SOA Development plug-ins provide tooling for developing, configuring and deploying BRMS, SwitchYard and Fuse applications to Red Hat JBoss Fuse and Fuse Fabric containers, Apache ServiceMix, and Apache Karaf instances.

  • All of the Business Process and Rules Development plugins, plus…​

  • Fuse Apache Camel Tooling - A graphical tool for integrating software components that works with Apache ServiceMix, Apache ActiveMQ, Apache Camel and the FuseSource distributions.

  • SwitchYard - A lightweight service delivery framework providing full lifecycle support for developing, deploying, and managing service-oriented applications.

The JBoss Tools website features tab

Don’t miss the Features tab for up to date information on your favorite Integration Stack components.

Installation

The easiest way to install the Integration Stack components is through the stand-alone installer. If you’re interested specifically in Fuse we have a new all-in-one installer JBoss Fuse Tooling + JBoss Fuse/Karaf runtime.

For a complete set of Integration Stack installation instructions, see Integration Stack Installation Instructions

Try it - you’ll like it!

Paul Leacu.


by pleacu at February 22, 2017 02:22 PM

Windup Eclipse Plugin has been released!

by josteele at February 21, 2017 03:14 AM

We are happy to announce the first release of the Windup Eclipse Plugin. It is available now through JBoss Central, and from our update site at http://download.jboss.org/jbosstools/neon/stable/updates/windup/composite/.

What’s Windup?

Windup is a command line tool used to aid the process of migrating Java applications. Here’s a few examples:

  • You want to move your application from one application server to another, for example:

    • WebLogic to EAP

    • WebSphere to EAP

  • You want to upgrade from one version of a technology to another, for example:

    • Hibernate 3 to Hibernate 4

    • EAP 6 to EAP 7

  • You want to change technologies, for example:

    • Seam 2 UI controls to pure JSF 2 UI Controls

And here’s an example of how you’d run Windup using the CLI:

$ ./windup --input /path/to/jee-example-app-1.0.0.ear --output /path/to/output --source weblogic --target eap:7

The output of running Windup from the command line is an HTML report, which can then be used to help analyze how much effort the migration will take, as well as provide assistance with solving the individual problems.

What do the Windup Eclipse plugins do?

As previously mentioned, the output of running Windup from the command line is an HTML report, which is not very useful for the engineer responsible for making the changes in the code.

That’s where the Eclipse plugins come into play. Once you’ve run Windup from within the IDE, all the source files needing to be changed will be automatically marked, and can be easily organized, searched, and in many cases, fixed using quick fixes.

Let me give you a quick walkthrough of some of the key components. You can find more detailed information here.

Windup Perspective

We’ve created a dedicated perspective containing all the the views necessary to use Windup.

Windup Perspective

Run Configuration Dialog

Think of this as a GUI for your command line arguments. Instead of needing to dig deep into Windup documentation, and then having to tediously type paths, and other various arguments, this dialog simplifies the process of telling Windup what to analyze and how.

Run Configuration

Issue Explorer View

The Issue Explorer gets populated with all the migration issues.

Issue Explorer



You can customize how the issues are organized.

Issue Explorer Grouping



The context menu is dynamic, and will vary per issue.

Issue Explorer Context Menu



Some issues have quick fixes available. Quick fixes can be previewed prior to being applied.

Quick Fix Preview

Issue Details View

The Issue Details View provides more detailed information about migration issues, for example, hints on how to fix them, external documentation that might help with choosing the best solutions, etc.

Issue Details

Report View

You may need to refer back to the generated HTML report, and for that reason, we make it readily available here.

Windup Report

Demo

Here is a short video which demonstrates the basic usage:

Conclusion

We are trying our best to make the Windup tooling as good as possible. Users&apos feedback is what we are seeking for now. We are looking forward to hearing your comments / remarks!

Have fun!
John Steele
github/johnsteele


by josteele at February 21, 2017 03:14 AM

The Anatomy of Smart Homes

by Kai Kreuzer at February 20, 2017 12:00 AM

Scott Jenson published a very good article last week, where he nicely showed how far away we still are from the shiny Jetson-like marketing promises about smart homes. He asked us to take a step back so that we can think about the holistic shape of the “right” solution for smart homes. Let’s do so!

Missing Technology?

Doing a shameless shortening of his thoughts, he concluded that many issues would be solved if all devices had

  1. a smart setup that allows users a very simple way of commissioning
  2. auto-arrangement through proximity-detection
  3. a globally accepted way of announcing their capabilities
  4. a mechanism to share their data instead of keeping it in a silo

These are the technologies that we as a community are still lacking and that need to be tackled. Only if these are in place, it will enable devices to automatically form a distributed ensemble which we all aim for.

While I very much agree with all his points, I believe that solving the technological issues alone won’t suffice, but that we need to take yet another step back to see the big picture.

Taking Another Step Back

When talking about adding IoT devices to “the network”, it is out of question for most people to wonder “which network” - since we are talking about the Internet of Things, the network can only be The Internet where everything is connected, right?

Silos and Accounts

If we have a closer look at today’s smart home devices (and also other IoT setups), the truth is that everyone uses IP, i.e. Internet technology. But are the devices available “on the Internet”? No, many of them effectively do not allow you to communicate in any direct way with them. Instead, they open an encrypted tunnel to a cloud service, which then acts as a portal to the device for the user - all data is kept secretly (not saying securely) in the cloud under full control of the manufacturer. Effectively, we have a big number of siloes that can be regarded as VPNs. It is important to note that these are deliberate silos - this choice is not (only) the fault of missing interoperability standards!

This setup is a major reason for Scotts points 1 (smart setup) and 4 (sharing data): The manufacturers do not want to give control to the users, but instead force them to create accounts at their cloud service, accept the manufacturers terms and conditions and let the manufacturer decide how and with whom to share the data.

This is the major issue that needs to be addressed: The mindset of people that are building IoT devices must move away from this omni-present IoT architecture. The outlook that users will have to create web accounts for every single household object (everything’s going to be connected, right?) they buy in the future is ridiculous. This works as long as only nerds with a dozen devices play around with smart homes, but it definitely won’t scale. Manufacturers must let their devices go - only if they are unleashed the devices will have a chance to communicate with each other and form distributed ensembles.

Internet vs. Intranet

Let’s for the moment assume the devices were unleashed. Is the Internet then really the best place for them? Even if all potential authentication and authorization issues were solved, we all know that all those devices are hackable. Do we really want to see them publicly listed on Shodan.io? Having isolated VPNs isn’t such a bad idea after all and in the case of the smart home the most natural fit is the local Intranet. What you want to have on the public Internet are services (hosted on “real” servers that are hardened against attacks) and not household devices. A good compromise is in my opinion the concept of the Physical Web, where the discovery and authorization could be done through low-range radio like BLE, while the heavy lifting is done through cloud servers - but this really hardly applies to the smart home itself. If services of a home should be shared (this decision should really be up the user), it could be made available on any standard-compliant cloud service (i.e. no vendor lock-in), once Lego brick 3 (Standard Descriptions) is in place. Unfortunately, I am not as optimistic as Scott to have a “shared, common approach soon” - maybe for technical operability, but most likely not for semantical interoperability.

A Matter of Perspective

Taking the perspective of the user instead of the perspective of the manufacturer definitely brings us much closer to the desired solution. But putting the user in the center also causes some dilemma in the context of a smart home: After all, a home usually has multiple inhabitants, furthermore there are visitors, pets, burglars (hopefully not), etc. For the home to become smart and autonomous, the devices must become an integral part of the home and must not be a property of a single user. This is another clear indicator that we are still in an early phase: Most devices are bought off-the-shelf, installed DIY and registered to a certain user, who configures it to meet his needs. The devices are the personal objects of a person; they are not a part of the home. If this person moves the house, he will take his personal belongings with him. This works as long as you have only a dozen smart bulbs, a door lock and a web cam. It won’t work - and especially it does not make any sense - for a photovoltaic system, a complex central heating, the in-wall motion sensors, the smart frontdoor,… you get the point. You are also not ripping out the wall switches and your windows when you sell a house. A smart home should be allowed to exist on its own, even if nobody lives there. What brings it into existence are its bricks and its infrastructure - including the local IP network as the communicatin backbone for its parts.

Water, Electricity, Smartness

Smart home features and connectivity thus have to become a matter of course, just like electricity or water - it should not be treated as something that is retrofitted by the owner; instead, it must be an integral part of a home, which in turn becomes a smart, self-organizing entity. Having the devices build a local private network would ensure data privacy and the smallest possible attack target. This should be the long-term vision when designing smart home devices. In consequence, this means that devices must

  1. not require a mandatory user registration
  2. work (at least in a basic way) without a cloud connection
  3. provide a local API for interaction with other devices (and not limiting access to a small set of business-oriented partnerships) and expect the same from the others

In short, things should focus again on being things in the first place and not cloud-connected devices - they must be unleashed to show their full potential.

This might sound trivial, but unfortunately these features are widely ignored in the industry - although they would secure the adoption of smart home technology in the long-term. How can the industry possibly be influenced to see that a bit more communism will eventually grow the cake for everyone?


by Kai Kreuzer at February 20, 2017 12:00 AM

What is new in Eclipse 4.x?

by Andrey Loskutov (noreply@blogger.com) at February 19, 2017 09:39 PM

A look on Eclipse 4.6 from Eclipse 3.8 point of view

The good, the bad and the ugly

Preface

This is my personal, condensed, not comprehensive overview of changes in the Eclipse platform happened between last major 3.x release (3.8.0) from September 2012 and now (4.6.3). This overview is written from a user perspective and is for end-users, API changes are not part of the discussion.

Here are official "New and Noteworthy" links for each release:

4.2 N&N (2012)
4.3 N&N (2013)
4.4 N&N (2014)
4.5 N&N (2015)
4.6 N&N (2016)

Please note that Eclipse 3.8.0 was released at same time with 4.2.1 but did not contain some features related to the UI programming model change. The difference between 3.8.0 and 4.6.3 are not just 5 years or 5 major Eclipse releases, it is also a bigger change of the UI programming model from 3.x to 4.x API (aka e4).

Why to talk about 3.x to 4.x changes today?

Because Eclipse 3.8.2 was exceptionally well made, stable and reliable release. It required five (!) 4.x releases to reach similar stable state (may be not even same state, but at least some usable state). Also because most plugins till recently were still supporting 3.x stream and because 3.x was good enough to support development of Java 7 based software.

But as soon as your business switches to Java 8, the time is for Eclipse 4.x, because JDT in 3.8 does not know and does not support Java 8.

Themes aka L&F

Eclipse 4.x introduces "themed UI", where many UI elements can be "styled" with CSS rules. This is a highly controversial change, which from the one side allowed such features like "dark theme" but from the other side ruined UI performance, especially in distributed environments.

A good example how NOT to design ergonomic IDE theme (4.2. on Windows)

Same misery (4.2.1) on Linux
3.8.0 on same system
Either way (with or without themes), Eclipse 4.x doesn't look like Eclipse 3.8. Initial 4.2.1 in the default theme was simply ugly, on every operating system. With the time, 4.x stream got some polishing and starting with 4.6 reached the state of 3.8, more or less. From the performance point of view, CSS themes add a considerable computational overhead for complex UI's (plus extra layer of code which can be broken) and can significantly slow down Eclipse (to completely unusable state) in specific environments, like remote usage via vncviewer, rdesktop and Co.


Recommendation: don't use CSS themes to have less bugs, slicker and faster UI experience. Here is the how Eclipse looks like on Windows 10 without CSS themes:
4.6.3 with themes disabled, Windows 10

High-DPI monitors support

SWT now automatically scales images on high-DPI monitors on Windows and Linux, similar to the Mac's Retina support on OS X. In the absence of high-resolution images, SWT will auto-scale the available images to ensure that SWT-based applications like Eclipse are scaled proportionately to the resolution of the monitor.

Sometimes this automatic scale doesn't work or looks not nice - in this cases there are ways to tweak the default behavior

GTK3 support 

As if it would be not enough UI nonsense introduced with new themes, Eclipse 4.5+ uses GTK3 instead of GTK2 on Linux by default. Slick, useful, clean widgets from GTK2 are replaced by a pixel eating monsters from GTK3. The only solution for that is to use GTK3 themes with more "human" button sizes, but good GTK3 themes are very rare. Unfortunately, GTK3 is here to stay and GTK2 will be obsoleted sooner or later.

One can still switch to GTK2 however: either export SWT_GTK3=0 in the shell before starting Eclipse or add this two lines to your eclipse.ini:

--launcher.GTK_version
2

Modeled UI

Eclipse 4.x UI layout is more flexible and removes some restrictions 3.x Eclipse had. Now one can place editors and views in the same stack and move editors outside of the main window. FWIW, I personally never used this flexibility, but there are always corner cases where this could be useful.

Flexible UI

Unfortunately, with flexibility comes also complexity and bugs. So migration from 3.x workspace to 4.x with opened editors results in editors without close buttons and close menus (bug 509712). First time user who don't know this are surprised and are helpless to find out how to close the editor in the new world. Another bug (511798) is also not nice: if you have two editors in the split screen mode (like in example above), you will see an ugly white bar over them.

Quick access <Ctrl+3>

One shortcut every Eclipse user should learn is <Ctrl+3>, or "Quick Access". You can open views, start commands etc, very handy. To make it more prominent for new users, a permanent text box was added in the top right corner in Eclipse 4.x. Unfortunately, this changed the location and size of the resulting dialog, which a step backwards compared to 3.x. Instead to open in front of user and have appropriate size and location, it opens somewhere far right, with a small size where nothing really fits into.

Crippled Quick Assist in 4.x
Fortunately, there is a workaround which allows to restore 3.x L&F. One has to right click on the thin text box border and if one has luck, one will be able to click on the "Hide" menu:
Hide that ugly box!

and voila, the old good quick assist appears at the expected location and with size where everything fits:

Welcome back, quick assist with 3.x L&F!

Open resource <Ctrl+Shift+R>

Open Resource dialog got some love. It can now filter duplicated resources, show or hide derived resources, show only files from specific working set. Additionally, files can be either opened with a default editor or user can choose which editor should be used to open them.




Split editor

Sometimes one want to see code from two functions at same time - in a large file this is of course not possible, therefore there is now a new Window > Editor > Toggle Split Editor menu:

Split editor vertical

Word wrap <Alt+Shift+Y>

Believe or not, it took only 12 years for Eclipse to implement soft wrapping in text editors. But better later then never, we have now a button on the toolbar and a shortcut <Alt+Shift+Y> to toggle word wrapping in the current editor:
Word wrap!


Configure left and right sides in Compare editors

Another one change we waited about 10 years finally quietly made into Eclipse 4.6.2 release: one can configure Eclipse to show the new changes on the right side of the diff editor! Unbelievable, but differently to the rest of the world, Eclipse always has shown local changes not on the right but on the left side, which was and is quite surprising for everyone.

Since the old way was there for a decade, by default, the compare editor still shows the local changes on the left side, but the user finally can change this! There are two ways to do so: a new "swap left and right view" button in the compare editor toolbar:
Local changes on the right side!

and the Window > Preferences > General > Compare / Patch > Text Compare > Swap left and right:
Preference to see local changes on the right side

Console improvements

Word wrap button is also added to all I/O consoles, where it enables soft word wrapping for long output. Beside this, I/O consoles got "automatic scroll lock" mode, which is automatically enabled by scrolling up in the Console view using keys, mouse wheel, or scroll bar. When you scroll down to the end of the console, the scroll lock is automatically released again.

Console with word wrap and scroll lock

Root of all evil

Don't drink and drive Don't run Eclipse as root! This happens seldom, but if it happens, it will you make busy for the next few hours. Finally, Eclipse helps us and allows prevent to start it with root rights via new command line option: -protect root. This option will make sure Eclipse don't start if you have root rights on Linux.

Show in system explorer

Right click on any file or folder in Eclipse and say Show In > System Explorer. This will open system default "Explorer" and highlight the selected file or folder. In case the system default "Explorer" doesn't open (Linux), the command line to launch it can be configured under Window > Preferences > General > Workspace > Command for launching system explorer.

Show In System Explorer

Improved Open With ... Other dialog

The Open With > Other... dialog now has:
Open With... extended
  • a filter field
  • remembers last used choice
  • options to remember the selected editor as default for the selected file name or type.

Tabs "Close..." menus

The context menu of editor and view tabs now offer Close Tabs to the Left and Close Tabs to the Right menu:
Tabs menu

MRU or automatic editor shuffling

In classic 2.x versions of Eclipse the editors tab placement strategy was simple: they were shown in exact the order they were opened by the user. This is how all tools in the universe work. Then, in Eclipse 3.0 this simple universal standard was changed to show the most recently used tabs (MRU) first, and automatically reorder tabs to always show few last used if there is not enough place to show all. This change was highly controversial, see lengthy discussions in bug 68684. Finally, via bug 461736 the tab visibility strategy has has been turned into a separate preference, so everyone can decide if MRU should be turned on or off.

The new preference can be found under Window > Preferences > General > Appearance > Visible Tabs on overflow > Show most recently used tabs:

MRU off!

UI responsiveness monitoring

This feature is might be uninteresting for the end user, but it greatly helps IDE developers to recognize misbehaved code which causes UI freezes. UI responsiveness monitoring can be turned on via Window > Preferences > General > UI Responsiveness Monitoring.

UI responsiveness preferences
Please enable monitoring and please report UI freezes (they are reported to the Eclipse error log) to bugzilla.

Copy/paste for files improved

<Ctrl+C > / <Ctrl+V> of a files in same folder in explorer views to create a copy automatically proposes now just the old name followed by the digit 2 (or 3, etc., if that name is already taken). This is really handy if one wants create few file duplicates in same folder.

Shortcut for Skip All Breakpoints  

A very handy Ctrl+Alt+B has been added as the shortcut for Skip All Breakpoints:

Zoom font in editors <Ctrl++> and <Ctrl+->

In text editors, you can now use Zoom In <Ctrl++> or <Ctrl+=> and Zoom Out <Ctrl+-> commands to increase and decrease the font size. Like a change in the Window > General > Appearance > Colors and Fonts preference page, the commands persistently change the font size in all editors of the same type. If the editor type's font is configured to use a default font, then that default font will be zoomed.

Automatic save of dirty editors

Auto-save of dirty editors is now available in Eclipse. The autosave option is disabled by default. A new autosave preference page (Window > Preferences > General > Editors > Autosave) is available and allows to enable/disable the autosave and change the interval of autosave. The countdown is reset on keyboard activity, mouse click, or when a popup is displayed (e.g. content assist, preference page, ...).
Autosave preferences

Workspace selection dialog

The workspace selection dialog now allows you to start a previously selected workspace directly via a link. The path to the workspace is shortened. The full path is available if you hover over the link. You can remove existing entries via the context menu.

Workspace selection dialog

Screen space tweaks

Main toolbar can be hidden (Window > Appearance > Hide Toolbar), and Eclipse window can be put in the "Full Screen" mode without title bar (Window > Appearance > Toggle Full Screen). In 4.7 there will be another tweak: Window > Appearance > Hide Status Bar. Guess what it will do :-)

Print button is hidden

Yep, the global toolbar does not show "Print" button by default, to save space. It turned out, no one actually uses this function, so why should it be on the main toolbar?

Default text editor for unknown files

Since ever Eclipse tried to open a system editor for files which did not have a dedicated editor in Eclipse. In 4.x there is a possibility to stop this and to chose to always use default text editor from Eclipse instead.

On the Window > Preferences > General > Editors > File Association page, you can now define an editor selection strategy for unassociated file types. Three strategies are proposed out-of-the-box:
  • System Editor; if none: Text Editor (default) will open the system editor associated with the file, if available. If no system editor is associated with the given file, fall back to the Eclipse Text Editor
  • Text Editor will always open Eclipse's Text Editor on unassociated file types
  • Ask via pop-up will open the same dialog as using Open With > Other... on a file and let you choose which editor to use (inside or outside the IDE)
Keep in mind that in any case, it's possible to assign an editor for an unassociated file type either via this same preference page, or via the Open With > Other... context-menu on the file

Eclipse sources are in Git

Along with 4.x transition, all Eclipse sources were moved to Git (from CVS). This opened a really nice path for contributions. If you are interested in Eclipse hacking, or just want to fix this nasty XYZ bug - please consider to contribute back to the community: https://wiki.eclipse.org/Platform_UI/How_to_Contribute.

by Andrey Loskutov (noreply@blogger.com) at February 19, 2017 09:39 PM

Combine Xcore, Xtend, Ecore, and Maven

by Niko Stotz at February 18, 2017 06:00 PM

Update: Christian found a workaround for compiling code that references Ecore types. We adjusted the article.

Lots of thanks to Christian for figuring this out together.

The complete example is available on github.

Also, we included all the files as Listings at the end of the article. They are heavily commented.

Objective

Inside an Eclipse plugin, we have EMF models defined in both Xcore and Ecore, using types from each other. Also, we have a Helper Xtend class that’s called from Xcore and uses types from the same model. We want to build the Eclipse Plugin with Maven. We also don’t want to commit any generated sources (from Xtend, Xcore, or Ecore).

Issue

Usually, we would use xtend-maven-plugin to build the Xtend classes, xtext-maven-plugin to build the Xcore model, and MWE2 to build the Ecore model.

However, we have a cyclic compile-time dependency between AllGreetings calling GreetingsHelper.compileAllGreetings() which receives a parameter of type AllGreetings. We also have a cyclic dependency between AllGreetings calling GreetingsHelper.compileAllPersons() and using IPerson type.

Solution

Java (and Xtend) can solve such cycles in general, but only if they process all members at the same time. Thus, we need to make sure both Xtend and Xcore are generated within the same Maven plugin.

We didn’t find a way to include the regular Ecore generator in the same step, so we keep this in a separate MWE2-based Maven plugin.

For generating persons.ecore, we call an MWE2 workflow from Maven via exec-maven-plugin. The workflow itself gets a bit more complicated, as we use INamedElement from base.xcore as a supertype to IPerson inside persons.ecore. Thus, we need to ensure that base.xcore is loaded, available, and can be understood by the Ecore generator.

Afterwards, we use xtext-maven-plugin to generate both Xtend and the two Xcore models. To do this, we need to include all the required languages and dependencies into one single plugin in our maven pom.

The first version could not compile GreetingsHelper.compileAllPersons() because it referenced types from persons.ecore, and these could not be resolved by the EMF validator. As a workaround, we disabled the validator in the workflow.

Remarks

  • We developed this using Eclipse Mars.2.
  • The example project should be free of errors and warnings, and builds in all of Eclipse, MWE2, and Maven.

    In Maven, you might see some warnings due to an Xtext bug. It should not have any negative impact.

  • When creating the Ecore file, make sure only to use built-in types (like EString) from http://www.eclipse.org/emf/2002/Ecore. They may be listed several times.
  • In our genmodel file, Eclipse tends to replace this way of referring to platform:/resource/org.eclipse.emf.ecore/model/Ecore.genmodel#//ecore by something like ../../org.eclipse.emf.ecore/model/Ecore.genmodel#//ecore. This would lead to ConcurrentModificationException in xtext-maven-plugin or MWE2, or “The referenced packages ”{0}” and ”{1}” have the same namespace URI and cannot be checked at the same time.” when opening the GenModel editor.

    In this case, open the genmodel file with a text editor and use the platform:/resource/org.eclipse.emf.ecore/model/Ecore.genmodel#//ecore form in the genmodel:GenModel#usedGenPackages attribute. Sadly, this needs to be fixed every time the Genmodel Editor saves the file.

  • The Xcore builder inside Eclipse automatically picks up the latest EMF complianceLevel, leading to Java generics support in generated code. The maven plugin does not use the latest complianceLevel, thus we need to set it explicitly.
  • The modelDirectory setting in both Xcore and Ecore seem to be highly sensitive to leading or trailing slashes. We found it safest not to have them at all.
  • Using all the involved generators (MWE2, Xcore, Xtend, …) requires quite a few dependencies for our plugin. As a neat trick, we can define them in build.properties rather than MANIFEST.MF. This way, they are available at build time, but do not clog our run-time classpath. As we can see on the right, they are also listed separately in the Eclipse dependency editor.
  • maven-clean-plugin seems to be quite sensitive how its filesets are described. Even when disregarding the .dummy.txt entries in our pom, the *-gen directories were only cleaned if we listed them in separate fileset entries.
  • The workflow should reside within an Eclipse source folder. To separate it from real sources, we created a new source folder named workflow.

Listings

Project Layout

GreetingsHelper.xtend

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
package de.nikostotz.xtendxcoremaven.greetings.helper

import de.nikostotz.xtendxcoremaven.greetings.AllGreetings

class GreetingsHelper {
    // This method is called from greetings.xcore and has AllGreetings as parameter type,
    // thus creating a dependency circle
    def static String compileAllGreetings(AllGreetings it) {
        var totalSize = 0
        // We access the AllGreetings.getGreetings() method in two different ways
        // (for-each-loop and map) to demonstrate different error messages if we omit
        // 'complianceLevel="8.0"' in greetings.xcore
        for (greeting : it.getGreetings()) {
            totalSize = totalSize + greeting.getMessage().length
        }
       
        '''
        Greetings:
            «it.getGreetings().map[greeting | greeting.getMessage()].join(", ")»
        '
''
    }
   
    def static String compileAllPersons(AllGreetings it) {
        // AllGreetings.getPersons() refers to type IPerson, which is defined in Ecore.
        // This only works if we disable the validator in the MWE2 workflow.
        '''
        Hello Humans:
            «it.getPersons().map[person | person.describeMyself()].join(", ")»
        '
''
    }
}

base.xcore

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
@GenModel(
    // This sets the target directory where to put the generated classes.
    // Make sure NOT to start or end with a slash!
    // Doing so would lead to issues either with Eclipse builder, MWE2 launch, or Maven
    modelDirectory="de.nikostotz.xtendxcoremaven/xcore-gen",

    // required to fix an issue with xcore (see https://www.eclipse.org/forums/index.php/t/367588/)
    operationReflection="false"
)
package de.nikostotz.xtendxcoremaven.base

// This enables usage of the @GenModel annotation above. The annotation would work without
// this line in Eclipse, but Maven would fail.
// (WorkflowInterruptedException: Validation problems: GenModel cannot be resolved.)
annotation "http://www.eclipse.org/emf/2002/GenModel" as GenModel

interface INamedElement {
    String name
}

persons

persons.ecore

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
< ?xml version="1.0" encoding="UTF-8"?>
<ecore:epackage xmi:version="2.0" xmlns:xmi="http://www.omg.org/XMI" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
   xmlns:ecore="http://www.eclipse.org/emf/2002/Ecore" name="persons" nsURI="de.nikostotz.xtendxcoremaven.persons" nsPrefix="persons">
  <eclassifiers xsi:type="ecore:EClass" name="IPerson" abstract="true" interface="true"
     eSuperTypes="base.xcore#/EPackage/INamedElement">
    <eoperations name="describeMyself" lowerBound="1" eType="ecore:EDataType http://www.eclipse.org/emf/2002/Ecore#//EString"></eoperations>
  </eclassifiers>
  <eclassifiers xsi:type="ecore:EClass" name="Human" eSuperTypes="#//IPerson">
    <estructuralfeatures xsi:type="ecore:EAttribute" name="knownHumanLanguages" upperBound="-1"
       eType="ecore:EDataType http://www.eclipse.org/emf/2002/Ecore#//EString"></estructuralfeatures>
  </eclassifiers>
  <eclassifiers xsi:type="ecore:EClass" name="Developer" eSuperTypes="#//IPerson">
    <estructuralfeatures xsi:type="ecore:EAttribute" name="knownProgrammingLanguages"
       upperBound="-1" eType="ecore:EDataType http://www.eclipse.org/emf/2002/Ecore#//EString"></estructuralfeatures>
  </eclassifiers>
</ecore:epackage>

persons.genmodel

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
< ?xml version="1.0" encoding="UTF-8"?>
<genmodel:genmodel xmi:version="2.0"
    xmlns:xmi="http://www.omg.org/XMI" xmlns:ecore="http://www.eclipse.org/emf/2002/Ecore"
    xmlns:genmodel="http://www.eclipse.org/emf/2002/GenModel"
    modelDirectory="de.nikostotz.xtendxcoremaven/emf-gen" modelName="Persons"
    importerID="org.eclipse.emf.importer.ecore" complianceLevel="8.0"
    copyrightFields="false" importOrganizing="true"
    usedGenPackages="base.xcore#/1/base platform:/resource/org.eclipse.emf.ecore/model/Ecore.genmodel#//ecore">
    <!-- Eclipse tends to replace this way of referring to the Ecore genmodel
        by something like "../../org.eclipse.emf.ecore/model/Ecore.genmodel#//ecore".
        This would lead to ConcurrentModificationException in xtext-maven-plugin
        or MWE2, or "The referenced packages ''{0}'' and ''{1}'' have the same namespace
        URI and cannot be checked at the same time." when opening the GenModel editor. -->

    <foreignmodel>persons.ecore</foreignmodel>
    <genpackages prefix="Persons" basePackage="de.nikostotz.xtendxcoremaven.persons"
        disposableProviderFactory="true" ecorePackage="persons.ecore#/">
        <genclasses image="false" ecoreClass="persons.ecore#//IPerson">
            <genoperations ecoreOperation="persons.ecore#//IPerson/describeMyself"></genoperations>
        </genclasses>
        <genclasses ecoreClass="persons.ecore#//Human">
            <genfeatures createChild="false"
                ecoreFeature="ecore:EAttribute persons.ecore#//Human/knownHumanLanguages"></genfeatures>
        </genclasses>
        <genclasses ecoreClass="persons.ecore#//Developer">
            <genfeatures createChild="false"
                ecoreFeature="ecore:EAttribute persons.ecore#//Developer/knownProgrammingLanguages"></genfeatures>
        </genclasses>
    </genpackages>
</genmodel:genmodel>

greetings.xcore

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
@GenModel(
    modelDirectory="de.nikostotz.xtendxcoremaven/xcore-gen",
    operationReflection="false",
   
    // This enables Java generics support in EMF (starting with version 6.0).
    // If omitted, we'd get a "Validation Problem: The method or field message is undefined" in Maven
    // because AllGreetings.getGreetings() would return an EList instead of an EList<greeting>, thus
    // we cannot know about the types of the list elements and whether they have a 'message' property.
    // In other cases, this leads to error messages like "Cannot cast Object to Greeting".
    complianceLevel="8.0"
)

package de.nikostotz.xtendxcoremaven.greetings

// referring to Ecore
import de.nikostotz.xtendxcoremaven.persons.persons.IPerson

// referring to Xtend
import de.nikostotz.xtendxcoremaven.greetings.helper.GreetingsHelper

annotation "http://www.eclipse.org/emf/2002/GenModel" as GenModel

class AllGreetings {
    contains IPerson[] persons
    contains Greeting[] greetings
   
    op String compileAllGreetings() {
        // calling Xtend inside Xcore
        GreetingsHelper.compileAllGreetings(this)
    }
   
    op String compileAllPersons() {
        // calling Xtend inside Xcore
        GreetingsHelper.compileAllPersons(this)
    }
}

class Greeting {
    String message
    refers IPerson person
}

MANIFEST.MF

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: %pluginName
Bundle-SymbolicName: de.nikostotz.xtendxcoremaven;singleton:=true
Bundle-Version: 1.0.0.qualifier
Bundle-ClassPath: .
Bundle-Vendor: %providerName
Bundle-Localization: plugin
Bundle-Activator: de.nikostotz.xtendxcoremaven.XtendXcoreMavenActivator
Require-Bundle: org.eclipse.core.runtime,
org.eclipse.emf.ecore;visibility:=reexport,
org.eclipse.xtext.xbase.lib,
org.eclipse.emf.ecore.xcore.lib
Bundle-RequiredExecutionEnvironment: JavaSE-1.8
Export-Package: de.nikostotz.xtendxcoremaven.greetings.impl,
de.nikostotz.xtendxcoremaven.greetings.util,
de.nikostotz.xtendxcoremaven.base,
de.nikostotz.xtendxcoremaven.base.impl,
de.nikostotz.xtendxcoremaven.base.util,
de.nikostotz.xtendxcoremaven.persons.persons,
de.nikostotz.xtendxcoremaven.persons.persons.impl,
de.nikostotz.xtendxcoremaven.persons.persons.util
Bundle-ActivationPolicy: lazy

build.properties

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
#

bin.includes = .,\
               model/,\
               META-INF/,\
               plugin.xml,\
               plugin.properties
jars.compile.order = .
source.. = src/,\
           emf-gen/,\
           xcore-gen/,\
           xtend-gen/
src.excludes = workflow/
output.. = bin/
# These work the same as entries in MANIFEST.MF#Require-Bundle, but only at build time, not run-time
additional.bundles = org.eclipse.emf.mwe2.launch,\
                     org.apache.log4j,\
                     org.apache.commons.logging,\
                     org.eclipse.xtext.ecore,\
                     org.eclipse.emf.codegen.ecore.xtext,\
                     org.eclipse.emf.ecore.xcore,\
                     org.eclipse.xtend.core,\
                     org.eclipse.emf.codegen.ecore,\
                     org.eclipse.emf.mwe.core,\
                     org.eclipse.emf.mwe.utils,\
                     org.eclipse.emf.mwe2.lib

generateGenModel.mwe2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
module GenerateGenModel

var projectName = "de.nikostotz.xtendxcoremaven"
var rootPath = ".."

Workflow {
    // This configures the supported model types for EcoreGenerator.
    // Order is important.
    // Should be the same list as in Reader below.
    bean = org.eclipse.emf.ecore.xcore.XcoreStandaloneSetup {}
    bean = org.eclipse.xtend.core.XtendStandaloneSetup {}
    bean = org.eclipse.xtext.ecore.EcoreSupport {}
    bean = org.eclipse.emf.codegen.ecore.xtext.GenModelSupport {}
   
    bean = org.eclipse.emf.mwe.utils.StandaloneSetup {
        // Required for finding the platform contents (Ecore.ecore, Ecore.genmodel, ...) under all circumstances
        platformUri = "${rootPath}"
       
        // Required for finding above mentioned models inside their Eclipse plugins
        scanClassPath = true
    }
       
    // As persons.ecore refers to a type inside base.xcore (IPerson extends INamedElement),
    // we need to load base.xcore before we can generate persons.ecore.
    component = org.eclipse.xtext.mwe.Reader {
        // This configures the supported model types for this Reader.
        // Order is important.
        // Should be the same list as beans above.
        register = org.eclipse.emf.ecore.xcore.XcoreStandaloneSetup {}
        register = org.eclipse.xtend.core.XtendStandaloneSetup {}
        register = org.eclipse.xtext.ecore.EcoreSupport {}
        register = org.eclipse.emf.codegen.ecore.xtext.GenModelSupport {}
       
        // This asks the Reader to read all models it understands from these directories (and sub-directories).
        path = "model"
        path = "src"
       
        // Put the models inside a ResourceSet that's accessible by the EcoreGenerator.
        loadFromResourceSet =  {}
       
        // This is a workaround to get GreetingsHelper.compileAllPersons() compiled.
        validate = org.eclipse.xtext.mwe.Validator.Disabled {}
    }
   
    // Generate persons.ecore (via persons.genmodel).
    component = org.eclipse.emf.mwe2.ecore.EcoreGenerator {
        genModel = "platform:/resource/${projectName}/model/persons.genmodel"
        srcPath = "platform:/resource/${projectName}/emf-gen"
    }
}

pom.xml

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelversion>4.0.0</modelversion>
    <groupid>de.nikostotz.xtendxcoremaven</groupid>
    <version>1.0.0-SNAPSHOT</version>
    <artifactid>de.nikostotz.xtendxcoremaven</artifactid>

    <properties>
        <project .build.sourceEncoding>UTF-8</project>

        <!-- Java version, will be honored by Xcore / Xtend -->
        <maven .compiler.source>1.8</maven>
        <maven .compiler.target>1.8</maven>

        <!-- Xtend / Xcore -->
        <core -resources-version>3.7.100</core>
        <eclipse -text-version>3.5.101</eclipse>
        <emf -version>2.12.0</emf>
        <emf -common-version>2.12.0</emf>
        <emf -codegen-version>2.11.0</emf>
        <xtext -version>2.10.0</xtext>
        <ecore -xtext-version>1.2.0</ecore>
        <ecore -xcore-version>1.3.1</ecore>
        <ecore -xcore-lib-version>1.1.100</ecore>
        <emf -mwe2-launch-version>2.8.3</emf>
    </properties>

    <dependencies>
        <dependency>
            <groupid>org.eclipse.emf</groupid>
            <artifactid>org.eclipse.emf.common</artifactid>
            <version>${emf-common-version}</version>
        </dependency>
        <dependency>
            <groupid>org.eclipse.emf</groupid>
            <artifactid>org.eclipse.emf.ecore</artifactid>
            <version>${emf-version}</version>
        </dependency>
        <dependency>
            <groupid>org.eclipse.emf</groupid>
            <artifactid>org.eclipse.emf.ecore.xcore.lib</artifactid>
            <version>${ecore-xcore-lib-version}</version>
        </dependency>
        <dependency>
            <groupid>org.eclipse.xtext</groupid>
            <artifactid>org.eclipse.xtext.xbase.lib</artifactid>
            <version>${xtext-version}</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupid>org.apache.maven.plugins</groupid>
                <artifactid>maven-compiler-plugin</artifactid>
                <version>3.3</version>
            </plugin>

            <plugin>
                <artifactid>maven-clean-plugin</artifactid>
                <version>2.6.1</version>
                <configuration>
                    <filesets>
                        <fileset>
                            <excludes>
                                <exclude>.dummy.txt</exclude>
                            </excludes>
                            <directory>emf-gen</directory>
                        </fileset>
                        <fileset>
                            <excludes>
                                <exclude>.dummy.txt</exclude>
                            </excludes>
                            <directory>xtend-gen</directory>
                        </fileset>
                        <fileset>
                            <excludes>
                                <exclude>.dummy.txt</exclude>
                            </excludes>
                            <directory>xcore-gen</directory>
                        </fileset>
                    </filesets>
                </configuration>
            </plugin>

            <!-- Adds the generated sources to the compiler input -->
            <plugin>
                <groupid>org.codehaus.mojo</groupid>
                <artifactid>build-helper-maven-plugin</artifactid>
                <version>1.9.1</version>
                <executions>
                    <execution>
                        <id>add-source</id>
                        <phase>generate-sources</phase>
                        <goals>
                            <goal>add-source</goal>
                        </goals>
                        <configuration>
                            <!-- This should be in sync with xtext-maven-plugin//source-roots,
                                except for /model directory -->
                            <sources>
                                <source />${basedir}/emf-gen
                                <source />${basedir}/xcore-gen
                                <source />${basedir}/xtend-gen
                            </sources>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

            <!-- Generates the Ecore model via MWE2 -->
            <plugin>
                <groupid>org.codehaus.mojo</groupid>
                <artifactid>exec-maven-plugin</artifactid>
                <version>1.4.0</version>
                <executions>
                    <execution>
                        <id>mwe2Launcher</id>
                        <phase>generate-sources</phase>
                        <goals>
                            <goal>java</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <mainclass>org.eclipse.emf.mwe2.launch.runtime.Mwe2Launcher</mainclass>
                    <arguments>
                        <argument>${project.basedir}/workflow/generateGenModel.mwe2</argument>
                        <argument>-p</argument>
                        <argument>rootPath=${project.basedir}/..</argument>
                    </arguments>
                    <classpathscope>compile</classpathscope>
                    <includeplugindependencies>true</includeplugindependencies>
                    <cleanupdaemonthreads>false</cleanupdaemonthreads><!-- see https://bugs.eclipse.org/bugs/show_bug.cgi?id=475098#c3 -->
                </configuration>
                <dependencies>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.mwe2.launch</artifactid>
                        <version>${emf-mwe2-launch-version}</version>
                    </dependency>

                    <dependency>
                        <groupid>org.eclipse.xtext</groupid>
                        <artifactid>org.eclipse.xtext.xtext</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.text</groupid>
                        <artifactid>org.eclipse.text</artifactid>
                        <version>${eclipse-text-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.core</groupid>
                        <artifactid>org.eclipse.core.resources</artifactid>
                        <version>${core-resources-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtend</groupid>
                        <artifactid>org.eclipse.xtend.core</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtext</groupid>
                        <artifactid>org.eclipse.xtext.ecore</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtext</groupid>
                        <artifactid>org.eclipse.xtext.xbase</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen.ecore.xtext</artifactid>
                        <version>${ecore-xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.common</artifactid>
                        <version>${emf-common-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore</artifactid>
                        <version>${emf-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xmi</artifactid>
                        <version>${emf-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen</artifactid>
                        <version>${emf-codegen-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen.ecore</artifactid>
                        <version>${emf-codegen-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xcore</artifactid>
                        <version>${ecore-xcore-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xcore.lib</artifactid>
                        <version>${ecore-xcore-lib-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.text</groupid>
                        <artifactid>org.eclipse.text</artifactid>
                        <version>${eclipse-text-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.core</groupid>
                        <artifactid>org.eclipse.core.resources</artifactid>
                        <version>${core-resources-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtend</groupid>
                        <artifactid>org.eclipse.xtend.core</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtext</groupid>
                        <artifactid>org.eclipse.xtext.ecore</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtext</groupid>
                        <artifactid>org.eclipse.xtext.xbase</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen.ecore.xtext</artifactid>
                        <version>${ecore-xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.common</artifactid>
                        <version>${emf-common-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore</artifactid>
                        <version>${emf-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xmi</artifactid>
                        <version>${emf-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen</artifactid>
                        <version>${emf-codegen-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen.ecore</artifactid>
                        <version>${emf-codegen-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xcore</artifactid>
                        <version>${ecore-xcore-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xcore.lib</artifactid>
                        <version>${ecore-xcore-lib-version}</version>
                    </dependency>
                </dependencies>
            </plugin>

            <!-- Generates the Xtend and Xcore models -->
            <plugin>
                <groupid>org.eclipse.xtext</groupid>
                <artifactid>xtext-maven-plugin</artifactid>
                <version>${xtext-version}</version>
                <executions>
                    <execution>
                        <phase>generate-sources</phase>
                        <goals>
                            <goal>generate</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <languages>
                        <language>
                            <setup>org.eclipse.xtext.ecore.EcoreSupport</setup>
                        </language>
                        <language>
                            <setup>org.eclipse.emf.codegen.ecore.xtext.GenModelSupport</setup>
                        </language>
                        <language>
                            <setup>org.eclipse.xtend.core.XtendStandaloneSetup</setup>
                            <outputconfigurations>
                                <outputconfiguration>
                                    <outputdirectory>${project.basedir}/xtend-gen</outputdirectory>
                                </outputconfiguration>
                            </outputconfigurations>
                        </language>
                        <language>
                            <setup>org.eclipse.emf.ecore.xcore.XcoreStandaloneSetup</setup>
                            <outputconfigurations>
                                <outputconfiguration>
                                    <outputdirectory>${project.basedir}/xcore-gen</outputdirectory>
                                </outputconfiguration>
                            </outputconfigurations>

                        </language>
                    </languages>
                    <!-- This should be in sync with build-helper-maven-plugin//sources,
                        except for /model directory -->
                    <sourceroots>
                        <root>${basedir}/src</root>
                        <root>${basedir}/emf-gen</root>
                        <!-- Note that we include the /model path here although it's not part
                            of the source directories in Eclipse or Maven -->
                        <root>${basedir}/model</root>
                    </sourceroots>
                    <!-- This does not work currently, as we can see by the missing lambda
                        in generated code for GreetingsHelper.compileAllGreetings(). It does work,
                        however, for xtend-maven-plugin. (see https://github.com/eclipse/xtext-maven/issues/11)-->
                    <javasourceversion>1.8</javasourceversion>
                </configuration>
                <dependencies>
                    <dependency>
                        <groupid>org.eclipse.text</groupid>
                        <artifactid>org.eclipse.text</artifactid>
                        <version>${eclipse-text-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.core</groupid>
                        <artifactid>org.eclipse.core.resources</artifactid>
                        <version>${core-resources-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtend</groupid>
                        <artifactid>org.eclipse.xtend.core</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtext</groupid>
                        <artifactid>org.eclipse.xtext.ecore</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.xtext</groupid>
                        <artifactid>org.eclipse.xtext.xbase</artifactid>
                        <version>${xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen.ecore.xtext</artifactid>
                        <version>${ecore-xtext-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.common</artifactid>
                        <version>${emf-common-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore</artifactid>
                        <version>${emf-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xmi</artifactid>
                        <version>${emf-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen</artifactid>
                        <version>${emf-codegen-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.codegen.ecore</artifactid>
                        <version>${emf-codegen-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xcore</artifactid>
                        <version>${ecore-xcore-version}</version>
                    </dependency>
                    <dependency>
                        <groupid>org.eclipse.emf</groupid>
                        <artifactid>org.eclipse.emf.ecore.xcore.lib</artifactid>
                        <version>${ecore-xcore-lib-version}</version>
                    </dependency>
                </dependencies>
            </plugin>
        </plugins>
    </build>
</project>

.gitignore

1
2
3
4
5
bin/*
target
emf-gen/*
xcore-gen/*
xtend-gen/*


by Niko Stotz at February 18, 2017 06:00 PM

Eclipse Newsletter - Top Eclipse Marketplace Plugins

February 17, 2017 04:52 PM

What are the top Eclipse Marketplace plugins? How can you benefit from them? Find out!

February 17, 2017 04:52 PM

Eclipse Newsletter - Top Eclipse Marketplace Plugins

February 17, 2017 04:52 PM

What are the top Eclipse Marketplace plugins? How can you benefit from them? Find out!

February 17, 2017 04:52 PM

Eclipse Hono : “Connect. Command. Control” … even on OpenShift !

by ppatierno at February 17, 2017 09:04 AM

The Eclipse Foundation is the main open source community which has a great focus on the Internet of Things world and the related Eclipse IoT ecosystem involves a lot of different projects and partners like Red Hat, Bosch, Eurotech, IBM and many more. Recently, publishing this white paper, they showed a full stack with all the available tools for building a complete IoT solution from devices to the Cloud through the gateways.

selection_005

In relation to the Cloud side, one of the main problems in the IoT world is the ability to handle millions of connections from devices (and gateways) in the field, their registration for managing authentication and authorisation and, last but not least, the ingestion of the data received from them like telemetry or events. Finally, the last point is related to control these devices sending them commands in order to execute actions in the environment around them or upgrading their software and configuration.

The Eclipse Hono™ project is the answer to these problem !

The APIs

From the official web site, we can read :

Eclipse Hono™ provides remote service interfaces for connecting large numbers of IoT devices to a back end and interacting with them in a uniform way regardless of the device communication protocol.

The mantra from the landing page of the web site project is “Connect. Command. Control” which is made a reality through a well defined API for :

  • Registration : for handling the requests related to the registration (so creation) of a new device so that it can be authorised to connect. It’s also possible retrieving information about registered devices or delete them;
  • Telemetry : for the ingestion of a large volume of data from devices (i.e sensors) available for analysis to the backend applications;
  • Event : for receiving specific events (i.e. alarms, notification, …) from devices for making decision on the Cloud side. This API is quite similar to a telemetry path but it uses a “different” channel in order to avoid such events going through the overwhelmed telemetry one;
  • Command & Control : for sending commands to the devices (from a backend application) for executing operations and actions on the field (receiving the related response) and/or upgrading local software and configuration;

All the above APIs are accessible through the AMQP 1.0 protocol (the only standardised AMQP version!) and it means that they are defined in terms of addresses on which devices need to connect for interacting with the system and the properties/content of the messages; of course, it’s true not only for devices but even for business and backend applications which can receive data from devices or send them commands. In any case, it doesn’t mean that devices which aren’t able to speak such protocol can’t connect but they can leverage on the protocol adapters provided by Hono; the current implementation provides an MQTT and HTTP REST adapter.

All these APIs are though in order to allow multi-tenancy so that using a single deployment, it’s possible to handle channels for different tenants so that each of them can’t see data or exchanged messages from the others.

The Architecture

The main components which build the Eclipse Hono™ architecture are :

  1. Protocol Adapters : these components adapt a device protocol to the first citizens protocol used in Hono, the AMQP 1.0. Today, an MQTT and HTT REST adapters are provided out of box but thanks to the available interfaces, the user can develop a new adapter even for some custom protocols;
  2. Hono Server : this is the main component to which devices can connect directly through AMQP 1.0 or through the protocol adapters. It’s in charge to expose the APIs in terms of endpoints and handling the authentication and authorisation of devices;
  3. Qpid Dispatch Router : this is an AMQP 1.0 router, part of the Apache Qpid project, which provides the underlying infrastructure for handling millions of connections from devices in the fields. The simpler deployment can use only one router but in order to guarantee reliability and high volume ingestion, a router network should be used;
  4. ActiveMQ Artemis : this is the broker mainly used for handling command and control API so for storing commands which have to be delivered to devices;

While the devices connect directly to the Hono Server (or through protocol adapters), the backend applications connect to the Qpid Dispatch Router leveraging on direct addressing for receiving telemetry data (if no application is online, no devices are able to send data) or sending commands (the queus backed in the broker are reachable through the router).

selection_004

The running environment

All the artifacts from the project are provided as Docker images for each of the above components that can run using Docker Compose (the Docker Swarm support will be available soon) or using a more focused Cloud platform like OpenShift (compatible with Kubernetes deployment as well).

selection_006

Regarding the OpenShift deployment, the building process of the Hono project provides a bunch of YAML files which describe the objects to deploy like deployment configurations, services, persistent volumes and related claims. All the need is having an OpenShift platform instance accessible and deploy such objects for having Eclipse Hono™ running in a full featured Cloud environment with all the related scaling capabilities.

hono_openshift

The example deployment is based on four pods, one for each of the main components so there is the router pod, the Hono Server pod and one pod for each protocol adapter; of course if you need the command & control path, even the broker pod need to be deployed.

In order to try it, an OpenShift Origin instance can be used on a local PC for deploying the entire Hono solution; for example, the above picture is related to my tests where you can see all the pods running on OpenShift (left side) with simulated devices that interact using MQTT and HTTP REST (on the right side).

So what are you waiting for ? Give it a try !

Conclusion

In my mind every IoT solution should be made of three different layers (a little bit different from the Eclipse vision) : the devices/gateways, the connectivity layer and the service layer.

While the Eclipse Kura project fits in the gateways layer and the Eclipse Kapua in the service layer, Eclipse Hono is the glue between them in order to handle the connections from devices and making available their data to the backend services (and vice versa in the opposite direction for command and control). Thanks to the API standardisation  and the usage of a standard protocol like AMQP 1.0, Hono can be used for connecting any kind of devices with any kind of services; of course leveraging on a these three project has the big advantage of having big companies working on them, mainly Red Hat, Bosch and Eurotech.

Finally, the solution is always the same …. open source and collaboration ! 😉

 



by ppatierno at February 17, 2017 09:04 AM

Prerequisite Dependencies

by waynebeaton at February 17, 2017 06:34 AM

All third party content must be taken through the Eclipse Foundation’s Intellectual Property (IP) Due Diligence Process before being used by an open source project hosted by the Eclipse Foundation. This includes all third party content that is incorporated into project code, included in builds, or otherwise required by the project code to provide functionality.

The Eclipse Foundation’s Guidelines for the Review of Third Party Dependencies defines various classifications of third party content and how project teams should engage in the IP process. The classification of third party content most commonly used is the so-called prerequisite dependency: a prerequisite dependency is third party content that the project either needs to distribute along with the project code, or otherwise have present in order to function.

Third party content is likely a prerequisite dependency if:

  • project code accesses APIs;
  • project code includes an import statement for a package from a library;
  • project code uses reflection or other means to reference a library’s APIs and implementation;
  • the Java/OSGi manifest for a project bundles makes a direct reference; or
  • project code invokes a command line tool to access the functionality.

This list is not intended to be exhaustive, but rather to provide common examples. Fundamentally, if project code makes some kind of use of third-party content, then that content is likely a prerequisite. The guidelines provide for so-called works with and exempt prerequisite dependencies, but I’ll save that discussion for a future post.

Project committers can engage in the IP Due Diligence process by creating a contribution questionnaire (CQ) in the Eclipse Foundation’s IPZilla system.

The term contribution questionnaire is a bit misleading when it comes to third party content. Third party content is not a really a contribution to the project, but since the requirements for tracking project code and third party content is very similar, the same mechanism is used for both.

The workflow looks a little something like this:

prereq

screenshot-from-2017-02-08-13-25-01

Committer Tools

There’s an entry point that project committers can use to Create a Contribution Questionnaire in the Committer Tools block that is on every project’s information page (see the image on the right).

In the first step, the committer creates the CQ record with essential information about the third party content and then attaches the source code for that content (this currently happens in two separate steps). The corresponding Project Management Committee (PMC) must sign-off on the CQ before any further processing occurs.

As a general rule, a CQ should be created for each separate library. The process permits for some flexibility with regard to the definition of library. Content that has source in a single source repository that is distributed as multiple JAR files can very likely be regarded as a single library. There has also been cases where the IP Team has accepted a single CQ to do license certification for a collection of many different JavaScript sources. If you’re not sure, check with the IP Team.

License scanning software will be engaged for third party content that’s to be reviewed for license certification (type A). As we roll out this new type of IP due diligence, the IP Team is engaging the tool manually, evaluating the output, and making a determination. Our plan is to automate the invocation of the tool and make a determination automatically where possible (e.g. in cases where the licenses are clearly indicated in the content) and have the IP Team investigate further when necessary.

For requests to perform the more thorough IP due diligence process (type B), the workflow is different. For content that qualifies for parallel IP processing, the IP Team will do a cursory review and—assuming that the content passes muster—grant check in, meaning that the content can be pushed into the project’s source code repository and included in builds. The full due diligence review includes verification of the content’s provenance and a scan for all kinds of other anomalies (more on this in future posts). This process is a lot work that can take quite a lot of time to complete, and very often requires interaction with the committer. When the IP team completes a full review with a successful outcome, they’ll mark the CQ as approved.

All third party content must be either license certified or approved before an project can issue an official release. Any release that includes license certified content must be marked as a type A. A type A release may include type B content; A type B release, however, cannot include type A content.

I’ve thrown out a few concepts in this post without providing explanations. I’ll try and fill in the gaps in future posts, which will all be grouped in the intellectual property category.



by waynebeaton at February 17, 2017 06:34 AM

What is Two? Much more than yet another Eclipse IDE

by Doug Schaefer at February 16, 2017 07:19 PM

It’s been a wild few weeks on the journey I now simply call “Two”. (Until it becomes an Eclipse project, I can’t call it one. And rightly so). The feedback I’ve received, especially as it hit the presses, was fantastic. It’s driven a lot of traffic to my github repo. I’ve even received a few pull requests from people cleaning up my mess as I plow through it. At the very least, people are intrigued by idea of an IDE built with Web UI technologies running locally under Electron.

As I work through the vision, I don’t think it’s fair to simply call it yet another Eclipse IDE. It’s much grander than that. For me, Two is about that vision. It’s more than a code editor. It’s more than just taking the existing Eclipse IDE and implement it in HTML5 running locally. It’s about a whole new way for developers to work with their tools and access the resources available to them on the web.

Developers today have access to many powerful web services. Github is a huge one. At Eclipse we use Bugzilla for bug tracking. A lot of companies use JIRA. People may be using Gitlab as their private Github. You may be using Gerrit for code reviews. I notice LLVM is using Phabricator for code reviews. And there’s the venerable Review Board still active today. Jenkins for continuous integration. And don’t tell me you don’t cut and paste error messages into Google searches to find others who’ve seen the same problem and worked out a solution. The Web is a critical tool in the developers belt.

I was intrigued by an article written by the creators of a tool called Ship. It’s a Mac native, i.e. Cocoa, app that integrates with the Github issue tracker. This quote hit me:

“A point of pride for us is that many people we have shown Ship 2.0 haven’t been able to tell where the web-based content ends and the native Cocoa stuff begins.”

That’s exactly it. If we can say the same for Two, where you can’t tell where the web ends and integration with local tools begin, then we’ve hit a home run.

Integrated Development Environments provide the most value when they have integrations between tools, which the developer has had to do manually for years. The IDE’s role is to automate workflows that transgress all the tools the developer has to use, where the workflow becomes the key, not the individual tools themselves. This focus makes it much easier for them to accomplish their objectives, so they don’t have to learn all the intricacies of underlying environments, but provide a unified language.

And that’s what Two needs to be, a tools integrator that includes not only the tools the user has installed locally, but also integrates with these super powerful web resources in seamless workflows.

The other important factor with IDEs that so many forget, is that experienced developers will always have a bit of mistrust in their IDE no matter how good it is. They need to be able to get close to the iron and run the underlying tools manually at a place where their trust is higher. An IDE that hides that, or makes those tools hard to get to, will struggle being universal. That’s why I’m not a fan of Cloud or Docker hosted tools. In making deployment easy for the provider, they make it hard for the user to touch the bits they’re creating.

The main reason I don’t want to call this Eclipse Two is that this isn’t just about a next generation Eclipse IDE. That IDE was my One, the first IDE that I’ve had a hand in creating. I want to rethink the whole developer experience and create an IDE that works well in the modern world, now almost 20 years after One. This is Two.


by Doug Schaefer at February 16, 2017 07:19 PM

Tutorial – Eclipse Marketplace & Favourites List

by Antoine THOMAS at February 16, 2017 08:00 AM

A few months ago we introduced the Eclipse Marketplace Favourites List. To explain how it works I created a video tutorial. This article will be a more classic tutorial to explain what Eclipse Marketplace is and how you can use the Favourite Lists to your advantage.

What is Eclipse Marketplace?

The Eclipse Marketplace is a place to discover, share, and install relevant Eclipse plugins and solutions. Think of it like an app store for Eclipse solutions.

One of it’s main uses is to add plugins to your Eclipse installation. You will also find applications based on Eclipse Platform, tooling for IoT and other solutions and services. In this article you’ll find out how to do so – you might even learn a few new tricks along the way.

Favourite list online

One of the great features of the Eclipse Marketplace is the possibility to create and share a list of plugins. This is very helpful if you have many Eclipse installations, and want to find and install your favourites one quickly. How can you do this on the Eclipse website?

How to do this on the website?

  1. Go to https://marketplace.eclipse.org/ and sign up. Or create an account if you don’t have one yet.
  2. Search for your favourite solution
  3. Click on the white star below the logo to add a plugin to your Favourites list.

Visual Example

In this example, Darkest Dark Theme will be added to my favourites plugins list. Once a plugin has been “starred” it is automatically added to your list.

To view it, simply go to “My Marketplace”:

And there it is, in your list of favorites!

Marketplace Client in Eclipse

Manage plugins

Did you know? You can install plugins from your Eclipse IDEs using the Marketplace Client. To launch it:

  1. Click on the “Help” tab in the menu.
  2. Click on “Eclipse Marketplace”.

You can also launch it using the Quick Access Bar at the top right of your Eclipse workspace.

  1. Click “Alt + 3” (Windows) or “Command + 3” (Mac) to launch the Quick Access search bar
  2. Type “Eclipse Marketplace”
  3. Hit “Enter” and voila – you’re there!

You can display and manage your favorites here too:

You can off course add, remove, and install plugins. The Eclipse Marketplace Client and the website store your favorites with your eclipse.org account, so they are synchronised.

Copying Another User’s Favourite List

If you want to install plugins in the list of someone else, say, a colleague, or another contributor to a project, this is also possible.  You can import a favourites list from an other Community users, and then select the plugin you want.

It’s also possible to install plugins from someone else’s list , say, a colleague, or another project contributor. To import a favourites list from an other Community users in your workspace Marketplace client:

  1. Click on “Import Favourites List”
  2. Copy/paste the link to someone’s favorites lists.
  3. All plugins are automatically selected. Unselect any plugins you don’t want by unchecking the checkboxes to the left of each individual plugin.
  4. Click import.

Here is an example with Lars Vogel‘s plugins.

Share

Long story short, to save time or to share your favourite plugins quickly and easily with someone else, you can add plugins to your favourites list. Sharing is simple, all you need is the link to your list. You can find this link on your user profile:

Here are a few Favourites Lists that I would like to share with you. You can import them directly by copying and pasting the link in Eclipse Marketplace Client:

Do not hesitate to share your favourite list on social networks. You can use #EclipseMarketplace on Twitter.

Feedback welcome

As usual, feedback is welcome and discussion is open. Do not hesitate to comment this article or to open an bug.


by Antoine THOMAS at February 16, 2017 08:00 AM

Announcing Availability of Free Error Reporting Service for Eclipse Plugin Developers

by Content Master at February 15, 2017 03:26 PM

What annoys software developers most? Probably when the tools they use Just Don’t Work. And it doesn’t matter what kind of software it is; or whether it’s open-source, or commercial. If it fails, it sucks. Users don’t show mercy. The same holds for the Eclipse IDE. In recent years, many users complained that Eclipse was way too buggy and moved on to the competition. To stop that trend, we started the Automated Error Reporting Initiative for Eclipse. Our main objectives were (i) to make reporting problems with Eclipse as easy as possible and (ii) to make Eclipse committers aware which parts […]

The post Announcing Availability of Free Error Reporting Service for Eclipse Plugin Developers appeared first on Ctrlflow Blog.


by Content Master at February 15, 2017 03:26 PM

Eclipse IoT Announces Support for OMA LightweightM2M 1.0 Device Management Standard

February 15, 2017 02:11 PM

We are pleased to announce the Eclipse Leshan project will support OMA's newly ratified LwM2M 1.0 standard.

February 15, 2017 02:11 PM

Upcoming IoT Events in March 2017

by Roxanne on IoT at February 15, 2017 11:07 AM

It’s no surprise that there are many, MANY, Internet of Things (IoT) events happening all over the world every month. But, there are a few…


by Roxanne on IoT at February 15, 2017 11:07 AM

Why we need industrial consortiums for open source projects!

by tevirselrahc at February 15, 2017 09:00 AM

On Friday, February 17th at 16:00 CET , 15:00 (GMT), and 10:00 EST, the Papyrus Industry Consortium’s (a.k.a. Papyrus-IC or, as I prefer, Me-IC 😉 Research and Academia committee will host their second webinar of the year:

Industrial usage of open source solutions – Why do we need industrial consortiums?

Ericsson’s Francis Bordeleau, who is also the Me-IC’s chairperson will be presenting this very interesting topic.

I must say that, as an open source project myself, I have profited from the creation of the Papyrus IC and I certainly owe some of my growth to its members!

If you are curious as to how open source project can grow and become more palatable for usage in an industrial setting, then you must attend this webinar!

You can find information on how to connect from the Me-IC Research and Academia webinar page!

See you there!


Filed under: Papyrus IC, Research and Academia, webinar Tagged: community, industry, Industry consortium, open-source, PIC

by tevirselrahc at February 15, 2017 09:00 AM