Eclipse Newsletter - Discover the Eclipse Marketplace

August 27, 2015 03:06 PM

The articles highlight some of the most popular solutions including EclEmma, Spring Tool Suite, JBoss Tools, and PyDev.

August 27, 2015 03:06 PM

Exposing a REST service as an OSGi Service

by Scott Lewis (noreply@blogger.com) at August 27, 2015 02:25 PM


I've produced a tutorial describing an additional use case for Remote Services:  Exposing a Jax REST service as an OSGi Remote Service.

This describes how one can take an existing REST service and expose to consumers as an OSGi Remote Service.





by Scott Lewis (noreply@blogger.com) at August 27, 2015 02:25 PM

Eclipse Scout Day 2015

August 27, 2015 12:03 PM

On November 2nd we are organizing the Scout user group meeting as part of the Unconference right before the Eclipsecon Europe. Participation requires registration but the Event is open for anyone interested.

 

 


August 27, 2015 12:03 PM

Access Dart Analysis Server from Java

by Tom Schindl at August 25, 2015 03:24 PM

In my last few blogs (here, here and here) I started implementing a smart dart editor with syntax highlighting, auto complete, … for google dart.

While the first posts have been dedicated to getting syntax highlighting working we’ll now start with the higher level features you are used to from full blown IDEs.

In this first blog post we’ll look how we can interface with the Dart Analysis Server which is the reason I’ve chose dart as the example language.

The communication with the Dart Analysis is done through the input and output stream. An example will probably help.

Let’s suppose we have a /home/tomschindl/dart-samples/test.dart with the following content:

class Rectangle {
  num left;   
  num top;
  num width; 
  num height;          
  
  num get right             => left + width;
      set right(num value)  => left = value - width;
  num get bottom            => top + height;
      set bottom(num value) => top = value - height;
}

// This is where the app starts executing.
main() {
  var r = new Rectangle();
  r.
}

If you now want to get all auto completetion proposals after r. you issue the following commands/requests to the server:

  • You launch the dart analysis server eg on my workstation it is:
    /Users/tomschindl/Downloads/dart-sdk/bin/dart \
     bin/snapshots/analysis_server.dart.snapshot
  • You inform the server about the root directory:
    { 
      "id" : "default_1", 
      "method" : "analysis.setAnalysisRoots" , 
      "params" :  {
         "included":["/Users/tomschindl/dart-samples/"],
         "excluded":[]
      }
    }
    
  • Which will synchronously return the following result

    {
      "id":"default_1"
    }
    
  • Requesting all auto completetions at offset 367 which is directly after r.

    { 
      "id" : "default_2", 
      "method" : "completion.getSuggestions" , 
      "params" :  {
        "file":"/Users/tomschindl/dart-samples/test.dart",
        "offset":367
      }
    }
    
  • Which will synchronously return the following result
    {
      "id":"default_2",
      "result":{
        "id":"0"
      }
    }
    
  • And asynchronously the following events will occur

    {
      "event":"completion.results",
      "params":{
        "id":"0",
        "replacementOffset":367,
        "replacementLength":0,
        "results":[],
        "isLast":false
      }
    }
    
    {
      "event":"completion.results",
       "params":{
         "id":"0",
         "replacementOffset":367,
         "replacementLength":0,
         "results": [
           {
             "kind":"INVOCATION",
             "relevance":1000,
             "completion":"left",
             "selectionOffset":4,
             "selectionLength":0,
             "isDeprecated":false,
             "isPotential":false,
             "declaringType":"Rectangle",
             "element":{
               "kind":"FIELD",
               "name":"left",
               "location":{
                 "file":"/Users/tomschindl/dart-samples/test.dart",
                 "offset":24,
                 "length":4,
                 "startLine":2,
                 "startColumn":7
               },
               "flags":0,
               "returnType":"num"
             },"returnType":"num"
           },
           {
             "kind":"INVOCATION",
             "relevance":1000,
             "completion":"right",
             "selectionOffset":5,
             "selectionLength":0,
             "isDeprecated":false,
             "isPotential":false,
             "declaringType":"Rectangle",
             "element":{
               "kind":"GETTER",
               "name":"right",
               "location":{
                 "file":"/Users/tomschindl/dart-samples/test.dart",
                 "offset":95,
                 "length":5,
                 "startLine":7,
                 "startColumn":11
               },
               "flags":0,
               "returnType":"num"
             },
             "returnType":"num"
           }
           // Many more
         ],
        "isLast":true
      }
    }
    

I guess you get the idea – not really rocket science but in java we don’t want to deal with this low-level stuff. So as part of the editor work we also developed a Java interface for the Dart Analysis Server available from maven-central.

If we want to issue the same commands as above through the Java-API.

Create a maven-project and make the pom.xml look like this:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>at.bestsolution</groupId>
	<artifactId>at.bestsolution.dart.server.sample</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<repositories>
		<repository>
			<id>sonatype-snapshots</id>
			<url>https://oss.sonatype.org/content/repositories/snapshots/</url>
		</repository>
	</repositories>
	<dependencies>
		<dependency>
			<groupId>at.bestsolution.eclipse</groupId>
			<artifactId>org.eclipse.fx.core</artifactId>
			<version>2.1.0-SNAPSHOT</version>
		</dependency>
		<dependency>
			<groupId>at.bestsolution</groupId>
			<artifactId>at.bestsolution.dart.server.api</artifactId>
			<version>1.0.0</version>
		</dependency>
	</dependencies>

	<build>
		<plugins>
			<plugin>
				<artifactId>maven-compiler-plugin</artifactId>
				<configuration>
					<source>1.8</source>
					<target>1.8</target>
				</configuration>
			</plugin>
		</plugins>
	</build>
</project>

And a java class with the following content:

package at.bestsolution.dart.server.sample;

import java.io.IOException;
import java.util.stream.Stream;

import org.eclipse.fx.core.Util;

import at.bestsolution.dart.server.api.DartServer;
import at.bestsolution.dart.server.api.DartServerFactory;
import at.bestsolution.dart.server.api.Registration;
import at.bestsolution.dart.server.api.model.CompletionResultsNotification;
import at.bestsolution.dart.server.api.services.ServiceAnalysis;
import at.bestsolution.dart.server.api.services.ServiceCompletion;

public class DartServerSample {
	public static void main(String[] args) {
		// Get the server factory from the service registry
		DartServerFactory serverFactory = Util.lookupService(DartServerFactory.class);
		// Create a server instance
		DartServer server = serverFactory.getServer("server");

		// Get the analysis and completion service
		ServiceAnalysis analysisService = server.getService(ServiceAnalysis.class);
		ServiceCompletion completionService = server.getService(ServiceCompletion.class);

		// set the root
		analysisService.setAnalysisRoots(new String[] {"/Users/tomschindl/dart-samples/"}, new String[0], null);
		// register for completion notifcations
		Registration proposalRegistration = completionService.results(DartServerSample::handleHandleResults);

		// Request completion at offset 367
		completionService.getSuggestions("/Users/tomschindl/dart-samples/test.dart", 367);

		// Wait for a key press
		try {
			System.in.read();
		} catch (IOException e) {
		}

		// unregister the notification listener
		proposalRegistration.dispose();
		// shutdown the server instance
		server.dispose();
	}

	private static void handleHandleResults(CompletionResultsNotification notification) {
		Stream.of(notification.getResults()).forEach( c -> System.err.println(c.getCompletion()));
	}
}

And then you have to launch the class with

java -Ddart.sdkdir=/Users/tomschindl/Downloads/dart-sdk at.bestsolution.dart.server.sample.DartServerSample

where you replace /Users/tomschindl/Downloads/dart-sdk with the path to your dart-sdk installation.



by Tom Schindl at August 25, 2015 03:24 PM

Gerrit inline editing Feature

August 25, 2015 08:56 AM

We have moved our Scout Blog to a new location:
https://www.bsi-software.com/en/scout-blog/

I have opened Bug 473139 to update our RSS Feed URL in the Eclipse planet configuration file.

This was for me a perfect opportunity to test the new inline editing feature of Gerrit. (See Denis Roy email on the “eclipse.org-committers” mailing list: New version of Gerrit deployed)

Here is how I created a new change for my Bug, directly on Gerrit:

1/ Search for the corresponding repository in the projects list.


August 25, 2015 08:56 AM

by Peter Kriens (noreply@blogger.com) at August 24, 2015 07:56 AM

Building the Internet of Things with OSGi by Tim Verbelen, iMinds, Ghent University One year ago, I was speaking on my first EclipseCon Europe about the results of my PhD research (Mobilizing the Cloud with AIOLOS). Since then, I have been working for iMinds as a research engineer. iMinds is a digital research centre in Belgium, which joins the 5 Flemish universities in conducting

by Peter Kriens (noreply@blogger.com) at August 24, 2015 07:56 AM

Presentation: Tasty Recipes for OSGi Bundles

by Gunnar Wagenknecht at August 23, 2015 11:31 PM

Gunnar Wagenknecht introduces the Eclipse Bundle Recipes project, explaining how to turn a library from Maven into an OSGi bundle, and how to deploy recipes and build systems to a local environment.

By Gunnar Wagenknecht

by Gunnar Wagenknecht at August 23, 2015 11:31 PM

Presentation: Model Migration with Edapt

by Maximilian Koegel at August 23, 2015 08:50 PM

Maximilian Koegel introduces Edapt, describing its basic features and demonstrating how it can be used for migrating models in real life applications.

By Maximilian Koegel

by Maximilian Koegel at August 23, 2015 08:50 PM

Introducing the EclipseCon Europe 2015 Hackathon

by waynebeaton at August 20, 2015 07:52 PM

Alternate title: I Went to EclipseCon Europe and What I Saw There Blew My Mind…

EclipseCon has great tutorials and sessions. The speakers are second-to-none, and the networking opportunities are fantastic. if you’re looking for it, you’ll find software developers doing amazing things together. For EclipseCon Europe 2015, we’re going to formalize this a bit. At very least, we’re going to give it a nice home and a lot of love.

10553673244_84713e4157_z We’re extending our usual notion of a hackathon: instead of just having a single evening session, we’re going to host a hacking area for two full days during the conference. We’ll have a few things happening in this hacking area.

There will be multiple tables arranged in perfect order for pair programming. The tables will have wired connections so that participants don’t have to wrestle with the wifi. We’re going to set up a server with mirrors of our Git and Mars p2 repositories so you don’t have to make the long trip back to our servers in Ottawa to get started (once you’ve cloned, you can connect to the master repository to get updates and push patches).

We’ll have Eclipse Foundation staff and others there to help. We can help with things like creating Eclipse Foundation accounts, setting up Contributor License Agreements (CLA), configuring development environments, understanding the development and intellectual property (IP) processes, and more. We can even help you find a problem to solve and a partner to solve it with (if you need that sort of help). We’ll have a little presentation area set up where we’ll run short how to contribute sessions throughout the day.

You are most certainly welcome to work on your own problem. Or, you can select from our short list of issues that we feel can be solved with up to two hours of effort: pick something from the list and we’ll help you get started and sort through the development process. If you’re not quite ready to submit patches, you can help us test by downloading one of the packages, the installer, or a feature; we’ll help you create bug reports for issues that you find. Or we can help you just create a bug report for some aspect of Eclipse that’s been driving you batty.

We’re going to set up some office hours: specific times set aside when we’ll have subject matter experts on hand to help with specific sorts of problems. Use the office hours to connect with experts on topics such as building Oomph configurations, testing your plug-ins, or using the Common Build Infrastructure; and committers from specific Eclipse projects.

Eclipse committers are, of course, welcome to use the hacking area to collaborate with fellow committers that you don’t normally get to work with directly.

I think that you’ll find that we have a little something for everybody.

EclipseCon Europe 2015



by waynebeaton at August 20, 2015 07:52 PM

Improved TextMate bundles integration and instant searches on Eclipse (LiClipse 2.3.0)

by Fabio Zadrozny (noreply@blogger.com) at August 20, 2015 07:02 PM

Yeap, LiClipse 2.3.0 is out now.

The main change in this release was a restructuring on how the parsing works...

This deserves some better explanation: in Eclipse, the default parsing structure is done by separating a document in many top-level partitions and later on those partitions are scanned to give syntax highlighting.

The issue here is that in LiClipse when parsing with TextMate rules, a single parse will provide all the information at once (top-level partitions and internal scopes in the partition), so, the code was restructured so that a single parse is done and the information is properly saved to be accessed later on, so, a single parse will provide the partitions, colors, outline, etc.

This means that LiClipse became even more responsive, especially when dealing with TextMate rules and the scope information is available to better support TextMate snippets for code-completion.

Also, the searching structure created on LiClipse 2.2.0 was ironed out (and its really pleasant getting instant searches even on big codebases).

Enjoy!

by Fabio Zadrozny (noreply@blogger.com) at August 20, 2015 07:02 PM

Users can now fund development work on Eclipse

by Mike Milinkovich at August 19, 2015 03:00 PM

Today, we are significantly lowering the barriers for companies and individuals to actively invest in the ongoing development of the Eclipse platform. Eclipse has an amazing community of individuals and companies that invest significant amount of resources in the development of Eclipse open source projects. We also have a huge community of users that benefit from Eclipse technology. They use Eclipse tools and technology to build their software products and applications. Most of these users don’t have the time required to participate in an open source project but they do want to see ongoing improvements and investment in Eclipse. We now have a way for these users to invest in Eclipse improvements.

We are pleased to announce the Eclipse Foundation has begun to fund development work on Eclipse projects.  In fact, there are a number of features and issues in the Mars release that were funded through the Foundation. The initial focus is on improving the core Eclipse platform, JDT and Web Tools. As the program expands we expect the list of projects will grow too. The process by which funds will be allocated is still a work in progress, but will be made available soon. It will be based on the core principles of openness and transparency.

The funding for the development work will come from individuals and corporate users. Earlier this year, Ericsson provided the Eclipse Foundation funds to improve the Eclipse platform which resulted in SWT, GTK3 and PDE improvements available in the Mars release. Ericsson is a large user of Eclipse and they see the value of investing in ongoing improvements. We hope other large corporate users of Eclipse will follow Ericsson’s lead.

We are also pleased to announce that all users’ donations to our Friends of Eclipse program will be used to fund Eclipse development work. Last year we raised over $120,000 from the Friends of Eclipse program, so we hope the ability to directly fund Eclipse development will significantly increase the donations we gain from our individual user community. To make things even easier, we have added Bitcoin as a payment option. Please take this opportunity to help improve Eclipse by making a donation.

Eclipse open source development will continue to move forward through work of our committer community. Committers are the heart and soul of any open source project. However, we are confident having additional investment from our user community will help accelerate future improvement to Eclipse. If you are a user of Eclipse, individual or corporate, it is now simple to participate in the future of Eclipse.


Filed under: Foundation

by Mike Milinkovich at August 19, 2015 03:00 PM

OSGi Residential Release 6 Specification

by Kai Hackbarth (noreply@blogger.com) at August 19, 2015 02:45 PM

The OSGi Residential Expert Group is excited about the new specifications introduced in the OSGi Residential Release 6 Specification. You can find it at: http://www.osgi.org/Specifications/HomePage. It contains a number of new specifications mostly dealing with device interoperability and configuration, monitoring and management services. As with the previous Residential 4.3 Specification, the

by Kai Hackbarth (noreply@blogger.com) at August 19, 2015 02:45 PM

Users can now fund development work on Eclipse

August 19, 2015 01:39 PM

We significantly lowered the barriers for companies and individuals to invest in the ongoing development of the Eclipse platform.

August 19, 2015 01:39 PM

More jars on Maven Central and JCenter

by BJ Hargrave (noreply@blogger.com) at August 19, 2015 12:45 PM

In my last blog post, I announced the availability of the Release 6 specifications including their companion code jars which were released to Maven Central and JCenter. Those companion code jars were collections of companion code associated with the specification documents. For example, the osgi.enterprise jar contained all the packages for the all the APIs specified in the OSGi Enterprise

by BJ Hargrave (noreply@blogger.com) at August 19, 2015 12:45 PM

Building Eclipse Plugins with Maven Tycho and Travis-CI

by Andreas Mülder (noreply@blogger.com) at August 19, 2015 12:01 PM



A couple of weeks ago, we moved our open source project Yakindu Statechart Tools from google code SVN to GitHub. Until today we packaged and deployed our software using a self hosted Jenkins Server, but since Travis CI integrates seamlessly with GitHub and is free for open source projects we decided to give it a try. This blog post tries to explain how to set up Travis CI for a Maven Tycho based eclipse build.

There are plenty of blog posts out there, that stated how easy it was to setup a Travis CI build for a 10 lines of Java code 'Hello World' example. This is great, but our builds have a lot more work to do:
  • 170,000 lines of Java code
  • Code generation of Xtend classes and Xtext grammars
  • more than 1000 JUnit Tests (including UI Tests)
  • ~100 C and C++ Google Tests to run 
Let's see how well Travis can handle this.
 
Initial Setup

To get started with Travis you first have to log in to travis-ci.org with your GitHub account and grant access to your GitHub repositories This requires admin access to the repository. It is well documented in the User Guide. Next, you have to create a file .travis.yml in the root folder of your repository to tell Travis what to do. By the way, there is a really useful online syntax checker for yml files available here.

This is the simplest possible .travis.yml file for our build based on Maven Tycho:
 language: java   
script:
- cd releng/org.yakindu.sct.releng
- mvn clean verify

Immediately after pushing the .travis.yml file to our Git repository Travis started a new build.
and we ended up in a 5 MB log file full of exceptions like this one:

Caused by: org.eclipse.swt.SWTError: No more handles [gtk_init_check() failed]
    at org.eclipse.swt.SWT.error(SWT.java:4517)
    at org.eclipse.swt.widgets.Display.createDisplay(Display.java:908)


org.yakindu.sct.generator.genmodel.ui.SGenExecutableExtensionFactory
    at org.eclipse.swt.SWT.error(SWT.java:4517)
    at org.eclipse.swt.widgets.Display.createDisplay(Display.java:908) 

Running UI Tests

After analyzing the log files it points out that our UI dependent tests are not able to create an SWT Display object. Luckily, Travis allows the use of the virtual framebuffer xvfb for UI tests. It can be started as follows:

  language: java    
env:
global:
- DISPLAY=:99.0

before_install:
- sh -e /etc/init.d/xvfb start - sleep 10

script:
- cd releng/org.yakindu.sct.releng
- mvn clean verify

After pushing these changes to Git the build finished successful!

User Interface

Travis provides a simple yet powerful web based user interface that displays a list of all your repository builds and the console output. This is really helpful to see what is going wrong with your build. It also allows to start a build without pushing something to your repository.
What I really miss in the UI is a better integration for Unit Tests, like the Jenkins JUnit plugin. But such a feature is unfortunately not planned and one has to browse the (huge) log files for failed Unit tests.

 GitHub Integration

Per default, the master branch as well as all pull requests are build. The integration with GitHub works out of the box, the running CI jobs are shown in the GitHub UI as checks and updated automatically when the job finishes. This is really awesome for a configuration file with only 9 lines!


 Caching and container based infrastructure

Every time a new build job starts, a new Linux image is setup. To prevent that Maven downloads the internet every time a new build is started, Travis provides a container based infrastructure that allows the use of caches. One drawback of this is that the use of the sudo command is prohibited.

 sudo: false  
language: java
cache:
directories:
- $HOME/.m2

env:
global:
- DISPLAY=:99.0
before_install:
- sh -e /etc/init.d/xvfb start - sleep 10
script:
- cd releng/org.yakindu.sct.releng
- mvn clean verify

As shown above, starting the .yml configuration file with sudo: false allows the use of caches. When the build starts and a new Linux image is created, all cached directories (in our example the maven repository .m2) are restored at the beginning. This really boosts up the build time.

Performance

Our existing Jenkins CI Server is hosted on a Linux KVM virtual machine with 12 core Opteron 2,6 Ghz, 32 GB RAM and  Raid 10 HDD. The average build time is about 13 - 20 minutes.
Travis, running somewhere on Amazon cloud services, took about 10 - 11 minutes for a build. I don't know how they do it, but this is incredibly fast!


Adding google test framework for C and C++

Since Yakindu Statechart Tools is shipped with a C and C++ code generator, we have a couple of C and C++ tests that have to be run during build. This was the not-so-funny part of the configuration, because google recently decided to ship libgtest only as source code and without the static libraries. On the Linux machine we are running Jenkins it was straight forward to install google test, compile it and copy the static library to /usr/lib with the following commands:

 sudo apt-get install libgtest-dev  
cd /usr/src/gtest
sudo cmake CMakeLists.txt
sudo make
sudo cp *.a /usr/lib

Running these commands at the beginning of the Travis build would be possible, but we want to use caching so we are not allowed to use sudo, as explained above. Fortunately, there is an apt get whitelist that allows the installation of external libraries via addons.apt.packages:

 sudo: false  
language: java
addons:
apt:
packages:
- libgtest-dev

... 

Now the GTest source code is available on the image, but it has to be compiled. The - somehow quite hacky - solution is to copy the source into the build dir, compile it and create a environment variable GTEST_DIR that is used in our C Tests for the library lookup. Copying to usr/lib did not work because without sudo the permission is denied.
 env:  
global:
- GTEST_DIR=${TRAVIS_BUILD_DIR}/gtest
  before_script:  
- mkdir gtest
- cd gtest
- cp -r /usr/src/gtest/. .
- ls
- cmake CMakeLists.txt
- make
 
... 

 At least this works, if you know a better solution please let me know :)
  
Publish Releases
Last but not least, we want to publish successful release builds to GitHub releases automatically.
To publish artifacts to GitHub releases a secret api_key is required for Travis to get access. These api_key should be encrypted, especially when the .travis.yml file is part of an open source project. ;-)
To set up the deploy section of the configuration, you need to install the Travis Ruby Gem.
First, download Ruby version 2.0.0 (I tried to install the Travis Gem with a newer version but it did not work) and install the ruby gem as described here.
If everything works properly, run the following command in the root folder of your GIT repository:

travis setup releases

This will setup a basic deployment section to your .travis.yml file. Do not forget to add the skip_cleanup: true, otherwise the build artifacts you want to release got deleted before the deployment step. In our case, a release is indicated with a tag that follows the naming convention release.x.x.x and we only want those tags to be automatically published, so we added a condition to the deployment section.

... 
 deploy:  
skip_cleanup: true
provider: releases
api_key:
secure: BSEYtMYXInrXum0eO........
file: releng/org.yakindu.sct.repository/target/updatesite.zip
on:
tags: Yakindu/statecharts
condition: "$TRAVIS_TAG =~ ^release.*$"

Now, every time Travis indicates a release tag, it will automatically publish the release to GitHub in case of a successful build.

Conclusion: Travis provides an incredibly powerful and customizable infrastructure to open source projects for free. Thank you very much for that and keep up the good work!



by Andreas Mülder (noreply@blogger.com) at August 19, 2015 12:01 PM

Call for Submissions: Modeling Symposium @ EclipseCon Europe 2015

by Maximilian Koegel and Jonas Helming at August 19, 2015 10:05 AM

We are happy to announce that Philip, Ed and I are organizing the Modeling
Symposium for EclipseCon Europe 2015. Please support us by sharing the call on your communication channels.

The symposium aims to provide a forum for community members to present a brief overview of their work. We offer 10 minute lightning slots (including questions) to facilitate a broad range of speakers. The primary goal is to introduce interesting, new, technology and features. We are mainly targeting projects that are not otherwise represented in the conference program.

If you are interested in giving a talk, please send a short description (a few sentences) to jhelming@eclipsesource.com. Depending on the number, we might have to select among the submissions. Submission Deadline is September 31st.

Please adhere to the following guidelines:

  • Please provide sufficient context. Talks should start with a concise overview of what the presenter plans to demonstrate, or what a certain
    framework offers. Even more important, explain how and why the topic is relevant.
  • Don’t bore us! Get to the point quickly. You don’t have to use all your allocated time. An interesting 3 minute talk will have a bigger
    impact than a boring 10 minute talk. We encourage you to plan for a 5 minute talk, leaving room for 5 minutes of discussion.
  • Keep it short and sweet, and focus on the most important aspects. A conference offers the advantage of getting in contact with people who
    are interested in your work. So consider the talk more as a teaser to prompt follow-up conversations than a forum to demonstrate or discuss technical details in depth.
  • A demo is worth a thousand slides. We prefer to see how your stuff works rather than be told about how it works with illustrative slides. Please restrict the slides to summarize your introduction or conclusion.

Looking forward to your submissions!


TwitterGoogle+LinkedInFacebook

Leave a Comment. Tagged with eclipse, eclipsecon, emf, modeling, eclipse, eclipsecon, emf, modeling


by Maximilian Koegel and Jonas Helming at August 19, 2015 10:05 AM

EclipseCon Europe 2015 from an Automotive Perspective

by Andreas Graf at August 19, 2015 08:54 AM

As Eclipse is established as a tooling platform in automotive industry, the EclipseCon Europe conference in Ludwigsburg is an invaluable source of information. This year’s EclipseCon is full of interesting talks. Here is a selection from my “automotive tooling / methodology” perspective.

  • The Car – Just Another Thing in the Internet of Things? – 

    Michael Würtenberger is the head of BMW CarIT. In addition to the interesting talk, BMW CarIT was also one of the strong drivers of the Eclipse / EMF based AUTOSAR tooling platform “Artop” and it is very interesting to learn about their topics.

  • openETCS – Eclipse in the Rail Domain, 

    Rover Use Case, Specification, design and implementation using Polarsys Tools

    Scenarios@run.time – Modeling, Analyzing, and Executing Specifications of Distributed Systems

    The multiple facets of the PBS (Product Breakdown Structure)

    openMDM5: From a fat client to a scalable, omni-channel architecture

    Enhanced Project Management for Embedded C/C++ Programming using Software Components

    All industries are also looking into what the “other” industry is doing to learn about strength and weaknesses.

  • Brace Yourself! With Long Term Support: Long-Term-Support is an important issue when choosing a technology as strategic platform. This talk should provide information to use in discussions when arguing for/against Eclipse and Open Source.

  • The Eclipse Way: Learn and understand how Eclipse manages to develop such a strong ecosystem

  • My experience as an Eclipse contributor:
    As more and more companies from the automotive domain actively participate in Open Source, you will learn on what that means from a contributor’s view.

On the technical side, a lot of talks are of interest for automotive tooling, including:

  • Ecore Editor- Reloaded

    CDO’s New Clothes

    GEF4 – Sightseeing Mars
    All Sirius Talks
    All Xtext Talks
    Modeling Symposium

    Customizable Automatic Layout for Complex Diagrams Is Coming to Eclipse

    Since EMF is a major factor in Eclipse in automotive, these talks provide interesting information on modeling automotive data.

  • EMF Compare + EGit = Seamless Collaborative Modeling

    News from Git in Eclipse

    Tailor-made model comparison: how to customize EMF Compare for your modeling language

    Storing models in files has many advantages over storing them in a database. These technologies help in setting up a model management infrastructure that satisfies a lot of requirements.

  • 40 features/400 plugins: Operating a build pipeline with high-frequently updating P2 repositories

    POM-less Tycho builds

    Oomph: Eclipse the Way You Want It

    High productivity development with Eclipse and Java 8
    Docker Beginners Tutorial

    These will introduce the latest information on build management and development tools.

Of course there are a lot more talks at EclipseCon that are also of interest. Checkout the program. It is very interesting this year.


by Andreas Graf at August 19, 2015 08:54 AM

Eclipse Neon (Eclipse 4.6) will require a Java 8 runtime

by Lars Vogel at August 18, 2015 11:41 AM

I’m personally happy that the next Eclipse release will require Java 1.8 to run. See official email to the cross mailing list for the public announcement of this.

Several Eclipse projects like m2e and Jetty have already moved to Java 8. This moves allows us in the Eclipse platform to use the improved Java 8 API to modernize and optimize our code base and will hopefully make the Eclipse project even more interesting for potential Eclipse contributors.

After all, who wants to work in his unpaid time with an outdated Java version?


by Lars Vogel at August 18, 2015 11:41 AM

AQL - a new interpreter for Sirius

by Cédric Brun (cedric.brun@obeo.fr) at August 18, 2015 12:00 AM

TL;DR: we’ve been working on a new query interpreter for Sirius which is small, simple, fast, extensible and bring richer validation. It’s been released for early adopters with Sirius 3.0 but will be the recommended interpreter for Sirius 3.1 in October. The MTL interpreter ([/]) will be deprecated at some point, this moment depending on how fast the community adopts the new aql: interpreter.

Background and motivation

One of the key factor making Sirius so flexible is the ability to rely on queries when defining your graphical mapping. Every configurable field rendered with a yellow background in the tooling specification editor can be set either with a literal value or with a query which will be interpreted at runtime.

Sirius can be extended with new query interpreters through Eclipse plugins, each having its own prefix.

AQL's Code completion within the .odesign editor

Some interpreters are available by default notably feature:, var:, or service: which are direct access either to a model element feature, a context variable or a Java service. These interpreters have the tiniest overhead you can think of.

For everything else, [/] is the reference implementation : an OCL variant used in Acceleo 3.x and based on the MOFM2T OMG Standard.

A specific plugin bring the <%%> syntax which was the first interpreter supported by Sirius (long before Sirius was even named ‘Sirius’) and is using Acceleo 2 behind the scene. The Acceleo2 syntax is a mix in between XPath and OCL syntax, it has been deprecated long ago (and as such has to be installed through a specific update site) but you might still find it in .odesign models which originated before Sirius was contributed to Eclipse.

Being <%%>, [/] or even the “forever experimental” ocl:, every interpreter was implemented by integrating a pre-existing language and trying to make him happy about the dynamic nature of Sirius : during a diagram refresh, a given query might be evaluated thousands of times, each time on different EObjects which would not even necessarly share a common type, each time with different variables values.

This worked but sometime would lead to fairly complex code just to integrate things. To make the MTL interpreter ([/]) happy enough to evaluate the query we had to create templates in memory keeping the parameters signature, comparing the signatures with the new context for each variable change and evicting those templates at some point.

Fairly complex code generally means subtle bugs, sub-optimal performance and/or memory usage.

The interpreter is a cornerstone of the flexibility provided by Sirius, it also is a key player in the performance you’ll get in the end, so as heavy users of Sirius where painfully migrating from <%%> to [/] we quickly noticed that we would be better of with an implementation specificaly tailored for Sirius and that it would come at a fraction of the cost of all those migrations.

A new interpreter

With Sirius 3.0 we started a new interpreter implementation with the goal of being a perfect fit for Sirius:

  • Support static and dynamic Ecore models, no compilation phase required
  • the least possible overhead at evaluation time for an interpreted language: the evaluation actually goes forward and will not even try to validate or compile the expressions. Errors are tracked and captured along the way.
  • strong validation: types are checked at validation time (in the .odesign editor) and metamodels are analyzed to do some basic type inference and avoid false positive errors.
  • union types: in the context of Sirius, a variable in a given query have N potential types, with N being often greater than 1. We needed an interpreter embracing this fact and not falling back to EObject as soon as there is more than one type.
  • a straightforward implementation easily extensible with Java classes providing extension methods.
  • a very narrow dependency surface: only the very central parts of EMF, Guava and Antlr so that we could easily deploy it server-side or in standalone scenarios.

Proof of Concept

Last summer the proof of concept phase quickly demonstrated that we would not be able to be 100% compatible with the implementation of [/] without inheriting limits in the context of Sirius.

Here is a specific example :

In MTL [name/] is a valid expression, though depending on the available variables it might mean The variable named name or The ‘name’ attribute of the self object

This is a useful feature for a template language when 99% of the templates have no parameters besides self: this cut the clutter and lead to a template which is more readable:

An Acceleo Template

But in the context of an expression defined in a .odesign model : there are always more than one available variable: the selected object, the text value entered by the user, the possible sources or targets of an edge and so on.

Using the implicit self makes you .odesign file actually way harder to maintain and understand.

Furthermore from an implementation point of view, “implicit self “ induces runtime overhead : when a new variable has been set or has a new value, the previous AST parsed from a String had to be invalidated. Again, in the context of Sirius variables values or type are changing a lot and are not necessarly sharing a common type besides EObject (this allows us to mix Ecore based instances or UML one in a single editor for instance).

All these factors led to a runtime which might reparses and re-link a query because a variable changed its value with a completely new type.

That’s just one example (among several others), that led us re-consider those choices at the cost of a language which would not be 100% compatible. As such both interpreters will co-exist but the benefits the new one bring should be so strong that this transition time will be kept reasonnable.

Introducing AQL : Acceleo Query Language.

As languages, AQL and MTL are very close yet there are some notable differences:

  • there is no implicit variable reference
  • auto-collect and auto-flatten : there is no such thing as a List of List in AQL.
  • type literals can’t be in the form of somePackage::someSubPackage::SomeType but instead someSubPackage::SomeType should be directly used
  • you can only have Lists or Sets as collections and the order of their elements is always deterministic.

From an implementation point of view AQL:

  • is small : 100 non generated Java files
  • works with both dynamic or generated Ecore models.
  • is fast at runtime with minimal overhead
  • is smart at validation time by considering the union types in its analysis
  • is easy to re-use, extend and integrate in other contexts.

AQL is a pure Java library : org.eclipse.acceleo.query.jar and is part of the Acceleo project. Here are its dependencies :

AQL dependencies

Try it !

It is meant to be reused and has already seen several adoptions at Obeo.

If the Sirius use-case matches yours, give it a try and tell us what you think !

With Sirius 3.0 a first version of the AQL interpreter has been released as Experimental so that early adopters can work with it.

Install AQL experimental support in Sirius 3.0

EcoreTools has been the early-early adopter and has migrated with the Mars release. UML Designer is also using AQL on the master branch since July and more migrations will follow in the upcoming months.

First benchmarks

We started to measure performances and overhead compared to the other interpreters as even if it was one of the important factor, no specific effort had been made already besides keeping the implementation small and straightforward.

Here is the first bench we made to compare the different interpreters overhead. Projects to reproduce these measures are published on Github if you feel like giving it a shot.

Benchmarking Environment

The benchmark is a composed of synchronized diagram descriptions, one for each interpreter. They are strictly equivalent from the end-user point of view.

  • [/] is using the Acceleo MTL interpreter
  • <%%> is using the Acceleo2/Legacy interpreter
  • AQL uses the AQL interpeter
  • Service: uses the service: interpreter which dispatch to a Java method defined by the Specifier
  • feature: : is only using direct access to a attributes, references or hardcoded logic for eAllContents for instance.
  • Service over AQL : uses the same Java services as service: but dispatch those through aql, for comparison.

I created a diagram instance for every diagram description on top of a simple Ecore model, the corresponding diagrams have 3267 elements (node, edges, list items). After a “warmup” phase, I trigger an explicit refresh of the diagram while profiling with Yourkit in tracing (non-adaptative) mode.

Then I split the methods call-tree to break down the time spent in three categories:

  • calling external code/eAllContents : ‘external’ is EMF here. This is the time spent in doing eGet() or eAllContents() for instance.
  • managing variables : during a single refresh the interpreter state changes thousands of times, this is the time spent in managing these changes.
  • interpreter dispatch : the CPU time used by the interpreter to decide what to call (eGet, eAllContents, a java service…).

All these numbers are relativised with 100% being the total time of a given refresh, which in this case was in the order of 500ms to 1sec.

Sirius 3.0 Interpreters Overhead

A few small things you can notice already :

  • We are focusing at most on 40% of the refresh time: in the 200 to 500ms range.
  • <%%> and [/] tend to have a bigger overhead managing variables. That’s because their implementation are eagerly creating data structures each time a new value is set.
  • <%%> spent less time in eGet/eAllContents. That’s because this 10 years old implementation benefits from an eAllContents() algorithm which prunes subtrees based on a small analysis of the accessible metamodels.

The main thing you should notice : there is not much reasons to prefer anything other than AQL. Indeed feature: will always be faster if what you need is a direct access to an attribute or reference but AQL has the same overhead that direct service calls and gives you better analysis and validation capabilities in your .odesign.

Plan for Sirius 3.1 (release in mid-October)

Since the Sirius 3.0 release we are ramping up :

  • migrating more Sirius based products owned by Obeo and creating helper tools to conduct those.
  • ironing out bugs, streamlining things.
  • documenting the language and the standard services.
  • implementing missing services.
  • making it smarter at runtime : AQL will now prune model subtrees if an analysis shows that a subtype can’t possibly exist in this branch. And that makes AQL even faster.
  • making it smarter at validation time : with static analysis which can consider a contextual branch of alternative.
  • more experiments (stay tuned !)

All of these to make AQL the recommended query language implementation for Sirius 3.1 !

AQL - a new interpreter for Sirius was originally published by Cédric Brun at CTO @ Obeo on August 18, 2015.


by Cédric Brun (cedric.brun@obeo.fr) at August 18, 2015 12:00 AM

Introducing the EclipseSource Oomph Profile

by Maximilian Koegel and Jonas Helming at August 17, 2015 12:40 PM

The core strength of the Eclipse IDE has always been its adaptability and extensibility. Even without adding new plug-ins, you can customize almost everything by setting a preference. Additionally, there is a rich ecosystem of plug-ins for almost any imaginable task or activity a developer works on. This ranges from programming language support, SCM integration, static code analysis to integrations with task and bug trackers. The open architecture of the Eclipse platform and the open source license facilitates to extend and even adapt the IDE to specific needs.

Additionally, if something is really still missing, there is a useful support system to help you develop your own plug-ins to extend Eclipse and make it even greater. Hundreds of projects and developers have done so over the last 15 years. However, this power and variety also creates a problem: developers – especially developers new to Eclipse – are easily overwhelmed with choice. The Eclipse Packaging Project (EPP) does a great job in creating pre-defined Eclipse distributions. However, if you are just looking for a simple IDE which is suited for your use case (whatever that may be), without missing your favorite plug-ins and settings, you typically NEED to adapt the default downloads. Some great tools are not (yet) hosted at Eclipse and cannot be part of an EPP download. Some default settings have been part of endless discussions, so it is hard to change them for everyone. While it is always possible to change and adapt your Eclipse, it is pretty cumbersome to repeat this for every developer and every new version.

However, Eclipse would not be Eclipse, if the community did not come up with a great solution to this problem. The Oomph project provides the technology to define profiles, which describe an Eclipse IDE instance as you want it to be. This includes installed plug-ins, but also settings. Oomph even allows you to describe project specifics, such as repositories and projects, to be checked out.

However, even when using Oomph, you still have to make quite a few decisions about the plug-ins to be used and about the standard setting for your IDE. To make your life easier, we at EclipseSource would like to share our experience with the Eclipse IDE by sharing our standard settings and installed plug-ins with you. Of course, we sometimes use different settings for different projects, so this is kind of the common denominator and our starting point for new projects. To make this more useful, we will also describe the settings we apply and the plug-ins we install in a blog series.

To make our standard set-up re-usable by everyone, we created an Oomph profile. With this profile, you can install an Eclipse IDE with all plug-ins and settings with a single click. We will also update this profile, in case there are new things or settings to be adapted, so if you use this profile, you will continuously benefit from our thoughts about the best way of setting up a default IDE. Even if the profile does not fit your needs 100%, you can still use it as a basis and add your own user-specific customizations to it.

In this first blog, we will describe, how to use our profile with Oomph to get a pre-configured Eclipse IDE. In the following weeks, we will describe the plug-ins we included as well as the settings we adapted more in detail.

Installation

Our common profile (we call it EclipseSource Oomph Profile) is hosted at GitHub. To use it, you have to add it to Oomph. Subsequently, it can be used to install the pre-configured Eclipse Profile we use and also to retrieve updates on it. To use the profile, you need to complete the following steps once:

  1. Download and install the Oomph Installer.
  2. After launching Oomph, you have to switch to the “advanced mode” in the top right corner.
  3. Select the “+” sign next to the filter dialog on top and add the following URL: https://raw.githubusercontent.com/eclipsesource/oomph/master/EclipseSource.setup”
  4. Check the JVM and the bundle pool location. The bundle pool is the location where all bundles will be downloaded to and shared among all Oomph-provisioned Eclipse instances. Click “next”
  5. Skip the next page (project specific setups) and just click next
  6. Fill out the variables. I recommend:
    – Install a uniquely named folder within the root folder
    – Installation Folder name: Either the projects name you want this Eclipse to use for or if you share your Eclipse installation, just “EclipseSourceProfile”
    – Root Installation Folder: This will be used again in case you want to have a second Eclipse instance
    -Workspace: As you wish
  7. Click next and finish

 

After clicking on finish, Oomph will download all necessary bundles, set the adapted default settings and start the IDE. From now on,  this is not need to use the Oomph installer explicitly, except if you want to create another installation. To start this IDE again, you can simple start the Eclipse launcher from the chosen Installation Folder (on Windows the eclipse.exe file).

If you want to install a second and independant Eclipse Instance, just repeat all steps, but chose

a different “Installation folder name” in step 6. You will also notice that installing a second instance is a lot faster due to the bundle pool sharing bundles among Eclipse instances.

Now you can start to use the IDE and see whether our set of default plug-ins and settings fits your needs. If you miss something, you can install it at anytime using the default mechanism of Eclipse (Help => Install new Software). It is also possible to inherit from our profile and add things permanently, I will describe this in a future blog post. Finally, if you miss something you think is relevant for many users, please get in contact with us, we might want to add this to the default profile. You can report issues and feedback using this github project.

If you change a preference, a window will ask you, whether you want to record that. If you do that, it will be included in your future installation, but only yours. Again, if you think a certain setting might be interesting for the majority of users, please get in touch with us, so we might add it to the original profile.

In the following weeks, I will describe our default set of bundles and settings more in detail. For an overview, I will link those posts here. I will also post all updates on the profile here, so please follow us to be notified about those.

Until then, have fun using the EclipseSource Oomph Profile!


TwitterGoogle+LinkedInFacebook

Leave a Comment. Tagged with eclipse, egit, git, IDE, oomph, pde, eclipse, egit, git, IDE, oomph, pde


by Maximilian Koegel and Jonas Helming at August 17, 2015 12:40 PM