Skip to main content

The “Next” Eclipse CDT

by Doug Schaefer at April 23, 2019 05:27 PM

CDT Summit at EclipseCon Europe 2016

In my last article, I walked through a bit of the history of the Eclipse CDT project. It’s a great story of the trials and tribulations of community building, working with a group of like minded platform vendors, even competitors, who just wanted a great IDE and who knew the only way to get that was to work together.

What sticks with me most looking back on my ump-teen years on the project is the people. Each of these vendors were led by passionate tools developers who put aside any hint of corporate agenda and just wanted to work with other developers from around the world, to solve hard problems, and build something they would be proud of. Through the semi regular face-to-face CDT Summits and CDT days at EclipseCons past we were able to build relationships and become great friends. Even after some have moved on, I still remain in contact with many of them and we still share a laugh or two, and even the odd beer!

So I offer no apologies if, as participation has dwindled on the project in recent years, I yearn for those days. But my personal enjoyment working in this environment is one thing, I don’t think the job is done. The C++ language continues to evolve and become more usable and powerful than ever before. Even C continues to receive updates. And more importantly, the IDE landscape is changing and we need to keep up to keep our users and customers happy.

Over the years, I have always had to deal with users and even colleagues, who are hard core engineers, who refuse to give up their text editors and custom bash scripts. In the end, it’s been hard to argue with them. For these folk, IDEs don’t work the way they want to work. And, initially at least, it does slow them down as they fight the paradigm and they just end up frustrated and toss it aside losing all those gains we are trying to give them.

But in the end, I guess it was only a matter of time that an IDE platform would come along and offer the best of both worlds. Be a great unopinionated text editor, facilitate hooking up users’s bash scripts, be super extensible to add in little accelerators. It’s been brewing for a while. Sublime was the first such thing I saw developers really enjoy. And now, today, we have Visual Studio Code, which is becoming the dominant editor platform.

And funny enough, an interesting thing happens when you take a bunch of IDE developers and tell them to build a great editor, they end up building an editor so extensible that it slowly becomes an IDE. The VS Code team, many of them former Eclipsers, has managed to create a mechanism to add powerful language services to the editor. And, of all things, add support for debuggers. OK, you add debug support, you’re no longer an editor, you are an IDE. Stop pretending :).

But even greater, the VS Code team has done this by creating standardized APIs and protocols that other IDEs can use. You can build common services and then plug them into your favorite IDE, even Eclipse!

With this new architecture, the path forward for the Eclipse CDT project is clear. We need to take all that C/C++ IDE knowledge we’ve built over the years and use that to build common components that can be used by other IDE platforms. It means growing beyond our Eclipse IDE roots to give users that great experience in the IDE that best works they way they want to work.

We as the CDT community have a lot to offer, and there really isn’t a community like it, that focuses on the needs of the C/C++ developer. The challenge will be to harness that energy again, to work together for the greater good that we can only accomplish together. It really is an exciting time to be an IDE developer and Eclipse is a great home for us to come together.

In my next article I will start getting into the technical meat of what this new vision entails and what fun we can have building it! Time to roll up our sleeves and get to work.


by Doug Schaefer at April 23, 2019 05:27 PM

The 2019 IoT Developer Survey Results are Live

April 17, 2019 01:40 PM

This year marks the fifth year the Eclipse IoT Working Group has asked the global IoT developer community to share their perceptions, requirements, and priorities.

April 17, 2019 01:40 PM

New Survey of More Than 1,700 IoT Developers Reveals Top Hardware, Software "Stack" Choices

April 17, 2019 01:40 PM

Today, we released the 2019 IoT Developer Survey results that canvassed more than 1,700 developers about their IoT efforts!

April 17, 2019 01:40 PM

The 2019 IoT Developer Survey Results are Live

by Mike Milinkovich at April 17, 2019 11:00 AM

After months of hard work, the 2019 IoT Developer Survey results are live today. This year marks the fifth year the Eclipse IoT Working Group has asked the global IoT developer community to share their perceptions, requirements, and priorities. The survey has proven to be an influential assessment of the IoT market as viewed from the development front lines. Access the full findings of the 2019 IoT Developer Survey here.

Over 1,700 individuals took the survey between February and March 2019. Just like in previous years (see results from 2018, 2017, and earlier here), Eclipse IoT collaborated with key IoT ecosystem players like Bosch and Red Hat to maximize the reach of the survey.

The key findings this year include the following:

  • IoT drives real-world, commercial outcomes today. 65% of respondents are currently working on IoT projects professionally or will be in the next 18 months.
  • IoT developers mostly use C, C++, Java, JavaScript, and Python
  • AWS, Azure, and GCP are the leading IoT cloud platforms
  • Top three industry focus areas remain the same as last year: IoT Platforms, Home Automation, and Industrial Automation / IIoT.
  • MQTT remains the dominant IoT communication protocol leveraged by developers
  • The Eclipse Desktop IDE is the leading IDE for building IoT applications

IoT gets real(er)

Consistent with our findings last year, two-thirds of the respondents to our survey develop and deploy IoT solutions today or will be doing so within 18 months. This continued focus on building and deploying real world solutions is reflected in the increases in developers’ focus on performance, connectivity, and standards shown in the survey.

C and Java dominate

C won out as the programming language of choice for constrained devices, while Java was most popular for gateways/edge nodes and IoT cloud. Neither of those findings are surprising. C and C++ have long been the languages of choice for small embedded systems where minimizing memory space and power consumption, and maximizing processor utilization are key. Java is the dominant language and platform where the memory and processing resources are larger, and the complexity of the systems are greater. In particular, Java is the language of choice for most cloud infrastructure projects, so seeing it lead in IoT cloud is consistent with that.

AWS, Azure, and Google hold on to the lead

As expected AWS, Azure, and GCP maintain their status as the leading IoT cloud platforms. The list of three and their rankings are entirely consistent with their relative weights in the cloud computing marketplace as a whole, so there is no surprise to see this reflected in our results.

A continued focus on platforms, home automation, and IIoT

IoT Platforms (34%), Home Automation (27%), and Industrial Automation / IIoT (26%) were the respondents’ three most common industry focus areas. These areas are likely to continue to be key targets for IoT developer activity.  The fact that IoT Platforms is consistently year after year the number one focus for IoT developers is interesting. It implies that enterprises and industrials are putting resources into building their own IoT platforms for use by their companies. To me this suggests that industrial IoT is going to be a huge opportunity for hybrid cloud, as companies build and run IoT solutions on-premise based using modern, open technologies.

Security is (still) top of mind

Security is still the top concern for IoT developers. Communication Security (38%) Data Encryption (38%), and JSON Web Tokens (JWTs) (26%) were the top three security technologies cited in the survey, with virtualization also starting to play a stronger role in IoT Security.

MQTT is still the dominant IoT communication protocol

HTTP (49%), MQTT (42%), and Websockets (26%) were the top three communications protocols used by IoT developers. The growth in MQTT adoption over the past seven or eight years has been phenomenal, and I like to think that the Eclipse IoT community with its Eclipse Paho and Eclipse Mosquitto projects had a small part to play in that. Having robust open source implementations available has certainly been part of the MQTT’s success. Looking forward that main challenge we see for further MQTT adoption is the lack of interoperability built into the protocol. While MQTT is a great lightweight, low latency protocol it does not provide any guidance on the topic structures and payload definitions used by any device or application. This means that no two teams using MQTT would expect to have their systems be able to reliably exchange data. The Eclipse Tahu project defines the Sparkplug protocol — created by Arlen Nipper one of the co-inventors of MQTT itself. Sparkplug defines the topic structures and payload definitions necessary for out-of-the-box interoperability of SCADA systems. We are hopeful that Sparkplug could spur MQTT to even greater adoption in industrial IoT use cases.

The Eclipse Desktop IDE is the leading IoT IDE

45% of respondents use the Eclipse Desktop IDE. It is not at all surprising that the Eclipse IDE has a strong franchise with IoT developers, given the dominance of C and Java. The Eclipse CDT project has long been hugely important in the embedded software space. The past decade CDT has been used by virtually every chip, SOC, and RTOS company as the basis for their toolset. Those developer solutions also typically use additional tools such as the Target Management Framework, and Remote Systems Explorer that were specifically designed with the embedded developer in mind. That, coupled with the Eclipse IDE’s broad use amongst professional Java developers makes its leadership in IoT clear.

In addition, close to 10% also use Eclipse Che, our community’s next generation cloud-based tooling solution. It really seems part of the future of IoT is in the cloud, one way or another.

Access the full findings of the 2019 IoT Developer Survey here.

Thanks to everyone who took the time to fill out this survey, and thanks again our Eclipse IoT members for their help with the promotion.

We are very interested in hearing your thoughts and feedback about this year’s findings. And, of course, we are always open to suggestions on how to improve the survey in the future!


by Mike Milinkovich at April 17, 2019 11:00 AM

How to reference UML elements from Xtext DSLs

by Tamas Miklossy (miklossy@itemis.de) at April 16, 2019 02:00 PM

With the Xtext framework, you can build DSL workbenches in just a few steps. However, sometimes you want to reuse model elements already defined in other formats or even in other languages. In this blog post I’m going to demonstrate typical scenarios when you’re reusing model elements belonging to a different language.

Firstly, consider having some pre-defined Eclipse UML2 models, and you want to reference classes of these UML models from your Xtext DSL.

If you are interested in reusing model elements belonging to the same language but defined in different formats, take a look at my previous blog post “Combining EMF models with Xtext DSLs”.

Let's get started with the preparatory steps:

Preparatory steps

  1. Install the latest version of the UML2 Extender SDK and the Xtext Complete SDK of the Eclipse release train.

    install-Xtext-Complete-SDK-and-UML2-Extender-SDK

     

  2. Create the Domainmodel project, based on the Xtext 15 Minutes Tutorial. The meta-model of the Domainmodel project

    meta-model-of-the-Domainmodel project

    describes that a domain model consist of certain types (data types and entities), an entity contains features and each feature can have a type. To be able to use UML classes in the feature's type definition, the following modifications are necessary:

Modifications1 in the org.example.domainmodel plug-in

  1. Extend the Domainmodel.xtext grammar definition:
    grammar org.example.domainmodel.Domainmodel with org.eclipse.xtext.common.Terminals
    
    ...
    
    import "http://www.eclipse.org/uml2/5.0.0/UML" as uml
    
    ...
    
    Feature:
    	(many?='many')? name=ID ':' type=[uml::Class|FQN] | type=[Type];
    
    ...

     

  2. Extend the GenerateDomainmodel.mwe2 workflow:
    module org.example.domainmodel.GenerateDomainmodel
    
    import org.eclipse.emf.mwe.utils.*
    import org.eclipse.xtext.xtext.generator.*
    import org.eclipse.xtext.xtext.generator.model.project.*
    
    var rootPath = ".."
    
    Workflow {
    
    	bean = StandaloneSetup {
    		
    		scanClassPath = true
    		platformUri = rootPath
    		
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.emf.codegen.ecore/model/GenModel.genmodel"
    			to = "platform:/resource/org.eclipse.emf.codegen.ecore/model/GenModel.genmodel"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.emf.ecore/model/Ecore.genmodel"
    			to = "platform:/resource/org.eclipse.emf.ecore/model/Ecore.genmodel"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.uml2.codegen.ecore/model/GenModel.genmodel"
    			to = "platform:/resource/org.eclipse.uml2.codegen.ecore/model/GenModel.genmodel"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.uml2.uml/model/UML.genmodel"
    			to = "platform:/resource/org.eclipse.uml2.uml/model/UML.genmodel"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.emf.codegen.ecore/model/GenModel.ecore"
    			to = "platform:/resource/org.eclipse.emf.codegen.ecore/model/GenModel.ecore"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.emf.ecore/model/Ecore.ecore"
    			to = "platform:/resource/org.eclipse.emf.ecore/model/Ecore.ecore"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.uml2.codegen.ecore/model/GenModel.ecore"
    			to = "platform:/resource/org.eclipse.uml2.codegen.ecore/model/GenModel.ecore"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.uml2.uml/model/UML.ecore"
    			to = "platform:/resource/org.eclipse.uml2.uml/model/UML.ecore"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.uml2.types/model/Types.genmodel"
    			to = "platform:/resource/org.eclipse.uml2.types/model/Types.genmodel"
    		}
    		uriMap = {
    			from = "platform:/plugin/org.eclipse.uml2.types/model/Types.ecore"
    			to = "platform:/resource/org.eclipse.uml2.types/model/Types.ecore"
    		}
    		
    		registerGeneratedEPackage = "org.eclipse.emf.ecore.EcorePackage"
    		registerGeneratedEPackage = "org.eclipse.uml2.uml.UMLPackage"
    		registerGeneratedEPackage = "org.eclipse.uml2.types.TypesPackage"
    		registerGeneratedEPackage = "org.eclipse.emf.codegen.ecore.genmodel.GenModelPackage"
    		registerGeneratedEPackage = "org.eclipse.uml2.codegen.ecore.genmodel.GenModelPackage"
    		registerGenModelFile = "platform:/resource/org.eclipse.emf.ecore/model/Ecore.genmodel"
    		registerGenModelFile = "platform:/resource/org.eclipse.emf.codegen.ecore/model/GenModel.genmodel"
    		registerGenModelFile = "platform:/resource/org.eclipse.uml2.uml/model/UML.genmodel"
    		registerGenModelFile = "platform:/resource/org.eclipse.uml2.codegen.ecore/model/GenModel.genmodel"
    	}
    	
    	component = XtextGenerator {
    		...
    	}
    }

     

  3. Add the following plugins to the Require-Bundle section in the MANIFEST.MF file:
    • org.eclipse.uml2.uml
    • org.eclipse.uml2.codegen.ecore
  4. Add the following classes:

Modifications2 in the org.example.domainmodel.ui plug-in

  1. Add the following classes:
  2. Register DomainmodelActivatorEx as Bundle-Activator in the MANIFEST.MF file.
  3. Add the following plugin to the Require-Bundle section in the MANIFEST.MF:
    • org.eclipse.emf.ecore.editor
  4. Add the following section to the plugin.xml file:
    <!-- register the Xtext UI language services to Xtext's registry -->
    <extension
    	point="org.eclipse.xtext.extension_resourceServiceProvider">
    	<resourceServiceProvider
    		class="org.example.domainmodel.ui.UMLExecutableExtensionFactory:org.eclipse.xtext.ui.resource.generic.EmfResourceUIServiceProvider"
    		uriExtension="uml">
    	</resourceServiceProvider>
    </extension>

     

Manual testing

Start an Eclipse runtime to verify that parsing, linking, content assistant, hovering, hyperlink navigation, quickfixes, etc., are working properly.

Eclipse-manual-testing

Automated testing

Conclusion

We have done the preparatory steps, modifications and testing necessary to reuse model elements belonging to a different language. This example has been kept simple on purpose.

If you are interested in more advanced use-cases on the Xtext/UML integration, I recommend Karsten‘s and Holger‘s presentation on “How to build Code Generators for Non-Xtext Models with Xtend”.

Do you have questions or feedback? Feel free to leave me a comment below.

 

1,2 Please note that the blog post “Combining EMF Models with Xtext DSLs” explains the necessary modifications in detail.


by Tamas Miklossy (miklossy@itemis.de) at April 16, 2019 02:00 PM

The Journey to a “New Eclipse CDT”

by Doug Schaefer at April 11, 2019 06:08 PM

The Eclipse CDT project has been around for a long time. The first code commits were in June 2002 when QNX contributed the core C/C++ tooling components of it’s fresh new Momentics IDE. For the time it was quite visionary. Platform vendors are more often focused on their platforms and provide tools as an enabler for developers to leverage the power of those platforms. Having every company replicate that effort only leads to poor tools and unhappy developers. It doesn’t scale. Why not join together with like-minded vendors and build something we can all benefit from?

As an added bonus, we end up with a standard platform that allows developers to work with multiple environments on the same platform leading to some interesting partnership opportunities. At the very least developers acquire skills working with that IDE they can take to other jobs or even other projects in the same company. It truly is a win/win for everyone and makes for a very rich ecosystem.

But as with all long lasting projects, it has had it’s ebbs and flows. As we started, QNX and my team at Rational Software (later part of IBM) were the main contributors. There were a few others lurking around with a few contributions. The first few monthly CDT conference calls were cool as new voices appeared from around the world. It was a great start that led to the first EclipseCon where we had people standing in the hall trying to listen into our overflow birds of a feather session. We were overwhelmed by the interest.

But it didn’t last forever and the powers that be decided my team at IBM was needed elsewhere in the company. Obviously I wasn’t pleased and certainly wasn’t ready to leave the community. So when Sebastien Marineau, the CDT project lead at the time suggested I come work for him at QNX, I jumped at the chance. It wasn’t the easiest of moves but it was my chance to do what I could to help keep the CDT machine rolling and ended up getting a good start on my pride and joy, CDT’s super-fast indexer.

At the first CDT summit held in the fall of 2005 shortly after I had joined QNX, we had a lot of interest and it was well attended with over a dozen vendors present. We had some good presentations on what we were working on and some interesting presentations from other vendors on what they’d like to see in the project. Well someone said the wrong thing and I had to stop the proceedings. Of the 20 or so people in the room I asked the active contributors to stand up. There were four of us. How are they expecting this little team to do their bidding when the whole idea was to share in the effort and do something great together.

That’s probably the proudest and hardest moment of my time on the CDT. It worked. And from the chart above you can see we had healthy growth and lots of vendors coming to help after that. Those were the best years for the project. Though for me personally, I never really did recover from my lost team at Rational/IBM and ended up moving around a bit, never leaving the CDT project but not really giving it my all.

Luckily and happily, for the last four years, I’ve been back at QNX working on that original Momentics IDE and contributing to Eclipse trying to make it easier to use for C/C++ developers including my favorite add, the Launch Bar. Our customers are happy, the community seems happy, and there are more and more vendors delivering product based on our work in open source.

But also from the chart you can see participation in the CDT has been on a worrying trajectory to a point where we once again have about 4 people actively working on it. Despite being as popular as ever, only a small percentage of people and companies out there are helping with the common cause. It’s made me sad, and frankly angry. And I must apologize to my friends at last year’s EclipseCon for losing control of that a bit, but when you have so many companies leveraging something you are passionately doing for free for them, thanks to my sympathetic employer, you feel taken advantage of.

But as the leader of the project, it’s up to me with the help of my open source colleagues to find a way to turn that around. And we have some ideas that I’m very excited about. My next few posts will talk about that and go into details of some very cool new efforts underway in the CDT project. The IDE world has changed but it’s needed more than ever and we are ready to adapt. We really hope you can join us and make it great, again…


by Doug Schaefer at April 11, 2019 06:08 PM

Automated rename refactoring in N4JS IDE

by n4js dev (noreply@blogger.com) at April 09, 2019 11:48 AM

Refactoring is probably one of the most important tools for us, software developers since we constantly need to change the structure of the code to improve the code quality or to prepare the code for new features etc. The most used refactoring operation is arguably rename refactoring. Find and replace could be used for renaming but the risk of renaming unrelated names is pretty high.

N4JS IDE provides a powerful way of automatically renaming a definition and all its references with a comparable user experience as rename refactoring of Eclipse Java Development Tool (JDT).  The slogan is: I want to rename this thing, do it for me however you like but please in a safe manner so that I can move on! This will exactly be your experience with rename refactoring in N4JS IDE.

Simple rename example

Let's have look at a simple example to see how rename refactoring works in N4JS IDE in action. Assume that we have an N4JS file with the following content.


When the cursor is at A of the constructor new A()and we press Cmd + Shift + R to rename A to B, the rename refactoring suggests that it would rename A to B at 3 different locations. After entering the new name B and pressing enter, the class A and all its usages are renamed to B, fully automatically :-)

Name conflicts detection 

Renaming an element may cause name conflicts. The rename refactoring in N4JS IDE provides comprehensive checks for detecting name conflicts. If the new name would cause name conflicts, the rename refactoring is disallowed.

In the example above,  renaming class A to class C would cause a name conflict because in the script scope the name C already exists. The rename refactoring provided by N4JS IDE can recognize this conflict and shows an error message.



In a large code base, these checks are a true life saver. Imagine having to manually verify these kinds of name conflicts across hundred of files.

Additionally, N4JS IDE's rename refactoring is capable of recognizing name conflicts when renaming

  • members of classifier
  • formal parameters of a function or method
  • field of a structural type
  • enum literal
  • local variable, constant
  • global variable, constant
  • etc.

Rename composed members

N4JS language supports composed elements. Renaming a composed element is somewhat special.




In this example, ab.foo is a composed member because ab is of the intersection type A & B which is composed of both A and B. Renaming ab.foo would rename all the definitions that contribute to the creation of ab.foo as well as all references of these definitions.

Preview of changes

When you start rename refactoring operation, you have the possibility to see the preview of changes before actually executing the operation.



Note that the preview shows the changes in each file in a very recognizable manner.

Undo changes

After the rename refactoring, if you feel regret and would like to undo the operation, simply press Cmd + Z. The undo will undo all the changes in affected files previously done by the rename refactoring.

Current limitations

As the time of this writing, the rename refactoring in N4JS IDE still has several limitations:
  • Renaming alias is not supported
  • Checking name conflicts do not take into account shadowing

By Minh Quang Tran

by n4js dev (noreply@blogger.com) at April 09, 2019 11:48 AM

Specification Scope in Jakarta EE

by waynebeaton at April 08, 2019 02:56 PM

With the Eclipse Foundation Specification Process (EFSP) a single open source specification project has a dedicated project team of committers to create and maintain one or more specifications. The cycle of creation and maintenance extends across multiple versions of the specification, and so while individual members may come and go, the team remains and it is that team that is responsible for the every version of that specification that is created.

The first step in managing how intellectual property rights flow through a specification is to define the range of the work encompassed by the specification. Per the Eclipse Intellectual Property Policy, this range of work (referred to as the scope) needs to be well-defined and captured. Once defined, the scope is effectively locked down (changes to the scope are possible but rare, and must be carefully managed; the scope of a specification can be tweaked and changed, but doing so requires approval from the Jakarta EE Working Group’s Specification Committee).

Regarding scope, the EFSP states:

Among other things, the Scope of a Specification Project is intended to inform companies and individuals so they can determine whether or not to contribute to the Specification. Since a change in Scope may change the nature of the contribution to the project, a change to a Specification Project’s Scope must be approved by a Super-majority of the Specification Committee.

As a general rule, a scope statement should not be too precise. Rather, it should describe the intention of the specification in broad terms. Think of the scope statement as an executive summary or “elevator pitch”.

Elevator pitch: You have fifteen seconds before the elevator doors open on your floor; tell me about the problem your specification addresses.

The scope statement must answer the question: what does an implementation of this specification do? The scope statement must be aspirational rather than attempt to capture any particular state at any particular point-in-time. A scope statement must not focus on the work planned for any particular version of the specification, but rather, define the problem space that the specification is intended to address.

For example:

Jakarta Batch provides describes a means for executing and managing batch processes in Jakarta EE applications.

and:

Jakarta Message Service describes a means for Jakarta EE applications to create, send, and receive messages via loosely coupled, reliable asynchronous communication services.

For the scope statement, you can assume that the reader has a rudimentary understanding of the field. It’s reasonable, for example, to expect the reader to understand what “batch processing” means.

I should note that the two examples presented above are just examples of form. I’m pretty sure that they make sense, but defer to the project teams to work with their communities to sort out the final form.

The scope is “sticky” for the entire lifetime of the specification: it spans versions. The plan for any particular development cycle must describe work that is in scope; and at the checkpoint (progress and release) reviews, the project team must be prepared to demonstrate that the behavior described by the specifications (and tested by the corresponding TCK) cleanly falls within the scope (note that the development life cycle of specification project is described in Eclipse Foundation Specification Process Step-by-Step).

In addition the specification scope which is required by the Eclipse Intellectual Property Policy and EFSP, the specification project that owns and maintains the specification needs a project scope. The project scope is, I think, pretty straightforward: a particular specification project defines and maintains a specification.

For example:

The Jakarta Batch project defines and maintains the Jakarta Batch specification and related artifacts.

Like the specification scope, the project scope should be aspirational. In this regard, the specification project is responsible for the particular specification in perpetuity. Further the related artifacts, like APIs and TCKs can be in scope without actually being managed by the project right now.

Today, for example, most of the TCKs for the Jakarta EE specifications are rolled into the Jakarta EE TCK project. But, over time, this single monster TCK may be broken up and individual TCKs moved to corresponding specification projects. Or not. The point is that regardless of where the technical artifacts are currently maintained, they may one day be part of the specification project, so they are in scope.

I should back up a bit and say that our intention right now is to turn the “Eclipse Project for …” projects that we have managing artifacts related to various specifications into actual specification projects. As part of this effort, we’ll add Git repositories to these projects to provide a home for the specification documents (more on this later). A handful of these proto-specification projects currently include artifacts related to multiple specifications, so we’ll have to sort out what we’re going to do about those project scope statements.

We might consider, for example, changing the project scope of the Jakarta EE Stable APIs (note that I’m guessing a future new project name) to something simple like:

Jakarta EE Stable APIs provides a home for stable (legacy) Jakarta EE specifications and related artifacts which are no longer actively developed.

But, all that talk about specification projects aside, our initial focus needs to be on describing the scope of the specifications themselves. With that in mind, the EE4J PMC has created a project board with issues to track this work and we’re going to ask the project teams to start working with their communities to put these scope statements together. If you have thoughts regarding the scope statements for a particular specification, please weigh in.

Note that we’re in a bit of a weird state right now. As we engage in a parallel effort to rename the specifications (and corresponding specification projects), it’s not entirely clear what we should call things. You’ll notice that the issues that have been created all use the names that we guess we’re going to end up using (there’s more more information about that in Renaming Java EE Specifications for Jakarta EE).


by waynebeaton at April 08, 2019 02:56 PM

How to participate in advancing Jakarta EE Specification: Technical and Collateral material related work

April 05, 2019 12:40 PM

We have heard from members of the community some suggestions on what they need from the specification, but we can always use more!

April 05, 2019 12:40 PM

Renaming Java EE Specifications for Jakarta EE

April 05, 2019 12:40 PM

As we prepare to engage in actual specification work, it's time to start thinking about changing the names of the specifications and the projects that contain their artifacts.

April 05, 2019 12:40 PM

Develop a React app in N4JS

by n4js dev (noreply@blogger.com) at April 05, 2019 10:07 AM

React is a popular JavaScript library created by Facebook widely used for developing web user interface. N4JS provides full support for React as well as the JavaScript extension JSX for describing UI elements. Internally, we have been using N4JS in combination with React and JSX for years to develop very large e-commerce web applications.

In this blog post, we would like to show you the support of React and JSX in N4JS. In particular, we will implement the game Tic-tac-toe in N4JS. The implementation is heavily based on the pure JavaScript version in the tutorial Tictactoe in React. In this post, we will focus on N4JS specifics as well as on the advantages of using N4JS over pure JavaScript.

As typical with Rect applications, the first step is to design a tree of React components to represent the application.



The root React component is Game that consists of two areas. The left area is the React component Board showing the Tic-tac-toe board while the right area shows the game information.

N4JS type definitions of React

In order to make use of N4JS's type checking for React, we need to declare @n4jsd/react as dev dependency in package.json@n4jsd/react, provided by us as a public npm, consists of n4jsd files that contain file definitions for React.

{
    "name": "tictactoe",
    "devDependencies": {
    "@n4jsd/react": "<=16.6.*",
    },
    "dependencies": {
        "react": "^16.6.0",
    }

}

File extension n4jsx


The standard file extension of N4JS is .n4js. N4JS files containing React and JSX must have the extension .n4jsx

Square React component

The Square React component defines a single square of the Tic-tac-toe board that can be clicked by the current user. Its value is either X or O or null depending on which user is in turn or if the square is empty. In this example, we define Square as a lightweight functional component since it does not have any state.

/**
 * Square props
 */
interface ~SquareProps extends React.ComponentProps {
    public value: string;
    public onClick: {function(): void}
}

/**
 * Square React component
 */
function Square(props: SquareProps): React.Element<?> {
  return (
    <button className="square" onClick={props.onClick}>
      {props.value}
    </button>
  );

}

The functional definition of Square must have a single props parameter and return an instance of type React.Element.

When a Square is instantiated, it expects two mandatory props described by SquareProps

  • value: the value of the square, either X, O or null
  • onClick: the event handler to be called when the square is clicked

SquareProps, as any data structure describing props of a React component, must extend React.ComponentProps. In addition to having explicit types, the props can be declared as mandatory (as in this example) or optional with the help of the question mark. For instance,  if you declared  public value?: string, value would be an optional prop.

Here we start to see the advantages of N4JS over the pure non-type JavaScript implementation. When a Square component is created, the compiler will enforce the type of the props. Moreover, it will complain if a mandatory prop is missing. And all these checks happen at compile time. In pure JavaScript, we will recognize those mistakes only at runtime.

Board component

The Board React component represents the Tic-tac-toe board. Even though it does not have state, we define it as a class because it contains a helper method.

/**
 * Board props
 */
interface ~BoardProps extends React.ComponentProps {
    public squares: [string];
    public onClick: {function(int): void}
}

/**
 * Board React component
 */
class Board extends React.Component<BoardProps, Object> {
  /**
   * Render the i-th square on the board
   */
  renderSquare(i: int) {
    return (
      <Square
        value={this.props.squares[i]}
        onClick={() => this.props.onClick(i)}
      />
    );
  }

  @Override
  public render(): React.Element<?> {
    return (
      <div>
        <div className="board-row">
          {this.renderSquare(0)}
          {this.renderSquare(1)}
          {this.renderSquare(2)}
        </div>
        <div className="board-row">
          {this.renderSquare(3)}
          {this.renderSquare(4)}
          {this.renderSquare(5)}
        </div>
        <div className="board-row">
          {this.renderSquare(6)}
          {this.renderSquare(7)}
          {this.renderSquare(8)}
        </div>
      </div>
    );
  }

}

The Board class, as any class representing a React component, must extend React.Component. Note that React.Component expects two type arguments: the first type argument is the type of props and the second type argument is the type of state.

Here, in the render method we simply create 9 Squares that make up the board.

Game React component

This is the root React component of this application and hence does not have any props. Instead, it has state represented by GameState which stores the history of the board, the step number and whether the next player is X.

/**
 * Game state
 */
interface ~GameState {
    public history: Array<~Object with { squares: Array<string>}>;
    public stepNumber: int;
    public xIsNext: boolean;

}

/**
 * Game React component (root)
 */
export default public class Game extends React.Component<React.ComponentProps, GameState> {
  public constructor(props: React.ComponentProps) {
    super(props);
    this.state = {
      history: [
        {
          squares: new Array<string>(9)
        }
      ],
      stepNumber: 0,
      xIsNext: true 
    };

  }
  ...
  
  @Override
  public render(): React.Element<?> {
  ...
  }
}

Here, again thanks to type checking, the N4JS compiler will complain if we access a non-existing field of the state or use the wrong type of a certain field at compile time. In pure JavaScript, we will recognize those mistakes only at runtime.

Source code

You can find the source code here

                                                                                                                      By Minh Quang Tran

by n4js dev (noreply@blogger.com) at April 05, 2019 10:07 AM

Renaming Java EE Specifications for Jakarta EE

by waynebeaton at April 04, 2019 02:17 PM

It’s time to change the specification names…

When we first moved the APIs and TCKs for the Java EE specifications over to the Eclipse Foundation under the Jakarta EE banner, we kept the existing names for the specifications in place, and adopted placeholder names for the open source projects that hold their artifacts. As we prepare to engage in actual specification work (involving an actual specification document), it’s time to start thinking about changing the names of the specifications and the projects that contain their artifacts.

Why change? For starters, it’s just good form to leverage the Jakarta brand. But, more critically, many of the existing specification names use trademarked terms that make it either very challenging or impossible to use those names without violating trademark rules. Motivation for changing the names of the existing open source projects that we’ll turn into specification projects is, I think, a little easier: “Eclipse Project for …” is a terrible name. So, while the current names for our proto-specification projects have served us well to-date, it’s time to change them. To keep things simple, we recommend that we just use the name of the specification as the project name. 

With this in mind, we’ve come up with a naming pattern that we believe can serve as a good starting point for discussion. To start with, in order to keep things as simple as possible, we’ll have the project use the same name as the specification (unless there is a compelling reason to do otherwise).

The naming rules are relatively simple:

  • Replace “Java” with “Jakarta” (e.g. “Java Message Service” becomes “Jakarta Message Service”);
  • Add a space in cases where names are mashed together (e.g. “JavaMail” becomes “Jakarta Mail”);
  • Add “Jakarta” when it is missing (e.g. “Expression Language” becomes “Jakarta Expression Language”); and
  • Rework names to consistently start with “Jakarta” (“Enterprise JavaBeans” becomes “Jakarta Enterprise Beans”).

This presents us with an opportunity to add even more consistency to the various specification names. Some, for example, are more wordy or descriptive than others; some include the term “API” in the name, and others don’t; etc.

We’ll have to sort out what we’re going to do with the Eclipse Project for Stable Jakarta EE Specifications, which provides a home for a small handful of specifications which are not expected to change. I’ll personally be happy if we can at least drop the “Eclipse Project for” from the name (“Jakarta EE Stable”?). We’ll also have to sort out what we’re going to do about the Eclipse Mojarra and Eclipse Metro projects which hold the APIs for some specifications; we may end up having to create new specification projects as homes for development of the corresponding specification documents (regardless of how this ends up manifesting as a specification project, we’re still going to need specification names).

Based on all of the above, here is my suggested starting point for specification (and most project) names (I’ve applied the rules described above; and have suggested tweaks for consistency by strike out):

  • Jakarta APIs for XML Messaging
  • Jakarta Architecture for XML Binding
  • Jakarta API for XML-based Web Services
  • Jakarta Common Annotations
  • Jakarta Enterprise Beans
  • Jakarta Persistence API
  • Jakarta Contexts and Dependency Injection
  • Jakarta EE Platform
  • Jakarta API for JSON Binding
  • Jakarta Servlet
  • Jakarta API for RESTful Web Services
  • Jakarta Server Faces
  • Jakarta API for JSON Processing
  • Jakarta EE Security API
  • Jakarta Bean Validation
  • Jakarta Mail
  • Jakarta Beans Activation Framework
  • Jakarta Debugging Support for Other Languages
  • Jakarta Server Pages Standard Tag Library
  • Jakarta EE Platform Management
  • Jakarta EE Platform Application Deployment
  • Jakarta API for XML Registries
  • Jakarta API for XML-based RPC
  • Jakarta Enterprise Web Services
  • Jakarta Authorization Contract for Containers
  • Jakarta Web Services Metadata
  • Jakarta Authentication Service Provider Interface for Containers
  • Jakarta Concurrency Utlities
  • Jakarta Server Pages
  • Jakarta Connector Architecture
  • Jakarta Dependency Injection
  • Jakarta Expression Language
  • Jakarta Message Service
  • Jakarta Batch
  • Jakarta API for WebSocket
  • Jakarta Transaction API

We’re going to couple renaming with an effort to capture proper scope statements (I’ll cover this in my next post). The Eclipse EE4J PMC Lead, Ivar Grimstad, has blogged about this recently and has created a project board to track the specification and project renaming activity (as of this writing, it has only just been started, so watch that space). We’ll start reaching out to the “Eclipse Project for …”  teams shortly to start engaging this process. When we’ve collected all of the information (names and scopes), we’ll engage in a restructuring review per the Eclipse Development Process (EDP) and make it all happen (more on this later).

Your input is requested. I’ll monitor comments on this post, but it would be better to collect your thoughts in the issues listed on the project board (after we’ve taken the step to create them, of course), on the related issue, or on the EE4J PMC’s mailing list.

 


by waynebeaton at April 04, 2019 02:17 PM

New Release: Python<->Java Remote Services

by Scott Lewis (noreply@blogger.com) at April 04, 2019 05:42 AM

There is a new release (2.9.0) of the ECF distribution provider for OSGi R7 Remote Services between Java and Python.

This release has:

An upgraded version of Py4j
An upgraded version of Google Protocol Buffers
Enhancements to the distribution provider based upon the improved Py4j and Protobuf libs

In this previous blog posting there are links to tutorials and examples showing how to use remote services between Python<->Java.

Python<->Java remote services can be consumed or implemented in either Java or Python.

by Scott Lewis (noreply@blogger.com) at April 04, 2019 05:42 AM

How to participate in advancing Jakarta EE Specification: Technical and Collateral material related work

by Tanja Obradovic at April 03, 2019 07:13 PM

Technical Work

We will need a lot of help on this front as well

o   Jakarta EE specifications: specification documents and APIs

We have heard from members of the community some suggestions on what they need from the specification, but we can always use more. Get involved in the discussion on Github (https://github.com/eclipse-ee4j/jakartaee-platform/issues).

o   Jakarta EE TCK

It’s a goliath, inconvenient, and we want to slowly begin to break it up into separate TCKs for each specification. Not for the very first release of Jakarta EE, but we need to start planning and discussing the approach.

o   Compatible Implementations

To make a final version of a specification alive we need specification implementations. Whether the implementation is hosted in Eclipse Foundation or not is not the focus, we need you to implement the specification.

 

 

Collateral material related work

While we encourage everyone to participate in specification development, please keep in mind this isn’t limited to coding only. Of equal importance is the need for collateral material related to the specification(s). This includes documentation, presentations, videos, demos, examples, blogs, tech talks, etc. This is the type of content we can circulate through the community and use to educate and spread the news on the new specifications. Presenting the material on the conferences is yet another aspect where you can help out also!


by Tanja Obradovic at April 03, 2019 07:13 PM

Eclipse Vert.x 3.7.0 released!

by vietj at April 02, 2019 12:00 AM

We are extremely pleased to announce that the Eclipse Vert.x version 3.7.0 has been released.

It is an exciting milestone for a couple of reasons:

  1. it comes with great new features like the GraphQL extension for Vert.x Web.
  2. this is the last minor version before Vert.x 4!

Before we go throught the most notable new features, we would like to thank all the contributors. Your participation has been essential to this achievement.

Vert.x Web GraphQL

Vert.x Web GraphQL extends Vert.x Web with the GraphQL-Java library so that you can build a GraphQL server.

To use this new module, add the following to the dependencies section of your Maven POM file:

<dependency>
  <groupId>io.vertxgroupId>
  <artifactId>vertx-web-graphqlartifactId>
  <version>3.7.0version>
dependency>

Or, if you use Gradle:

compile 'io.vertx:vertx-web-graphql:3.7.0'

Then create a Vert.x Web Route and a GraphQLHandler for it:

// Setup the GraphQL-Java object
GraphQL graphQL = setupGraphQLJava();
// Use it to handle requests on a Vert.x Web route 
router.route("/graphql").handler(GraphQLHandler.create(graphQL));

The GraphQL handler supports out of the box:

  • query context customization
  • GraphQL-Java data loaders
  • batching on POST requests (compatible with the apollo-link-batch-http transport)

For detailed usage instructions, please refer to the Vert.x Web GraphQL documentation.

Vert.x Cassandra Client

Object mapper support

Vert.x Cassandra Client now supports the cassandra-driver-mapping module.

To enable this feature, you need to update your classpath by adding:

<dependency>
  <groupId>com.datastax.cassandragroupId>
  <artifactId>cassandra-driver-mappingartifactId>
  <version>3.7.1version>
dependency>

Then for a given entity:

@Table(keyspace = "test", name = "users")
class User {
  @PartitionKey String name;
  // ... other fields and methods 
}

You can retrieve a mapper and execute CRUD operations:

VertxMappingManager manager = VertxMappingManager.create(cassandraClient);
VertxMapper mapper = manager.mapper(User.class, vertx);
mapper.save(new User("john", hander -> {}));
Collector API

The feature allows to use Java collectors for query results:

// Create a collector projecting a row set to a string in the form (last_name_1,last_name_2,...)
Collector collector = Collectors.mapping(
    row -> row.getString("last_name"),
    Collectors.joining(",", "(", ")")
);

// Run the query with the collector
client.execute("SELECT * FROM users", collector, ar -> {
  if (ar.succeeded()) {
    // Result in the form (last_name_1,last_name_2,...)
    String result = ar.result();
  } else {
    System.out.println("Failure: " + ar.cause().getMessage());
  }
});
Cursor API

The ResultSet object has been enhanced with ResultSet#several method, allowing you to obtain several rows at once:

resultSet.several(30, ar -> {
  if (ar.succeeded()) {
    List result = ar.result();
  } else {
    System.out.println("Failure: " + ar.cause().getMessage());
  }
});

A very useful feature for result batch iterations without resorting to streaming or fetching all rows in memory.

Client lifecyle

The client lifecyle has been revisited in 3.7.

Previously users expected to connect manually before sending requests. It was also possible to disconnect a shared client thus failing requests sent from another verticle or part of the application.

Now it is no longer required to manually connect a client (in fact, the method has been removed).

As soon as you retrieve an instance you can start using it, the lifecyle is automatically managed:

CassandraClientOptions options = new CassandraClientOptions()
  .addContactPoint("node1.address")
  .addContactPoint("node2.address")
  .addContactPoint("node3.address")
  .setKeyspace("my_keyspace");
CassandraClient sharedClient = CassandraClient.createShared(vertx, "sharedClientName", options);
// Start sending requests to Cassandra with the client instance

Similarly, when the new close method is invoked on a shared client, only the last active instance will actually disconnect from Cassandra:

// Disconnects only if this is the last running instance of the shared client
sharedClient.close();

Vert.x Redis Client

The Vert.x Redis client has been reworked internally and provides now a new (more evolution friendly) API.

The current API had the limitation of being manually crafted after the redis API and involved many non controlable features such as auto reconnect, unlimited buffering of requests, etc… The new API offers a more vert.x-y experience.

It just exposes the base client:

Redis
  .createClient(vertx, inetSocketAddress(7006, "127.0.0.1"))
  .connect(create -> {
    final Redis redis = create.result();

    redis.send(Request.cmd(Command.PING), send -> {
      // ... should reply with PONG
    });
  });

This has the benefits that you can now connect to Redis in any of it’s operation modes:

  • Single server
  • HA mode
  • Cluster mode

The API is decoupled from the handcrafted commands, which means that you can use new features such as:

A generated helper RedisAPI is available that can wrap the client to provide a similar experience to the old API.

The main difference is that this new wrapper is generated from the COMMAND command, so the correct API it always exposed:

RedisAPI redis = RedisAPI.api(client);

redis.set(Arrays.asList("key1", "value1"), set -> {
  // ...
});

Vert.x AMQP Client

The Vert.x AMQP client allows receiving and sending AMQP messages. It supersedes the current AMQP bridge and provide an API more flexible and very much user-friendly.

The Vert.x AMQP client allows:

  • Connecting to an AMQP broker or router - SASL and TLS connections are supported
  • Consuming message from a queue or a topic
  • Sending messages to a queue or a topic
  • Checking acknowledgement for sent messages

The AMQP 1.0 protocol support durable subscriptions, persistence, security, conversations, sophisticated routing… More details on the protocol can be found on the AMQP homepage.

The Vert.x AMQP client is based on Vert.x Proton. If you need fine-grain control, we recommend using Vert.x Proton directly.

To use this new module, add the following to the dependencies section of your Maven POM file:

<dependency>
  <groupId>io.vertxgroupId>
  <artifactId>vertx-amqp-clientartifactId>
  <version>3.7.0version>
dependency>

Or, if you use Gradle:

compile 'io.vertx:vertx-amqp-client:3.7.0'

Then, you can connect to an AMQP broker:

AmqpClientOptions options = new AmqpClientOptions()
      .setHost("localhost")
      .setPort(5672)
      .setUsername("user")
      .setPassword("secret");

AmqpClient client = AmqpClient.create(vertx, options);

client.connect(ar -> {
  if (ar.failed()) {
    System.out.println("Unable to connect to the broker");
  } else {
    System.out.println("Connection succeeded");
    AmqpConnection connection = ar.result();

    // You can create receivers and senders
    connection.createReceiver("my-queue",
      msg -> {
        // called on every received messages
        System.out.println("Received " + msg.bodyAsString());
      },
      done -> {
        if (done.failed()) {
          System.out.println("Unable to create receiver");
        } else {
          AmqpReceiver receiver = done.result();
        }
      }
    );

    connection.createSender("my-queue", done -> {
      if (done.failed()) {
        System.out.println("Unable to create a sender");
      } else {
        AmqpSender sender = done.result();
        sender.send(AmqpMessage.create().withBody("hello").build());
      }
    });

  }
});

Stream pipes

When it comes to streaming, back-pressure is something you need to care about.

You have very much likely heard or used the Vert.x Pump API to transfer data from a read stream to a write stream while respecting the write stream back-pressure.

The Pipe a new API superseding the Pump to achieve the same effect and even more, it acts like a pump and handles for you

  • read stream pause/resume
  • write stream termination
  • stream failures handling
  • asynchronous result upon streaming completion

You can transfer a read stream to a write stream simply, the write stream will be ended upon completion of the stream

readStream.pipeTo(writeStream);

You can also be notified when the pipe completes:

readStream.pipeTo(writeStream, ar -> {
  if (ar.succeeded()) {
    System.out.println("done");
  } else {
    System.out.println("failed " + ar.cause());
  }
});

Creating and using an asynchronous pipe is easy

// The read stream will be paused until the pipe is used
Pipe pipe = readStream.pipe();
getAsyncPipe(ar -> {
  if (ar.succeeded()) {
    pipe.to(writeStream);
  } else {
    pipe.close();
  }
});

Kafka admin client

The new version brings a Vert.x based first implementation of the native Kafka Admin Client API which are in Java, instead of Scala used in the previous version.

The AdminUtils is now deprecated and the new KafkaAdminClient is available instead. It allows to remove the last Scala artifact dependency.

While the AdminUtils implementation needs to connect to Zookeeper for administration purposes, the KafkaAdminClient only uses the Kafka bootstrap brokers connection.

Properties config = new Properties();
config.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, "my-kafka-broker:9092");

KafkaAdminClient adminClient = KafkaAdminClient.create(vertx, config);

The features currently supported are:

  • create and delete topics
  • list all the topics
  • describe topics for getting information about leader partition, follower replicas and ISR (in-sync replicas) list
  • alter topics configuration
  • list all consumer groups
  • describe consumer groups for getting information like the state, the coordinator host, consumers per topics and so on

If you are using the AdminUtils today, consider migrate to the new KafkaAdminClient because the former will be removed in Vert.x 4.0.

And more…

Here are some other important improvements you can find in this release:

  • Shared data structures available in local-only mode even when Vert.x is clustered
  • JSON decoding without prior knowledge of the structure (object, array, string, …etc)
  • Infinispan Cluster Manager upgraded to Infinispan 9.4.10.Final
  • And obviously we have the usual bug fixes!

Finally

The 3.7.0 release notes can be found on the wiki, as well as the list of deprecations and breaking changes

Docker images are available on Docker Hub.

The Vert.x distribution can be downloaded on the website but is also available from SDKMan and HomeBrew.

The event bus client using the SockJS bridge is available from:

The release artifacts have been deployed to Maven Central and you can get the distribution on Bintray.

That’s it! Happy coding and see you soon on our user or dev channels.


by vietj at April 02, 2019 12:00 AM

JBoss Tools and Red Hat CodeReady Studio for Eclipse 2019-03

by jeffmaury at April 01, 2019 08:35 PM

JBoss Tools 4.11.0 and Red Hat CodeReady Studio 12.11 for Eclipse 2019-03 are here waiting for you. Check it out!

crstudio12

Installation

Red Hat CodeReady Studio comes with everything pre-bundled in its installer. Simply download it from our Red Hat CodeReady product page and run it like this:

java -jar devstudio-<installername>.jar

JBoss Tools or Bring-Your-Own-Eclipse (BYOE) CodeReady Studio require a bit more:

This release requires at least Eclipse 4.11 (2019-03) but we recommend using the latest Eclipse 4.11 2019-03 JEE Bundle since then you get most of the dependencies preinstalled.

Once you have installed Eclipse, you can either find us on the Eclipse Marketplace under "JBoss Tools" or "Red Hat CodeReady Studio".

For JBoss Tools, you can also use our update site directly.

http://download.jboss.org/jbosstools/photon/stable/updates/

What is new?

Our main focus for this release was improvements for container based development and bug fixing. Eclipse 2019-03 itself has a lot of new cool stuff but let me highlight just a few updates in both Eclipse 2019-03 and JBoss Tools plugins that I think are worth mentioning.

OpenShift 3

New OpenShift connection helper

When you need to defined a new OpenShift connection, you need to provide the following information:

  • cluster URL

  • username and password or token

If you’ve already logged in your cluster through the OpenShift Web Console, you can copy an oc command in the clipboard that contains both the cluster URL and your token.

So, from now, there is a new option that allows you to initialize the wizard fields from the copied oc command:

connection wizard paste

Click on the Paste Login Command button and the fields will be initialized:

connection wizard paste1

Server tools

EAP 7.2 Server Adapter

A server adapter has been added to work with EAP 7.2.

Wildfly 15 Server Adapter

A server adapter has been added to work with Wildfly 15. It adds support for Java EE 8.

Related JIRA: JBIDE-26502

Wildfly 16 Server Adapter

A server adapter has been added to work with Wildfly 16. It adds support for Java EE 8.

Hibernate Tools

New Runtime Provider

The new Hibernate 5.4 runtime provider has been added. It incorporates Hibernate Core version 5.4.1.Final and Hibernate Tools version 5.4.1.Final

Runtime Provider Updates

The Hibernate 5.3 runtime provider now incorporates Hibernate Core version 5.3.9.Final and Hibernate Tools version 5.3.9.Final.

The Hibernate 5.2 runtime provider now incorporates Hibernate Core version 5.2.18.Final and Hibernate Tools version 5.2.12.Final.

Maven

Maven support updated to M2E 1.11

The Maven support is based on Eclipse M2E 1.11

Platform

Views, Dialogs and Toolbar

User defined resource filters in Project Explorer

The Filters and Customization…​ menu in Project Explorer now shows an additional User filters tab which can be used to exclude some resources from Project Explorer based on their name.

Full name and regular expressions are supported.

user filters
Error Log view added to Platform

The Error Log view has been moved from the PDE project to the Platform project. See bug 50517 for details.

Copy to clipboard in Installation Details

A copy to clipboard action has been added to all tabs of the Installation Details dialog.

copy installation details
Copy & paste of Environment Variables

The Environment tab in a Launch configuration dialog supports copy & paste actions now. The environment variables are transferred as text data, so it is not only possible to copy & paste between two different launch configurations, but also between the launch configuration and e.g. some text editor or the command line.

env var copy paste

This feature is available in all launch configurations which use the common Environment tab.

When Eclipse IDE is started for the first time or with a new workspace, it may not be intuitive for new users on how to proceed. To help the users in getting started, the following useful links have been provided to add a project to the workspace:

  • Perspective specific project creation wizard

  • Generic New Project wizard

  • Import projects wizard

ProjectExplorer
New mnemonics in Error Log view

New mnemonics have been added for Export Entry…​ and Event Detail entries in the context menu of Error Log view.

mneumonics

Themes and Styling

Improved Dark theme for Mac

The Dark theme for Mac has been improved to use the colors from the macOS system dark appearance. Some of the notable changes in Eclipse IDE are the dark window title bar, menus, file dialogs, combos and buttons.

Note: This change is available on macOS Mojave and later.

Before:

darktheme before

After:

darktheme after
Improved Dark theme for Windows

The drawing operations have been improved in Windows so the custom drawn icons look better now. For example, check the close icon below.

Before:

closebutton before

After:

closebutton after

General Updates

Performance improvements

The startup and interactive performance of multiple operations has been improved again in this release.

Java Developement Tools (JDT)

Java 12 Support

Java 12

Java 12 is out and Eclipse JDT supports Java 12 for 4.11 via Marketplace. The release notably includes the following Java 12 feature: JEP 325: Switch Expressions (Preview). Please note that this is a preview language feature and hence enable preview option should be on. For an informal introduction of the support, please refer to Java 12 Examples wiki.

JUnit

JUnit 5.4

JUnit 5.4 is here and Eclipse JDT has been updated to use this version.

Test factory template

JUnit Jupiter now allows test factory methods to return a single DynamicNode. The test_factory template has been updated to include DynamicNode in the return type.

junit test template

Java Editor

Default and constant values in content assist information pop-up

The additional information pop-up of a content assist proposal now shows the default value of an annotation type element:

default value annotation type elelemt

and the value of a constant:

constant value
Create service provider method

If a service defined in a module-info.java file has an invalid service provider implementation, a Quick Fix (Ctrl + 1) is now available to create the new provider method:

service provider proposal
service provider linked proposal

Java Formatter

Line wrapping settings for binary operators

Instead of a single line wrapping setting for binary expressions, there’s now a whole section of settings for various kinds of binary operators (multiplicative, additive, logical, etc.). There are settings for relational (including equality) and shift operators, which were not covered by the old setting. Also, string concatenation can now be treated differently from arithmetic sum.

The settings can be found in the Profile Editor (Preferences > Java > Code Style > Formatter > Edit…​) under the Line Wrapping > Wrapping settings > Binary expressions subsection.

formatter wrap binary expressions
White space settings for binary operators

The white space around operators in binary expressions can now be controlled separately for different groups of operators, consistent with the line wrapping settings.

The new Binary operators sub-section has been added under White Space > Expressions in the Formatter profile editor.

formatter spaces binary expressions
Wrapping setting for chained conditional expressions

A chain of nested conditional expressions (using ternary operator) can be now wrapped as a single group, with all of them indented at the same level. It’s only possible for right-sided nesting.

Find the Chained conditionals setting in the Profile Editor under the Line Wrapping > Wrapping settings > Other expressions subsection.

formatter wrap chained conditionals
Indent Javadoc tag descriptions

The Formatter Profile has a new setting that indents wrapped Javadoc tag descriptions. It’s called Indent other tag descriptions when wrapped, in contrast to the preexisting Indent wrapped @param/@throws descriptions setting. It affects tags like @return or @deprecated.

The settings can be found in the Profile Editor (Preferences > Java > Code Style > Formatter > Edit…​) under the Comments > Javadocs section.

formatter indent tags

Debug

History for expressions in the Variables view

The Variables view now stores a history of the expressions used in the Detail pane. You can choose a previously entered expression for a variable from the new drop-down menu. The expression will be copied to the Detail pane where you can select it to perform various actions present in the context menu.

expressions history

And more…​

You can find more noteworthy updates in on this page.

What is next?

Having JBoss Tools 4.11.0 and Red Hat CodeReady Studio 12.11 out we are already working on the next release for Eclipse 2019-06.

Enjoy!

Jeff Maury


by jeffmaury at April 01, 2019 08:35 PM

Welcome Gabriela!

April 01, 2019 02:40 PM

We'd like to welcome Gabriela Motroc to the Eclipse Foundation as a Content Marketing Specialist based in Germany!

April 01, 2019 02:40 PM

Thank You for Taking the Jakarta EE 2019 Developer Survey!

April 01, 2019 01:40 PM

The survey wrapped up at midnight on Monday, March 25, with over 1,770 responses from developers around the globe.

April 01, 2019 01:40 PM

Welcome Gabriela!

by Thabang Mashologu at April 01, 2019 01:12 PM

I am happy to announce that Gabriela Motroc has joined the Eclipse Foundation as a Content Marketing Specialist based in Germany.

Gabriela joins us from the Software & Support Media Group, where she was an editor of JAXenter.com and JAX Magazine. She is well known to many in the Eclipse community and recently spearheaded JAXenter’s excellent Understanding Jakarta EE Series

Gabriela holds a Master’s degree in International Communication Management and a B.A. Journalism. Her knowledge and experience will be great assets in developing and sharing updates, news, and content that motivates, educates, and inspires the engagement of our community.

Please join me in welcoming Gabriela to the Eclipse Foundation marketing team.


by Thabang Mashologu at April 01, 2019 01:12 PM

Single-sourcing web & mobile forms with JSON Forms

by Jonas Helming and Maximilian Koegel at April 01, 2019 09:32 AM

JSON Forms is a framework for efficiently developing form-based UIs based on JSON Schema. It provides a simple declarative JSON based...

The post Single-sourcing web & mobile forms with JSON Forms appeared first on EclipseSource.


by Jonas Helming and Maximilian Koegel at April 01, 2019 09:32 AM

Back to the top