Posts by pjmolina.

StringTemplate: a great template engine for code generation

Paper Templates, licensed CC Attribution by Edinburgh City of Print

When building a new tool for modelling and code generation like Essential, one has to rethink again what template engine use to drive all the machinery. In code generation contexts, template engines are a good field for innovation and your choice will be with you probably for the full lifetime of your tool.

In this post, I will try to introduce and explain why StringTemplate is a superb engine for doing code generation and why you should consider it if dealing with a code generation scenario. More… »

Hello World with Essential, the video

Essential Logo

The Hello World sample is a nice starting point to show the syntax and capabilities of every new language. This test is also useful for code generators and Domain Specific Languages (DSLs) also as a proof of concept.

Following this honorable tradition, I have created a video showing the capabilities of Essential: the tool I am working on for doing agile Model Driven Development.

In this 10 minutes video you will get a general idea of the DSL the language provides to create:

  • metamodels
  • models
  • templates
  • and control transformations

In order to see the details, jump to Vimeo, activate the High Definition mode (HD) and set full screen (sorry embebed version is not good enough).

Essential IDE – Hello World sample from Pedro J. Molina on Vimeo.

More info about it and 12 usage scenarios in the last Code Generation 2010 presentation about Tailored Code Generators.

Share your impressions!

Language Workbench Competition 2011

Language Workbenches, as defined originally by Martin Fowler, are tools aiming to cope with DSL creation and code generation to increase the level of abstraction of software development.

Currently, the main efforts on MDD, MDE, MDSD (model-driven-whatever you prefer…) are focused in the development of this kind of tools perceived as a hot research area for Software Engineering.

In this scenarion, Cambridge, at Code Generation 2010 was the perfect place for sparkling the idea of promoting a contest to show and compare the advances of different language workbenches.

The Language Workbench Competition born with the objective to serve as a point of comparison between different tools in this exciting and fast moving area.

The competition is now open to the public. So anyone interested can enroll and implement the proposed challenge just published.

On the other hand, if you want to know more about Language Workbenches, modeling and code generation add this page to you bookmarks and come back in few months to see some proposals.

The promoters of the idea are: Markus Völter, Eelco Visser, Steven Kelly, Angelo Hulshout, Jos Warmer, Bernhard Merkle, Karsten Thoms and myself.

So this a call to arms but with sportsmanship!

Angelo and Markus has already started the calling.

Tailored Code Generators at CG2010

I presented the following talk: DSL and tool support for Tailored Code Generators at Code Generation 2010, at Cambridge, UK on June 18th.

It also was the public presentation of Essential: the tooling supporting my approach for applying MDD. I got a very good feedback from the audience and receive many request to test the tool.

People interested in beta testing it can still enroll here.

Introducing MDSD

My yesterday talk slides in Code Generation 2010 about Introducing Model Driven Software Development:

Essential drop

Essential is going to be presented this week in Code Generation 2010 during the session DSL and tool support for building tailored code generators.

To celebrate this milestone and give the chance to have more people trying it, an early version is going to be released for the people interested in.

If this is your case, please enroll yourself using the evaluation request form.

Nature by Numbers

Today I want to share an outstanding video found by my colleague Nico.

This kind of material always shock and amaze me!

When I was a child, I imagine how multimedia contents can effectively be more educational than just using the boring traditional books. I remember myself playing with animated GIFs to show the cyclic nature of the glucose and later on playing with Powerpoint, Flash, etc. to try to explain complex things visually. I prefer a good picture than a thousand of words.

Thereby, when I see a video like the next one I need to see it two or three more times till be able to close the mouth and that only happens just after satisfying my curiosity and gathering the full details. Math, nature and a piece of art, all in one.

Now enjoy it and turn on the full screen mode!

Nature by Numbers from Cristóbal Vila on Vimeo.

The three principles explained in the video:

Intro and the making-of.

After seen the video, and coming back from the off-topic, isn’t beauty to dream about that, may be, Nature is really model-driven… and actually has a complex and hidden metamodel governing it all?

All credits to Cristóbal Vila, Etérea Studios and his great videos.

¡Que bueno! ¡Maño! Me quito el sobrero.

Additional Model Driven bonus: Reviewing the making-of I found two visual models (DSL) (this and this using XPresso) describing algorithms in a visual form driving the animation in two scenes. Wow!

Presenting on Semana Informatica 2010

Enlace a

On April 27, my colleague Nicolas Cornaglia and I will be presenting a talk with some live demos in Valencia, Spain representing our company Capgemini, in the scope of the event Semana Informatica 2010.

The title of the talk will be: Productivity through frameworks and MDD.

The session will be delivered in Spanish. Full agenda (PDF version) and session details.


Business applications for Enterprise Software usually follows a fixed set of standards (global or in-house) to help in keeping the maintenance cost as lower as possible (reducing TCO). In this context, homogeneity and regulation compliance is frequently a must.

The main issue in our presentation will be to show how an approach based in a good framework, modeling tools and code generation techniques can be the right tools to achieve a high degree of standarization, quality, productivity and flexibility to evolve the Enterprise Architecture. Such flexibility is key to provide a better Time to Market when a business process change or a technical requirement suddenly emerges.

Balancing Variability & Commonality

When creating a DSL (Domain Specific Language) one of the most important choices is to decide about what items in your domain are going to be considered variable, changeable and which ones are going to be considered fixed, carved in stone.

The former need to be specified in your DSL, in your design or may be coded. The latter are considered immutable and will remain static for all your derived applications for ages.

Considering that everything is static it is obviously useless. On the contrary, considering every aspect variable drives to another no-end getting nothing tangible again as a result. Therefore, in the middle we will have to search for the virtue.

The main issue here is to study a domain and ponder between variable parts and fixed parts. It is not a trivial thing to do from the very beginning. Experience in DSL construction and specially, experience in the domain helps to train your smell, but there are not clear rules for it, nevertheless.

It is not only about knowing your requirements. It is about trying to predict how your requirements will change across time and what types of requirements have more likelihood and tendency to change.

Adding variability

A variable part could be, for example, the background color of your application. If so, you need to add syntax and semantics to your DSL to capture such property. Let’s say you can express somewhere in your specification:  { background-color = peach; }

We can select the peach color for app1, and may be ivory for app2.

However, nothing is for free and this freedom comes with the followings possible drawbacks:

  • You need to increase the size of your language (DSL), editors, model checkers, compilers and code generation or interpreters.
  • Users have to provide a value for such property unless you have also provided a sensible default value in case of missing information.
  • Homogeneity across applications vanishes with respect to background-color. Now it’s a user choice (the one in control of the modeling tool).
  • Specs are more complex.

Adding commonality

On the other hand, if you consider the background of your application should be always the same because you are following, for example, a user interface style guide then, the background color is a fixed issue. Its value is provided by design by a style guide, by an architect, or design choice and the user modeling has no control over it.

In this scenario, the DSL is smaller. No need to specify the background color, it is implicit, it is no included in the model/specification.

With this kind of choice, we are betting for standardization. A shared library, a runtime framework or an interpreter will take care of supplying the right color in the right moment.

  • Users can not change the background color, specs are smaller.
  • Standardization is improved across applications.
  • User has no control on the feature.

But, what is the right choice?

It depends. There is no right choice with the information given till the moment. To answer the question we need to consider if the background color is a fundamental feature in our domain and it is needed to be different from application to application or may be, on the contrary, the color should be used in an homogeneous way following a predefined style guide.

Again, the domain imposes the rules to follow. Studding the domain and its variability is crucial to create a consistent DSLs focused in gathering the key features of the domain in a model: the important and variable ones. The important and fixed ones must be also identified but they shouldn’t be included into the model, but into the framework or the runtime.

Standards, policy assurance, compliance

Everything related to standard procedures, compliance and in-house stile guidelines are first-class candidates for standardization. If done in that way, your developers will not have to remember all that weird standard and compliance rules when developing a specific artifact.

A code generator will provide the right value for them. It will do it silently, without errors neither oversights. All the boring code dedicated to plumbing applications like: naming guidelines, service publication, serialization, persistence, adapters, proxies, skeletons, stubs, DAO code are driven by strict standards and best practices and are natural candidates for strong automation by code generators.

Moreover, if the regulation or the standard changes, the change will have impact in the following assets:

  • a single change to a framework will be enough
  • or a change to a code generator and then forcing a regeneration process and redeploy.

In both cases, it is cheaper that manually reviewing a set of in-production applications.

For example, think about replacing your data-layer access code from a DAO pattern and SQL to an ORM based approach like Hibernate.

Business Know-How

The core of the business Know-How is the important and the variable parts we are interested in to be collected in a specification. Such features need to be modeled, and if possible, abstracted from the technology that will implement it.

If we do it in this way, the model can survive the current technology.

Why we could be interested in do it in such a way?

Just because technology evolves like fashion. Today everyone likes red T-shirts, tomorrow blue jeans will be sublime! Basic, Cobol, C, Java, C#, Ruby… what is the next language to use in 5 years time?

Use your best bet, whatever platform better fulfills your requirements, but I it could be nice to see the business process surviving the technology. ;)  We don’t know in which direction, but technology will evolve, and will change for sure.

Maintaining a language or a DSL

When a DSL or a language needs a review you will be probably considering adding new features to the language.

Each new feature will increase the variability and increase the complexity of the language. Before deciding to add a new modeling feature make a cost/benefits analysis and double check that the valued added by the improvement is greater than the cost of implementing it.

I like to follow the golden rule proposed by Gordon S. Novak about automatic programming:

“Automatic Programming is defined as the synthesis of a program from a specification. If automatic programming is to be useful, the specification must be smaller and easier to write than the program would be if written in a conventional programming language.”


Whenever is possible:

  • Business Know-How should be captured by models, specs, DSLs.
  • Technical Know-How should be captured by code generators, model interpreters, best practices and patterns.

So, at the end of the day I like the following pair of quotes to sum up about what to include in a model:

  • The Spanish writer Baltasar Gracián in the XVII century said “Lo bueno si breve, dos veces bueno.” (a literal translation from Spanish could be: “Good things if brief, twice good.”)
  • On the other side, Albert Einstein (XX century) counterpoints “Things should be as simple as possible, but not simpler.”

Countdown for CG2010

The Programme for Code Generation 2010 has been published.

This year Mark has invited me to give an introductory session to Model Driven Software Development (MDSD) oriented to begginers.

Also, I will discuss in a second session about creating tailored code generators.

See you in Cambridge in June!