Assistance with Open Source adoption

Syndicate content
Liferay Community
Updated: 3 hours 11 min ago

How to upgrade my sharded environment to Liferay 7.x?

Fri, 08/10/2018 - 02:35

Hi Liferay Community,

Before responding this question I would like to explain what's sharding first: to overcome the horizontal scalability concerns of open source databases at the time (circa 2008), Liferay implemented physical partitioning support.  The solution allowed administrators to configure portal instances to be stored in different database instances and database server processes.

This feature was originally named "sharding" although "data partitioning" is more accurate since it requires a small amount of information sharing to coordinate partitions.

Thus, beginning in 7.0, Liferay removed its own physical partitioning implementation in favor of the capabilities provided natively by database vendors. Please, notice that logical partitioning via the "portal instance" concept (logical set of data grouped by the companyId column with data security at portal level) is not affected by this change and it's available in current Liferay versions.

Having explained this, the answer to this question is simple, just the follow the official procedure to do it:

So Liferay 7.x provides a process which will convert all shards in independent database schemas after the upgrade. This can be suitable for thoses cases where you need to keep information separated for legal reasons. However if you can not afford to maintain one complete environment for every of those independent databases you could try another approach: disable staging by merging all shards into just one database schema before performing the upgrade to Liferay 7.x.

The option of merging all shard schemas into the default one is feasible because sharding generates unique ids per every row among all databases. These are the steps you should follow to achieve this:

  1. Create a backup for the shard database schemas in the production environment.
  2. Copy the content of every table in the non default shards into the default shard. It's recommended to create an SQL script to automate this process.
  3. If a unique index is violated, analyze the data for the two records which cause the issue and remove one of them since it's not necessary anymore (different reasons could cause the creation of data in the incorrect shard in the past such as wrong configuration, a bug, issues with custom developments, etc.)
  4. Resume this process from the last point of failure.
  5. Repeat 3 and 4 until the default shard database contains all data from the other shards.
  6. Clean up the Shard table except for the default shard record.
  7. Startup a Liferay server using this database without the sharding
    1. Remove all database connections except for the default one.
    2. Comment the line META-INF/shard-data-source-spring.xml in the spring.configs property.
  8. Ensure that everything works well and you can access to the different portal instances. 

It is recommended that you keep record of the changes made in the step 3 and 6 since you will need to repeat this process once you decide to go live after merging all databases in the default shard. It is also advisable to do this as a separate project before performing the upgrade to Liferay 7.x. Once you have completed this process you will just need to execute the upgrade as a regular non-shared environment:

This alternative to upgrade sharded environments is not officially supported but it has been executed succesfully in a couple of installations. For that reason, if you have any question regarding it please write a comment in the this blog entry or open a new thread in the community forums, other members of the community and I will try to assist you during this process.

Alberto Chaparro 2018-08-10T07:35:00Z
Categories: CMS, ECM

Extending Liferay DXP - User Registration (Part 1)

Sun, 08/05/2018 - 20:19

I am starting a short series of blogs in which I will show how to customise User Registration / Customer Acquisition process in Liferay.

The inspiration for this blog is one of our DXP portal implementations which had very broad (and frequently changing) user data collection requirements.

There are many scenarios in which you want to customise user registration and account 
creation process:

  • creating a custom user registration form allows delivering a more consistent user experience;
  • when based on information a user provided an approver makes a decision to grant or reject the registration;
  • apart from default fields you can add any user account fields;
  • you can collect any other information related to your business and store it in your custom tables;
  • different registration flows might be required for different groups of users or sites;
  • a wizard-like registration form can deliver more information and help user to make the right choices;
  • users can apply for different roles or permissions at registration;
  • and it could be a place where you might want to display your marketing material.

For now I plan to write three articles:

1. Custom registration process for a portal with multiple sites. 
The main features:

  • Check if a user already has an account
  • Show and allow the user to select which site to join
  • Collect additional information from the user and send email notification to interested parties (approver for restricted site) on form submission
  • Automatically create the user account on form submission if the membership type of the selected site is open, otherwise create the user account but make it inactive (will be activated by approver)

2. Getting more data out of registration form 
The main features:

  •  Add fields to registration form that will be stored in User Account details
  • Collect information to populate user Custom fields
  • Add fields to store in custom tables
  • Allow users to download and print the entries they made to the form for their records

3. Taking payments at registration 
The main features:

  • If payment is required as part of request to join a site, display amount to pay and collect the payment
  • Allow users to download Receipt for their records


The Registration form will be based on SmartForms. Of course, you can built your own form/ui  but here are the reasons why you might consider Smartforms:

  • Flexibility - non-technical staff can edit the form and change wording to adjust or add explanations/instructions/conditions on the form.
  • Styling made easy - fully separated from software code and can be done by web designer.
  • Simple Integration with your other systems via webservices to bring data to the form.
  • Easy field value extraction from the submitted form into Liferay data stores or your other systems. 
  • But the most important - cutting costs on the porlet development, as form creation and further adjustments/changes can be done without software modifications.

You can download SmartForms from Liferay marketplace - or from SmartForms downloads page, additional information can be located here


Even if you are not going to use Smartforms I hope these blogs will help you as I will be providing samples of how to use Liferay services/APIs for your user registration purposes.

Victor Zorin 2018-08-06T01:19:00Z
Categories: CMS, ECM

The new Web Experience functionalities released in 7.1

Fri, 08/03/2018 - 05:52

In Liferay Portal 7.1 a lot of effort has been put in improving the experience of building sites and pages and, as a result, some new functionalities have been introduced to empower marketers and other business users in the creation of stunning websites.

We have achieved this by considering differentiated needs for the marketer, designer and web developer; and adding a new approach to content and page creation. We will continue working to expand the initiative in future releases, but some exciting features have already been introduced.

The key elements added to Web Experience under what we call Modern Site Building are Page Fragments, Content Pages, Display Pages and Multiple Navigation Menus. This is the first entry of a series of posts in which we will have the opportunity to go into more detail of each of those. But, as a starter, we will briefly introduce the big picture and how every component fits in it.


The context

Authors, marketers, designers, HR teams or whoever is in charge of creating a piece of content or site pages sometimes face repeat work, but not always, right? Sometimes they are happy to use web content structures because content reusability sounds like pure magic. Some other times they wish there was a faster way to create pages because it is a one-off and what really matters is minimizing the number of clicks while making sure it looks awesome. Put simply, the needs are not always the same.

Liferay has always used a very content centric and decoupled approach. The creation of content in the form of different assets and the display of it have always been two separate things. Thus, the long-established approach to page creation with Liferay is by combining content and applications. These pages are flexible, powerful and offer advanced functionality; but also add complexity to some use cases.


The authoring flow considered

For the new functionalities introduced we have considered a design and authoring flow like the one described below.


The marketer or the business user needs to create an effective landing page that communicates the desired message and increases customer engagement. The marketer does not have front-end knowledge but would like to have an interface to create site pages in a few clicks with editing options and making sure it looks great.

In order to achieve this, previous work of the Design and Web Development team is necessary. First of all, the designer designs several reusable page parts called Page Fragments (list of contents, headers, title sections, footers…).

The Web developer then implements these designs, coding the appearance and behavior of the Page Fragments and loads them into the Page Fragment collection library. Thereafter, the designer can create Page Templates by adding and combining Page Fragments, making a Page Template collection available for the marketer to directly start working on.

Using this approach, marketers will focus on writing content in Content Pages, using Page Templates created by designers which will consist of Page Fragments previously developed by web developers


Key new elements.

With this authoring flow considered, we have introduced new functionalities attending to the different roles and responsibilities in the creation process:

  • Page Fragments, which are collections of “design blocks” created by web developers using HTML, CSS and JavaScript that are made available to the non-technical users to build the Pages by composing them.

  • Content Pages, a new way of creating pages with a few clicks. The main benefits of content page are the easy edition of its content directly on the page.

  • Display Pages, which allow to control how a particular content is displayed in full page.

  • Multiple Navigation Menus, that leverages the Navigation by allowing to introduce several Menus on the header, on the footer, sidebar...


Themes with Page Fragments available from the Marketplace

To ease the Blank Page Syndrome you can start standing on some already defined examples. If you want to leverage on existing Page Fragment collections to start exploring the possibilities of the new features right away, you can already download the Fjord theme for Liferay Portal 7.1 from Liferay Marketplace. It contains a series of Page Fragments and Page Templates that can be used as a starter kit. More themes will follow soon!



I want to know more.

Read along the coming series of posts to learn about Page Fragments, Content Pages, Display Pages and Multiple Navigation Menus.

Also, if you want to learn more about how to empower your team to build modern sites, you can sign in for free to the lesson on “Building Engaging Websites” available in Liferay University or accessing Liferay Documentation.


Ianire Cobeaga 2018-08-03T10:52:00Z
Categories: CMS, ECM

New Project SDK Installers 3.2.0 GA1 Released

Thu, 08/02/2018 - 21:45

We are pleased to announce the first general available release of Liferay Project SDK Installers that support Liferay 7.1.

New Installers:

For customers, they can download all of them on the customer studio download page.

The installer is the full fledged Liferay Developer Studio installer which installs Liferay workspace, blade, Developer Studio and comes pre-bundled with latest Liferay DXP server. It also supports to config a proxy using for download gradle dependencies.

Upgrade From previous 3.1.x:
  1. Download updatesite here

  2. Go to Help > Install New Software… > Add…

  3. Select Archive..., Browse to the downloaded updatesite

  4. Click OK to close Add repository dialog

  5. Select all features to upgrade, then click > Next, again click > Next and accept the license agreements

  6. Finish and restart to complete the upgrade

Release highlights:

Installers Improvements:

1. Better support HTTP, HTTPS, SOCKS5 proxy

 2. Bundle latest Liferay Portal

   - bundle 7.1.0 GA1 in LiferayProjectSDKwithDevStudioCommunityEdition installers

   - bundle DXP 7.1.10 GA1 in LiferayProjectSDKwithDevStudioDXP installers

Improvements for Deployment:

1. Support Liferay Watch Task

2. Support for Target Platform

3. Improve wizard to download dependencies in background

4. Better deployment support for Liferay DXP/7

   - support DXP 7.1.10 GA1 Tomcat and Wildfly

   - support Liferay 71 CE GA1 Tomcat and Wildfly

   - integration of Blade CLI 3.1.1

   - support Plugins sdk 1.0.19

   - support Liferay Workspace Gradle 1.10.2

5. Third party plugins update

   - update m2e to 1.8.3

   - updategradle plugin buildship to latest 3.0.0

6. Miscellaneous bug fixes


If you run into any issues or have any suggestions please come find us on our community forums or report them on JIRA (IDE project), we are always around to try to help you out. Good luck!

Yanan Yuan 2018-08-03T02:45:00Z
Categories: CMS, ECM

Oh no, my URLs disappeared…(and how to get them back)

Thu, 08/02/2018 - 02:30

Recently we got a couple of complains about new Web Content article behaviour, specifically about the JournalArticle.getContent() method’s return value. The main problem developers experience is when they embed an Image into the Web Content article or use ddm-image or ddm-document-library field in their structures they expect to see the URL of the object(Image or D&M asset) in the raw XML when using JournalArticle.getContent() method, it actually was there in the 7.0 and the raw XML looked like this:

(...) <dynamic-element name="Image8r1v" type="image" index-type="text" instance-id="ryns"> <dynamic-content language-id="en_US" alt="" name="blonde.png" title="blonde.png" type="journal" fileEntryId="34506" id="34835"> /image/journal/article?img_id=34835&amp;t=1531817578959 </dynamic-content> </dynamic-element> (...)


There are two main differences in the 7.1:
We switched from the internal table JournalArticleImage to the common Documents and Media repository as a storage for the Web Content article images
DDM fields for Image and D&M assets changed their internal representation from the URL to the JSON object

Now the raw XML of the article with Images or ddm-image(ddm-documentlibrary) fields looks like this:

(...) <dynamic-element name="Image54q7" type="image" index-type="text" instance-id="wscg"> <dynamic-content language-id="en_US"> <![CDATA[\{ "groupId":"20124","name":"allatonce.png","alt":"", "title":"allatonce.png","type":"journal", "uuid":"80269faa-dea9-fd5a-cb78-3c7aa9da51ea", "fileEntryId":"36774","resourcePrimKey":"36772"} ]]> </dynamic-content> </dynamic-element> (...)


It was an internal decision and we didn’t realize that out there could be developers who actually use the raw XML content for their own needs…

First I would like to explain why it was done, not to try to excuse for this case, but to prevent such cases in the future. On the one hand, JournalArticle.getContent() method is a public API and its behaviour must be, at least, backward compatible, but on the other hand its behaviour depends on many components behind it, the signature of the method didn’t change and the implementation details (including the raw XML format of the content) never were published. To avoid such a problem we strongly recommend the developers to use published means for Web Content processing, such as JournalContent and JournalArticleDisplay. Both of them provide processed content of the Article without need to work with the raw XML, clear example can be found in the Web Content Display portlet:

(...) JournalArticleDisplay articleDisplay = _journalContent.getDisplay( article, ddmTemplateKey, viewMode, languageId, page, new PortletRequestModel(renderRequest, renderResponse), themeDisplay); String processedContent = articleDisplay.getContent(); (...) @Reference private JournalContent _journalContent; (...)


Also there is a taglib which allows to render a specific journal article using its JournalArticleDisplay instance:

<liferay-journal:journal-article-display articleDisplay="<%= articleDisplay %>" />


Or the developer can use JournalContent.getContent() method directly, the result must be the same - processed content where all the fields behave as expected.
Now let’s talk about how to get the URLs back because I understand that it could be a problem to refactor hundreds of lines of your code and the best way for the developers who use the raw XML would be to keep processing the URLs as they were doing it before.
Here I have to mention one detail - there is no way to return to the old format of the URLs for embedded images, so if you have some sort of regular expression catching “/image/journal/article...” - there is no way to make it work again.
There are two options to get the URLs back, both need to adapt your existing code, which works with the raw XML, a little bit.

First option is applicable when you have a concrete file entry ID:

(...) // here fieldValue is raw XML field value for your Image/DM field JSONObject jsonObject = JSONFactoryUtil.createJSONObject(fieldValue); long fileEntryId = jsonObject.getLong("fileEntryId"); FileEntry fileEntry = PortletFileRepositoryUtil.getPortletFileEntry(fileEntryId); String fileEntryURL = PortletFileRepositoryUtil.getDownloadPortletFileEntryURL( themeDisplay, fileEntry, StringPool.BLANK); (...)

And the second option is applicable in case when you don’t have specific file entry ID, but have UUID and group ID of the target entry:

(...) // here fieldValue is raw XML field value for your Image/DM field JSONObject jsonObject = JSONFactoryUtil.createJSONObject(fieldValue); String fileEntryGroupId = jsonObject.getLong("groupId"); String fileEntryUuid = jsonObject.getLong("uuid"); FileEntry fileEntry = PortletFileRepositoryUtil.getPortletFileEntry( fileEntryUuid, fileEntryGroupId); String fileEntryURL = PortletFileRepositoryUtil.getDownloadPortletFileEntryURL( themeDisplay, fileEntry, StringPool.BLANK); (...)


Hope these 5 lines of code help you to solve the problem. We understand that it could be frustrating to deal with such changes and we are trying to do our best to avoid them without actual need.

Pavel Savinov 2018-08-02T07:30:00Z
Categories: CMS, ECM

Monitor c3p0 in Liferay

Tue, 07/31/2018 - 20:05
The Problem

The c3p0 connection pool exposes MBeans for JMX tools like jconsole to monitor its run-time states. However, both c3p0 and Liferay intentionally generate random MBean names on each startup. The random name of each data source makes it difficult to setup an independent monitoring tool.

The second problem is, by default Liferay creates several data sources, one used by most Liferay features, one for the counter service and one for analytics. They all use the same portal property prefix "jdbc.default.", making them identical in every way except for their random bean names.

The Solution

A default c3p0 bean name looks like this:


The default behavior sets the value of both  identityToken and name to the same random string. According to c3p0 documentation, identityToken can be removed with a system property. The second part of the solution is explicitly name the data sources created by Liferay.

Remove identityToken

Add the following line to TOMCAT_HOME/bin/

JAVA_OPTS="$JAVA_OPTS" Explicitly Name Data Sources

Add the following lines to LIFERAY_HOME/

jdbc.default.dataSourceName=liferayDataSource counter.jdbc.prefix=jdbc.counter. jdbc.counter.dataSourceName=counterDataSource

These lines name the default data source liferayDataSource, the counter service data source counterDataSource and similarly analyticsDataSource. These names are all arbitrary, so you can chose other desired values.

The second line gives the counter service data source a new property prefix "jdbc.counter." Use this prefix to set counterDataSource properties you wish to be different from the liferayDataSource. Conversely, any properties not set with this prefix will use the value set using the "jdbc.default." prefix.

Now in JMX, c3p0 beans look like this:

  • com.mchange.v2.c3p0:name=analyticsDataSource,type=PooledDataSource
  • com.mchange.v2.c3p0:name=counterDataSource,type=PooledDataSource
  • com.mchange.v2.c3p0:name=liferayDataSource,type=PooledDataSource

Name Other Data Sources

If your Liferay is configured with other data source(s), you must also give each of them a unique name. Otherwise, multiple MBeans will have the same name as liferayDataSource, and only one of them will be detected by JMX tools.

For example, if you have a third data source defined with prefix "", add this line to LIFERAY_HOME/

Then you should see these data sources:

  • com.mchange.v2.c3p0:name=analyticsDataSource,type=PooledDataSource

  • com.mchange.v2.c3p0:name=counterDataSource,type=PooledDataSource

  • com.mchange.v2.c3p0:name=liferayDataSource,type=PooledDataSource

  • com.mchange.v2.c3p0:name=sapDataSource,type=PooledDataSource

What to Watch

In a JMX monitoring tool, for each data source, monitor these 3 properties of its MBean:

  • numConnectionsAllUsers

  • numBusyConnectionsAllUsers

  • numIdleConnectionsAllUsers

An alert should be configured when numBusyConnectionsAllUsers exceeds a certain percentage of maxPoolSize, which forecasts a possible exhaustion of database connections with that data source.


Michael Chen 2018-08-01T01:05:00Z
Categories: CMS, ECM

Dynamic Widget

Tue, 07/31/2018 - 15:20

My colleague Vagif proposed a new way to develop a Liferay dynamic widget (the best name I can think of for now) with these pieces:

  1. Use the “Basic Web Content” structure or define a new structure.
  2. For each dynamic widget type, define a new display template.
  3. Create an OSGi service that returns a data model to be used by the display template.
  4. Create a web content article with the above structure/template pair.
  5. Place this web content article onto the page with Web Content Display (OOTB).

All business logics are in the OSGi service. The structure can serve as a preference or parameter holder for the display template. The display template draws most meaningful data from the model returned by the service.

You can build a portlet-less Liferay site with this "Crafter-ish" approach.


A typical Liferay feature has program code spread out among many technologies: service Java code, portlet Java code, template code, template Java code (JSP), language files, property files, XML files, etc. You often find a feature morphs into some JSP files with 100+ lines of Java code, a 2000 line portlet, plus several service builders, all competing for business logic implementations and MVC roles.

A Simpler Pattern

Dynamic widget may serve as a Liferay implementation pattern that simplifies and promotes good programming practices (I think this is better pattern, but some may disagree):

  • Write Java code in and only in service modules (no portlet).
  • Implement all business logic in services.
  • The display template calls a single service to retrieve a data model.
  • The display template then renders the model with almost no business logic.

A Simple Rule:

  • Let the services build the most foolproof model for the template to render it in the most simplistic way.

There is nothing stopping a template from calling multiple services then juggle multiple models to render the view. However, do your self a favor, write a new service method that combines all of them into a single model for the template. You will thank yourself later.

Why Freemarker?

Cannot use JSP may be a down side of dynamic widget, or is it?

Compared to JSP, template languages like Freemarker cannot mix Java code with UI code, so they promote a cleaner separation of view from model and controller. Some may argue that template code getting the model from services is controller-ish. Agree, but beyond that keeping Freemarker code simple is not hard at all, because complex Freemarker code is ugly and painful to write, not to mention tough to debug, log, handle errors, etc.

Pseudo Web Content

Dynamic widget is a Liferay web content, but its "content" is not in its web content structure. The content comes from the services. If you think of a Web Content Display rendering a dynamic widget in a page as calling a function in a program, then its web content structure is like the parameter for that function call.

The widget's web content may be empty, which is similar to calling a function with no parameter. The service that builds the model for the template has everything it needs. In other instances, the structure can have variables used as parameters for calling the service or be used directly by the template just like any other web content.

Search and Preview

Note that variables in the web content structure can be searchable, making them available to Liferay search and Assert Publisher. For example, a dynamic widget may be titled "Annual Sales Figures 2018", which renders a bar chart from a model provided by a service module. Then this widget may be a search hit, along with its pretty chart if you wish. You can't have that if the chart is implemented in a portlet.

Don't forget, you can easily mark the web content not searchable too:

Another convenience of dynamic widget over portlet is preview. Open Liferay Control Panel > (a site) > Web Content. You can preview a widget independent of a page from its Options menu:

Managing Display Templates

For the rest of this topic, lets call the template associated with the web content structure the parent template. The parent template can render the entire widget all by itself, but it's much more likely for the parent template to use other templates with the '<#include />' Freemarker directive.

Here we discuss three ways to develop templates for dynamic widgets.

Liferay Generic Templates

Open Liferay Control Panel > (a site) > Web Content > (corner Options menu) > Templates. Note the first column of table view called "ID". The value of that column is the template ID, also known as template key.

From this Templates view, you can create a template and leave the "Structure" field empty. That creates a generic template. Then in a parent template, you can use a generic template like this:

<#include "${templatesPath}/TEMPLATE_KEY"/>

With this framework, you can implement everything inside Liferay user interface. However, exporting then importing generic templates a LAR file will invalidate all template keys. All parent templates must be manually modified with new keys.

Package Templates In A Module

In the attached, "modules/my-templates" demonstrates packaging Freemarker templates in a OSGi module. The key for parent templates to use templates in this module is the "Web-ContextPath" header:

modules/my-templates/bnd.bnd Bundle-Name: My Templates Bundle Bundle-SymbolicName: my.liferay.templates Bundle-Version: 1.0.0 Web-ContextPath: /my-templates

In a parent template, use the Liferay infix _SERVLET_CONTEXT_ like this:

<#include "my-templates_SERVLET_CONTEXT_/widget/software-project.ftl" />

where before the infix is the "Web-ContextPath" value, and after which is the template path in the module project under "src/main/resources".

The Liferay Freemarker engine allows a template to include sibling templates in the same module using relative path of the current template. For example,

modules/my-templates/src/main/resources/widget/software-project.ftl <h3>software-project.ftl</h3> From parent path: <#include "../common/left-navigation.ftl"/> From child path: <#include "more/right-navigation.ftl"/>

Now the fun part of how Freemarker templates find and call services. First of all, you need to remove Liferay's default restriction on template variables by creating this configuration file as the following:

LIFERAY_HOME/osgi/configs/com.liferay.portal.template.freemarker.configuration.FreeMarkerEngineConfiguration.cfg restrictedVariables=

Then a template can access services in several ways, for example, using the "staticUtil" or "serviceLocator" variables:

modules/my-templates/src/main/resources/widget/software-project.ftl Using restricted variables: <#assign userLocalServiceUtil = staticUtil['com.liferay.portal.kernel.service.UserLocalServiceUtil'] /> ${userLocalServiceUtil.getDefaultUserId(companyId)}, <#assign userLocalService = serviceLocator.findService('com.liferay.portal.kernel.service.UserLocalService') /> ${userLocalService.getDefaultUser(companyId).originalEmailAddress}

Refer to the "Extras" section for a complete list of all Liferay variables available to Freemarker templates.

Package Templates In A Theme

The attached also includes an example theme in "wars/my-theme". Following the folder convention of a theme, templates in the projects are under folder "src/main/webapp/templates". The Gradle build automatically generates a "Web-ContactPath" header with the value of the project folder. Therefore, a parent template can reference a template in this theme as:

<#include "my-theme_SERVLET_CONTEXT_/templates/widget/software-project.ftl" />

Other aspects of templates in a theme are identical to ones in a module.


You can deploy both the module JAR and the theme WAR files by copying them to the Liferay auto deploy folder. Here is what they look like when running:

$ telnet localhost 11311 Trying Connected to localhost. Escape character is '^]'. ____________________________ Welcome to Apache Felix Gogo g! lb my START LEVEL 20 ID|State |Level|Name 759|Active | 10|Liferay My Account Web (1.0.11) 826|Active | 10|Liferay Portal Security AntiSamy (2.0.12) 994|Active | 10|Liferay Site My Sites Web (1.0.8) 1232|Active | 10|My Templates Bundle (1.0.0) 1237|Active | 10|my-theme (7.0.10) g! Extras

A complete list of all Liferay variables available to Freemarker templates, including their implementation class names and snippets of Liferay source code: @Meta.AD( deflt = "serviceLocator|utilLocator|objectUtil|staticFieldGetter|staticUtil", required = false ) public String[] restrictedVariables(); com.liferay.portal.template.TemplateContextHelper.getHelperUtilities(, false) { accountPermission=com.liferay.portal.service.permission.AccountPermissionImpl, arrayUtil=com.liferay.portal.kernel.util.ArrayUtil_IW,, auditRouterUtil=null, browserSniffer=com.liferay.portal.servlet.BrowserSnifferImpl, calendarFactory=com.liferay.portal.util.CalendarFactoryImpl, commonPermission=com.liferay.portal.service.permission.CommonPermissionImpl, dateFormatFactory=com.liferay.portal.util.FastDateFormatFactoryImpl, dateFormats=com.liferay.portal.util.FastDateFormatFactoryImpl, dateTool=May 5, 2018 10:34:58 AM, dateUtil=com.liferay.portal.kernel.util.DateUtil_IW,, expandoColumnLocalService=com.liferay.portlet.expando.service.impl.ExpandoColumnLocalServiceImpl, expandoRowLocalService=com.liferay.portlet.expando.service.impl.ExpandoRowLocalServiceImpl, expandoTableLocalService=com.liferay.portlet.expando.service.impl.ExpandoTableLocalServiceImpl, expandoValueLocalService=com.liferay.portlet.expando.service.impl.ExpandoValueLocalServiceImpl, getterUtil=com.liferay.portal.kernel.util.GetterUtil_IW, groupPermission=com.liferay.portal.service.permission.GroupPermissionImpl, htmlUtil=com.liferay.portal.util.HtmlImpl, httpUtil=com.liferay.portal.util.HttpImpl, imageToken=com.liferay.portal.webserver.WebServerServletTokenImpl, imageToolUtil=com.liferay.portal.image.ImageToolImpl,, jsonFactoryUtil=com.liferay.portal.json.JSONFactoryImpl, languageUtil=com.liferay.portal.language.LanguageImpl, layoutPermission=com.liferay.portal.service.permission.LayoutPermissionImpl,, localeUtil=com.liferay.portal.kernel.util.LocaleUtil, locationPermission=com.liferay.portal.service.permission.OrganizationPermissionImpl,,, organizationPermission=com.liferay.portal.service.permission.OrganizationPermissionImpl, paramUtil=com.liferay.portal.kernel.util.ParamUtil_IW, passwordPolicyPermission=com.liferay.portal.service.permission.PasswordPolicyPermissionImpl, portal=com.liferay.portal.util.PortalImpl, portalPermission=com.liferay.portal.service.permission.PortalPermissionImpl, portalUtil=com.liferay.portal.util.PortalImpl, portletModeFactory=com.liferay.portal.kernel.portlet.PortletModeFactory_IW, portletPermission=com.liferay.portal.service.permission.PortletPermissionImpl, portletProviderAction={ADD=ADD, BROWSE=BROWSE, MANAGE=MANAGE, EDIT=EDIT, PREVIEW=PREVIEW, VIEW=VIEW}, portletURLFactory=com.liferay.portlet.PortletURLFactoryImpl, prefsPropsUtil=com.liferay.portal.util.PrefsPropsImpl, propsUtil=com.liferay.portal.util.PropsImpl, randomizer=com.liferay.portal.kernel.util.Randomizer, rolePermission=com.liferay.portal.service.permission.RolePermissionImpl, saxReaderUtil=com.liferay.portal.xml.SAXReaderImpl, serviceLocator=com.liferay.portal.template.ServiceLocator, sessionClicks=com.liferay.portal.kernel.util.SessionClicks_IW,, staticFieldGetter=com.liferay.portal.kernel.util.StaticFieldGetter, stringUtil=com.liferay.portal.kernel.util.StringUtil_IW, timeZoneUtil=com.liferay.portal.kernel.util.TimeZoneUtil_IW, unicodeFormatter=com.liferay.portal.kernel.util.UnicodeFormatter_IW, unicodeLanguageUtil=com.liferay.portal.language.UnicodeLanguageImpl, userGroupPermission=com.liferay.portal.service.permission.UserGroupPermissionImpl, userPermission=com.liferay.portal.service.permission.UserPermissionImpl, utilLocator=com.liferay.portal.template.UtilLocator, validator=com.liferay.portal.kernel.util.Validator_IW, velocityPortletPreferences=, webServerToken=com.liferay.portal.webserver.WebServerServletTokenImpl, windowStateFactory=com.liferay.portal.kernel.portlet.WindowStateFactory_IW, }


Michael Chen 2018-07-31T20:20:00Z
Categories: CMS, ECM

A Simplified Pattern for Liferay 7 Services

Tue, 07/31/2018 - 13:07

This is a simplified OSGi service API and implementation pattern. It follows the traditional Java interface-implementation pattern, in which the programmer is only required to keep the interface class and implementation class in sync. It does not use Liferay 7 service builder.

The attached archive is a fully implemented ORM service based on MyBatis. Unzip it into the modules folder in a Blade created workspace. A script to create the back-end database will be added soon to make this example fully operational.

Implementation Pattern

In this ORM example, there are two top level packages: 'api' and 'impl'. The 'api' and its children packages are to be exported and used by consumers of this service API. The 'impl' and its children packages are for implementation only and should remain private packages of this OSGi module.

Factory Usage Pattern

The 'api.Factory' class is the access point for consumers to get the services they need. A consumer class uses the Factory like this:

import com.acme.orm.api.bean.Order import static com.acme.orm.api.Facotry.getOrderLocalService; class OnlineStore { public Order checkOrder(String orderId) { Order order = getOrderLocalService().getOrderDetailsById(orderId); // Do something else return order; } }

In order to preserve OSGi's runt-time life cycle management of this module (start, stop, install, uninstall), it is important NOT to keep a reference of the service object obtained from the Factory:

// DO NOT DO THIS OrderLocalService myService = getOrderLocalService(); // DO NOT DO THIS OrderLocalService myService = Factory.getOrderLocalService();

The Liferay service builder went to great length to prevent programmers from keeping a reference of the service object by generating and forcing people to use the static methods in the XyzServiceUtil class. It also creates other confusing and irrelevant artifacts: XyzServiceBaseImpl, XyzServiceWrapper and two projects (two jars) for one service.

Instead of making it foolproof with all those complexities, why not just tell programmers, a very intelligent bunch, not to keep references of OSGi service objects. The result is this clean implementation pattern, with no generated artifact, and two easy to understand rules:

  • Keep the API interface class in sync with the implementation class.
  • Do not keep a reference of the service object obtained from the Factory.
Understanding ServiceTracker

When an OSGi module (or bundle) is replaced at run-time due to it being stopped, started, uninstalled or re-installed, the desired effect is that the services provided by that module be replaced as well. ServiceTracker is the OSGi class that keeps track of module life cycle changes. Module life cycle changes are transparent to service consumer code as long as the consumer code always access the service from the ServiceTracker.

OSGi is a component framework running in a JVM instance that exhibits the same run-time behavior as any Java programs. When the consumer code saves a reference to a service object, that service object will live on even when OSGi replaced its module with a new instance. That service object now becomes an orphan and out-dated instance only known to that consumer code. This is the reason for not keeping a reference to the service.

In this implementation pattern, the Factory class retrieves the service object from its corresponding ServiceTracker. The getService() method of the ServiceTracker shields the module's life cycle changes from the consumer code:

@ProviderType public class Factory { private static ServiceTracker<OrderLocalService, OrderLocalService> _OrderLocalService =; public static OrderLocalService getOrderLocalService() { return _OrderLocalService.getService(); } } Local vs. Remote Service

The differences between a Liferay 7.0 local service and remote service are:

  1. The base interface of the API.
  2. Specific annotations for the remote interface.

In the ORM example, OrderLocalService is local service interface:

Local Service API Declaration @ProviderType @Transactional(isolation = Isolation.PORTAL, rollbackFor = { PortalException.class, SystemException.class}) public interface OrderLocalService extends BaseLocalService { }

while OrderService is the remote service interface exposed as RESTful web service:

Remote Service API Declaration @AccessControlled @JSONWebService @OSGiBeanProperties(property = { "", "json.web.service.context.path=Order" }, service = OrderService.class) @ProviderType @Transactional(isolation = Isolation.PORTAL, rollbackFor = { PortalException.class, SystemException.class}) public interface OrderService extends BaseService { public Order getOrderDetailsById(String orderId); }

This RESTful web service can be found in the following catalog under Context Name "acme" (click the drop down box to find "acme" or other context names):


Both the local and remote service implementation classes just implement their corresponding API interfaces. In the ORM example, OrderLocalServiceImpl is the local implementation that does the actual work of mapping to the database. The remote implementation, as shown below, simply calls the local Factory services:

Remote Service Implementation @ProviderType public class OrderServiceImpl implements OrderService { public Order getOrderDetailsById(String orderId) { return Factory.getOrderLocalService().getOrderDetailsById(orderId); } } Development Details

Here are some key files for creating and implementing a service in this pattern.

Eclipse .project and .classpath

These two files in the example archive must be used to start your service project for Eclipse to recognize it as a Gradle project. You can change the project name in the <name> tag of the .project file before importing the project to Eclipse:


You must also create these two folder structures to hold your Java and resource files:


Once imported to Eclipse, be sure to right click on the project and select "Gradle" -> "Refresh Gradle Project". You can also do the same thing with the parent Gradle project created by Liferay Blade.


The two 'org.osgi:org.osgi.*' dependencies are required for OSGi features.

dependencies { compile group: "com.liferay", name: "com.liferay.osgi.util", version: "3.0.3" compile group: "com.liferay.portal", name: "com.liferay.portal.kernel", version: "2.0.0" compile 'org.osgi:org.osgi.core:5.0.0' compile 'org.osgi:org.osgi.annotation:6.0.0' compile group: 'javax.servlet', name: 'servlet-api', version: '2.5' compile group: "org.mybatis", name: "mybatis", version: "3.4.1" compile files('./resources/lib/sqljdbc4.jar') compileOnly group: "com.liferay", name: "com.liferay.journal.api", version: "1.0.0" } bnd.bnd

All packages under 'api' should be exported in the "Export-Package:" setting. The "Liferay-Spring-Context:" setting directs Liferay to load the Spring bean definition in the module-spring.xml file discussed below. "Lifer-Require-SchemaVersion:", "Liferay-Service:" and "Require-Capability:" settings are also required.

Bundle-Version: 1.0.0 Bundle-ClassPath: .,lib/sqljdbc4.jar Export-Package: \ com.acme.orm.api,\ com.acme.orm.api.bean,\ com.acme.orm.api.exception Import-Package: \ !*,\ !microsoft.sql.*,\ !com.sun.jdi.*,\ !net.sf.cglib.proxy.*,\ !org.apache.logging.*,\ * Include-Resource: @mybatis-3.4.1.jar Liferay-Require-SchemaVersion: 1.0.0 Liferay-Service: true Liferay-Spring-Context: META-INF/spring Require-Capability: liferay.extender;filter:="(&(liferay.extender=spring.extender)(version>=2.0)(!(version>=3.0)))" src/main/resources/META-INF/spring/module-spring.xml

For each bean definition, the "class=" value is the implementation class name, and the "id=" value is the interface class name.

<?xml version="1.0"?> <beans xmlns="" xmlns:xsi="" default-destroy-method="destroy" default-init-method="afterPropertiesSet" xsi:schemaLocation=""> <bean class="com.acme.orm.impl.CommunicationMediumLocalServiceImpl" id="com.acme.orm.api.CommunicationMediumLocalService" /> <bean class="com.acme.orm.impl.MessageCenterLocalServiceImpl" id="com.acme.orm.api.MessageCenterLocalService" /> <bean class="com.acme.orm.impl.NSMUserLocalServiceImpl" id="com.acme.orm.api.NSMUserLocalService" /> <bean class="com.acme.orm.impl.OrderLocalServiceImpl" id="com.acme.orm.api.OrderLocalService" /> <bean class="com.acme.orm.impl.OrderServiceImpl" id="com.acme.orm.api.OrderService" /> <bean class="com.acme.orm.impl.RoutingAreaLocalServiceImpl" id="com.acme.orm.api.RoutingAreaLocalService" /> <bean class="com.acme.orm.impl.WebContentArticleLocalServiceImpl" id="com.acme.orm.api.WebContentArticleLocalService" /> </beans>


Michael Chen 2018-07-31T18:07:00Z
Categories: CMS, ECM

How to disable XML Log Files in Liferay 7.x

Tue, 07/31/2018 - 11:52

Liferay 7.0 started to produce a pair of log files per day:

ls -l /opt/liferay/home7/logs/ total 3104 -rw-r--r-- 1 michael admin 285201 May 26 13:24 liferay.2016-05-26.log -rw-r--r-- 1 michael admin 898027 May 26 13:24 liferay.2016-05-26.xml -rw-r--r-- 1 michael admin 400811 Aug 19 13:08 liferay.2016-08-19.log -rw-r--r-- 1 michael admin 0 Aug 19 12:26 liferay.2016-08-19.xml

To disable Liferay logging to XML files, create portal-log4j-ext.xml in the following path:

tomcat-8.0.32/webapps/ROOT/WEB-INF/classes/META-INF/portal-log4j-ext.xml <?xml version="1.0"?> <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd"> <log4j:configuration xmlns:log4j="">       <appender name="XML_FILE" class="org.apache.log4j.ConsoleAppender"/>       <root>         <priority value="INFO" />         <!--appender-ref ref="CONSOLE" /-->         <appender-ref ref="TEXT_FILE" />         <!--appender-ref ref="XML_FILE" /-->     </root> </log4j:configuration>

This file overrides two elements of META-INF/portal-log4j.xml in ROOT/WEB-INF/lib/portal-impl.jar, which stops Liferay from copying log entries to catalina.out and the daily XML log file.

You can also create the extension XML file as a copy of the original in the JAR as the following:

cd tomcat-8.0.32/webapps/ROOT/WEB-INF unzip -p lib/portal-impl.jar META-INF/portal-log4j.xml > classes/META-INF/portal-log4j-ext.xml


Michael Chen 2018-07-31T16:52:00Z
Categories: CMS, ECM

Blade Project Version

Mon, 07/30/2018 - 13:08

TL;DR - Add "-v 7.0" to your Blade command line to create modules for Liferay CE 7.0 and Liferay DXP 7.0.

Hey, just a quick blog post here for something that developers may need to know...

I updated my Blade command line tool recently (since I use Intellij, I rely on the command line tool quite a bit for new module project creation), but I hadn't started a new module after updating.

That changed last Friday... Had a hot customer request for a customization so I used Blade to create a new workspace and started creating my modules. Modules loaded into Intellij just fine, and I happily start cranking out the code.

When I was happy with development, I built my 3 modules (SB API and Service modules and one UI module) and dropped them into my Liferay 7.0 DXP deployment folder.

Normally, I see the happy messages that my modules have started. I mean, these modules were not rocket science and they had no weird transitive dependency issues, so they should have just started.

But none of them did, not even the API module and those never have an issue starting.

So I fire up the Gogo shell, issue my lb command, and there are my modules at the end of the list, all in the Installed state.

So I command Gogo to start my API module, but it won't start, it has unresolved references on portal-kernel 3.0 and other such references.

I realize right away that portal-kernel 3.0 is from Liferay 7.1, but I wasn't building anything for Liferay 7.1, I'm still building for 7.0.

All of my modules suffered from the same problem, all had references to 7.1 artifacts.

Not knowing any better, I went back into the build.gradle files for my 3 modules and updated all of the versions so they were back at the normal 7.0 versions, rebuilt and redeployed and all was good.

So I asked my friend Greg Amerson what the deal was, and he informed me that the new default project version for the Blade tool is now 7.1. To have Blade create a project for Liferay CE 7.0 or Liferay DXP 7.0, you have to add "-v 7.0" to the Blade command line arguments.

So, there you go.  Avoid my Blade update headaches and just remember to tack on the "-v 7.0" for all of your new 7.0 modules.

Update 08/2018

So I checked with the folks that know, unfortunately there is no way to "default" the new blade to 7.0.  There are plans on the roadmap, but I don't know what that means for when it is available.

In the mean time, if you are using and developing for Liferay CE 7.0 or Liferay DXP 7.0, don't upgrade your blade until you move to 7.1 or they get the default setting in place.

If you did upgrade your blade, you can either remember to add the -v 7.0 args to your invoke of blade, or the easier option might be to just revert to an older version of Blade.

All you need to do is:

  1. jpm remove blade
  2. jpm install -f

That should get you back to a pre-7.1 version where the default will be 7.0.

David H Nebinger 2018-07-30T18:08:00Z
Categories: CMS, ECM

Designing animations for a multicultural product

Thu, 07/26/2018 - 05:32

The world is big and diverse. Right now there are around 7.6 billion people in the world, 4 billion with an internet access. When you’re improving a Design System for a product used worldwide, like Lexicon for Liferay, you should take this simple statement into account. Cultural, religious and regional concerns need to be considered. It may sound obvious, but the true fact is that we sometimes forget about it.

I’m going to explain how we redesigned and tested empty states illustrations for Liferay products.


1. The past

Let’s put a bit of context around this story. As you may know, an empty state illustration is shown when there is not any other info to display and capture the user’s attention. A friendly face looking around, smiling, or winking was the animation family for empty states in Lexicon 1 by Marcos Castro. They were created to give feedback in a memorable way and provide a pleasurable experience for the user but the fact was that the smiling face animation caused some cultural issues.

In some cultures a wink could be understood in a sexual way. Of course, we fixed that ASAP. For our next version, Lexicon 2, we wanted to avoid possible misunderstandings and make sure that our set of animations are universal, appropriate, understandable and polite for every culture in different geographical localizations all over the world.


2. The theme

How did we select a common theme? Firstly, the subject was very carefully chosen. It should suggest the emptiness in a pleasurable and descriptive way. Think about it, but the way of representing emptiness is not so obvious. In fact, we started crafting ideas around the desert concept but at the end the selected topic was outer space. The reason behind this decision was that the space is usually related with curiosity, technology and challenges in a good way unlike the desert which is normally related to death.


3. Visual identity

With the topic selected, it was the turn for visual design, where Emiliano Cicero turned concepts into illustrations.

Regarding the visual guidelines, it was necessary to make a clear, understandable design, with clean shapes that show the idea with simple animations to catch the user attention but without being too aggressive. We created 3 illustrations as part of the collection: a satellite surrounded by asteroids, the moon with comets and a spaceship launch.

The idea was to tell a story that accompanies users in their journey using the product, giving consistency through different animations depending on the interface status but totally related to a common topic, outer space.


4. The test

Next step was the testing stage to validate the new concept and design. For the test, we chose different profiles of Liferay workers all around the world, representing different regions, cultures, and religions. Then they were requested to fill a questionnaire to make sure that the illustrations couldn’t be interpreted as offensive, obscene, inappropriate, menacing, insulting or impolite for anyone in their culture or region. We asked participants to answer the questions with specific comments, ideas or feelings on why they thought the animation could be appropriate and understandable or not. Moreover, we included the description of future possible ideas to use in new illustrations related with outer space, as an astronaut arriving to the moon, an astronaut putting a flag, a comet, a spaceship flying or planets orbiting around a star. All these scenes together create the story.

As a summary of the test results, 100% of the answers agreed that the images weren’t offensive, obscene, inappropriate, impolite, menacing or insulting for them, their regions or their cultures. Nevertheless, 40% pointed that the pictures didn’t necessarily mean an empty state because of the presence of many elements. And comments on other possible scenes related to the space gave us lot of information on what to do and what to avoid.


5. Design iteration

Regarding all the information provided by participants, we iterated the design again, refining and polishing details to get to the new empty state animations set. We took into account the importance of having a universal, appropriate, understandable and polite design for each of the possible users for our product. As an example, we reduced the number of elements based on the feedback and we changed the powerful bright colors to a grayscale to not be so intense but still eye-catching.


6. The result

As an outcome of all this study and design process we created three animated illustrations for our empty states.

A telescope looking into the sky for the empty search/filter action when there are not results.

A satellite when there are no elements yet in a dataset.

A spaceship launch when the user has emptied the dataset for a good end, for example, when the notifications list has been cleared out.

Our animations set for empty states conformed a collection that makes users feel comfortable and part of the story. This is just the beginning and we expect to expand the family as our needs grow. We expect the experience using Liferay products will be more pleasurable, memorable, understandable and universal.

What do you think?

Susana Vázquez 2018-07-26T10:32:00Z
Categories: CMS, ECM

Unleash the power of Gogo Shell

Sun, 07/22/2018 - 17:30

// The french version of this article can be found here: Libérez la puissance de Gogo Shell.

Gogo Shell opens huge opportunities but some obstacles limit the expression of its potential. Let's see in this post how to remove these barriers.

Few reminders: usual use of Gogo Shell

Apache Felix Gogo Shell is an interactive shell that allows you to interact with an OSGi container.

Gogo Shell can be accessed via telnet:

$ telnet localhost 11311 Trying Connected to localhost. Escape character is '^]'. ____________________________ Welcome to Apache Felix Gogo g!

Warning ! Since version 7.1 of Liferay, access to Gogo Shell via telnet is no longer enabled by default. To enable it, you must add the following line in the file:

Once the telnet connection is open, the Gogo Shell commands can be executed one after another without any restriction:

g! lb | grep search 438|Active | 10|Liferay Portal Search Elasticsearch (2.1.8) true g!

Beyond the commands dedicated to the management of the OSGi container, Gogo Shell offers a fairly complete language, including standard control structures such as loop or conditional:

g! each [1 2 3] { echo -- $it -- } -- 1 -- -- 2 -- -- 3 -- null null null

Additionally, Gogo Shell allows you to call any method of any service deployed within the OSGi container:

g! ((($.context getService ($.context getServiceReference com.liferay.portal.kernel.service.UserLocalService)) getUserByScreenName 20116 'test') getFullName) Test Test Non-interactive mode

When you want to use Gogo Shell for task automation, typically within a bash script, the telnet connection poses difficulties.

Good news, as indicated in the official Liferay documentation, the Blade development tool can be used to overcome this obstacle:

$ blade sh 'lb | grep search' lb | grep search 438|Active | 10|Liferay Portal Search Elasticsearch (2.1.8) true

Unfortunately, using Blade on production environment is often impossible. Indeed, by the time these lines are written, the installation procedure of Blade is not adapted to a production context. Moreover, ops teams generally do not appreciate the installation of a tool dedicated to development on a production server.

Hopefully, there is another way to automate a telnet session. An Expect script, for example, does the job very well:

#!/usr/bin/expect -f log_user 0 spawn telnet localhost 11311 expect "g!" send -- "lb | grep search\n" expect "g!" send_user $expect_out(buffer)

The Expect interpreter is a battle-tested command (created almost 30 years ago) that can be installed through standard tools of Linux distributions.

For a distro supporting APT:

sudo apt-get install expect

For a distro supporting YUM:

sudo yum install expect Beyond OSGi: Groovy to the rescue!

In terms of executing portal components, Gogo Shell is confined to services exposed at the OSGi container level. Off, the portal Liferay offers many interesting Java classes that are not exposed in OSGi.

But luckily nothing is ever completely impossible with Liferay!

Indeed, as explained by Piotr Swiniarski in a post published on the blog of Mimacom, it is possible to invoke the Liferay Groovy script execution service from Gogo Shell.

Thus, it is possible to invoke any class of the portal. For example, using this principle, you can change the log level of a particular logger through an Expect script like this:

#!/usr/bin/expect -f log_user 0 spawn telnet localhost 11311 expect "g!" set groovy_script " \ import static com.liferay.util.log4j.Log4JUtil.setLevel; \ setLevel('','DEBUG',true); \ "; send -- " \ ($.context getService (( \ $.context getServiceReferences \ com.liferay.portal.kernel.scripting.ScriptingExecutor \ \"(scripting.language=groovy)\" \ ) 0)) eval null null null \"$groovy_script\" \ \n" expect "g!" send_user $expect_out(buffer)

Thanks to this brilliant idea of ​​Piotr, the possibilities become endless.

On the other hand it allows to use a language familiar to Java developers, much more convenient than Gogo Shell’s language.

Go to the next Level

At this point the possibilities are immense, but before your script is ready to go into production with confidence, there are still a lot of things to be done:

  • handle all error cases (telnet connection failure, exception in Groovy code, etc.)

  • manage the return status of the script

  • manage any buffer overflows from Expect interpreter

  • etc.

But don’t be afraid, you will not have to deal with it yourself, because the repository slemarchand/liferay-gogo-scripts on GitHub contains more than fifteen complete scripts that you can use as is or use as source of inspiration to write your own scripts.

You will find in this git repository, among others, the following scripts:

  • gogo-groovy-script directly execute a groovy script from a file,

  • gogo-get-bundle-status print the status of a particular OSGi bundle,

  • gogo-bundle-start and gogo-bundle-stop respectively start and stop an OSGi bundle,

  • gogo-set-log-level set the log level of a particular logger,

  • gogo-reindex-all-search-indexes re-builds all search indexes,

  • et gogo-clear-all-caches clear all server caches.

Something to share ?

Have you ever encountered these obstacles with Gogo Shell? How did you overcome them? Have you encountered any other challenges?

Feel free to share your experiences in the comments of this post to make some contribution to the Liferay community!


Sébastien Le Marchand
Freelance Technical Consultant in Paris


Sébastien Le Marchand 2018-07-22T22:30:00Z
Categories: CMS, ECM

Libérez la puissance de Gogo Shell

Sun, 07/22/2018 - 16:04

// The english version of this article can be found here: Unleash the power of Gogo Shell.

Gogo Shell ouvre d’énorme possibilités mais certains obstacles limite la plein expression de son potentiel. Voyons au fil de ce billet comment lever ces barrières.

Quelques rappels : utilisation usuelle de Gogo Shell

Apache Felix Gogo Shell est un interpréteur de commandes en mode interactif qui permet d’interagir avec un conteneur OSGi.

Gogo Shell est accessible simplement via telnet :

$ telnet localhost 11311 Trying Connected to localhost. Escape character is '^]'. ____________________________ Welcome to Apache Felix Gogo g!

Attention ! À partir de la version 7.1 de Liferay, l’accès à Gogo Shell via telnet n’est plus activé par défaut. Pour l’activer, il faut ajouter la ligne suivante dans le fichier :

Une fois que la connexion telnet est ouverte, les commandes Gogo Shell peuvent être enchaînées à loisir :

g! lb | grep search 438|Active | 10|Liferay Portal Search Elasticsearch (2.1.8) true g!

Au delà des commandes propres à la gestion du conteneur OSGi, Gogo Shell propose un langage assez complet, doté notamment des structures de contrôle standards comme la boucle ou la conditionnelle :

g! each [1 2 3] { echo -- $it -- } -- 1 -- -- 2 -- -- 3 -- null null null

De plus Gogo Shell permet d’appeler n’importe quelle méthode de n’importe quel service déployé au sein du conteneur OSGi :

g! ((($.context getService ($.context getServiceReference com.liferay.portal.kernel.service.UserLocalService)) getUserByScreenName 20116 'test') getFullName) Test Test S’affranchir du mode interactif

Dès lors que l’on souhaite utiliser Gogo Shell à des fins d’automatisation des tâches, par exemple au sein d’un script bash, la connexion telnet pose des difficultés.

Bonne nouvelle, comme indiqué dans la documentation officielle Liferay, l’outil de développement Blade peut être utilisé pour surmonter cet obstacle :

$ blade sh 'lb | grep search' lb | grep search 438|Active | 10|Liferay Portal Search Elasticsearch (2.1.8) true

Seulement voilà, sur un environnement de production, l’utilisation de Blade est souvent impossible. En effet, au moment où ces lignes sont écrites, la procédure d’installation de Blade n’est pas adaptée à un contexte de production. De plus les équipes d’exploitation apprécient généralement peu l’installation d’un outil dédié au développement sur un serveur de production.

Heureusement, il existe d’autre moyen d’automatiser un session telnet. Un script Expect par exemple, remplit très bien ce rôle :

#!/usr/bin/expect -f log_user 0 spawn telnet localhost 11311 expect "g!" send -- "lb | grep search\n" expect "g!" send_user $expect_out(buffer)

Il suffit ensuite d'exécuter le script :

$ ./ lb | grep search 438|Active | 10|Liferay Portal Search Elasticsearch (2.1.8) true

L’interpréteur Expect est une commande éprouvée (elle existe depuis presque 30 ans) qui peut être installée via les outils d’installation standard des distributions Linux.

Pour une distribution supportant APT :

sudo apt-get install expect

Pour une distribution supportant YUM :

sudo yum install expect Au delà d’OSGi : Groovy à la rescousse !

En terme d’exécution de composants du portail, Gogo Shell se cantonne aux services exposés au niveau du conteneur OSGi. Hors, le portail Liferay propose de nombreuses classes Java très intéressantes qui ne sont pas exposées en OSGi.

Mais heureusement rien n’est jamais complètement impossible avec Liferay !

En effet, comme expliqué par Piotr Swiniarski dans un billet publié sur le blog de Mimacom, il est possible d’invoquer le service d'exécution de script Groovy de Liferay depuis Gogo Shell.

Ainsi, il est possible d’invoquer n’importe quelle classe du portail. Par exemple, en utilisant ce principe, on peut changer le niveau de log d’un logger particulier via un script Expect comme celui-ci :

#!/usr/bin/expect -f log_user 0 spawn telnet localhost 11311 expect "g!" set groovy_script " \ import static com.liferay.util.log4j.Log4JUtil.setLevel; \ setLevel('','DEBUG',true); \ "; send -- " \ ($.context getService (( \ $.context getServiceReferences \ com.liferay.portal.kernel.scripting.ScriptingExecutor \ \"(scripting.language=groovy)\" \ ) 0)) eval null null null \"$groovy_script\" \ \n" expect "g!" send_user $expect_out(buffer)

Grâce à cette brillante idée de Piotr, les possibilités deviennent infinies.

D’autre part cela permet d’utiliser un langage familier des développeurs Java, bien plus pratique à l’usage que le langage de Gogo Shell.

Passer au niveau supérieur

À ce stade les possibilités sont immenses, mais avant que votre script soit prêt à partir en production en toute sérénité, il y a encore pas mal de points à traiter :

  • gérer tous les cas d’erreur (échec de connexion telnet, exception dans le code Groovy, etc),

  • gérer le statut de retour du script,

  • gérer les éventuels dépassements de buffer de l'interpréteur Expect,

  • etc.

Mais n’ayez pas peur, vous n’aurez pas à gérer cela vous-même, car le dépôt slemarchand/liferay-gogo-scripts sur GitHub contient plus d’une quinzaine de scripts complets que vous pourrez utiliser directement, ou sur lesquels vous pourrez vous baser pour réaliser vos propres scripts.

Vous trouverez dans ce dépôt de sources, entre autres, les scripts suivants :

  • gogo-groovy-script qui vous permettra d'exécuter directement un script groovy à partir d’un fichier,

  • gogo-get-bundle-status qui vous retourne le statut d’un bundle OSGi particulier,

  • gogo-bundle-start et gogo-bundle-stop qui respectivement démarre et arrête un bundle OSGi,

  • gogo-set-log-level qui modifie le niveau de log d’un logger particulier,

  • gogo-reindex-all-search-indexes qui re-construit tous les index de recherche,

  • et gogo-clear-all-caches qui vide tous les caches du serveur.

Quelque chose à partager ?

Avez-vous déjà eu rencontré ces problématiques autour de Gogo Shell ? Comment de votre côté les avez-vous résolus ? Avez-vous rencontré d’autres problématiques connexes ?

N’hésitez pas à partagez vos expériences dans les commentaires de ce billet pour apporter votre pierre à l’édifice de la communauté Liferay !


Sébastien Le Marchand
Consultant Technique indépendant à Paris


Sébastien Le Marchand 2018-07-22T21:04:00Z
Categories: CMS, ECM

Liferay University now available

Fri, 07/20/2018 - 05:00

Good news. Now that Liferay Portal CE 7.1 and Liferay DXP 7.1 are available, your training is also just a fingertip away. No longer do you have to travel far, or juggle your calendar, because these trainings are ready and available whenever you are, at your own pace.

Liferay University is the place where you can find all of Liferay's trainings in a convenient place. And if you can't find what you're looking for today, it's worth coming back in a bit, because more courses and lessons are already in the pipeline - editing those videos and setting it all up takes quite some time.

About the content: The lessons that you find on University are available for free, just log in with your, the one that you use here as well. The "full" courses have their price listed, but if you want to get a special deal for them - we've got you covered as well: Just get a  Liferay Passport and you'll have a flat rate access to all of the trainings offered.

The free lessons, as of today, consist of

  • Mobile Development
  • Building Engaging Websites
  • Digital Asset Management
  • OSGi Basics (featuring yours truly ;) )

...and there are more to come...

Paid courses that are available as of today work on a pre-release of the current 7.1 release, but will be updated soon to the GA release of Liferay DXP 7.1

  • Liferay Fundamentals
  • Content Management
  • Backend Developer
  • Frontend Developer

...and there are also more to come...

Of course, if you prefer to have a live trainer in the room: The regular trainings are still available, and are updated to  contain all of the courses that you find on Liferay University and Passport. And, this way (with a trainer, on- or offline) you can book courses for all of the previous versions of Liferay as well.

And, of course, the fine documentation is still available and updated to contain information about the new version already.

(Photo: CC by 2.0 Hamza Butt)

Olaf Kock 2018-07-20T10:00:00Z
Categories: CMS, ECM

Updating PDFBox

Wed, 07/18/2018 - 19:50

In case you're not aware, Liferay uses Apache PDFBox to look at your PDFs. In order to index the content of a PDF, Liferay uses PDFBox to extract the content and uses that during indexing/searching.

However, Liferay tends to fall behind a bit on updating 3rd party libraries. When you favor stability, you tend to be rather conservative when considering updates like this.

Recently though I was helping a client bulk upload documents using the Resources Importer, and wouldn't you know it but almost every one had one exception after another. Some were just errors about fonts, some were actual PDF errors, but some were from bugs in PDFBox.

I checked my DXP version and found that it was using Apache PDFBox 2.0.3, and this was released 2016-09-17. I saw that they released Apache PDFBox 2.0.11 on 2018-06-28, and I wondered if I would see any difference if I updated my PDFBox version.

Replacing Liferay's Jars

So the first question was "What do I download?" When you check the download page, you see that they have a full source zip, a number of java apps and a number of library jars for versions 1.x and 2.x.

I knew I wanted better than 2.0.3, so I was happy to stay in the 2.x downloads. Since I wasn't using the command line apps, I skipped those and went straight to the "Libraries of each subproject" section.

I ended up grabbing the PDFBox, FontBox, Preflight, XMPBox and PDFBoxTools jars.  I skipped the PDFBoxDebugger because I don't plan on debugging to that level of detail.

After stopping my Liferay appserver, I copied these jars to the webapps/ROOT/WEB-INF/lib directory. This was easy for Tomcat, but you may need to follow a slightly different path depending upon your app server choice. Worst case scenario, you could actually build an EXT plugin to deploy your libs, but I'd avoid this if at all possible.

I didn't bother with renaming the jars, so I had to get rid of the old 2.0.3 versions. Interestingly, Liferay doesn't include all of these jars, only PDFBox and FontBox, so after deleting those old jars I was ready to bring the environment up.


After starting up the environment and trying my PDF loads, I found that many of the errors I had seen before were gone. I still had some, but at this point I think they are bad PDF files (they're marked as generated by some robo-pdf tool). I don't know if Preflight or XMPBox have anything to do with some of the errors disappearing, but I don't believe they hurt anything and don't know if they are actual dependencies for the newer versions. So I'm just going to keep them.

Anyway, since my errors were all gone, I'm declaring this an unqualified success.


Well, it goes without saying that you may need to repeat part of this when you apply a new fixpack or service pack to DXP.  If they updated from 2.0.3 to 2.0.4 the patching tool will have no problem adding their jars, but you'll have the duplicates again. If you stripped the version number from the jar name, the patching tool would overwrite your newer 2.0.11 jars with the older jars, probably something you wouldn't want.

So keep an eye on your ROOT/WEB-INF/lib jars when you apply a fixpack or service pack.

Also, I think it is okay to update as new 2.0 versions get released. But if you find that there's a shiny new PDFBox 2.1 or 3.0 up there, I would resist the temptation to just blindly push those in as that kind of version bump usually points to an API change that may not be compatible with how Liferay uses PDFBox.

David H Nebinger 2018-07-19T00:50:00Z
Categories: CMS, ECM

Liferay 7.1 Community Beta Program Ends

Wed, 07/18/2018 - 10:37

As they say: All good things must come to an end. With the official release of Liferay Portal 7.1 CE GA1 we are officially ending the community beta program. We had over 250 sign ups with 48 active participants and over 500 posts making this one of the most successful community programs to date. Thank you for all who participated in the program! Active participants who posted feedback in the Feedback Forum will receive a complimentary t-shirt.

Interesting Stats How to report bugs

If you believe you have encountered a bug in the new release you can report your issue on, selecting the "7.1.0 CE GA1" release as the value for the "Affects Version/s" field.

Jamie Sammons 2018-07-18T15:37:00Z
Categories: CMS, ECM

Liferay IntelliJ Plugin 1.0.0 Released

Mon, 07/16/2018 - 21:46

The latest release of Liferay IntelliJ 1.0.0 plugin has been made available today. Head over to this page for downloading.


The most notable features for this latest 1.0.0 release are :


  • Support watch for module development

  • Support for Liferay 7.1 development

  • Improved Editor Support

    • XML file references

    • OSGi code completion

      • portlet-module-hints.xml

      • custom-sql/default.xml  

      • OSGi component properties

  • Support custom portal paths

  • Support uninstall bundles during liferay watch


Using Editors






Custom Liferay Server


Developer mode is enabled by default for liferay servers. In addition to initialize a server from liferay workspace, you also can browse your own server.



Liferay Watch Support


Liferay watch allows users to see the changes they make without requiring a full redeploy. Right click on a single module or a folder which contains liferay module projects to enable Liferay watch. And once you stop liferay watch task, the module will be removed automatically.




Liferay watch requires com . liferay . gradle . plugins . workspace plugin 1.9.2 or greater.

Watch task will detect the chagnes you made once you press CTRL+s.


Special Thanks

Thanks so much to Dominik Marks and danielebb for the improvements.

Yanan Yuan 2018-07-17T02:46:00Z
Categories: CMS, ECM

Liferay joins Open Source leaders in protecting developers

Mon, 07/16/2018 - 04:04

Liferay, Inc was established on the the strong foundation of an Open Source project. The company has since grown to offer many different projects and products, yet we continue to believe in and support Open Source.


To further emphasise our support, today we are publicly committing to offering the much friendlier GPLv3 cure approach to licensees of also our GPLv2, LGPLv2.1 and LGPLv2 licensed code.

In practical terms, this means that if you accidentally violate the license of Liferay Portal for example, we would give you a grace period in which you could fix your mistake.


With this we are joining the movement of both volunteers as well as companies, who want to make sure that developing new and evolving existing Open Source software in collaborative fashion is not stifled by draconic license termination clauses.


At Liferay we want to encourage innovation and collaboration, and as such it has never been our intention to (mis)use our projects’ Open Source licenses against honest users and developers. We are extremely happy to see the IT industry come together and make this easier with such an elegant legal hack, which we are now a proud user of.

The GPL Cooperation Commitment also allows individuals to show their support, so we would like to encourage all engineers and open source advocates to make the commitment. Please visit the GPL Cooperation Commitment project on GitHub to learn more, and perhaps join us in this movement as well.  

Matija Šuklje 2018-07-16T09:04:00Z
Categories: CMS, ECM

Liferay joins Open Source leaders in protecting developers

Mon, 07/16/2018 - 02:04

Liferay, Inc was established on the the strong foundation of an Open Source project. The company has since grown to offer many different projects and products, yet we continue to believe in and support Open Source.


To further emphasise our support, today we are publicly committing to offering the much friendlier GPLv3 cure approach to licensees of also our GPLv2, LGPLv2.1 and LGPLv2 licensed code.

In practical terms, this means that if you accidentally violate the license of Liferay Portal for example, we would give you a grace period in which you could fix your mistake.


With this we are joining the movement of both volunteers as well as companies, who want to make sure that developing new and evolving existing Open Source software in collaborative fashion is not stifled by draconic license termination clauses.


At Liferay we want to encourage innovation and collaboration, and as such it has never been our intention to (mis)use our projects’ Open Source licenses against honest users and developers. We are extremely happy to see the IT industry come together and make this easier with such an elegant legal hack, which we are now a proud user of.

The GPL Cooperation Commitment also allows individuals to show their support, so we would like to encourage all engineers and open source advocates to make the commitment. Please visit the GPL Cooperation Commitment project on GitHub to learn more, and perhaps join us in this movement as well.  

Matija Šuklje 2018-07-16T07:04:00Z
Categories: CMS, ECM

Creating custom DDM field types in Liferay 7.x

Fri, 07/13/2018 - 01:19
DDM... what's DDM?

DDM stands for Dynamic Data Mapping, a set of modules that define various types of fields that can be used to assemble:

  • Web content structures for creating web content articles.
  • Data definitions for creating dynamic data list records.
  • Metadata sets for creating documents based on document types.

Liferay bundles with an ever expanding set of DDM field types out of the box. At the moment of speaking (Liferay 7.1 has just released), the following field types are available:

Is it ever enough?

For basic content management needs, the default set of DDM field types is usually sufficient. However, there certainly are use cases which would require you to create custom field types:

  • You want to dynamically populate the options of a dropdown field from the response of a REST endpoint.
  • You want to provide a field that can point to a user, a group or any other entity in Liferay.
  • You want to have a text field with some very specific validation, e.g. email address, URL or phone number validation.

In this article, we will examine how to create such custom fields. As an example, we'll pick the first use case and try to create a field type which takes the response of a REST endpoint as the possible values of a dropdown list.

A new field type was born: REST Select

The default Select field in Liferay allows content authors to pick a value from a predefined list of possible values for an article. This is great for simple purposes, but what if you want to dynamically populate this dropdown list with values from an external platform? Or if the values that should be listed depend on the user that requests them?

Meet the brand new REST Select field. This dropdown field allows you to configure a REST endpoint and how to map the JSON fields in the response onto the labels and values of the possible options.

Install the module

Perform the following actions to install the module in your Liferay installation:

  1. Clone and checkout the 7.x/field/ddm-rest-select branch.
  2. Build the project using Maven by executing mvn clean package.
  3. Move the resulting JAR (you can find it in the target folder) into the deploy folder of Liferay.
Define the field in a structure
  1. Create a new web content structure. 
  2. Drag the "REST Select" field type into the structure definition.
  3. Configure the field's endpoint and option mappings by selecting the field.

Apart from the default attributes like "Field Label" and "Required", notice there are 3 additional attributes to be configured. The first attribute should be filled with the URL of a REST endpoint. This should be an endpoint that returns a list of JSON objects. E.g. the endpoint http://localhost:8080/api/jsonws/country/get-countries (a built-in endpoint in Liferay) returns the following output:

With the two remaining attributes, you configure what should be respectively in the label and the value of the HTML <option> elements. E.g. in this case, if you pick the "nameCurrentValue" JSON attribute for the option label and the "a3" JSON attribute for the option value, this will produce the following HTML:

Create a new article using the structure

When creating a new web content article based on the structure you've just created, you're able to select a country from the dynamically loaded dropdown list!

Cool, now I want to create my own DDM field type! So... how?

In this part of the post, we'll explore how Liferay was extended to support the new REST Select field. Hopefully, this gives you enough insights to start working on your own custom DDM field type!


It turned out to be quite a journey to create new DDM fields in Liferay, even in 7.x. You would expect that every field type is defined in its own OSGi module, but the woeful truth is that the definition and rendering of these fields is scattered throughout several Liferay modules in both client-side and server-side code. So the result is rather hacky to say the least.

Please only consider this solution if there is absolutely no other way!!

With that being said, do go on reading this article 

The code

The starting point is This repository provides a skeleton project that can be extended to support new DDM field types. Each part that needs to be extended or implemented is marked with a //TODO comment. You can do a diff of this main branch with the 7.x/field/ddm-rest-select branch to see the different changes to provide your own implementation.

  • Make sure that ALL todos are configured. Forgetting to implement only one of them and it will not work and Liferay will give you no clue.
  • Make sure to start Liferay in developer mode to (Unix) or setenv.bat (Windows).
  • If your changes don't seem to be reflected, uninstall the module from the Gogo shell and then reinstall it again to clear any remaining state.

While it was a real challenge to create a custom DDM field, it IS possible and if you inject enough flexibility in your field, it can serve multiple purposes. Please send me your feedback on how you would do things differently or what field types you are missing at the moment in Liferay 7.x.

Thanks for reading!

Peter Mesotten 2018-07-13T06:19:00Z
Categories: CMS, ECM