Assistance with Open Source adoption


Extending Liferay DXP - User Registration (Part 1)

Liferay - Sun, 08/05/2018 - 20:19

I am starting a short series of blogs in which I will show how to customise User Registration / Customer Acquisition process in Liferay.

The inspiration for this blog is one of our DXP portal implementations which had very broad (and frequently changing) user data collection requirements.

There are many scenarios in which you want to customise user registration and account 
creation process:

  • creating a custom user registration form allows delivering a more consistent user experience;
  • when based on information a user provided an approver makes a decision to grant or reject the registration;
  • apart from default fields you can add any user account fields;
  • you can collect any other information related to your business and store it in your custom tables;
  • different registration flows might be required for different groups of users or sites;
  • a wizard-like registration form can deliver more information and help user to make the right choices;
  • users can apply for different roles or permissions at registration;
  • and it could be a place where you might want to display your marketing material.

For now I plan to write three articles:

1. Custom registration process for a portal with multiple sites. 
The main features:

  • Check if a user already has an account
  • Show and allow the user to select which site to join
  • Collect additional information from the user and send email notification to interested parties (approver for restricted site) on form submission
  • Automatically create the user account on form submission if the membership type of the selected site is open, otherwise create the user account but make it inactive (will be activated by approver)

2. Getting more data out of registration form 
The main features:

  •  Add fields to registration form that will be stored in User Account details
  • Collect information to populate user Custom fields
  • Add fields to store in custom tables
  • Allow users to download and print the entries they made to the form for their records

3. Taking payments at registration 
The main features:

  • If payment is required as part of request to join a site, display amount to pay and collect the payment
  • Allow users to download Receipt for their records


The Registration form will be based on SmartForms. Of course, you can built your own form/ui  but here are the reasons why you might consider Smartforms:

  • Flexibility - non-technical staff can edit the form and change wording to adjust or add explanations/instructions/conditions on the form.
  • Styling made easy - fully separated from software code and can be done by web designer.
  • Simple Integration with your other systems via webservices to bring data to the form.
  • Easy field value extraction from the submitted form into Liferay data stores or your other systems. 
  • But the most important - cutting costs on the porlet development, as form creation and further adjustments/changes can be done without software modifications.

You can download SmartForms from Liferay marketplace - or from SmartForms downloads page, additional information can be located here


Even if you are not going to use Smartforms I hope these blogs will help you as I will be providing samples of how to use Liferay services/APIs for your user registration purposes.

Victor Zorin 2018-08-06T01:19:00Z
Categories: CMS, ECM

The new Web Experience functionalities released in 7.1

Liferay - Fri, 08/03/2018 - 05:52

In Liferay Portal 7.1 a lot of effort has been put in improving the experience of building sites and pages and, as a result, some new functionalities have been introduced to empower marketers and other business users in the creation of stunning websites.

We have achieved this by considering differentiated needs for the marketer, designer and web developer; and adding a new approach to content and page creation. We will continue working to expand the initiative in future releases, but some exciting features have already been introduced.

The key elements added to Web Experience under what we call Modern Site Building are Page Fragments, Content Pages, Display Pages and Multiple Navigation Menus. This is the first entry of a series of posts in which we will have the opportunity to go into more detail of each of those. But, as a starter, we will briefly introduce the big picture and how every component fits in it.


The context

Authors, marketers, designers, HR teams or whoever is in charge of creating a piece of content or site pages sometimes face repeat work, but not always, right? Sometimes they are happy to use web content structures because content reusability sounds like pure magic. Some other times they wish there was a faster way to create pages because it is a one-off and what really matters is minimizing the number of clicks while making sure it looks awesome. Put simply, the needs are not always the same.

Liferay has always used a very content centric and decoupled approach. The creation of content in the form of different assets and the display of it have always been two separate things. Thus, the long-established approach to page creation with Liferay is by combining content and applications. These pages are flexible, powerful and offer advanced functionality; but also add complexity to some use cases.


The authoring flow considered

For the new functionalities introduced we have considered a design and authoring flow like the one described below.


The marketer or the business user needs to create an effective landing page that communicates the desired message and increases customer engagement. The marketer does not have front-end knowledge but would like to have an interface to create site pages in a few clicks with editing options and making sure it looks great.

In order to achieve this, previous work of the Design and Web Development team is necessary. First of all, the designer designs several reusable page parts called Page Fragments (list of contents, headers, title sections, footers…).

The Web developer then implements these designs, coding the appearance and behavior of the Page Fragments and loads them into the Page Fragment collection library. Thereafter, the designer can create Page Templates by adding and combining Page Fragments, making a Page Template collection available for the marketer to directly start working on.

Using this approach, marketers will focus on writing content in Content Pages, using Page Templates created by designers which will consist of Page Fragments previously developed by web developers


Key new elements.

With this authoring flow considered, we have introduced new functionalities attending to the different roles and responsibilities in the creation process:

  • Page Fragments, which are collections of “design blocks” created by web developers using HTML, CSS and JavaScript that are made available to the non-technical users to build the Pages by composing them.

  • Content Pages, a new way of creating pages with a few clicks. The main benefits of content page are the easy edition of its content directly on the page.

  • Display Pages, which allow to control how a particular content is displayed in full page.

  • Multiple Navigation Menus, that leverages the Navigation by allowing to introduce several Menus on the header, on the footer, sidebar...


Themes with Page Fragments available from the Marketplace

To ease the Blank Page Syndrome you can start standing on some already defined examples. If you want to leverage on existing Page Fragment collections to start exploring the possibilities of the new features right away, you can already download the Fjord theme for Liferay Portal 7.1 from Liferay Marketplace. It contains a series of Page Fragments and Page Templates that can be used as a starter kit. More themes will follow soon!



I want to know more.

Read along the coming series of posts to learn about Page Fragments, Content Pages, Display Pages and Multiple Navigation Menus.

Also, if you want to learn more about how to empower your team to build modern sites, you can sign in for free to the lesson on “Building Engaging Websites” available in Liferay University or accessing Liferay Documentation.


Ianire Cobeaga 2018-08-03T10:52:00Z
Categories: CMS, ECM

New Project SDK Installers 3.2.0 GA1 Released

Liferay - Thu, 08/02/2018 - 21:45

We are pleased to announce the first general available release of Liferay Project SDK Installers that support Liferay 7.1.

New Installers:

For customers, they can download all of them on the customer studio download page.

The installer is the full fledged Liferay Developer Studio installer which installs Liferay workspace, blade, Developer Studio and comes pre-bundled with latest Liferay DXP server. It also supports to config a proxy using for download gradle dependencies.

Upgrade From previous 3.1.x:
  1. Download updatesite here

  2. Go to Help > Install New Software… > Add…

  3. Select Archive..., Browse to the downloaded updatesite

  4. Click OK to close Add repository dialog

  5. Select all features to upgrade, then click > Next, again click > Next and accept the license agreements

  6. Finish and restart to complete the upgrade

Release highlights:

Installers Improvements:

1. Better support HTTP, HTTPS, SOCKS5 proxy

 2. Bundle latest Liferay Portal

   - bundle 7.1.0 GA1 in LiferayProjectSDKwithDevStudioCommunityEdition installers

   - bundle DXP 7.1.10 GA1 in LiferayProjectSDKwithDevStudioDXP installers

Improvements for Deployment:

1. Support Liferay Watch Task

2. Support for Target Platform

3. Improve wizard to download dependencies in background

4. Better deployment support for Liferay DXP/7

   - support DXP 7.1.10 GA1 Tomcat and Wildfly

   - support Liferay 71 CE GA1 Tomcat and Wildfly

   - integration of Blade CLI 3.1.1

   - support Plugins sdk 1.0.19

   - support Liferay Workspace Gradle 1.10.2

5. Third party plugins update

   - update m2e to 1.8.3

   - updategradle plugin buildship to latest 3.0.0

6. Miscellaneous bug fixes


If you run into any issues or have any suggestions please come find us on our community forums or report them on JIRA (IDE project), we are always around to try to help you out. Good luck!

Yanan Yuan 2018-08-03T02:45:00Z
Categories: CMS, ECM

Oh no, my URLs disappeared…(and how to get them back)

Liferay - Thu, 08/02/2018 - 02:30

Recently we got a couple of complains about new Web Content article behaviour, specifically about the JournalArticle.getContent() method’s return value. The main problem developers experience is when they embed an Image into the Web Content article or use ddm-image or ddm-document-library field in their structures they expect to see the URL of the object(Image or D&M asset) in the raw XML when using JournalArticle.getContent() method, it actually was there in the 7.0 and the raw XML looked like this:

(...) <dynamic-element name="Image8r1v" type="image" index-type="text" instance-id="ryns"> <dynamic-content language-id="en_US" alt="" name="blonde.png" title="blonde.png" type="journal" fileEntryId="34506" id="34835"> /image/journal/article?img_id=34835&amp;t=1531817578959 </dynamic-content> </dynamic-element> (...)


There are two main differences in the 7.1:
We switched from the internal table JournalArticleImage to the common Documents and Media repository as a storage for the Web Content article images
DDM fields for Image and D&M assets changed their internal representation from the URL to the JSON object

Now the raw XML of the article with Images or ddm-image(ddm-documentlibrary) fields looks like this:

(...) <dynamic-element name="Image54q7" type="image" index-type="text" instance-id="wscg"> <dynamic-content language-id="en_US"> <![CDATA[\{ "groupId":"20124","name":"allatonce.png","alt":"", "title":"allatonce.png","type":"journal", "uuid":"80269faa-dea9-fd5a-cb78-3c7aa9da51ea", "fileEntryId":"36774","resourcePrimKey":"36772"} ]]> </dynamic-content> </dynamic-element> (...)


It was an internal decision and we didn’t realize that out there could be developers who actually use the raw XML content for their own needs…

First I would like to explain why it was done, not to try to excuse for this case, but to prevent such cases in the future. On the one hand, JournalArticle.getContent() method is a public API and its behaviour must be, at least, backward compatible, but on the other hand its behaviour depends on many components behind it, the signature of the method didn’t change and the implementation details (including the raw XML format of the content) never were published. To avoid such a problem we strongly recommend the developers to use published means for Web Content processing, such as JournalContent and JournalArticleDisplay. Both of them provide processed content of the Article without need to work with the raw XML, clear example can be found in the Web Content Display portlet:

(...) JournalArticleDisplay articleDisplay = _journalContent.getDisplay( article, ddmTemplateKey, viewMode, languageId, page, new PortletRequestModel(renderRequest, renderResponse), themeDisplay); String processedContent = articleDisplay.getContent(); (...) @Reference private JournalContent _journalContent; (...)


Also there is a taglib which allows to render a specific journal article using its JournalArticleDisplay instance:

<liferay-journal:journal-article-display articleDisplay="<%= articleDisplay %>" />


Or the developer can use JournalContent.getContent() method directly, the result must be the same - processed content where all the fields behave as expected.
Now let’s talk about how to get the URLs back because I understand that it could be a problem to refactor hundreds of lines of your code and the best way for the developers who use the raw XML would be to keep processing the URLs as they were doing it before.
Here I have to mention one detail - there is no way to return to the old format of the URLs for embedded images, so if you have some sort of regular expression catching “/image/journal/article...” - there is no way to make it work again.
There are two options to get the URLs back, both need to adapt your existing code, which works with the raw XML, a little bit.

First option is applicable when you have a concrete file entry ID:

(...) // here fieldValue is raw XML field value for your Image/DM field JSONObject jsonObject = JSONFactoryUtil.createJSONObject(fieldValue); long fileEntryId = jsonObject.getLong("fileEntryId"); FileEntry fileEntry = PortletFileRepositoryUtil.getPortletFileEntry(fileEntryId); String fileEntryURL = PortletFileRepositoryUtil.getDownloadPortletFileEntryURL( themeDisplay, fileEntry, StringPool.BLANK); (...)

And the second option is applicable in case when you don’t have specific file entry ID, but have UUID and group ID of the target entry:

(...) // here fieldValue is raw XML field value for your Image/DM field JSONObject jsonObject = JSONFactoryUtil.createJSONObject(fieldValue); String fileEntryGroupId = jsonObject.getLong("groupId"); String fileEntryUuid = jsonObject.getLong("uuid"); FileEntry fileEntry = PortletFileRepositoryUtil.getPortletFileEntry( fileEntryUuid, fileEntryGroupId); String fileEntryURL = PortletFileRepositoryUtil.getDownloadPortletFileEntryURL( themeDisplay, fileEntry, StringPool.BLANK); (...)


Hope these 5 lines of code help you to solve the problem. We understand that it could be frustrating to deal with such changes and we are trying to do our best to avoid them without actual need.

Pavel Savinov 2018-08-02T07:30:00Z
Categories: CMS, ECM

Monitor c3p0 in Liferay

Liferay - Tue, 07/31/2018 - 20:05
The Problem

The c3p0 connection pool exposes MBeans for JMX tools like jconsole to monitor its run-time states. However, both c3p0 and Liferay intentionally generate random MBean names on each startup. The random name of each data source makes it difficult to setup an independent monitoring tool.

The second problem is, by default Liferay creates several data sources, one used by most Liferay features, one for the counter service and one for analytics. They all use the same portal property prefix "jdbc.default.", making them identical in every way except for their random bean names.

The Solution

A default c3p0 bean name looks like this:


The default behavior sets the value of both  identityToken and name to the same random string. According to c3p0 documentation, identityToken can be removed with a system property. The second part of the solution is explicitly name the data sources created by Liferay.

Remove identityToken

Add the following line to TOMCAT_HOME/bin/

JAVA_OPTS="$JAVA_OPTS" Explicitly Name Data Sources

Add the following lines to LIFERAY_HOME/

jdbc.default.dataSourceName=liferayDataSource counter.jdbc.prefix=jdbc.counter. jdbc.counter.dataSourceName=counterDataSource

These lines name the default data source liferayDataSource, the counter service data source counterDataSource and similarly analyticsDataSource. These names are all arbitrary, so you can chose other desired values.

The second line gives the counter service data source a new property prefix "jdbc.counter." Use this prefix to set counterDataSource properties you wish to be different from the liferayDataSource. Conversely, any properties not set with this prefix will use the value set using the "jdbc.default." prefix.

Now in JMX, c3p0 beans look like this:

  • com.mchange.v2.c3p0:name=analyticsDataSource,type=PooledDataSource
  • com.mchange.v2.c3p0:name=counterDataSource,type=PooledDataSource
  • com.mchange.v2.c3p0:name=liferayDataSource,type=PooledDataSource

Name Other Data Sources

If your Liferay is configured with other data source(s), you must also give each of them a unique name. Otherwise, multiple MBeans will have the same name as liferayDataSource, and only one of them will be detected by JMX tools.

For example, if you have a third data source defined with prefix "", add this line to LIFERAY_HOME/

Then you should see these data sources:

  • com.mchange.v2.c3p0:name=analyticsDataSource,type=PooledDataSource

  • com.mchange.v2.c3p0:name=counterDataSource,type=PooledDataSource

  • com.mchange.v2.c3p0:name=liferayDataSource,type=PooledDataSource

  • com.mchange.v2.c3p0:name=sapDataSource,type=PooledDataSource

What to Watch

In a JMX monitoring tool, for each data source, monitor these 3 properties of its MBean:

  • numConnectionsAllUsers

  • numBusyConnectionsAllUsers

  • numIdleConnectionsAllUsers

An alert should be configured when numBusyConnectionsAllUsers exceeds a certain percentage of maxPoolSize, which forecasts a possible exhaustion of database connections with that data source.


Michael Chen 2018-08-01T01:05:00Z
Categories: CMS, ECM

Dynamic Widget

Liferay - Tue, 07/31/2018 - 15:20

My colleague Vagif proposed a new way to develop a Liferay dynamic widget (the best name I can think of for now) with these pieces:

  1. Use the “Basic Web Content” structure or define a new structure.
  2. For each dynamic widget type, define a new display template.
  3. Create an OSGi service that returns a data model to be used by the display template.
  4. Create a web content article with the above structure/template pair.
  5. Place this web content article onto the page with Web Content Display (OOTB).

All business logics are in the OSGi service. The structure can serve as a preference or parameter holder for the display template. The display template draws most meaningful data from the model returned by the service.

You can build a portlet-less Liferay site with this "Crafter-ish" approach.


A typical Liferay feature has program code spread out among many technologies: service Java code, portlet Java code, template code, template Java code (JSP), language files, property files, XML files, etc. You often find a feature morphs into some JSP files with 100+ lines of Java code, a 2000 line portlet, plus several service builders, all competing for business logic implementations and MVC roles.

A Simpler Pattern

Dynamic widget may serve as a Liferay implementation pattern that simplifies and promotes good programming practices (I think this is better pattern, but some may disagree):

  • Write Java code in and only in service modules (no portlet).
  • Implement all business logic in services.
  • The display template calls a single service to retrieve a data model.
  • The display template then renders the model with almost no business logic.

A Simple Rule:

  • Let the services build the most foolproof model for the template to render it in the most simplistic way.

There is nothing stopping a template from calling multiple services then juggle multiple models to render the view. However, do your self a favor, write a new service method that combines all of them into a single model for the template. You will thank yourself later.

Why Freemarker?

Cannot use JSP may be a down side of dynamic widget, or is it?

Compared to JSP, template languages like Freemarker cannot mix Java code with UI code, so they promote a cleaner separation of view from model and controller. Some may argue that template code getting the model from services is controller-ish. Agree, but beyond that keeping Freemarker code simple is not hard at all, because complex Freemarker code is ugly and painful to write, not to mention tough to debug, log, handle errors, etc.

Pseudo Web Content

Dynamic widget is a Liferay web content, but its "content" is not in its web content structure. The content comes from the services. If you think of a Web Content Display rendering a dynamic widget in a page as calling a function in a program, then its web content structure is like the parameter for that function call.

The widget's web content may be empty, which is similar to calling a function with no parameter. The service that builds the model for the template has everything it needs. In other instances, the structure can have variables used as parameters for calling the service or be used directly by the template just like any other web content.

Search and Preview

Note that variables in the web content structure can be searchable, making them available to Liferay search and Assert Publisher. For example, a dynamic widget may be titled "Annual Sales Figures 2018", which renders a bar chart from a model provided by a service module. Then this widget may be a search hit, along with its pretty chart if you wish. You can't have that if the chart is implemented in a portlet.

Don't forget, you can easily mark the web content not searchable too:

Another convenience of dynamic widget over portlet is preview. Open Liferay Control Panel > (a site) > Web Content. You can preview a widget independent of a page from its Options menu:

Managing Display Templates

For the rest of this topic, lets call the template associated with the web content structure the parent template. The parent template can render the entire widget all by itself, but it's much more likely for the parent template to use other templates with the '<#include />' Freemarker directive.

Here we discuss three ways to develop templates for dynamic widgets.

Liferay Generic Templates

Open Liferay Control Panel > (a site) > Web Content > (corner Options menu) > Templates. Note the first column of table view called "ID". The value of that column is the template ID, also known as template key.

From this Templates view, you can create a template and leave the "Structure" field empty. That creates a generic template. Then in a parent template, you can use a generic template like this:

<#include "${templatesPath}/TEMPLATE_KEY"/>

With this framework, you can implement everything inside Liferay user interface. However, exporting then importing generic templates a LAR file will invalidate all template keys. All parent templates must be manually modified with new keys.

Package Templates In A Module

In the attached, "modules/my-templates" demonstrates packaging Freemarker templates in a OSGi module. The key for parent templates to use templates in this module is the "Web-ContextPath" header:

modules/my-templates/bnd.bnd Bundle-Name: My Templates Bundle Bundle-SymbolicName: my.liferay.templates Bundle-Version: 1.0.0 Web-ContextPath: /my-templates

In a parent template, use the Liferay infix _SERVLET_CONTEXT_ like this:

<#include "my-templates_SERVLET_CONTEXT_/widget/software-project.ftl" />

where before the infix is the "Web-ContextPath" value, and after which is the template path in the module project under "src/main/resources".

The Liferay Freemarker engine allows a template to include sibling templates in the same module using relative path of the current template. For example,

modules/my-templates/src/main/resources/widget/software-project.ftl <h3>software-project.ftl</h3> From parent path: <#include "../common/left-navigation.ftl"/> From child path: <#include "more/right-navigation.ftl"/>

Now the fun part of how Freemarker templates find and call services. First of all, you need to remove Liferay's default restriction on template variables by creating this configuration file as the following:

LIFERAY_HOME/osgi/configs/com.liferay.portal.template.freemarker.configuration.FreeMarkerEngineConfiguration.cfg restrictedVariables=

Then a template can access services in several ways, for example, using the "staticUtil" or "serviceLocator" variables:

modules/my-templates/src/main/resources/widget/software-project.ftl Using restricted variables: <#assign userLocalServiceUtil = staticUtil['com.liferay.portal.kernel.service.UserLocalServiceUtil'] /> ${userLocalServiceUtil.getDefaultUserId(companyId)}, <#assign userLocalService = serviceLocator.findService('com.liferay.portal.kernel.service.UserLocalService') /> ${userLocalService.getDefaultUser(companyId).originalEmailAddress}

Refer to the "Extras" section for a complete list of all Liferay variables available to Freemarker templates.

Package Templates In A Theme

The attached also includes an example theme in "wars/my-theme". Following the folder convention of a theme, templates in the projects are under folder "src/main/webapp/templates". The Gradle build automatically generates a "Web-ContactPath" header with the value of the project folder. Therefore, a parent template can reference a template in this theme as:

<#include "my-theme_SERVLET_CONTEXT_/templates/widget/software-project.ftl" />

Other aspects of templates in a theme are identical to ones in a module.


You can deploy both the module JAR and the theme WAR files by copying them to the Liferay auto deploy folder. Here is what they look like when running:

$ telnet localhost 11311 Trying Connected to localhost. Escape character is '^]'. ____________________________ Welcome to Apache Felix Gogo g! lb my START LEVEL 20 ID|State |Level|Name 759|Active | 10|Liferay My Account Web (1.0.11) 826|Active | 10|Liferay Portal Security AntiSamy (2.0.12) 994|Active | 10|Liferay Site My Sites Web (1.0.8) 1232|Active | 10|My Templates Bundle (1.0.0) 1237|Active | 10|my-theme (7.0.10) g! Extras

A complete list of all Liferay variables available to Freemarker templates, including their implementation class names and snippets of Liferay source code: @Meta.AD( deflt = "serviceLocator|utilLocator|objectUtil|staticFieldGetter|staticUtil", required = false ) public String[] restrictedVariables(); com.liferay.portal.template.TemplateContextHelper.getHelperUtilities(, false) { accountPermission=com.liferay.portal.service.permission.AccountPermissionImpl, arrayUtil=com.liferay.portal.kernel.util.ArrayUtil_IW,, auditRouterUtil=null, browserSniffer=com.liferay.portal.servlet.BrowserSnifferImpl, calendarFactory=com.liferay.portal.util.CalendarFactoryImpl, commonPermission=com.liferay.portal.service.permission.CommonPermissionImpl, dateFormatFactory=com.liferay.portal.util.FastDateFormatFactoryImpl, dateFormats=com.liferay.portal.util.FastDateFormatFactoryImpl, dateTool=May 5, 2018 10:34:58 AM, dateUtil=com.liferay.portal.kernel.util.DateUtil_IW,, expandoColumnLocalService=com.liferay.portlet.expando.service.impl.ExpandoColumnLocalServiceImpl, expandoRowLocalService=com.liferay.portlet.expando.service.impl.ExpandoRowLocalServiceImpl, expandoTableLocalService=com.liferay.portlet.expando.service.impl.ExpandoTableLocalServiceImpl, expandoValueLocalService=com.liferay.portlet.expando.service.impl.ExpandoValueLocalServiceImpl, getterUtil=com.liferay.portal.kernel.util.GetterUtil_IW, groupPermission=com.liferay.portal.service.permission.GroupPermissionImpl, htmlUtil=com.liferay.portal.util.HtmlImpl, httpUtil=com.liferay.portal.util.HttpImpl, imageToken=com.liferay.portal.webserver.WebServerServletTokenImpl, imageToolUtil=com.liferay.portal.image.ImageToolImpl,, jsonFactoryUtil=com.liferay.portal.json.JSONFactoryImpl, languageUtil=com.liferay.portal.language.LanguageImpl, layoutPermission=com.liferay.portal.service.permission.LayoutPermissionImpl,, localeUtil=com.liferay.portal.kernel.util.LocaleUtil, locationPermission=com.liferay.portal.service.permission.OrganizationPermissionImpl,,, organizationPermission=com.liferay.portal.service.permission.OrganizationPermissionImpl, paramUtil=com.liferay.portal.kernel.util.ParamUtil_IW, passwordPolicyPermission=com.liferay.portal.service.permission.PasswordPolicyPermissionImpl, portal=com.liferay.portal.util.PortalImpl, portalPermission=com.liferay.portal.service.permission.PortalPermissionImpl, portalUtil=com.liferay.portal.util.PortalImpl, portletModeFactory=com.liferay.portal.kernel.portlet.PortletModeFactory_IW, portletPermission=com.liferay.portal.service.permission.PortletPermissionImpl, portletProviderAction={ADD=ADD, BROWSE=BROWSE, MANAGE=MANAGE, EDIT=EDIT, PREVIEW=PREVIEW, VIEW=VIEW}, portletURLFactory=com.liferay.portlet.PortletURLFactoryImpl, prefsPropsUtil=com.liferay.portal.util.PrefsPropsImpl, propsUtil=com.liferay.portal.util.PropsImpl, randomizer=com.liferay.portal.kernel.util.Randomizer, rolePermission=com.liferay.portal.service.permission.RolePermissionImpl, saxReaderUtil=com.liferay.portal.xml.SAXReaderImpl, serviceLocator=com.liferay.portal.template.ServiceLocator, sessionClicks=com.liferay.portal.kernel.util.SessionClicks_IW,, staticFieldGetter=com.liferay.portal.kernel.util.StaticFieldGetter, stringUtil=com.liferay.portal.kernel.util.StringUtil_IW, timeZoneUtil=com.liferay.portal.kernel.util.TimeZoneUtil_IW, unicodeFormatter=com.liferay.portal.kernel.util.UnicodeFormatter_IW, unicodeLanguageUtil=com.liferay.portal.language.UnicodeLanguageImpl, userGroupPermission=com.liferay.portal.service.permission.UserGroupPermissionImpl, userPermission=com.liferay.portal.service.permission.UserPermissionImpl, utilLocator=com.liferay.portal.template.UtilLocator, validator=com.liferay.portal.kernel.util.Validator_IW, velocityPortletPreferences=, webServerToken=com.liferay.portal.webserver.WebServerServletTokenImpl, windowStateFactory=com.liferay.portal.kernel.portlet.WindowStateFactory_IW, }


Michael Chen 2018-07-31T20:20:00Z
Categories: CMS, ECM

A Simplified Pattern for Liferay 7 Services

Liferay - Tue, 07/31/2018 - 13:07

This is a simplified OSGi service API and implementation pattern. It follows the traditional Java interface-implementation pattern, in which the programmer is only required to keep the interface class and implementation class in sync. It does not use Liferay 7 service builder.

The attached archive is a fully implemented ORM service based on MyBatis. Unzip it into the modules folder in a Blade created workspace. A script to create the back-end database will be added soon to make this example fully operational.

Implementation Pattern

In this ORM example, there are two top level packages: 'api' and 'impl'. The 'api' and its children packages are to be exported and used by consumers of this service API. The 'impl' and its children packages are for implementation only and should remain private packages of this OSGi module.

Factory Usage Pattern

The 'api.Factory' class is the access point for consumers to get the services they need. A consumer class uses the Factory like this:

import com.acme.orm.api.bean.Order import static com.acme.orm.api.Facotry.getOrderLocalService; class OnlineStore { public Order checkOrder(String orderId) { Order order = getOrderLocalService().getOrderDetailsById(orderId); // Do something else return order; } }

In order to preserve OSGi's runt-time life cycle management of this module (start, stop, install, uninstall), it is important NOT to keep a reference of the service object obtained from the Factory:

// DO NOT DO THIS OrderLocalService myService = getOrderLocalService(); // DO NOT DO THIS OrderLocalService myService = Factory.getOrderLocalService();

The Liferay service builder went to great length to prevent programmers from keeping a reference of the service object by generating and forcing people to use the static methods in the XyzServiceUtil class. It also creates other confusing and irrelevant artifacts: XyzServiceBaseImpl, XyzServiceWrapper and two projects (two jars) for one service.

Instead of making it foolproof with all those complexities, why not just tell programmers, a very intelligent bunch, not to keep references of OSGi service objects. The result is this clean implementation pattern, with no generated artifact, and two easy to understand rules:

  • Keep the API interface class in sync with the implementation class.
  • Do not keep a reference of the service object obtained from the Factory.
Understanding ServiceTracker

When an OSGi module (or bundle) is replaced at run-time due to it being stopped, started, uninstalled or re-installed, the desired effect is that the services provided by that module be replaced as well. ServiceTracker is the OSGi class that keeps track of module life cycle changes. Module life cycle changes are transparent to service consumer code as long as the consumer code always access the service from the ServiceTracker.

OSGi is a component framework running in a JVM instance that exhibits the same run-time behavior as any Java programs. When the consumer code saves a reference to a service object, that service object will live on even when OSGi replaced its module with a new instance. That service object now becomes an orphan and out-dated instance only known to that consumer code. This is the reason for not keeping a reference to the service.

In this implementation pattern, the Factory class retrieves the service object from its corresponding ServiceTracker. The getService() method of the ServiceTracker shields the module's life cycle changes from the consumer code:

@ProviderType public class Factory { private static ServiceTracker<OrderLocalService, OrderLocalService> _OrderLocalService =; public static OrderLocalService getOrderLocalService() { return _OrderLocalService.getService(); } } Local vs. Remote Service

The differences between a Liferay 7.0 local service and remote service are:

  1. The base interface of the API.
  2. Specific annotations for the remote interface.

In the ORM example, OrderLocalService is local service interface:

Local Service API Declaration @ProviderType @Transactional(isolation = Isolation.PORTAL, rollbackFor = { PortalException.class, SystemException.class}) public interface OrderLocalService extends BaseLocalService { }

while OrderService is the remote service interface exposed as RESTful web service:

Remote Service API Declaration @AccessControlled @JSONWebService @OSGiBeanProperties(property = { "", "json.web.service.context.path=Order" }, service = OrderService.class) @ProviderType @Transactional(isolation = Isolation.PORTAL, rollbackFor = { PortalException.class, SystemException.class}) public interface OrderService extends BaseService { public Order getOrderDetailsById(String orderId); }

This RESTful web service can be found in the following catalog under Context Name "acme" (click the drop down box to find "acme" or other context names):


Both the local and remote service implementation classes just implement their corresponding API interfaces. In the ORM example, OrderLocalServiceImpl is the local implementation that does the actual work of mapping to the database. The remote implementation, as shown below, simply calls the local Factory services:

Remote Service Implementation @ProviderType public class OrderServiceImpl implements OrderService { public Order getOrderDetailsById(String orderId) { return Factory.getOrderLocalService().getOrderDetailsById(orderId); } } Development Details

Here are some key files for creating and implementing a service in this pattern.

Eclipse .project and .classpath

These two files in the example archive must be used to start your service project for Eclipse to recognize it as a Gradle project. You can change the project name in the <name> tag of the .project file before importing the project to Eclipse:


You must also create these two folder structures to hold your Java and resource files:


Once imported to Eclipse, be sure to right click on the project and select "Gradle" -> "Refresh Gradle Project". You can also do the same thing with the parent Gradle project created by Liferay Blade.


The two 'org.osgi:org.osgi.*' dependencies are required for OSGi features.

dependencies { compile group: "com.liferay", name: "com.liferay.osgi.util", version: "3.0.3" compile group: "com.liferay.portal", name: "com.liferay.portal.kernel", version: "2.0.0" compile 'org.osgi:org.osgi.core:5.0.0' compile 'org.osgi:org.osgi.annotation:6.0.0' compile group: 'javax.servlet', name: 'servlet-api', version: '2.5' compile group: "org.mybatis", name: "mybatis", version: "3.4.1" compile files('./resources/lib/sqljdbc4.jar') compileOnly group: "com.liferay", name: "com.liferay.journal.api", version: "1.0.0" } bnd.bnd

All packages under 'api' should be exported in the "Export-Package:" setting. The "Liferay-Spring-Context:" setting directs Liferay to load the Spring bean definition in the module-spring.xml file discussed below. "Lifer-Require-SchemaVersion:", "Liferay-Service:" and "Require-Capability:" settings are also required.

Bundle-Version: 1.0.0 Bundle-ClassPath: .,lib/sqljdbc4.jar Export-Package: \ com.acme.orm.api,\ com.acme.orm.api.bean,\ com.acme.orm.api.exception Import-Package: \ !*,\ !microsoft.sql.*,\ !com.sun.jdi.*,\ !net.sf.cglib.proxy.*,\ !org.apache.logging.*,\ * Include-Resource: @mybatis-3.4.1.jar Liferay-Require-SchemaVersion: 1.0.0 Liferay-Service: true Liferay-Spring-Context: META-INF/spring Require-Capability: liferay.extender;filter:="(&(liferay.extender=spring.extender)(version>=2.0)(!(version>=3.0)))" src/main/resources/META-INF/spring/module-spring.xml

For each bean definition, the "class=" value is the implementation class name, and the "id=" value is the interface class name.

<?xml version="1.0"?> <beans xmlns="" xmlns:xsi="" default-destroy-method="destroy" default-init-method="afterPropertiesSet" xsi:schemaLocation=""> <bean class="com.acme.orm.impl.CommunicationMediumLocalServiceImpl" id="com.acme.orm.api.CommunicationMediumLocalService" /> <bean class="com.acme.orm.impl.MessageCenterLocalServiceImpl" id="com.acme.orm.api.MessageCenterLocalService" /> <bean class="com.acme.orm.impl.NSMUserLocalServiceImpl" id="com.acme.orm.api.NSMUserLocalService" /> <bean class="com.acme.orm.impl.OrderLocalServiceImpl" id="com.acme.orm.api.OrderLocalService" /> <bean class="com.acme.orm.impl.OrderServiceImpl" id="com.acme.orm.api.OrderService" /> <bean class="com.acme.orm.impl.RoutingAreaLocalServiceImpl" id="com.acme.orm.api.RoutingAreaLocalService" /> <bean class="com.acme.orm.impl.WebContentArticleLocalServiceImpl" id="com.acme.orm.api.WebContentArticleLocalService" /> </beans>


Michael Chen 2018-07-31T18:07:00Z
Categories: CMS, ECM

How to disable XML Log Files in Liferay 7.x

Liferay - Tue, 07/31/2018 - 11:52

Liferay 7.0 started to produce a pair of log files per day:

ls -l /opt/liferay/home7/logs/ total 3104 -rw-r--r-- 1 michael admin 285201 May 26 13:24 liferay.2016-05-26.log -rw-r--r-- 1 michael admin 898027 May 26 13:24 liferay.2016-05-26.xml -rw-r--r-- 1 michael admin 400811 Aug 19 13:08 liferay.2016-08-19.log -rw-r--r-- 1 michael admin 0 Aug 19 12:26 liferay.2016-08-19.xml

To disable Liferay logging to XML files, create portal-log4j-ext.xml in the following path:

tomcat-8.0.32/webapps/ROOT/WEB-INF/classes/META-INF/portal-log4j-ext.xml <?xml version="1.0"?> <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd"> <log4j:configuration xmlns:log4j="">       <appender name="XML_FILE" class="org.apache.log4j.ConsoleAppender"/>       <root>         <priority value="INFO" />         <!--appender-ref ref="CONSOLE" /-->         <appender-ref ref="TEXT_FILE" />         <!--appender-ref ref="XML_FILE" /-->     </root> </log4j:configuration>

This file overrides two elements of META-INF/portal-log4j.xml in ROOT/WEB-INF/lib/portal-impl.jar, which stops Liferay from copying log entries to catalina.out and the daily XML log file.

You can also create the extension XML file as a copy of the original in the JAR as the following:

cd tomcat-8.0.32/webapps/ROOT/WEB-INF unzip -p lib/portal-impl.jar META-INF/portal-log4j.xml > classes/META-INF/portal-log4j-ext.xml


Michael Chen 2018-07-31T16:52:00Z
Categories: CMS, ECM

Blade Project Version

Liferay - Mon, 07/30/2018 - 13:08

TL;DR - Add "-v 7.0" to your Blade command line to create modules for Liferay CE 7.0 and Liferay DXP 7.0.

Hey, just a quick blog post here for something that developers may need to know...

I updated my Blade command line tool recently (since I use Intellij, I rely on the command line tool quite a bit for new module project creation), but I hadn't started a new module after updating.

That changed last Friday... Had a hot customer request for a customization so I used Blade to create a new workspace and started creating my modules. Modules loaded into Intellij just fine, and I happily start cranking out the code.

When I was happy with development, I built my 3 modules (SB API and Service modules and one UI module) and dropped them into my Liferay 7.0 DXP deployment folder.

Normally, I see the happy messages that my modules have started. I mean, these modules were not rocket science and they had no weird transitive dependency issues, so they should have just started.

But none of them did, not even the API module and those never have an issue starting.

So I fire up the Gogo shell, issue my lb command, and there are my modules at the end of the list, all in the Installed state.

So I command Gogo to start my API module, but it won't start, it has unresolved references on portal-kernel 3.0 and other such references.

I realize right away that portal-kernel 3.0 is from Liferay 7.1, but I wasn't building anything for Liferay 7.1, I'm still building for 7.0.

All of my modules suffered from the same problem, all had references to 7.1 artifacts.

Not knowing any better, I went back into the build.gradle files for my 3 modules and updated all of the versions so they were back at the normal 7.0 versions, rebuilt and redeployed and all was good.

So I asked my friend Greg Amerson what the deal was, and he informed me that the new default project version for the Blade tool is now 7.1. To have Blade create a project for Liferay CE 7.0 or Liferay DXP 7.0, you have to add "-v 7.0" to the Blade command line arguments.

So, there you go.  Avoid my Blade update headaches and just remember to tack on the "-v 7.0" for all of your new 7.0 modules.

Update 08/2018

So I checked with the folks that know, unfortunately there is no way to "default" the new blade to 7.0.  There are plans on the roadmap, but I don't know what that means for when it is available.

In the mean time, if you are using and developing for Liferay CE 7.0 or Liferay DXP 7.0, don't upgrade your blade until you move to 7.1 or they get the default setting in place.

If you did upgrade your blade, you can either remember to add the -v 7.0 args to your invoke of blade, or the easier option might be to just revert to an older version of Blade.

All you need to do is:

  1. jpm remove blade
  2. jpm install -f

That should get you back to a pre-7.1 version where the default will be 7.0.

David H Nebinger 2018-07-30T18:08:00Z
Categories: CMS, ECM

The Indigenous Culture Will be Preserved for Years to Come with Alfresco

Alfresco - Wed, 09/02/2015 - 09:21

After an extensive search, the Australian Institute of Aboriginal and Torres Strait Islander Studies (AIATSIS) chose Alfresco One and Parashift, one of Alfresco’s Australian partners, to streamline and digitize their new ECM and digital asset management system.

AIATSIS was looking for a more robust solution to help digitize their repository of the materials relating to the Aboriginal and Torres Strait culture.

This is a very exciting project for the Alfresco and Parashift teams and everyone is excited to get started. AIATSIS holds a collection of nearly one million assets that are in need of digitization and it’s a race against the clock as many of these objects are fragile and at risk for deterioration.

Parashift Managing Director Kieren Fitzpatrick said the specially tailored solution will provide many benefits for AIATSIS, including freeing up teams to streamline their digitizing work. “The new DAMs/EDRMs will provide a single index for all of AIATSIS’ information – documents, digital assets and records – allowing for more collaboration across the Institute and more sophisticated workflows,” he said. “Automation will save people from emailing back and forth, or updating spreadsheets, freeing them up to focus on digitizing the collection, which holds approximately one million objects.”

AIATSIS Executive Direction of Collections, Lyndall Osborne, said, “our project team met with every business unit to form a detailed set of requirements. A number of the proposals were explored in depth and ultimately it was determined the Alfresco Digital Asset Management system (DAMs) / Electronic Document Records Management system (EDRMs) solution proposed by Parashift would provide the best possible outcomes for AIATSIS.”

Alfresco, along with Parashift, understand the responsibility that we have to not only AIATSIS, but all those that were a part of and are interested in the studying the Aboriginal and Torres Strait Islander culture. We are thrilled to start the digitization project so that these items are preserved for future generations.

Categories: ECM

Alfresco Takes Over San Francisco

Alfresco - Fri, 08/07/2015 - 13:42

We were thrilled to welcome over 100 people to our Alfresco Day San Francisco event – which took place at The Argonaut Hotel.

The day-long event included presentations from Alfresco executives including VP of Product Management, Thomas Deo Meo and VP of Sales, Chris Lynch. The day saw presentations around the Latest Advancements of Alfresco One, How to Manage Compliance and Business Process Management with Activiti.

We welcome Alfresco customers Cisco and National MI to the stage and they discussed their use cases around Alfresco.

Cisco’s presentation, “Where Do I Put my Stuff?” was focused around how search and content management need to co-exist. After consistent employee survey results, it was determined that employees just couldn’t find anything on their intranet. That is when Cisco turned to Alfresco to deliver a consolidated content management system with highly engaged search.

National MI discussed their road to Alfresco. While looking for an ECM solution, they wanted a cost effective, efficient and scalable solution. They decided to look for Open Source candidates and found that Alfresco gave them what they were looking for.

Quoting one of the Alfresco Day attendees, “To see where Alfresco has come from, just makes it an even more viable candidate for enterprises to consider when choosing an ECM solution.”

I also can not forget to thank our wonderful sponsors for the event. Ixxus, Rivet Logic, Micro Strategies and Zia were among the many who presented at Alfresco Day San Francisco and exhibited their solutions in the expo hall.

Our Alfresco Day World Tour is just starting! Be sure to check out and register for a city near you!

Categories: ECM

Do you have your Alfresco University Passport?

Alfresco - Mon, 08/03/2015 - 03:00

Most of today’s leading companies either have an enterprise content management (ECM) system in place or are in the process of getting one.

In a recent AIIM Industry Watch survey, over half (52%) of participants said that they are working towards company-wide ECM capabilities, but only 14% said they are currently leveraging full ECM capability.

The biggest hurdle is user adoption.

Of those surveyed, 45% cited user adoption as the biggest roadblock when it comes to fully leveraging their ECM solution company-wide.

This was closely followed by challenges around consolidating multiple repositories (42%) to break away from data silos and an increase in the level of training (30%).

The best way to acquire the right skills for your team to deliver a successful Alfresco implementation is through training with Alfresco University. Alfresco is now offering an even more cost effective way to take advantage of this valuable training with the new Alfresco University Passport.

For just $3,250 per user per year, Passport will give you access to the entire catalog of eLearning and virtual instructor-led (vILT) courses available through Alfresco University for an entire year. With your Alfresco University Passport, you’ll enjoy:

  • Easy, self-service course enrollment for eLearning and vILT
  • The ability to enroll in as many courses as you like and attend the same course multiple times
  • Access to eLearning courses 24 hours a day, 7 days a week
  • Access to Alfresco Certified Instructors during live webcasts of vILT courses
  • Access to hands-on lab exercises to gain practical experience, as well as student guides, lab guides and other rich media content in an electronic format
  • Access to exclusive content only available to Passport holders
  • A jump-start to acquiring the information you need to achieve your Alfresco Certified Engineer and/or Alfresco Certified Administrator certificate!

Passport holders can complete any courses in the current course catalog as often as they like during the subscription period in addition to accessing any new and/or updated courses that are released during the term of their subscription.

Want to learn more? Contact your Alfresco representative today to purchase your Alfresco University Passport or e-mail All major credit cards are accepted. Once your purchase is complete, please allow up to 24 hours for access to your Passport to take effect.

Categories: ECM

Alfresco Activiti 1.3 – Engagement Just Got Easier

Alfresco - Fri, 07/31/2015 - 04:16

It only seems like a few months since the last release of Activiti, and that’ll be because it was.  We’re still working at cloud speed to build out the smartest BPM product on the planet.  With Activiti 1.3, we’re really showing off its potential for building and delivering engaging business solutions: rich processes and forms with simple design and deployment, coupled with dynamic collaboration.

Did I mention any major performance improvement?  No, but it’s there, and you can find out more and who we’ve been collaborating with on it here:

We’ve had a lot of requests for enhanced forms, so we’re giving you tabbed forms; variable numbers of columns on different parts of the same form; saving of forms for coming back to later or passing to someone else to complete; visibility control over whole groups of controls; and Form Stencils.  With these Stencils, you can define your own custom form controls through the UI – a topic worthy of a blog all on its own.

Some of the headline features include better integration to content repositories.  For Alfresco ECM repositories, you can now map metadata into Activiti process variables when you attach a document, or set metadata on content when it’s published to Alfresco ECM, on-premise or the cloud.  You can even choose different Process Apps for different sites in Alfresco One.  We’ve added Box to the list of integrations, allowing you to attach files from Box, or publish content from Activiti to Box.  And you can specify where in these repositories and by who.

A really natty capability is Document Generation from Microsoft Word templates, producing PDF or .docx files based on data collected in a process.  Couple that with the ability to publish to other content stores and you have a business solution with only a few clicks.

We’ve also added a bunch of enhancements for the user to take control of their task management.  First, there’s custom filters for the task and process lists, so you can create very specific views of tasks or processes.  Then, when you’re working on a task, as well as involving other people, you can add a Checklist.  So what? Well, a checklist is a set of smaller tasks you want to add to help you solve the main task, and because they’re tasks, they too can be assigned to other people with due dates.  Naturally, they too can have a Checklist added to that subtask and involve further people.  You can organically solve a problem by breaking away from the process, but it is all controlled and captured.  The other bit of magic is that you can attach any form from your Form Library to a task, so you could assign a Checklist item to someone and attach a form to capture their feedback or collect data.  All dynamically and as needed.

There’s quite a few other things too, especially for anyone working with REST interfaces, but I’ll leave you to discover those yourself.  As always, I feel excessively proud of the Activiti team that has brought this release to fruition.  But we’re not done yet.

Get the trial download here:

Categories: ECM

Next-Gen ECM Webinar: Your Questions Answered

Alfresco - Thu, 07/30/2015 - 16:53

Earlier this month, Alfresco hosted the webinar Next-Gen ECM. More than 1000 people joined us for a discussion on the trends that are reshaping ECM requirements and a why a new approach to content management is needed.

The webinar generated lots of interest and dozens of questions. We didn’t have time to address everyone’s questions live, so I’ve answered several of them in this blog post. We are also following up individually with anyone who asked a question.

If you missed the webinar, it’s available on demand here. Registrants also receive a copy of my white paper Next-Gen ECM: Why the Time is Right for a New Approach to Enterprise Content Management.

Q: Is Alfresco being used to replace legacy ECM systems? If so, what are the typical drivers for this change?

A: Our customers want value, flexibility and accessibility from their ECM systems, and they don’t want to spend more for a big brand. Often, they keep their most sensitive and important content on premises, but want to store some content in the cloud to support collaboration. This makes Alfresco’s hybrid cloud ECM platform very appealing. Other top drivers for ECM migration are improved search and regulatory compliance. We provide the easiest, most effective platform for their requirements.

Q: How do you see Alfresco competing againstor do you compete against—file sharing services like Box?

A: We generally don’t compete with enterprise file sync and share vendors like Box, but we are often complementary. Some companies use Alfresco to manage their most important business content and Box to manage casual content that can be shared outside the organization. Although there is some overlap in functionality, our use cases tend to be very different. With our process control, high level of security, rules and rich metadata, Alfresco sits at the heart of business processes. In contrast, products like Box and Dropbox are used for more collaborative interactions. Some of our partners have created integrations between Alfresco and these services.

Q: What are the main differences in terms of ECM between Alfresco and SharePoint? Can you give us some examples of how the two applications could be integrated?

A: The main difference between Alfresco and SharePoint is that SharePoint is primarily a content portal, while Alfresco is focused on the integration of ECM and BPM. Alfresco supports rich business processes, a much richer data model, dynamic searches, much higher scalability and robustness as well as integrated records and digital asset management. In addition, users generally find the navigation and searching experience to be easier in Alfresco.

Some customers use Alfresco and SharePoint together. A partner like SeeUnity provides archival capabilities from SharePoint to Alfresco and Web Parts for accessing Alfresco content in the SharePoint portal.

Q: Is it possible to manage confidential information in Alfresco? Is it really secure?

A: Absolutely. Alfresco is trusted by organizations that require the highest level of security. Military, intelligence and civil government organizations use Alfresco to manage highly sensitive information. Alfresco is also used by large corporations to store confidential intellectual property that is subject to export controls.

Q: Large multinational organizations often face local jurisdictional rules that prohibit the transfer of information across borders. How would you handle this in Alfresco with deployment models and search capabilities?

A: Alfresco gives you a few deployment options. It sounds like you are asking the question from a central repository perspective. In this case, each part of the repository can have different access controls. Our search capability through Solr evaluates all documents for relevant security controls so that users see only the information that they have the right to access. In a distributed model, you have a great deal of control over what gets replicated from one repository to another. This means you can set up and enforce policies that only authorized content is replicated.

Q: Does Alfresco in the Cloud offer data center failover capabilities (e.g., with data centers on the East and West coasts of the U.S.)?

A: Yes. Alfresco in the Cloud uses Amazon Web Services (AWS) for data storage. AWS has multiple data centers in each region, but we also have disaster recovery in the West Region. Over time, we will add new Amazon and European regions, with government regions a high priority for us. This is contingent on our rollout of new cloud-based applications.

Q: Legacy document management systems like Documentum were hard to implement from a “business change” perspective. Can you shed any light on how easy it is to implement Alfresco? Issues like content clean up, end user training, new business processes, etc.

A: There are two fundamental concepts in Alfresco—Aspects and Rules—that make organizing information much easier. Since I was involved in the design of Documentum, I can say we made a fundamental mistake in assuming that the information architecture would be predefined. After all, that’s how a database works. But content is different. You want to load up the content and then figure out what to do with it. Aspects are like Lego-blocks of metadata that can be added after the content is loaded, so you can figure out what data is needed dynamically. Rules use content context—folder, file type, even data in the document— to apply Aspects, move content around and protect it with the right level of security. This is all done automatically, without the user having to figure these things out.

Q: How do you sell metadata to people who are used to a folder-based interface?

A: You can sell metadata as the ability to search and find things more easily. Folder Rules allow metadata to be added to content without forcing people to adopt a new way of working. When the user adds a document to a folder, the metadata is automatically added with rules to be found in searches. Using the Office integration in Alfresco 5.0, it’s also possible to add metadata directly in the Office file and pick it out just as SharePoint does.

Workflows are another great way to submit files. A simple form can capture the information you need, and the workflow automatically handles the metadata. The user doesn’t have to do anything at all.

Q: What do you think about the trend to integrate more and more content in the application? In the past, the idea was to integrate the application into the content.

A: Content is the moveable part; the application is not. There may be some context in the question that I’m missing, but it’s something I have been thinking about. Content can exist in multiple places at the same time: on premises, in the cloud, in an enterprise app or social web site, on a desktop or mobile repository, and so on. ECM is key to keeping track of content, ensuring that it’s in sync and handling conflicts.

Applications can be anywhere, and on mobile devices they will be harder to integrate into the content. But content can be moved where it is needed and transformed into a format that the application can use.

Q: What capabilities are available for personalized content delivery?

A: Alfresco supports the templating language Freemarker, which runs in the repository and has the full capability of the Alfresco data model. This allows Freemarker to modify and assemble complex, dynamic content as needed.

Q: What capabilities does Alfresco provide for indexing proprietary binary data, assuming that data structures are known or that access APIs exist?

A: We support the open tokenizer and schema capabilities of Solr and have an open architecture for metadata extraction. With these two capabilities, you can add the ability to extract searchable information from binary data. Alfresco Consulting Services can provide assistance with this, if needed.

Q: How will you interoperate with Big Data technologies like MongoDB, HBase, HDFS and graph databases?

A: The Alfresco community and our engineering team have been experimenting with Big Data technologies, particularly MongoDB and HBase. Alfresco lends itself to storing information in these technologies quite well. We are also doing some experimentation with Cassandra—although, we’ve actually been able to scale higher and have full transactional control using new in-memory relational databases. It may be a matter of using different databases for different purposes. As for graph databases, these would be used for very different purposes like keeping track of relationships and deducing connections between users and content for recommendation purposes. This is a longer-term project.

Q: What is Alfresco’s integration proposition for Atlassian products?

A: Partners have created integrations with Atlassian products, including Jira and Confluence. Click here for a full list of partner solutions.

Thanks to everyone who joined the webinar and submitted questions. If you’d like some hands-on experience with Alfresco, I encourage you try one of these options:

Categories: ECM

Alfresco Drives Digital Content Experience for Award-Winning Canadian Museum for Human Rights

Alfresco - Mon, 07/20/2015 - 12:36

The Canadian Museum for Human Rights (the Museum) recently won four MUSE Awards from the American Alliance of Museums, one of which was a Gold Award for its digital infrastructure. Alfresco One, our enterprise content management (ECM) platform, facilitates the Museum’s digital content experience, which includes 6,500 images and over 100 hours of high-definition video, all complete with metadata. In cooperation with solutions provided by Technology Services Group and the Museum’s digital platforms team, our technology supports all aspects of digital storytelling, including creation, cataloguing, exhibition, web and mobile dissemination, and archival storage.

This digital experience is core to the Museum’s work. It is the first museum in the world solely dedicated to the evolution, celebration and future of human rights. It is designed to encourage reflection and dialogue, and therefore provides visitors with a unique interactive experience including a large digital collection of recorded oral histories and dynamic, multimedia technology exhibits. Because of this, the Museum is one of the most digitally advanced cultural institutions in North America.

I believe Alfresco is perfectly suited to support the Museum now and into the future. By its nature, the Museum is open and ever changing. We founded Alfresco on these principles as well – creating an open architecture that can adapt to unforeseen changes. Our technology is in line with the spirit of the Museum and its mandate of transparency and collaboration, and allows the Museum to continue to evolve with different technologies.

Congratulations to the Museum on winning these prestigious awards. We are proud to help you provide an exceptional digital experience to your visitors and we look forward to working with you as your needs and use cases evolve in the future.

Categories: ECM

Updating the Enterprise Content Management Strategy for the Federal Workforce

Alfresco - Tue, 06/16/2015 - 13:04

I was recently asked to submit a byline to Government Computer News, one of the top government technology trade publications, on how agencies need to think about ECM in light of the many new challenges they are facing. The experience of the federal worker is undergoing a dramatic transformation, as older employees are retiring and are being replaced with a younger demographic that is on the forefront of the rise of mobile, social and cloud technologies. The younger work force expects to be productive anytime, anywhere, using any device or application. This requires that agencies develop systems that deliver easier access to content, or employees may resort to consumer-grade applications that fail to meet compliance and records management mandates.

At the same time, many legacy Enterprise Content Management (ECM) systems have been in place for a decade or more, requiring major upgrades or replacement. They have been architected for a different era, and therefore these outdated applications have been left behind by current working and computing environments.

This has intensified the pressure on federal IT teams to update their ECM strategy or face the control, security and compliance issues that are created by user-established technologies, but are unacceptable in the public sector environment. Devising a new approach to ECM will help agencies overcome shadow IT and adhere to required standards. IT teams looking to update their ECMs while meeting the needs of both federal workers and federal mandates should take note of these four content management necessities. Here are some of the observations I shared in my GCN byline:

Support New Ways of Working
Today’s federal workforce has been shaped by the use of apps like Instagram, Snapchat and Uber. Modern ECM needs to embrace the applications and devices people use every day, taking cues from consumer apps so that the user experience is fresh, engaging and intuitive. Thirty-seven percent of the world’s workforce is predicted to be mobile in 2015 – and this reality is felt in the federal market as well. This highly mobile, very connected employee wants IT solutions that allow them to work regardless of location, network, or device.

Prepare for the Emergence of the Extended Enterprise
Agencies are no longer stand-alone entities. Instead, they are often an inter-connected web of employees, contractors, suppliers and constituents, extending far beyond the traditional boundaries of the agency, and certainly beyond their network firewalls. This makes the ability to share content and process across the extended enterprise an imperative for modern ECM. Legacy ECM systems were architected at a time when users and content stayed behind the firewall on servers and PCs, but that simply isn’t feasible to support today’s federal worker. Modern ECM needs to support the easy, controlled sharing of content and process both inside and outside of the agency.

Plan for the Explosion in Digital Content
With so much content being created on an ongoing basis, it can be harder than ever for government users to find the information they need quickly. If content is not adequately managed, governed, or secured because it is “in the wild” – spread across network drives, mobile devices, laptops, email, USB sticks and consumer file sharing sites – then it raises security, compliance, operational and other risks. Modern ECM needs to use contextual data – comments, preferences, geo-location data – to drive business processes that move content to the right person at the right time within the right application. This will enable content to be put into context so that people and processes work more efficiently and effectively.

Embrace Today’s IT Infrastructure
Agencies must adopt a strategic platform for meeting the often conflicting needs of users – who want to work remotely and with people outside of the agency – and IT that must ensure that content is secure and managed in accordance with governance and compliance policies. Unfortunately, Legacy ECM platforms are generally not built for cloud scale and offer only limited mobile support. Federal agencies have shifted to a hybrid storage approach, storing content both on premises and in the cloud. This requires seamless syncing between the two locations. An agency’s ECM approach needs to recognize the need for control and compliance and must be able to support traditional on-premises deployments, virtualized environments, private cloud deployments, full-fledged public-cloud SaaS deployments and everything in between. Agencies will need to figure out which content is appropriate for sharing in the cloud and which should remain on premises behind the firewall.

ECM plays a vital role in storing, managing and controlling the content that keeps the federal government moving. Agencies must address the need to allow people to work where and how they want, enable information and process to flow across the extended enterprise, develop a system to handle digital content management and make deployment an easy process for IT teams across the IT infrastructure. Keeping the current workplace in mind when crafting an updated ECM strategy will empower federal employees to do and share great work.

Categories: ECM

Alfresco Wins Omega NorthFace ScoreBoard Award for Outstanding Customer Service

Alfresco - Fri, 06/05/2015 - 18:21

At Alfresco, we love to tell prospects, customers, analysts, prospective employees, anyone really, about how we are a company of more than 375 (and counting) employees who truly want to deliver great experiences to our customers and partners. We can talk about this with confidence because there is a certain truth in business: You cannot fake a passion for delivering great customer experiences.

Today, we want everyone to know how incredibly fortunate we are to have a team who engage with our customers and partners every day and truly embrace that passion for delivering great customer experiences. Here’s why: We just received notification that our Customer Support and Premier Support Services Teams are among a very prestigious group of recipients of the 2014 Omega NorthFace ScoreBoard Award! This award is the only award of its type and is a highly regarded customer service industry award based upon an independent audit which comes exclusively from the evaluation of our customer measures and customer feedback. In Alfresco’s case, that means our customer satisfaction survey which measures the level of satisfaction expressed by our customers each time a support case is closed.

There is no better recognition for this hard work than when it comes directly from the customers they serve every day and gets further acknowledged by an organization like Omega. At Alfresco, we truly believe that having the best technology alone is not good enough for a leading global company like ours, because at the core of our business are our customers. How we work with them on a daily basis—ensuring that they get prompt, effective and courteous service and that they are meeting their business goals—is what drives our team. Customer service is at the forefront of everything we do.

The Omega Management Group first initiated this program 15 years ago. It is the first of its kind in the service industry to recognize organizations who not only offer exemplary service to their customers, but who center their very existence on a deep commitment to being customer centric. It is widely recognized as the most prestigious award for customer service excellence.

And the process is not easy. Omega reviews customer satisfaction survey results each year from over 500 companies on average to determine their qualification for the honor. First, each organization’s customer satisfaction survey must meet Omega’s rigorous qualifications and criteria, which includes mastering both the experience and satisfaction that customers want from their service providers, just to be considered. The journey includes a quarterly review of the survey process and results, and a Customer Satisfaction rating of 4.0 out of 5.0 (or an equivalent rating system) over a 12-month calendar year.

We are proud of this achievement not because it’s another trophy for our lobby. More importantly, it is the result of the commitment and care that we take to make sure our customers meet their business objectives through the Alfresco platform. Not only does Alfresco create an amazing product, design and implement great solutions for customers using that product, and educate and train those customers to be successful, we back that delivery up with a world-class, and now AWARD WINNING support organization that cares deeply about ensuring our customers have consistently great experiences.

Congratulations, team, to a job very well done.

Categories: ECM

Three extended enterprises embracing the collaborative economy

Alfresco - Mon, 06/01/2015 - 04:42

Businesses are being reshaped by an exponential growth in connected activity and information flow. There are few stand-alone entities these days. Instead, companies are often an inter-connected web of companies, contractors, suppliers, partners and employees. And the challenge for IT is only going to increase as companies begin to take advantage of new economic models such as the Collaborative Economy.

The Collaborative or Sharing Economy involves using technology to connect people to make better use of products, services and resources – reinventing the way we consume and disrupting all aspects of the value chain. According to Jeremiah Owyang, for every car that is rented through a car-sharing service such as Zipcar, the manufacturer loses on average $270,000 as this reduces car ownership by 9-13 vehicles. This then has a knock-on effect on the rest of the value chain (insurance, fuel, parts etc).

Some companies have already recognized these shifts and are exploring how their business models could change forever.

GE has partnered with product crowdsourcing service Quirky investing over $30m to create Wink, a ‘microfactory’ for connected home products which is currently generating over 4,000 new ideas per week. Wink selects a small number of these to take to production and are then sold by retailers such as The Home Depot.

Barclaycard Ring, from Barclays PLC is a credit card service designed and run by its community members. Members propose and vote on new ideas to make the service better. Barclays shares the financial impact that these ideas might have and the Giveback program gives members a share in the profits.

US Government agency NASA launched a contest asking data scientists on crowdsourcing site Kaggle to create new uses for the agency’s wealth of earth sciences informational material and data. The competition uses the NASA Open Earth Exchange platform. NASA has some of the smartest people on the planet, but competitions on Kaggle are often won by people with little subject-matter expertise but who are great at building predictive data models.

Managing the digital content and workflow associated with such a dramatically different extended enterprise requires some careful thought.

Firstly, you will need to design business processes that can extend beyond your organization and that can be adapted on the fly. Secondly, it is vital to use collaboration tools that work as well with your partners as they do inside your organization. Finally, content needs to flow easily between the different parties involved in the process. But in some cases, there could be significant security concerns so think about how to safeguard your critical content, whilst still delivering the content that partners need in the cloud.

For more information:

Check out and download the Modernization of ECM Whitepaper here.

How Alfresco is solving these types of business process and content challenges.

Categories: ECM

Alfresco Furthers Modernization of ECM and BPM with New Analytics and Multi-Media Tools

Alfresco - Tue, 05/19/2015 - 12:52

The ECM and BPM markets continue to advance at ever increasing speeds, and customers are seeking simple and smart solutions to solve information and content-driven challenges. As we continue to redefine the ECM and BPM markets, Alfresco is leading the charge to bring business insights into unstructured content and associated business processes. We just took another step in that direction, announcing two new tools for Alfresco One – Alfresco Analytics and Alfresco Media Management – as well as key enhancements to our BPM software. These updates highlight Alfresco’s commitment to providing businesses with innovative, easy-to-use solutions to manage all their content needs.

The Alfresco Analytics moduleTM for Alfresco One provides new insights and reporting capabilities on content, process and people-related events. This gives new visibility into how content is being managed and shared, including:

• Who is accessing which documents
• How often content is being shared and accessed
• What tasks are being completed by which employees

The Analytics module allows businesses to use information about unstructured content to influence business processes and enable process improvements, including the ability to identify star performers, bottlenecks, and resource needs and usage.

The Alfresco Media Management moduleTM is designed to make it easier for companies to store and manage a range of rich media files, such as high definition video, without a separate system. The new components allow end users to view, conduct lightweight editing and transforming of photos and videos within our platform, removing the requirement to download the files and use specialist media editing or viewing software. This saves serious time and removes complexity. It also makes it easier transcoding video files from one format to another (HD to mobile or streaming formats) or to transform media files from one format to another – for example transforming high-quality print-ready images to lower resolution for inclusion on a company’s website. This new media-handling component enables companies to easily repurpose rich media files to further drive digital business.

We also made significant upgrades to our Alfresco Activiti 1.2 BPM solution, including:

• Integration with Alfresco ShareTM – Alfresco’s group collaboration tool – effectively merging the worlds of BPM and ECM through one simple-to-use application
• New process and forms designers aimed at line-of-business users
• Additional languages to support global deployments
• A PDF process audit to capture to help meet compliance needs
• A process heat map to help analyze and improve process efficiency
• Improved forms layout
• Ease of integration with other line-of-business applications.

Alfresco now makes it possible to analyze and act on vast amounts of information in ways never before possible. By continuing to push the boundaries of the definition of “content” and how companies use it to drive digital business, Alfresco enables organizations to advance well beyond traditional ECM benefits – and instead experience an enterprise-wide modernization of business processes.

I am truly excited about these advancements today, and how it will empower our users. Please check them out and let us know what you think.

Categories: ECM
Syndicate content