AdoptOS

Assistance with Open Source adoption

ECM

Boosting Search

Liferay - Thu, 11/15/2018 - 17:43
Introduction

A client recently was moving off of Google Search Appliance (GSA) on to Liferay and Elasticsearch. One key aspect of GSA that they relied on though, was KeyMatch.

What is KeyMatch? Well, in GSA an administrator can define a list of specific keywords and assign content to them. When a user performs a search that includes one of the specific keywords, the associated content is boosted to the top of the search results.

This way an admin can ensure that a specific piece of content can be promoted as a top result.

For example, you run a bakery. During holidays, you have specially decorated cakes and cupcakes. You might do a KeyMatch search for "cupcake" to your specialty cupcakes so when a user searches, they get the specialty cakes over your normal cupcakes.

Elasticsearch Tuning

So Elasticsearch, the heart of the Liferay search facilities, does not have KeyMatch support. In fact, often it may seem that there is little search result tuning capabilities at all. In fact, this is not the case.

There are tuning opportunities for Elasticsearch, but it does take some effort to get the outcomes you're hoping for.

Tag Boosting

So one way to get a result similar to KeyMatch would be to boost the match for tags.

In our bakery example above, all of our contents related to cupcakes will, of course, appear as search results for "cupcake" if only because the keyword is part of our content. Tagging content with "cupcake" would also get it to come up as a search result, but may not make it score high enough to make them stand out as results.

We could, however, use tag boosting so that a keyword match on a tag would push a match to the top of the search results.

So how do you implement a tag boost? Through a custom IndexPostProcessor implementation.

Here's one that I whipped up that will boost tag matches by 100.0:

@Component( immediate = true, property = { "indexer.class.name=com.liferay.journal.model.JournalArticle", "indexer.class.name=com.liferay.document.library.kernel.model.DLFileEntry" }, service = IndexerPostProcessor.class ) public class TagBoostIndexerPostProcessor extends BaseIndexerPostProcessor implements   IndexerPostProcessor { @Override public void postProcessFullQuery(BooleanQuery fullQuery, SearchContext searchContext)   throws Exception { List<BooleanClause<Query>> clauses = fullQuery.clauses(); if ((clauses == null) || (clauses.isEmpty())) { return; } Query query; BooleanQueryImpl queryImpl; for (BooleanClause<Query> clause : clauses) { query = clause.getClause(); updateBoost(query); } } protected void updateBoost(final Query query) { if (query instanceof BooleanClauseImpl) { BooleanClauseImpl<Query> booleanClause = (BooleanClauseImpl<Query>) query; updateBoost(booleanClause.getClause()); } else if (query instanceof BooleanQueryImpl) { BooleanQueryImpl booleanQuery = (BooleanQueryImpl) query; for (BooleanClause<Query> clause : booleanQuery.clauses()) { updateBoost(clause.getClause()); } } else if (query instanceof WildcardQueryImpl) { WildcardQueryImpl wildcardQuery = (WildcardQueryImpl) query; if (wildcardQuery.getQueryTerm().getField().startsWith(Field.ASSET_TAG_NAMES)) { query.setBoost(100.0f); } } else if (query instanceof MatchQuery) { MatchQuery matchQuery = (MatchQuery) query; if (matchQuery.getField().startsWith(Field.ASSET_TAG_NAMES)) { query.setBoost(100.0f); } } } }

So this is an IndexPostProcessor implementation that is bound to all JournalArticles and DLFileEntries. When a search is performed, the postProcessFullQuery() method will be invoked with the full query to be processed and the search context. The above code will be used to identify all tag matches and will increase the boost for them.

This implementation uses recursion because the passed in query is actually a tree; processing via recursion is an easy way to visit each node in the tree looking for matches on tag names.

When a match is found, the boost on the query is set to 100.0.

Using this implementation, if a single article is tagged with "cupcake", a search for "cupcake" will cause those articles with the tag to jump to the top of the search results.

Other Modification Ideas

This is an example of how you can modify the search before it is handed off to Elasticsearch for processing.

It can be used to remove query items, change query items, add query items, etc.

It can also be used to adjust the query filters to exclude items from search results.

Conclusion

So the internals of the postProcessFullQuery() method and arguments are not really documented, at least not anywhere in detail that I could find for adjusting the query results.

Rather than reading through the code for how the query is built, when I was creating this override, I actually used a debugger to check the nodes of the tree to determine types, fields, etc.

I hope this will give you some ideas about how you too might adjust your search queries in ways to manipulate search results to get the ordering you're looking for.

David H Nebinger 2018-11-15T22:43:00Z
Categories: CMS, ECM

Liferay And...  Jackson

Liferay - Tue, 11/13/2018 - 18:27
Introduction

So often when discussing how to deal with dependencies, we're often looking for ways to package our third party jars into our custom modules.

There's good reason to do this. It ensures that our modules get a version of a third party jar that we've tested with. It also excludes ambiguity over where the dependency will come from, whether it is deployed and available or not, etc.

That said, there is another option that we don't really talk about much, even though it is still a viable one. Many third party jars are actually OSGi-ready and can be deployed as modules separately from your own custom modules.

Jackson, for example, is actually module jars on their own and can be deployed to Liferay just by dropping them into the deploy folder.

Dependencies Deployed as Modules

So why deploy a third party dependency jar as an OSGi module instead of just as an embedded jar?

Often it comes down to either a concern about class loaders or, less frequently, a (misguided) attempt to shrink general modules size. I usually say this is misguided because memory and disk consumption is cheap, and the problems (to be discussed below) often are not worth it.

So what about the class loader concern? Well, when you are using a system which uses class loaders to instantiate java classes, such as with Jackson and it marshaling JSON into Java objects with Jackson annotations, class loader hierarchies and the normal boundaries between OSGi modules can make general OSGi usage a challenge when the annotations are used in different bundles.

For example, if you have module A and module B and both have POJOs decorated with Jackson annotations, you could run into issues with the annotations. When performing a package scan for classes decorated with the annotations, if the annotation is loaded by a different class loader it is effectively a different class and may not be visible during annotation processing.

If your package is deployed as a standalone module, though, then all bundles sharing the dependency will pull from the same module and therefore the same class loader.

A downside of this, though, is with versioning. If you deploy Jackson 2.9.3 and 2.9.7, there are two competing versions available and can still lead to class loader issues when the different versions are used at the same time. In the case of just a single version deployed, then you have the typical concern of all modules stuck using an agreed upon version.

Is My Dependency OSGi Ready?

So the first thing you'll need to know is whether your third party dependency jar is an OSGi module or not.

The most complicated way to find out is by opening up your jar with a zip tool to look at the contents. If the jar is a bundle, the META-INF/MANIFEST.MF file will contain the OSGi headers like Bundle-Name, Bundle-SymbolicName, Bundle-Version, etc. Additionally you may have OSGi-specific files in the META-INF folder for declarative services.

An easier way is just to use one of the Maven repo search tools. When looking for jackson-core 2.9.6 in mvnrepository.com, you come across the page like https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core/2.9.6:

Under the Files section, it is shown as a bundle w/ the size. This means it is an OSGi-ready bundle. When not OSGi-ready, the search tools will typically show it as just a jar.

This and the other Jackson jars are all marked as bundles, so I know I can deploy them as modules.

Deploying Jackson as Modules

So for a future "Liferay And..." blog post, I have need of Jackson as a module instead of as just a dependency, so in this post we're going to focus on deploying Jackson as modules. Sure this may not be necessary for all deployments or usage of Jackson, but it is for me.

Okay, so our test is going to be to build a couple of modules w/ some POJOs decorated with Jackson annotations and a module that will be marshaling to/from JSON. In order to do this, we need to have the following Jackson modules deployed:

  • compile group: 'com.fasterxml.jackson.core', name: 'jackson-core', version: '2.9.6'
  • compile group: 'com.fasterxml.jackson.core', name: 'jackson-annotations', version: '2.9.6'
  • compile group: 'com.fasterxml.jackson.core', name: 'jackson-databind', version: '2.9.6'
  • compile group: 'com.fasterxml.jackson.datatype', name: 'jackson-datatype-jdk8', version: '2.9.6'
  • compile group: 'com.fasterxml.jackson.datatype', name: 'jackson-datatype-jsr310', version: '2.9.6'
  • compile group: 'com.fasterxml.jackson.module', name: 'jackson-module-parameter-names', version: '2.9.6'

Download these bundle jars and drop them into the Liferay deploy folder while Liferay is running. The bundles will deploy into Liferay and you'll see the messages in the log that the bundles are deployed and started.

Building Test Modules

So in the referenced Github repo, we'll build out a Liferay workspace with three module projects:

  1. Animals - Defines the POJOs with Jackson annotations for defining different pet instances.
  2. Persons - Defines a POJO for a person to define their set of pets.
  3. Mappings - Services based upon using Jackson to marshal to/from JSON.
  4. Gogo-Commands - Provides some simple Gogo commands that we can use to test the modules w/o building out a portlet infrastructure.

The Github repo is: https://github.com/dnebing/liferay-and-jackson

The animals and persons modules are nothing fancy, but they do leverage the Jackson annotations from the deployed OSGi Jackson modules.

The mappings module uses the Jackson ObjectMapper to handle the marshaling. It is capable of processing classes from the other modules.

The gogo module contains some simple gogo shell commands:

Command Description jackson:createCatCreates a Cat instance and outputs the toString representation of it. Args are [name [breed [age [favorite treat]]]]. jackson:createDogCreates a Dog instance and outputs the toString representation of it. Args are [name [breed [age [likes pigs ears]]]]. jackson:catJsonLike createCat, but outputs the JSON representation of the cat. jackson:dogJsonLike createDog, but outputs the JSON representation of the dog. jackson:catParses the given JSON into a Cat object and outputs the toString representation of it. The only argument is the JSON. jackson:dogParses the given JSON into a Dog object and outputs the toString representation of it. The only argument is the JSON.

For the cat and dog commands, to pass JSON as a single argument, enclose it in single quotes:

jackson:cat '{"type":"cat","name":"claire","age":6,"breed":"house","treat":"filets"}'

Conclusion

Seems like an odd place to stop, huh?

I mean, we have identified how to find OSGi-ready modules such as the Jackson modules, we have deployed them to Liferay, and we have built modules that depend upon them.

So why introduce Jackson like this and then stop? Well, it is just a preparatory blog for my next post, Liferay And... MongoDB. We'll be leveraging Jackson as part of that solution, so starting with the Jackson deployment is a good starting point.  See you in the next post!

David H Nebinger 2018-11-13T23:27:00Z
Categories: CMS, ECM

Get that old school page editing touch back again

Liferay - Tue, 11/13/2018 - 04:56

At this years Unconference at DEVCON in Amsterdam, Victor Valle kept a session about changing the look-and-feel of the portal administration. Things like removing the default product menu and creating your own. I missed that discussion because so many other interesting discussions were going on, so I have no idea if someone brought this up. In this blog article, I'm not going to do anything "drastic" like like removing a side menu. I just want to show you how easy it is to get something back which some of us missed since version 6.2.

I've heard some people complain about the way the page editing works since version 7.0, and it seems like themelets are still pretty much an underrated or unknown addition to Liferay theming. I hope this blog will tackle both.

By default you'll have to hover over a portlet or widget to get the configuration options:

Nothing wrong with it. The page is displayed as anyone without editing rights would see the page. No need to toggle that icon which shows/hides the controls. But for those who have to do a lot of page editing or widget configuration, ... every second counts. We just want to click on the ellipsis icon to configure that widget without having to hover over it first.

Some of us want this:

There are different ways to get this toggle controls feature back

After all, the controls icon did not disappear. It only becomes visible when your screensize is small enough. So it's all a matter of tweaking the styles.

Option 1: Portlet decorators

You can easily add your own portlet decorator and make it the default one in your theme. You just need to add the custom decorator in the  look-and-feel.xml of your theme:

<portlet-decorator id="show-controls" name="Show Controls"> <default-portlet-decorator>true</default-portlet-decorator> <portlet-decorator-css-class>portlet-show-controls</portlet-decorator-css-class> </portlet-decorator>

Then you will want to add some css to the theme so the toggle controls icon is displayed at all time. Toglling the icon will add a css class "controls-visible" to the body element. This is easy, just display the "portlet-topper" everytime "controls-visible" is present.

Why this option is bad for this purpose: I believe portlet decorators are meant for styling purposes. How do you want the user to see the widget when you select a certain decorator. When you want to use portlet decorators for this purpose and you want to display widgets in different ways, with(out) borders or titles... This means you'll lose the controls anyway when you assign a different decorator to a widget. I bet you'll say: "Just put the extra styles on every widget". So just forget I even brought this up.

Option 2: Add some css and js

We'll add the custom css to the theme:

/* old school portlet decorators, don't mind the shameless usage of !important */ .control-menu .toggle-controls { display: block !important; } .controls-visible.has-toggle-controls { .portlet-topper { display: -webkit-box !important; display: -moz-box !important; display: box !important; display: -webkit-flex !important; display: -moz-flex !important; display: -ms-flexbox !important; display: flex !important; position: relative; opacity: 1; transform: none !important; } section.portlet { border: 1px solid #8b8b8b; border-radius: 0.5rem; } } .controls-hidden { .portlet-topper { display: none !important; } }

We're not there yet. We'll need to put a "has-toggle-controls" class on the body element because it seems "controls-visible" is there by default, even when you're not logged in. And in this case I want to display a border around the portlet when the controls are active. So I'll be setting my own css class like this inside main.js:

var toggleControls = document.querySelector('.toggle-controls'); if (toggleControls !== null) { document.body.classList.add('has-toggle-controls'); }

But what if we had lots of different themes. Are we really going to add the same code over and over again? And what if business decides to add a box-shadow effect after a few weeks?

Option 3: Themelets

A themelet is an extension of a theme. You can extend all your themes with the same themelet. When some style needs to change, you just edit the themelet and rebuild your themes. This will require you to use the liferay-theme-generator. But I bet everyone does by now, right? More information on how to build and use themelets.

You can find the themelet I wrote on github.

Please be sure your theme's _custom.scss imports the css from themelets:

/* inject:imports */ /* endinject */

And your portal_normal.ftl should contain:

<!-- inject:js --> <!-- endinject --> Conclusion

When changing the behaviour or design of certain aspects which are not design (in the eye of the end users or guest users) related, I think its always better to go with themelets. You'll rarely come across a portal with only one theme. So using themelets will make it so much easier to maintain your themes.

Michael Adamczyk 2018-11-13T09:56:00Z
Categories: CMS, ECM

The reason we moved to 7zip bundles

Liferay - Mon, 11/12/2018 - 14:51

As some of you may have already discovered, 7.1 GA2 was released as a 7zip bundle instead of the typical zip bundle. This probably caused a ton of issues. Even our own Dev Tools are not yet equipped to handle 7z files since all the events took up our time.

I will provide you with the reasons why we had to make this move and hopefully, everyone will come to the conclusion that this was the best decision, albeit, the communication could have been handled significantly better.

The original goal that led to providing 7z bundles was to improve startup times. We discovered that if we prepopulate the OSGi state, we were able to significantly reduce startup times by 2-3 times. As we began our testing, our zip bundles were not preserving timestamps correctly. They were rounding our timestamps to the nearest seconds which invalidated our OSGi state. We also found that our bundles had grown to 1.2 gigabytes!

This improvement imposed 2 requirements:
  • maintain the original timestamp
  • significantly increase the number of duplicate files.

We began to look for solutions. Naturally, tar.gz was the first solution that came to mind. It would easily preserve the timestamps but it did not solve the file size issue. While some people may find a large download acceptable, we did not believe that it would be appropriate for some of our use cases. As a result, someone suggested that we investigate 7zip because 7zip will actually detect for duplicate files and treat them as a single file during compression. This significantly brought down the file size from 1.2 gigabytes to 400 megabytes. It was the perfect solution for us. So this is why we have ultimately decided to use 7zip instead of zips. 

Since our initial development, we have also fixed the duplicate file issue. This means that tar.gz is also viable as a solution (though the bundles are slightly larger at 600 megabytes). From now on, we will be providing 7zip bundles and also tar.gz bundles. Internally we will be using 7zip because ultimately that 200-megabyte difference is still too significant for our use cases, but for everyone else, you guys can decide what works best for you.  

David Truong 2018-11-12T19:51:00Z
Categories: CMS, ECM

Liferay Portal 7.1 CE GA2 Release

Liferay - Mon, 11/05/2018 - 23:00
What's New Downloads

Download the release now at: https://www.liferay.com/downloads-community

New Features Summary

Web Experience:  Fragments allow a content author to create small reusable content pieces. Fragments can be edited in real time or can be exported and managed with the tooling of your choice. Use content pages from within a site to have complete control over the layout of your pages.  Navigation Menus let's you create site navigation in new and interesting ways and have full control over the navigations visual presentation.
 


Forms Experience: Forms can now have complex grid layouts, numeric fields and file uploads. They now include new personalization rules that let you customize the default behavior of the form. Using the new Element Sets, form creators can now create groups of reusable components. Forms fields can now be translated into any language using any locale and can also be easily duplicated.
 


Redesigned System Settings: System Settings have received a complete overhaul. Configuration options have been logically grouped together making it easier than ever before to find what's configurable. Several options that were located under Server Administration have also been moved to System Settings.
 


User Administration: The User account form has been completely redesigned. Each form section can now be saved independently of each other minimizing the chance of losing changes. The new ScreensNavigationEntry let's developers add custom forms under user administration.

Improvements to Blogs and Forums: Blog readers a can now un-subscribe to notifications via email.  Blog authors now have complete control over the friendly URL of the entry.  Estimated reading time can be enabled in System Settings and will be calculated based on time taken to write an entry.

Blogs also have a new cards ADT that can be selected from the application configuration.  Videos can now be added inline while writing a new entry from popular services such as: Youtube, Vimeo, Facebook Video, and Twitch. Message boards users can now attach as many files as they want by dragging and dropping them in a post. Message boards also has had many visual updates.
 


Workflow Improvements: Workflow has received a complete UI overhaul. All workflow configuration is now consolidated under one area in the Control Panel. Workflow definitions are now versioned and previous versions can be restored. Workflow definitions can now be saved in draft form and published live when they are ready.

Infrastructure: Many improvements have been incorporated at the core platform level, including ElasticSearch 6.0 support and the inclusion of Tomcat 9.0. 

Documentation

Official Documentation can be found on Liferay Developer network.  For information on upgrading, see the Upgrade Guide.

Bug Reporting

If you believe you have encountered a bug in the new release you can report your issue on issues.liferay.com, selecting the "7.1.0 CE GA2" release as the value for the "Affects Version/s" field.

Getting Support

Support is provided by our awesome community. Please visit our community website for more details on how you can receive support.

Liferay and its worldwide partner network also provides services, support, training, and consulting around its flagship enterprise offering, Liferay DXP.

Also note that customers on existing releases such as 6.2 and 7.0 continue to be professionally supported, and the documentation, source, and other ancillary data about these releases will remain in place.

Jamie Sammons 2018-11-06T04:00:00Z
Categories: CMS, ECM

Mitigating RichFaces 4.5.17.Final EOL Vulnerabilities

Liferay - Mon, 11/05/2018 - 18:16
Mitigating RichFaces 4.5.17.Final EOL Vulnerabilities

If you are using RichFaces, you should be aware that Code White has discovered some remote code execution vulnerabilities in the component library. Unfortunately, since RichFaces has reached end-of-life status, these vulnerabilities will not be fixed. Thankfully there two easy options to mitigate these vulnerabilities:

  1. Migrate to Alberto Fernandez’s fork of RichFaces.

    Alberto has fixed the known security vulnerabilities and other issues with RichFaces, so you should be able to upgrade to his latest release with little trouble:

    <dependency> <groupId>com.github.albfernandez.richfaces</groupId> <artifactId>richfaces</artifactId> <version>4.6.5.ayg</version> </dependency>
  2. Disable resource serialization.

    RichFaces has a whitelist of classes that it will deserialize. By setting the whitelist to empty you can avoid this remote code execution vulnerability. Just add the following content to a file named src/main/resources/org/richfaces/resource/resource-serialization.properties in your Maven or Gradle project:

    # Disable resource serialization to disallow remote code execution: # CVE-2013-2165, RF-14310, CVE-2015-0279, RF-13977, and RF-14309. # See https://codewhitesec.blogspot.com/2018/05/poor-richfaces.html for more details. whitelist=

The Liferay Faces team has used the second mitigation method to protect our RichFaces demos and archetypes. We have released new versions of our RichFaces archetypes with the mitigation included. Please see the release notes for more details.

Kyle Joseph Stiemann 2018-11-05T23:16:00Z
Categories: CMS, ECM

For a more conspicuous SPA loading indicator

Liferay - Thu, 11/01/2018 - 17:15

// The french version of this article can be found here: Pour un indicateur de chargement SPA plus visible.

Since version 7.0 of Liferay, you surely noticed the apparition of a thin loading bar on top of screen, after most of user actions.

This loading bar is part of the new SPA (Single Page Application) mode of Liferay, supported by the Senna.js framework.

Unfortunately, this bar is so inconspicuous that most users do not see it. In general, without a visual feedback related to their action, they reiterate their action several times, which often lengthens the waiting time.

In the end, users are often unnecessarily frustrated just because this load indicator is not visible enough.

Fortunately, it's quite simple to fix this with a few lines of CSS in a custom theme, because this loading bar is just a single HTML tag on which a CSS class is dynamically applied.

<div class="lfr-spa-loading-bar"></div>

As a starting point, we can consider the superb loaders provided by Luke Haas in his project Single Element CSS Spinners. Just make some adaptations to get a CSS loader compatible with Liferay:

/* Reset properties used by the original loader */ .lfr-spa-loading .lfr-spa-loading-bar, .lfr-spa-loading-bar { -moz-animation: none 0 ease 0 1 normal none running; -webkit-animation: none 0 ease 0 1 normal none running; -o-animation: none 0 ease 0 1 normal none running; -ms-animation: none 0 ease 0 1 normal none running; animation: none 0 ease 0 1 normal none running; display: block; -webkit-transform: none; -moz-transform: none; -ms-transform: none; -o-transform: none; transform: none; background: transparent; right: initial; bottom: initial; } /* Pure CSS loader from https://projects.lukehaas.me/css-loaders */ .lfr-spa-loading .lfr-spa-loading-bar, .lfr-spa-loading .lfr-spa-loading-bar:after { border-radius: 50%; width: 10em; height: 10em; z-index: 1999999; } .lfr-spa-loading .lfr-spa-loading-bar { margin: 60px auto; font-size: 10px; text-indent: -9999em; border-top: 1.1em solid rgba(47, 164, 245, 0.2); border-right: 1.1em solid rgba(47, 164, 245, 0.2); border-bottom: 1.1em solid rgba(47, 164, 245, 0.2); border-left: 1.1em solid #2FA4F5; -webkit-transform: translateZ(0); -ms-transform: translateZ(0); transform: translateZ(0); -webkit-animation: load8 1.1s infinite linear; animation: load8 1.1s infinite linear; } @-webkit-keyframes load8 { 0% { -webkit-transform: rotate(0deg); transform: rotate(0deg); } 100% { -webkit-transform: rotate(360deg); transform: rotate(360deg); } } @keyframes load8 { 0% { -webkit-transform: rotate(0deg); transform: rotate(0deg); } 100% { -webkit-transform: rotate(360deg); transform: rotate(360deg); } } /* Positionning */ .lfr-spa-loading .lfr-spa-loading-bar { position: fixed; top: 50%; left: 50%; margin-top: -5em; margin-left: -5em; }

Once the custom theme is applied we get a loader clearly visible that no user can miss:

This snippet support Liferay 7.0 and 7.1 and is also available on gist.

If you also have tips to improve the UX of a portal Liferay, feel free to share them in the comments of this post or in a dedicated blog post.

Sébastien Le Marchand
Freelance Technical Consultant in Paris

Sébastien Le Marchand 2018-11-01T22:15:00Z
Categories: CMS, ECM

Pour un indicateur de chargement SPA plus visible

Liferay - Thu, 11/01/2018 - 17:00

// The english version of this article can be found here: For a more conspicuous SPA loader.

Depuis la version 7.0 de Liferay, vous avez surement remarqué l’apparition d’une fine barre de chargement en haut de l’écran en réponse à la plupart des actions utilisateur.

Cette barre de chargement intervient dans le cadre du nouveau fonctionnement SPA (Single Page Application) de Liferay, assuré par le framework Senna.js.

Malheureusement, cette barre est tellement discrète que la plupart des utilisateurs ne la voient pas. En général, ne constatant pas de retour visuel à leur action, ils réitèrent leur action plusieurs fois, ce qui souvent allonge encore le temps d’attente.

Au final les utilisateurs sont souvent inutilement frustrés juste parce-que cet indicateur de chargement n’est pas assez visible.

Heureusement, il est assez simple de remédier à cela avec quelques lignes de CSS dans un thème personnalisé, car cette barre de chargement est juste une unique balise HTML sur laquelle est appliquée dynamiquement une classe CSS.

<div class="lfr-spa-loading-bar"></div>

Comme point de départ, on pourra considérer les superbes loaders proposé par Luke Haas dans son projet Single Element CSS Spinners. Il suffit alors faire quelques adaptations pour obtenir un loader CSS compatible avec Liferay :

/* Reset properties used by the original loader */ .lfr-spa-loading .lfr-spa-loading-bar, .lfr-spa-loading-bar { -moz-animation: none 0 ease 0 1 normal none running; -webkit-animation: none 0 ease 0 1 normal none running; -o-animation: none 0 ease 0 1 normal none running; -ms-animation: none 0 ease 0 1 normal none running; animation: none 0 ease 0 1 normal none running; display: block; -webkit-transform: none; -moz-transform: none; -ms-transform: none; -o-transform: none; transform: none; background: transparent; right: initial; bottom: initial; } /* Pure CSS loader from https://projects.lukehaas.me/css-loaders */ .lfr-spa-loading .lfr-spa-loading-bar, .lfr-spa-loading .lfr-spa-loading-bar:after { border-radius: 50%; width: 10em; height: 10em; z-index: 1999999; } .lfr-spa-loading .lfr-spa-loading-bar { margin: 60px auto; font-size: 10px; text-indent: -9999em; border-top: 1.1em solid rgba(47, 164, 245, 0.2); border-right: 1.1em solid rgba(47, 164, 245, 0.2); border-bottom: 1.1em solid rgba(47, 164, 245, 0.2); border-left: 1.1em solid #2FA4F5; -webkit-transform: translateZ(0); -ms-transform: translateZ(0); transform: translateZ(0); -webkit-animation: load8 1.1s infinite linear; animation: load8 1.1s infinite linear; } @-webkit-keyframes load8 { 0% { -webkit-transform: rotate(0deg); transform: rotate(0deg); } 100% { -webkit-transform: rotate(360deg); transform: rotate(360deg); } } @keyframes load8 { 0% { -webkit-transform: rotate(0deg); transform: rotate(0deg); } 100% { -webkit-transform: rotate(360deg); transform: rotate(360deg); } } /* Positionning */ .lfr-spa-loading .lfr-spa-loading-bar { position: fixed; top: 50%; left: 50%; margin-top: -5em; margin-left: -5em; }

Une fois le thème personnalisé appliqué on obtient un loader bien visible, qu’aucun utilisateur ne pourra plus manquer :

Ce snippet est compatible avec les versions 7.0 et 7.1 de Liferay et est également disponible sur gist.

Si vous aussi vous avez des astuces pour améliorer l’UX d’un portail Liferay, n’hésitez pas à les partager dans les commentaires de ce billet ou dans un billet dédié.

Sébastien Le Marchand
Consultant Technique indépendant à Paris

Sébastien Le Marchand 2018-11-01T22:00:00Z
Categories: CMS, ECM

Anonymize your custom entities and comply with GDPR the easy way!

Liferay - Wed, 10/31/2018 - 09:54

Helping my colleague Sergio Sanchez with his GDPR talk in the past Spanish Symposium, I came across a hidden gem in Liferay 7.1. It turns out you can integrate custom Service Builder entities with Liferay’s GDPR framework (UAD), to enable anonymization of personal data in your own entities.

I didn’t know anything about this feature, but it’s really easy to use and works like a charm (after solving a couple “gotchas”!). Some of this gotchas have been identified in LPS's and will be mentioned in the article.

Let’s take a look at how it’s done!

Service Builder to the rescue!

The first step is to create a Service Builder project, using the tool you like most (Blade, Liferay IDE...). This will create two projects, as usual, API and service, and you'll find the first hiccup. Blade doesn’t generate the service.xml file using the 7.1 DTD, it still uses 7.0, so the first thing we need to do is update the DTD to 7.1:

<!DOCTYPE service-builder PUBLIC "-//Liferay//DTD Service Builder 7.1.0//EN" "http://www.liferay.com/dtd/liferay-service-builder_7_1_0.dtd">

This issue is being tracked in LPS-86544.

The second “gotcha” is that when you update the DTD to version 7.1 Service Builder will generate code that won't compile or run with Liferay CE 7.1 GA1. To make it compile, you need to add a dependency to Petra and update the kernel to at least version 3.23.0 (that's the kernel version for Liferay CE 7.1 GA2, not released yet), but unfortunately won't run if you deploy it to a Liferay CE 7.1 GA1 instance.

Thanks to Minhchau Dang for pointing this out, Minhchau filed this bug in ticket LPS-86835.

I ended up using these in my build.gradle (of the -service project), I'm using Liferay DXP FP2:

dependencies { compileOnly group: "biz.aQute.bnd", name: "biz.aQute.bndlib", version: "3.5.0" compileOnly group: "com.liferay", name: "com.liferay.portal.spring.extender", version: "2.0.0" compileOnly group: "com.liferay.portal", name: "com.liferay.portal.kernel", version: "3.26.0" compileOnly group: "com.liferay", name: "com.liferay.petra.string", version: "2.0.0" }

Also note that all across the example in Github I've used kernel version 3.26.0, not only in the -service project, but everywhere.

What exactly do you want to anonymize?

Now your service and api projects should be ready to compile, so the next step is to include the necessary information in your service.xml to make your entities anonymizable.

The first two things you need to include are two attributes at the entity level, uad-application-name, and uad-package-path. uad-application-name is the name that Liferay will use to show your application in the anonymization UI, and uad-package-path is the package Service Builder will use to create the UAD classes (Pro tip: don’t include “uad” in the package name as I did, SB will include it for you)

In my example I used this entity:

<entity local-service="true" name="Promotion" remote-service="true" uuid="true" uad-application-name="Citytour" uad-package-path="com.liferay.symposium.citytour.uad">

Once you have specified those, you can start telling Service Builder how your entity’s data will be anonymized. For this, you can use two attributes at a field level: uad-anonymize-field-name and uad-nonanonymizable.

Uad-anonymize-field-name, from the 7.1 service.xml DTD:

“The uad-anonymize-field- name value specifies the anonymous user field that
should be used to replace this column's value during auto anonymization. For
example, if "fullName" is specified, the anonymous user's full name will replace
this column's value during auto anonymization. The uad-anonymize-field-name
value should only be used withuser name columns (e.g. " statusByUserName ").”

For example, if we have a field defined like this:

<column name="userName" type="String" uad-anonymize-field-name="fullName"/>

That means that when the auto-anonymization process runs, it will replace the value of that field with the Anonymous User Full Name.

On the other hand, uad-nonanonymizable, again from the 7.1 DTD:

“The uad- nonanonymizable value specifies whether the column represents data
associated with a specific user that should be reviewed by an administrator in
the event of a GDPR compliance request. This implies the data cannot be
anonymized automatically.”

This means exactly that, the field can’t be auto-anonymized and needs manual revision when deleting user data. That’s partially true because even though the anonymization is not automatic, the admin user doesn’t have to actually do anything, just review the entities and click “anonymize” (providing the anonymization process for the entity is implemented, which we’ll do later on).

I used this fields in my “Promotion” entity:

<!-- PK fields --> <column name="promotionId" primary="true" type="long" /> <!-- Group instance --> <column name="groupId" type="long" /> <!-- Audit fields --> <column name="companyId" type="long" /> <column name="userId" type="long" /> <column name="userName" type="String" uad-anonymize-field-name="fullName"/> <column name="createDate" type="Date" /> <column name="modifiedDate" type="Date" /> <!-- Promotion Data --> <column name="description" type="String" uad-nonanonymizable="true"/> <column name="price" type="double" uad-nonanonymizable="true"/> <column name="destinationCity" type="String" uad-nonanonymizable="true"/> <!-- Personal User Data --> <column name="offereeFirstName" type="String"/> <column name="offereeLastName" type="String"/> <column name="offereeIdNumber" type="String" /> <column name="offereeTelephone" type="String" />

In my case, I’m telling the framework that the description, price, and destinationCity fields need
manual review.

So, what does SB do with this two attributes? Actually, if we have an entity with a field marked with uad-anonymize-field-name, when running buildService it will create two new projects to hold the anonymization, display, and data export logic! Isn't Service Builder awesome?

Build Service Builder Services using buildService

Excellent, you’re almost there! Now you’re ready to run the buildService gradle task, and you should see that SB has created two projects for you: <entity>-uad, and <entity>-uad-test. The <entity>-uad project (promotions-uad in my case) contains the custom anonymization and data export logic for your entity, and -uad-test contains well, the test classes.

And now that we have both projects, we must fix another “gotcha”. If you run the build or deploy gradle tasks now on those projects, they will fail spectacularly. Why? Well, if you take a look at your projects, you’ll see that there’s no build.gradle file! That’s Service Builder being a little petty, but don’t worry, you can create a new one and include these dependencies (again, based on my system):

dependencies { compileOnly group: "com.liferay.portal", name: "com.liferay.portal.kernel", version: "3.26.0" compileOnly group: "com.liferay", name: "com.liferay.user.associated.data.api", version: "3.0.1" compileOnly group: "com.liferay", name: "com.liferay.petra.string", version: "2.0.0" compileOnly group: "org.osgi", name: "osgi.cmpn", version: "6.0.0" compileOnly project(":modules:promotions:promotions-api") }

This bug is being tracked in LPS-86814

Now your project should compile without issue, let’s add our custom anonymization logic!

Customize to anonymize

In your -uad project you’ll find three packages (actually four if you count constants):

  • Anonymizer: Holds the custom anonymization logic for your entity.
  • Display: Holds the custom display logic for Liferay’s UAD UI.
  • Exporter: Holds the custom personal data export logic.

In each one, you’ll find a Base class and another class that extends the Base class. For the anonymizer package in my example, I have a BasePromotionUADAnonymizer class and a PromotionUADAnonymizer class. What we’ll do is use the concrete class (PromotionUADAnonymizer in my case) and override the autoAnonymize method. In this method, you tell the framework how to anonymize each field of your custom entity. In my example I did this:

@Component(immediate = true, service = UADAnonymizer.class) public class PromotionUADAnonymizer extends BasePromotionUADAnonymizer { @Override public void autoAnonymize(Promotion promotion, long userId, User anonymousUser) throws PortalException { if (promotion.getUserId() == userId) { promotion.setUserId(anonymousUser.getUserId()); promotion.setUserName(anonymousUser.getFullName()); promotion.setOffereeFirstName(anonymousUser.getFirstName()); promotion.setOffereeLastName(anonymousUser.getLastName()); promotion.setOffereeIdNumber("XXXXXXXXX"); promotion.setOffereeTelephone("XXX-XXX-XXX"); } promotionLocalService.updatePromotion(promotion); } }

I’m using the anonymousUser fields for my entities’ first and last name, and setting the ID and telephone fields to anonymous entries, but you can choose what fields you want to anonymize and how.

Done!

Good job, you’re done! Now your entities are integrated with Liferay’s UAD framework, and when deleting a user, you’ll see that it works like a charm. To see how UAD works from a user perspective, you can check out the Managing User Data chapter in the documentation.

I’ve uploaded this example in my GitHub repo, along with a web portlet to create Promotions. You just need to add the promotions-web portlet to the page, create some promotions with a user different from Test Test and then delete the user, you should see that all entries created by that user have now anonymous fields.

Hope you like it: https://github.com/ibairuiz/sympo-demo

Ibai Ruiz 2018-10-31T14:54:00Z
Categories: CMS, ECM

LiferayPhotos: a fully functional app in 10 days using Liferay Screens. Final part: Development experience.

Liferay - Mon, 10/29/2018 - 12:06

In the previous posts, we talked about what we were going to do, how we were going to do it, and which tools we were going to use. We introduced you the Liferay Screens’s screenlets and the reason why we were going to use them.

Well, it’s time to explain to you how we developed this application and how a junior developer, with only one year of experience in the mobile world and 2 weeks in Liferay, made this app in 10 days thanks to Liferay Screens and its screenlets. In this post, I will explain what I have done and how have been my experience.

Introduction

I am going to detail as much as possible how we have done the UserProfile screen, where the user can see their photos next to their basic information and the Feed screen, which shows a timeline with the users’s photos. Finally, I will show you how we have implemented the button that allows upload new images, either taking a new photo from the camera or getting it from the gallery. Also, I want to explain what problems encountered when I was doing the development.

First of all, as a starting point, it’s necessary to be familiarized with creating iOS themes. All of this is explained in this tutorial. They are very useful because, as you will see, to do the app we only have to change the screenlet’s UI because the logic is already done.

UserProfile screen

For the UserProfile screen I’ve created a collection of new themes for the screenlets used in this screen. Basically, what I have done is to extend the default theme, which has the name of the screenlet but ending in View_default, e.g., UserPortraitView_default.

The new theme should be given the same name than the default theme but changing the suffix to the name that we want to give them. For example, in UserProfile we modify the UserPortraitView to show the image inside a circle. For this, as I said before, we just need to copy the default theme and change the name, in my case to UserPortraitView_liferayphotos. Thus, the theme name will be liferayphotos. You have to put this name into the screenlet’s themeName attribute.

The associated code to the view, created in an analogous way, is where I have coded how the image will be displayed, in our case and as I said above, will appear inside a circle shape.

import Foundation import LiferayScreens class UserPortraitView_liferayphotos: UserPortraitView_default { override func onShow() { super.onShow() self.portraitImage?.layer.cornerRadius = self.portraitImage!.frame.size.width / 2 self.portraitImage?.clipsToBounds = true self.borderWidth = 0.0 self.editButton?.layer.cornerRadius = self.editButton!.frame.size.width / 2 self.screenlet?.backgroundColor = .clear } }

Just below of the UserPortraitScreenlet I have added the username and the user email. Both are a simple label text.

In the back of the user basic information, I’ve placed an image using an ImageDisplayScreenlet. This will let you render an image from documents and media, pretty straightforward.

Now, I am going to explain the image gallery. Maybe, this is the trickiest part in this screen but, with screenlets, nothing is difficult :)

To do this I’ve used an ImageGalleryScreenlet, but, as I did with the UserPortraitScreenlet, I have created a custom theme to give it a more appropriate style. This style consists of 3 columns with a little small space between columns and rows.

First of all, I had to specify the number of columns because the default number didn’t fit our design. It’s just one variable called columnNumber:

override var columnNumber: Int { get { return 3 } set {} }

The next step is to specify which layout we needed to use in the gallery. I needed to calculate his size and the space between the other items. All of this was done in the function doCreateLayout().

override func doCreateLayout() -> UICollectionViewLayout { let layout = UICollectionViewFlowLayout() let cellWidth = UIScreen.main.bounds.width / CGFloat(columnNumber) - spacing layout.sectionInset = UIEdgeInsets(top: 0, left: 0, bottom: 0, right: 0) layout.itemSize = CGSize(width: cellWidth, height: cellWidth) layout.minimumInteritemSpacing = spacing layout.minimumLineSpacing = spacing return layout }

And that’s all in this screen. With three screenlets I have made a fully functional user profile.

Feed screen

Honestly, I thought that Feed screen was the most complicated screen. In fact, all screen is just one big screenlet: the imageGalleryScreenlet. But how? Well, I have to show all images, so, it’s an image gallery and the rest information is added to the cell of each item of the gallery. 

As we mentioned in the previous post, in the cell I have two screenlets: asset display screenlet and rating screenlet.

First of all, I have created a theme for the image gallery screenlet. In this theme I’ve specified that I only need one column, as I did in the previous screen; then I give to the screenlet the item size and pass the custom cell name.

Already in the custom cell, I had to build my personal cell with the screenlets mentioned above. For the asset display screenlet, I’ve created a simple view with the user portrait screenlet and a label with the username.

For the rating screenlet, I’ve used another custom view with a new image (a heart) and with the number of likes of the image.

The rest of the cell consists of one image and some labels to give the aspect that we wanted.

As we can see, both screens were made in the same way: creating themes to give a more appropriate style and nothing else. All logic was done in the screenlet and I don’t have to worry about it.

Other screens

The other screens were far more easy than the mentioned here because the view was less complex, and I just had to tweak the style of the screenlets a little bit. For example, the sign up screenlet and the login screenlet follow the same style, as we can see in the pictures below.

                                                     

Final result

In the next video you can see the result of the application.

Conclusion

As we can see along this post, the principal and more complex logic of the application is already provided by the screenlet. I have to focus only on customizing the screenlets creating new themes and nothing else. The new themes depend on the platform to develop, of course, so we’ve needed iOS knowledge in this case.

Screenlets are a powerful tool to create native applications integrating with Liferay and very easy to use.

My humble recommendation is that if you never have used screenlets technology, create a little app as proof of concept to familiarize yourself with it. This task will only take a couple of hours and will help you to gain knowledge about screenlets and to observe how powerful they are.

With this little knowledge and some experience (not much) in the mobile platform, you will be ready to develop an application like the one mentioned in this blog, because all screenlets are used in a similar way.

Related links

Github repository: https://github.com/LuismiBarcos/LiferayPhotos

Luis Miguel Barco 2018-10-29T17:06:00Z
Categories: CMS, ECM

Binding Java and Swift libraries for Xamarin projects

Liferay - Mon, 10/29/2018 - 09:14

Previously, we talked about how market is really fragmented right now because there isn’t any popular hybrid framework. We kept an eye on some of the most popular frameworks like React Native, Nativescript, Ionic2 and Xamarin. We did a prototype with Nativescript and it worked fine but we think that the use of native libraries with this technology isn’t mature enough. If you want to know how these frameworks work underneath and their pros and cons, read this article.

Since I started with hybrid frameworks, I have another point of view of how other developers work with other technology or language and also how I feel working outside my comfort zone. We usually work with Java for Android, Swift for iOS and this year we built a hybrid solution for our framework using JavaScript and CSS. But, we have never tried some cross-compile framework. So, we decided to give Xamarin a chance!

“We want to help people develop their apps easier and faster. Liferay Screens is our solution. Since 2017, you don’t have to choose between native and hybrid, you can use both at the same time!”

When we met Xamarin

The team decided to implement a prototype in Xamarin, but we didn’t know this framework very well. So, first of all, we had to know what language we had to use, how it worked, and how we could use Liferay Screens using the native part, so we didn’t have to code everything again.

Official documentation

Xamarin has a lot of documentation and you have to read it at least twice, if not more, just to connect all the dots and realise what you’re doing. We need to know some things before starting:

When we took our first steps and wrote some C# code, we were prepared to start our binding library but…Surprise, surprise!

Officially, Xamarin does not support binding Swift libraries.

And now, what?

Ok, that’s awful, but it’s not the end of the world. Let’s investigate!

We talked with the community about binding Swift libraries and everybody seemed to be very lost. You would think that no one had done this before (some people in Objective-C, right, but not in Swift). I kept investigating and I found a post in StackOverflow related to what I wanted to do. This was really exciting and I did everything step by step to get my C# library. The post was moved to Medium: https://medium.com/@Flash3001/binding-swift-libraries-xamarin-ios-ff32adbc7c76

First binding draft

When we had our first draft of the library, with only two or three classes, we decided to test it. We created a new Xamarin project, installed the NuGet package and…It failed.

At this point, we read the documentation about error codes and binding troubleshooting. And we had to add this information to the previous one:

Putting everything together

At that moment, we had everything we really needed to implement our library. So, we started the whole implementation:

  1. Create the .aar (for Java binding) and .framework (for Swift binding) files from native code.
  2. Create an empty binding library for Android in Visual Studio and add the .aar. Mono framework will create the C# classes for us (as many as it can).
  3. Create a fat library for iOS. It will contain x86 and arm architectures.
  4. Use Objective Sharpie ( sharpie bind ) to autogenerate C# classes for iOS. We can create them from a Pod or from a .h file. This command will create two files: ApiDefinitions.cs and StrucsAndEnums.cs .
  5. Create an empty binding library for iOS in Visual Studio and add the files we created on step 4.
  6. Rebuild and check everything is ok. If there are some errors, we can check them directly in Visual Studio or in the documentation.

We worked hard and finally we created a binding library from our native one. This library is already in the NuGet website.

This was really amazing because we didn’t know anything about C#, nor Xamarin, nor binding a library and here we are, with our library released and ready to be used. Finally, we reached our goal and we created a library without implementing everything again. It was hard and it took almost 4 months but now everyone can use it and this is what we love most!

Binding troubleshooting

The steps to create a binding library are annoying and it’s very easy to get angry and not understand what’s going on, what’s the meaning of an error, why it’s not working and so on.

  1. Do not despair and take a deep breath.
  2. It’s necessary to know and use the “Xamarin steps” (this is what our team named it!). Those steps are: clean the project several times, open closed project, open closed the simulator, uninstall the app in the simulator, and the most important one, delete the obj and bin folders.
  3. Xamarin.iOS crashes unexpectedly without any error messages in the console: check the log file; on Mac OS, do this via the Console app; on Windows, use the Event Viewer. In the Mac OS app, you must click User Reports and then look for your app’s name. Note that there may be more than one log file.
  4. At the moment, it’s necessary to add  Swift runtime NuGet packages (available for Swift 3 and Swift 4) to our Xamarin project. This will increase the size of the final app.
Sample projects

It may also help to investigate Xamarin.Android and Xamarin.iOS sample apps developed by Liferay. Both are good examples of how to use the Screenlets, Views (Android), and Themes (iOS):

If you want to read more about Liferay Screens for Xamarin, you can read our official documentation: https://dev.liferay.com/es/develop/tutorials/-/knowledge_base/7-0/using-xamarin-with-liferay-screens

If you have any questions about how to install it or how to use it, just post it in our forums. We will gladly help you. And if you already have a binding library, share your experience with others.

Sarai Diaz 2018-10-29T14:14:00Z
Categories: CMS, ECM

First anniversary of the Liferay Spain User Group

Liferay - Sat, 10/27/2018 - 05:54

//The spanish version of this article can be found here: Primer aniversario de la comunidad Liferay en España, otra vez.

One year ago, on October 25, 2017, Liferay Community in Spain started again. During this year, we have made important achievements: we have joined a great group in all the events and we have called meetups regularity (usually last Wednesday every month). Only this year, Community has held 9 meetups, counting presentations and round tables, covering various topics:

When you compare the Liferay Symposium Spain 2018 agenda with the meetup topics, it’s clear that our Community had the chance to see many of the sessions first in exclusive. At this point we should especially thank the work and support of the Liferay team and especially my fellow co-organizers of the Liferay Spain User Group: Eduardo García, Javier Gamarra  and David Gómez have managed to prepare and get experts for each of the areas.

As in October was going to be a year since the User group retook its activity, and taking the chance of the Liferay Symposium in Spain happening in Madrid, we scheduled our monthly meetup at the same date to celebrate that first anniversary. Thus, the meetup was included by the Liferay team in the oficial agenda of the symposium as part of the celebration.

Fig. Screenshot of the Liferay Events App showing the LSUG meetup scheduled as part of the Symposium agenda.

 

In this last meetup, we made a review of what we have achieved and the main objectives for next year, highlighting:

  • Holding meetups in other locations, and even getting to create local groups (some attendees showed interest from Asturias and Seville)
  • Encouraging members to give talks
  • Planning other types of talks, more focused on different user levels, (introductory sessions, workshops, etc).
  • Using Forums again. They are very useful for Community, and they had lost importance
     

To commemorate the anniversary, the Liferay team gave away T-shirts to Community members.

   

We have also gained visibility from Liferay. Here we show two important examples. The first one, in the closing keynotes of Liferay Symposium 2018, and the second one with our honorary member.

    

If you want to join the Liferay Community, just join the Liferay Spain User Group in meetup and sign in the Liferay Community Slack where we have a dedicated #lug-spain channel. If you are reading this from outside Spain, you probably can find a local Liferay User Group near you or, if it does not exists, you can learn how to create a new LUG on your area.
 

Álvaro Saugar 2018-10-27T10:54:00Z
Categories: CMS, ECM

Primer año de la Comunidad Liferay en España, otra vez

Liferay - Sat, 10/27/2018 - 03:36

//The english version of this article can be found here:  First anniversary of the Liferay Spain User Group

El  25 de octubre se cumplió un año del comienzo de la nueva etapa de la Comunidad Liferay en España. En este año se han conseguido logros importantes, como juntarnos un grupo fijo en todos los eventos y tener  una regularidad en los meetups durante el año. En total, entre presentaciones y mesas redondas, se han hecho  9 encuentros, en los que se han llegado a tocar diversos temas:

Si vemos  las conferencias dadas y la agenda del Symposium Liferay 2018, se puede comprobar que, en la Comunidad, hemos tenido muchas primicias sobre los temas que se han expuesto. En este punto hay que agradecer especialmente  el trabajo y apoyo al equipo de Liferay y especialmente a quiénes  están involucrados en la Comunidad. Eduardo García, Javier Gamarra y David Gómez han conseguido preparar y traer a los responsables de cada uno de las áreas en las que se ha hablado.

Como en este mes de octubre se cumplía un año desde que retomamos la actividad del grupo de usuarios, se aprovechó el simposio de Liferay para adelantar la celebración y hacer coincidir la reunión, que incluso fue incluida en la agenda del simposio.

En esta  última reunión, hicimos un repaso de lo hecho y los objetivos principales para el próximo año, entre los que podemos destacar:

  • Meetups en otras localizaciones, incluso facilitar la creación de grupos locales (entre los asistentes se mostró interés en Asturias y Sevilla)
  • Participación en charlas de los miembros
  • Otro tipo de charlas, más enfocadas a usuario, workshop, etc.
  • También se remarcó mucho la necesidad que fomentar los foros, ya que habían perdido importancia.

Para conmemorar el aniversario, desde el equipo de Liferay, regalaron camisetas a los miembros de la comunidad.

      

También hemos conseguido visibilidad por parte de Liferay, como se puede ver en el cierre del simposio y con nuestro miembro de honor.

       

Si quieres unirte a la comunidad de Liferay en España, puedes hacerlo en el grupo de meetup y en el Slack de la comunidad donde tenemos un canal #lug-spain específico. Si no lees esto desde España, quizá puedas encontrar un grupo de usuarios de Liferay que te quede cerca y si no hay, siempre puedes empezar un LUG en tu zona.

Álvaro Saugar 2018-10-27T08:36:00Z
Categories: CMS, ECM

Customizing Liferay Navigation menus

Liferay - Fri, 10/26/2018 - 07:47

In Liferay 7.1 we introduced a new approach to the Liferay Navigation. New navigation doesn't depend on the pages tree, there is a possibility to create multiple navigation menus, to compose them using different types of elements and to assign them different functions. Out of the box, we provide 3 types of menu elements - Pages, Submenus and URLs.  But we don't want to limit the users with these 3 types and any developer can create own navigation menu element type by implementing a simple interface and in this article, I want to explain how to do that. 
As the idea for this example, I want to use a "Call back request" functionality, there are plugins for most of the modern CMS which allow creating a "Call back request" button and process the user's input in a specific way. The idea of the article is to provide more or less the same functionality(of course, simplified) for the Liferay Navigation menus.
Let's create a module with the following structure:

The key parts here are CallbackSiteNavigationMenuItemType class which implements SiteNavigationMenuItemType interface, and the edit_callback. jsp view which provides the UI to add/edit navigation menu items of this type.

CallbackSiteNavigationMenuItemType has 4 main methods(getRegularURL, getTitle, renderAddPage, and renderEditPage), the first of them define the URL the specific menu item leads to. In the getRegularURL method, we create a PortletURL to perform a notification action with the userId from the properties of the SiteNavigationMenuItem instance and a number to callback to from the JS prompt function result:

@Override public String getRegularURL( HttpServletRequest request, SiteNavigationMenuItem siteNavigationMenuItem) { UnicodeProperties properties = new UnicodeProperties(true); properties.fastLoad(siteNavigationMenuItem.getTypeSettings()); PortletURL portletURL = PortletURLFactoryUtil.create( request, CallbackPortletKeys.CALLBACK, PortletRequest.ACTION_PHASE); portletURL.setParameter( ActionRequest.ACTION_NAME, "/callback/notify_user"); portletURL.setParameter("userId", properties.getProperty("userId")); String numberParamName = _portal.getPortletNamespace(CallbackPortletKeys.CALLBACK) + "number"; StringBundler sb = new StringBundler(7); sb.append("javascript: var number = prompt('"); sb.append("Please specify a number to call back','');"); sb.append("var url = Liferay.Util.addParams({"); sb.append(numberParamName); sb.append(": number}, '"); sb.append(portletURL.toString()); sb.append("'); submitForm(document.hrefFm, url);"); return sb.toString(); }

The getTitle method defines the title shown to the user in the menu, by default it uses the current SiteNavigationMenuItem name.
In the renderAddPage and renderEditPage methods, we use JSPRenderer to show the appropriate JSP to the user when adding or editing a navigation menu item of this new type.

Our view is also pretty simple:

<%@ include file="/init.jsp" %> <% SiteNavigationMenuItem siteNavigationMenuItem = (SiteNavigationMenuItem)request.getAttribute( SiteNavigationWebKeys.SITE_NAVIGATION_MENU_ITEM); String name = StringPool.BLANK; User user = null; if (siteNavigationMenuItem != null) { UnicodeProperties typeSettingsProperties = new UnicodeProperties(); typeSettingsProperties.fastLoad(siteNavigationMenuItem.getTypeSettings()); name = typeSettingsProperties.get("name"); long userId = GetterUtil.getLong(typeSettingsProperties.get("userId")); user = UserLocalServiceUtil.getUser(userId); } CallbackDisplayContext callbackDisplayContext = new CallbackDisplayContext( renderRequest, renderResponse); %> <aui:input label="name" maxlength='<%= ModelHintsUtil.getMaxLength( SiteNavigationMenuItem.class.getName(), "name") %>' name="TypeSettingsProperties--name--" placeholder="name" value="<%= name %>"> <aui:validator name="required" /> </aui:input> <aui:input name="TypeSettingsProperties--userId--" type="hidden" value="<%= (user != null) ? user.getUserId() : 0 %>" /> <aui:input disabled="<%= true %>" label="User to notify" name="TypeSettingsProperties--userName--" placeholder="user-name" value="<%= (user != null) ? user.getFullName() : StringPool.BLANK %>" /> <aui:button id="chooseUser" value="choose" /> <aui:script use="aui-base,liferay-item-selector-dialog"> A.one('#<portlet:namespace/>chooseUser').on( 'click', function(event) { var itemSelectorDialog = new A.LiferayItemSelectorDialog( { eventName: '<%= callbackDisplayContext.getEventName() %>', on: { selectedItemChange: function(event) { if (event.newVal && event.newVal.length > 0) { var user = event.newVal[0]; A.one('#<portlet:namespace/>userId').val(user.id); A.one('#<portlet:namespace/>userName').val(user.name); } } }, 'strings.add': 'Done', title: 'Choose User', url: '<%= callbackDisplayContext.getItemSelectorURL() %>' } ); itemSelectorDialog.open(); }); </aui:script>

We use TypeSettingsProperties-prefixed parameters names to save all necessary fields in the SiteNavigationMenuItem instance properties. We need to save two fields - name of the navigation menu item which is used as a title in the menu and the ID of the user we want to notify about a callback request, in a practical case it could be a sales manager/account executive or someone who holds the responsibility for this type of requests. This JSP is using Liferay Item Selector to select the user and the final UI looks like this:

Auxiliary functionality in this module is the notification functionality. We want to send a notification by email and using Liferay internal notification system. In order to do that we need two more components, an MVCActionCommand we used in our getRegularURL method and a notification handler to allow sending this type of notifications. NotifyUserMVCActionCommand has no magic, it accepts userId and number parameters from the request and send a notification to the user using SubscriptionSender component:

@Component( immediate = true, property = { "javax.portlet.name=" + CallbackPortletKeys.CALLBACK, "mvc.command.name=/callback/notify_user" }, service = MVCActionCommand.class ) public class NotifyUserMVCActionCommand extends BaseMVCActionCommand { @Override protected void doProcessAction( ActionRequest actionRequest, ActionResponse actionResponse) throws Exception { String number = ParamUtil.getString(actionRequest, "number"); long userId = ParamUtil.getLong(actionRequest, "userId"); SubscriptionSender subscriptionSender = _getSubscriptionSender( number, userId); subscriptionSender.flushNotificationsAsync(); } private SubscriptionSender _getSubscriptionSender( String number, long userId) throws Exception { User user = _userLocalService.getUser(userId); SubscriptionSender subscriptionSender = new SubscriptionSender(); subscriptionSender.setCompanyId(user.getCompanyId()); subscriptionSender.setSubject("Call back request"); subscriptionSender.setBody("Call back request: " + number); subscriptionSender.setMailId(StringUtil.randomId()); subscriptionSender.setPortletId(CallbackPortletKeys.CALLBACK); subscriptionSender.setEntryTitle("Call bak request: " + number); subscriptionSender.addRuntimeSubscribers( user.getEmailAddress(), user.getFullName()); return subscriptionSender; } @Reference private UserLocalService _userLocalService; }

And to make it possible to send notifications from this particular portlet we need to implement a UserNotificationHandler to allow delivery and to define the body of notifications in this case:

@Component( immediate = true, property = "javax.portlet.name=" + CallbackPortletKeys.CALLBACK, service = UserNotificationHandler.class ) public class CallbackUserNotificationHandler extends BaseModelUserNotificationHandler { public CallbackUserNotificationHandler() { setPortletId(CallbackPortletKeys.CALLBACK); } @Override public boolean isDeliver( long userId, long classNameId, int notificationType, int deliveryType, ServiceContext serviceContext) throws PortalException { return true; } @Override protected String getBody( UserNotificationEvent userNotificationEvent, ServiceContext serviceContext) throws Exception { JSONObject jsonObject = JSONFactoryUtil.createJSONObject( userNotificationEvent.getPayload()); return jsonObject.getString("entryTitle"); } }

After deploying this module we can add the element of our new type to the navigation menu. Unfortunately, the only application display template for the navigation menus that supports Javascript in the URLs is List, so in order to try our Callback request element type, we have to configure Navigation Menu Widget to use List template. Clicking on the new item type we can type a number and request our Call back.

Please keep in mind that user cannot create a notification to himself so it is necessary to log out and perform this action as a Guest user or using any other account. In My Account section we can see the list of notifications:

Full code of this example is available here.

Hope it helps! If you need any help implementing your navigation menu item types  feel free to ask in comments.

Pavel Savinov 2018-10-26T12:47:00Z
Categories: CMS, ECM

Take Advantage of Elastic Stack to monitor Liferay DXP

Liferay - Tue, 10/23/2018 - 07:49

As many of you probably know, starting with Liferay DXP, Elasticsearch is the default Search Engine. In fact, by default, an Elasticsearch instance is embedded in Liferay DXP (it’s a good moment to remind everyone that this is not supported for Production environments, where a separate Elasticsearch instance must be created).

In this article we’re going to outline how to use the Elastic Stack in order to monitor Liferay DXP.

The first idea about this article was to create a step by step guide, but I decided to avoid this approach because it'd require very detailed configurations that in the end would be slightly different on each installation, and also because anyone who uses the Elastic Stack should get a minimum knowledge in order to be able to monitor things that are important for his/her particular situation, so the learning process is going to be beneficial.

 

So what’s the Elastic Stack?

Elastic Stack is a set of products created by Elastic, it’s also known as ELK, which comes from Elasticsearch, Logstash and Kibana.

  • Elasticsearch is the index engine where information is stored.

  • Logstash is a program that sends information to Elasticsearch.

  • Kibana is  a server where we can create dashboards and graphics from the information that Logstash stored on Elasticsearch.

In this example we are not going to alter, nor consume the indexes created on Elasticsearch for Liferay DXP, we are going to create our own indexes just for monitoring purposes.

 

Setting up an Elastic Stack environment.

Kibana is a central service for the ELK stack, so for practical purposes we chose to install it on the same server where Elasticsearch is installed. Also because Logstash has to access Liferay logs, we chose to install a Logstash service on each Liferay node server. Anyway this approach is not mandatory, you can install Kibana in a different server from Elasticsearch and point it to the right address. The following examples are based on a 2 node Liferay cluster (node1 and node2).

 

Collecting data with Logstash:

The first step for the ELK stack to work is to have Logstash collect data from Liferay logs and send it to Elasticsearch. We can create different pipelines for Logstash in order to define which data it should ingest and how to store it on Elasticsearch. Each pipeline is a json file where you can define three sections:

  • input → how the data is going to be collected, here we specify the Logstash plugin we are going to use and set its parameters.
  • filter → where we can alter the data that our plugin has collected before sending it to Elasticsearch
  • output → it indicates the endpoint where our Elasticsearch is and also the name of the index where it’s going to store the data (if it doesn’t exist, it’ll be created automatically).

We are going to focus on two different pipelines for this article, here is some general information about how to configure pipelines.

 

Liferay logs pipeline:

This pipeline collects the information from the Liferay logs using the file input plugin. This plugin reads the logs (specified in the input phase) and parses each line according to some conditions. In our case that parsing happens during the filter phase, where we tokenize the log message to extract the log level, java class, time, and all the data we want to extract from each log line. Finally, we choose in which index we want to store the extracted data using the output phase:

input { file { path => "/opt/liferay/logs/liferay*.log" start_position => beginning ignore_older => 0 type => "liferaylog" sincedb_path => "/dev/null" codec => multiline { pattern => "^%{TIMESTAMP_ISO8601}" negate => true what => previous } } } filter { if [type] == "liferaylog" { grok { match => { "message" => "%{TIMESTAMP_ISO8601:realtime}\s*%{LOGLEVEL:loglevel}\s*\[%{DATA:thread}\]\[%{NOTSPACE:javaclass}:%{DATA:linenumber}\]\s*%{GREEDYDATA:logmessage}" } tag_on_failure => ["error_message_not_parsed"] add_field => { "hostname" => "LIFERAY_SERVER_HOSTNAME"} } date { match => [ "realtime", "ISO8601" ] timezone => "UTC" remove_field => ["realtime"] } } } output { if [type] == "liferaylog" { elasticsearch { hosts => ["ELASTIC_SERVER_HOSTNAME:9200"] index => "logstash-liferay-log-node1-%{+YYYY.MM.dd}" } #stdout { codec => rubydebug } } }

 

JVM statistics pipeline:

The previous plugin is installed by default when we install Logstash, but the plugin we'll use to take JVM statistics (logstash-input-jmx) has to be installed manually.

When using this plugin, we should tell Elasticsearch that the information this plugin sends to a particular index has a decimal format, otherwise Elasticsearch will infer it based on the first data it‘ll receive, which could be interpreted as a long value.

To configure this we can execute a simple CURL to set up this information. We’ll make a HTTP call to Elasticsearch with some JSON data to tell Elasticsearch that all indexes beginning with “logstash-jmx-node” will treat its values as doubles. We just need to do this once and then Elasticsearch will know how to deal with our JMX data:

curl -H "Content-Type: application/json" -XPUT ELASTIC_SERVER_HOSTNAME:9200/_template/template-logstash-jmx-node* -d ' { "template" : "logstash-jmx-node*", "settings" : { "number_of_shards" : 1 }, "mappings": { "doc": { "properties": { "metric_value_number" : { "type" : "double" } } } } }'

Before using this plugin, we will also need to enable the JMX connection in the Liferay JVM. For example, if running on Tomcat, we can add this on the setenv.sh:

CATALINA_OPTS="$CATALINA_OPTS -Djava.rmi.server.hostname=LIFERAY_SERVER_HOSTNAME -Dcom.sun.management.jmxremote.port=5000 -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false"

To set up this plugin we need to create two different configuration files, the usual Logstash pipeline configuration and a JMX pipeline configuration which location is specified in the Logstash pipeline, in the jmx node:

  • Logstash pipeline: here we specify where the jmx configuration file (/opt/logstash/jmx) is and which index we are going to use.
input { jmx { path => "/opt/logstash/jmx" polling_frequency => 30 type => "jmx" nb_thread => 3 } } filter { mutate { add_field => { "node" => "node1" } } } output { if [type] == "jmx" { elasticsearch { hosts => [ "ELASTIC_SERVER_HOSTNAME:9200" ] index => "logstash-jmx-node1-%{+YYYY.MM.dd}" } } }
  • The JMX Configuration file:  where we decide which JMX statistics we want to collect and send to the index:
{ "host" : "localhost", "port" : 5000, "alias" : "reddit.jmx.elasticsearch", "queries" : [ { "object_name" : "java.lang:type=Memory", "object_alias" : "Memory" }, { "object_name" : "java.lang:type=Threading", "object_alias" : "Threading" }, { "object_name" : "java.lang:type=Runtime", "attributes" : [ "Uptime", "StartTime" ], "object_alias" : "Runtime" },{ "object_name" : "java.lang:type=GarbageCollector,name=ParNew", "object_alias" : "ParNew" },{ "object_name" : "java.lang:type=GarbageCollector,name=ConcurrentMarkSweep", "object_alias" : "MarkSweep" },{ "object_name" : "java.lang:type=OperatingSystem", "object_alias" : "OperatingSystem" },{ "object_name" : "com.zaxxer.hikari:type=Pool (HikariPool-1)", "object_alias" : "Hikari1" },{ "object_name" : "com.zaxxer.hikari:type=Pool (HikariPool-2)", "object_alias" : "Hikari2" },{ "object_name" : "Catalina:type=ThreadPool,name=\"http-nio-8080\"", "object_alias" : "HttpThread" },{ "object_name" : "java.lang:type=MemoryPool,name=Metaspace", "object_alias" : "Metaspace" },{ "object_name" : "java.lang:type=MemoryPool,name=Par Eden Space", "object_alias" : "Eden" },{ "object_name" : "java.lang:type=MemoryPool,name=CMS Old Gen", "object_alias" : "Old" },{ "object_name" : "java.lang:type=MemoryPool,name=Par Survivor Space", "object_alias" : "Survivor" }] }

 

Monitoring with Kibana:

Once all this information is being processed and indexed, it's time to create dashboards and visualizations on Kibana.

First we have to point Kibana to the Elasticsearch index whose data we are going to consume. We’ll use the kibana.yml configuration file for this purpose.

 

Logtrail:

We are also going to install a plugin (logtrail) that allows us to see the logs, in a similar way a tail instruction does, via Kibana’s UI. This way, we can share logs with developers, sysadmins, devops, project managers… without having to give all those people access to the actual server where the logs are.

 

 

How to create visualizations and dashboards

Once we have pointed Kibana to the index and installed Logtrail, then we can start creating dashboards and visualizations in Kibana. To create dashboards and visualizations is easy in Kibana, we just need to use the UI. The steps we need to follow to create a visualization are:

  1. Indicate which index patterns we are going to deal with. In our case we wanted to use JMX and log information from two separate Liferay nodes. In our example:

    logstash-jmx-node1-*

    logstash-liferay-log-node1-*

    logstash-jmx-node2-*

    logstash-liferay-log-node2-*
  2. Create a search query in Kibana using Lucene queries. In this example we’re retrieving Process and System CPU Loads retrieved via JMX:

    metric_path: "reddit.jmx.elasticsearch.OperatingSystem.ProcessCpuLoad" || metric_path: "reddit.jmx.elasticsearch.OperatingSystem.SystemCpuLoad"
  3. Create different visual components to show that information.There are different kind of visualizations we can create, like an histogram showing the cpu usage we have recorded using the JMX plugin:

    Or a table counting the times a java class appears in the log with the ERROR loglevel:

  4. Create dashboards that group all the visualizations you want. Dashboards are collections of visualizations:

To conclude:

The ELK stack is a very powerful way to monitor and extract information of running Liferay instances, and in my opinion the main advantages of this system are:

  • Democratize logs: so everybody can access them, not only sysadmins. The logs can also be searched easily.

  • Historical of JMX stats, so it’s possible to know how the CPU, memory, database pools were on a given time.

I hope I have convinced you that ELK can help you monitor your Liferay DXP installations, and you feel you're ready to start creating the monitoring system that covers your needs.

Eduardo Perez 2018-10-23T12:49:00Z
Categories: CMS, ECM

LiferayPhotos: a fully functional app in 10 days using Liferay Screens. Part 2 Architecture and planning

Liferay - Tue, 10/23/2018 - 05:36

In the previous blog post we covered the beginning of the project, this includes a description of the app features, advantages of using Liferay Screens and lastly the wireframe.

In this one, we will cover how we can “screenletize” the different screens of the application and build them combining different screenlets. We will also organise the 10 days and plan what are the tasks we are going to work on each day.

“Screenletizing” the app

In this step, we have to take each screen and think about which screenlet we can use to build each part, this way we can combine those screenlets together to build the final screen. Before we continue, you can check out available screenlets here.

We will extract these images from the prototype included in the previous post.

Log In

In this case, the solution is very straightforward, we need the Login Screenlet.

Let’s cover a more complex example

Feed

We have a feed screen where the user will see all the pictures that are uploaded onto the platform with its description and number of likes.

This screen is basically a list of images, but each image has a lot of additional information (user, user image, number of likes, description). To render a List of Images we have a dedicated screenlet, the ImageGalleryScreenlet.

The default version of the ImageGalleryScreenlet only displays a list of images in a Grid, but as we learned in the previous post, we can easily customise the UI.

We have to display more information in each image cell, and… exactly, we will also be using screenlets to do this.

 

 

  • To render the user information we will be using the Asset Display Screenlet. This screenlet will show any asset of Liferay and, depending on the inner type, it will choose a specific UI. In this case, as it is a user we will render the name and the profile picture, and of course, to render the profile picture we will use another screenlet, the Userportrait Screenlet.
  • To display the likes, we have another screenlet: the Rating Screenlet. In this case, we don’t have a default UI that represent likes, so we will need to create one.
 

The great thing about separating view parts into screenlets is that it will allow to reuse them in other parts of the application, or even in other applications.

Planning

After we have finished extracting screenlets of our app, we can create a schedule for the development of the application.

The app will be developed by two people. One will be working part-time and the other full time. The person that will be developing it full time is Luis Miguel Barcos, our great intern. He doesn’t have previous experience using Liferay Screens, which will make the journey even more interesting. He will write the next blog post talking about the experience and a more technical look at the project.

Here is the detailed planning of the app:

Conclusion

In this blog post, we have shown how we are going to “screenletize” the application in order to divide it into small and reusable pieces. We have also created a planning scheme to organise the 10 days we have to work on this app.

Be sure to check out the next and last blog post with the final result.

 

Victor Galan 2018-10-23T10:36:00Z
Categories: CMS, ECM

LiferayPhotos: a fully functional app in 10 days using Liferay Screens. Part 1 Introduction 

Liferay - Tue, 10/23/2018 - 05:28

Before we begin with this story, for those of you that don’t know about Liferay Screens. Liferay Screens is a library for iOS, Android and Xamarin that aims to speed up the development of native applications that use Liferay as a backend. It’s basically a collection of components, called Screenlets. These components are very easy to connect to your Liferay instance and provide plenty of features.

This is going to be a series of 3 blog post describing the creation of an iOS application: LiferayPhotos. We are going to describe how we did it from beginning to end and what are the advantages of using Liferay Screens in your projects. All within a 10-day timeframe.

This first blog post will cover the first aspects of the application, from the initial idea to the mockups of how the application should be.

Idea

The application that we want to build is inspired by a well-known app: Instagram. We want to build an application where a user can:

  • Log in.

  • Sign up.

  • Upload pictures, and add stickers too!

  • See other people’s uploaded pictures.

  • Comment pictures.

  • Like pictures.

  • Be able to chat with other users.

  • See your profile with your uploaded pictures.

With all of this in mind, let's move on to the decision of using Liferay Screens to develop this application.

Why use Liferay Screens?

As you have read in the title, we are going to build this app in only 10 days, yes you have read it right. This is one of the greatest benefits of using Liferay Screens, it makes the development of apps really fast. But… how is this even possible? As I mentioned earlier, the library contains a set of ready-to-use components to cover most of our use cases.

Are you tired of implementing a login screen over and over again? By just using a Login Screenlet, adding a couple of line of code, and configuring it with your Liferay parameters, it’s done! You now have a working login screen in your application.

Okay, this seems promising, but what if the UI doesn’t fit your brand style? No problem whatsoever, the screenlets are designed to support this scenario, UI and business logic are decoupled so you can change it easily without having to rewrite the logic.

Last thing you might ask, what if the use case I want it’s not supported by the available screenlets? This is not a problem either, within Liferay Screens there is also a toolkit to build your own screenlets and reuse them in your projects.

Prototyping

Before working on the app, we have to visualize the application wireframe to split and schedule the task involved. We have created a basic prototype of the different screens of the application:


Conclusion

As you can see here, there are a lot of advantages to using Liferay Screens in your developments. But this is not all… in the next two posts, we will be unveiling other interesting parts of the project and its lifecycle.

Check out the next post of the series

Victor Galan 2018-10-23T10:28:00Z
Categories: CMS, ECM

Asset Display Contributors in Action

Liferay - Sun, 10/21/2018 - 14:19

Display pages functionality in Liferay always was tightly coupled to the Web Content articles, we never had plans to support more or less the same technology for other types of assets even though we had many of these types: Documents & Media, Bookmarks, Wiki, etc... Even User is an asset and every user has corresponding AssetEntry in the database. But for Liferay 7.1 we decided to change this significantly, we introduced a new concept for the Display Pages, based on Fragments, very flexible, much more attractive than the old ones and...we still support only Web Content articles visualization :). Good news for the developers is that the framework is extensible now and it is easy to implement an AssetDisplayContributor and visualize any type of asset using our great display pages, based on fragments and in this article I want to show you how to do it with an example.

Let's imagine that we want to launch a recruitment site, typical one with tons of job-offers, candidates profiles, thematic blogs etc. One of the main functionalities must be a candidate profile page - some sort of landing page with the basic candidate information, photo, personal summary, and skills. And this task can be solved using new Display Pages.

As I mentioned before, User is an Asset in Liferay and there is a corresponding AssetEntry for each User, it is good since for now, we support visualization only for the Asset Entries. To achieve our goal we need two things, first - an AssetDisplayContributor implementation for the User, to know which fields are mappable and which values correspond to those fields, and second - custom friendly URL resolver to be able to get our users profile page by a friendly URL with the user's screen name in it.

 Let's implement the contributor first, it is very simple(some repeated code is skipped, the full class available on GitHub):

@Component(immediate = true, service = AssetDisplayContributor.class) public class UserAssetDisplayContributor implements AssetDisplayContributor { @Override public Set<AssetDisplayField> getAssetDisplayFields( long classTypeId, Locale locale) throws PortalException { Set<AssetDisplayField> fields = new HashSet<>(); fields.add( new AssetDisplayField( "fullName", LanguageUtil.get(locale, "full-name"), "text"));   /* some fields skipped here, see project source for the full implementation */ fields.add( new AssetDisplayField( "portrait", LanguageUtil.get(locale, "portrait"), "image")); return fields; } @Override public Map<String, Object> getAssetDisplayFieldsValues( AssetEntry assetEntry, Locale locale) throws PortalException { Map<String, Object> fieldValues = new HashMap<>(); User user = _userLocalService.getUser(assetEntry.getClassPK()); fieldValues.put("fullName", user.getFullName());   /* some fields skipped here, see project source for the full implementation */       ServiceContext serviceContext = ServiceContextThreadLocal.getServiceContext(); fieldValues.put( "portrait", user.getPortraitURL(serviceContext.getThemeDisplay())); return fieldValues; } @Override public String getClassName() { return User.class.getName(); } @Override public String getLabel(Locale locale) { return LanguageUtil.get(locale, "user"); } @Reference private UserLocalService _userLocalService; }

As you can see, there are two main methods - getAssetDisplayFields which defines the set of AssetDisplayField objects, with the field name, label and the type (for the moment we support two types - text and image trying to convert to text all non-text values, like numbers, booleans, dates, and lists of strings) and getAssetDisplayFieldsValues which defines the values for those fields using specific AssetEntry instance.  There is a possibility to provide different field sets for the different subtypes of entities like we do it for the different Web Content structures, using the classTypeId parameter.

The second task is to implement corresponding friendly URL resolver to be able to get our profiles by users screen name. Here I'll show only the implementation of the getActualURL method of FriendlyURLResolver interface because it is the method that matters, but the full code of this resolver is also available in GitHub.

@Override public String getActualURL( long companyId, long groupId, boolean privateLayout, String mainPath, String friendlyURL, Map<String, String[]> params, Map<String, Object> requestContext) throws PortalException { String urlSeparator = getURLSeparator(); String screenName = friendlyURL.substring(urlSeparator.length()); User user = _userLocalService.getUserByScreenName(companyId, screenName); AssetEntry assetEntry = _assetEntryLocalService.getEntry(User.class.getName(), user.getUserId()); HttpServletRequest request = (HttpServletRequest)requestContext.get("request"); ServiceContext serviceContext = ServiceContextFactory.getInstance(request); AssetDisplayPageEntry assetDisplayPageEntry = _assetDisplayPageEntryLocalService.fetchAssetDisplayPageEntry( assetEntry.getGroupId(), assetEntry.getClassNameId(), assetEntry.getClassPK()); if (assetDisplayPageEntry == null) { LayoutPageTemplateEntry layoutPageTemplateEntry = _layoutPageTemplateEntryService. fetchDefaultLayoutPageTemplateEntry(groupId, assetEntry.getClassNameId(), 0); _assetDisplayPageEntryLocalService.addAssetDisplayPageEntry( layoutPageTemplateEntry.getUserId(), assetEntry.getGroupId(), assetEntry.getClassNameId(),   assetEntry.getClassPK(), layoutPageTemplateEntry.getLayoutPageTemplateEntryId(), serviceContext); } String requestUri = request.getRequestURI(); requestUri = StringUtil.replace(requestUri, getURLSeparator(), "/a/"); return StringUtil.replace( requestUri, screenName, String.valueOf(assetEntry.getEntryId())); }

The key part here is that we need to know which AssetDisplayPageEntry corresponds to the current user. For the Web Content articles, we have a corresponding UI to define Display Page during the content editing. In the case of User, it is also possible to create the UI and save the ID of the page in the DB but to make my example simple I prefer to fetch default display page for the User class and create corresponding AssetDisplayPageEntry if it doesn't exist. And at the end of the method, we just redirect the request to our Asset Display Layout Type Controller to render the page using corresponding page fragments.

That's it. There are more tasks left, but there is no need to deploy anything else. Now let's prepare the fragments, create a Display Page and try it out! For our Display Page, we need 3 fragments: Header, Summary, and Skills. You can create your own fragments with editable areas and map them as you like, but in case if you are still not familiar with the new Display Pages mapping concept I recommend you to download my fragments collection and import them to your site.

When you have your fragments ready you can create a Display Page, just go to Build -> Pages -> Display Pages, click plus button and put the fragments in the order you like. This is how it looks using my fragments: 

Clicking on any editable area(marked with the dashed background) you can map this area to any available field of available Asset Type(there should be 2 - Web Content Article and User). Choose User type and map all the fields you would like to show on the Display Page and click Publish button. After publishing it is necessary to mark our new Display Page as default for this Asset Type, this action is available in the kebab menu of the display page entry:

Now we can create a user and try our new Display Page. Make sure you specified all the fields you mapped, in my case the fields are - First name, Last name, Job title, Portrait, Birthday, Email, Comments(as a Summary), Tags(as Skills list) and Organization(as Company). Save the user and use it's screen name to get the final result:

It is possible to create a corresponding AssetDisplayContributor for any type of Asset and use it to visualize your assets in a brand new way using Fragment-based Display Pages.

Full code of this example is available here.

Hope it helps! If you need any help implementing contributors for your Assets feel free to ask in comments.

Pavel Savinov 2018-10-21T19:19:00Z
Categories: CMS, ECM

Fragments extension: Fragment Entry Processors

Liferay - Fri, 10/19/2018 - 00:02

In Liferay 7.1 we presented a new vision to the page authoring process. The main idea was to empower business users to create pages and visualize contents in a very visual way, without a need to know technical stuff like Freemarker or Velocity for the Web Content templates. To make it possible we introduced the fragment concept.

In our vision fragment is a construction block which can be used to build new content pages, display pages or content page templates. Fragment consists of HTML markup, CSS stylesheet, and Javascript code.

Despite the fact that we really wanted to create a business-user-friendly application, we always remember about our strong developers community and their needs. Fragments API is extensible and allows you to create your custom markup tags to enrich your fragments code and in this article, I would like to show you how to create your own processors for the fragments markup.

As an example, we are going to create a custom tag which shows a UNIX-style fortune cookie :). Our fortune cookie module has the following structure:

We use Jsoup library to parse fragments HTML markup so we have to include it into our build file(since it doesn't come within the Portal core) among other dependencies. 

sourceCompatibility = "1.8" targetCompatibility = "1.8" dependencies { compileInclude group: "org.jsoup", name: "jsoup", version: "1.10.2" compileOnly group: "com.liferay", name: "com.liferay.fragment.api", version: "1.0.0" compileOnly group: "com.liferay", name: "com.liferay.petra.string", version: "2.0.0" compileOnly group: "com.liferay.portal", name: "com.liferay.portal.kernel", version: "3.0.0" compileOnly group: "javax.portlet", name: "portlet-api", version: "3.0.0" compileOnly group: "javax.servlet", name: "javax.servlet-api", version: "3.0.1" compileOnly group: "org.osgi", name: "org.osgi.service.component.annotations", version: "1.3.0" }

OSGi bnd.bnd descriptor has nothing special because we don't export any package and don't provide any capability:

Bundle-Name: Liferay Fragment Entry Processor Fortune Bundle-SymbolicName: com.liferay.fragment.entry.processor.fortune Bundle-Version: 1.0.0

Every Fragment Entry Processor implementation has two main methods, first one - to process fragments HTML markup and second - to validate the markup to avoid saving fragments with invalid markup.

/** * @param fragmentEntryLink Fragment Entry link object to get editable * values needed for a particular case processing. * @param html Fragment markup to process. * @param mode Processing mode (@see FragmentEntryLinkConstants) * @return Processed Fragment markup. * @throws PortalException */ public String processFragmentEntryLinkHTML( FragmentEntryLink fragmentEntryLink, String html, String mode) throws PortalException; /** * @param html Fragment markup to validate. * @throws PortalException In case of any invalid content. */ public void validateFragmentEntryHTML(String html) throws PortalException;

FragmentEntryLink object gives us access to the particular fragment usage on a page, display page or page template and can be used if we want our result to depend on the particular usage parameters. Mode parameter can be used to give additional processing(or remove unnecessary processing) options in the EDIT(or VIEW) mode.

In this particular case, we don't need the validation method, but we have a good example in the Portal code.

Let's implement our fortune cookie tag processor! The only thing we have to do here is to iterate through all fortune tags we meet and replace them with a random cookie text. As I mentioned before, we use Jsoup to parse the markup and work with the document:

@Override public String processFragmentEntryLinkHTML( FragmentEntryLink fragmentEntryLink, String html, String mode) { Document document = _getDocument(html); Elements elements = document.getElementsByTag(_FORTUNE); Random random = new Random(); elements.forEach( element -> { Element fortuneText = document.createElement("span"); fortuneText.attr("class", "fortune"); fortuneText.text(_COOKIES[random.nextInt(7)]); element.replaceWith(fortuneText); }); Element bodyElement = document.body(); return bodyElement.html(); } private Document _getDocument(String html) { Document document = Jsoup.parseBodyFragment(html); Document.OutputSettings outputSettings = new Document.OutputSettings(); outputSettings.prettyPrint(false); document.outputSettings(outputSettings); return document; } private static final String[] _COOKIES = { "A friend asks only for your time not your money.", "If you refuse to accept anything but the best, you very often get it.", "Today it's up to you to create the peacefulness you long for.", "A smile is your passport into the hearts of others.", "A good way to keep healthy is to eat more Chinese food.", "Your high-minded principles spell success.", "The only easy day was yesterday." }; private static final String _FORTUNE = "fortune";

 

That is it. After deploying this module to our Portal instance, fortune tag is ready to use in the Fragments editor:

It is up to you how to render your personal tag, which attributes to use, which technology to use to process tags content. You can even create your own script language, or apply the one which you already have in your CMS to avoid massive refactoring and use existing templates as-is.

Full Fortune Fragment Entry Processor code can be found here.

Hope it helps!

Pavel Savinov 2018-10-19T05:02:00Z
Categories: CMS, ECM

Using BOMs to Manage Liferay Dependency Versions

Liferay - Wed, 10/17/2018 - 15:15

Liferay is a large project, and many developers who are attempting to get their customizations to work with Liferay will often end up asking the question, "What version of module W should I use at compile time when I'm running on Liferay X.Y.Z?" To answer that question, Liferay has some instructions on how to find versions in its document, Configuring Dependencies.

This blog entry is really to talk about what to do in situations where those instructions just aren't very useful.

Path to Unofficial BOMs

First, a little bit of background, because I think context is useful to know, though you can skip it if you want to get right to working with BOMs.

Back in late 2016, I started to feel paranoid that we'd start introducing major version changes to packages in the middle of a release and nobody would notice. To ease those concerns, I wrote a tool that indexed all of the packageinfo files in Liferay at each tag, and then I loaded up these metadata files with a Jupyter notebook and did a check for a major version change.

Then, like many other is it worth the time problems, it evolved into a script that I'd run once a week, and a small web-based tool so that I wouldn't have to fire up Jupyter every time I needed to check what was essentially static information.

Fast forward to February 2017, and our internal knowledge base was updated to allow for a new wiki format which (accidentally?) provided support for HTML with script tags. So, I chose to share my mini web-based tool to the wiki, which then lead our support team in Spain to share a related question that they'd been pondering for awhile.

Imagine if you happened to need to follow the Liferay document, Configuring Dependencies, for a lot of modules. Doesn't that lookup process get old really fast?

So, given that it was clearly possible to create an unofficial reference document for every Liferay exported package, wouldn't it be nice if we could create an official reference document that recorded every Liferay module version?

Since I had all of the metadata indexed anyway, I put together a separate tool that displayed the information stored in bnd.bnd files at every tag, which sort of let you look up module version changes between releases. This let people get a sense for what an official reference document might look like.

(Note: I learned a year later that bnd.bnd files are not the correct file to look at if you want to know the version at a given tag. Rather, you need to look at the artifact.properties files saved in the modules/.releng folder for that information. So in case it helps anyone feel better, Liferay's release and versioning process isn't obvious to anyone not directly involved with the release process, whether you're a Liferay employee or not.)

From the outside looking in, you might ask, why is it that our team didn't ask for Liferay to officially provide a "bill of materials" (BOMs), as described in the Introduction to Dependency Mechanism in the official Maven documentation? That way, you'd only specify the version of Liferay you're using, and the BOM would take care of the rest. If such a BOM existed, a lookup tool would be completely unnecessary.

Well, that's how the request actually started at the time of DXP's release, but since it wasn't happening, it got downgraded to a reference document which looked immediately achievable.

Fast forward to today. Still no official reference document for module versions, still no official Liferay BOMs.

However, by chance, I learned that the IDE team has been evaluating unofficial BOMs currently located on repository.liferay.com. These BOMs were generated as proof of concepts on what that BOM might include, and are referenced in some drafts of Liferay IDE tutorials. Since I now had an idea of what the IDE team itself felt a BOM should look like, I updated my web-based tool to use all of the collected metadata to dynamically generate BOMs for all indexed Liferay versions.

Install the Unofficial BOM

For sake of an example, assume that you want to install release-dxp-bom-7.1.10.pom.

The proof of concept for this version exists in the liferay-private-releases repository of repository.liferay.com, and Liferay employees can setup access to that repository to acquire that file. Since there are no fix packs, it is also functionally equivalent to the original 7.1.0 GA1 release, and you can use com.liferay.portal:release.portal.bom:7.1.0 instead.

However, if you wish to use a version for which a proof of concept has not been generated (or if you're a non-employee wanting to use an unofficial DXP BOM), you can try using Module Version Changes Since DXP Release to use the indexed metadata and generate a BOM for your Liferay version. If you go that route, open up the file in a text editor, and you should find something that looks like the following near the top of the file:

<groupId>com.liferay</groupId> <artifactId>release.dxp.bom</artifactId> <version>7.1.10</version> <packaging>pom</packaging>

With those values for the GROUP_ID, ARTIFACT_ID, VERSION, and PACKAGING, you would install the BOM to your local Maven repository by substituting in the appropriate values into the following mvn install:install-file command:

mvn install:install-file -Dfile=release.dxp.bom-7.1.10.pom -DgroupId="${GROUP_ID}" \ -DartifactId="${ARTIFACT_ID}" -Dversion="${VERSION}" \ -Dpackaging="${PACKAGING}"

And that's basically all you need to do when installing a BOM that's not available in any repository you can access.

Install Multiple BOMs

If you only have a handful of BOMs, you could repeat the process mentioned above for each of your BOMs. If you have a lot of BOMs to install (for example, you're a Liferay employee that might need to build against arbitrary releases, and you decided to use the earlier linked page and generate something for every Liferay fix pack), you may want to script it.

To keep things simple for pulling values out of an XML file, you should install the Python package yq, which provides you with the tool xq, which provides you with XML processing at the command-line. This tool is similar to the popular tool jq, which provides you with JSON processing at the command line.

pip install yq

Once yq is installed, you can add the following to a Bash script to install the auto-generated BOMs to your local Maven repository:

#!/bin/bash install_bom() { local GROUP_ID=$(cat ${1} | xq '.project.groupId' | cut -d'"' -f 2) local ARTIFACT_ID=$(cat ${1} | xq '.project.artifactId' | cut -d'"' -f 2) local VERSION=$(cat ${1} | xq '.project.version' | cut -d'"' -f 2) local PACKAGING=$(cat ${1} | xq '.project.packaging' | cut -d'"' -f 2) mvn install:install-file -Dfile=${1} -DgroupId="${GROUP_ID}" \ -DartifactId="${ARTIFACT_ID}" -Dversion="${VERSION}" \ -Dpackaging="${PACKAGING}" echo "Installed ${GROUP_ID}:${ARTIFACT_ID}:${VERSION}:${PACKAGING}" } for bom in *.pom; do install_bom ${bom} done Use BOMs in Blade Samples

In case you've never used a BOM before, I'll show you how you would use them if you were to build projects in the Blade Samples repository.

Reference the BOM in Maven

First, update the parent pom.xml so that child projects know which dependency versions are available by simply adding the BOM as a dependency.

<dependencyManagement> <dependencies> <dependency> <groupId>com.liferay.portal</groupId> <artifactId>release.dxp.bom</artifactId> <version>7.1.10</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement>

We set this to be an import scope dependency, so that we don't have to download all of Liferay's release artifacts just to have the version numbers (they will be downloaded as-needed when the specific artifacts are resolved).

Then, in order to have a child project use this version, simply update the pom.xml in the child project to not include the version explicitly for any of the dependencies.

<dependency> <groupId>com.liferay.portal</groupId> <artifactId>com.liferay.portal.kernel</artifactId> <scope>provided</scope> </dependency>

As noted in Introduction to Dependency Mechanism, the version specified in the parent POM will then be chosen as the dependency version. Running mvn package in the child project will then download the actual versions for the noted Liferay release.

Note: If this is the first time you've run any Maven commands in the liferay-blade-samples repository, you'll want to make sure that all of the parent projects are installed or the build will fail. If you are new to Maven and aren't sure how to read pom.xml files, this is achieved with the following steps:

  1. Run mvn -N install in liferay-blade-samples/maven
  2. Run mvn install in liferay-blade-samples/parent.bnd.bundle.plugin.
Reference the BOM in Gradle

Liferay workspace uses an older version of Gradle, and so BOMs aren't supported by default. To get support for BOMs, we'll first need to bring in io.spring.gradle:dependency-management-plugin:1.0.6.RELEASE.

The first step in doing this is to update the parent build.gradle so that Gradle knows where to find the plugin.

buildscript { dependencies { ... classpath group: "io.spring.gradle", name: "dependency-management-plugin", version: "1.0.6.RELEASE" } ... }

The next step in doing this is to update the parent build.gradle so that Gradle makes sure to apply the plugin to each child project.

subprojects { subproject -> ... apply plugin: "io.spring.dependency-management" ... }

Because we're installing the BOMs to our local Maven repository, the next step in doing this is to update the parent build.gradle so that Gradle knows to check that local repository. We can then also add the BOM to a dependencyManagement block.

subprojects { subproject -> ... repositories { mavenLocal() ... } dependencyManagement { imports { mavenBom "com.liferay.portal:release.dxp.bom:7.1.10" } } ... }

Then, in order to have a child project use this version, simply update the build.gradle in the child project to not include the version explicitly for any of the dependencies.

dependencies { compileOnly group: "com.liferay.portal", name: "com.liferay.portal.kernel" ... } Minhchau Dang 2018-10-17T20:15:00Z
Categories: CMS, ECM
Syndicate content