Assistance with Open Source adoption

Open Source News

Self-Service Big Data and AI/ML at Intelligent Data Summit

SnapLogic - Thu, 08/30/2018 - 12:57

Recently, I presented at the Intelligent Data Summit, a virtual event hosted by IDevNews. I was joined by SoftwareAG, MapR, among others and one of our technology partners, Reltio. This timely, online event was focused on all things AI/ML, data, IoT, and other modern technologies and it was a pleasure to be a part of[...] Read the full article here.

The post Self-Service Big Data and AI/ML at Intelligent Data Summit appeared first on SnapLogic.

Categories: ETL

Fundamentals of Contribution Management online training - September 5th

CiviCRM - Wed, 08/29/2018 - 14:41

This two hour training session taught by Cividesk is a great introduction to CiviContribute and will help you optimize this module to record and track contributions to your non-profit and improve your interaction with your donors.

Click here to learn more about the topics covered in the Fundamentals of Contribution Management and to register for this class held on Wednesday, September 5th at 9 am MT/11 am ET.  

Categories: CRM

Migrate from Salesforce to Microsoft Dynamics 365 for Sales using SnapLogic

SnapLogic - Wed, 08/29/2018 - 12:51

A recent Gartner report announced that Customer Relationship Management (CRM) became the largest software market in 2017 and is the fastest growing software market in 2018. Market growth for cloud CRM increased as well, at twice the rate of the overall market growth, and it will only continue to accelerate at a rapid pace as[...] Read the full article here.

The post Migrate from Salesforce to Microsoft Dynamics 365 for Sales using SnapLogic appeared first on SnapLogic.

Categories: ETL

How to create a community around your e-commerce store

PrestaShop - Wed, 08/29/2018 - 05:13
Today, online shopping is on the rise.
Categories: E-commerce

Protecting eCommerce Stores From Click Fraud

PrestaShop - Wed, 08/29/2018 - 03:57
There are between 12 to 24 million eCommerce stores in the world today.
Categories: E-commerce

Simplifying the Compatibility Matrix

Liferay - Tue, 08/28/2018 - 15:10

One of the initiatives our Quality Assurance Team has taken the past few years, has been to try and automate all their testing. This has resulted in testing all of our compatible environments in a similar fashion.  Thus, we no longer need to distinguish between Supported and Certified environments.


Starting with Liferay DXP 7.0, we will now list all our compatible environments together.  We hope this will make the decision to use Liferay DXP even easier.


Here is the new Compatibility Matrix for 7.0 and the Compatibility Matrix for 7.1.

David Truong 2018-08-28T20:10:00Z
Categories: CMS, ECM

Six Do’s and Don’ts of Collaborative Data Management

Talend - Tue, 08/28/2018 - 11:29

Data Quality Projects are not technical projects anymore. They are becoming collaborative and team driven.

As organizations strive to succeed their digital transformation, data professionals realize they need to work as teams with business operations as they are the ones who need better data to succeed their operations. Being in the cockpit, Chief Data Officers need to master some simple but useful Do’s and Don’t’s about running their Data Quality Projects.

Let’s list a few of these.

 DO’S  Set your expectations from the start.

Why Data Quality? What do you target? How deep will you impact your organization’s business performance? Find your Data Quality answers among business people. Make sure you know your finish line, so you can set intermediate goals and milestones on a project calendar.

Build your interdisciplinary team.

Of course, it’s about having the right technical people on board: people who master Data Management Platforms. But It’s all also about finding the right people who will understand how Data Quality impacts the business and make them your local champions in their respective department. For example, Digital Marketing Experts often struggle with bad leads and low performing tactics due to the lack of good contact information. Moreover, new regulations such as GDPR made marketing professionals aware about how important personal data are. By putting such tools as Data Preparation in their hands, you will give them a way to act on their Data without losing control. They will be your allies in your Data Quality Journey.

Deliver quick wins.

While it’s key to stretch people capabilities and set ambitious objectives, it’s also necessary to prove your data quality project will have positive business value very quickly. Don’t spend too much time on heavy planning. You need to prove business impacts with immediate results. Some Talend customers achieved business results very quickly by enabling business people with apps such as Data Prep or Data Stewardship.  If you deliver better and faster time to insight, you will gain instant credibility and people will support your project. After gaining credibility and confidence, it will be easier to ask for additional means when presenting your projects to the board. At the end remember many small ones make a big one.

DON’TS Don’t underestimate the power of bad communication

We often think technical projects need technical answers. But Data Quality is a strategic topic. It would be misleading to treat it as a technical challenge. To succeed, your project must be widely known within your organization. You will take control of your own project story instead of leaving bad communication spreading across departments. For that, you must master the perfect mix of know-how and communication skills so that your results will be known and properly communicated within your organization. Marketing suffering from bad leads, operations suffering from missing infos, strategists suffering from biased insights. People may ask you to extend your projects and solve their data quality issues, which is a good reason to ask for more budget.

Don’t overengineer your projects then making it too complex and sophisticated.

Talend provides simple and powerful platform to produce fast results so you can start small and deliver big. One example of having implemented Data Management from the start, is Carhartt who managed to clean 50,000 records in one day. You don’t necessarily need to wait a long time to see results.

Don’t Leave the clock running and leave your team without clear directions

Set and meet deadlines as often as possible. It will bolster your credibility. As time is running fast and your organization may shift to short term business priorities, track your route and stay focused on your end goals. Make sure you deliver project on time. Then celebrate success. When finishing a project milestone, make sure you take time to celebrate with your team and within the organization.


To learn more about Data Quality, please download our Definitive Guide to Data Quality.


The post Six Do’s and Don’ts of Collaborative Data Management appeared first on Talend Real-Time Open Source Data Integration Software.

Categories: ETL

Joomla 3.8.12 Release

Joomla! - Tue, 08/28/2018 - 08:45

Joomla 3.8.12 is now available. This is a security release for the 3.x series of Joomla which addresses 3 security vulnerabilities and contains over 20 bug fixes and improvements.

Categories: CMS

Is self-service big data possible?

SnapLogic - Tue, 08/28/2018 - 07:30

By now, we all know about and are experiencing the rise in the volume of data generated and available to an organization and the issues it can cause. One can see that there is little end in sight to the data tsunami which is largely in part due to the increased variety of data from[...] Read the full article here.

The post Is self-service big data possible? appeared first on SnapLogic.

Categories: ETL

End of life plans for 5.x php versions & planning for 7.0 EOL

CiviCRM - Tue, 08/28/2018 - 04:44
This blog serves as advance notice of our intention to stop supporting php versions 5.5, 5.6 and our ongoing evaluation of 7.0.   For php 5.5 we intend to end support in January 2019. This is already unsupported by php and we strongly recommend you upgrade off it as soon as possible. The release in February 2019 will be the first release that does not support php 5.5   For php 5.6 our TARGET is to end support in September 2019 (Oct release would support php 7.0+).
Categories: CRM

New Liferay Project SDK Installers 3.3.0 GA1 Released

Liferay - Mon, 08/27/2018 - 21:16

We are pleased to announce the new release of Liferay Project SDK Installers version 3.3.0 GA1.




For customers, they can download all of them on the customer studio download page.


Upgrade From previous 3.2:


  1. Download updatesite here

  2. Go to Help > Install New Software… > Add…

  3. Select Archive...Browse to the downloaded updatesite

  4. Click OK to close Add repository dialog

  5. Select all features to upgrade then click > Next, again click > Next and accept the license agreements

  6. Finish and restart to complete the upgrade

Release highlights:


Installers Improvements:


Add option to install Developer Studio only


Developer Studio Improvements and Fixes:


1. Code Upgrade Tool Improvements

  • upgrade to Liferay 7.1 support
    • convert Liferay Plugins SDK 6.2 to Liferay Workspace 7.0 or 7.1
    • convert Liferay Workspace 7.0 to Liferay Workspace 7.1
  • added Liferay DXP/7.1 breaking changes
  • various performance improvements

2. Enabled dependencies management for Target Platform

3. Fixed source lookup during watch task


Using Code Upgrade Tool






If you run into any issues or have any suggestions please come find us on our community forums or report them on JIRA (IDE project), we are always around to try to help you out. Good luck!

Yanan Yuan 2018-08-28T02:16:00Z
Categories: CMS, ECM

Four sessions to attend at the Strata Data Conference in New York

SnapLogic - Mon, 08/27/2018 - 13:17

The Strata Data Conference in New York is where thousands of cutting-edge companies deep dive into emerging big data technologies and techniques. From hot topics like AI and machine learning to implementing data strategy, this seven-year running conference series is a hotbed for new ideas and strategies to tackle the challenges that have emerged in the[...] Read the full article here.

The post Four sessions to attend at the Strata Data Conference in New York appeared first on SnapLogic.

Categories: ETL

Digital Ad Fraud – What You Need to Know

PrestaShop - Mon, 08/27/2018 - 11:30
Every company that invests in digital advertising has likely been victim to some form of ad fraud, yet, many fail to realize the extent of the problem.
Categories: E-commerce

Revisiting OSGi DS Annotations

Liferay - Sat, 08/25/2018 - 08:51

I've been asked a couple of times recently about different aspects of @Modified annotation that I'm not sure have been made clear in the documentation, so I wanted to cover the lifecycle annotations in a little further detail so we can use them effectively.

The @Activate, @Deactivate and @Modified annotations are used for lifecycle event notifications for the DS components. They get called when the component itself is activated, deactivated or modified and allow you to take appropriate action as a result.

One important note - these lifecycle events will only be triggered if your @Component is actually alive. I know, sounds kind of weird, but it can happen. If you have an @Reference which is mandatory but is not available, your @Component will not be alive and your @Activate (and other) method will not be invoked.


This annotation is used to notify your component that it is now loaded, resolved and ready to provide service. You use this method to do some final setup in your component. It is equivalent to Spring's afterPropertiesSet() InitializingBean interface.

One of the cool things about the @Activate method is that the signature for the method that you are creating is not fixed. You can have zero, one or more of the following parameters:

Map<String, Object> propertiesThe property map from the @Component properties, also can contain your ConfigurationAdmin properties. BundleContext bundleContextThe bundle context for the bundle that holds the component that is being activated. Saves you from having to look it up, is great when you want to enable a ServiceTracker. ComponentContextThe component context contains the above objects, but most of all it has context information about the component itself.


So the following @Activate methods would all individually be considered valid:

@Activate protected void activate() {...} @Activate protected void afterPropertiesSet(Map<String,Object> props,   BundleContext bCtx) { ... } @Activate public void dontCallMe(BundleContext bundleContext, Map<String, Object> properties,   ComponentContext componentContext) { ... }

So we can use any method name (although Liferay tends to stick with activate()), any combination of parameters in any order we want.


Hopefully this is obvious that it is invoked when the component is about to be deactivated. What might not be so obvious is when, exactly, it is called.

Basically you can be sure that your component context is still valid (nothing has been done to it yet), but it is about to happen. You want to use this lifecycle event to clean up before your component goes away.

So if you have a ServiceTracker open, this is your chance to close it. If you have a file open or a DB connection or any resource, use this as your entry point to clean it all up.

Like the @Activate annotation, the method signature for the @Deactivate methods is variable. It supports all of the same parameters as @Activate, but it has an additional value, an int which will hold the deactivation reason.

I've never been worried about the deactivation reason myself, but I suppose there are good use cases for receiving them. If you want to see the codes and explanations, check out this Felix ticket:


So this one is a fun one, one that is not really documented that well, IMHO.

You can find blogs, etc where it boils down to "this method is called when the service context changes", but there is an assumption there that you understand what the service context is in the first place.

The tl;dr version is that this method is invoked mostly when your configuration changes, ala Configuration Admin.

For example, most of us will have some kind of code in our @Activate method to receive the properties for the component (from the @Component properties and also from the Config Admin), and we tend to copy the value we need into our component, basically caching it so we don't have to track it down later when we need it.

This is fine, of course, as long as no one goes to the control panel and changes your Config Admin properties.

When that happens, you won't have the updated value. Your only option in this case is to restart (the service, the component, or the container will all result in your getting the new value).

But that's kind of a pain, don't you think? Change a config, restart the container?

Enter @Modified. @Modified is how you get notified of the changes to the Config Admin properties, either via a change in the control panel or a change to the osgi/configs/<my pid>.config files.

When you have an @Modified annotation, you can update your local cache value and then you won't require a restart when the data changes.

Note that sometimes you'll see Liferay code like:

@Activate @Modified protected void activate(Map<String, Object> properties) { ... }

where the same method is used for both lifecycle events. This is fine, but you have to ensure that you're not doing anything in the method that you don't want invoked for the @Modified call.

For example, if you're starting a ServiceTracker, an @Modified notification will have you restarting the tracker unless you are careful in your implementation.

I often find it easier just to use separate methods so I don't have to worry about it.

The method signature for your @Modified methods follows the same rules as @Activate, so all of the same parameter types are allowed. You can still get to the bundle context or the component context if you need to, but often times this may not be necessary.


So there you kind of have it. There is not a heck of a lot more to it.

With @Activate, you get the lifecycle notification that your component is starting. With @Deactivate, you can clean up before your component is stopped. And with @Modified, you can avoid the otherwise required and pesky restart.



David H Nebinger 2018-08-25T13:51:00Z
Categories: CMS, ECM

Proven ways to speed data integration

SnapLogic - Fri, 08/24/2018 - 12:58

Data integration is a challenge that keeps getting more difficult. It’s no surprise considering the explosion of cloud-based tools, the proliferation of devices that consume and produce information, and the way information is shared between systems and from systems to humans. Plus, IDC predicts that the volume of data will reach around 40 Zettabytes (1[...] Read the full article here.

The post Proven ways to speed data integration appeared first on SnapLogic.

Categories: ETL

Liferay 7.1 CE GA1 OSGi Service / Portlet / Maven sample

Liferay - Fri, 08/24/2018 - 03:14

Hi everybody,

With the version 7.X we have all started to play with OSGi and for some projects it's sometimes hard to have a clean start with the correct build tool; despite the blade sample which are definitely useful.

I wanted to share through this blog, a sample with an OSGi service and a portlet which are configured to be deployed in a Liferay 7.1.

The code given can be compiled with maven :

A repository has to be added into the settings.xml : to be able to find the version 3.0.0 of the liferay kernel.


Let's see some details on the sample given :

1 - OSGi service module

In this module we have an interface called OsgiService and his implementation OsgiServiceImpl.

The OsgiService is into the package xxx.api which is declared as export package into the bnd.

The MANIFEST.MF generated once the compilation is done :

Manifest-Version: 1.0 Bnd-LastModified: 1535096462818 Bundle-ManifestVersion: 2 Bundle-Name: osgi-simple-module Bundle-SymbolicName: Bundle-Version: Created-By: 1.8.0_111 (Oracle Corporation) Export-Package:;version="0.0.1" Import-Package: Private-Package: Provide-Capability: osgi.service;objectClass:List<String>="  al.osgi.simple.api.OsgiService" Require-Capability:;filter:="(&(" Service-Component: OSGI-INF/  Impl.xml Tool: Bnd-


When the OSGi service module is started in liferay, you will be able to see the service is correctly exposed (felix webconsole view) :

With the Service ID 9245, you can see the type of the service exposed and the implementation below.


2 - OSGi portlet

The portlet is a MVCPortlet with jsp where we inject the OsgiService and we called the log method on the portlet default render.

Let's see the MANIFEST.MF of the portlet :

Manifest-Version: 1.0 Bnd-LastModified: 1535097259074 Bundle-ManifestVersion: 2 Bundle-Name: osgi-simple-portlet Bundle-SymbolicName: Bundle-Version: Created-By: 1.8.0_111 (Oracle Corporation) Import-Package: com.liferay.portal.kernel.portlet.bridges.mvc;version=  "[1.5,2)",javax.portlet;version="[2.0,3)",  pi;version="[0.0,1)",com.liferay.portal.kernel.log;version="[7.0,8)",  javax.servlet;version="[3.0,4)",javax.servlet.http;version="[3.0,4)" Private-Package:,  nt.web.model,content Provide-Capability: osgi.service;objectClass:List<String>="javax.portl  et.Portlet",liferay.resource.bundle;"  al.osgi-simple-portlet";"content.Language" Require-Capability: osgi.extender;filter:="(&(osgi.extender=jsp.taglib  )(uri=",osgi.extender;filter:="(&(os  gi.extender=jsp.taglib)(uri=",osgi.extend  er;filter:="(&(osgi.extender=jsp.taglib)(uri=  ortlet))",osgi.extender;filter:="(&(osgi.extender=jsp.taglib)(uri=htt  p://",osgi.extender;filter:="(&(osgi.extender=  jsp.taglib)(uri=",osgi.extender;filter:="(  &(osgi.extender=osgi.component)(version>=1.3.0)(!(version>=2.0.0)))",  osgi.service;filter:="(  Service)";effective:=active,;filter:="(&(  on=1.8))" Service-Component: OSGI-INF/  rtlet.xml Tool: Bnd-


In the Apache Felix webconsole, we can see the OsgiService is a service used by the portlet and satisfy by the osgi simple module (service #9245) :


I hope this quick tutorial will help some of you to your next development on Liferay 7.X.

Feel free to add comments.


Best regards,


David Bougearel 2018-08-24T08:14:00Z
Categories: CMS, ECM

SnapLogic Snaplex management: Powered by Mesosphere

SnapLogic - Thu, 08/23/2018 - 13:01

As enterprises continue to invest heavily to optimize their IT infrastructure to keep up with business demands and become more agile, many embrace container technology and seek out applications that are compatible with it. In previous releases, SnapLogic introduced its support for Dockerization of Snaplex nodes to add flexibility to existing environments. Recently, the SnapLogic[...] Read the full article here.

The post SnapLogic Snaplex management: Powered by Mesosphere appeared first on SnapLogic.

Categories: ETL

Integrating Workday with ERP systems

SnapLogic - Wed, 08/22/2018 - 12:30

As companies modernize their enterprise software environments, many are turning to SaaS applications like Workday for key financial and Human Capital Management (HCM) capabilities. However, the new platforms must still co-exist with old-guard ERP systems, which demands a new approach to seamless integration. Workday HCM covers a lot of ground, including traditional back-office HR functions[...] Read the full article here.

The post Integrating Workday with ERP systems appeared first on SnapLogic.

Categories: ETL

13 Reasons to Love Joomla!

Joomla! - Wed, 08/22/2018 - 06:30

13 years ago, we set out with a vision – to develop a CMS that stood firm on its Open Source values. Instantly, the community was behind us – with more than a thousand people joining our project within just one day. Fast forward 13 years and you could say that we’ve come a long way. 14 versions in (major/minor), Joomla! now powers millions of websites, with more than 7950 extensions lining the digital shelves of our extension directory.

Categories: CMS

Why data is no longer just an IT function

Talend - Tue, 08/21/2018 - 12:56

Data – or at least the collection, storage, protection, transfer and processing of it – has traditionally been seen as the role of a modern data-driven technical division. However, as data continues to explode in both volume and importance, it is not enough to gather huge amounts of disparate data into a data lake and expect that it will be properly consumed. With data becoming the defining factor of a business’s strategy, this valuable gold dust needs to be in the hands of the right business function, in the right form, at the right time, to be at its most effective. This means that traditional roles within the organization need to adapt, as CIOs and CTOs oversee digital transformation projects across the business landscape.

The aim of digital transformation is to create an adaptive, dynamic company that is powered by digital technology – it is the perfect marriage of the business and IT function and requires both to collaborate to successfully harness the data at a company’s disposal. This will be imperative to deliver the types of rapid growth and customer-centric developments that modern businesses are determined to achieve. In recent years, the groundwork for this has already been delivered in the increasing use of cloud within businesses – which the Cloud Industry Forum revealed earlier this year stands at 88% in the UK, with 67% of users expecting to increase their cloud usage over the coming years. However, while the cloud provides the perfect platform for scalable, agile digitization, three further challenges stand between organizations and digital transformation success, and the business and IT functions need to work together to ensure their business emerges victorious at the other end.

Watch Put More Data to Work: Talend Spring '18 now.
Watch Now Challenge 1: Business Wants Data, But IT Can’t Keep Up

With cloud applications, sensors, online data streams and new types of technology emerging week on week, businesses are seeing an explosion of data – both in volume and variety. At the same time, consumers are expecting the very latest products, with personalized services, in real-time. The data businesses have access to can help but frequently ends up siloed, out of context, or of bad quality. Industry estimates predict that working on flawed data costs a business in the region of 10x more than working on perfect data.

Traditionally, employees within the business have maintained this data, but this is no longer feasible in the face of the sheer volume of information that businesses receive. Instead, businesses will need to be empowered by modern technologies such as Big Data and machine learning to ensure that as much of data preparation, cleansing and analysis is guided or automated. Without a combined data landscape of high-quality data, businesses risk missing opportunities by simply not successfully analyzing their own data… or even drive improper insights and related actions.

Being data-driven is a mandate for modern business, and the strain cannot be placed on IT to simply keep pace with the latest technological innovations. Instead, the business function must support in creating a digital strategy, focused on the latest business objectives, in order for the company to succeed.

Challenge 2: Digitization is Changing the Job Description

In the not-too-distant past, IT resources were centralized, with a core IT organization managing on-premises data using legacy systems. While this was an effective way of keeping data safe and organized, it resulted in the data being hard to access and even harder to use. As recently as 2015, BARC statistics stated that from a sample of over 2,000 responses, 45% of business users say their companies have less than 10% of employees using business intelligence (BI).

However, in today’s data-centric world where surveys estimate that 38% of overall job postings require digital skills, empowering 10% of employees to be self-sufficient with data is nowhere near enough. Furthermore, Gartner research asserts that by 2019, citizen data scientists will surpass data scientists in terms of the amount of advanced analysis they produce. The roles of everyone throughout the business, from the CIO to the business process analyst, are emerging to need data right at the user’s fingertips. These figures need access to data to ensure they can strategize, execute and deliver for the business with the most relevant and up-to-date insights available. This means the business must fully equip its employees and at every level to empower their decision-making with highly available and insightful data. As well as providing self-service technologies and applications which provide a turnkey solution to mining insight from data, this involves using training and internal communications to define a data-driven culture throughout business divisions.

Challenge 3: The threats to data, and to businesses, are increasing by the day

The knee-jerk reaction to this might be to make as much data as possible available to as many people as possible. However, any well-versed CIO knows this is not viable. With regulations like the GDPR, organizations have an increasing obligation to make sure only the right people have access to every piece of information or place their entire organization at risk. This is especially important given a backdrop where 71% of users admit to having access to data they should not according to the Ponemon Institute.

The solution to this is successfully implemented self-service IT solutions, which automates functions such as data access requests and data preparation. This is fundamental to allowing business employees quicker access to the right data, as well as providing clear lineages of who accessed what information, when – which will be crucial to monitor under the GDPR. At the same time, automated data preparation tools are essential to reduce the burden on the IT team, performing manual cleansing and formatting tasks. This, in turn, will enable the IT team to focus on delivering new technologies for the organization, rather than troubleshooting legacy issues.

The rise of the cloud has created the possibility for every person in every business to be data driven – but to date, this has not been the case. Instead, organizations experience siloing and limits on innovation. The key is creating an approach to data that is built with the business objectives in mind. A successful digital transformation project is centered on achieving real business outcomes, which is then operationalized by IT – making both vital players in evolving the role and use of data within an organization.

The post Why data is no longer just an IT function appeared first on Talend Real-Time Open Source Data Integration Software.

Categories: ETL
Syndicate content