Assistance with Open Source adoption

Open Source News

Fresh news about Bonita BPM Community

Bonita - Fri, 09/13/2013 - 08:31

Dear Community,

I have some fresh news for you, about Bonita BPM and the future of our Community.

First, it’s time for an update: the 6.0.3 version of Bonita BPM is available for download. It fixes some bugs in the Engine and Portal, and a lot of work has been done in Studio. You can read the full release note in pdf.

With Pablo Alonso, one of the consultants on our Service team, we have started a blog series about a Trello connector and application. It provides an example with a subprocess and a brand new connector from scratch with a Java application, shows how to use JSON with Bonita BPM, and more. All resources are already available, and will be updated on our Community GitHub.

Here’s the first part of the series: Trello Application Example – part 1. Pablo has started to work on the second part, which is developer-oriented.

About the future of our Community, I can tell you that we have done a lot of work to create a really good, brand-new website:

  • More social – it will be easy to share your questions and content
  • More practical – for Community technical support and collaboration between members
  • More languages – with the ability to create local groups in other languages on the way
  • More usable – there will be a new way to manage your contributions, from connectors to ideas and feedback.

It should be available before the end of the year. We will try to make a sort of beta opening to give you a chance to tell us what you think about it. If you want to be part of this beta group, please let me know in the comments.

I see on the forum that there are more and more users who are new to BPMN and Bonita BPM. I know that Bonita BPM can be technical for non-developers,  so I would like to let you know that we haven’t forgotten you. On our corporate website, there are many ready-made process examples you can download, test and customize for your need. Also, there are BPMN process diagrams and process examples for use with tbe Community edition of Bonita BPM here – click on each category to see the list of what’s available.

In order to make it easier to install and learn about Bonita BPM, I have started some cool stuff on the Community Blog:

  • I’m updating “how to set up Bonita BPM on a full Open Source stack” for Bonita BPM 6.x.
  • I was looking at a fun way to learn and talk about Bonita BPM 6. I don’t know about you, but sometimes for me, it’s not so easy to explain to friends and family what BPMN and all that stuff is about. So, with help from our experts, I’ve creates a little game in Bontia BPM Studio, to show people that we can make not only business but fun process applications too. Stay tuned, for there may be some Bonitasoft goodies to win!

One of my goals is to try to better respond to community needs, and I have access to resources available at Bonitasoft. This means that if you want the Bonitasoft team to work on a specific How to, please let me know. There are many ways to contact me: Twitter, LinkedIn, and through the Bonita forum.

Have fun with Bonita!


Categories: BPM

Trello Application Example – Part 1

Bonita - Wed, 09/04/2013 - 10:07

In this article, we would like to share with you a example of a nice Bonita BPM application using Trello.
Here is a summary of what it does:

  • Helps you create credentials to connect Bonita BPM to a Trello API
  • Retrieves JSON information from a Trello board
  • Creates a downloadable CSV and displays the table in a Bonita BPM form
  • Stores the credentials, so you can use them later when starting a new case from the application in the Portal.

The first part  of this article is aimed at the beginner. We will review the example, see how to set up a subprocess,  and how to store credentials. In the second part, aimed at the developer, we will see how the Trello Connector has been designed in Bonita BPM Studio. All resources used in this How to will be available on GitHub.

Process overview

Before drawing the entire application, we started with a basic one, containing just two human tasks, in order to test how the Trello API works:

  • The first task asks for the credentials to connect to Trello API
  • The second task displays retrieved data

In this way, it was easy to work on the Java application of the Trello connector. Once done and tested, we have the full application with all the forms. The aim was to create not only an application to retrieve data, but a more complete application that will help people to create the credentials, and store them for future use in Bonita BPM.

Here is the diagram:

So let’s describe this process:

  • The first step is automatic and checks if there are already Trello credentials. If this is the first time you are starting the application, there are no credentials. In which case, you can go straight to “Set up your account”. If the information is already available (e.g: the second time you start the application) the second step will display it. If you want to make changes, go to “Set up your account”
  • “Set up your account” is a “Call activity” and it will start the process on the second pool
  • This subprocess helps you to generate a Trello API Key and Token
  • Once done, go to “Choose the board”. Simply provide the link to the Trello board you want to use in the application or leave the one you defined before
  • “Retrieve Trello data” will check if all the information is correct, retrieve JSON data from Trello API and convert it to a CSV table. The Trello connector is on this step
  • If there are no errors, the final step will show you the table, and provide a link to download it as a CSV file. However, if you do get an error, you should check your data.

At the end of the process, you can go back to Bonita Portal and start the application again. The option to change Trello API credentials will be displayed.

Set Users in a Call Activity / Subprocess

The task “Set up your account” is called a “Call activity”. In Bonita BPM, this is the way to start a subprocess. Subprocesses can be in the same diagram or in a separate pool as in the Trello Application example. You can of course, create subprocesses in a separate diagram.

In this case, we want the subprocess (“Set up your Trello account”) to be started by the main process (Retrieve board data from Trello). Once deployed in Bonita BPM Portal, we would like the users to see and only be able to start the main process. We don’t want a user to be able to start the subprocess by itself, once it is deployed in Bonita BPM Portal.
The best practice is:

  • Select the main process Pool, and in “General / Actors”, add a user and set it as the initiator. In this way, users can start the main process.

  • Create your second Pool, give it a name, and draw the steps. Next, select the Call to action step in the main process (“Set up your account”) and in “Subprocess Name”, click on the list and select the process to start (“Set up your Trello account”)
  • If you leave “Version” empty, Bonita BPM Engine will always use the latest available version of the subprocess . This is useful when you make changes to the subprocess only: you won’t have to deploy the main process too. But if needed, you can set the version of the subprocess.

  • Next, select the second pool, and in “General / Actors”, add a user. Don’t set it as Initiator: in this way, the process can be only started programmatically. In this example, the “Call Activity” step in the main process will start it.

In order to import the application into Bonita  Portal to check what you see, in Bonita Studio click on the menu “Server / Build”. Next, select the process you want to build, and click on “Finish”. In this example, you will get two files, one per Pool. In the Bonita Portal, import the subprocess first, and then the main process. Once user rights are set and are enabled, you should only see  the “Retrieve Board data from Trello” in the User view.


All resources used in this article are available on our Github repository:

The Trello connector and application was developed by our consultant Pablo Alonso de Linaje, and fine-tuned by Antoine Thomas for this article. Some variables or comments in code may be in Spanish, as Pablo cames from Spain.

Categories: BPM

Bonita BPM 6.x on GitHub

Bonita - Tue, 07/16/2013 - 07:14

At Bonitasoft, we are moving to provide more and more useful resources to our great Bonita BPM Community. We’ve already moved to Jira for issues tracking, and to Crowdin for translations. Now I am pleased to announce that we are moving the sources of Bonita BPM 6.x to GitHub.

So where are our GitHub repositories? Just go to:

We can’t wait to see what you will do with Bonita BPM code on GitHub. Now, it is up to you!

Categories: BPM

Maintenance releases: 6.0.1 and 5.10.1

Bonita - Fri, 07/12/2013 - 04:02

Dear ladies and gentlemen,

If you are using Bonita Open Solution 5.10 and Bonita BPM 6.0, it is time to update: 5.10.1 and 6.0.1 are released and ready to download from our corporate website:

These new versions, as you may expect, fix issues and are more stable, so we recommend that you upgrade.

The release notes are available here:

Let’s be more efficient!

Categories: BPM

Data Integration Challenges: Define Your Customer

Data integration blog - Fri, 04/29/2011 - 07:56

The IT and business alignment is a widely discussed challenge of data integration. The major data integration problem adds up to this: define customer.

Data from different functional areas doesn’t join up: sales orders are associated with the newly contracted customers, but the marketing campaign data is associated with prospects. Is a customer someone who’s actually bought something from you, or is a customer someone who’s interested in buying something from you? Should a definition include a certain demographic factor that reflects your typical buyer? If sales, marketing, service, and finance can all agree on a single definition of customer, then all the associated transactions could be easily integrated.

The thing is that all these specialists have their understanding of the word “customer”. That is why it is next to impossible for them to agree on a single definition and you have to somehow manage data integration without it.

To solve this issue, you can define what each functional area (and each CRM system) means by “customer”. This is how we know that customer data coming from a marketing system includes prospects, as well as existing customers. With this information, you can build a semantic model to understand how the different definitions of customer relate to one another.

Using this model, it would be possible to associate supply data with parts, cost data with product class, marketing data with brands, and so on. The relationships among these entities allow for data integration from different functional areas. This semantic model may be complex, but try to accept it and don’t head for simplifying it. The world is complex. Data integration requires a sophisticated understanding of your business, and standardizing vocabulary is not going to be the right answer to this challenge.

Categories: ETL

iPaaS: A New Trend In Data Integration?

Data integration blog - Wed, 04/20/2011 - 09:51

iPaaS (integration platform-as-a-service) is a development platform for building integration applications. It provides a set of capabilities for data integration and application integration in the Cloud and on-premises.

There are very few vendors offering iPaaS solutions at the moment. Although Gartner recognizes and uses the term, it still sounds confusing to researchers and data integration experts. So how does iPaaS work and can it benefit your data integration efforts?

Integration platform delivers a combination of data integration, governance, security and other capabilities to link applications, SOA services, and Cloud services. In addition to basic features that a Cloud solution should have, such as multi-tenancy, elasticity, and reliability, there are other capabilities relevant for iPaaS:

    1. Intermediation, the ability to integrate applications and services using the Cloud scenarios, which include SaaS and Cloud services, on-premises apps and resources.
    2. Orchestration between services, which requires connectivity and the ability to map data.
    3. Service containers to enable users publish their own services using either RESTful or SOAP technologies.
    4. Security covers the ability to authenticate and authorize access to any resource on the platform, as well as to manage this access.
    5. Enterprise Data Gateway installed on-premises and used as a proxy to access enterprise resources.

Data integration and application integration with and within the Cloud is the concept that business owners should consider nowadays. As of today, iPaaS would mostly appeal to companies that don’t mind building their own IT solutions or to ISVs that need to integrate Cloud silos they have created previously. It will be interesting to see whether iPaaS will become the next trend in the data integration discipline.

Categories: ETL

Salesforce Integration with QuickBooks: Out-of-the-box Solution on its Way

Data integration blog - Wed, 04/06/2011 - 05:41 and Intuit have signed a partnership agreement to provide Salesforce integration with QuickBooks to Intuit’s four million customers. The companies promise to finish developing the integrated solution in summer.

The solution is going to make CRM processes more convenient and transparent by displaying customer data along with financial information. Salesforce integration with QuickBooks will enable businesses to synchronize customer data in CRM with financial data in QuickBooks and QuickBooks Online. This will solve an issue of double data entry in two different systems.

Salesforce integration with QuickBooks will help small business owners to make better decisions. According to Intuit’s survey, more than 50% of small businesses perform CRM activities manually with pen and paper or with software, which is not designed for that.

With thousands of small businesses using both QuickBooks and, the integration of two systems is a great way to leverage the power of cloud computing and data integration strategies to help businesses grow.

Categories: ETL

Is Your Data Integration Technology Outdated?

Data integration blog - Sat, 04/02/2011 - 10:49

Spring is a good time to get rid of the old stuff and check out something new. This might as well be the time to upgrade your data integration tools. How can you learn if your data integration solution is outdated and should be replaced by something more productive? May be it just needs a little tuning? Here are the main check points to see if your solution’s performance still fits the industry standards.

Data transformation schemas
deal with both data structure and content. If data mappings are not as well-organized as possible, then a single transformation may take twice as long. Mapping problems can cause small delays that add up. The solution to the transformation issue is to make sure that data maps are written as efficiently as possible. You can compare your data integration solution to the similar ones to understand if the data transformation runs with the required speed.

Business rules processing are specific rules for the data that has to be validated. Too many rules can suspend your data integration processes. You have to make sure that the amount of rules in you data integration system is optimal, meaning that there are not too many of them running at the same time.

Network bandwidth and traffic—in many cases the performance is hindered not by the data integration tool itself, but by the size of the network you use. To avoid this issue, you need calculate the predicted performance under various loads and make sure you use the fastest network available for the data integration needs.

Data integration solution reminds a car: it can run but become slow if it is not properly tuned and taken care of. As we become more dependent upon the data integration technology, our ability to understand and optimize the performance issues will make a substantial difference.

Categories: ETL

The Key Data Integration Strategies for Successful CRM

Data integration blog - Thu, 03/10/2011 - 09:39

One of the great values data integration provides is a possibility to gain a deeper insight into one’s customers. It is not surprising that data integration with CRM (customer relations management) systems is one of the main directions in the industry development. As more companies choose managing customers electronically, it is quite useful to apply the most effective data integration strategies to pay-off for CRM investments.

The recent survey by the data integration experts and authors—Christopher Barko, Ashfaaq Moosa, and Hamid Nemati, —explores the significant role of data integration in electronic customer relationship management (e-CRM) analytics. They reviewed 115 organizations including both B2B and B2C companies and sorted out four data integration initiatives that provide for better CRM:

    1. Integrating more data sources. The research shows that the total value of CRM project increases when you integrate more data sources. As sales people are using more channels than ever before to reach prospects and customers, no wonder that data integrated from all these channels is more efficient, than when stored in the isolated silos.

    2. Integrating offline data with online data gives a better picture of customer’s buying habits. 62 percent of respondents said they integrated these data sources, while 30 percent did not. Not surprisingly, those who integrated the online and offline data experienced greater value from their e-CRM projects.

    3. Integrating external data (e.g., from social media sites) into the central repository. 74 percent integrated external data in some form, while 26 percent did not. The companies that practice external data integration in their e-CRM projects enjoy significantly more benefits.

    4. Using a centralized data warehouse or a CRM-specific data repository does provide a deeper customer insight. Those who used a decentralized data repository (legacy databases, operational data stores) experienced significantly less benefits than those who centralized their data storage.

As the number of marketing channels used to communicate with customers continues to multiply, so does the number of places used to store the data. The research reveals that the most efficient data integration strategies include integrating different kinds of data from multiple channels and keeping it in the central repository. These data integration best practices help ensure marketing efforts have a positive effect on sales.

Categories: ETL

How Can Data Governance Serve Data Integration Projects?

Data integration blog - Sat, 03/05/2011 - 06:56

Data governance initiatives in an organization are intended to cover data quality, data management, and data policy issues. These activities are carried out by data stewards and a team that develops and implements business rules for administrating the use of data.

The focus on data governance is essential when the company has to implement a successful data integration strategy and use it for analysis, reporting, and decision-making. Here are some ways of making data integration projects more efficient with data governance:

    • It brings IT and business teams together. Data governance identifies what is really important to the business and helps establish business rules that are crucial for data integration.

    • A data governance program can help your company define and measure the potential ROI you get from maintaining data. You can use this information to calculate the ROI for data integration projects.

    • It helps you learn who’s responsible for the data quality. Data governance provides valuable information that enables to appoint data stewards and decision makers for data integration projects. Since data governance tells you who’s responsible for the data, you know where to go to resolve data quality issues.

    • Data governance can save you money, because it helps establish best practices and select cost-effective data integration and data quality tools.

Data governance and data integration are tightly connected with each other. You are not likely to enjoy data integration benefits without a strong governance program. On the other hand, data governance is only possible if your data is stored in an integrated system. My advice: make sensible use of both.

Categories: ETL

What Is The Difference Between Data Conversion and Data Migration?

Data integration blog - Thu, 02/24/2011 - 11:28

The terms data conversion and data migration are still sometimes used interchangeably on the internet. However, they do mean different things. Data conversion is the transformation of data from one format to another. It implies extracting data from the source, transforming it and loading the data to the target system based on a set of requirements.

Data migration is the process of transferring data between silos, formats, or systems. Therefore, data conversion is only the first step in this complicated process. Except for data conversion, data migration includes data profiling, data cleansing, data validation, and the ongoing data quality assurance process in the target system.

Both terms are used as synonymous by many internet resources. I think the reason for that might be that there are very few situations when a company has to convert the data without migrating it.

Data conversion possible issues

There are some data conversion issues to consider, when data is transferred between different systems. Operating systems have certain alignment requirements which will cause program exceptions if these requirements are not taken into consideration. Converting files to another format can be tricky as how you convert it depends on how the file was created. These are only few examples of possible conversion issues.

There are some ways to avoid data conversion problems:

    1. Always transform objects into printable character data types, including numeric data.
    2. Devise an operating system-neutral format for an object transformed into a binary data type.
    3. Include sufficient header information in the transformed data type so that the remainder of the encoded object can be correctly interpreted independent of the operating system.

Data conversion is often the most important part of data migration. You have to be very careful during this stage to assure data quality in your target system.

Categories: ETL

Data Integration in SharePoint 2010

Data integration blog - Thu, 02/17/2011 - 09:23

A survey by AIIM (Association for Information and Image Management) states that although SharePoint is being rapidly adopted by organizations, at least half of the companies that are implementing the platform don’t have business use in mind.

This might be a reason we don’t see millions of companies shifting their data integration initiatives into SharePoint. It may be only a question of time, as SharePoint 2010 comes with rich integration capabilities. Here are some of the features that can be leveraged for external data integration and application integration:

    1. Business Connectivity Services (BSC) is a new feature of the SharePoint platform that provides new means for external data integration into SharePoint 2010. It enables to create connections to external data sources through the use of SharePoint Designer or more complex scenarios with custom code development.

    2. Web Services can be leveraged by both SharePoint and external systems for data integration and application integration purposes. Common services include the ability to authenticate, search, and manage content. SharePoint 2010 also includes built-in RESTful Web services, which allows the integration of remote systems.

    3. Client Object Models are used to integrate SharePoint and other systems to provide a better usability. SharePoint 2010 introduces three new client API’s: ECMAScript Client, SilverLight Client, and .NET Managed Client. These object models enable users to access both SharePoint and other data sources from a single interface that does not have to be or look like the SharePoint interface.

    4. The CMIS (Content Management Interoperability Services) connector for SharePoint 2010 enables to perform content management functions between systems that comply with the CMIS specification.

There are many ways in which organizations can leverage SharePoint for their data integration needs. Nevertheless, the question on whether companies will start data migration and data integration into SharePoint 2010 in the nearest future remains open.

Categories: ETL
Syndicate content