AdoptOS

Assistance with Open Source adoption

Open Source News

Trello Application Example – part 2

Bonita - Wed, 09/25/2013 - 08:07

In this article, we would like to share with you a example of a nice Bonita BPM application using Trello.
Here is a summary of what it does:

  • Helps you create credentials to connect Bonita BPM to a Trello API
  • Retrieves JSON information from a Trello board
  • Creates a downloadable CSV and displays the table in a Bonita BPM form
  • Stores the credentials, so you can use them later when starting a new case from the application in the Portal.

The first part  of this article was aimed at the beginner. We reviewed the example, saw how to set up a subprocess,  and showed how to store credentials.

In the second part, aimed at the developer, we will see how the Trello Connector has been designed in Bonita BPM Studio. We need to create a custom connector in order to:

  • connect with Trello API to get Trello Board Info
  • transform it (JSON array) into a list, which we will need to populate our table widget, and
  • transform it into a CSV formatted string to generate the content of the downloadable file.

While doing thing, you can:

  • Understand better how connectors work in Bonita BPM
  • Understand how to generate a Bonita BPM document from process content
Connector Structure

First, let’s look at the structure of a connector in Bonita BPM. It has two parts, Definition and Implementation.

The definition will allow us to define inputs and outputs of our connector. We will define the way we are going to ask for the input data, and defined the pages of the widgets which we will show to the user who will actually populate it to give the inputs required to execute the connector itself.

The implementation will be the Java code which will be executed with the inputs and outputs defined in the definition. Keep in mind that we can have several different  implementations for the same definition. Imagine that your company has developed some custom connector to retrieve and create invoices into your ERP, and now you are going to change the ERP.  You need to re-implement your connector. With the community version of Bonita BPM, you can redeploy the process using the new implementation of the connector. If you have a Subscription Pack your processes are not going to be affected, as you will able to change the connector implementation in runtime and if your processes get stuck, you can relaunch the execution of previous tasks with the new implementation.

Create a Trello Connector

To get our connector working from scratch, we have to create a new definition from Bonita BPM Studio (see Figure 2). With the development menu you can create a new connector definition. In this case, we will need 3 different inputs:

  • the Trello API key,
  • the qAuth token and
  • the board we want to show.

So we define these inputs as text (string) and a wizard page with text widgets to let the user set up the connector.

As you can see, we’ve defined a custom category to store this connector with its own icon. You can define your own category to store your custom connectors.

Now we need to define the connector’s outputs. In this case we will output several data:

  • the content of a csv file stringCSV
  • the content of the board in a List<List<String>> bonitaList with which we will populate a table widget,
  • the headers of that table widget with a List<String> bonitaColumn,
  • and a List<Map<String, Object>trelloList for other purposes, like for managing the data inside the process (we don’t use this in our process example).

Later in this blog post we will show how to manage these outputs and how to give our connector the inputs required from a variable, either a constant or a script.

Now we have our connector definition fully designed, so the next step will be to create the implementation. This will connect to the Trello API to retrieve the data inside the board, and will transform it to objects easier to manage inside our process -  a table and a CSV file ready to download. In summary, we will use inputs to generate our outputs through a piece of code written in the connector implementation. You can create a new connector implementation or edit this one in the development menu > connectors >edit connector implementation.

First we will setup the definition we are going to implement by defining:

  • name of this implementation and of the class that we are going to create,
  • the package  where we will do this,
  • the version,  and
  • the dependencies that we will need in our connector code.

In this case we need

  • common-httpclient
  • common-io
  • json-simple
  • json_org and
  • common-logging.

Once we press Finish, we will be redirected to our new Java file in which we will write our own code. But first we have some general remarks:

Your new class will extend an abstract class automaticly generated too, in our case:

public class Trello_Get_BoardImpl extends AbstractTrello_Get_BoardImpl

This abstract class will manage all inputs and outputs set in the definition, and the default validation of the inputs which will check the type of value given as input. So the way to access to the inputs from our class will be through get methods, and you will have to return the outputs with set methods.

In Figure 5 you can check which methods will be executed and in which order:

  1. setInputParameters() –> the connector receives input parameter values
  2. validateInputParameters() –> the connector can validate input parameter values
  3. connect() –> the connector can establish a connection to a remote server (if necessary)
  4. executeBusinessLogic() –> execute the connector
  5. getOutputParameters() –> outputs are retrieved from connector
  6. disconnect() –> the connector can close connection to remote server (if any)

Connect and disconnect methods are optional: they are used when the targeted system uses “connected objects,” for instance a datasource. We open the connection (connect method), we execute actions, and we close the connection (disconnect method).

Remember that you can manage several implementations of the same definition, and after that select which one will execute in your process

You can check the code we developed for this connector in our Community GitHub repository, feel free to reuse it.

Managing outputs

We finished creating our connector and we used it in a task to retrieve data, but how to manage lists of content and headers, and the CSV content?

We will start with easiest one first. In the previous explanation we spoke about both lists (bonitaList and bonitaColumn) so we are going to use that output to fill two variables and put it into a pageflow. To do this, in our pageflow we will define a new table widget and populate it with the variables. I previously filled in the connector output, as initial value and as horizontal header values respectively. Now we will have our table with the Trello board data.

But how to convert the formatted string to a Document?

First define a new document in the process. To do this, for that click in the pool, go to documents, and create a new document in there. In this case it should be a MIME Type “application/csv.” Then we will “write” the content generated by our connector into this document, and for that we will need to use Operations where we can manage this document variable. We will use a script to return an org.bonitasoft.engine.bpm.document.DocumentValue, which will be stored in the document.

import org.bonitasoft.engine.bpm.document.DocumentValue;
return new DocumentValue(stringCSV.getBytes(), "application/csv", "trello_board.csv");

And there you have it!

Categories: BPM

User Experience

Bonita - Tue, 09/24/2013 - 02:15

Dear Community,

I’m Nathalie, and I recently joined the Bonitasoft adventure as a Usability and Ergonomics Specialist.

My job is to put my 12 years of experience as a usability consultant at the service of Bonita BPM users, whatever their profiles and needs.

We are working to adjust the usability of Bonita BPM features – including the way they are displayed, and also according to your logical needs and expectations. We are taking on these challenges to help make you more efficient.

Here is what I do:

  • Conduct usability evaluations
  • Gather and apply your feedback (you can contact me via comment to this post)
  • Conduct remote or in-house usability testing: observe where what you need to do when building a process is not obvious enough or does not follow your logic, expectations or needs

To help me with these various assessments, I need volunteers! Some examples of how I will ask:

  • Polls to help us understand better how you interact with our product, to define requirements and specifications with a more accurate “you” in mind.
  • Polls to gather feedback on certain features. I’ll put up a poll whenever we have a specific question to ask, thanks to our Bonita Community manager Antoine.

Here is the first one.

Thank you in advance for spending 5 minutes to answer questions about our Bonita BPM Studio Overview pane today.

Be back soon.

Categories: BPM

Bonitasoft is looking for writers

Bonita - Thu, 09/19/2013 - 10:12

Dear community,

Bonitasoft is hiring some writers, and we would be pleased to choose community members to write about some BPM topics and Bonita BPM.

We are looking fore some introductory material for non-expert users - on topics such as “Meet Bonita BPM” (to explain the software suite to a relatively nontechnical person), or “What is a ‘process-based application’”?

We’re also open to your ideas and proposals. Please contact me through the community forum if you would  like to discuss this further.

Categories: BPM

Introducing our Vimeo channel with the “Bonita Word”

Bonita - Tue, 09/17/2013 - 09:20

Dear Community,

I am happy to announce that we are opening a Vimeo channel for Bonitasoft videos, for any and all topics. And as we introduce this brand new channel, take a look at some of Bonitasoft’s employees giving you a word about Bonita BPM 6.

Have fun!

PS: All Community related videos will be made available in this specific subchannel:
https://vimeo.com/channels/591152

Categories: BPM

Fresh news about Bonita BPM Community

Bonita - Fri, 09/13/2013 - 08:31

Dear Community,

I have some fresh news for you, about Bonita BPM and the future of our Community.

First, it’s time for an update: the 6.0.3 version of Bonita BPM is available for download. It fixes some bugs in the Engine and Portal, and a lot of work has been done in Studio. You can read the full release note in pdf.

With Pablo Alonso, one of the consultants on our Service team, we have started a blog series about a Trello connector and application. It provides an example with a subprocess and a brand new connector from scratch with a Java application, shows how to use JSON with Bonita BPM, and more. All resources are already available, and will be updated on our Community GitHub.

Here’s the first part of the series: Trello Application Example – part 1. Pablo has started to work on the second part, which is developer-oriented.

About the future of our Community, I can tell you that we have done a lot of work to create a really good, brand-new website:

  • More social – it will be easy to share your questions and content
  • More practical – for Community technical support and collaboration between members
  • More languages – with the ability to create local groups in other languages on the way
  • More usable – there will be a new way to manage your contributions, from connectors to ideas and feedback.

It should be available before the end of the year. We will try to make a sort of beta opening to give you a chance to tell us what you think about it. If you want to be part of this beta group, please let me know in the comments.

I see on the forum that there are more and more users who are new to BPMN and Bonita BPM. I know that Bonita BPM can be technical for non-developers,  so I would like to let you know that we haven’t forgotten you. On our corporate website, there are many ready-made process examples you can download, test and customize for your need. Also, there are BPMN process diagrams and process examples for use with tbe Community edition of Bonita BPM here – click on each category to see the list of what’s available.

In order to make it easier to install and learn about Bonita BPM, I have started some cool stuff on the Community Blog:

  • I’m updating “how to set up Bonita BPM on a full Open Source stack” for Bonita BPM 6.x.
  • I was looking at a fun way to learn and talk about Bonita BPM 6. I don’t know about you, but sometimes for me, it’s not so easy to explain to friends and family what BPMN and all that stuff is about. So, with help from our experts, I’ve creates a little game in Bontia BPM Studio, to show people that we can make not only business but fun process applications too. Stay tuned, for there may be some Bonitasoft goodies to win!

One of my goals is to try to better respond to community needs, and I have access to resources available at Bonitasoft. This means that if you want the Bonitasoft team to work on a specific How to, please let me know. There are many ways to contact me: Twitter, LinkedIn, and through the Bonita forum.

Have fun with Bonita!

Antoine

Categories: BPM

Trello Application Example – Part 1

Bonita - Wed, 09/04/2013 - 10:07

In this article, we would like to share with you a example of a nice Bonita BPM application using Trello.
Here is a summary of what it does:

  • Helps you create credentials to connect Bonita BPM to a Trello API
  • Retrieves JSON information from a Trello board
  • Creates a downloadable CSV and displays the table in a Bonita BPM form
  • Stores the credentials, so you can use them later when starting a new case from the application in the Portal.

The first part  of this article is aimed at the beginner. We will review the example, see how to set up a subprocess,  and how to store credentials. In the second part, aimed at the developer, we will see how the Trello Connector has been designed in Bonita BPM Studio. All resources used in this How to will be available on GitHub.

Process overview

Before drawing the entire application, we started with a basic one, containing just two human tasks, in order to test how the Trello API works:

  • The first task asks for the credentials to connect to Trello API
  • The second task displays retrieved data

In this way, it was easy to work on the Java application of the Trello connector. Once done and tested, we have the full application with all the forms. The aim was to create not only an application to retrieve data, but a more complete application that will help people to create the credentials, and store them for future use in Bonita BPM.

Here is the diagram:

So let’s describe this process:

  • The first step is automatic and checks if there are already Trello credentials. If this is the first time you are starting the application, there are no credentials. In which case, you can go straight to “Set up your account”. If the information is already available (e.g: the second time you start the application) the second step will display it. If you want to make changes, go to “Set up your account”
  • “Set up your account” is a “Call activity” and it will start the process on the second pool
  • This subprocess helps you to generate a Trello API Key and Token
  • Once done, go to “Choose the board”. Simply provide the link to the Trello board you want to use in the application or leave the one you defined before
  • “Retrieve Trello data” will check if all the information is correct, retrieve JSON data from Trello API and convert it to a CSV table. The Trello connector is on this step
  • If there are no errors, the final step will show you the table, and provide a link to download it as a CSV file. However, if you do get an error, you should check your data.

At the end of the process, you can go back to Bonita Portal and start the application again. The option to change Trello API credentials will be displayed.

Set Users in a Call Activity / Subprocess

The task “Set up your account” is called a “Call activity”. In Bonita BPM, this is the way to start a subprocess. Subprocesses can be in the same diagram or in a separate pool as in the Trello Application example. You can of course, create subprocesses in a separate diagram.

In this case, we want the subprocess (“Set up your Trello account”) to be started by the main process (Retrieve board data from Trello). Once deployed in Bonita BPM Portal, we would like the users to see and only be able to start the main process. We don’t want a user to be able to start the subprocess by itself, once it is deployed in Bonita BPM Portal.
The best practice is:

  • Select the main process Pool, and in “General / Actors”, add a user and set it as the initiator. In this way, users can start the main process.

  • Create your second Pool, give it a name, and draw the steps. Next, select the Call to action step in the main process (“Set up your account”) and in “Subprocess Name”, click on the list and select the process to start (“Set up your Trello account”)
  • If you leave “Version” empty, Bonita BPM Engine will always use the latest available version of the subprocess . This is useful when you make changes to the subprocess only: you won’t have to deploy the main process too. But if needed, you can set the version of the subprocess.

  • Next, select the second pool, and in “General / Actors”, add a user. Don’t set it as Initiator: in this way, the process can be only started programmatically. In this example, the “Call Activity” step in the main process will start it.

In order to import the application into Bonita  Portal to check what you see, in Bonita Studio click on the menu “Server / Build”. Next, select the process you want to build, and click on “Finish”. In this example, you will get two files, one per Pool. In the Bonita Portal, import the subprocess first, and then the main process. Once user rights are set and are enabled, you should only see  the “Retrieve Board data from Trello” in the User view.

Resources

All resources used in this article are available on our Github repository:
https://github.com/Bonitasoft-Community/Trello-connector

The Trello connector and application was developed by our consultant Pablo Alonso de Linaje, and fine-tuned by Antoine Thomas for this article. Some variables or comments in code may be in Spanish, as Pablo cames from Spain.

Categories: BPM

Bonita BPM 6.x on GitHub

Bonita - Tue, 07/16/2013 - 07:14

At Bonitasoft, we are moving to provide more and more useful resources to our great Bonita BPM Community. We’ve already moved to Jira for issues tracking, and to Crowdin for translations. Now I am pleased to announce that we are moving the sources of Bonita BPM 6.x to GitHub.

So where are our GitHub repositories? Just go to:

https://github.com/bonitasoft

We can’t wait to see what you will do with Bonita BPM code on GitHub. Now, it is up to you!

Categories: BPM

Maintenance releases: 6.0.1 and 5.10.1

Bonita - Fri, 07/12/2013 - 04:02

Dear ladies and gentlemen,

If you are using Bonita Open Solution 5.10 and Bonita BPM 6.0, it is time to update: 5.10.1 and 6.0.1 are released and ready to download from our corporate website:

These new versions, as you may expect, fix issues and are more stable, so we recommend that you upgrade.

The release notes are available here:

Let’s be more efficient!

Categories: BPM

Data Integration Challenges: Define Your Customer

Data integration blog - Fri, 04/29/2011 - 07:56

The IT and business alignment is a widely discussed challenge of data integration. The major data integration problem adds up to this: define customer.

Data from different functional areas doesn’t join up: sales orders are associated with the newly contracted customers, but the marketing campaign data is associated with prospects. Is a customer someone who’s actually bought something from you, or is a customer someone who’s interested in buying something from you? Should a definition include a certain demographic factor that reflects your typical buyer? If sales, marketing, service, and finance can all agree on a single definition of customer, then all the associated transactions could be easily integrated.

The thing is that all these specialists have their understanding of the word “customer”. That is why it is next to impossible for them to agree on a single definition and you have to somehow manage data integration without it.

To solve this issue, you can define what each functional area (and each CRM system) means by “customer”. This is how we know that customer data coming from a marketing system includes prospects, as well as existing customers. With this information, you can build a semantic model to understand how the different definitions of customer relate to one another.

Using this model, it would be possible to associate supply data with parts, cost data with product class, marketing data with brands, and so on. The relationships among these entities allow for data integration from different functional areas. This semantic model may be complex, but try to accept it and don’t head for simplifying it. The world is complex. Data integration requires a sophisticated understanding of your business, and standardizing vocabulary is not going to be the right answer to this challenge.

Categories: ETL

iPaaS: A New Trend In Data Integration?

Data integration blog - Wed, 04/20/2011 - 09:51

iPaaS (integration platform-as-a-service) is a development platform for building integration applications. It provides a set of capabilities for data integration and application integration in the Cloud and on-premises.

There are very few vendors offering iPaaS solutions at the moment. Although Gartner recognizes and uses the term, it still sounds confusing to researchers and data integration experts. So how does iPaaS work and can it benefit your data integration efforts?

Integration platform delivers a combination of data integration, governance, security and other capabilities to link applications, SOA services, and Cloud services. In addition to basic features that a Cloud solution should have, such as multi-tenancy, elasticity, and reliability, there are other capabilities relevant for iPaaS:

    1. Intermediation, the ability to integrate applications and services using the Cloud scenarios, which include SaaS and Cloud services, on-premises apps and resources.
    2. Orchestration between services, which requires connectivity and the ability to map data.
    3. Service containers to enable users publish their own services using either RESTful or SOAP technologies.
    4. Security covers the ability to authenticate and authorize access to any resource on the platform, as well as to manage this access.
    5. Enterprise Data Gateway installed on-premises and used as a proxy to access enterprise resources.

Data integration and application integration with and within the Cloud is the concept that business owners should consider nowadays. As of today, iPaaS would mostly appeal to companies that don’t mind building their own IT solutions or to ISVs that need to integrate Cloud silos they have created previously. It will be interesting to see whether iPaaS will become the next trend in the data integration discipline.

Categories: ETL

Salesforce Integration with QuickBooks: Out-of-the-box Solution on its Way

Data integration blog - Wed, 04/06/2011 - 05:41

Salesforce.com and Intuit have signed a partnership agreement to provide Salesforce integration with QuickBooks to Intuit’s four million customers. The companies promise to finish developing the integrated solution in summer.

The solution is going to make CRM processes more convenient and transparent by displaying customer data along with financial information. Salesforce integration with QuickBooks will enable businesses to synchronize customer data in Salesforce.com CRM with financial data in QuickBooks and QuickBooks Online. This will solve an issue of double data entry in two different systems.

Salesforce integration with QuickBooks will help small business owners to make better decisions. According to Intuit’s survey, more than 50% of small businesses perform CRM activities manually with pen and paper or with software, which is not designed for that.

With thousands of small businesses using both QuickBooks and Salesforce.com, the integration of two systems is a great way to leverage the power of cloud computing and data integration strategies to help businesses grow.

Categories: ETL

Is Your Data Integration Technology Outdated?

Data integration blog - Sat, 04/02/2011 - 10:49

Spring is a good time to get rid of the old stuff and check out something new. This might as well be the time to upgrade your data integration tools. How can you learn if your data integration solution is outdated and should be replaced by something more productive? May be it just needs a little tuning? Here are the main check points to see if your solution’s performance still fits the industry standards.

Data transformation schemas
deal with both data structure and content. If data mappings are not as well-organized as possible, then a single transformation may take twice as long. Mapping problems can cause small delays that add up. The solution to the transformation issue is to make sure that data maps are written as efficiently as possible. You can compare your data integration solution to the similar ones to understand if the data transformation runs with the required speed.

Business rules processing are specific rules for the data that has to be validated. Too many rules can suspend your data integration processes. You have to make sure that the amount of rules in you data integration system is optimal, meaning that there are not too many of them running at the same time.

Network bandwidth and traffic—in many cases the performance is hindered not by the data integration tool itself, but by the size of the network you use. To avoid this issue, you need calculate the predicted performance under various loads and make sure you use the fastest network available for the data integration needs.

Data integration solution reminds a car: it can run but become slow if it is not properly tuned and taken care of. As we become more dependent upon the data integration technology, our ability to understand and optimize the performance issues will make a substantial difference.

Categories: ETL

The Key Data Integration Strategies for Successful CRM

Data integration blog - Thu, 03/10/2011 - 09:39

One of the great values data integration provides is a possibility to gain a deeper insight into one’s customers. It is not surprising that data integration with CRM (customer relations management) systems is one of the main directions in the industry development. As more companies choose managing customers electronically, it is quite useful to apply the most effective data integration strategies to pay-off for CRM investments.

The recent survey by the data integration experts and authors—Christopher Barko, Ashfaaq Moosa, and Hamid Nemati, —explores the significant role of data integration in electronic customer relationship management (e-CRM) analytics. They reviewed 115 organizations including both B2B and B2C companies and sorted out four data integration initiatives that provide for better CRM:

    1. Integrating more data sources. The research shows that the total value of CRM project increases when you integrate more data sources. As sales people are using more channels than ever before to reach prospects and customers, no wonder that data integrated from all these channels is more efficient, than when stored in the isolated silos.

    2. Integrating offline data with online data gives a better picture of customer’s buying habits. 62 percent of respondents said they integrated these data sources, while 30 percent did not. Not surprisingly, those who integrated the online and offline data experienced greater value from their e-CRM projects.

    3. Integrating external data (e.g., from social media sites) into the central repository. 74 percent integrated external data in some form, while 26 percent did not. The companies that practice external data integration in their e-CRM projects enjoy significantly more benefits.

    4. Using a centralized data warehouse or a CRM-specific data repository does provide a deeper customer insight. Those who used a decentralized data repository (legacy databases, operational data stores) experienced significantly less benefits than those who centralized their data storage.

As the number of marketing channels used to communicate with customers continues to multiply, so does the number of places used to store the data. The research reveals that the most efficient data integration strategies include integrating different kinds of data from multiple channels and keeping it in the central repository. These data integration best practices help ensure marketing efforts have a positive effect on sales.

Categories: ETL

How Can Data Governance Serve Data Integration Projects?

Data integration blog - Sat, 03/05/2011 - 06:56

Data governance initiatives in an organization are intended to cover data quality, data management, and data policy issues. These activities are carried out by data stewards and a team that develops and implements business rules for administrating the use of data.

The focus on data governance is essential when the company has to implement a successful data integration strategy and use it for analysis, reporting, and decision-making. Here are some ways of making data integration projects more efficient with data governance:

    • It brings IT and business teams together. Data governance identifies what is really important to the business and helps establish business rules that are crucial for data integration.

    • A data governance program can help your company define and measure the potential ROI you get from maintaining data. You can use this information to calculate the ROI for data integration projects.

    • It helps you learn who’s responsible for the data quality. Data governance provides valuable information that enables to appoint data stewards and decision makers for data integration projects. Since data governance tells you who’s responsible for the data, you know where to go to resolve data quality issues.

    • Data governance can save you money, because it helps establish best practices and select cost-effective data integration and data quality tools.

Data governance and data integration are tightly connected with each other. You are not likely to enjoy data integration benefits without a strong governance program. On the other hand, data governance is only possible if your data is stored in an integrated system. My advice: make sensible use of both.

Categories: ETL

What Is The Difference Between Data Conversion and Data Migration?

Data integration blog - Thu, 02/24/2011 - 11:28

The terms data conversion and data migration are still sometimes used interchangeably on the internet. However, they do mean different things. Data conversion is the transformation of data from one format to another. It implies extracting data from the source, transforming it and loading the data to the target system based on a set of requirements.

Data migration is the process of transferring data between silos, formats, or systems. Therefore, data conversion is only the first step in this complicated process. Except for data conversion, data migration includes data profiling, data cleansing, data validation, and the ongoing data quality assurance process in the target system.

Both terms are used as synonymous by many internet resources. I think the reason for that might be that there are very few situations when a company has to convert the data without migrating it.

Data conversion possible issues

There are some data conversion issues to consider, when data is transferred between different systems. Operating systems have certain alignment requirements which will cause program exceptions if these requirements are not taken into consideration. Converting files to another format can be tricky as how you convert it depends on how the file was created. These are only few examples of possible conversion issues.

There are some ways to avoid data conversion problems:

    1. Always transform objects into printable character data types, including numeric data.
    2. Devise an operating system-neutral format for an object transformed into a binary data type.
    3. Include sufficient header information in the transformed data type so that the remainder of the encoded object can be correctly interpreted independent of the operating system.

Data conversion is often the most important part of data migration. You have to be very careful during this stage to assure data quality in your target system.

Categories: ETL

Data Integration in SharePoint 2010

Data integration blog - Thu, 02/17/2011 - 09:23

A survey by AIIM (Association for Information and Image Management) states that although SharePoint is being rapidly adopted by organizations, at least half of the companies that are implementing the platform don’t have business use in mind.

This might be a reason we don’t see millions of companies shifting their data integration initiatives into SharePoint. It may be only a question of time, as SharePoint 2010 comes with rich integration capabilities. Here are some of the features that can be leveraged for external data integration and application integration:

    1. Business Connectivity Services (BSC) is a new feature of the SharePoint platform that provides new means for external data integration into SharePoint 2010. It enables to create connections to external data sources through the use of SharePoint Designer or more complex scenarios with custom code development.

    2. Web Services can be leveraged by both SharePoint and external systems for data integration and application integration purposes. Common services include the ability to authenticate, search, and manage content. SharePoint 2010 also includes built-in RESTful Web services, which allows the integration of remote systems.

    3. Client Object Models are used to integrate SharePoint and other systems to provide a better usability. SharePoint 2010 introduces three new client API’s: ECMAScript Client, SilverLight Client, and .NET Managed Client. These object models enable users to access both SharePoint and other data sources from a single interface that does not have to be or look like the SharePoint interface.

    4. The CMIS (Content Management Interoperability Services) connector for SharePoint 2010 enables to perform content management functions between systems that comply with the CMIS specification.

There are many ways in which organizations can leverage SharePoint for their data integration needs. Nevertheless, the question on whether companies will start data migration and data integration into SharePoint 2010 in the nearest future remains open.

Categories: ETL
Syndicate content