AdoptOS

Assistance with Open Source adoption

Open Source News
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.
  • warning: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead in /home/mykidsin/public_adoptos_html/includes/unicode.inc on line 349.

Social selling: What is it and why should you care

VTiger - Tue, 01/29/2019 - 02:09
With social media channels now influencing 92% of B2B buying decisions, social selling has become an indispensable part of a sales process. Social selling helps you stay in front of your prospects, building recognition and trust to eventually win more deals. It is a long-term strategy that helps build long lasting relationships. What is social […]
Categories: CRM

KNIME recognized by Gartner as a Leader in Data Science and Machine Learning Platforms

Knime - Mon, 01/28/2019 - 12:30
KNIME recognized by Gartner as a Leader in Data Science and Machine Learning Platforms admin Mon, 01/28/2019 - 18:30

Zurich, January 28, 2019 - For the sixth year in a row1, Gartner has placed KNIME®, the open source platform for data driven innovation, as a leader for Data Science and Machine Learning Platforms based on ability to execute and completeness of vision.

Gartner, Inc. describes leaders as having “a strong presence and significant mind share in the data science and ML market. They demonstrate strength in depth and breadth across the full data exploration, model development and operationalization process. While providing outstanding service and support, Leaders are also nimble in responding to rapidly changing market conditions. The number of expert and citizen data scientists using Leaders’ platforms is significant and growing.

Leaders are in the strongest position to influence the market’s growth and direction. They address the majority of industries, geographies, data domains and use cases, and therefore have a solid understanding of, and strategy for, this market. Not only can they focus on executing effectively, based on current market conditions, but they also have solid roadmaps to take advantage of new developments and advancing technologies in this rapidly transforming sector. They provide thought leadership and innovative differentiation, often disrupting the market in the process.

Leaders are suitable vendors for most organizations to evaluate. They should not be the only vendors evaluated, however, as other vendors might address an organization’s unique needs more precisely. Leaders provide a benchmark of high standards to which others should be compared.”

“We have always believed it is the data science and machine learning community that ultimately decides what the leading platforms are. We are humbled by the response of KNIME users across all industries, continents, and application areas which has helped us achieve this honor,” says Michael Berthold, CEO, KNIME AG. “That community comprises not only users who use our software, but many who also contribute to it.”

The KNIME approach is unique. Its integrative, open source approach ensures all capabilities required of an individual data scientist are available for free and with no restrictions. This includes leading edge technologies such as text mining, image processing, the newest trends in machine learning / artificial intelligence, as well as integrations with other platforms and engines such as Python, R, Spark, H2O, and Keras/TensorFlow for Deep Learning. “KNIME simply does it all within one platform, allowing you to mix and match and move seamlessly from prototyping to production as required, depending on the problem you are solving,” says Phil Winters, KNIME advocate and “Father of Customer Intelligence”. “For me as an individual, everything I do - regardless of scale - is free. I strongly believe that KNIME has set the standard that others must now scramble to follow.”

“We are continuing to not only add leading edge methods, but are also placing a strong emphasis on blueprints for Guided Analytics and Guided Automation. These are an extremely practical way of helping the KNIME community get “focused AI” applications into the hands of even more users,” continues Berthold. “Our commitment to open source is unwavering. This transparent business model works for everyone and we will keep it that way.” For further information, contact KNIME at info@knime.com, or download KNIME and discover it for yourself. KNIME – Open for Innovation®. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. 1Reports referenced include: Gartner Magic Quadrant for Advanced Analytics Platforms (2014, 2015, 2016), Gartner Magic Quadrant for Data Science Platforms (2017), Gartner Magic Quadrant for Data Science and Machine Learning Platforms (2018, 2019) News date Mon, 01/28/2019 - 18:30

Categories: BI

Businesses must integrate Artificial Intelligence (AI) now or fall further behind

Talend - Mon, 01/28/2019 - 10:50

This article was originally published on Venture Beat Artificial intelligence became one of the hottest tech topics in 2017 and is still attracting attention and investments. Although scientists have been working on the technology and heralding its numerous anticipated benefits for more than four decades, it’s only in the past few years that society’s artificial intelligence dreams have come to fruition. The impact AI applications stand to have on both consumer and business operations is profound. For example, a New York-based Harley Davidson dealer incorporated the Albert Algorithm AI-driven marketing platform into his marketing mix and saw a 2,930 percent increase in sales leads that helped triple his business over the previous year. Unfortunately, success stories like this aren’t as common as the more prevalent failed AI pilot projects. However, with growing volumes of raw data about people, places, and things, plus increasing compute power and real-time processing speeds, immediate AI applicability and business benefits are becoming a reality. According to a survey by Cowen and Company, 81 percent of IT leaders are currently investing in or planning to invest in AI, as CIOs have mandated that their companies need to integrate AI into their entire technology stacks. Another 43 percent are evaluating and doing an AI proof of concept, and 38 percent already have operational AI applications and are planning to invest more. Additionally, McKinsey research estimates tech giants spent $20 to $30 billion on AI in 2016, with 90 percent of it going to R&D and deployment and 10 percent to AI acquisitions. Industry analyst firm IDC predicts artificial intelligence will grow to be a $47 billion market by 2020, with a CAGR of 55 percent. Of that market, IDC forecasts companies will spend some $18 billion on software applications, $5 billion on software platforms, and $24 billion on services and hardware. With this level of investment, if your business doesn’t already have a strategy to incorporate AI or machine learning (ML) into your development efforts by 2019, then you risk irrelevancy. The AI race is heating up Google, Amazon, and Facebook lead the AI race, with Microsoft Corp. investing a lot of time and resources to catch up. These companies already have thousands of researchers on staff and billions of dollars set aside to invest in capturing the next generation of leading data scientists — giving them a huge head start over the rest of the market. For example:

  • Of Google’s 25,000 engineers, currently only a few thousand are proficient in machine learning — roughly 10 percent — but Jeff Dean, Google Senior Fellow, would like that number to be closer to 100 percent.
  • In its first year of operation, the AI and Research group at Microsoft grew by 60 percent through hiring and acquisitions.
  • Over 750 Facebook engineers and 40 different product teams are using a piece of software called FBLearner Flow, which helps them leverage AI and ML. The company has trained more than a million models on the new software.
These tech giants are just a few of the serious artificial intelligence contenders in the market today. There is only so much talent to go around, making it hard and very expensive for smaller companies to attract and retain the skilled workers required to make their AI dreams a reality. This doesn’t just impact recruiting efforts, but also the time required to conduct new employee onboarding, training, and supervised learning to effectively scale AI programs. << Free On-Demand Webinar “The Fundamentals of Machine Learning” >> Most companies lack the connected, analytical infrastructure and general knowledge needed to apply AI and ML to its fullest extent. Engineers must be able to securely access data without having to deal with multiple layers of authentication, which is often the case if a company has several siloed data warehouses or enterprise resource planning application systems. Before IT leaders attempt to successfully deploy or conquer an enterprise-wide AI strategy, they must have the ability to bring large data sets together from several disparate and varied data sources into a centralized, scalable, and governed data repository. An Artificial Intelligence services marketplace is developing While it’s clear that the use of AI is becoming more prominent, not all companies have the IT budgets needed to recruit the talent required to build AI-fueled applications in-house. Thus, what we can expect to see more immediately is the emergence of an AI services marketplace. We’re already seeing examples of this emerge. Many companies are beginning to offer AI self-service tools that have become both easier to use for the non-data scientist and less expensive to acquire. Much like mobile app stores, these new AI marketplaces will resell specialized AI services and algorithms that companies can instantly buy and implement within their businesses. This model makes it easier for companies with a more modest budget to keep some skin in the game and remain competitive in the race for AI. The post Businesses must integrate Artificial Intelligence (AI) now or fall further behind appeared first on Talend Real-Time Open Source Data Integration Software.
Categories: ETL

Food delivery service built on Decoupled Drupal Commerce with React Native App

Drupal - Mon, 01/28/2019 - 06:32
Completed Drupal site or project URL: https://www.eldumrett.is/Eldum Rétt is an Icelandic subscription service that delivers food boxes complete with recipes and meal plans for home cooking to private households. The idea is to provide extremely flexible and varied offerings for customers of all family sizes that can be ordered and managed completely online, through a website and mobile apps. Eldum Rétt is the market leader in Iceland with thousands of customers and an annual turnover of close to 7 million US dollars. As a customer centric company, Eldum Rétt has very complex requirements for its online services. Customers need to be able to for example to
  • individually configure their food boxes with recipes from a selected list
  • adjust the number of portions that they want to cook
  • select the days of the week they want food delivered for
  • manage their delivery times and places
  • change their subscription settings, e.g. to pause deliveries during absences
The original website was built in 2014 on Drupal 7 with Drupal Commerce and wasn’t flexible enough to sustainably accomodate all these requirements. So the decision was made to move the site to Drupal 8 and Drupal Commerce 2 and also add a React Native based mobile app that would communicate with the main Drupal driven website. This project went live in November 2018.
Categories: CMS

New Liferay Project SDK Installers 2019.01.21

Liferay - Fri, 01/25/2019 - 01:26

The new release of Liferay Project SDK and Studio Installers has been made available today. This new package support for Eclipse photon or greater.

Download

For customers, they can download all of them on the customer studio download page.  Community downloads https://community.liferay.com/project/-/asset_publisher/TyF2HQPLV1b5/content/ide-installation-instructions Release highlights Installers Improvements:

  • Bundle latest Liferay Portal 7.1.1 GA2 in LiferayProjectSDKwithDevStudioCommunityEdition installers
  • Support userHome parameter in command line mode
Development Improvements:
  • Update gradle plugin buildship to latest 3.0.2
  • Better support for liferay workspace
  • Watch support improvements
  • Better deployment support for Liferay 7
    • support Liferay 71 CE GA2 Tomcat and Wildfly
    • integration of Blade CLI 3.4.1
    • support Liferay Workspace Gradle 1.10.13
  • Miscellaneous bug fixes
    Feedback If you run into any issues or have any suggestions, please come find us on our community forums or report them on JIRA (IDE project), we are always around to try to help you out. Good luck! Yanan Yuan 2019-01-25T06:26:00Z
Categories: CMS, ECM

New Liferay Project SDK Installers 2018.11.4

Liferay - Fri, 01/25/2019 - 00:57

Key Features

  • Watch improvements on gradle liferay workspace
  • Add Target Platform in new liferay workpsace wizard
  • Bug fixes

 

Upgrade From previous 3.x

  • Download updatesite here
  • Go to Help > Install New Software… > Add…
  • Select Archive... Browse to the downloaded updatesite
  • Click OK to close Add repository dialog
  • Select all features to upgrade then click > Next, again click > Next and accept the license agreements
  • Finish and restart to complete the upgrade
  About Watch Users can find a liferay workspace node under their servers when they develop on gradle liferay workspace. With this users can watch a large amount of projects at the same time.   Target Platform in Wizard Dependencies management for Target Plarform has been enabled by default when you create a gradle liferay wrokspace. Yanan Yuan 2019-01-25T05:57:00Z
Categories: CMS, ECM

Three traps that stifle modern enterprise integration

SnapLogic - Thu, 01/24/2019 - 13:46

A string of good decisions can easily propel a garage-born startup to greatness. But a few bad ones can bring even the most formidable organization to its knees. The key to minimizing costly decisions while maximizing profit-yielding ones is to make data the bedrock of your decision-making. Top companies have been making data-driven decisions for[...] Read the full article here. The post Three traps that stifle modern enterprise integration appeared first on SnapLogic.

Categories: ETL

Mastering Data and Self-Development at Lenovo

Talend - Thu, 01/24/2019 - 11:12

In 2012 when I worked at Lenovo, the company set out on a journey to create the Lenovo Unified Customer Intelligence (LUCI) platform. The decisions we made with regard to people and technology involved in that project helped to shape my self-development, relationships with others on the team, and relationship with executives. Data management leaders today are still facing a problem that has been around for years: How do we create systems and processes to move, transform, and deliver trusted data insights at the speed of business? To provide an understanding of how we solved this problem at Lenovo, it would be helpful if I shared a bit of background about myself. I come from a non-traditional background in that data and analytics is not where I started. My first position at Lenovo was as a digital analytics implementation manager, responsible for ensuring that all of the data collected at Lenovo.com integrated with digital solutions. I used quality assurance programs to establish trust. My web analytics team quickly realized that in order to create the value we wanted, we would need to integrate with many online and off-line data sources. Building Your Data Team & Self-Development This realization and the understanding that the team needed to build a new kind of platform was the beginning of a multi-year self-development journey. As I began to evaluate our needs against our internal platforms, I realized that none of them were capable of supporting our key requirements. “This realization and the understanding that the team needed to build a new kind of platform was the beginning of a multi-year self-development journey.” We needed an analytical platform that supported batch and streaming on 10-plus terabytes per year. We chose Tableau, R, and Python for the analytics layer and leveraged Amazon Web Services cloud databases for the storage layer. But we still needed to make a decision on the data integration layer. The 80/20 rule of data management came to mind: I refused to accept that 80% of the time would be spent on data wrangling and 20% spent on analysis. Our program had more than 60 endpoints, and change management needed to occur within one business day. We wanted 30% of our resources focused on data wrangling and 70% focused on business intelligence (BI) and analytics. To achieve this, we selected Talend for our integration technology, established one-week agile sprints, and leveraged our people to be integrators, implementers, and administrators. Building the IT and Business Relationship Organizational support for data integration was decentralized and often leveraged different vendors. It was considered an IT function and put in the background several layers removed from the business. I wanted to grow my team, and the only way I could do this was by creating value with business stakeholders. At this point, I created data architect roles that would become masters of their domains and cross-trained in others. These roles would be business facing so that architects would be working directly with the stakeholders. They were responsible for architecting, developing, and maintaining their own data solutions. A single data solution such as the voice of the customer pipeline could have more than 10 data sources, structured and unstructured data, varying volumes and velocities, translation and natural language processing loops, and multiple analytics and visualization outputs. Empowering data architects over such a large scope enabled them and the business to move at the pace that was needed for success. Working hand-in-hand, analysts began to understand the data wrangling processes, improving both the performance and quality of these processes. Most important, it helped them understand the value of an efficient data integration team. Relationships with Executives Business executives didn’t understand, nor were they interested in understanding, how a good data management practice can help drive the business forward. The first two to three years of my role was focused on delivering insights more efficiently. We tackled challenges such as having a dashboard that required six people over the course of a month to copy and paste in Excel to get an executive a view once a month. We got that down to half a person, automated, daily, and with quality checks. These wins give us the credibility and momentum to then connect data sets in different ways and to experiment with new analytics models. The larger business impacts and analytical wins came after we had a strong data integration and management practice. Today, many of those business executives understand what ETL is and why it’s important for their business. Some of my key learnings throughout this experience have been to drive a sense of ownership and business accessibility with the data architect function. The most important was to help my team to understand the “why”. Oftentimes the “why” of a business case is return on investment (ROI). I would vigorously enforce that the architects and engineers had to articulate how their actions were impacting a business objective, regardless of how far removed they were from the problem. This focus on ROI, understanding the why, empowering technical resources interfaced with the business, and giving more end-to-end ownership of these data processes are in my opinion the keys to building a successful data integration practice. The post Mastering Data and Self-Development at Lenovo appeared first on Talend Real-Time Open Source Data Integration Software.

Categories: ETL

Liferay IntelliJ Plugin 1.2.1 Released

Liferay - Thu, 01/24/2019 - 03:12

The latest release of Liferay IntelliJ 1.2.1 plugin has been made available today. Head over to this page for downloading. Release Highlights:

  • Wizards
    • Added new module ext wizard
    • Add target platform option in new liferay workspace wizard
  • Editor Improvements
    • code completion
      • java bean for Liferay Taglib
      • more OSGi component properties
      • additional hints for model-hints xml files
      • more resource bundle keys for Liferay Taglib
    • better support for  bnd.bnd files
  • Support quick fix for gradle dependencies
  • Update embedded blade to 3.4.1
  • Bug Fixes
  Wizards Users can find the new wizard through clicking on File > New > Liferay Module. Target platform option is added to new liferaygradle workspace project wizard.   Using Editors         Quick Fix Quick fix is enabled by default if target platform has been set.   Known Tickets INTELLIJ-34   Special Thanks Thanks to Dominik Marks for the improvements.   Feedback If you run into any issues or have any suggestions, please come find us on our community forums or report them on JIRA (INTELLIJ project), we are always around to try to help you out. Good luck! Yanan Yuan 2019-01-24T08:12:00Z
Categories: CMS, ECM

The true value of data

SnapLogic - Wed, 01/23/2019 - 14:01

Previously published on networkcomputing.co.uk.  Joining the likes of agile and innovative, the term data-centric has become part of the modern business lexicon. Whilst the most cynical might consider data-centricity merely a buzz term, the facts are undeniable: businesses of all sizes are sitting on swelling data stores, with valuable business insights hidden within. More data[...] Read the full article here. The post The true value of data appeared first on SnapLogic.

Categories: ETL

Are APIs becoming the keys to customer experience?

Talend - Wed, 01/23/2019 - 12:51

In recent years, APIs have encouraged the emergence of new services by facilitating collaboration between applications and databases of one or more companies. Beyond catalyzing innovation, APIs have also revolutionized the customer-company relationship, allowing it to provide an accurate and detailed picture of the consumer at a time when a quality customer experience now counts as much as the price or capabilities of the product. APIs: A Bridge Between the Digital and Physical World Over the years, customer relationship channels have multiplied with consumers who can interact with their brands through stores, voice, email, mobile applications, the web or chatbots. The multiple points of interaction used by customers have made its journey more complex, forcing companies to consider data from these many channels to deliver the most seamless customer experience possible. To do this, they must synchronize data from one channel to another and cross-reference data related to its history with the brand. This is where APIs come into play. These interfaces allow data processing to refine customer knowledge and deliver a personalized experience. Thanks to a 360° customer view, the digital experience can be extended in store. The API acts as a bridge between the digital and physical world. The APIs also allow organizations to work with data in a more operational way and especially in real time. However, many companies still treat their loyal customers as if they’ve never interacted before. It is therefore not uncommon to have to reappear after several requests or to retrace the history of previous interactions, which can seriously damage the customer relationship. The challenge for companies is to deliver a seamless, consistent and personalized experience through real-time analysis. This will provide relevant information to account managers during interaction and allow them to have guidance on the next best action to take, in line with the client’s expectations. Even better, with APIs, we can predict the customer’s buying behavior and suggest services or products that meet their needs. Indeed, with the data collected, and thanks to the use of artificial intelligence, the cross-tabulations and instant analysis make it possible to refine the selection to offer an increasingly relevant and fluid experience, increasing customer loyalty and thus the economic performance of companies. The Importance of APIs with GDPR Recently, there has been a trend to empower consumers to control their data, after new regulations such as the European Payment Services Directive (PSD2) and GDPR came into force in May 2018. What do they have in common? They both give individuals control over their personal data with the ability to request, delete or share it with other organizations. Thus, within the framework of PSD2, it is now possible to manage your bank account or issue payments through an application that is not necessarily that of your bank. Through this, APIs provide companies the opportunity to offer a dedicated portal to their customers to enable them to manage their data autonomously and offer new, innovative payment services. For its part, companies will be able to better manage governance and the risks of fraudulent access to data. With an API, a company can proactively detect abnormal or even suspicious data access behaviors in near real time. APIs are the gateways between companies and their business data and are the answer to real needs that the market is beginning to meet with customer experience. However, many organizations have not yet understood the importance of implementing an API strategy, an essential part of digital transformation, as well as the cloud, and the emergence of increasingly data-driven organizations. APIs are the missing link between data and customer experience — a key companies need to start using. Ready to Learn More?  << Watch the webinar on-demand “APIs for Dummies” >> The post Are APIs becoming the keys to customer experience? appeared first on Talend Real-Time Open Source Data Integration Software.

Categories: ETL

SnapLogic’s journey to becoming more data-driven

SnapLogic - Tue, 01/22/2019 - 16:17

Data is an important asset for every organization and empowers leaders to make fully informed business decisions. However, it has also become a major barrier for those who analyze the health of their organization and seek opportunities to improve business. In part one of my two-part series, I will detail how SnapLogic overcame these barriers[...] Read the full article here. The post SnapLogic’s journey to becoming more data-driven appeared first on SnapLogic.

Categories: ETL

Intellij IDEA - Debug Liferay 7.1  

Liferay - Tue, 01/22/2019 - 03:22

Problem:

Running Liferay 7.1 from IDEA in Debug mode throws the following error by default:

java.lang.NoClassDefFoundError: com/intellij/rt/debugger/agent/CaptureStorage

However, Run mode works without any issues.

 

Fix:

Disable Instrumenting agent in "Settings | Build, Execution, Deployment | Debugger | Async Stacktraces":

Hope, this will help :)

Vitaliy

Vitaliy Koshelenko 2019-01-22T08:22:00Z
Categories: CMS, ECM

Liferay Portal 7.1 CE GA3 Release

Liferay - Mon, 01/21/2019 - 14:40
Overview New Features Summary
  • Oracle OpenJDK 11 -  GA3 has been tested for use with Oracle OpenJDK 11.  For more information on JDK 11 support in Liferay CE/DXP please see the JDK Roadmap post.  Also check the known issues section below for an issue and workaround related to JDK 11.  
  • Clustering Returns - GA3 now includes clustering support out of the box once again.  For more information see this announcement for clustering.  Also see the official documentation for updated info on configuring clustering.  
  • Liferay Hypermedia REST APIs - We recently announced a beta release for Liferay Hypermedia REST APIs.  In addition to Liferay DXP, Liferay Portal 7.1 CE GA3 now supports the beta release.  For more information see the official Liferay Hypermedia REST APIs site.
Documentation Official Documentation can be found on Liferay Developer Network.  For information on upgrading, see the Upgrade Guide. Bug Reporting If you believe you have encountered a bug in the new release you can report your issue on issues.liferay.com, selecting the "7.1.2 CE GA3" release as the value for the "Affects Version/s" field. Known Issues
  • LPS-86955: Use alternate browser from IE 11.  
  • LPS-88877: Remove license file from lpkg file and deploy application manually. A more permanent fix is coming to GA4.  
  • LPS-87421: Set properties included in ticket.
Getting Support Support is provided by our awesome community. Please visit our community website for more details on how you can receive support. Liferay and its worldwide partner network also provides services, support, training, and consulting around its flagship enterprise offering, Liferay DXP. Also note that customers on existing releases such as 6.2 and 7.0 continue to be professionally supported, and the documentation, source, and other ancillary data about these releases will remain in place. Kudos A thanks goes out to our engineering and QA teams who spent countless hours developing, testing, translating and writing documentation to ensure a stable release that meets and hopefully exceeds our expectations!   Jamie Sammons 2019-01-21T19:40:00Z
Categories: CMS, ECM

Custom Fields and Profiles online training - January 22nd, 10 am MT

CiviCRM - Fri, 01/18/2019 - 19:21

Would you like to gather specific information about your contacts, such as their dietary preferences when they register online for an event or industry information when an organization signs up for a membership? Or maybe you need to create a newsletter sign-up form for your website.

Categories: CRM

How AB InBev is Using Data to Brew up the Best Customer Experience

Talend - Fri, 01/18/2019 - 16:01

AB InBev, headquartered in Belgium, is one of the largest fast-moving consumer goods (FMCG) companies in the world with a diverse portfolio of well over 500 beer brands, including Budweiser, Corona, Stella Artois, Beck’s, Hoegaarden and Leffe. When companies grow via external acquisitions, integrating the systems and data from acquired companies is always a challenge. For AB InBev, that challenge included a hybrid environment with both on-premises and cloud systems and a host of brewers operating as independent entities with their own internal systems. Also, like other alcoholic beverage producers, AB InBev must abide by strict regulations regarding gathering consumer information. Integrating systems and data from acquired companies AB InBev wanted to embark on a cloud journey, and Talend was built in that world. Talend extracts data from over 100 source systems, —realtime and batch, cloud and on-premises, ERP systems, data from IoT devices—and stores it in a data lake on Microsoft Azure. All data management work has to be done for multiple companies under the AB InBev umbrella and among the biggest benefits of the new IT architecture are simplification and reusability of processes to rapidly extract and provide access to data. Selling the best beers and making people happy Because AB InBev is leveraging reusable code, what used to take six months now takes six weeks. That translates into faster decisions. Now, with Talend Data Preparation, internal users spend only about 30 percent of their time gathering data and can spend 70 percent analyzing it. Data helps understand drinker tastes and analyze new demands from consumers for low-calorie beers for example or determine preferences for beers according to seasonality. Data also helps improve store and bar experiences, supply chain optimization, product development and more. Learn more  The post How AB InBev is Using Data to Brew up the Best Customer Experience appeared first on Talend Real-Time Open Source Data Integration Software.

Categories: ETL

AI adoption: When will it be too late for you?

SnapLogic - Thu, 01/17/2019 - 15:49

Previously published on information-age.com.  While not every company has ambitions to be the next Amazon, none wants to be Sears. Yet to avoid the fate of the fallen, organizations must, to a certain extent, emulate the world’s most dominant firms. A core trait shared by industry titans like Amazon, Netflix, and Google worth imitating is the[...] Read the full article here. The post AI adoption: When will it be too late for you? appeared first on SnapLogic.

Categories: ETL

Session Storage is Evil

Liferay - Thu, 01/17/2019 - 13:19
Introduction

For folks that know me, they know one of my favorite soapbox rants is on HTTP and/or Portlet session storage.

Recently my friend Kyle Stiemann wrote a blog about Session Storage in Liferay, and he reached out to me to proof the blog post before it was published. While it was really well written and informative, I must admit I didn't want it published. Providing a convenient place showing how to use sessions, even with all of the warnings, to me seemed almost like promoting session storage too much. I am very much against session storage usage, but could only find one other reference that shared my opinion: https://dzone.com/articles/rules-thumb-dont-use-session Since I hadn't really made my rant public before, and since I've been getting questions lately about session usage, I thought it was about time that I make my case publicly so it's available for all to see (or trash, as the case may be). Session Storage is Evil There, I said it. Session storage is evil. I'll go even farther - if a developer uses session storage of any kind, it demonstrates that the developer either doesn't care about the runtime impact or doesn't know about the runtime impact, and I'm not really sure which is worse. Session Storage as the Sirens Song For developers, session storage is akin to a sirens song. Sailors, hearing the siren's song, would steer their ships upon the rocks, leading to destruction and their death. Session storage is the same for developers. It is so darn easy to use. It has been part of the javax.servlet.http.HttpServletRequest class since the very first release. There are tons of examples for using session storage for developers to reference online. It is presented to new Java developers who are learning to build servlets. And considering other temporary unstructured data storage, it is so simple. So it definitely has its allure. So how can it be evil? How Session Storage is Evil Session storage is not evil from a developers perspective, but they are absolutely evil from deployment and runtime perspectives. Here's a short list of why session storage is evil: 1. Session Storage Consumes Server Resources. Although this may sound obvious, it may surprise developers if load and capacity were not considered during development. As a developer, it might seem trivial to store a list of objects retrieved for the user into their session. The code is easy and unit and system testing will not reveal any obvious defects. Problems may only surface during load testing. Let's consider a calendar implementation. Imagine a system where, when a user logs in, their list of upcoming calendar events is retrieved and stored in their session. The idea was that this would offer a significant performance boost by not retrieving the data from the database every time the user refreshes the page. Such a system would be easy to code, easy to unit and system test. After all, we're going to do our testing with some relatively small number of events, the session storage aspect will work out fine and performance would be great. Now consider that an event is, say, on average 250 bytes. Then say on average, a user would have 20 events on their calendar at any given time. Rough math gives us on average 5k then for each user. Now given that session storage is in-memory only, further rough math says that this system will accommodate about 200 users per MB of memory. These kinds of numbers define what our capacity is going to be for mostly concurrent users at any given time. If the numbers used for the average increase, i.e. you add a description string to the event and events grow to an average of 500 bytes per, this will decrease your capacity by half. And it is "mostly concurrent" because session storage is only reclaimed if a) the user logs out or b) the user's session times out. You cannot expect that every user will always log out, in fact you should plan on the worst case scenario that users never log out and their 5k of calendar events will remain in memory until the session expiration timeout. Factoring these things together, you can start to see how session data can actually start to consume the system resources and can negatively effect node capacity. 2. Session Storage is Implemented using Object Serialization. All objects that will be stored to the session must implement java.io.Serializable. So on the surface, that might not seem like a big deal. And maybe for some use cases, it isn't a big deal. If you control the classes that you will be pushing to the session, serializability is easy to include. The problem comes when you do not control the classes that you want to push to the session. Maybe these classes come from another team in your organization, or maybe the classes come from a third party library or Liferay itself. When you don't have control over the classes, you may not be able to make the classes serializable so they might not be compatible with session storage. And honestly, developers are really, really bad about implementing serialization. I guarantee that few developers actually follow the best practices for using Serializable. If you think you're the exception, check out http://www.javapractices.com/topic/TopicAction.do?Id=45 or https://www.javacodegeeks.com/2010/07/java-best-practices-high-performance.html or https://howtodoinjava.com/java/serialization/a-mini-guide-for-implementing-serializable-interface-in-java/ for good Serialization usage. Compare that to yours or Liferay's code to see if you can find an instance where Serialization is implemented according to best practices. Did you know that serialized data is not really secure? Serialized objects capture the data the instances contained, but by default it is not going to be encrypted at rest. Look for the .ser files from Tomcat after session storage to determine if your data is exposed. Serialized data also has issues. OWASP defines a vulnerability inherit when deserializing untrusted data: https://www.owasp.org/index.php/Deserialization_of_untrusted_data. If you are set up to persist session data across restarts, the reality is that this data must be considered untrusted because there are no guarantee that the serialized data has not been tampered with. Finally, as serialization is seen as the source of many Java security defects, there are reports that Oracle plans on dropping support for serialization: https://www.bleepingcomputer.com/news/security/oracle-plans-to-drop-java-serialization-support-the-source-of-most-security-bugs/.  When and if this happens, this will likely force changes on how session storage is handled. 3. Session Data is Bound to One Node. When using session data, it is normally only stored only on a single node. If you only have a single production application server, then no problem because the stored data is where it needs to be. But if you have a cluster, data stored in a session on a node is not normally available across the cluster. Without some kind of intervention in the deployment, data stored to a session for a user is only available to the user if their request ends up back on the same node. If they end up on a different node, the session data is not available. Switching nodes can happen automatically if the node the user stored the data on either crashed or was shut down. In an OOTB session configuration in Tomcat, sessions can be used to store data, but this data is not persisted across restarts. So any data stored in the session is lost if the node is either stopped or crashed; so even if you can restart a failed node, the session data is lost. You can configure Tomcat to persist session data across restarts, but even in this configuration when the node is down, the session data is not available since it is bound to that specific node. Plus you still have all of the issues with serialized data from #2 above to deal with. The first wave of Java shopping cart implementations used session storage for all of the cart items. It was super easy to use as a temporary, disposable store of data not worthy of longer-term persistence. But customers had hard times using these carts because they would sometimes see their cart items disappear. This would happen if the customer got switched to a new instance because the node crashed, the load balancer redistributed traffic or the node was taken down for maintenance. 4. Session Replication is Costly and Consumes Resources. One solution for the loss of the node w/ all of the session data was to introduce session replication. Session replication copies data stored in session to all of the other nodes in the cluster. Since this is not an OOTB solution for most app servers, the solution requires additional server(s) and software. There is no standard for session replication, so each offering is custom and leads to lock-in. Often times at the start of a project these additional costs are not planned for; they usually crop up at the end of the project when the operations team is trying to figure out how to fix an implementation that was broken due to use of session storage, so you get an end of project surprise implementation cost. Once the replication stuff is in place, there is still the ongoing overhead to deal with. Every session data update will result in additional network overhead (to share the update). In some cases, replication is done by copying to all nodes (in a mesh) which has a large amount of overhead, in some cases session storage is centralized to minimize on network overhead. In either case, operationally you are adding another possible point of failure in the infrastructure. What happens if your session data container crashes? How will your application recover? Is it even operable at that point? What are the disaster recovery concerns? How do you fail over gracefully? In a distributed cluster, how do you handle latency issues or network issues between the regions? How do you monitor availability of your session replication infrastructure? How do you debug issues arising from session replication? Will existing session data be available to a new node coming online or is there some amount of syncing that needs to be done? As developers, we often don't have to think about any of these kinds of issues. But I guarantee that these issues exist and must be planned on from an operations perspective. Remember the math example from item #1 above? For session replication, all of the math gets multiplied by the number of nodes deployed in the cluster. The replication solution needs to be able to store as much session data as is generated by X number of nodes, each under peak load; anything less could potentially lead to data loss. And the "mostly concurrent" aspects encountered by users not manually logging out, this is an additional factor that affects the sizing of your replication solution. 5. Sticky Sessions Unbalance Resource Consumption. Another option often used with session storage is the sticky sessions. In sticky sessions, the load balancer is configured to send traffic originating from the same host to the same target node. This ensures that a user will have access to the data stored in their session. It is the lightest-weight solution for stored session data use since it won't require additional hardware/software, but it does have its own serious drawbacks. If the node crashes or is taken out, the user loses access to the session data and the UX will not be good. The load balancer in this situation will switch the user to another node, but the data in the session is still not available. However, while the node is up, all traffic originating from the same origin will always go to the same node. In an autoscaling scenario, sticky sessions work against being able to distribute traffic amongst the nodes. If you have a two node cluster and both nodes are saturated, when you spin up a third node it will only receive requests from new origins; the two saturated nodes will remain saturated because the sticky session logic binds origins to nodes. Ideal Solution So what's the ideal solution? Avoid session storage altogether. Seriously, I mean that. The benefits are tremendous:

  • No resource consumption for session data.
  • No additional network chatter to broadcast session data for replication.
  • Load balancing able to shift load across the cluster based on capacity.
  • No additional costs in sizing the nodes or session replication solutions.
  • No autoscaling issues.
  • No security concerns.
  • No lingering data waiting for session timeouts.
  • No developer impact to correctly implement Serializable interface for internal and external code.
Most often the pushback I get on this is a developer who needs to stash temporary, unstructured data for a short period of time. If session storage is taken off the table, what is left? The database, of course. If your data is a Java object graph, you can marshal that into JSON or XML and store the string in the database if it must be completely unstructured. Or you could use a NoSQL solution like MongoDB or even Elasticsearch to hold the data. For wizard-like forms, you can carry forward form field values into hidden fields, allowing the temporary data storage to occur in the client's browser instead of your application server. There are just so many solid, cluster-friendly ways to carry this data around. All it takes is a good architecture and design. And the general desire to avoid the evil that comes with session storage... If you are advocating using session storage, consider the items above. If a coworker is using session storage, call them out on it as soon as possible. If a potential candidate proposes using sessions to store data, question whether the candidate understands the runtime issues that plague session storage. If a contractor wants to use session storage, get a new contractor. Follow the advice of Odysseus; fill your ears with beeswax and avoid the Sirens Song sung by Session Storage... David H Nebinger 2019-01-17T18:19:00Z
Categories: CMS, ECM

Data skills – Many hands make light work in a world awash with data

Talend - Thu, 01/17/2019 - 12:56

While the transformation to a data-driven culture needs to come from the top of the organization, data skills must permeate through all areas of the business.

Rather than being the responsibility of one person or department, assuring data availability and integrity must be a team sport in modern data-centric businesses. Everyone must be involved and made accountable throughout the process.  

The challenge for enterprises is to effectively enable greater data access among the workforce while maintaining oversight and quality.

The Evolution of the Data Team

Businesses are recognizing the value and opportunities that data creates. There is an understanding that data needs to be handled and processed efficiently. For some companies, this has led to the formation of a new department of data analysts and scientists.

The data team is led by a Chief Data Officer (CDO), a role that is set to become key to business success in the digital era, according to recent research from Gartner. While earlier iterations of roles within the data team centered on data governance, data quality and regulatory issues, the focus is shifting. Data analysts and scientists are now expected to contribute and deliver a data-driven culture across the company, while also driving business value. According to the Gartner survey, the skills required for roles within the data team have expanded to span data management, analytics, data science, ethics, and digital transformation. Businesses are clearly recognizing the importance of the data team’s functions and are making significant investments in it. Office budgets for the data team increased by an impressive 23% between 2016 and 2017 according to Gartner. What’s more, some 15% of the CDOs that took part in the study revealed that their budgets were more than $20 million for their departments, compared with just 7% who said the same in 2016. The increasing popularity and evolution of these new data roles has largely been driven by GDPR in Europe and by new data protection regulations in the US. And the evidence suggests that the position will be essential for ensuring the successful transfer of data skills throughout businesses of all sizes. The Data Skills Shortage Data is an incredibly valuable resource, but businesses can only unlock its full potential if they have the talent to analyze that data and produce actionable insights that help them to better understand their customers’ needs. However, companies are already struggling to cope with the big data ecosystem due to a skills shortage and the problem shows little sign of improving. In fact, Europe could see a shortage of up to 500,000 IT professionals by 2020, according to the latest research from consultancy firm Empirica. The rapidly evolving digital landscape is partly to blame as the skills required have changed radically in recent years. The required data science skills needed at today’s data-driven companies are more wide-ranging than ever before. The modern workforce is now required to have a firm grasp of computer science including everything from databases to the cloud, according to strategic advisor and best-selling author Bernard Marr. In addition, analytical skills are essential to make sense of the ever-increasing data gathered by enterprises, while mathematical skills are also vital as much of the data captured will be numerical as this is largely due to IoT and sensor data. These skills must also sit alongside more traditional business and communication skills, as well as the ability to be creative and adapt to developing technologies. The need for these skills is set to increase, with IBM predicting that the number of jobs for data professionals will rise by a massive 28% by 2020. The good news is that businesses are already recognizing the importance of digital skills in the workforce, with the role of Data Scientist taking the number one spot in Glassdoor’s Best Jobs in America for the past three years, with a staggering 4,524 positions available in 2018.  Data Training Employees Data quality management is a task that extends across all functional areas of a company. It, therefore, makes sense to provide the employees in the specialist departments with tools to ensure data quality in self-service. Cloud-based tools that can be rolled out quickly and easily in the departments are essential. This way, companies can gradually improve their data quality whilst also increasing the value of their data. While the number of data workers triples and to stay competitive with GDPR, businesses must think of good data management as a team sport. Investing in the Chief Data Officer role and data skills now will enable forward-thinking businesses to reap the rewards, both in the short-term and further into the future. The post Data skills – Many hands make light work in a world awash with data appeared first on Talend Real-Time Open Source Data Integration Software.

Categories: ETL

More Tracking of Exports

CiviCRM - Thu, 01/17/2019 - 00:39
  CiviCRM does not natively log whenever an export of contacts occurs. This can be problematic for organisations with compliance and auditing obligations. In response to this The Australian Greens have developed an Extension which creates a log file for each CiviCRM domain in the ConfigAndLog directory and logs every time a CiviCRM user performs an export from within CiviCRM. The current log format is as follows:  
Categories: CRM
Syndicate content