Good Vibe The Informatica of Things

Vibe Data Stream [Vibe] and Virtual Data Machine [VDM] combine at the center of Informatica’s Internet of Things strategy. Primarily for Machine-to-Machine [M2M] data, and by connecting through Power Center, ultimately leading to Machine-to-Human [M2H] Data. The goal is to have VDMs residing in mobile devices, sensor packages, or as part of sensor networks. At this point, VDMs require more processing power than available in most components. Thus, Vibe and VDM are primarily suited today to data, network operations, and communication centers.

However, Informatica is seeing a broad range of use cases involving both large machines and sensor networks, from many different sectors including

  • telcos,
  • oil and gas,
  • financial services,
  • government,
  • data center operations, and
  • building services.

The Proof is Out There

One Proof of Concept [PoC] currently underway is with a Heating, Ventilation and Air Conditioning [HVAC] company. In the PoC, the HVAC company is looking at streaming data from all of their installations. Using Informatica products, they are bringing this data into their data center for both streaming and batch analytics. There are actually three use cases being examined in this PoC:

  • Improving customer service
  • Internal analytics on generic patterns of use for improved design, reliability and maintainability
  • Predictive maintenance from the provider rather than from the building management team

Other field trials look at Vibe and VDM capabilities in regard to Pub/Sub models working with Informatica Ultra Messaging, as well as persisting data in all forms of data stores from traditional Enterprise Data Warehouses [EDW] to Hadoop [HDFS] and NoSQL databases such as Cassandra. These field trials involve solving the ongoing problems of the different areas mentioned above.

  • In a financial services case, both application log data and financial information exchange [FIX] log data are being used to pull in log data real time for market, order flow and trade data.
  • For online retail, Vibe is used to track web-site visitor paths through the site using log data.
  • Data center operational efficiency optimization for green IT, sustainability or improving the bottom line through log data from switches, servers, applications and call centers.
  • For one governmental agency, Informatica Vibe and VDM are maintaining the Service Level Agreement [SLA] in real time, for 800 separate field organizations over more than a million devices, using industry-standard Security Content Automation Protocols [SCAP] data formats.

Perhaps the most involved trials begin done to date with Informatica Vibe and VDM, are within the Telecommunications space. As one might expect, the explosion of data and customer expectations, as cellular goes from 2G to 3G to 4G/LTE requires real-time management of ever increasing amounts of data. But also the wireline/fiber and cable use cases are exploding as the traditional market places of voice, entertainment and connectivity intertwine.

Out to the Edge

Informatica is aggressively working with partners, such as chip, sensor and package manufacturers, to understand how to optimally implement Vibe, whether that is through streaming collection capability of Vibe on the device itself or as part of the larger infrastructure at some point in the collection tier to implement the needed streaming collection. Currently, collecting sensor data can hit performance limits using the sensor or communication base protocols. Thus, for example in the oil and gas industry, Informatica is working with both vertical-specific sensor manufactures and large organizations in the industry, to determine how Vibe can supplement or even replace the collection tier.

SAE Fit

What Informatica brings to evolving sensor analytics ecosystems [SAE] is not only their specific technologies of Vibe and VDM, but combining these with a complete package for supporting streaming analytics, operational intelligence, complex event processing [CEP], batch analytics, predictives, reporting, data marts and EDW, through their existing technology families such as Ultra Messaging, Power Center, Master Data Management, Data Quality, and more, both through traditional and Cloud deployments. This results in bringing mature market features to the SAE in the form of

  • Guaranteed delivery
  • Automated zero latency fail-over
  • Centralized GUI administration
  • No intermediary staging of data at source, broker, or target
  • Fail-over does not require shared file systems

References

This blog post is based upon both the Informatica Press release referenced below, and a private briefing from the Informatica team that allowed us to gather more information and get answers to our questions. Also referenced are other of our blog posts on IoT and Big Data, for context.

  1. Informatica Press Release from Strata + Hadoop World
  2. What does IoT All Mean
  3. The IoT and Change
  4. Big Data: It’s Not the Size, It’s How You Use It
  5. New Hope from Big Data
Creative Commons License: Attribution, Non-Commercial, Share-AlikeExcept where otherwise noted, this content is
licensed under a Creative Commons License.

https://www.constellationr.com/content/internet-things-and-change

Permalink

Salesforce1

Back in July, I wrote

[An] excellent example of the importance of the Industrial Internet comes from Salesforce.com use of The Social Machine by Digi International and its Etherios business unit, in bringing sensor data into customer relationship management [CRM] by allowing sensors embedded in industrial refrigerators, hot tubs, and heavy and light equipment of all types to open SFDC chatter sessions and to file cases.

At Dreamforce 2013, Salesforce.com is announcing Salesforce1, their new Internet of Customers ecosystem, bringing together Force.com, Heroku, and ExactTarget FUEL platforms under a united series of APIs controlled by the Salesforce1 App.

Today and tomorrow, Dreamforce is all about the Internet of Things, and I'll be providing my analyses of how SFDC is building out it's massive existing ecosystem of parnters, services and customers into Marc Benioff's evolving vision of the Internet of the Customer. The message here, is how Salesforce1 is ready today to prepare their customers to leverage the opportunities presented by the Internet of Things today. As Cisco states, over a trillion dollars in added value was left on the table this year by companies not taking advantage of IoT. For 2014, SFDC's customers won't have an excuse to leave this money behind.

One challenge for Salesforce1 is its dependence on partners for analytics. Are SFDC partners ready to help in bringing the Internet of Customers to full potential through connected analytics? How will IBM's MQTT, Smarter, and Cognitive Computing, Oracle's Device-to-Data-Center, Teradata's Hub for Monetizing the IoT, Infobright's M2M optimized ADBMS, and many other data management & analytics initiatives focused on M2M and M2H data fit in?

Will Salesforce1 create or be integrated into Sensor Analytics Ecosystems, with the necessary marketplaces for raw, processed and insights from M2M & M2H data? SFCD has never been up to the challenge of analytics in the past. While there are many general BI and Analytics partners, SFDC specific analytics firms have come and gone. Salesforce1 is a broader concept and brings SFDC into a future beyond salesforce automation and customer relationship management.

You can hear more about Salesforce1 on this YouTube video, peruse the official Salesforce1 page, and read a more general account of Salesforce1 by R. Ray Wang.

The IoT Keynote at Dreamforce today, and the packed sessions on IoT will answer some of these questions. I'll be providing my analysis of how well these questions are answered in an Event Report blog post after the close of Dreamforce 2013.

Creative Commons License: Attribution, Non-Commercial, Share-AlikeExcept where otherwise noted, this content is
licensed under a Creative Commons License.
Permalink

Paxata Revealed

The week of 2013 October 28 was a big one for Paxata, Inc. Founded in January of 2012, followed by advisories, beta customers also known as "Pax Pros", and 12 sprints, Paxata quietly released their first GA product in May of 2013. With panels and debuts at the Strata + Hadoop conference in New York and other events, leading up to announcements and demonstrations at the Constellation Connected Enterprise at the Ritz Carlton in Half Moon Bay, California, Paxata officially left stealth mode, publicly discussing:

  • Five Blue Chip Customers: including UBS, Dannon, Box, Pabst, and a $49 B High Tech Networking Manufacturer
  • Partnerships with Tableau, QlikTech and Cloudera
  • Adaptive Data Preparation Platform
  • Eight Million US Dollars in the latest round of funding led by Accel
  • Filling out the Management Team with Enterprise Software executives having backgrounds from SAP, Tableau and Hyperion

The most wondrous feature of the Paxata Adaptive Data Preparation Platform is how it adds semantic richness to one's data sets by automatically recommending and linking to third-party and freely available data. This allows one to bring in firmographic, demographic, social and machine data within the context of the user's goals. This is what truly allows the Paxata Adaptive Data Preparation Platform to go beyond data exploration and discovery.

Paxata has received a fair amount of press as well, some of which I've referenced below. However, all this press misses what is one of the most important additions Paxata makes to the toolboxes of Data Management & Analytics [DMA] professionals… the ability to present questions to the user that they may not have thought of on their own. Paxata was one of the companies that inspired my DataGrok blog post. Paxata was in stealth at the time, and couldn't be named then. Now, I'm happy to be able to write that Paxata is one of the few companies or projects building tools that allow the creator and user of data to go beyond data discovery, beyond data exploration, to being able to fully, deeply understand their data. Data discovery and data exploration tools allow one to determine if various data sets can answer the questions posed by business, engineering or scientific challenges. These tools go further by exposing data integrity issues among data sets or data quality problems within a data set. Some such tools might help the user find new data sets or how various data sources within an organization might fit together in a data warehouse. Some hark back to grep, sed and awk to parse textual data. Others provide probabilistic and statistical tools to determine the appropriate shape, distribution or density functions of a data set. But Paxata is one tool that does all these and more, and does it through your web browser in a collaborative fashion, maintaining the history of each collaborator's operations on the data sets.

When my partner, Clarise, and I were first briefed by Paxata in November of 2012, we were so excited that we stayed over three hours. The demonstration, of what was then a much rougher product than what you see today, incited both of us to exclaim how much we wished that we had this tool back in our DMA practitioner days. We were treated to a demonstration using the data from another Constellation Research customer with which we were familiar. Over a year later, we were treated to a pre-launch briefing using current data sets from that same customer. The ease of use, the pleasantness of the user experience, the simplicity with which one could complete complex tasks, from histograms to column-splitting, showed the maturity that Paxata had gained since our first exposure. What was most important to us, was that Paxata could show a solution for every need that we would like to see in the Adaptive Data Preparation Platform, from our experiences in implementing data warehousing and business intelligence programs since 1996, as well as our decades of experience in computational statistics and operations research.

  • Collect and parse data of disparate types and sources including XML and JSON, and Excel, Flat Files and relational databases
  • Pre-analyze and visualize the data sets
  • Combine different data sets
  • Separate data into patterns
  • Verify individual datum for integrity, quality, mastering and governance
  • Allow multiple IT and end-users to prepare and operate upon the data
  • Maintain the history of what each user [a.k.a. Pax Pro] does, and show that history to all other users

It allows data warehousing and BI extract, transform and load professionals, business analysts, data scientists, chemists, physicists, engineers, researchers, and professionals of all skills who work with data to completely understand and resonate with their data sets. The Paxata Adaptive Data Preparation Platform does what few other tools can do, it provides clues to what you didn't know to ask. It poses questions that the data can answer, but that you didn't think to ask. And it does all of this in a familiar looking interface, in HTML 5, in your favorite web browser, wherever you are, whenever you need it. In Paxata's words:

  1. Connect
  2. Explore
  3. Transform
  4. Combine
  5. Publish

Paxata pricing is published and open. There are three subscriptions available:

  • Pax Personal
  • Pax Share
  • Pax Enterprise

Each of the Paxata subscriptions build upon the first, from an individual subscription to the ability for those with individual subscriptions to share in a single environment, to a full organization-wide subscription. Of course, what makes this possible, is that the Paxata Adaptive Data Preparation platform is available as a Cloud service, accessible through any modern HTML 5 web browser whether that's from a sophisticated, high-end workstation, a tablet or smart phone.

The main value comes not from a nice-looking, fairly intuitive interface, but from the underlying technologies that makes Paxata so useful: powerful Mathematics, Semantics and Graph Theory algorithms. The results of which are easily accessible through this Cloud-based, web experience, while the complexities are under the covers, not getting in the way. This fact is what makes the Adaptive Data Preparation Platform so accessible to business analysts, and other creators and users of data who are not PhD statisticians. Paxata uses proprietary algorithms that detect relationships among data sets, using probabilistic techniques to select the best joins, semantically typing the data so that it can intelligently enrich the data, clean the data and merge the data based upon context not just metadata. All of this is done in an ad hoc fashion, with no predefined models or schæmas needed. These proprietary algorithms make use of

  • Latent Semantic Indexing
  • Statistical Cluster Graphing
  • Pattern Recognition
  • Text Analytics
  • Machine Learning

Distributed computing and in-memory technologies allow these computational statistics algorithms to be,cost effectively executed in parallel, across massive data sets. Coupled with the advancements in visualization technologies, Paxata is able to address a 13.5-16 Billion dollar market over next three years, with extremely attractive pricing. The true return on investment from Paxata comes from flipping the DMA equation around. Currently, a common truism is that 80% of the time on a DMA, Data Science, DW or BI project is spent in preparing data; 20% in analyzing the data. Paxata reduces that data preparation percentage, such that 70% is analytics, 30% is preparation. This reduces not only the labor directly involved in preparing the data, but also allows an Agile framework to address significant business needs at the right time, in a sustainable fashion.

Paxata's strategy is to attach to the QlikView and Tableau markets that are being hampered from enterprise adoption because of these very data preparation challenges. Along with these partnerships, is the partnership with Cloudera, providing enterprise class access to modern, distributed data storage systems. Add connectors to common enterprise and external data sources and the third-party Paxata Enrichment Libraries, and it is obvious to the most casual observer that the Paxata Adaptive Data Preparation Platform addresses the most frustrating complaint of Data Scientists and Business Analysts alike: that too much of their time is spent on plumbing, whether directly or waiting for IT. We have long spoken about the need for IT to give up control of data, and realize that their most effective role is to provide a framework of success for end-users to fully, deeply understand and use their data to solve real problems. Paxata creates this framework for success.

Other Sources to learn about the Paxata launch:

  1. The Paxata Web Site
  2. Diginomica: Can Business Users control their data destiny? Paxata says yes
  3. GigaOM: With $10M from Accel, Paxata wants to make data prep a breeze
  4. VentureBeat: Paxata grabs $8M to help data scientists skip the dirty work
  5. YouTube: Paxata Customers and Partners Help Launch the Company
  6. YouTube: The Cube: Prakash Nanduri - Big Data NYC
Creative Commons License: Attribution, Non-Commercial, Share-AlikeExcept where otherwise noted, this content is
licensed under a Creative Commons License.
Permalink

What Does IoT all mean?

The number of articles about the Internet of Things [IoT], Machine-to-Machine communication [M2M], the Industrial Internet, the Internet of Everything [IoE] and the like have been increasing since I wrote my post introducing my IoT mindmap almost a year ago. I learn from some of them, some I nod sagely in agreement, and others cause me to scratch my head in confusion. One in particular this last week fell in that last category, when they claimed that all the terms listed here all mean the same thing.

From my reading, briefings and research over the past year, I've come to a different conclusion. The following definitions are my opinion. I can't say that any authority has certified these definitions. I believe them to be accurate, and if any vendor with an interest in any of these definitions strongly agree or disagree, I would be very much interested in talking with you.

Types

The first thing to be considered is Machine-to-Machine communication. M2M is really just one of four types of interchanges that occur over the Internet, intranets and any command, control, communication, computing or intelligence network. The other types are Human-to-Machine [H2M], Human-to-Human [H2H] and Machine-to-Human [M2H]. H2M and H2H interchanges have been around since the beginning of ARPAnet, which evolved to become the Internet. From the many different protocols at the beginning, such as FTP and Gopher [among many more], two have come to dominate Internet traffic:

  • simple mail transfer protocol [SMTP] at the heart of email, and
  • hypertext transfer protocol [HTTP] at the heart of the world wide web [WWW or Web].

Every transaction made using a computer: online transaction process [OLTP] electronic data interchange [EDI], and eCommerce; every purchase you make at your favorite web store, is an example of H2M.

Of course, starting with email [still the dominant form of communication over the Internet and for businesses and individuals] and expanding to Twiter, Facebook, Waze, Yelp, Foursquare, Yammer, all the various instant messaging networks, voice over Internet protocol [VoIP] and your favorite public or private social network, we have many examples of Internet enabled H2H communication.

These two, H2M and H2H, have become so prevalent, and so important to business, governments and our personal life, that the over-hyped phenomenon "Big Data" was born. But the importance and pervasiveness of M2M, and soon, M2H data will swamp the so-called data tsunami of the past decade. Predictive maintenance, building automation, elastic provisioning, machine logs, software "phoning home" and automated decision support systems are all good examples of direct M2M interchanges where one sensor, device, embedded computer or system has a productive exchange with another such machine, without concurrent human intervention. Self-quantification, gamification, personalized medicine and augmented reality [AR] are all early examples of M2H interchanges, where sensors, devices, embedded computers or system directly provides relevant information to an individual, allowing for better informed decisions.

The Internet of Things

The Internet of Things was coined in 1999 by Kevin Ashton. Since then, the term has come to mean any device that is connected to the Internet. Most people don't consider computers, routers, edge equipment and other Internet infrastructure hardware to be a "device", and usually exclude such hardware from consideration as a thing that uses that infrastructure. For many, the devices are only smart phones, feature phones and tablets. This has led to predictions by Cisco and GSMA to declare that there will be 30 to 50 billion devices connected to the Internet by 2020. However, even these organizations, and most people with whom I speak who have skin in the IoT game, feel that my own prediction of one trillion devices connected to the Internet by 2020 is more likely. These devices span from individual, but connected sensors, to heavy machinery. However, as companies come out with Tweeting diapers, glowing clothing and other such silliness, the Internet of Things is in danger of becoming a fad. So, what is the Internet of Things? To my mind, the Internet of Things comprises any sensor, embedded sensor, embedded computer, component, package, sub-system, systems, or System that is connected to the Internet and intended to have meaningful interchanges with other such items and with humans. The Internet of Things primarily uses M2M and increasingly M2H interchange.

Smarter Planet

The first treatment of the IoT as large, complex system, to which I was exposed was at networking event in 2008… One of those events where IBM was introducing their new initiative for a Smarter Planet. The Smarter Planet brings complex systems such as the Smart Grid, building automation across facilities, water management, traffic management, Smarter Cities and Smarter Farms under one System. One approach and one initiative that raises the IoT to a new level of importance for world governments, global businesses and individuals from the poorest village to the most cosmopolitan city. The Smarter Planet initiatives go beyond IoT, beyond the individual things, to treating all such things, the Internet, the protocols, process and policies as one very large, complex, possibly cognate system.

Industrial Internet

The Industrial Internet is a term coined by General Electric [GE] in 2011. At a very simple level, the Industrial Internet can be thought of connected industrial control systems. But the impact is much more complex, and much more significant. The first thing to be realized is that connected sensors and computing power will be embedded in everything, from robots and conveyor belts on the factory floor, to tractors and irrigation on the farm, from heavy equipment to hand drills, from jet engines to bus fleets; every piece of equipment, everywhere. The Industrial Internet also primarily uses M2M and M2H. While this sounds much like the Internet of Things, the purpose is much different. The Industrial Internet is about changing business processes and making data the new coin of the realm. GE is very serious about the Industrial Internet, and while they don't use the term yet, Sensor Analytics Ecosystems. Data Marketplaces are rapidly becoming core to GEs businesses, as proven by their recent 140 million dollar investment in Pivotal, the new Big Data Platform as a Service [PaaS] by EMC. Another excellent example of the importance of the Industrial Internet comes from Salesforce.com use of The Social Machine by Digi International and its Etherios business unit, in bringing sensor data into customer relationship management [CRM] by allowing sensors embedded in industrial refrigerators, hot tubs, and heavy and light equipment of all types to open SFDC chatter sessions and to file cases.

Internet of Everything

Cisco has recently started two initiatives related to the IoT, the Internet of Everything [IoE] and Fog Computing. IoE seeks to bring together H2H, H2M, M2M and H2H interchanges. On June 19th of this year, Cisco introduced their IoE Value Index [link to PDF]. By bringing together people, processes, data, and things, and with some impressive research to back it up, Cisco feels that the IoE, in 2013, could bring 1.2 Trillion Dollars in added value, and by 2022, 14.4 Trillion dollars in added market value to business around the world. Fog Computing tends more to the infrastructure of the IoE, bringing the concepts of Cloud Computing, such as distributed computing and elastic provisioning, to the edge of the network, with an emphasis on wireless connectivity, streaming data, and heterogeneity.

Industry Overview

While some of the above are corporate initiatives, they each represent important and distinct concepts. In addition to these from IBM, Cisco, GE, EMC and Salesforce.com, there are other initiatives and products, in this sphere, coming from HP, Oracle, SAP, MuleSoft, SnapLogic, Nuance, Splunk, Mocana, Evrythng, Electric Imp, Quirky, reelyActive, Ayla, SmartThings, Withings, Fitbit, Jawbone including BodyMedia, Nike, Basis, Cohda Wireless, AT&T, Verizon, Huawei, Orange, Belkin, DropCam, Gravity Jack, Alcatel-Lucent, and Siemens. Platforms, software, sensor packages and services, are being developed by a wide variety of innovative companies:

These innovative companies, and others, are implementing one or more of these concepts in a variety of ways. As I stated at the beginning, I don't think that these concepts are the same. While the IoT was first named 14 years ago, it is still early days in its implementation. There are many ways that the Internet of Things might evolve, and many missteps that could lead the IoT to be a passing fancy, leaving some important changes in its wake, but never reaching its full potential. I think there is one way, and one way only, that all of the concepts and initiatives will come together and change everything that we do, how we make decisions, how we think about ourselves, how governments make policy, how businesses make money: The Sensor Analytics Ecosystem [SAE]. Here's a tease of a mindmap giving a hint of what I mean by the SAE. Look for my upcoming report "Sensor Analytics as an Ecosystem" and a series of research reports delving into each area introduced therein. The companies listed above are building out parts of the SAE, and will feature heavily in these reports.

Creative Commons License: Attribution, Non-Commercial, Share-AlikeExcept where otherwise noted, this content is
licensed under a Creative Commons License.

EC3 Energy Home PageEC3 Energy Home Page

Permalink

Pentaho Acquires Webdetails for Great UX

Today, Monday, 2013 April 22, Pentaho completed the acquisition of long-time partner, Webdetails.

Pentaho offers one of the most complete data management and analytics suites available both as an open source solution, its Community Edition, and as an Enterprise Edition:

  • included target database [HSQL or MySQL], with the ability to use any RDBMS,
  • extract, transform and load servers and clients, KETTLE, Carte, Pan and Spoon,
  • online analytical processing server, Mondrian,
  • metadata management,
  • report development,
  • schema development,
  • dashboard development,
  • data mining, WEKA,
  • and a BI server to tie it all together.

Webdetails is a 20-person strong consultancy based in Portugal, founded by Pedro Alves, focused on building Pentaho solutions for its customers, and on data visualization. In addition to the consulting work, Webdetails has become the major committer for the open source Community Development Framework project, originally developed by Ingo Klose. In the course of their work, as inspired by the muse of customer needs, Webdetails has grown the original CDF project into a full suite of OSS data visualization and dashboard projects, CTools. Over the past year, the talented web details user experience teams, seems to have put out a new CTool almost monthly.

  • CDF - community dashboard framework
  • CDE - community dashboard editor
  • CBF - community build framework
  • CDA - community data access
  • CCC - community chart components
  • CST - community startup tabs
  • CGG - community graphics generator
  • CDC - community distributed cache
  • CDB - community data browser
  • CDG - community data generator
  • CDV - community data validation

Pedro Alves is an extremely well-respected member of the Pentaho community, leading community events and training, appearing often in the forums and IRC, and staying connected through Twitter and Skype. Recently, Pedro was highly active in helping to create the Pentaho Marketplace, which provides direct access from the BI Server web interface for users, to a series of plug-ins for the BI Server, including CTools, and other community and third-party projects.

I have the pleasure of knowing Pedro, and several other of the Webdetails and Pentaho teams. This week I was able to speak with Pedro, as well as Davy Nys, Vice President, EMEA & APAC at Pentaho, and Doug Moran, one of Pentaho's Founders.

Pedro doesn't feel that the acquisition will change Webdetails, in that both the UX and consulting teams will continue as before. However, both community and enterprise users of Pentaho will feel the impact of both teams, as the lessons learned from Webdetails consulting projects are implemented by the UX team, not only in the Dashboards and data visualizations tools, but also, per Davy, in the overall UX throughout all the Pentaho products. Having worked with Pentaho tools as a practitioner in the past, I know that business users will appreciate this as Pentaho becomes both easier and more pleasant to use. The data scientists will also appreciate more and better tools to draw the story out of the data, and present it to the subject matter experts and business leaders in an Agile fashion.

As Pedro mentioned, most things won't change, such as the fact that CDF is the underpinning of all of Pentaho Dashboards, or the pace of development of new CTools. Several are currently underway. One that I can mention grew out of a request by the Mozilla Foundation, for a file and data browser for the Hadoop distributed file system [HDFS] that would be as easy as the file browser in any modern operating system. The result is CVB - community VFS browser. One thing that will change is that more of the CTools will make their way into the main branch of the EE product as they reach the appropriate state of maturity and stability.

Pedro has many plans for CTools, and for facilitating data visualization through Pentaho. But in addition to continuing his role as the general manager of Webdetails, and Chief Architect of CTools, Pedro will also be assuming the role of Senior Vice President of Community for Pentaho. As a long time friend of the Pentaho community myself, I have to say that there couldn't be a better choice.

One of Pentaho's Founders, Doug Moran, was the "Community Guy", who stayed in this role until the start of 2011, following the original community guy, Gretchen Moran. Doug's philosophy is that any open source community needs to stand on its own to be organic and strong. The Pentaho community is one of the strongest in the OSS DMA space, and as a result, Doug felt comfortable focusing elsewhere, and assumed management of all of Pentaho's "big data" products and Instaview initiatives. As SVP of Community, Pedro will be mostly focused within the company to integrate the community internally and help drive the corporate strategy for community. He'll continue to participate in the community, but as the Pentaho BeeKeeper model, developed by Pentaho CTO & Chief Geek, James Dixon, his main concern will be to assure that there is a rich environment for community innovation. As part of that, Pedro will also be actively pursuing ways to grow and leverage the Pentaho Marketplace. Doug also pointed out that the Pentaho community is also hugely valuable for QA and as a training ground for the best Pentaho developers. This is sure to continue with Pedro in his new role. Doug and Pedro have worked together since the early days of Pentaho, when Pedro decided to quit his job, and, with his wife, create a company devoted to professional services for Pentaho projects and products. This strong relationship between the original Community Guy and the new SVP of Community can only help to make an already strong community even better and more creative.

Davy pointed out to me that there has been an increase in customer demand for Dashboards that were in essence, apps within Pentaho. This might happen through a plan that Pedro has to make it very easy to create such dashboard-based apps without any programming ability, and then publish them to the Marketplace. This planned community plugin kick-starter [CPK] will use CDE to create the front-end, and the Pentaho Data Integration software, KETTLE and Spoon, for the backend logic. I believe that both internal and external consultants, integrating Pentaho into an organization's decision making process, will find this ability exciting, as many of these system integrators are not Java developers. The ability to push such apps to the Marketplace will also be embraced by both CE and EE users, as most customers are excited by the idea of openly sharing their solutions, and enjoy the resulting community recognition.

As the innovations related to the big data and data science movements become more important, Davy told me that Pentaho has seen great interest in four areas:

  • EDW optimization,
  • exploratory analytics,
  • machine learning [WEKA, mahout & R], and
  • leveraging Hadoop to scale.

Webdetails fits very well into creating a finer exploratory analytics experience for the customers, and will make Pentaho a superior choice for big data. Combined with Instaview, and with the proper roadmap, it may even push Pentaho into the new Data Grok market, not only helping users answer the questions they have, but actually pointing out the questions that the data set can answer, even if the user didn't think of it.

Both CE and EE users and customers of Pentaho should welcome this acquisition, and look forward to the better UX and data visualization. Most importantly, they should plan on how they can contribute to, and benefit from the Pentaho Marketplace, as it becomes an important part of the Pentaho ecosystem.

Other Resources:

  1. Pedro Alves on Business Intelligence: "A new challenge - Webdetails Joins Pentaho"
  2. Pentaho page about announcement: Bringing the art of the possible to life
  3. Webdetails page about the announcement: The future of business analytics changes today
  4. Pentaho Blog: Webdetails and the Art of the Possible
  5. Press Release English: Pentaho Acquires Dashboard and UI Specialist Partner Webdetails
Permalink

Pages: 1 2 3 4 5 6 7 8 9 10 11 ... 122 >>

August 2014
Mon Tue Wed Thu Fri Sat Sun
 << <   > >>
        1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31
Welcome to the TeleInterActive Press, the web logs that allow you to act, one upon the other, remotely over a distance. Join the conversation. 37.540686772871 -122.516149406889

Search

Categories

The TeleInterActive Lifestyle

Yackity Blog Blog

The Cynosural Blog

Open Source Solutions

Data Archon

XML Feeds

powered by b2evolution free blog software