During Microsoft’s PASS Summit 2010, Microsoft announced their strategic plans for their Analysis Services business intelligence platform. But as intriguing as Microsoft’s announcement was from a technological perspective, it also had another less desired effect – it presented Microsoft’s long time business partners, namely BI consultants and service providers, with big questions in regards to where they fit in Microsoft’s grand plans for the BI space.
There are two main reasons for this:
1. While Microsoft heavily relies on its partner network for selling more SQL Server licenses, they have been heavily marketing PowerPivot which is positioned as self-service BI. Neither the partners, nor Microsoft, have yet to figure out how to position the two offerings in a way that makes sense to a typical customer.
2. Microsoft’s road map clearly shows a shift away from OLAP architecture and to a new paradigm they call the BI Semantic Model that doesn’t coincide with their partners’ existing and hard earned training and expertise in implementing/selling Microsoft BI solutions. Microsoft partners would need to re-align their entire business based on a product that isn’t even released yet, let alone implemented anywhere.
Following this announcements, high profile evangelists of Microsoft BI solutions have openly expressed their concern regarding this radical move. See examples here and here. Ever since, Microsoft has been working overtime on some major damage control, trying to explain itself and reassure its partner network that OLAP is not going anywhere and that these new plans are complementary to the existing Analysis Services offering (i.e. "bla bla").
Regardless of Microsoft’s post factum attempt to re-establish calm amongst its partner community from a PR perspective, that cat is already out of the bag. And honestly, Microsoft’s move was not unexpected. Check out the article published two months ago titled ‘Business Intelligence Vendors and their Partners – Rough Seas Ahead’ which specifically discusses what PowerPivot (and similar technologies) would mean for the BI implementation business.
Whether Microsoft’s ideas are practical or just pipe dreams remains to be seen. However, one thing is certain – considering the fact Microsoft rely so heavily on its dedicated partners for sales and marketing, I would have expected this to be handled with much more finesse. This announcement was a very poor display of handling public relations.
By: Elad Israeli | The ElastiCube Chronicles - Business Intelligence Blog
A new age of business intelligence and analytics is upon us. And it's about time!
Monday, November 15, 2010
Microsoft’s Announcement of BI Road Map – a Public Relations Nightmare
Friday, November 12, 2010
Microsoft’s BI Roadmap says NO to OLAP Cubes and MDX
So Microsoft PASS Summit 2010 was kicked off on November 10th, and the burning topic was where Microsoft’s Analysis Services product is headed in light of Microsoft’s new PowerPivot offering. Chris Webb, probably one of Analysis Service’s biggest fans and experts, said it best:
“The last few days have been quite emotional for me. I’ve gone from being very angry, to just feeling sad, to being angry again; I’m grateful to the many members of the SSAS dev team who’ve let me rant and rave at them for hours on end and who have patiently explained their strategy – it’s certainly helped me deal with things. So what’s happened to make me feel like this? I’ll tell you: while it’s not true to say that Analysis Services cubes as we know them today and MDX are dead, they have a terminal illness. I’d give them two, maybe three more releases before they’re properly dead, based on the roadmap that was announced yesterday.”
The full post and consequent comments can be found here.
Readers of The ElastiCube Chronicles may recall a previous post titled ‘Is Microsoft Admitting that Analysis Services is not Fit for the Mid-Market?’ published back in August 2010, in response to the official release of PowerPivot. Well, I believe that question has been officially answered - Yes.
By: Elad Israeli | The ElastiCube Chronicles - Business Intelligence Blog
“The last few days have been quite emotional for me. I’ve gone from being very angry, to just feeling sad, to being angry again; I’m grateful to the many members of the SSAS dev team who’ve let me rant and rave at them for hours on end and who have patiently explained their strategy – it’s certainly helped me deal with things. So what’s happened to make me feel like this? I’ll tell you: while it’s not true to say that Analysis Services cubes as we know them today and MDX are dead, they have a terminal illness. I’d give them two, maybe three more releases before they’re properly dead, based on the roadmap that was announced yesterday.”
The full post and consequent comments can be found here.
Readers of The ElastiCube Chronicles may recall a previous post titled ‘Is Microsoft Admitting that Analysis Services is not Fit for the Mid-Market?’ published back in August 2010, in response to the official release of PowerPivot. Well, I believe that question has been officially answered - Yes.
By: Elad Israeli | The ElastiCube Chronicles - Business Intelligence Blog
Labels:
Analysis Services,
Business Intelligence Technology,
Elad,
microsoft,
olap,
powerpivot,
SiSense
The New Tableau 6.0 Data Engine – First Impressions
Tableau 6.0 is out, and according to Tableau Software’s CEO one of its main features is a new data engine. Here’s an excerpt from one of the articles covering Tableau’s latest release:
"Our new Tableau Data Engine achieves instant query response on hundreds of millions of data rows, even on hardware as basic as a corporate laptop... No other platform allows companies to choose in-memory analytics on gigabytes of data …" Christian Chabot, CEO of Tableau Software, said in a statement.
These are bombastic claims indeed and the underlined segments of the CEO’s quote are particularly interesting. So with the help of my friend, colleague and brilliant database technologist Eldad Farkash, I decided to put these claims to a real life test.
Since this data engine was claimed to be utilizing in-memory technology, we set up a 64-bit computer with adequate amounts of RAM (hardly a corporate laptop) and used a real customer’s data set consisting of 560 million rows of raw internet traffic data. To make it easier, we imported just a single text field out of this entire data set.
Initial Findings:
1. Surprisingly, and unlike what Tableau’s CEO claims, Tableau’s new data engine is not really in-memory technology. In fact, their entire data set is stored on disk after it is imported and RAM is hardly utilized.
2. It took Tableau 6.0 approximately 5 hours to import this single text field, out of which 1.5 hours was pure import and the rest a process Tableau calls ‘Column Optimization’ which we believe is creating an index very similar to that of a regular relational database. For comparison, it took QlikView 50 minutes and ElastiCube 30 minutes to import the same field. That is an x7 difference. All products were using their default settings.
3. Once the import process completed, we asked Tableau to count how many distinct values existed in that field, a common query required for business intelligence purposes. That query took 30 minutes to return. For comparison, it took both QlikView and ElastiCube approximately 10 seconds to return. That’s an x180 difference. Again, both products were used with their default settings.
Initial Conclusions:
Tableau’s new data engine is a step up from their previous engine which was quite similar to that which Microsoft Access had been using in Office 2007. That is good news for individual analysts working with non-trivial amounts of data using earlier versions of Tableau, which were quite poor in this respect. This release, I imagine, also helps Tableau against SpotFire (Tibco), which until now was the only pure visualization player who could claim to have technology aimed for handling of larger data sets.
From a practical perspective, however, the handling of hundreds of millions of rows of data as well as the reference to in-memory analytics are more marketing fluff geared towards riding the in-memory hype than a true depiction of what this technology is or what it is capable of. Tableau’s data engine is not in the same league as in-memory technology, or pure columnar technologies like ElastiCube, when it comes to import times or query response times. In fact, it is slower by several orders of magnitude.
"Our new Tableau Data Engine achieves instant query response on hundreds of millions of data rows, even on hardware as basic as a corporate laptop... No other platform allows companies to choose in-memory analytics on gigabytes of data …" Christian Chabot, CEO of Tableau Software, said in a statement.
These are bombastic claims indeed and the underlined segments of the CEO’s quote are particularly interesting. So with the help of my friend, colleague and brilliant database technologist Eldad Farkash, I decided to put these claims to a real life test.
Since this data engine was claimed to be utilizing in-memory technology, we set up a 64-bit computer with adequate amounts of RAM (hardly a corporate laptop) and used a real customer’s data set consisting of 560 million rows of raw internet traffic data. To make it easier, we imported just a single text field out of this entire data set.
Initial Findings:
1. Surprisingly, and unlike what Tableau’s CEO claims, Tableau’s new data engine is not really in-memory technology. In fact, their entire data set is stored on disk after it is imported and RAM is hardly utilized.
2. It took Tableau 6.0 approximately 5 hours to import this single text field, out of which 1.5 hours was pure import and the rest a process Tableau calls ‘Column Optimization’ which we believe is creating an index very similar to that of a regular relational database. For comparison, it took QlikView 50 minutes and ElastiCube 30 minutes to import the same field. That is an x7 difference. All products were using their default settings.
3. Once the import process completed, we asked Tableau to count how many distinct values existed in that field, a common query required for business intelligence purposes. That query took 30 minutes to return. For comparison, it took both QlikView and ElastiCube approximately 10 seconds to return. That’s an x180 difference. Again, both products were used with their default settings.
Initial Conclusions:
Tableau’s new data engine is a step up from their previous engine which was quite similar to that which Microsoft Access had been using in Office 2007. That is good news for individual analysts working with non-trivial amounts of data using earlier versions of Tableau, which were quite poor in this respect. This release, I imagine, also helps Tableau against SpotFire (Tibco), which until now was the only pure visualization player who could claim to have technology aimed for handling of larger data sets.
From a practical perspective, however, the handling of hundreds of millions of rows of data as well as the reference to in-memory analytics are more marketing fluff geared towards riding the in-memory hype than a true depiction of what this technology is or what it is capable of. Tableau’s data engine is not in the same league as in-memory technology, or pure columnar technologies like ElastiCube, when it comes to import times or query response times. In fact, it is slower by several orders of magnitude.
Labels:
Big Data,
Business Intelligence,
Elad,
ElastiCube,
In Memory Databases,
powerpivot,
QlikView,
SiSense,
Tableau,
Tibco SpotFire
Subscribe to:
Posts (Atom)