Month: October 2013

Recap of Day 2 at SQL Pass Summit 2013

SQL Pass Summit 2013 day 2 sessions! Another great day at the “office”, with great sessions and great opportunity for networking. The keynote was the hot topic at the water cooler, and it was interesting to listen on how it was perceived among participants (Doug Hoogervorst gives his thoughts on the summit keynote).

Most agreed that the future release of SQL Server 2014 was promising, especially “column stored indexes” and the in-memory OLTP engine was highlighted as a great new feature and Quentin Clark emphasized that improvements were built-in to the existing engine and upgrading to SQL Server 2014 demanded no modifications to existing applications.

Watch Quentin Clark’s keynote: Microsoft’s Data Platform – The Road Ahead by Quentin Clark

The demo on Power Query was the epicenter to the greatest disagreements. Kamal Hathi demonstrated how a Skype analyst could use Power Query to connect to HDinsight in Azure (Microsoft’s Hadoop store), pull in the JSON records, and convert the big data into structured tabular data. Then, he mashed up this data with online country data via built-in search engine connected to Bing’s engine. In a few clicks he had created a “calls per country” using two different data sources…easy as that! I’m not claiming that this was not impressive, and that Power Query opens up a wide range of opportunities. However, it was interesting to learn  that most people I talked to, and that liked it, had little or no contact with the Business users. I personally belong to the more skeptic group that does not see the average CIO going to access the Hadoop store, knowing how to delimit the unstructured data, mash it up with online data and have no data validation. I question that online sources, like Wikipedia that Kamal mentions, always gets it right, and will stay consistent over time as users replicate their analysis. Also, spelling errors, difference in pages depending on what country it is accessed and many other caveats that makes “easy as that!” difficult. I do however, believe that the talented analyst can leverage value from Power Query.

My very favorite session of the day, was the “From Minutes to Milliseconds: High-Performance SSRS Tuning” by Doug Lane. He demonstrated how one can squeeze out every drop of performance improvements in a constrained environment. The biggest reason why this session was my favorite is because I learned how to improve reporting without using any capital and the second biggest reason was the confirmation that I’m already applying a lot of Doug’s suggestions in our solutions. The best advice, in my opinion, is caching large reports, i.e. you essentially pre-load the report so the user saves the time it takes to retrieve the data and process it. I do know some of our clients that are going to get a pleasant surprise next week… that’s a promise!

I’ll also promise that I won’t grab the microphone at SQL PASS Karaoke tonight… no reason to ruin the evening for everybody. I’ll stick to the beverage department – cheers!

SQL PASS Summit Keynote

SQL PASS Summit Keynote

SQL PASS Summit Keynote

THANK GOD! I’m watching the keynote at SQL Pass and Microsoft SQL Server 2014 will allow us to ask for simple data we would never ask for and to graphically map it or chart it we would never want. No assistance on helping users join data from multiple systems where there is limited commonality. (Unless it is so clear the kids from the AT&T commercials could do it.) And now we can save the queries into a corporate library where all my fellow coworkers never know what my query is and be unsure that they are getting the right data so they have to build their own anyway. I am so tired of calling simple things powerful. Meanwhile, the memory performance stuff is cool and it seems like the concepts around hybrid databases between on and off premise are promising.

Business Impact will Exhibit at SQL Pass Summit 2013!

We’re excited to be hosting a booth at SQL Pass Summit 2013 in Charlotte, NC, from Oct 15-18. It’s our first time (we’ll always remember you Charlotte) and we’re excited because for the past eight years we’ve believed that you have to understand business to build business intelligence solutions. And now we’ll be able to talk with you and other conference attendees about our experiences at more than 70 customers. Experiences such as delivering product contribution and landed costs. About sales goals and budgets. Dashboards and delivery. Why come by? Well, we hope you will experience with us that business intelligence – while aided by SQL Server and technology – is based on understanding you, what you need, and how it can help your customer. We’re at booth 231 and hope you stop by.

Learn more about the event at SQL Pass website here!

 

SQL Pass

SQL PASS Exhibitor

 

 

Six Ways to Prove the Business Intelligence Data Matches in Your BI Solution!

Are you struggling or worried that the business intelligence solution you spent money on won’t be utilized fully?  Are your employees resistant to change, or are you hearing rumblings in the break room that your staff is dreading the use of the new tool and might not trust your business intelligence data?

One significant way to make a business intelligence solution successful is to prove the data matches.  Your staff needs to trust that the data in the BI solution matches, again and again and again!  Although BI solutions can provide and point out “statistically relevant” differences, you need your staff to interpret the data in order to have a successful solution.

We have six recommendations to prove the data matches:

Six ways to prove business intelligence data matches

1) Create a Reference Document

Maintain a living document that takes the key measures and has two columns. Column A holds how the data is derived from the source system – this should be technical code of some kind that one can use in case of emergency.  Column B holds how the data is derived, retrieved or manifested within the BI tool. These should be technical instructions on how A and B should match.

2) Create Comparison Reports.

Second, create reports that calculate the values from the source system directly and the BI system and show the comparison. These reports can be set on some sort of trigger when values are out of whack, but it’s a useful management checkpoint to have someone quarterly look at these, confirm they have a variance of 0 and to confirm what we all come to expect, “Yep, it’s working.”

To read the next four ways to prove your data matches in your business intelligence solution click here to download the whitepaper!


business intelligence data