Automated data management software delivers faster, more accurate financial reporting.
The latest from our blog
Integrate everything into your Tableau dashboards at a fraction of the time and cost of Alteryx
JOIN US AT #TC18
Meet our management team and contact our offices in the UK, USA, and Australia
Opportunities for VARs, ISVs and software vendors
Keep-up with the latest goings-on at ZAP and the events we’re attending
PLUS five-point plan for automated data governance
Raise a support ticket and explore our knowledge base, user forum and more
A wealth of materials to help you become a data driven business. Videos, webinars, mentoring and more
Get in touch
September 21, 2018
By Trey Johnson
No comments yet
Welcome to the second part of my blog recapping and exploring the feedback provided by a group of attendees at Azure Data Fest, a great by the Atlanta PASS Local Group Leaders!
It is really a great deal of fun bringing the advancements in technology to folks and equally fun to hear their perspectives on change, such as Moving BI Workloads to the Cloud.
I won’t fully restate my last blog post but suffice it to say there is probably a bit more interest than action with the audience in my session. Last time we concluded with no one in the room raising any significant fears about moving to Azure, but the motivations weren’t exactly defined either.
The Modern Data Platform
We started the last blog with this graphic. It is just incredibly fitting and something well worth studying as Azure really provides many of these elements, well.
The other element to look at includes this Microsoft graphic on who manages what in the various flavors of On Premises, IaaS, PaaS and SaaS.
The reduction in Blue boxes and the increase in Orange boxes means the management transitions from IT to that of the Cloud Provider (Microsoft in the case of Azure). I spoke about this with the group and recall many heads nodding when we discussed the improvements in reducing some of the management workloads, particularly with IaaS and PaaS.
I asked a couple of key questions and both Team Blue and Team Red were ready to answer!
Which are you considering for moving your BI Workloads to Azure (IaaS or PaaS) and why?
Generally, it seemed as if the group was more comfortable with IaaS, which architecturally has Windows Azure VMs the customer manages. This is VERY understandable especially if there are licenses folks are trying to continue to use or an incomplete Digital Transformation. I remember telling people that I hoped BI was not their first foray into the cloud. I only say that as we’ve seen our fair share of customers not think about identities in the cloud and other elements which naturally come around as your physical local network extends into authentication/authorization/resources which sit in the cloud.
Are you “Lifting” and “Shifting” or going through a Re-Engineering exercise, too?
The answers, in this case, were an equal split between wanting to just “move” and needing to re-engineer as there were influencing factors like new data sources and new means of acquiring data. This, too, makes sense. The Collect, Manage and Transform elements of the Microsoft Data Platform look a bit different depending on what technology is being used to, do many things but at its core, move data around.
The Modern Data Warehouse
Microsoft has a depiction of the Modern Data Warehouse, which I used to discuss the various parts of Azure notably Storage, Data Manipulation and delivery of a Data Warehouse plus Tabular Model platform. The graphic is shown below…
The above is an impressive array of technology and may well be a “future” data warehouse architecture where unstructured data is really a valuable asset in the architecture. But, having a sneaking suspicion the audience might not have as robust a view of their first Azure data warehouse, I showed them this view, which seemed a bit more relatable (based on the comments).
We talked about the above and most people understood the relevance of Azure Data Factory (ADF), Azure SQL Data Warehouse (or, for non-MPP scenarios, just Azure SQL DB) and Azure Analysis Services as the underpinnings of a data platform which Power BI and other tools can sit on top of for self-analysis plus pervasive analytic delivery.
We talked a bit more about ADF and I asked people …
Can you actually do what you need to in ADF, easily?
“No.” That was the consensus. It doesn’t mean that ADF isn’t a technology which as a v2 is better than the v1 but it isn’t quite the same toolset as SSIS or ZAP’s Data Hub. Those who are on Azure still use SSIS, and not the SSIS integration runtime in ADF. It was interesting to hear the tone of the group change. I sensed a level of dissatisfaction with ADF but I also heard people say that SSIS and Scripts were their tools right now.
I shared my own early views on ADF which included:
Personally, and yes, I’m biased, I’d prefer to use ADF for simple orchestration and use great tools like ZAP Data Hub to acquire data intelligently and incrementally from my structured applications (like Dynamics 365, Salesforce, Sage, etc…) Once the data is in a store in Azure, if there is a compelling case for marrying this structured data with other data, including unstructured data, you are that much closer to an end-product and ZAP will certainly have smoothed the ramp to data on Azure.
Analysis Services is Dead?
As a lifelong user of Analysis Services (starting with OLAP Services eons ago), I can definitely say Analysis Services is not dead BUT interestingly people are using the smaller scaled Tabular Model storage of Power BI (limited to 10GB) than the Enterprise-Grade deployments of Tabular models under Azure Analysis Services. Some folks are still using SQL Server Analysis Services as Multidimensional Storage (MOLAP) can address certain scenarios which Tabular simply cannot.
And you know, Analysis Services is probably the last place to focus if you’re looking to move your BI Workloads to the Cloud because Power BI can talk to traditional cubes and can unlock the basic features of Azure Analysis Services (as Tabular Models) in a subtle way.
It seemed like a real shame to have to reduce the drawing to just ONE team. Fortunately, I didn’t have to….
Both teams accumulated 16 points. I had asked a person at the beginning to pick a number between 1 and 5, she picked 4. I asked a two people at the end to each pick a number between 1 and 10 and they picked 6 and 1. So, the winner was (4*6)-1 or the 23rd person. It was a fun session and I think the real winner was me!
Trey Johnson is ZAP’s Chief Evangelist. Based out of Jacksonville, Florida, he brings experience from leading various boutique BI software and national consulting companies. A published author, speaker, and consultant, Trey sat on the PASS Board of Directors over multiple terms, concluding as their Executive Vice President. He was a long-term member of Microsoft’s BI Partner Advisory Council and has spent the last 25 years delivering business intelligence, data warehousing, and data management solutions to businesses of all shapes, sizes and “data challenges.” Follow Trey on Twitter and LinkedIn.
Get in touch for a personalised demo of ZAP Data Hub
Comments are closed.
All Rights Reserved