I’m a data geek 🤓 In fact, I like data so much that I have made it my career! I work with Azure Data and the Microsoft Data Platform, focusing on Data Integration using Azure Data Factory (ADF), Azure Synapse Analytics, and SQL Server Integration Services (SSIS).
In this category, I write technical posts and guides, and share my experiences with certification exams. You can also find a few interviews with Azure and SQL Server experts!
Azure Data posts may cover topics like Azure Data Factory, Azure Synapse Analytics, Azure SQL Databases, and Azure Data Lake Storage. Microsoft Data Platform posts may cover topics like SQL Server, T-SQL, and SQL Server Management Studio (SSMS), and SQL Server Integration Services (SSIS).
One of the sessions I was most looking forward to at Microsoft Ignite 2017 was New capabilities for data integration in the cloud with Mike Flasko. In that session, he talks about Azure Data Factory (ADF) v2 and its new first-class SSIS support.
After the session, I convinced Mike Flasko and Sanjay Krishnamurthi to have a chat with me 🤓 We talked about what’s new in Azure Data Factory v2, including the updated pipeline application model with a new visual design canvas, new Software Development Kits (SDKs) for working with Azure Data Factory, the new Integration Runtime, and the ability to run SSIS packages inside Azure Data Factory v2.
Azure Data Factory v2 with Mike Flasko
At Microsoft Ignite 2017, I had planned an interview with Sunil Agarwal, and was very excited about it. Then Sunil asked if he could bring Kevin Farlee. Of course! Then he asked if he could also bring their customer, Aaron Gerdeman from FIS. Even better! 😁
In this interview, we chat about SQL Server 2017, Resumable Index Builds, Adaptive Query Processing, Columnstore Indexes, High Availability, Real-time Analytics, Real-time Dashboards and the SQL Tiger Team.
During Microsoft Ignite 2017, I got to interview one of the nicest guys in Microsoft, Bob Ward! 🤩
In this interview, we chat about SQL Server 2017, SQL Server on Linux, Adaptive Query Processing, Auto Plan Correction and Columnstore Indexes.
SQL Server 2017 with Bob Ward - Microsoft Ignite 2017
I got to interview Buck Woody about Data Science at Microsoft Ignite 2017! 🥳
In this interview, we chat about Microsoft Business Analytics and AI (formerly known as Cortana Intelligence Suite), Artificial Intelligence in Excel, intent-based programming, Predictive Analytics, DevOps for Data Scientists and life-long learning.
Data Science with Buck Woody - Microsoft Ignite 2017
T-SQL Tuesday #68 is hosted by Andy Yun (@SQLBek). Many SQL Server defaults are not ideal, and most of us have a list of defaults we always change. Andy wants us to Just Say No to Defaults and blog about what, why or how we change defaults.
If you are an SSIS developer like me, there is a big chance that the ProtectionLevel in SSIS Packages is on top of your list of defaults to change. The default ProtectionLevel is EncryptSensitiveWithUserKey (ugh), but most of the time it is not the best option. Raise your hand if you have ever asked your favorite search engine for advice on issues like “SSIS package fails in SQL Server Agent job” or if you have ever heard someone exclaim “but it works on my machine!?” 😅
There are many great blog posts about the different ProtectionLevels, why you probably want to change to DontSaveSensitive as your default, and how to use configurations and parameters instead of encrypted SSIS packages. I will not go into details about any of that in this post, but I will use ProtectionLevel as an example default property you want to change in many SSIS packages at the same time.
How do you batch update properties in existing SSIS packages? You probably don’t want to open up every single package and change them manually?