Andy Leonard

OK
About Andy Leonard
Andy Leonard is a Chief Data Engineer at Enterprise Data & Analytics, a Biml (Business Intelligence Markup Language) developer and BimlHero, SSIS architect, consultant, and trainer. He created and maintains the DILM (Data Integration Lifecycle Management) Suite that includes tools and utilities for managing SSIS in the enterprise. Andy is an avid blogger and has co-authored several books on SSIS, ETL, and database technology.
Customers Also Bought Items By
Are you an author?
Author Updates
-
-
Blog postThis is part of a series of posts. This post focuses on using Azure Data Factory with a GitHub repository.
One Way to Add an Existing Data Factory to Github, Part 1 One Way to Add an Existing Data Factory to Github, Part 2 One Way to Add an Existing Data Factory to Github, Part 3 Making Changes Using the Main Branch Next, introduce a change to Azure Data Factory. In my Azure Data Factory at the time of this post, I have a single pipeline published (deployed). The pipeline is named “pi19 hours ago Read more -
Blog postThis is part of a series of posts. This post focuses on connecting an Azure Data Factory to a GitHub repository.
One Way to Add an Existing Data Factory to Github, Part 1 One Way to Add an Existing Data Factory to Github, Part 2 One Way to Add an Existing Data Factory to Github, Part 3 Connect ADF to GitHub Repository Careful readers will note I change github accounts moving forward. I made the change so that I could accurately demonstrate creating a new account on github.com (at19 hours ago Read more -
Blog postThis is part of a series of posts. This post focuses on creating a GitHub account and repository.
One Way to Add an Existing Data Factory to Github, Part 1 One Way to Add an Existing Data Factory to Github, Part 2 One Way to Add an Existing Data Factory to Github, Part 3 You built an Azure Data Factory back in the olde days and now you want to save your ADF pipelines and affiliated artifacts. What to do? What to do? One way to save all that code is to connect your data factory to Gith19 hours ago Read more -
-
Blog postThat’s what we sell at Data Integration Lifecycle Management Suite: confidence.
Confidence There’s no substitute for that feeling of confidence when SSIS Catalog Compare is used to compare two SSIS Catalogs and the comparison yields no differences.
I cannot improve on testimonies from our customers:
The question of comparing a Production SSIS Catalog with a QA (Quality Assurance) SSIS Catalog is a loaded question. The question presumes the enterprise SSIS lifecycle isYesterday Read more -
Blog postThis post is a follow-up to my recent post titled Promoting SSIS Catalog Objects Between Lifecycle Management Tiers.
DILM Deployment Utility is a great way to encapsulate SSIS Catalog folders as code in sccpac files. Drop the files in a git repo, add, commit, and push, and viola, scripts for your SSIS Catalog folder(s) and all Catalog objects are scripted and source controlled.
SSIS Catalog Compare is on sale until 31 Aug 2022. Subscribe now and save!
Enjoy the video:5 days ago Read more -
Blog postCode promotion is essential to DevOps. One problem solved by SSIS Catalog Compare is code promotion via scripting between Development, Test, Pre-Production, and Production lifecycle management tiers*.
In the video, I demonstrate:
One gap (scripting all SSIS objects stored in the SSIS Catalog) in SSIS management tools shipped with SSMS. One way SSIS Catalog Compare addresses this gap. Subscriptions to SSIS Catalog Compare are on sale this month (August 2022).
Schedule a1 week ago Read more -
-
Blog postFor the month of August 2022, SSIS Catalog Compare Enterprise Edition yearly subscription is on sale for 40% off the regular yearly subscription rate.
SSIS Catalog Compare facilitates Data Integration Lifecycle Management (DILM) and DevOps with SSIS by:
Comparing SSIS Catalog metadata between data integration lifecycle tiers (Development, Test, QA, Production, etc.) Promoting (deploying) SSIS Catalog metadata between data integration lifecycle tiers Scripting SSIS Catalog metadata1 week ago Read more -
Blog postAzure Automation has been around for a few years now. I just got started because my brother and friend, Aaron Nelson (@SQLvariant), shared some automation he’s been working on. Once I got my head around the piece I’m about to share with you, Azure Automation started to make sense to me.
Set Up an Azure Automation Account Browse to portal.azure.com. Search for “automation” and then click “Automation Accounts”:
The Automation Accounts blade displays. Click the “+ Create” link: <1 month ago Read more -
Blog postI am honored to deliver Master the Fundamentals of ADF at SQL Saturday Boston 2022 (#1031) 7 Oct 2022!
Abstract Azure Data Factory, or ADF, is an Azure PaaS (Platform-as-a-Service) that provides hybrid data integration at global scale. Use ADF to build fully managed ETL in the cloud – including SSIS.
Join Andy Leonard – Microsoft Data Platform MVP, author, blogger, and Chief Data Engineer at Enterprise Data & Analytics – as he demonstrates Azure Data Factory in action.1 month ago Read more -
Blog postI am honored to deliver two training sessions at the PASS Data Community Summit 2022:
A Day of Azure Data Factory is a full-day pre-conference session scheduled for Monday, 14 Nov 2022. In this session I cover version control integration, copying data, using parameters and metadata, design patterns (of course!), and more! Introduction to Azure Data Factory is a 75-minute presentation exploring the ADF web GUI, Azure-SSIS integration runtime, and designing a basic pipeline. You may not1 month ago Read more
Titles By Andy Leonard
SQL Server MVP Deep Dives is organized into five parts: Design and Architecture, Development, Administration, Performance Tuning and Optimization, and Business Intelligence. Within each part, you'll find a collection of brilliantly concise and focused chapters that take on key topics like mobile data strategies, Dynamic Management Views, or query performance. The range of subjects covered is comprehensive, from database design tips to data profiling strategies for BI.
Additionally, the authors of this book have generously donated 100% of their royalties to support War Child International. War Child International is a network of independent organizations, working across the world to help children affected by war. War Child was founded upon a fundamental goal: to advance the cause of peace through investing hope in the lives of children caught up in the horrors of war. War Child works in many different conflict areas around the world, helping hundreds of thousands of children every year. Visit www.warchild.org for more information.
Purchase of the print book comes with an offer of a free PDF, ePub, and Kindle eBook from Manning. Also available is all code from the book.
Frameworks not only reduce the time required to deliver enterprise functionality, but can also accelerate troubleshooting and problem resolution. You'll learn in this book how frameworks also improve code quality by using metadata to drive processes. Much of the work performed by data professionals can be classified as “drudge work”—tasks that are repetitive and template-based. The frameworks-based approach shown in this book helps you to avoid that drudgery by turning repetitive tasks into "one and done" operations. Frameworks as described in this book also support enterprise DevOps with built-in logging functionality.
What You Will Learn
- Create a stored procedure framework to automate SQL process execution
- Base your framework on a working system of stored procedures and execution logging
- Create an SSIS framework to reduce the complexity of executing multiple SSIS packages
- Deploy stored procedure and SSIS frameworks to Azure Data Factory environments in the cloud
Who This Book Is For
Database administrators and developers who are involved in enterprise data projects built around stored procedures and SQL Server Integration Services (SSIS). Readers should have a background in programming along with a desire to optimize their data efforts by implementing repeatable processes that support enterprise DevOps.
Build custom SQL Server Integration Services (SSIS) tasks using Visual Studio Community Edition and C#. Bring all the power of Microsoft .NET to bear on your data integration and ETL processes, and for no added cost over what you’ve already spent on licensing SQL Server. New in this edition is a demonstration deploying a custom SSIS task to the Azure Data Factory (ADF) Azure-SSIS Integration Runtime (IR).
All examples in this new edition are implemented in C#. Custom task developers are shown how to implement custom tasks using the widely accepted and default language for .NET development.
Why are custom components necessary? Because even though the SSIS catalog of built-in tasks and components is a marvel of engineering, gaps remain in the available functionality. One such gap is a constraint of the built-in SSIS Execute Package Task, which does not allow SSIS developers to select SSIS packages from other projects in the SSIS Catalog. Examples in this book show how to create a custom Execute Catalog Package task that allows SSIS developers to execute tasks from other projects in the SSIS Catalog. Building on the examples and patterns in this book, SSIS developers may create any task to which they aspire, custom tailored to their specific data integration and ETL needs.
What You Will Learn
- Configure and execute Visual Studio in the way that best supports SSIS task development
- Create a class library as the basis for an SSIS task, and reference the needed SSIS assemblies
- Properly sign assemblies that you create in order to invoke them from your task
- Implement source code control via Azure DevOps, or your own favorite tool set
- Troubleshoot and execute custom tasks as part of your own projects
- Create deployment projects (MSIs) for distributing code-complete tasks
- Deploy custom tasks to Azure Data Factory Azure-SSIS IRs in the cloud
- Create advanced editors for custom task parameters
Who This Book Is For
For database administrators and developers who are involved in ETL projects built around SQL Server Integration Services (SSIS). Readers do not need a background in software development with C#. Most important is a desire to optimize ETL efforts by creating custom-tailored tasks for execution in SSIS packages, on-premises or in ADF Azure-SSIS IRs.
SQL Server MVP Deep Dives, Volume 2 is a unique book that lets you learn from the best in the business - 64 SQL Server MVPs offer completely new content in this second volume on topics ranging from testing and policy management to integration services, reporting, and performance optimization techniques...and more.
About this Book
To become an MVP requires deep knowledge and impressive skill. Together, the 64 MVPs who wrote this book bring about 1,000 years of experience in SQL Server administration, development, training, and design. This incredible book captures their expertise and passion in 60 concise, hand-picked chapters and offers valuable insights for readers of all levels.
SQL Server MVP Deep Dives, Volume 2 picks up where the first volume leaves off, with completely new content on topics ranging from testing and policy management to integration services, reporting, and performance optimization. The chapters fall into five parts: Architecture and Design, Database Administration, Database Development, Performance Tuning and Optimization, and Business Intelligence.
Purchase of the print book comes with an offer of a free PDF, ePub, and Kindle eBook from Manning. Also available is all code from the book.
What's Inside
Discovering servers with PowerShell
Using regular expressions in SSMS
Tuning the Transaction Log for OLTP
Optimizing SSIS for dimensional data
Real-time BI and much more
Manning Publications and the authors of this book support the children of Operation Smile, an international children's medical charity that performs free reconstructive surgery for children suffering from facial deformities such as cleft lips and cleft palates by mobilizing medical volunteers who provide education and training programs to local doctors on the latest surgical techniques.
===============================================
Table of Contents
PART 1 ARCHITECTURE Edited by Louis Davidson
PART 2 DATABASE ADMINISTRATION Edited by Paul Randal and Kimberly Tripp
PART 3 DATABASE DEVELOPMENT Edited by Paul Nielsen
PART 4 PERFORMANCE TUNING AND OPTIMIZATION Edited by Brad M. McGehee
PART 5 BUSINESS INTELLIGENCE Edited by Greg Low
The first part of the book starts with the basics—getting your development environment configured, Biml syntax, and scripting essentials.
Whether a beginner or a seasoned Biml expert, the next part of the book guides you through the process of using Biml to build a framework that captures both your design patterns and execution management. Design patterns are reusable code blocks that standardize the approach you use to perform certain types of data integration, logging, and other key data functions. Design patterns solve common problems encountered when developing data integration solutions. Because you do not have to build the code from scratch each time, design patterns improve your efficiency as a Biml developer.
In addition to leveraging design patterns in your framework, you will learn how to build a robust metadata store and how to package your framework into Biml bundles for deployment within your enterprise.
In the last part of the book, we teach you more advanced Biml features and capabilities, such as SSAS development, T-SQL recipes, documentation autogeneration, and Biml troubleshooting.
The Biml Book:
- Provides practical and applicable examples
- Teaches you how to use Biml to reduce development time while improving quality
- Takes you through solutions to common data integration and BI challenges
- Master the basics of Business Intelligence Markup Language (Biml)
- Study patterns for automating SSIS package generation
- Build a Biml Framework
- Import and transform database schemas
- Automate generation of scripts and projects
BI developers wishing to quickly locate previously tested solutions, Microsoft BI specialists, those seeking more information about solution automation and code generation, and practitioners of Data Integration Lifecycle Management (DILM) in the DevOps enterprise
SQL Server Integration Services Design Patterns is newly-revised for SQL Server 2014, and is a book of recipes for SQL Server Integration Services (SSIS). Design patterns in the book help to solve common problems encountered when developing data integration solutions. The patterns and solution examples in the book increase your efficiency as an SSIS developer, because you do not have to design and code from scratch with each new problem you face. The book's team of expert authors take you through numerous design patterns that you'll soon be using every day, providing the thought process and technical details needed to support their solutions.
SQL Server Integration Services Design Patterns goes beyond the surface of the immediate problems to be solved, delving into why particular problems should be solved in certain ways. You'll learn more about SSIS as a result, and you'll learn by practical example. Where appropriate, the book provides examples of alternative patterns and discusses when and where they should be used. Highlights of the book include sections on ETL Instrumentation, SSIS Frameworks, Business Intelligence Markup Language, and Dependency Services.
- Takes you through solutions to common data integration challenges
- Provides examples involving Business Intelligence Markup Language
- Teaches SSIS using practical examples
Why are custom components necessary? Because even though the SSIS catalog of built-in tasks and components is a marvel of engineering, there do remain gaps in the functionality that is provided. These gaps are especially relevant to enterprises practicing Data Integration Lifecycle Management (DILMS) and/or DevOps.
One of the gaps is a limitation of the SSIS Execute Package task. Developers using the stock version of that task are unable to select SSIS packages from other projects. Yet it’s useful to be able to select and execute tasks across projects, and the example used throughout this book will help you to create an Execute Catalog Package task that does in fact allow you to execute a task from another project. Building on the example’s pattern, you can create any task that you like, custom tailored to your specific, data integration and ETL needs.
What You Will Learn
- Configure and execute Visual Studio in the way that best supports SSIS task development
- Create a class library as the basis for an SSIS task, and reference the needed SSIS assemblies
- Properly sign assemblies that you create in order to invoke them from your task
- Implement source code control via Visual Studio Team Services, or your own favorite tool set
- Code not only your tasks themselves, but also the associated task editors
- Troubleshoot and then execute your custom tasks as part of your own project
Database administrators and developers who are involved in ETL projects built around SQL Server Integration Services (SSIS). Readers should have a background in programming along with a desire to optimize their ETL efforts by creating custom-tailored tasks for execution from SSIS packages.
Data Integration Life Cycle Management with SSIS shows you how to bring DevOps benefits to SSIS integration projects. Practices in this book enable faster time to market, higher quality of code, and repeatable automation. Code will be created that is easier to support and maintain. The book teaches you how to more effectively manage SSIS in the enterprise environment by drawing on the art and science of modern DevOps practices.
What You'll Learn
- Generate dozens of SSIS packages in minutes to speed your integration projects
- Reduce the execution of related groups of SSIS packages to a single command
- Successfully handle SSIS catalog deployments and their projects
- Monitor the execution and history of SSIS catalog projects
- Manage your enterprise data integration life cycle through automated tools and utilities
Database professionals working with SQL Server Integration Services in enterprise environments. The book is especially useful to those readers following, or wishing to follow, DevOps practices in their use of SSIS.