A practical guide to orchestrating data workflow with Apache Airflow
Dylan Intorf, Dylan Storey, Kendrick van Doorn

#Data
#Airflow
#Apache_Airflow
#DAG
#CI/CD
#ETL
#ML
#UI
#DevOps
Confidently orchestrate your data pipelines with Apache Airflow by applying industry best practices and scalable strategies
Data professionals face the monumental task of managing complex data pipelines, orchestrating workflows across diverse systems, and ensuring scalable, reliable data processing. This definitive guide to mastering Apache Airflow, written by experts in engineering, data strategy, and problem-solving across tech, financial, and life sciences industries, is your key to overcoming these challenges. It covers everything from the basics of Airflow and its core components to advanced topics such as custom plugin development, multi-tenancy, and cloud deployment.
Starting with an introduction to data orchestration and the significant updates in Apache Airflow 2.0, this book takes you through the essentials of DAG authoring, managing Airflow components, and connecting to external data sources. Through real-world use cases, you’ll gain practical insights into implementing ETL pipelines and machine learning workflows in your environment. You’ll also learn how to deploy Airflow in cloud environments, tackle operational considerations for scaling, and apply best practices for CI/CD and monitoring.
By the end of this book, you’ll be proficient in operating and using Apache Airflow, authoring high-quality workflows in Python for your specific use cases, and making informed decisions crucial for production-ready implementation.
This book is for data engineers, developers, IT professionals, and data scientists who want to optimize workflow orchestration with Apache Airflow. It's perfect for those who recognize Airflow’s potential and want to avoid common implementation pitfalls. Whether you’re new to data, an experienced professional, or a manager seeking insights, this guide will support you. A functional understanding of Python, some business experience, and basic DevOps skills are helpful. While prior experience with Airflow is not required, it is beneficial.
Dylan Intorf is a seasoned technology leader with a B.Sc. in computer science from Arizona State University. With over a decade of experience in software and data engineering, he has delivered custom, tailored solutions to the technology, financial, and insurance sectors. Dylan's expertise in data and infrastructure management has been instrumental in optimizing Airflow deployments and operations for several Fortune 25 companies.
Dylan Storey holds a B.Sc. and M.Sc. in biology from California State University, Fresno, and a Ph.D. in life sciences from the University of Tennessee, Knoxville where he specialized in leveraging computational methods to study complex biological systems. With over 15 years of experience, Dylan has successfully built, grown, and led teams to drive the development and operation of data products across various scales and industries, including many of the top Fortune-recognized organizations. He is also an expert in leveraging AI and machine learning to automate processes and decisions, enabling businesses to achieve their strategic goals.
Kendrick van Doorn is an accomplished engineering and business leader with a strong foundation in soft ware development, honed through impactful work with federal agencies and consulting technology firms. With over a decade of experience in crafting technology and data strategies for leading brands, he has consistently driven innovation and efficiency. Kendrick holds a B.Sc. in computer engineering from Villanova University, an M.Sc. in systems engineering from George Mason University, and an MBA from Columbia University.









