Apache Airflow Training
14 March, 2024 – Virtual
Unravel Apache Airflow in our training, as we tackle intricate cron job management and data reliability. Engage in hands-on learning, mastering Airflow’s essentials for effortless workflow scheduling, monitoring, and data flow. Develop skills to create efficient pipelines and navigate architecture intricacies, all while making informed decisions. This training goes beyond tools, emphasizing thoughtful decision-making for robust workflows. Join us to demystify Airflow’s potential and equip yourself with skills to streamline your data processes effectively.
09:00 – 17:00
What will you learn?
After the training, you will be able to:
Use basic Airflow concepts, such as DAGs, Operators, and Hooks.
Use Airflow Operators to execute predefined tasks which integrate with many different systems.
Design and create task workflows using Airflow DAGs.
Use the Airflow UI to get insights into the status of your workflows.
In the Course Capstone Project, you will become a rocket scientist tasked with constructing an Apache Airflow data pipeline. Guided by a visionary boss, you leverage The SpaceDevs API to gather insights from rocket launches. Your journey centers on mastering architectural design, creating launch-scheduling operators, and optimizing pipeline efficiency. Embracing files and cloud storage, you will integrate SQL-like databases for comprehensive analysis. This achievement underscores the project’s impact on scientific exploration, offering a unique opportunity to harness Airflow’s power for orchestrating real-world data pipelines. 🚀
- What is Airflow?
- Airflow Installation
- Create Your First DAG
- Scheduling pipelines in Airflow
- Context & Templating
- Branching & Trigger Rules
- Introduction to the Capstone Project
Who is it for?
This training is for you if:
You want to schedule and monitor your data pipelines in a user-friendly way.
You have complex data pipelines with a lot of dependencies.
You want to schedule your data pipelines on external systems.
This training is not for you if:
You want to handle streaming data pipelines.
Your primary focus is on single, isolated tasks rather than orchestrating complex workflows.
You don’t work with data pipelines or recurring processes that require scheduling.
You have less than one year of experience working with Python in the Data Engineering field.
Why should I
follow this training?
Get a solid understanding of Apache Airflow’s core concepts and practical hands-on experience.
Learn how to author, schedule, and monitor workflows.
Learn how best to design data pipelines for real-life use cases.
should I know?
After registering for this training, you will receive a confirmation email with practical information. A week before the training, the trainer will get in touch to ask you about any requirements you may have and any pre-course tasks you will need to do.
See you soon!
All literature and course materials are included in the price.
After registering for this course, you will receive a confirmation email with practical information.
Also interesting for youView all training courses
Delve into the world of Large Language Models (LLMs) and state-of-the-art generative AI.
Unlock the power of data with SQL – the ultimate data management system!
Discover what MLOps is and how you can apply it in GCP (Google Cloud Platform) with our MLOps on GCP training course.
29 May, 2024
This MLOps on Azure training is then a perfect next step if you’d like like to take your Machine Learning models further.
7 Oct, 2024