Apache Airflow Training
16 december, 2024 – Amsterdam, The Netherlands
Unravel Apache Airflow in our training, as we tackle intricate cron job management and data reliability. Engage in hands-on learning, mastering Airflow’s essentials for effortless workflow scheduling, monitoring, and data flow. Develop skills to create efficient pipelines and navigate architecture intricacies, all while making informed decisions. This training goes beyond tools, emphasizing thoughtful decision-making for robust workflows. Join us to demystify Airflow’s potential and equip yourself with skills to streamline your data processes effectively.
Looking to upskill your team(s) or organization?
Rozaliia will gladly help you further with custom training solutions for your organization.
Get in touchDuur
2 dagen
Tijd
09:00 – 17:00
Taal
Engels
Lunch
Included
Certificering
Nee
Level
Advanced
What will you learn?
After the training, you will be able to:
Use basic Airflow concepts, such as DAGs, Operators, and Hooks.
Use Airflow Operators to execute predefined tasks which integrate with many different systems.
Design and create task workflows using Airflow DAGs.
Use the Airflow UI to get insights into the status of your workflows.
Program
- What is Airflow?
- Airflow Installation
- Create Your First DAG
- Scheduling pipelines in Airflow
- Context & Templating
- Branching & Trigger Rules
- Sensors
- Introduction to the Capstone Project
Who is it for?
This training is for you if:
You want to schedule and monitor your data pipelines in a user-friendly way.
You have complex data pipelines with a lot of dependencies.
You want to schedule your data pipelines on external systems.
This training is not for you if:
You want to handle streaming data pipelines.
Your primary focus is on single, isolated tasks rather than orchestrating complex workflows.
You don’t work with data pipelines or recurring processes that require scheduling.
You have less than one year of experience working with Python in the Data Engineering field.