Xebia Academy is ook beschikbaar in het NederlandsSwitch naar Nederlands
Close
Data Engineering

Apache Airflow

Do you want to develop workflows using Apache Airflow? This 1-day course teaches you the internals, terminology, and best practices of writing DAGs. Plus hands-on experience in writing and maintaining data pipelines.

Develop workflows with Apache Airflow!

If you recognize the following situation, this training will surely help: you start with a small cronjob that will run every night. A few weeks later you write another job, that uses the input from the first job. To be safe, you schedule it a couple of hours after midnight. Over the next weeks, people come to your desk complaining that the data is missing, and they can't do their job. This is where Apache Airflow comes into play.

What is Apache Airflow

Apache Airflow is a data orchestration tool to monitor, audit, and run your data pipelines. Airflow will take care of the scheduling of jobs, and graphically visualize your pipelines. If you do ETL from a database, the database can be unavailable for various reasons. Airflow will detect that the job is failing, and retry later on. This includes deferring the jobs that depend on the input of the first job.

This Apache Airflow training is perfect for

The Apache Airflow training is ideal for Data Scientists and Data Engineers. Do you want to bring your data to production? Do you want to monitor, control, and run your data pipelines with Airflow? This training shows you how! Apache Airflow is an Expert-level training, which means there are requirements for participation. To ensure that you get the most out of the two days, you must be familiar with the basic principles of data engineering and the Python programming language. Check out the basics of Python here. If this list is known, then you will be all right!

What will you learn during the Apache Airflow training?

You will learn how to develop workflows using Apache Airflow. We will go over the internals, terminology, and best practices of writing DAGs. To increase your confidence and make sure you are comfortable working with Airflow, you will work on a use case that involves writing and maintaining data pipelines.

Program

You will learn about the following topics:

  • Airflow components
  • DAG
  • Operators and sensors
  • Branching
  • Tasks
  • Connections
  • Libraries
  • Common Variables
  • Advanced topics
  •  Logs
  •  xCom
  •  Writing your operators and hooks
  •  Pools
  •  Jinja Templating

Data Engineering Trainers

This Data Engineering training is brought to you by our training partner, GoDataDriven. GoDataDriven works with experts in their field who are always on the lookout for the most innovative ways to get the most out of data. Your trainer is a data guru who enjoys sharing his or her experiences to help you work with the latest tools. 

Data Engineering Learning Journey

In addition to this Apache Airflow training, together with GoDataDriven, we also offer a 3-day Spark Programming course. Read more about this in-depth Expert level training here

Yes, I want to know more about Apache Airflow

After registering for this training, you will receive a confirmation email with practical information. A week before the training, we will ask you about any dietary requirements and share literature if there's a need to prepare. See you soon!

What else should I know?

  • You must bring your laptop and make sure it can access the internet. 
  • This course is brought to you by our training partner, GoDataDriven.
Get in touch
contact-us

Our team is at your service

Get in touch!

Or call +31 (0)35 538 1921