Apache Airflow is a powerful, open-source workflow orchestration platform designed to programmatically author, schedule, and monitor workflows. It supports coding in Python with extensive integration capabilities for databases, cloud services, and APIs. With the "Ubuntu 24.04 with Airflow" VPS template from Hostinger, Airflow comes pre-installed using Docker Compose, so you can start orchestrating workflows immediately.
Accessing Airflow
Once your VPS is deployed using this template, Airflow is ready to use out of the box. Open your web browser and navigate to:
http://[your-vps-IP]:8080
When accessing Airflow, you must log in using the username "admin" and the password set during installation. After logging in, you can access the Airflow web interface where you can view, trigger, and monitor your DAGs.
Airflow Directory Structure The Airflow installation is located in /root/airflow/
with the following structure:
/root/airflow/
├── docker-compose.yml # Docker Compose configuration
├── dags/ # Place your DAG files here
├── logs/ # Airflow execution logs
├── plugins/ # Custom Airflow plugins
└── config/ # Airflow configuration files
Basic Airflow Operations
Here are some essential tasks to get you started with Airflow:
Managing DAGs
Add DAGs: Place your Python DAG files in
/root/airflow/dags/
View DAGs: The main dashboard shows all your DAGs with their current status
Trigger DAG: Click the "play" button next to any DAG to trigger it manually
Pause/Unpause: Use the toggle switch to pause or unpause scheduled DAG runs
View Logs: Click on a task instance to view detailed execution logs
Graph View: Click on a DAG name to see its task dependencies and execution flow
Working with Docker Compose Access your VPS via SSH to manage Airflow using Docker Compose:
# Change to the Airflow directory
cd /root/airflow/
# View running containers
docker compose ps
# View Airflow logs
docker compose logs -f
# Stop Airflow
docker compose down
# Start Airflow
docker compose up -d
# Restart Airflow
docker compose restart
# Execute Airflow CLI commands
docker compose run airflow-worker airflow version
docker compose run airflow-worker airflow dags list
Updating Airflow It's important to keep your Airflow version up to date. This ensures you get the latest features and fixes. Follow these steps in SSH to update Airflow:
# Change directory
cd /root/airflow/
# Pull latest version
docker compose pull
# Stop and remove older version
docker compose down
# Start the container
docker compose up -d
Hostinger's "Ubuntu 24.04 with Airflow" VPS templates let you quickly set up and manage workflow orchestration solutions for data pipelines and automation tasks.
Official website: https://airflow.apache.org/
Official documentation: https://airflow.apache.org/docs/