Installation¶
Getting Airflow¶
The easiest way to install the latest stable version of Airflow is with pip
:
pip install apache-airflow
You can also install Airflow with support for extra features like gcp
or postgres
:
pip install 'apache-airflow[postgres,gcp]'
Extra Packages¶
The apache-airflow
PyPI basic package only installs what’s needed to get started.
Subpackages can be installed depending on what will be useful in your
environment. For instance, if you don’t need connectivity with Postgres,
you won’t have to go through the trouble of installing the postgres-devel
yum package, or whatever equivalent applies on the distribution you are using.
Behind the scenes, Airflow does conditional imports of operators that require these extra dependencies.
Here’s the list of the subpackages and what they enable:
subpackage |
install command |
enables |
---|---|---|
all |
|
All Airflow features known to man |
all_dbs |
|
All databases integrations |
atlas |
|
Apache Atlas to use Data Lineage feature |
async |
|
Async worker classes for Gunicorn |
aws |
|
Amazon Web Services |
azure |
|
Microsoft Azure |
cassandra |
|
Cassandra related operators & hooks |
celery |
|
CeleryExecutor |
cgroups |
|
Needed To use CgroupTaskRunner |
cloudant |
|
Cloudant hook |
crypto |
|
Encrypt connection passwords in metadata db |
dask |
|
DaskExecutor |
databricks |
|
Databricks hooks and operators |
datadog |
|
Datadog hooks and sensors |
devel |
|
Minimum dev tools requirements |
devel_hadoop |
|
Airflow + dependencies on the Hadoop stack |
doc |
|
Packages needed to build docs |
docker |
|
Docker hooks and operators |
druid |
|
Druid related operators & hooks |
elasticsearch |
|
Elastic Log Handler |
gcp |
|
Google Cloud Platform |
github_enterprise |
|
GitHub Enterprise auth backend |
google_auth |
|
Google auth backend |
grpc |
|
Grpc hooks and operators |
hdfs |
|
HDFS hooks and operators |
hive |
|
All Hive related operators |
jdbc |
|
JDBC hooks and operators |
jira |
|
Jira hooks and operators |
kerberos |
|
Kerberos integration for Kerberized Hadoop |
kubernetes |
|
Kubernetes Executor and operator |
ldap |
|
LDAP authentication for users |
mongo |
|
Mongo hooks and operators |
mssql |
|
Microsoft SQL Server operators and hook, support as an Airflow backend |
mysql |
|
MySQL operators and hook, support as an Airflow
backend. The version of MySQL server has to be
5.6.4+. The exact version upper bound depends
on version of |
oracle |
|
Oracle hooks and operators |
papermill |
|
Papermill hooks and operators |
password |
|
Password authentication for users |
pinot |
|
Pinot DB hook |
postgres |
|
PostgreSQL operators and hook, support as an Airflow backend |
qds |
|
Enable QDS (Qubole Data Service) support |
rabbitmq |
|
RabbitMQ support as a Celery backend |
redis |
|
Redis hooks and sensors |
salesforce |
|
Salesforce hook |
samba |
|
|
sendgrid |
|
Send email using sendgrid |
segment |
|
Segment hooks and sensors |
slack |
|
|
snowflake |
|
Snowflake hooks and operators |
ssh |
|
SSH hooks and Operator |
statsd |
|
Needed by StatsD metrics |
vertica |
|
Vertica hook support as an Airflow backend |
webhdfs |
|
HDFS hooks and operators |
winrm |
|
WinRM hooks and operators |
Initiating Airflow Database¶
Airflow requires a database to be initiated before you can run tasks. If you’re just experimenting and learning Airflow, you can stick with the default SQLite option. If you don’t want to use SQLite, then take a look at Initializing a Database Backend to setup a different database.
After configuration, you’ll need to initialize the database before you can run tasks:
airflow db init