Skip to content

Commit

Permalink
Docs(airflow): Add information for Mac users about installing the Air…
Browse files Browse the repository at this point in the history
…flow facade (#3662)
  • Loading branch information
erindru authored Jan 20, 2025
1 parent 42b07a0 commit 5d1fe65
Showing 1 changed file with 13 additions and 0 deletions.
13 changes: 13 additions & 0 deletions docs/cloud/features/scheduler/airflow.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,18 @@ Make sure to include the `[airflow]` extra in the installation command:
$ pip install tobiko-cloud-scheduler-facade[airflow]
```

!!! info "Mac Users"

On Mac OS, you may get the following error:

`zsh: no matches found: tobiko-cloud-scheduler-facade[airflow]`

In which case, the argument to `pip install` needs to be quoted like so:

```
$ pip install 'tobiko-cloud-scheduler-facade[airflow]'
```

### Connect Airflow to Tobiko Cloud

Next, add an Airflow [connection](https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html#creating-a-connection-with-the-ui) containing your Tobiko Cloud credentials.
Expand Down Expand Up @@ -126,6 +138,7 @@ The local DAG represents your SQLMesh project's models and their activity in Tob
The DAG is composed of SQLMesh models, but there must be a boundary around those models to separate them from your broader Airflow pipeline. The boundary consists of two tasks that serve as entry and exit nodes for the entire Tobiko Cloud run.

The first and last tasks in the DAG are the boundary tasks. The tasks are the same in every local DAG instance:

- First task: `Sensor` task that synchronizes with Tobiko Cloud
- Last task: `DummyOperator` task that ensures all models without downstream dependencies have completed before declaring the DAG completed

Expand Down

0 comments on commit 5d1fe65

Please sign in to comment.