Commit 3957cad5 authored by Domenico Giordano's avatar Domenico Giordano
Browse files

organize documentation

parent 993bb208
......@@ -68,36 +68,44 @@ NB: the script `/opt/ad_system/control_ad_system/start_ad_system.sh` can also be
**Congratulation!** You just complete the full installation of your Anomaly Detection System.
## Getting started with your first Anomaly Detection DAG
**W.I.P. : should it be moved in another README?**
### Getting started with Anomaly Detection DAG
Now that Airflow is up and running we can test the Anomaly Detection System and
its algorithms on a demo scenario.
Follow these steps:
1. Open the File Browser http://localhost:5003/ and login (username = admin, pass = admin), Navigate to the folder **/airflow-compose/dags** and open the file
**config_variables.py**. There you have to change comments on the deploy section:
```
# DEPLOY
SYSTEM_FOLDER = "..."
DATALAKE_FOLDER = "..."
TMP_CONFIG = "..."
IMAGE_NAME = "..."
```
and comment the developement section:
```
# DEVELOPEMENT
# SYSTEM_FOLDER = "..."
# DATALAKE_FOLDER = "..."
# TMP_CONFIG = "..."
# IMAGE_NAME = "..."
```
1. Open the Airflow UI: http://localhost:8080/
1. Search for the dag named **dag_ad_demo** and click on its name.
1. Click on the *graph view* tab to see the interconnection between different tasks
1. Click on the **on/off switch** nex to the header *DAG: dag_ad_demo*.
1. Click on the **on/off switch** next to the header *DAG: dag_ad_demo*.
**Congratulation!** You just started your first Anomaly Detection pipeline. Check the its successful termination via the *graph view*, when all the boxes are dark green the pipeline is completed.
**Congratulation!** You just started your first Anomaly Detection pipeline. Check then its successful termination via the *graph view*,
when all the boxes are dark green the pipeline is completed.
## Additional Documentation
The Anomaly Detection System driven by Airflow can be started in different ways
1. using Docker compose on a given VM (our implemented choice!)
1. using Docker Swarm (requires a Swarm cluster is already up)
1. using Kubernetes (w.i.p)
Details are in the following paragraphs.
### Docker Compose
This is the standard method. The instructions are in this same README.md
### Docker Swarm
This is still W.I.P.
In case of a docker swarm, run to start look into the script in folder [docker-swarm](./docker-swarm)
### Kubernetes
This is still W.I.P.
Documentation can be found at
- https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/
- https://airflow.apache.org/docs/stable/kubernetes.html
> **_NOTE:_** The file browser is used to create new Airflow DAG (Direct Acyclic Graphs) and to modify the configuration files. Access it from here http://localhost:5003/ with username = admin, pass = admin.
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment