diff --git a/README.md b/README.md
index 7878b976ba458b8ad55793a79b862021a3e6dc8c..a9439558c075683b8af39b341b7b46802ed0250e 100644
--- a/README.md
+++ b/README.md
@@ -4,22 +4,6 @@ Docker image for a Dashboard ETL worker - contains software performing NXCALS ex
 
 # How to use
 
-The entrypoint triggers a mvn call.
-Mount your python script folder as a volume to /work and start up the docker image to run py-spark
-You can pass arguments directly on the command line :
-```bash
-    docker run -ti --rm -v `pwd`:/work gitlab-registry.cern.ch/industrial-controls/services/dash/worker:latest my-script.py
-```
-You can also mount `/opt/nxcals-spark/work` as a persistent volume if you wish to collect the output of your build.
-
-You can run a bash session in the container too :
-
-```bash
-   docker run -it -e KPRINCIPAL=bcopy -v ~/nxcals.keytab:/auth/private.keytab -v `pwd`/scripts:/opt/nxcals-spark/work:z etlworker bash
-
-
-# How to use
-
 
 * Generate a keytab with :
 ```
@@ -27,18 +11,17 @@ You can run a bash session in the container too :
 ```
 * Provide Influxdb connectivity env variables
 * Provide parameters to your extraction script
-* Run :a
+* Run :
 
 ```bash
-docker run -e KPRINCIPAL=$USER -v `pwd`/nxcals.keytab:/auth/private.keytab -v `pwd`/myscript.py:/opt/nxcals-spark/work/script.py etlworker
+docker run --net=host -e KPRINCIPAL=$USER -v `pwd`/nxcals.keytab:/auth/private.keytab -v `pwd`/myscript.py:/opt/nxcals-spark/work/script.py etlworker
 ``` 
-
 You can also mount `/opt/nxcals-spark/work` as a persistent volume if you wish to collect the output of your build.
 
 You can run a bash session in the container too :
 
 ```bash
-   docker run -it -e KPRINCIPAL=$USER -v ~/nxcals.keytab:/auth/private.keytab -v `pwd`/scripts:/opt/nxcals-spark/work:z etlworker bash
+   docker run -it --net=host -e KPRINCIPAL=$USER -v ~/nxcals.keytab:/auth/private.keytab -v `pwd`/scripts:/opt/nxcals-spark/work:z etlworker bash
 ```
 
 # Development-related instructions