Commit d1e2cf93 authored by Domenico Giordano's avatar Domenico Giordano Committed by Stiven Metaj
Browse files

Giordano stiv notebook patch 10430

parent b9dc8982
......@@ -16,12 +16,12 @@
"id": "challenging-amazon",
"metadata": {},
"source": [
"**Demo Notebook for Running Cloud Analytics AD System**\n",
"# Demo Notebook for TimeSeries Anomaly Detection",
"\n",
"With this notebook, we want to show to the users that want to try the Cloud Analytics Anomaly Detection system how to use the tools that we provide in our repository.\n",
"This notebook shows the Anomaly Detection system for TimeSeries in action using the tools provided in this repository.\n",
"\n",
"**REMEMBER TO ACTIVATE THE ANALYTIX CLUSTER IN THE SWAN CONFIGURATION!!**\n",
"Activating it you will be able to activate spark (you must click the spark icon in the upper part of the notebook to enable spark before starting to run the notebook cells!)."
"Activating it means to activate spark (click the spark icon in the upper part of the notebook to enable spark before starting to run the notebook cells!)."
]
},
{
......@@ -120,7 +120,7 @@
"\n",
"Note that we need 2 config files: \n",
"- A config file about the training part.\n",
"- A config file about the data used by the trained model to inference the scores. "
"- A config file about the data used by the trained model to infer the scores. "
]
},
{
......@@ -133,7 +133,7 @@
"- json_data: containing all the main information\n",
"- json_data_train, json_data_inference: containing specific paths for the 2 different purposes\n",
"\n",
"**Note also that you have to be sure that you have the writing rights for all the paths contained here, in particular *HDFS_folder_with_write_rights* should link to your hdfs personal folder to ensure that :)**"
"**Note also that you have to be sure that you have the writing rights for all the paths contained here, in particular *HDFS_folder_with_write_rights* should point to your hdfs personal folder to ensure that**"
]
},
{
......@@ -430,7 +430,7 @@
"\n",
"**Note that we use the click python library (it permits to use python functions through the command line but it has a default behavior that calls exit(0) at the end of them). To avoid the exit(0) problem just use the standalone_mode=False option.**\n",
"\n",
"More over, we will skip the first 2 steps of the pipeline that we show in the image above (data_presence and check_normalization) because they will always fail the first time and they are not needed for learning purposes in this first moment."
"Moreover, we will skip the first 2 steps of the pipeline that we show in the image above (data_presence and check_normalization) because they are used in production pipelines to check and avoid the re-processing of already processed time intervals. For the purpose of this example, we will force the reprocessing if data are available"
]
},
{
......@@ -1412,8 +1412,8 @@
}
],
"source": [
"# Note that each row of df will be a window of data\n",
"# (48 points, with an aggregation of 10 minutes -> 8 hours).\n",
"# Note that each row of df will be a window of data following the granularity definition in the configuration file. \n",
"# In this example it consists of 48 points representing 8 hours of time interval with an aggregation of 10 minutes.\n",
"# So for every day, for every hostname, we will have 3 rows (3 x 8hours = 24),\n",
"# each row will be composed by timestamp, hostname, hostgroup, ts and\n",
"# for each plugin we will have 48 columns, 1 for each 10 minutes.\n",
......@@ -1794,7 +1794,7 @@
"metadata": {},
"source": [
"## Creation\n",
"For the analysis part we need another json configuration file in input! This time in the file we save:\n",
"For the analysis part we need another json configuration file in input. This time in the file we save:\n",
"- information about the window size we want to study\n",
"- the algorithms we will use to elaborate scores\n",
"- for each algo the hyperparameters and some useful metadata\n",
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment