From a8ec85f4855632600df7f456a05356edb95a4c55 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 14:42:32 +0200 Subject: [PATCH 01/21] added airflow --- docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc index 5c90d526..9522fbc2 100644 --- a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc +++ b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc @@ -22,6 +22,15 @@ You can see the deployed products as well as their relationship in the following image::demo-airflow-scheduled-job/overview.png[] +== System requirements + +To run this demo, your system needs at least: + +* 2.5 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 9GiB memory +* 24GiB disk storage + + == List deployed Stackable services To list the installed Stackable services run the following command: From 0a8ad1ffda49e078bad56bda33ad60d552f541b3 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 15:01:50 +0200 Subject: [PATCH 02/21] ... --- .../ROOT/pages/demos/airflow-scheduled-job.adoc | 11 ++++------- 1 file changed, 4 insertions(+), 7 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc index 9522fbc2..cc18a84e 100644 --- a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc +++ b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc @@ -1,12 +1,5 @@ = airflow-scheduled-job -[NOTE] -==== -This guide assumes that you already have the demo `airflow-scheduled-job` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install airflow-scheduled-job`. -==== - This demo will * Install the required Stackable operators @@ -30,6 +23,10 @@ To run this demo, your system needs at least: * 9GiB memory * 24GiB disk storage +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install airflow-scheduled-job`. == List deployed Stackable services To list the installed Stackable services run the following command: From deb5b1069ce36da53b3e8f9120a5b9d28b8a104a Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 15:14:29 +0200 Subject: [PATCH 03/21] ... --- .../pages/demos/airflow-scheduled-job.adoc | 2 + .../data-lakehouse-iceberg-trino-spark.adoc | 38 ++++++++++--------- 2 files changed, 22 insertions(+), 18 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc index cc18a84e..1edfea2e 100644 --- a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc +++ b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc @@ -15,6 +15,7 @@ You can see the deployed products as well as their relationship in the following image::demo-airflow-scheduled-job/overview.png[] +[#system-requirements] == System requirements To run this demo, your system needs at least: @@ -23,6 +24,7 @@ To run this demo, your system needs at least: * 9GiB memory * 24GiB disk storage +[#installation] == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. diff --git a/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc b/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc index 27efb1fd..81188f5f 100644 --- a/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc +++ b/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc @@ -1,36 +1,20 @@ = data-lakehouse-iceberg-trino-spark -[WARNING] +[IMPORTANT] ==== This demo shows a data workload with real world data volumes and uses significant amount of resources to ensure acceptable response times. It will most likely not run on your workstation. There is also the smaller xref:demos/trino-iceberg.adoc[] demo focusing on the abilities a lakehouse using Apache Iceberg offers. The `trino-iceberg` demo has no streaming data part and can be executed on a local workstation. - -The demo was developed and tested on a kubernetes cluster with 10 nodes (4 cores (8 threads), 20GB RAM and 30GB HDD). -Instance types that loosely correspond to this on the Hyperscalers are: - -- *Google*: `e2-standard-8` -- *Azure*: `Standard_D4_v2` -- *AWS*: `m5.2xlarge` - -In addition to these nodes the operators will request multiple persistent volumes with a total capacity of about 1TB. ==== -[WARNING] +[CAUTION] ==== This demo only runs in the `default` namespace, as a `ServiceAccount` will be created. Additionally, we have to use the fqdn service names (including the namespace), so that the used TLS certificates are valid. ==== -[NOTE] -==== -This guide assumes that you already have the demo `data-lakehouse-iceberg-trino-spark` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install data-lakehouse-iceberg-trino-spark`. -==== - This demo will * Install the required Stackable operators @@ -53,6 +37,24 @@ You can see the deployed products as well as their relationship in the following image::demo-data-lakehouse-iceberg-trino-spark/overview.png[] +[#system-requirements] +== System requirements + +The demo was developed and tested on a kubernetes cluster with 10 nodes (4 cores (8 threads), 20GB RAM and 30GB HDD). +Instance types that loosely correspond to this on the Hyperscalers are: + +- *Google*: `e2-standard-8` +- *Azure*: `Standard_D4_v2` +- *AWS*: `m5.2xlarge` + +In addition to these nodes the operators will request multiple persistent volumes with a total capacity of about 1TB. + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install data-lakehouse-iceberg-trino-spark`. + == Apache Iceberg As Apache Iceberg states on their https://iceberg.apache.org/docs/latest/[website]: From 4e65d7c54857f93f1a83ef57208eb0b18564ef18 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 15:27:57 +0200 Subject: [PATCH 04/21] added hbase-hdfs --- .../demos/hbase-hdfs-load-cycling-data.adoc | 22 +++++++++++++------ 1 file changed, 15 insertions(+), 7 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc b/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc index ae42eb72..560c05cd 100644 --- a/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc +++ b/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc @@ -1,12 +1,5 @@ = hbase-hdfs-cycling-data -[NOTE] -==== -This guide assumes that you already have the demo `hbase-hdfs-load-cycling-data` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install hbase-hdfs-load-cycling-data`. -==== - This demo will * Install the required Stackable operators @@ -22,6 +15,21 @@ You can see the deployed products as well as their relationship in the following image::demo-hbase-hdfs-load-cycling-data/overview.png[] +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 3 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 6GiB memory +* 16GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install hbase-hdfs-load-cycling-data`. + == List deployed Stackable services To list the installed Stackable services run the following command: `stackablectl services list --all-namespaces` From fcd32eea2338fba05f8d7ec0873f9d0b1dc34265 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 15:30:42 +0200 Subject: [PATCH 05/21] trino-taxi-data --- .../ROOT/pages/demos/trino-taxi-data.adoc | 22 +++++++++++++------ 1 file changed, 15 insertions(+), 7 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc b/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc index dc8ba10b..04ed6671 100644 --- a/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc +++ b/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc @@ -1,12 +1,5 @@ = trino-taxi-data -[NOTE] -==== -This guide assumes that you already have the demo `trino-taxi-data` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install trino-taxi-data`. -==== - This demo will * Install the required Stackable operators @@ -24,6 +17,21 @@ You can see the deployed products as well as their relationship in the following image::demo-trino-taxi-data/overview.png[] +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 6.8 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 16GiB memory +* 28GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install trino-taxi-data`. + == List deployed Stackable services To list the installed Stackable services run the following command: From baaaf4de021ef9497d9c296f6022c7b975d235e2 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 15:32:33 +0200 Subject: [PATCH 06/21] added trino-iceberg --- .../ROOT/pages/demos/trino-iceberg.adoc | 22 +++++++++++++------ 1 file changed, 15 insertions(+), 7 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/trino-iceberg.adoc b/docs/modules/ROOT/pages/demos/trino-iceberg.adoc index b6e51924..7ae710cc 100644 --- a/docs/modules/ROOT/pages/demos/trino-iceberg.adoc +++ b/docs/modules/ROOT/pages/demos/trino-iceberg.adoc @@ -7,13 +7,6 @@ It focuses on the Trino and Iceberg integration and should run on you local work If you are interested in a more complex lakehouse setup, please have a look at the xref:demos/data-lakehouse-iceberg-trino-spark.adoc[] demo. ==== -[NOTE] -==== -This guide assumes that you already have the demo `trino-iceberg` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install trino-iceberg`. -==== - This demo will * Install the required Stackable operators @@ -22,6 +15,21 @@ This demo will * Create multiple data lakehouse tables using Apache Iceberg and data from the https://www.tpc.org/tpch/[TPC-H dataset]. * Run some queries to show the benefits of Iceberg +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 8.55 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 27GiB memory +* 110GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install trino-iceberg`. + == List deployed Stackable services To list the installed installed Stackable services run the following command: From fdf6fea4b13d0acf370c16a0c1c03e81b552311d Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 15:55:42 +0200 Subject: [PATCH 07/21] added some more --- .../nifi-kafka-druid-earthquake-data.adoc | 24 +++++++++++------ .../nifi-kafka-druid-water-level-data.adoc | 24 +++++++++++------ ...spark-k8s-anomaly-detection-taxi-data.adoc | 26 +++++++++++-------- 3 files changed, 47 insertions(+), 27 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc index c8ee3f7f..30361224 100644 --- a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc +++ b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc @@ -1,18 +1,11 @@ = nifi-kafka-druid-earthquake-data -[WARNING] +[CAUTION] ==== This demo only runs in the `default` namespace, as a `ServiceAccount` will be created. Additionally, we have to use the fqdn service names (including the namespace), so that the used TLS certificates are valid. ==== -[NOTE] -==== -This guide assumes that you already have the demo `nifi-kafka-druid-earthquake-data` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install nifi-kafka-druid-earthquake-data`. -==== - This demo will * Install the required Stackable operators @@ -32,6 +25,21 @@ You can see the deployed products as well as their relationship in the following image::demo-nifi-kafka-druid-earthquake-data/overview.png[] +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 8.7 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 28GiB memory +* 75GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install nifi-kafka-druid-earthquake-data`. + == List deployed Stackable services To list the installed Stackable services run the following command: diff --git a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc index 3d344007..a4740a97 100644 --- a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc +++ b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc @@ -1,18 +1,11 @@ = nifi-kafka-druid-water-level-data -[WARNING] +[CAUTION] ==== This demo only runs in the `default` namespace, as a `ServiceAccount` will be created. Additionally, we have to use the fqdn service names (including the namespace), so that the used TLS certificates are valid. ==== -[NOTE] -==== -This guide assumes that you already have the demo `nifi-kafka-druid-water-level-data` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install nifi-kafka-druid-water-level-data`. -==== - This demo will * Install the required Stackable operators @@ -34,6 +27,21 @@ You can see the deployed products as well as their relationship in the following image::demo-nifi-kafka-druid-water-level-data/overview.png[] +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 8.9 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 28GiB memory +* 75GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install nifi-kafka-druid-water-level-data`. + == List deployed Stackable services To list the installed Stackable services run the following command: diff --git a/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc b/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc index 8de7319a..39b2d0c9 100644 --- a/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc +++ b/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc @@ -1,16 +1,5 @@ = spark-k8s-anomaly-detection-taxi-data -[WARNING] -==== -This demo should not be run alongside other demos and requires a minimum of 32 GB RAM and 8 CPUs. -==== -[NOTE] -==== -This guide assumes you already have the demo `spark-k8s-anomaly-detection-taxi-data` installed. -If you don't have it installed please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install spark-k8s-anomaly-detection-taxi-data`. -==== - This demo will * Install the required Stackable operators @@ -29,6 +18,21 @@ You can see the deployed products as well as their relationship in the following image::spark-k8s-anomaly-detection-taxi-data/overview.png[] +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 8 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 32GiB memory +* 35GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install spark-k8s-anomaly-detection-taxi-data`. + == List deployed Stackable services To list the installed Stackable services run the following command: From 4d2cf0fd98159c46d26d9e31a16e1111338dd9cc Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 16:00:01 +0200 Subject: [PATCH 08/21] added logging --- docs/modules/ROOT/pages/demos/logging.adoc | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/docs/modules/ROOT/pages/demos/logging.adoc b/docs/modules/ROOT/pages/demos/logging.adoc index 4ffc3f10..f7ad82d6 100644 --- a/docs/modules/ROOT/pages/demos/logging.adoc +++ b/docs/modules/ROOT/pages/demos/logging.adoc @@ -68,6 +68,21 @@ The following command creates a kind cluster and installs this demo: $ stackablectl demo install logging --kind-cluster ---- +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 6.5 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 5GiB memory +* 27GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install logging`. + == List deployed Stackable services To list the installed Stackable services run the following command: From 9acb3573223c10236aecbaff8fb9dbc6192756c0 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 16:04:21 +0200 Subject: [PATCH 09/21] ... --- ...park-hdfs-anomaly-detection-taxi-data.adoc | 41 ++++++++++--------- 1 file changed, 21 insertions(+), 20 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc b/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc index 1aef8d27..9c6c6f16 100644 --- a/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc +++ b/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc @@ -2,30 +2,10 @@ This demo showcases the integration between https://jupyter.org[Jupyter] and https://hadoop.apache.org/[Apache Hadoop] deployed on the Stackable Data Platform (SDP) Kubernetes cluster. https://jupyterlab.readthedocs.io/en/stable/[JupyterLab] is deployed using the https://github.com/jupyterhub/zero-to-jupyterhub-k8s[pyspark-notebook stack] provided by the Jupyter community. The SDP makes this integration easy by publishing a discovery `ConfigMap` for the HDFS cluster. This `ConfigMap` is then mounted in all `Pods`` running https://spark.apache.org/docs/latest/api/python/getting_started/index.html[PySpark] notebooks so that these have access to HDFS data. For this demo, the HDFS cluster is provisioned with a small sample of the https://www.nyc.gov/site/tlc/about/tlc-trip-record-data.page[NYC taxi trip dataset] which is analyzed with a notebook that is provisioned automatically in the JupyterLab interface . -This demo can be installed on most cloud managed Kubernetes clusters as well as on premise or on a reasonably provisioned laptop. Install this demo on an existing Kubernetes cluster: - -[source,bash] ----- -stackablectl demo install jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data ----- - -[WARNING] -==== -This demo should not be run alongside other demos and requires a minimum of 32 GB RAM and 8 CPUs. -==== - -[NOTE] -==== -Some container images used by this demo are quite large and some steps may take several minutes to complete. If you install this demo locally, on a developer laptop for example, this can lead to timeouts during the installation. If this happens, it's safe to rerun the `stackablectl` command from above. - -For more details on how to install Stackable demos see the xref:commands/demo.adoc#_install_demo[documentation]. -==== - == Aim / Context This demo does not use the Stackable spark-k8s-operator but rather delegates the creation of executor pods to JupyterHub. The intention is to demonstrate how to interact with SDP components when designing and testing Spark jobs: the resulting script and Spark job definition can then be transferred for use with a Stackable `SparkApplication` resource. When logging in to JupyterHub (described below), a pod will be created with the username as a suffix e.g. `jupyter-admin`. This runs a container that hosts a Jupyter notebook with Spark, Java and Python pre-installed. When the user creates a `SparkSession`, temporary spark executors are created that are persisted until the notebook kernel is shut down or re-started. The notebook can thus be used as a sandbox for writing, testing and benchmarking Spark jobs before they are moved into production. - == Overview This demo will: @@ -39,6 +19,27 @@ This demo will: * Train an anomaly detection model using PySpark on the data available in HDFS * Perform some predictions and visualize anomalies +[#system-requirements] +== System requirements + +To run this demo, your system needs at least: + +* 8 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 32GiB memory +* 22GiB disk storage + +[#installation] +== Installation + +Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. +To put it simply you have to run `stackablectl demo install jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data`. + +[NOTE] +==== +Some container images used by this demo are quite large and some steps may take several minutes to complete. If you install this demo locally, on a developer laptop for example, this can lead to timeouts during the installation. If this happens, it's safe to rerun the `stackablectl` command from above. + +For more details on how to install Stackable demos see the xref:commands/demo.adoc#_install_demo[documentation]. +==== == HDFS From c0e169aa84380779c89daf80e73159932f8e05a2 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Mon, 14 Aug 2023 16:13:26 +0200 Subject: [PATCH 10/21] ... --- docs/modules/ROOT/pages/demos/logging.adoc | 9 --------- 1 file changed, 9 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/logging.adoc b/docs/modules/ROOT/pages/demos/logging.adoc index f7ad82d6..e7e2a1d4 100644 --- a/docs/modules/ROOT/pages/demos/logging.adoc +++ b/docs/modules/ROOT/pages/demos/logging.adoc @@ -59,15 +59,6 @@ vm.max_map_count=262144 Then run `sudo sysctl --load` to reload. -== Run the demo - -The following command creates a kind cluster and installs this demo: - -[source,console] ----- -$ stackablectl demo install logging --kind-cluster ----- - [#system-requirements] == System requirements From 58034feb9465592e94768240202357c324040e09 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:58:30 +0200 Subject: [PATCH 11/21] Update docs/modules/ROOT/pages/demos/trino-taxi-data.adoc Co-authored-by: Andrew Kenworthy --- docs/modules/ROOT/pages/demos/trino-taxi-data.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc b/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc index 04ed6671..b39518f8 100644 --- a/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc +++ b/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc @@ -30,7 +30,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install trino-taxi-data`. +To put it simply you just have to run `stackablectl demo install trino-taxi-data`. == List deployed Stackable services To list the installed Stackable services run the following command: From 2be122b92a6d2a58b2d88c03b2da28654c997538 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:58:40 +0200 Subject: [PATCH 12/21] Update docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc Co-authored-by: Andrew Kenworthy --- docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc index 1edfea2e..bc6fc5a7 100644 --- a/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc +++ b/docs/modules/ROOT/pages/demos/airflow-scheduled-job.adoc @@ -28,7 +28,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install airflow-scheduled-job`. +To put it simply you just have to run `stackablectl demo install airflow-scheduled-job`. == List deployed Stackable services To list the installed Stackable services run the following command: From 53e7d18ec38fd35b43d937b1062e5c4cf8b36114 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:58:47 +0200 Subject: [PATCH 13/21] Update docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc Co-authored-by: Andrew Kenworthy --- .../ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc b/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc index 81188f5f..c2390cc5 100644 --- a/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc +++ b/docs/modules/ROOT/pages/demos/data-lakehouse-iceberg-trino-spark.adoc @@ -53,7 +53,7 @@ In addition to these nodes the operators will request multiple persistent volume == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install data-lakehouse-iceberg-trino-spark`. +To put it simply you just have to run `stackablectl demo install data-lakehouse-iceberg-trino-spark`. == Apache Iceberg As Apache Iceberg states on their https://iceberg.apache.org/docs/latest/[website]: From c30e3378ce346736991d5604402832018b85a0a5 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:58:55 +0200 Subject: [PATCH 14/21] Update docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc Co-authored-by: Andrew Kenworthy --- docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc b/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc index 560c05cd..4a300ab8 100644 --- a/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc +++ b/docs/modules/ROOT/pages/demos/hbase-hdfs-load-cycling-data.adoc @@ -28,7 +28,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install hbase-hdfs-load-cycling-data`. +To put it simply you just have to run `stackablectl demo install hbase-hdfs-load-cycling-data`. == List deployed Stackable services To list the installed Stackable services run the following command: From 186347cbfc9cb1239e296466b1d8f7e3175d18ff Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:59:01 +0200 Subject: [PATCH 15/21] Update docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc Co-authored-by: Andrew Kenworthy --- .../jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc b/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc index 9c6c6f16..c157f409 100644 --- a/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc +++ b/docs/modules/ROOT/pages/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data.adoc @@ -32,7 +32,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data`. +To put it simply you just have to run `stackablectl demo install jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data`. [NOTE] ==== From 8baf2ec36162936c18ee0d22abb4e5fa348651cb Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:59:08 +0200 Subject: [PATCH 16/21] Update docs/modules/ROOT/pages/demos/logging.adoc Co-authored-by: Andrew Kenworthy --- docs/modules/ROOT/pages/demos/logging.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/logging.adoc b/docs/modules/ROOT/pages/demos/logging.adoc index e7e2a1d4..e68aa67b 100644 --- a/docs/modules/ROOT/pages/demos/logging.adoc +++ b/docs/modules/ROOT/pages/demos/logging.adoc @@ -72,7 +72,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install logging`. +To put it simply you just have to run `stackablectl demo install logging`. == List deployed Stackable services From eda430679c6756ad9ef0ddce2e56db5ea98d5184 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:59:15 +0200 Subject: [PATCH 17/21] Update docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc Co-authored-by: Andrew Kenworthy --- .../ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc index 30361224..51aad5c6 100644 --- a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc +++ b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc @@ -38,7 +38,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install nifi-kafka-druid-earthquake-data`. +To put it simply you just have to run `stackablectl demo install nifi-kafka-druid-earthquake-data`. == List deployed Stackable services To list the installed Stackable services run the following command: From 86d804b6ca752a80d6ca6f82ef6a27d36e841623 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 08:59:23 +0200 Subject: [PATCH 18/21] Update docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc Co-authored-by: Andrew Kenworthy --- .../ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc index a4740a97..25589fd7 100644 --- a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc +++ b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc @@ -40,7 +40,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install nifi-kafka-druid-water-level-data`. +To put it simply you just have to run `stackablectl demo install nifi-kafka-druid-water-level-data`. == List deployed Stackable services To list the installed Stackable services run the following command: From d7618601a22401e5375864be4bec4fc3c4f3ed64 Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 09:07:51 +0200 Subject: [PATCH 19/21] Update docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc Co-authored-by: Andrew Kenworthy --- .../ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc b/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc index 39b2d0c9..c08bd2b9 100644 --- a/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc +++ b/docs/modules/ROOT/pages/demos/spark-k8s-anomaly-detection-taxi-data.adoc @@ -31,7 +31,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install spark-k8s-anomaly-detection-taxi-data`. +To put it simply you just have to run `stackablectl demo install spark-k8s-anomaly-detection-taxi-data`. == List deployed Stackable services To list the installed Stackable services run the following command: From 754fc61b46ac809dec5d40c3a1e1de61120a6d4d Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 09:08:00 +0200 Subject: [PATCH 20/21] Update docs/modules/ROOT/pages/demos/trino-iceberg.adoc Co-authored-by: Andrew Kenworthy --- docs/modules/ROOT/pages/demos/trino-iceberg.adoc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/modules/ROOT/pages/demos/trino-iceberg.adoc b/docs/modules/ROOT/pages/demos/trino-iceberg.adoc index 7ae710cc..63065f1b 100644 --- a/docs/modules/ROOT/pages/demos/trino-iceberg.adoc +++ b/docs/modules/ROOT/pages/demos/trino-iceberg.adoc @@ -28,7 +28,7 @@ To run this demo, your system needs at least: == Installation Please follow the xref:commands/demo.adoc#_install_demo[documentation on how to install a demo]. -To put it simply you have to run `stackablectl demo install trino-iceberg`. +To put it simply you just have to run `stackablectl demo install trino-iceberg`. == List deployed Stackable services To list the installed installed Stackable services run the following command: From 3edb6d70e945f9fb02d5b9b4ecc77a2ffc67f4bd Mon Sep 17 00:00:00 2001 From: Felix Hennig Date: Tue, 15 Aug 2023 09:23:36 +0200 Subject: [PATCH 21/21] rounded up CPU numbers --- .../ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc | 2 +- .../ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc | 2 +- docs/modules/ROOT/pages/demos/trino-iceberg.adoc | 2 +- docs/modules/ROOT/pages/demos/trino-taxi-data.adoc | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc index 51aad5c6..0a9d5d33 100644 --- a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc +++ b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-earthquake-data.adoc @@ -30,7 +30,7 @@ image::demo-nifi-kafka-druid-earthquake-data/overview.png[] To run this demo, your system needs at least: -* 8.7 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 9 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) * 28GiB memory * 75GiB disk storage diff --git a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc index 25589fd7..6ca2c38e 100644 --- a/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc +++ b/docs/modules/ROOT/pages/demos/nifi-kafka-druid-water-level-data.adoc @@ -32,7 +32,7 @@ image::demo-nifi-kafka-druid-water-level-data/overview.png[] To run this demo, your system needs at least: -* 8.9 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 9 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) * 28GiB memory * 75GiB disk storage diff --git a/docs/modules/ROOT/pages/demos/trino-iceberg.adoc b/docs/modules/ROOT/pages/demos/trino-iceberg.adoc index 63065f1b..d517fd10 100644 --- a/docs/modules/ROOT/pages/demos/trino-iceberg.adoc +++ b/docs/modules/ROOT/pages/demos/trino-iceberg.adoc @@ -20,7 +20,7 @@ This demo will To run this demo, your system needs at least: -* 8.55 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 9 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) * 27GiB memory * 110GiB disk storage diff --git a/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc b/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc index b39518f8..c65bfc8e 100644 --- a/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc +++ b/docs/modules/ROOT/pages/demos/trino-taxi-data.adoc @@ -22,7 +22,7 @@ image::demo-trino-taxi-data/overview.png[] To run this demo, your system needs at least: -* 6.8 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) +* 7 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread) * 16GiB memory * 28GiB disk storage