From cdb0e66b0caa573aea4b9b584e53f9aacc76309b Mon Sep 17 00:00:00 2001 From: francis_tanyx Date: Wed, 2 Apr 2025 13:13:53 +0800 Subject: [PATCH 1/2] del: removed markdown files for migration to docs --- src/common-service/README.md | 147 ---------------------------------- src/pipeline-server/readme.md | 65 --------------- 2 files changed, 212 deletions(-) delete mode 100644 src/common-service/README.md delete mode 100644 src/pipeline-server/readme.md diff --git a/src/common-service/README.md b/src/common-service/README.md deleted file mode 100644 index cbc3b979..00000000 --- a/src/common-service/README.md +++ /dev/null @@ -1,147 +0,0 @@ -# Common-Service: LiDAR & Weight Sensor Microservice -This microservice manages **Barcode, LiDAR, and Weight sensors** in a single container. It publishes sensor data over **MQTT** , **Kafka** , or **HTTP** (or any combination), controlled entirely by environment variables. -## 1. Overview - -- **Sensors** - - Barcode, LiDAR, & Weight support in the same codebase. - - - Configuration for each sensor (e.g., ID, port, mock mode, intervals). - -- **Publishing** - - `publisher.py` handles publishing to one or more protocols: - - **MQTT** - - - **Kafka** - - - **HTTP** - -- **Apps** - - Three main modules: - - `barcode_app.py` - - - `lidar_app.py` - - - `weight_app.py` - - - Each uses shared methods from `publisher.py` & `config.py`. - -## 2. Environment Variables -All settings are defined in `docker-compose.yml` under the `asc_common_service` section. Key variables include: -### LiDAR -| Variable | Description | Example | -| --- | --- | --- | -| LIDAR_COUNT | Number of LiDAR sensors | 2 | -| LIDAR_SENSOR_ID_1 | Unique ID for first LiDAR sensor | lidar-001 | -| LIDAR_SENSOR_ID_2 | Unique ID for second LiDAR sensor (if any) | lidar-002 | -| LIDAR_MOCK_1 | Enable mock data for first LiDAR sensor (true/false) | true | -| LIDAR_MQTT_ENABLE | Toggle MQTT publishing | true | -| LIDAR_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker or mqtt-broker_1 | -| LIDAR_MQTT_BROKER_PORT | MQTT broker port | 1883 | -| LIDAR_KAFKA_ENABLE | Toggle Kafka publishing | true | -| KAFKA_BOOTSTRAP_SERVERS | Kafka bootstrap server addresses | kafka:9093 | -| LIDAR_KAFKA_TOPIC | Kafka topic name for LiDAR data | lidar-data | -| LIDAR_HTTP_ENABLE | Toggle HTTP publishing | true | -| LIDAR_HTTP_URL | HTTP endpoint URL for LiDAR data | http://localhost:5000/api/lidar_data | -| LIDAR_PUBLISH_INTERVAL | Interval (in seconds) for LiDAR data publishing | 1.0 | -| LIDAR_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO | - -### Weight -| Variable | Description | Example | -| --- | --- | --- | -| WEIGHT_COUNT | Number of Weight sensors | 2 | -| WEIGHT_SENSOR_ID_1 | Unique ID for first Weight sensor | weight-001 | -| WEIGHT_SENSOR_ID_2 | Unique ID for second Weight sensor (if any) | weight-002 | -| WEIGHT_MOCK_1 | Enable mock data for first Weight sensor (true/false) | true | -| WEIGHT_MQTT_ENABLE | Toggle MQTT publishing | true | -| WEIGHT_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker_1 | -| WEIGHT_MQTT_BROKER_PORT | MQTT broker port | 1883 | -| WEIGHT_KAFKA_ENABLE | Toggle Kafka publishing | false | -| WEIGHT_MQTT_TOPIC | MQTT topic name for Weight data | weight/data | -| WEIGHT_HTTP_ENABLE | Toggle HTTP publishing | false | -| WEIGHT_PUBLISH_INTERVAL | Interval (in seconds) for Weight data publishing | 1.0 | -| WEIGHT_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO | - -### Barcode -| Variable | Description | Example | -| --- | --- | --- | -| BARCODE_COUNT | Number of Barcode sensors | 2 | -| BARCODE_SENSOR_ID_1 | Unique ID for first Barcode sensor | barcode-001 | -| BARCODE_SENSOR_ID_2 | Unique ID for second Barcode sensor (if any) | barcode-002 | -| BARCODE_MOCK_1 | Enable mock data for first Barcode sensor (true/false) | true | -| BARCODE_MQTT_ENABLE | Toggle MQTT publishing | true | -| BARCODE_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker_1 | -| BARCODE_MQTT_BROKER_PORT | MQTT broker port | 1883 | -| BARCODE_KAFKA_ENABLE | Toggle Kafka publishing | false | -| BARCODE_MQTT_TOPIC | MQTT topic name for Barcode data | barcode/data | -| BARCODE_HTTP_ENABLE | Toggle HTTP publishing | false | -| BARCODE_PUBLISH_INTERVAL | Interval (in seconds) for Barcode data publishing | 1.0 | -| BARCODE_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO | - -> **Note:** Change `"true"` or `"false"` to enable or disable each protocol. Adjust intervals, logging levels, or sensor counts as needed. -## 3. Usage - -1. **Build and Run** - -```bash -make build-sensors -make run-sensors -``` -This spins up the `asc_common_service` container (and related services like Mosquitto or Kafka, depending on your configuration). - -2. **Data Flow** - - By default, LiDAR publishes to `lidar/data` (MQTT, if enabled) or `lidar-data` (Kafka), or an HTTP endpoint if configured. - - - Weight sensor similarly publishes to `weight/data` or `weight-data`. - -3. **Mock Mode** - - Setting `LIDAR_MOCK_1="true"` (or `WEIGHT_MOCK_1="true"`) forces the sensor to generate **random** data rather than reading from actual hardware. - -## 4. Testing - -### A. MQTT - -- **Grafana** : A pre-loaded dashboard named *Retail Analytics Dashboard* is available at [http://localhost:3000](http://localhost:3000/) (default credentials `admin`/`admin`). - -- Check that the MQTT data source in Grafana points to `tcp://mqtt-broker_1:1883` (or `tcp://mqtt-broker:1883`, depending on the network). - -### B. Kafka - -- Enable Kafka for LiDAR/Weight by setting `LIDAR_KAFKA_ENABLE="true"` and/or `WEIGHT_KAFKA_ENABLE="true"`. - -- Test from inside the container: - -```bash -docker exec asc_common_service python kafka_publisher_test.py --topic lidar-data -``` -You should see incoming messages in the console. - -### C. HTTP - -1️ **Local Test (Inside Docker)** - -- Set `LIDAR_HTTP_URL="http://localhost:5000/api/lidar_data"` in the environment. -- Run `make run-sensors` and wait for all containers to start. -- Once up, execute: - -```bash -docker exec asc_common_service python http_publisher_test.py -``` - -- This will trigger the HTTP publisher and display the received data inside the container. - -2️ **Using an External Webhook Service** - -- Visit [Webhook.site](https://webhook.site/) and get a unique URL. -- Set `LIDAR_HTTP_URL` to this URL. -- Run `make run-sensors`, and you should see the HTTP requests arriving on the Webhook.site dashboard. - - - -## 5. Contributing & Development - -- **Code Structure** - - `publisher.py`: Core publishing logic (MQTT, Kafka, HTTP). - - - `config.py`: Loads environment variables and configures each sensor. - - - `barcode_app.py`, `lidar_app.py`, and `weight_app.py`: Sensor-specific logic. \ No newline at end of file diff --git a/src/pipeline-server/readme.md b/src/pipeline-server/readme.md deleted file mode 100644 index 338f3ebd..00000000 --- a/src/pipeline-server/readme.md +++ /dev/null @@ -1,65 +0,0 @@ - * Prepare models - - Use the model downloader [available here](https://github.com/dlstreamer/pipeline-server/tree/main/tools/model_downloader) to download new models. Point `MODEL_DIR` to the directory containing the new models. The following section assumes that the new models are available under `$(pwd)/models`. - ```bash - $ export MODEL_DIR=$(pwd)/models - ``` -object_detection/yolov5s/yolov5s.json -object_detection/yolov5s/FP32 - - * Prepare pipelines - - Use [these docs](https://github.com/dlstreamer/pipeline-server/blob/main/docs/defining_pipelines.md) to get started with defining new pipelines. Once the new pipelines have been defined, point `PIPELINE_DIR` to the directory containing the new pipelines. The following section assumes that the new pipelines are available under `$(pwd)/pipelines`. - ```bash - $ export PIPELINE_DIR=$(pwd)/pipelines - ``` - - * Run the image with new models and pipelines mounted into the container - ```bash - $ docker run -itd \ - --privileged \ - --device=/dev:/dev \ - --device-cgroup-rule='c 189:* rmw' \ - --device-cgroup-rule='c 209:* rmw' \ - --group-add 109 \ - --name evam \ - -p 8080:8080 \ - -p 8554:8554 \ - -e ENABLE_RTSP=true \ - -e RTSP_PORT=8554 \ - -e ENABLE_WEBRTC=true \ - -e WEBRTC_SIGNALING_SERVER=ws://localhost:8443 \ - -e RUN_MODE=EVA \ - -e DETECTION_DEVICE=CPU \ - -e CLASSIFICATION_DEVICE=CPU \ - -v ./models:/home/pipeline-server/models \ - -v ./src/pipelines:/home/pipeline-server/pipelines \ - dlstreamer:dev - ``` -## Starting pipelines - * We can trigger pipelines using the *pipeline server's* REST endpoints, here is an example cURL command, the output is available as a RTSP stream at *rtsp:///pipeline-server* - ```bash - $ curl localhost:8080/pipelines/object_detection/yolov5 -X POST -H \ - 'Content-Type: application/json' -d \ - '{ - "source": { - "uri": "rtsp://192.168.1.141:8555/camera_0", - "type": "uri" - }, - "destination": { - "metadata": { - "type": "file", - "path": "/tmp/results.jsonl", - "format": "json-lines" - }, - "frame": { - "type": "rtsp", - "path": "pipeline-server" - } - }, - "parameters": { - "detection-device": "CPU", - "network": "FP16-INT8" - } - }' - ``` \ No newline at end of file From 27bddb5d9f09b805a3661e591020072f9638220e Mon Sep 17 00:00:00 2001 From: "Yan Xue, Tan (Francis)" <118970371+francis-tanyx@users.noreply.github.com> Date: Wed, 16 Apr 2025 18:10:06 +0800 Subject: [PATCH 2/2] Delete src/pipeline-server/README.md --- src/pipeline-server/README.md | 105 ---------------------------------- 1 file changed, 105 deletions(-) delete mode 100644 src/pipeline-server/README.md diff --git a/src/pipeline-server/README.md b/src/pipeline-server/README.md deleted file mode 100644 index b8245568..00000000 --- a/src/pipeline-server/README.md +++ /dev/null @@ -1,105 +0,0 @@ -## Deep Learning Streamer Pipeline Server (EVAM) - -1. First clone the repository and run the following command at the root level. - -2. build and run pipelines - -```bash -make run-pipeline-server -``` - -3. Validate docker containers are running - -```bash -docker ps --format 'table{{.Names}}\t{{.Image}}\t{{.Status}}' -``` - -result: - -| NAMES | IMAGE | STATUS | -|------------------------------------------|-------------------------------------------------------------|----------------------------------------| -| camera-simulator0 | jrottenberg/ffmpeg:4.1-alpine | Up 5 seconds | -| camera-simulator1 | jrottenberg/ffmpeg:4.1-alpine | Up 5 seconds | -| camera-simulator2 | jrottenberg/ffmpeg:4.1-alpine | Up 5 seconds | -| edge-video-analytics-microservice | intel/edge-video-analytics-microservice:2.3.0 | Up 5 seconds | -| multimodal-data-visualization | intel/multimodal-data-visualization:5.0.0 | Up 5 seconds (health: starting) | -| multimodal-data-visualization-streaming | intel/multimodal-data-visualization-streaming:5.0.0 | Up 5 seconds (health: starting) | -| mqtt-broker | eclipse-mosquitto:2.0.18 | Up 5 seconds | -| pipeline-init | postman/newman | Up 5 seconds | -| webrtc-signaling-server | intel/simple-signaling-server:5.0.0 | Up 5 seconds (health: starting) | -| camera-simulator | aler9/rtsp-simple-server | Up 5 seconds | - - -3. Open your browser and go to: [http://127.0.0.1:3000](http://127.0.0.1:3000) -Log in with the following credentials: - - **Username:** `root` - - **Password:** `evam123` - -Once logged in, navigate to the **default dashboard** from the homepage. - -5. Validate MQTT inference output - -```bash -mosquitto_sub -v -h localhost -p 1883 -t 'AnalyticsData0' -mosquitto_sub -v -h localhost -p 1883 -t 'AnalyticsData1' -mosquitto_sub -v -h localhost -p 1883 -t 'AnalyticsData2' -``` - -result per sub command: - -``` -AnalyticsData0 {"objects":[{"detection":{"bounding_box":{"x_max":0.3163176067521043,"x_min":0.20249048400491532,"y_max":0.7995593662281202,"y_min":0.12237883070032396},"confidence":0.868196964263916,"label":"bottle","label_id":39},"h":731,"region_id":6199,"roi_type":"bottle","w":219,"x":389,"y":132},{"detection":{"bounding_box":{"x_max":0.7833052431819754,"x_min":0.6710088227893136,"y_max":0.810283140877349,"y_min":0.1329853767638305},"confidence":0.8499506711959839,"label":"bottle","label_id":39},"h":731,"region_id":6200,"roi_type":"bottle","w":216,"x":1288,"y":144}],"resolution":{"height":1080,"width":1920},"tags":{},"timestamp":67297301635} -AnalyticsData0 {"objects":[{"detection":{"bounding_box":{"x_max":0.3163306922646063,"x_min":0.20249845268772138,"y_max":0.7984013488063937,"y_min":0.12254781445953},"confidence":0.8666459321975708,"label":"bottle","label_id":39},"h":730,"region_id":6201,"roi_type":"bottle","w":219,"x":389,"y":132},{"detection":{"bounding_box":{"x_max":0.7850104587729607,"x_min":0.6687324296210857,"y_max":0.7971464600783804,"y_min":0.13681757042794374},"confidence":0.8462932109832764,"label":"bottle","label_id":39},"h":713,"region_id":6202,"roi_type":"bottle","w":223,"x":1284,"y":148}],"resolution":{"height":1080,"width":1920},"tags":{},"timestamp":67330637174} -``` - -6. Run the status command script - -```bash -./src/pipeline-server/status.sh -``` - -``` ---------------------- Pipeline Status --------------------- -----------------8080---------------- -[ - { - "avg_fps": 11.862402507697258, - "avg_pipeline_latency": 0.5888091060475129, - "elapsed_time": 268.07383918762207, - "id": "95204aba458211efa9080242ac180006", - "message": "", - "start_time": 1721361269.6349292, - "state": "RUNNING" - } -] -----------------8081---------------- -[ - { - "avg_fps": 11.481329713987789, - "avg_pipeline_latency": 0.6092195660469542, - "elapsed_time": 262.33892583847046, - "id": "98233952458211efb5090242ac180007", - "message": "", - "start_time": 1721361275.3886085, - "state": "RUNNING" - } -] -----------------8082---------------- -[ - { - "avg_fps": 11.374176117139063, - "avg_pipeline_latency": 0.6153032569996222, - "elapsed_time": 256.985634803772, - "id": "9b2385a8458211efa46f0242ac180005", - "message": "", - "start_time": 1721361280.7602823, - "state": "RUNNING" - } -] -``` - -7. Stop services - -```bash -make down-pipeline-server -```