Skip to content

Points created by the Home Assistant Integration don't get reverse geocoded unless triggered manually #1242

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
noway42 opened this issue May 20, 2025 · 6 comments

Comments

@noway42
Copy link

noway42 commented May 20, 2025

OS & Hardware
Proxmox VM

Version
0.26.3

Describe the bug
Points that were created by the Home Assistant integration https://github.com/AlbinLind/dawarich-home-assistant do not get geocode information after they were created. For poits created by owntracks it does work, so it is not a problem of reverse geocodeing itself, but somehow on the trigger of it. If you "continue reverse geocoding" in the background jobs settings, all points including the one from the HA integration get reverse geocoded.

To Reproduce
Steps to reproduce the behavior:

  1. Install Home Assistant integration
  2. Configure a location tracker for it and wait that it sends a datapoint to Dawarich (can be confirmed in the log of the dawarich_app container)
  3. Look up the point in the DB and find that the reverse geocoding did not start.

Expected behavior
All points, including the one from HA, get reverse geocoded after creation.

Screenshots

Logs
no error can be found

Additional context
I'm not sure if this is an issue of dawarich or the HA Integration, so I will report the issue there as well.

@AlbinLind
Copy link

For context, we are using POST on the /api/v1/points endpoint if that helps with debugging.

@Freika
Copy link
Owner

Freika commented May 20, 2025

Please provide logs and your compose file

@noway42
Copy link
Author

noway42 commented May 21, 2025

Example from today:

Point created at 8:36 (or 6.36 w/o timezone):

Log dawarich_app:

D, [2025-05-21T06:36:04.293901 #126] DEBUG -- :   ↳ app/controllers/api_controller.rb:22:in 'ApiController#current_api_user'
D, [2025-05-21T06:36:04.321222 #126] DEBUG -- :   Point Upsert (3.6ms)  INSERT INTO "points" ("lonlat","battery_status","battery","timestamp","altitude","tracker_id","velocity","ssid","accuracy","vertical_accuracy","course_accuracy","course","raw_data","user_id","created_at","updated_at") VALUES ('**sensitive data removed**, NULL, 1747809364, 138, 'Dawarich', '0', 'unknown', 100, 100, 0.0, 0.0, '{"type":"Feature","geometry":{"type":"Point","coordinates":[**sensitive data removed**]},"properties":{"timestamp":"2025-05-21T08:36:04.256291+02:00","altitude":138.0,"speed":0,"horizontal_accuracy":100,"vertical_accuracy":100,"significant_change":"unknown","device_id":"Dawarich","wifi":"unknown","battery_state":"unknown","battery_level":0,"course":0,"course_accuracy":0}}', 1, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP) ON CONFLICT ("lonlat","timestamp","user_id") DO UPDATE SET updated_at=(CASE WHEN ("points"."battery_status" IS NOT DISTINCT FROM excluded."battery_status" AND "points"."battery" IS NOT DISTINCT FROM excluded."battery" AND "points"."altitude" IS NOT DISTINCT FROM excluded."altitude" AND "points"."tracker_id" IS NOT DISTINCT FROM excluded."tracker_id" AND "points"."velocity" IS NOT DISTINCT FROM excluded."velocity" AND "points"."ssid" IS NOT DISTINCT FROM excluded."ssid" AND "points"."accuracy" IS NOT DISTINCT FROM excluded."accuracy" AND "points"."vertical_accuracy" IS NOT DISTINCT FROM excluded."vertical_accuracy" AND "points"."course_accuracy" IS NOT DISTINCT FROM excluded."course_accuracy" AND "points"."course" IS NOT DISTINCT FROM excluded."course" AND "points"."raw_data" IS NOT DISTINCT FROM excluded."raw_data") THEN "points".updated_at ELSE CURRENT_TIMESTAMP END),"battery_status"=excluded."battery_status","battery"=excluded."battery","altitude"=excluded."altitude","tracker_id"=excluded."tracker_id","velocity"=excluded."velocity","ssid"=excluded."ssid","accuracy"=excluded."accuracy","vertical_accuracy"=excluded."vertical_accuracy","course_accuracy"=excluded."course_accuracy","course"=excluded."course","raw_data"=excluded."raw_data" RETURNING id, timestamp, ST_X(lonlat::geometry) AS longitude, ST_Y(lonlat::geometry) AS latitude
D, [2025-05-21T06:36:04.321629 #126] DEBUG -- :   ↳ app/services/points/create.rb:21:in 'block in Points::Create#call'
I, [2025-05-21T06:36:04.322915 #126]  INFO -- : {"method":"POST","path":"/api/v1/points","format":"json","controller":"Api::V1::PointsController","action":"create","status":200,"allocations":4117,"duration":31.39,"view":0.16,"db":13.92,"unpermitted_params":["point"]}
D, [2025-05-21T06:36:04.766999 #126] DEBUG -- :   User Load (0.3ms)  SELECT "users".* FROM "users" WHERE "users"."api_key" IS NULL LIMIT $1  [["LIMIT", 1]]
D, [2025-05-21T06:36:04.767535 #126] DEBUG -- :   ↳ app/controllers/api_controller.rb:22:in 'ApiController#current_api_user'
I, [2025-05-21T06:36:04.767836 #126]  INFO -- : {"method":"GET","path":"/api/v1/health","format":"*/*","controller":"Api::V1::HealthController","action":"index","status":200,"allocations":903,"duration":1.94,"view":0.08,"db":0.26}

Log dawarich_sidekiq

I, [2025-05-21T06:19:58.626408 #8]  INFO -- : start
I, [2025-05-21T06:19:58.634666 #8]  INFO -- : Performing Cache::PreheatingJob (Job ID: d54492cc-a261-47ba-8b4f-1337a971331f) from Sidekiq(default) enqueued at 2025-05-21T06:19:58.305203574Z
D, [2025-05-21T06:19:58.637518 #8] DEBUG -- :   User Load (0.4ms)  SELECT "users".* FROM "users" ORDER BY "users"."id" ASC LIMIT $1  [["LIMIT", 1000]]
D, [2025-05-21T06:19:58.638066 #8] DEBUG -- :   ↳ app/jobs/cache/preheating_job.rb:7:in 'Cache::PreheatingJob#perform'
D, [2025-05-21T06:19:58.714705 #8] DEBUG -- :    (73.3ms)          SELECT DISTINCT
          EXTRACT(YEAR FROM TO_TIMESTAMP(timestamp)) AS year,
          TO_CHAR(TO_TIMESTAMP(timestamp), 'Mon') AS month
        FROM points
        WHERE user_id = 1
        ORDER BY year DESC, month ASC
D, [2025-05-21T06:19:58.715741 #8] DEBUG -- :   ↳ app/models/user.rb:92:in 'block in User#years_tracked'
I, [2025-05-21T06:19:58.720497 #8]  INFO -- : Performed Cache::PreheatingJob (Job ID: d54492cc-a261-47ba-8b4f-1337a971331f) from Sidekiq(default) in 85.88ms
I, [2025-05-21T06:19:58.721374 #8]  INFO -- : done
D, [2025-05-21T06:20:00.778727 #8] DEBUG -- : Flushed 4 metrics
I, [2025-05-21T06:36:20.780127 #8]  INFO -- : start
I, [2025-05-21T06:36:20.797723 #8]  INFO -- : Performing Owntracks::PointCreatingJob (Job ID: 83f35c3f-53f4-4188-a32c-631ee0ec388f) from Sidekiq(default) enqueued at 2025-05-21T06:36:20.774681136Z with arguments: {"_type" => "location", "_id" => "592c7c61", "acc" => 4, "alt" => 134, "batt" => 96, "bs" => 1, "cog" => 34, "conn" => "m", "created_at" => 1747809381, "lat" => **sensitive data removed**, "lon" => **sensitive data removed**, "m" => 1, "t" => "p", "tid" => "nx", "topic" => "owntracks/user/**sensitive data removed**", "tst" => 1747806765, "vac" => 1, "vel" => 55, "api_key" => "**sensitive data removed**", "controller" => "api/v1/owntracks/points", "action" => "create", "point" => {"topic" => "owntracks/user/**sensitive data removed**", "created_at" => 1747809381}}, 1
D, [2025-05-21T06:36:20.855975 #8] DEBUG -- :   Point Exists? (2.5ms)  SELECT 1 AS one FROM "points" WHERE "points"."lonlat" = $1 AND "points"."timestamp" = $2 AND "points"."user_id" = $3 LIMIT $4  [["lonlat", "[FILTERED]"], ["timestamp", 1747806765], ["user_id", 1], ["LIMIT", 1]]
D, [2025-05-21T06:36:20.856462 #8] DEBUG -- :   ↳ app/models/concerns/point_validation.rb:11:in 'PointValidation#point_exists?'
I, [2025-05-21T06:36:20.856768 #8]  INFO -- : Performed Owntracks::PointCreatingJob (Job ID: 83f35c3f-53f4-4188-a32c-631ee0ec388f) from Sidekiq(default) in 59.21ms
I, [2025-05-21T06:36:20.857558 #8]  INFO -- : done
D, [2025-05-21T06:36:21.043912 #8] DEBUG -- : Flushed 2 metrics
compose file:
networks:
  dawarich:
services:
  dawarich_redis:
    image: redis:7.0-alpine
    container_name: dawarich_redis
    command: redis-server
    networks:
      - dawarich
    volumes:
      - /share/data/dawarich/dawarich_shared:/data
    restart: always
    healthcheck:
      test: [ "CMD", "redis-cli", "--raw", "incr", "ping" ]
      interval: 10s
      retries: 5
      start_period: 30s
      timeout: 10s

  dawarich_db:
    image: postgis/postgis:17-3.5-alpine
    shm_size: 1G
    container_name: dawarich_db
    volumes:
    - /share/data/dawarich/dawarich_db_data:/var/lib/postgresql/data
    - /share/data/dawarich/dawarich_shared:/var/shared
    networks:
      - dawarich
    environment:
      POSTGRES_USER: **sensitive data removed**
      POSTGRES_PASSWORD: **sensitive data removed**
    restart: always
    healthcheck:
      test: [ "CMD-SHELL", "pg_isready -U postgres -d dawarich_development" ]
      interval: 10s
      retries: 5
      start_period: 30s
      timeout: 10s

  dawarich_app:
    image: freikin/dawarich:latest
    container_name: dawarich_app
    volumes:
      - /share/data/dawarich/dawarich_public:/var/app/public
      - /share/data/dawarich/dawarich_watched:/var/app/tmp/imports/watched
      - /share/data/dawarich/dawarich_storage:/var/app/storage
    networks:
      - dawarich
    ports:
      - 3033:3000
    stdin_open: true
    tty: true
    entrypoint: web-entrypoint.sh
    command: ['bin/rails', 'server', '-p', '3000', '-b', '::']
    restart: on-failure
    environment:
      RAILS_ENV: development
      REDIS_URL: redis://dawarich_redis:6379/0
      DATABASE_HOST: dawarich_db
      DATABASE_USERNAME: **sensitive data removed**
      DATABASE_PASSWORD: **sensitive data removed**
      DATABASE_NAME: dawarich_development
      MIN_MINUTES_SPENT_IN_CITY: 60
      APPLICATION_HOST: localhost
      APPLICATION_HOSTS: **sensitive data removed**
      TIME_ZONE: Europe/Berlin
      APPLICATION_PROTOCOL: http
      NOMINATIM_API_HOST: nominatim.openstreetmap.org
      NOMINATIM_API_USE_HTTPS: true
      DISTANCE_UNIT: km
      PROMETHEUS_EXPORTER_ENABLED: false
      PROMETHEUS_EXPORTER_HOST: 0.0.0.0
      PROMETHEUS_EXPORTER_PORT: 9394
      ENABLE_TELEMETRY: false # More on telemetry: https://dawarich.app/docs/tutorials/telemetry
      SELF_HOSTED: "true"
      SIDEKIQ_USERNAME: **sensitive data removed**
      SIDEKIQ_PASSWORD: **sensitive data removed**
    logging:
      driver: "json-file"
      options:
        max-size: "100m"
        max-file: "5"
    healthcheck:
      test: [ "CMD-SHELL", "wget -qO - http://127.0.0.1:3000/api/v1/health | grep -q '\"status\"\\s*:\\s*\"ok\"'" ]
      interval: 10s
      retries: 30
      start_period: 30s
      timeout: 10s
    depends_on:
      dawarich_db:
        condition: service_healthy
        restart: true
      dawarich_redis:
        condition: service_healthy
        restart: true
    deploy:
      resources:
        limits:
          cpus: '0.50'    # Limit CPU usage to 50% of one core
          memory: '4G'    # Limit memory usage to 4GB
  dawarich_sidekiq:
    image: freikin/dawarich:latest
    container_name: dawarich_sidekiq
    volumes:
      - /share/data/dawarich/dawarich_public:/var/app/public
      - /share/data/dawarich/dawarich_watched:/var/app/tmp/imports/watched
      - /share/data/dawarich/dawarich_storage:/var/app/storage
    networks:
      - dawarich
    stdin_open: true
    tty: true
    entrypoint: sidekiq-entrypoint.sh
    command: ['sidekiq']
    restart: on-failure
    environment:
      RAILS_ENV: development
      REDIS_URL: redis://dawarich_redis:6379/0
      DATABASE_HOST: dawarich_db
      DATABASE_USERNAME: **sensitive data removed**
      DATABASE_PASSWORD: **sensitive data removed**
      DATABASE_NAME: dawarich_development
      APPLICATION_HOSTS: localhost
      BACKGROUND_PROCESSING_CONCURRENCY: 1
      APPLICATION_PROTOCOL: http
      NOMINATIM_API_HOST: nominatim.openstreetmap.org
      NOMINATIM_API_USE_HTTPS: true
      DISTANCE_UNIT: km
      PROMETHEUS_EXPORTER_ENABLED: false
      PROMETHEUS_EXPORTER_HOST: dawarich_app
      PROMETHEUS_EXPORTER_PORT: 9394
      ENABLE_TELEMETRY: false # More on telemetry: https://dawarich.app/docs/tutorials/telemetry
      SELF_HOSTED: "true"
      SIDEKIQ_USERNAME: **sensitive data removed**
      SIDEKIQ_PASSWORD: **sensitive data removed**
    logging:
      driver: "json-file"
      options:
        max-size: "100m"
        max-file: "5"
    healthcheck:
      test: [ "CMD-SHELL", "ps axu|grep -i [s]idekiq" ]
      interval: 10s
      retries: 30
      start_period: 30s
      timeout: 10s
    depends_on:
      dawarich_db:
        condition: service_healthy
        restart: true
      dawarich_redis:
        condition: service_healthy
        restart: true
      dawarich_app:
        condition: service_healthy
        restart: true
    deploy:
      resources:
        limits:
          cpus: '0.50'    # Limit CPU usage to 50% of one core
          memory: '4G'    # Limit memory usage to 4GB

The data which was processed by sidekiq at this time seems to be another point from my android owntracks app since the location and other data is slightly different.

This is how the DB looks like, you can see on first sight which points come from which source:

Image

@noway42
Copy link
Author

noway42 commented May 23, 2025

Does anyone knows how to trigger the "Continue Reverse Geocoding" Job via console? Then I could set up a cron job to trigger it every hour as a workaround.

@Freika
Copy link
Owner

Freika commented May 24, 2025

Does anyone knows how to trigger the "Continue Reverse Geocoding" Job via console? Then I could set up a cron job to trigger it every hour as a workaround.

EnqueueBackgroundJob.perform_later("continue_reverse_geocoding", User.find_by(email: "[email protected]").id)

@sbcrumb
Copy link

sbcrumb commented May 30, 2025

thanks for this. this explained why I had to manually reverse geocode my points since I am using HA. Adding an hourly cron to do it is a great workaround.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants