Skip to content

tiles.btrfs.gz download fails and cannot resume due to missing Range support #72

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
pbov opened this issue Apr 1, 2025 · 0 comments

Comments

@pbov
Copy link

pbov commented Apr 1, 2025

tiles.btrfs.gz download fails due to instability and missing resume support

When running http-host-static with SKIP_PLANET=false in .env, we repeatedly run into a critical problem with downloading the large tiles.btrfs.gz file (~83 GB) from:

https://btrfs.openfreemap.com/areas/planet/20250326_001001_pt/tiles.btrfs.gz

💥 The real issue

The download frequently aborts mid-transfer, possibly due to Cloudflare timeouts or instability (also on my local computer, having a very stable internet connection). Here's an example:

[#067e13 18GiB/83GiB(21%) CN:1 DL:23MiB ETA:48m13s]
[#067e13 18GiB/83GiB(21%) CN:1 DL:21MiB ETA:51m57s]

04/01 16:31:38 [ERROR] CUID#7 - Download aborted. URI=https://btrfs.openfreemap.com/areas/planet/20250326_001001_pt/tiles.btrfs.gz
Exception: [AbstractCommand.cc:351] errorCode=8 URI=https://btrfs.openfreemap.com/areas/planet/20250326_001001_pt/tiles.btrfs.gz
  -> [HttpResponse.cc:81] errorCode=8 Invalid range header. Request: 19500875776-89946993411/89946993412, Response: 0-89946993411/89946993412

04/01 16:31:38 [NOTICE] Download GID#067e132e16c88ec2 not complete: /data/ofm/http_host/runs/_tmp/tiles.btrfs.gz

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
067e13|ERR |    22MiB/s|/data/ofm/http_host/runs/_tmp/tiles.btrfs.gz

Status Legend:
(ERR):error occurred.

aria2 will resume download if the transfer is restarted.

🔁 Why resuming is not possible

Normally, aria2c or similar tools can resume large downloads using HTTP Range headers. However, in this case, Range headers are ignored by the server, presumably because Cloudflare is caching the file and refusing partial content delivery.

Here's a manual test with curl:

curl -s -D - -H "Range: bytes=100000000-100000100" \
"https://btrfs.openfreemap.com/areas/planet/20250326_001001_pt/tiles.btrfs.gz" -o /dev/null

Expected: 206 Partial Content
Actual: 200 OK, full file returned – Range ignored.


⚠️ Consequences

  • Every time the download fails, it starts over from scratch (currently the script also does not the retry the download)
  • Resume is not possible.
  • Parallel segmented download (--split) fails immediately.
  • The deployment script fails with an unrecoverable error.

Maybe this could help: https://community.cloudflare.com/t/range-header-is-ignore-on-large-files/650787

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant