Skip to content

Remove VoxelType Enum, use ArrayDataType instead for WKW #8559

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

fm3
Copy link
Member

@fm3 fm3 commented Apr 24, 2025

Removes the VoxelType Enum, and adapts the WKW code to use ArrayDataType directly.

I decided to leave Zarr3DataType, N5DataType and PrecomputedDataType untouched, as their json formats are useful to parse the respective headers easily.

Steps to test:

  • Load WKW dataset, should still work
  • Download volume annotation as WKW + reupload, should still work

TODOs:

  • Looks like I broke wkw loading, debug
  • how much/how many of the other dtype classes listed in the issue can we avoid?

Issues:


@fm3 fm3 self-assigned this Apr 24, 2025
Copy link
Contributor

coderabbitai bot commented Apr 24, 2025

📝 Walkthrough

Walkthrough

This change removes the VoxelType abstraction and refactors code to use a unified ArrayDataType for representing voxel data types throughout the codebase. Methods and logic previously dependent on VoxelType are updated to utilize ArrayDataType and explicit channel counts. The WKWHeader class and related serialization/deserialization logic are revised to reflect this, and new mapping methods between ArrayDataType, WKW IDs, and ElementClass are introduced. Some redundant or now-unnecessary methods are removed, and imports are cleaned up to match the new structure.

Changes

Files/Paths Change Summary
.../dataformats/wkw/VoxelType.scala Removed the VoxelType enumeration and all associated utility methods.
.../dataformats/wkw/WKWHeader.scala Refactored to use ArrayDataType and explicit numChannels instead of VoxelType and numBytesPerVoxel. Updated parsing, serialization, and method signatures accordingly.
.../datareaders/ArrayDataType.scala Renamed bytesPerElementFor to bytesPerElement. Added toWKWId and fromWKWTypeId for mapping between ArrayDataType and WKW IDs.
.../datareaders/BytesConverter.scala
.../datareaders/DatasetHeader.scala
Updated usage from bytesPerElementFor to ArrayDataType.bytesPerElement. Cleaned up imports.
.../models/datasource/DataLayer.scala Removed Category.guessFromElementClass. Added ElementClass.toArrayDataTypeAndChannel for mapping to ArrayDataType and channel count.
.../tracingstore/tracings/volume/WKWBucketStreamSink.scala Updated logic to use ElementClass.toArrayDataTypeAndChannel and refactored header construction. Cleaned up imports.

Assessment against linked issues

Objective (Issue #) Addressed Explanation
Unify DataType classes by removing redundancy (e.g., VoxelType, N5DataType) (#6484)
Use ArrayDataType as the unified type for voxel data and update mapping logic (#6484)
Optionally merge wkw ElementClass into ArrayDataType (#6484) ElementClass is still present; only mapping methods were added, not a full merge.

Poem

In fields of code where types once sprawled,
The rabbits hopped and logic called.
VoxelType is gone, ArrayDataType reigns,
Channel counts explicit, no more pains.
With mappings neat and headers clear,
The codebase hops with less to fear!
🐇✨


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@fm3 fm3 changed the title WIP: unify backend dtype classes Remove VoxelType Enum, use ArrayDataType instead for WKW Apr 29, 2025
@fm3 fm3 marked this pull request as ready for review April 29, 2025 13:35
@fm3 fm3 requested a review from frcroth April 29, 2025 13:35
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (4)
webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/WKWBucketStreamSink.scala (1)

21-23: Confirm parameter order for the WKWHeader factory

WKWHeader’s companion apply expects
(numChunksPerShardDimension, numVoxelsPerChunkDimension, blockType, dataType, numChannels).
Here we pass

WKWHeader(1, DataLayer.bucketLength, ChunkType.LZ4, dataType, numChannels)

This maps 1 → numChunksPerShardDimension and DataLayer.bucketLength → numVoxelsPerChunkDimension, i.e. we create shards that contain exactly one chunk whose side–length equals bucketLength.
That is exactly what the old implementation did, so it is probably intended – but please double-check that:

  1. DataLayer.bucketLength still equals the side length (in voxels) of one chunk, not of the whole shard.
  2. All readers/writers that consume these WKW files rely on numChunksPerShardDimension == 1.
    If we ever decide to pack several chunks per file, this hard-coded 1 will become a silent bug.

If the assumption is correct, consider replacing the magic constant with a named value for self-documentation, e.g.

- val header = WKWHeader(1, DataLayer.bucketLength, ChunkType.LZ4, dataType, numChannels)
+ val NumChunksPerShard = 1
+ val header = WKWHeader(NumChunksPerShard, DataLayer.bucketLength, ChunkType.LZ4, dataType, numChannels)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/wkw/WKWHeader.scala (3)

34-39: Channel count now explicit – great, but rename parameters in docs

Switching from the implicit numBytesPerVoxel to the explicit (dataType, numChannels) clears up previous ambiguity – nice!
Minor nit: the scaladoc / error messages inside the file still refer to voxelType in a few places (e.g. line 133).
Renaming those to dataType will prevent confusion for future maintainers.


150-152: Variable name still references removed concept

voxelTypeId and voxelType variables are now holding an ArrayDataType.
Renaming to dataTypeId / dataType would align the implementation with the refactor and avoid cognitive friction.


175-184: Factory apply overload: consider defaulting numChannels to 1

Most call-sites will create single-channel headers. Providing

def apply(numChunksPerShardDimension: Int,
          numVoxelsPerChunkDimension: Int,
          blockType: ChunkType.Value,
          dataType: ArrayDataType,
          numChannels: Int = 1): WKWHeader =

reduces boilerplate and makes the common case explicit.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3e7e5c3 and 3e64cd3.

📒 Files selected for processing (7)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/wkw/VoxelType.scala (0 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/wkw/WKWHeader.scala (8 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/ArrayDataType.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/BytesConverter.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/DatasetHeader.scala (2 hunks)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (1 hunks)
  • webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/WKWBucketStreamSink.scala (2 hunks)
💤 Files with no reviewable changes (1)
  • webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/wkw/VoxelType.scala
🧰 Additional context used
🧬 Code Graph Analysis (3)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/BytesConverter.scala (1)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/ArrayDataType.scala (2)
  • ArrayDataType (5-78)
  • bytesPerElement (9-23)
webknossos-tracingstore/app/com/scalableminds/webknossos/tracingstore/tracings/volume/WKWBucketStreamSink.scala (3)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/wkw/WKWHeader.scala (3)
  • ChunkType (25-29)
  • WKWHeader (31-113)
  • WKWHeader (115-186)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/dataformats/wkw/WKWFile.scala (1)
  • WKWFile (68-115)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (3)
  • DataLayer (283-329)
  • ElementClass (37-195)
  • toArrayDataTypeAndChannel (182-194)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (2)
frontend/javascripts/types/api_flow_types.ts (1)
  • ElementClass (33-44)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/ArrayDataType.scala (1)
  • ArrayDataType (5-78)
🔇 Additional comments (8)
webknossos-datastore/app/com/scalableminds/webknossos/datastore/models/datasource/DataLayer.scala (1)

182-194: Looks good: Implementation of toArrayDataTypeAndChannel is well-structured and correct

The new method properly maps each ElementClass value to appropriate ArrayDataType and channel count pairs, with special handling for uint24 which uses 3 channels. This implementation supports the PR's goal of replacing VoxelType with direct ArrayDataType usage.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/BytesConverter.scala (2)

3-3: Import statement updated correctly

The import has been streamlined to only include the necessary ArrayDataType type definition.


13-13: Method call updated to use ArrayDataType.bytesPerElement

Changed from the removed bytesPerElementFor to align with the refactoring in ArrayDataType.scala.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/DatasetHeader.scala (2)

6-6: Import simplified correctly

Import statement updated to only include ArrayDataType.ArrayDataType, removing unnecessary imports.


31-31: Method call updated to ArrayDataType.bytesPerElement

Correctly changed to use the renamed method in ArrayDataType object.

webknossos-datastore/app/com/scalableminds/webknossos/datastore/datareaders/ArrayDataType.scala (3)

9-9: Method renamed from bytesPerElementFor to bytesPerElement

The method name has been updated for better consistency while preserving the same functionality.


51-63: Well-implemented toWKWId mapping method

This method provides a clean mapping from ArrayDataType values to WKW type ID integers, supporting the refactoring away from VoxelType. The pattern matching is exhaustive and covers all data types.


65-77: Well-implemented fromWKWTypeId conversion method

This method correctly maps WKW type IDs back to ArrayDataType values, providing the reverse functionality of toWKWId. The pattern matching is exhaustive and handles all possible WKW type IDs.

Comment on lines +48 to +49
private def numBytesPerVoxel = numChannels * ArrayDataType.bytesPerElement(dataType)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Potential overflow & spec violation for numBytesPerVoxel

numBytesPerVoxel is written with output.writeByte, i.e. only one unsigned byte is stored in the header.
With the new formula

private def numBytesPerVoxel = numChannels * ArrayDataType.bytesPerElement(dataType)

values above 255 will silently overflow or truncate, corrupting every subsequent reader.

Typical cases (u1×3, f4×1, …) are safe, but nothing prevents the caller from creating, say, an f8 with 4 channels (≈ 32 bytes).

+ require(
+   numBytesPerVoxel <= 0xFF,
+   s"numBytesPerVoxel=$numBytesPerVoxel exceeds 1-byte limit of WKW header"
+ )

Add the guard (ideally right after the calculation) and document the limit in the companion apply that builds headers.

Comment on lines 67 to 70
output.writeByte(blockType.id)
output.writeByte(voxelType.id)
output.writeByte(ArrayDataType.toWKWId(dataType))
output.writeByte(numBytesPerVoxel)
if (isHeaderFile) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Missing validation for numChannels when reading header

During deserialization we do:

val numChannels = numBytesPerVoxel / ArrayDataType.bytesPerElement(voxelType)

If numBytesPerVoxel is not an exact multiple, integer division truncates and we silently drop fractional channels.
Please validate divisibility:

+ val bytesPerElem = ArrayDataType.bytesPerElement(voxelType)
+ require(numBytesPerVoxel % bytesPerElem == 0,
+         s"Inconsistent header: $numBytesPerVoxel not divisible by $bytesPerElem")
+ val numChannels = numBytesPerVoxel / bytesPerElem

Copy link
Member

@frcroth frcroth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Unify DataType classes in wk
2 participants