Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spatially chunked and sharding compatible annotation writer #522

Closed
wants to merge 49 commits into from
Closed
Show file tree
Hide file tree
Changes from 32 commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
40d27b9
initial sharding compatible version using cloudvolume
fcollman Jan 28, 2024
aa0b558
make cloud volume import optional
fcollman Jan 30, 2024
04e623b
changed to write via tensorstore
fcollman Jan 31, 2024
b058f78
formatting fixes
fcollman Jan 31, 2024
331b523
spatial sharding WIP
fcollman Feb 1, 2024
196f8b5
fix big endian encoding of spatial index and description
fcollman Feb 2, 2024
074b6e2
formatting fixes
fcollman Feb 2, 2024
7862de0
change shard spec to NamedTuple
fcollman Feb 2, 2024
4e6ebb8
make tensorstore import optional
fcollman Feb 2, 2024
29e9f75
fixing typing errors
fcollman Feb 3, 2024
6efd7bc
fixing type hinting
fcollman Feb 3, 2024
8c9169a
fixing lint issues
fcollman Feb 3, 2024
949f22b
fix more linting
fcollman Feb 3, 2024
fd77f40
fixing import linting
fcollman Feb 3, 2024
3bc1a64
fix: numpy.cast removal
fcollman Feb 3, 2024
dc2fd57
another import block fix
fcollman Feb 3, 2024
cd1e0dd
ruff formatting
fcollman Feb 4, 2024
5ce8665
feat: dynamic lower bounds
fcollman Feb 4, 2024
8f08c24
removing annotation reader
fcollman Feb 4, 2024
f147932
remove unused max annotations
fcollman Feb 4, 2024
e6211c0
make chunk_size default dynamic to size of the coordinate system
fcollman Feb 4, 2024
38eb529
fixing mypy with cast
fcollman Feb 4, 2024
073573b
mypy fix: value error print formatting
fcollman Feb 4, 2024
aece02d
fix doc string
fcollman Feb 7, 2024
d8aa858
fix chunk_size typing
fcollman Feb 7, 2024
34ac02a
renaming chunk size
fcollman Feb 7, 2024
b15286b
add better handling of multiple point annotations
fcollman Feb 8, 2024
d073bea
adding ellipsoid
fcollman Feb 8, 2024
75ba944
fix ellipsoid logic
fcollman Feb 8, 2024
a2c1932
fixing typing
fcollman Feb 8, 2024
ed4761a
ruff formatting
fcollman Feb 8, 2024
bbabf64
fix: relationship key encoded incorrectly
fcollman Feb 11, 2024
524cca2
fixing dtypes of chunk size
fcollman Feb 13, 2024
5f80c3c
fix: missing related IDs in by_id index
fcollman Feb 14, 2024
987659e
fix: typo
fcollman Feb 14, 2024
ca9f066
bugfix: byid endian encoding fix
fcollman Feb 14, 2024
8119007
fixing np.asarray case
fcollman Feb 14, 2024
fcd47a6
fix single layer chunks
fcollman Feb 16, 2024
667d203
fixing chunk size for single planes
fcollman Feb 16, 2024
cf5579e
fix num_chunks
fcollman Feb 16, 2024
20c40e6
fixing spatial indices for points
fcollman Feb 18, 2024
e6398e8
fixed sharded spatial index writing
fcollman Feb 18, 2024
afbc23d
generlizing spatial bins with rtree
fcollman Feb 18, 2024
a8c7a0f
improved generalization of upper and lower bound
fcollman Feb 19, 2024
2a5eeea
remove comments, add sharded option, fix generator
fcollman Feb 19, 2024
1d502ee
adding rtree dependancy
fcollman Feb 19, 2024
a781732
switch to sharded writing as default
fcollman Feb 20, 2024
066b801
removing chunks with no items
fcollman Feb 22, 2024
109726a
Merge branch 'master' into annotationwriter_improvements
fcollman Apr 13, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions python/examples/example.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,9 @@ def add_example_layers(state):
a[1, :, :, :] = np.abs(np.sin(4 * (iy + iz))) * 255
a[2, :, :, :] = np.abs(np.sin(4 * (ix + iz))) * 255

b = np.cast[np.uint32](
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10)
b = np.asarray(
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10),
dtype=np.uint32,
)
b = np.pad(b, 1, "constant")
dimensions = neuroglancer.CoordinateSpace(
Expand Down
5 changes: 3 additions & 2 deletions python/examples/example_coordinate_arrays.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,9 @@ def add_example_layers(state):
a[1, :, :, :] = np.abs(np.sin(4 * (iy + iz))) * 255
a[2, :, :, :] = np.abs(np.sin(4 * (ix + iz))) * 255

b = np.cast[np.uint32](
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10)
b = np.asarray(
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10),
dtype=np.uint32,
)
b = np.pad(b, 1, "constant")
dimensions = neuroglancer.CoordinateSpace(
Expand Down
5 changes: 3 additions & 2 deletions python/examples/example_coordinate_transform.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,9 @@
ix, iy, iz = np.meshgrid(
*[np.linspace(0, 1, n) for n in [100, 100, 100]], indexing="ij"
)
data = np.cast[np.uint32](
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10)
data = np.asarray(
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10),
dtype=np.uint32,
)
data = np.pad(data, 1, "constant")
dimensions = neuroglancer.CoordinateSpace(
Expand Down
5 changes: 3 additions & 2 deletions python/examples/example_local_volume_coordinate_arrays.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,9 @@ def add_example_layers(state):
a[1, :, :, :] = np.abs(np.sin(4 * (iy + iz))) * 255
a[2, :, :, :] = np.abs(np.sin(4 * (ix + iz))) * 255

b = np.cast[np.uint32](
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10)
b = np.asarray(
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10),
dtype=np.uint32,
)
b = np.pad(b, 1, "constant")
dimensions = neuroglancer.CoordinateSpace(
Expand Down
5 changes: 3 additions & 2 deletions python/examples/example_signed_int.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,9 @@ def add_example_layer(state):
*[np.linspace(0, 1, n) for n in [100, 100, 100]], indexing="ij"
)
b = (
np.cast[np.int32](
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10)
np.asarray(
np.floor(np.sqrt((ix - 0.5) ** 2 + (iy - 0.5) ** 2 + (iz - 0.5) ** 2) * 10),
dtype=[np.int32],
)
- 2
)
Expand Down
5 changes: 3 additions & 2 deletions python/examples/flood_filling_simulation.py
Original file line number Diff line number Diff line change
Expand Up @@ -169,8 +169,9 @@ def process_pos(pos):
enqueue(tuple(new_pos))

dist_transform = scipy.ndimage.morphology.distance_transform_edt(~mask)
inf_results[slice_expr] = 1 + np.cast[np.uint8](
np.minimum(dist_transform, 5) / 5.0 * 254
inf_results[slice_expr] = 1 + np.asarray(
np.minimum(dist_transform, 5) / 5.0 * 254,
dtype=np.uint8,
)

self.viewer.defer_callback(update_view)
Expand Down
4 changes: 2 additions & 2 deletions python/examples/interactive_inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,8 +96,8 @@ def _do_inference(self, action_state):
boundary_mask[:-1, :, :] |= gt_data[:-1, :, :] != gt_data[1:, :, :]
boundary_mask[1:, :, :] |= gt_data[:-1, :, :] != gt_data[1:, :, :]
dist_transform = scipy.ndimage.morphology.distance_transform_edt(~boundary_mask)
self.inf_results[slice_expr] = 1 + np.cast[np.uint8](
np.minimum(dist_transform, 5) / 5.0 * 254
self.inf_results[slice_expr] = 1 + np.asarray(
np.minimum(dist_transform, 5) / 5.0 * 254, np.uint8
)
self.inf_volume.invalidate()

Expand Down
2 changes: 1 addition & 1 deletion python/neuroglancer/downsample.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ def downsample_with_averaging(array, factor):
indexing_expr = tuple(np.s_[:s] for s in part.shape)
temp[indexing_expr] += part
counts[indexing_expr] += 1
return np.cast[array.dtype](temp / counts)
return np.asarray(temp / counts, dtype=array.dtype)


def downsample_with_striding(array, factor):
Expand Down
6 changes: 4 additions & 2 deletions python/neuroglancer/local_volume.py
Original file line number Diff line number Diff line change
Expand Up @@ -193,7 +193,9 @@ def get_encoded_subvolume(self, data_format, start, end, scale_key):
or np.prod(downsample_factor) > self.max_downsampling
):
raise ValueError("Invalid downsampling factor.")
downsampled_shape = np.cast[np.int64](np.ceil(self.shape / downsample_factor))
downsampled_shape = np.asarray(
np.ceil(self.shape / downsample_factor, dtype=np.int64)
Copy link

@mikejhuang mikejhuang Feb 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm running into a snag here.

TypeError: No loop matching the specified signature and casting was found for ufunc ceil

I moved the dtype to the asarray and it seemed to have worked.
downsampled_shape = np.asarray( np.ceil(self.shape / downsample_factor), dtype=np.int64 )

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks fixed!

)
if np.any(end < start) or np.any(start < 0) or np.any(end > downsampled_shape):
raise ValueError("Out of bounds data request.")

Expand All @@ -208,7 +210,7 @@ def get_encoded_subvolume(self, data_format, start, end, scale_key):
)
subvol = np.array(self.data[indexing_expr], copy=False)
if subvol.dtype == "float64":
subvol = np.cast[np.float32](subvol)
subvol = np.asarray(subvol, dtype=np.float32)

if np.any(downsample_factor != 1):
if self.volume_type == "image":
Expand Down
Loading
Loading