Merge pull request #6 from p2p-ld/dynamictable
Some checks failed
Lint / Ruff Linting (push) Has been cancelled
Lint / Black Formatting (push) Has been cancelled
Lint / Check for spelling errors (push) Has been cancelled
Tests / test (3.10) (push) Has been cancelled
Tests / test (3.11) (push) Has been cancelled
Tests / test (3.12) (push) Has been cancelled
Tests / finish-coverage (push) Has been cancelled

Implement Dynamictable
This commit is contained in:
Jonny Saunders 2024-08-15 01:57:39 -07:00 committed by GitHub
commit ff77f0a2b8
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
451 changed files with 41658 additions and 8541 deletions

View file

@ -2,6 +2,8 @@ name: Lint
on:
push:
branches:
- main
pull_request:
branches: [main]

View file

@ -2,6 +2,11 @@ name: Tests
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
test:
@ -34,8 +39,20 @@ jobs:
run: pytest
working-directory: nwb_linkml
- name: Report coverage
working-directory: nwb_linkml
run: "coveralls --service=github"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Coveralls Parallel
uses: coverallsapp/github-action@v2.3.0
if: runner.os != 'macOS'
with:
flag-name: run-${{ join(matrix.*, '-') }}
parallel: true
debug: true
finish-coverage:
needs: test
if: ${{ always() }}
runs-on: ubuntu-latest
steps:
- name: Coveralls Finished
uses: coverallsapp/github-action@v2.3.0
with:
parallel-finished: true

View file

@ -284,6 +284,7 @@ api/nwb_linkml/schema/index
meta/todo
meta/changelog
meta/references
genindex
```

View file

@ -20,6 +20,11 @@
### DynamicTable
```{note}
See the [DynamicTable](https://hdmf-common-schema.readthedocs.io/en/stable/format_description.html#dynamictable)
reference docs
```
One of the major special cases in NWB is the use of `DynamicTable` to contain tabular data that
contains columns that are not in the base spec.
@ -284,8 +289,35 @@ When generating pydantic models we...
There are several different ways to create references between objects in nwb/hdmf:
- ...
- [`links`](https://schema-language.readthedocs.io/en/latest/description.html#sec-link-spec) are group-level
properties that can reference other groups or datasets like this:
```yaml
links:
- name: Link name
doc: Required string with the description of the link
target_type: Type of target
quantity: Optional quantity identifier for the group (default=1).
```
- [Reference `dtype`](https://schema-language.readthedocs.io/en/latest/description.html#reference-dtype)s are
dataset, and attribute-level properties that can reference both other objects and regions within other objects:
```yaml
dtype:
target_type: ElectrodeGroup
reftype: object
```
- `TimeSeriesReferenceVectorData` is a compound dtype that behaves like VectorData and VectorIndex combined
into a single type. It is slightly different in that each row of the vector can refer to a different table,
and has a different way of handling selection (with `start` and `count`
rather than a series of indices for the end of each cell)
- Implicitly, hdmf creates references between objects according to some naming conventions, eg.
an attribute/dataset that is a `VectorIndex` named `mydata_index` will be linked to a `VectorData`
object `mydata`.
- There is currently a note in the schema language docs that there will be an additional
[Relationships](https://schema-language.readthedocs.io/en/latest/description.html#relationships) system
that explicitly models relationships, but it is unclear how that would be different than references.
We represent all of these by just directly referring to the object type, preserving the source type
in an annotation, when necessary.
## LinkML to Everything

11
docs/meta/references.md Normal file
View file

@ -0,0 +1,11 @@
# References
## Documentation
- [hdmf](https://hdmf.readthedocs.io/en/stable/)
- [hdmf-common-schema](https://hdmf-common-schema.readthedocs.io/en/stable/)
- [pynwb](https://pynwb.readthedocs.io/en/latest/)
```{todo}
Add the bibtex refs to NWB papers :)
```

View file

@ -7,6 +7,7 @@ NWB schema translation
- handle compound `dtype` like in ophys.PlaneSegmentation.pixel_mask
- handle compound `dtype` like in TimeSeriesReferenceVectorData
- Create a validator that checks if all the lists in a compound dtype dataset are same length
- [ ] Move making `target` optional in vectorIndex from pydantic generator to linkml generators!
Cleanup
- [ ] Update pydantic generator
@ -22,7 +23,7 @@ Cleanup
- [ ] Make a minimal pydanticgen-only package to slim linkml deps?
- [ ] Disambiguate "maps" terminology - split out simple maps from the eg. dataset mapping classes
- [ ] Remove unnecessary imports
- [x] Remove unnecessary imports
- dask
- nptyping
- [ ] Adapt the split generation to the new split generator style

File diff suppressed because it is too large Load diff

View file

@ -14,7 +14,6 @@ dependencies = [
"furo>=2023.8.19",
"myst-parser>=2.0.0",
"autodoc-pydantic>=2.0.1",
"nptyping>=2.5.0",
"sphinx-autobuild>=2021.3.14",
"sphinx-design>=0.5.0",
"sphinx-togglebutton>=0.3.2",

File diff suppressed because it is too large Load diff

View file

@ -15,15 +15,14 @@ dependencies = [
"rich>=13.5.2",
#"linkml>=1.7.10",
"linkml @ git+https://github.com/sneakers-the-rat/linkml@nwb-linkml",
"nptyping>=2.5.0",
"pydantic>=2.3.0",
"h5py>=3.9.0",
"pydantic-settings>=2.0.3",
"dask>=2023.9.2",
"tqdm>=4.66.1",
'typing-extensions>=4.12.2;python_version<"3.11"',
"numpydantic>=1.2.1",
"numpydantic>=1.3.3",
"black>=24.4.2",
"pandas>=2.2.2",
]
[project.urls]
@ -37,21 +36,16 @@ plot = [
"dash-cytoscape<1.0.0,>=0.3.0",
]
tests = [
"nwb-linkml[plot]",
"pytest<8.0.0,>=7.4.0",
"nwb-linkml",
"pytest>=8.0.0",
"pytest-depends<2.0.0,>=1.0.1",
"coverage<7.0.0,>=6.1.1",
"pytest-md<1.0.0,>=0.2.0",
"pytest-cov<5.0.0,>=4.1.0",
"coveralls<4.0.0,>=3.3.1",
"pytest-profiling<2.0.0,>=1.7.0",
"sybil<6.0.0,>=5.0.3",
"sybil>=6.0.3",
"requests-cache>=1.2.1",
]
dev = [
"nwb-linkml[tests]",
"ruff>=0.5.0",
"black>=24.4.2",
]
[tool.pdm]
@ -75,7 +69,9 @@ addopts = [
]
markers = [
"dev: tests that are just for development rather than testing correctness",
"provider: tests for providers!"
"provider: tests for providers!",
"linkml: tests related to linkml generation",
"pydantic: tests related to pydantic generation"
]
testpaths = [
"src/nwb_linkml",

View file

@ -5,16 +5,8 @@ Base class for adapters
import sys
from abc import abstractmethod
from dataclasses import dataclass, field
from typing import (
Any,
Generator,
List,
Optional,
Tuple,
Type,
TypeVar,
Union,
)
from logging import Logger
from typing import Any, Generator, List, Literal, Optional, Tuple, Type, TypeVar, Union, overload
from linkml_runtime.dumpers import yaml_dumper
from linkml_runtime.linkml_model import (
@ -26,7 +18,8 @@ from linkml_runtime.linkml_model import (
)
from pydantic import BaseModel
from nwb_schema_language import Attribute, Dataset, Group, Schema
from nwb_linkml.logging import init_logger
from nwb_schema_language import Attribute, CompoundDtype, Dataset, Group, Schema
if sys.version_info.minor >= 11:
from typing import TypeVarTuple, Unpack
@ -107,6 +100,15 @@ class BuildResult:
class Adapter(BaseModel):
"""Abstract base class for adapters"""
_logger: Optional[Logger] = None
@property
def logger(self) -> Logger:
"""A logger with the name of the adapter class! See :class:`.config`"""
if self._logger is None:
self._logger = init_logger(self.__class__.__name__)
return self._logger
@abstractmethod
def build(self) -> "BuildResult":
"""
@ -196,6 +198,14 @@ class Adapter(BaseModel):
if isinstance(item, tuple) and item[0] in field and item[1] is not None:
yield item[1]
@overload
def walk_field_values(
self,
input: Union[BaseModel, dict, list],
field: Literal["neurodata_type_def"],
value: Optional[Any] = None,
) -> Generator[Group | Dataset, None, None]: ...
def walk_field_values(
self, input: Union[BaseModel, dict, list], field: str, value: Optional[Any] = None
) -> Generator[BaseModel, None, None]:
@ -238,3 +248,43 @@ class Adapter(BaseModel):
for item in self.walk(input):
if any([type(item) is atype for atype in get_type]):
yield item
def is_1d(cls: Dataset | Attribute) -> bool:
"""
Check if the values of a dataset are 1-dimensional.
Specifically:
* a single-layer dim/shape list of length 1, or
* a nested dim/shape list where every nested spec is of length 1
"""
if cls.dims is None:
return False
return (
not any([isinstance(dim, list) for dim in cls.dims]) and len(cls.dims) == 1
) or ( # nested list
all([isinstance(dim, list) for dim in cls.dims])
and len(cls.dims) == 1
and len(cls.dims[0]) == 1
)
def is_compound(cls: Dataset) -> bool:
"""Check if dataset has a compound dtype"""
return (
isinstance(cls.dtype, list)
and len(cls.dtype) > 0
and isinstance(cls.dtype[0], CompoundDtype)
)
def has_attrs(cls: Dataset) -> bool:
"""
Check if a dataset has any attributes at all without defaults
"""
return (
cls.attributes is not None
and len(cls.attributes) > 0
and all([not a.value for a in cls.attributes])
)

View file

@ -0,0 +1,197 @@
"""
Adapters for attribute types
"""
from abc import abstractmethod
from typing import ClassVar, Optional, Type, TypedDict
from linkml_runtime.linkml_model.meta import SlotDefinition
from nwb_linkml.adapters.adapter import Adapter, BuildResult, is_1d
from nwb_linkml.adapters.array import ArrayAdapter
from nwb_linkml.maps import Map
from nwb_linkml.maps.dtype import handle_dtype
from nwb_schema_language import Attribute
def _make_ifabsent(val: str | int | float | None) -> str | None:
if val is None:
return None
elif isinstance(val, str):
return f"string({val})"
elif isinstance(val, int):
return f"integer({val})"
elif isinstance(val, float):
return f"float({val})"
else:
return str(val)
class AttrDefaults(TypedDict):
"""Default fields for an attribute"""
equals_string: str | None
equals_number: float | int | None
ifabsent: str | None
class AttributeMap(Map):
"""Base class for attribute mapping transformations :)"""
@classmethod
def handle_defaults(cls, attr: Attribute) -> AttrDefaults:
"""
Construct arguments for linkml slot default metaslots from nwb schema lang attribute props
"""
equals_string = None
equals_number = None
default_value = None
if attr.value:
if isinstance(attr.value, (int, float)):
equals_number = attr.value
elif attr.value:
equals_string = str(attr.value)
if equals_number:
default_value = _make_ifabsent(equals_number)
elif equals_string:
default_value = _make_ifabsent(equals_string)
elif attr.default_value:
default_value = _make_ifabsent(attr.default_value)
return AttrDefaults(
equals_string=equals_string, equals_number=equals_number, ifabsent=default_value
)
@classmethod
@abstractmethod
def check(cls, attr: Attribute) -> bool:
"""
Check if this map applies
"""
pass # pragma: no cover
@classmethod
@abstractmethod
def apply(
cls, attr: Attribute, res: Optional[BuildResult] = None, name: Optional[str] = None
) -> BuildResult:
"""
Apply this mapping
"""
pass # pragma: no cover
class MapScalar(AttributeMap):
"""
Map a simple scalar value
"""
@classmethod
def check(cls, attr: Attribute) -> bool:
"""
Check if we are a scalar value!
"""
return not attr.dims and not attr.shape
@classmethod
def apply(cls, attr: Attribute, res: Optional[BuildResult] = None) -> BuildResult:
"""
Make a slot for us!
"""
slot = SlotDefinition(
name=attr.name,
range=handle_dtype(attr.dtype),
description=attr.doc,
required=attr.required,
**cls.handle_defaults(attr),
)
return BuildResult(slots=[slot])
class MapArray(AttributeMap):
"""
Map an array value!
"""
@classmethod
def check(cls, attr: Attribute) -> bool:
"""
Check that we have some array specification!
"""
return bool(attr.dims) or attr.shape
@classmethod
def apply(cls, attr: Attribute, res: Optional[BuildResult] = None) -> BuildResult:
"""
Make a slot with an array expression!
If we're just a 1D array, use a list (set multivalued: true).
If more than that, make an array descriptor
"""
expressions = {}
multivalued = False
if is_1d(attr):
multivalued = True
else:
# ---------------------------------
# SPECIAL CASE: Some old versions of HDMF don't have ``dims``, only shape
# ---------------------------------
shape = attr.shape
dims = attr.dims
if shape and not dims:
dims = ["null"] * len(shape)
array_adapter = ArrayAdapter(dims, shape)
expressions = array_adapter.make_slot()
slot = SlotDefinition(
name=attr.name,
range=handle_dtype(attr.dtype),
multivalued=multivalued,
description=attr.doc,
required=attr.required,
**expressions,
**cls.handle_defaults(attr),
)
return BuildResult(slots=[slot])
class AttributeAdapter(Adapter):
"""
Create slot definitions from nwb schema language attributes
"""
TYPE: ClassVar[Type] = Attribute
cls: Attribute
def build(self) -> "BuildResult":
"""
Build the slot definitions, every attribute should have a map.
"""
map = self.match()
return map.apply(self.cls)
def match(self) -> Optional[Type[AttributeMap]]:
"""
Find the map class that applies to this attribute
Returns:
:class:`.AttributeMap`
Raises:
RuntimeError - if more than one map matches
"""
# find a map to use
matches = [m for m in AttributeMap.__subclasses__() if m.check(self.cls)]
if len(matches) > 1: # pragma: no cover
raise RuntimeError(
"Only one map should apply to a dataset, you need to refactor the maps! Got maps:"
f" {matches}"
)
elif len(matches) == 0:
return None
else:
return matches[0]

View file

@ -9,9 +9,10 @@ from linkml_runtime.linkml_model import ClassDefinition, SlotDefinition
from pydantic import field_validator
from nwb_linkml.adapters.adapter import Adapter, BuildResult
from nwb_linkml.adapters.attribute import AttributeAdapter
from nwb_linkml.maps import QUANTITY_MAP
from nwb_linkml.maps.naming import camel_to_snake
from nwb_schema_language import CompoundDtype, Dataset, DTypeType, FlatDtype, Group, ReferenceDtype
from nwb_schema_language import Dataset, Group
T = TypeVar("T", bound=Type[Dataset] | Type[Group])
TI = TypeVar("TI", bound=Dataset | Group)
@ -118,22 +119,35 @@ class ClassAdapter(Adapter):
Returns:
list[:class:`.SlotDefinition`]
"""
attrs = [
SlotDefinition(
name=attr.name,
description=attr.doc,
range=self.handle_dtype(attr.dtype),
)
for attr in cls.attributes
]
return attrs
if cls.attributes is not None:
results = [AttributeAdapter(cls=attr).build() for attr in cls.attributes]
slots = [r.slots[0] for r in results]
return slots
else:
return []
def _get_full_name(self) -> str:
"""The full name of the object in the generated linkml
Distinct from 'name' which is the thing that's used to define position in
a hierarchical data setting
a hierarchical data setting.
Combines names from ``parent``, if present, using ``"__"`` .
Rather than concatenating the full series of names with ``__`` like
* ``Parent``
* ``Parent__child1``
* ``Parent__child1__child2``
we only keep the last parent, so
* ``Parent``
* ``Parent__child1``
* ``child1__child2``
The assumption is that a child name may not be unique, but the combination of
a parent/child pair should be unique enough to avoid name shadowing without
making humongous and cumbersome names.
"""
if self.cls.neurodata_type_def:
name = self.cls.neurodata_type_def
@ -141,7 +155,8 @@ class ClassAdapter(Adapter):
# not necessarily a unique name, so we combine parent names
name_parts = []
if self.parent is not None:
name_parts.append(self.parent._get_full_name())
parent_name = self.parent._get_full_name().split("__")[-1]
name_parts.append(parent_name)
name_parts.append(self.cls.name)
name = "__".join(name_parts)
@ -187,37 +202,6 @@ class ClassAdapter(Adapter):
return name
@classmethod
def handle_dtype(cls, dtype: DTypeType | None) -> str:
"""
Get the string form of a dtype
Args:
dtype (:class:`.DTypeType`): Dtype to stringify
Returns:
str
"""
if isinstance(dtype, ReferenceDtype):
return dtype.target_type
elif dtype is None or dtype == []:
# Some ill-defined datasets are "abstract" despite that not being in the schema language
return "AnyType"
elif isinstance(dtype, FlatDtype):
return dtype.value
elif isinstance(dtype, list) and isinstance(dtype[0], CompoundDtype):
# there is precisely one class that uses compound dtypes:
# TimeSeriesReferenceVectorData
# compoundDtypes are able to define a ragged table according to the schema
# but are used in this single case equivalently to attributes.
# so we'll... uh... treat them as slots.
# TODO
return "AnyType"
else:
# flat dtype
return dtype
def build_name_slot(self) -> SlotDefinition:
"""
If a class has a name, then that name should be a slot with a

View file

@ -7,13 +7,13 @@ from typing import ClassVar, Optional, Type
from linkml_runtime.linkml_model.meta import ArrayExpression, SlotDefinition
from nwb_linkml.adapters.adapter import BuildResult
from nwb_linkml.adapters.adapter import BuildResult, has_attrs, is_1d, is_compound
from nwb_linkml.adapters.array import ArrayAdapter
from nwb_linkml.adapters.classes import ClassAdapter
from nwb_linkml.maps import QUANTITY_MAP, Map
from nwb_linkml.maps.dtype import flat_to_linkml
from nwb_linkml.maps.dtype import flat_to_linkml, handle_dtype
from nwb_linkml.maps.naming import camel_to_snake
from nwb_schema_language import CompoundDtype, Dataset
from nwb_schema_language import Dataset
class DatasetMap(Map):
@ -106,7 +106,7 @@ class MapScalar(DatasetMap):
this_slot = SlotDefinition(
name=cls.name,
description=cls.doc,
range=ClassAdapter.handle_dtype(cls.dtype),
range=handle_dtype(cls.dtype),
**QUANTITY_MAP[cls.quantity],
)
res = BuildResult(slots=[this_slot])
@ -154,10 +154,14 @@ class MapScalarAttributes(DatasetMap):
name: rate
description: Sampling rate, in Hz.
range: float32
required: true
unit:
name: unit
description: Unit of measurement for time, which is fixed to 'seconds'.
ifabsent: string(seconds)
range: text
required: true
equals_string: seconds
value:
name: value
range: float64
@ -203,9 +207,7 @@ class MapScalarAttributes(DatasetMap):
"""
Map to a scalar attribute with an adjoining "value" slot
"""
value_slot = SlotDefinition(
name="value", range=ClassAdapter.handle_dtype(cls.dtype), required=True
)
value_slot = SlotDefinition(name="value", range=handle_dtype(cls.dtype), required=True)
res.classes[0].attributes["value"] = value_slot
return res
@ -216,8 +218,8 @@ class MapListlike(DatasetMap):
Used exactly once in the core schema, in ``ImageReferences`` -
an array of references to other ``Image`` datasets. We ignore the
usual array structure and unnest the implicit array into a slot names from the
target type rather than the oddly-named ``num_images`` dimension so that
usual array structure and unnest the implicit array into a slot named "value"
rather than the oddly-named ``num_images`` dimension so that
ultimately in the pydantic model we get a nicely behaved single-level list.
Examples:
@ -245,12 +247,16 @@ class MapListlike(DatasetMap):
name: name
range: string
required: true
image:
name: image
value:
name: value
annotations:
source_type:
tag: source_type
value: reference
description: Ordered dataset of references to Image objects.
multivalued: true
range: Image
required: true
multivalued: true
tree_root: true
"""
@ -271,7 +277,7 @@ class MapListlike(DatasetMap):
* - ``dtype``
- ``Class``
"""
dtype = ClassAdapter.handle_dtype(cls.dtype)
dtype = handle_dtype(cls.dtype)
return (
cls.neurodata_type_inc != "VectorData"
and is_1d(cls)
@ -286,15 +292,15 @@ class MapListlike(DatasetMap):
"""
Map to a list of the given class
"""
dtype = camel_to_snake(ClassAdapter.handle_dtype(cls.dtype))
slot = SlotDefinition(
name=dtype,
name="value",
multivalued=True,
range=ClassAdapter.handle_dtype(cls.dtype),
range=handle_dtype(cls.dtype),
description=cls.doc,
required=cls.quantity not in ("*", "?"),
annotations=[{"source_type": "reference"}],
)
res.classes[0].attributes[dtype] = slot
res.classes[0].attributes["value"] = slot
return res
@ -378,7 +384,7 @@ class MapArraylike(DatasetMap):
- ``False``
"""
dtype = ClassAdapter.handle_dtype(cls.dtype)
dtype = handle_dtype(cls.dtype)
return (
cls.name
and (all([cls.dims, cls.shape]) or cls.neurodata_type_inc == "VectorData")
@ -409,7 +415,7 @@ class MapArraylike(DatasetMap):
SlotDefinition(
name=name,
multivalued=False,
range=ClassAdapter.handle_dtype(cls.dtype),
range=handle_dtype(cls.dtype),
description=cls.doc,
required=cls.quantity not in ("*", "?"),
**expressions,
@ -478,12 +484,14 @@ class MapArrayLikeAttributes(DatasetMap):
name: resolution
description: Pixel resolution of the image, in pixels per centimeter.
range: float32
required: false
description:
name: description
description: Description of the image.
range: text
array:
name: array
required: false
value:
name: value
range: numeric
any_of:
- array:
@ -513,7 +521,7 @@ class MapArrayLikeAttributes(DatasetMap):
"""
Check that we're an array with some additional metadata
"""
dtype = ClassAdapter.handle_dtype(cls.dtype)
dtype = handle_dtype(cls.dtype)
return (
all([cls.dims, cls.shape])
and cls.neurodata_type_inc != "VectorData"
@ -532,10 +540,8 @@ class MapArrayLikeAttributes(DatasetMap):
array_adapter = ArrayAdapter(cls.dims, cls.shape)
expressions = array_adapter.make_slot()
# make a slot for the arraylike class
array_slot = SlotDefinition(
name="array", range=ClassAdapter.handle_dtype(cls.dtype), **expressions
)
res.classes[0].attributes.update({"array": array_slot})
array_slot = SlotDefinition(name="value", range=handle_dtype(cls.dtype), **expressions)
res.classes[0].attributes.update({"value": array_slot})
return res
@ -572,7 +578,7 @@ class MapClassRange(DatasetMap):
name=cls.name,
description=cls.doc,
range=f"{cls.neurodata_type_inc}",
annotations=[{"named": True}],
annotations=[{"named": True}, {"source_type": "neurodata_type_inc"}],
**QUANTITY_MAP[cls.quantity],
)
res = BuildResult(slots=[this_slot])
@ -596,7 +602,7 @@ class MapVectorClassRange(DatasetMap):
Check that we are a VectorData object without any additional attributes
with a dtype that refers to another class
"""
dtype = ClassAdapter.handle_dtype(cls.dtype)
dtype = handle_dtype(cls.dtype)
return (
cls.neurodata_type_inc == "VectorData"
and cls.name
@ -617,7 +623,7 @@ class MapVectorClassRange(DatasetMap):
name=cls.name,
description=cls.doc,
multivalued=True,
range=ClassAdapter.handle_dtype(cls.dtype),
range=handle_dtype(cls.dtype),
required=cls.quantity not in ("*", "?"),
)
res = BuildResult(slots=[this_slot])
@ -672,7 +678,7 @@ class MapVectorClassRange(DatasetMap):
# this_slot = SlotDefinition(
# name=cls.name,
# description=cls.doc,
# range=ClassAdapter.handle_dtype(cls.dtype),
# range=handle_dtype(cls.dtype),
# multivalued=True,
# )
# # No need to make a class for us, so we replace the existing build results
@ -686,17 +692,28 @@ class MapNVectors(DatasetMap):
Most commonly: ``VectorData`` is subclassed without a name and with a '*' quantity to indicate
arbitrary columns.
Used twice:
- Images
- DynamicTable (and all its uses)
DynamicTable (and the slot VectorData where this is called for)
is handled specially and just dropped, because we handle the possibility for
arbitrary extra VectorData in the :mod:`nwb_linkml.includes.hdmf` module mixin classes.
So really this is just a handler for the `Images` case
"""
@classmethod
def check(c, cls: Dataset) -> bool:
"""
Check for being an unnamed multivalued vector class
Check for being an unnamed multivalued vector class that isn't VectorData
"""
return (
cls.name is None
and cls.neurodata_type_def is None
and cls.neurodata_type_inc
and cls.neurodata_type_inc != "VectorData"
and cls.quantity in ("*", "+")
)
@ -725,6 +742,10 @@ class MapCompoundDtype(DatasetMap):
We render them just as a class with each of the dtypes as slots - they are
typically used by other datasets to create a table.
Since there is exactly one class (``TimeSeriesReferenceVectorData``) that uses compound dtypes
meaningfully, we just hardcode the behavior of inheriting the array shape from the VectorData
parent classes. Otherwise, linkml schemas correctly propagate the ``value`` property.
Eg. ``base.TimeSeriesReferenceVectorData``
.. code-block:: yaml
@ -772,10 +793,14 @@ class MapCompoundDtype(DatasetMap):
slots[a_dtype.name] = SlotDefinition(
name=a_dtype.name,
description=a_dtype.doc,
range=ClassAdapter.handle_dtype(a_dtype.dtype),
range=handle_dtype(a_dtype.dtype),
array=ArrayExpression(exact_number_dimensions=1),
**QUANTITY_MAP[cls.quantity],
)
res.classes[0].attributes.update(slots)
if "value" in res.classes[0].attributes:
del res.classes[0].attributes["value"]
return res
@ -825,36 +850,3 @@ class DatasetAdapter(ClassAdapter):
return None
else:
return matches[0]
def is_1d(cls: Dataset) -> bool:
"""
Check if the values of a dataset are 1-dimensional.
Specifically:
* a single-layer dim/shape list of length 1, or
* a nested dim/shape list where every nested spec is of length 1
"""
return (
not any([isinstance(dim, list) for dim in cls.dims]) and len(cls.dims) == 1
) or ( # nested list
all([isinstance(dim, list) for dim in cls.dims])
and len(cls.dims) == 1
and len(cls.dims[0]) == 1
)
def is_compound(cls: Dataset) -> bool:
"""Check if dataset has a compound dtype"""
return (
isinstance(cls.dtype, list)
and len(cls.dtype) > 0
and isinstance(cls.dtype[0], CompoundDtype)
)
def has_attrs(cls: Dataset) -> bool:
"""
Check if a dataset has any attributes at all without defaults
"""
return len(cls.attributes) > 0 and all([not a.value for a in cls.attributes])

View file

@ -2,7 +2,7 @@
Adapter for NWB groups to linkml Classes
"""
from typing import Type
from typing import List, Type
from linkml_runtime.linkml_model import SlotDefinition
@ -28,25 +28,13 @@ class GroupAdapter(ClassAdapter):
Do the translation, yielding the BuildResult
"""
# Handle container groups with only * quantity unnamed groups
if len(self.cls.groups) > 0 and all(
[self._check_if_container(g) for g in self.cls.groups]
if (
len(self.cls.groups) > 0
and not self.cls.links
and all([self._check_if_container(g) for g in self.cls.groups])
): # and \
# self.parent is not None:
return self.handle_container_group(self.cls)
# Or you can have groups like /intervals where there are some named groups, and some unnamed
# but they all have the same type
elif (
len(self.cls.groups) > 0
and all(
[
g.neurodata_type_inc == self.cls.groups[0].neurodata_type_inc
for g in self.cls.groups
]
)
and self.cls.groups[0].neurodata_type_inc is not None
and all([g.quantity in ("?", "*") for g in self.cls.groups])
):
return self.handle_container_group(self.cls)
# handle if we are a terminal container group without making a new class
if (
@ -58,17 +46,42 @@ class GroupAdapter(ClassAdapter):
return self.handle_container_slot(self.cls)
nested_res = self.build_subclasses()
# add links
links = self.build_links()
# we don't propagate slots up to the next level since they are meant for this
# level (ie. a way to refer to our children)
res = self.build_base(extra_attrs=nested_res.slots)
res = self.build_base(extra_attrs=nested_res.slots + links)
# we do propagate classes tho
res.classes.extend(nested_res.classes)
return res
def build_links(self) -> List[SlotDefinition]:
"""
Build links specified in the ``links`` field as slots that refer to other
classes, with an additional annotation specifying that they are in fact links.
Link slots can take either the object itself or the path to that object in the
file hierarchy as a string.
"""
if not self.cls.links:
return []
slots = [
SlotDefinition(
name=link.name,
any_of=[{"range": link.target_type}, {"range": "string"}],
annotations=[{"tag": "source_type", "value": "link"}],
**QUANTITY_MAP[link.quantity],
)
for link in self.cls.links
]
return slots
def handle_container_group(self, cls: Group) -> BuildResult:
"""
Make a special LinkML `children` slot that can
Make a special LinkML `value` slot that can
have any number of the objects that are of `neurodata_type_inc` class
Examples:
@ -84,14 +97,11 @@ class GroupAdapter(ClassAdapter):
doc: Images objects containing images of presented stimuli.
quantity: '*'
Args:
children (List[:class:`.Group`]): Child groups
"""
# don't build subgroups as their own classes, just make a slot
# that can contain them
name = cls.name if self.cls.name else "children"
name = cls.name if self.cls.name else "value"
slot = SlotDefinition(
name=name,

View file

@ -13,7 +13,7 @@ from typing import Dict, List, Optional
from linkml_runtime.dumpers import yaml_dumper
from linkml_runtime.linkml_model import Annotation, SchemaDefinition
from pydantic import Field, PrivateAttr
from pydantic import Field, model_validator
from nwb_linkml.adapters.adapter import Adapter, BuildResult
from nwb_linkml.adapters.schema import SchemaAdapter
@ -31,12 +31,6 @@ class NamespacesAdapter(Adapter):
schemas: List[SchemaAdapter]
imported: List["NamespacesAdapter"] = Field(default_factory=list)
_imports_populated: bool = PrivateAttr(False)
def __init__(self, **kwargs: dict):
super().__init__(**kwargs)
self._populate_schema_namespaces()
@classmethod
def from_yaml(cls, path: Path) -> "NamespacesAdapter":
"""
@ -70,8 +64,6 @@ class NamespacesAdapter(Adapter):
"""
Build the NWB namespace to the LinkML Schema
"""
if not self._imports_populated and not skip_imports:
self.populate_imports()
sch_result = BuildResult()
for sch in self.schemas:
@ -129,6 +121,7 @@ class NamespacesAdapter(Adapter):
return sch_result
@model_validator(mode="after")
def _populate_schema_namespaces(self) -> None:
"""
annotate for each schema which namespace imports it
@ -143,6 +136,7 @@ class NamespacesAdapter(Adapter):
sch.namespace = ns.name
sch.version = ns.version
break
return self
def find_type_source(self, name: str) -> SchemaAdapter:
"""
@ -182,7 +176,8 @@ class NamespacesAdapter(Adapter):
else:
raise KeyError(f"No schema found that define {name}")
def populate_imports(self) -> None:
@model_validator(mode="after")
def populate_imports(self) -> "NamespacesAdapter":
"""
Populate the imports that are needed for each schema file
@ -199,11 +194,7 @@ class NamespacesAdapter(Adapter):
if depends_on not in sch.imports:
sch.imports.append(depends_on)
# do so recursively
for imported in self.imported:
imported.populate_imports()
self._imports_populated = True
return self
def to_yaml(self, base_dir: Path) -> None:
"""
@ -266,10 +257,7 @@ class NamespacesAdapter(Adapter):
else:
ns = ns[0]
schema_names = []
for sch in ns.schema_:
if sch.source is not None:
schema_names.append(sch.source)
schema_names = [sch.source for sch in ns.schema_ if sch.source is not None]
return schema_names
def schema_namespace(self, name: str) -> Optional[str]:

View file

@ -42,7 +42,8 @@ class SchemaAdapter(Adapter):
"""
The namespace.schema name for a single schema
"""
return ".".join([self.namespace, self.path.with_suffix("").name])
namespace = self.namespace if self.namespace is not None else ""
return ".".join([namespace, self.path.with_suffix("").name])
def __repr__(self):
out_str = "\n" + self.name + "\n"

View file

@ -4,8 +4,10 @@ Manage the operation of nwb_linkml from environmental variables
import tempfile
from pathlib import Path
from typing import Literal, Optional
from pydantic import (
BaseModel,
DirectoryPath,
Field,
FieldValidationInfo,
@ -15,15 +17,68 @@ from pydantic import (
)
from pydantic_settings import BaseSettings, SettingsConfigDict
LOG_LEVELS = Literal["DEBUG", "INFO", "WARNING", "ERROR"]
class LogConfig(BaseModel):
"""
Configuration for logging
"""
level: LOG_LEVELS = "INFO"
"""
Severity of log messages to process.
"""
level_file: Optional[LOG_LEVELS] = None
"""
Severity for file-based logging. If unset, use ``level``
"""
level_stdout: Optional[LOG_LEVELS] = "WARNING"
"""
Severity for stream-based logging. If unset, use ``level``
"""
file_n: int = 5
"""
Number of log files to rotate through
"""
file_size: int = 2**22 # roughly 4MB
"""
Maximum size of log files (bytes)
"""
@field_validator("level", "level_file", "level_stdout", mode="before")
@classmethod
def uppercase_levels(cls, value: Optional[str] = None) -> Optional[str]:
"""
Ensure log level strings are uppercased
"""
if value is not None:
value = value.upper()
return value
@model_validator(mode="after")
def inherit_base_level(self) -> "LogConfig":
"""
If loglevels for specific output streams are unset, set from base :attr:`.level`
"""
levels = ("level_file", "level_stdout")
for level_name in levels:
if getattr(self, level_name) is None:
setattr(self, level_name, self.level)
return self
class Config(BaseSettings):
"""
Configuration for nwb_linkml, populated by default but can be overridden
by environment variables.
Nested models can be assigned from .env files with a __ (see examples)
Examples:
export NWB_LINKML_CACHE_DIR="/home/mycache/dir"
export NWB_LINKML_LOGS__LEVEL="debug"
"""
@ -32,6 +87,11 @@ class Config(BaseSettings):
default_factory=lambda: Path(tempfile.gettempdir()) / "nwb_linkml__cache",
description="Location to cache generated schema and models",
)
log_dir: Path = Field(
Path("logs"),
description="Location to store logs. If a relative directory, relative to ``cache_dir``",
)
logs: LogConfig = Field(LogConfig(), description="Log configuration")
@computed_field
@property
@ -62,6 +122,15 @@ class Config(BaseSettings):
assert v.exists()
return v
@model_validator(mode="after")
def log_dir_relative_to_cache_dir(self) -> "Config":
"""
If log dir is relative, put it beneath the cache_dir
"""
if not self.log_dir.is_absolute():
self.log_dir = self.cache_dir / self.log_dir
return self
@model_validator(mode="after")
def folders_exist(self) -> "Config":
"""

View file

@ -1,74 +1,48 @@
"""
Subclass of :class:`linkml.generators.PydanticGenerator`
customized to support NWB models.
The pydantic generator is a subclass of
- :class:`linkml.utils.generator.Generator`
- :class:`linkml.generators.oocodegen.OOCodeGenerator`
The default `__main__` method
- Instantiates the class
- Calls :meth:`~linkml.generators.PydanticGenerator.serialize`
The `serialize` method:
- Accepts an optional jinja-style template, otherwise it uses the default template
- Uses :class:`linkml_runtime.utils.schemaview.SchemaView` to interact with the schema
- Generates linkML Classes
- `generate_enums` runs first
.. note::
This module is heinous. We have mostly copied and pasted the existing :class:`linkml.generators.PydanticGenerator`
and overridden what we need to make this work for NWB, but the source is...
a little messy. We will be tidying this up and trying to pull changes upstream,
but for now this is just our hacky little secret.
See class and module docstrings for details :)
"""
# FIXME: Remove this after we refactor this generator
# ruff: noqa
import inspect
import pdb
import re
import sys
import warnings
from copy import copy
from dataclasses import dataclass, field
from pathlib import Path
from types import ModuleType
from typing import ClassVar, Dict, List, Optional, Tuple, Type, Union
from typing import ClassVar, Dict, List, Optional, Tuple
from linkml.generators import PydanticGenerator
from linkml.generators.pydanticgen.build import SlotResult
from linkml.generators.pydanticgen.array import ArrayRepresentation, NumpydanticArray
from linkml.generators.pydanticgen.template import PydanticModule, Import, Imports
from linkml.generators.pydanticgen.build import ClassResult, SlotResult
from linkml.generators.pydanticgen.template import Import, Imports, PydanticModule
from linkml_runtime.linkml_model.meta import (
Annotation,
AnonymousSlotExpression,
ArrayExpression,
ClassDefinition,
ClassDefinitionName,
ElementName,
SchemaDefinition,
SlotDefinition,
SlotDefinitionName,
)
from linkml_runtime.utils.compile_python import file_text
from linkml_runtime.utils.formatutils import camelcase, underscore, remove_empty_items
from linkml_runtime.utils.formatutils import remove_empty_items
from linkml_runtime.utils.schemaview import SchemaView
from pydantic import BaseModel
from nwb_linkml.maps import flat_to_nptyping
from nwb_linkml.maps.naming import module_case, version_module_case
from nwb_linkml.includes.types import ModelTypeString, _get_name, NamedString, NamedImports
from nwb_linkml.includes.base import BASEMODEL_GETITEM
from nwb_linkml.includes.hdmf import (
DYNAMIC_TABLE_IMPORTS,
DYNAMIC_TABLE_INJECTS,
TSRVD_IMPORTS,
TSRVD_INJECTS,
)
from nwb_linkml.includes.types import ModelTypeString, NamedImports, NamedString, _get_name
OPTIONAL_PATTERN = re.compile(r"Optional\[([\w\.]*)\]")
@dataclass
class NWBPydanticGenerator(PydanticGenerator):
"""
Subclass of pydantic generator, custom behavior is in overridden lifecycle methods :)
"""
injected_fields: List[str] = (
(
@ -76,6 +50,7 @@ class NWBPydanticGenerator(PydanticGenerator):
' is stored in an NWB file")'
),
'object_id: Optional[str] = Field(None, description="Unique UUID for each object")',
BASEMODEL_GETITEM,
)
split: bool = True
imports: list[Import] = field(default_factory=lambda: [Import(module="numpy", alias="np")])
@ -95,7 +70,10 @@ class NWBPydanticGenerator(PydanticGenerator):
def _check_anyof(
self, s: SlotDefinition, sn: SlotDefinitionName, sv: SchemaView
): # pragma: no cover
) -> None: # pragma: no cover
"""
Overridden to allow `array` in any_of
"""
# Confirm that the original slot range (ignoring the default that comes in from
# induced_slot) isn't in addition to setting any_of
allowed_keys = ("array",)
@ -104,7 +82,7 @@ class NWBPydanticGenerator(PydanticGenerator):
allowed = True
for option in s.any_of:
items = remove_empty_items(option)
if not all([key in allowed_keys for key in items.keys()]):
if not all([key in allowed_keys for key in items]):
allowed = False
if allowed:
return
@ -116,6 +94,14 @@ class NWBPydanticGenerator(PydanticGenerator):
if not base_range_subsumes_any_of:
raise ValueError("Slot cannot have both range and any_of defined")
def before_generate_slot(self, slot: SlotDefinition, sv: SchemaView) -> SlotDefinition:
"""
Force some properties to be optional
"""
if slot.name == "target" and "index" in slot.description:
slot.required = False
return slot
def after_generate_slot(self, slot: SlotResult, sv: SchemaView) -> SlotResult:
"""
- strip unwanted metadata
@ -127,7 +113,16 @@ class NWBPydanticGenerator(PydanticGenerator):
return slot
def after_generate_class(self, cls: ClassResult, sv: SchemaView) -> ClassResult:
"""Customize dynamictable behavior"""
cls = AfterGenerateClass.inject_dynamictable(cls)
cls = AfterGenerateClass.wrap_dynamictable_columns(cls, sv)
return cls
def before_render_template(self, template: PydanticModule, sv: SchemaView) -> PydanticModule:
"""
Remove source file from metadata
"""
if "source_file" in template.meta:
del template.meta["source_file"]
return template
@ -159,6 +154,9 @@ class AfterGenerateSlot:
@staticmethod
def skip_meta(slot: SlotResult, skip_meta: tuple[str]) -> SlotResult:
"""
Skip additional metadata slots
"""
for key in skip_meta:
if key in slot.attribute.meta:
del slot.attribute.meta[key]
@ -227,13 +225,91 @@ class AfterGenerateSlot:
return slot
class AfterGenerateClass:
"""
Container class for class-modification methods
"""
@staticmethod
def inject_dynamictable(cls: ClassResult) -> ClassResult:
"""
Modify dynamictable class bases and inject needed objects :)
Args:
cls:
Returns:
"""
if cls.cls.name in "DynamicTable":
cls.cls.bases = ["DynamicTableMixin"]
if cls.injected_classes is None:
cls.injected_classes = DYNAMIC_TABLE_INJECTS.copy()
else:
cls.injected_classes.extend(DYNAMIC_TABLE_INJECTS.copy())
if isinstance(cls.imports, Imports):
cls.imports += DYNAMIC_TABLE_IMPORTS
elif isinstance(cls.imports, list):
cls.imports = Imports(imports=cls.imports) + DYNAMIC_TABLE_IMPORTS
else:
cls.imports = DYNAMIC_TABLE_IMPORTS.model_copy()
elif cls.cls.name == "VectorData":
cls.cls.bases = ["VectorDataMixin"]
elif cls.cls.name == "VectorIndex":
cls.cls.bases = ["VectorIndexMixin"]
elif cls.cls.name == "DynamicTableRegion":
cls.cls.bases = ["DynamicTableRegionMixin", "VectorData"]
elif cls.cls.name == "AlignedDynamicTable":
cls.cls.bases = ["AlignedDynamicTableMixin", "DynamicTable"]
elif cls.cls.name == "TimeSeriesReferenceVectorData":
# in core.nwb.base, so need to inject and import again
cls.cls.bases = ["TimeSeriesReferenceVectorDataMixin", "VectorData"]
if cls.injected_classes is None:
cls.injected_classes = TSRVD_INJECTS.copy()
else:
cls.injected_classes.extend(TSRVD_INJECTS.copy())
if isinstance(cls.imports, Imports):
cls.imports += TSRVD_IMPORTS
elif isinstance(cls.imports, list):
cls.imports = Imports(imports=cls.imports) + TSRVD_IMPORTS
else:
cls.imports = TSRVD_IMPORTS.model_copy()
return cls
@staticmethod
def wrap_dynamictable_columns(cls: ClassResult, sv: SchemaView) -> ClassResult:
"""
Wrap NDArray columns inside of dynamictables with ``VectorData`` or
``VectorIndex``, which are generic classes whose value slot is
parameterized by the NDArray
"""
if cls.source.is_a == "DynamicTable" or "DynamicTable" in sv.class_ancestors(
cls.source.name
):
for an_attr in cls.cls.attributes:
if "NDArray" in (slot_range := cls.cls.attributes[an_attr].range):
if an_attr.endswith("_index"):
cls.cls.attributes[an_attr].range = "".join(
["VectorIndex[", slot_range, "]"]
)
else:
cls.cls.attributes[an_attr].range = "".join(
["VectorData[", slot_range, "]"]
)
return cls
def compile_python(
text_or_fn: str, package_path: Path = None, module_name: str = "test"
) -> ModuleType:
"""
Compile the text or file and return the resulting module
@param text_or_fn: Python text or file name that references python file
@param package_path: Root package path. If omitted and we've got a python file, the package is the containing
@param package_path: Root package path. If omitted and we've got a python file,
the package is the containing
directory
@return: Compiled module
"""

View file

@ -0,0 +1,14 @@
"""
Modifications to the ConfiguredBaseModel used by all generated classes
"""
BASEMODEL_GETITEM = """
def __getitem__(self, val: Union[int, slice]) -> Any:
\"\"\"Try and get a value from value or "data" if we have it\"\"\"
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
"""

View file

@ -2,38 +2,913 @@
Special types for mimicking HDMF special case behavior
"""
from typing import Any
import sys
from typing import (
TYPE_CHECKING,
Any,
ClassVar,
Dict,
Generic,
Iterable,
List,
Optional,
Tuple,
TypeVar,
Union,
overload,
)
from pydantic import BaseModel, ConfigDict
import numpy as np
import pandas as pd
from linkml.generators.pydanticgen.template import Import, Imports, ObjectImport
from numpydantic import NDArray, Shape
from pydantic import (
BaseModel,
ConfigDict,
Field,
ValidationError,
ValidationInfo,
ValidatorFunctionWrapHandler,
field_validator,
model_validator,
)
if TYPE_CHECKING: # pragma: no cover
from nwb_linkml.models import VectorData, VectorIndex
T = TypeVar("T", bound=NDArray)
T_INJECT = 'T = TypeVar("T", bound=NDArray)'
class DynamicTableMixin(BaseModel):
"""
Mixin to make DynamicTable subclasses behave like tables/dataframes
Mimicking some of the behavior from :class:`hdmf.common.table.DynamicTable`
but simplifying along the way :)
"""
model_config = ConfigDict(extra="allow")
model_config = ConfigDict(extra="allow", validate_assignment=True)
__pydantic_extra__: Dict[str, Union["VectorDataMixin", "VectorIndexMixin", "NDArray", list]]
NON_COLUMN_FIELDS: ClassVar[tuple[str]] = (
"id",
"name",
"colnames",
"description",
)
# @model_validator(mode='after')
# def ensure_equal_length(cls, model: 'DynamicTableMixin') -> 'DynamicTableMixin':
# """
# Ensure all vectors are of equal length
# """
# raise NotImplementedError('TODO')
#
# @model_validator(mode="after")
# def create_index_backrefs(cls, model: 'DynamicTableMixin') -> 'DynamicTableMixin':
# """
# Ensure that vectordata with vectorindexes know about them
# """
# raise NotImplementedError('TODO')
# overridden by subclass but implemented here for testing and typechecking purposes :)
colnames: List[str] = Field(default_factory=list)
id: Optional[NDArray[Shape["* num_rows"], int]] = None
def __getitem__(self, item: str) -> Any:
raise NotImplementedError("TODO")
@property
def _columns(self) -> Dict[str, Union[list, "NDArray", "VectorDataMixin"]]:
return {k: getattr(self, k) for i, k in enumerate(self.colnames)}
@overload
def __getitem__(self, item: str) -> Union[list, "NDArray", "VectorDataMixin"]: ...
@overload
def __getitem__(self, item: int) -> pd.DataFrame: ...
@overload
def __getitem__(self, item: Tuple[int, Union[int, str]]) -> Any: ...
@overload
def __getitem__(self, item: Tuple[Union[int, slice], ...]) -> Union[
pd.DataFrame,
list,
"NDArray",
"VectorDataMixin",
]: ...
@overload
def __getitem__(self, item: Union[slice, "NDArray"]) -> pd.DataFrame: ...
def __getitem__(
self,
item: Union[
str,
int,
slice,
"NDArray",
Tuple[int, Union[int, str]],
Tuple[Union[int, slice], ...],
],
) -> Any:
"""
Get an item from the table
If item is...
- ``str`` : get the column with this name
- ``int`` : get the row at this index
- ``tuple[int, int]`` : get a specific cell value eg. (0,1) gets the 0th row and 1st column
- ``tuple[int, str]`` : get a specific cell value eg. (0, 'colname')
gets the 0th row from ``colname``
- ``tuple[int | slice, int | slice]`` : get a range of cells from a range of columns.
returns as a :class:`pandas.DataFrame`
"""
if isinstance(item, str):
return self._columns[item]
if isinstance(item, (int, slice, np.integer, np.ndarray)):
data = self._slice_range(item)
index = self.id[item]
elif isinstance(item, tuple):
if len(item) != 2:
raise ValueError(
"DynamicTables are 2-dimensional, can't index with more than 2 indices like"
f" {item}"
)
# all other cases are tuples of (rows, cols)
rows, cols = item
if isinstance(cols, (int, slice, np.integer)):
cols = self.colnames[cols]
if isinstance(rows, int) and isinstance(cols, str):
# single scalar value
return self._columns[cols][rows]
data = self._slice_range(rows, cols)
index = self.id[rows]
else:
raise ValueError(f"Unsure how to get item with key {item}")
# cast to DF
if not isinstance(index, Iterable):
index = [index]
index = pd.Index(data=index)
return pd.DataFrame(data, index=index)
def _slice_range(
self, rows: Union[int, slice, np.ndarray], cols: Optional[Union[str, List[str]]] = None
) -> Dict[str, Union[list, "NDArray", "VectorData"]]:
if cols is None:
cols = self.colnames
elif isinstance(cols, str):
cols = [cols]
data = {}
for k in cols:
if isinstance(rows, np.ndarray):
# help wanted - this is probably cr*zy slow
val = [self._columns[k][i] for i in rows]
else:
val = self._columns[k][rows]
# scalars need to be wrapped in series for pandas
# do this by the iterability of the rows index not the value because
# we want all lengths from this method to be equal, and if the rows are
# scalar, that means length == 1
if not isinstance(rows, (Iterable, slice)):
val = [val]
data[k] = val
return data
def __setitem__(self, key: str, value: Any) -> None:
raise NotImplementedError("TODO")
raise NotImplementedError("TODO") # pragma: no cover
def __setattr__(self, key: str, value: Union[list, "NDArray", "VectorData"]):
"""
Add a column, appending it to ``colnames``
"""
# don't use this while building the model
if not getattr(self, "__pydantic_complete__", False): # pragma: no cover
return super().__setattr__(key, value)
if key not in self.model_fields_set and not key.endswith("_index"):
self.colnames.append(key)
# we get a recursion error if we setattr without having first added to
# extras if we need it to be there
if key not in self.model_fields and key not in self.__pydantic_extra__:
self.__pydantic_extra__[key] = value
return super().__setattr__(key, value)
def __getattr__(self, item: str) -> Any:
"""Try and use pandas df attrs if we don't have them"""
try:
return BaseModel.__getattr__(self, item)
except AttributeError as e:
try:
return getattr(self[:, :], item)
except AttributeError:
raise e from None
def __len__(self) -> int:
"""
Use the id column to determine length.
If the id column doesn't represent length accurately, it's a bug
"""
return len(self.id)
@model_validator(mode="before")
@classmethod
def create_id(cls, model: Dict[str, Any]) -> Dict:
"""
Create ID column if not provided
"""
if not isinstance(model, dict):
return model
if "id" not in model:
lengths = []
for key, val in model.items():
# don't get lengths of columns with an index
if (
f"{key}_index" in model
or (isinstance(val, VectorData) and val._index)
or key in cls.NON_COLUMN_FIELDS
):
continue
lengths.append(len(val))
model["id"] = np.arange(np.max(lengths))
return model
@model_validator(mode="before")
@classmethod
def create_colnames(cls, model: Dict[str, Any]) -> Dict:
"""
Construct colnames from arguments.
the model dict is ordered after python3.6, so we can use that minus
anything in :attr:`.NON_COLUMN_FIELDS` to determine order implied from passage order
"""
if not isinstance(model, dict):
return model
if "colnames" not in model:
colnames = [
k
for k in model
if k not in cls.NON_COLUMN_FIELDS
and not k.endswith("_index")
and not isinstance(model[k], VectorIndexMixin)
]
model["colnames"] = colnames
else:
# add any columns not explicitly given an order at the end
colnames = model["colnames"].copy()
colnames.extend(
[
k
for k in model
if k not in cls.NON_COLUMN_FIELDS
and not k.endswith("_index")
and k not in model["colnames"]
and not isinstance(model[k], VectorIndexMixin)
]
)
model["colnames"] = colnames
return model
@model_validator(mode="before")
@classmethod
def cast_extra_columns(cls, model: Dict[str, Any]) -> Dict:
"""
If extra columns are passed as just lists or arrays, cast to VectorData
before we resolve targets for VectorData and VectorIndex pairs.
See :meth:`.cast_specified_columns` for handling columns in the class specification
"""
# if columns are not in the specification, cast to a generic VectorData
if isinstance(model, dict):
for key, val in model.items():
if key in cls.model_fields:
continue
if not isinstance(val, (VectorData, VectorIndex)):
try:
if key.endswith("_index"):
model[key] = VectorIndex(name=key, description="", value=val)
else:
model[key] = VectorData(name=key, description="", value=val)
except ValidationError as e: # pragma: no cover
raise ValidationError(
f"field {key} cannot be cast to VectorData from {val}"
) from e
return model
@model_validator(mode="after")
def resolve_targets(self) -> "DynamicTableMixin":
"""
Ensure that any implicitly indexed columns are linked, and create backlinks
"""
for key, col in self._columns.items():
if isinstance(col, VectorData):
# find an index
idx = None
for field_name in self.model_fields_set:
if field_name in self.NON_COLUMN_FIELDS or field_name == key:
continue
# implicit name-based index
field = getattr(self, field_name)
if isinstance(field, VectorIndex) and (
field_name == f"{key}_index" or field.target is col
):
idx = field
break
if idx is not None:
col._index = idx
idx.target = col
return self
@model_validator(mode="after")
def ensure_equal_length_cols(self) -> "DynamicTableMixin":
"""
Ensure that all columns are equal length
"""
lengths = [len(v) for v in self._columns.values()] + [len(self.id)]
assert all([length == lengths[0] for length in lengths]), (
"Columns are not of equal length! "
f"Got colnames:\n{self.colnames}\nand lengths: {lengths}"
)
return self
@field_validator("*", mode="wrap")
@classmethod
def cast_specified_columns(
cls, val: Any, handler: ValidatorFunctionWrapHandler, info: ValidationInfo
) -> Any:
"""
If columns *in* the model specification are supplied as arrays,
try casting them to the type before validating.
Columns that are not in the spec are handled separately in
:meth:`.cast_extra_columns`
"""
try:
return handler(val)
except ValidationError as e:
annotation = cls.model_fields[info.field_name].annotation
if type(annotation).__name__ == "_UnionGenericAlias":
annotation = annotation.__args__[0]
try:
# should pass if we're supposed to be a VectorData column
# don't want to override intention here by insisting that it is
# *actually* a VectorData column in case an NDArray has been specified for now
return handler(
annotation(
val,
name=info.field_name,
description=cls.model_fields[info.field_name].description,
)
)
except Exception:
raise e from None
# class VectorDataMixin(BaseModel):
# index: Optional[BaseModel] = None
class VectorDataMixin(BaseModel, Generic[T]):
"""
Mixin class to give VectorData indexing abilities
"""
_index: Optional["VectorIndex"] = None
# redefined in `VectorData`, but included here for testing and type checking
value: Optional[T] = None
def __init__(self, value: Optional[NDArray] = None, **kwargs):
if value is not None and "value" not in kwargs:
kwargs["value"] = value
super().__init__(**kwargs)
def __getitem__(self, item: Union[str, int, slice, Tuple[Union[str, int, slice], ...]]) -> Any:
if self._index:
# Following hdmf, VectorIndex is the thing that knows how to do the slicing
return self._index[item]
else:
return self.value[item]
def __setitem__(self, key: Union[int, str, slice], value: Any) -> None:
if self._index:
# Following hdmf, VectorIndex is the thing that knows how to do the slicing
self._index[key] = value
else:
self.value[key] = value
def __getattr__(self, item: str) -> Any:
"""
Forward getattr to ``value``
"""
try:
return BaseModel.__getattr__(self, item)
except AttributeError as e:
try:
return getattr(self.value, item)
except AttributeError:
raise e from None
def __len__(self) -> int:
"""
Use index as length, if present
"""
if self._index:
return len(self._index)
else:
return len(self.value)
class VectorIndexMixin(BaseModel, Generic[T]):
"""
Mixin class to give VectorIndex indexing abilities
"""
# redefined in `VectorData`, but included here for testing and type checking
value: Optional[T] = None
target: Optional["VectorData"] = None
def __init__(self, value: Optional[NDArray] = None, **kwargs):
if value is not None and "value" not in kwargs:
kwargs["value"] = value
super().__init__(**kwargs)
def _slice(self, arg: int) -> slice:
"""
Mimicking :func:`hdmf.common.table.VectorIndex.__getitem_helper`
"""
start = 0 if arg == 0 else self.value[arg - 1]
end = self.value[arg]
return slice(start, end)
def __getitem__(self, item: Union[int, slice, Iterable]) -> Any:
if self.target is None:
return self.value[item]
else:
if isinstance(item, (int, np.integer)):
return self.target.value[self._slice(item)]
elif isinstance(item, (slice, Iterable)):
if isinstance(item, slice):
item = range(*item.indices(len(self.value)))
return [self.target.value[self._slice(i)] for i in item]
else: # pragma: no cover
raise AttributeError(f"Could not index with {item}")
def __setitem__(self, key: Union[int, slice], value: Any) -> None:
"""
Set a value on the :attr:`.target` .
.. note::
Even though we correct the indexing logic from HDMF where the
_data_ is the thing that is provided by the API when one accesses
table.data (rather than table.data_index as hdmf does),
we will set to the target here (rather than to the index)
to be consistent. To modify the index, modify `self.value` directly
"""
if self.target:
if isinstance(key, (int, np.integer)):
self.target.value[self._slice(key)] = value
elif isinstance(key, (slice, Iterable)):
if isinstance(key, slice):
key = range(*key.indices(len(self.value)))
if isinstance(value, Iterable):
if len(key) != len(value):
raise ValueError(
"Can only assign equal-length iterable to a slice, manually index the"
" ragged values of of the target VectorData object if you need more"
" control"
)
for i, subval in zip(key, value):
self.target.value[self._slice(i)] = subval
else:
for i in key:
self.target.value[self._slice(i)] = value
else: # pragma: no cover
raise AttributeError(f"Could not index with {key}")
else:
self.value[key] = value
def __getattr__(self, item: str) -> Any:
"""
Forward getattr to ``value``
"""
try:
return BaseModel.__getattr__(self, item)
except AttributeError as e:
try:
return getattr(self.value, item)
except AttributeError:
raise e from None
def __len__(self) -> int:
"""
Get length from value
"""
return len(self.value)
class DynamicTableRegionMixin(BaseModel):
"""
Mixin to allow indexing references to regions of dynamictables
"""
_index: Optional["VectorIndex"] = None
table: "DynamicTableMixin"
value: Optional[NDArray[Shape["*"], int]] = None
@overload
def __getitem__(self, item: int) -> pd.DataFrame: ...
@overload
def __getitem__(self, item: Union[slice, Iterable]) -> List[pd.DataFrame]: ...
def __getitem__(
self, item: Union[int, slice, Iterable]
) -> Union[pd.DataFrame, List[pd.DataFrame]]:
"""
Use ``value`` to index the table. Works analogously to ``VectorIndex`` despite
this being a subclass of ``VectorData``
"""
if self._index:
if isinstance(item, (int, np.integer)):
# index returns an array of indices,
# and indexing table with an array returns a list of rows
return self.table[self._index[item]]
elif isinstance(item, slice):
# index returns a list of arrays of indices,
# so we index table with an array to construct
# a list of lists of rows
return [self.table[idx] for idx in self._index[item]]
else: # pragma: no cover
raise ValueError(f"Dont know how to index with {item}, need an int or a slice")
else:
if isinstance(item, (int, np.integer)):
return self.table[self.value[item]]
elif isinstance(item, (slice, Iterable)):
# Return a list of dataframe rows because this is most often used
# as a column in a DynamicTable, so while it would normally be
# ideal to just return the slice as above as a single df,
# we need each row to be separate to fill the column
if isinstance(item, slice):
item = range(*item.indices(len(self.value)))
return [self.table[self.value[i]] for i in item]
else: # pragma: no cover
raise ValueError(f"Dont know how to index with {item}, need an int or a slice")
def __setitem__(self, key: Union[int, str, slice], value: Any) -> None:
# self.table[self.value[key]] = value
raise NotImplementedError(
"Assigning values to tables is not implemented yet!"
) # pragma: no cover
class AlignedDynamicTableMixin(BaseModel):
"""
Mixin to allow indexing multiple tables that are aligned on a common ID
A great deal of code duplication because we need to avoid diamond inheritance
and also it's not so easy to copy a pydantic validator method.
"""
model_config = ConfigDict(extra="allow", validate_assignment=True)
__pydantic_extra__: Dict[str, Union["DynamicTableMixin", "VectorDataMixin", "VectorIndexMixin"]]
NON_CATEGORY_FIELDS: ClassVar[tuple[str]] = (
"name",
"categories",
"colnames",
"description",
)
name: str = "aligned_table"
categories: List[str] = Field(default_factory=list)
id: Optional[NDArray[Shape["* num_rows"], int]] = None
@property
def _categories(self) -> Dict[str, "DynamicTableMixin"]:
return {k: getattr(self, k) for i, k in enumerate(self.categories)}
def __getitem__(
self, item: Union[int, str, slice, NDArray[Shape["*"], int], Tuple[Union[int, slice], str]]
) -> pd.DataFrame:
"""
Mimic hdmf:
https://github.com/hdmf-dev/hdmf/blob/dev/src/hdmf/common/alignedtable.py#L261
Args:
item:
Returns:
"""
if isinstance(item, str):
# get a single table
return self._categories[item][:]
elif isinstance(item, tuple) and len(item) == 2 and isinstance(item[1], str):
# get a slice of a single table
return self._categories[item[1]][item[0]]
elif isinstance(item, (int, slice, Iterable)):
# get a slice of all the tables
ids = self.id[item]
if not isinstance(ids, Iterable):
ids = pd.Series([ids])
ids = pd.DataFrame({"id": ids})
tables = [ids]
for category_name, category in self._categories.items():
table = category[item]
if isinstance(table, pd.DataFrame):
table = table.reset_index()
elif isinstance(table, np.ndarray):
table = pd.DataFrame({category_name: [table]})
elif isinstance(table, Iterable):
table = pd.DataFrame({category_name: table})
else:
raise ValueError(
f"Don't know how to construct category table for {category_name}"
)
tables.append(table)
names = [self.name] + self.categories
# construct below in case we need to support array indexing in the future
else:
raise ValueError(
f"Dont know how to index with {item}, "
"need an int, string, slice, ndarray, or tuple[int | slice, str]"
)
df = pd.concat(tables, axis=1, keys=names)
df.set_index((self.name, "id"), drop=True, inplace=True)
return df
def __getattr__(self, item: str) -> Any:
"""Try and use pandas df attrs if we don't have them"""
try:
return BaseModel.__getattr__(self, item)
except AttributeError as e:
try:
return getattr(self[:], item)
except AttributeError:
raise e from None
def __len__(self) -> int:
"""
Use the id column to determine length.
If the id column doesn't represent length accurately, it's a bug
"""
return len(self.id)
@model_validator(mode="before")
@classmethod
def create_id(cls, model: Dict[str, Any]) -> Dict:
"""
Create ID column if not provided
"""
if "id" not in model:
lengths = []
for key, val in model.items():
# don't get lengths of columns with an index
if (
f"{key}_index" in model
or (isinstance(val, VectorData) and val._index)
or key in cls.NON_CATEGORY_FIELDS
):
continue
lengths.append(len(val))
model["id"] = np.arange(np.max(lengths))
return model
@model_validator(mode="before")
@classmethod
def create_categories(cls, model: Dict[str, Any]) -> Dict:
"""
Construct categories from arguments.
the model dict is ordered after python3.6, so we can use that minus
anything in :attr:`.NON_COLUMN_FIELDS` to determine order implied from passage order
"""
if "categories" not in model:
categories = [
k for k in model if k not in cls.NON_CATEGORY_FIELDS and not k.endswith("_index")
]
model["categories"] = categories
else:
# add any columns not explicitly given an order at the end
categories = [
k
for k in model
if k not in cls.NON_COLUMN_FIELDS
and not k.endswith("_index")
and k not in model["categories"]
]
model["categories"].extend(categories)
return model
@model_validator(mode="after")
def resolve_targets(self) -> "DynamicTableMixin":
"""
Ensure that any implicitly indexed columns are linked, and create backlinks
"""
for key, col in self._categories.items():
if isinstance(col, VectorData):
# find an index
idx = None
for field_name in self.model_fields_set:
if field_name in self.NON_CATEGORY_FIELDS or field_name == key:
continue
# implicit name-based index
field = getattr(self, field_name)
if isinstance(field, VectorIndex) and (
field_name == f"{key}_index" or field.target is col
):
idx = field
break
if idx is not None:
col._index = idx
idx.target = col
return self
@model_validator(mode="after")
def ensure_equal_length_cols(self) -> "DynamicTableMixin":
"""
Ensure that all columns are equal length
"""
lengths = [len(v) for v in self._categories.values()] + [len(self.id)]
assert all([length == lengths[0] for length in lengths]), (
"Columns are not of equal length! "
f"Got colnames:\n{self.categories}\nand lengths: {lengths}"
)
return self
class TimeSeriesReferenceVectorDataMixin(VectorDataMixin):
"""
Mixin class for TimeSeriesReferenceVectorData -
very simple, just indexing the given timeseries object.
These shouldn't have additional fields in them, just the three columns
for index, span, and timeseries
"""
idx_start: NDArray[Shape["*"], int]
count: NDArray[Shape["*"], int]
timeseries: NDArray
@model_validator(mode="after")
def ensure_equal_length(self) -> "TimeSeriesReferenceVectorDataMixin":
"""
Each of the three indexing columns must be the same length to work!
"""
assert len(self.idx_start) == len(self.timeseries) == len(self.count), (
f"Columns have differing lengths: idx: {len(self.idx_start)}, count: {len(self.count)},"
f" timeseries: {len(self.timeseries)}"
)
return self
def __len__(self) -> int:
"""Since we have ensured equal length, just return idx_start"""
return len(self.idx_start)
@overload
def _slice_helper(self, item: int) -> slice: ...
@overload
def _slice_helper(self, item: slice) -> List[slice]: ...
def _slice_helper(self, item: Union[int, slice]) -> Union[slice, List[slice]]:
if isinstance(item, (int, np.integer)):
return slice(self.idx_start[item], self.idx_start[item] + self.count[item])
else:
starts = self.idx_start[item]
ends = starts + self.count[item]
return [slice(start, end) for start, end in zip(starts, ends)]
def __getitem__(self, item: Union[int, slice, Iterable]) -> Any:
if self._index is not None:
raise NotImplementedError(
"VectorIndexing with TimeSeriesReferenceVectorData is not supported because it is"
" never done in the core schema."
)
if isinstance(item, (int, np.integer)):
return self.timeseries[item][self._slice_helper(item)]
elif isinstance(item, (slice, Iterable)):
if isinstance(item, slice):
item = range(*item.indices(len(self.idx_start)))
return [self.timeseries[subitem][self._slice_helper(subitem)] for subitem in item]
else:
raise ValueError(
f"Dont know how to index with {item}, must be an int, slice, or iterable"
)
def __setitem__(self, key: Union[int, slice, Iterable], value: Any) -> None:
if self._index is not None:
raise NotImplementedError(
"VectorIndexing with TimeSeriesReferenceVectorData is not supported because it is"
" never done in the core schema."
)
if isinstance(key, (int, np.integer)):
self.timeseries[key][self._slice_helper(key)] = value
elif isinstance(key, (slice, Iterable)):
if isinstance(key, slice):
key = range(*key.indices(len(self.idx_start)))
if isinstance(value, Iterable):
if len(key) != len(value):
raise ValueError(
"Can only assign equal-length iterable to a slice, manually index the"
" target Timeseries object if you need more control"
)
for subitem, subvalue in zip(key, value):
self.timeseries[subitem][self._slice_helper(subitem)] = subvalue
else:
for subitem in key:
self.timeseries[subitem][self._slice_helper(subitem)] = value
else:
raise ValueError(
f"Dont know how to index with {key}, must be an int, slice, or iterable"
)
DYNAMIC_TABLE_IMPORTS = Imports(
imports=[
Import(module="pandas", alias="pd"),
Import(
module="typing",
objects=[
ObjectImport(name="ClassVar"),
ObjectImport(name="Generic"),
ObjectImport(name="Iterable"),
ObjectImport(name="Tuple"),
ObjectImport(name="TypeVar"),
ObjectImport(name="overload"),
],
),
Import(
module="numpydantic", objects=[ObjectImport(name="NDArray"), ObjectImport(name="Shape")]
),
Import(
module="pydantic",
objects=[
ObjectImport(name="model_validator"),
ObjectImport(name="field_validator"),
ObjectImport(name="ValidationInfo"),
ObjectImport(name="ValidatorFunctionWrapHandler"),
ObjectImport(name="ValidationError"),
],
),
Import(module="numpy", alias="np"),
]
)
"""
Imports required for the dynamic table mixin
VectorData is purposefully excluded as an import or an inject so that it will be
resolved to the VectorData definition in the generated module
"""
DYNAMIC_TABLE_INJECTS = [
T_INJECT,
VectorDataMixin,
VectorIndexMixin,
DynamicTableRegionMixin,
DynamicTableMixin,
AlignedDynamicTableMixin,
]
TSRVD_IMPORTS = Imports(
imports=[
Import(
module="typing",
objects=[
ObjectImport(name="Generic"),
ObjectImport(name="Iterable"),
ObjectImport(name="Tuple"),
ObjectImport(name="TypeVar"),
ObjectImport(name="overload"),
],
),
Import(module="pydantic", objects=[ObjectImport(name="model_validator")]),
]
)
"""Imports for TimeSeriesReferenceVectorData"""
TSRVD_INJECTS = [T_INJECT, VectorDataMixin, TimeSeriesReferenceVectorDataMixin]
if "pytest" in sys.modules:
# during testing define concrete subclasses...
class VectorData(VectorDataMixin):
"""VectorData subclass for testing"""
pass
class VectorIndex(VectorIndexMixin):
"""VectorIndex subclass for testing"""
pass
class DynamicTableRegion(DynamicTableRegionMixin, VectorData):
"""DynamicTableRegion subclass for testing"""
pass
class TimeSeriesReferenceVectorData(TimeSeriesReferenceVectorDataMixin):
"""TimeSeriesReferenceVectorData subclass for testing"""
pass

View file

@ -19,7 +19,7 @@ ModelTypeString = """ModelType = TypeVar("ModelType", bound=Type[BaseModel])"""
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name

View file

@ -242,10 +242,7 @@ def find_references(h5f: h5py.File, path: str) -> List[str]:
def _find_references(name: str, obj: h5py.Group | h5py.Dataset) -> None:
pbar.update()
refs = []
for attr in obj.attrs.values():
if isinstance(attr, h5py.h5r.Reference):
refs.append(attr)
refs = [attr for attr in obj.attrs.values() if isinstance(attr, h5py.h5r.Reference)]
if isinstance(obj, h5py.Dataset):
# dataset is all references

View file

@ -2,6 +2,7 @@
Loading/saving NWB Schema yaml files
"""
import warnings
from pathlib import Path
from pprint import pprint
from typing import Optional
@ -70,6 +71,7 @@ def load_namespace_adapter(
namespace: Path | NamespaceRepo | Namespaces,
path: Optional[Path] = None,
version: Optional[str] = None,
imported: Optional[list[NamespacesAdapter]] = None,
) -> NamespacesAdapter:
"""
Load all schema referenced by a namespace file
@ -81,6 +83,8 @@ def load_namespace_adapter(
version (str): Optional: tag or commit to check out namespace is a
:class:`.NamespaceRepo`. If ``None``, use ``HEAD`` if not already checked out,
or otherwise use whatever version is already checked out.
imported (list[:class:`.NamespacesAdapter`]): Optional: override discovered imports
with already-loaded namespaces adapters
Returns:
:class:`.NamespacesAdapter`
@ -110,17 +114,56 @@ def load_namespace_adapter(
for ns in namespaces.namespaces:
for schema in ns.schema_:
if schema.source is None:
# this is normal, we'll resolve later
if imported is None and schema.namespace == "hdmf-common" and ns.name == "core":
# special case - hdmf-common is imported by name without location or version,
# so to get the correct version we have to handle it separately
imported = _resolve_hdmf(namespace, path)
if imported is not None:
imported = [imported]
else:
continue
else:
yml_file = (path / schema.source).resolve()
sch.append(load_schema_file(yml_file))
if imported is not None:
adapter = NamespacesAdapter(namespaces=namespaces, schemas=sch, imported=imported)
else:
adapter = NamespacesAdapter(namespaces=namespaces, schemas=sch)
return adapter
def load_nwb_core(core_version: str = "2.7.0", hdmf_version: str = "1.8.0") -> NamespacesAdapter:
def _resolve_hdmf(
namespace: Path | NamespaceRepo | Namespaces, path: Optional[Path] = None
) -> Optional[NamespacesAdapter]:
if path is None and isinstance(namespace, Namespaces):
# can't get any more information from already-loaded namespaces without a path
return None
if isinstance(namespace, NamespaceRepo):
# easiest route is if we got a NamespaceRepo
if namespace.name == "core":
hdmf_path = (path / namespace.imports["hdmf-common"]).resolve()
return load_namespace_adapter(namespace=hdmf_path)
# otherwise the hdmf-common adapter itself, and it loads common
else:
return None
elif path is not None:
# otherwise try and get it from relative paths
# pretty much a hack, but hey we are compensating for absence of versioning system here
maybe_repo_root = path / NWB_CORE_REPO.imports["hdmf-common"]
if maybe_repo_root.exists():
return load_namespace_adapter(namespace=maybe_repo_root)
warnings.warn(
f"Could not locate hdmf-common from namespace {namespace} and path {path}", stacklevel=1
)
return None
def load_nwb_core(
core_version: str = "2.7.0", hdmf_version: str = "1.8.0", hdmf_only: bool = False
) -> NamespacesAdapter:
"""
Convenience function for loading the NWB core schema + hdmf-common as a namespace adapter.
@ -136,14 +179,16 @@ def load_nwb_core(core_version: str = "2.7.0", hdmf_version: str = "1.8.0") -> N
Args:
core_version (str): an entry in :attr:`.NWB_CORE_REPO.versions`
hdmf_version (str): an entry in :attr:`.NWB_CORE_REPO.versions`
hdmf_only (bool): Only return the hdmf common schema
Returns:
"""
# First get hdmf-common:
hdmf_schema = load_namespace_adapter(HDMF_COMMON_REPO, version=hdmf_version)
schema = load_namespace_adapter(NWB_CORE_REPO, version=core_version)
schema.imported.append(hdmf_schema)
if hdmf_only:
schema = hdmf_schema
else:
schema = load_namespace_adapter(NWB_CORE_REPO, version=core_version, imported=[hdmf_schema])
return schema

View file

@ -12,7 +12,7 @@ from linkml_runtime.linkml_model import (
TypeDefinition,
)
from nwb_linkml.maps import flat_to_linkml, flat_to_np
from nwb_linkml.maps import flat_to_linkml
def _make_dtypes() -> List[TypeDefinition]:
@ -27,12 +27,15 @@ def _make_dtypes() -> List[TypeDefinition]:
if nwbtype.startswith("uint"):
amin = 0
np_type = flat_to_np[nwbtype]
# FIXME: Restore numpy types when we wrap them :)
# np_type = flat_to_np[nwbtype]
repr_string = f"np.{np_type.__name__}" if np_type.__module__ == "numpy" else None
# repr_string = f"np.{np_type.__name__}" if np_type.__module__ == "numpy" else None
atype = TypeDefinition(
name=nwbtype, minimum_value=amin, typeof=linkmltype, repr=repr_string
name=nwbtype,
minimum_value=amin,
typeof=linkmltype, # repr=repr_string
)
DTypeTypes.append(atype)
return DTypeTypes

View file

@ -0,0 +1,100 @@
"""
Logging factory and handlers
"""
import logging
from logging.handlers import RotatingFileHandler
from pathlib import Path
from typing import Optional, Union
from rich.logging import RichHandler
from nwb_linkml.config import LOG_LEVELS, Config
def init_logger(
name: str,
log_dir: Union[Optional[Path], bool] = None,
level: Optional[LOG_LEVELS] = None,
file_level: Optional[LOG_LEVELS] = None,
log_file_n: Optional[int] = None,
log_file_size: Optional[int] = None,
) -> logging.Logger:
"""
Make a logger.
Log to a set of rotating files in the ``log_dir`` according to ``name`` ,
as well as using the :class:`~rich.RichHandler` for pretty-formatted stdout logs.
Args:
name (str): Name of this logger. Ideally names are hierarchical
and indicate what they are logging for, eg. ``miniscope_io.sdcard``
and don't contain metadata like timestamps, etc. (which are in the logs)
log_dir (:class:`pathlib.Path`): Directory to store file-based logs in. If ``None``,
get from :class:`.Config`. If ``False`` , disable file logging.
level (:class:`.LOG_LEVELS`): Level to use for stdout logging. If ``None`` ,
get from :class:`.Config`
file_level (:class:`.LOG_LEVELS`): Level to use for file-based logging.
If ``None`` , get from :class:`.Config`
log_file_n (int): Number of rotating file logs to use.
If ``None`` , get from :class:`.Config`
log_file_size (int): Maximum size of logfiles before rotation.
If ``None`` , get from :class:`.Config`
Returns:
:class:`logging.Logger`
"""
config = Config()
if log_dir is None:
log_dir = config.log_dir
if level is None:
level = config.logs.level_stdout
if file_level is None:
file_level = config.logs.level_file
if log_file_n is None:
log_file_n = config.logs.file_n
if log_file_size is None:
log_file_size = config.logs.file_size
if not name.startswith("nwb_linkml"):
name = "nwb_linkml." + name
logger = logging.getLogger(name)
logger.setLevel(level)
# Add handlers for stdout and file
if log_dir is not False:
logger.addHandler(_file_handler(name, file_level, log_dir, log_file_n, log_file_size))
logger.addHandler(_rich_handler())
return logger
def _file_handler(
name: str,
file_level: LOG_LEVELS,
log_dir: Path,
log_file_n: int = 5,
log_file_size: int = 2**22,
) -> RotatingFileHandler:
# See init_logger for arg docs
filename = Path(log_dir) / ".".join([name, "log"])
file_handler = RotatingFileHandler(
str(filename), mode="a", maxBytes=log_file_size, backupCount=log_file_n
)
file_formatter = logging.Formatter("[%(asctime)s] %(levelname)s [%(name)s]: %(message)s")
file_handler.setLevel(file_level)
file_handler.setFormatter(file_formatter)
return file_handler
def _rich_handler() -> RichHandler:
rich_handler = RichHandler(rich_tracebacks=True, markup=True)
rich_formatter = logging.Formatter(
"[bold green]\[%(name)s][/bold green] %(message)s",
datefmt="[%y-%m-%dT%H:%M:%S]",
)
rich_handler.setFormatter(rich_formatter)
return rich_handler

View file

@ -2,7 +2,7 @@
Mapping from one domain to another
"""
from nwb_linkml.maps.dtype import flat_to_linkml, flat_to_np, flat_to_nptyping
from nwb_linkml.maps.dtype import flat_to_linkml, flat_to_np
from nwb_linkml.maps.map import Map
from nwb_linkml.maps.postload import MAP_HDMF_DATATYPE_DEF, MAP_HDMF_DATATYPE_INC
from nwb_linkml.maps.quantity import QUANTITY_MAP
@ -14,5 +14,4 @@ __all__ = [
"Map",
"flat_to_linkml",
"flat_to_np",
"flat_to_nptyping",
]

View file

@ -3,11 +3,12 @@ Dtype mappings
"""
from datetime import datetime
from typing import Any, Type
from typing import Any
import nptyping
import numpy as np
from nwb_schema_language import CompoundDtype, DTypeType, FlatDtype, ReferenceDtype
flat_to_linkml = {
"float": "float",
"float32": "float",
@ -38,37 +39,6 @@ flat_to_linkml = {
Map between the flat data types and the simpler linkml base types
"""
flat_to_nptyping = {
"float": "Float",
"float32": "Float32",
"double": "Double",
"float64": "Float64",
"long": "LongLong",
"int64": "Int64",
"int": "Int",
"int32": "Int32",
"int16": "Int16",
"short": "Short",
"int8": "Int8",
"uint": "UInt",
"uint32": "UInt32",
"uint16": "UInt16",
"uint8": "UInt8",
"uint64": "UInt64",
"numeric": "Number",
"text": "String",
"utf": "Unicode",
"utf8": "Unicode",
"utf_8": "Unicode",
"string": "Unicode",
"str": "Unicode",
"ascii": "String",
"bool": "Bool",
"isodatetime": "Datetime64",
"AnyType": "Any",
"object": "Object",
}
flat_to_np = {
"float": float,
"float32": np.float32,
@ -130,10 +100,9 @@ np_to_python = {
np.float64,
np.single,
np.double,
np.float_,
)
},
**{n: str for n in (np.character, np.str_, np.string_, np.unicode_)},
**{n: str for n in (np.character, np.str_)},
}
allowed_precisions = {
@ -173,15 +142,32 @@ https://github.com/hdmf-dev/hdmf/blob/ddc842b5c81d96e0b957b96e88533b16c137e206/s
"""
def struct_from_dtype(dtype: np.dtype) -> Type[nptyping.Structure]:
def handle_dtype(dtype: DTypeType | None) -> str:
"""
Create a nptyping Structure from a compound numpy dtype
Get the string form of a dtype
nptyping structures have the form::
Structure["name: Str, age: Int"]
Args:
dtype (:class:`.DTypeType`): Dtype to stringify
Returns:
str
"""
struct_pieces = [f"{k}: {flat_to_nptyping[v[0].name]}" for k, v in dtype.fields.items()]
struct_dtype = ", ".join(struct_pieces)
return nptyping.Structure[struct_dtype]
if isinstance(dtype, ReferenceDtype):
return dtype.target_type
elif dtype is None or dtype == []:
# Some ill-defined datasets are "abstract" despite that not being in the schema language
return "AnyType"
elif isinstance(dtype, FlatDtype):
return dtype.value
elif isinstance(dtype, list) and isinstance(dtype[0], CompoundDtype):
# there is precisely one class that uses compound dtypes:
# TimeSeriesReferenceVectorData
# compoundDtypes are able to define a ragged table according to the schema
# but are used in this single case equivalently to attributes.
# so we'll... uh... treat them as slots.
# TODO
return "AnyType"
else:
# flat dtype
return dtype

View file

@ -23,7 +23,6 @@ from pydantic import BaseModel, ConfigDict, Field
from nwb_linkml.annotations import unwrap_optional
from nwb_linkml.maps import Map
from nwb_linkml.maps.hdmf import dynamictable_to_model
from nwb_linkml.types.hdf5 import HDF5_Path
if sys.version_info.minor >= 11:
@ -234,63 +233,64 @@ class PruneEmpty(HDF5Map):
return H5ReadResult.model_construct(path=src.path, source=src, completed=True)
class ResolveDynamicTable(HDF5Map):
"""
Handle loading a dynamic table!
Dynamic tables are sort of odd in that their models don't include their fields
(except as a list of strings in ``colnames`` ),
so we need to create a new model that includes fields for each column,
and then we include the datasets as :class:`~numpydantic.interface.hdf5.H5ArrayPath`
objects which lazy load the arrays in a thread/process safe way.
This map also resolves the child elements,
indicating so by the ``completes`` field in the :class:`.ReadResult`
"""
phase = ReadPhases.read
priority = 1
@classmethod
def check(
cls, src: H5SourceItem, provider: "SchemaProvider", completed: Dict[str, H5ReadResult]
) -> bool:
if src.h5_type == "dataset":
return False
if "neurodata_type" in src.attrs:
if src.attrs["neurodata_type"] == "DynamicTable":
return True
# otherwise, see if it's a subclass
model = provider.get_class(src.attrs["namespace"], src.attrs["neurodata_type"])
# just inspect the MRO as strings rather than trying to check subclasses because
# we might replace DynamicTable in the future, and there isn't a stable DynamicTable
# class to inherit from anyway because of the whole multiple versions thing
parents = [parent.__name__ for parent in model.__mro__]
return "DynamicTable" in parents
else:
return False
@classmethod
def apply(
cls, src: H5SourceItem, provider: "SchemaProvider", completed: Dict[str, H5ReadResult]
) -> H5ReadResult:
with h5py.File(src.h5f_path, "r") as h5f:
obj = h5f.get(src.path)
# make a populated model :)
base_model = provider.get_class(src.namespace, src.neurodata_type)
model = dynamictable_to_model(obj, base=base_model)
completes = [HDF5_Path(child.name) for child in obj.values()]
return H5ReadResult(
path=src.path,
source=src,
result=model,
completes=completes,
completed=True,
applied=["ResolveDynamicTable"],
)
#
# class ResolveDynamicTable(HDF5Map):
# """
# Handle loading a dynamic table!
#
# Dynamic tables are sort of odd in that their models don't include their fields
# (except as a list of strings in ``colnames`` ),
# so we need to create a new model that includes fields for each column,
# and then we include the datasets as :class:`~numpydantic.interface.hdf5.H5ArrayPath`
# objects which lazy load the arrays in a thread/process safe way.
#
# This map also resolves the child elements,
# indicating so by the ``completes`` field in the :class:`.ReadResult`
# """
#
# phase = ReadPhases.read
# priority = 1
#
# @classmethod
# def check(
# cls, src: H5SourceItem, provider: "SchemaProvider", completed: Dict[str, H5ReadResult]
# ) -> bool:
# if src.h5_type == "dataset":
# return False
# if "neurodata_type" in src.attrs:
# if src.attrs["neurodata_type"] == "DynamicTable":
# return True
# # otherwise, see if it's a subclass
# model = provider.get_class(src.attrs["namespace"], src.attrs["neurodata_type"])
# # just inspect the MRO as strings rather than trying to check subclasses because
# # we might replace DynamicTable in the future, and there isn't a stable DynamicTable
# # class to inherit from anyway because of the whole multiple versions thing
# parents = [parent.__name__ for parent in model.__mro__]
# return "DynamicTable" in parents
# else:
# return False
#
# @classmethod
# def apply(
# cls, src: H5SourceItem, provider: "SchemaProvider", completed: Dict[str, H5ReadResult]
# ) -> H5ReadResult:
# with h5py.File(src.h5f_path, "r") as h5f:
# obj = h5f.get(src.path)
#
# # make a populated model :)
# base_model = provider.get_class(src.namespace, src.neurodata_type)
# model = dynamictable_to_model(obj, base=base_model)
#
# completes = [HDF5_Path(child.name) for child in obj.values()]
#
# return H5ReadResult(
# path=src.path,
# source=src,
# result=model,
# completes=completes,
# completed=True,
# applied=["ResolveDynamicTable"],
# )
class ResolveModelGroup(HDF5Map):

View file

@ -1,84 +0,0 @@
"""
Mapping functions for handling HDMF classes like DynamicTables
"""
from typing import Any, List, Optional, Type
import dask.array as da
import h5py
import numpy as np
from numpydantic import NDArray
from numpydantic.interface.hdf5 import H5ArrayPath
from pydantic import BaseModel, create_model
from nwb_linkml.maps.dtype import struct_from_dtype
from nwb_linkml.types.hdf5 import HDF5_Path
def model_from_dynamictable(group: h5py.Group, base: Optional[BaseModel] = None) -> Type[BaseModel]:
"""
Create a pydantic model from a dynamic table
"""
colnames = group.attrs["colnames"]
types = {}
for col in colnames:
nptype = group[col].dtype
nptype = struct_from_dtype(nptype) if nptype.type == np.void else nptype.type
type_ = Optional[NDArray[Any, nptype]]
# FIXME: handling nested column types that appear only in some versions?
# types[col] = (List[type_ | None], ...)
types[col] = (type_, None)
model = create_model(group.name.split("/")[-1], **types, __base__=base)
return model
def dynamictable_to_model(
group: h5py.Group,
model: Optional[Type[BaseModel]] = None,
base: Optional[Type[BaseModel]] = None,
) -> BaseModel:
"""
Instantiate a dynamictable model
Calls :func:`.model_from_dynamictable` if ``model`` is not provided.
"""
if model is None:
model = model_from_dynamictable(group, base)
items = {}
for col, col_type in model.model_fields.items():
if col not in group:
if col in group.attrs:
items[col] = group.attrs[col]
continue
if col_type.annotation is HDF5_Path:
items[col] = [HDF5_Path(group[d].name) for d in group[col][:]]
else:
try:
items[col] = da.from_array(group[col])
except NotImplementedError:
items[col] = H5ArrayPath(file=group.file.filename, path=group[col].name)
return model.model_construct(hdf5_path=group.name, name=group.name.split("/")[-1], **items)
def dereference_reference_vector(dset: h5py.Dataset, data: Optional[List[Any]]) -> List:
"""
Given a compound dataset with indices, counts, and object references, dereference to values
Data is of the form
(idx_start, count, target)
"""
# assume all these references are to the same target
# and the index is in the 3rd position
if data is None:
data = dset[:]
target = dset.parent.get(data[0][-1])
res = [target[d[0] : d[0] + d[1]] for d in data]
return res

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -83,15 +92,15 @@ class Image(NWBData):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -130,10 +139,15 @@ class TimeSeries(NWBDataInterface):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
data: TimeSeriesData = Field(
...,
@ -143,12 +157,12 @@ class TimeSeries(NWBDataInterface):
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -177,19 +191,21 @@ class TimeSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as signed 16-bit integers (int16 range -32,768 to 32,767) that correspond to a 5V range (-2.5V to 2.5V), and the data acquisition system gain is 8000X, then the 'conversion' multiplier to get from raw data acquisition values to recorded volts is 2.5/32768/8000 = 9.5367e-9.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
resolution: Optional[np.float32] = Field(
None,
resolution: Optional[float] = Field(
-1.0,
description="""Smallest meaningful difference between values in data, stored in the specified by unit, e.g., the change in value of the least significant bit, or a larger number if signal noise is known to be present. If unknown, use -1.0.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(-1.0)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], Any],
NDArray[Shape["* num_times, * num_dim2"], Any],
@ -212,11 +228,15 @@ class TimeSeriesStartingTime(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "starting_time", "ifabsent": "string(starting_time)"}
},
)
rate: Optional[np.float32] = Field(None, description="""Sampling rate, in Hz.""")
unit: Optional[str] = Field(
None, description="""Unit of measurement for time, which is fixed to 'seconds'."""
rate: float = Field(..., description="""Sampling rate, in Hz.""")
unit: Literal["seconds"] = Field(
"seconds",
description="""Unit of measurement for time, which is fixed to 'seconds'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "seconds", "ifabsent": "string(seconds)"}
},
)
value: np.float64 = Field(...)
value: float = Field(...)
class TimeSeriesSync(ConfiguredBaseModel):
@ -241,7 +261,7 @@ class ProcessingModule(NWBContainer):
{"from_schema": "core.nwb.base", "tree_root": True}
)
children: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
value: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
None,
json_schema_extra={
"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}, {"range": "DynamicTable"}]}
@ -260,9 +280,7 @@ class Images(NWBDataInterface):
)
name: str = Field("Images", json_schema_extra={"linkml_meta": {"ifabsent": "string(Images)"}})
description: Optional[str] = Field(
None, description="""Description of this collection of images."""
)
description: str = Field(..., description="""Description of this collection of images.""")
image: List[Image] = Field(..., description="""Images stored in this collection.""")

View file

@ -34,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -84,21 +93,26 @@ class SpatialSeries(TimeSeries):
reference_frame: Optional[str] = Field(
None, description="""Description defining what exactly 'straight-ahead' means."""
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -128,13 +142,14 @@ class SpatialSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -148,7 +163,7 @@ class BehavioralEpochs(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[IntervalSeries]] = Field(
value: Optional[List[IntervalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "IntervalSeries"}]}}
)
name: str = Field(...)
@ -163,7 +178,7 @@ class BehavioralEvents(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -178,7 +193,7 @@ class BehavioralTimeSeries(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -193,7 +208,7 @@ class PupilTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -208,7 +223,7 @@ class EyeTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -223,7 +238,7 @@ class CompassDirection(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -238,7 +253,7 @@ class Position(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)

View file

@ -27,6 +27,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -16,6 +16,7 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from ...core.v2_2_0.core_nwb_device import Device
from ...core.v2_2_0.core_nwb_base import (
TimeSeries,
TimeSeriesStartingTime,
@ -43,6 +44,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +78,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -108,37 +118,47 @@ class ElectricalSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_channels"], np.number],
NDArray[Shape["* num_times, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_channels"], float],
NDArray[Shape["* num_times, * num_channels, * num_samples"], float],
] = Field(..., description="""Recorded voltage data.""")
electrodes: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -167,10 +187,10 @@ class SpikeEventSeries(ElectricalSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_events, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_samples"], float],
NDArray[Shape["* num_events, * num_channels, * num_samples"], float],
] = Field(..., description="""Spike waveforms.""")
timestamps: NDArray[Shape["* num_times"], np.float64] = Field(
timestamps: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time. Timestamps are required for the events. Unlike for TimeSeries, timestamps are required for SpikeEventSeries and are thus re-specified here.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -179,24 +199,34 @@ class SpikeEventSeries(ElectricalSeries):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,7 +262,7 @@ class FeatureExtraction(NWBDataInterface):
description="""Description of features (eg, ''PC1'') for each of the extracted features.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
features: NDArray[Shape["* num_events, * num_channels, * num_features"], np.float32] = Field(
features: NDArray[Shape["* num_events, * num_channels, * num_features"], float] = Field(
...,
description="""Multi-dimensional array of features extracted from each event.""",
json_schema_extra={
@ -247,7 +277,7 @@ class FeatureExtraction(NWBDataInterface):
}
},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of events that features correspond to (can be a link).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
@ -256,7 +286,12 @@ class FeatureExtraction(NWBDataInterface):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -277,16 +312,25 @@ class EventDetection(NWBDataInterface):
...,
description="""Description of how events were detected, such as voltage threshold, or dV/dT threshold, as well as relevant values.""",
)
source_idx: NDArray[Shape["* num_events"], np.int32] = Field(
source_idx: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Indices (zero-based) into source ElectricalSeries::data array corresponding to time of event. ''description'' should define what is meant by time of event (e.g., .25 ms before action potential peak, zero-crossing time, etc). The index points to each event from the raw data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Timestamps of events, in seconds.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
source_electricalseries: Union[ElectricalSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ElectricalSeries"}, {"range": "string"}],
}
},
)
class EventWaveform(NWBDataInterface):
@ -298,7 +342,7 @@ class EventWaveform(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[SpikeEventSeries]] = Field(
value: Optional[List[SpikeEventSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpikeEventSeries"}]}}
)
name: str = Field(...)
@ -313,7 +357,7 @@ class FilteredEphys(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -328,7 +372,7 @@ class LFP(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -344,14 +388,23 @@ class ElectrodeGroup(NWBContainer):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of this electrode group.""")
location: Optional[str] = Field(
None,
description: str = Field(..., description="""Description of this electrode group.""")
location: str = Field(
...,
description="""Location of electrode group. Specify the area, layer, comments on estimation of area/layer, etc. Use standard atlas names for anatomical regions when possible.""",
)
position: Optional[ElectrodeGroupPosition] = Field(
None, description="""stereotaxic or common framework coordinates"""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ElectrodeGroupPosition(ConfiguredBaseModel):
@ -367,9 +420,21 @@ class ElectrodeGroupPosition(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "position", "ifabsent": "string(position)"}
},
)
x: Optional[np.float32] = Field(None, description="""x coordinate""")
y: Optional[np.float32] = Field(None, description="""y coordinate""")
z: Optional[np.float32] = Field(None, description="""z coordinate""")
x: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""x coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
y: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""y coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""z coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
class ClusterWaveforms(NWBDataInterface):
@ -388,7 +453,7 @@ class ClusterWaveforms(NWBDataInterface):
waveform_filtering: str = Field(
..., description="""Filtering applied to data before generating mean/sd"""
)
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""The mean waveform for each cluster, using the same indices for each wave as cluster numbers in the associated Clustering module (i.e, cluster 3 is in array slot [3]). Waveforms corresponding to gaps in cluster sequence should be empty (e.g., zero- filled)""",
json_schema_extra={
@ -397,7 +462,7 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""Stdev of waveforms for each cluster, using the same indices as in mean""",
json_schema_extra={
@ -406,6 +471,15 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
clustering_interface: Union[Clustering, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Clustering"}, {"range": "string"}],
}
},
)
class Clustering(NWBDataInterface):
@ -424,17 +498,17 @@ class Clustering(NWBDataInterface):
...,
description="""Description of clusters or clustering, (e.g. cluster 0 is noise, clusters curated using Klusters, etc)""",
)
num: NDArray[Shape["* num_events"], np.int32] = Field(
num: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Cluster number of each event""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
peak_over_rms: NDArray[Shape["* num_clusters"], np.float32] = Field(
peak_over_rms: NDArray[Shape["* num_clusters"], float] = Field(
...,
description="""Maximum ratio of waveform peak to RMS on any channel in the cluster (provides a basic clustering metric).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_clusters"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of clustered events, in seconds. This may be a link to times field in associated FeatureExtraction module.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},

View file

@ -15,9 +15,9 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_0.hdmf_common_table import DynamicTable, VectorIndex, VectorData
from ...core.v2_2_0.core_nwb_base import TimeSeries
from numpydantic import NDArray, Shape
metamodel_version = "None"
version = "2.2.0"
@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,7 +105,7 @@ class TimeIntervals(DynamicTable):
)
name: str = Field(...)
start_time: NDArray[Any, np.float32] = Field(
start_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Start time of epoch, in seconds.""",
json_schema_extra={
@ -105,7 +114,7 @@ class TimeIntervals(DynamicTable):
}
},
)
stop_time: NDArray[Any, np.float32] = Field(
stop_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Stop time of epoch, in seconds.""",
json_schema_extra={
@ -114,7 +123,7 @@ class TimeIntervals(DynamicTable):
}
},
)
tags: Optional[NDArray[Any, str]] = Field(
tags: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""User-defined tags that identify or categorize events.""",
json_schema_extra={
@ -127,7 +136,12 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for tags.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
timeseries: Optional[TimeIntervalsTimeseries] = Field(
@ -137,17 +151,20 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -173,20 +190,22 @@ class TimeIntervalsTimeseries(VectorData):
"linkml_meta": {"equals_string": "timeseries", "ifabsent": "string(timeseries)"}
},
)
idx_start: Optional[np.int32] = Field(
idx_start: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Start index into the TimeSeries 'data' and 'timestamp' datasets of the referenced TimeSeries. The first dimension of those arrays is always time.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
count: Optional[np.int32] = Field(
count: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Number of data samples available in this time series, during this epoch.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
timeseries: Optional[TimeSeries] = Field(
None, description="""the TimeSeries that this index applies to."""
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
timeseries: Optional[NDArray[Shape["*"], TimeSeries]] = Field(
None,
description="""the TimeSeries that this index applies to.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: str = Field(..., description="""Description of what these vectors represent.""")
# Model rebuild

View file

@ -7,7 +7,6 @@ import sys
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union
from pydantic import BaseModel, ConfigDict, Field, RootModel, field_validator
import numpy as np
from ...core.v2_2_0.core_nwb_epoch import TimeIntervals
from ...core.v2_2_0.core_nwb_misc import Units
from ...core.v2_2_0.core_nwb_device import Device
from ...core.v2_2_0.core_nwb_ogen import OptogeneticStimulusSite
@ -22,6 +21,7 @@ from ...core.v2_2_0.core_nwb_ecephys import ElectrodeGroup
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_0.hdmf_common_table import DynamicTable, VectorData, VectorIndex
from ...core.v2_2_0.core_nwb_icephys import IntracellularElectrode, SweepTable
from ...core.v2_2_0.core_nwb_epoch import TimeIntervals
metamodel_version = "None"
version = "2.2.0"
@ -41,6 +41,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -98,11 +107,12 @@ class NWBFile(NWBContainer):
"root",
json_schema_extra={"linkml_meta": {"equals_string": "root", "ifabsent": "string(root)"}},
)
nwb_version: Optional[str] = Field(
None,
nwb_version: Literal["2.1.0"] = Field(
"2.1.0",
description="""File version string. Use semantic versioning, e.g. 1.2.1. This will be the name of the format with trailing major, minor and patch numbers.""",
json_schema_extra={"linkml_meta": {"equals_string": "2.1.0", "ifabsent": "string(2.1.0)"}},
)
file_create_date: NDArray[Shape["* num_modifications"], np.datetime64] = Field(
file_create_date: NDArray[Shape["* num_modifications"], datetime] = Field(
...,
description="""A record of the date the file was created and of subsequent modifications. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted strings: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. The file can be created after the experiment was run, so this may differ from the experiment start time. Each modification to the nwb file adds a new entry to the array.""",
json_schema_extra={
@ -116,11 +126,11 @@ class NWBFile(NWBContainer):
session_description: str = Field(
..., description="""A description of the experimental session and data in the file."""
)
session_start_time: np.datetime64 = Field(
session_start_time: datetime = Field(
...,
description="""Date and time of the experiment/session start. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds.""",
)
timestamps_reference_time: np.datetime64 = Field(
timestamps_reference_time: datetime = Field(
...,
description="""Date and time corresponding to time zero of all timestamps. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. All times stored in the file use this time as reference (i.e., time zero).""",
)
@ -158,19 +168,9 @@ class NWBFile(NWBContainer):
...,
description="""Experimental metadata, including protocol, notes and description of hardware device(s). The metadata stored in this section should be used to describe the experiment. Metadata necessary for interpreting the data is stored with the data. General experimental metadata, including animal strain, experimental protocols, experimenter, devices, etc, are stored under 'general'. Core metadata (e.g., that required to interpret data fields) is stored with the data itself, and implicitly defined by the file specification (e.g., time is in seconds). The strategy used here for storing non-core metadata is to use free-form text fields, such as would appear in sentences or paragraphs from a Methods section. Metadata fields are text to enable them to be more general, for example to represent ranges instead of numerical values. Machine-readable metadata is stored as attributes to these free-form datasets. All entries in the below table are to be included when data is present. Unused groups (e.g., intracellular_ephys in an optophysiology experiment) should not be created unless there is data to store within them.""",
)
intervals: Optional[List[TimeIntervals]] = Field(
intervals: Optional[NWBFileIntervals] = Field(
None,
description="""Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.""",
json_schema_extra={
"linkml_meta": {
"any_of": [
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
]
}
},
)
units: Optional[Units] = Field(None, description="""Data about sorted spike units.""")
@ -256,7 +256,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Description of slices, including information about preparation thickness, orientation, temperature, and bath solution.""",
)
source_script: Optional[NWBFileGeneralSourceScript] = Field(
source_script: Optional[GeneralSourceScript] = Field(
None,
description="""Script file or link to public source code used to create this NWB file.""",
)
@ -284,10 +284,10 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Information about the animal or person from which the data was measured.""",
)
extracellular_ephys: Optional[NWBFileGeneralExtracellularEphys] = Field(
extracellular_ephys: Optional[GeneralExtracellularEphys] = Field(
None, description="""Metadata related to extracellular electrophysiology."""
)
intracellular_ephys: Optional[NWBFileGeneralIntracellularEphys] = Field(
intracellular_ephys: Optional[GeneralIntracellularEphys] = Field(
None, description="""Metadata related to intracellular electrophysiology."""
)
optogenetics: Optional[List[OptogeneticStimulusSite]] = Field(
@ -302,7 +302,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
)
class NWBFileGeneralSourceScript(ConfiguredBaseModel):
class GeneralSourceScript(ConfiguredBaseModel):
"""
Script file or link to public source code used to create this NWB file.
"""
@ -315,7 +315,7 @@ class NWBFileGeneralSourceScript(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "source_script", "ifabsent": "string(source_script)"}
},
)
file_name: Optional[str] = Field(None, description="""Name of script file.""")
file_name: str = Field(..., description="""Name of script file.""")
value: str = Field(...)
@ -335,7 +335,7 @@ class Subject(NWBContainer):
age: Optional[str] = Field(
None, description="""Age of subject. Can be supplied instead of 'date_of_birth'."""
)
date_of_birth: Optional[np.datetime64] = Field(
date_of_birth: Optional[datetime] = Field(
None, description="""Date of birth of subject. Can be supplied instead of 'age'."""
)
description: Optional[str] = Field(
@ -357,7 +357,7 @@ class Subject(NWBContainer):
)
class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
class GeneralExtracellularEphys(ConfiguredBaseModel):
"""
Metadata related to extracellular electrophysiology.
"""
@ -376,12 +376,12 @@ class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Physical group of electrodes."""
)
electrodes: Optional[NWBFileGeneralExtracellularEphysElectrodes] = Field(
electrodes: Optional[ExtracellularEphysElectrodes] = Field(
None, description="""A table of all electrodes (i.e. channels) used for recording."""
)
class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
class ExtracellularEphysElectrodes(DynamicTable):
"""
A table of all electrodes (i.e. channels) used for recording.
"""
@ -394,7 +394,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
"linkml_meta": {"equals_string": "electrodes", "ifabsent": "string(electrodes)"}
},
)
x: NDArray[Any, np.float32] = Field(
x: VectorData[NDArray[Any, float]] = Field(
...,
description="""x coordinate of the channel location in the brain (+x is posterior).""",
json_schema_extra={
@ -403,7 +403,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
y: NDArray[Any, np.float32] = Field(
y: VectorData[NDArray[Any, float]] = Field(
...,
description="""y coordinate of the channel location in the brain (+y is inferior).""",
json_schema_extra={
@ -412,7 +412,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
z: NDArray[Any, np.float32] = Field(
z: VectorData[NDArray[Any, float]] = Field(
...,
description="""z coordinate of the channel location in the brain (+z is right).""",
json_schema_extra={
@ -421,7 +421,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
imp: NDArray[Any, np.float32] = Field(
imp: VectorData[NDArray[Any, float]] = Field(
...,
description="""Impedance of the channel.""",
json_schema_extra={
@ -430,7 +430,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
location: NDArray[Any, str] = Field(
location: VectorData[NDArray[Any, str]] = Field(
...,
description="""Location of the electrode (channel). Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
json_schema_extra={
@ -439,7 +439,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
filtering: NDArray[Any, np.float32] = Field(
filtering: VectorData[NDArray[Any, float]] = Field(
...,
description="""Description of hardware filtering.""",
json_schema_extra={
@ -451,7 +451,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
group: List[ElectrodeGroup] = Field(
..., description="""Reference to the ElectrodeGroup this electrode is a part of."""
)
group_name: NDArray[Any, str] = Field(
group_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the ElectrodeGroup this electrode is a part of.""",
json_schema_extra={
@ -460,7 +460,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_x: Optional[NDArray[Any, np.float32]] = Field(
rel_x: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""x coordinate in electrode group""",
json_schema_extra={
@ -469,7 +469,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_y: Optional[NDArray[Any, np.float32]] = Field(
rel_y: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""y coordinate in electrode group""",
json_schema_extra={
@ -478,7 +478,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_z: Optional[NDArray[Any, np.float32]] = Field(
rel_z: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""z coordinate in electrode group""",
json_schema_extra={
@ -487,7 +487,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
reference: Optional[NDArray[Any, str]] = Field(
reference: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""Description of the reference used for this electrode.""",
json_schema_extra={
@ -496,14 +496,12 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -516,7 +514,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
)
class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
class GeneralIntracellularEphys(ConfiguredBaseModel):
"""
Metadata related to intracellular electrophysiology.
"""
@ -544,13 +542,43 @@ class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
)
class NWBFileIntervals(ConfiguredBaseModel):
"""
Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.file"})
name: Literal["intervals"] = Field(
"intervals",
json_schema_extra={
"linkml_meta": {"equals_string": "intervals", "ifabsent": "string(intervals)"}
},
)
epochs: Optional[TimeIntervals] = Field(
None,
description="""Divisions in time marking experimental stages or sub-divisions of a single recording session.""",
)
trials: Optional[TimeIntervals] = Field(
None, description="""Repeated experimental events that have a logical grouping."""
)
invalid_times: Optional[TimeIntervals] = Field(
None, description="""Time intervals that should be removed from analysis."""
)
time_intervals: Optional[List[TimeIntervals]] = Field(
None,
description="""Optional additional table(s) for describing other experimental time intervals.""",
)
# Model rebuild
# see https://pydantic-docs.helpmanual.io/usage/models/#rebuilding-a-model
NWBFile.model_rebuild()
NWBFileStimulus.model_rebuild()
NWBFileGeneral.model_rebuild()
NWBFileGeneralSourceScript.model_rebuild()
GeneralSourceScript.model_rebuild()
Subject.model_rebuild()
NWBFileGeneralExtracellularEphys.model_rebuild()
NWBFileGeneralExtracellularEphysElectrodes.model_rebuild()
NWBFileGeneralIntracellularEphys.model_rebuild()
GeneralExtracellularEphys.model_rebuild()
ExtracellularEphysElectrodes.model_rebuild()
GeneralIntracellularEphys.model_rebuild()
NWBFileIntervals.model_rebuild()

View file

@ -11,6 +11,7 @@ from ...core.v2_2_0.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_0.core_nwb_device import Device
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union, Annotated, Type, TypeVar
from pydantic import (
BaseModel,
@ -42,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -67,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -106,32 +116,46 @@ class PatchClampSeries(TimeSeries):
)
name: str = Field(...)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
data: PatchClampSeriesData = Field(..., description="""Recorded voltage or current.""")
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -160,11 +184,11 @@ class PatchClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[NDArray[Shape["* num_times"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times"], float]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}}
)
@ -180,36 +204,50 @@ class CurrentClampSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
bias_current: Optional[np.float32] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[np.float32] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[np.float32] = Field(
bias_current: Optional[float] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[float] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[float] = Field(
None, description="""Capacitance compensation, in farads."""
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -238,9 +276,10 @@ class CurrentClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -255,39 +294,51 @@ class IZeroClampSeries(CurrentClampSeries):
)
name: str = Field(...)
bias_current: np.float32 = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: np.float32 = Field(
..., description="""Bridge balance, in ohms, fixed to 0.0."""
)
capacitance_compensation: np.float32 = Field(
bias_current: float = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: float = Field(..., description="""Bridge balance, in ohms, fixed to 0.0.""")
capacitance_compensation: float = Field(
..., description="""Capacitance compensation, in farads, fixed to 0.0."""
)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -316,31 +367,45 @@ class CurrentClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampStimulusSeriesData = Field(..., description="""Stimulus current applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -369,9 +434,12 @@ class CurrentClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -408,31 +476,45 @@ class VoltageClampSeries(PatchClampSeries):
whole_cell_series_resistance_comp: Optional[VoltageClampSeriesWholeCellSeriesResistanceComp] = (
Field(None, description="""Whole cell series resistance compensation, in ohms.""")
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -461,9 +543,12 @@ class VoltageClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -484,11 +569,14 @@ class VoltageClampSeriesCapacitanceFast(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
@ -507,11 +595,14 @@ class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
@ -530,11 +621,12 @@ class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["hertz"] = Field(
"hertz",
description="""Unit of measurement for resistance_comp_bandwidth, which is fixed to 'hertz'.""",
json_schema_extra={"linkml_meta": {"equals_string": "hertz", "ifabsent": "string(hertz)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
@ -553,11 +645,14 @@ class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_correction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
@ -576,11 +671,14 @@ class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_prediction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
@ -599,11 +697,14 @@ class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for whole_cell_capacitance_comp, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
@ -622,11 +723,12 @@ class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["ohms"] = Field(
"ohms",
description="""Unit of measurement for whole_cell_series_resistance_comp, which is fixed to 'ohms'.""",
json_schema_extra={"linkml_meta": {"equals_string": "ohms", "ifabsent": "string(ohms)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampStimulusSeries(PatchClampSeries):
@ -640,31 +742,45 @@ class VoltageClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: VoltageClampStimulusSeriesData = Field(..., description="""Stimulus voltage applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -693,9 +809,10 @@ class VoltageClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -726,6 +843,15 @@ class IntracellularElectrode(NWBContainer):
slice: Optional[str] = Field(
None, description="""Information about slice used for recording."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class SweepTable(DynamicTable):
@ -738,7 +864,7 @@ class SweepTable(DynamicTable):
)
name: str = Field(...)
sweep_number: NDArray[Any, np.uint32] = Field(
sweep_number: VectorData[NDArray[Any, int]] = Field(
...,
description="""Sweep number of the PatchClampSeries in that row.""",
json_schema_extra={
@ -754,17 +880,20 @@ class SweepTable(DynamicTable):
...,
description="""Index for series.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -71,15 +80,15 @@ class GrayscaleImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -94,15 +103,15 @@ class RGBImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -117,15 +126,15 @@ class RGBAImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -142,11 +151,11 @@ class ImageSeries(TimeSeries):
name: str = Field(...)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -159,21 +168,26 @@ class ImageSeries(TimeSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -204,11 +218,11 @@ class ImageSeriesExternalFile(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "external_file", "ifabsent": "string(external_file)"}
},
)
starting_frame: Optional[np.int32] = Field(
None,
starting_frame: List[int] = Field(
...,
description="""Each external image may contain one or more consecutive frames of the full ImageSeries. This attribute serves as an index to indicate which frames each file contains, to faciliate random access. The 'starting_frame' attribute, hence, contains a list of frame numbers within the full ImageSeries of the first frame of each file listed in the parent 'external_file' dataset. Zero-based indexing is used (hence, the first element will always be zero). For example, if the 'external_file' dataset has three paths to files and the first file has 5 frames, the second file has 10 frames, and the third file has 20 frames, then this attribute will have values [0, 5, 15]. If there is a single external file that holds all of the frames of the ImageSeries (and so there is a single element in the 'external_file' dataset), then this attribute should have value [0].""",
)
array: Optional[NDArray[Shape["* num_files"], str]] = Field(
value: Optional[NDArray[Shape["* num_files"], str]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_files"}]}}}
)
@ -223,13 +237,22 @@ class ImageMaskSeries(ImageSeries):
)
name: str = Field(...)
masked_imageseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -242,21 +265,26 @@ class ImageMaskSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -284,13 +312,12 @@ class OpticalSeries(ImageSeries):
)
name: str = Field(...)
distance: Optional[np.float32] = Field(
distance: Optional[float] = Field(
None, description="""Distance from camera/monitor to target/eye."""
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height_depth"], np.float32],
NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height_depth"], float]
]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
orientation: Optional[str] = Field(
@ -299,11 +326,11 @@ class OpticalSeries(ImageSeries):
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -316,21 +343,26 @@ class OpticalSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -358,26 +390,40 @@ class IndexSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int32] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Index of the frame in the referenced ImageSeries.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
indexed_timeseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},

View file

@ -43,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -120,21 +129,26 @@ class AbstractFeatureSeries(TimeSeries):
description="""Description of the features represented in TimeSeries::data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -164,13 +178,14 @@ class AbstractFeatureSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"see 'feature_units'",
description="""Since there can be different units for different features, store the units in 'feature_units'. The default value for this attribute is \"see 'feature_units'\".""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(see 'feature_units')"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -190,21 +205,26 @@ class AnnotationSeries(TimeSeries):
description="""Annotations made during an experiment.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,26 +252,31 @@ class IntervalSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int8] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Use values >0 if interval started, <0 if interval ended.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -287,21 +312,35 @@ class DecompositionSeries(TimeSeries):
...,
description="""Table for describing the bands that this series was generated from. There should be one row in this table for each band.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
comments: Optional[str] = Field(
source_timeseries: Optional[Union[TimeSeries, str]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "TimeSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -330,11 +369,12 @@ class DecompositionSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
"no unit",
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no unit)"}},
)
array: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -361,7 +401,7 @@ class DecompositionSeriesBands(DynamicTable):
"bands",
json_schema_extra={"linkml_meta": {"equals_string": "bands", "ifabsent": "string(bands)"}},
)
band_name: NDArray[Any, str] = Field(
band_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the band, e.g. theta.""",
json_schema_extra={
@ -370,7 +410,7 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_limits: NDArray[Shape["* num_bands, 2 low_high"], np.float32] = Field(
band_limits: VectorData[NDArray[Shape["* num_bands, 2 low_high"], float]] = Field(
...,
description="""Low and high limit of each band in Hz. If it is a Gaussian filter, use 2 SD on either side of the center.""",
json_schema_extra={
@ -384,24 +424,22 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_mean: NDArray[Shape["* num_bands"], np.float32] = Field(
band_mean: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The mean Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
band_stdev: NDArray[Shape["* num_bands"], np.float32] = Field(
band_stdev: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The standard deviation of Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -428,7 +466,12 @@ class Units(DynamicTable):
None,
description="""Index into the spike_times dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
spike_times: Optional[UnitsSpikeTimes] = Field(
@ -438,10 +481,16 @@ class Units(DynamicTable):
None,
description="""Index into the obs_intervals dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
obs_intervals: Optional[NDArray[Shape["* num_intervals, 2 start_end"], np.float64]] = Field(
obs_intervals: VectorData[Optional[NDArray[Shape["* num_intervals, 2 start_end"], float]]] = (
Field(
None,
description="""Observation intervals for each unit.""",
json_schema_extra={
@ -455,43 +504,56 @@ class Units(DynamicTable):
}
},
)
)
electrodes_index: Named[Optional[VectorIndex]] = Field(
None,
description="""Index into electrodes.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrodes: Named[Optional[DynamicTableRegion]] = Field(
None,
description="""Electrode that each spike unit came from, specified using a DynamicTableRegion.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Electrode group that each spike unit came from."""
)
waveform_mean: Optional[
waveform_mean: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform mean for each spike unit.""")
waveform_sd: Optional[
waveform_sd: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform standard deviation for each spike unit.""")
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -517,13 +579,11 @@ class UnitsSpikeTimes(VectorData):
"linkml_meta": {"equals_string": "spike_times", "ifabsent": "string(spike_times)"}
},
)
resolution: Optional[np.float64] = Field(
resolution: Optional[float] = Field(
None,
description="""The smallest possible difference between two spike times. Usually 1 divided by the acquisition sampling rate from which spike times were extracted, but could be larger if the acquisition time series was downsampled or smaller if the acquisition time series was smoothed/interpolated and it is possible for the spike time to be between samples.""",
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
description: str = Field(..., description="""Description of what these vectors represent.""")
# Model rebuild

View file

@ -14,6 +14,7 @@ from ...core.v2_2_0.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_0.core_nwb_device import Device
metamodel_version = "None"
version = "2.2.0"
@ -33,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -76,26 +86,40 @@ class OptogeneticSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.number] = Field(
data: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Applied power for optogenetic stimulus, in watts.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
site: Union[OptogeneticStimulusSite, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "OptogeneticStimulusSite"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -124,11 +148,20 @@ class OptogeneticStimulusSite(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description of stimulation site.""")
excitation_lambda: np.float32 = Field(..., description="""Excitation wavelength, in nm.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
location: str = Field(
...,
description="""Location of the stimulation site. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
# Model rebuild

View file

@ -17,6 +17,7 @@ from pydantic import (
BeforeValidator,
)
from ...hdmf_common.v1_1_0.hdmf_common_table import DynamicTableRegion, DynamicTable
from ...core.v2_2_0.core_nwb_device import Device
from numpydantic import NDArray, Shape
from ...core.v2_2_0.core_nwb_base import (
TimeSeriesStartingTime,
@ -44,6 +45,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -69,7 +79,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -109,24 +119,30 @@ class TwoPhotonSeries(ImageSeries):
)
name: str = Field(...)
pmt_gain: Optional[np.float32] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[np.float32] = Field(
pmt_gain: Optional[float] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[float] = Field(
None,
description="""Lines imaged per second. This is also stored in /general/optophysiology but is kept here as it is useful information for analysis, and so good to be stored w/ the actual data.""",
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height"], np.float32],
]
Union[NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height"], float]]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
imaging_plane: Union[ImagingPlane, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImagingPlane"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -139,21 +155,26 @@ class TwoPhotonSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -182,31 +203,40 @@ class RoiResponseSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_rois"], np.number],
NDArray[Shape["* num_times"], float], NDArray[Shape["* num_times, * num_rois"], float]
] = Field(..., description="""Signals from ROIs.""")
rois: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion referencing into an ROITable containing information on the ROIs stored in this timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -233,7 +263,7 @@ class DfOverF(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -248,7 +278,7 @@ class Fluorescence(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -263,7 +293,7 @@ class ImageSegmentation(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[DynamicTable]] = Field(
value: Optional[List[DynamicTable]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "DynamicTable"}]}}
)
name: str = Field(...)
@ -280,8 +310,8 @@ class ImagingPlane(NWBContainer):
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the imaging plane.""")
excitation_lambda: np.float32 = Field(..., description="""Excitation wavelength, in nm.""")
imaging_rate: np.float32 = Field(..., description="""Rate that images are acquired, in Hz.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
imaging_rate: float = Field(..., description="""Rate that images are acquired, in Hz.""")
indicator: str = Field(..., description="""Calcium indicator.""")
location: str = Field(
...,
@ -306,6 +336,15 @@ class ImagingPlane(NWBContainer):
optical_channel: OpticalChannel = Field(
..., description="""An optical channel used to record from an imaging plane."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ImagingPlaneManifold(ConfiguredBaseModel):
@ -321,18 +360,20 @@ class ImagingPlaneManifold(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "manifold", "ifabsent": "string(manifold)"}
},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as pixels from x = -500 to 499, y = -500 to 499 that correspond to a 2 m x 2 m range, then the 'conversion' multiplier to get from raw data acquisition pixel units to meters is 2/1000.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* height, * width, 3 x_y_z"], np.float32],
NDArray[Shape["* height, * width, * depth, 3 x_y_z"], np.float32],
NDArray[Shape["* height, * width, 3 x_y_z"], float],
NDArray[Shape["* height, * width, * depth, 3 x_y_z"], float],
]
] = Field(None)
@ -350,10 +391,12 @@ class ImagingPlaneOriginCoords(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "origin_coords", "ifabsent": "string(origin_coords)"}
},
)
unit: Optional[str] = Field(
None, description="""Measurement units for origin_coords. The default value is 'meters'."""
unit: str = Field(
"meters",
description="""Measurement units for origin_coords. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], np.float32]] = Field(
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -381,10 +424,12 @@ class ImagingPlaneGridSpacing(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "grid_spacing", "ifabsent": "string(grid_spacing)"}
},
)
unit: Optional[str] = Field(
None, description="""Measurement units for grid_spacing. The default value is 'meters'."""
unit: str = Field(
"meters",
description="""Measurement units for grid_spacing. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], np.float32]] = Field(
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -408,9 +453,7 @@ class OpticalChannel(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description or other notes about the channel.""")
emission_lambda: np.float32 = Field(
..., description="""Emission wavelength for channel, in nm."""
)
emission_lambda: float = Field(..., description="""Emission wavelength for channel, in nm.""")
class MotionCorrection(NWBDataInterface):
@ -422,7 +465,7 @@ class MotionCorrection(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[NWBDataInterface]] = Field(
value: Optional[List[NWBDataInterface]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}]}}
)
name: str = Field(...)

View file

@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,14 +105,12 @@ class RetinotopyMap(NWBData):
)
name: str = Field(...)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -121,22 +128,18 @@ class AxisMap(RetinotopyMap):
)
name: str = Field(...)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
class RetinotopyImage(GrayscaleImage):
@ -149,29 +152,25 @@ class RetinotopyImage(GrayscaleImage):
)
name: str = Field(...)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value.""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
resolution: Optional[np.float32] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -193,35 +192,60 @@ class ImagingRetinotopy(NWBDataInterface):
...,
description="""Phase response to stimulus on the first measured axis.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_1_power_map: Named[Optional[AxisMap]] = Field(
None,
description="""Power response on the first measured axis. Response is scaled so 0.0 is no power in the response and 1.0 is maximum relative power.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_2_phase_map: Named[AxisMap] = Field(
...,
description="""Phase response to stimulus on the second measured axis.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_2_power_map: Named[Optional[AxisMap]] = Field(
None,
description="""Power response to stimulus on the second measured axis.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
sign_map: Named[RetinotopyMap] = Field(
...,
description="""Sine of the angle between the direction of the gradient in axis_1 and axis_2.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_descriptions: NDArray[Shape["2 num_axes"], str] = Field(
@ -241,7 +265,12 @@ class ImagingRetinotopy(NWBDataInterface):
...,
description="""Gray-scale anatomical image of cortical surface. Array structure: [rows][columns]""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -262,32 +291,26 @@ class ImagingRetinotopyFocalDepthImage(RetinotopyImage):
}
},
)
focal_depth: Optional[np.float32] = Field(
None, description="""Focal depth offset, in meters."""
)
bits_per_pixel: Optional[np.int32] = Field(
None,
focal_depth: float = Field(..., description="""Focal depth offset, in meters.""")
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value.""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
resolution: Optional[np.float32] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)

View file

@ -128,11 +128,12 @@ from ...core.v2_2_0.core_nwb_file import (
NWBFile,
NWBFileStimulus,
NWBFileGeneral,
NWBFileGeneralSourceScript,
GeneralSourceScript,
Subject,
NWBFileGeneralExtracellularEphys,
NWBFileGeneralExtracellularEphysElectrodes,
NWBFileGeneralIntracellularEphys,
GeneralExtracellularEphys,
ExtracellularEphysElectrodes,
GeneralIntracellularEphys,
NWBFileIntervals,
)
from ...core.v2_2_0.core_nwb_epoch import TimeIntervals, TimeIntervalsTimeseries
@ -154,6 +155,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -83,15 +92,15 @@ class Image(NWBData):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -130,10 +139,15 @@ class TimeSeries(NWBDataInterface):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
data: TimeSeriesData = Field(
...,
@ -143,12 +157,12 @@ class TimeSeries(NWBDataInterface):
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -177,19 +191,21 @@ class TimeSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as signed 16-bit integers (int16 range -32,768 to 32,767) that correspond to a 5V range (-2.5V to 2.5V), and the data acquisition system gain is 8000X, then the 'conversion' multiplier to get from raw data acquisition values to recorded volts is 2.5/32768/8000 = 9.5367e-9.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
resolution: Optional[np.float32] = Field(
None,
resolution: Optional[float] = Field(
-1.0,
description="""Smallest meaningful difference between values in data, stored in the specified by unit, e.g., the change in value of the least significant bit, or a larger number if signal noise is known to be present. If unknown, use -1.0.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(-1.0)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], Any],
NDArray[Shape["* num_times, * num_dim2"], Any],
@ -212,11 +228,15 @@ class TimeSeriesStartingTime(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "starting_time", "ifabsent": "string(starting_time)"}
},
)
rate: Optional[np.float32] = Field(None, description="""Sampling rate, in Hz.""")
unit: Optional[str] = Field(
None, description="""Unit of measurement for time, which is fixed to 'seconds'."""
rate: float = Field(..., description="""Sampling rate, in Hz.""")
unit: Literal["seconds"] = Field(
"seconds",
description="""Unit of measurement for time, which is fixed to 'seconds'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "seconds", "ifabsent": "string(seconds)"}
},
)
value: np.float64 = Field(...)
value: float = Field(...)
class TimeSeriesSync(ConfiguredBaseModel):
@ -241,7 +261,7 @@ class ProcessingModule(NWBContainer):
{"from_schema": "core.nwb.base", "tree_root": True}
)
children: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
value: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
None,
json_schema_extra={
"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}, {"range": "DynamicTable"}]}
@ -260,9 +280,7 @@ class Images(NWBDataInterface):
)
name: str = Field("Images", json_schema_extra={"linkml_meta": {"ifabsent": "string(Images)"}})
description: Optional[str] = Field(
None, description="""Description of this collection of images."""
)
description: str = Field(..., description="""Description of this collection of images.""")
image: List[Image] = Field(..., description="""Images stored in this collection.""")

View file

@ -34,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -84,21 +93,26 @@ class SpatialSeries(TimeSeries):
reference_frame: Optional[str] = Field(
None, description="""Description defining what exactly 'straight-ahead' means."""
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -128,13 +142,14 @@ class SpatialSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -148,7 +163,7 @@ class BehavioralEpochs(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[IntervalSeries]] = Field(
value: Optional[List[IntervalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "IntervalSeries"}]}}
)
name: str = Field(...)
@ -163,7 +178,7 @@ class BehavioralEvents(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -178,7 +193,7 @@ class BehavioralTimeSeries(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -193,7 +208,7 @@ class PupilTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -208,7 +223,7 @@ class EyeTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -223,7 +238,7 @@ class CompassDirection(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -238,7 +253,7 @@ class Position(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)

View file

@ -27,6 +27,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -16,6 +16,7 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from ...core.v2_2_1.core_nwb_device import Device
from ...core.v2_2_1.core_nwb_base import (
TimeSeries,
TimeSeriesStartingTime,
@ -43,6 +44,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +78,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -108,37 +118,47 @@ class ElectricalSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_channels"], np.number],
NDArray[Shape["* num_times, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_channels"], float],
NDArray[Shape["* num_times, * num_channels, * num_samples"], float],
] = Field(..., description="""Recorded voltage data.""")
electrodes: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -167,10 +187,10 @@ class SpikeEventSeries(ElectricalSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_events, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_samples"], float],
NDArray[Shape["* num_events, * num_channels, * num_samples"], float],
] = Field(..., description="""Spike waveforms.""")
timestamps: NDArray[Shape["* num_times"], np.float64] = Field(
timestamps: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time. Timestamps are required for the events. Unlike for TimeSeries, timestamps are required for SpikeEventSeries and are thus re-specified here.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -179,24 +199,34 @@ class SpikeEventSeries(ElectricalSeries):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,7 +262,7 @@ class FeatureExtraction(NWBDataInterface):
description="""Description of features (eg, ''PC1'') for each of the extracted features.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
features: NDArray[Shape["* num_events, * num_channels, * num_features"], np.float32] = Field(
features: NDArray[Shape["* num_events, * num_channels, * num_features"], float] = Field(
...,
description="""Multi-dimensional array of features extracted from each event.""",
json_schema_extra={
@ -247,7 +277,7 @@ class FeatureExtraction(NWBDataInterface):
}
},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of events that features correspond to (can be a link).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
@ -256,7 +286,12 @@ class FeatureExtraction(NWBDataInterface):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -277,16 +312,25 @@ class EventDetection(NWBDataInterface):
...,
description="""Description of how events were detected, such as voltage threshold, or dV/dT threshold, as well as relevant values.""",
)
source_idx: NDArray[Shape["* num_events"], np.int32] = Field(
source_idx: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Indices (zero-based) into source ElectricalSeries::data array corresponding to time of event. ''description'' should define what is meant by time of event (e.g., .25 ms before action potential peak, zero-crossing time, etc). The index points to each event from the raw data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Timestamps of events, in seconds.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
source_electricalseries: Union[ElectricalSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ElectricalSeries"}, {"range": "string"}],
}
},
)
class EventWaveform(NWBDataInterface):
@ -298,7 +342,7 @@ class EventWaveform(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[SpikeEventSeries]] = Field(
value: Optional[List[SpikeEventSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpikeEventSeries"}]}}
)
name: str = Field(...)
@ -313,7 +357,7 @@ class FilteredEphys(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -328,7 +372,7 @@ class LFP(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -344,14 +388,23 @@ class ElectrodeGroup(NWBContainer):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of this electrode group.""")
location: Optional[str] = Field(
None,
description: str = Field(..., description="""Description of this electrode group.""")
location: str = Field(
...,
description="""Location of electrode group. Specify the area, layer, comments on estimation of area/layer, etc. Use standard atlas names for anatomical regions when possible.""",
)
position: Optional[ElectrodeGroupPosition] = Field(
None, description="""stereotaxic or common framework coordinates"""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ElectrodeGroupPosition(ConfiguredBaseModel):
@ -367,9 +420,21 @@ class ElectrodeGroupPosition(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "position", "ifabsent": "string(position)"}
},
)
x: Optional[np.float32] = Field(None, description="""x coordinate""")
y: Optional[np.float32] = Field(None, description="""y coordinate""")
z: Optional[np.float32] = Field(None, description="""z coordinate""")
x: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""x coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
y: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""y coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""z coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
class ClusterWaveforms(NWBDataInterface):
@ -388,7 +453,7 @@ class ClusterWaveforms(NWBDataInterface):
waveform_filtering: str = Field(
..., description="""Filtering applied to data before generating mean/sd"""
)
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""The mean waveform for each cluster, using the same indices for each wave as cluster numbers in the associated Clustering module (i.e, cluster 3 is in array slot [3]). Waveforms corresponding to gaps in cluster sequence should be empty (e.g., zero- filled)""",
json_schema_extra={
@ -397,7 +462,7 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""Stdev of waveforms for each cluster, using the same indices as in mean""",
json_schema_extra={
@ -406,6 +471,15 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
clustering_interface: Union[Clustering, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Clustering"}, {"range": "string"}],
}
},
)
class Clustering(NWBDataInterface):
@ -424,17 +498,17 @@ class Clustering(NWBDataInterface):
...,
description="""Description of clusters or clustering, (e.g. cluster 0 is noise, clusters curated using Klusters, etc)""",
)
num: NDArray[Shape["* num_events"], np.int32] = Field(
num: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Cluster number of each event""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
peak_over_rms: NDArray[Shape["* num_clusters"], np.float32] = Field(
peak_over_rms: NDArray[Shape["* num_clusters"], float] = Field(
...,
description="""Maximum ratio of waveform peak to RMS on any channel in the cluster (provides a basic clustering metric).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_clusters"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of clustered events, in seconds. This may be a link to times field in associated FeatureExtraction module.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},

View file

@ -15,9 +15,9 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_2.hdmf_common_table import DynamicTable, VectorIndex, VectorData
from ...core.v2_2_1.core_nwb_base import TimeSeries
from numpydantic import NDArray, Shape
metamodel_version = "None"
version = "2.2.1"
@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,7 +105,7 @@ class TimeIntervals(DynamicTable):
)
name: str = Field(...)
start_time: NDArray[Any, np.float32] = Field(
start_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Start time of epoch, in seconds.""",
json_schema_extra={
@ -105,7 +114,7 @@ class TimeIntervals(DynamicTable):
}
},
)
stop_time: NDArray[Any, np.float32] = Field(
stop_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Stop time of epoch, in seconds.""",
json_schema_extra={
@ -114,7 +123,7 @@ class TimeIntervals(DynamicTable):
}
},
)
tags: Optional[NDArray[Any, str]] = Field(
tags: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""User-defined tags that identify or categorize events.""",
json_schema_extra={
@ -127,7 +136,12 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for tags.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
timeseries: Optional[TimeIntervalsTimeseries] = Field(
@ -137,17 +151,20 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -173,20 +190,22 @@ class TimeIntervalsTimeseries(VectorData):
"linkml_meta": {"equals_string": "timeseries", "ifabsent": "string(timeseries)"}
},
)
idx_start: Optional[np.int32] = Field(
idx_start: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Start index into the TimeSeries 'data' and 'timestamp' datasets of the referenced TimeSeries. The first dimension of those arrays is always time.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
count: Optional[np.int32] = Field(
count: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Number of data samples available in this time series, during this epoch.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
timeseries: Optional[TimeSeries] = Field(
None, description="""the TimeSeries that this index applies to."""
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
timeseries: Optional[NDArray[Shape["*"], TimeSeries]] = Field(
None,
description="""the TimeSeries that this index applies to.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: str = Field(..., description="""Description of what these vectors represent.""")
# Model rebuild

View file

@ -7,7 +7,6 @@ import sys
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union
from pydantic import BaseModel, ConfigDict, Field, RootModel, field_validator
import numpy as np
from ...core.v2_2_1.core_nwb_epoch import TimeIntervals
from ...core.v2_2_1.core_nwb_misc import Units
from ...core.v2_2_1.core_nwb_device import Device
from ...core.v2_2_1.core_nwb_ogen import OptogeneticStimulusSite
@ -22,6 +21,7 @@ from ...core.v2_2_1.core_nwb_ecephys import ElectrodeGroup
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_2.hdmf_common_table import DynamicTable, VectorData, VectorIndex
from ...core.v2_2_1.core_nwb_icephys import IntracellularElectrode, SweepTable
from ...core.v2_2_1.core_nwb_epoch import TimeIntervals
metamodel_version = "None"
version = "2.2.1"
@ -41,6 +41,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -98,11 +107,12 @@ class NWBFile(NWBContainer):
"root",
json_schema_extra={"linkml_meta": {"equals_string": "root", "ifabsent": "string(root)"}},
)
nwb_version: Optional[str] = Field(
None,
nwb_version: Literal["2.2.1"] = Field(
"2.2.1",
description="""File version string. Use semantic versioning, e.g. 1.2.1. This will be the name of the format with trailing major, minor and patch numbers.""",
json_schema_extra={"linkml_meta": {"equals_string": "2.2.1", "ifabsent": "string(2.2.1)"}},
)
file_create_date: NDArray[Shape["* num_modifications"], np.datetime64] = Field(
file_create_date: NDArray[Shape["* num_modifications"], datetime] = Field(
...,
description="""A record of the date the file was created and of subsequent modifications. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted strings: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. The file can be created after the experiment was run, so this may differ from the experiment start time. Each modification to the nwb file adds a new entry to the array.""",
json_schema_extra={
@ -116,11 +126,11 @@ class NWBFile(NWBContainer):
session_description: str = Field(
..., description="""A description of the experimental session and data in the file."""
)
session_start_time: np.datetime64 = Field(
session_start_time: datetime = Field(
...,
description="""Date and time of the experiment/session start. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds.""",
)
timestamps_reference_time: np.datetime64 = Field(
timestamps_reference_time: datetime = Field(
...,
description="""Date and time corresponding to time zero of all timestamps. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. All times stored in the file use this time as reference (i.e., time zero).""",
)
@ -158,19 +168,9 @@ class NWBFile(NWBContainer):
...,
description="""Experimental metadata, including protocol, notes and description of hardware device(s). The metadata stored in this section should be used to describe the experiment. Metadata necessary for interpreting the data is stored with the data. General experimental metadata, including animal strain, experimental protocols, experimenter, devices, etc, are stored under 'general'. Core metadata (e.g., that required to interpret data fields) is stored with the data itself, and implicitly defined by the file specification (e.g., time is in seconds). The strategy used here for storing non-core metadata is to use free-form text fields, such as would appear in sentences or paragraphs from a Methods section. Metadata fields are text to enable them to be more general, for example to represent ranges instead of numerical values. Machine-readable metadata is stored as attributes to these free-form datasets. All entries in the below table are to be included when data is present. Unused groups (e.g., intracellular_ephys in an optophysiology experiment) should not be created unless there is data to store within them.""",
)
intervals: Optional[List[TimeIntervals]] = Field(
intervals: Optional[NWBFileIntervals] = Field(
None,
description="""Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.""",
json_schema_extra={
"linkml_meta": {
"any_of": [
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
]
}
},
)
units: Optional[Units] = Field(None, description="""Data about sorted spike units.""")
@ -256,7 +256,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Description of slices, including information about preparation thickness, orientation, temperature, and bath solution.""",
)
source_script: Optional[NWBFileGeneralSourceScript] = Field(
source_script: Optional[GeneralSourceScript] = Field(
None,
description="""Script file or link to public source code used to create this NWB file.""",
)
@ -284,10 +284,10 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Information about the animal or person from which the data was measured.""",
)
extracellular_ephys: Optional[NWBFileGeneralExtracellularEphys] = Field(
extracellular_ephys: Optional[GeneralExtracellularEphys] = Field(
None, description="""Metadata related to extracellular electrophysiology."""
)
intracellular_ephys: Optional[NWBFileGeneralIntracellularEphys] = Field(
intracellular_ephys: Optional[GeneralIntracellularEphys] = Field(
None, description="""Metadata related to intracellular electrophysiology."""
)
optogenetics: Optional[List[OptogeneticStimulusSite]] = Field(
@ -302,7 +302,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
)
class NWBFileGeneralSourceScript(ConfiguredBaseModel):
class GeneralSourceScript(ConfiguredBaseModel):
"""
Script file or link to public source code used to create this NWB file.
"""
@ -315,7 +315,7 @@ class NWBFileGeneralSourceScript(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "source_script", "ifabsent": "string(source_script)"}
},
)
file_name: Optional[str] = Field(None, description="""Name of script file.""")
file_name: str = Field(..., description="""Name of script file.""")
value: str = Field(...)
@ -335,7 +335,7 @@ class Subject(NWBContainer):
age: Optional[str] = Field(
None, description="""Age of subject. Can be supplied instead of 'date_of_birth'."""
)
date_of_birth: Optional[np.datetime64] = Field(
date_of_birth: Optional[datetime] = Field(
None, description="""Date of birth of subject. Can be supplied instead of 'age'."""
)
description: Optional[str] = Field(
@ -357,7 +357,7 @@ class Subject(NWBContainer):
)
class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
class GeneralExtracellularEphys(ConfiguredBaseModel):
"""
Metadata related to extracellular electrophysiology.
"""
@ -376,12 +376,12 @@ class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Physical group of electrodes."""
)
electrodes: Optional[NWBFileGeneralExtracellularEphysElectrodes] = Field(
electrodes: Optional[ExtracellularEphysElectrodes] = Field(
None, description="""A table of all electrodes (i.e. channels) used for recording."""
)
class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
class ExtracellularEphysElectrodes(DynamicTable):
"""
A table of all electrodes (i.e. channels) used for recording.
"""
@ -394,7 +394,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
"linkml_meta": {"equals_string": "electrodes", "ifabsent": "string(electrodes)"}
},
)
x: NDArray[Any, np.float32] = Field(
x: VectorData[NDArray[Any, float]] = Field(
...,
description="""x coordinate of the channel location in the brain (+x is posterior).""",
json_schema_extra={
@ -403,7 +403,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
y: NDArray[Any, np.float32] = Field(
y: VectorData[NDArray[Any, float]] = Field(
...,
description="""y coordinate of the channel location in the brain (+y is inferior).""",
json_schema_extra={
@ -412,7 +412,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
z: NDArray[Any, np.float32] = Field(
z: VectorData[NDArray[Any, float]] = Field(
...,
description="""z coordinate of the channel location in the brain (+z is right).""",
json_schema_extra={
@ -421,7 +421,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
imp: NDArray[Any, np.float32] = Field(
imp: VectorData[NDArray[Any, float]] = Field(
...,
description="""Impedance of the channel.""",
json_schema_extra={
@ -430,7 +430,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
location: NDArray[Any, str] = Field(
location: VectorData[NDArray[Any, str]] = Field(
...,
description="""Location of the electrode (channel). Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
json_schema_extra={
@ -439,7 +439,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
filtering: NDArray[Any, np.float32] = Field(
filtering: VectorData[NDArray[Any, float]] = Field(
...,
description="""Description of hardware filtering.""",
json_schema_extra={
@ -451,7 +451,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
group: List[ElectrodeGroup] = Field(
..., description="""Reference to the ElectrodeGroup this electrode is a part of."""
)
group_name: NDArray[Any, str] = Field(
group_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the ElectrodeGroup this electrode is a part of.""",
json_schema_extra={
@ -460,7 +460,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_x: Optional[NDArray[Any, np.float32]] = Field(
rel_x: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""x coordinate in electrode group""",
json_schema_extra={
@ -469,7 +469,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_y: Optional[NDArray[Any, np.float32]] = Field(
rel_y: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""y coordinate in electrode group""",
json_schema_extra={
@ -478,7 +478,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_z: Optional[NDArray[Any, np.float32]] = Field(
rel_z: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""z coordinate in electrode group""",
json_schema_extra={
@ -487,7 +487,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
reference: Optional[NDArray[Any, str]] = Field(
reference: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""Description of the reference used for this electrode.""",
json_schema_extra={
@ -496,14 +496,12 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -516,7 +514,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
)
class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
class GeneralIntracellularEphys(ConfiguredBaseModel):
"""
Metadata related to intracellular electrophysiology.
"""
@ -544,13 +542,43 @@ class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
)
class NWBFileIntervals(ConfiguredBaseModel):
"""
Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.file"})
name: Literal["intervals"] = Field(
"intervals",
json_schema_extra={
"linkml_meta": {"equals_string": "intervals", "ifabsent": "string(intervals)"}
},
)
epochs: Optional[TimeIntervals] = Field(
None,
description="""Divisions in time marking experimental stages or sub-divisions of a single recording session.""",
)
trials: Optional[TimeIntervals] = Field(
None, description="""Repeated experimental events that have a logical grouping."""
)
invalid_times: Optional[TimeIntervals] = Field(
None, description="""Time intervals that should be removed from analysis."""
)
time_intervals: Optional[List[TimeIntervals]] = Field(
None,
description="""Optional additional table(s) for describing other experimental time intervals.""",
)
# Model rebuild
# see https://pydantic-docs.helpmanual.io/usage/models/#rebuilding-a-model
NWBFile.model_rebuild()
NWBFileStimulus.model_rebuild()
NWBFileGeneral.model_rebuild()
NWBFileGeneralSourceScript.model_rebuild()
GeneralSourceScript.model_rebuild()
Subject.model_rebuild()
NWBFileGeneralExtracellularEphys.model_rebuild()
NWBFileGeneralExtracellularEphysElectrodes.model_rebuild()
NWBFileGeneralIntracellularEphys.model_rebuild()
GeneralExtracellularEphys.model_rebuild()
ExtracellularEphysElectrodes.model_rebuild()
GeneralIntracellularEphys.model_rebuild()
NWBFileIntervals.model_rebuild()

View file

@ -11,6 +11,7 @@ from ...core.v2_2_1.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_1.core_nwb_device import Device
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union, Annotated, Type, TypeVar
from pydantic import (
BaseModel,
@ -42,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -67,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -106,32 +116,46 @@ class PatchClampSeries(TimeSeries):
)
name: str = Field(...)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
data: PatchClampSeriesData = Field(..., description="""Recorded voltage or current.""")
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -160,11 +184,11 @@ class PatchClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[NDArray[Shape["* num_times"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times"], float]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}}
)
@ -180,36 +204,50 @@ class CurrentClampSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
bias_current: Optional[np.float32] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[np.float32] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[np.float32] = Field(
bias_current: Optional[float] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[float] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[float] = Field(
None, description="""Capacitance compensation, in farads."""
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -238,9 +276,10 @@ class CurrentClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -255,39 +294,51 @@ class IZeroClampSeries(CurrentClampSeries):
)
name: str = Field(...)
bias_current: np.float32 = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: np.float32 = Field(
..., description="""Bridge balance, in ohms, fixed to 0.0."""
)
capacitance_compensation: np.float32 = Field(
bias_current: float = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: float = Field(..., description="""Bridge balance, in ohms, fixed to 0.0.""")
capacitance_compensation: float = Field(
..., description="""Capacitance compensation, in farads, fixed to 0.0."""
)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -316,31 +367,45 @@ class CurrentClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampStimulusSeriesData = Field(..., description="""Stimulus current applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -369,9 +434,12 @@ class CurrentClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -408,31 +476,45 @@ class VoltageClampSeries(PatchClampSeries):
whole_cell_series_resistance_comp: Optional[VoltageClampSeriesWholeCellSeriesResistanceComp] = (
Field(None, description="""Whole cell series resistance compensation, in ohms.""")
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -461,9 +543,12 @@ class VoltageClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -484,11 +569,14 @@ class VoltageClampSeriesCapacitanceFast(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
@ -507,11 +595,14 @@ class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
@ -530,11 +621,12 @@ class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["hertz"] = Field(
"hertz",
description="""Unit of measurement for resistance_comp_bandwidth, which is fixed to 'hertz'.""",
json_schema_extra={"linkml_meta": {"equals_string": "hertz", "ifabsent": "string(hertz)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
@ -553,11 +645,14 @@ class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_correction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
@ -576,11 +671,14 @@ class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_prediction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
@ -599,11 +697,14 @@ class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for whole_cell_capacitance_comp, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
@ -622,11 +723,12 @@ class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["ohms"] = Field(
"ohms",
description="""Unit of measurement for whole_cell_series_resistance_comp, which is fixed to 'ohms'.""",
json_schema_extra={"linkml_meta": {"equals_string": "ohms", "ifabsent": "string(ohms)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampStimulusSeries(PatchClampSeries):
@ -640,31 +742,45 @@ class VoltageClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: VoltageClampStimulusSeriesData = Field(..., description="""Stimulus voltage applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -693,9 +809,10 @@ class VoltageClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -726,6 +843,15 @@ class IntracellularElectrode(NWBContainer):
slice: Optional[str] = Field(
None, description="""Information about slice used for recording."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class SweepTable(DynamicTable):
@ -738,7 +864,7 @@ class SweepTable(DynamicTable):
)
name: str = Field(...)
sweep_number: NDArray[Any, np.uint32] = Field(
sweep_number: VectorData[NDArray[Any, int]] = Field(
...,
description="""Sweep number of the PatchClampSeries in that row.""",
json_schema_extra={
@ -754,17 +880,20 @@ class SweepTable(DynamicTable):
...,
description="""Index for series.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -71,15 +80,15 @@ class GrayscaleImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -94,15 +103,15 @@ class RGBImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -117,15 +126,15 @@ class RGBAImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -142,11 +151,11 @@ class ImageSeries(TimeSeries):
name: str = Field(...)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -159,21 +168,26 @@ class ImageSeries(TimeSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -204,11 +218,11 @@ class ImageSeriesExternalFile(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "external_file", "ifabsent": "string(external_file)"}
},
)
starting_frame: Optional[np.int32] = Field(
None,
starting_frame: List[int] = Field(
...,
description="""Each external image may contain one or more consecutive frames of the full ImageSeries. This attribute serves as an index to indicate which frames each file contains, to faciliate random access. The 'starting_frame' attribute, hence, contains a list of frame numbers within the full ImageSeries of the first frame of each file listed in the parent 'external_file' dataset. Zero-based indexing is used (hence, the first element will always be zero). For example, if the 'external_file' dataset has three paths to files and the first file has 5 frames, the second file has 10 frames, and the third file has 20 frames, then this attribute will have values [0, 5, 15]. If there is a single external file that holds all of the frames of the ImageSeries (and so there is a single element in the 'external_file' dataset), then this attribute should have value [0].""",
)
array: Optional[NDArray[Shape["* num_files"], str]] = Field(
value: Optional[NDArray[Shape["* num_files"], str]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_files"}]}}}
)
@ -223,13 +237,22 @@ class ImageMaskSeries(ImageSeries):
)
name: str = Field(...)
masked_imageseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -242,21 +265,26 @@ class ImageMaskSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -284,13 +312,12 @@ class OpticalSeries(ImageSeries):
)
name: str = Field(...)
distance: Optional[np.float32] = Field(
distance: Optional[float] = Field(
None, description="""Distance from camera/monitor to target/eye."""
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height_depth"], np.float32],
NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height_depth"], float]
]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
orientation: Optional[str] = Field(
@ -299,11 +326,11 @@ class OpticalSeries(ImageSeries):
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -316,21 +343,26 @@ class OpticalSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -358,26 +390,40 @@ class IndexSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int32] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Index of the frame in the referenced ImageSeries.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
indexed_timeseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},

View file

@ -43,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -120,21 +129,26 @@ class AbstractFeatureSeries(TimeSeries):
description="""Description of the features represented in TimeSeries::data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -164,13 +178,14 @@ class AbstractFeatureSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"see 'feature_units'",
description="""Since there can be different units for different features, store the units in 'feature_units'. The default value for this attribute is \"see 'feature_units'\".""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(see 'feature_units')"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -190,21 +205,26 @@ class AnnotationSeries(TimeSeries):
description="""Annotations made during an experiment.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,26 +252,31 @@ class IntervalSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int8] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Use values >0 if interval started, <0 if interval ended.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -287,21 +312,35 @@ class DecompositionSeries(TimeSeries):
...,
description="""Table for describing the bands that this series was generated from. There should be one row in this table for each band.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
comments: Optional[str] = Field(
source_timeseries: Optional[Union[TimeSeries, str]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "TimeSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -330,11 +369,12 @@ class DecompositionSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
"no unit",
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no unit)"}},
)
array: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -361,7 +401,7 @@ class DecompositionSeriesBands(DynamicTable):
"bands",
json_schema_extra={"linkml_meta": {"equals_string": "bands", "ifabsent": "string(bands)"}},
)
band_name: NDArray[Any, str] = Field(
band_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the band, e.g. theta.""",
json_schema_extra={
@ -370,7 +410,7 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_limits: NDArray[Shape["* num_bands, 2 low_high"], np.float32] = Field(
band_limits: VectorData[NDArray[Shape["* num_bands, 2 low_high"], float]] = Field(
...,
description="""Low and high limit of each band in Hz. If it is a Gaussian filter, use 2 SD on either side of the center.""",
json_schema_extra={
@ -384,24 +424,22 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_mean: NDArray[Shape["* num_bands"], np.float32] = Field(
band_mean: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The mean Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
band_stdev: NDArray[Shape["* num_bands"], np.float32] = Field(
band_stdev: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The standard deviation of Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -428,7 +466,12 @@ class Units(DynamicTable):
None,
description="""Index into the spike_times dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
spike_times: Optional[UnitsSpikeTimes] = Field(
@ -438,10 +481,16 @@ class Units(DynamicTable):
None,
description="""Index into the obs_intervals dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
obs_intervals: Optional[NDArray[Shape["* num_intervals, 2 start_end"], np.float64]] = Field(
obs_intervals: VectorData[Optional[NDArray[Shape["* num_intervals, 2 start_end"], float]]] = (
Field(
None,
description="""Observation intervals for each unit.""",
json_schema_extra={
@ -455,43 +504,56 @@ class Units(DynamicTable):
}
},
)
)
electrodes_index: Named[Optional[VectorIndex]] = Field(
None,
description="""Index into electrodes.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrodes: Named[Optional[DynamicTableRegion]] = Field(
None,
description="""Electrode that each spike unit came from, specified using a DynamicTableRegion.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Electrode group that each spike unit came from."""
)
waveform_mean: Optional[
waveform_mean: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform mean for each spike unit.""")
waveform_sd: Optional[
waveform_sd: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform standard deviation for each spike unit.""")
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -517,13 +579,11 @@ class UnitsSpikeTimes(VectorData):
"linkml_meta": {"equals_string": "spike_times", "ifabsent": "string(spike_times)"}
},
)
resolution: Optional[np.float64] = Field(
resolution: Optional[float] = Field(
None,
description="""The smallest possible difference between two spike times. Usually 1 divided by the acquisition sampling rate from which spike times were extracted, but could be larger if the acquisition time series was downsampled or smaller if the acquisition time series was smoothed/interpolated and it is possible for the spike time to be between samples.""",
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
description: str = Field(..., description="""Description of what these vectors represent.""")
# Model rebuild

View file

@ -14,6 +14,7 @@ from ...core.v2_2_1.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_1.core_nwb_device import Device
metamodel_version = "None"
version = "2.2.1"
@ -33,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -76,26 +86,40 @@ class OptogeneticSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.number] = Field(
data: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Applied power for optogenetic stimulus, in watts.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
site: Union[OptogeneticStimulusSite, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "OptogeneticStimulusSite"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -124,11 +148,20 @@ class OptogeneticStimulusSite(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description of stimulation site.""")
excitation_lambda: np.float32 = Field(..., description="""Excitation wavelength, in nm.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
location: str = Field(
...,
description="""Location of the stimulation site. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
# Model rebuild

View file

@ -17,6 +17,7 @@ from pydantic import (
BeforeValidator,
)
from ...hdmf_common.v1_1_2.hdmf_common_table import DynamicTableRegion, DynamicTable
from ...core.v2_2_1.core_nwb_device import Device
from numpydantic import NDArray, Shape
from ...core.v2_2_1.core_nwb_base import (
TimeSeriesStartingTime,
@ -44,6 +45,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -69,7 +79,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -109,24 +119,30 @@ class TwoPhotonSeries(ImageSeries):
)
name: str = Field(...)
pmt_gain: Optional[np.float32] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[np.float32] = Field(
pmt_gain: Optional[float] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[float] = Field(
None,
description="""Lines imaged per second. This is also stored in /general/optophysiology but is kept here as it is useful information for analysis, and so good to be stored w/ the actual data.""",
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height"], np.float32],
]
Union[NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height"], float]]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
imaging_plane: Union[ImagingPlane, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImagingPlane"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -139,21 +155,26 @@ class TwoPhotonSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -182,31 +203,40 @@ class RoiResponseSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_rois"], np.number],
NDArray[Shape["* num_times"], float], NDArray[Shape["* num_times, * num_rois"], float]
] = Field(..., description="""Signals from ROIs.""")
rois: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion referencing into an ROITable containing information on the ROIs stored in this timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -233,7 +263,7 @@ class DfOverF(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -248,7 +278,7 @@ class Fluorescence(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -263,7 +293,7 @@ class ImageSegmentation(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[DynamicTable]] = Field(
value: Optional[List[DynamicTable]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "DynamicTable"}]}}
)
name: str = Field(...)
@ -280,8 +310,8 @@ class ImagingPlane(NWBContainer):
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the imaging plane.""")
excitation_lambda: np.float32 = Field(..., description="""Excitation wavelength, in nm.""")
imaging_rate: np.float32 = Field(..., description="""Rate that images are acquired, in Hz.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
imaging_rate: float = Field(..., description="""Rate that images are acquired, in Hz.""")
indicator: str = Field(..., description="""Calcium indicator.""")
location: str = Field(
...,
@ -306,6 +336,15 @@ class ImagingPlane(NWBContainer):
optical_channel: OpticalChannel = Field(
..., description="""An optical channel used to record from an imaging plane."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ImagingPlaneManifold(ConfiguredBaseModel):
@ -321,18 +360,20 @@ class ImagingPlaneManifold(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "manifold", "ifabsent": "string(manifold)"}
},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as pixels from x = -500 to 499, y = -500 to 499 that correspond to a 2 m x 2 m range, then the 'conversion' multiplier to get from raw data acquisition pixel units to meters is 2/1000.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* height, * width, 3 x_y_z"], np.float32],
NDArray[Shape["* height, * width, * depth, 3 x_y_z"], np.float32],
NDArray[Shape["* height, * width, 3 x_y_z"], float],
NDArray[Shape["* height, * width, * depth, 3 x_y_z"], float],
]
] = Field(None)
@ -350,10 +391,12 @@ class ImagingPlaneOriginCoords(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "origin_coords", "ifabsent": "string(origin_coords)"}
},
)
unit: Optional[str] = Field(
None, description="""Measurement units for origin_coords. The default value is 'meters'."""
unit: str = Field(
"meters",
description="""Measurement units for origin_coords. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], np.float32]] = Field(
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -381,10 +424,12 @@ class ImagingPlaneGridSpacing(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "grid_spacing", "ifabsent": "string(grid_spacing)"}
},
)
unit: Optional[str] = Field(
None, description="""Measurement units for grid_spacing. The default value is 'meters'."""
unit: str = Field(
"meters",
description="""Measurement units for grid_spacing. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], np.float32]] = Field(
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -408,9 +453,7 @@ class OpticalChannel(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description or other notes about the channel.""")
emission_lambda: np.float32 = Field(
..., description="""Emission wavelength for channel, in nm."""
)
emission_lambda: float = Field(..., description="""Emission wavelength for channel, in nm.""")
class MotionCorrection(NWBDataInterface):
@ -422,7 +465,7 @@ class MotionCorrection(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[NWBDataInterface]] = Field(
value: Optional[List[NWBDataInterface]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}]}}
)
name: str = Field(...)

View file

@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,14 +105,12 @@ class RetinotopyMap(NWBData):
)
name: str = Field(...)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -121,22 +128,18 @@ class AxisMap(RetinotopyMap):
)
name: str = Field(...)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
class RetinotopyImage(GrayscaleImage):
@ -149,29 +152,25 @@ class RetinotopyImage(GrayscaleImage):
)
name: str = Field(...)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value.""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
resolution: Optional[np.float32] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -193,35 +192,60 @@ class ImagingRetinotopy(NWBDataInterface):
...,
description="""Phase response to stimulus on the first measured axis.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_1_power_map: Named[Optional[AxisMap]] = Field(
None,
description="""Power response on the first measured axis. Response is scaled so 0.0 is no power in the response and 1.0 is maximum relative power.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_2_phase_map: Named[AxisMap] = Field(
...,
description="""Phase response to stimulus on the second measured axis.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_2_power_map: Named[Optional[AxisMap]] = Field(
None,
description="""Power response to stimulus on the second measured axis.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
sign_map: Named[RetinotopyMap] = Field(
...,
description="""Sine of the angle between the direction of the gradient in axis_1 and axis_2.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
axis_descriptions: NDArray[Shape["2 num_axes"], str] = Field(
@ -241,7 +265,12 @@ class ImagingRetinotopy(NWBDataInterface):
...,
description="""Gray-scale anatomical image of cortical surface. Array structure: [rows][columns]""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -262,32 +291,26 @@ class ImagingRetinotopyFocalDepthImage(RetinotopyImage):
}
},
)
focal_depth: Optional[np.float32] = Field(
None, description="""Focal depth offset, in meters."""
)
bits_per_pixel: Optional[np.int32] = Field(
None,
focal_depth: float = Field(..., description="""Focal depth offset, in meters.""")
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value.""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
resolution: Optional[np.float32] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)

View file

@ -128,11 +128,12 @@ from ...core.v2_2_1.core_nwb_file import (
NWBFile,
NWBFileStimulus,
NWBFileGeneral,
NWBFileGeneralSourceScript,
GeneralSourceScript,
Subject,
NWBFileGeneralExtracellularEphys,
NWBFileGeneralExtracellularEphysElectrodes,
NWBFileGeneralIntracellularEphys,
GeneralExtracellularEphys,
ExtracellularEphysElectrodes,
GeneralIntracellularEphys,
NWBFileIntervals,
)
from ...core.v2_2_1.core_nwb_epoch import TimeIntervals, TimeIntervalsTimeseries
@ -154,6 +155,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -83,15 +92,15 @@ class Image(NWBData):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -130,10 +139,15 @@ class TimeSeries(NWBDataInterface):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
data: TimeSeriesData = Field(
...,
@ -143,12 +157,12 @@ class TimeSeries(NWBDataInterface):
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -177,19 +191,21 @@ class TimeSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as signed 16-bit integers (int16 range -32,768 to 32,767) that correspond to a 5V range (-2.5V to 2.5V), and the data acquisition system gain is 8000X, then the 'conversion' multiplier to get from raw data acquisition values to recorded volts is 2.5/32768/8000 = 9.5367e-9.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
resolution: Optional[np.float32] = Field(
None,
resolution: Optional[float] = Field(
-1.0,
description="""Smallest meaningful difference between values in data, stored in the specified by unit, e.g., the change in value of the least significant bit, or a larger number if signal noise is known to be present. If unknown, use -1.0.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(-1.0)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], Any],
NDArray[Shape["* num_times, * num_dim2"], Any],
@ -212,11 +228,15 @@ class TimeSeriesStartingTime(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "starting_time", "ifabsent": "string(starting_time)"}
},
)
rate: Optional[np.float32] = Field(None, description="""Sampling rate, in Hz.""")
unit: Optional[str] = Field(
None, description="""Unit of measurement for time, which is fixed to 'seconds'."""
rate: float = Field(..., description="""Sampling rate, in Hz.""")
unit: Literal["seconds"] = Field(
"seconds",
description="""Unit of measurement for time, which is fixed to 'seconds'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "seconds", "ifabsent": "string(seconds)"}
},
)
value: np.float64 = Field(...)
value: float = Field(...)
class TimeSeriesSync(ConfiguredBaseModel):
@ -241,7 +261,7 @@ class ProcessingModule(NWBContainer):
{"from_schema": "core.nwb.base", "tree_root": True}
)
children: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
value: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
None,
json_schema_extra={
"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}, {"range": "DynamicTable"}]}
@ -260,9 +280,7 @@ class Images(NWBDataInterface):
)
name: str = Field("Images", json_schema_extra={"linkml_meta": {"ifabsent": "string(Images)"}})
description: Optional[str] = Field(
None, description="""Description of this collection of images."""
)
description: str = Field(..., description="""Description of this collection of images.""")
image: List[Image] = Field(..., description="""Images stored in this collection.""")

View file

@ -34,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -84,21 +93,26 @@ class SpatialSeries(TimeSeries):
reference_frame: Optional[str] = Field(
None, description="""Description defining what exactly 'straight-ahead' means."""
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -128,13 +142,14 @@ class SpatialSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -148,7 +163,7 @@ class BehavioralEpochs(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[IntervalSeries]] = Field(
value: Optional[List[IntervalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "IntervalSeries"}]}}
)
name: str = Field(...)
@ -163,7 +178,7 @@ class BehavioralEvents(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -178,7 +193,7 @@ class BehavioralTimeSeries(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -193,7 +208,7 @@ class PupilTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -208,7 +223,7 @@ class EyeTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -223,7 +238,7 @@ class CompassDirection(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -238,7 +253,7 @@ class Position(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)

View file

@ -27,6 +27,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -16,6 +16,7 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from ...core.v2_2_2.core_nwb_device import Device
from ...core.v2_2_2.core_nwb_base import (
TimeSeries,
TimeSeriesStartingTime,
@ -43,6 +44,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +78,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -108,37 +118,47 @@ class ElectricalSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_channels"], np.number],
NDArray[Shape["* num_times, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_channels"], float],
NDArray[Shape["* num_times, * num_channels, * num_samples"], float],
] = Field(..., description="""Recorded voltage data.""")
electrodes: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -167,10 +187,10 @@ class SpikeEventSeries(ElectricalSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_events, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_samples"], float],
NDArray[Shape["* num_events, * num_channels, * num_samples"], float],
] = Field(..., description="""Spike waveforms.""")
timestamps: NDArray[Shape["* num_times"], np.float64] = Field(
timestamps: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time. Timestamps are required for the events. Unlike for TimeSeries, timestamps are required for SpikeEventSeries and are thus re-specified here.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -179,24 +199,34 @@ class SpikeEventSeries(ElectricalSeries):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,7 +262,7 @@ class FeatureExtraction(NWBDataInterface):
description="""Description of features (eg, ''PC1'') for each of the extracted features.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
features: NDArray[Shape["* num_events, * num_channels, * num_features"], np.float32] = Field(
features: NDArray[Shape["* num_events, * num_channels, * num_features"], float] = Field(
...,
description="""Multi-dimensional array of features extracted from each event.""",
json_schema_extra={
@ -247,7 +277,7 @@ class FeatureExtraction(NWBDataInterface):
}
},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of events that features correspond to (can be a link).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
@ -256,7 +286,12 @@ class FeatureExtraction(NWBDataInterface):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -277,16 +312,25 @@ class EventDetection(NWBDataInterface):
...,
description="""Description of how events were detected, such as voltage threshold, or dV/dT threshold, as well as relevant values.""",
)
source_idx: NDArray[Shape["* num_events"], np.int32] = Field(
source_idx: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Indices (zero-based) into source ElectricalSeries::data array corresponding to time of event. ''description'' should define what is meant by time of event (e.g., .25 ms before action potential peak, zero-crossing time, etc). The index points to each event from the raw data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Timestamps of events, in seconds.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
source_electricalseries: Union[ElectricalSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ElectricalSeries"}, {"range": "string"}],
}
},
)
class EventWaveform(NWBDataInterface):
@ -298,7 +342,7 @@ class EventWaveform(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[SpikeEventSeries]] = Field(
value: Optional[List[SpikeEventSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpikeEventSeries"}]}}
)
name: str = Field(...)
@ -313,7 +357,7 @@ class FilteredEphys(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -328,7 +372,7 @@ class LFP(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -344,14 +388,23 @@ class ElectrodeGroup(NWBContainer):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of this electrode group.""")
location: Optional[str] = Field(
None,
description: str = Field(..., description="""Description of this electrode group.""")
location: str = Field(
...,
description="""Location of electrode group. Specify the area, layer, comments on estimation of area/layer, etc. Use standard atlas names for anatomical regions when possible.""",
)
position: Optional[ElectrodeGroupPosition] = Field(
None, description="""stereotaxic or common framework coordinates"""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ElectrodeGroupPosition(ConfiguredBaseModel):
@ -367,9 +420,21 @@ class ElectrodeGroupPosition(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "position", "ifabsent": "string(position)"}
},
)
x: Optional[np.float32] = Field(None, description="""x coordinate""")
y: Optional[np.float32] = Field(None, description="""y coordinate""")
z: Optional[np.float32] = Field(None, description="""z coordinate""")
x: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""x coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
y: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""y coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""z coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
class ClusterWaveforms(NWBDataInterface):
@ -388,7 +453,7 @@ class ClusterWaveforms(NWBDataInterface):
waveform_filtering: str = Field(
..., description="""Filtering applied to data before generating mean/sd"""
)
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""The mean waveform for each cluster, using the same indices for each wave as cluster numbers in the associated Clustering module (i.e, cluster 3 is in array slot [3]). Waveforms corresponding to gaps in cluster sequence should be empty (e.g., zero- filled)""",
json_schema_extra={
@ -397,7 +462,7 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""Stdev of waveforms for each cluster, using the same indices as in mean""",
json_schema_extra={
@ -406,6 +471,15 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
clustering_interface: Union[Clustering, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Clustering"}, {"range": "string"}],
}
},
)
class Clustering(NWBDataInterface):
@ -424,17 +498,17 @@ class Clustering(NWBDataInterface):
...,
description="""Description of clusters or clustering, (e.g. cluster 0 is noise, clusters curated using Klusters, etc)""",
)
num: NDArray[Shape["* num_events"], np.int32] = Field(
num: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Cluster number of each event""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
peak_over_rms: NDArray[Shape["* num_clusters"], np.float32] = Field(
peak_over_rms: NDArray[Shape["* num_clusters"], float] = Field(
...,
description="""Maximum ratio of waveform peak to RMS on any channel in the cluster (provides a basic clustering metric).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_clusters"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of clustered events, in seconds. This may be a link to times field in associated FeatureExtraction module.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},

View file

@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,7 +105,7 @@ class TimeIntervals(DynamicTable):
)
name: str = Field(...)
start_time: NDArray[Any, np.float32] = Field(
start_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Start time of epoch, in seconds.""",
json_schema_extra={
@ -105,7 +114,7 @@ class TimeIntervals(DynamicTable):
}
},
)
stop_time: NDArray[Any, np.float32] = Field(
stop_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Stop time of epoch, in seconds.""",
json_schema_extra={
@ -114,7 +123,7 @@ class TimeIntervals(DynamicTable):
}
},
)
tags: Optional[NDArray[Any, str]] = Field(
tags: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""User-defined tags that identify or categorize events.""",
json_schema_extra={
@ -127,7 +136,12 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for tags.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
timeseries: Optional[TimeIntervalsTimeseries] = Field(
@ -137,17 +151,20 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -173,21 +190,23 @@ class TimeIntervalsTimeseries(VectorData):
"linkml_meta": {"equals_string": "timeseries", "ifabsent": "string(timeseries)"}
},
)
idx_start: Optional[np.int32] = Field(
idx_start: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Start index into the TimeSeries 'data' and 'timestamp' datasets of the referenced TimeSeries. The first dimension of those arrays is always time.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
count: Optional[np.int32] = Field(
count: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Number of data samples available in this time series, during this epoch.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
timeseries: Optional[TimeSeries] = Field(
None, description="""the TimeSeries that this index applies to."""
timeseries: Optional[NDArray[Shape["*"], TimeSeries]] = Field(
None,
description="""the TimeSeries that this index applies to.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],

View file

@ -7,7 +7,6 @@ import sys
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union
from pydantic import BaseModel, ConfigDict, Field, RootModel, field_validator
import numpy as np
from ...core.v2_2_2.core_nwb_epoch import TimeIntervals
from ...core.v2_2_2.core_nwb_misc import Units
from ...core.v2_2_2.core_nwb_device import Device
from ...core.v2_2_2.core_nwb_ogen import OptogeneticStimulusSite
@ -22,6 +21,7 @@ from ...core.v2_2_2.core_nwb_ecephys import ElectrodeGroup
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_3.hdmf_common_table import DynamicTable, VectorData, VectorIndex
from ...core.v2_2_2.core_nwb_icephys import IntracellularElectrode, SweepTable
from ...core.v2_2_2.core_nwb_epoch import TimeIntervals
metamodel_version = "None"
version = "2.2.2"
@ -41,6 +41,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -98,11 +107,12 @@ class NWBFile(NWBContainer):
"root",
json_schema_extra={"linkml_meta": {"equals_string": "root", "ifabsent": "string(root)"}},
)
nwb_version: Optional[str] = Field(
None,
nwb_version: Literal["2.2.2"] = Field(
"2.2.2",
description="""File version string. Use semantic versioning, e.g. 1.2.1. This will be the name of the format with trailing major, minor and patch numbers.""",
json_schema_extra={"linkml_meta": {"equals_string": "2.2.2", "ifabsent": "string(2.2.2)"}},
)
file_create_date: NDArray[Shape["* num_modifications"], np.datetime64] = Field(
file_create_date: NDArray[Shape["* num_modifications"], datetime] = Field(
...,
description="""A record of the date the file was created and of subsequent modifications. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted strings: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. The file can be created after the experiment was run, so this may differ from the experiment start time. Each modification to the nwb file adds a new entry to the array.""",
json_schema_extra={
@ -116,11 +126,11 @@ class NWBFile(NWBContainer):
session_description: str = Field(
..., description="""A description of the experimental session and data in the file."""
)
session_start_time: np.datetime64 = Field(
session_start_time: datetime = Field(
...,
description="""Date and time of the experiment/session start. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds.""",
)
timestamps_reference_time: np.datetime64 = Field(
timestamps_reference_time: datetime = Field(
...,
description="""Date and time corresponding to time zero of all timestamps. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. All times stored in the file use this time as reference (i.e., time zero).""",
)
@ -158,19 +168,9 @@ class NWBFile(NWBContainer):
...,
description="""Experimental metadata, including protocol, notes and description of hardware device(s). The metadata stored in this section should be used to describe the experiment. Metadata necessary for interpreting the data is stored with the data. General experimental metadata, including animal strain, experimental protocols, experimenter, devices, etc, are stored under 'general'. Core metadata (e.g., that required to interpret data fields) is stored with the data itself, and implicitly defined by the file specification (e.g., time is in seconds). The strategy used here for storing non-core metadata is to use free-form text fields, such as would appear in sentences or paragraphs from a Methods section. Metadata fields are text to enable them to be more general, for example to represent ranges instead of numerical values. Machine-readable metadata is stored as attributes to these free-form datasets. All entries in the below table are to be included when data is present. Unused groups (e.g., intracellular_ephys in an optophysiology experiment) should not be created unless there is data to store within them.""",
)
intervals: Optional[List[TimeIntervals]] = Field(
intervals: Optional[NWBFileIntervals] = Field(
None,
description="""Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.""",
json_schema_extra={
"linkml_meta": {
"any_of": [
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
]
}
},
)
units: Optional[Units] = Field(None, description="""Data about sorted spike units.""")
@ -256,7 +256,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Description of slices, including information about preparation thickness, orientation, temperature, and bath solution.""",
)
source_script: Optional[NWBFileGeneralSourceScript] = Field(
source_script: Optional[GeneralSourceScript] = Field(
None,
description="""Script file or link to public source code used to create this NWB file.""",
)
@ -284,10 +284,10 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Information about the animal or person from which the data was measured.""",
)
extracellular_ephys: Optional[NWBFileGeneralExtracellularEphys] = Field(
extracellular_ephys: Optional[GeneralExtracellularEphys] = Field(
None, description="""Metadata related to extracellular electrophysiology."""
)
intracellular_ephys: Optional[NWBFileGeneralIntracellularEphys] = Field(
intracellular_ephys: Optional[GeneralIntracellularEphys] = Field(
None, description="""Metadata related to intracellular electrophysiology."""
)
optogenetics: Optional[List[OptogeneticStimulusSite]] = Field(
@ -302,7 +302,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
)
class NWBFileGeneralSourceScript(ConfiguredBaseModel):
class GeneralSourceScript(ConfiguredBaseModel):
"""
Script file or link to public source code used to create this NWB file.
"""
@ -315,7 +315,7 @@ class NWBFileGeneralSourceScript(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "source_script", "ifabsent": "string(source_script)"}
},
)
file_name: Optional[str] = Field(None, description="""Name of script file.""")
file_name: str = Field(..., description="""Name of script file.""")
value: str = Field(...)
@ -335,7 +335,7 @@ class Subject(NWBContainer):
age: Optional[str] = Field(
None, description="""Age of subject. Can be supplied instead of 'date_of_birth'."""
)
date_of_birth: Optional[np.datetime64] = Field(
date_of_birth: Optional[datetime] = Field(
None, description="""Date of birth of subject. Can be supplied instead of 'age'."""
)
description: Optional[str] = Field(
@ -357,7 +357,7 @@ class Subject(NWBContainer):
)
class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
class GeneralExtracellularEphys(ConfiguredBaseModel):
"""
Metadata related to extracellular electrophysiology.
"""
@ -376,12 +376,12 @@ class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Physical group of electrodes."""
)
electrodes: Optional[NWBFileGeneralExtracellularEphysElectrodes] = Field(
electrodes: Optional[ExtracellularEphysElectrodes] = Field(
None, description="""A table of all electrodes (i.e. channels) used for recording."""
)
class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
class ExtracellularEphysElectrodes(DynamicTable):
"""
A table of all electrodes (i.e. channels) used for recording.
"""
@ -394,7 +394,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
"linkml_meta": {"equals_string": "electrodes", "ifabsent": "string(electrodes)"}
},
)
x: NDArray[Any, np.float32] = Field(
x: VectorData[NDArray[Any, float]] = Field(
...,
description="""x coordinate of the channel location in the brain (+x is posterior).""",
json_schema_extra={
@ -403,7 +403,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
y: NDArray[Any, np.float32] = Field(
y: VectorData[NDArray[Any, float]] = Field(
...,
description="""y coordinate of the channel location in the brain (+y is inferior).""",
json_schema_extra={
@ -412,7 +412,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
z: NDArray[Any, np.float32] = Field(
z: VectorData[NDArray[Any, float]] = Field(
...,
description="""z coordinate of the channel location in the brain (+z is right).""",
json_schema_extra={
@ -421,7 +421,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
imp: NDArray[Any, np.float32] = Field(
imp: VectorData[NDArray[Any, float]] = Field(
...,
description="""Impedance of the channel.""",
json_schema_extra={
@ -430,7 +430,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
location: NDArray[Any, str] = Field(
location: VectorData[NDArray[Any, str]] = Field(
...,
description="""Location of the electrode (channel). Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
json_schema_extra={
@ -439,7 +439,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
filtering: NDArray[Any, np.float32] = Field(
filtering: VectorData[NDArray[Any, float]] = Field(
...,
description="""Description of hardware filtering.""",
json_schema_extra={
@ -451,7 +451,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
group: List[ElectrodeGroup] = Field(
..., description="""Reference to the ElectrodeGroup this electrode is a part of."""
)
group_name: NDArray[Any, str] = Field(
group_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the ElectrodeGroup this electrode is a part of.""",
json_schema_extra={
@ -460,7 +460,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_x: Optional[NDArray[Any, np.float32]] = Field(
rel_x: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""x coordinate in electrode group""",
json_schema_extra={
@ -469,7 +469,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_y: Optional[NDArray[Any, np.float32]] = Field(
rel_y: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""y coordinate in electrode group""",
json_schema_extra={
@ -478,7 +478,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_z: Optional[NDArray[Any, np.float32]] = Field(
rel_z: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""z coordinate in electrode group""",
json_schema_extra={
@ -487,7 +487,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
reference: Optional[NDArray[Any, str]] = Field(
reference: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""Description of the reference used for this electrode.""",
json_schema_extra={
@ -496,14 +496,12 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -516,7 +514,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
)
class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
class GeneralIntracellularEphys(ConfiguredBaseModel):
"""
Metadata related to intracellular electrophysiology.
"""
@ -544,13 +542,43 @@ class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
)
class NWBFileIntervals(ConfiguredBaseModel):
"""
Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.file"})
name: Literal["intervals"] = Field(
"intervals",
json_schema_extra={
"linkml_meta": {"equals_string": "intervals", "ifabsent": "string(intervals)"}
},
)
epochs: Optional[TimeIntervals] = Field(
None,
description="""Divisions in time marking experimental stages or sub-divisions of a single recording session.""",
)
trials: Optional[TimeIntervals] = Field(
None, description="""Repeated experimental events that have a logical grouping."""
)
invalid_times: Optional[TimeIntervals] = Field(
None, description="""Time intervals that should be removed from analysis."""
)
time_intervals: Optional[List[TimeIntervals]] = Field(
None,
description="""Optional additional table(s) for describing other experimental time intervals.""",
)
# Model rebuild
# see https://pydantic-docs.helpmanual.io/usage/models/#rebuilding-a-model
NWBFile.model_rebuild()
NWBFileStimulus.model_rebuild()
NWBFileGeneral.model_rebuild()
NWBFileGeneralSourceScript.model_rebuild()
GeneralSourceScript.model_rebuild()
Subject.model_rebuild()
NWBFileGeneralExtracellularEphys.model_rebuild()
NWBFileGeneralExtracellularEphysElectrodes.model_rebuild()
NWBFileGeneralIntracellularEphys.model_rebuild()
GeneralExtracellularEphys.model_rebuild()
ExtracellularEphysElectrodes.model_rebuild()
GeneralIntracellularEphys.model_rebuild()
NWBFileIntervals.model_rebuild()

View file

@ -11,6 +11,7 @@ from ...core.v2_2_2.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_2.core_nwb_device import Device
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union, Annotated, Type, TypeVar
from pydantic import (
BaseModel,
@ -42,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -67,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -106,32 +116,46 @@ class PatchClampSeries(TimeSeries):
)
name: str = Field(...)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
data: PatchClampSeriesData = Field(..., description="""Recorded voltage or current.""")
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -160,11 +184,11 @@ class PatchClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[NDArray[Shape["* num_times"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times"], float]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}}
)
@ -180,36 +204,50 @@ class CurrentClampSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
bias_current: Optional[np.float32] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[np.float32] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[np.float32] = Field(
bias_current: Optional[float] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[float] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[float] = Field(
None, description="""Capacitance compensation, in farads."""
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -238,9 +276,10 @@ class CurrentClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -255,39 +294,51 @@ class IZeroClampSeries(CurrentClampSeries):
)
name: str = Field(...)
bias_current: np.float32 = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: np.float32 = Field(
..., description="""Bridge balance, in ohms, fixed to 0.0."""
)
capacitance_compensation: np.float32 = Field(
bias_current: float = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: float = Field(..., description="""Bridge balance, in ohms, fixed to 0.0.""")
capacitance_compensation: float = Field(
..., description="""Capacitance compensation, in farads, fixed to 0.0."""
)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -316,31 +367,45 @@ class CurrentClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampStimulusSeriesData = Field(..., description="""Stimulus current applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -369,9 +434,12 @@ class CurrentClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -408,31 +476,45 @@ class VoltageClampSeries(PatchClampSeries):
whole_cell_series_resistance_comp: Optional[VoltageClampSeriesWholeCellSeriesResistanceComp] = (
Field(None, description="""Whole cell series resistance compensation, in ohms.""")
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -461,9 +543,12 @@ class VoltageClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -484,11 +569,14 @@ class VoltageClampSeriesCapacitanceFast(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
@ -507,11 +595,14 @@ class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
@ -530,11 +621,12 @@ class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["hertz"] = Field(
"hertz",
description="""Unit of measurement for resistance_comp_bandwidth, which is fixed to 'hertz'.""",
json_schema_extra={"linkml_meta": {"equals_string": "hertz", "ifabsent": "string(hertz)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
@ -553,11 +645,14 @@ class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_correction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
@ -576,11 +671,14 @@ class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_prediction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
@ -599,11 +697,14 @@ class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for whole_cell_capacitance_comp, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
@ -622,11 +723,12 @@ class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["ohms"] = Field(
"ohms",
description="""Unit of measurement for whole_cell_series_resistance_comp, which is fixed to 'ohms'.""",
json_schema_extra={"linkml_meta": {"equals_string": "ohms", "ifabsent": "string(ohms)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampStimulusSeries(PatchClampSeries):
@ -640,31 +742,45 @@ class VoltageClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: VoltageClampStimulusSeriesData = Field(..., description="""Stimulus voltage applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -693,9 +809,10 @@ class VoltageClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -726,6 +843,15 @@ class IntracellularElectrode(NWBContainer):
slice: Optional[str] = Field(
None, description="""Information about slice used for recording."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class SweepTable(DynamicTable):
@ -738,7 +864,7 @@ class SweepTable(DynamicTable):
)
name: str = Field(...)
sweep_number: NDArray[Any, np.uint32] = Field(
sweep_number: VectorData[NDArray[Any, int]] = Field(
...,
description="""Sweep number of the PatchClampSeries in that row.""",
json_schema_extra={
@ -754,17 +880,20 @@ class SweepTable(DynamicTable):
...,
description="""Index for series.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -71,15 +80,15 @@ class GrayscaleImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -94,15 +103,15 @@ class RGBImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -117,15 +126,15 @@ class RGBAImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -142,11 +151,11 @@ class ImageSeries(TimeSeries):
name: str = Field(...)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -159,21 +168,26 @@ class ImageSeries(TimeSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -204,11 +218,11 @@ class ImageSeriesExternalFile(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "external_file", "ifabsent": "string(external_file)"}
},
)
starting_frame: Optional[np.int32] = Field(
None,
starting_frame: List[int] = Field(
...,
description="""Each external image may contain one or more consecutive frames of the full ImageSeries. This attribute serves as an index to indicate which frames each file contains, to faciliate random access. The 'starting_frame' attribute, hence, contains a list of frame numbers within the full ImageSeries of the first frame of each file listed in the parent 'external_file' dataset. Zero-based indexing is used (hence, the first element will always be zero). For example, if the 'external_file' dataset has three paths to files and the first file has 5 frames, the second file has 10 frames, and the third file has 20 frames, then this attribute will have values [0, 5, 15]. If there is a single external file that holds all of the frames of the ImageSeries (and so there is a single element in the 'external_file' dataset), then this attribute should have value [0].""",
)
array: Optional[NDArray[Shape["* num_files"], str]] = Field(
value: Optional[NDArray[Shape["* num_files"], str]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_files"}]}}}
)
@ -223,13 +237,22 @@ class ImageMaskSeries(ImageSeries):
)
name: str = Field(...)
masked_imageseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -242,21 +265,26 @@ class ImageMaskSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -284,24 +312,23 @@ class OpticalSeries(ImageSeries):
)
name: str = Field(...)
distance: Optional[np.float32] = Field(
distance: Optional[float] = Field(
None, description="""Distance from camera/monitor to target/eye."""
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height_depth"], np.float32],
NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height_depth"], float]
]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
data: Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, 3 r_g_b"], float],
] = Field(..., description="""Images presented to subject, either grayscale or RGB""")
orientation: Optional[str] = Field(
None,
description="""Description of image relative to some reference frame (e.g., which way is up). Must also specify frame of reference.""",
)
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -314,21 +341,26 @@ class OpticalSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -356,26 +388,40 @@ class IndexSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int32] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Index of the frame in the referenced ImageSeries.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
indexed_timeseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},

View file

@ -43,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -120,21 +129,26 @@ class AbstractFeatureSeries(TimeSeries):
description="""Description of the features represented in TimeSeries::data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -164,13 +178,14 @@ class AbstractFeatureSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"see 'feature_units'",
description="""Since there can be different units for different features, store the units in 'feature_units'. The default value for this attribute is \"see 'feature_units'\".""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(see 'feature_units')"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -190,21 +205,26 @@ class AnnotationSeries(TimeSeries):
description="""Annotations made during an experiment.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,26 +252,31 @@ class IntervalSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int8] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Use values >0 if interval started, <0 if interval ended.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -287,21 +312,35 @@ class DecompositionSeries(TimeSeries):
...,
description="""Table for describing the bands that this series was generated from. There should be one row in this table for each band.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
comments: Optional[str] = Field(
source_timeseries: Optional[Union[TimeSeries, str]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "TimeSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -330,11 +369,12 @@ class DecompositionSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
"no unit",
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no unit)"}},
)
array: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -361,7 +401,7 @@ class DecompositionSeriesBands(DynamicTable):
"bands",
json_schema_extra={"linkml_meta": {"equals_string": "bands", "ifabsent": "string(bands)"}},
)
band_name: NDArray[Any, str] = Field(
band_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the band, e.g. theta.""",
json_schema_extra={
@ -370,7 +410,7 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_limits: NDArray[Shape["* num_bands, 2 low_high"], np.float32] = Field(
band_limits: VectorData[NDArray[Shape["* num_bands, 2 low_high"], float]] = Field(
...,
description="""Low and high limit of each band in Hz. If it is a Gaussian filter, use 2 SD on either side of the center.""",
json_schema_extra={
@ -384,24 +424,22 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_mean: NDArray[Shape["* num_bands"], np.float32] = Field(
band_mean: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The mean Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
band_stdev: NDArray[Shape["* num_bands"], np.float32] = Field(
band_stdev: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The standard deviation of Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -428,7 +466,12 @@ class Units(DynamicTable):
None,
description="""Index into the spike_times dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
spike_times: Optional[UnitsSpikeTimes] = Field(
@ -438,10 +481,16 @@ class Units(DynamicTable):
None,
description="""Index into the obs_intervals dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
obs_intervals: Optional[NDArray[Shape["* num_intervals, 2 start_end"], np.float64]] = Field(
obs_intervals: VectorData[Optional[NDArray[Shape["* num_intervals, 2 start_end"], float]]] = (
Field(
None,
description="""Observation intervals for each unit.""",
json_schema_extra={
@ -455,43 +504,56 @@ class Units(DynamicTable):
}
},
)
)
electrodes_index: Named[Optional[VectorIndex]] = Field(
None,
description="""Index into electrodes.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrodes: Named[Optional[DynamicTableRegion]] = Field(
None,
description="""Electrode that each spike unit came from, specified using a DynamicTableRegion.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Electrode group that each spike unit came from."""
)
waveform_mean: Optional[
waveform_mean: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform mean for each spike unit.""")
waveform_sd: Optional[
waveform_sd: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform standard deviation for each spike unit.""")
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -517,14 +579,12 @@ class UnitsSpikeTimes(VectorData):
"linkml_meta": {"equals_string": "spike_times", "ifabsent": "string(spike_times)"}
},
)
resolution: Optional[np.float64] = Field(
resolution: Optional[float] = Field(
None,
description="""The smallest possible difference between two spike times. Usually 1 divided by the acquisition sampling rate from which spike times were extracted, but could be larger if the acquisition time series was downsampled or smaller if the acquisition time series was smoothed/interpolated and it is possible for the spike time to be between samples.""",
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],

View file

@ -14,6 +14,7 @@ from ...core.v2_2_2.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_2.core_nwb_device import Device
metamodel_version = "None"
version = "2.2.2"
@ -33,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -76,26 +86,40 @@ class OptogeneticSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.number] = Field(
data: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Applied power for optogenetic stimulus, in watts.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
site: Union[OptogeneticStimulusSite, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "OptogeneticStimulusSite"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -124,11 +148,20 @@ class OptogeneticStimulusSite(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description of stimulation site.""")
excitation_lambda: np.float32 = Field(..., description="""Excitation wavelength, in nm.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
location: str = Field(
...,
description="""Location of the stimulation site. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
# Model rebuild

View file

@ -16,8 +16,9 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_3.hdmf_common_table import DynamicTableRegion, DynamicTable
from ...core.v2_2_2.core_nwb_device import Device
from numpydantic import NDArray, Shape
from ...core.v2_2_2.core_nwb_base import (
TimeSeriesStartingTime,
TimeSeriesSync,
@ -44,6 +45,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -69,7 +79,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -109,24 +119,30 @@ class TwoPhotonSeries(ImageSeries):
)
name: str = Field(...)
pmt_gain: Optional[np.float32] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[np.float32] = Field(
pmt_gain: Optional[float] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[float] = Field(
None,
description="""Lines imaged per second. This is also stored in /general/optophysiology but is kept here as it is useful information for analysis, and so good to be stored w/ the actual data.""",
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height"], np.float32],
]
Union[NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height"], float]]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
imaging_plane: Union[ImagingPlane, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImagingPlane"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -139,21 +155,26 @@ class TwoPhotonSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -182,31 +203,40 @@ class RoiResponseSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_rois"], np.number],
NDArray[Shape["* num_times"], float], NDArray[Shape["* num_times, * num_rois"], float]
] = Field(..., description="""Signals from ROIs.""")
rois: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion referencing into an ROITable containing information on the ROIs stored in this timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -233,7 +263,7 @@ class DfOverF(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -248,7 +278,7 @@ class Fluorescence(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -263,7 +293,7 @@ class ImageSegmentation(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[DynamicTable]] = Field(
value: Optional[List[DynamicTable]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "DynamicTable"}]}}
)
name: str = Field(...)
@ -278,10 +308,152 @@ class ImagingPlane(NWBContainer):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[NWBContainer]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "NWBContainer"}]}}
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the imaging plane.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
imaging_rate: float = Field(..., description="""Rate that images are acquired, in Hz.""")
indicator: str = Field(..., description="""Calcium indicator.""")
location: str = Field(
...,
description="""Location of the imaging plane. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
manifold: Optional[ImagingPlaneManifold] = Field(
None,
description="""DEPRECATED Physical position of each pixel. 'xyz' represents the position of the pixel relative to the defined coordinate space. Deprecated in favor of origin_coords and grid_spacing.""",
)
origin_coords: Optional[ImagingPlaneOriginCoords] = Field(
None,
description="""Physical location of the first element of the imaging plane (0, 0) for 2-D data or (0, 0, 0) for 3-D data. See also reference_frame for what the physical location is relative to (e.g., bregma).""",
)
grid_spacing: Optional[ImagingPlaneGridSpacing] = Field(
None,
description="""Space between pixels in (x, y) or voxels in (x, y, z) directions, in the specified unit. Assumes imaging plane is a regular grid. See also reference_frame to interpret the grid.""",
)
reference_frame: Optional[str] = Field(
None,
description="""Describes reference frame of origin_coords and grid_spacing. For example, this can be a text description of the anatomical location and orientation of the grid defined by origin_coords and grid_spacing or the vectors needed to transform or rotate the grid to a common anatomical axis (e.g., AP/DV/ML). This field is necessary to interpret origin_coords and grid_spacing. If origin_coords and grid_spacing are not present, then this field is not required. For example, if the microscope takes 10 x 10 x 2 images, where the first value of the data matrix (index (0, 0, 0)) corresponds to (-1.2, -0.6, -2) mm relative to bregma, the spacing between pixels is 0.2 mm in x, 0.2 mm in y and 0.5 mm in z, and larger numbers in x means more anterior, larger numbers in y means more rightward, and larger numbers in z means more ventral, then enter the following -- origin_coords = (-1.2, -0.6, -2) grid_spacing = (0.2, 0.2, 0.5) reference_frame = \"Origin coordinates are relative to bregma. First dimension corresponds to anterior-posterior axis (larger index = more anterior). Second dimension corresponds to medial-lateral axis (larger index = more rightward). Third dimension corresponds to dorsal-ventral axis (larger index = more ventral).\"""",
)
optical_channel: List[OpticalChannel] = Field(
..., description="""An optical channel used to record from an imaging plane."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ImagingPlaneManifold(ConfiguredBaseModel):
"""
DEPRECATED Physical position of each pixel. 'xyz' represents the position of the pixel relative to the defined coordinate space. Deprecated in favor of origin_coords and grid_spacing.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["manifold"] = Field(
"manifold",
json_schema_extra={
"linkml_meta": {"equals_string": "manifold", "ifabsent": "string(manifold)"}
},
)
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as pixels from x = -500 to 499, y = -500 to 499 that correspond to a 2 m x 2 m range, then the 'conversion' multiplier to get from raw data acquisition pixel units to meters is 2/1000.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
unit: Optional[str] = Field(
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[
Union[
NDArray[Shape["* height, * width, 3 x_y_z"], float],
NDArray[Shape["* height, * width, * depth, 3 x_y_z"], float],
]
] = Field(None)
class ImagingPlaneOriginCoords(ConfiguredBaseModel):
"""
Physical location of the first element of the imaging plane (0, 0) for 2-D data or (0, 0, 0) for 3-D data. See also reference_frame for what the physical location is relative to (e.g., bregma).
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["origin_coords"] = Field(
"origin_coords",
json_schema_extra={
"linkml_meta": {"equals_string": "origin_coords", "ifabsent": "string(origin_coords)"}
},
)
unit: str = Field(
"meters",
description="""Measurement units for origin_coords. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"array": {
"dimensions": [
{"alias": "x_y", "exact_cardinality": 2},
{"alias": "x_y_z", "exact_cardinality": 3},
]
}
}
},
)
class ImagingPlaneGridSpacing(ConfiguredBaseModel):
"""
Space between pixels in (x, y) or voxels in (x, y, z) directions, in the specified unit. Assumes imaging plane is a regular grid. See also reference_frame to interpret the grid.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["grid_spacing"] = Field(
"grid_spacing",
json_schema_extra={
"linkml_meta": {"equals_string": "grid_spacing", "ifabsent": "string(grid_spacing)"}
},
)
unit: str = Field(
"meters",
description="""Measurement units for grid_spacing. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"array": {
"dimensions": [
{"alias": "x_y", "exact_cardinality": 2},
{"alias": "x_y_z", "exact_cardinality": 3},
]
}
}
},
)
class OpticalChannel(NWBContainer):
"""
An optical channel used to record from an imaging plane.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: str = Field(...)
description: str = Field(..., description="""Description or other notes about the channel.""")
emission_lambda: float = Field(..., description="""Emission wavelength for channel, in nm.""")
class MotionCorrection(NWBDataInterface):
@ -293,7 +465,7 @@ class MotionCorrection(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[NWBDataInterface]] = Field(
value: Optional[List[NWBDataInterface]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}]}}
)
name: str = Field(...)
@ -307,4 +479,8 @@ DfOverF.model_rebuild()
Fluorescence.model_rebuild()
ImageSegmentation.model_rebuild()
ImagingPlane.model_rebuild()
ImagingPlaneManifold.model_rebuild()
ImagingPlaneOriginCoords.model_rebuild()
ImagingPlaneGridSpacing.model_rebuild()
OpticalChannel.model_rebuild()
MotionCorrection.model_rebuild()

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -127,17 +136,13 @@ class ImagingRetinotopyAxis1PhaseMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -161,17 +166,13 @@ class ImagingRetinotopyAxis1PowerMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -195,17 +196,13 @@ class ImagingRetinotopyAxis2PhaseMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -229,17 +226,13 @@ class ImagingRetinotopyAxis2PowerMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -263,24 +256,18 @@ class ImagingRetinotopyFocalDepthImage(ConfiguredBaseModel):
}
},
)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value.""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
focal_depth: Optional[np.float32] = Field(
None, description="""Focal depth offset, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.uint16]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
focal_depth: float = Field(..., description="""Focal depth offset, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], int]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -301,14 +288,12 @@ class ImagingRetinotopySignMap(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "sign_map", "ifabsent": "string(sign_map)"}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -332,21 +317,17 @@ class ImagingRetinotopyVasculatureImage(ConfiguredBaseModel):
}
},
)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.uint16]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], int]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}

View file

@ -52,6 +52,10 @@ from ...core.v2_2_2.core_nwb_ophys import (
Fluorescence,
ImageSegmentation,
ImagingPlane,
ImagingPlaneManifold,
ImagingPlaneOriginCoords,
ImagingPlaneGridSpacing,
OpticalChannel,
MotionCorrection,
)
from ...core.v2_2_2.core_nwb_device import Device
@ -127,11 +131,12 @@ from ...core.v2_2_2.core_nwb_file import (
NWBFile,
NWBFileStimulus,
NWBFileGeneral,
NWBFileGeneralSourceScript,
GeneralSourceScript,
Subject,
NWBFileGeneralExtracellularEphys,
NWBFileGeneralExtracellularEphysElectrodes,
NWBFileGeneralIntracellularEphys,
GeneralExtracellularEphys,
ExtracellularEphysElectrodes,
GeneralIntracellularEphys,
NWBFileIntervals,
)
from ...core.v2_2_2.core_nwb_epoch import TimeIntervals, TimeIntervalsTimeseries
@ -153,6 +158,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -83,15 +92,15 @@ class Image(NWBData):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -130,10 +139,15 @@ class TimeSeries(NWBDataInterface):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
data: TimeSeriesData = Field(
...,
@ -143,12 +157,12 @@ class TimeSeries(NWBDataInterface):
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -177,19 +191,21 @@ class TimeSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as signed 16-bit integers (int16 range -32,768 to 32,767) that correspond to a 5V range (-2.5V to 2.5V), and the data acquisition system gain is 8000X, then the 'conversion' multiplier to get from raw data acquisition values to recorded volts is 2.5/32768/8000 = 9.5367e-9.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
resolution: Optional[np.float32] = Field(
None,
resolution: Optional[float] = Field(
-1.0,
description="""Smallest meaningful difference between values in data, stored in the specified by unit, e.g., the change in value of the least significant bit, or a larger number if signal noise is known to be present. If unknown, use -1.0.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(-1.0)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], Any],
NDArray[Shape["* num_times, * num_dim2"], Any],
@ -212,11 +228,15 @@ class TimeSeriesStartingTime(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "starting_time", "ifabsent": "string(starting_time)"}
},
)
rate: Optional[np.float32] = Field(None, description="""Sampling rate, in Hz.""")
unit: Optional[str] = Field(
None, description="""Unit of measurement for time, which is fixed to 'seconds'."""
rate: float = Field(..., description="""Sampling rate, in Hz.""")
unit: Literal["seconds"] = Field(
"seconds",
description="""Unit of measurement for time, which is fixed to 'seconds'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "seconds", "ifabsent": "string(seconds)"}
},
)
value: np.float64 = Field(...)
value: float = Field(...)
class TimeSeriesSync(ConfiguredBaseModel):
@ -241,7 +261,7 @@ class ProcessingModule(NWBContainer):
{"from_schema": "core.nwb.base", "tree_root": True}
)
children: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
value: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
None,
json_schema_extra={
"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}, {"range": "DynamicTable"}]}
@ -260,9 +280,7 @@ class Images(NWBDataInterface):
)
name: str = Field("Images", json_schema_extra={"linkml_meta": {"ifabsent": "string(Images)"}})
description: Optional[str] = Field(
None, description="""Description of this collection of images."""
)
description: str = Field(..., description="""Description of this collection of images.""")
image: List[Image] = Field(..., description="""Images stored in this collection.""")

View file

@ -34,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -84,21 +93,26 @@ class SpatialSeries(TimeSeries):
reference_frame: Optional[str] = Field(
None, description="""Description defining what exactly 'straight-ahead' means."""
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -128,13 +142,14 @@ class SpatialSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -148,7 +163,7 @@ class BehavioralEpochs(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[IntervalSeries]] = Field(
value: Optional[List[IntervalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "IntervalSeries"}]}}
)
name: str = Field(...)
@ -163,7 +178,7 @@ class BehavioralEvents(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -178,7 +193,7 @@ class BehavioralTimeSeries(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -193,7 +208,7 @@ class PupilTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -208,7 +223,7 @@ class EyeTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -223,7 +238,7 @@ class CompassDirection(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -238,7 +253,7 @@ class Position(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)

View file

@ -27,6 +27,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -16,6 +16,7 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from ...core.v2_2_4.core_nwb_device import Device
from ...core.v2_2_4.core_nwb_base import (
TimeSeries,
TimeSeriesStartingTime,
@ -43,6 +44,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +78,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -108,37 +118,47 @@ class ElectricalSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_channels"], np.number],
NDArray[Shape["* num_times, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_channels"], float],
NDArray[Shape["* num_times, * num_channels, * num_samples"], float],
] = Field(..., description="""Recorded voltage data.""")
electrodes: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -167,10 +187,10 @@ class SpikeEventSeries(ElectricalSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_events, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_samples"], float],
NDArray[Shape["* num_events, * num_channels, * num_samples"], float],
] = Field(..., description="""Spike waveforms.""")
timestamps: NDArray[Shape["* num_times"], np.float64] = Field(
timestamps: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time. Timestamps are required for the events. Unlike for TimeSeries, timestamps are required for SpikeEventSeries and are thus re-specified here.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -179,24 +199,34 @@ class SpikeEventSeries(ElectricalSeries):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,7 +262,7 @@ class FeatureExtraction(NWBDataInterface):
description="""Description of features (eg, ''PC1'') for each of the extracted features.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
features: NDArray[Shape["* num_events, * num_channels, * num_features"], np.float32] = Field(
features: NDArray[Shape["* num_events, * num_channels, * num_features"], float] = Field(
...,
description="""Multi-dimensional array of features extracted from each event.""",
json_schema_extra={
@ -247,7 +277,7 @@ class FeatureExtraction(NWBDataInterface):
}
},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of events that features correspond to (can be a link).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
@ -256,7 +286,12 @@ class FeatureExtraction(NWBDataInterface):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -277,16 +312,25 @@ class EventDetection(NWBDataInterface):
...,
description="""Description of how events were detected, such as voltage threshold, or dV/dT threshold, as well as relevant values.""",
)
source_idx: NDArray[Shape["* num_events"], np.int32] = Field(
source_idx: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Indices (zero-based) into source ElectricalSeries::data array corresponding to time of event. ''description'' should define what is meant by time of event (e.g., .25 ms before action potential peak, zero-crossing time, etc). The index points to each event from the raw data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Timestamps of events, in seconds.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
source_electricalseries: Union[ElectricalSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ElectricalSeries"}, {"range": "string"}],
}
},
)
class EventWaveform(NWBDataInterface):
@ -298,7 +342,7 @@ class EventWaveform(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[SpikeEventSeries]] = Field(
value: Optional[List[SpikeEventSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpikeEventSeries"}]}}
)
name: str = Field(...)
@ -313,7 +357,7 @@ class FilteredEphys(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -328,7 +372,7 @@ class LFP(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -344,14 +388,23 @@ class ElectrodeGroup(NWBContainer):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of this electrode group.""")
location: Optional[str] = Field(
None,
description: str = Field(..., description="""Description of this electrode group.""")
location: str = Field(
...,
description="""Location of electrode group. Specify the area, layer, comments on estimation of area/layer, etc. Use standard atlas names for anatomical regions when possible.""",
)
position: Optional[ElectrodeGroupPosition] = Field(
None, description="""stereotaxic or common framework coordinates"""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ElectrodeGroupPosition(ConfiguredBaseModel):
@ -367,9 +420,21 @@ class ElectrodeGroupPosition(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "position", "ifabsent": "string(position)"}
},
)
x: Optional[np.float32] = Field(None, description="""x coordinate""")
y: Optional[np.float32] = Field(None, description="""y coordinate""")
z: Optional[np.float32] = Field(None, description="""z coordinate""")
x: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""x coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
y: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""y coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""z coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
class ClusterWaveforms(NWBDataInterface):
@ -388,7 +453,7 @@ class ClusterWaveforms(NWBDataInterface):
waveform_filtering: str = Field(
..., description="""Filtering applied to data before generating mean/sd"""
)
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""The mean waveform for each cluster, using the same indices for each wave as cluster numbers in the associated Clustering module (i.e, cluster 3 is in array slot [3]). Waveforms corresponding to gaps in cluster sequence should be empty (e.g., zero- filled)""",
json_schema_extra={
@ -397,7 +462,7 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""Stdev of waveforms for each cluster, using the same indices as in mean""",
json_schema_extra={
@ -406,6 +471,15 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
clustering_interface: Union[Clustering, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Clustering"}, {"range": "string"}],
}
},
)
class Clustering(NWBDataInterface):
@ -424,17 +498,17 @@ class Clustering(NWBDataInterface):
...,
description="""Description of clusters or clustering, (e.g. cluster 0 is noise, clusters curated using Klusters, etc)""",
)
num: NDArray[Shape["* num_events"], np.int32] = Field(
num: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Cluster number of each event""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
peak_over_rms: NDArray[Shape["* num_clusters"], np.float32] = Field(
peak_over_rms: NDArray[Shape["* num_clusters"], float] = Field(
...,
description="""Maximum ratio of waveform peak to RMS on any channel in the cluster (provides a basic clustering metric).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_clusters"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of clustered events, in seconds. This may be a link to times field in associated FeatureExtraction module.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},

View file

@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,7 +105,7 @@ class TimeIntervals(DynamicTable):
)
name: str = Field(...)
start_time: NDArray[Any, np.float32] = Field(
start_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Start time of epoch, in seconds.""",
json_schema_extra={
@ -105,7 +114,7 @@ class TimeIntervals(DynamicTable):
}
},
)
stop_time: NDArray[Any, np.float32] = Field(
stop_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Stop time of epoch, in seconds.""",
json_schema_extra={
@ -114,7 +123,7 @@ class TimeIntervals(DynamicTable):
}
},
)
tags: Optional[NDArray[Any, str]] = Field(
tags: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""User-defined tags that identify or categorize events.""",
json_schema_extra={
@ -127,7 +136,12 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for tags.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
timeseries: Optional[TimeIntervalsTimeseries] = Field(
@ -137,17 +151,20 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -173,21 +190,23 @@ class TimeIntervalsTimeseries(VectorData):
"linkml_meta": {"equals_string": "timeseries", "ifabsent": "string(timeseries)"}
},
)
idx_start: Optional[np.int32] = Field(
idx_start: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Start index into the TimeSeries 'data' and 'timestamp' datasets of the referenced TimeSeries. The first dimension of those arrays is always time.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
count: Optional[np.int32] = Field(
count: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Number of data samples available in this time series, during this epoch.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
timeseries: Optional[TimeSeries] = Field(
None, description="""the TimeSeries that this index applies to."""
timeseries: Optional[NDArray[Shape["*"], TimeSeries]] = Field(
None,
description="""the TimeSeries that this index applies to.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],

View file

@ -7,7 +7,6 @@ import sys
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union
from pydantic import BaseModel, ConfigDict, Field, RootModel, field_validator
import numpy as np
from ...core.v2_2_4.core_nwb_epoch import TimeIntervals
from ...core.v2_2_4.core_nwb_misc import Units
from ...core.v2_2_4.core_nwb_device import Device
from ...core.v2_2_4.core_nwb_ogen import OptogeneticStimulusSite
@ -16,6 +15,7 @@ from ...core.v2_2_4.core_nwb_ecephys import ElectrodeGroup
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_3.hdmf_common_table import DynamicTable, VectorData, VectorIndex
from ...core.v2_2_4.core_nwb_icephys import IntracellularElectrode, SweepTable
from ...core.v2_2_4.core_nwb_epoch import TimeIntervals
from ...core.v2_2_4.core_nwb_base import (
NWBData,
NWBContainer,
@ -42,6 +42,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -96,9 +105,7 @@ class ScratchData(NWBData):
)
name: str = Field(...)
notes: Optional[str] = Field(
None, description="""Any notes the user has about the dataset being stored"""
)
notes: str = Field(..., description="""Any notes the user has about the dataset being stored""")
class NWBFile(NWBContainer):
@ -114,11 +121,12 @@ class NWBFile(NWBContainer):
"root",
json_schema_extra={"linkml_meta": {"equals_string": "root", "ifabsent": "string(root)"}},
)
nwb_version: Optional[str] = Field(
None,
nwb_version: Literal["2.2.4"] = Field(
"2.2.4",
description="""File version string. Use semantic versioning, e.g. 1.2.1. This will be the name of the format with trailing major, minor and patch numbers.""",
json_schema_extra={"linkml_meta": {"equals_string": "2.2.4", "ifabsent": "string(2.2.4)"}},
)
file_create_date: NDArray[Shape["* num_modifications"], np.datetime64] = Field(
file_create_date: NDArray[Shape["* num_modifications"], datetime] = Field(
...,
description="""A record of the date the file was created and of subsequent modifications. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted strings: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. The file can be created after the experiment was run, so this may differ from the experiment start time. Each modification to the nwb file adds a new entry to the array.""",
json_schema_extra={
@ -132,11 +140,11 @@ class NWBFile(NWBContainer):
session_description: str = Field(
..., description="""A description of the experimental session and data in the file."""
)
session_start_time: np.datetime64 = Field(
session_start_time: datetime = Field(
...,
description="""Date and time of the experiment/session start. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds.""",
)
timestamps_reference_time: np.datetime64 = Field(
timestamps_reference_time: datetime = Field(
...,
description="""Date and time corresponding to time zero of all timestamps. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. All times stored in the file use this time as reference (i.e., time zero).""",
)
@ -174,19 +182,9 @@ class NWBFile(NWBContainer):
...,
description="""Experimental metadata, including protocol, notes and description of hardware device(s). The metadata stored in this section should be used to describe the experiment. Metadata necessary for interpreting the data is stored with the data. General experimental metadata, including animal strain, experimental protocols, experimenter, devices, etc, are stored under 'general'. Core metadata (e.g., that required to interpret data fields) is stored with the data itself, and implicitly defined by the file specification (e.g., time is in seconds). The strategy used here for storing non-core metadata is to use free-form text fields, such as would appear in sentences or paragraphs from a Methods section. Metadata fields are text to enable them to be more general, for example to represent ranges instead of numerical values. Machine-readable metadata is stored as attributes to these free-form datasets. All entries in the below table are to be included when data is present. Unused groups (e.g., intracellular_ephys in an optophysiology experiment) should not be created unless there is data to store within them.""",
)
intervals: Optional[List[TimeIntervals]] = Field(
intervals: Optional[NWBFileIntervals] = Field(
None,
description="""Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.""",
json_schema_extra={
"linkml_meta": {
"any_of": [
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
]
}
},
)
units: Optional[Units] = Field(None, description="""Data about sorted spike units.""")
@ -272,7 +270,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Description of slices, including information about preparation thickness, orientation, temperature, and bath solution.""",
)
source_script: Optional[NWBFileGeneralSourceScript] = Field(
source_script: Optional[GeneralSourceScript] = Field(
None,
description="""Script file or link to public source code used to create this NWB file.""",
)
@ -300,10 +298,10 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Information about the animal or person from which the data was measured.""",
)
extracellular_ephys: Optional[NWBFileGeneralExtracellularEphys] = Field(
extracellular_ephys: Optional[GeneralExtracellularEphys] = Field(
None, description="""Metadata related to extracellular electrophysiology."""
)
intracellular_ephys: Optional[NWBFileGeneralIntracellularEphys] = Field(
intracellular_ephys: Optional[GeneralIntracellularEphys] = Field(
None, description="""Metadata related to intracellular electrophysiology."""
)
optogenetics: Optional[List[OptogeneticStimulusSite]] = Field(
@ -318,7 +316,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
)
class NWBFileGeneralSourceScript(ConfiguredBaseModel):
class GeneralSourceScript(ConfiguredBaseModel):
"""
Script file or link to public source code used to create this NWB file.
"""
@ -331,11 +329,11 @@ class NWBFileGeneralSourceScript(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "source_script", "ifabsent": "string(source_script)"}
},
)
file_name: Optional[str] = Field(None, description="""Name of script file.""")
file_name: str = Field(..., description="""Name of script file.""")
value: str = Field(...)
class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
class GeneralExtracellularEphys(ConfiguredBaseModel):
"""
Metadata related to extracellular electrophysiology.
"""
@ -354,12 +352,12 @@ class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Physical group of electrodes."""
)
electrodes: Optional[NWBFileGeneralExtracellularEphysElectrodes] = Field(
electrodes: Optional[ExtracellularEphysElectrodes] = Field(
None, description="""A table of all electrodes (i.e. channels) used for recording."""
)
class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
class ExtracellularEphysElectrodes(DynamicTable):
"""
A table of all electrodes (i.e. channels) used for recording.
"""
@ -372,7 +370,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
"linkml_meta": {"equals_string": "electrodes", "ifabsent": "string(electrodes)"}
},
)
x: NDArray[Any, np.float32] = Field(
x: VectorData[NDArray[Any, float]] = Field(
...,
description="""x coordinate of the channel location in the brain (+x is posterior).""",
json_schema_extra={
@ -381,7 +379,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
y: NDArray[Any, np.float32] = Field(
y: VectorData[NDArray[Any, float]] = Field(
...,
description="""y coordinate of the channel location in the brain (+y is inferior).""",
json_schema_extra={
@ -390,7 +388,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
z: NDArray[Any, np.float32] = Field(
z: VectorData[NDArray[Any, float]] = Field(
...,
description="""z coordinate of the channel location in the brain (+z is right).""",
json_schema_extra={
@ -399,7 +397,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
imp: NDArray[Any, np.float32] = Field(
imp: VectorData[NDArray[Any, float]] = Field(
...,
description="""Impedance of the channel.""",
json_schema_extra={
@ -408,7 +406,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
location: NDArray[Any, str] = Field(
location: VectorData[NDArray[Any, str]] = Field(
...,
description="""Location of the electrode (channel). Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
json_schema_extra={
@ -417,7 +415,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
filtering: NDArray[Any, np.float32] = Field(
filtering: VectorData[NDArray[Any, float]] = Field(
...,
description="""Description of hardware filtering.""",
json_schema_extra={
@ -429,7 +427,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
group: List[ElectrodeGroup] = Field(
..., description="""Reference to the ElectrodeGroup this electrode is a part of."""
)
group_name: NDArray[Any, str] = Field(
group_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the ElectrodeGroup this electrode is a part of.""",
json_schema_extra={
@ -438,7 +436,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_x: Optional[NDArray[Any, np.float32]] = Field(
rel_x: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""x coordinate in electrode group""",
json_schema_extra={
@ -447,7 +445,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_y: Optional[NDArray[Any, np.float32]] = Field(
rel_y: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""y coordinate in electrode group""",
json_schema_extra={
@ -456,7 +454,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_z: Optional[NDArray[Any, np.float32]] = Field(
rel_z: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""z coordinate in electrode group""",
json_schema_extra={
@ -465,7 +463,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
reference: Optional[NDArray[Any, str]] = Field(
reference: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""Description of the reference used for this electrode.""",
json_schema_extra={
@ -474,14 +472,12 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -494,7 +490,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
)
class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
class GeneralIntracellularEphys(ConfiguredBaseModel):
"""
Metadata related to intracellular electrophysiology.
"""
@ -522,6 +518,35 @@ class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
)
class NWBFileIntervals(ConfiguredBaseModel):
"""
Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.file"})
name: Literal["intervals"] = Field(
"intervals",
json_schema_extra={
"linkml_meta": {"equals_string": "intervals", "ifabsent": "string(intervals)"}
},
)
epochs: Optional[TimeIntervals] = Field(
None,
description="""Divisions in time marking experimental stages or sub-divisions of a single recording session.""",
)
trials: Optional[TimeIntervals] = Field(
None, description="""Repeated experimental events that have a logical grouping."""
)
invalid_times: Optional[TimeIntervals] = Field(
None, description="""Time intervals that should be removed from analysis."""
)
time_intervals: Optional[List[TimeIntervals]] = Field(
None,
description="""Optional additional table(s) for describing other experimental time intervals.""",
)
class LabMetaData(NWBContainer):
"""
Lab-specific meta-data.
@ -547,7 +572,7 @@ class Subject(NWBContainer):
age: Optional[str] = Field(
None, description="""Age of subject. Can be supplied instead of 'date_of_birth'."""
)
date_of_birth: Optional[np.datetime64] = Field(
date_of_birth: Optional[datetime] = Field(
None, description="""Date of birth of subject. Can be supplied instead of 'age'."""
)
description: Optional[str] = Field(
@ -575,9 +600,10 @@ ScratchData.model_rebuild()
NWBFile.model_rebuild()
NWBFileStimulus.model_rebuild()
NWBFileGeneral.model_rebuild()
NWBFileGeneralSourceScript.model_rebuild()
NWBFileGeneralExtracellularEphys.model_rebuild()
NWBFileGeneralExtracellularEphysElectrodes.model_rebuild()
NWBFileGeneralIntracellularEphys.model_rebuild()
GeneralSourceScript.model_rebuild()
GeneralExtracellularEphys.model_rebuild()
ExtracellularEphysElectrodes.model_rebuild()
GeneralIntracellularEphys.model_rebuild()
NWBFileIntervals.model_rebuild()
LabMetaData.model_rebuild()
Subject.model_rebuild()

View file

@ -11,6 +11,7 @@ from ...core.v2_2_4.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_4.core_nwb_device import Device
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union, Annotated, Type, TypeVar
from pydantic import (
BaseModel,
@ -42,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -67,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -106,32 +116,46 @@ class PatchClampSeries(TimeSeries):
)
name: str = Field(...)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
data: PatchClampSeriesData = Field(..., description="""Recorded voltage or current.""")
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -160,11 +184,11 @@ class PatchClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[NDArray[Shape["* num_times"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times"], float]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}}
)
@ -180,36 +204,50 @@ class CurrentClampSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
bias_current: Optional[np.float32] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[np.float32] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[np.float32] = Field(
bias_current: Optional[float] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[float] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[float] = Field(
None, description="""Capacitance compensation, in farads."""
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -238,9 +276,10 @@ class CurrentClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -255,39 +294,51 @@ class IZeroClampSeries(CurrentClampSeries):
)
name: str = Field(...)
bias_current: np.float32 = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: np.float32 = Field(
..., description="""Bridge balance, in ohms, fixed to 0.0."""
)
capacitance_compensation: np.float32 = Field(
bias_current: float = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: float = Field(..., description="""Bridge balance, in ohms, fixed to 0.0.""")
capacitance_compensation: float = Field(
..., description="""Capacitance compensation, in farads, fixed to 0.0."""
)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -316,31 +367,45 @@ class CurrentClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampStimulusSeriesData = Field(..., description="""Stimulus current applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -369,9 +434,12 @@ class CurrentClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -408,31 +476,45 @@ class VoltageClampSeries(PatchClampSeries):
whole_cell_series_resistance_comp: Optional[VoltageClampSeriesWholeCellSeriesResistanceComp] = (
Field(None, description="""Whole cell series resistance compensation, in ohms.""")
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -461,9 +543,12 @@ class VoltageClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -484,11 +569,14 @@ class VoltageClampSeriesCapacitanceFast(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
@ -507,11 +595,14 @@ class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
@ -530,11 +621,12 @@ class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["hertz"] = Field(
"hertz",
description="""Unit of measurement for resistance_comp_bandwidth, which is fixed to 'hertz'.""",
json_schema_extra={"linkml_meta": {"equals_string": "hertz", "ifabsent": "string(hertz)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
@ -553,11 +645,14 @@ class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_correction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
@ -576,11 +671,14 @@ class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_prediction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
@ -599,11 +697,14 @@ class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for whole_cell_capacitance_comp, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
@ -622,11 +723,12 @@ class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["ohms"] = Field(
"ohms",
description="""Unit of measurement for whole_cell_series_resistance_comp, which is fixed to 'ohms'.""",
json_schema_extra={"linkml_meta": {"equals_string": "ohms", "ifabsent": "string(ohms)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampStimulusSeries(PatchClampSeries):
@ -640,31 +742,45 @@ class VoltageClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: VoltageClampStimulusSeriesData = Field(..., description="""Stimulus voltage applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -693,9 +809,10 @@ class VoltageClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -726,6 +843,15 @@ class IntracellularElectrode(NWBContainer):
slice: Optional[str] = Field(
None, description="""Information about slice used for recording."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class SweepTable(DynamicTable):
@ -738,7 +864,7 @@ class SweepTable(DynamicTable):
)
name: str = Field(...)
sweep_number: NDArray[Any, np.uint32] = Field(
sweep_number: VectorData[NDArray[Any, int]] = Field(
...,
description="""Sweep number of the PatchClampSeries in that row.""",
json_schema_extra={
@ -754,17 +880,20 @@ class SweepTable(DynamicTable):
...,
description="""Index for series.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -71,15 +80,15 @@ class GrayscaleImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -94,15 +103,15 @@ class RGBImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -117,15 +126,15 @@ class RGBAImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -142,11 +151,11 @@ class ImageSeries(TimeSeries):
name: str = Field(...)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -159,21 +168,26 @@ class ImageSeries(TimeSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -204,11 +218,11 @@ class ImageSeriesExternalFile(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "external_file", "ifabsent": "string(external_file)"}
},
)
starting_frame: Optional[np.int32] = Field(
None,
starting_frame: List[int] = Field(
...,
description="""Each external image may contain one or more consecutive frames of the full ImageSeries. This attribute serves as an index to indicate which frames each file contains, to faciliate random access. The 'starting_frame' attribute, hence, contains a list of frame numbers within the full ImageSeries of the first frame of each file listed in the parent 'external_file' dataset. Zero-based indexing is used (hence, the first element will always be zero). For example, if the 'external_file' dataset has three paths to files and the first file has 5 frames, the second file has 10 frames, and the third file has 20 frames, then this attribute will have values [0, 5, 15]. If there is a single external file that holds all of the frames of the ImageSeries (and so there is a single element in the 'external_file' dataset), then this attribute should have value [0].""",
)
array: Optional[NDArray[Shape["* num_files"], str]] = Field(
value: Optional[NDArray[Shape["* num_files"], str]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_files"}]}}}
)
@ -223,13 +237,22 @@ class ImageMaskSeries(ImageSeries):
)
name: str = Field(...)
masked_imageseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -242,21 +265,26 @@ class ImageMaskSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -284,24 +312,23 @@ class OpticalSeries(ImageSeries):
)
name: str = Field(...)
distance: Optional[np.float32] = Field(
distance: Optional[float] = Field(
None, description="""Distance from camera/monitor to target/eye."""
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height_depth"], np.float32],
NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height_depth"], float]
]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
data: Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, 3 r_g_b"], float],
] = Field(..., description="""Images presented to subject, either grayscale or RGB""")
orientation: Optional[str] = Field(
None,
description="""Description of image relative to some reference frame (e.g., which way is up). Must also specify frame of reference.""",
)
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -314,21 +341,26 @@ class OpticalSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -356,26 +388,40 @@ class IndexSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int32] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Index of the frame in the referenced ImageSeries.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
indexed_timeseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},

View file

@ -43,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -120,21 +129,26 @@ class AbstractFeatureSeries(TimeSeries):
description="""Description of the features represented in TimeSeries::data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -164,13 +178,14 @@ class AbstractFeatureSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"see 'feature_units'",
description="""Since there can be different units for different features, store the units in 'feature_units'. The default value for this attribute is \"see 'feature_units'\".""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(see 'feature_units')"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -190,21 +205,26 @@ class AnnotationSeries(TimeSeries):
description="""Annotations made during an experiment.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,26 +252,31 @@ class IntervalSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int8] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Use values >0 if interval started, <0 if interval ended.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -287,21 +312,35 @@ class DecompositionSeries(TimeSeries):
...,
description="""Table for describing the bands that this series was generated from. There should be one row in this table for each band.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
comments: Optional[str] = Field(
source_timeseries: Optional[Union[TimeSeries, str]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "TimeSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -330,11 +369,12 @@ class DecompositionSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
"no unit",
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no unit)"}},
)
array: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -361,7 +401,7 @@ class DecompositionSeriesBands(DynamicTable):
"bands",
json_schema_extra={"linkml_meta": {"equals_string": "bands", "ifabsent": "string(bands)"}},
)
band_name: NDArray[Any, str] = Field(
band_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the band, e.g. theta.""",
json_schema_extra={
@ -370,7 +410,7 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_limits: NDArray[Shape["* num_bands, 2 low_high"], np.float32] = Field(
band_limits: VectorData[NDArray[Shape["* num_bands, 2 low_high"], float]] = Field(
...,
description="""Low and high limit of each band in Hz. If it is a Gaussian filter, use 2 SD on either side of the center.""",
json_schema_extra={
@ -384,24 +424,22 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_mean: NDArray[Shape["* num_bands"], np.float32] = Field(
band_mean: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The mean Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
band_stdev: NDArray[Shape["* num_bands"], np.float32] = Field(
band_stdev: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The standard deviation of Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -428,7 +466,12 @@ class Units(DynamicTable):
None,
description="""Index into the spike_times dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
spike_times: Optional[UnitsSpikeTimes] = Field(
@ -438,10 +481,16 @@ class Units(DynamicTable):
None,
description="""Index into the obs_intervals dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
obs_intervals: Optional[NDArray[Shape["* num_intervals, 2 start_end"], np.float64]] = Field(
obs_intervals: VectorData[Optional[NDArray[Shape["* num_intervals, 2 start_end"], float]]] = (
Field(
None,
description="""Observation intervals for each unit.""",
json_schema_extra={
@ -455,43 +504,56 @@ class Units(DynamicTable):
}
},
)
)
electrodes_index: Named[Optional[VectorIndex]] = Field(
None,
description="""Index into electrodes.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrodes: Named[Optional[DynamicTableRegion]] = Field(
None,
description="""Electrode that each spike unit came from, specified using a DynamicTableRegion.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Electrode group that each spike unit came from."""
)
waveform_mean: Optional[
waveform_mean: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform mean for each spike unit.""")
waveform_sd: Optional[
waveform_sd: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform standard deviation for each spike unit.""")
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -517,14 +579,12 @@ class UnitsSpikeTimes(VectorData):
"linkml_meta": {"equals_string": "spike_times", "ifabsent": "string(spike_times)"}
},
)
resolution: Optional[np.float64] = Field(
resolution: Optional[float] = Field(
None,
description="""The smallest possible difference between two spike times. Usually 1 divided by the acquisition sampling rate from which spike times were extracted, but could be larger if the acquisition time series was downsampled or smaller if the acquisition time series was smoothed/interpolated and it is possible for the spike time to be between samples.""",
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],

View file

@ -14,6 +14,7 @@ from ...core.v2_2_4.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_4.core_nwb_device import Device
metamodel_version = "None"
version = "2.2.4"
@ -33,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -76,26 +86,40 @@ class OptogeneticSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.number] = Field(
data: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Applied power for optogenetic stimulus, in watts.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
site: Union[OptogeneticStimulusSite, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "OptogeneticStimulusSite"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -124,11 +148,20 @@ class OptogeneticStimulusSite(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description of stimulation site.""")
excitation_lambda: np.float32 = Field(..., description="""Excitation wavelength, in nm.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
location: str = Field(
...,
description="""Location of the stimulation site. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
# Model rebuild

View file

@ -21,8 +21,8 @@ from ...hdmf_common.v1_1_3.hdmf_common_table import (
VectorIndex,
VectorData,
)
from ...core.v2_2_4.core_nwb_device import Device
from numpydantic import NDArray, Shape
from ...core.v2_2_4.core_nwb_image import ImageSeries, ImageSeriesExternalFile
from ...core.v2_2_4.core_nwb_base import (
TimeSeriesStartingTime,
TimeSeriesSync,
@ -30,6 +30,7 @@ from ...core.v2_2_4.core_nwb_base import (
NWBDataInterface,
NWBContainer,
)
from ...core.v2_2_4.core_nwb_image import ImageSeries, ImageSeriesExternalFile
metamodel_version = "None"
version = "2.2.4"
@ -49,6 +50,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -74,7 +84,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -114,24 +124,30 @@ class TwoPhotonSeries(ImageSeries):
)
name: str = Field(...)
pmt_gain: Optional[np.float32] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[np.float32] = Field(
pmt_gain: Optional[float] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[float] = Field(
None,
description="""Lines imaged per second. This is also stored in /general/optophysiology but is kept here as it is useful information for analysis, and so good to be stored w/ the actual data.""",
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height"], np.float32],
]
Union[NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height"], float]]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
imaging_plane: Union[ImagingPlane, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImagingPlane"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -144,21 +160,26 @@ class TwoPhotonSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -187,31 +208,40 @@ class RoiResponseSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_rois"], np.number],
NDArray[Shape["* num_times"], float], NDArray[Shape["* num_times, * num_rois"], float]
] = Field(..., description="""Signals from ROIs.""")
rois: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion referencing into an ROITable containing information on the ROIs stored in this timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -238,7 +268,7 @@ class DfOverF(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -253,7 +283,7 @@ class Fluorescence(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -268,7 +298,7 @@ class ImageSegmentation(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[PlaneSegmentation]] = Field(
value: Optional[List[PlaneSegmentation]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "PlaneSegmentation"}]}}
)
name: str = Field(...)
@ -292,7 +322,12 @@ class PlaneSegmentation(DynamicTable):
None,
description="""Index into pixel_mask.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
pixel_mask: Optional[PlaneSegmentationPixelMask] = Field(
@ -303,7 +338,12 @@ class PlaneSegmentation(DynamicTable):
None,
description="""Index into voxel_mask.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
voxel_mask: Optional[PlaneSegmentationVoxelMask] = Field(
@ -315,14 +355,21 @@ class PlaneSegmentation(DynamicTable):
description="""Image stacks that the segmentation masks apply to.""",
json_schema_extra={"linkml_meta": {"any_of": [{"range": "ImageSeries"}]}},
)
colnames: Optional[str] = Field(
None,
imaging_plane: Union[ImagingPlane, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImagingPlane"}, {"range": "string"}],
}
},
)
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -348,10 +395,8 @@ class PlaneSegmentationImageMask(VectorData):
"linkml_meta": {"equals_string": "image_mask", "ifabsent": "string(image_mask)"}
},
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],
@ -374,13 +419,23 @@ class PlaneSegmentationPixelMask(VectorData):
"linkml_meta": {"equals_string": "pixel_mask", "ifabsent": "string(pixel_mask)"}
},
)
x: Optional[np.uint32] = Field(None, description="""Pixel x-coordinate.""")
y: Optional[np.uint32] = Field(None, description="""Pixel y-coordinate.""")
weight: Optional[np.float32] = Field(None, description="""Weight of the pixel.""")
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
x: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Pixel x-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
array: Optional[
y: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Pixel y-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
weight: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""Weight of the pixel.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],
@ -403,14 +458,28 @@ class PlaneSegmentationVoxelMask(VectorData):
"linkml_meta": {"equals_string": "voxel_mask", "ifabsent": "string(voxel_mask)"}
},
)
x: Optional[np.uint32] = Field(None, description="""Voxel x-coordinate.""")
y: Optional[np.uint32] = Field(None, description="""Voxel y-coordinate.""")
z: Optional[np.uint32] = Field(None, description="""Voxel z-coordinate.""")
weight: Optional[np.float32] = Field(None, description="""Weight of the voxel.""")
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
x: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Voxel x-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
array: Optional[
y: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Voxel y-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Voxel z-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
weight: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""Weight of the voxel.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],
@ -429,10 +498,143 @@ class ImagingPlane(NWBContainer):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[OpticalChannel]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "OpticalChannel"}]}}
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the imaging plane.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
imaging_rate: Optional[float] = Field(
None,
description="""Rate that images are acquired, in Hz. If the corresponding TimeSeries is present, the rate should be stored there instead.""",
)
indicator: str = Field(..., description="""Calcium indicator.""")
location: str = Field(
...,
description="""Location of the imaging plane. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
manifold: Optional[ImagingPlaneManifold] = Field(
None,
description="""DEPRECATED Physical position of each pixel. 'xyz' represents the position of the pixel relative to the defined coordinate space. Deprecated in favor of origin_coords and grid_spacing.""",
)
origin_coords: Optional[ImagingPlaneOriginCoords] = Field(
None,
description="""Physical location of the first element of the imaging plane (0, 0) for 2-D data or (0, 0, 0) for 3-D data. See also reference_frame for what the physical location is relative to (e.g., bregma).""",
)
grid_spacing: Optional[ImagingPlaneGridSpacing] = Field(
None,
description="""Space between pixels in (x, y) or voxels in (x, y, z) directions, in the specified unit. Assumes imaging plane is a regular grid. See also reference_frame to interpret the grid.""",
)
reference_frame: Optional[str] = Field(
None,
description="""Describes reference frame of origin_coords and grid_spacing. For example, this can be a text description of the anatomical location and orientation of the grid defined by origin_coords and grid_spacing or the vectors needed to transform or rotate the grid to a common anatomical axis (e.g., AP/DV/ML). This field is necessary to interpret origin_coords and grid_spacing. If origin_coords and grid_spacing are not present, then this field is not required. For example, if the microscope takes 10 x 10 x 2 images, where the first value of the data matrix (index (0, 0, 0)) corresponds to (-1.2, -0.6, -2) mm relative to bregma, the spacing between pixels is 0.2 mm in x, 0.2 mm in y and 0.5 mm in z, and larger numbers in x means more anterior, larger numbers in y means more rightward, and larger numbers in z means more ventral, then enter the following -- origin_coords = (-1.2, -0.6, -2) grid_spacing = (0.2, 0.2, 0.5) reference_frame = \"Origin coordinates are relative to bregma. First dimension corresponds to anterior-posterior axis (larger index = more anterior). Second dimension corresponds to medial-lateral axis (larger index = more rightward). Third dimension corresponds to dorsal-ventral axis (larger index = more ventral).\"""",
)
optical_channel: List[OpticalChannel] = Field(
..., description="""An optical channel used to record from an imaging plane."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ImagingPlaneManifold(ConfiguredBaseModel):
"""
DEPRECATED Physical position of each pixel. 'xyz' represents the position of the pixel relative to the defined coordinate space. Deprecated in favor of origin_coords and grid_spacing.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["manifold"] = Field(
"manifold",
json_schema_extra={
"linkml_meta": {"equals_string": "manifold", "ifabsent": "string(manifold)"}
},
)
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as pixels from x = -500 to 499, y = -500 to 499 that correspond to a 2 m x 2 m range, then the 'conversion' multiplier to get from raw data acquisition pixel units to meters is 2/1000.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
unit: Optional[str] = Field(
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[
Union[
NDArray[Shape["* height, * width, 3 x_y_z"], float],
NDArray[Shape["* height, * width, * depth, 3 x_y_z"], float],
]
] = Field(None)
class ImagingPlaneOriginCoords(ConfiguredBaseModel):
"""
Physical location of the first element of the imaging plane (0, 0) for 2-D data or (0, 0, 0) for 3-D data. See also reference_frame for what the physical location is relative to (e.g., bregma).
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["origin_coords"] = Field(
"origin_coords",
json_schema_extra={
"linkml_meta": {"equals_string": "origin_coords", "ifabsent": "string(origin_coords)"}
},
)
unit: str = Field(
"meters",
description="""Measurement units for origin_coords. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"array": {
"dimensions": [
{"alias": "x_y", "exact_cardinality": 2},
{"alias": "x_y_z", "exact_cardinality": 3},
]
}
}
},
)
class ImagingPlaneGridSpacing(ConfiguredBaseModel):
"""
Space between pixels in (x, y) or voxels in (x, y, z) directions, in the specified unit. Assumes imaging plane is a regular grid. See also reference_frame to interpret the grid.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["grid_spacing"] = Field(
"grid_spacing",
json_schema_extra={
"linkml_meta": {"equals_string": "grid_spacing", "ifabsent": "string(grid_spacing)"}
},
)
unit: str = Field(
"meters",
description="""Measurement units for grid_spacing. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[NDArray[Shape["2 x_y, 3 x_y_z"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"array": {
"dimensions": [
{"alias": "x_y", "exact_cardinality": 2},
{"alias": "x_y_z", "exact_cardinality": 3},
]
}
}
},
)
class OpticalChannel(NWBContainer):
@ -446,9 +648,7 @@ class OpticalChannel(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description or other notes about the channel.""")
emission_lambda: np.float32 = Field(
..., description="""Emission wavelength for channel, in nm."""
)
emission_lambda: float = Field(..., description="""Emission wavelength for channel, in nm.""")
class MotionCorrection(NWBDataInterface):
@ -460,7 +660,7 @@ class MotionCorrection(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[CorrectedImageStack]] = Field(
value: Optional[List[CorrectedImageStack]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "CorrectedImageStack"}]}}
)
name: str = Field(...)
@ -483,6 +683,15 @@ class CorrectedImageStack(NWBDataInterface):
...,
description="""Stores the x,y delta necessary to align each frame to the common coordinates, for example, to align each frame to a reference image.""",
)
original: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
# Model rebuild
@ -497,6 +706,9 @@ PlaneSegmentationImageMask.model_rebuild()
PlaneSegmentationPixelMask.model_rebuild()
PlaneSegmentationVoxelMask.model_rebuild()
ImagingPlane.model_rebuild()
ImagingPlaneManifold.model_rebuild()
ImagingPlaneOriginCoords.model_rebuild()
ImagingPlaneGridSpacing.model_rebuild()
OpticalChannel.model_rebuild()
MotionCorrection.model_rebuild()
CorrectedImageStack.model_rebuild()

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -127,17 +136,13 @@ class ImagingRetinotopyAxis1PhaseMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -161,17 +166,13 @@ class ImagingRetinotopyAxis1PowerMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -195,17 +196,13 @@ class ImagingRetinotopyAxis2PhaseMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -229,17 +226,13 @@ class ImagingRetinotopyAxis2PowerMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -263,24 +256,18 @@ class ImagingRetinotopyFocalDepthImage(ConfiguredBaseModel):
}
},
)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value.""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
focal_depth: Optional[np.float32] = Field(
None, description="""Focal depth offset, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.uint16]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
focal_depth: float = Field(..., description="""Focal depth offset, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], int]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -301,14 +288,12 @@ class ImagingRetinotopySignMap(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "sign_map", "ifabsent": "string(sign_map)"}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -332,21 +317,17 @@ class ImagingRetinotopyVasculatureImage(ConfiguredBaseModel):
}
},
)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.uint16]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], int]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}

View file

@ -56,6 +56,9 @@ from ...core.v2_2_4.core_nwb_ophys import (
PlaneSegmentationPixelMask,
PlaneSegmentationVoxelMask,
ImagingPlane,
ImagingPlaneManifold,
ImagingPlaneOriginCoords,
ImagingPlaneGridSpacing,
OpticalChannel,
MotionCorrection,
CorrectedImageStack,
@ -134,10 +137,11 @@ from ...core.v2_2_4.core_nwb_file import (
NWBFile,
NWBFileStimulus,
NWBFileGeneral,
NWBFileGeneralSourceScript,
NWBFileGeneralExtracellularEphys,
NWBFileGeneralExtracellularEphysElectrodes,
NWBFileGeneralIntracellularEphys,
GeneralSourceScript,
GeneralExtracellularEphys,
ExtracellularEphysElectrodes,
GeneralIntracellularEphys,
NWBFileIntervals,
LabMetaData,
Subject,
)
@ -161,6 +165,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -83,15 +92,15 @@ class Image(NWBData):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -130,10 +139,15 @@ class TimeSeries(NWBDataInterface):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
data: TimeSeriesData = Field(
...,
@ -143,12 +157,12 @@ class TimeSeries(NWBDataInterface):
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -177,19 +191,21 @@ class TimeSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as signed 16-bit integers (int16 range -32,768 to 32,767) that correspond to a 5V range (-2.5V to 2.5V), and the data acquisition system gain is 8000X, then the 'conversion' multiplier to get from raw data acquisition values to recorded volts is 2.5/32768/8000 = 9.5367e-9.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
resolution: Optional[np.float32] = Field(
None,
resolution: Optional[float] = Field(
-1.0,
description="""Smallest meaningful difference between values in data, stored in the specified by unit, e.g., the change in value of the least significant bit, or a larger number if signal noise is known to be present. If unknown, use -1.0.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(-1.0)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], Any],
NDArray[Shape["* num_times, * num_dim2"], Any],
@ -212,11 +228,15 @@ class TimeSeriesStartingTime(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "starting_time", "ifabsent": "string(starting_time)"}
},
)
rate: Optional[np.float32] = Field(None, description="""Sampling rate, in Hz.""")
unit: Optional[str] = Field(
None, description="""Unit of measurement for time, which is fixed to 'seconds'."""
rate: float = Field(..., description="""Sampling rate, in Hz.""")
unit: Literal["seconds"] = Field(
"seconds",
description="""Unit of measurement for time, which is fixed to 'seconds'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "seconds", "ifabsent": "string(seconds)"}
},
)
value: np.float64 = Field(...)
value: float = Field(...)
class TimeSeriesSync(ConfiguredBaseModel):
@ -241,7 +261,7 @@ class ProcessingModule(NWBContainer):
{"from_schema": "core.nwb.base", "tree_root": True}
)
children: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
value: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
None,
json_schema_extra={
"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}, {"range": "DynamicTable"}]}
@ -260,9 +280,7 @@ class Images(NWBDataInterface):
)
name: str = Field("Images", json_schema_extra={"linkml_meta": {"ifabsent": "string(Images)"}})
description: Optional[str] = Field(
None, description="""Description of this collection of images."""
)
description: str = Field(..., description="""Description of this collection of images.""")
image: List[Image] = Field(..., description="""Images stored in this collection.""")

View file

@ -34,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -84,21 +93,26 @@ class SpatialSeries(TimeSeries):
reference_frame: Optional[str] = Field(
None, description="""Description defining what exactly 'straight-ahead' means."""
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -128,13 +142,14 @@ class SpatialSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -148,7 +163,7 @@ class BehavioralEpochs(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[IntervalSeries]] = Field(
value: Optional[List[IntervalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "IntervalSeries"}]}}
)
name: str = Field(...)
@ -163,7 +178,7 @@ class BehavioralEvents(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -178,7 +193,7 @@ class BehavioralTimeSeries(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -193,7 +208,7 @@ class PupilTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -208,7 +223,7 @@ class EyeTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -223,7 +238,7 @@ class CompassDirection(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -238,7 +253,7 @@ class Position(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)

View file

@ -27,6 +27,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -16,6 +16,7 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from ...core.v2_2_5.core_nwb_device import Device
from ...core.v2_2_5.core_nwb_base import (
TimeSeries,
TimeSeriesStartingTime,
@ -43,6 +44,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +78,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -108,37 +118,47 @@ class ElectricalSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_channels"], np.number],
NDArray[Shape["* num_times, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_channels"], float],
NDArray[Shape["* num_times, * num_channels, * num_samples"], float],
] = Field(..., description="""Recorded voltage data.""")
electrodes: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -167,10 +187,10 @@ class SpikeEventSeries(ElectricalSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_events, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_samples"], float],
NDArray[Shape["* num_events, * num_channels, * num_samples"], float],
] = Field(..., description="""Spike waveforms.""")
timestamps: NDArray[Shape["* num_times"], np.float64] = Field(
timestamps: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time. Timestamps are required for the events. Unlike for TimeSeries, timestamps are required for SpikeEventSeries and are thus re-specified here.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -179,24 +199,34 @@ class SpikeEventSeries(ElectricalSeries):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,7 +262,7 @@ class FeatureExtraction(NWBDataInterface):
description="""Description of features (eg, ''PC1'') for each of the extracted features.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
features: NDArray[Shape["* num_events, * num_channels, * num_features"], np.float32] = Field(
features: NDArray[Shape["* num_events, * num_channels, * num_features"], float] = Field(
...,
description="""Multi-dimensional array of features extracted from each event.""",
json_schema_extra={
@ -247,7 +277,7 @@ class FeatureExtraction(NWBDataInterface):
}
},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of events that features correspond to (can be a link).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
@ -256,7 +286,12 @@ class FeatureExtraction(NWBDataInterface):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -277,16 +312,25 @@ class EventDetection(NWBDataInterface):
...,
description="""Description of how events were detected, such as voltage threshold, or dV/dT threshold, as well as relevant values.""",
)
source_idx: NDArray[Shape["* num_events"], np.int32] = Field(
source_idx: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Indices (zero-based) into source ElectricalSeries::data array corresponding to time of event. ''description'' should define what is meant by time of event (e.g., .25 ms before action potential peak, zero-crossing time, etc). The index points to each event from the raw data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Timestamps of events, in seconds.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
source_electricalseries: Union[ElectricalSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ElectricalSeries"}, {"range": "string"}],
}
},
)
class EventWaveform(NWBDataInterface):
@ -298,7 +342,7 @@ class EventWaveform(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[SpikeEventSeries]] = Field(
value: Optional[List[SpikeEventSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpikeEventSeries"}]}}
)
name: str = Field(...)
@ -313,7 +357,7 @@ class FilteredEphys(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -328,7 +372,7 @@ class LFP(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -344,14 +388,23 @@ class ElectrodeGroup(NWBContainer):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of this electrode group.""")
location: Optional[str] = Field(
None,
description: str = Field(..., description="""Description of this electrode group.""")
location: str = Field(
...,
description="""Location of electrode group. Specify the area, layer, comments on estimation of area/layer, etc. Use standard atlas names for anatomical regions when possible.""",
)
position: Optional[ElectrodeGroupPosition] = Field(
None, description="""stereotaxic or common framework coordinates"""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ElectrodeGroupPosition(ConfiguredBaseModel):
@ -367,9 +420,21 @@ class ElectrodeGroupPosition(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "position", "ifabsent": "string(position)"}
},
)
x: Optional[np.float32] = Field(None, description="""x coordinate""")
y: Optional[np.float32] = Field(None, description="""y coordinate""")
z: Optional[np.float32] = Field(None, description="""z coordinate""")
x: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""x coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
y: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""y coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""z coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
class ClusterWaveforms(NWBDataInterface):
@ -388,7 +453,7 @@ class ClusterWaveforms(NWBDataInterface):
waveform_filtering: str = Field(
..., description="""Filtering applied to data before generating mean/sd"""
)
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""The mean waveform for each cluster, using the same indices for each wave as cluster numbers in the associated Clustering module (i.e, cluster 3 is in array slot [3]). Waveforms corresponding to gaps in cluster sequence should be empty (e.g., zero- filled)""",
json_schema_extra={
@ -397,7 +462,7 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""Stdev of waveforms for each cluster, using the same indices as in mean""",
json_schema_extra={
@ -406,6 +471,15 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
clustering_interface: Union[Clustering, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Clustering"}, {"range": "string"}],
}
},
)
class Clustering(NWBDataInterface):
@ -424,17 +498,17 @@ class Clustering(NWBDataInterface):
...,
description="""Description of clusters or clustering, (e.g. cluster 0 is noise, clusters curated using Klusters, etc)""",
)
num: NDArray[Shape["* num_events"], np.int32] = Field(
num: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Cluster number of each event""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
peak_over_rms: NDArray[Shape["* num_clusters"], np.float32] = Field(
peak_over_rms: NDArray[Shape["* num_clusters"], float] = Field(
...,
description="""Maximum ratio of waveform peak to RMS on any channel in the cluster (provides a basic clustering metric).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_clusters"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of clustered events, in seconds. This may be a link to times field in associated FeatureExtraction module.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},

View file

@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,7 +105,7 @@ class TimeIntervals(DynamicTable):
)
name: str = Field(...)
start_time: NDArray[Any, np.float32] = Field(
start_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Start time of epoch, in seconds.""",
json_schema_extra={
@ -105,7 +114,7 @@ class TimeIntervals(DynamicTable):
}
},
)
stop_time: NDArray[Any, np.float32] = Field(
stop_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Stop time of epoch, in seconds.""",
json_schema_extra={
@ -114,7 +123,7 @@ class TimeIntervals(DynamicTable):
}
},
)
tags: Optional[NDArray[Any, str]] = Field(
tags: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""User-defined tags that identify or categorize events.""",
json_schema_extra={
@ -127,7 +136,12 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for tags.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
timeseries: Optional[TimeIntervalsTimeseries] = Field(
@ -137,17 +151,20 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -173,21 +190,23 @@ class TimeIntervalsTimeseries(VectorData):
"linkml_meta": {"equals_string": "timeseries", "ifabsent": "string(timeseries)"}
},
)
idx_start: Optional[np.int32] = Field(
idx_start: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Start index into the TimeSeries 'data' and 'timestamp' datasets of the referenced TimeSeries. The first dimension of those arrays is always time.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
count: Optional[np.int32] = Field(
count: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Number of data samples available in this time series, during this epoch.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
timeseries: Optional[TimeSeries] = Field(
None, description="""the TimeSeries that this index applies to."""
timeseries: Optional[NDArray[Shape["*"], TimeSeries]] = Field(
None,
description="""the TimeSeries that this index applies to.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],

View file

@ -7,7 +7,6 @@ import sys
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union
from pydantic import BaseModel, ConfigDict, Field, RootModel, field_validator
import numpy as np
from ...core.v2_2_5.core_nwb_epoch import TimeIntervals
from ...core.v2_2_5.core_nwb_misc import Units
from ...core.v2_2_5.core_nwb_device import Device
from ...core.v2_2_5.core_nwb_ogen import OptogeneticStimulusSite
@ -16,6 +15,7 @@ from ...core.v2_2_5.core_nwb_ecephys import ElectrodeGroup
from numpydantic import NDArray, Shape
from ...hdmf_common.v1_1_3.hdmf_common_table import DynamicTable, VectorData, VectorIndex
from ...core.v2_2_5.core_nwb_icephys import IntracellularElectrode, SweepTable
from ...core.v2_2_5.core_nwb_epoch import TimeIntervals
from ...core.v2_2_5.core_nwb_base import (
NWBData,
NWBContainer,
@ -42,6 +42,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -96,9 +105,7 @@ class ScratchData(NWBData):
)
name: str = Field(...)
notes: Optional[str] = Field(
None, description="""Any notes the user has about the dataset being stored"""
)
notes: str = Field(..., description="""Any notes the user has about the dataset being stored""")
class NWBFile(NWBContainer):
@ -114,11 +121,12 @@ class NWBFile(NWBContainer):
"root",
json_schema_extra={"linkml_meta": {"equals_string": "root", "ifabsent": "string(root)"}},
)
nwb_version: Optional[str] = Field(
None,
nwb_version: Literal["2.2.5"] = Field(
"2.2.5",
description="""File version string. Use semantic versioning, e.g. 1.2.1. This will be the name of the format with trailing major, minor and patch numbers.""",
json_schema_extra={"linkml_meta": {"equals_string": "2.2.5", "ifabsent": "string(2.2.5)"}},
)
file_create_date: NDArray[Shape["* num_modifications"], np.datetime64] = Field(
file_create_date: NDArray[Shape["* num_modifications"], datetime] = Field(
...,
description="""A record of the date the file was created and of subsequent modifications. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted strings: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. The file can be created after the experiment was run, so this may differ from the experiment start time. Each modification to the nwb file adds a new entry to the array.""",
json_schema_extra={
@ -132,11 +140,11 @@ class NWBFile(NWBContainer):
session_description: str = Field(
..., description="""A description of the experimental session and data in the file."""
)
session_start_time: np.datetime64 = Field(
session_start_time: datetime = Field(
...,
description="""Date and time of the experiment/session start. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds.""",
)
timestamps_reference_time: np.datetime64 = Field(
timestamps_reference_time: datetime = Field(
...,
description="""Date and time corresponding to time zero of all timestamps. The date is stored in UTC with local timezone offset as ISO 8601 extended formatted string: 2018-09-28T14:43:54.123+02:00. Dates stored in UTC end in \"Z\" with no timezone offset. Date accuracy is up to milliseconds. All times stored in the file use this time as reference (i.e., time zero).""",
)
@ -174,19 +182,9 @@ class NWBFile(NWBContainer):
...,
description="""Experimental metadata, including protocol, notes and description of hardware device(s). The metadata stored in this section should be used to describe the experiment. Metadata necessary for interpreting the data is stored with the data. General experimental metadata, including animal strain, experimental protocols, experimenter, devices, etc, are stored under 'general'. Core metadata (e.g., that required to interpret data fields) is stored with the data itself, and implicitly defined by the file specification (e.g., time is in seconds). The strategy used here for storing non-core metadata is to use free-form text fields, such as would appear in sentences or paragraphs from a Methods section. Metadata fields are text to enable them to be more general, for example to represent ranges instead of numerical values. Machine-readable metadata is stored as attributes to these free-form datasets. All entries in the below table are to be included when data is present. Unused groups (e.g., intracellular_ephys in an optophysiology experiment) should not be created unless there is data to store within them.""",
)
intervals: Optional[List[TimeIntervals]] = Field(
intervals: Optional[NWBFileIntervals] = Field(
None,
description="""Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.""",
json_schema_extra={
"linkml_meta": {
"any_of": [
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
{"range": "TimeIntervals"},
]
}
},
)
units: Optional[Units] = Field(None, description="""Data about sorted spike units.""")
@ -272,7 +270,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Description of slices, including information about preparation thickness, orientation, temperature, and bath solution.""",
)
source_script: Optional[NWBFileGeneralSourceScript] = Field(
source_script: Optional[GeneralSourceScript] = Field(
None,
description="""Script file or link to public source code used to create this NWB file.""",
)
@ -300,10 +298,10 @@ class NWBFileGeneral(ConfiguredBaseModel):
None,
description="""Information about the animal or person from which the data was measured.""",
)
extracellular_ephys: Optional[NWBFileGeneralExtracellularEphys] = Field(
extracellular_ephys: Optional[GeneralExtracellularEphys] = Field(
None, description="""Metadata related to extracellular electrophysiology."""
)
intracellular_ephys: Optional[NWBFileGeneralIntracellularEphys] = Field(
intracellular_ephys: Optional[GeneralIntracellularEphys] = Field(
None, description="""Metadata related to intracellular electrophysiology."""
)
optogenetics: Optional[List[OptogeneticStimulusSite]] = Field(
@ -318,7 +316,7 @@ class NWBFileGeneral(ConfiguredBaseModel):
)
class NWBFileGeneralSourceScript(ConfiguredBaseModel):
class GeneralSourceScript(ConfiguredBaseModel):
"""
Script file or link to public source code used to create this NWB file.
"""
@ -331,11 +329,11 @@ class NWBFileGeneralSourceScript(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "source_script", "ifabsent": "string(source_script)"}
},
)
file_name: Optional[str] = Field(None, description="""Name of script file.""")
file_name: str = Field(..., description="""Name of script file.""")
value: str = Field(...)
class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
class GeneralExtracellularEphys(ConfiguredBaseModel):
"""
Metadata related to extracellular electrophysiology.
"""
@ -354,12 +352,12 @@ class NWBFileGeneralExtracellularEphys(ConfiguredBaseModel):
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Physical group of electrodes."""
)
electrodes: Optional[NWBFileGeneralExtracellularEphysElectrodes] = Field(
electrodes: Optional[ExtracellularEphysElectrodes] = Field(
None, description="""A table of all electrodes (i.e. channels) used for recording."""
)
class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
class ExtracellularEphysElectrodes(DynamicTable):
"""
A table of all electrodes (i.e. channels) used for recording.
"""
@ -372,7 +370,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
"linkml_meta": {"equals_string": "electrodes", "ifabsent": "string(electrodes)"}
},
)
x: NDArray[Any, np.float32] = Field(
x: VectorData[NDArray[Any, float]] = Field(
...,
description="""x coordinate of the channel location in the brain (+x is posterior).""",
json_schema_extra={
@ -381,7 +379,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
y: NDArray[Any, np.float32] = Field(
y: VectorData[NDArray[Any, float]] = Field(
...,
description="""y coordinate of the channel location in the brain (+y is inferior).""",
json_schema_extra={
@ -390,7 +388,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
z: NDArray[Any, np.float32] = Field(
z: VectorData[NDArray[Any, float]] = Field(
...,
description="""z coordinate of the channel location in the brain (+z is right).""",
json_schema_extra={
@ -399,7 +397,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
imp: NDArray[Any, np.float32] = Field(
imp: VectorData[NDArray[Any, float]] = Field(
...,
description="""Impedance of the channel.""",
json_schema_extra={
@ -408,7 +406,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
location: NDArray[Any, str] = Field(
location: VectorData[NDArray[Any, str]] = Field(
...,
description="""Location of the electrode (channel). Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
json_schema_extra={
@ -417,7 +415,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
filtering: NDArray[Any, np.float32] = Field(
filtering: VectorData[NDArray[Any, float]] = Field(
...,
description="""Description of hardware filtering.""",
json_schema_extra={
@ -429,7 +427,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
group: List[ElectrodeGroup] = Field(
..., description="""Reference to the ElectrodeGroup this electrode is a part of."""
)
group_name: NDArray[Any, str] = Field(
group_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the ElectrodeGroup this electrode is a part of.""",
json_schema_extra={
@ -438,7 +436,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_x: Optional[NDArray[Any, np.float32]] = Field(
rel_x: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""x coordinate in electrode group""",
json_schema_extra={
@ -447,7 +445,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_y: Optional[NDArray[Any, np.float32]] = Field(
rel_y: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""y coordinate in electrode group""",
json_schema_extra={
@ -456,7 +454,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
rel_z: Optional[NDArray[Any, np.float32]] = Field(
rel_z: VectorData[Optional[NDArray[Any, float]]] = Field(
None,
description="""z coordinate in electrode group""",
json_schema_extra={
@ -465,7 +463,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
reference: Optional[NDArray[Any, str]] = Field(
reference: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""Description of the reference used for this electrode.""",
json_schema_extra={
@ -474,14 +472,12 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -494,7 +490,7 @@ class NWBFileGeneralExtracellularEphysElectrodes(DynamicTable):
)
class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
class GeneralIntracellularEphys(ConfiguredBaseModel):
"""
Metadata related to intracellular electrophysiology.
"""
@ -522,6 +518,35 @@ class NWBFileGeneralIntracellularEphys(ConfiguredBaseModel):
)
class NWBFileIntervals(ConfiguredBaseModel):
"""
Experimental intervals, whether that be logically distinct sub-experiments having a particular scientific goal, trials (see trials subgroup) during an experiment, or epochs (see epochs subgroup) deriving from analysis of data.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.file"})
name: Literal["intervals"] = Field(
"intervals",
json_schema_extra={
"linkml_meta": {"equals_string": "intervals", "ifabsent": "string(intervals)"}
},
)
epochs: Optional[TimeIntervals] = Field(
None,
description="""Divisions in time marking experimental stages or sub-divisions of a single recording session.""",
)
trials: Optional[TimeIntervals] = Field(
None, description="""Repeated experimental events that have a logical grouping."""
)
invalid_times: Optional[TimeIntervals] = Field(
None, description="""Time intervals that should be removed from analysis."""
)
time_intervals: Optional[List[TimeIntervals]] = Field(
None,
description="""Optional additional table(s) for describing other experimental time intervals.""",
)
class LabMetaData(NWBContainer):
"""
Lab-specific meta-data.
@ -547,7 +572,7 @@ class Subject(NWBContainer):
age: Optional[str] = Field(
None, description="""Age of subject. Can be supplied instead of 'date_of_birth'."""
)
date_of_birth: Optional[np.datetime64] = Field(
date_of_birth: Optional[datetime] = Field(
None, description="""Date of birth of subject. Can be supplied instead of 'age'."""
)
description: Optional[str] = Field(
@ -575,9 +600,10 @@ ScratchData.model_rebuild()
NWBFile.model_rebuild()
NWBFileStimulus.model_rebuild()
NWBFileGeneral.model_rebuild()
NWBFileGeneralSourceScript.model_rebuild()
NWBFileGeneralExtracellularEphys.model_rebuild()
NWBFileGeneralExtracellularEphysElectrodes.model_rebuild()
NWBFileGeneralIntracellularEphys.model_rebuild()
GeneralSourceScript.model_rebuild()
GeneralExtracellularEphys.model_rebuild()
ExtracellularEphysElectrodes.model_rebuild()
GeneralIntracellularEphys.model_rebuild()
NWBFileIntervals.model_rebuild()
LabMetaData.model_rebuild()
Subject.model_rebuild()

View file

@ -11,6 +11,7 @@ from ...core.v2_2_5.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_5.core_nwb_device import Device
from typing import Any, ClassVar, List, Literal, Dict, Optional, Union, Annotated, Type, TypeVar
from pydantic import (
BaseModel,
@ -42,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -67,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -106,32 +116,46 @@ class PatchClampSeries(TimeSeries):
)
name: str = Field(...)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
data: PatchClampSeriesData = Field(..., description="""Recorded voltage or current.""")
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -160,11 +184,11 @@ class PatchClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
array: Optional[NDArray[Shape["* num_times"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times"], float]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}}
)
@ -180,36 +204,50 @@ class CurrentClampSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
bias_current: Optional[np.float32] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[np.float32] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[np.float32] = Field(
bias_current: Optional[float] = Field(None, description="""Bias current, in amps.""")
bridge_balance: Optional[float] = Field(None, description="""Bridge balance, in ohms.""")
capacitance_compensation: Optional[float] = Field(
None, description="""Capacitance compensation, in farads."""
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -238,9 +276,10 @@ class CurrentClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -255,39 +294,51 @@ class IZeroClampSeries(CurrentClampSeries):
)
name: str = Field(...)
bias_current: np.float32 = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: np.float32 = Field(
..., description="""Bridge balance, in ohms, fixed to 0.0."""
)
capacitance_compensation: np.float32 = Field(
bias_current: float = Field(..., description="""Bias current, in amps, fixed to 0.0.""")
bridge_balance: float = Field(..., description="""Bridge balance, in ohms, fixed to 0.0.""")
capacitance_compensation: float = Field(
..., description="""Capacitance compensation, in farads, fixed to 0.0."""
)
data: CurrentClampSeriesData = Field(..., description="""Recorded voltage.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -316,31 +367,45 @@ class CurrentClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: CurrentClampStimulusSeriesData = Field(..., description="""Stimulus current applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -369,9 +434,12 @@ class CurrentClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -408,31 +476,45 @@ class VoltageClampSeries(PatchClampSeries):
whole_cell_series_resistance_comp: Optional[VoltageClampSeriesWholeCellSeriesResistanceComp] = (
Field(None, description="""Whole cell series resistance compensation, in ohms.""")
)
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -461,9 +543,12 @@ class VoltageClampSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["amperes"] = Field(
"amperes",
description="""Base unit of measurement for working with the data. which is fixed to 'amperes'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "amperes", "ifabsent": "string(amperes)"}
},
)
value: Any = Field(...)
@ -484,11 +569,14 @@ class VoltageClampSeriesCapacitanceFast(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
@ -507,11 +595,14 @@ class VoltageClampSeriesCapacitanceSlow(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for capacitance_fast, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
@ -530,11 +621,12 @@ class VoltageClampSeriesResistanceCompBandwidth(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["hertz"] = Field(
"hertz",
description="""Unit of measurement for resistance_comp_bandwidth, which is fixed to 'hertz'.""",
json_schema_extra={"linkml_meta": {"equals_string": "hertz", "ifabsent": "string(hertz)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
@ -553,11 +645,14 @@ class VoltageClampSeriesResistanceCompCorrection(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_correction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
@ -576,11 +671,14 @@ class VoltageClampSeriesResistanceCompPrediction(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["percent"] = Field(
"percent",
description="""Unit of measurement for resistance_comp_prediction, which is fixed to 'percent'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "percent", "ifabsent": "string(percent)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
@ -599,11 +697,14 @@ class VoltageClampSeriesWholeCellCapacitanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["farads"] = Field(
"farads",
description="""Unit of measurement for whole_cell_capacitance_comp, which is fixed to 'farads'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "farads", "ifabsent": "string(farads)"}
},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
@ -622,11 +723,12 @@ class VoltageClampSeriesWholeCellSeriesResistanceComp(ConfiguredBaseModel):
}
},
)
unit: Optional[str] = Field(
None,
unit: Literal["ohms"] = Field(
"ohms",
description="""Unit of measurement for whole_cell_series_resistance_comp, which is fixed to 'ohms'.""",
json_schema_extra={"linkml_meta": {"equals_string": "ohms", "ifabsent": "string(ohms)"}},
)
value: np.float32 = Field(...)
value: float = Field(...)
class VoltageClampStimulusSeries(PatchClampSeries):
@ -640,31 +742,45 @@ class VoltageClampStimulusSeries(PatchClampSeries):
name: str = Field(...)
data: VoltageClampStimulusSeriesData = Field(..., description="""Stimulus voltage applied.""")
stimulus_description: Optional[str] = Field(
None, description="""Protocol/stimulus name for this patch-clamp dataset."""
stimulus_description: str = Field(
..., description="""Protocol/stimulus name for this patch-clamp dataset."""
)
sweep_number: Optional[np.uint32] = Field(
sweep_number: Optional[int] = Field(
None, description="""Sweep number, allows to group different PatchClampSeries together."""
)
gain: Optional[np.float32] = Field(
gain: Optional[float] = Field(
None,
description="""Gain of the recording, in units Volt/Amp (v-clamp) or Volt/Volt (c-clamp).""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
electrode: Union[IntracellularElectrode, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "IntracellularElectrode"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -693,9 +809,10 @@ class VoltageClampStimulusSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: Literal["volts"] = Field(
"volts",
description="""Base unit of measurement for working with the data. which is fixed to 'volts'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"equals_string": "volts", "ifabsent": "string(volts)"}},
)
value: Any = Field(...)
@ -726,6 +843,15 @@ class IntracellularElectrode(NWBContainer):
slice: Optional[str] = Field(
None, description="""Information about slice used for recording."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class SweepTable(DynamicTable):
@ -738,7 +864,7 @@ class SweepTable(DynamicTable):
)
name: str = Field(...)
sweep_number: NDArray[Any, np.uint32] = Field(
sweep_number: VectorData[NDArray[Any, int]] = Field(
...,
description="""Sweep number of the PatchClampSeries in that row.""",
json_schema_extra={
@ -754,17 +880,20 @@ class SweepTable(DynamicTable):
...,
description="""Index for series.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -71,15 +80,15 @@ class GrayscaleImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -94,15 +103,15 @@ class RGBImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -117,15 +126,15 @@ class RGBAImage(Image):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -142,11 +151,11 @@ class ImageSeries(TimeSeries):
name: str = Field(...)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -159,21 +168,26 @@ class ImageSeries(TimeSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -204,11 +218,11 @@ class ImageSeriesExternalFile(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "external_file", "ifabsent": "string(external_file)"}
},
)
starting_frame: Optional[np.int32] = Field(
None,
starting_frame: List[int] = Field(
...,
description="""Each external image may contain one or more consecutive frames of the full ImageSeries. This attribute serves as an index to indicate which frames each file contains, to faciliate random access. The 'starting_frame' attribute, hence, contains a list of frame numbers within the full ImageSeries of the first frame of each file listed in the parent 'external_file' dataset. Zero-based indexing is used (hence, the first element will always be zero). For example, if the 'external_file' dataset has three paths to files and the first file has 5 frames, the second file has 10 frames, and the third file has 20 frames, then this attribute will have values [0, 5, 15]. If there is a single external file that holds all of the frames of the ImageSeries (and so there is a single element in the 'external_file' dataset), then this attribute should have value [0].""",
)
array: Optional[NDArray[Shape["* num_files"], str]] = Field(
value: Optional[NDArray[Shape["* num_files"], str]] = Field(
None, json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_files"}]}}}
)
@ -223,13 +237,22 @@ class ImageMaskSeries(ImageSeries):
)
name: str = Field(...)
masked_imageseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -242,21 +265,26 @@ class ImageMaskSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -284,24 +312,23 @@ class OpticalSeries(ImageSeries):
)
name: str = Field(...)
distance: Optional[np.float32] = Field(
distance: Optional[float] = Field(
None, description="""Distance from camera/monitor to target/eye."""
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height_depth"], np.float32],
NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height_depth"], float]
]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
data: Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, 3 r_g_b"], float],
] = Field(..., description="""Images presented to subject, either grayscale or RGB""")
orientation: Optional[str] = Field(
None,
description="""Description of image relative to some reference frame (e.g., which way is up). Must also specify frame of reference.""",
)
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -314,21 +341,26 @@ class OpticalSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -356,26 +388,40 @@ class IndexSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int32] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Index of the frame in the referenced ImageSeries.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
indexed_timeseries: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},

View file

@ -43,6 +43,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +77,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -120,21 +129,26 @@ class AbstractFeatureSeries(TimeSeries):
description="""Description of the features represented in TimeSeries::data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -164,13 +178,14 @@ class AbstractFeatureSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"see 'feature_units'",
description="""Since there can be different units for different features, store the units in 'feature_units'. The default value for this attribute is \"see 'feature_units'\".""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(see 'feature_units')"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -190,21 +205,26 @@ class AnnotationSeries(TimeSeries):
description="""Annotations made during an experiment.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -232,26 +252,31 @@ class IntervalSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.int8] = Field(
data: NDArray[Shape["* num_times"], int] = Field(
...,
description="""Use values >0 if interval started, <0 if interval ended.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -287,21 +312,35 @@ class DecompositionSeries(TimeSeries):
...,
description="""Table for describing the bands that this series was generated from. There should be one row in this table for each band.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
comments: Optional[str] = Field(
source_timeseries: Optional[Union[TimeSeries, str]] = Field(
None,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "TimeSeries"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -330,11 +369,12 @@ class DecompositionSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
"no unit",
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no unit)"}},
)
array: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], np.number]] = Field(
value: Optional[NDArray[Shape["* num_times, * num_channels, * num_bands"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {
@ -361,7 +401,7 @@ class DecompositionSeriesBands(DynamicTable):
"bands",
json_schema_extra={"linkml_meta": {"equals_string": "bands", "ifabsent": "string(bands)"}},
)
band_name: NDArray[Any, str] = Field(
band_name: VectorData[NDArray[Any, str]] = Field(
...,
description="""Name of the band, e.g. theta.""",
json_schema_extra={
@ -370,7 +410,7 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_limits: NDArray[Shape["* num_bands, 2 low_high"], np.float32] = Field(
band_limits: VectorData[NDArray[Shape["* num_bands, 2 low_high"], float]] = Field(
...,
description="""Low and high limit of each band in Hz. If it is a Gaussian filter, use 2 SD on either side of the center.""",
json_schema_extra={
@ -384,24 +424,22 @@ class DecompositionSeriesBands(DynamicTable):
}
},
)
band_mean: NDArray[Shape["* num_bands"], np.float32] = Field(
band_mean: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The mean Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
band_stdev: NDArray[Shape["* num_bands"], np.float32] = Field(
band_stdev: VectorData[NDArray[Shape["* num_bands"], float]] = Field(
...,
description="""The standard deviation of Gaussian filters, in Hz.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_bands"}]}}},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -428,7 +466,12 @@ class Units(DynamicTable):
None,
description="""Index into the spike_times dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
spike_times: Optional[UnitsSpikeTimes] = Field(
@ -438,10 +481,16 @@ class Units(DynamicTable):
None,
description="""Index into the obs_intervals dataset.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
obs_intervals: Optional[NDArray[Shape["* num_intervals, 2 start_end"], np.float64]] = Field(
obs_intervals: VectorData[Optional[NDArray[Shape["* num_intervals, 2 start_end"], float]]] = (
Field(
None,
description="""Observation intervals for each unit.""",
json_schema_extra={
@ -455,43 +504,56 @@ class Units(DynamicTable):
}
},
)
)
electrodes_index: Named[Optional[VectorIndex]] = Field(
None,
description="""Index into electrodes.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrodes: Named[Optional[DynamicTableRegion]] = Field(
None,
description="""Electrode that each spike unit came from, specified using a DynamicTableRegion.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
electrode_group: Optional[List[ElectrodeGroup]] = Field(
None, description="""Electrode group that each spike unit came from."""
)
waveform_mean: Optional[
waveform_mean: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform mean for each spike unit.""")
waveform_sd: Optional[
waveform_sd: VectorData[
Optional[
Union[
NDArray[Shape["* num_units, * num_samples"], np.float32],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], np.float32],
NDArray[Shape["* num_units, * num_samples"], float],
NDArray[Shape["* num_units, * num_samples, * num_electrodes"], float],
]
]
] = Field(None, description="""Spike waveform standard deviation for each spike unit.""")
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -517,14 +579,12 @@ class UnitsSpikeTimes(VectorData):
"linkml_meta": {"equals_string": "spike_times", "ifabsent": "string(spike_times)"}
},
)
resolution: Optional[np.float64] = Field(
resolution: Optional[float] = Field(
None,
description="""The smallest possible difference between two spike times. Usually 1 divided by the acquisition sampling rate from which spike times were extracted, but could be larger if the acquisition time series was downsampled or smaller if the acquisition time series was smoothed/interpolated and it is possible for the spike time to be between samples.""",
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],

View file

@ -14,6 +14,7 @@ from ...core.v2_2_5.core_nwb_base import (
TimeSeriesSync,
NWBContainer,
)
from ...core.v2_2_5.core_nwb_device import Device
metamodel_version = "None"
version = "2.2.5"
@ -33,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -76,26 +86,40 @@ class OptogeneticSeries(TimeSeries):
)
name: str = Field(...)
data: NDArray[Shape["* num_times"], np.number] = Field(
data: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Applied power for optogenetic stimulus, in watts.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
site: Union[OptogeneticStimulusSite, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "OptogeneticStimulusSite"}, {"range": "string"}],
}
},
)
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -124,11 +148,20 @@ class OptogeneticStimulusSite(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description of stimulation site.""")
excitation_lambda: np.float32 = Field(..., description="""Excitation wavelength, in nm.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
location: str = Field(
...,
description="""Location of the stimulation site. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
# Model rebuild

View file

@ -21,8 +21,8 @@ from ...hdmf_common.v1_1_3.hdmf_common_table import (
VectorIndex,
VectorData,
)
from ...core.v2_2_5.core_nwb_device import Device
from numpydantic import NDArray, Shape
from ...core.v2_2_5.core_nwb_image import ImageSeries, ImageSeriesExternalFile
from ...core.v2_2_5.core_nwb_base import (
TimeSeriesStartingTime,
TimeSeriesSync,
@ -30,6 +30,7 @@ from ...core.v2_2_5.core_nwb_base import (
NWBDataInterface,
NWBContainer,
)
from ...core.v2_2_5.core_nwb_image import ImageSeries, ImageSeriesExternalFile
metamodel_version = "None"
version = "2.2.5"
@ -49,6 +50,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -74,7 +84,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -114,24 +124,32 @@ class TwoPhotonSeries(ImageSeries):
)
name: str = Field(...)
pmt_gain: Optional[np.float32] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[np.float32] = Field(
pmt_gain: Optional[float] = Field(None, description="""Photomultiplier gain.""")
scan_line_rate: Optional[float] = Field(
None,
description="""Lines imaged per second. This is also stored in /general/optophysiology but is kept here as it is useful information for analysis, and so good to be stored w/ the actual data.""",
)
field_of_view: Optional[
Union[
NDArray[Shape["2 width_height"], np.float32],
NDArray[Shape["3 width_height_depth"], np.float32],
NDArray[Shape["2 width_height"], float], NDArray[Shape["3 width_height_depth"], float]
]
] = Field(None, description="""Width, height and depth of image, or imaged area, in meters.""")
imaging_plane: Union[ImagingPlane, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImagingPlane"}, {"range": "string"}],
}
},
)
data: Optional[
Union[
NDArray[Shape["* frame, * x, * y"], np.number],
NDArray[Shape["* frame, * x, * y, * z"], np.number],
NDArray[Shape["* frame, * x, * y"], float],
NDArray[Shape["* frame, * x, * y, * z"], float],
]
] = Field(None, description="""Binary data representing images across frames.""")
dimension: Optional[NDArray[Shape["* rank"], np.int32]] = Field(
dimension: Optional[NDArray[Shape["* rank"], int]] = Field(
None,
description="""Number of pixels on x, y, (and z) axes.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "rank"}]}}},
@ -144,21 +162,26 @@ class TwoPhotonSeries(ImageSeries):
None,
description="""Format of image. If this is 'external', then the attribute 'external_file' contains the path information to the image files. If this is 'raw', then the raw (single-channel) binary data is stored in the 'data' dataset. If this attribute is not present, then the default format='raw' case is assumed.""",
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -187,31 +210,40 @@ class RoiResponseSeries(TimeSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_rois"], np.number],
NDArray[Shape["* num_times"], float], NDArray[Shape["* num_times, * num_rois"], float]
] = Field(..., description="""Signals from ROIs.""")
rois: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion referencing into an ROITable containing information on the ROIs stored in this timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -238,7 +270,7 @@ class DfOverF(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -253,7 +285,7 @@ class Fluorescence(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[RoiResponseSeries]] = Field(
value: Optional[List[RoiResponseSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "RoiResponseSeries"}]}}
)
name: str = Field(...)
@ -268,7 +300,7 @@ class ImageSegmentation(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[PlaneSegmentation]] = Field(
value: Optional[List[PlaneSegmentation]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "PlaneSegmentation"}]}}
)
name: str = Field(...)
@ -292,7 +324,12 @@ class PlaneSegmentation(DynamicTable):
None,
description="""Index into pixel_mask.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
pixel_mask: Optional[PlaneSegmentationPixelMask] = Field(
@ -303,7 +340,12 @@ class PlaneSegmentation(DynamicTable):
None,
description="""Index into voxel_mask.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
voxel_mask: Optional[PlaneSegmentationVoxelMask] = Field(
@ -315,14 +357,21 @@ class PlaneSegmentation(DynamicTable):
description="""Image stacks that the segmentation masks apply to.""",
json_schema_extra={"linkml_meta": {"any_of": [{"range": "ImageSeries"}]}},
)
colnames: Optional[str] = Field(
None,
imaging_plane: Union[ImagingPlane, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImagingPlane"}, {"range": "string"}],
}
},
)
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -348,10 +397,8 @@ class PlaneSegmentationImageMask(VectorData):
"linkml_meta": {"equals_string": "image_mask", "ifabsent": "string(image_mask)"}
},
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],
@ -374,13 +421,23 @@ class PlaneSegmentationPixelMask(VectorData):
"linkml_meta": {"equals_string": "pixel_mask", "ifabsent": "string(pixel_mask)"}
},
)
x: Optional[np.uint32] = Field(None, description="""Pixel x-coordinate.""")
y: Optional[np.uint32] = Field(None, description="""Pixel y-coordinate.""")
weight: Optional[np.float32] = Field(None, description="""Weight of the pixel.""")
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
x: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Pixel x-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
array: Optional[
y: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Pixel y-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
weight: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""Weight of the pixel.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],
@ -403,14 +460,28 @@ class PlaneSegmentationVoxelMask(VectorData):
"linkml_meta": {"equals_string": "voxel_mask", "ifabsent": "string(voxel_mask)"}
},
)
x: Optional[np.uint32] = Field(None, description="""Voxel x-coordinate.""")
y: Optional[np.uint32] = Field(None, description="""Voxel y-coordinate.""")
z: Optional[np.uint32] = Field(None, description="""Voxel z-coordinate.""")
weight: Optional[np.float32] = Field(None, description="""Weight of the voxel.""")
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
x: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Voxel x-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
array: Optional[
y: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Voxel y-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Voxel z-coordinate.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
weight: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""Weight of the voxel.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],
@ -429,10 +500,123 @@ class ImagingPlane(NWBContainer):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[OpticalChannel]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "OpticalChannel"}]}}
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the imaging plane.""")
excitation_lambda: float = Field(..., description="""Excitation wavelength, in nm.""")
imaging_rate: Optional[float] = Field(
None,
description="""Rate that images are acquired, in Hz. If the corresponding TimeSeries is present, the rate should be stored there instead.""",
)
indicator: str = Field(..., description="""Calcium indicator.""")
location: str = Field(
...,
description="""Location of the imaging plane. Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if in vivo, etc. Use standard atlas names for anatomical regions when possible.""",
)
manifold: Optional[ImagingPlaneManifold] = Field(
None,
description="""DEPRECATED Physical position of each pixel. 'xyz' represents the position of the pixel relative to the defined coordinate space. Deprecated in favor of origin_coords and grid_spacing.""",
)
origin_coords: Optional[ImagingPlaneOriginCoords] = Field(
None,
description="""Physical location of the first element of the imaging plane (0, 0) for 2-D data or (0, 0, 0) for 3-D data. See also reference_frame for what the physical location is relative to (e.g., bregma).""",
)
grid_spacing: Optional[ImagingPlaneGridSpacing] = Field(
None,
description="""Space between pixels in (x, y) or voxels in (x, y, z) directions, in the specified unit. Assumes imaging plane is a regular grid. See also reference_frame to interpret the grid.""",
)
reference_frame: Optional[str] = Field(
None,
description="""Describes reference frame of origin_coords and grid_spacing. For example, this can be a text description of the anatomical location and orientation of the grid defined by origin_coords and grid_spacing or the vectors needed to transform or rotate the grid to a common anatomical axis (e.g., AP/DV/ML). This field is necessary to interpret origin_coords and grid_spacing. If origin_coords and grid_spacing are not present, then this field is not required. For example, if the microscope takes 10 x 10 x 2 images, where the first value of the data matrix (index (0, 0, 0)) corresponds to (-1.2, -0.6, -2) mm relative to bregma, the spacing between pixels is 0.2 mm in x, 0.2 mm in y and 0.5 mm in z, and larger numbers in x means more anterior, larger numbers in y means more rightward, and larger numbers in z means more ventral, then enter the following -- origin_coords = (-1.2, -0.6, -2) grid_spacing = (0.2, 0.2, 0.5) reference_frame = \"Origin coordinates are relative to bregma. First dimension corresponds to anterior-posterior axis (larger index = more anterior). Second dimension corresponds to medial-lateral axis (larger index = more rightward). Third dimension corresponds to dorsal-ventral axis (larger index = more ventral).\"""",
)
optical_channel: List[OpticalChannel] = Field(
..., description="""An optical channel used to record from an imaging plane."""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ImagingPlaneManifold(ConfiguredBaseModel):
"""
DEPRECATED Physical position of each pixel. 'xyz' represents the position of the pixel relative to the defined coordinate space. Deprecated in favor of origin_coords and grid_spacing.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["manifold"] = Field(
"manifold",
json_schema_extra={
"linkml_meta": {"equals_string": "manifold", "ifabsent": "string(manifold)"}
},
)
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as pixels from x = -500 to 499, y = -500 to 499 that correspond to a 2 m x 2 m range, then the 'conversion' multiplier to get from raw data acquisition pixel units to meters is 2/1000.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
unit: Optional[str] = Field(
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[
Union[
NDArray[Shape["* height, * width, 3 x_y_z"], float],
NDArray[Shape["* height, * width, * depth, 3 x_y_z"], float],
]
] = Field(None)
class ImagingPlaneOriginCoords(ConfiguredBaseModel):
"""
Physical location of the first element of the imaging plane (0, 0) for 2-D data or (0, 0, 0) for 3-D data. See also reference_frame for what the physical location is relative to (e.g., bregma).
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["origin_coords"] = Field(
"origin_coords",
json_schema_extra={
"linkml_meta": {"equals_string": "origin_coords", "ifabsent": "string(origin_coords)"}
},
)
unit: str = Field(
"meters",
description="""Measurement units for origin_coords. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[Union[NDArray[Shape["2 x_y"], float], NDArray[Shape["3 x_y_z"], float]]] = (
Field(None)
)
class ImagingPlaneGridSpacing(ConfiguredBaseModel):
"""
Space between pixels in (x, y) or voxels in (x, y, z) directions, in the specified unit. Assumes imaging plane is a regular grid. See also reference_frame to interpret the grid.
"""
linkml_meta: ClassVar[LinkMLMeta] = LinkMLMeta({"from_schema": "core.nwb.ophys"})
name: Literal["grid_spacing"] = Field(
"grid_spacing",
json_schema_extra={
"linkml_meta": {"equals_string": "grid_spacing", "ifabsent": "string(grid_spacing)"}
},
)
unit: str = Field(
"meters",
description="""Measurement units for grid_spacing. The default value is 'meters'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
value: Optional[Union[NDArray[Shape["2 x_y"], float], NDArray[Shape["3 x_y_z"], float]]] = (
Field(None)
)
class OpticalChannel(NWBContainer):
@ -446,9 +630,7 @@ class OpticalChannel(NWBContainer):
name: str = Field(...)
description: str = Field(..., description="""Description or other notes about the channel.""")
emission_lambda: np.float32 = Field(
..., description="""Emission wavelength for channel, in nm."""
)
emission_lambda: float = Field(..., description="""Emission wavelength for channel, in nm.""")
class MotionCorrection(NWBDataInterface):
@ -460,7 +642,7 @@ class MotionCorrection(NWBDataInterface):
{"from_schema": "core.nwb.ophys", "tree_root": True}
)
children: Optional[List[CorrectedImageStack]] = Field(
value: Optional[List[CorrectedImageStack]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "CorrectedImageStack"}]}}
)
name: str = Field(...)
@ -483,6 +665,15 @@ class CorrectedImageStack(NWBDataInterface):
...,
description="""Stores the x,y delta necessary to align each frame to the common coordinates, for example, to align each frame to a reference image.""",
)
original: Union[ImageSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ImageSeries"}, {"range": "string"}],
}
},
)
# Model rebuild
@ -497,6 +688,9 @@ PlaneSegmentationImageMask.model_rebuild()
PlaneSegmentationPixelMask.model_rebuild()
PlaneSegmentationVoxelMask.model_rebuild()
ImagingPlane.model_rebuild()
ImagingPlaneManifold.model_rebuild()
ImagingPlaneOriginCoords.model_rebuild()
ImagingPlaneGridSpacing.model_rebuild()
OpticalChannel.model_rebuild()
MotionCorrection.model_rebuild()
CorrectedImageStack.model_rebuild()

View file

@ -28,6 +28,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -127,17 +136,13 @@ class ImagingRetinotopyAxis1PhaseMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -161,17 +166,13 @@ class ImagingRetinotopyAxis1PowerMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -195,17 +196,13 @@ class ImagingRetinotopyAxis2PhaseMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -229,17 +226,13 @@ class ImagingRetinotopyAxis2PowerMap(ConfiguredBaseModel):
}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
unit: Optional[str] = Field(
None, description="""Unit that axis data is stored in (e.g., degrees)."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
unit: str = Field(..., description="""Unit that axis data is stored in (e.g., degrees).""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -263,24 +256,18 @@ class ImagingRetinotopyFocalDepthImage(ConfiguredBaseModel):
}
},
)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value.""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
focal_depth: Optional[np.float32] = Field(
None, description="""Focal depth offset, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.uint16]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
focal_depth: float = Field(..., description="""Focal depth offset, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], int]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -301,14 +288,12 @@ class ImagingRetinotopySignMap(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "sign_map", "ifabsent": "string(sign_map)"}
},
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.float32]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], float]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}
@ -332,21 +317,17 @@ class ImagingRetinotopyVasculatureImage(ConfiguredBaseModel):
}
},
)
bits_per_pixel: Optional[np.int32] = Field(
None,
bits_per_pixel: int = Field(
...,
description="""Number of bits used to represent each value. This is necessary to determine maximum (white) pixel value""",
)
dimension: Optional[np.int32] = Field(
None,
dimension: List[int] = Field(
...,
description="""Number of rows and columns in the image. NOTE: row, column representation is equivalent to height, width.""",
)
field_of_view: Optional[np.float32] = Field(
None, description="""Size of viewing area, in meters."""
)
format: Optional[str] = Field(
None, description="""Format of image. Right now only 'raw' is supported."""
)
array: Optional[NDArray[Shape["* num_rows, * num_cols"], np.uint16]] = Field(
field_of_view: List[float] = Field(..., description="""Size of viewing area, in meters.""")
format: str = Field(..., description="""Format of image. Right now only 'raw' is supported.""")
value: Optional[NDArray[Shape["* num_rows, * num_cols"], int]] = Field(
None,
json_schema_extra={
"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}, {"alias": "num_cols"}]}}

View file

@ -56,6 +56,9 @@ from ...core.v2_2_5.core_nwb_ophys import (
PlaneSegmentationPixelMask,
PlaneSegmentationVoxelMask,
ImagingPlane,
ImagingPlaneManifold,
ImagingPlaneOriginCoords,
ImagingPlaneGridSpacing,
OpticalChannel,
MotionCorrection,
CorrectedImageStack,
@ -134,10 +137,11 @@ from ...core.v2_2_5.core_nwb_file import (
NWBFile,
NWBFileStimulus,
NWBFileGeneral,
NWBFileGeneralSourceScript,
NWBFileGeneralExtracellularEphys,
NWBFileGeneralExtracellularEphysElectrodes,
NWBFileGeneralIntracellularEphys,
GeneralSourceScript,
GeneralExtracellularEphys,
ExtracellularEphysElectrodes,
GeneralIntracellularEphys,
NWBFileIntervals,
LabMetaData,
Subject,
)
@ -161,6 +165,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -29,6 +29,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -88,15 +97,15 @@ class Image(NWBData):
)
name: str = Field(...)
resolution: Optional[np.float32] = Field(
resolution: Optional[float] = Field(
None, description="""Pixel resolution of the image, in pixels per centimeter."""
)
description: Optional[str] = Field(None, description="""Description of the image.""")
array: Optional[
value: Optional[
Union[
NDArray[Shape["* x, * y"], np.number],
NDArray[Shape["* x, * y, 3 r_g_b"], np.number],
NDArray[Shape["* x, * y, 4 r_g_b_a"], np.number],
NDArray[Shape["* x, * y"], float],
NDArray[Shape["* x, * y, 3 r_g_b"], float],
NDArray[Shape["* x, * y, 4 r_g_b_a"], float],
]
] = Field(None)
@ -135,10 +144,15 @@ class TimeSeries(NWBDataInterface):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
data: TimeSeriesData = Field(
...,
@ -148,12 +162,12 @@ class TimeSeries(NWBDataInterface):
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -182,23 +196,25 @@ class TimeSeriesData(ConfiguredBaseModel):
"data",
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
conversion: Optional[np.float32] = Field(
None,
conversion: Optional[float] = Field(
1.0,
description="""Scalar to multiply each element in data to convert it to the specified 'unit'. If the data are stored in acquisition system units or other units that require a conversion to be interpretable, multiply the data by 'conversion' to convert the data to the specified 'unit'. e.g. if the data acquisition system stores values in this object as signed 16-bit integers (int16 range -32,768 to 32,767) that correspond to a 5V range (-2.5V to 2.5V), and the data acquisition system gain is 8000X, then the 'conversion' multiplier to get from raw data acquisition values to recorded volts is 2.5/32768/8000 = 9.5367e-9.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(1.0)"}},
)
resolution: Optional[np.float32] = Field(
None,
resolution: Optional[float] = Field(
-1.0,
description="""Smallest meaningful difference between values in data, stored in the specified by unit, e.g., the change in value of the least significant bit, or a larger number if signal noise is known to be present. If unknown, use -1.0.""",
json_schema_extra={"linkml_meta": {"ifabsent": "float(-1.0)"}},
)
unit: Optional[str] = Field(
None,
unit: str = Field(
...,
description="""Base unit of measurement for working with the data. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
)
continuity: Optional[str] = Field(
None,
description="""Optionally describe the continuity of the data. Can be \"continuous\", \"instantaneous\", or \"step\". For example, a voltage trace would be \"continuous\", because samples are recorded from a continuous process. An array of lick times would be \"instantaneous\", because the data represents distinct moments in time. Times of image presentations would be \"step\" because the picture remains the same until the next timepoint. This field is optional, but is useful in providing information about the underlying data. It may inform the way this data is interpreted, the way it is visualized, and what analysis methods are applicable.""",
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], Any],
NDArray[Shape["* num_times, * num_dim2"], Any],
@ -221,11 +237,15 @@ class TimeSeriesStartingTime(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "starting_time", "ifabsent": "string(starting_time)"}
},
)
rate: Optional[np.float32] = Field(None, description="""Sampling rate, in Hz.""")
unit: Optional[str] = Field(
None, description="""Unit of measurement for time, which is fixed to 'seconds'."""
rate: float = Field(..., description="""Sampling rate, in Hz.""")
unit: Literal["seconds"] = Field(
"seconds",
description="""Unit of measurement for time, which is fixed to 'seconds'.""",
json_schema_extra={
"linkml_meta": {"equals_string": "seconds", "ifabsent": "string(seconds)"}
},
)
value: np.float64 = Field(...)
value: float = Field(...)
class TimeSeriesSync(ConfiguredBaseModel):
@ -250,7 +270,7 @@ class ProcessingModule(NWBContainer):
{"from_schema": "core.nwb.base", "tree_root": True}
)
children: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
value: Optional[List[Union[DynamicTable, NWBDataInterface]]] = Field(
None,
json_schema_extra={
"linkml_meta": {"any_of": [{"range": "NWBDataInterface"}, {"range": "DynamicTable"}]}
@ -269,9 +289,7 @@ class Images(NWBDataInterface):
)
name: str = Field("Images", json_schema_extra={"linkml_meta": {"ifabsent": "string(Images)"}})
description: Optional[str] = Field(
None, description="""Description of this collection of images."""
)
description: str = Field(..., description="""Description of this collection of images.""")
image: List[Image] = Field(..., description="""Images stored in this collection.""")

View file

@ -34,6 +34,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -84,21 +93,26 @@ class SpatialSeries(TimeSeries):
reference_frame: Optional[str] = Field(
None, description="""Description defining what exactly 'straight-ahead' means."""
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -128,13 +142,14 @@ class SpatialSeriesData(ConfiguredBaseModel):
json_schema_extra={"linkml_meta": {"equals_string": "data", "ifabsent": "string(data)"}},
)
unit: Optional[str] = Field(
None,
"meters",
description="""Base unit of measurement for working with the data. The default value is 'meters'. Actual stored values are not necessarily stored in these units. To access the data in these units, multiply 'data' by 'conversion'.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(meters)"}},
)
array: Optional[
value: Optional[
Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_features"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_features"], float],
]
] = Field(None)
@ -148,7 +163,7 @@ class BehavioralEpochs(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[IntervalSeries]] = Field(
value: Optional[List[IntervalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "IntervalSeries"}]}}
)
name: str = Field(...)
@ -163,7 +178,7 @@ class BehavioralEvents(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -178,7 +193,7 @@ class BehavioralTimeSeries(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -193,7 +208,7 @@ class PupilTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[TimeSeries]] = Field(
value: Optional[List[TimeSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "TimeSeries"}]}}
)
name: str = Field(...)
@ -208,7 +223,7 @@ class EyeTracking(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -223,7 +238,7 @@ class CompassDirection(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)
@ -238,7 +253,7 @@ class Position(NWBDataInterface):
{"from_schema": "core.nwb.behavior", "tree_root": True}
)
children: Optional[List[SpatialSeries]] = Field(
value: Optional[List[SpatialSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpatialSeries"}]}}
)
name: str = Field(...)

View file

@ -27,6 +27,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}

View file

@ -16,6 +16,7 @@ from pydantic import (
ValidationInfo,
BeforeValidator,
)
from ...core.v2_3_0.core_nwb_device import Device
from ...core.v2_3_0.core_nwb_base import (
TimeSeries,
TimeSeriesStartingTime,
@ -43,6 +44,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -68,7 +78,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -112,37 +122,47 @@ class ElectricalSeries(TimeSeries):
description="""Filtering applied to all channels of the data. For example, if this ElectricalSeries represents high-pass-filtered data (also known as AP Band), then this value could be \"High-pass 4-pole Bessel filter at 500 Hz\". If this ElectricalSeries represents low-pass-filtered LFP data and the type of filter is unknown, then this value could be \"Low-pass filter at 300 Hz\". If a non-standard filter type is used, provide as much detail about the filter properties as possible.""",
)
data: Union[
NDArray[Shape["* num_times"], np.number],
NDArray[Shape["* num_times, * num_channels"], np.number],
NDArray[Shape["* num_times, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_times"], float],
NDArray[Shape["* num_times, * num_channels"], float],
NDArray[Shape["* num_times, * num_channels, * num_samples"], float],
] = Field(..., description="""Recorded voltage data.""")
electrodes: Named[DynamicTableRegion] = Field(
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
timestamps: Optional[NDArray[Shape["* num_times"], np.float64]] = Field(
timestamps: Optional[NDArray[Shape["* num_times"], float]] = Field(
None,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -171,10 +191,10 @@ class SpikeEventSeries(ElectricalSeries):
name: str = Field(...)
data: Union[
NDArray[Shape["* num_events, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_channels, * num_samples"], np.number],
NDArray[Shape["* num_events, * num_samples"], float],
NDArray[Shape["* num_events, * num_channels, * num_samples"], float],
] = Field(..., description="""Spike waveforms.""")
timestamps: NDArray[Shape["* num_times"], np.float64] = Field(
timestamps: NDArray[Shape["* num_times"], float] = Field(
...,
description="""Timestamps for samples stored in data, in seconds, relative to the common experiment master-clock stored in NWBFile.timestamps_reference_time. Timestamps are required for the events. Unlike for TimeSeries, timestamps are required for SpikeEventSeries and are thus re-specified here.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -187,24 +207,34 @@ class SpikeEventSeries(ElectricalSeries):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
channel_conversion: Optional[NDArray[Shape["* num_channels"], np.float32]] = Field(
channel_conversion: Optional[NDArray[Shape["* num_channels"], float]] = Field(
None,
description="""Channel-specific conversion factor. Multiply the data in the 'data' dataset by these values along the channel axis (as indicated by axis attribute) AND by the global conversion factor in the 'conversion' attribute of 'data' to get the data values in Volts, i.e, data in Volts = data * data.conversion * channel_conversion. This approach allows for both global and per-channel data conversion factors needed to support the storage of electrical recordings as native values generated by data acquisition systems. If this dataset is not present, then there is no channel-specific conversion factor, i.e. it is 1 for all channels.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_channels"}]}}},
)
description: Optional[str] = Field(None, description="""Description of the time series.""")
description: Optional[str] = Field(
"no description",
description="""Description of the time series.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no description)"}},
)
comments: Optional[str] = Field(
None,
"no comments",
description="""Human-readable comments about the TimeSeries. This second descriptive field can be used to store additional information, or descriptive information if the primary description field is populated with a computer-readable string.""",
json_schema_extra={"linkml_meta": {"ifabsent": "string(no comments)"}},
)
starting_time: Optional[TimeSeriesStartingTime] = Field(
None,
description="""Timestamp of the first sample in seconds. When timestamps are uniformly spaced, the timestamp of the first sample can be specified and all subsequent ones calculated from the sampling rate attribute.""",
)
control: Optional[NDArray[Shape["* num_times"], np.uint8]] = Field(
control: Optional[NDArray[Shape["* num_times"], int]] = Field(
None,
description="""Numerical labels that apply to each time point in data for the purpose of querying and slicing data by these values. If present, the length of this array should be the same size as the first dimension of data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_times"}]}}},
@ -240,7 +270,7 @@ class FeatureExtraction(NWBDataInterface):
description="""Description of features (eg, ''PC1'') for each of the extracted features.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_features"}]}}},
)
features: NDArray[Shape["* num_events, * num_channels, * num_features"], np.float32] = Field(
features: NDArray[Shape["* num_events, * num_channels, * num_features"], float] = Field(
...,
description="""Multi-dimensional array of features extracted from each event.""",
json_schema_extra={
@ -255,7 +285,7 @@ class FeatureExtraction(NWBDataInterface):
}
},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of events that features correspond to (can be a link).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
@ -264,7 +294,12 @@ class FeatureExtraction(NWBDataInterface):
...,
description="""DynamicTableRegion pointer to the electrodes that this time series was generated from.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
@ -285,16 +320,25 @@ class EventDetection(NWBDataInterface):
...,
description="""Description of how events were detected, such as voltage threshold, or dV/dT threshold, as well as relevant values.""",
)
source_idx: NDArray[Shape["* num_events"], np.int32] = Field(
source_idx: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Indices (zero-based) into source ElectricalSeries::data array corresponding to time of event. ''description'' should define what is meant by time of event (e.g., .25 ms before action potential peak, zero-crossing time, etc). The index points to each event from the raw data.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Timestamps of events, in seconds.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
source_electricalseries: Union[ElectricalSeries, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "ElectricalSeries"}, {"range": "string"}],
}
},
)
class EventWaveform(NWBDataInterface):
@ -306,7 +350,7 @@ class EventWaveform(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[SpikeEventSeries]] = Field(
value: Optional[List[SpikeEventSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "SpikeEventSeries"}]}}
)
name: str = Field(...)
@ -321,7 +365,7 @@ class FilteredEphys(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -336,7 +380,7 @@ class LFP(NWBDataInterface):
{"from_schema": "core.nwb.ecephys", "tree_root": True}
)
children: Optional[List[ElectricalSeries]] = Field(
value: Optional[List[ElectricalSeries]] = Field(
None, json_schema_extra={"linkml_meta": {"any_of": [{"range": "ElectricalSeries"}]}}
)
name: str = Field(...)
@ -352,14 +396,23 @@ class ElectrodeGroup(NWBContainer):
)
name: str = Field(...)
description: Optional[str] = Field(None, description="""Description of this electrode group.""")
location: Optional[str] = Field(
None,
description: str = Field(..., description="""Description of this electrode group.""")
location: str = Field(
...,
description="""Location of electrode group. Specify the area, layer, comments on estimation of area/layer, etc. Use standard atlas names for anatomical regions when possible.""",
)
position: Optional[ElectrodeGroupPosition] = Field(
None, description="""stereotaxic or common framework coordinates"""
)
device: Union[Device, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Device"}, {"range": "string"}],
}
},
)
class ElectrodeGroupPosition(ConfiguredBaseModel):
@ -375,9 +428,21 @@ class ElectrodeGroupPosition(ConfiguredBaseModel):
"linkml_meta": {"equals_string": "position", "ifabsent": "string(position)"}
},
)
x: Optional[np.float32] = Field(None, description="""x coordinate""")
y: Optional[np.float32] = Field(None, description="""y coordinate""")
z: Optional[np.float32] = Field(None, description="""z coordinate""")
x: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""x coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
y: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""y coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
z: Optional[NDArray[Shape["*"], float]] = Field(
None,
description="""z coordinate""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
class ClusterWaveforms(NWBDataInterface):
@ -396,7 +461,7 @@ class ClusterWaveforms(NWBDataInterface):
waveform_filtering: str = Field(
..., description="""Filtering applied to data before generating mean/sd"""
)
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_mean: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""The mean waveform for each cluster, using the same indices for each wave as cluster numbers in the associated Clustering module (i.e, cluster 3 is in array slot [3]). Waveforms corresponding to gaps in cluster sequence should be empty (e.g., zero- filled)""",
json_schema_extra={
@ -405,7 +470,7 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], np.float32] = Field(
waveform_sd: NDArray[Shape["* num_clusters, * num_samples"], float] = Field(
...,
description="""Stdev of waveforms for each cluster, using the same indices as in mean""",
json_schema_extra={
@ -414,6 +479,15 @@ class ClusterWaveforms(NWBDataInterface):
}
},
)
clustering_interface: Union[Clustering, str] = Field(
...,
json_schema_extra={
"linkml_meta": {
"annotations": {"source_type": {"tag": "source_type", "value": "link"}},
"any_of": [{"range": "Clustering"}, {"range": "string"}],
}
},
)
class Clustering(NWBDataInterface):
@ -432,17 +506,17 @@ class Clustering(NWBDataInterface):
...,
description="""Description of clusters or clustering, (e.g. cluster 0 is noise, clusters curated using Klusters, etc)""",
)
num: NDArray[Shape["* num_events"], np.int32] = Field(
num: NDArray[Shape["* num_events"], int] = Field(
...,
description="""Cluster number of each event""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},
)
peak_over_rms: NDArray[Shape["* num_clusters"], np.float32] = Field(
peak_over_rms: NDArray[Shape["* num_clusters"], float] = Field(
...,
description="""Maximum ratio of waveform peak to RMS on any channel in the cluster (provides a basic clustering metric).""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_clusters"}]}}},
)
times: NDArray[Shape["* num_events"], np.float64] = Field(
times: NDArray[Shape["* num_events"], float] = Field(
...,
description="""Times of clustered events, in seconds. This may be a link to times field in associated FeatureExtraction module.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_events"}]}}},

View file

@ -37,6 +37,15 @@ class ConfiguredBaseModel(BaseModel):
)
object_id: Optional[str] = Field(None, description="Unique UUID for each object")
def __getitem__(self, val: Union[int, slice]) -> Any:
"""Try and get a value from value or "data" if we have it"""
if hasattr(self, "value") and self.value is not None:
return self.value[val]
elif hasattr(self, "data") and self.data is not None:
return self.data[val]
else:
raise KeyError("No value or data field to index from")
class LinkMLMeta(RootModel):
root: Dict[str, Any] = {}
@ -62,7 +71,7 @@ ModelType = TypeVar("ModelType", bound=Type[BaseModel])
def _get_name(item: ModelType | dict, info: ValidationInfo) -> Union[ModelType, dict]:
"""Get the name of the slot that refers to this object"""
assert isinstance(item, (BaseModel, dict))
assert isinstance(item, (BaseModel, dict)), f"{item} was not a BaseModel or a dict!"
name = info.field_name
if isinstance(item, BaseModel):
item.name = name
@ -96,7 +105,7 @@ class TimeIntervals(DynamicTable):
)
name: str = Field(...)
start_time: NDArray[Any, np.float32] = Field(
start_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Start time of epoch, in seconds.""",
json_schema_extra={
@ -105,7 +114,7 @@ class TimeIntervals(DynamicTable):
}
},
)
stop_time: NDArray[Any, np.float32] = Field(
stop_time: VectorData[NDArray[Any, float]] = Field(
...,
description="""Stop time of epoch, in seconds.""",
json_schema_extra={
@ -114,7 +123,7 @@ class TimeIntervals(DynamicTable):
}
},
)
tags: Optional[NDArray[Any, str]] = Field(
tags: VectorData[Optional[NDArray[Any, str]]] = Field(
None,
description="""User-defined tags that identify or categorize events.""",
json_schema_extra={
@ -127,7 +136,12 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for tags.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
timeseries: Optional[TimeIntervalsTimeseries] = Field(
@ -137,17 +151,20 @@ class TimeIntervals(DynamicTable):
None,
description="""Index for timeseries.""",
json_schema_extra={
"linkml_meta": {"annotations": {"named": {"tag": "named", "value": True}}}
"linkml_meta": {
"annotations": {
"named": {"tag": "named", "value": True},
"source_type": {"tag": "source_type", "value": "neurodata_type_inc"},
}
}
},
)
colnames: Optional[str] = Field(
None,
colnames: List[str] = Field(
...,
description="""The names of the columns in this table. This should be used to specify an order to the columns.""",
)
description: Optional[str] = Field(
None, description="""Description of what is in this dynamic table."""
)
id: NDArray[Shape["* num_rows"], int] = Field(
description: str = Field(..., description="""Description of what is in this dynamic table.""")
id: VectorData[NDArray[Shape["* num_rows"], int]] = Field(
...,
description="""Array of unique identifiers for the rows of this dynamic table.""",
json_schema_extra={"linkml_meta": {"array": {"dimensions": [{"alias": "num_rows"}]}}},
@ -170,21 +187,23 @@ class TimeIntervalsTimeseries(VectorData):
"linkml_meta": {"equals_string": "timeseries", "ifabsent": "string(timeseries)"}
},
)
idx_start: Optional[np.int32] = Field(
idx_start: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Start index into the TimeSeries 'data' and 'timestamp' datasets of the referenced TimeSeries. The first dimension of those arrays is always time.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
count: Optional[np.int32] = Field(
count: Optional[NDArray[Shape["*"], int]] = Field(
None,
description="""Number of data samples available in this time series, during this epoch.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
timeseries: Optional[TimeSeries] = Field(
None, description="""the TimeSeries that this index applies to."""
timeseries: Optional[NDArray[Shape["*"], TimeSeries]] = Field(
None,
description="""the TimeSeries that this index applies to.""",
json_schema_extra={"linkml_meta": {"array": {"exact_number_dimensions": 1}}},
)
description: Optional[str] = Field(
None, description="""Description of what these vectors represent."""
)
array: Optional[
description: str = Field(..., description="""Description of what these vectors represent.""")
value: Optional[
Union[
NDArray[Shape["* dim0"], Any],
NDArray[Shape["* dim0, * dim1"], Any],

Some files were not shown because too many files have changed in this diff Show more