Skip to content
This repository has been archived by the owner on Nov 19, 2024. It is now read-only.

Commit

Permalink
Merge upstream changes from dandi-cli for pydantic 2.0
Browse files Browse the repository at this point in the history
  • Loading branch information
Aaron Kanzer authored and Aaron Kanzer committed Mar 13, 2024
2 parents 2a49b7f + ee18a15 commit 82294f0
Show file tree
Hide file tree
Showing 26 changed files with 368 additions and 171 deletions.
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ repos:
- id: codespell
exclude: ^(dandi/_version\.py|dandi/due\.py|versioneer\.py)$
- repo: https://github.com/PyCQA/flake8
rev: 4.0.1
rev: 7.0.0
hooks:
- id: flake8
exclude: ^(dandi/_version\.py|dandi/due\.py|versioneer\.py)$
6 changes: 3 additions & 3 deletions lincbrain/cli/cmd_download.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
from ..dandiarchive import _dandi_url_parser, parse_dandi_url
from ..dandiset import Dandiset
from ..download import DownloadExisting, DownloadFormat, PathType
from ..utils import get_instance
from ..utils import get_instance, joinurl


# The use of f-strings apparently makes this not a proper docstring, and so
Expand Down Expand Up @@ -131,9 +131,9 @@ def download(
pass
else:
if instance.gui is not None:
url = [f"{instance.gui}/#/dandiset/{dandiset_id}/draft"]
url = [joinurl(instance.gui, f"/#/dandiset/{dandiset_id}/draft")]
else:
url = [f"{instance.api}/dandisets/{dandiset_id}/"]
url = [joinurl(instance.api, f"/dandisets/{dandiset_id}/")]

return download.download(
url,
Expand Down
6 changes: 3 additions & 3 deletions lincbrain/cli/cmd_ls.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,8 +96,8 @@ def ls(
all_fields = tuple(
sorted(
set(common_fields)
| models.Dandiset.__fields__.keys()
| models.Asset.__fields__.keys()
| models.Dandiset.model_fields.keys()
| models.Asset.model_fields.keys()
)
)
else:
Expand Down Expand Up @@ -345,7 +345,7 @@ def fn():
path,
schema_version=schema,
digest=Digest.dandi_etag(digest),
).json_dict()
).model_dump(mode="json", exclude_none=True)
else:
if path.endswith(tuple(ZARR_EXTENSIONS)):
if use_fake_digest:
Expand Down
2 changes: 1 addition & 1 deletion lincbrain/cli/cmd_service_scripts.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ def reextract_metadata(url: str, diff: bool, when: str) -> None:
lgr.info("Extracting new metadata for asset")
metadata = nwb2asset(asset.as_readable(), digest=digest)
metadata.path = asset.path
mddict = metadata.json_dict()
mddict = metadata.model_dump(mode="json", exclude_none=True)
if diff:
oldmd = asset.get_raw_metadata()
oldmd_str = yaml_dump(oldmd)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -386,7 +386,7 @@
"includeInCitation": true
}
],
"dateCreated": "2023-04-25T16:28:26.500181+00:00",
"dateCreated": "2023-04-25T16:28:26.500181Z",
"description": "<jats:p>Progress in science requires standardized assays whose results can be readily shared, compared, and reproduced across laboratories. Reproducibility, however, has been a concern in neuroscience, particularly for measurements of mouse behavior. Here we show that a standardized task to probe decision-making in mice produces reproducible results across multiple laboratories. We designed a task for head-fixed mice that combines established assays of perceptual and value-based decision making, and we standardized training protocol and experimental hardware, software, and procedures. We trained 140 mice across seven laboratories in three countries, and we collected 5 million mouse choices into a publicly available database. Learning speed was variable across mice and laboratories, but once training was complete there were no significant differences in behavior across laboratories. Mice in different laboratories adopted similar reliance on visual stimuli, on past successes and failures, and on estimates of stimulus prior probability to guide their choices. These results reveal that a complex mouse behavior can be successfully reproduced across multiple laboratories. They establish a standard for reproducible rodent behavior, and provide an unprecedented dataset and open-access tools to study decision-making in mice. More generally, they indicate a path towards achieving reproducibility in neuroscience through collaborative open-science approaches.</jats:p>",
"assetsSummary": {
"schemaKey": "AssetsSummary",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@
"includeInCitation": true
}
],
"dateCreated": "2023-04-25T16:28:30.453019+00:00",
"dateCreated": "2023-04-25T16:28:30.453019Z",
"description": "<jats:p>Proprioception, the sense of body position, movement, and associated forces, remains poorly understood, despite its critical role in movement. Most studies of area 2, a proprioceptive area of somatosensory cortex, have simply compared neurons\u2019 activities to the movement of the hand through space. Using motion tracking, we sought to elaborate this relationship by characterizing how area 2 activity relates to whole arm movements. We found that a whole-arm model, unlike classic models, successfully predicted how features of neural activity changed as monkeys reached to targets in two workspaces. However, when we then evaluated this whole-arm model across active and passive movements, we found that many neurons did not consistently represent the whole arm over both conditions. These results suggest that 1) neural activity in area 2 includes representation of the whole arm during reaching and 2) many of these neurons represented limb state differently during active and passive movements.</jats:p>",
"assetsSummary": {
"schemaKey": "AssetsSummary",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
"includeInCitation": true
}
],
"dateCreated": "2023-04-25T16:28:28.308094+00:00",
"dateCreated": "2023-04-25T16:28:28.308094Z",
"description": "<jats:p>Reinforcement learning theory plays a key role in understanding the behavioral and neural mechanisms of choice behavior in animals and humans. Especially, intermediate variables of learning models estimated from behavioral data, such as the expectation of reward for each candidate choice (action value), have been used in searches for the neural correlates of computational elements in learning and decision making. The aims of the present study are as follows: (1) to test which computational model best captures the choice learning process in animals and (2) to elucidate how action values are represented in different parts of the corticobasal ganglia circuit. We compared different behavioral learning algorithms to predict the choice sequences generated by rats during a free-choice task and analyzed associated neural activity in the nucleus accumbens (NAc) and ventral pallidum (VP). The major findings of this study were as follows: (1) modified versions of an action\u2013value learning model captured a variety of choice strategies of rats, including win-stay\u2013lose-switch and persevering behavior, and predicted rats' choice sequences better than the best multistep Markov model; and (2) information about action values and future actions was coded in both the NAc and VP, but was less dominant than information about trial types, selected actions, and reward outcome. The results of our model-based analysis suggest that the primary role of the NAc and VP is to monitor information important for updating choice behaviors. Information represented in the NAc and VP might contribute to a choice mechanism that is situated elsewhere.</jats:p>",
"assetsSummary": {
"schemaKey": "AssetsSummary",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
"includeInCitation": true
}
],
"dateCreated": "2023-04-25T16:28:31.601155+00:00",
"dateCreated": "2023-04-25T16:28:31.601155Z",
"description": "<jats:title>Abstract</jats:title><jats:p>Spatial cognition depends on an accurate representation of orientation within an environment. Head direction cells in distributed brain regions receive a range of sensory inputs, but visual input is particularly important for aligning their responses to environmental landmarks. To investigate how population-level heading responses are aligned to visual input, we recorded from retrosplenial cortex (RSC) of head-fixed mice in a moving environment using two-photon calcium imaging. We show that RSC neurons are tuned to the animal\u2019s relative orientation in the environment, even in the absence of head movement. Next, we found that RSC receives functionally distinct projections from visual and thalamic areas and contains several functional classes of neurons. While some functional classes mirror RSC inputs, a newly discovered class coregisters visual and thalamic signals. Finally, decoding analyses reveal unique contributions to heading from each class. Our results suggest an RSC circuit for anchoring heading representations to environmental visual landmarks.</jats:p>",
"assetsSummary": {
"schemaKey": "AssetsSummary",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
"includeInCitation": true
}
],
"dateCreated": "2023-04-25T16:28:29.373034+00:00",
"dateCreated": "2023-04-25T16:28:29.373034Z",
"description": "A test Dandiset",
"assetsSummary": {
"schemaKey": "AssetsSummary",
Expand Down
Loading

0 comments on commit 82294f0

Please sign in to comment.