Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] select_gradients node breaks custom ss3t yaml pipeline. #195

Closed
chiuhoward opened this issue Dec 7, 2024 · 2 comments
Closed

[BUG] select_gradients node breaks custom ss3t yaml pipeline. #195

chiuhoward opened this issue Dec 7, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@chiuhoward
Copy link
Contributor

Summary

When I used the new yaml files that include this chunk, I get an error

-   action: select_gradients
    input: qsirecon
    name: select_single_shell
    parameters:
        requested_shells:
            - 0
            - highest
        bval_distance_cutoff: 100

I reran my exact same shell script with the old yaml file without this chunk and it executes.

Additional details

  • QSIRecon version: 1.0.0rc3.dev0+g619341b.d20241121
  • Docker version: NA
  • Singularity version: 1.3.2-1.el7 (Apptainer)

What were you trying to do?

I was trying to run a custom ss3t_noACT pipeline by customizing my yaml file. I downloaded the latest yaml file from the qsirecon repo but did not notice that there was an additional node included and my code crashed.

What did you expect to happen?

I expected my code to run as per usual.

What actually happened?

Process Process-2:
Traceback (most recent call last):
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/opt/conda/envs/qsiprep/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsirecon/cli/workflow.py", line 133, in build_workflow
    retval["workflow"] = init_qsirecon_wf()
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsirecon/workflows/base.py", line 44, in init_qsirecon_wf
    single_subject_wf = init_single_subject_recon_wf(subject_id=subject_id)
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsirecon/workflows/base.py", line 268, in init_single_subject_recon_wf
    dwi_recon_wfs[dwi_file] = init_dwi_recon_workflow(
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsirecon/workflows/recon/build_workflow.py", line 70, in init_dwi_recon_workflow
    new_node = workflow_from_spec(
  File "/opt/conda/envs/qsiprep/lib/python3.10/site-packages/qsirecon/workflows/recon/build_workflow.py", line 290, in workflow_from_spec
    raise Exception("Unknown node %s" % node_spec)
Exception: Unknown node {'action': 'select_gradients', 'input': 'qsirecon', 'name': 'select_single_shell', 'parameters': {'requested_shells': [0, 'highest'], 'bval_distance_cutoff': 100}}

Reproducing the bug

This shell script ran with the first yaml file but not the second (I modified the absolute yaml path in the shell script to point to either):

export APPTAINERENV_FS_LICENSE=$GROUP_HOME/freesurferlicense.txt

singularity run --writable-tmpfs --cleanenv \
/home/groups/jyeatman/software/singularity_images/qsirecon-1.0.0rc2.sif \
/scratch/groups/jyeatman/howard/DBPFullwT1derivNov24-mrdegibbs-ignorefmap \
/scratch/groups/jyeatman/howard/DBPFullwT1derivNov24-mrdegibbs-ignorefmap-recon-sl/ participant --participant-label sub-$SLURM_ARRAY_TASK_ID \
-w /scratch/groups/jyeatman/howard/DBPFullwT1derivNov24-mrdegibbs-ignorefmap-recon-sl-work \
--infant \
--recon-spec /home/groups/jyeatman/software/howard/mrtrix_singleshell_ss3t_noACT_noselectgrad.yaml \
--fs-license-file $GROUP_HOME/freesurferlicense.txt \
--output-resolution 2 \
-v -v

This works:

anatomical: []
name: mrtrix_singleshell_ss3t_noACT
nodes:
-   action: csd
    input: qsirecon
    name: ss3t_csd
    parameters:
        fod:
            algorithm: ss3t
        mtnormalize: true
        response:
            algorithm: dhollander
    qsirecon_suffix: MRtrix3_fork-SS3T_act-None
    software: MRTrix3
-   action: tractography
    input: ss3t_csd
    name: track_ifod2
    parameters:
        sift2: {}
        tckgen:
            algorithm: iFOD2
            quiet: true
            select: 20000000.0
        use_5tt: false
        use_sift2: true
    qsirecon_suffix: MRtrix3_fork-SS3T_act-None
    software: MRTrix3
space: T1w

This didn't:

anatomical: []
name: mrtrix_singleshell_ss3t_noACT
nodes:

-   action: select_gradients
    input: qsirecon
    name: select_single_shell
    parameters:
        requested_shells:
            - 0
            - highest
        bval_distance_cutoff: 100

-   action: csd
    input: select_single_shell
    name: ss3t_csd
    parameters:
        fod:
            algorithm: ss3t
        mtnormalize: true
        response:
            algorithm: dhollander
    qsirecon_suffix: MRtrix3_fork-SS3T_act-None
    software: MRTrix3
-   action: tractography
    input: ss3t_csd
    name: track_ifod2
    parameters:
        sift2: {}
        tckgen:
            algorithm: iFOD2
            max_length: 250
            min_length: 30
            quiet: true
            select: 20000000.0
        use_5tt: false
        use_sift2: true
    qsirecon_suffix: MRtrix3_fork-SS3T_act-None
    software: MRTrix3
space: T1w
@chiuhoward chiuhoward added the bug Something isn't working label Dec 7, 2024
@chiuhoward
Copy link
Contributor Author

Sorry I think #194 will fix this, I'll try again by pulling the latest container and update.

@chiuhoward
Copy link
Contributor Author

Error goes away in 1.0.0rc3.dev2+gf95211a.d20241206!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant