diff --git a/Makefile b/Makefile index d0f6e4a4ce..8c7856e7ca 100644 --- a/Makefile +++ b/Makefile @@ -15,8 +15,6 @@ sdist: zipdoc python setup.py sdist @echo "Done building source distribution." # XXX copy documentation.zip to dist directory. - # XXX Somewhere the doc/_build directory is removed and causes - # this script to fail. egg: zipdoc @echo "Building egg..." @@ -45,7 +43,10 @@ clean-build: clean-ctags: rm -f tags -clean: clean-build clean-pyc clean-so clean-ctags +clean-doc: + rm -rf doc/_build + +clean: clean-build clean-pyc clean-so clean-ctags clean-doc in: inplace # just a shortcut inplace: diff --git a/doc/users/install.rst b/doc/users/install.rst index c46865ff19..5f96c040c7 100644 --- a/doc/users/install.rst +++ b/doc/users/install.rst @@ -76,8 +76,14 @@ nose_ installed, then do the following:: you can also test with nosetests:: - nosetests --with-doctest /software/nipy-repo/masternipype/nipype - --exclude=external --exclude=testing + nosetests --with-doctest /nipype --exclude=external --exclude=testing + +A successful test run should complete in a few minutes and end with +something like:: + + Ran 13053 tests in 126.618s + + OK (SKIP=66) All tests should pass (unless you're missing a dependency). If SUBJECTS_DIR variable is not set some FreeSurfer related tests will fail. If any tests @@ -89,9 +95,9 @@ tests:: export MATLABCMD=$pathtomatlabdir/bin/$platform/MATLAB -where, $pathtomatlabdir is the path to your matlab installation and -$platform is the directory referring to x86 or x64 installations -(typically glnxa64 on 64-bit installations). +where ``$pathtomatlabdir`` is the path to your matlab installation and +``$platform`` is the directory referring to x86 or x64 installations +(typically ``glnxa64`` on 64-bit installations). Avoiding any MATLAB calls from testing ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -119,7 +125,7 @@ Must Have Nibabel_ 1.0 - 1.4 Neuroimaging file i/o library. -Python_ 2.7 +Python_ 2.7 or Python_ 3 NetworkX_ 1.0 - 1.8 Python package for working with complex networks. diff --git a/doc/users/pipeline_tutorial.rst b/doc/users/pipeline_tutorial.rst index 85abf384ed..463c879386 100644 --- a/doc/users/pipeline_tutorial.rst +++ b/doc/users/pipeline_tutorial.rst @@ -59,29 +59,12 @@ Checklist for analysis tutorials For the analysis tutorials, we will be using a slightly modified version of the FBIRN Phase I travelling data set. -Step 0 -~~~~~~ +# Download and extract the `Pipeline tutorial data (429MB). +`_ +(md5: d175083784c5167de4ea11b43b37c166) -Download and extract the `Pipeline tutorial data (429MB). -`_ - -(checksum: 56ed4b7e0aac5627d1724e9c10cd26a7) - - -Step 1. -~~~~~~~ - -Ensure that all programs are available by calling ``bet``, ``matlab`` -and then ``which spm`` within matlab to ensure you have spm5/8 in your +# Ensure that all programs are available by calling ``bet``, ``matlab`` +and then ``which spm`` within matlab to ensure you have spm5/8/12 in your matlab path. -Step 2. -~~~~~~~ - -You can now run the tutorial by typing ``python tutorial_script.py`` -within the nipype-tutorial directory. This will run a full first level -analysis on two subjects following by a 1-sample t-test on their first -level results. The next section goes through each section of the -tutorial script and describes what it is doing. - .. include:: ../links_names.txt diff --git a/doc/users/tutorial_101.rst b/doc/users/tutorial_101.rst index 64b9be2a3c..a6b7a1333e 100644 --- a/doc/users/tutorial_101.rst +++ b/doc/users/tutorial_101.rst @@ -75,7 +75,8 @@ realigner to the smoother in step 5. **3. Creating and configuring a workflow** Here we create an instance of a workflow and indicate that it should operate in -the current directory. +the current directory. The workflow's output will be placed in the ``preproc`` +directory. .. testcode:: @@ -128,11 +129,13 @@ above were generated using this. workflow.write_graph() -This creates two files graph.dot and graph_detailed.dot and if +This creates two files ``graph.dot`` and ``graph_detailed.dot`` inside +``./preproc`` and if graphviz_ is installed on your system it automatically converts it to png files. If graphviz is not installed you can take the dot files and load them in a graphviz visualizer elsewhere. You can specify how detailed -the graph is going to be, by using "graph2use" argument which takes the following +the graph is going to be, by using the ``graph2use`` argument which takes +the following options: * hierarchical - creates a graph showing all embedded workflows (default) @@ -152,9 +155,11 @@ above pipeline. import nipype.algorithms.rapidart as ra artdetect = pe.Node(interface=ra.ArtifactDetect(), name='artdetect') artdetect.inputs.use_differences = [True, False] - art.inputs.use_norm = True - art.inputs.norm_threshold = 0.5 - art.inputs.zintensity_threshold = 3 + artdetect.inputs.use_norm = True + artdetect.inputs.norm_threshold = 0.5 + artdetect.inputs.zintensity_threshold = 3 + artdetect.inputs.parameter_source = "SPM" + artdetect.inputs.mask_type = "spm_global" workflow.connect([(realigner, artdetect, [('realigned_files', 'realigned_files'), ('realignment_parameters','realignment_parameters')]