{ "cells": [ { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n# Creating a new combined workflow\n\nA ``CombinedWorkflow`` is a series of DIPY_ workflows organized together in a\nway that the output of a workflow serves as input for the next one.\n\nFirst create your ``CombinedWorkflow`` class. Your ``CombinedWorkflow`` class\nfile is usually located in the ``dipy/workflows`` directory.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "from dipy.workflows.combined_workflow import CombinedWorkflow" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "``CombinedWorkflow`` is the base class that will be extended to create our\ncombined workflow.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "from dipy.workflows.denoise import NLMeansFlow\nfrom dipy.workflows.segment import MedianOtsuFlow" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "``MedianOtsuFlow`` and ``NLMeansFlow`` will be combined to create our\nprocessing section.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "class DenoiseAndSegment(CombinedWorkflow):\n\n \"\"\"\n ``DenoiseAndSegment`` is the name of our combined workflow. Note that\n it needs to extend CombinedWorkflow for everything to work properly.\n \"\"\"\n\n def _get_sub_flows(self):\n return [\n NLMeansFlow,\n MedianOtsuFlow\n ]\n\n \"\"\"\n It is mandatory to implement this method if you want to make all the sub\n workflows parameters available in commandline.\n \"\"\"\n\n def run(self, input_files, out_dir='', out_file='processed.nii.gz'):\n \"\"\"\n Parameters\n ----------\n input_files : string\n Path to the input files. This path may contain wildcards to\n process multiple inputs at once.\n\n out_dir : string, optional\n Where the resulting file will be saved. (default '')\n\n out_file : string, optional\n Name of the result file to be saved. (default 'processed.nii.gz')\n \"\"\"\n\n \"\"\"\n Just like a normal workflow, it is mandatory to have out_dir as a\n parameter. It is also mandatory to put 'out_' in front of every\n parameter that is going to be an output. Lastly, all out_ params needs\n to be at the end of the params list.\n\n The class docstring part is very important, you need to document\n every parameter as they will be used with inspection to build the\n command line argument parser.\n \"\"\"\n\n io_it = self.get_io_iterator()\n\n for in_file, out_file in io_it:\n nl_flow = NLMeansFlow()\n self.run_sub_flow(nl_flow, in_file, out_dir=out_dir)\n denoised = nl_flow.last_generated_outputs['out_denoised']\n\n me_flow = MedianOtsuFlow()\n self.run_sub_flow(me_flow, denoised, out_dir=out_dir)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Use ``self.get_io_iterator()`` in every workflow you create. This creates\nan ``IOIterator`` object that create output file names and directory structure\nbased on the inputs and some other advanced output strategy parameters.\n\nIterating on the ``IOIterator`` object you created previously you\nconveniently get all input and output paths for every input file\nfound when globbin the input parameters.\n\nIn the ``IOIterator`` loop you can see how we create a new ``NLMeans`` workflow\nthen run it using ``self.run_sub_flow``. Running it this way will pass any\nworkflow specific parameter that was retrieved from the command line and will\nappend the ones you specify as optional parameters (``out_dir`` in this case).\n\nLastly, the outputs paths are retrieved using\n``workflow.last_generated_outputs``. This allows to use ``denoise`` as the\ninput for the ``MedianOtsuFlow``.\n\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is it for the combined workflow class! Now to be able to call it easily via\ncommand line, you need this last bit of code. It is usually in an executable\nfile located in ``bin``.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "from dipy.workflows.flow_runner import run_flow" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is the method that will wrap everything that is needed to make a workflow\nready then run it.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# if __name__ == \"__main__\":\n# run_flow(DenoiseAndSegment())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is the only thing needed to make your workflow available through command\nline.\n\nNow just call the script you just made with ``-h`` to see the argparser help\ntext::\n\n python combined_workflow_creation.py --help\n\nYou should see all your parameters available along with some extra common ones\nlike logging file and force overwrite. Also all the documentation you wrote\nabout each parameter is there. Also note that every sub workflow optional\nparameter is available.\n\nNow call it for real with a nifti file to see the results. Experiment\nwith the parameters and see the results::\n\n python combined_workflow_creation.py volume.nii.gz\n\n.. include:: ../links_names.inc\n\n\n" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.13" } }, "nbformat": 4, "nbformat_minor": 0 }