nn

bench([label, verbose, extra_argv])

Run benchmarks for module using nose.

test([label, verbose, extra_argv, doctests, ...])

Run tests for module using nose.

Module: nn.histo_resdnn

Class and helper functions for fitting the Histological ResDNN model.

HemiSphere([x, y, z, theta, phi, xyz, ...])

Points on the unit sphere.

HistoResDNN([sh_order, basis_type, verbose])

This class is intended for the ResDNN Histology Network model.

Version(version)

This class abstracts handling of a project's versions.

doctest_skip_parser(func)

Decorator replaces custom skip test markup in doctests.

get_bval_indices(bvals, bval[, tol])

Get indices where the b-value is bval

get_fnames([name])

Provide full paths to example or test datasets.

get_sphere([name])

provide triangulated spheres

optional_package(name[, trip_msg])

Return package-like thing and module setup for package name

set_logger_level(log_level)

Change the logger of the HistoResDNN to one on the following: DEBUG, INFO, WARNING, CRITICAL, ERROR

sf_to_sh(sf, sphere[, sh_order, basis_type, ...])

Spherical function to spherical harmonics (SH).

sh_to_sf(sh, sphere[, sh_order, basis_type, ...])

Spherical harmonics (SH) to spherical function (SF).

sph_harm_ind_list(sh_order[, full_basis])

Returns the degree (m) and order (n) of all the symmetric spherical harmonics of degree less then or equal to sh_order.

unique_bvals_magnitude(bvals[, bmag, rbvals])

This function gives the unique rounded b-values of the data

Module: nn.model

MultipleLayerPercepton([input_shape, ...])

Methods

SingleLayerPerceptron([input_shape, ...])

Methods

Version(version)

This class abstracts handling of a project's versions.

optional_package(name[, trip_msg])

Return package-like thing and module setup for package name

bench

dipy.nn.bench(label='fast', verbose=1, extra_argv=None)

Run benchmarks for module using nose.

Parameters:
label{‘fast’, ‘full’, ‘’, attribute identifier}, optional

Identifies the benchmarks to run. This can be a string to pass to the nosetests executable with the ‘-A’ option, or one of several special values. Special values are:

  • ‘fast’ - the default - which corresponds to the nosetests -A option of ‘not slow’.

  • ‘full’ - fast (as above) and slow benchmarks as in the ‘no -A’ option to nosetests - this is the same as ‘’.

  • None or ‘’ - run all tests.

  • attribute_identifier - string passed directly to nosetests as ‘-A’.

verboseint, optional

Verbosity value for benchmark outputs, in the range 1-10. Default is 1.

extra_argvlist, optional

List with any extra arguments to pass to nosetests.

Returns:
successbool

Returns True if running the benchmarks works, False if an error occurred.

Notes

Benchmarks are like tests, but have names starting with “bench” instead of “test”, and can be found under the “benchmarks” sub-directory of the module.

Each NumPy module exposes bench in its namespace to run all benchmarks for it.

Examples

>>> success = np.lib.bench() 
Running benchmarks for numpy.lib
...
using 562341 items:
unique:
0.11
unique1d:
0.11
ratio: 1.0
nUnique: 56230 == 56230
...
OK
>>> success 
True

test

dipy.nn.test(label='fast', verbose=1, extra_argv=None, doctests=False, coverage=False, raise_warnings=None, timer=False)

Run tests for module using nose.

Parameters:
label{‘fast’, ‘full’, ‘’, attribute identifier}, optional

Identifies the tests to run. This can be a string to pass to the nosetests executable with the ‘-A’ option, or one of several special values. Special values are:

  • ‘fast’ - the default - which corresponds to the nosetests -A option of ‘not slow’.

  • ‘full’ - fast (as above) and slow tests as in the ‘no -A’ option to nosetests - this is the same as ‘’.

  • None or ‘’ - run all tests.

  • attribute_identifier - string passed directly to nosetests as ‘-A’.

verboseint, optional

Verbosity value for test outputs, in the range 1-10. Default is 1.

extra_argvlist, optional

List with any extra arguments to pass to nosetests.

doctestsbool, optional

If True, run doctests in module. Default is False.

coveragebool, optional

If True, report coverage of NumPy code. Default is False. (This requires the coverage module).

raise_warningsNone, str or sequence of warnings, optional

This specifies which warnings to configure as ‘raise’ instead of being shown once during the test execution. Valid strings are:

  • “develop” : equals (Warning,)

  • “release” : equals (), do not raise on any warnings.

timerbool or int, optional

Timing of individual tests with nose-timer (which needs to be installed). If True, time tests and report on all of them. If an integer (say N), report timing results for N slowest tests.

Returns:
resultobject

Returns the result of running the tests as a nose.result.TextTestResult object.

Notes

Each NumPy module exposes test in its namespace to run all tests for it. For example, to run all tests for numpy.lib:

>>> np.lib.test() 

Examples

>>> result = np.lib.test() 
Running unit tests for numpy.lib
...
Ran 976 tests in 3.933s

OK

>>> result.errors 
[]
>>> result.knownfail 
[]

HemiSphere

class dipy.nn.histo_resdnn.HemiSphere(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None, tol=1e-05)

Bases: Sphere

Points on the unit sphere.

A HemiSphere is similar to a Sphere but it takes antipodal symmetry into account. Antipodal symmetry means that point v on a HemiSphere is the same as the point -v. Duplicate points are discarded when constructing a HemiSphere (including antipodal duplicates). edges and faces are remapped to the remaining points as closely as possible.

The HemiSphere can be constructed using one of three conventions:

HemiSphere(x, y, z)
HemiSphere(xyz=xyz)
HemiSphere(theta=theta, phi=phi)
Parameters:
x, y, z1-D array_like

Vertices as x-y-z coordinates.

theta, phi1-D array_like

Vertices as spherical coordinates. Theta and phi are the inclination and azimuth angles respectively.

xyz(N, 3) ndarray

Vertices as x-y-z coordinates.

faces(N, 3) ndarray

Indices into vertices that form triangular faces. If unspecified, the faces are computed using a Delaunay triangulation.

edges(N, 2) ndarray

Edges between vertices. If unspecified, the edges are derived from the faces.

tolfloat

Angle in degrees. Vertices that are less than tol degrees apart are treated as duplicates.

See also

Sphere
Attributes:
x
y
z

Methods

find_closest(xyz)

Find the index of the vertex in the Sphere closest to the input vector, taking into account antipodal symmetry

from_sphere(sphere[, tol])

Create instance from a Sphere

mirror()

Create a full Sphere from a HemiSphere

subdivide([n])

Create a more subdivided HemiSphere

edges

faces

vertices

__init__(x=None, y=None, z=None, theta=None, phi=None, xyz=None, faces=None, edges=None, tol=1e-05)

Create a HemiSphere from points

faces()
find_closest(xyz)

Find the index of the vertex in the Sphere closest to the input vector, taking into account antipodal symmetry

Parameters:
xyzarray-like, 3 elements

A unit vector

Returns:
idxint

The index into the Sphere.vertices array that gives the closest vertex (in angle).

classmethod from_sphere(sphere, tol=1e-05)

Create instance from a Sphere

mirror()

Create a full Sphere from a HemiSphere

subdivide(n=1)

Create a more subdivided HemiSphere

See Sphere.subdivide for full documentation.

HistoResDNN

class dipy.nn.histo_resdnn.HistoResDNN(sh_order=8, basis_type='tournier07', verbose=False)

Bases: object

This class is intended for the ResDNN Histology Network model.

Methods

fetch_default_weights()

Load the model pre-training weights to use for the fitting.

load_model_weights(weights_path)

Load the custom pre-training weights to use for the fitting.

predict(data, gtab[, mask, chunk_size])

Wrapper function to faciliate prediction of larger dataset.

__init__(sh_order=8, basis_type='tournier07', verbose=False)

The model was re-trained for usage with a different basis function (‘tournier07’) like the proposed model in [1, 2].

To obtain the pre-trained model, use:: >>> resdnn_model = HistoResDNN() # doctest: +SKIP >>> fetch_model_weights_path = get_fnames(‘histo_resdnn_weights’) # doctest: +SKIP >>> resdnn_model.load_model_weights(fetch_model_weights_path) # doctest: +SKIP

This model is designed to take as input raw DWI signal on a sphere (ODF) represented as SH of order 8 in the tournier basis and predict fODF of order 8 in the tournier basis. Effectively, this model is mimicking a CSD fit.

Parameters:
sh_orderint, optional

Maximum SH order in the SH fit. For sh_order, there will be (sh_order + 1) * (sh_order + 2) / 2 SH coefficients for a symmetric basis. Default: 8

basis_type{‘tournier07’, ‘descoteaux07’}, optional

tournier07 (default) or descoteaux07.

verbosebool (optional)

Whether to show information about the processing. Default: False

References

fetch_default_weights()

Load the model pre-training weights to use for the fitting. Will not work if the declared SH_ORDER does not match the weights expected input.

load_model_weights(weights_path)

Load the custom pre-training weights to use for the fitting. Will not work if the declared SH_ORDER does not match the weights expected input.

The weights for a sh_order of 8 can be obtained via the function:

get_fnames(‘histo_resdnn_weights’).

Parameters:
weights_pathstr

Path to the file containing the weights (hdf5, saved by tensorflow)

predict(data, gtab, mask=None, chunk_size=1000)

Wrapper function to faciliate prediction of larger dataset. The function will mask, normalize, split, predict and ‘re-assemble’ the data as a volume.

Parameters:
datanp.ndarray

DWI signal in a 4D array

gtabGradientTable class instance

The acquisition scheme matching the data (must contain at least one b0)

masknp.ndarray (optional)

Binary mask of the brain to avoid unnecessary computation and unreliable prediction outside the brain. Default: Compute prediction only for nonzero voxels (with at least one nonzero DWI value).

Returns:
pred_sh_coefnp.ndarray (x, y, z, M)

Predicted fODF (as SH). The volume has matching shape to the input data, but with (sh_order + 1) * (sh_order + 2) / 2 as a last dimension.

Version

class dipy.nn.histo_resdnn.Version(version: str)

Bases: _BaseVersion

This class abstracts handling of a project’s versions.

A Version instance is comparison aware and can be compared and sorted using the standard Python interfaces.

>>> v1 = Version("1.0a5")
>>> v2 = Version("1.0")
>>> v1
<Version('1.0a5')>
>>> v2
<Version('1.0')>
>>> v1 < v2
True
>>> v1 == v2
False
>>> v1 > v2
False
>>> v1 >= v2
False
>>> v1 <= v2
True
Attributes:
base_version

The “base version” of the version.

dev

The development number of the version.

epoch

The epoch of the version.

is_devrelease

Whether this version is a development release.

is_postrelease

Whether this version is a post-release.

is_prerelease

Whether this version is a pre-release.

local

The local version segment of the version.

major

The first item of release or 0 if unavailable.

micro

The third item of release or 0 if unavailable.

minor

The second item of release or 0 if unavailable.

post

The post-release number of the version.

pre

The pre-release segment of the version.

public

The public portion of the version.

release

The components of the “release” segment of the version.

__init__(version: str) None

Initialize a Version object.

Parameters:

version – The string representation of a version which will be parsed and normalized before use.

Raises:

InvalidVersion – If the version does not conform to PEP 440 in any way then this exception will be raised.

property base_version: str

The “base version” of the version.

>>> Version("1.2.3").base_version
'1.2.3'
>>> Version("1.2.3+abc").base_version
'1.2.3'
>>> Version("1!1.2.3+abc.dev1").base_version
'1!1.2.3'

The “base version” is the public version of the project without any pre or post release markers.

property dev: int | None

The development number of the version.

>>> print(Version("1.2.3").dev)
None
>>> Version("1.2.3.dev1").dev
1
property epoch: int

The epoch of the version.

>>> Version("2.0.0").epoch
0
>>> Version("1!2.0.0").epoch
1
property is_devrelease: bool

Whether this version is a development release.

>>> Version("1.2.3").is_devrelease
False
>>> Version("1.2.3.dev1").is_devrelease
True
property is_postrelease: bool

Whether this version is a post-release.

>>> Version("1.2.3").is_postrelease
False
>>> Version("1.2.3.post1").is_postrelease
True
property is_prerelease: bool

Whether this version is a pre-release.

>>> Version("1.2.3").is_prerelease
False
>>> Version("1.2.3a1").is_prerelease
True
>>> Version("1.2.3b1").is_prerelease
True
>>> Version("1.2.3rc1").is_prerelease
True
>>> Version("1.2.3dev1").is_prerelease
True
property local: str | None

The local version segment of the version.

>>> print(Version("1.2.3").local)
None
>>> Version("1.2.3+abc").local
'abc'
property major: int

The first item of release or 0 if unavailable.

>>> Version("1.2.3").major
1
property micro: int

The third item of release or 0 if unavailable.

>>> Version("1.2.3").micro
3
>>> Version("1").micro
0
property minor: int

The second item of release or 0 if unavailable.

>>> Version("1.2.3").minor
2
>>> Version("1").minor
0
property post: int | None

The post-release number of the version.

>>> print(Version("1.2.3").post)
None
>>> Version("1.2.3.post1").post
1
property pre: Tuple[str, int] | None

The pre-release segment of the version.

>>> print(Version("1.2.3").pre)
None
>>> Version("1.2.3a1").pre
('a', 1)
>>> Version("1.2.3b1").pre
('b', 1)
>>> Version("1.2.3rc1").pre
('rc', 1)
property public: str

The public portion of the version.

>>> Version("1.2.3").public
'1.2.3'
>>> Version("1.2.3+abc").public
'1.2.3'
>>> Version("1.2.3+abc.dev1").public
'1.2.3'
property release: Tuple[int, ...]

The components of the “release” segment of the version.

>>> Version("1.2.3").release
(1, 2, 3)
>>> Version("2.0.0").release
(2, 0, 0)
>>> Version("1!2.0.0.post0").release
(2, 0, 0)

Includes trailing zeroes but not the epoch or any pre-release / development / post-release suffixes.

doctest_skip_parser

dipy.nn.histo_resdnn.doctest_skip_parser(func)

Decorator replaces custom skip test markup in doctests.

Say a function has a docstring:

>>> something # skip if not HAVE_AMODULE
>>> something + else
>>> something # skip if HAVE_BMODULE

This decorator will evaluate the expresssion after skip if. If this evaluates to True, then the comment is replaced by # doctest: +SKIP. If False, then the comment is just removed. The expression is evaluated in the globals scope of func.

For example, if the module global HAVE_AMODULE is False, and module global HAVE_BMODULE is False, the returned function will have docstring:

>>> something 
>>> something + else
>>> something

get_bval_indices

dipy.nn.histo_resdnn.get_bval_indices(bvals, bval, tol=20)

Get indices where the b-value is bval

Parameters:
bvals: ndarray

Array containing the b-values

bval: float or int

b-value to extract indices

tol: int

The tolerated gap between the b-values to extract and the actual b-values.

Returns:
Array of indices where the b-value is bval

get_fnames

dipy.nn.histo_resdnn.get_fnames(name='small_64D')

Provide full paths to example or test datasets.

Parameters:
namestr

the filename/s of which dataset to return, one of:

  • ‘small_64D’ small region of interest nifti,bvecs,bvals 64 directions

  • ‘small_101D’ small region of interest nifti, bvecs, bvals 101 directions

  • ‘aniso_vox’ volume with anisotropic voxel size as Nifti

  • ‘fornix’ 300 tracks in Trackvis format (from Pittsburgh Brain Competition)

  • ‘gqi_vectors’ the scanner wave vectors needed for a GQI acquisitions of 101 directions tested on Siemens 3T Trio

  • ‘small_25’ small ROI (10x8x2) DTI data (b value 2000, 25 directions)

  • ‘test_piesno’ slice of N=8, K=14 diffusion data

  • ‘reg_c’ small 2D image used for validating registration

  • ‘reg_o’ small 2D image used for validation registration

  • ‘cb_2’ two vectorized cingulum bundles

Returns:
fnamestuple

filenames for dataset

Examples

>>> import numpy as np
>>> from dipy.io.image import load_nifti
>>> from dipy.data import get_fnames
>>> fimg, fbvals, fbvecs = get_fnames('small_101D')
>>> bvals=np.loadtxt(fbvals)
>>> bvecs=np.loadtxt(fbvecs).T
>>> data, affine = load_nifti(fimg)
>>> data.shape == (6, 10, 10, 102)
True
>>> bvals.shape == (102,)
True
>>> bvecs.shape == (102, 3)
True

get_sphere

dipy.nn.histo_resdnn.get_sphere(name='symmetric362')

provide triangulated spheres

Parameters:
namestr

which sphere - one of: * ‘symmetric362’ * ‘symmetric642’ * ‘symmetric724’ * ‘repulsion724’ * ‘repulsion100’ * ‘repulsion200’

Returns:
spherea dipy.core.sphere.Sphere class instance

Examples

>>> import numpy as np
>>> from dipy.data import get_sphere
>>> sphere = get_sphere('symmetric362')
>>> verts, faces = sphere.vertices, sphere.faces
>>> verts.shape == (362, 3)
True
>>> faces.shape == (720, 3)
True
>>> verts, faces = get_sphere('not a sphere name') 
Traceback (most recent call last):
    ...
DataError: No sphere called "not a sphere name"

optional_package

dipy.nn.histo_resdnn.optional_package(name, trip_msg=None)

Return package-like thing and module setup for package name

Parameters:
namestr

package name

trip_msgNone or str

message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.

Returns:
pkg_likemodule or TripWire instance

If we can import the package, return it. Otherwise return an object raising an error when accessed

have_pkgbool

True if import for package was successful, false otherwise

module_setupfunction

callable usually set as setup_module in calling namespace, to allow skipping tests.

Examples

Typical use would be something like this at the top of a module using an optional package:

>>> from dipy.utils.optpkg import optional_package
>>> pkg, have_pkg, setup_module = optional_package('not_a_package')

Of course in this case the package doesn’t exist, and so, in the module:

>>> have_pkg
False

and

>>> pkg.some_function() 
Traceback (most recent call last):
    ...
TripWireError: We need package not_a_package for these functions, but
``import not_a_package`` raised an ImportError

If the module does exist - we get the module

>>> pkg, _, _ = optional_package('os')
>>> hasattr(pkg, 'path')
True

Or a submodule if that’s what we asked for

>>> subpkg, _, _ = optional_package('os.path')
>>> hasattr(subpkg, 'dirname')
True

set_logger_level

dipy.nn.histo_resdnn.set_logger_level(log_level)

Change the logger of the HistoResDNN to one on the following: DEBUG, INFO, WARNING, CRITICAL, ERROR

Parameters:
log_levelstr

Log level for the HistoResDNN only

sf_to_sh

dipy.nn.histo_resdnn.sf_to_sh(sf, sphere, sh_order=4, basis_type=None, full_basis=False, legacy=True, smooth=0.0)

Spherical function to spherical harmonics (SH).

Parameters:
sfndarray

Values of a function on the given sphere.

sphereSphere

The points on which the sf is defined.

sh_orderint, optional

Maximum SH order in the SH fit. For sh_order, there will be (sh_order + 1) * (sh_order + 2) / 2 SH coefficients for a symmetric basis and (sh_order + 1) * (sh_order + 1) coefficients for a full SH basis.

basis_type{None, ‘tournier07’, ‘descoteaux07’}, optional

None for the default DIPY basis, tournier07 for the Tournier 2007 [R35636a4a5d66-2]_[R35636a4a5d66-3]_ basis, descoteaux07 for the Descoteaux 2007 [1] basis, (None defaults to descoteaux07).

full_basis: bool, optional

True for using a SH basis containing even and odd order SH functions. False for using a SH basis consisting only of even order SH functions.

legacy: bool, optional

True to use a legacy basis definition for backward compatibility with previous tournier07 and descoteaux07 implementations.

smoothfloat, optional

Lambda-regularization in the SH fit.

Returns:
shndarray

SH coefficients representing the input function.

References

[1]

Descoteaux, M., Angelino, E., Fitzgibbons, S. and Deriche, R. Regularized, Fast, and Robust Analytical Q-ball Imaging. Magn. Reson. Med. 2007;58:497-510.

[2]

Tournier J.D., Calamante F. and Connelly A. Robust determination of the fibre orientation distribution in diffusion MRI: Non-negativity constrained super-resolved spherical deconvolution. NeuroImage. 2007;35(4):1459-1472.

[3]

Tournier J-D, Smith R, Raffelt D, Tabbara R, Dhollander T, Pietsch M, et al. MRtrix3: A fast, flexible and open software framework for medical image processing and visualisation. NeuroImage. 2019 Nov 15;202:116-137.

sh_to_sf

dipy.nn.histo_resdnn.sh_to_sf(sh, sphere, sh_order=4, basis_type=None, full_basis=False, legacy=True)

Spherical harmonics (SH) to spherical function (SF).

Parameters:
shndarray

SH coefficients representing a spherical function.

sphereSphere

The points on which to sample the spherical function.

sh_orderint, optional

Maximum SH order in the SH fit. For sh_order, there will be (sh_order + 1) * (sh_order + 2) / 2 SH coefficients for a symmetric basis and (sh_order + 1) * (sh_order + 1) coefficients for a full SH basis.

basis_type{None, ‘tournier07’, ‘descoteaux07’}, optional

None for the default DIPY basis, tournier07 for the Tournier 2007 [R30944dc1667c-2]_[R30944dc1667c-3]_ basis, descoteaux07 for the Descoteaux 2007 [1] basis, (None defaults to descoteaux07).

full_basis: bool, optional

True to use a SH basis containing even and odd order SH functions. Else, use a SH basis consisting only of even order SH functions.

legacy: bool, optional

True to use a legacy basis definition for backward compatibility with previous tournier07 and descoteaux07 implementations.

Returns:
sfndarray

Spherical function values on the sphere.

References

[1]

Descoteaux, M., Angelino, E., Fitzgibbons, S. and Deriche, R. Regularized, Fast, and Robust Analytical Q-ball Imaging. Magn. Reson. Med. 2007;58:497-510.

[2]

Tournier J.D., Calamante F. and Connelly A. Robust determination of the fibre orientation distribution in diffusion MRI: Non-negativity constrained super-resolved spherical deconvolution. NeuroImage. 2007;35(4):1459-1472.

[3]

Tournier J-D, Smith R, Raffelt D, Tabbara R, Dhollander T, Pietsch M, et al. MRtrix3: A fast, flexible and open software framework for medical image processing and visualisation. NeuroImage. 2019 Nov 15;202:116-137.

sph_harm_ind_list

dipy.nn.histo_resdnn.sph_harm_ind_list(sh_order, full_basis=False)

Returns the degree (m) and order (n) of all the symmetric spherical harmonics of degree less then or equal to sh_order. The results, m_list and n_list are kx1 arrays, where k depends on sh_order. They can be passed to real_sh_descoteaux_from_index() and :func:real_sh_tournier_from_index.

Parameters:
sh_orderint

even int > 0, max order to return

full_basis: bool, optional

True for SH basis with even and odd order terms

Returns:
m_listarray

degrees of even spherical harmonics

n_listarray

orders of even spherical harmonics

See also

shm.real_sh_descoteaux_from_index, shm.real_sh_tournier_from_index

unique_bvals_magnitude

dipy.nn.histo_resdnn.unique_bvals_magnitude(bvals, bmag=None, rbvals=False)

This function gives the unique rounded b-values of the data

Parameters:
bvalsndarray

Array containing the b-values

bmagint

The order of magnitude that the bvalues have to differ to be considered an unique b-value. B-values are also rounded up to this order of magnitude. Default: derive this value from the maximal b-value provided: \(bmag=log_{10}(max(bvals)) - 1\).

rbvalsbool, optional

If True function also returns all individual rounded b-values. Default: False

Returns:
ubvalsndarray

Array containing the rounded unique b-values

MultipleLayerPercepton

class dipy.nn.model.MultipleLayerPercepton(input_shape=(28, 28), num_hidden=(128,), act_hidden='relu', dropout=0.2, num_out=10, act_out='softmax', loss='sparse_categorical_crossentropy', optimizer='adam')

Bases: object

Methods

evaluate(x_test, y_test[, verbose])

Evaluate the model on test dataset.

fit(x_train, y_train[, epochs])

Train the model on train dataset.

predict(x_test)

Predict the output from input samples.

summary()

Get the summary of the model.

__init__(input_shape=(28, 28), num_hidden=(128,), act_hidden='relu', dropout=0.2, num_out=10, act_out='softmax', loss='sparse_categorical_crossentropy', optimizer='adam')

Multiple Layer Perceptron with Dropout.

Parameters:
input_shapetuple

Shape of data to be trained

num_hiddenarray-like

List of number of nodes in hidden layers

act_hiddenstring

Activation function used in hidden layer

dropoutfloat

Dropout ratio

num_out10

Number of nodes in output layer

act_outstring

Activation function used in output layer

optimizerstring

Select optimizer. Default adam.

lossstring

Select loss function for measuring accuracy. Default sparse_categorical_crossentropy.

evaluate(x_test, y_test, verbose=2)

Evaluate the model on test dataset.

The evaluate method will evaluate the model on a test dataset.

Parameters:
x_testndarray

the x_test is the test dataset

y_testndarray shape=(BatchSize,)

the y_test is the labels of the test dataset

verboseint (Default = 2)

By setting verbose 0, 1 or 2 you just say how do you want to ‘see’ the training progress for each epoch.

Returns:
evaluateList

return list of loss value and accuracy value on test dataset

fit(x_train, y_train, epochs=5)

Train the model on train dataset.

The fit method will train the model for a fixed number of epochs (iterations) on a dataset.

Parameters:
x_trainndarray

the x_train is the train dataset

y_trainndarray shape=(BatchSize,)

the y_train is the labels of the train dataset

epochsint (Default = 5)

the number of epochs

Returns:
histobject

A History object. Its History.history attribute is a record of training loss values and metrics values at successive epochs

predict(x_test)

Predict the output from input samples.

The predict method will generates output predictions for the input samples.

Parameters:
x_trainndarray

the x_test is the test dataset or input samples

Returns:
predictndarray shape(TestSize,OutputSize)

Numpy array(s) of predictions.

summary()

Get the summary of the model.

The summary is textual and includes information about: The layers and their order in the model. The output shape of each layer.

Returns:
summaryNoneType

the summary of the model

SingleLayerPerceptron

class dipy.nn.model.SingleLayerPerceptron(input_shape=(28, 28), num_hidden=128, act_hidden='relu', dropout=0.2, num_out=10, act_out='softmax', optimizer='adam', loss='sparse_categorical_crossentropy')

Bases: object

Methods

evaluate(x_test, y_test[, verbose])

Evaluate the model on test dataset.

fit(x_train, y_train[, epochs])

Train the model on train dataset.

predict(x_test)

Predict the output from input samples.

summary()

Get the summary of the model.

__init__(input_shape=(28, 28), num_hidden=128, act_hidden='relu', dropout=0.2, num_out=10, act_out='softmax', optimizer='adam', loss='sparse_categorical_crossentropy')

Single Layer Perceptron with Dropout.

Parameters:
input_shapetuple

Shape of data to be trained

num_hiddenint

Number of nodes in hidden layer

act_hiddenstring

Activation function used in hidden layer

dropoutfloat

Dropout ratio

num_out10

Number of nodes in output layer

act_outstring

Activation function used in output layer

optimizerstring

Select optimizer. Default adam.

lossstring

Select loss function for measuring accuracy. Default sparse_categorical_crossentropy.

evaluate(x_test, y_test, verbose=2)

Evaluate the model on test dataset.

The evaluate method will evaluate the model on a test dataset.

Parameters:
x_testndarray

the x_test is the test dataset

y_testndarray shape=(BatchSize,)

the y_test is the labels of the test dataset

verboseint (Default = 2)

By setting verbose 0, 1 or 2 you just say how do you want to ‘see’ the training progress for each epoch.

Returns:
evaluateList

return list of loss value and accuracy value on test dataset

fit(x_train, y_train, epochs=5)

Train the model on train dataset.

The fit method will train the model for a fixed number of epochs (iterations) on a dataset.

Parameters:
x_trainndarray

the x_train is the train dataset

y_trainndarray shape=(BatchSize,)

the y_train is the labels of the train dataset

epochsint (Default = 5)

the number of epochs

Returns:
histobject

A History object. Its History.history attribute is a record of training loss values and metrics values at successive epochs

predict(x_test)

Predict the output from input samples.

The predict method will generates output predictions for the input samples.

Parameters:
x_trainndarray

the x_test is the test dataset or input samples

Returns:
predictndarray shape(TestSize,OutputSize)

Numpy array(s) of predictions.

summary()

Get the summary of the model.

The summary is textual and includes information about: The layers and their order in the model. The output shape of each layer.

Returns:
summaryNoneType

the summary of the model

Version

class dipy.nn.model.Version(version: str)

Bases: _BaseVersion

This class abstracts handling of a project’s versions.

A Version instance is comparison aware and can be compared and sorted using the standard Python interfaces.

>>> v1 = Version("1.0a5")
>>> v2 = Version("1.0")
>>> v1
<Version('1.0a5')>
>>> v2
<Version('1.0')>
>>> v1 < v2
True
>>> v1 == v2
False
>>> v1 > v2
False
>>> v1 >= v2
False
>>> v1 <= v2
True
Attributes:
base_version

The “base version” of the version.

dev

The development number of the version.

epoch

The epoch of the version.

is_devrelease

Whether this version is a development release.

is_postrelease

Whether this version is a post-release.

is_prerelease

Whether this version is a pre-release.

local

The local version segment of the version.

major

The first item of release or 0 if unavailable.

micro

The third item of release or 0 if unavailable.

minor

The second item of release or 0 if unavailable.

post

The post-release number of the version.

pre

The pre-release segment of the version.

public

The public portion of the version.

release

The components of the “release” segment of the version.

__init__(version: str) None

Initialize a Version object.

Parameters:

version – The string representation of a version which will be parsed and normalized before use.

Raises:

InvalidVersion – If the version does not conform to PEP 440 in any way then this exception will be raised.

property base_version: str

The “base version” of the version.

>>> Version("1.2.3").base_version
'1.2.3'
>>> Version("1.2.3+abc").base_version
'1.2.3'
>>> Version("1!1.2.3+abc.dev1").base_version
'1!1.2.3'

The “base version” is the public version of the project without any pre or post release markers.

property dev: int | None

The development number of the version.

>>> print(Version("1.2.3").dev)
None
>>> Version("1.2.3.dev1").dev
1
property epoch: int

The epoch of the version.

>>> Version("2.0.0").epoch
0
>>> Version("1!2.0.0").epoch
1
property is_devrelease: bool

Whether this version is a development release.

>>> Version("1.2.3").is_devrelease
False
>>> Version("1.2.3.dev1").is_devrelease
True
property is_postrelease: bool

Whether this version is a post-release.

>>> Version("1.2.3").is_postrelease
False
>>> Version("1.2.3.post1").is_postrelease
True
property is_prerelease: bool

Whether this version is a pre-release.

>>> Version("1.2.3").is_prerelease
False
>>> Version("1.2.3a1").is_prerelease
True
>>> Version("1.2.3b1").is_prerelease
True
>>> Version("1.2.3rc1").is_prerelease
True
>>> Version("1.2.3dev1").is_prerelease
True
property local: str | None

The local version segment of the version.

>>> print(Version("1.2.3").local)
None
>>> Version("1.2.3+abc").local
'abc'
property major: int

The first item of release or 0 if unavailable.

>>> Version("1.2.3").major
1
property micro: int

The third item of release or 0 if unavailable.

>>> Version("1.2.3").micro
3
>>> Version("1").micro
0
property minor: int

The second item of release or 0 if unavailable.

>>> Version("1.2.3").minor
2
>>> Version("1").minor
0
property post: int | None

The post-release number of the version.

>>> print(Version("1.2.3").post)
None
>>> Version("1.2.3.post1").post
1
property pre: Tuple[str, int] | None

The pre-release segment of the version.

>>> print(Version("1.2.3").pre)
None
>>> Version("1.2.3a1").pre
('a', 1)
>>> Version("1.2.3b1").pre
('b', 1)
>>> Version("1.2.3rc1").pre
('rc', 1)
property public: str

The public portion of the version.

>>> Version("1.2.3").public
'1.2.3'
>>> Version("1.2.3+abc").public
'1.2.3'
>>> Version("1.2.3+abc.dev1").public
'1.2.3'
property release: Tuple[int, ...]

The components of the “release” segment of the version.

>>> Version("1.2.3").release
(1, 2, 3)
>>> Version("2.0.0").release
(2, 0, 0)
>>> Version("1!2.0.0.post0").release
(2, 0, 0)

Includes trailing zeroes but not the epoch or any pre-release / development / post-release suffixes.

optional_package

dipy.nn.model.optional_package(name, trip_msg=None)

Return package-like thing and module setup for package name

Parameters:
namestr

package name

trip_msgNone or str

message to give when someone tries to use the return package, but we could not import it, and have returned a TripWire object instead. Default message if None.

Returns:
pkg_likemodule or TripWire instance

If we can import the package, return it. Otherwise return an object raising an error when accessed

have_pkgbool

True if import for package was successful, false otherwise

module_setupfunction

callable usually set as setup_module in calling namespace, to allow skipping tests.

Examples

Typical use would be something like this at the top of a module using an optional package:

>>> from dipy.utils.optpkg import optional_package
>>> pkg, have_pkg, setup_module = optional_package('not_a_package')

Of course in this case the package doesn’t exist, and so, in the module:

>>> have_pkg
False

and

>>> pkg.some_function() 
Traceback (most recent call last):
    ...
TripWireError: We need package not_a_package for these functions, but
``import not_a_package`` raised an ImportError

If the module does exist - we get the module

>>> pkg, _, _ = optional_package('os')
>>> hasattr(pkg, 'path')
True

Or a submodule if that’s what we asked for

>>> subpkg, _, _ = optional_package('os.path')
>>> hasattr(subpkg, 'dirname')
True