...
 
Commits (105)
......@@ -46,6 +46,7 @@
publisher = {Springer},
year = {2007},
isbn = {3540262393},
doi = {10.1007/3-540-27752-8},
}
@article{Krzywinski2009,
......
......@@ -26,7 +26,7 @@ bibliography: paper.bib
Autoregressive modelling provides a powerful and flexible parametric approach
to modelling uni- or multi-variate time-series data. AR models have
mathematical links to linear time-invariant systems, digital filters and Fourier
based frequency analysis. As such, a wide range of time-domain and
based frequency analyses. As such, a wide range of time-domain and
frequency-domain metrics can be readily derived from the fitted autoregressive
parameters. These approaches are fundamental in a wide range of science and
engineering fields and still undergoing active development. ``SAILS``
......@@ -42,7 +42,7 @@ application, but was originally written by the authors with an intended use in
human neuroscience. The package provides functionality for model fitting,
(for example Ordinary Least-Squares and Vieira-Morf [@Marple1987] approaches),
model selection (Akaike's Information Criterion and Bayesian Information
Criterion), model validation (eg Stabililty Index [@Lutkepohl2007],
Criterion), model validation (e.g. Stability Index [@Lutkepohl2007],
Durbin-Watson criteria [@DurbinWatson1950] and Percent Consistency [@Ding2000])
Once fitted, a range of spectral features such as the power spectral density
and the transfer function can be estimated from the fitted model parameters. An
......@@ -68,24 +68,45 @@ information across multiple nodes to routines to simplify the use of external
plotting programs such as Circos [@Krzywinski2009] for circular plots.
Examples of such plots are included in the Screenshots section of this paper.
Several Python packages related functionality to SAILS. Firstly, there are a
range of packages implementing multivariate regression fits (including
# Target Audience
SAILS is aimed for use in scientific and engineering data analysis of
multivariate oscillating systems. As this opens up a wide range of potential
use cases, we include a incomplete set of examples for illustration. An
economist might be interested to use power spectra to quantify oscillatory or
cyclical changes in economic variables such as financial markets. A biomedical
engineer analysing M/EEG data may be interested in the spectral connectivity
metrics for quantifying how oscillatory synchrony in brain networks changes
between clinical populations. Similarly, a cognitive neuroscientist could use
the sliding window model fits on MEG data to explore how brain networks change
during a cognitive task. As a final example, an engineer or physicist may use
the functions to explore resonant vibrations in a physical system. In practice,
these features can be used in on any time-series to quantify oscillatory
features, their spatial distribution and network interactions.
# State of the field
Several Python packages provide related functionality to SAILS. Firstly, there
are a range of packages implementing multivariate regression fits (including
[numpy.linalg](https://docs.scipy.org/doc/numpy/reference/routines.linalg.html#solving-equations-and-inverting-matrices)
and [statsmodels.tsa](https://www.statsmodels.org/stable/tsa.html)). SAILS is
readily extensible to work with model fits from other packages by creating a
class inheriting from AbstractLinearModel and implementing the fit method to
call the external package. Other packages (such as
call the external package. Some spectral connectivity measures (based on
multi-taper or Wavelet transforms and not currently including MVAR based
measures such as Partial Directed Coherence) are implemented in
[MNE-python](https://mne.tools/stable/generated/mne.connectivity.spectral_connectivity.html),
a specialised electrophysiology analysis package. Other packages (such as
[pyspectrum](https://pyspectrum.readthedocs.io)) provide autoregressive
spectrum estimation methods but are not readily extensible to multivariate
datasets or connectivity metrics. Finally, some packages implement varying
range of autoregressive connectivity estimates (such as
[nitime](http://nipy.org/nitime/), [SCoT](https://github.com/scot-dev/scot),
[connectivipy](connectivipy.readthedocs.io) and
[spectral_connectivity](https://github.com/Eden-Kramer-Lab/spectral_connectivity).
[spectral_connectivity](https://github.com/Eden-Kramer-Lab/spectral_connectivity)).
SAILS provides overlapping functionality to these packages alongside advanced
methods such as the modal decomposition.
# Availability and Installation
SAILS is released under a GPLv2 or later license.
......@@ -124,12 +145,12 @@ All screenshots are based around material found in the online tutorials
![Example of plot using AIC to determine model order](aic.png)
![Example of a Partial Directed Coherence matrix plot](F_PDC.png)
![Example of a Partial Directed Coherence matrix plot](F_PDC.png)
![Example of plot showing the pole representation of the AR system](poleplot.png)
![Example of circular connectivity plot generated using Circos](aal_circular.png)
![Example of netmat connectivity plot generated using Circos](aal_netmat.png)
![Example of netmat connectivity plot](aal_netmat.png)
# References
......@@ -68,9 +68,9 @@ author = 'Andrew Quinn, Mark Hymers'
# built documents.
#
# The short X.Y version.
version = '0.0.1'
version = '1.0'
# The full version, including alpha/beta/rc tags.
release = '0.0.1'
release = '1.0.0'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
......
......@@ -41,6 +41,7 @@ SAILS can be installed from `PyPI <https://pypi.org/project/sails/>`_ using pip:
and used to model and describe frequency content in networks of time-series::
import sails
import numpy as np
# Create a simulated signal
sample_rate = 100
......
......@@ -16,4 +16,6 @@ various plotting routines.
tutorials/tutorial6
tutorials/tutorial7
tutorials/tutorial8
tutorials/tutorial9
tutorials/tutorial10
......@@ -104,22 +104,67 @@ take this into account for you.
M_H = M.modes.per_mode_transfer_function(sample_rate, freq_vect)
Finally, we can plot our modes in both forms to examine the relationship
between them.
We can plot our modes in both forms to examine the relationship between them.
Firstly, we sum the modal transfer function across all modes to recover the
Fourier based transfer function (to within computational precision).
.. code-block:: python
fourier_H = np.abs(F_H).squeeze()
modal_H = np.abs(M_H.sum(axis=1)).squeeze()
# Check the two forms are equivalent
equiv = np.allclose( fourier_H, modal_H )
ssdiff = np.sum( (fourier_H - modal_H)**2 )
print('Fourier and Modal transfer functions equivalent: {0}'.format(equiv))
print('Sum-square residual difference: {0}'.format(ssdiff))
# Plot our fourier and modal spectra
f2 = plt.figure()
plt.plot(freq_vect, np.abs(F_H).squeeze(), 'o');
plt.plot(freq_vect, np.abs(M_H).squeeze());
plt.plot(freq_vect, fourier_H, 'o');
plt.plot(freq_vect, modal_H);
plt.xlabel('Frequency (Hz)')
plt.ylabel('Frequency Response')
plt.legend(['Fourier H', 'Modal H'])
plt.savefig('tutorial1_2.png', dpi=300)
f2.show()
.. image:: tutorial1_2.png
Finally, we can see how each mode contributes to the overall transfer function
shape by plotting each mode separately. Each mode contributes a single
uni-modal resonance to the spectrum. In this case there are no clear peaks in
the spectrum, just the 1/f shape - as such, all the mode peaks sum together to
make a smooth spectrum. Note that we're plotting the modes on a log y-scale as
some modes make a very small contribution to the overall transfer function.
.. code-block:: python
fourier_H = np.abs(F_H).squeeze()
modal_H = np.abs(M_H).squeeze()
# Plot our fourier and modal spectra
f2 = plt.figure()
plt.semilogy(freq_vect, fourier_H, 'o');
plt.semilogy(freq_vect, modal_H);
plt.xlabel('Frequency (Hz)')
plt.ylabel('Frequency Response')
legend = ['Mode' + str(ii) for ii in range(modal_H.shape[1])]
plt.legend(['Fourier H'] + legend)
plt.savefig('tutorial1_3.png', dpi=300)
f2.show()
.. image:: tutorial1_3.png
This diff is collapsed.
doc/source/tutorials/tutorial1_1.png

42.1 KB | W: | H:

doc/source/tutorials/tutorial1_1.png

142 KB | W: | H:

doc/source/tutorials/tutorial1_1.png
doc/source/tutorials/tutorial1_1.png
doc/source/tutorials/tutorial1_1.png
doc/source/tutorials/tutorial1_1.png
  • 2-up
  • Swipe
  • Onion skin
doc/source/tutorials/tutorial1_2.png

16.8 KB | W: | H:

doc/source/tutorials/tutorial1_2.png

88.2 KB | W: | H:

doc/source/tutorials/tutorial1_2.png
doc/source/tutorials/tutorial1_2.png
doc/source/tutorials/tutorial1_2.png
doc/source/tutorials/tutorial1_2.png
  • 2-up
  • Swipe
  • Onion skin
doc/source/tutorials/tutorial2_1.png

42.7 KB | W: | H:

doc/source/tutorials/tutorial2_1.png

142 KB | W: | H:

doc/source/tutorials/tutorial2_1.png
doc/source/tutorials/tutorial2_1.png
doc/source/tutorials/tutorial2_1.png
doc/source/tutorials/tutorial2_1.png
  • 2-up
  • Swipe
  • Onion skin
......@@ -37,7 +37,7 @@ parameters and use it to compute the connectivity metrics.
m = siggen.generate_model()
freq_vect = np.linspace(0,sample_rate/2,36)
freq_vect = np.linspace(0, sample_rate/2, 36)
F = sails.FourierMvarMetrics.initialise(m, sample_rate, freq_vect)
......@@ -49,11 +49,9 @@ our simulation parameter.
.. code-block:: python
import matplotlib.pyplot as plt
sails.plotting.plot_vector( np.abs(F.S), freq_vect, diag=True )
fig = sails.plotting.plot_vector(np.abs(F.S), freq_vect, diag=True)
plt.show()
fig.show()
.. image:: tutorial5_1.png
......@@ -70,9 +68,9 @@ able to show when connections are not symmetrical
.. code-block:: python
sails.plotting.plot_vector( F.directed_transfer_function, freq_vect, diag=True )
fig = sails.plotting.plot_vector(F.directed_transfer_function, freq_vect, diag=True)
plt.show()
fig.show()
.. image:: tutorial5_2.png
......@@ -87,9 +85,9 @@ The Partial Directed Coherence aims to address this problem.
.. code-block:: python
sails.plotting.plot_vector( F.partial_directed_coherence, freq_vect, diag=True )
fig = sails.plotting.plot_vector(F.partial_directed_coherence, freq_vect, diag=True)
plt.show()
fig.show()
.. image:: tutorial5_3.png
......@@ -131,17 +129,19 @@ sample_rate and number of samples to generate
.. code-block:: python
X = siggen.generate_signal(sample_rate=128,num_samples=640)
X = siggen.generate_signal(sample_rate=128, num_samples=640)
``X`` is a ``(nchannels x nsamples)`` array containing our simulated data. We can
plot ``X`` using matplotlib
.. code-block:: python
import matplotlib.pyplot as plt
plt.figure()
for ii in range(5):
plt.plot(X[ii,:]+(ii*10))
plt.plot(X[ii, :] + (ii * 10))
plt.show()
......@@ -162,7 +162,7 @@ did in previous tutorials.
F = sails.FourierMvarMetrics.initialise(m, sample_rate, freq_vect)
diag = m.compute_diagnostics( X )
diag = m.compute_diagnostics(X)
We check that our model is fitting well by interrogating the diagnostics. Here
we see that we are explaining around 56% of the total variance in the signal
......@@ -175,13 +175,13 @@ Partial Directed Coherence from the 'true' model.
m0 = siggen.generate_model() # This is our true model
F0 = sails.FourierMvarMetrics.initialise( m0, sample_rate, freq_vect )
F0 = sails.FourierMvarMetrics.initialise(m0, sample_rate, freq_vect)
pdc = np.concatenate( (F0.partial_directed_coherence,F.partial_directed_coherence), axis=3 )
pdc = np.concatenate((F0.partial_directed_coherence, F.partial_directed_coherence), axis=3)
sails.plotting.plot_vector( pdc, freq_vect, diag=True,line_labels=['True','Fitted'])
fig = sails.plotting.plot_vector(pdc, freq_vect, diag=True, line_labels=['True', 'Fitted'])
plt.show()
fig.show()
.. image:: tutorial5_5.png
......
......@@ -10,25 +10,45 @@ We start by importing our modules and finding and loading the example data.
.. code-block:: python
from os.path import join
import os
from os.path import join
import h5py
import h5py
import numpy as np
import matplotlib.pyplot as plt
import numpy as np
import matplotlib.pyplot as plt
from sails import find_example_path
import sails
plt.style.use('ggplot')
plt.style.use('ggplot')
ex_dir = find_example_path()
sample_rate = 2034.51 / 24
nyq = sample_rate / 2.
SAILS will automatically detect the example data if downloaded into your home
directory. If you've used a different location, you can specify this in an
environment variable named ``SAILS_EXAMPLE_DATA``.
freq_vect = np.linspace(0, nyq, 64)
.. code-block:: python
# Specify environment variable with example data location
# This is only necessary if you have not checked out the
# example data into $HOME/sails-example-data
os.environ['SAILS_EXAMPLE_DATA'] = '/path/to/sails-example-data'
# Locate and validate example data directory
example_path = sails.find_example_path()
# Load data using h5py
data_path = os.path.join(sails.find_example_path(), 'meg_occipital_ve.hdf5')
X = h5py.File(data_path, 'r')['X'][:, 122500:130000, :]
X = sails.utils.fast_resample(X, 5/2)
sample_rate = 250 / (5/2)
nyq = sample_rate / 2.
time_vect = np.linspace(0, X.shape[1] // sample_rate, X.shape[1])
freq_vect = np.linspace(0, nyq, 64)
X = h5py.File(join(ex_dir, 'meg_single.hdf5'), 'r')['X'][:, 0:11000, :]
As a reminder, our data is (nsignals, nsamples, ntrials):
......@@ -38,7 +58,7 @@ As a reminder, our data is (nsignals, nsamples, ntrials):
.. code-block:: console
(1, 30517, 1)
(1, 3000, 1)
The idea behind a sliding window analysis is to fit a separate MVAR model
to short windows of the data. For the purposes of this example, we will
......@@ -63,7 +83,7 @@ We could also choose to examine our Fourier MVAR metrics.
.. code-block:: console
0.170369632033
0.6695709340724934
We can then repeat this procedure for the next, overlapping, window:
......@@ -79,7 +99,7 @@ We can then repeat this procedure for the next, overlapping, window:
.. code-block:: python
0.172925597904
0.6567768337622513
This is obviously a time-consuming task, so we provide a helper function
which will take a set of time-series data and compute a model and
......@@ -104,7 +124,9 @@ which contains the diagnostic information for all of the windows. Use of the
from sails import sliding_window_fit
M, D = sliding_window_fit(VieiraMorfLinearModel, X, delay_vect, 200, 1)
M, D = sliding_window_fit(VieiraMorfLinearModel, X, delay_vect, 100, 1)
model_time_vect = time_vect[M.time_vect.astype(int)]
Once we have constructed our models, we can go ahead and construct our
:class:`~sails.mvar_metrics.FourierMvarMetrics` class which will allow us to,
......@@ -128,9 +150,10 @@ On the top row, we plot our data.
plt.subplot(5, 1, 1)
plt.plot(X[0,:,0])
plt.plot(time_vect, X[0, :, 0])
plt.xlim(0,10000); plt.grid(True)
plt.xlim(0, 30)
plt.grid(True)
plt.ylabel('Amplitude')
......@@ -141,9 +164,10 @@ windows:
plt.subplot(5, 1, (2, 3))
plt.contourf(M.time_vect, F.freq_vect[3:], F.H[0, 0, 3:, :])
plt.contourf(model_time_vect, F.freq_vect[3:], np.sqrt(np.abs(F.PSD[0, 0, 3:, :])))
plt.xlim(0,10000); plt.grid(True)
plt.xlim(0, 30)
plt.grid(True)
plt.ylabel('Frequency (Hz)')
......@@ -154,11 +178,12 @@ Underneath the transfer function, we examine the stability index
plt.subplot(5, 1, 4)
plt.plot(D.SI)
plt.plot(model_time_vect, D.SI)
plt.plot(D.R_square)
plt.plot(model_time_vect, D.R_square)
plt.xlim(0, 10000); plt.grid(True)
plt.xlim(0, 30)
plt.grid(True)
plt.ylabel('SI / $R^2$')
......@@ -168,11 +193,13 @@ On the bottom row, we plot the AIC value for each of the models:
plt.subplot(5, 1, 5)
plt.plot(D.AIC)
plt.plot(model_time_vect, D.AIC)
plt.xlim(0,10000); plt.grid(True)
plt.xlim(0, 30)
plt.grid(True)
plt.xlabel('Time (samples)')
plt.ylabel('AIC')
Finally, we can look at our overall figure:
......@@ -181,7 +208,7 @@ Finally, we can look at our overall figure:
f1.tight_layout()
plt.show()
f1.show()
.. image:: tutorial6_1.png
doc/source/tutorials/tutorial6_1.png

151 KB | W: | H:

doc/source/tutorials/tutorial6_1.png

425 KB | W: | H:

doc/source/tutorials/tutorial6_1.png
doc/source/tutorials/tutorial6_1.png
doc/source/tutorials/tutorial6_1.png
doc/source/tutorials/tutorial6_1.png
  • 2-up
  • Swipe
  • Onion skin
......@@ -14,12 +14,12 @@ We start by importing our modules and finding and finding our example path
from os.path import join
from sails import find_example_path
from sails import find_support_path
ex_dir = find_example_path()
support_dir = find_support_path()
group_csv = join(ex_dir, 'aal_cortical_plotting_groups.csv')
region_csv = join(ex_dir, 'aal_cortical_plotting_regions.csv')
group_csv = join(support_dir, 'aal_cortical_plotting_groups.csv')
region_csv = join(support_dir, 'aal_cortical_plotting_regions.csv')
The two files which are imported above describe the layout of the plots which
......
Tutorial 9 - Modal decomposition of a simulated system
======================================================
One of the main features of SAILS is the ability to perform a modal
decomposition on fitted MVAR models. This tutorial will outline the use of
modal decomposition on some simulated data. This example is included
as part of an upcoming paper from the authors of SAILS.
The system which is being simulated consists of 10 "nodes". Each node is made
up of a mixture of two components, each of which is distributed differently
amongst the nodes. We start by simulating the data for a single realisation:
.. code-block:: python
import numpy as np
import scipy.signal as signal
import matplotlib.pyplot as plt
import sails
plt.style.use('ggplot')
# General configuration for the generation and analysis
sample_rate = 128
num_samples = sample_rate*50
f1 = 10
f2 = 0
siggen = sails.simulate.SailsTutorialExample()
num_sources = siggen.get_num_sources()
weight1, weight2 = siggen.get_connection_weights()
# Calculate the base signals for reference later on
sig1 = siggen.generate_basesignal(f1, .8+.05, sample_rate, num_samples)
sig2 = siggen.generate_basesignal(f2, .61, sample_rate, num_samples)
sig1 = (sig1 - sig1.mean(axis=0)) / sig1.std(axis=0)
sig2 = (sig2 - sig2.mean(axis=0)) / sig2.std(axis=0)
# Generate a network realisation
X = siggen.generate_signal(f1, f2, sample_rate, num_samples)
At this point the variable `X` contains the simulated data. We start by
visualising this data. `sig1` and `sig2` contain the base signals which are
used for visualistion purposes only. Finally, `weight1` and `weight2` contain
the connection weights (the amount to which each signal is spread across the
nodes).
Next we visualise the system before fitting our model. This section of
the code is for plotting purposes only and can be skipped if you are happy
to just examine the plot following it.
.. code-block:: python
# Calculate the Welch PSD of the signals
f, pxx1 = signal.welch(sig1.T, fs=sample_rate, nperseg=128, nfft=512)
f, pxx2 = signal.welch(sig2.T, fs=sample_rate, nperseg=128, nfft=512)
# Visualise the signal and the mixing of it
x_label_pos = -2.25
x_min = -1.75
plot_seconds = 5
plot_samples = int(plot_seconds * sample_rate)
# Plot the two base signals
plt.figure(figsize=(7.5, 2.5))
plt.axes([.15, .8, .65, .15], frameon=False)
plt.plot(sig1[:plot_samples], 'r')
plt.plot(sig2[:plot_samples] + 7.5, 'b')
plt.xlim(-10, plot_samples)
yl = plt.ylim()
plt.xticks([])
plt.yticks([])
# Add labelling to the base signals
plt.axes([.05, .8, .1, .15], frameon=False)
plt.text(x_label_pos, 7.5, 'Mode 1', verticalalignment='center', color='b')
plt.text(x_label_pos, 0, 'Mode 2', verticalalignment='center', color='r')
plt.xlim(x_label_pos, 1)
plt.ylim(*yl)
plt.xticks([])
plt.yticks([])
# Add a diagram of the Welch PSD to each signal
plt.fill_between(np.linspace(0, 2, 257), np.zeros((257)), 20 * pxx1[0, :], color='r')
plt.fill_between(np.linspace(0, 2, 257), np.zeros((257)) + 7.5, 20 * pxx2[0, :]+7.5, color='b')
plt.xlim(x_min, 1)
# Plot the mixed signals in the network
plt.axes([.15, .1, .65, .7], frameon=False)
plt.plot(X[:, :plot_samples, 0].T + np.arange(num_sources) * 7.5, color='k', linewidth=.5)
plt.hlines(np.arange(num_sources)*7.5, -num_sources, plot_samples, linewidth=.2)
plt.xlim(-num_sources, plot_samples)
plt.xticks([])
plt.yticks([])
yl = plt.ylim()
# Add labelling and the weighting information
plt.axes([.05, .1, .1, .7], frameon=False)
plt.barh(np.arange(num_sources)*7.5+1, weight1, color='r', height=2)
plt.barh(np.arange(num_sources)*7.5-1, weight2, color='b', height=2)
plt.xlim(x_min, 1)
plt.ylim(*yl)
for ii in range(num_sources):
plt.text(x_label_pos, 7.5 * ii, 'Node {0}'.format(ii+1), verticalalignment='center')
plt.xticks([])
plt.yticks([])
# Plot the mixing matrices
wmat1, wmat2 = siggen.get_connection_weights('matrix')
plt.axes [.85, .6, .13, .34])
plt.pcolormesh(wmat2, cmap='Blues')
plt.xticks(np.arange(num_sources, 0, -1) - .5, [''] * num_sources, fontsize=6)
plt.yticks(np.arange(num_sources, 0, -1) - .5, np.arange(num_sources, 0, -1), fontsize=6)
plt.axis('scaled')
plt.axes([.85, .2, .13, .34])
plt.pcolormesh(wmat1, cmap='Reds')
plt.xticks(np.arange(num_sources, 0, -1)-.5, np.arange(num_sources, 0, -1), fontsize=6)
plt.yticks(np.arange(num_sources, 0, -1)-.5, np.arange(num_sources, 0, -1), fontsize=6)
plt.axis('scaled')
.. image:: tutorial9_1.png
Next we fit an MVAR model and examine the Modal decomposition:
.. code-block:: python
model_order = 5
delay_vect = np.arange(model_order + 1)
freq_vect = np.linspace(0, sample_rate/2, 512)
# Fit our MVAR model and perform the modal decomposition
m = sails.VieiraMorfLinearModel.fit_model(X, delay_vect)
modes = sails.MvarModalDecomposition.initialise(m, sample_rate)
# Now extract the metrics for the modal model
M = sails.ModalMvarMetrics.initialise(m, sample_rate,
freq_vect, sum_modes=False)
Each mode in the decomposition consists of either a single pole
(where it is a DC component) or a pair of poles (where it is
an oscillator). We can extract the indicies of the pole pairs
using the ```mode_indices``` property. Given the nature of
our simulation, we would expect to find one DC pole and one
pole at 10Hz with a stronger effect than all other poles.
For the purposes of this tutorial, we are going to set an arbitrary
threshold on the modes; in reality a permutation testing scheme
can be used to determine this threshold (which will be the
subject of a future tutorial).
.. code-block:: python
# For real data and simulations this can be estimated using
# a permutation scheme
pole_threshold = 2.3
# Extract the pairs of mode indices
pole_idx = modes.mode_indices
# Find which modes pass threshold
surviving_modes = []
for idx, poles in enumerate(pole_idx):
# The DTs of a pair of poles will be identical
# so we can just check the first one
if modes.dampening_time[poles[0]] > pole_threshold:
surviving_modes.append(idx)
# Use root plot to plot all modes and then replot
# the surviving modes on top of them
ax = sails.plotting.root_plot(modes.evals)
low_mode = None
high_mode = None
for mode in surviving_modes:
# Pick the colour based on the peak frequency
if modes.peak_frequency[pole_idx[mode][0]] < 0.001:
color = 'b'
low_mode = mode
else:
color = 'r'
high_mode = mode
for poleidx in pole_idx[mode]:
ax.plot(modes.evals[poleidx].real, modes.evals[poleidx].imag,
marker='+', color=color)
.. image:: tutorial9_2.png
From this, we can see that the modal decomposition has clearly extracted a set
of poles of importance. We can now visualise the connectivity patterns
for these modes as well as their spectra:
.. code-block:: python
# The frequency location simply acts as a scaling factor at this
# point, but we pick the closest bin in freq_vect to the
# peak frequency of the mode
low_freq_idx = np.argmin(np.abs(freq_vect - modes.peak_frequency[pole_idx[low_mode][0]]))
high_freq_idx = np.argmin(np.abs(freq_vect - modes.peak_frequency[pole_idx[high_mode][0]]))
# We can now plot the two graph
plt.figure()
low_psd = np.sum(M.PSD[:, :, :, pole_idx[low_mode]], axis=3)
high_psd = np.sum(M.PSD[:, :, :, pole_idx[high_mode]], axis=3)
# Plot the connectivity patterns as well as the spectra
plt.subplot(2, 2, 1)
plt.pcolormesh(low_psd[:, :, low_freq_idx], cmap='Blues')
plt.xticks(np.arange(num_sources, 0, -1)-.5, np.arange(num_sources, 0, -1), fontsize=6)
plt.yticks(np.arange(num_sources, 0, -1)-.5, np.arange(num_sources, 0, -1), fontsize=6)
plt.subplot(2, 2, 2)
for k in range(num_sources):
plt.plot(freq_vect, low_psd[k, k, :])
plt.subplot(2, 2, 3)
plt.pcolormesh(high_psd[:, :, high_freq_idx], cmap='Reds')
plt.xticks(np.arange(num_sources, 0, -1)-.5, np.arange(num_sources, 0, -1), fontsize=6)
plt.yticks(np.arange(num_sources, 0, -1)-.5, np.arange(num_sources, 0, -1), fontsize=6)
plt.subplot(2, 2, 4)
for k in range(num_sources):
plt.plot(freq_vect, high_psd[k, k, :])
plt.xlabel('Frequency (Hz)')
.. image:: tutorial9_3.png
In this tutorial we have demonstrated how to use the modal decomposition
functionality in SAILS to decompose the parameters of an MVAR model and then to
extract spectral and connectivity patterns from just the modes of significance.
This approach can be extended across participants and, with the additional use
of PCA, used to simultaneously estimate spatial and spectral patterns in data.
More details of this approach will be available in an upcoming paper from
the authors of SAILS.
#!/usr/bin/python
# vim: set expandtab ts=4 sw=4:
# Avoid having code to make anamnesis optional in multiple places
try:
from anamnesis import AbstractAnam, register_class
except ImportError:
AbstractAnam = object
def register_class(x):
return None
__all__ = ['AbstractAnam', 'register_class']
This diff is collapsed.
......@@ -4,19 +4,29 @@
import numpy as np
from .anam import AbstractAnam, register_class
from .utils import get_valid_samples
from .stats import find_cov_multitrial, durbin_watson, \
stability_index, stability_ratio, \
rsquared_from_residuals, percent_consistency
from .modelvalidation import approx_model_selection_criteria
__all__ = []
class DelayDiagnostics(object):
class DelayDiagnostics(AbstractAnam):
"""Class which computes the mutual information as a function of lag from zero
lag to the first zero crossing of the autocorrelation function.
"""
# This is only used within NAF
# This is only used within Anamnesis
hdf5_outputs = ['MI', 'MI_diff', 'delay_vect_samples', 'delay_vect_ms',
'time_vect', 'autocorrelation', 'maxdelay', 'first_zero']
def __init__(self):
AbstractAnam.__init__(self)
# TODO: Add docstrings for these
self.MI = None
......@@ -146,42 +156,210 @@ class DelayDiagnostics(object):
__all__.append('DelayDiagnostics')
register_class(DelayDiagnostics)
class ModelDiagnostics(AbstractAnam):
"""
Class to store and display LinearModel diagnostic information. Several
diagnostic criteria are computed including:
R_square (``R_square``)
the percentage of variance explained in each channel
Stability Index (``SI``)
Indicator of the stability of the MVAR parameters (SI<1 indicates a
stable model)
Stability Ratio (``SR``)
A stronger test of stability computed from the ratio of the largest
eigenvalue of A to all others.
Durbin-Watson (``DW``)
A test of autocorrelation in residuals of the model fit. Values should
be close to 2 indicating no autocorrelation, values close to 0 indicate
a positive autocorrelation and 4 and negative autocorrelation.
class ModelDiagnostics(object):
"""Class to store and display LinearModel diagnostic information"""
Log Likelihood (``LL``)
The log-likelihood of the model.
# This is only used within NAF
Akaike's Information Criterion (``AIC``)
An indication of the model 'quality', lower values indicate a more
accurate, less complex model.
Bayesian Information Criterion (``BIC``)
An indication of the model 'quality', lower values indicate a more
accurate, less complex model.
Percent Consistency (``PC``)
Indicates how well a model captures the auto and cross correlation in a
time-series. Only computed if ``compute_pc`` is passed to the relevant
function.
"""
# This is only used within Anamnesis
hdf5_outputs = ['R_square', 'SR', 'DW', 'PC', 'AIC', 'BIC', 'LL', 'SI']
def __init__(self):
# TODO: Add docstrings for these
self.R_square = None
"""Constructor for ModelDiagnostics.
self.SR = None
This fills the diagnostic variables with placeholders. The diagnostics
should be computed using the ModelDiagnostics.compute classmethod.
self.DW = None
"""
self.PC = None
AbstractAnam.__init__(self)
self.R_square = None
self.SR = None
self.DW = None
self.PC = None
self.AIC = None
self.BIC = None
self.LL = None
self.SI = None
def __str__(self):
"""Show the measures in a pre-formatted manner"""
template = "{0:^10}{1:^10}{2:^10}{3:^10}{4:^10}{5:^10}"
self.is_list = False
@classmethod
def compute(cls, model, data, compute_pc=False):
"""
Classmethod for computing a set of model diagnostics from a fitted
model applied to a time-series dataset.
Parameters
----------
model: sails LinearModel class
A fitted linear model
data: ndarray
A 3d time-series of size [nchannels x nsamples x ntrials]
compute_pc: bool
Flag indicating whether to compute the percent consistency, this
can be time-consuming for large datasets (Default=False).
Returns
-------
sails ModelDiagnostics instance
"""
ret = cls()
# Get residuals - adjust valid samples next
resid = model.get_residuals(data, mode='full')
# Adjust data and resids for valid samples
data = get_valid_samples(data, model.delay_vect)
resid = get_valid_samples(resid, model.delay_vect)
# Get covariance of the residuals
ret.resid_cov = find_cov_multitrial(resid, resid)
# Compute Durbin-Watson test for residual autocorrelation
ret.DW = durbin_watson(resid, step=np.diff(model.delay_vect)[0])[0]
# Compute the stability index and ratio
ret.SI = stability_index(model.parameters)
ret.SR = stability_ratio(model.parameters)
# Estimate the R^2 - per signal
ret.R_square = rsquared_from_residuals(data, resid, per_signal=True)
# Estimate Percent Consistency - Ding et al 2000.
if compute_pc:
# This can cause memory issues with large datasets
ret.PC = percent_consistency(data, resid)
else:
ret.PC = None
# Compute LL, AIC and BIC
# LL, AIC, BIC = est_model_selection_criteria(model.parameters, resid)
# Approximate AIC and BIC
LL, AIC, BIC = approx_model_selection_criteria(model.parameters, resid)
ret.LL = LL
ret.AIC = AIC
ret.BIC = BIC
return ret
@classmethod
def combine_diag_list(cls, diags):
"""
Helper function for combining diagnostics from a list of
ModelDiagnostics instances for easy comparison and visualisation.
Parameters
----------
diags: list of ModelDiagnostics instances
The ModelDiagnostics to concatenate
Returns
-------
sails ModelDiagnostics instance
"""
ret = cls()
ret.resid_cov = np.concatenate([d.resid_cov[..., None] for d in diags], axis=2)
ret.DW = np.r_[[d.DW for d in diags]]
ret.SI = np.r_[[d.SI for d in diags]]
ret.SR = np.r_[[d.SR for d in diags]]
if diags[0].PC is not None:
ret.PC = np.r_[[d.PC for d in diags]]
ret.R_square = np.c_[[d.R_square for d in diags]]
ret.LL = np.r_[[d.LL for d in diags]]
ret.AIC = np.r_[[d.AIC for d in diags]]
ret.BIC = np.r_[[d.BIC for d in diags]]
ret.is_list = True
return ret
return template.format(np.round(self.SI, 3),
np.round(self.SR, 3),
np.round(self.DW, 3),
np.round(self.AIC, 3),
np.round(self.BIC, 3),
np.round(self.R_square.mean(), 3))
def summary(self, all_models=True):
"""Print the ModelDiagnostics in a pre-formatted table"""
if self.is_list:
template = "{0:<10}{1:<10}{2:<10}{3:<10}{4:<10}{5:<10}{6:<10}"
print(template.format('Model', 'SI', 'SR', 'DW', 'AIC', 'BIC', 'R Square'))
print('_'*70)
for ii in range(self.SI.shape[0]):
print(template.format(ii,
np.round(self.SI[ii], 3),
np.round(self.SR[ii], 3),
np.round(self.DW[ii], 3),
np.round(self.AIC[ii], 3),
np.round(self.BIC[ii], 3),
np.round(self.R_square[ii, :].mean(), 3)))
print('_'*70)
print(template.format('Total',
np.round(self.SI.mean(), 3),
np.round(self.SR.mean(), 3),
np.round(self.DW.mean(), 3),
np.round(self.AIC.mean(), 3),
np.round(self.BIC.mean(), 3),
np.round(self.R_square.mean(), 3)))
else:
template = "{0:<10}{1:<10}{2:<10}{3:<10}{4:<10}{5:<10}"
print(template.format('SI', 'SR', 'DW', 'AIC', 'BIC', 'R Square'))
print('_'*60)
print(template.format(np.round(self.SI, 3),
np.round(self.SR, 3),
np.round(self.DW, 3),
np.round(self.AIC, 3),
np.round(self.BIC, 3),
np.round(self.R_square.mean(), 3)))
__all__.append('ModelDiagnostics')
register_class(ModelDiagnostics)
......@@ -7,6 +7,7 @@ from scipy import signal
import numpy as np
import matplotlib.pyplot as plt
from .anam import AbstractAnam, register_class
from sails.support import ensure_leading_negative
from sails.modelfit import coloured_noise_fit
......@@ -54,22 +55,22 @@ def adjust_phase(x):
__all__.append('adjust_phase')
class MvarModalDecomposition(object):
class MvarModalDecomposition(AbstractAnam):
"""Object which calculates the modal decomposition of a fitted model
derived from the AbstractLinearModel class. This is typically created using
the :meth:`initialise` classmethod which computes the decomposition and returns a
populated object instance.
"""
# This is only used when used within NAF
# This is only used when used within Anamnesis
hdf5_outputs = ['nsignals', 'order', 'companion', 'evals', 'evecsr',
'evecsl', 'decoupled_cov', 'abs_lambda_sq', 'excitation',
'modeinds', 'freq_vect', 'period', 'dampening_time',
'freq_vect_hz', 'peak_frequency', 'dampening_time_hz',
'freq_response', 'phase_response', 'pole_pairs', 'residue',
'freq_response_modal', 'phase_response_modal', 'nmodes',
'evecsl', 'pole_pairs', 'modeinds', 'period',
'dampening_time', 'peak_frequency', 'dampening_time_hz', 'nmodes',
'delay_vect']
def __init__(self):
AbstractAnam.__init__(self)
@classmethod
def initialise(cls, model, sample_rate=None, normalise=False):
"""Compute the modal pole-residue decomposition of the A matrix from a
......@@ -657,6 +658,8 @@ class MvarModalDecomposition(object):
ax.annotate('Frequency', xy=(.56, 1.02))
ax.set_aspect('equal')
elif plottype == 'eigenspectrum':
mags = np.abs(self.evals)
......@@ -683,6 +686,7 @@ class MvarModalDecomposition(object):
__all__.append('MvarModalDecomposition')
register_class(MvarModalDecomposition)
def find_noise_poles(modes, model, X, sample_rate, metric='diff', include_real_poles=True):
......
This diff is collapsed.
......@@ -4,16 +4,22 @@
import numpy as np
from .anam import AbstractAnam, register_class
from sails.support import ensure_leading_positive, ensure_leading_negative
from sails import plotting
__all__ = []
class AbstractMVARMetrics(object):
class AbstractMVARMetrics(AbstractAnam):
"""This is a base class for computing various connectivity metrics from
fitted autoregressive models"""
hdf5_outputs = []
def __init__(self):
AbstractAnam.__init__(self)
@property
def S(self):
"""Spectral Matrix of shape [nsignals, nsignals, order, epochs]"""
......@@ -166,6 +172,7 @@ class AbstractMVARMetrics(object):
__all__.append('AbstractMVARMetrics')
register_class(AbstractMVARMetrics)
class ModalMvarMetrics(AbstractMVARMetrics):
......@@ -178,8 +185,11 @@ class ModalMvarMetrics(AbstractMVARMetrics):
"""
# This is only used when used within NAF
hdf5_outputs = ['resid_cov', 'freq_vect', 'H', 'modes']
def __init__(self):
AbstractMVARMetrics.__init__(self)
# This is only used when used within Anamnesis
hdf5_outputs = ['resid_cov', 'freq_vect', 'sample_rate', 'H', 'modes']
@classmethod
def initialise(cls, model, sample_rate, freq_vect, sum_modes=True):
......@@ -280,7 +290,7 @@ class ModalMvarMetrics(AbstractMVARMetrics):
ret = cls()
ret.modes = modes
ret.freq_vect = freq_vect
ret.sample_rate = sample_rate
ret.sample_rate = float(sample_rate)
A = model.parameters
resid_cov = model.resid_cov
......@@ -314,6 +324,7 @@ class ModalMvarMetrics(AbstractMVARMetrics):
__all__.append('ModalMvarMetrics')
register_class(ModalMvarMetrics)
class FourierMvarMetrics(AbstractMVARMetrics):
......@@ -326,8 +337,11 @@ class FourierMvarMetrics(AbstractMVARMetrics):
"""
# This is only used when used within NAF
hdf5_outputs = ['resid_cov', 'freq_vect', 'H', '_A', 'Af']
# This is only used when used within anamnesis
hdf5_outputs = ['resid_cov', 'freq_vect', 'sample_rate', 'H', '_A', 'Af']
def __init__(self):
AbstractMVARMetrics.__init__(self)
@classmethod
def initialise(cls, model, sample_rate, freq_vect, nmodes=None):
......@@ -375,7 +389,7 @@ class FourierMvarMetrics(AbstractMVARMetrics):
ret.resid_cov = resid_cov
ret.freq_vect = freq_vect
ret.sample_rate = sample_rate
ret.sample_rate = float(sample_rate)
# Get frequency transform of A
ret.Af = ar_spectrum(A, resid_cov, sample_rate, freq_vect)
......@@ -387,6 +401,7 @@ class FourierMvarMetrics(AbstractMVARMetrics):
__all__.append('FourierMvarMetrics')
register_class(FourierMvarMetrics)
def modal_transfer_function(evals, evecl, evecr, nchannels,
......
......@@ -56,7 +56,6 @@ def root_plot(rts, ax=None, figargs=dict(), plotargs=dict()):
head_width=.05, color='k')
# Add poles
print(plotargs)
ax.plot(rts.real, rts.imag, 'k+', **plotargs)
# Labels
......@@ -65,6 +64,7 @@ def root_plot(rts, ax=None, figargs=dict(), plotargs=dict()):
ax.set_xlabel('Real')
ax.set_ylabel('Imaginary')
ax.annotate('Frequency', xy=(.56, 1.02))
ax.set_aspect('equal')
return ax
......@@ -377,7 +377,7 @@ def plot_vector(metric, x_vect, y_vect=None, x_label=None, y_label=None,
def plot_matrix(metric, x_vect, y_vect, x_label=None, y_label=None,
z_vect=None, title=None, labels=None, F=None,
z_vect=None, title=None, labels=None, F=None, vlines=None,
cmap=plt.cm.jet, font_size=8, use_latex=False, diag=True):
"""Function for plotting frequency domain connectivity over many time points
......@@ -404,6 +404,8 @@ def plot_matrix(metric, x_vect, y_vect, x_label=None, y_label=None,
list of node labels for columns and vectors (Default value = None)
F : figurehandle [optional]
handle of existing figure to plot within (Default value = None)
vlines : list
List of x-axis values to plot a dashed vertical line (Default value = None)
cmap : matplotlib colormap [optional]
matplotlib.cm.<colormapname> to use for
colourscale (redundant for plot vector??) (Default value = plt.cm.jet)
......@@ -436,16 +438,25 @@ def plot_matrix(metric, x_vect, y_vect, x_label=None, y_label=None,
# Set up colour scale
if z_vect is None:
cbounds = np.linspace(0, 1)
col = np.linspace(0, 1)
if diag is True:
mx = np.abs(metric).max()
else:
upper_tri = np.tril(np.abs(metric).max(axis=(2, 3)), k=-1).max()
lower_tri = np.triu(np.abs(metric).max(axis=(2, 3)), k=1).max()
mx = np.max([upper_tri, lower_tri])
if metric.min() < 0:
cbounds = np.linspace(-mx, mx)
else:
cbounds = np.linspace(0, mx)
else:
cbounds = z_vect
col = z_vect
# Make figure if we don't have one
if F is None:
F = plt.figure(figsize=(8.3, 5.8))
plt.axis('off')
plt.axis('off')
# Write labels to plot
step = 1. / nbSignals
......@@ -462,6 +473,7 @@ def plot_matrix(metric, x_vect, y_vect, x_label=None, y_label=None,
plt.text(-.08, (step*i)-step/10 - .06,
r'$\mathbf{%s}$' % (labels[-i]),
fontsize=font_size,
verticalalignment='center',
rotation='vertical')
# Make grid with appropriate amount of subplots
......@@ -479,37 +491,25 @@ def plot_matrix(metric, x_vect, y_vect, x_label=None, y_label=None,
grid[g].contourf(x_vect, y_vect, metric[i, j, :, :],
np.linspace(cbounds[0], cbounds[-1], 10),
cmap=cmap)
grid[g].vlines(0, y_vect[0], y_vect[-1])
if vlines is not None:
grid[g].vlines(vlines, y_vect[0], y_vect[-1], linestyles='dashed')
else:
grid[g].set_visible(False)
if j == 0:
grid[g].set_ylabel(y_label)
if i == nbSignals-1:
grid[g].set_xlabel(x_label)
# Set labels
grid.axes_llc.set_ylim(y_vect[0], y_vect[-1])
n_ticks = len(grid[g].xaxis.get_majorticklocs())
grid.axes_llc.set_xticklabels(np.round(np.linspace(x_vect[0], x_vect[-1],
n_ticks), 2))
# grid.axes_llc.set_xticks(x_vect[::4])
# Set colourbar
ax = F.add_axes([.92, .12, .02, .7])
if col[-1] < 2:
# Round to nearest .1
rcol = np.array(.1 * np.round(col/.1))
elif col[-1] < 10:
# Round to nearest 1
rcol = np.array(1 * np.round(col/1)).astype(int)
else:
# Round to nearest 10
rcol = np.array(10 * np.round(col/10)).astype(int)
cb.ColorbarBase(ax, boundaries=cbounds, cmap=cmap, ticks=rcol)
ax = F.add_axes([.92, .12, .02, .7], visible=True)
cb.ColorbarBase(ax, boundaries=cbounds, cmap=cmap)
if title is not None:
plt.suptitle(title)
if x_label is not None:
grid[nbSignals*(nbSignals-1)].set_xlabel(x_label)
if y_label is not None:
grid[nbSignals*(nbSignals-1)].set_ylabel(y_label)
plt.suptitle(title, fontsize=16)
return F
......
......@@ -11,14 +11,7 @@ __all__ = []
class AbstractSigGen():
"""Abstract base class for signal generation"""
refs = []
hdf5_outputs = []
shortdesc = "Undescribed signal generator"
def __init__(self, extraargs=None):
if extraargs is not None:
self.parse_arguments(extraargs)
......@@ -79,12 +72,6 @@ __all__.append('AbstractSigGen')
class AbstractLagSigGen(AbstractSigGen):
"""Generic class for making lagged signals"""
refs = []
hdf5_outputs = []
shortdesc = "MVAR signal generator"
def get_num_sources(self):
""" """
print("Warning! get_num_sources is not implemented")
......@@ -210,12 +197,6 @@ __all__.append('AbstractLagSigGen')
class AbstractMVARSigGen(AbstractSigGen):
"""Generic class for making MVAR signals"""
refs = []
hdf5_outputs = []
shortdesc = "MVAR signal generator"
def get_num_sources(self):
""" """
print("Warning! get_num_sources is not implemented")
......
......@@ -30,6 +30,10 @@ def find_cov_multitrial(a, b, ddof=1):
A covariance matrix between a and b of shape nsignals x nsignals
"""
if a.dtype is not b.dtype:
raise TypeError('Data types of input a ({0}) and b ({1}) not matched')
nsignals = a.shape[0]
nsamples = a.shape[1]
ntrials = a.shape[2]
......@@ -38,7 +42,7 @@ def find_cov_multitrial(a, b, ddof=1):
a = a - a.mean(axis=1)[slice(None), np.newaxis]
b = b - b.mean(axis=1)[slice(None), np.newaxis]
cov = np.zeros((nsignals, nsignals, ntrials), dtype=np.complex)
cov = np.zeros((nsignals, nsignals, ntrials), dtype=a.dtype)
for i in range(ntrials):
b_h = b[:, :, i].T.conj()
......@@ -73,6 +77,9 @@ def find_cov(a, b, ddof=1):
"""
if a.dtype is not b.dtype:
raise TypeError('Data types of input a ({0}) and b ({1}) not matched')
nsamples = float(a.shape[1])
# Make sure datasets are demeaned before calculating covariance
......@@ -80,7 +87,7 @@ def find_cov(a, b, ddof=1):
b = b - b.mean(axis=1)[slice(None), np.newaxis]
b_h = b.T.conj()
cov = a.astype(np.complex).dot(b_h)
cov = a.astype(a.dtype).dot(b_h)
if ddof is not None:
return cov / (nsamples - ddof)
......
This diff is collapsed.
......@@ -2,7 +2,8 @@
# vim: set expandtab ts=4 sw=4:
from os.path import abspath, join, split
import os
from os.path import abspath, expanduser, isdir, isfile, join, split
import numpy as np
......@@ -67,11 +68,48 @@ def model_from_roots(roots):
__all__.append('model_from_roots')
def find_example_path():
"""Returns the path to the example data directory"""
def find_support_path():
"""Returns the path to the support files directory"""
d, f = split(__file__)
return join(abspath(d), 'data')
return join(abspath(d), 'support')
__all__.append('find_support_path')
def find_example_path():
"""Returns the path to the tutorial example data directory (if available)"""
example_path = os.getenv('SAILS_EXAMPLE_DATA', None)
if example_path is None:
# Try the user's home directory
example_path = join(expanduser("~"), "sails-example-data")
# A sentinel file which we ensure is in the git repo for the example data
sentinel = join(example_path, 'SAILS_EXAMPLE_DATA')
# Check for directory and sentinel file
if not isdir(example_path) or not isfile(sentinel):
error = """
Error: Could not find the sails example data (looked in
{})
If you wish to use the example data, download it from git
using:
git clone https://vcs.ynic.york.ac.uk/analysis/sails-example-data
If the git repository is in your home directory, this function
will then find the path. If you wish to store the data in another
location, set the SAILS_EXAMPLE_DATA environment variable to
the location of the git checkout
""".format(example_path)
raise Exception(error)
return example_path
__all__.append('find_example_path')
......