Skip to content

Commit c85e8b7

Browse files
committed
EXA rework writing in multiple examples
1 parent 4036b8e commit c85e8b7

9 files changed

Lines changed: 450 additions & 302 deletions

doc/README.md

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
# Tutorials website
2+
3+
## Requirements
4+
5+
```
6+
numpydoc
7+
sphinx
8+
sphinx_gallery
9+
```
10+
11+
## Build the website
12+
13+
```bash
14+
cd doc
15+
make html
16+
firefox _build/html/index.html
17+
```

doc/index.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,8 @@ Voxelwise modeling tutorials
22
============================
33

44

5-
Welcome to the voxelwise modeling tutorial from the Gallant lab.
5+
Welcome to the voxelwise modeling tutorial from the
6+
`Gallantlab <https://gallantlab.org>`_.
67

78
Getting started
89
---------------

tutorials/movies_3T/00_download_vim4.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -35,9 +35,9 @@
3535
###############################################################################
3636
# We will only use the first subject in this tutorial, but you can run the same
3737
# analysis on the four other subjects. Uncomment the lines in `DATAFILES` to
38-
# download more subjects, or to download the stimuli images.
38+
# download more subjects.
3939
#
40-
# We also skip the stimuli files, since the dataset provides two processed
40+
# We also skip the stimuli files, since the dataset provides two preprocessed
4141
# feature spaces to perform voxelwise modeling without requiring the original
4242
# stimuli.
4343

tutorials/movies_3T/01_plot_explainable_variance.py

Lines changed: 76 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -37,35 +37,60 @@
3737
###############################################################################
3838
# First, we load the fMRI responses on the test set, which contains 10 repeats.
3939
file_name = os.path.join(directory, 'responses', f'{subject}_responses.hdf')
40-
Y_test = load_hdf5_array(file_name, "Y_test")
40+
Y_test = load_hdf5_array(file_name, key="Y_test")
4141

4242
###############################################################################
4343
# Then, we compute the explainable variance per voxel.
4444
# The variance of the signal is estimated by taking the average variance over
4545
# repeats. The variance of the component shared across repeats is estimated by
4646
# taking the variance of the average response. Then, we compute the
4747
# explainable variance by dividing these two quantities.
48-
# Finally, an correction can be applied to account for small numbers of repeat.
48+
# Finally, a correction can be applied to account for small numbers of repeat
49+
# (parameter ``bias_correction``).
4950

5051
from voxelwise.utils import explainable_variance
51-
5252
ev = explainable_variance(Y_test, bias_correction=False)
5353

5454
###############################################################################
55-
# Plot the distribution of explainable variance over voxels.
55+
# We can plot the distribution of explainable variance over voxels.
56+
5657
import matplotlib.pyplot as plt
5758

58-
plt.hist(ev, bins=np.linspace(0, 1, 100), log=True,
59-
histtype='step')
59+
plt.hist(ev, bins=np.linspace(0, 1, 100), log=True, histtype='step')
6060
plt.xlabel("Explainable variance")
6161
plt.ylabel("Number of voxels")
6262
plt.title('Histogram of explainable variance')
6363
plt.grid('on')
6464
plt.show()
6565

66+
###############################################################################
67+
# We see that most voxels have a rather low explainable variance, around 0.1
68+
# (when not using the bias correction). This is expected, since most voxels are
69+
# not directly driven by a visual stimulus.
70+
# We also see that some voxels reach an explainable variance of 0.7, which is
71+
# quite high. It means that these voxels consistently record the same activity
72+
# across a repeated stimulus, and thus are good targets for encoding models.
73+
6674
###############################################################################
6775
# Map to subject flatmap
6876
# ----------------------
77+
#
78+
# To better understand the distribution of explainable variance, we map the
79+
# values to the subject brain. This can be done with
80+
# `pycortex <https://gallantlab.github.io/pycortex/>`_, which can create
81+
# interactive 3D viewers displayed in any modern browser.
82+
# ``Pycortex`` can also display flatten maps of the cortical surface, to
83+
# visualize the entire cortical surface at once.
84+
#
85+
# Here, we do not share the anatomical information of the subjects for privacy
86+
# concerns. Instead, we provide two mappers, (i) to map the voxels to a
87+
# subject-specific flatmap, or (ii) to map the voxels to the Freesurfer average
88+
# cortical surface ("fsaverage").
89+
#
90+
# The first mapper is a sparse CSR matrix that map each voxel to a set of pixel
91+
# in a flatmap. To ease its use, we provide here an example function
92+
# ``plot_flatmap_from_mapper``.
93+
6994
from voxelwise.viz import plot_flatmap_from_mapper
7095

7196
mapper_file = os.path.join(directory, 'mappers', f'{subject}_mappers.hdf')
@@ -74,4 +99,48 @@
7499

75100
###############################################################################
76101
# We can see that the explainable variance is mainly located in the visual
77-
# cortex, which was expected since this is a purely visual experiment.
102+
# cortex, in early regions like V1, V2, V3, or in higher-level regions like
103+
# EBA, FFA or IPS. This was expected since this is a purely visual experiment.
104+
105+
###############################################################################
106+
# Map to fsaverage
107+
# ----------------
108+
#
109+
# The second mapper we provide maps the voxel data to a Freesurfer
110+
# average surface ("fsaverage"), that can be used in ``pycortex``.
111+
# First, let's download the fsaverage surface if it does not exist
112+
113+
import cortex
114+
115+
surface = "fsaverage_pycortex" # ("fsaverage" outside the Gallant lab)
116+
117+
if not hasattr(cortex.db, surface):
118+
cortex.utils.download_subject(subject_id=surface)
119+
120+
###############################################################################
121+
# Then, we load the fsaverage mapper. The mapper is a sparse CSR matrix, which
122+
# map each voxel to some vertices in the fsaverage surface.
123+
# The mapper is applied with a dot product ``@``.
124+
from voxelwise.io import load_hdf5_sparse_array
125+
voxel_to_fsaverage = load_hdf5_sparse_array(mapper_file,
126+
key='voxel_to_fsaverage')
127+
ev_projected = voxel_to_fsaverage @ ev
128+
129+
###############################################################################
130+
# We can then create a ``Vertex`` object with the projected data.
131+
# This object can be used either in a ``pycortex`` interactive 3D viewer, or
132+
# in a ``matplotlib`` figure showing directly the flatmap.
133+
134+
vertex = cortex.Vertex(ev_projected, surface, vmin=0, vmax=0.7, cmap='inferno')
135+
136+
###############################################################################
137+
# To start an interactive 3D viewer in the browser, use the following function:
138+
if False:
139+
cortex.webshow(vertex, open_browser=True)
140+
141+
###############################################################################
142+
# Alternatively, to plot a flatmap in a ``matplotlib`` figure, use the
143+
# following function:
144+
145+
fig = cortex.quickshow(vertex, colorbar_location='right')
146+
plt.show()

0 commit comments

Comments
 (0)