Skip to content

Commit 3c7fd88

Browse files
committed
MNT use markdown in notebooks
1 parent de692ed commit 3c7fd88

8 files changed

Lines changed: 15 additions & 17 deletions

tutorials/notebooks/shortclips/00_download_shortclips.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Download the data set\n\nIn this script, we download the data set from Wasabi or GIN. No account is\nrequired.\n\n## Cite this data set\n\nThis tutorial is based on publicly available data `published on GIN\n<https://gin.g-node.org/gallantlab/shortclips>`_. If you publish any work using\nthis data set, please cite the original publication [1]_, and the data set\n[2]_.\n"
18+
"\n# Download the data set\n\nIn this script, we download the data set from Wasabi or GIN. No account is\nrequired.\n\n## Cite this data set\n\nThis tutorial is based on publicly available data [published on GIN](https://gin.g-node.org/gallantlab/shortclips). If you publish any work using\nthis data set, please cite the original publication [1]_, and the data set\n[2]_.\n"
1919
]
2020
},
2121
{

tutorials/notebooks/shortclips/00_setup_colab.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Setup Google Colab\n\nIn this script, we setup a Google Colab environment. This script will only work\nwhen run from `Google Colab <https://colab.research.google.com/>`_). You can\nskip it if you run the tutorials on your machine.\n"
18+
"\n# Setup Google Colab\n\nIn this script, we setup a Google Colab environment. This script will only work\nwhen run from [Google Colab](https://colab.research.google.com/)). You can\nskip it if you run the tutorials on your machine.\n"
1919
]
2020
},
2121
{

tutorials/notebooks/shortclips/01_plot_explainable_variance.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -177,7 +177,7 @@
177177
"cell_type": "markdown",
178178
"metadata": {},
179179
"source": [
180-
"## Map to subject flatmap\n\nTo better understand the distribution of explainable variance, we map the\nvalues to the subject brain. This can be done with `pycortex\n<https://gallantlab.github.io/pycortex/>`_, which can create interactive 3D\nviewers to be displayed in any modern browser. ``pycortex`` can also display\nflattened maps of the cortical surface to visualize the entire cortical\nsurface at once.\n\nHere, we do not share the anatomical information of the subjects for privacy\nconcerns. Instead, we provide two mappers:\n\n- to map the voxels to a (subject-specific) flatmap\n- to map the voxels to the Freesurfer average cortical surface (\"fsaverage\")\n\nThe first mapper is 2D matrix of shape (n_pixels, n_voxels) that maps each\nvoxel to a set of pixel in a flatmap. The matrix is efficiently stored in a\n``scipy`` sparse CSR matrix. The function ``plot_flatmap_from_mapper``\nprovides an example of how to use the mapper and visualize the flatmap.\n\n"
180+
"## Map to subject flatmap\n\nTo better understand the distribution of explainable variance, we map the\nvalues to the subject brain. This can be done with [pycortex](https://gallantlab.github.io/pycortex/), which can create interactive 3D\nviewers to be displayed in any modern browser. ``pycortex`` can also display\nflattened maps of the cortical surface to visualize the entire cortical\nsurface at once.\n\nHere, we do not share the anatomical information of the subjects for privacy\nconcerns. Instead, we provide two mappers:\n\n- to map the voxels to a (subject-specific) flatmap\n- to map the voxels to the Freesurfer average cortical surface (\"fsaverage\")\n\nThe first mapper is 2D matrix of shape (n_pixels, n_voxels) that maps each\nvoxel to a set of pixel in a flatmap. The matrix is efficiently stored in a\n``scipy`` sparse CSR matrix. The function ``plot_flatmap_from_mapper``\nprovides an example of how to use the mapper and visualize the flatmap.\n\n"
181181
]
182182
},
183183
{
@@ -195,7 +195,7 @@
195195
"cell_type": "markdown",
196196
"metadata": {},
197197
"source": [
198-
"This figure is a flattened map of the cortical surface. A number of regions\nof interest (ROIs) have been labeled to ease interpretation. If you have\nnever seen such a flatmap, we recommend taking a look at a `pycortex brain\nviewer <https://www.gallantlab.org/brainviewer/Deniz2019>`_, which displays\nthe brain in 3D. In this viewer, press \"I\" to inflate the brain, \"F\" to\nflatten the surface, and \"R\" to reset the view (or use the ``surface/unfold``\ncursor on the right menu). Press \"H\" for a list of all keyboard shortcuts.\nThis viewer should help you understand the correspondance between the flatten\nand the folded cortical surface of the brain.\n\n"
198+
"This figure is a flattened map of the cortical surface. A number of regions\nof interest (ROIs) have been labeled to ease interpretation. If you have\nnever seen such a flatmap, we recommend taking a look at a [pycortex brain\nviewer](https://www.gallantlab.org/brainviewer/Deniz2019), which displays\nthe brain in 3D. In this viewer, press \"I\" to inflate the brain, \"F\" to\nflatten the surface, and \"R\" to reset the view (or use the ``surface/unfold``\ncursor on the right menu). Press \"H\" for a list of all keyboard shortcuts.\nThis viewer should help you understand the correspondance between the flatten\nand the folded cortical surface of the brain.\n\n"
199199
]
200200
},
201201
{

tutorials/notebooks/shortclips/07_extract_motion_energy.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Extract motion energy features from the stimuli\n\nThis script describes how to extract motion-energy features from the stimuli.\n\n.. Note:: The public data set already contains precomputed motion-energy.\n Therefore, you do not need to run this script to fit motion-energy models\n in other part of this tutorial.\n\n*Motion-energy features:* Motion-energy features result from filtering a video\nstimulus with spatio-temporal Gabor filters. A pyramid of filters is used to\ncompute the motion-energy features at multiple spatial and temporal scales.\nMotion-energy features were introduced in [1]_.\n\nThe motion-energy extraction is performed by the package `pymoten\n<https://github.com/gallantlab/pymoten>`_. Check the pymoten `gallery of\nexamples <https://gallantlab.github.io/pymoten/auto_examples/index.html>`_ for\nvisualizing motion-energy filters, and for pymoten API usage examples.\n\n## Running time\nExtracting motion energy is a bit longer than the other examples. It typically\ntakes a couple hours to run.\n"
18+
"\n# Extract motion energy features from the stimuli\n\nThis script describes how to extract motion-energy features from the stimuli.\n\n.. Note:: The public data set already contains precomputed motion-energy.\n Therefore, you do not need to run this script to fit motion-energy models\n in other part of this tutorial.\n\n*Motion-energy features:* Motion-energy features result from filtering a video\nstimulus with spatio-temporal Gabor filters. A pyramid of filters is used to\ncompute the motion-energy features at multiple spatial and temporal scales.\nMotion-energy features were introduced in [1]_.\n\nThe motion-energy extraction is performed by the package [pymoten](https://github.com/gallantlab/pymoten). Check the pymoten [gallery of\nexamples](https://gallantlab.github.io/pymoten/auto_examples/index.html) for\nvisualizing motion-energy filters, and for pymoten API usage examples.\n\n## Running time\nExtracting motion energy is a bit longer than the other examples. It typically\ntakes a couple hours to run.\n"
1919
]
2020
},
2121
{

tutorials/notebooks/shortclips/merged_for_colab.ipynb

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
"# Setup Google Colab\n",
2828
"\n",
2929
"In this script, we setup a Google Colab environment. This script will only work\n",
30-
"when run from `Google Colab <https://colab.research.google.com/>`_). You can\n",
30+
"when run from [Google Colab](https://colab.research.google.com/)). You can\n",
3131
"skip it if you run the tutorials on your machine.\n"
3232
]
3333
},
@@ -454,8 +454,7 @@
454454
"## Map to subject flatmap\n",
455455
"\n",
456456
"To better understand the distribution of explainable variance, we map the\n",
457-
"values to the subject brain. This can be done with `pycortex\n",
458-
"<https://gallantlab.github.io/pycortex/>`_, which can create interactive 3D\n",
457+
"values to the subject brain. This can be done with [pycortex](https://gallantlab.github.io/pycortex/), which can create interactive 3D\n",
459458
"viewers to be displayed in any modern browser. ``pycortex`` can also display\n",
460459
"flattened maps of the cortical surface to visualize the entire cortical\n",
461460
"surface at once.\n",
@@ -494,8 +493,8 @@
494493
"source": [
495494
"This figure is a flattened map of the cortical surface. A number of regions\n",
496495
"of interest (ROIs) have been labeled to ease interpretation. If you have\n",
497-
"never seen such a flatmap, we recommend taking a look at a `pycortex brain\n",
498-
"viewer <https://www.gallantlab.org/brainviewer/Deniz2019>`_, which displays\n",
496+
"never seen such a flatmap, we recommend taking a look at a [pycortex brain\n",
497+
"viewer](https://www.gallantlab.org/brainviewer/Deniz2019), which displays\n",
499498
"the brain in 3D. In this viewer, press \"I\" to inflate the brain, \"F\" to\n",
500499
"flatten the surface, and \"R\" to reset the view (or use the ``surface/unfold``\n",
501500
"cursor on the right menu). Press \"H\" for a list of all keyboard shortcuts.\n",

tutorials/notebooks/shortclips/merged_for_colab_model_fitting.ipynb

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
"# Setup Google Colab\n",
2828
"\n",
2929
"In this script, we setup a Google Colab environment. This script will only work\n",
30-
"when run from `Google Colab <https://colab.research.google.com/>`_). You can\n",
30+
"when run from [Google Colab](https://colab.research.google.com/)). You can\n",
3131
"skip it if you run the tutorials on your machine.\n"
3232
]
3333
},
@@ -454,8 +454,7 @@
454454
"## Map to subject flatmap\n",
455455
"\n",
456456
"To better understand the distribution of explainable variance, we map the\n",
457-
"values to the subject brain. This can be done with `pycortex\n",
458-
"<https://gallantlab.github.io/pycortex/>`_, which can create interactive 3D\n",
457+
"values to the subject brain. This can be done with [pycortex](https://gallantlab.github.io/pycortex/), which can create interactive 3D\n",
459458
"viewers to be displayed in any modern browser. ``pycortex`` can also display\n",
460459
"flattened maps of the cortical surface to visualize the entire cortical\n",
461460
"surface at once.\n",
@@ -494,8 +493,8 @@
494493
"source": [
495494
"This figure is a flattened map of the cortical surface. A number of regions\n",
496495
"of interest (ROIs) have been labeled to ease interpretation. If you have\n",
497-
"never seen such a flatmap, we recommend taking a look at a `pycortex brain\n",
498-
"viewer <https://www.gallantlab.org/brainviewer/Deniz2019>`_, which displays\n",
496+
"never seen such a flatmap, we recommend taking a look at a [pycortex brain\n",
497+
"viewer](https://www.gallantlab.org/brainviewer/Deniz2019), which displays\n",
499498
"the brain in 3D. In this viewer, press \"I\" to inflate the brain, \"F\" to\n",
500499
"flatten the surface, and \"R\" to reset the view (or use the ``surface/unfold``\n",
501500
"cursor on the right menu). Press \"H\" for a list of all keyboard shortcuts.\n",

tutorials/notebooks/vim2/00_download_vim2.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Download the data set from CRCNS\n\nIn this script, we download the data set from CRCNS.\nA (free) account is required.\n\n## Cite this data set\n\nThis tutorial is based on publicly available data\n`published on CRCNS <https://crcns.org/data-sets/vc/vim-2/about-vim-2>`_.\nIf you publish any work using this data set, please cite the original\npublication [1]_, and the data set [2]_.\n"
18+
"\n# Download the data set from CRCNS\n\nIn this script, we download the data set from CRCNS.\nA (free) account is required.\n\n## Cite this data set\n\nThis tutorial is based on publicly available data\n[published on CRCNS](https://crcns.org/data-sets/vc/vim-2/about-vim-2).\nIf you publish any work using this data set, please cite the original\npublication [1]_, and the data set [2]_.\n"
1919
]
2020
},
2121
{

tutorials/notebooks/vim2/01_extract_motion_energy.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Extract motion energy features from the stimuli\n\nThis script describes how to extract motion-energy features from the stimuli.\n\n*Motion-energy features:* Motion-energy features result from filtering a video\nstimulus with spatio-temporal Gabor filters. A pyramid of filters is used to\ncompute the motion-energy features at multiple spatial and temporal scales.\nMotion-energy features were introduced in [1]_.\n\nThe motion-energy extraction is performed by the package `pymoten\n<https://github.com/gallantlab/pymoten>`_. Check the pymoten `gallery of\nexamples <https://gallantlab.github.io/pymoten/auto_examples/index.html>`_ for\nvisualizing motion-energy filters, and for pymoten API usage examples.\n\n## Running time\nExtracting motion energy is a bit longer than the other examples. It typically\ntakes a couple hours to run.\n"
18+
"\n# Extract motion energy features from the stimuli\n\nThis script describes how to extract motion-energy features from the stimuli.\n\n*Motion-energy features:* Motion-energy features result from filtering a video\nstimulus with spatio-temporal Gabor filters. A pyramid of filters is used to\ncompute the motion-energy features at multiple spatial and temporal scales.\nMotion-energy features were introduced in [1]_.\n\nThe motion-energy extraction is performed by the package [pymoten](https://github.com/gallantlab/pymoten). Check the pymoten [gallery of\nexamples](https://gallantlab.github.io/pymoten/auto_examples/index.html) for\nvisualizing motion-energy filters, and for pymoten API usage examples.\n\n## Running time\nExtracting motion energy is a bit longer than the other examples. It typically\ntakes a couple hours to run.\n"
1919
]
2020
},
2121
{

0 commit comments

Comments
 (0)