Skip to content

Commit de692ed

Browse files
committed
Merge branch 'main' of github.com:gallantlab/voxelwise_tutorials into main
* 'main' of github.com:gallantlab/voxelwise_tutorials: FIX ngrok now requires authentication, remove it from tutorial for simplicity (#18)
2 parents 8594144 + e480017 commit de692ed

10 files changed

Lines changed: 36 additions & 176 deletions

tutorials/notebooks/shortclips/00_download_shortclips.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Download the data set\n\nIn this script, we download the data set from Wasabi or GIN. No account is\nrequired.\n\n## Cite this data set\n\nThis tutorial is based on publicly available data [published on GIN](https://gin.g-node.org/gallantlab/shortclips). If you publish any work using\nthis data set, please cite the original publication [1]_, and the data set\n[2]_.\n"
18+
"\n# Download the data set\n\nIn this script, we download the data set from Wasabi or GIN. No account is\nrequired.\n\n## Cite this data set\n\nThis tutorial is based on publicly available data `published on GIN\n<https://gin.g-node.org/gallantlab/shortclips>`_. If you publish any work using\nthis data set, please cite the original publication [1]_, and the data set\n[2]_.\n"
1919
]
2020
},
2121
{

tutorials/notebooks/shortclips/00_setup_colab.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Setup Google Colab\n\nIn this script, we setup a Google Colab environment. This script will only work\nwhen run from [Google Colab](https://colab.research.google.com/)). You can\nskip it if you run the tutorials on your machine.\n"
18+
"\n# Setup Google Colab\n\nIn this script, we setup a Google Colab environment. This script will only work\nwhen run from `Google Colab <https://colab.research.google.com/>`_). You can\nskip it if you run the tutorials on your machine.\n"
1919
]
2020
},
2121
{
@@ -51,7 +51,7 @@
5151
},
5252
"outputs": [],
5353
"source": [
54-
"#!git config --global user.email \"you@example.com\" && git config --global user.name \"Your Name\"\n#!wget -O- http://neuro.debian.net/lists/impish.us-ca.libre | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list\n#!apt-key adv --recv-keys --keyserver hkps://keyserver.ubuntu.com 0xA5D32F012649A5A9 > /dev/null\n#!apt-get -qq update > /dev/null\n#!apt-get install -qq inkscape git-annex-standalone > /dev/null\n#!pip install -q voxelwise_tutorials\n#![ -f \"ngrok-stable-linux-amd64.zip\" ] || wget -q https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip\n#![ -f \"ngrok\" ] || unzip ngrok-stable-linux-amd64.zip"
54+
"#!git config --global user.email \"you@example.com\" && git config --global user.name \"Your Name\"\n#!wget -O- http://neuro.debian.net/lists/impish.us-ca.libre | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list\n#!apt-key adv --recv-keys --keyserver hkps://keyserver.ubuntu.com 0xA5D32F012649A5A9 > /dev/null\n#!apt-get -qq update > /dev/null\n#!apt-get install -qq inkscape git-annex-standalone > /dev/null\n#!pip install -q voxelwise_tutorials"
5555
]
5656
},
5757
{
@@ -69,7 +69,7 @@
6969
},
7070
"outputs": [],
7171
"source": [
72-
"# - Set up an email and user name to use git, git-annex, and datalad (required to download the data)\n# - Add NeuroDebian to the package sources\n# - Update the gpg keys to use NeuroDebian\n# - Update the list of available packages\n# - Install Inkscape to use more features from Pycortex, and install git-annex to download the data\n# - Install the tutorial helper package, and all the required dependencies\n# - Download ngrok to create a tunnel for pycortex 3D brain viewer\n# - Extract the ngrok archive"
72+
"# - Set up an email and username to use git, git-annex, and datalad (required to download the data)\n# - Add NeuroDebian to the package sources\n# - Update the gpg keys to use NeuroDebian\n# - Update the list of available packages\n# - Install Inkscape to use more features from Pycortex, and install git-annex to download the data\n# - Install the tutorial helper package, and all the required dependencies"
7373
]
7474
},
7575
{

tutorials/notebooks/shortclips/01_plot_explainable_variance.ipynb

Lines changed: 3 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -177,7 +177,7 @@
177177
"cell_type": "markdown",
178178
"metadata": {},
179179
"source": [
180-
"## Map to subject flatmap\n\nTo better understand the distribution of explainable variance, we map the\nvalues to the subject brain. This can be done with [pycortex](https://gallantlab.github.io/pycortex/), which can create interactive 3D\nviewers to be displayed in any modern browser. ``pycortex`` can also display\nflattened maps of the cortical surface to visualize the entire cortical\nsurface at once.\n\nHere, we do not share the anatomical information of the subjects for privacy\nconcerns. Instead, we provide two mappers:\n\n- to map the voxels to a (subject-specific) flatmap\n- to map the voxels to the Freesurfer average cortical surface (\"fsaverage\")\n\nThe first mapper is 2D matrix of shape (n_pixels, n_voxels) that maps each\nvoxel to a set of pixel in a flatmap. The matrix is efficiently stored in a\n``scipy`` sparse CSR matrix. The function ``plot_flatmap_from_mapper``\nprovides an example of how to use the mapper and visualize the flatmap.\n\n"
180+
"## Map to subject flatmap\n\nTo better understand the distribution of explainable variance, we map the\nvalues to the subject brain. This can be done with `pycortex\n<https://gallantlab.github.io/pycortex/>`_, which can create interactive 3D\nviewers to be displayed in any modern browser. ``pycortex`` can also display\nflattened maps of the cortical surface to visualize the entire cortical\nsurface at once.\n\nHere, we do not share the anatomical information of the subjects for privacy\nconcerns. Instead, we provide two mappers:\n\n- to map the voxels to a (subject-specific) flatmap\n- to map the voxels to the Freesurfer average cortical surface (\"fsaverage\")\n\nThe first mapper is 2D matrix of shape (n_pixels, n_voxels) that maps each\nvoxel to a set of pixel in a flatmap. The matrix is efficiently stored in a\n``scipy`` sparse CSR matrix. The function ``plot_flatmap_from_mapper``\nprovides an example of how to use the mapper and visualize the flatmap.\n\n"
181181
]
182182
},
183183
{
@@ -195,7 +195,7 @@
195195
"cell_type": "markdown",
196196
"metadata": {},
197197
"source": [
198-
"This figure is a flattened map of the cortical surface. A number of regions\nof interest (ROIs) have been labeled to ease interpretation. If you have\nnever seen such a flatmap, we recommend taking a look at a [pycortex brain\nviewer](https://www.gallantlab.org/brainviewer/Deniz2019), which displays\nthe brain in 3D. In this viewer, press \"I\" to inflate the brain, \"F\" to\nflatten the surface, and \"R\" to reset the view (or use the ``surface/unfold``\ncursor on the right menu). Press \"H\" for a list of all keyboard shortcuts.\nThis viewer should help you understand the correspondance between the flatten\nand the folded cortical surface of the brain.\n\n"
198+
"This figure is a flattened map of the cortical surface. A number of regions\nof interest (ROIs) have been labeled to ease interpretation. If you have\nnever seen such a flatmap, we recommend taking a look at a `pycortex brain\nviewer <https://www.gallantlab.org/brainviewer/Deniz2019>`_, which displays\nthe brain in 3D. In this viewer, press \"I\" to inflate the brain, \"F\" to\nflatten the surface, and \"R\" to reset the view (or use the ``surface/unfold``\ncursor on the right menu). Press \"H\" for a list of all keyboard shortcuts.\nThis viewer should help you understand the correspondance between the flatten\nand the folded cortical surface of the brain.\n\n"
199199
]
200200
},
201201
{
@@ -263,25 +263,7 @@
263263
"cell_type": "markdown",
264264
"metadata": {},
265265
"source": [
266-
"To start an interactive 3D viewer in the browser, we can use the ``webshow``\nfunction in pycortex. If you are running the notebook on Colab, you first\nneed to tunnel the pycortex application out of Colab. To do so, use the\nfollowing cell to start a tunnel with ``ngrok`` and to get an address where\nthe pycortex viewer will be made accessible.\n\n"
267-
]
268-
},
269-
{
270-
"cell_type": "code",
271-
"execution_count": null,
272-
"metadata": {
273-
"collapsed": false
274-
},
275-
"outputs": [],
276-
"source": [
277-
"try:\n import google.colab # noqa\n in_colab = True\nexcept ImportError:\n in_colab = False\nprint(in_colab)\n\nif in_colab:\n from IPython import get_ipython\n get_ipython().system_raw('./ngrok http 8050 &')\n plt.pause(1)\n\n command = \"\"\"\n curl -s http://localhost:4040/api/tunnels | python3 -c \\\n \"import sys, json; print(json.load(sys.stdin)['tunnels'][0]['public_url'])\"\n \"\"\"\n result = get_ipython().getoutput(command, split=True)\n print(\"Use the following address to connect to the brain viewer:\\n\"\n f\"{result}\\n\"\n \"and not the one proposed by pycortex ('Open viewer: ...')\\n\")"
278-
]
279-
},
280-
{
281-
"cell_type": "markdown",
282-
"metadata": {},
283-
"source": [
284-
"Now you can start an interactive 3D viewer by changing ``run_webshow`` to\n``True`` and running the following cell. If you are using Colab, remember to\nuse the address returned by ngrok in the cell above rather than the address\nreturned by this cell.\n\n"
266+
"To start an interactive 3D viewer in the browser, we can use the ``webshow``\nfunction in pycortex. (Note that this method works only if you are running the\nnotebooks locally.) You can start an interactive 3D viewer by changing\n``run_webshow`` to ``True`` and running the following cell.\n\n"
285267
]
286268
},
287269
{

tutorials/notebooks/shortclips/07_extract_motion_energy.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Extract motion energy features from the stimuli\n\nThis script describes how to extract motion-energy features from the stimuli.\n\n.. Note:: The public data set already contains precomputed motion-energy.\n Therefore, you do not need to run this script to fit motion-energy models\n in other part of this tutorial.\n\n*Motion-energy features:* Motion-energy features result from filtering a video\nstimulus with spatio-temporal Gabor filters. A pyramid of filters is used to\ncompute the motion-energy features at multiple spatial and temporal scales.\nMotion-energy features were introduced in [1]_.\n\nThe motion-energy extraction is performed by the package [pymoten](https://github.com/gallantlab/pymoten). Check the pymoten [gallery of\nexamples](https://gallantlab.github.io/pymoten/auto_examples/index.html) for\nvisualizing motion-energy filters, and for pymoten API usage examples.\n\n## Running time\nExtracting motion energy is a bit longer than the other examples. It typically\ntakes a couple hours to run.\n"
18+
"\n# Extract motion energy features from the stimuli\n\nThis script describes how to extract motion-energy features from the stimuli.\n\n.. Note:: The public data set already contains precomputed motion-energy.\n Therefore, you do not need to run this script to fit motion-energy models\n in other part of this tutorial.\n\n*Motion-energy features:* Motion-energy features result from filtering a video\nstimulus with spatio-temporal Gabor filters. A pyramid of filters is used to\ncompute the motion-energy features at multiple spatial and temporal scales.\nMotion-energy features were introduced in [1]_.\n\nThe motion-energy extraction is performed by the package `pymoten\n<https://github.com/gallantlab/pymoten>`_. Check the pymoten `gallery of\nexamples <https://gallantlab.github.io/pymoten/auto_examples/index.html>`_ for\nvisualizing motion-energy filters, and for pymoten API usage examples.\n\n## Running time\nExtracting motion energy is a bit longer than the other examples. It typically\ntakes a couple hours to run.\n"
1919
]
2020
},
2121
{

tutorials/notebooks/shortclips/merged_for_colab.ipynb

Lines changed: 11 additions & 56 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
"# Setup Google Colab\n",
2828
"\n",
2929
"In this script, we setup a Google Colab environment. This script will only work\n",
30-
"when run from [Google Colab](https://colab.research.google.com/)). You can\n",
30+
"when run from `Google Colab <https://colab.research.google.com/>`_). You can\n",
3131
"skip it if you run the tutorials on your machine.\n"
3232
]
3333
},
@@ -77,9 +77,7 @@
7777
"#!apt-key adv --recv-keys --keyserver hkps://keyserver.ubuntu.com 0xA5D32F012649A5A9 > /dev/null\n",
7878
"#!apt-get -qq update > /dev/null\n",
7979
"#!apt-get install -qq inkscape git-annex-standalone > /dev/null\n",
80-
"#!pip install -q voxelwise_tutorials\n",
81-
"#![ -f \"ngrok-stable-linux-amd64.zip\" ] || wget -q https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip\n",
82-
"#![ -f \"ngrok\" ] || unzip ngrok-stable-linux-amd64.zip"
80+
"#!pip install -q voxelwise_tutorials"
8381
]
8482
},
8583
{
@@ -98,14 +96,12 @@
9896
},
9997
"outputs": [],
10098
"source": [
101-
"# - Set up an email and user name to use git, git-annex, and datalad (required to download the data)\n",
99+
"# - Set up an email and username to use git, git-annex, and datalad (required to download the data)\n",
102100
"# - Add NeuroDebian to the package sources\n",
103101
"# - Update the gpg keys to use NeuroDebian\n",
104102
"# - Update the list of available packages\n",
105103
"# - Install Inkscape to use more features from Pycortex, and install git-annex to download the data\n",
106-
"# - Install the tutorial helper package, and all the required dependencies\n",
107-
"# - Download ngrok to create a tunnel for pycortex 3D brain viewer\n",
108-
"# - Extract the ngrok archive"
104+
"# - Install the tutorial helper package, and all the required dependencies"
109105
]
110106
},
111107
{
@@ -458,7 +454,8 @@
458454
"## Map to subject flatmap\n",
459455
"\n",
460456
"To better understand the distribution of explainable variance, we map the\n",
461-
"values to the subject brain. This can be done with [pycortex](https://gallantlab.github.io/pycortex/), which can create interactive 3D\n",
457+
"values to the subject brain. This can be done with `pycortex\n",
458+
"<https://gallantlab.github.io/pycortex/>`_, which can create interactive 3D\n",
462459
"viewers to be displayed in any modern browser. ``pycortex`` can also display\n",
463460
"flattened maps of the cortical surface to visualize the entire cortical\n",
464461
"surface at once.\n",
@@ -497,8 +494,8 @@
497494
"source": [
498495
"This figure is a flattened map of the cortical surface. A number of regions\n",
499496
"of interest (ROIs) have been labeled to ease interpretation. If you have\n",
500-
"never seen such a flatmap, we recommend taking a look at a [pycortex brain\n",
501-
"viewer](https://www.gallantlab.org/brainviewer/Deniz2019), which displays\n",
497+
"never seen such a flatmap, we recommend taking a look at a `pycortex brain\n",
498+
"viewer <https://www.gallantlab.org/brainviewer/Deniz2019>`_, which displays\n",
502499
"the brain in 3D. In this viewer, press \"I\" to inflate the brain, \"F\" to\n",
503500
"flatten the surface, and \"R\" to reset the view (or use the ``surface/unfold``\n",
504501
"cursor on the right menu). Press \"H\" for a list of all keyboard shortcuts.\n",
@@ -603,51 +600,9 @@
603600
"metadata": {},
604601
"source": [
605602
"To start an interactive 3D viewer in the browser, we can use the ``webshow``\n",
606-
"function in pycortex. If you are running the notebook on Colab, you first\n",
607-
"need to tunnel the pycortex application out of Colab. To do so, use the\n",
608-
"following cell to start a tunnel with ``ngrok`` and to get an address where\n",
609-
"the pycortex viewer will be made accessible.\n",
610-
"\n"
611-
]
612-
},
613-
{
614-
"cell_type": "code",
615-
"execution_count": null,
616-
"metadata": {
617-
"collapsed": false
618-
},
619-
"outputs": [],
620-
"source": [
621-
"try:\n",
622-
" import google.colab # noqa\n",
623-
" in_colab = True\n",
624-
"except ImportError:\n",
625-
" in_colab = False\n",
626-
"print(in_colab)\n",
627-
"\n",
628-
"if in_colab:\n",
629-
" from IPython import get_ipython\n",
630-
" get_ipython().system_raw('./ngrok http 8050 &')\n",
631-
" plt.pause(1)\n",
632-
"\n",
633-
" command = \"\"\"\n",
634-
" curl -s http://localhost:4040/api/tunnels | python3 -c \\\n",
635-
" \"import sys, json; print(json.load(sys.stdin)['tunnels'][0]['public_url'])\"\n",
636-
" \"\"\"\n",
637-
" result = get_ipython().getoutput(command, split=True)\n",
638-
" print(\"Use the following address to connect to the brain viewer:\\n\"\n",
639-
" f\"{result}\\n\"\n",
640-
" \"and not the one proposed by pycortex ('Open viewer: ...')\\n\")"
641-
]
642-
},
643-
{
644-
"cell_type": "markdown",
645-
"metadata": {},
646-
"source": [
647-
"Now you can start an interactive 3D viewer by changing ``run_webshow`` to\n",
648-
"``True`` and running the following cell. If you are using Colab, remember to\n",
649-
"use the address returned by ngrok in the cell above rather than the address\n",
650-
"returned by this cell.\n",
603+
"function in pycortex. (Note that this method works only if you are running the\n",
604+
"notebooks locally.) You can start an interactive 3D viewer by changing\n",
605+
"``run_webshow`` to ``True`` and running the following cell.\n",
651606
"\n"
652607
]
653608
},

0 commit comments

Comments
 (0)