Skip to content

Commit 4092fe4

Browse files
committed
fixing links to point back to master
1 parent 574947e commit 4092fe4

1 file changed

Lines changed: 5 additions & 5 deletions

File tree

lab1/solutions/PT_Part1_Intro_Solution.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,9 @@
1010
" <td align=\"center\"><a target=\"_blank\" href=\"http://introtodeeplearning.com\">\n",
1111
" <img src=\"https://i.ibb.co/Jr88sn2/mit.png\" style=\"padding-bottom:5px;\" />\n",
1212
" Visit MIT Deep Learning</a></td>\n",
13-
" <td align=\"center\"><a target=\"_blank\" href=\"https://colab.research.google.com/github/aamini/introtodeeplearning/blob/2025/lab1/solutions/PT_Part1_Intro_Solution.ipynb\">\n",
13+
" <td align=\"center\"><a target=\"_blank\" href=\"https://colab.research.google.com/github/aamini/introtodeeplearning/blob/master/lab1/solutions/PT_Part1_Intro_Solution.ipynb\">\n",
1414
" <img src=\"https://i.ibb.co/2P3SLwK/colab.png\" style=\"padding-bottom:5px;\" />Run in Google Colab</a></td>\n",
15-
" <td align=\"center\"><a target=\"_blank\" href=\"https://github.com/aamini/introtodeeplearning/blob/2025/lab1/solutions/PT_Part1_Intro_Solution.ipynb\">\n",
15+
" <td align=\"center\"><a target=\"_blank\" href=\"https://github.com/aamini/introtodeeplearning/blob/master/lab1/solutions/PT_Part1_Intro_Solution.ipynb\">\n",
1616
" <img src=\"https://i.ibb.co/xfJbPmL/github.png\" height=\"70px\" style=\"padding-bottom:5px;\" />View Source on GitHub</a></td>\n",
1717
"</table>\n",
1818
"\n",
@@ -205,7 +205,7 @@
205205
"\n",
206206
"A convenient way to think about and visualize computations in a machine learning framework like PyTorch is in terms of graphs. We can define this graph in terms of tensors, which hold data, and the mathematical operations that act on these tensors in some order. Let's look at a simple example, and define this computation using PyTorch:\n",
207207
"\n",
208-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/add-graph.png)"
208+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/add-graph.png)"
209209
]
210210
},
211211
{
@@ -237,7 +237,7 @@
237237
"\n",
238238
"Now let's consider a slightly more complicated example:\n",
239239
"\n",
240-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph.png)\n",
240+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph.png)\n",
241241
"\n",
242242
"Here, we take two inputs, `a, b`, and compute an output `e`. Each node in the graph represents an operation that takes some input, does some computation, and passes its output to another node.\n",
243243
"\n",
@@ -311,7 +311,7 @@
311311
"\n",
312312
"Let's consider the example of a simple perceptron defined by just one dense (aka fully-connected or linear) layer: $ y = \\sigma(Wx + b) $, where $W$ represents a matrix of weights, $b$ is a bias, $x$ is the input, $\\sigma$ is the sigmoid activation function, and $y$ is the output.\n",
313313
"\n",
314-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph-2.png)\n",
314+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph-2.png)\n",
315315
"\n",
316316
"We will use `torch.nn.Module` to define layers -- the building blocks of neural networks. Layers implement common neural networks operations. In PyTorch, when we implement a layer, we subclass `nn.Module` and define the parameters of the layer as attributes of our new class. We also define and override a function [``forward``](https://pytorch.org/docs/stable/generated/torch.nn.Module.html#torch.nn.Module.forward), which will define the forward pass computation that is performed at every step. All classes subclassing `nn.Module` should override the `forward` function.\n",
317317
"\n",

0 commit comments

Comments
 (0)