Skip to content

Commit bc451ad

Browse files
committed
Merge remote branch '2025' with local branch
2 parents 5db8e93 + f0ec8c6 commit bc451ad

3 files changed

Lines changed: 10 additions & 10 deletions

File tree

lab1/solutions/PT_Part1_Intro_Solution.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,7 @@
205205
"\n",
206206
"A convenient way to think about and visualize computations in a machine learning framework like PyTorch is in terms of graphs. We can define this graph in terms of tensors, which hold data, and the mathematical operations that act on these tensors in some order. Let's look at a simple example, and define this computation using PyTorch:\n",
207207
"\n",
208-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/add-graph.png)"
208+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/add-graph.png)"
209209
]
210210
},
211211
{
@@ -237,7 +237,7 @@
237237
"\n",
238238
"Now let's consider a slightly more complicated example:\n",
239239
"\n",
240-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph.png)\n",
240+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph.png)\n",
241241
"\n",
242242
"Here, we take two inputs, `a, b`, and compute an output `e`. Each node in the graph represents an operation that takes some input, does some computation, and passes its output to another node.\n",
243243
"\n",
@@ -311,7 +311,7 @@
311311
"\n",
312312
"Let's consider the example of a simple perceptron defined by just one dense (aka fully-connected or linear) layer: $ y = \\sigma(Wx + b) $, where $W$ represents a matrix of weights, $b$ is a bias, $x$ is the input, $\\sigma$ is the sigmoid activation function, and $y$ is the output.\n",
313313
"\n",
314-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph-2.png)\n",
314+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph-2.png)\n",
315315
"\n",
316316
"We will use `torch.nn.Module` to define layers -- the building blocks of neural networks. Layers implement common neural networks operations. In PyTorch, when we implement a layer, we subclass `nn.Module` and define the parameters of the layer as attributes of our new class. We also define and override a function [``forward``](https://pytorch.org/docs/stable/generated/torch.nn.Module.html#torch.nn.Module.forward), which will define the forward pass computation that is performed at every step. All classes subclassing `nn.Module` should override the `forward` function.\n",
317317
"\n",
@@ -721,4 +721,4 @@
721721
},
722722
"nbformat": 4,
723723
"nbformat_minor": 0
724-
}
724+
}

lab1/solutions/TF_Part1_Intro_Solution.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -210,7 +210,7 @@
210210
"\n",
211211
"A convenient way to think about and visualize computations in TensorFlow is in terms of graphs. We can define this graph in terms of Tensors, which hold data, and the mathematical operations that act on these Tensors in some order. Let's look at a simple example, and define this computation using TensorFlow:\n",
212212
"\n",
213-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/add-graph.png)"
213+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/add-graph.png)"
214214
]
215215
},
216216
{
@@ -242,7 +242,7 @@
242242
"\n",
243243
"Now let's consider a slightly more complicated example:\n",
244244
"\n",
245-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph.png)\n",
245+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph.png)\n",
246246
"\n",
247247
"Here, we take two inputs, `a, b`, and compute an output `e`. Each node in the graph represents an operation that takes some input, does some computation, and passes its output to another node.\n",
248248
"\n",
@@ -316,7 +316,7 @@
316316
"\n",
317317
"Let's first consider the example of a simple perceptron defined by just one dense layer: $ y = \\sigma(Wx + b)$, where $W$ represents a matrix of weights, $b$ is a bias, $x$ is the input, $\\sigma$ is the sigmoid activation function, and $y$ is the output. We can also visualize this operation using a graph:\n",
318318
"\n",
319-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph-2.png)\n",
319+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph-2.png)\n",
320320
"\n",
321321
"Tensors can flow through abstract types called [```Layers```](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Layer) -- the building blocks of neural networks. ```Layers``` implement common neural networks operations, and are used to update weights, compute losses, and define inter-layer connectivity. We will first define a ```Layer``` to implement the simple perceptron defined above."
322322
]
@@ -711,4 +711,4 @@
711711
},
712712
"nbformat": 4,
713713
"nbformat_minor": 0
714-
}
714+
}

lab1/solutions/TF_Part2_Music_Generation_Solution.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,9 @@
1010
" <td align=\"center\"><a target=\"_blank\" href=\"http://introtodeeplearning.com\">\n",
1111
" <img src=\"https://i.ibb.co/Jr88sn2/mit.png\" style=\"padding-bottom:5px;\" />\n",
1212
" Visit MIT Deep Learning</a></td>\n",
13-
" <td align=\"center\"><a target=\"_blank\" href=\"https://colab.research.google.com/github/aamini/introtodeeplearning/blob/2025/lab1/solutions/TF_Part2_Music_Generation_Solution.ipynb\">\n",
13+
" <td align=\"center\"><a target=\"_blank\" href=\"https://colab.research.google.com/github/aamini/introtodeeplearning/blob/master/lab1/solutions/TF_Part2_Music_Generation_Solution.ipynb\">\n",
1414
" <img src=\"https://i.ibb.co/2P3SLwK/colab.png\" style=\"padding-bottom:5px;\" />Run in Google Colab</a></td>\n",
15-
" <td align=\"center\"><a target=\"_blank\" href=\"https://github.com/aamini/introtodeeplearning/blob/2025/lab1/solutions/TF_Part2_Music_Generation_Solution.ipynb\">\n",
15+
" <td align=\"center\"><a target=\"_blank\" href=\"https://github.com/aamini/introtodeeplearning/blob/master/lab1/solutions/TF_Part2_Music_Generation_Solution.ipynb\">\n",
1616
" <img src=\"https://i.ibb.co/xfJbPmL/github.png\" height=\"70px\" style=\"padding-bottom:5px;\" />View Source on GitHub</a></td>\n",
1717
"</table>\n",
1818
"\n",

0 commit comments

Comments
 (0)