Skip to content

Commit 574947e

Browse files
committed
fixing links to point back to master
1 parent 4da01b4 commit 574947e

1 file changed

Lines changed: 5 additions & 5 deletions

File tree

lab1/solutions/TF_Part1_Intro_Solution.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,9 @@
1010
" <td align=\"center\"><a target=\"_blank\" href=\"http://introtodeeplearning.com\">\n",
1111
" <img src=\"https://i.ibb.co/Jr88sn2/mit.png\" style=\"padding-bottom:5px;\" />\n",
1212
" Visit MIT Deep Learning</a></td>\n",
13-
" <td align=\"center\"><a target=\"_blank\" href=\"https://colab.research.google.com/github/aamini/introtodeeplearning/blob/2025/lab1/solutions/TF_Part1_Intro_Solution.ipynb\">\n",
13+
" <td align=\"center\"><a target=\"_blank\" href=\"https://colab.research.google.com/github/aamini/introtodeeplearning/blob/master/lab1/solutions/TF_Part1_Intro_Solution.ipynb\">\n",
1414
" <img src=\"https://i.ibb.co/2P3SLwK/colab.png\" style=\"padding-bottom:5px;\" />Run in Google Colab</a></td>\n",
15-
" <td align=\"center\"><a target=\"_blank\" href=\"https://github.com/aamini/introtodeeplearning/blob/2025/lab1/solutions/TF_Part1_Intro_Solution.ipynb\">\n",
15+
" <td align=\"center\"><a target=\"_blank\" href=\"https://github.com/aamini/introtodeeplearning/blob/master/lab1/solutions/TF_Part1_Intro_Solution.ipynb\">\n",
1616
" <img src=\"https://i.ibb.co/xfJbPmL/github.png\" height=\"70px\" style=\"padding-bottom:5px;\" />View Source on GitHub</a></td>\n",
1717
"</table>\n",
1818
"\n",
@@ -210,7 +210,7 @@
210210
"\n",
211211
"A convenient way to think about and visualize computations in TensorFlow is in terms of graphs. We can define this graph in terms of Tensors, which hold data, and the mathematical operations that act on these Tensors in some order. Let's look at a simple example, and define this computation using TensorFlow:\n",
212212
"\n",
213-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/add-graph.png)"
213+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/add-graph.png)"
214214
]
215215
},
216216
{
@@ -242,7 +242,7 @@
242242
"\n",
243243
"Now let's consider a slightly more complicated example:\n",
244244
"\n",
245-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph.png)\n",
245+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph.png)\n",
246246
"\n",
247247
"Here, we take two inputs, `a, b`, and compute an output `e`. Each node in the graph represents an operation that takes some input, does some computation, and passes its output to another node.\n",
248248
"\n",
@@ -316,7 +316,7 @@
316316
"\n",
317317
"Let's first consider the example of a simple perceptron defined by just one dense layer: $ y = \\sigma(Wx + b)$, where $W$ represents a matrix of weights, $b$ is a bias, $x$ is the input, $\\sigma$ is the sigmoid activation function, and $y$ is the output. We can also visualize this operation using a graph:\n",
318318
"\n",
319-
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/2025/lab1/img/computation-graph-2.png)\n",
319+
"![alt text](https://raw.githubusercontent.com/aamini/introtodeeplearning/master/lab1/img/computation-graph-2.png)\n",
320320
"\n",
321321
"Tensors can flow through abstract types called [```Layers```](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Layer) -- the building blocks of neural networks. ```Layers``` implement common neural networks operations, and are used to update weights, compute losses, and define inter-layer connectivity. We will first define a ```Layer``` to implement the simple perceptron defined above."
322322
]

0 commit comments

Comments
 (0)