Skip to content

Commit 8974543

Browse files
authoredSep 11, 2024··
Merge pull request #1074 from pritesh2000/gram-1/00
00_pytorch_fundamentals.ipynb
2 parents c2ae892 + 20a51fd commit 8974543

File tree

1 file changed

+12
-12
lines changed

1 file changed

+12
-12
lines changed
 

‎00_pytorch_fundamentals.ipynb

+12-12
Original file line numberDiff line numberDiff line change
@@ -30,11 +30,11 @@
3030
"\n",
3131
"## Who uses PyTorch?\n",
3232
"\n",
33-
"Many of the worlds largest technology companies such as [Meta (Facebook)](https://ai.facebook.com/blog/pytorch-builds-the-future-of-ai-and-machine-learning-at-facebook/), Tesla and Microsoft as well as artificial intelligence research companies such as [OpenAI use PyTorch](https://openai.com/blog/openai-pytorch/) to power research and bring machine learning to their products.\n",
33+
"Many of the world's largest technology companies such as [Meta (Facebook)](https://ai.facebook.com/blog/pytorch-builds-the-future-of-ai-and-machine-learning-at-facebook/), Tesla and Microsoft as well as artificial intelligence research companies such as [OpenAI use PyTorch](https://openai.com/blog/openai-pytorch/) to power research and bring machine learning to their products.\n",
3434
"\n",
3535
"![pytorch being used across industry and research](https://raw.githubusercontent.com/mrdbourke/pytorch-deep-learning/main/images/00-pytorch-being-used-across-research-and-industry.png)\n",
3636
"\n",
37-
"For example, Andrej Karpathy (head of AI at Tesla) has given several talks ([PyTorch DevCon 2019](https://youtu.be/oBklltKXtDE), [Tesla AI Day 2021](https://youtu.be/j0z4FweCy4M?t=2904)) about how Tesla use PyTorch to power their self-driving computer vision models.\n",
37+
"For example, Andrej Karpathy (head of AI at Tesla) has given several talks ([PyTorch DevCon 2019](https://youtu.be/oBklltKXtDE), [Tesla AI Day 2021](https://youtu.be/j0z4FweCy4M?t=2904)) about how Tesla uses PyTorch to power their self-driving computer vision models.\n",
3838
"\n",
3939
"PyTorch is also used in other industries such as agriculture to [power computer vision on tractors](https://medium.com/pytorch/ai-for-ag-production-machine-learning-for-agriculture-e8cfdb9849a1).\n",
4040
"\n",
@@ -66,7 +66,7 @@
6666
"| **Creating tensors** | Tensors can represent almost any kind of data (images, words, tables of numbers). |\n",
6767
"| **Getting information from tensors** | If you can put information into a tensor, you'll want to get it out too. |\n",
6868
"| **Manipulating tensors** | Machine learning algorithms (like neural networks) involve manipulating tensors in many different ways such as adding, multiplying, combining. | \n",
69-
"| **Dealing with tensor shapes** | One of the most common issues in machine learning is dealing with shape mismatches (trying to mixed wrong shaped tensors with other tensors). |\n",
69+
"| **Dealing with tensor shapes** | One of the most common issues in machine learning is dealing with shape mismatches (trying to mix wrong shaped tensors with other tensors). |\n",
7070
"| **Indexing on tensors** | If you've indexed on a Python list or NumPy array, it's very similar with tensors, except they can have far more dimensions. |\n",
7171
"| **Mixing PyTorch tensors and NumPy** | PyTorch plays with tensors ([`torch.Tensor`](https://pytorch.org/docs/stable/tensors.html)), NumPy likes arrays ([`np.ndarray`](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.html)) sometimes you'll want to mix and match these. | \n",
7272
"| **Reproducibility** | Machine learning is very experimental and since it uses a lot of *randomness* to work, sometimes you'll want that *randomness* to not be so random. |\n",
@@ -501,7 +501,7 @@
501501
"id": "LhXXgq-dTGe3"
502502
},
503503
"source": [
504-
"`MATRIX` has two dimensions (did you count the number of square brakcets on the outside of one side?).\n",
504+
"`MATRIX` has two dimensions (did you count the number of square brackets on the outside of one side?).\n",
505505
"\n",
506506
"What `shape` do you think it will have?"
507507
]
@@ -697,7 +697,7 @@
697697
"\n",
698698
"And machine learning models such as neural networks manipulate and seek patterns within tensors.\n",
699699
"\n",
700-
"But when building machine learning models with PyTorch, it's rare you'll create tensors by hand (like what we've being doing).\n",
700+
"But when building machine learning models with PyTorch, it's rare you'll create tensors by hand (like what we've been doing).\n",
701701
"\n",
702702
"Instead, a machine learning model often starts out with large random tensors of numbers and adjusts these random numbers as it works through data to better represent it.\n",
703703
"\n",
@@ -984,7 +984,7 @@
984984
"\n",
985985
"Some are specific for CPU and some are better for GPU.\n",
986986
"\n",
987-
"Getting to know which is which can take some time.\n",
987+
"Getting to know which one can take some time.\n",
988988
"\n",
989989
"Generally if you see `torch.cuda` anywhere, the tensor is being used for GPU (since Nvidia GPUs use a computing toolkit called CUDA).\n",
990990
"\n",
@@ -1901,7 +1901,7 @@
19011901
"id": "bXKozI4T0hFi"
19021902
},
19031903
"source": [
1904-
"Without the transpose, the rules of matrix mulitplication aren't fulfilled and we get an error like above.\n",
1904+
"Without the transpose, the rules of matrix multiplication aren't fulfilled and we get an error like above.\n",
19051905
"\n",
19061906
"How about a visual? \n",
19071907
"\n",
@@ -1988,7 +1988,7 @@
19881988
"id": "zIGrP5j1pN7j"
19891989
},
19901990
"source": [
1991-
"> **Question:** What happens if you change `in_features` from 2 to 3 above? Does it error? How could you change the shape of the input (`x`) to accomodate to the error? Hint: what did we have to do to `tensor_B` above?"
1991+
"> **Question:** What happens if you change `in_features` from 2 to 3 above? Does it error? How could you change the shape of the input (`x`) to accommodate to the error? Hint: what did we have to do to `tensor_B` above?"
19921992
]
19931993
},
19941994
{
@@ -2188,7 +2188,7 @@
21882188
"\n",
21892189
"You can change the datatypes of tensors using [`torch.Tensor.type(dtype=None)`](https://pytorch.org/docs/stable/generated/torch.Tensor.type.html) where the `dtype` parameter is the datatype you'd like to use.\n",
21902190
"\n",
2191-
"First we'll create a tensor and check it's datatype (the default is `torch.float32`)."
2191+
"First we'll create a tensor and check its datatype (the default is `torch.float32`)."
21922192
]
21932193
},
21942194
{
@@ -2289,7 +2289,7 @@
22892289
}
22902290
],
22912291
"source": [
2292-
"# Create a int8 tensor\n",
2292+
"# Create an int8 tensor\n",
22932293
"tensor_int8 = tensor.type(torch.int8)\n",
22942294
"tensor_int8"
22952295
]
@@ -3139,7 +3139,7 @@
31393139
"source": [
31403140
"Just as you might've expected, the tensors come out with different values.\n",
31413141
"\n",
3142-
"But what if you wanted to created two random tensors with the *same* values.\n",
3142+
"But what if you wanted to create two random tensors with the *same* values.\n",
31433143
"\n",
31443144
"As in, the tensors would still contain random values but they would be of the same flavour.\n",
31453145
"\n",
@@ -3220,7 +3220,7 @@
32203220
"It looks like setting the seed worked. \n",
32213221
"\n",
32223222
"> **Resource:** What we've just covered only scratches the surface of reproducibility in PyTorch. For more, on reproducibility in general and random seeds, I'd checkout:\n",
3223-
"> * [The PyTorch reproducibility documentation](https://pytorch.org/docs/stable/notes/randomness.html) (a good exericse would be to read through this for 10-minutes and even if you don't understand it now, being aware of it is important).\n",
3223+
"> * [The PyTorch reproducibility documentation](https://pytorch.org/docs/stable/notes/randomness.html) (a good exercise would be to read through this for 10-minutes and even if you don't understand it now, being aware of it is important).\n",
32243224
"> * [The Wikipedia random seed page](https://en.wikipedia.org/wiki/Random_seed) (this'll give a good overview of random seeds and pseudorandomness in general)."
32253225
]
32263226
},

0 commit comments

Comments
 (0)
Please sign in to comment.