Skip to content

Commit

Permalink
Fix typos in tutorials 1 and 2 (#355)
Browse files Browse the repository at this point in the history
Signed-off-by: LateNightIceCream <[email protected]>
  • Loading branch information
LateNightIceCream authored Apr 15, 2024
1 parent 7eb04f4 commit 29dba8d
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 7 deletions.
10 changes: 5 additions & 5 deletions examples/Sionna_tutorial_part1.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -411,7 +411,7 @@
"source": [
"As can be seen, the `Mapper` class inherits from `Layer`, i.e., implements a Keras layer.\n",
"\n",
"This allows to simply built complex systems by using the [Keras functional API](https://keras.io/guides/functional_api/) to stack layers."
"This allows to simply build complex systems by using the [Keras functional API](https://keras.io/guides/functional_api/) to stack layers."
]
},
{
Expand Down Expand Up @@ -521,7 +521,7 @@
"id": "aca7a98b",
"metadata": {},
"source": [
"In *Eager* mode, we can directly access the values of each tensor. This simplify debugging."
"In *Eager* mode, we can directly access the values of each tensor. This simplifies debugging."
]
},
{
Expand Down Expand Up @@ -613,7 +613,7 @@
"id": "bfc184ba-c090-4443-9cd6-c217b3f64052",
"metadata": {},
"source": [
"It is typically more convenient to wrap a Sionna-based communication system into a [Keras models](https://keras.io/api/models/model/).\n",
"It is typically more convenient to wrap a Sionna-based communication system into a [Keras model](https://keras.io/api/models/model/).\n",
"\n",
"These models can be simply built by using the [Keras functional API](https://keras.io/guides/functional_api/) to stack layers.\n",
"\n",
Expand Down Expand Up @@ -854,10 +854,10 @@
"metadata": {},
"source": [
"One of the fundamental paradigms of Sionna is batch-processing.\n",
"Thus, the example above could be executed with for arbitrary batch-sizes to simulate `batch_size` codewords in parallel.\n",
"Thus, the example above could be executed for arbitrary batch-sizes to simulate `batch_size` codewords in parallel.\n",
"\n",
"However, Sionna can do more - it supports *N*-dimensional input tensors and, thereby, allows the processing of multiple samples of multiple users and several antennas in a single command line.\n",
"Let's say we want to encoded `batch_size` codewords of length `n` for each of the `num_users` connected to each of the `num_basestations`. \n",
"Let's say we want to encode `batch_size` codewords of length `n` for each of the `num_users` connected to each of the `num_basestations`. \n",
"This means in total we transmit `batch_size` * `n` * `num_users` * `num_basestations` bits."
]
},
Expand Down
4 changes: 2 additions & 2 deletions examples/Sionna_tutorial_part2.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@
"id": "318aa681",
"metadata": {},
"source": [
"`gradient` is a list of tensor, each tensor corresponding to a trainable variable of our model.\n",
"`gradient` is a list of tensors, each tensor corresponding to a trainable variable of our model.\n",
"\n",
"For this model, we only have a single trainable tensor: The constellation of shape [`2`, `2^NUM_BITS_PER_SYMBOL`], the first dimension corresponding to the real and imaginary components of the constellation points.\n",
"\n",
Expand Down Expand Up @@ -357,7 +357,7 @@
"id": "d9ef8110",
"metadata": {},
"source": [
"Let compare the constellation before and after the gradient application"
"Let's compare the constellation before and after the gradient application"
]
},
{
Expand Down

0 comments on commit 29dba8d

Please sign in to comment.