Skip to content

Commit a4aabf5

Browse files
authored
Update README.md
1 parent 206544d commit a4aabf5

1 file changed

Lines changed: 17 additions & 18 deletions

File tree

examples/CoreML/ONNXLive/README.md

Lines changed: 17 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -38,35 +38,34 @@ You can also convert models in Linux, however to run the iOS app itself, you wil
3838

3939
## Download (or train) PyTorch style transfer models
4040

41-
For this tutorial, we use the style transfer models that are published with pytorch in https://github.com/pytorch/examples/tree/master/fast_neural_style .
42-
If you want to use different PyTorch or ONNX models, feel free to skip this step.
41+
For this tutorial, we will use the style transfer models that are published with pytorch in https://github.com/pytorch/examples/tree/master/fast_neural_style .
42+
If you would like to use a different PyTorch or ONNX model, feel free to skip this step.
4343

44-
These models are meant for doing style transfer on still images and are not optimized to be fast enough for video,
45-
but if we reduce the resolution enough, they work well on videos as well.
44+
These models are meant for applying style transfer on still images and really not optimized to be fast enough for video. However if we reduce the resolution low enough, they can also work well on videos.
4645

47-
Let's download these models.
46+
Let's download the models.
4847

4948
git clone https://github.com/pytorch/examples
5049
cd examples/fast_neural_style
5150

52-
If you want to train the models yourself, the pytorch/examples repository we just cloned has more information on how to do this.
53-
For now, we'll just download pretrained models with the script provided by the repository:
51+
If you would like to train the models yourself, the pytorch/examples repository you just cloned has more information on how to do this.
52+
For now, we'll just download pre-trained models with the script provided by the repository:
5453

5554
./download_saved_models.sh
5655

57-
This script downloads the pretrained PyTorch models and puts them into the `saved_models` folder.
58-
There should now be 4 files, `candy.pth`, `mosaic.pth`, `rain_princess.pth` and `udnie.pth`.
56+
This script downloads the pre-trained PyTorch models and puts them into the `saved_models` folder.
57+
There should now be 4 files, `candy.pth`, `mosaic.pth`, `rain_princess.pth` and `udnie.pth` in your directory.
5958

60-
## Convert the PyTorch models to ONNX models
59+
## Converting the PyTorch models to the ONNX model format
6160

62-
Now we have the pretrained PyTorch models as `.pth` files in the `saved_models` folder and want to convert them to ONNX.
63-
The model definition is in the pytorch/examples repository we cloned before, and with a few lines of python we can export it.
64-
Basically, instead of running the neural net, we call `torch.onnx._export`, which is provided with PyTorch.
65-
However, in this case we don't even need to do that, because there already is a script `neural_style/neural_style.py` doing this for us.
66-
You can take a look at that script if you want to apply it to different models.
61+
Now that we have the pre-trained PyTorch models as `.pth` files in the `saved_models` folder, we will need to convert them to ONNX format.
62+
The model definition is in the pytorch/examples repository we cloned previously, and with a few lines of python we can export it to ONNX.
63+
In this case, instead of actually running the neural net, we will call `torch.onnx._export`, which is provided with PyTorch as an api to directly export ONNX formatted models from PyTorch.
64+
However, in this case we don't even need to do that, because a script already exists `neural_style/neural_style.py` that will do this for us.
65+
You can also take a look at that script if you would like to apply it to other models.
6766

68-
Since the PyTorch to ONNX export is based on tracing your neural network, this will internally run the network.
69-
For this, it needs an input image to apply the style transfer to. This can just be an blank image.
67+
Exporting the ONNX format from PyTorch is essentially tracing your neural network so this api call will internally run the network on 'dummy data' in order to generate the graph.
68+
For this, it needs an input image to apply the style transfer to which can simply be a blank image.
7069
However, the pixel size of this image is important, as this will be the size for the exported style transfer model.
7170
To get good performance, we'll use a resolution of 250x540. Feel free to take a larger resolution if you care less about
7271
FPS and more about style transfer quality.
@@ -82,7 +81,7 @@ and use that to export the PyTorch models
8281
python ./neural_style/neural_style.py eval --content-image dummy.jpg --output-image dummy-out.jpg --model ./saved_models/rain_princess.pth --cuda 0 --export_onnx ./saved_models/rain_princess.onnx
8382
python ./neural_style/neural_style.py eval --content-image dummy.jpg --output-image dummy-out.jpg --model ./saved_models/mosaic.pth --cuda 0 --export_onnx ./saved_models/mosaic.onnx
8483

85-
You should end up getting 4 files, `candy.onnx`, `mosaic.onnx`, `rain_princess.onnx` and `udnie.onnx`,
84+
You should end up with 4 files, `candy.onnx`, `mosaic.onnx`, `rain_princess.onnx` and `udnie.onnx`,
8685
created from the corresponding `.pth` files.
8786

8887
## Convert the ONNX models to CoreML models

0 commit comments

Comments
 (0)