You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/CoreML/ONNXLive/README.md
+17-18Lines changed: 17 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,35 +38,34 @@ You can also convert models in Linux, however to run the iOS app itself, you wil
38
38
39
39
## Download (or train) PyTorch style transfer models
40
40
41
-
For this tutorial, we use the style transfer models that are published with pytorch in https://github.com/pytorch/examples/tree/master/fast_neural_style .
42
-
If you want to use different PyTorch or ONNX models, feel free to skip this step.
41
+
For this tutorial, we will use the style transfer models that are published with pytorch in https://github.com/pytorch/examples/tree/master/fast_neural_style .
42
+
If you would like to use a different PyTorch or ONNX model, feel free to skip this step.
43
43
44
-
These models are meant for doing style transfer on still images and are not optimized to be fast enough for video,
45
-
but if we reduce the resolution enough, they work well on videos as well.
44
+
These models are meant for applying style transfer on still images and really not optimized to be fast enough for video. However if we reduce the resolution low enough, they can also work well on videos.
46
45
47
-
Let's download these models.
46
+
Let's download the models.
48
47
49
48
git clone https://github.com/pytorch/examples
50
49
cd examples/fast_neural_style
51
50
52
-
If you want to train the models yourself, the pytorch/examples repository we just cloned has more information on how to do this.
53
-
For now, we'll just download pretrained models with the script provided by the repository:
51
+
If you would like to train the models yourself, the pytorch/examples repository you just cloned has more information on how to do this.
52
+
For now, we'll just download pre-trained models with the script provided by the repository:
54
53
55
54
./download_saved_models.sh
56
55
57
-
This script downloads the pretrained PyTorch models and puts them into the `saved_models` folder.
58
-
There should now be 4 files, `candy.pth`, `mosaic.pth`, `rain_princess.pth` and `udnie.pth`.
56
+
This script downloads the pre-trained PyTorch models and puts them into the `saved_models` folder.
57
+
There should now be 4 files, `candy.pth`, `mosaic.pth`, `rain_princess.pth` and `udnie.pth` in your directory.
59
58
60
-
## Convert the PyTorch models to ONNX models
59
+
## Converting the PyTorch models to the ONNX model format
61
60
62
-
Now we have the pretrained PyTorch models as `.pth` files in the `saved_models` folder and want to convert them to ONNX.
63
-
The model definition is in the pytorch/examples repository we cloned before, and with a few lines of python we can export it.
64
-
Basically, instead of running the neural net, we call `torch.onnx._export`, which is provided with PyTorch.
65
-
However, in this case we don't even need to do that, because there already is a script `neural_style/neural_style.py`doing this for us.
66
-
You can take a look at that script if you want to apply it to different models.
61
+
Now that we have the pre-trained PyTorch models as `.pth` files in the `saved_models` folder, we will need to convert them to ONNX format.
62
+
The model definition is in the pytorch/examples repository we cloned previously, and with a few lines of python we can export it to ONNX.
63
+
In this case, instead of actually running the neural net, we will call `torch.onnx._export`, which is provided with PyTorch as an api to directly export ONNX formatted models from PyTorch.
64
+
However, in this case we don't even need to do that, because a script already exists `neural_style/neural_style.py`that will do this for us.
65
+
You can also take a look at that script if you would like to apply it to other models.
67
66
68
-
Since the PyTorch to ONNX export is based on tracing your neural network, this will internally run the network.
69
-
For this, it needs an input image to apply the style transfer to. This can just be an blank image.
67
+
Exporting the ONNX format from PyTorch is essentially tracing your neural network so this api call will internally run the network on 'dummy data' in order to generate the graph.
68
+
For this, it needs an input image to apply the style transfer to which can simply be a blank image.
70
69
However, the pixel size of this image is important, as this will be the size for the exported style transfer model.
71
70
To get good performance, we'll use a resolution of 250x540. Feel free to take a larger resolution if you care less about
72
71
FPS and more about style transfer quality.
@@ -82,7 +81,7 @@ and use that to export the PyTorch models
0 commit comments