Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 7b549ae

Browse files
Roshrinilupesko
authored andcommitted
Update mxnet doc and adding MXNet to ONNX export tutorial (#50)
* mxnet export tutorial added * doc fix * small fixes * adding details for supported onnx operator opset
1 parent 0f308c2 commit 7b549ae

3 files changed

Lines changed: 277 additions & 6 deletions

File tree

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
| [Caffe2](http://caffe2.ai) | [part of caffe2 package](https://github.com/caffe2/caffe2/tree/master/caffe2/python/onnx) | [Exporting](tutorials/Caffe2OnnxExport.ipynb) | [Importing](tutorials/OnnxCaffe2Import.ipynb) |
88
| [PyTorch](http://pytorch.org/) | [part of pytorch package](http://pytorch.org/docs/master/onnx.html) | [Exporting](tutorials/PytorchOnnxExport.ipynb), [Extending support](tutorials/PytorchAddExportSupport.md) | coming soon |
99
| [Cognitive Toolkit (CNTK)](https://www.microsoft.com/en-us/cognitive-toolkit/) | [built-in](https://docs.microsoft.com/en-us/cognitive-toolkit/setup-cntk-on-your-machine) | [Exporting](tutorials/CntkOnnxExport.ipynb) | [Importing](tutorials/OnnxCntkImport.ipynb) |
10-
| [Apache MXNet](http://mxnet.incubator.apache.org/) | part of mxnet package [docs](http://mxnet.incubator.apache.org/versions/1.2.0/api/python/contrib/onnx.html) [github](https://github.com/apache/incubator-mxnet/tree/master/python/mxnet/contrib/onnx) | coming soon | [Importing](tutorials/OnnxMxnetImport.ipynb) [experimental] |
10+
| [Apache MXNet](http://mxnet.incubator.apache.org/) | part of mxnet package [docs](http://mxnet.incubator.apache.org/api/python/contrib/onnx.html) [github](https://github.com/apache/incubator-mxnet/tree/master/python/mxnet/contrib/onnx) | [Exporting](tutorials/MXNetONNXExport.ipynb) | [Importing](tutorials/OnnxMxnetImport.ipynb) |
1111
| [Chainer](https://chainer.org/) | [chainer/onnx-chainer](https://github.com/chainer/onnx-chainer) | [Exporting](tutorials/ChainerOnnxExport.ipynb) | coming soon |
1212
| [TensorFlow](https://www.tensorflow.org/) | [onnx/onnx-tensorflow](https://github.com/onnx/onnx-tensorflow) | [Exporting](tutorials/OnnxTensorflowExport.ipynb) | [Importing](tutorials/OnnxTensorflowImport.ipynb) [experimental] |
1313
| [Apple CoreML](https://developer.apple.com/documentation/coreml) | [onnx/onnx-coreml](https://github.com/onnx/onnx-coreml) and [onnx/onnxmltools](https://github.com/onnx/onnxmltools) | [Exporting](https://github.com/onnx/onnxmltools) | [Importing](tutorials/OnnxCoremlImport.ipynb) |

tutorials/MXNetONNXExport.ipynb

Lines changed: 265 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,265 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Exporting MXNet models to ONNX\n",
8+
"\n",
9+
"In this tutorial, we will show how you can save MXNet models to ONNX format.\n",
10+
"ONNX exporter it a part of [MXNet repository](https://github.com/apache/incubator-mxnet/tree/master/python/mxnet/contrib/onnx/mx2onnx).\n",
11+
"\n",
12+
"Current MXNet-ONNX import and export operator support and coverage can be found [here](Operator support and coverage - https://cwiki.apache.org/confluence/display/MXNET/ONNX):\n"
13+
]
14+
},
15+
{
16+
"cell_type": "markdown",
17+
"metadata": {},
18+
"source": [
19+
"## Step 1: Installations:\n",
20+
"#### Install **ONNX 1.2.1 version.** Follow instructions on [ONNX repo](https://github.com/onnx/onnx).\n",
21+
"\n",
22+
"MXNet ONNX importer and exporter follows version 7 of ONNX operator set.\n",
23+
"\n",
24+
"#### Make sure to install latest MXNet either from source or pip.\n",
25+
"\n",
26+
"```bash\n",
27+
"pip install mxnet --pre\n",
28+
"```\n",
29+
"\n",
30+
"* Note: ONNX exporter will be released as a part of MXNet v1.3. "
31+
]
32+
},
33+
{
34+
"cell_type": "markdown",
35+
"metadata": {},
36+
"source": [
37+
"## Step 2: Prepare MXNet model to convert to ONNX:\n",
38+
"\n",
39+
"Let's try out pretrained resnet model from [MXNet model zoo](http://data.mxnet.io/models/)."
40+
]
41+
},
42+
{
43+
"cell_type": "code",
44+
"execution_count": 1,
45+
"metadata": {
46+
"collapsed": true
47+
},
48+
"outputs": [],
49+
"source": [
50+
"import mxnet as mx\n",
51+
"import numpy as np\n",
52+
"from mxnet.contrib import onnx as onnx_mxnet"
53+
]
54+
},
55+
{
56+
"cell_type": "code",
57+
"execution_count": 2,
58+
"metadata": {},
59+
"outputs": [
60+
{
61+
"data": {
62+
"text/plain": [
63+
"['resnet-18-0000.params', 'resnet-18-symbol.json', 'synset.txt']"
64+
]
65+
},
66+
"execution_count": 2,
67+
"metadata": {},
68+
"output_type": "execute_result"
69+
}
70+
],
71+
"source": [
72+
"# Download pretrained resnet model - json and params from mxnet model zoo.\n",
73+
"path='http://data.mxnet.io/models/imagenet/'\n",
74+
"[mx.test_utils.download(path+'resnet/18-layers/resnet-18-0000.params'),\n",
75+
" mx.test_utils.download(path+'resnet/18-layers/resnet-18-symbol.json'),\n",
76+
" mx.test_utils.download(path+'synset.txt')]"
77+
]
78+
},
79+
{
80+
"cell_type": "markdown",
81+
"metadata": {},
82+
"source": [
83+
"## Step 3: Use MXNet to ONNX exporter:\n",
84+
"\n",
85+
"MXNet's ONNX \"export_model\" API accepts following inputs: \n"
86+
]
87+
},
88+
{
89+
"cell_type": "code",
90+
"execution_count": 3,
91+
"metadata": {},
92+
"outputs": [
93+
{
94+
"name": "stdout",
95+
"output_type": "stream",
96+
"text": [
97+
"Help on function export_model in module mxnet.contrib.onnx.mx2onnx.export_model:\n",
98+
"\n",
99+
"export_model(sym, params, input_shape, input_type=<type 'numpy.float32'>, onnx_file_path=u'model.onnx', verbose=False)\n",
100+
" Exports the MXNet model file, passed as a parameter, into ONNX model.\n",
101+
" Accepts both symbol,parameter objects as well as json and params filepaths as input.\n",
102+
" Operator support and coverage - https://cwiki.apache.org/confluence/display/MXNET/ONNX\n",
103+
" \n",
104+
" Parameters\n",
105+
" ----------\n",
106+
" sym : str or symbol object\n",
107+
" Path to the json file or Symbol object\n",
108+
" params : str or symbol object\n",
109+
" Path to the params file or params dictionary. (Including both arg_params and aux_params)\n",
110+
" input_shape : List of tuple\n",
111+
" Input shape of the model e.g [(1,3,224,224)]\n",
112+
" input_type : data type\n",
113+
" Input data type e.g. np.float32\n",
114+
" onnx_file_path : str\n",
115+
" Path where to save the generated onnx file\n",
116+
" verbose : Boolean\n",
117+
" If true will print logs of the model conversion\n",
118+
" \n",
119+
" Returns\n",
120+
" -------\n",
121+
" onnx_file_path : str\n",
122+
" Onnx file path\n",
123+
"\n"
124+
]
125+
}
126+
],
127+
"source": [
128+
"help(onnx_mxnet.export_model)"
129+
]
130+
},
131+
{
132+
"cell_type": "markdown",
133+
"metadata": {},
134+
"source": [
135+
"From the API description, you can see that `export_model` API accepts 2 kinds of inputs:\n",
136+
"\n",
137+
"### MXNet sym, params objects:\n",
138+
"\n",
139+
" This is useful if you are training a model. At the end of training, you just need to invoke the `export_model` function and provide sym and params objects as inputs with other attributes to save the model in ONNX format.\n",
140+
" \n",
141+
"### MXNet's exported json and params files:\n",
142+
"\n",
143+
" This is useful if you have pretrained models and you want to convert them to ONNX format.\n",
144+
"\n",
145+
"In this tutorial, we will show second usecase: "
146+
]
147+
},
148+
{
149+
"cell_type": "code",
150+
"execution_count": 4,
151+
"metadata": {
152+
"collapsed": true
153+
},
154+
"outputs": [],
155+
"source": [
156+
"# Downloaded input symbol and params files\n",
157+
"sym = 'resnet-18-symbol.json'\n",
158+
"params = 'resnet-18-0000.params'\n",
159+
"# Standard Imagenet input - 3 channels, 224*224\n",
160+
"input_shape = (1,3,224,224)\n",
161+
"# Path of the output file\n",
162+
"onnx_file = 'mxnet_exported_resnet50.onnx'"
163+
]
164+
},
165+
{
166+
"cell_type": "code",
167+
"execution_count": 5,
168+
"metadata": {
169+
"collapsed": true
170+
},
171+
"outputs": [],
172+
"source": [
173+
"# Invoke export model API. It returns path of the converted onnx model\n",
174+
"converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], np.float32, onnx_file)"
175+
]
176+
},
177+
{
178+
"cell_type": "code",
179+
"execution_count": 6,
180+
"metadata": {},
181+
"outputs": [
182+
{
183+
"name": "stdout",
184+
"output_type": "stream",
185+
"text": [
186+
"mxnet_exported_resnet50.onnx\n"
187+
]
188+
}
189+
],
190+
"source": [
191+
"print(converted_model_path)"
192+
]
193+
},
194+
{
195+
"cell_type": "markdown",
196+
"metadata": {},
197+
"source": [
198+
"## Step 4: Check validity\n",
199+
"\n",
200+
"You can check validity of the converted ONNX model by using ONNX checker tool as follows:"
201+
]
202+
},
203+
{
204+
"cell_type": "code",
205+
"execution_count": 7,
206+
"metadata": {},
207+
"outputs": [],
208+
"source": [
209+
"from onnx import checker\n",
210+
"import onnx\n",
211+
"# Load onnx model\n",
212+
"model_proto = onnx.load(converted_model_path)\n",
213+
"\n",
214+
"# Check if converted ONNX protobuf is valid\n",
215+
"checker.check_graph(model_proto.graph)"
216+
]
217+
},
218+
{
219+
"cell_type": "markdown",
220+
"metadata": {},
221+
"source": [
222+
"This method confirms input model protobuf is valid."
223+
]
224+
},
225+
{
226+
"cell_type": "markdown",
227+
"metadata": {},
228+
"source": [
229+
"## What's next\n",
230+
"\n",
231+
"Take a look at [other tutorials, including importing of ONNX models to other frameworks](https://github.com/onnx/tutorials/tree/master/tutorials)"
232+
]
233+
},
234+
{
235+
"cell_type": "code",
236+
"execution_count": null,
237+
"metadata": {
238+
"collapsed": true
239+
},
240+
"outputs": [],
241+
"source": []
242+
}
243+
],
244+
"metadata": {
245+
"kernelspec": {
246+
"display_name": "onnx_mxnet",
247+
"language": "python",
248+
"name": "onnx_mxnet"
249+
},
250+
"language_info": {
251+
"codemirror_mode": {
252+
"name": "ipython",
253+
"version": 2
254+
},
255+
"file_extension": ".py",
256+
"mimetype": "text/x-python",
257+
"name": "python",
258+
"nbconvert_exporter": "python",
259+
"pygments_lexer": "ipython2",
260+
"version": "2.7.14"
261+
}
262+
},
263+
"nbformat": 4,
264+
"nbformat_minor": 2
265+
}

tutorials/OnnxMxnetImport.ipynb

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
"source": [
77
"# Importing ONNX models to MXNet\n",
88
"\n",
9-
"We'll show how you can use ONNX-MXNet to import ONNX models into MXNet, and use the imported model for inference, benefiting from MXNet's optimized execution engine.\n",
9+
"We'll show how you can import ONNX models into MXNet, and use the imported model for inference, benefiting from MXNet's optimized execution engine.\n",
1010
"\t\n",
1111
"## Step 1: Installations\n",
1212
"\t\n",
@@ -31,13 +31,15 @@
3131
"\n",
3232
"## Step 3: Import the ONNX model into MXNet\n",
3333
"\n",
34-
"Now that we have an ONNX model file ready, let's import it into MXNet using ONNX-MXNet's import API."
34+
"Now that we have an ONNX model file ready, let's import it into MXNet:"
3535
]
3636
},
3737
{
3838
"cell_type": "code",
3939
"execution_count": 12,
40-
"metadata": {},
40+
"metadata": {
41+
"collapsed": true
42+
},
4143
"outputs": [],
4244
"source": [
4345
"import mxnet.contrib.onnx as onnx_mxnet\n",
@@ -95,7 +97,9 @@
9597
{
9698
"cell_type": "code",
9799
"execution_count": 14,
98-
"metadata": {},
100+
"metadata": {
101+
"collapsed": true
102+
},
99103
"outputs": [],
100104
"source": [
101105
"mod = mx.mod.Module(symbol=sym, data_names=['1'], label_names=None)\n",
@@ -114,7 +118,9 @@
114118
{
115119
"cell_type": "code",
116120
"execution_count": 15,
117-
"metadata": {},
121+
"metadata": {
122+
"collapsed": true
123+
},
118124
"outputs": [],
119125
"source": [
120126
"from collections import namedtuple\n",

0 commit comments

Comments
 (0)