Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

dboyliao
Copy link
Member

@dboyliao dboyliao commented Aug 27, 2020

demo multiple inferences

Simple MNIST end-to-end uTensor cli example (device)
pred label: 8, expecting: 8
pred label: 3, expecting: 3
pred label: 5, expecting: 5
pred label: 5, expecting: 5
pred label: 1, expecting: 1
pred label: 9, expecting: 9
pred label: 3, expecting: 3
pred label: 1, expecting: 1

@dboyliao dboyliao requested a review from neil-tan August 27, 2020 13:24
Copy link
Member

@neil-tan neil-tan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the work!
One minor change requested in the main() for readability.

Though, this is an example that's meant to be run on all devices. The inclusion of additional image samples might not fit on the tiniest devices. My recommendation is holding back this PR until tests have been conduced for cortex M0 devices.

// create the input/output tensor
Tensor input_image = new RomTensor({1, 28, 28, 1}, flt, arr_input_image);
Tensor logits = new RamTensor({1, 10}, flt);
size_t num_samples = *(&ref_labels + 1) - ref_labels;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A macro define or a const variable is a better idea here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants