Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Can't allocate memory for the interpreter in tflite #19982

@shiftsayan

Description

@shiftsayan

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): Nope, using tensorflow-for-poets
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS High Sierra
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (use command below): 1.7.1
  • Python version: Python 2.7.10
  • Bazel version (if compiling from source): N/A
  • GCC/Compiler version (if compiling from source): N/A
  • CUDA/cuDNN version: N/A
  • GPU model and memory: N/A
  • Exact command to reproduce: I added a custom tflite model (converted .pb using TOCO) and replaced graph.lite with the new custom model and built the app, but it crashes on runtime.

Detailed Description

I created a custom TensorFlow model and converted it to TOCO (as described in the tensorflow-for-poets tutorial) and I replaced the old graph.lite file with my custom model and changed nothing else in the code. When I run the app, I get the following runtime error:

Process: android.example.com.tflitecamerademo, PID: 29160
    java.lang.RuntimeException: Unable to start activity ComponentInfo{android.example.com.tflitecamerademo/com.example.android.tflitecamerademo.CameraActivity}: java.lang.NullPointerException: Can not allocate memory for the interpreter

Fixes Already Tried

Metadata

Metadata

Assignees

Labels

comp:liteTF Lite related issues

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions