Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Enabled Batch Processing #231

Open
saurabhkumar8112 opened this issue Apr 15, 2023 · 0 comments
Open

Feature Request: Enabled Batch Processing #231

saurabhkumar8112 opened this issue Apr 15, 2023 · 0 comments

Comments

@saurabhkumar8112
Copy link

saurabhkumar8112 commented Apr 15, 2023

Right now there is no way to do batch processing.
I was trying to run a classification model on a batch of images instead of one.

The standard way to run inference on 1 image is to convert your Image to TensorImage and run the interpreter on it.

TensorImage img = TensorImage.fromImage(image);
interpreter.run(img.buffer, output_buffer.get_buffer());

I have tried reshaping the input tensor size [batch_size,H,W,C]

_interpreter.resizeInputTensor(0, [batch_size, H, W, 3]);

Now if I want to run the interpreter on a batch of say 8, I can't run the interpreter on List cause.

  1. List can't be converted into TensorImage (tflite helper plugin doesn't support it)
  2. If we use an interpreter.runForMultipleInputs(List, output_buffer), then we will get an exception from the interpreter since the var inputTensors = getInputTensors(); shape(say [batch_size, H, W, C]) won't match with the shape of List as the length is 1 for inputTensors. Still, the input length is the list's length (refer to code below, the inputTensors shape will not match with inputs length). This comes from interpreter class
  var inputTensors = getInputTensors();
 
  for (int i = 0; i < inputs.length; i++) {
    var tensor = inputTensors.elementAt(i);
    final newShape = tensor.getInputShapeIfDifferent(inputs[i]);
    if (newShape != null) {
      resizeInputTensor(i, newShape);
    }
  }

I have looked into StackOverflow and the source code and my understanding is batch_processing isn't available right now.
Can we get a feature to enable batch processing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant