-
Notifications
You must be signed in to change notification settings - Fork 981
Added ResetVariableTensor #3298
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
| TfLiteStatus MicroInterpreterGraph::ResetVariableTensor(int tensor_index, | ||
| int subgraph_index) { | ||
| const SubGraph* subgraph = (*subgraphs_)[subgraph_index]; | ||
| auto* tensor = subgraph->tensors()->Get(tensor_index); | ||
| if (tensor->is_variable()) { | ||
| size_t buffer_size; | ||
| TF_LITE_ENSURE_STATUS(TfLiteEvalTensorByteLength( | ||
| &subgraph_allocations_[subgraph_index].tensors[tensor_index], | ||
| &buffer_size)); | ||
|
|
||
| int value = 0; | ||
| if (tensor->type() == tflite::TensorType_INT8) { | ||
| value = tensor->quantization()->zero_point()->Get(0); | ||
| } | ||
| memset(subgraph_allocations_[subgraph_index].tensors[tensor_index].data.raw, | ||
| value, buffer_size); | ||
| } | ||
| return kTfLiteOk; | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since this only works for TensorFlow 1.0 variable type tensors, the default return value should be failure. Also TensorType_INT8 should not be used for quantization checks. Instead the existence of the quantization field in the schema should be used. Multi-channel quantization checks also need to be here, or alternatively fail this method.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since this only works for TensorFlow 1.0 variable type tensors, the default return value should be failure.
Done.
Also TensorType_INT8 should not be used for quantization checks. Instead the existence of the quantization field in the schema should be used. Multi-channel quantization checks also need to be here, or alternatively fail this method.
This reset logic is identical to the existing ResetVariableTensors. We can improve it in a separate PR if this is needed.
|
@veblush This will not work for RESOURCE variable tensors (TensorFlow 2.0) |
Noted. If we need a way to reset RESOURCE variable, then we can add a new api for that. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only the use of data.data is allowed from the TfLitePtrUnion struct. All other members have been deprecated by the LiteRT team (formerly the TfLite folks).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please be advised that TF_LITE_MICRO_EXPECT_ macros do not end the test. This could cause the test to crash on null pointer checks, and the logs for the test will be lost.
Perhaps we should add a new macro:
#define TF_LITE_MICRO_CHECK_FAIL_AND_STOP if (micro_test::did_test_fail) continue
| TF_LITE_ENSURE_STATUS(TfLiteEvalTensorByteLength(eval_tensor, &buffer_size)); | ||
|
|
||
| int value = 0; | ||
| if (tensor->type() == tflite::TensorType_INT8) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right, a multi-channel tensor variable probably doesn't exist. But this line would be better as:
if (tensor->type() == tflite::TensorType_INT8 && tensor->quantization() && tensor->quantization()->zero_point() && tensor->quantization->zero_point()->size() > 0) {
Brainstroming.
BUG=N/a