Skip to content

Commit 8c6f391

Browse files
karimnosseirtensorflower-gardener
authored andcommitted
[lite] Add check for bias_size is zero to avoid division by zero. This shouldn't happen for properly converted models. Just safety check
PiperOrigin-RevId: 416383645 Change-Id: If8e508bf696ae8ecfb927e69c139a8ccf7fe60cb
1 parent c8dafc9 commit 8c6f391

File tree

1 file changed

+1
-0
lines changed
  • tensorflow/lite/kernels/internal

1 file changed

+1
-0
lines changed

tensorflow/lite/kernels/internal/common.h

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -75,6 +75,7 @@ float ActivationFunction(float x) {
7575
inline void BiasAndClamp(float clamp_min, float clamp_max, int bias_size,
7676
const float* bias_data, int array_size,
7777
float* array_data) {
78+
if (bias_size == 0) return;
7879
// Note: see b/132215220: in May 2019 we thought it would be OK to replace
7980
// this with the Eigen one-liner:
8081
// return (array.colwise() + bias).cwiseMin(clamp_max).cwiseMin(clamp_max).

0 commit comments

Comments
 (0)