-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Closed
Labels
api: bigtableIssues related to the Bigtable API.Issues related to the Bigtable API.
Description
On the big table documents, it says the max cell size is 100mb. However, when I try to read a row with a cell of size 10mb using the Bigtable Python client, I get the following error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/google/cloud/happybase/table.py", line 190, in row
row, filter_=filter_)
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/table.py", line 234, in read_row
rows_data.consume_all()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/row_data.py", line 323, in consume_all
self.consume_next()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/bigtable/row_data.py", line 261, in consume_next
response = six.next(self._response_iterator)
File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 344, in next
return self._next()
File "/usr/local/lib/python2.7/dist-packages/grpc/_channel.py", line 335, in _next
raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.INTERNAL, Max message size exceeded)>
This max size seems to be hard coded in the grpc library. Has anybody been able to read large rows using the Bigtable Python client? Any idea for workarounds or how I can set the max size?
Metadata
Metadata
Assignees
Labels
api: bigtableIssues related to the Bigtable API.Issues related to the Bigtable API.