I have a BigQuery Table with about 200,000,000 rows. I have an external table that holds rows that need to have a value updated by having the row to be updated's unique id, and thestring that needs to be updated in the table. The external table is about 1m rows.
When we run an update query to update all the rows in the static table with the values in the external table on matching IDs, we get the following error:
"Resources exceeded during query execution: The query could not be executed in the allotted memory. Peak usage: 110% of limit."
Query:
UPDATE `target_table` AS TARGET
SET target.string_to_update=source.string_to_update
FROM `external_table` AS SOURCE
WHERE target.id = source.id;
This is a simple update query and should be distributed, so I'm guessing the external table causes issues? What can I do to have this update complete as expected?
and target.id between 10000 and 999999and so on.