Looking at http://googlecloudplatform.github.io/gcloud-python/latest/datastore-batches.html#gcloud.datastore.batch.Batch
The last example for this method has a snippet:
>>> from gcloud import datastore
>>> dataset = datastore.get_dataset('dataset-id')
>>> with Batch() as batch:
... do_some_work(batch)
... raise Exception() # rolls back
I'm not sure why we need to have a dataset variable. I also don't know if that code even works... Might be worth chopping out? Ie, it would look like....
>>> from gcloud import datastore
>>> with datastore.Batch() as batch: # Note datastore.Batch vs Batch?
... do_some_work(batch)
... raise Exception() # rolls back