Skip to main content
Filter by
Sorted by
Tagged with
Advice
0 votes
2 replies
39 views

I'm looking to optimize cloud storage costs for my iOS app. The app generates a lot of data that is essentially "write once, seldom read." Right now, everything is sitting in the default ...
Cheok Yan Cheng's user avatar
Advice
1 vote
1 replies
107 views

I have an Apache Beam pipeline (running on Dataflow) that normally performs a daily batch load from Cloud Storage to BigQuery. The source team has provided 1 year of historical data that needs to be ...
Saravana Kumar's user avatar
Advice
2 votes
0 replies
47 views

I have a question about Conversational Agents in Google. I tried to get the agent to recognize images from the RAG (using OCR and the Layout Parser), but it doesn't work. I am using the JSONL format ...
Marco Romero's user avatar
0 votes
0 answers
63 views

I have a Dockerfile that builds an image based on golang:bookworm. It installs the google-cloud-cli package from the https://packages.cloud.google.com/apt repository. I used it to build an image that'...
derat's user avatar
  • 218
Advice
0 votes
1 replies
80 views

I receive ZIP files everyday to a bucket. Part of my Python pipeline is to extract these out into the individual CSVs. However, wondering if there's a quicker way? There's roughly 20 files in each ZIP,...
Aaron's user avatar
  • 5
Advice
0 votes
0 replies
39 views

I am setting up lifecycle policies for some of my buckets via terraform. I have this strategy: standard > nearline > deletion. The thing is, I am not sure if the AGE property of a file in the ...
Dasph's user avatar
  • 470
Best practices
0 votes
0 replies
72 views

My use case is simple in nature. I have a platform where users can upload any files up to 20GB. My current solution is: Frontend Client asks for a presignedURL which the Backend generates return ...
Asif Alam's user avatar
1 vote
0 answers
49 views

I have a Frontend Client which lets users uploads any number of files of any size (think up to 100 GB File). Currently I am using GCS buckets presigned URL to upload the file. My current ...
Asif Alam's user avatar
0 votes
0 answers
34 views

I'm trying to deploy Cloud Storage CORS settings using the Firebase CLI, but it consistently fails with a generic "Error: An unexpected error has occurred." Project ID: oa-maintenance-v2 ...
プラス田口's user avatar
0 votes
0 answers
47 views

I am trying to create a virtual image from a tar file that is present in the storage bucket When tried from my Java SDK code which is my actual requirement I get this Required ‘read’ permission for ‘${...
ROHAN ACHAR V's user avatar
2 votes
1 answer
103 views

I wrote this code that uses Google's Cloud API to get an object from my bucket and download it. It works perfectly when I had my bucket set to public (allUsers added to Principal w/ all the required ...
Art T.'s user avatar
  • 31
0 votes
1 answer
81 views

Context: using distcp, I am trying to copy HDFS directory including files to GCP bucket. I am using hadoop distcp -Dhadoop.security.credential.provider.path=jceks://$JCEKS_FILE hdfs://nameservice1/...
Jhon's user avatar
  • 49
1 vote
2 answers
117 views

I was trying to use google cloud storage in a python virtual environment. I tried installing google-cloud-storage and whenever I run the code I always get the error ModuleNotFoundError: No module ...
Asem Shaath's user avatar
0 votes
1 answer
101 views

I have a requirement to move files from a source folder to destination folder in different GCS buckets. I am using GCSToGCSOperator with following config: source_bucket: "source_bucket" ...
A B's user avatar
  • 1,934
1 vote
0 answers
429 views

I have been trying to run some models from huggingface locally. The script is being hosted on google cloud run. Since running the instance multiple times triggers rate limiting, I have downloaded the ...
GentleClash's user avatar
0 votes
0 answers
54 views

I have a small app where I am using firebase functions to upload an image into firebase storage. Once done, I store this image url against an object in firebase db and then reuse this image in the app ...
feeyam's user avatar
  • 1
1 vote
0 answers
122 views

I'm trying to upload files to a Google Cloud Storage bucket from a Ballerina application. Right now, the only way I’ve found to authenticate is by manually generating an access token using a service-...
Virul Nirmala Wickremesinghe's user avatar
0 votes
1 answer
93 views

On my Django Project I have a model that a property is used to store videos in a specific and single Google Cloud Storage bucket, using FileField. The model is defined like this: from storages....
Raul Chiarella's user avatar
0 votes
1 answer
102 views

I'm using Google Cloud Storage Signed URLs with the NodeJS SDK to allow my client app to directly upload to Cloud Storage. I want to limit the size of the file the client is allowed to upload. How do ...
yoonicode's user avatar
  • 1,477
0 votes
1 answer
106 views

I am migrating a generation 1 node.js google cloud functions to generation 2 cloud run. It uses a onDocumentCreated function to create and save a file to cloud storage built from a collection of data ...
Steve Klock's user avatar
0 votes
1 answer
102 views

Im trying to create unit tests for this method but im getting the error saying credentials not found from io import BytesIO from google.auth.exceptions import DefaultCredentialsError from google....
willie revillame's user avatar
1 vote
1 answer
115 views

First time trying anything like this I want to create a custom viz in Looker Studio with d3.js I have created a bucket d3js-bucket I have made the bucket public. IAM has Organization Administrator ...
Einarr's user avatar
  • 350
1 vote
1 answer
83 views

I'm using Django with django-storages and GoogleCloudStorage backend. My model has a FileField like this: raw_file_gcp = models.FileField(storage=GoogleCloudStorage(bucket_name='videos-raw')) At ...
Raul Chiarella's user avatar
0 votes
0 answers
61 views

How to extract the first n rows of a block compressed tab-separated value text file (.tsv.bgz), which could only be accessed with gsutil, into a text file? The original file is very large, so I wonder ...
Xunzhi Zhang's user avatar
1 vote
0 answers
660 views

Context I'm trying to deploy a custom multi-agent app on Vertex AI Reasoning Engine (using Google ADK / Agent Builder). I'm using a .whl file that includes my entire custom agent code, organized under ...
cryptickey's user avatar
0 votes
0 answers
76 views

I am attempting to use Google Cloud Document AI's asynchronous batch processing feature in a new Google Cloud project, but I am consistently encountering a ValueError. I also notice a limitation in ...
R34's user avatar
  • 1
0 votes
1 answer
101 views

When I use -P options with gcloud storage cp -r -P gs://some/bucket/object ./ It tries to convert the user id also and fails with the following message ERROR: Root permissions required to set UID ...
amisax's user avatar
  • 126
1 vote
3 answers
797 views

I am trying to upload unstructured data to a Google Cloud Platform (GCP) data store from a GCP Storage Bucket using the Python SDK. I want to use unstructured data with meta data which is mentioned ...
Fruity Fritz's user avatar
0 votes
1 answer
147 views

I'm trying to import documents from Firebase Storage into Google Cloud Document AI but I'm getting a persistent permission error even after adding the Storage Admin role to the Document AI service ...
Kyrylo Petrenko's user avatar
1 vote
0 answers
156 views

I’m trying to download a .tar.gz file from a public Google Cloud Storage bucket using the gcloud CLI tool. Here’s the command I use: gcloud storage cp gs://my-bucket/large-file.tar.gz ./ The file is ...
Maximilian Burr's user avatar
0 votes
1 answer
432 views

While running test cases, I am encountering the following error: pyspark.errors.exceptions.captured.IllegalArgumentException: Cannot initialize FileIO implementation org.apache.iceberg.gcp.gcs....
Prashant Kumar's user avatar
0 votes
1 answer
257 views

I am trying to load a 1GB+ csv file from GCS and encountering memory issues. So I am trying to use read_csv_batched per Memory issues sorting larger than memory file with polars The documentation for ...
sicsmpr's user avatar
  • 55
1 vote
0 answers
50 views

I was trying to load multiple gcs files to bigquery via Airflow using api load_table_from_uri uri="gs://b5a34db6ab213379-eu-pm-test-uat-data-ingest-temp/test_pm_cm_data/"+str(run_id).strip()+...
Vikrant Singh Rana's user avatar
0 votes
1 answer
139 views

➜ datasets gsutil --version gsutil version: 5.27 checksum: 5cf9fcad0f47bc86542d009bbe69f297 (OK) boto version: 2.49.0 python version: 3.10.13 | packaged by conda-forge | (main, Dec 23 2023, 15:35:25) ...
WurmD's user avatar
  • 1,493
0 votes
1 answer
111 views

I have set the CORS configuration on my firebase storage bucket like this : [ { "origin": ["https://arptc-connect.web.app", "*"], "method": [&...
Armando Sudi's user avatar
0 votes
0 answers
41 views

Can Google Cloud Storage (GCS) be directly accessed using OAuth 2.0 Client Credentials flow (client ID + client secret) for file uploads? Can Google Cloud Storage (GCS) be directly accessed using ...
Shyam Singh's user avatar
1 vote
0 answers
83 views

I am trying to trigger a simple Python 3.11 Gen2 Cloud Function (parse-ticket-v2) in project loadsnap-prod (ID: 266229951076, region us-central1) when a file is finalized in the GCS bucket gs://...
Brian Murphy's user avatar
-1 votes
1 answer
88 views

Short Version: I have configured a backend bucket on my load balancer and mapped it to /__/auth/, that bucket contains a publicly acessable file named handler, but when I hit /__/auth/handler I get an ...
David's user avatar
  • 15k
0 votes
1 answer
69 views

I'm hitting a wall trying to gcp create certificate authority in pulumi (python). The issue happens trying to create the authority, I get a 404 that it cannot find the authority (that it is creating). ...
Paul Forgey's user avatar
0 votes
2 answers
62 views

I'm subscribing to Google Cloud Storage events with Firebase Functions (2nd gen) on the Node.js runtime, as described here. My bucket has Hierarchical Namespace enabled. For most operations, I ...
mgw's user avatar
  • 260
2 votes
1 answer
113 views

Background I am using the google cloud sdk which implements a provider to its storage service. Inside this provider, I can gci and cd around just fine, but when I want to cp an object onto my local ...
Blaisem's user avatar
  • 659
-1 votes
1 answer
154 views

I have a use case where I want users to download a file from a public URL, but with a custom filename. For private files (e.g., in AWS S3 or GCS), I can generate a signed URL and use the Content-...
Sudhanshu Bansal's user avatar
0 votes
1 answer
269 views

I encountered an error while trying to execute an ETL task to export data from a BigQuery table to Google Cloud Storage (GCS). Here is the exact error message: raise self._exception google.api_core....
user22239200's user avatar
-1 votes
1 answer
51 views

How can I get the large sized images/videos by my backend express Api? I got project which demands the user can upload many numbers of large images and videos and click save. and if the user refreshes ...
Sree ram Sekar's user avatar
0 votes
0 answers
291 views

I am super new to using google cloud and a very novice coder. I am trying to create an automated system that saves a graph as a jpeg in a cloud storage bucket (this will be a cloud run function that ...
Amy Lock's user avatar
1 vote
1 answer
448 views

I want to read Parquet files stored in a GCS bucket via DuckDB as CLI i.e. duckdb in an environment where I setup a Service Account and I created the HMAC credentials like gcloud storage hmac create \ ...
TPPZ's user avatar
  • 4,949
1 vote
0 answers
153 views

We are designing a data ingestion pipeline where Parquet files are delivered weekly into a GCS bucket. The bucket structure is: gs://my-bucket/YYYY/MM/DD/<instance-version>/<instance-id>/&...
dadadima's user avatar
  • 958
0 votes
1 answer
68 views

I have an application deployed on Heroku that I want to connect to GCS using ActiveStorage. I am explicitly specifying credentials in config/storage.yml as specified in the JSON key file and the ...
aec's user avatar
  • 1,213
0 votes
1 answer
85 views

Context: Trying to download a file from Google Cloud Storage using a pre-signed URL with Spring Boot 3.4.4's RestClient. The same URL works perfectly with both RestTemplate and raw HttpURLConnection. ...
tschi's user avatar
  • 36
0 votes
1 answer
291 views

I'm attempting to deploy an instance of FoundryVTT as a container on Cloud Run. I've set it up to mount a Cloud Storage bucket as a volume so that its data will persist when it restarts. The app ...
splatman73's user avatar

1
2 3 4 5
225