I'm in need to upload a small chunk of files, around 70k of them, using the google cloud storage client for python. The size of the files are no larger than 1kb. In the beginning of the upload process I managed to upload about 20 files per second, but suddenly I started getting 429. After implementing measures to retry the upload waiting for some seconds between each request if I get 429, it is now suddenly always using more than 10 seconds for each upload.
I tried checking see if we were hitting the quota, but nothing is indicating to me that there is something wrong with the setup. Any suggestion on what could be causing the issue?
My guess is that you are running into the limits on how fast you can ramp up the request rate:
When I use upload_many_from_filenames with all the values, I see the code keeps running and keeps running for more than 11 minutes. I had to forcibly interrupt and terminate the execution. I have only 4 files in a source directory. Why is it so long without any result?
It seems like you've already implemented a mechanism to handle 429 errors by retrying with a delay. However, if you're still experiencing delays in your uploads, there could be several reasons for this behavior. I have solve same issue with my car loan project and I am sure you this will help you Here are some suggestions to help you troubleshoot the issue:
Check for Rate Limiting or Quota Exceedance:
Review Your Retry Strategy:
Network Latency:
Check Server-Side Logging:
Google Cloud Storage Client Configuration:
Consider Batch Operations:
Monitor Resource Utilization:
Review API Request and Response Times:
Consider Asynchronous Operations:
Contact Google Cloud Support:
Remember to carefully review your code and configurations, and consider the specific characteristics of your workload and network environment.