Ошибка 403 rate limit exceeded

To protect users and Google systems from abuse, applications that use OAuth and Google Identity have certain quota restrictions based on the risk level of the OAuth scopes an app uses. These limits include the following:

  • A new user authorization rate limit that limits how quickly your application can get new users.
  • A total new user cap. To learn more, see the Unverified apps page.

When an application exceeds the rate limit, Error 403: rate_limit_exceeded is displayed to users, like in the screenshot below:

Error 403 Authorization Error window

Application developers

As a developer of an application, you can view the current user authorization grant rate (or token grant rate) in the Google API Console OAuth consent screen page before your application displays this error.

If you see that your application will reach the rate limit soon via the Google API Console or see this error being displayed, you should take action to improve your application’s user experience. You can request a rate limit quota increase for the application. Please expect a response in 5 business days.

OAuth rate limit quotas

 

Applicable apps

Quota

Appeal

New user authorization rate limit

Apps that request access to user data, including verified apps

Dependent on application history, developer reputation, and riskiness

Request a rate limit quota increase

To learn more about the total new user cap, see the Unverified apps page.

Was this helpful?

How can we improve it?

Likely something to do with pytorch=1.9:

  • Works with python=3.7 and pytorch=1.7
Python 3.7.10 | packaged by conda-forge | (default, Feb 19 2021, 16:07:37) 
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.__version__
'1.7.0'
>>> model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet50', pretrained=True, force_reload=True)
Downloading: "https://github.com/pytorch/vision/archive/v0.10.0.zip" to /home/ml/farleylai/.cache/torch/hub/v0.10.0.zip
  • Works with python3.9 and pytorch=1.7.1 and 1.8.1.
Python 3.9.5 (default, Jun  4 2021, 12:28:51) 
[GCC 7.5.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
import torch
>>> torch.__version__
'1.7.1'
>>> model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet50', pretrained=True, force_reload=True)
Downloading: "https://github.com/pytorch/vision/archive/v0.10.0.zip" to /home/ml/farleylai/.cache/torch/hub/v0.10.0.zip
Python 3.9.5 (default, Jun  4 2021, 12:28:51) 
[GCC 7.5.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.__version__
'1.8.1'
>>> model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet50', pretrained=True, force_reload=True)
Downloading: "https://github.com/pytorch/vision/archive/v0.10.0.zip" to /home/ml/farleylai/.cache/torch/hub/v0.10.0.zip
  • Failed with python=3.9 and pytorch=1.9
Python 3.9.5 (default, Jun  4 2021, 12:28:51) 
[GCC 7.5.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.__version__
'1.9.0'
>>> model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet50', pretrained=True, force_reload=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/site-packages/torch/hub.py", line 362, in load
    repo_or_dir = _get_cache_or_reload(repo_or_dir, force_reload, verbose)
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/site-packages/torch/hub.py", line 162, in _get_cache_or_reload
    _validate_not_a_forked_repo(repo_owner, repo_name, branch)
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/site-packages/torch/hub.py", line 124, in _validate_not_a_forked_repo
    with urlopen(url) as r:
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/urllib/request.py", line 214, in urlopen
    return opener.open(url, data, timeout)
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/urllib/request.py", line 523, in open
    response = meth(req, response)
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/urllib/request.py", line 632, in http_response
    response = self.parent.error(
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/urllib/request.py", line 561, in error
    return self._call_chain(*args)
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/urllib/request.py", line 494, in _call_chain
    result = func(*args)
  File "/home/ml/farleylai/miniconda3/envs/sinet39/lib/python3.9/urllib/request.py", line 641, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: rate limit exceeded

The Google Drive API returns 2 levels of error information:

  • HTTP error codes and messages in the header.
  • A JSON object in the response body with additional details that can help you
    determine how to handle the error.

Drive apps should catch and handle all errors that might be encountered when
using the REST API. This guide provides instructions on how to resolve specific
API errors.

Resolve a 400 error: Bad request

This error can result from any one of the following issues in your code:

  • A required field or parameter hasn’t been provided.
  • The value supplied or a combination of provided fields is invalid.
  • You tried to add a duplicate parent to a Drive file.
  • You tried to add a parent that would create a cycle in the directory graph.

Following is a sample JSON representation of this error:

{
  "error": {
    "code": 400,
    "errors": [
      {
        "domain": "global",
        "location": "orderBy",
        "locationType": "parameter",
        "message": "Sorting is not supported for queries with fullText terms. Results are always in descending relevance order.",
        "reason": "badRequest"
      }
    ],
    "message": "Sorting is not supported for queries with fullText terms. Results are always in descending relevance order."
  }
}

To fix this error, check the message field and adjust your code accordingly.

Resolve a 400 error: Invalid sharing request

This error can occur for several reasons. To determine the limit that has been
exceeded, evaluate the reason field of the returned JSON. This error most
commonly occurs because:

  • Sharing succeeded, but the notification email was not correctly delivered.
  • The Access Control List (ACL) change is not allowed for this user.

The message field indicates the actual error.

Sharing succeeded, but the notification email was not correctly delivered

Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "invalidSharingRequest",
        "message": "Bad Request. User message: "Sorry, the items were successfully shared but emails could not be sent to email@domain.com.""
      }
    ],
    "code": 400,
    "message": "Bad Request"
  }
}

To fix this error, inform the user (sharer) they were unable to share because
the notification email couldn’t be sent to the email address they want to share
with. The user should ensure they have the correct email address and that it can
receive email.

The ACL change is not allowed for this user

Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "invalidSharingRequest",
        "message": "Bad Request. User message: "ACL change not allowed.""
      }
    ],
    "code": 400,
    "message": "Bad Request"
  }
}

To fix this error, check the sharing settings of the Google Workspace domain to which the file belongs. The settings might
prohibit sharing outside of the domain or sharing a shared drive might not be
permitted.

Resolve a 401 error: Invalid credentials

A 401 error indicates the access token that you’re using is either expired or
invalid. This error can also be caused by missing authorization for the
requested scopes. Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "authError",
        "message": "Invalid Credentials",
        "locationType": "header",
        "location": "Authorization",
      }
    ],
    "code": 401,
    "message": "Invalid Credentials"
  }
}

To fix this error, refresh the access token using the long-lived refresh token.
If this fails, direct the user through the OAuth flow, as described in
API-specific authorization and authentication information.

Resolve a 403 error

An error 403 occurs when a usage limit has been exceeded or the user doesn’t
have the correct privileges. To determine the specific type of error, evaluate
the reason field of the returned JSON. This error occurs for the following
situations:

  • The daily limit was exceeded.
  • The user rate limit was exceeded.
  • The project rate limit was exceeded.
  • The sharing rate limit was exceeded.
  • The user hasn’t granted your app rights to a file.
  • The user doesn’t have sufficient permissions for a file.
  • Your app can’t be used within the signed in user’s domain.
  • Number of items in a folder was exceeded.

For information on Drive API limits, refer to
Usage limits. For information on Drive folder limits,
refer to
Folder limits in Google Drive.

Resolve a 403 error: Daily limit exceeded

A dailyLimitExceeded error indicates the courtesy API limit for your project
has been reached. Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "usageLimits",
        "reason": "dailyLimitExceeded",
        "message": "Daily Limit Exceeded"
      }
    ],
    "code": 403,
    "message": "Daily Limit Exceeded"
  }
}

This error appears when the application’s owner has set a quota limit to cap
usage of a particular resource. To fix this error,
remove any usage caps for the «Queries per day» quota.

Resolve a 403 error: User rate limit exceeded

A userRateLimitExceeded error indicates the per-user limit has been reached.
This might be a limit from the Google API Console or a limit from the Drive
backend. Following is the JSON representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "usageLimits",
    "reason": "userRateLimitExceeded",
    "message": "User Rate Limit Exceeded"
   }
  ],
  "code": 403,
  "message": "User Rate Limit Exceeded"
 }
}

To fix this error, try any of the following:

  • Raise the per-user quota in the Google Cloud project. For more information,
    request a quota increase.
  • If one user is making numerous requests on behalf of many users of a Google Workspace account, consider a
    service account with domain-wide delegation
    using the
    quotaUser parameter.
  • Use exponential backoff to retry the
    request.

For information on Drive API limits, refer to
Usage limits.

Resolve a 403 error: Project rate limit exceeded

A rateLimitExceeded error indicates the project’s rate limit has been reached.
This limit varies depending on the type of requests. Following is the JSON
representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "usageLimits",
    "message": "Rate Limit Exceeded",
    "reason": "rateLimitExceeded",
   }
  ],
  "code": 403,
  "message": "Rate Limit Exceeded"
 }
}

To fix this error, try any of the following:

  • Raise the per-user quota in the Google Cloud project. For more information,
    request a quota increase.
  • Batch requests to make fewer
    API calls.
  • Use exponential backoff to retry the
    request.

Resolve a 403 error: Sharing rate limit exceeded

A sharingRateLimitExceeded error occurs when the user has reached a sharing
limit. This error is often linked with an email limit. Following is the JSON
representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "global",
    "message": "Rate limit exceeded. User message: "These item(s) could not be shared because a rate limit was exceeded: filename",
    "reason": "sharingRateLimitExceeded",
   }
  ],
  "code": 403,
  "message": "Rate Limit Exceeded"
 }
}

To fix this error:

  1. Do not send emails when sharing large amounts of files.
  2. If one user is making numerous requests on behalf of many users of a Google Workspace account, consider a
    service account with domain-wide delegation
    using the
    quotaUser parameter.

Resolve a 403 error: Storage quota exceeded

A storageQuotaExceeded error occurs when the user has reached their storage
limit. Following is the JSON representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "global",
    "message": "The user's Drive storage quota has been exceeded.",
    "reason": "storageQuotaExceeded",
   }
  ],
  "code": 403,
  "message": "The user's Drive storage quota has been exceeded."
 }
}

To fix this error:

  1. Review the storage limits for your Drive account. For more information,
    refer to
    Drive storage and upload limits

  2. Free up space

    or get more storage from Google One.

Resolve a 403 error: The user has not granted the app {appId} {verb} access to the file {fileId}

An appNotAuthorizedToFile error occurs when your app is not on the ACL for the
file. This error prevents the user from opening the file with your app.
Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "appNotAuthorizedToFile",
        "message": "The user has not granted the app {appId} {verb} access to the file {fileId}."
      }
    ],
    "code": 403,
    "message": "The user has not granted the app {appId} {verb} access to the file {fileId}."
  }
}

To fix this error, try any of the following:

  • Open the Google Drive picker
    and prompt the user to open the file.
  • Instruct the user to open the file using the
    Open with context menu in the Drive
    UI of your app.

You can also check the isAppAuthorized field on a file to verify that your app
created or opened the file.

Resolve a 403 error: The user does not have sufficient permissions for file {fileId}

A insufficientFilePermissions error occurs when the user doesn’t have write
access to a file, and your app is attempting to modify the file. Following is
the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "insufficientFilePermissions",
        "message": "The user does not have sufficient permissions for file {fileId}."
      }
    ],
    "code": 403,
    "message": "The user does not have sufficient permissions for file {fileId}."
  }
}

To fix this error, instruct the user to contact the file’s owner and request
edit access. You can also check user access levels in the metadata retrieved by
files.get and display a read-only UI when
permissions are missing.

Resolve a 403 error: App with id {appId} cannot be used within the authenticated user’s domain

A domainPolicy error occurs when the policy for the user’s domain doesn’t
allow access to Drive by your app. Following is the JSON representation of this
error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "domainPolicy",
        "message": "The domain administrators have disabled Drive apps."
      }
    ],
    "code": 403,
    "message": "The domain administrators have disabled Drive apps."
  }
}

To fix this error:

  1. Inform the user the domain doesn’t allow your app to access files in Drive.
  2. Instruct the user to contact the domain Admin to request access for your
    app.

Resolve a 403 error: Number of items in folder was exceeded

A numChildrenInNonRootLimitExceeded error occurs when the limit for a folder’s
number of children (folders, files, and shortcuts) has been exceeded. There’s a
500,000 item limit for folders, files, and shortcuts directly in a folder. Items
nested in subfolders don’t count against this 500,000 item limit. For more
information on Drive folder limits, refer to
Folder limits in Google Drive
.

Resolve a 403 error: Number of items created by account was exceeded

An activeItemCreationLimitExceeded error occurs when the limit for the number
of items, whether trashed or not, created by this account has been exceeded. All
item types, including folders, count towards the limit.

Resolve a 404 error: File not found: {fileId}

The notFound error occurs when the user doesn’t have read access to a file, or
the file doesn’t exist.

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "notFound",
        "message": "File not found {fileId}"
      }
    ],
    "code": 404,
    "message": "File not found: {fileId}"
  }
}

To fix this error:

  1. Inform the user they don’t have read access to the file or the file doesn’t
    exist.
  2. Instruct the user to contact the file’s owner and request permission to the
    file.

Resolve a 429 error: Too many requests

A rateLimitExceeded error occurs when the user has sent too many requests in a
given amount of time.

{
  "error": {
    "errors": [
      {
        "domain": "usageLimits",
        "reason": "rateLimitExceeded",
        "message": "Rate Limit Exceeded"
      }
    ],
    "code": 429,
    "message": "Rate Limit Exceeded"
  }
}

To fix this error, use
exponential backoff to retry the request.

Resolve a 5xx error

A 5xx error occurs when an unexpected error arises while processing the request.
This can be caused by various issues, including a request’s timing overlapping
with another request or a request for an unsupported action, such as attempting
to update permissions for a single page in a Google Site instead of the site
itself.

To fix this error, use
exponential backoff to retry the request.
Following is a list of 5xx errors:

  • 500 Backend error
  • 502 Bad Gateway
  • 503 Service Unavailable
  • 504 Gateway Timeout

After the recent Thunderbird update to version 91.2.1, I was asked to sign in again. After signing in, I was told that the app has reached its sign-in rate limit and the developer has to increase it. The full message is as follows:

Authorization Error
Error 403: rate_limit_exceeded

This app has reached its sign-in rate limit for now.

Google limits how quickly an app can get new users. You can try signing in again later or ask the developer (provider-for-google-calendar@googlegroups.com) to increase this app's sign-in rate limit.

If you are the developer for this app, you can request a sign-in rate limit increase.

These are the (censored) displayed request details:
login_hint:
hl: en-GB
response_type: code
redirect_uri: urn:ietf:wg:oauth:2.0:oob:auto
client_id: <maybe an identifying string?>.apps.googleusercontent.com
access_type: offline
scope: https://www.googleapis.com/auth/calendar https://www.googleapis.com/auth/tasks

403 userRateLimitExceeded is basically flood protection. Your application can make a max of 10 requests a second for your user. User is defined as IP address unless you send QuotaUser along with your request.

The per-user limit from the Developer Console has been reached.

{
 "error": {
  "errors": [
   {
    "domain": "usageLimits",
    "reason": "userRateLimitExceeded",
    "message": "User Rate Limit Exceeded"
   }
  ],
  "code": 403,
  "message": "User Rate Limit Exceeded"
 }
}

403 rateLimitExceeded is the same thing just with a different name. Why there are two cant tell you.

{
 "error": {
  "errors": [
   {
    "domain": "usageLimits",
    "message": "Rate Limit Exceeded",
    "reason": "rateLimitExceeded",
   }
  ],
  "code": 403,
  "message": "Rate Limit Exceeded"
 }
}

In both cases you should Implement Exponential backoff and try the request again Just slower.

В этой статье вы узнаете как убрать ошибку rate_limit_exceeded, которая появляется в почтовом клиенте Thunderbird.

Приятного чтения!

Заказывайте у нас разработку сайтов, настройку и сопровождение контекстной рекламы, поисковое продвижение и добавление организаций на карты.

error 403 thunderbird

Исходные данные:

  • Thunderbird 91.8.0 (64-битный)

Описание проблемы

При синхронизации почты в Thunderbird гугл аккаунты требуют ввести логин и пароль от аккаунта gmail. После вода данных выскакивает ошибка:

Authorization Error
Error 403: rate_limit_exceeded

This app has reached its sign-in rate limit for now.

Google limits how quickly an app can get new users. You can try signing in again later or ask the developer (provider-for-google-calendar@googlegroups.com) to increase this app’s sign-in rate limit.

If you are the developer for this app, you can request a sign-in rate limit increase.

При просмотре подробностей:

login_hint: xxxxxxx.yyyyyyyyyy@gmail.com
hl: de
response_type: code
redirect_uri: urn:ietf:wg:oauth:2.0:oob:auto
client_id: .apps.googleusercontent.com
access_type: offline
scope: https://www.googleapis.com/auth/calendar https://www.googleapis.com/auth/tasks

Смысл проблемы

Суть проблемы в календарях аккаунтов gmail. Запросы с почтовой программы для синхронизации календарей аккаунтов вызывают превышение лимита.

Решение проблемы

  1. Заходим в гугл аккаунт с которого почтовый клиент Thunderbird собирает почту. Для этого переходим на google.by (.com, .ru). Входим в наш аккаунт. Нажимаем на на 9 серых точек в правом верхнем углу (приложения Гугл) и выбираем там Календари (Calendars):
Error 403 pic 1

2. Слева внизу в закладке «Мои календари» (My calendars) выбрать календарь с название как логин аккаунта и нажать на три серых точки:

Error 403 pic 2

3. В появившемся меню выбираем «Настройки и общий доступ» (Settings and sharing):

Error 403 pic 3

4. На странице настроек спускаемся в самый низ и ищем пункт настроек «Закрытый адрес в формате iCal» (Secret address in iCal format). Он изначально будет скрыт. Нажимаете на иконку «Копировать» (Copy)

Error 403 pic 4

5. Далее открываете Thunderbird и в правом верхнем углу нажимаете на три полосы и далее «Создать»:

Error 403 pic 5

6. Выбираем «Календарь…»

Error 403 pic 6

7. Выбираем «В сети»

Error 403 pic 7

8. Вставляем в поле «Адрес» адрес который мы скопировали в пункте 4 этой инструкции и нажимаем «Найти календарь»

Error 403 pic 8

9. Выбираем наш календарь и нажимаем «Подписаться».

10. Возвращаемся в календари аккаунта гугл, и проделываем пункты со 2-го по 9-й для календаря «Праздники Беларуси»

Error 403 pic 10

11. Ошибка получения почты исправлена.

Похожие

The Google Drive API returns 2 levels of error information:

  • HTTP error codes and messages in the header.
  • A JSON object in the response body with additional details that can help you
    determine how to handle the error.

Drive apps should catch and handle all errors that might be encountered when
using the REST API. This guide provides instructions on how to resolve specific
API errors.

Resolve a 400 error: Bad request

This error can result from any one of the following issues in your code:

  • A required field or parameter hasn’t been provided.
  • The value supplied or a combination of provided fields is invalid.
  • You tried to add a duplicate parent to a Drive file.
  • You tried to add a parent that would create a cycle in the directory graph.

Following is a sample JSON representation of this error:

{
  "error": {
    "code": 400,
    "errors": [
      {
        "domain": "global",
        "location": "orderBy",
        "locationType": "parameter",
        "message": "Sorting is not supported for queries with fullText terms. Results are always in descending relevance order.",
        "reason": "badRequest"
      }
    ],
    "message": "Sorting is not supported for queries with fullText terms. Results are always in descending relevance order."
  }
}

To fix this error, check the message field and adjust your code accordingly.

Resolve a 400 error: Invalid sharing request

This error can occur for several reasons. To determine the limit that has been
exceeded, evaluate the reason field of the returned JSON. This error most
commonly occurs because:

  • Sharing succeeded, but the notification email was not correctly delivered.
  • The Access Control List (ACL) change is not allowed for this user.

The message field indicates the actual error.

Sharing succeeded, but the notification email was not correctly delivered

Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "invalidSharingRequest",
        "message": "Bad Request. User message: "Sorry, the items were successfully shared but emails could not be sent to email@domain.com.""
      }
    ],
    "code": 400,
    "message": "Bad Request"
  }
}

To fix this error, inform the user (sharer) they were unable to share because
the notification email couldn’t be sent to the email address they want to share
with. The user should ensure they have the correct email address and that it can
receive email.

The ACL change is not allowed for this user

Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "invalidSharingRequest",
        "message": "Bad Request. User message: "ACL change not allowed.""
      }
    ],
    "code": 400,
    "message": "Bad Request"
  }
}

To fix this error, check the sharing settings of the Google Workspace domain to which the file belongs. The settings might
prohibit sharing outside of the domain or sharing a shared drive might not be
permitted.

Resolve a 401 error: Invalid credentials

A 401 error indicates the access token that you’re using is either expired or
invalid. This error can also be caused by missing authorization for the
requested scopes. Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "authError",
        "message": "Invalid Credentials",
        "locationType": "header",
        "location": "Authorization",
      }
    ],
    "code": 401,
    "message": "Invalid Credentials"
  }
}

To fix this error, refresh the access token using the long-lived refresh token.
If this fails, direct the user through the OAuth flow, as described in
API-specific authorization and authentication information.

Resolve a 403 error

An error 403 occurs when a usage limit has been exceeded or the user doesn’t
have the correct privileges. To determine the specific type of error, evaluate
the reason field of the returned JSON. This error occurs for the following
situations:

  • The daily limit was exceeded.
  • The user rate limit was exceeded.
  • The project rate limit was exceeded.
  • The sharing rate limit was exceeded.
  • The user hasn’t granted your app rights to a file.
  • The user doesn’t have sufficient permissions for a file.
  • Your app can’t be used within the signed in user’s domain.
  • Number of items in a folder was exceeded.

For information on Drive API limits, refer to
Usage limits. For information on Drive folder limits,
refer to
Folder limits in Google Drive.

Resolve a 403 error: Daily limit exceeded

A dailyLimitExceeded error indicates the courtesy API limit for your project
has been reached. Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "usageLimits",
        "reason": "dailyLimitExceeded",
        "message": "Daily Limit Exceeded"
      }
    ],
    "code": 403,
    "message": "Daily Limit Exceeded"
  }
}

This error appears when the application’s owner has set a quota limit to cap
usage of a particular resource. To fix this error,
remove any usage caps for the «Queries per day» quota.

Resolve a 403 error: User rate limit exceeded

A userRateLimitExceeded error indicates the per-user limit has been reached.
This might be a limit from the Google API Console or a limit from the Drive
backend. Following is the JSON representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "usageLimits",
    "reason": "userRateLimitExceeded",
    "message": "User Rate Limit Exceeded"
   }
  ],
  "code": 403,
  "message": "User Rate Limit Exceeded"
 }
}

To fix this error, try any of the following:

  • Raise the per-user quota in the Google Cloud project. For more information,
    request a quota increase.
  • If one user is making numerous requests on behalf of many users of a Google Workspace account, consider a
    service account with domain-wide delegation
    using the
    quotaUser parameter.
  • Use exponential backoff to retry the
    request.

For information on Drive API limits, refer to
Usage limits.

Resolve a 403 error: Project rate limit exceeded

A rateLimitExceeded error indicates the project’s rate limit has been reached.
This limit varies depending on the type of requests. Following is the JSON
representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "usageLimits",
    "message": "Rate Limit Exceeded",
    "reason": "rateLimitExceeded",
   }
  ],
  "code": 403,
  "message": "Rate Limit Exceeded"
 }
}

To fix this error, try any of the following:

  • Raise the per-user quota in the Google Cloud project. For more information,
    request a quota increase.
  • Batch requests to make fewer
    API calls.
  • Use exponential backoff to retry the
    request.

Resolve a 403 error: Sharing rate limit exceeded

A sharingRateLimitExceeded error occurs when the user has reached a sharing
limit. This error is often linked with an email limit. Following is the JSON
representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "global",
    "message": "Rate limit exceeded. User message: "These item(s) could not be shared because a rate limit was exceeded: filename",
    "reason": "sharingRateLimitExceeded",
   }
  ],
  "code": 403,
  "message": "Rate Limit Exceeded"
 }
}

To fix this error:

  1. Do not send emails when sharing large amounts of files.
  2. If one user is making numerous requests on behalf of many users of a Google Workspace account, consider a
    service account with domain-wide delegation
    using the
    quotaUser parameter.

Resolve a 403 error: Storage quota exceeded

A storageQuotaExceeded error occurs when the user has reached their storage
limit. Following is the JSON representation of this error:

{
 "error": {
  "errors": [
   {
    "domain": "global",
    "message": "The user's Drive storage quota has been exceeded.",
    "reason": "storageQuotaExceeded",
   }
  ],
  "code": 403,
  "message": "The user's Drive storage quota has been exceeded."
 }
}

To fix this error:

  1. Review the storage limits for your Drive account. For more information,
    refer to
    Drive storage and upload limits

  2. Free up space

    or get more storage from Google One.

Resolve a 403 error: The user has not granted the app {appId} {verb} access to the file {fileId}

An appNotAuthorizedToFile error occurs when your app is not on the ACL for the
file. This error prevents the user from opening the file with your app.
Following is the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "appNotAuthorizedToFile",
        "message": "The user has not granted the app {appId} {verb} access to the file {fileId}."
      }
    ],
    "code": 403,
    "message": "The user has not granted the app {appId} {verb} access to the file {fileId}."
  }
}

To fix this error, try any of the following:

  • Open the Google Drive picker
    and prompt the user to open the file.
  • Instruct the user to open the file using the
    Open with context menu in the Drive
    UI of your app.

You can also check the isAppAuthorized field on a file to verify that your app
created or opened the file.

Resolve a 403 error: The user does not have sufficient permissions for file {fileId}

A insufficientFilePermissions error occurs when the user doesn’t have write
access to a file, and your app is attempting to modify the file. Following is
the JSON representation of this error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "insufficientFilePermissions",
        "message": "The user does not have sufficient permissions for file {fileId}."
      }
    ],
    "code": 403,
    "message": "The user does not have sufficient permissions for file {fileId}."
  }
}

To fix this error, instruct the user to contact the file’s owner and request
edit access. You can also check user access levels in the metadata retrieved by
files.get and display a read-only UI when
permissions are missing.

Resolve a 403 error: App with id {appId} cannot be used within the authenticated user’s domain

A domainPolicy error occurs when the policy for the user’s domain doesn’t
allow access to Drive by your app. Following is the JSON representation of this
error:

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "domainPolicy",
        "message": "The domain administrators have disabled Drive apps."
      }
    ],
    "code": 403,
    "message": "The domain administrators have disabled Drive apps."
  }
}

To fix this error:

  1. Inform the user the domain doesn’t allow your app to access files in Drive.
  2. Instruct the user to contact the domain Admin to request access for your
    app.

Resolve a 403 error: Number of items in folder was exceeded

A numChildrenInNonRootLimitExceeded error occurs when the limit for a folder’s
number of children (folders, files, and shortcuts) has been exceeded. There’s a
500,000 item limit for folders, files, and shortcuts directly in a folder. Items
nested in subfolders don’t count against this 500,000 item limit. For more
information on Drive folder limits, refer to
Folder limits in Google Drive
.

Resolve a 403 error: Number of items created by account was exceeded

An activeItemCreationLimitExceeded error occurs when the limit for the number
of items, whether trashed or not, created by this account has been exceeded. All
item types, including folders, count towards the limit.

Resolve a 404 error: File not found: {fileId}

The notFound error occurs when the user doesn’t have read access to a file, or
the file doesn’t exist.

{
  "error": {
    "errors": [
      {
        "domain": "global",
        "reason": "notFound",
        "message": "File not found {fileId}"
      }
    ],
    "code": 404,
    "message": "File not found: {fileId}"
  }
}

To fix this error:

  1. Inform the user they don’t have read access to the file or the file doesn’t
    exist.
  2. Instruct the user to contact the file’s owner and request permission to the
    file.

Resolve a 429 error: Too many requests

A rateLimitExceeded error occurs when the user has sent too many requests in a
given amount of time.

{
  "error": {
    "errors": [
      {
        "domain": "usageLimits",
        "reason": "rateLimitExceeded",
        "message": "Rate Limit Exceeded"
      }
    ],
    "code": 429,
    "message": "Rate Limit Exceeded"
  }
}

To fix this error, use
exponential backoff to retry the request.

Resolve a 5xx error

A 5xx error occurs when an unexpected error arises while processing the request.
This can be caused by various issues, including a request’s timing overlapping
with another request or a request for an unsupported action, such as attempting
to update permissions for a single page in a Google Site instead of the site
itself.

To fix this error, use
exponential backoff to retry the request.
Following is a list of 5xx errors:

  • 500 Backend error
  • 502 Bad Gateway
  • 503 Service Unavailable
  • 504 Gateway Timeout

How to fix the Runtime Code 403 Rate-limit exceeded

This article features error number Code 403, commonly known as Rate-limit exceeded described as Google Error 403. Rate-limit exceeded.

About Runtime Code 403

Runtime Code 403 happens when Picasa fails or crashes whilst it’s running, hence its name. It doesn’t necessarily mean that the code was corrupt in some way, but just that it did not work during its run-time. This kind of error will appear as an annoying notification on your screen unless handled and corrected. Here are symptoms, causes and ways to troubleshoot the problem.

Definitions (Beta)

Here we list some definitions for the words contained in your error, in an attempt to help you understand your problem. This is a work in progress, so sometimes we might define the word incorrectly, so feel free to skip this section!

  • Limit — Relates to any sort of limit applied to data or resources, e.g limiting the size or value of a variable, limiting the rate of incoming traffic or CPU usage
  • Rate — A measure, quantity, or frequency, typically one measured against some other quantity or measure.
  • Google+ — Integrate applications or websites with the Google+ platform
Symptoms of Code 403 — Rate-limit exceeded

Runtime errors happen without warning. The error message can come up the screen anytime Picasa is run. In fact, the error message or some other dialogue box can come up again and again if not addressed early on.

There may be instances of files deletion or new files appearing. Though this symptom is largely due to virus infection, it can be attributed as a symptom for runtime error, as virus infection is one of the causes for runtime error. User may also experience a sudden drop in internet connection speed, yet again, this is not always the case.

Fix Rate-limit exceeded (Error Code 403)
(For illustrative purposes only)

Causes of Rate-limit exceeded — Code 403

During software design, programmers code anticipating the occurrence of errors. However, there are no perfect designs, as errors can be expected even with the best program design. Glitches can happen during runtime if a certain error is not experienced and addressed during design and testing.

Runtime errors are generally caused by incompatible programs running at the same time. It may also occur because of memory problem, a bad graphics driver or virus infection. Whatever the case may be, the problem must be resolved immediately to avoid further problems. Here are ways to remedy the error.

Repair Methods

Runtime errors may be annoying and persistent, but it is not totally hopeless, repairs are available. Here are ways to do it.

If a repair method works for you, please click the upvote button to the left of the answer, this will let other users know which repair method is currently working the best.

Please note: Neither ErrorVault.com nor it’s writers claim responsibility for the results of the actions taken from employing any of the repair methods listed on this page — you complete these steps at your own risk.

Method 7 — IE related Runtime Error

If the error you are getting is related to the Internet Explorer, you may do the following:

  1. Reset your browser.
    • For Windows 7, you may click Start, go to Control Panel, then click Internet Options on the left side. Then you can click Advanced tab then click the Reset button.
    • For Windows 8 and 10, you may click search and type Internet Options, then go to Advanced tab and click Reset.
  2. Disable script debugging and error notifications.
    • On the same Internet Options window, you may go to Advanced tab and look for Disable script debugging
    • Put a check mark on the radio button
    • At the same time, uncheck the «Display a Notification about every Script Error» item and then click Apply and OK, then reboot your computer.

If these quick fixes do not work, you can always backup files and run repair reinstall on your computer. However, you can do that later when the solutions listed here did not do the job.

Method 1 — Close Conflicting Programs

When you get a runtime error, keep in mind that it is happening due to programs that are conflicting with each other. The first thing you can do to resolve the problem is to stop these conflicting programs.

  • Open Task Manager by clicking Ctrl-Alt-Del at the same time. This will let you see the list of programs currently running.
  • Go to the Processes tab and stop the programs one by one by highlighting each program and clicking the End Process buttom.
  • You will need to observe if the error message will reoccur each time you stop a process.
  • Once you get to identify which program is causing the error, you may go ahead with the next troubleshooting step, reinstalling the application.

Method 2 — Update / Reinstall Conflicting Programs

Using Control Panel

  • For Windows 7, click the Start Button, then click Control panel, then Uninstall a program
  • For Windows 8, click the Start Button, then scroll down and click More Settings, then click Control panel > Uninstall a program.
  • For Windows 10, just type Control Panel on the search box and click the result, then click Uninstall a program
  • Once inside Programs and Features, click the problem program and click Update or Uninstall.
  • If you chose to update, then you will just need to follow the prompt to complete the process, however if you chose to Uninstall, you will follow the prompt to uninstall and then re-download or use the application’s installation disk to reinstall the program.

Using Other Methods

  • For Windows 7, you may find the list of all installed programs when you click Start and scroll your mouse over the list that appear on the tab. You may see on that list utility for uninstalling the program. You may go ahead and uninstall using utilities available in this tab.
  • For Windows 10, you may click Start, then Settings, then choose Apps.
  • Scroll down to see the list of Apps and features installed in your computer.
  • Click the Program which is causing the runtime error, then you may choose to uninstall or click Advanced options to reset the application.

Method 3 — Update your Virus protection program or download and install the latest Windows Update

Virus infection causing runtime error on your computer must immediately be prevented, quarantined or deleted. Make sure you update your virus program and run a thorough scan of the computer or, run Windows update so you can get the latest virus definition and fix.

Method 4 — Re-install Runtime Libraries

You might be getting the error because of an update, like the MS Visual C++ package which might not be installed properly or completely. What you can do then is to uninstall the current package and install a fresh copy.

  • Uninstall the package by going to Programs and Features, find and highlight the Microsoft Visual C++ Redistributable Package.
  • Click Uninstall on top of the list, and when it is done, reboot your computer.
  • Download the latest redistributable package from Microsoft then install it.

Method 5 — Run Disk Cleanup

You might also be experiencing runtime error because of a very low free space on your computer.

  • You should consider backing up your files and freeing up space on your hard drive
  • You can also clear your cache and reboot your computer
  • You can also run Disk Cleanup, open your explorer window and right click your main directory (this is usually C: )
  • Click Properties and then click Disk Cleanup

Method 6 — Reinstall Your Graphics Driver

If the error is related to a bad graphics driver, then you may do the following:

  • Open your Device Manager, locate the graphics driver
  • Right click the video card driver then click uninstall, then restart your computer
Other languages:

Wie beheben Fehler 403 (Frequenzgrenze überschritten) — Google-Fehler 403. Ratenlimit überschritten.
Come fissare Errore 403 (Limite di velocità superato) — Errore di Google 403. Limite di frequenza superato.
Hoe maak je Fout 403 (Snelheidslimiet overschreden) — Google Error 403. Snelheidslimiet overschreden.
Comment réparer Erreur 403 (Limite de débit dépassée) — Erreur Google 403. Limite de débit dépassée.
어떻게 고치는 지 오류 403 (속도 제한 초과) — Google 오류 403. 비율 제한을 초과했습니다.
Como corrigir o Erro 403 (Limite de taxa excedido) — Erro 403 do Google. Limite de taxa excedido.
Hur man åtgärdar Fel 403 (Takstgränsen har överskridits) — Google-fel 403. Gränsen har överskridits.
Как исправить Ошибка 403 (Ограничение скорости превышено) — Ошибка Google 403. Превышен предел скорости.
Jak naprawić Błąd 403 (Przekroczono limit szybkości) — Błąd Google 403. Przekroczono limit szybkości.
Cómo arreglar Error 403 (Excede el límite de velocidad) — Error 403 de Google. Se superó el límite de velocidad.

The Author About The Author: Phil Hart has been a Microsoft Community Contributor since 2010. With a current point score over 100,000, they’ve contributed more than 3000 answers in the Microsoft Support forums and have created almost 200 new help articles in the Technet Wiki.

Follow Us: Facebook Youtube Twitter

Last Updated:

02/06/22 01:46 : A Windows 10 user voted that repair method 1 worked for them.

Recommended Repair Tool:

This repair tool can fix common computer problems such as blue screens, crashes and freezes, missing DLL files, as well as repair malware/virus damage and more by replacing damaged and missing system files.

STEP 1:

Click Here to Download and install the Windows repair tool.

STEP 2:

Click on Start Scan and let it analyze your device.

STEP 3:

Click on Repair All to fix all of the issues it detected.

DOWNLOAD NOW

Compatibility

Requirements

1 Ghz CPU, 512 MB RAM, 40 GB HDD
This download offers unlimited scans of your Windows PC for free. Full system repairs start at $19.95.

Article ID: ACX010161EN

Applies To: Windows 10, Windows 8.1, Windows 7, Windows Vista, Windows XP, Windows 2000

Speed Up Tip #53

Updating Device Drivers in Windows:

Allow the operating system to communicate efficiently with your device by updating all your drivers to the latest version. This would prevent crashes, errors, and slowdowns on your computer. Your chipset and motherboard should be running on the most recent driver updates released by the manufacturers.

Click Here for another way to speed up your Windows PC

Microsoft & Windows® logos are registered trademarks of Microsoft. Disclaimer: ErrorVault.com is not affiliated with Microsoft, nor does it claim such affiliation. This page may contain definitions from https://stackoverflow.com/tags under the CC-BY-SA license. The information on this page is provided for informational purposes only. © Copyright 2018

Stay organized with collections

Save and categorize content based on your preferences.

BigQuery has various quotas and limits
that limit the rate and volume of different requests and operations. They exist
both to protect the infrastructure and to help guard against unexpected
customer usage. This document describes how to diagnose and mitigate
specific errors resulting from quotas and limits.

If your error message is not listed in this document, then refer to
the list of error messages which has more
generic error information.

Overview

If a BigQuery operation fails because of exceeding a quota, the API
returns the HTTP 403 Forbidden status code. The response body contains more
information about the quota that was reached. The response body looks similar to
the following:

{
  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "Quota exceeded: ...",
    "reason" : "quotaExceeded"
  } ],
  "message" : "Quota exceeded: ..."
}

The message field in the payload describes which limit was exceeded. For
example, the message field might say Exceeded rate limits: too many table
update operations for this table
.

In general, quota limits fall into two categories, indicated by the reason
field in the response payload.

  • rateLimitExceeded. This value indicates a short-term
    limit. To resolve these limit issues, retry the operation after few seconds.
    Use exponential backoff between retry attempts. That is, exponentially
    increase the delay between each retry.

  • quotaExceeded. This value indicates a longer-term limit. If you reach a
    longer-term quota limit, you should wait 10 minutes or longer before trying
    the operation again. If you consistently reach one of these longer-term
    quota limits, you should analyze your workload for ways to mitigate the
    issue. Mitigations can include optimizing your workload or requesting a
    quota increase.

For quotaExceeded errors, examine the error message to understand which quota
limit was exceeded. Then, analyze your workload to see if you can avoid reaching
the quota. For example, optimizing query performance can mitigate quota errors
for concurrent queries.

In some cases, the quota can be raised by
contacting BigQuery support or
contacting Google Cloud sales, but we recommend trying the
suggestions in this document first.

Diagnosis

To diagnose issues, do the following:

  • Use INFORMATION_SCHEMA views to analyze the underlying issue. These views contain
    metadata about your BigQuery resources, including jobs,
    reservations, and streaming inserts.

    For example, the following query uses the
    INFORMATION_SCHEMA.JOBS view to list all
    quota-related errors within the past day:

    SELECT
     job_id,
     creation_time,
     error_result
    FROM `region-us`.INFORMATION_SCHEMA.JOBS
    WHERE creation_time > TIMESTAMP_SUB(CURRENT_TIMESTAMP, INTERVAL 1 DAY) AND
          error_result.reason IN ('rateLimitExceeded', 'quotaExceeded')
    
  • View errors in Cloud Audit Logs.

    For example, using
    Logs Explorer, the following query
    returns errors with either Quota exceeded or limit in the message string:

    resource.type = ("bigquery_project" OR "bigquery_dataset")
    protoPayload.status.code ="7"
    protoPayload.status.message: ("Quota exceeded" OR "limit")
    

    In this example, the status code 7 indicates
    PERMISSION_DENIED, which
    corresponds to the HTTP 403 status code.

    For additional Cloud Audit Logs query samples, see BigQuery
    queries.

Concurrent queries quota errors

If a project is simultaneously running more interactive queries
than the assigned limit for that project, you might encounter this error.

For more information about this limit, see the Maximum number of concurrent interactive
queries limit.

Error message

Exceeded rate limits: too many concurrent queries for this project_and_region

Diagnosis

If you haven’t identified the query jobs that are returning this error, do the
following:

  • Check for other queries that are running concurrently with the failed queries.

    For example, if your failed query was submitted on 2021-06-08 12:00:00 UTC in
    the us region, run the following query to the
    INFORMATION_SCHEMA.JOBS view
    table in the project where the failed query was submitted:

    DECLARE failed_query_submission_time DEFAULT CAST('2021-06-08 12:00:00' AS TIMESTAMP);
    
    SELECT
     job_id,
     state,
     user_email,
     query
    FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
    WHERE creation_time >= date_sub(failed_query_submission_time, INTERVAL 1 DAY)
    AND job_type = 'QUERY'
    AND priority = 'INTERACTIVE'
    AND start_time <= failed_query_submission_time
    AND (end_time >= failed_query_submission_time OR state != 'DONE')
    
  • If your query to INFORMATION_SCHEMA.JOBS_BY_PROJECT fails with the same
    error, then run the bq ls
    command in the
    Cloud Shell terminal to list the queries that are running:

    bq ls -j --format=prettyjson -n 200 | jq '.[] | select(.status.state=="RUNNING")| {configuration: .query, id: .id, jobReference: .jobReference, user_email: .user_email}'

Resolution

To resolve this quota error, do the following:

  • Pause the job. If your preceding diagnosis identifies a process or a
    workflow responsible for an increase in queries, then pause that process or
    workflow.

  • Use jobs with batch priority. Batch queries
    don’t count towards your concurrent rate limit. Running batch queries can
    allow you to start many queries at once. Batch queries use the same resources
    as interactive (on-demand) queries. BigQuery queues each batch
    query on your behalf, and starts the query when idle resources are
    available in the BigQuery shared resource pool.

    Alternatively, you can also create a separate project to run queries.

  • Distribute queries. Organize and distribute the load across different
    projects as informed by the nature of your queries and your business needs.

  • Distribute run times. Distribute the load across a larger time frame. If
    your
    reporting solution needs to run many queries, try to introduce some randomness
    for when queries start. For example, don’t start all reports at the same time.

  • Use BigQuery BI Engine. If you have encountered this error while using a
    business intelligence (BI) tool
    to create dashboards that query data in BigQuery, then we
    recommend that you can use BigQuery BI Engine.
    Using BigQuery BI Engine is optimal for this use case.

  • Optimize queries and data model. Oftentimes, a query can be rewritten so
    that it runs more efficiently. For example, if your query contains a Common
    table expression (CTE)
    WITH clause–which is referenced in more than
    one place in the query, then this computation is done multiple times. It is
    better to persist calculations done by the CTE in a temporary table, and then
    reference it in the query.

    Multiple joins can also be the source of lack of efficiency. In this case, you
    might want to consider using
    nested and repeated columns. Using this often improves locality of the
    data, eliminates the need for some joins, and overall reduces resource
    consumption and the query run time.

    Optimizing queries make them cheaper, so when
    you use flat-rate pricing, you can run more queries with your slots. For more
    information, see Introduction to optimizing query
    performance.

  • Optimize query model. BigQuery is not a relational
    database. It is not optimized for infinite number of small queries. Running a
    large number of small queries quickly depletes your quotas. Such queries
    don’t run as efficiently as they do with the smaller database products.
    BigQuery is a large data warehouse and this is its primary use
    case. It performs best with analytical queries over large amounts of data.

  • Persist data (Saved tables). Pre-process the data in
    BigQuery and store it in additional tables. For example, if you
    execute many similar, computationally-intensive queries with different WHERE
    conditions, then their results are not cached. Such queries also consume
    resources each time they run. You can improve the performance of such queries
    and decrease their processing time by pre-computing the data and storing it in
    a table. This pre-computed data in the table can be queried by SELECT
    queries. It can often be done during ingestion within the ETL
    process,
    or by using scheduled queries or
    materialized views.

  • Increase quota limits. You can also increase the quota limits to resolve
    this error. To raise the limit, contact support or
    contact sales. Requesting a quota increase might take several days
    to process. To provide more information for your request, we recommend that
    your request includes the priority of the job, the user running the query, and
    the affected method.

    Limits are applied at the project level. However, increasing the number of
    concurrent jobs per project reduces the number of available slots for each
    concurrently running query, which might reduce performance of individual
    queries. To improve the performance, we recommend that you increase the
    number of slots if the concurrent queries limit is increased.

    To learn more about about raising this limit, see
    Quotas and limits. For more information
    about slots, see slot reservation.

Number of partition modifications for column-partitioned tables quota errors

BigQuery returns this error when your column-partitioned table
reaches the
quota of the number of partition modifications permitted per day.
Partition modifications include the total of all load jobs,
copy jobs, and query jobs
that append or overwrite a destination partition, or that
use a DML DELETE, INSERT,
MERGE, TRUNCATE TABLE, or UPDATE statement to write data to a table.

To see the value of the Number of partition
modifications per column-partitioned table per day
limit, see Partitioned
tables.

Error message

Quota exceeded: Your table exceeded quota for
Number of partition modifications to a column partitioned table
Resolution

This quota cannot be increased. To resolve this quota error, do the following:

  • Change the partitioning on the table to have more data in each partition, in
    order to decrease the total number of partitions. For example, change from
    partitioning by day to partitioning by month
    or change how you partition the table.
  • Use clustering
    instead of partitioning.
  • If you frequently load data from multiple small files stored in Cloud Storage that uses
    a job per file, then combine multiple load jobs into a single job. You can load from multiple
    Cloud Storage URIs with
    a comma-separated list (for example, gs://my_path/file_1,gs://my_path/file_2), or by
    using wildcards (for example, gs://my_path/*).

    For more information, see Batch
    loading data.

  • If you use single-row queries (that is, INSERT statements) to write data to a
    table, consider batching multiple queries into one to reduce the number of jobs.
    BigQuery doesn’t perform well when used as a relational
    database, so single-row INSERT statements executed at a high speed is not a
    recommended best practice.
  • If you intend to insert data at a high rate, consider using
    BigQuery Storage Write API. It is a
    recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust
    features, including exactly-once delivery semantics. To learn about limits and
    quotas, see Storage Write
    API and to see costs of using this API, see
    BigQuery
    data ingestion pricing.

Streaming insert quota errors

This section gives some tips for troubleshooting quota errors related to
streaming data into BigQuery.

In certain regions, streaming inserts have a higher quota if you don’t populate
the insertId field for each row. For more information about quotas for
streaming inserts, see Streaming inserts.
The quota-related errors for BigQuery streaming depend on the
presence or absence of insertId.

Error message

If the insertId field is empty, the following quota error is possible:

Quota limit Error message
Bytes per second per
project
Your entity with gaia_id: GAIA_ID,
project: PROJECT_ID in region:
REGION exceeded quota for insert bytes
per second.

If the insertId field is populated, the following quota errors are possible:

Quota limit Error message
Rows per second per
project
Your project: PROJECT_ID in
REGION exceeded quota for streaming
insert rows per second.
Rows per second per
table
Your table: TABLE_ID exceeded quota for
streaming insert rows per second.
Bytes per second per
table
Your table: TABLE_ID exceeded quota for
streaming insert bytes per second.

The purpose of the insertId field is to deduplicate inserted rows. If multiple
inserts with the same insertId arrive within a few minutes’ window,
BigQuery writes a single version of the record. However, this
automatic deduplication is not guaranteed. For maximum streaming throughput, we
recommend that you don’t include insertId and instead use
manual deduplication.
For more information, see
Ensuring data consistency.

Diagnosis

Use the STREAMING_TIMELINE_BY_*
views to analyze the streaming traffic. These views aggregate streaming
statistics over one-minute intervals, grouped by error code. Quota errors appear
in the results with error_code equal to RATE_LIMIT_EXCEEDED or
QUOTA_EXCEEDED.

Depending on the specific quota limit that was reached, look at total_rows or
total_input_bytes. If the error is a table-level quota, filter by table_id.

For example, the following query shows total bytes ingested per minute, and the
total number of quota errors:

SELECT
 start_timestamp,
 error_code,
 SUM(total_input_bytes) as sum_input_bytes,
 SUM(IF(error_code IN ('QUOTA_EXCEEDED', 'RATE_LIMIT_EXCEEDED'),
     total_requests, 0)) AS quota_error
FROM
 `region-us`.INFORMATION_SCHEMA.STREAMING_TIMELINE_BY_PROJECT
WHERE
  start_timestamp > TIMESTAMP_SUB(CURRENT_TIMESTAMP, INTERVAL 1 DAY)
GROUP BY
 start_timestamp,
 error_code
ORDER BY 1 DESC

Resolution

To resolve this quota error, do the following:

  • If you are using the insertId field for deduplication, and your project is
    in a region that supports the higher streaming quota, we recommend removing the
    insertId field. This solution may require some additional steps to manually
    deduplicate the data. For more information, see
    Manually removing duplicates.

  • If you are not using insertId, or if it’s not feasible to remove it, monitor
    your streaming traffic over a 24-hour period and analyze the quota errors:

    • If you see mostly RATE_LIMIT_EXCEEDED errors rather than QUOTA_EXCEEDED
      errors, and your overall traffic is below 80% of quota, the errors probably
      indicate temporary spikes. You can address these errors by retrying the
      operation using exponential backoff between retries.

    • If you are using a Dataflow job to insert data, consider using
      load jobs instead of streaming
      inserts. For more information, see Setting the insertion
      method.
      If you are using Dataflow with a custom I/O connector, consider
      using a built-in I/O connector instead. For more information, see Custom
      I/O patterns.

    • If you see QUOTA_EXCEEDED errors or the overall traffic consistently
      exceeds 80% of the quota, submit a request for a quota increase. For more
      information, see
      Requesting a higher quota limit.

    • You may also want to consider replacing streaming inserts with the newer
      Storage Write API which has higher throughput,
      lower price, and many useful features.

Loading CSV files quota errors

If you load a large CSV file using the bq load command with the
--allow_quoted_newlines flag,
you might encounter this error.

Error message

Input CSV files are not splittable and at least one of the files is larger than
the maximum allowed size. Size is: ...

Resolution

To resolve this quota error, do the following:

  • Set the --allow_quoted_newlines flag to false.
  • Split the CSV file into smaller chunks that are each less than 4 GB.

For more information about limits that apply when you load data into
BigQuery, see Load jobs.

Table imports or query appends quota errors

BigQuery returns this error message when your table reaches the
limit for
table operations per day for Standard tables. Table operations include the
combined total of all load jobs,
copy jobs, and query jobs
that append or overwrite a destination table or that use
a DML DELETE, INSERT,
MERGE, TRUNCATE TABLE, or UPDATE statement to write data to a table.

To see the value of the Table operations per day limit, see Standard
tables.

Error message

Your table exceeded quota for imports or query appends per table

Diagnosis

If you have not identified the source from where most table operations are
originating, do the following:

  1. Make a note of the project, dataset, and table that the failed query, load,
    or the copy job is writing to.

  2. Use INFORMATION_SCHEMA.JOBS_BY_* tables to learn more about jobs
    that modify the table.

    The following example finds the hourly count of jobs grouped by job type for
    a 24-hour period using JOBS_BY_PROJECT. If you expect multiple
    projects to write to the table, replace JOBS_BY_PROJECT with
    JOBS_BY_ORGANIZATION.

    SELECT
      TIMESTAMP_TRUNC(creation_time, HOUR),
      job_type,
      count(1)
    FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
    #Adjust time
    WHERE creation_time BETWEEN "2021-06-20 00:00:00" AND "2021-06-21 00:00:00"
    AND destination_table.project_id = "my-project-id"
    AND destination_table.dataset_id = "my_dataset"
    AND destination_table.table_id = "my_table"
    GROUP BY 1, 2
    ORDER BY 1 DESC
    

Resolution

This quota cannot be increased. To resolve this quota error, do the following:

  • If you frequently load data from multiple small files stored in Cloud Storage that uses
    a job per file, then combine multiple load jobs into a single job. You can load from multiple
    Cloud Storage URIs with
    a comma-separated list (for example, gs://my_path/file_1,gs://my_path/file_2), or by
    using wildcards (for example, gs://my_path/*).

    For more information, see Batch
    loading data.

  • If you use single-row queries (that is, INSERT statements) to write data to a
    table, consider batching multiple queries into one to reduce the number of jobs.
    BigQuery doesn’t perform well when used as a relational
    database, so single-row INSERT statements executed at a high speed is not a
    recommended best practice.
  • If you intend to insert data at a high rate, consider using
    BigQuery Storage Write API. It is a
    recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust
    features, including exactly-once delivery semantics. To learn about limits and
    quotas, see Storage Write
    API and to see costs of using this API, see
    BigQuery
    data ingestion pricing.

BigQuery returns this error when your table reaches the limit for
maximum rate of table metadata update operations per table for Standard tables.
Table operations include the combined total of all load jobs,
copy jobs, and query jobs
that append to or overwrite a destination table or that use
a DML DELETE, INSERT,
MERGE, TRUNCATE TABLE, or UPDATE to write data to a table.

To see the value of the Maximum rate of table metadata update
operations per table
limit, see Standard tables.

Error message

Exceeded rate limits: too many table update operations for this table

Diagnosis

Metadata table updates can originate from API calls that modify a table’s
metadata or from jobs that modify a table’s content. If you have not
identified the source from where most update operations to a table’s metadata
are originating, do the following:

Identify API calls

  1. Go to the Google Cloud navigation
    menu and
    select Logging > Logs Explorer:

    Go to the Logs Explorer

  2. Filter logs to view table operations by running the following query:

    resource.type="bigquery_dataset"
    protoPayload.resourceName="projects/my-project-id/datasets/my_dataset/tables/my_table"
    (protoPayload.methodName="google.cloud.bigquery.v2.TableService.PatchTable" OR
    protoPayload.methodName="google.cloud.bigquery.v2.TableService.UpdateTable" OR
    protoPayload.methodName="google.cloud.bigquery.v2.TableService.InsertTable")
    

Identify jobs

The following query returns a list of jobs that modify the affected table in
the project. If you expect multiple projects in an organization
to write to the table, replace JOBS_BY_PROJECT with JOBS_BY_ORGANIZATION.

SELECT
 job_id,
 user_email,
 query
#Adjust region
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
#Adjust time
WHERE creation_time BETWEEN "2021-06-21 10:00:00" AND "2021-06-21 20:00:00"
AND destination_table.project_id = "my-project-id"
AND destination_table.dataset_id = "my_dataset"
AND destination_table.table_id = "my_table"

For more information, see BigQuery audit logs
overview.

Resolution

This quota cannot be increased. To resolve this quota error, do the following:

  • Reduce the update rate for the table metadata.
  • Add a delay between jobs or table operations to make sure that the update rate
    is within the limit.
  • For data inserts or modification, consider using DML operations. DML
    operations are not affected by the Maximum rate of table metadata update
    operations per table
    rate limit.

    DML operations have other limits
    and quotas. For more information, see Using
    data manipulation language (DML).

  • If you frequently load data from multiple small files stored in Cloud Storage that uses
    a job per file, then combine multiple load jobs into a single job. You can load from multiple
    Cloud Storage URIs with
    a comma-separated list (for example, gs://my_path/file_1,gs://my_path/file_2), or by
    using wildcards (for example, gs://my_path/*).

    For more information, see Batch
    loading data.

  • If you use single-row queries (that is, INSERT statements) to write data to a
    table, consider batching multiple queries into one to reduce the number of jobs.
    BigQuery doesn’t perform well when used as a relational
    database, so single-row INSERT statements executed at a high speed is not a
    recommended best practice.
  • If you intend to insert data at a high rate, consider using
    BigQuery Storage Write API. It is a
    recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust
    features, including exactly-once delivery semantics. To learn about limits and
    quotas, see Storage Write
    API and to see costs of using this API, see
    BigQuery
    data ingestion pricing.

Maximum number of API requests limit errors

BigQuery returns this error when you hit the rate limit for the
number of API requests to a BigQuery API per user per method—for
example, the tables.get
method calls from a service account, or the jobs.insert
method calls from a different user email.
For more information, see the Maximum number of API requests per second per
user per method
rate limit in
All BigQuery API.

Error message

Too many API requests per user per method for this user_method

Diagnosis

If you have not identified the method that has reached this rate limit, do the
following:

For service account

  1. Go to the project
    that hosts the service account.

  2. In the Google Cloud console, go to the API Dashboard.

    For instructions on how to view the detailed usage information of an API,
    see Using the API Dashboard.

  3. In the API Dashboard, select BigQuery API.

  4. To view more detailed usage information, select Metrics, and then do
    the following:

    1. For Select Graphs, select Traffic by API method.

    2. Filter the chart by the service account’s credentials. You might see
      spikes for a method in the time range where you noticed the error.

For API calls

Some API calls log errors in BigQuery audit
logs in Cloud Logging. To identify the method that reached the limit, do the
following:

  1. In the Google Cloud console, go to the Google Cloud navigation
    menu and then
    select Logging > Logs Explorer for your project:

    Go to the Logs Explorer

  2. Filter logs by running the following query:

     resource.type="bigquery_resource"
     protoPayload.authenticationInfo.principalEmail="<user email or service account>"
     "Too many API requests per user per method for this user_method"
     In the log entry, you can find the method name under the property protoPayload.method_name.
     

    For more information, see BigQuery audit logs
    overview.

Resolution

To resolve this quota error, do the following:

  • Reduce the number of API requests or add a delay between multiple API requests
    so that the number of requests stays under this limit.

  • If the limit is only exceeded occasionally, you can implement retries on this
    specific error with exponential backoff.

  • If you frequently insert data, consider using
    streaming inserts because streaming
    inserts are
    not affected by the BigQuery API quota. However, the streaming inserts API
    has costs associated with it and has its own set of limits and quotas.

    To learn about the cost of streaming inserts, see
    BigQuery pricing.

  • While loading data to BigQuery using Dataflow
    with the BigQuery I/O connector, you
    might encounter this error for the tables.get
    method. To resolve this issue, do the following:

    • Set the destination table’s create disposition to CREATE_NEVER. For more
      information, see Create disposition.

    • Use the Apache Beam SDK version 2.24.0 or higher. In the
      previous versions of the SDK, the CREATE_IF_NEEDED disposition
      calls the tables.get method to check if the table exists.

  • You can request a quota increase by contacting support or sales. For
    additional quota, see Request a quota increase.
    Requesting a quota increase might take several days to process. To provide more
    information for your request, we recommend that
    your request includes the priority of the job, the user running the query, and
    the affected method.

Your project exceeded quota for free query bytes scanned

BigQuery returns this error when you run a query in the free
usage tier and the account reaches the monthly limit of data size that can be
queried. For more information about Queries (analysis), see Free
usage tier.

Error message

Your project exceeded quota for free query bytes scanned

Resolution

To continue using BigQuery, you need to upgrade the account to a
paid Cloud Billing account.

Maximum tabledata.list bytes per second per project quota errors

BigQuery returns this error when the project number mentioned
in the error message reaches the maximum size of data that can be read through
the tabledata.list API call in a project per second. For more information, see
Maximum tabledata.list bytes per minute.

Error message

Your project:[project number] exceeded quota for tabledata.list bytes per second per project

Resolution

To resolve this error, do the following:

  • In general, we recommend trying to stay below this limit. For example, by
    spacing out requests over a longer period with delays. If the error doesn’t
    happen frequently, implementing retries with exponential backoff
    solves this issue.
  • If the use case expects fast and frequent reading of large amount of data from
    a table, we recommend using BigQuery Storage Read API
    instead of the tabledata.list API.
  • If the preceding suggestions do not work, you can request a quota increase from
    Google Cloud console API dashboard by doing the following:

    1. Go to the Google Cloud console API dashboard.
    2. In the dashboard, filter for Quota: Tabledata list bytes per minute (default quota).
    3. Select the quota and follow the instruction in Requesting higher quota limit.

    It might take several days to review and process the request.

Maximum number of copy jobs per day per project quota errors

BigQuery returns this error when the number of copy jobs running
in a project has exceeded the daily limit.
To learn more about the limit for Copy jobs per day, see Copy jobs.

Error message

Your project exceeded quota for copies per project

Diagnosis

If you’d like to gather more data about where the copy jobs are coming from,
you can try the following:

  • If your copy jobs are located in a single or only a few regions, you can try
    querying the INFORMATION_SCHEMA.JOBS
    table for that specific region(s). For example:

    SELECT
    creation_time, job_id, user_email, destination_table.project_id, destination_table.dataset_id, destination_table.table_id
    FROM `PROJECT_ID`.`REGION_NAME`.INFORMATION_SCHEMA.JOBS
    WHERE
    creation_time BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 2 DAY) AND CURRENT_TIMESTAMP()
    AND job_type = "COPY"
    order by creation_time DESC
    

    The REGION_NAME part should be replaced with the region name
    including the region- prefix. For example, region-us
    , region-asia-south1. You can also adjust the time interval
    depending on the time range you’re interested in.

  • To see all copy jobs in all regions, you can use the following filter in
    Cloud Logging:

    resource.type="bigquery_resource"
    protoPayload.methodName="jobservice.insert"
    protoPayload.serviceData.jobInsertRequest.resource.jobConfiguration.tableCopy:*
    

Resolution

  • If the goal of the frequent copy operations is to create a snapshot of data,
    consider using table snapshots
    instead. Table snapshots are cheaper and faster alternative to copying full tables.
  • You can request a quota increase by contacting support or
    sales. It might take several days to review and
    process the request. We recommend stating the priority, use case, and the
    project ID in the request.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2023-02-07 UTC.


I’m building a web application on top of the Google Drive API. Basically, the web application displays photos and videos. The media is stored in a Google Drive folder: Once authenticated, the application makes requests to the Google Drive API to get an URL for the media and displays each one. For the moment, I have only 16 images to display. These images are hard-written in the application (for the demo).

I have encountered an issue with my application accessing Google Drive API. Indeed, after multiple tries, I’ve got this error for random requests

User Rate Limit Exceeded. Rate of requests for user exceed configured project quota.
You may consider re-evaluating expected per-user traffic to the API and
adjust project quota limits accordingly.
You may monitor aggregate quota usage and adjust limits in the API Console:
https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?project=XXXXXXX»

So I looked the API console and saw nothing special, I don’t exceed the rate limit according to me. Maybe I use wrong the google API, I don’t know in fact…

the bar on the right is my last try: 32 queries

I followed the Google Drive API documentation to check whether I did something wrong. For each API request, the request contains the access token, so it should work correctly !

A demonstration of the app is available: https://poc-drive-api.firebaseapp.com

The source code is also available: https://github.com/Mcdostone/poc-google-drive-api (file App.js)

Linda Lawton - DaImTo's user avatar

asked Dec 6, 2018 at 13:47

Yann PRONO's user avatar

2

403: User Rate Limit Exceeded is flood protection. A user can only make so many requests at a time. unfortunately user rate limit is not shown in the graph you are looking at. That graph is actually really bad at showing what is truly happening. Google tests in the background and kicks out the error if you are exceeding your limit. They are not required to actually show us that in the graph

403: User Rate Limit Exceeded

The per-user limit has been reached. This may be the limit from the Developer Console or a limit from the Drive backend.

{
«error»: {
«errors»: [
{
«domain»: «usageLimits»,
«reason»: «userRateLimitExceeded»,
«message»: «User Rate Limit Exceeded»
}
],
«code»: 403,
«message»: «User Rate Limit Exceeded»
}
}

Suggested actions:

  • Raise the per-user quota in the Developer Console project.
  • If one user is making a lot of requests on behalf of many users of a G Suite domain, consider a Service Account with authority delegation (setting the quotaUser parameter).
  • Use exponential backoff.

IMO the main thing to do when you begin to encounter this error message is to implement exponential backoff this way your application will be able to slow down and make the request again.

answered Dec 6, 2018 at 14:08

Linda Lawton - DaImTo's user avatar

In my case, I was recursing through Google Drive folders in parallel and getting this error. I solved the problem by implementing client-side rate limiting using the Bottleneck library with a 110ms delay between requests:

const limiter = new Bottleneck({
    // Google allows 1000 requests per 100 seconds per user,
    // which is 100ms per request on average. Adding a delay
    // of 100ms still triggers "rate limit exceeded" errors,
    // so going with 110ms.
    minTime: 110,
});

// Wrap every API request with the rate limiter
await limiter.schedule(() => drive.files.list({
    // Params...
}));

answered Nov 3, 2019 at 1:19

Sam's user avatar

SamSam

39.7k35 gold badges175 silver badges214 bronze badges

I was using the limiter library to enforce the «1000 queries per 100 seconds» limit, but I was still getting 403 errors. I finally stumbled upon this page where it mentions that:

In the API Console, there is a similar quota referred to as Requests per 100 seconds per user. By default, it is set to 100 requests per 100 seconds per user and can be adjusted to a maximum value of 1,000. But the number of requests to the API is restricted to a maximum of 10 requests per second per user.

So I updated the limiter library to only allow 10 requests every second instead of 1,000 every 100 seconds and it worked like a charm.

const RateLimiter = require('limiter').RateLimiter;

const limiter = new RateLimiter(10, 1000);

answered Dec 4, 2019 at 22:55

Alex's user avatar

AlexAlex

2603 silver badges6 bronze badges

1

You can use this zero-dependency library I’ve created called rate-limited-queue, to limit the execution rate of tasks in a queue.

Limiting 10 requests per second can be achieved like so:

const createQueue = require("rate-limited-queue");

const queue = createQueue(
  1000 /* time based sliding window */,
  10 /* max concurrent tasks in the sliding window */);

const results = await queue([
  () => { /* a request code goes here */ },
  () => { /* another request code goes here */ }
  // ...
]);

answered Aug 26, 2021 at 13:51

Arik's user avatar

ArikArik

5,0081 gold badge26 silver badges25 bronze badges

Icon Ex Номер ошибки: Ошибка 403
Название ошибки: Rate-limit exceeded
Описание ошибки: Google Error 403. Rate-limit exceeded.
Разработчик: Google Inc.
Программное обеспечение: Picasa
Относится к: Windows XP, Vista, 7, 8, 10, 11

Обычно люди ссылаются на «Rate-limit exceeded» как на ошибку времени выполнения (ошибку). Разработчики, такие как Google Inc., обычно проходят через несколько контрольных точек перед запуском программного обеспечения, такого как Picasa. Ошибки, такие как ошибка 403, иногда удаляются из отчетов, оставляя проблему остается нерешенной в программном обеспечении.

Ошибка 403, рассматриваемая как «Google Error 403. Rate-limit exceeded.», может возникнуть пользователями Picasa в результате нормального использования программы. Когда появится ошибка, пользователи компьютеров смогут уведомить разработчика о наличии ошибки 403 через отчеты об ошибках. Затем Google Inc. исправляет эти дефектные записи кода и сделает обновление доступным для загрузки. Если есть уведомление об обновлении Picasa, это может быть решением для устранения таких проблем, как ошибка 403 и обнаруженные дополнительные проблемы.

Почему возникает ошибка времени выполнения 403?

«Rate-limit exceeded» чаще всего может возникать при загрузке Picasa. Мы рассмотрим основные причины ошибки 403 ошибок:

Ошибка 403 Crash — это очень популярная ошибка выполнения ошибки 403, которая приводит к завершению работы всей программы. Это происходит много, когда продукт (Picasa) или компьютер не может обрабатывать уникальные входные данные.

Утечка памяти «Rate-limit exceeded» — ошибка 403 утечка памяти приводит к тому, что Picasa постоянно использует все больше и больше памяти, увяская систему. Потенциальным фактором ошибки является код Google Inc., так как ошибка предотвращает завершение программы.

Ошибка 403 Logic Error — логическая ошибка возникает, когда Picasa производит неправильный вывод из правильного ввода. Неисправный исходный код Google Inc. может привести к этим проблемам с обработкой ввода.

Такие проблемы Rate-limit exceeded обычно вызваны повреждением файла, связанного с Picasa, или, в некоторых случаях, его случайным или намеренным удалением. Как правило, решить проблему можно заменой файла Google Inc.. В качестве последней меры мы рекомендуем использовать очиститель реестра для исправления всех недопустимых Rate-limit exceeded, расширений файлов Google Inc. и других ссылок на пути к файлам, по причине которых может возникать сообщение об ошибке.

Ошибки Rate-limit exceeded

Rate-limit exceeded Проблемы, связанные с Picasa:

  • «Ошибка Rate-limit exceeded. «
  • «Rate-limit exceeded не является программой Win32. «
  • «Rate-limit exceeded должен быть закрыт. «
  • «К сожалению, мы не можем найти Rate-limit exceeded. «
  • «Rate-limit exceeded не может быть найден. «
  • «Ошибка запуска в приложении: Rate-limit exceeded. «
  • «Не удается запустить Rate-limit exceeded. «
  • «Rate-limit exceeded остановлен. «
  • «Неверный путь к приложению: Rate-limit exceeded.»

Ошибки Rate-limit exceeded EXE возникают во время установки Picasa, при запуске приложений, связанных с Rate-limit exceeded (Picasa), во время запуска или завершения работы или во время установки ОС Windows. Выделение при возникновении ошибок Rate-limit exceeded имеет первостепенное значение для поиска причины проблем Picasa и сообщения о них вGoogle Inc. за помощью.

Создатели Rate-limit exceeded Трудности

Проблемы Rate-limit exceeded могут быть отнесены к поврежденным или отсутствующим файлам, содержащим ошибки записям реестра, связанным с Rate-limit exceeded, или к вирусам / вредоносному ПО.

В частности, проблемы с Rate-limit exceeded, вызванные:

  • Недопустимый Rate-limit exceeded или поврежденный раздел реестра.
  • Вредоносные программы заразили Rate-limit exceeded, создавая повреждение.
  • Другая программа (не связанная с Picasa) удалила Rate-limit exceeded злонамеренно или по ошибке.
  • Другое программное приложение, конфликтующее с Rate-limit exceeded.
  • Неполный или поврежденный Picasa (Rate-limit exceeded) из загрузки или установки.

Продукт Solvusoft

Загрузка
WinThruster 2022 — Проверьте свой компьютер на наличие ошибок.

Совместима с Windows 2000, XP, Vista, 7, 8, 10 и 11

Установить необязательные продукты — WinThruster (Solvusoft) | Лицензия | Политика защиты личных сведений | Условия | Удаление

If you are getting; googleapi User Rate Limit Exceeded,  gdrive 403 Rate Limit Exceeded we have a solution for you.

We have been using Gdrive to upload some of our essential files for many months. Recently, we noticed that our daily backup was not working as expected. Gdrive error logs show us, Failed to get file: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded and Failed to get file: googleapi: Error 404: File not found: Failed., notFound errors.

We tried rebooting our servers refreshing our auth logins etc. none of them fixed our gdrive User Rate Limit Exceeded errors. The problem was an API related issue so we need to create a new API and build Gdrive from source. The solution is simple but takes times, If you are figuring out how to do it. However, It took a couple of our hours to do it but it will take minutes of your time, if you follow this guide you will solve your google drive 403 Rate Limit Exceeded error.

PS: We applied all the steps on our Centos server, but it will be the same with all platforms.

Part 1

Carefully follow the steps to fix google drive User Rate Limit Exceeded Error.

Downloading and Installing GO

You will need root privileges or sudo for ubuntu.

Download the files :

To download the Go binary on your linux server you can use wget or curl:

wget https://dl.google.com/go/go1.11.5.linux-amd64.tar.gz

You need to extract Go binary files from go1.11.5.linux-amd64.tar.gz After successful extraction; you will have a go named folder. You should move it to /usr/local location because it is recommended by publishers.

tar -xzf go1.11.5.linux-amd64.tar.gz
mv go /usr/local

Creating Workspace Folder for Go

For better-organized projects, create a projects folder with bin and src folder together in user home directory.

mkdir  -p ~/projectss/{bin,src}

Setting Environment Variables for Go

We need to set a $PATH Environment variable for Go to use it like any other commands in our UNIX system.

Create path.sh script in /etc/profile.d directory location

nano /etc/profile.d/path.sh

Add the following to the file, save and exit. (/etc/profile.d/path.sh)

/usr/local/go/bin

Additionally, we need to define GOPATH and GOBIN Go environment variables in the user’s  .bash_profile file to point to the recently created projects folder. GOPATH is our Go source files GOBIN is our compiled Go binary files. Open the .bash_profile file:

nano ~/.bash_profile

Append the following to the end of the file, save and exit: (~/.bash_profile)

export GOBIN="$HOME/projects/bin"
export GOPATH="$HOME/projects/src"

Apply to changes in our system; we need to update profiles with source command

source /etc/profile && source ~/.bash_profile

Let’s test our Go if it is working

[[email protected] ~]# go version 
go version go1.11.5 
linux/amd64

We needed to have Go in our system to compile Gdrive so that’s all for installing Go. We can continue Part 2 where we will compile Gdrive from source files.

Part 2

We will continue to solve googleapi 403 Rate Limit Exceeded error. Keep following the steps…

Creating Google API for Gdrive

If you see these errors while running Gdrive on your system:

Failed to get file: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded

You need your own Google Drive API to use with Gdrive so you can get information from your usages. Google API’s provide  Quotas Information which is very helpful in our situation. We need to know if we Exceed our limits.

Visit https://console.developers.google.com/apis/dashboard

Top of the page, click Select a project then New Project. 

Fill Project name as you want it.

Google API New Project

Choose your newly created project at the top of the page. At the dashboard, click  ENABLE APIS AND SERVICES

It will redirect you API Library. Search Drive keyword and find Google Drive API. You need to enable Google Drive API to use it in your project.

Add and Enable Google Drive API

We successfully add Google Drive API in our project. Gdrive requries Google Drive API’s credentials. Let’s create one.

Google Drive API Credentials

Create Credentials Details and choose the option ” Help me choose.”

Google API Credentials

Choose the settings as I did:

Google API Credentials Settings

I filled Client Name as same as my API Name.

Google API Name

Next step, fill your mail address and write a Product name.

Google API OAuth Name

After that, It will give us the Credentials we need.

Google Drive Error 403 User Rate Limit Exceeded Solution

Click download and done.

It will download a JSON file which contains our Credentials for Gdrive. Open the client_id.json file with a text editor. Notepad++ is a good option.

You will see,

“client_id”:”205xxxxxxxxx-22imoxxxxxxxxxxxxxxxxpsm.apps.googleusercontent.com”

“client_secret”:”NxxxxxxG-4HxxxxxxxxxxxxxxxwZA”

We need that two value so note it.

Getting Source Files of Gdrive

We need Gdrive projects files from Github so that we can compile it with Go. Let’s download files into our ~/projects/ folder that we created earlier.

cd ~/projects/

Use GO to download Gdrive src files from GitHub.

go get github.com/prasmussen/gdrive

We need to change Credentials in handlers_drive.go where is located in the gdrive folder.

cd ~/projects/src/src/github.com/prasmussen/gdrive/

nano handlers_drive.go

const ClientId = "3671xxxxxxxxxxxxxxxxxxxxxxeg.apps.googleusercontent.com"
const ClientSecret = "1qsNxxxxxxxxhoO"

No More googleapi User Rate Limit Exceeded

Change ClientId and ClientSecret with your own Google Drive API Credentials form client_id.json 

Save and exit.

We are ready to build Gdrive.

Let’s build it:

cd ~/projects/src/src/github.com/prasmussen/gdrive/
go build

After the build, you will see gdrive executable file. Copy it to /usr/bin/ folder to use it.

cp gdrive /usr/bin/gdrive

Note: If you had gdrive on your system, you need to delete old token_v2.json.

cd ~/.gdrive

rm token_v2.json

Now, we have gdrive installed in our system with our Google Drive API settings.

Let’s test it.

gdrive list

If it is your first time to use Gdrive or deleted token file, Gdrive needs Authentication from you.

Authentication needed
Go to the following url in your browser:

Enter verification code:

Paste the link in your browser and get the verification code.

Execute gdrive list again

We Solved google drive 403 Rate Limit Exceeded

That is it is working without errors!

Let’s check if our API is working too.

Gdrive Quotas – gdrive User Rate Limit Exceeded

Yes ! it is working too. Now we can see our Quota limit.

For Windows, please check this: https://github.com/prasmussen/gdrive/issues/426#issue-404775200

If you have any question, please leave a comment below. We will answer them ASAP.

Thanks.

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and
privacy statement. We’ll occasionally send you account related emails.

Already on GitHub?
Sign in
to your account


Open

TanukiAI opened this issue

Jan 30, 2019

· 63 comments


Open

TUTORIAL: How to get rid of 403 Errors

#426

TanukiAI opened this issue

Jan 30, 2019

· 63 comments

Comments

@TanukiAI

How to get rid of 403 errors

Many people get Errors like
Failed to get file: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded
and thanks to LINKIWI #392 (comment) I found a way to fix it properly.

What causes the problem?

The developer made one API for this program and an Google API can «only» make 10 million requests at a day. That means that too many people use this program and the requests gets full.

How much does an own API cost?

Nothing, Google made them free for everyone.

How do I fix it then?

You have to make an own API, download the programing language go, change the API to your own one and compile it to an .exe.
(And btw. binary editing resulted in errors)

And now step by step:

  1. Download «go» for your platform from https://golang.org/dl/ and install it following the instructions for your platform here and download Git from https://git-scm.com/downloads and install it with standard settings

  2. Download the repository as .zip from GitHub and unzip it in a new folder or simply execute
    git clone https://github.com/gdrive-org/gdrive.git
    image

  3. Log into your Google Account (or create one) and go to the Google Developer API Website (and if needed accept the ToS)

  4. Click on «Create Project»
    image

  5. and then click «CREATE»
    image

  6. Give it a name (in my case «Google CLI TA40») and click «Create»
    image

  7. Now go to Google Drive API and click «ENABLE»
    image

  8. Now on the left side, click «Credentials»
    image

  9. And click «CREATE CREDENTIAL»
    image

  10. Fill the things like I did and then click on «What credentials do I need?»
    image

  11. Give it a name (I took the name of the API)
    image

  12. Next select your e-mail and then give it a name (I again took the name of the API)
    image

  13. Click on «Download» (A .json file will be downloaded)
    image

  14. Open the .json file in the editor of your choice.(Notepad++ prefered)
    You should see :

  • «client_id» which looks like this: 81915486XXXX-XXXX22bh62ql2rbnaqtpds82od4ql976.apps.googleusercontent.com
  • «client_secret» which are random characters like this: lnA7ZFg5NEGOMpFhd6e4Pqny
  1. In the unzipped repository open the file names «handlers_drive.go» (via Notepad++) and change these 2 variables to the ones you got in step 14
    image
    and save it

  2. open CMD/terminal and go to the folder where the «handerls_drive.go» is.
    First type this: go get github.com/prasmussen/gdrive (thanks to mbenlioglu TUTORIAL: How to get rid of 403 Errors #426 (comment))
    Now type this: go build -ldflags '-w -s'
    Now you should have an executable for you platform which you can use normally

You can reset your data by deleting the %appdata%.gdrive on Windows, $HOME/.gdrive on other platforms

Cross Compiling a Linux/OS X/Windows etc. version of gdrive from your device:

If you want to compile the binary for an OS other than the one you’re using you should first set GOOS and GOARCH variables for that system.
Look up your target OS here: https://golang.org/doc/install/source#environment

Now do this in terminal:
On Windows:

SET GOOS=your os
SET GOARCH=your arch

On Mac OS/Linux:

export GOOS=your os
export GOARCH=your arch

then do go build -ldflags '-w -s'

Example variables if target platform is a 64 bit Linux machine:
GOOS=linux
GOARCH=amd64


Edit: Updated to cover all platforms

mbenlioglu, nhalstead, cybridz, ThomasAllan, kakalotto, starbuck93, samlek, lmiol, ammarfaizi2, zetavg, and 25 more reacted with thumbs up emoji
tripLr, wagesj45, Savlon, Beomi, and Closty reacted with laugh emoji
wagesj45, junwoo091400, airdrummingfool, ObjectiveTruth, thearabbit, Savlon, Beomi, and Closty reacted with hooray emoji
vipinbihari, diabloxenon, jusu-E404, wagesj45, thearabbit, shadow443, Savlon, usmanmughalji, Beomi, Closty, and avatar-lavventura reacted with heart emoji
seupedro, wagesj45, thearabbit, Savlon, rochajg, Beomi, and Closty reacted with rocket emoji

@mbenlioglu

Thanks for the tutorial. Closes #424 and #392. Should be included to the repo imo

Also I think Client ID/Secret should be read from client_id.json directly instead of hard-coding and simply replacing that file would solve it. I might create a pull request but not sure if the repo is still maintained.

@TanukiAI

Last update was a year ago, so yeah, I don’t think so. But this step-by-step tutorial should be fine, if anything’s missing, let me know

@scrumbee

TA40 isn’t it possible to compile a linux amd64 version of this even if your doing it from golang installed on windows. Try to follow your steps and everything good so far, just need figure out the how to compile it and compile it for ubuntu 64bit, so amd64. Trying to do this from windows right now. If know if this possible and how maybe let me know and add to the guide?

@TanukiAI

I don’t know how to compile it for linux etc.
I can test it if you give me some type of commands but I don’t know how to CrossCompile on a Windows machine

@scrumbee

yea been trying too google it myself but yea might be better install unbutu in vmware and go from there. Like never done much compiling etc so dont realy know what im doing. Tryd follow ur steps for windows compile but yes giving me errors like this atm:

C:Userszibiagogdrive-master>go build
handlers_drive.go:12:2: cannot find package «github.com/prasmussen/gdrive/auth» in any of:
C:Gosrcgithub.comprasmussengdriveauth (from $GOROOT)
C:Userszibiagosrcgithub.comprasmussengdriveauth (from $GOPATH)
gdrive.go:7:2: cannot find package «github.com/prasmussen/gdrive/cli» in any of:
C:Gosrcgithub.comprasmussengdrivecli (from $GOROOT)
C:Userszibiagosrcgithub.comprasmussengdrivecli (from $GOPATH)
compare.go:5:2: cannot find package «github.com/prasmussen/gdrive/drive» in any of:
C:Gosrcgithub.comprasmussengdrivedrive (from $GOROOT)
C:Userszibiagosrcgithub.comprasmussengdrivedrive (from $GOPATH)

@mbenlioglu

After you’ve changed the token, running _release/build-all.sh on Linux (or bash on Windows) should build all versions defined in «PLATFORMS» variable.

2 prerequisites:
    1. install go first.
    2. execute $ go get github.com/prasmussen/gdrive

@mbenlioglu

yea been trying too google it myself but yea might be better install unbutu in vmware and go from there. Like never done much compiling etc so dont realy know what im doing. Tryd follow ur steps for windows compile but yes giving me errors like this atm:

C:Userszibiagogdrive-master>go build
handlers_drive.go:12:2: cannot find package «github.com/prasmussen/gdrive/auth» in any of:
C:Gosrcgithub.comprasmussengdriveauth (from $GOROOT)
C:Userszibiagosrcgithub.comprasmussengdriveauth (from $GOPATH)
gdrive.go:7:2: cannot find package «github.com/prasmussen/gdrive/cli» in any of:
C:Gosrcgithub.comprasmussengdrivecli (from $GOROOT)
C:Userszibiagosrcgithub.comprasmussengdrivecli (from $GOPATH)
compare.go:5:2: cannot find package «github.com/prasmussen/gdrive/drive» in any of:
C:Gosrcgithub.comprasmussengdrivedrive (from $GOROOT)
C:Userszibiagosrcgithub.comprasmussengdrivedrive (from $GOPATH)

execute go get github.com/prasmussen/gdrive first you should be able to build then

@TanukiAI

Okay, I found it out.
On Windows you can do SET GOOS=linux and GOARCH=amd64 and then do go build
This will create a linux executable file
You can look for your OS here: https://golang.org/doc/install/source#environment

execute go get github.com/prasmussen/gdrive first you should be able to build then

will add this one to the guide, thanks

@scrumbee

Will give this a try and get back to you asap.

@scrumbee

Okay so based on all this info i manged to compile a working version for my Ubuntu server. Have tested it and confirmed its using the new API info i put inn and no more 404 errors. Thanks for putting this together TA40. But yea if someone knows how code this maybe next thing would be to move the google api Credentials to a config file rather than having them hard coded like this. Would be ideal for regular users so they don’t have to compile it themselves.

@TanukiAI

Nice, good to know that it’s working. Anything to add to the tutorial where you had problems or so?
And well, thanks to LINKIWI for finding this out #392 (comment)

@scrumbee

Well nothing big but i installed golang first then git, newly reinstalled windows so didn’t have it and you will need to restart your computer for GIT to work in golang so maybe add that. But yea besides that everything was straight forward and painless.

@TanukiAI

I think for that you just had to restart your command line ^^

@scrumbee

Yea might forgotten to do that after i installed git, know i did it after i installed golang. But yea so lovely to set it upload without any errors. You intrested in maybe working on this some more, like im no code expert but removed the hard coded api bits and get that put into a config file instead.

@TanukiAI

I myself have no idea how to do this because I don’t know how golang works^^

@scrumbee

Yea read a little up on it, but yea im in the same boat so would just be trail and error. So yea when have some time to spend i might test some see if can figure something out. Well at least this post is here and hopeful those who need it will find it. And yea by all means if anyone don’t know how to get this done id be more than willing to help out and compile for someone.

@rohfle

For the lazy people who don’t want to recompile (myself included) — How to change the API credentials in gdrive binary without recompile on Linux

NOTE — This ONLY works if the client_id and client_secret are both the same length as the ones defined originally in the gdrive binary. If they are not these lengths, this method will not work and you will have to recompile from source.


STEP 1
Generate API credentials for Google Drive (see TA40’s OP here — follow steps 3-14)

STEP 2
Get the client_id and client_secret from your client_id.json file. The client_id MUST be 72 chars long and the client_secret MUST be 24 chars long. If they are not these lengths, following this method will probably break gdrive, and you will have to recompile gdrive from source instead.

Use your favourite method — I used wc to get the length

echo -n **CLIENT_ID** | wc -c
echo -n **CLIENT_SECRET** | wc -c

STEP 3
Backup your gdrive binary
cp gdrive gdrive.old

STEP 4
Edit binary in-place with sed (Replace **CLIENT_ID** and **CLIENT_SECRET** with your newly generated details from client_id.json).

sed -i -e 's/367116221053-7n0vf5akeru7on6o2fjinrecpdoe99eg.apps.googleusercontent.com/**CLIENT_ID**/g' gdrive
sed -i -e 's/1qsNodXNaWq1mQuBjUjmvhoO/**CLIENT_SECRET**/g' gdrive

STEP 4a
Delete the old .gdrive/token_v2.json (Thanks leekung)

cd ~ && rm .gdrive/token_v2.json

STEP 5
Run gdrive list to confirm that it has worked. It will ask you to go through the OAuth login process again to generate a new token, but after you should have a list of files in your Google Drive.

END HOWTO


Also I think Client ID/Secret should be read from client_id.json directly instead of hard-coding and simply replacing that file would solve it. I might create a pull request but not sure if the repo is still maintained.

This would be the best way forward. There is a note in the README about the state of maintenance though:

This tool is not being actively maintained at the moment, ymmv.


Edit 2019-03-25: There are reports that these steps do not work on FreeBSD (see this comment). I only tested this on Linux. If its not working for you — follow other peoples instructions and rebuild from source.

@scrumbee

Well if you know how to code and fix it i would be interested in this, if even just make a pull request with changes il be more then happy to compile it for all different OS. Assuming he actual cheeks this he could just put in the new stuff. Maybe have the default config use the hard coded Client ID/Secret as default so it will still work but users can then easily change it as they wish since it would be in a config file.

@rohfle

@mbenlioglu

@mbenlioglu @scrumbee I have submitted a pull request with working external client credential support — #428

Enjoy

Thanks for the effort @rohfle. Just checked it out looks nice

@scrumbee

Looks very good thanks, just suggestion not sure if its needed or not but add to read me a example of a client_id.json config file. Just to make it clear for everyone how it should look as im sure there gone be some that has questions regarding it.

@tripLr

someone can fork this repo, like i did and start maintaining it, i am willing but i am new to this
github.com/tripLr

@cnrting

I only create the CREDENTIAL(not yet build) ,everything already works fine,403 not show up again

@Th0masT

@vashti

For the lazy people who don’t want to recompile (myself included) — How to change the API credentials in gdrive binary without recompile on Linux

You mighty king. This worked perfectly, and I’m back to my uploads after they failed for a week.

@sukawatd

Try to re-create a new token may be it solve your problem.

If you mean.. remove current token..gDrive give me to get token again by going web URL then.. I have done that few times still same.

Let me know if you mean anything else simple tell me the process if you don’t mind.

I have same your problem. I’ve done 2 things below

  1. After I modified handlers_drive.go then I also copy this file into ~/go/src/github.com/prasmussen/gdrive/handlers_drive.go before build package.
  2. I’ve re-create new token.

After that I open google console to monitor the api request.
Good luck.

@rakeshr

I am creating on Windows machine.. so what I need to do ?

Try to re-create a new token may be it solve your problem.

If you mean.. remove current token..gDrive give me to get token again by going web URL then.. I have done that few times still same.
Let me know if you mean anything else simple tell me the process if you don’t mind.

I have same your problem. I’ve done 2 things below

  1. After I modified handlers_drive.go then I also copy this file into ~/go/src/github.com/prasmussen/gdrive/handlers_drive.go before build package.
  2. I’ve re-create new token.

After that I open google console to monitor the api request.
Good luck.

@rakeshr

I have downloaded gdrive master files on computer and editing there only.

@sukawatd

I am creating on Windows machine.. so what I need to do ?

Try to re-create a new token may be it solve your problem.

If you mean.. remove current token..gDrive give me to get token again by going web URL then.. I have done that few times still same.
Let me know if you mean anything else simple tell me the process if you don’t mind.

I have same your problem. I’ve done 2 things below

  1. After I modified handlers_drive.go then I also copy this file into ~/go/src/github.com/prasmussen/gdrive/handlers_drive.go before build package.
  2. I’ve re-create new token.

After that I open google console to monitor the api request.
Good luck.

I also test on Windows 10. It’s working properly.

  1. Install git for windows https://git-scm.com/download/win
  2. Install Go for windows https://golang.org/doc/install?download=go1.12.1.windows-amd64.msi
  3. set go environment in advanced system setting
    • GOPATH = C:ProjectsGo
    • GOROOT = C:go
  4. Open git bash then type this command
    $ cd /c/Projects/Go
    $ git clone https://github.com/gdrive-org/gdrive.git
    $ go get github.com/prasmussen/gdrive
  5. Edit handlers_drive.go in folder «C:/Projects/Go/gdrive/» (follow instruction above) then you also copy this file to «C:ProjectsGosrcgithub.comprasmussengdrive»
  6. On git bash terminal run this command
    $ env GOOS=windows GOARCH=amd64 go build -ldflags ‘-w -s’
    After you run this command you will see gdrive.exe on C:ProjectsGogdrive directory.

Good luck
This link very useful https://medium.freecodecamp.org/setting-up-go-programming-language-on-windows-f02c8c14e2f

@TanukiAI

Whoever translated it to Crosscompiling with linux and Windows, big thanks!

@rakeshr

6. env GOOS=windows GOARCH=amd64 go build -ldflags ‘-w -s’

Something is wrong I am still getting 403 error.. couple things I noticed on your step.

[1] how do you set «GOPATH» and «GOROOT» [ I used SET GOPATH is that right?]
[2] I did go get github.com/prasmussen/gdrive but it made no folder like»src» and so on in «Go» folder.

Let me clear I am using Windows machine to generate gdrive to be used on linux server. I did set GOOS to linux and GoArch to AMD64

@fredx181

Thanks a lot for this tutorial, gdrive works fine on Linux for me now !

Fred

@leekung

For the lazy people who don’t want to recompile (myself included) — How to change the API credentials in gdrive binary without recompile on Linux

NOTE — This ONLY works if the client_id and client_secret are both the same length as the ones defined originally in the gdrive binary. If they are not these lengths, this method will not work and you will have to recompile from source.

STEP 1
Generate API credentials for Google Drive (see TA40’s OP here — follow steps 3-14)

STEP 2
Get the client_id and client_secret from your client_id.json file. The client_id MUST be 72 chars long and the client_secret MUST be 24 chars long. If they are not these lengths, following this method will probably break gdrive, and you will have to recompile gdrive from source instead.

Use your favourite method — I used wc to get the length

echo -n **CLIENT_ID** | wc -c
echo -n **CLIENT_SECRET** | wc -c

STEP 3
Backup your gdrive binary
cp gdrive gdrive.old

STEP 4
Edit binary in-place with sed (Replace **CLIENT_ID** and **CLIENT_SECRET** with your newly generated details from client_id.json).

sed -i -e 's/367116221053-7n0vf5akeru7on6o2fjinrecpdoe99eg.apps.googleusercontent.com/**CLIENT_ID**/g' gdrive
sed -i -e 's/1qsNodXNaWq1mQuBjUjmvhoO/**CLIENT_SECRET**/g' gdrive

STEP 5
Run gdrive list to confirm that it has worked. It will ask you to go through the OAuth login process again to generate a new token, but after you should have a list of files in your Google Drive.

END HOWTO

Also I think Client ID/Secret should be read from client_id.json directly instead of hard-coding and simply replacing that file would solve it. I might create a pull request but not sure if the repo is still maintained.

This would be the best way forward. There is a note in the README about the state of maintenance though:

This tool is not being actively maintained at the moment, ymmv.

best for me. but after step 4. You need to delete the old .gdrive/token_v2.json
cd ~ && rm .gdrive/token_v2.json

@avgcoderlife

For the lazy people who don’t want to recompile (myself included) — How to change the API credentials in gdrive binary without recompile on Linux

NOTE — This ONLY works if the client_id and client_secret are both the same length as the ones defined originally in the gdrive binary. If they are not these lengths, this method will not work and you will have to recompile from source.

STEP 1
Generate API credentials for Google Drive (see TA40’s OP here — follow steps 3-14)

STEP 2
Get the client_id and client_secret from your client_id.json file. The client_id MUST be 72 chars long and the client_secret MUST be 24 chars long. If they are not these lengths, following this method will probably break gdrive, and you will have to recompile gdrive from source instead.

Use your favourite method — I used wc to get the length

echo -n **CLIENT_ID** | wc -c
echo -n **CLIENT_SECRET** | wc -c

STEP 3
Backup your gdrive binary
cp gdrive gdrive.old

STEP 4
Edit binary in-place with sed (Replace **CLIENT_ID** and **CLIENT_SECRET** with your newly generated details from client_id.json).

sed -i -e 's/367116221053-7n0vf5akeru7on6o2fjinrecpdoe99eg.apps.googleusercontent.com/**CLIENT_ID**/g' gdrive
sed -i -e 's/1qsNodXNaWq1mQuBjUjmvhoO/**CLIENT_SECRET**/g' gdrive

STEP 5
Run gdrive list to confirm that it has worked. It will ask you to go through the OAuth login process again to generate a new token, but after you should have a list of files in your Google Drive.

END HOWTO

Also I think Client ID/Secret should be read from client_id.json directly instead of hard-coding and simply replacing that file would solve it. I might create a pull request but not sure if the repo is still maintained.

This would be the best way forward. There is a note in the README about the state of maintenance though:

This tool is not being actively maintained at the moment, ymmv.

YOU ARE AWESOME !!

I am one of those lazy guy who does not want to build it manually :p Your detailed steps worked like a charm !

@fredx181

Get the client_id and client_secret from your client_id.json file. The client_id MUST be 72 chars long and the client_secret MUST be 24 chars long. If they are not these lengths, following this method will probably break gdrive, and you will have to recompile gdrive from source instead.

Use your favourite method — I used wc to get the length

echo -n CLIENT_ID | wc -c
echo -n CLIENT_SECRET | wc -c

For me the client_id is 73 chars, despite that I tried to edit the binary with the sed commands.
But indeed, as mentioned in the how-to, it breaks gdrive (segmentation fault), so for me compiling was the way to go and now works fine.

@ordimans

It will possible to put API key in argument, to avoid each people compile his own executable ?

@mbenlioglu

It will possible to put API key in argument, to avoid each people compile his own executable ?

It’ll be one of the first issues to be addressed, since it’s making the program unstable. I’m currently having tight deadlines, I’ll inspect pr’s and reorganize repo as soon as I catch a break.

@rohfle

@ordimans

It will possible to put API key in argument, to avoid each people compile his own executable ?

Not on command line, but as a file on disk, there is currently unmerged pull request — #428

The pull request allows you to override credentials with a client_id.json file in the ~/.gdrive directory.

@Amendill

Try to re-create a new token may be it solve your problem.

If you mean.. remove current token..gDrive give me to get token again by going web URL then.. I have done that few times still same.
Let me know if you mean anything else simple tell me the process if you don’t mind.

I have same your problem. I’ve done 2 things below

  1. After I modified handlers_drive.go then I also copy this file into ~/go/src/github.com/prasmussen/gdrive/handlers_drive.go before build package.
  2. I’ve re-create new token.

After that I open google console to monitor the api request.
Good luck.

Thank you it worked for me !

@ppbrown

I made a patch that allows direct use of the credentials.json file.

#476

This is better than the prior pull 428, because «credentials.json» is the exact filename used by google when you generate new creds from a project. So everything becomes dragndrop with no fiddling around.
(edit: and. umm.. its a LOT smaller patch ;) )

@rohfle

I agree with @ppbrown — #476 is way simpler. The only thing better about #428 is that it loads and uses the entire credentials file as exported from Google API console. #476 loads the credentials, extracts the client id and secret, and then recreates the credentials json with constant values for login endpoints, scopes etc that are set inside gdrive.

Do we really care about this? Probably not unless these constant values or the format of the credentials file are changed by Google.

One other thing to note — when I download credentials from Google API Console, I get a randomly named file like client_secret_892342893429834_[...].json.

@michaeledi

A typo here:

open CMD/terminal and go to the folder where the «handerls_drive.go» is

should be «handlers_drive.go» lol

@marufshidiq

Hello everyone, i try to create a repository https://github.com/marufshidiq/gdrive-cli-builder to build this application with our Google API credentials and the build process is using GitHub Actions, so we don’t need to setup anything in our computer. I have been used this to build my own app, i hope this can help anyone who want to build this app with much easier.

@avatar-lavventura

@rohfle It still asked me to enter the Enter verification code: , should it still ask it after we made all the changes on the source code and compile it?

@tcel2

I wonder if someone could create a fork of this.
Then, the OP steps could be made 10 times.
Then, the executable would be generated 10 times.
Then, if running executable1 fails with 403 error, it would automatically try the executable2 and so on.
I think it is not too complicated to implement, just will require some effort to make it 10 times.

And if you are used to code/compile on this language, may be, it does not need to be 10 big executables, but just 10 small libraries that the code would internally switch to let the magic happen! :)

Yes, each one would require each user to allow the permissions, but that is not a big deal at all.

@mesutalturk

Hello, The application is no longer working for the last 1 month. what has changed? it gives the following error.

Failed getting oauth client: Failed to exchange auth code for token: oauth2: cannot fetch token: 400 Bad Request
Response: {
«error»: «invalid_grant»,
«error_description»: «Bad Request»
}

@ppbrown

possibly google has changed the required syntax on their side

On Tue, Feb 16, 2021 at 3:28 AM mesutalturk ***@***.***> wrote:

Hello, The application is no longer working for the last 1 month. what has changed? it gives the following error.

Failed getting oauth client: Failed to exchange auth code for token: oauth2: cannot fetch token: 400 Bad Request
Response: {
«error»: «invalid_grant»,
«error_description»: «Bad Request»
}


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or unsubscribe.

@mesutalturk

@ppbrown how do we fix this. developer no longer supports :(

@leekung

@ppbrown how do we fix this. developer no longer supports :(

I think this #570 fix the 403 issue. I am using it without problem

Contents

  • 1 Google Analytics Error rate limiting
    • 1.1 Management API change
    • 1.2 What the 500 and 503 error?
      • 1.2.1 500 – internalError
      • 1.2.2 500 – Backend Error
      • 1.2.3 503 -Backend Error
      • 1.2.4 403: User Rate Limit Exceeded
    • 1.3 What to do when you encounter those errors.
    • 1.4 So what does this change mean for me?
      • 1.4.1 Documentation bug

So what is the error rate limit. The error reate limit is the limit to the number of errors you can receive from the Google API before the system will block you for being annoying. The main point to this I have been told was to prevent people from abusing the system and encourage developers to fix there code.
[wp_ad_camp_3]
Google Analytics

Management API change

Yesterday there was a change made to the Google Analytics Management API. You can read the change log here.

Google Analytics Management API Changelog search for Release 2016-02-25 (February 25, 2016)

Error rate limiting

It has always been our policy that developers should implement exponential backoff, when handling 500 or 503 responses. Today we are introducing a rate limit on 500 and 503 errors to enforce this policy.

  • 50 failed write requests per hour.

Thu Feb 25 2016

What the 500 and 503 error?

When we make a request to any of the Google APIs (not just Google Analytics) there are some standard errors that your request can result in. There are quite a few of them but lets just look at 500 and 503.

500 – internalError

{  
   "code":500,
   "errors":[  
      {  
         "domain":"global",
         "message":"There was an internal error.",
         "reason":"internalError"
      }
   ],
   "message":"There was an internal error."
}

500 – Backend Error

{  
   "code":500,
   "errors":[  
      {  
         "domain":"global",
         "message":"Backend Error",
         "reason":"backendError"
      }
   ],
   "message":"Backend Error"
}

503 -Backend Error

{  
   "code":503,
   "errors":[  
      {  
         "domain":"global",
         "message":"Backend Error",
         "reason":"backendError"
      }
   ],
   "message":"Backend Error"
}

That is what the JSon looks like if you encounter one of those errors.

403: User Rate Limit Exceeded

There is one more that I am aware of.

{
 "error": {
  "errors": [
   {
    "domain": "usageLimits",
    "reason": "userRateLimitExceeded",
    "message": "User Rate Limit Exceeded"
   }
  ],
  "code": 403,
  "message": "User Rate Limit Exceeded"
 }
}

What to do when you encounter those errors.

If you ever encounter in 500 , 503 or 403. You should simply try the request again, however its not quite that simple. You need to slow down your code you are running to fast. When it comes to the Google Analytics API we are allowed 10 requests a second per User. Remember user is denoted by either userip or quotauser.
[wp_ad_camp_1]
I normally run my code 6 times and add a half a second sleep in between each iteration of the loop.

  • sleep half a second try again
  • sleep a second try again
  • sleep a second and a half try again
  • sleep two seconds try again
  • sleep two and a half seconds try again
  • sleep three seconds try again

If it hasn’t been able to return the results after that I fail the application. This rarely happens. Most of the time in my experience it works after the second or third sleep. However I like to add a few extra in there to be sure that it doesn’t cause a problem. Also Google recommends we do it six times so it is probably a good idea to follow their recommendations.

So what does this change mean for me?

Well that is a really good question and why I am posting this hear. It is unclear to me what they mean by

Error rate limiting 50 failed write requests per hour.

  • Does this mean that they are going to block all of your requests for an hour if you hit this?
  • Is it user specific. If i have an application with 60 users and they each hit it once will my application be blocked?

I am going to send an email off to the Google Analytics API developers and see if I cant get us a little clarification on this.   I will update this post when I hear back from them

Documentation bug

Also there is a slight issue with the documentation. I have always gone buy the documentation. If you check here it says.

  • 500 internalServerError Unexpected internal server error occurred. Do not retry this query more than once.
  • 503 backendError Server returned an error. Do not retry this query more than once.

Only try it once?

However if you scroll down to here it says to

A 500 or 503 error might result during heavy load or for larger more complex requests. For larger requests consider requesting data for a shorter time period. Also consider implementing exponential backoff. The frequency of these errors can be dependent on the view (profile) and the amount of reporting data associated with that view; A query that causes a500 or 503 error for one view (profile) will not necessarily cause an error for the same query with a different view (profile).

I am going to report that little bug as well. They need to make up there mind try it once or more then once.

PyTorch Forums

Loading

If you are getting; googleapi User Rate Limit Exceeded,  gdrive 403 Rate Limit Exceeded we have a solution for you.

We have been using Gdrive to upload some of our essential files for many months. Recently, we noticed that our daily backup was not working as expected. Gdrive error logs show us, Failed to get file: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded and Failed to get file: googleapi: Error 404: File not found: Failed., notFound errors.

We tried rebooting our servers refreshing our auth logins etc. none of them fixed our gdrive User Rate Limit Exceeded errors. The problem was an API related issue so we need to create a new API and build Gdrive from source. The solution is simple but takes times, If you are figuring out how to do it. However, It took a couple of our hours to do it but it will take minutes of your time, if you follow this guide you will solve your google drive 403 Rate Limit Exceeded error.

PS: We applied all the steps on our Centos server, but it will be the same with all platforms.

Part 1

Carefully follow the steps to fix google drive User Rate Limit Exceeded Error.

Downloading and Installing GO

You will need root privileges or sudo for ubuntu.

Download the files :

To download the Go binary on your linux server you can use wget or curl:

wget https://dl.google.com/go/go1.11.5.linux-amd64.tar.gz

You need to extract Go binary files from go1.11.5.linux-amd64.tar.gz After successful extraction; you will have a go named folder. You should move it to /usr/local location because it is recommended by publishers.

tar -xzf go1.11.5.linux-amd64.tar.gz
mv go /usr/local

Creating Workspace Folder for Go

For better-organized projects, create a projects folder with bin and src folder together in user home directory.

mkdir  -p ~/projectss/{bin,src}

Setting Environment Variables for Go

We need to set a $PATH Environment variable for Go to use it like any other commands in our UNIX system.

Create path.sh script in /etc/profile.d directory location

nano /etc/profile.d/path.sh

Add the following to the file, save and exit. (/etc/profile.d/path.sh)

/usr/local/go/bin

Additionally, we need to define GOPATH and GOBIN Go environment variables in the user’s  .bash_profile file to point to the recently created projects folder. GOPATH is our Go source files GOBIN is our compiled Go binary files. Open the .bash_profile file:

nano ~/.bash_profile

Append the following to the end of the file, save and exit: (~/.bash_profile)

export GOBIN="$HOME/projects/bin"
export GOPATH="$HOME/projects/src"

Apply to changes in our system; we need to update profiles with source command

source /etc/profile && source ~/.bash_profile

Let’s test our Go if it is working

[[email protected] ~]# go version 
go version go1.11.5 
linux/amd64

We needed to have Go in our system to compile Gdrive so that’s all for installing Go. We can continue Part 2 where we will compile Gdrive from source files.

Part 2

We will continue to solve googleapi 403 Rate Limit Exceeded error. Keep following the steps…

Creating Google API for Gdrive

If you see these errors while running Gdrive on your system:

Failed to get file: googleapi: Error 403: Rate Limit Exceeded, rateLimitExceeded

You need your own Google Drive API to use with Gdrive so you can get information from your usages. Google API’s provide  Quotas Information which is very helpful in our situation. We need to know if we Exceed our limits.

Visit https://console.developers.google.com/apis/dashboard

Top of the page, click Select a project then New Project. 

Fill Project name as you want it.

Google API New Project

Choose your newly created project at the top of the page. At the dashboard, click  ENABLE APIS AND SERVICES

It will redirect you API Library. Search Drive keyword and find Google Drive API. You need to enable Google Drive API to use it in your project.

Add and Enable Google Drive API

We successfully add Google Drive API in our project. Gdrive requries Google Drive API’s credentials. Let’s create one.

Google Drive API Credentials

Create Credentials Details and choose the option ” Help me choose.”

Google API Credentials

Choose the settings as I did:

Google API Credentials Settings

I filled Client Name as same as my API Name.

Google API Name

Next step, fill your mail address and write a Product name.

Google API OAuth Name

After that, It will give us the Credentials we need.

Google Drive Error 403 User Rate Limit Exceeded Solution

Click download and done.

It will download a JSON file which contains our Credentials for Gdrive. Open the client_id.json file with a text editor. Notepad++ is a good option.

You will see,

“client_id”:”205xxxxxxxxx-22imoxxxxxxxxxxxxxxxxpsm.apps.googleusercontent.com”

“client_secret”:”NxxxxxxG-4HxxxxxxxxxxxxxxxwZA”

We need that two value so note it.

Getting Source Files of Gdrive

We need Gdrive projects files from Github so that we can compile it with Go. Let’s download files into our ~/projects/ folder that we created earlier.

cd ~/projects/

Use GO to download Gdrive src files from GitHub.

go get github.com/prasmussen/gdrive

We need to change Credentials in handlers_drive.go where is located in the gdrive folder.

cd ~/projects/src/src/github.com/prasmussen/gdrive/

nano handlers_drive.go

const ClientId = "3671xxxxxxxxxxxxxxxxxxxxxxeg.apps.googleusercontent.com"
const ClientSecret = "1qsNxxxxxxxxhoO"

No More googleapi User Rate Limit Exceeded

Change ClientId and ClientSecret with your own Google Drive API Credentials form client_id.json 

Save and exit.

We are ready to build Gdrive.

Let’s build it:

cd ~/projects/src/src/github.com/prasmussen/gdrive/
go build

After the build, you will see gdrive executable file. Copy it to /usr/bin/ folder to use it.

cp gdrive /usr/bin/gdrive

Note: If you had gdrive on your system, you need to delete old token_v2.json.

cd ~/.gdrive

rm token_v2.json

Now, we have gdrive installed in our system with our Google Drive API settings.

Let’s test it.

gdrive list

If it is your first time to use Gdrive or deleted token file, Gdrive needs Authentication from you.

Authentication needed
Go to the following url in your browser:

Enter verification code:

Paste the link in your browser and get the verification code.

Execute gdrive list again

We Solved google drive 403 Rate Limit Exceeded

That is it is working without errors!

Let’s check if our API is working too.

Gdrive Quotas – gdrive User Rate Limit Exceeded

Yes ! it is working too. Now we can see our Quota limit.

For Windows, please check this: https://github.com/prasmussen/gdrive/issues/426#issue-404775200

If you have any question, please leave a comment below. We will answer them ASAP.

Thanks.

Понравилась статья? Поделить с друзьями:
  • Ошибка 403 post
  • Ошибка 403 на сайте мтс
  • Ошибка 403 play market крым
  • Ошибка 403 на локальном сервере
  • Ошибка 403 phpmyadmin