I'm getting the following error when I try to process a single 10 minute audio file. I'm just getting started with Google Cloud products and so I'm the only person accessing this resource. How could I have exceeded the quota? The quota is set at its default values and I dont think that I am anywhere near the limits. Is there another reason for this?
I'm using the transcribe_async.py demo code. The audio files (22MB) are stored in a bucket and being accessed through a uri audio source, otherwise the demo code is unchanged.
Waiting for server processing...
Traceback (most recent call last):
File "/Users/kevin/Downloads/python-docs-samples-master/speech/api-client/transcribe_async.py", line 116, in <module>
main(args.speech_file)
File "/Users/kevin/Downloads/python-docs-samples-master/speech/api-client/transcribe_async.py", line 93, in main
response = service_request.execute()
File "/Users/kevin/anaconda2/lib/python2.7/site-packages/oauth2client/_helpers.py", line 133, in positional_wrapper
return wrapped(*args, **kwargs)
File "/Users/kevin/anaconda2/lib/python2.7/site-packages/googleapiclient/http.py", line 840, in execute
raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 429 when requesting https://speech.googleapis.com/v1beta1/operations/596739883637256586?alt=json returned "Insufficient tokens for quota group and limit 'Default_GroupCLIENT_PROJECT-100s' of service 'speech.googleapis.com', using the limit by ID '764086051850'.">
I've been having this problem too. I'm still trying to understand more about how the GCP credentials generally, but in the meantime I think I've figured out enough to make this work. I'm also using the example Python scripts. I followed the instructions at this page.
The gist of it is:
Create a "private key" using the Credentials page of the Google Cloud Console. It's really more than simply a "key", it's a (JSON) config file with many values such as 'type', 'project_id', and so forth.
Download that file and save it somewhere. I put mine in the ~/.config/gcloud/ folder which seems to also have a lot of relevant config files.
Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to point at that file, i.e. export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
There's also a way to do this from within the code shown on that page, but the environment variable route made more sense for me.
It seems that the process above sets the "default" credentials, and in the example code those are fetched on line 39.
There's additional documentation on the Google sites, I found the one on Google Cloud Storage authentication to be the most useful so far.
I think you get error because use Application Default Credentials specified by the command "gcloud auth application-default login". Try to create a Service Account for your project. Save the JSON key in the private folder. Then specify the path to the key, like this:
var speech = require('#google-cloud/speech')({
keyFilename: '/path/to/keyfile.json'
});
It's important, your project should be enabled billing. To enable billing, you can activate the Free Trial period
Related
I have this app that I've built, which scrapes recipes and adds them to evernote. Link here. In the EvernotePy folder, there is a python3 script called Add_to_Evernote.py that interfaces with the API. When I have Sandbox = True, then it works like normal, but when I turn Sandbox = False it breaks when trying to call the function client.get_authorize_url(). I just received an email today saying that my API-Key got activated for their production servers, so Idk what' wrong.
The error I'm getting is:
Traceback (most recent call last):
File "./EvernotePy/Add_to_evernote.py", line 69, in <module>
webbrowser.open(client.get_authorize_url(request_token))
File "/usr/local/lib/python3.8/dist-packages/evernote3-1.25.0-
py3.8.egg/evernote/api/client.py", line 58, in get_authorize_url KeyError: 'oauth_token'
What do I have to do?
This happened to me too,happened it was because though I changed the sandbox to FALSE I didn't redeploy the app. so make sure after changing the sandbox value the app is redeploy so the changes takes effect
I know there is a way by which we can call Document AI from python environment in local system. In that process one needs to upload the local file to GCS bucket so that Document AI can access the file from there. Is there any way by which we can give direct access of local files to Document AI (i.e., without uploading the file to GCS bucket) using python? [Note that it's a mandatory requirement for me to run python code in local system, not in GCP.]
DocumentAI cannot "open" files by itself from your local filesystem.
If you don't want / cannot upload the documents to a bucket, you can send them in as part of the REST API. BUT in this case you cannot use BatchProcessing: I mean, you must process the files one by one and wait for a response.
The relevant REST API documentation is here: https://cloud.google.com/document-ai/docs/reference/rest/v1/projects.locations.processors/process
In the quickstart documentation for python you've got this sample code that reads a file and sends it inline as part of the request:
# The full resource name of the processor, e.g.:
# projects/project-id/locations/location/processor/processor-id
# You must create new processors in the Cloud Console first
name = f"projects/{project_id}/locations/{location}/processors/{processor_id}"
# Read the file into memory
with open(file_path, "rb") as image:
image_content = image.read()
document = {"content": image_content, "mime_type": "application/pdf"}
# Configure the process request
request = {"name": name, "raw_document": document}
result = client.process_document(request=request)
I am trying to run Vorto dashboard on Raspberry Pi to visualize my Bosch IoT "things" data.
In order to run the Vorto Dashboard, I installed npm and nodejs and created the config.json file.
I am getting the below error whenever I try to run the dashboard using the command: sudo vorto-dashboard config.json, knowing that I already added the OAuth2 Client credentials.
No credentials given, can not get things
Could not get the token with given credentials. - StatusCodeError: 400 -
{"error":"unauthorized_client","error_description":"INVALID_CREDENTIALS:
Invalid client credentials"}
I am currently contributing the Vorto Project as an Intern at Bosch. Due to changes in the Vorto-Dashboard we combined and merged the functionality of a previous dashboard with another coexisting updated UI, providing advanced ways to visualize the existing devices.
As the uploaded state was work in progress, we temporarily disabled the config.json methodology and removed existing references from the documentation. Apparently, the reference in the tutorial you found was omitted, sorry for that!
Today, I deployed a new version 0.5.0 of the vorto-dashboard which should work as usual. You are now able to work with either process.env.[...] varibales or a config.json file. Thank you Mena for the quick response!
Feel free to let me know if you need any further help or have additional feedback.
TL;DR
To resolve your issue, store your OAUth credentials as environmental variables.
E.g. in debian et al., export BOSCH_CLIENT_ID=... etc., then start the dashboard in the same terminal.
Context
I was about to ask the same question, as I got the same error message no matter how I referenced the config.json file (relative path, absolute path, no reference, etc.).
For clarification, the tutorial pointing to a config.json resource for storing OAuth credentials is here.
Quoting:
While the dependencies are being installed, create the config.json file and insert client_id, secret and scope from your Already created
OAuth2 Client. The content of the file has to look like this:
{
"client_id": "<YOUR_CLIENT_ID>",
"client_secret": "<YOUR_CLIENT_SECRET",
"scope": "<YOUR_SCOPE>",
"intervalMS": 10000
}
The reference to the config.json file has been removed from the README.md resource in the vorto-dashboard module of vorto-examples.
The latest README.md suggests providing the OAuth credentials through environmental variables:
You can provide your OAuth2 credentials through environment variables.
The three environment variables you have to provide are:
BOSCH_CLIENT_ID
BOSCH_CLIENT_SECRET
BOSCH_SCOPE
[...]
Looking at the source, I can only find an explicit reference to a config.json in the start script entry for package_for_deployment.json (nor anything around the source seems to be consuming, say, argv[2] for that matter).
The AuthToken.js resource in charge of handling OAuth credentials only seems to reference environmental variables through the process.env.[...] references.
Elaboration
This is only speculation at the time of writing, but I suspect the reason why the config.json methodology has been abandoned might have something to do with strengthening security, i.e. not storing OAuth credentials permanently in a file.
If that much is true, then the tutorial page should probably be amended with the latest instructions from the README.md.
I would like to upload invoking a REST endpoint in multi-part.
In particular, I am looking at this API: Google Cloud Storage: Objects: insert
I did read about using multer, however I did not find any complete example showing me how to perform this operation.
Could someone help me with that?
https://cloud.google.com/nodejs/getting-started/using-cloud-storage#uploading_to_cloud_storage
^^ this is a a good example of how to use multer to upload a single image to Google Cloud Storage. Use multer to create filestream for each file ( storage: multer.memoryStorage() ), and handle the file stream by sending it to your GCS bucket in your callback.
However link only shows an example for one image. If you want to do an array of images, create a for-loop, where you create a stream for each file in your request, but only put the next() function after the for loop ends. If you keep the next(); in each loop cycle you will get the error: Error: Can't set headers after they are sent.
There is an example for uploading files with the nodejs client library and multer. You can modify this example and set the multipart option:
Download the sample code and cd into the folder:
git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples/
cd nodejs-docs-samples/appengine/storage
Edit the app.yaml file and include your bucket name:
GCLOUD_STORAGE_BUCKET: YOUR_BUCKET_NAME
Then in the source code, you can modify the publicUrl variable according to Objects: insert example:
const publicUrl = format(`https://www.googleapis.com/upload/storage/v1/b/${bucket.name}/o?uploadType=multipart`);
Download a key file for your service account and set the environment variable:
Go to the Create service account key page in the GCP Console.
From the Service account drop-down list, select New service account.
Input a name into the Service account name field.
From the Role drop-down list, select Project > Owner.
Click Create. A JSON file that contains your key downloads to your computer. And finally export the environment variable:
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/key/file
After that, yo're ready to run npm start and go to the app's frontend and upload your file:
I am trying to update the metadata (programatically, from Python) of several CSV/JSON files that are exported from BigQuery. The application that exports the data is the same with the one modifying the files (thus using the same server certificate). The export goes all well, that is until I try to use the objects.patch() method to set the metadata I want. The problem is that I keep getting the following error:
apiclient.errors.HttpError: <HttpError 403 when requesting https://www.googleapis.com/storage/v1/b/<bucket>/<file>?alt=json returned "Forbidden">
Obviously, this has something to do with bucket or file permissions, but I can't manage to get around it. How come if the same certificate is being used in writing files and updating file metadata, i'm unable to update it? The bucket is created with the same certificate.
If that's the exact URL you're using, it's a URL problem: you're missing the /o/ between the bucket name and the object name.