Like to know what is the best practice to store or reference these GOOGLE_APPLICATION_CREDENTIALS and GOOGLE_CLOUD_PROJECT in prod?
export GOOGLE_APPLICATION_CREDENTIALS="~/Download/key.json"
export GOOGLE_CLOUD_PROJECT=`gcloud config get-value project
as in dev these are likely to be in .zshrc so when running:
poetry run python3 samples/snippets/quickstart/pub.py $PROJECT hello_topic
to publish message is fine and we can see messages inside console portal, however:
poetry run pytest samples/snippets/quickstart/quickstart_test.py
ERROR samples/snippets/quickstart/quickstart_test.py::test_pub - google.api_core.exceptions.PermissionDenied: 403 User not authorized to perform this...
.zshrc have already set:
export GOOGLE_APPLICATION_CREDENTIALS="$HOME/Documents/python-pubsub/key.json"
export GOOGLE_CLOUD_PROJECT=`gcloud config get-value project`
source .zshrc
The environment variables are intended for development. In production do not use them.
Each compute service offers the ability to attach a service account. By default, compute services have a Google created service account (aka Default Service Account) attached. Best practices recommend creating a User-managed service account with only the required IAM roles and attaching that to the service.
User-managed service accounts
Compute Engine Service accounts
Best practices for securing service accounts
Related
I'm willing to deploy a service in Google-Cloud-Run. It would be a (python) Flask App that would connect to datastore (firestore in datastore mode) to either write or read a small blob.
The problem is that it is not explained in the docs: Accessing your Database how to reach datastore within GCP but not from GCE or AppEngine. Is there a fancy/seamless way to achieve this or should I go with providing a service account credentials as if it was an external platform ?
Thank you in advance for your help and answers.
When your Cloud Run logic executes, it executes with the identity of a GCP Service Account. You can configure which service account it runs as at configuration time. You can create and configure a Service Account that has the correct roles to allow/provide access to your datastore. This means that when your Cloud Run logic executes, it will have the correct authority to perform the desired operations. This story is documented here:
Using per-service identity
If for some reason you don't find this sufficient, an alternative is to save the tokens necessary for access in compute metadata and then dynamically retrieve these explicitly within your cloud run logic. This is described here:
Fetching identity and access tokens
Hopefully this covers the fundamentals of what you are looking for. If after reading these areas new questions arise, feel very free to create new questions which are more specific and detailed and we'll follow up there.
To connect to Cloud Datastore from your Flask app deployed to Cloud Run...
Ensure you've got both services enabled in a project with an active billing account.
Ensure you've got at least both Flask & Datastore packages in your requirements.txt file (w/any desired versioning):
flask
google-cloud-datastore
Integrate Datastore usage into your app... here's some sample usage in my demo main.py (Flask code dropped for simplicity):
from google.cloud import datastore
ds_client = datastore.Client()
KEY_TYPE = 'Record'
def insert(**data):
entity = datastore.Entity(key=ds_client.key(KEY_TYPE))
entity.update(**data) ## where data = dict/JSON of key-value pairs
ds_client.put(entity)
def query(limit):
return ds_client.query(kind=KEY_TYPE).fetch(limit=limit)
You can have a Dockerfile (minimal one below), but better yet, skip it and let Google (Cloud Buildpacks) build your container for you so you don't have extra stuff like this to worry about.
FROM python:3-slim
WORKDIR /app
COPY . .
RUN pip install -r requirements.txt
CMD ["python", "main.py"]
Come up with an app/service name SVC_NAME then build & deploy your prototype container with gcloud beta run deploy SVC_NAME --source . --platform managed --allow-unauthenticated. (Think docker build followed by docker push and then docker run, all from 1 command!) If you have a Dockerfile, Buildpacks will use it, but if not, it'll introspect your code and dependencies to build the most efficient container it can.
That's it. Some of you will get distracted by service accounts and making a public/private key-pair, both of which are fine. However to keep things simple, especially during prototyping, just use the default service account you get for free on Cloud Run. The snippet above works without any service account or IAM code present.
BTW, the above is for a prototype to get you going. If you were deploying to production, you wouldn't use the Flask dev server. You'd probably add gunicorn to your requirements.txt and Dockerfile, and you'd probably create a unique service account key w/specific IAM permissions, perhaps adding other requirements like IAP, VPC, and/or a load-balancer.
In order to use the aws command-line tool, I have aws credentials stored in ~/.aws/credentials.
When I run my app locally, I want it to require the correct IAM permissions for the app; I want it to read these permissions from environment variables.
What has happened is that even without those environment variables defined - even without the permissions defined - my app allows calls to aws which should not be allowed, because it's running on a system with developer credentials.
How can I run my app on my system (not in a container), without blowing away the credentials I need for the aws command-line, but have the app ignore those credentials? I've tried setting the AWS_PROFILE environment variable to a non-existent value in my start script but that did not help.
I like to use named profiles, and run 2 sets of credentials eg DEV and PROD.
When you want to run production profile, run export AWS_PROFILE=PROD
Then return to the DEV credentials in the same way.
The trick here is to have no default credentials at all, and only use named profiles. Remove the credentials named default in the credentials file, and replace with only the named profiles.
See
https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html
Anyone can HELP? This one is really driving me crazy... Thank you!
I tried to use a google cloud platform API Speech-to-text.
Tools: WINDOWS 10 && GCP &&Python(Pycharm IDE)
I've created a service account as a owner for my speech-to-test project and generated a key from GCP console in json, then I set the environment variables.
Code I ran on WIN10 Powershell && CMD:
$env:GOOGLE_APPLICATION_CREDENTIALS="D:\GCloud speech-to-text\Speech
To Text Series-93e03f36bc9d.json"
set GOOGLE_APPLICATION_CREDENTIALS=D:\GCloud speech-to-text\Speech To
Text Series-93e03f36bc9d.json
PS: the added environment variables disappear in CMD and Powershell after reboot my laptop but do show in the env list if added again.
I've enabled the google storage api and google speech-to-text api in GCP console.
I've tried the explicitely showing credential method via python, same problem.
I've installed the google cloud SDK shell and initialized by using command to log in my account.
PYTHON SPEECH-TO-TEXT CODE(from GCP demo)
import io
import os
# Imports the Google Cloud client library
from google.cloud import speech
from google.cloud.speech import enums
from google.cloud.speech import types
# Instantiates a client
client = speech.SpeechClient()
# The name of the audio file to transcribe
file_name = os.path.join(
os.path.dirname(__file__),
'test_cre.m4a')
# Loads the audio into memory
with io.open(file_name, 'rb') as audio_file:
content = audio_file.read()
audio = types.RecognitionAudio(content=content)
config = types.RecognitionConfig(
encoding=enums.RecognitionConfig.AudioEncoding.LINEAR16,
sample_rate_hertz=16000,
language_code='en-US')
# Detects speech in the audio file
response = client.recognize(config, audio)
for result in response.results:
print('Transcript: {}'.format(result.alternatives[0].transcript))
----Expected to receive a "200OK" and the transcribed text when runing the code above (a demo of short speech to text api from GCP Document)
----But got:
D:\Python\main program\lib\site-packages\google\auth_default.py:66: UserWarning: Your application has authenticated using end user credentials from Google Cloud SDK. We recommend that most server applications use service accounts instead. If your application continues to use end user credentials from Cloud SDK, you might receive a "quota exceeded" or "API not enabled" error. For more information about service accounts, see https://cloud.google.com/docs/authentication/
warnings.warn(_CLOUD_SDK_CREDENTIALS_WARNING)
google.api_core.exceptions.ResourceExhausted: 429 Quota exceeded for quota metric 'speech.googleapis.com/default_requests' and limit 'DefaultRequestsPerMinutePerProject' of service 'speech.googleapis.com' for consumer 'project_number:764086051850'.
ANOTHER WEIRD THING: the error info shows that 'project_number:764086051850', which is different from my speech-to-text project_number on GCP (I do distinguish project number and project ID), the project_number shown in the error info also varies every time the code runs. It seems I was sending cloud requirement of the wrong project?
My GOOGLE_APPLICATION_CREDENTIALS system environment variables disappear after I restart my laptop next time. After adding again, it will appear in the env list but can't be stored after reboot again.
Appreciate it if someone can help, thank you!
try to do this:
Run gcloud init -> authenticate with your account and choose your project
Run gcloud auth activate-service-account <service account email> --key-file=<JSON key file>
Run gcloud config list to validate your configuration.
Run your script and see if it's better.
Else, try to do the same thing on a micro-vm for validating your code, service account and environment (and for validating that there is a problem only with Windows)
For Windows issues, I'm on ChromeBook, I can't test and help you on this. However, I checked about EnvVar on internet, and this update the registry. Check if you don't have stuff which protect Registry update (Antivirus,....)
D:\Python\main program\lib\site-packages\google\auth_default.py:66:
UserWarning: Your application has authenticated using end user
credentials from Google Cloud SDK. We recommend that most server
applications use service accounts instead. If your application
continues to use end user credentials from Cloud SDK, you might
receive a "quota exceeded" or "API not enabled" error. For more
information about service accounts, see
https://cloud.google.com/docs/authentication/
warnings.warn(_CLOUD_SDK_CREDENTIALS_WARNING)
This error means that your code is not using a service account. Your code is configured to use ADC (Application Default Credentials). Most likely your code is using the Google Cloud SDK credentials configured and stored by the CLI gcloud.
To determine what credentials the Cloud SDK is using, execute this command:
gcloud auth list
The IAM Member ID, displayed as ACCOUNT, with the asterisk is the account used by the CLI and any applications that do not specify credentials.
To learn more about ADC, read this article that I wrote:
Google Cloud Application Default Credentials
google.api_core.exceptions.ResourceExhausted: 429 Quota exceeded for
quota metric 'speech.googleapis.com/default_requests' and limit
'DefaultRequestsPerMinutePerProject' of service
'speech.googleapis.com' for consumer 'project_number:764086051850'.
The Cloud SDK has the concept of default values. Execute gcloud config list. This will display various items. Look for project. Most likely this project does not have the API Cloud Speech-to-Text enabled.
ANOTHER WEIRD THING: the error info shows that
'project_number:764086051850', which is different from my
speech-to-text project_number on GCP (I do distinguish project number
and project ID), the project_number shown in the error info also
varies every time the code runs. It seems I was sending cloud
requirement of the wrong project?
To see the list of projects, Project IDs and Project Numbers that your current credentials can see (access) execute:
gcloud projects list.
This command will display the Project Number given a Project ID:
gcloud projects list --filter="REPLACE_WITH_PROJECT_ID" --format="value(PROJECT_NUMBER)"
My GOOGLE_APPLICATION_CREDENTIALS system environment variables
disappear after I restart my laptop next time. After adding again, it
will appear in the env list but can't be stored after reboot again.
When you execute this command in a Command Prompt, it only persists for the life of the Command Prompt: set GOOGLE_APPLICATION_CREDENTIALS=D:\GCloud speech-to-text\Speech To
Text Series-93e03f36bc9d.json. When you exit the Command Prompt, reboot, etc. the environment variable is destroyed.
To create persistent environment variables on Windows, edit the System Properties -> Environment Variables. You can launch this command as follows from a Command Prompt:
SystemPropertiesAdvanced.exe
Suggestions to make your life easier:
Do NOT use long path names with spaces for your service account files. Create a directory such as C:\Config and place the file there with no spaces in the file name.
Do NOT use ADC (Application Default Credentials) when developing on your desktop. Specify the actual credentials that you want to use.
Change this line:
client = speech.SpeechClient()
To this:
client = speech.SpeechClient().from_service_account_json('c:/config/service-account.json')
Service Accounts have a Project ID inside them. Create the service account in the same project that you intend to use them (until you understand IAM and Service Accounts well).
Every now and then I lose the permissions for a project to be deployed via Google App Engine PHP.
HttpException: Permissions error fetching application [apps/PROJECT_ID]. Please make sure you are using the correct project ID and that you have permission to view applications on the project.
I use Jenkins, and I can see via the config history that nothing changed. Even if I escalate the privileges to "project owner" in console.cloud.google.com - same results: permissions denied.
The only way I am able to solve this issue is create a complete new app engine project.
Question: why do IAM accounts expire, and what is the recommended way of using credentials to automate deploys with Google App Engine PHP ?
I had this error message and for me the error was due to switching projects by using "gcloud config set project PROJECT_NAME" instead of "gcloud config set project PROJECT_ID".
I was able to solve it by appending --configuration CONFIG_NAME additionally to my gcloud app deploy command. I don't know why some projects work by just specifying the --project arg, and others not.
In any case it seems cleaner to always explicitly set the configuration-arg per gcloud command
I have a local git repo and I am trying to do continuous integration and deployment using Codeship. https://documentation.codeship.com
I have the github hooked up to the continuous integration and it seems to work fine.
I have an AWS account and a bucket on there with my access keys and permissions.
When I run the deploy script I get this error:
How can I fix the error?
I had this very issue when using aws-cli and relying on the following files to hold AWS credentials and config for the default profile:
~/.aws/credentials
~/.aws/config
I suspect there is an issue with this technique; as reported in github: Unable to locate credentials
I ended up using codeship project's Environment Variables for the following:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
Now, this is not ideal. However my AWS-IAM user has very limited access to perform the specific task of uploading to the bucket being used for the deployment.
Alternatively, depending on your need, you could also check out the Codeshop Pro platform; it allows you to have an encrypted file with environment variables that are decrypted at runtime, during your build.
On both Basic and Pro platforms, if you want/need to use credentials in a file, you can store the credentials in environment variables (like suggested by Nigel) and then echo it into the file as part of your test setup.