Python & Firebase - Unable to Read Uploaded File in Storage - python-3.x

I'm trying to upload my file to my Firebase Storage.
Many months ago, I could add my files using my same method, now I can add but I can't see its preview and direct link.
My code is
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
from firebase_admin import storage
cred = credentials.Certificate('C:\\Users\\Wicaledon\\PycharmProjects\\ABC\\XXXXXX.json')
# Initialize the app with a service account, granting admin privileges
firebase_admin.initialize_app(cred, {
'databaseURL': 'https://XXXXXXXXXX.firebaseio.com/',
'storageBucket': 'XXXXXXXXXX.appspot.com'
})
database_url = 'https://XXXXXXXXXX.firebaseio.com/'
bucket = storage.bucket()
blob2 = bucket.blob('my_image.jpg')
blob2.upload_from_filename('C:\\Users\\Wicaledon\\PycharmProjects\\ABC\\my_image.jpg')
I see my file now like this. I can't see its preview and link at the right side.
But I can add same file manually from my hand, and I can see its preview and link.
What is the problem? Can you solve it ?

Related

Connect to GCP SQL using the .json credentials file

I have a PostgreSQL DB at GCP. Right now I can login using a username, password e.g
import pandas as pd
import pyodbc
conn_str = (
"DRIVER={PostgreSQL Unicode};"
"DATABASE=test;"
"UID=user;"
"PWD=a_very_strong_password;"
"SERVER=34.76.yy.xxxx;"
"PORT=5432;"
)
with pyodbc.connect(conn_str) as con:
print(pd.read_sql("SELECT * from entries",con=con))
Is there a way to use the .json credentialsfile which is downloaded when I created my IAM user, instead of "hard typing" the credentials like above? I recon I can use the file to connect to a GCP storage, where I then can save my credentials for the DB thus I can write a script which loads the username,password etc. from the storage, but I feel it is a kinda "clunky" workaround.
From the guide here it seems like you can create IAM roles for such, but you only grants access for an hour at a time, and you need to create a token-pair each time.
Short answer: Yes, you can connect to a Cloud SQL instance using SA keys (json file) but only with PostgreSQL but you need to refresh the token every hour.
Long answer: The purpouse of the json is more intended to make operations in the instance at resource level or when using the Cloud SQL proxy.
For example when you use the Cloud SQL proxy with a service account you make a "magical bridge" to the instance but at the end you need to authenticate the way you're doing right now but using as SERVER = 127.0.0.1. This is the recommended method in most cases.
As well you've mentioned that using IAM authentication can work, this approach works for 1 hour since you depend on token refresh. If you're okay with this, just keep in mind you need to be refreshing the token.
Another approach I can think of for now is to use Secret Manager. The steps can be as follows:
Create a service account and a key for that.
Create a secret which contains your password.
Grant access to this particular secret to the SA created in step 1:
Go to Secret Manager.
Select the secret and click on Show info panel
Click on Add member and type or paste the email of the SA
Grant Secret Manager Secret Accessor
Click on Save
Now in your code you can get the secret content (which is the password) with maybe this sample code:
import pandas as pd
import pyodbc
from google.cloud import secretmanager
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file('/path/to/key.json')
client = secretmanager.SecretManagerServiceClient(credentials=credentials)
name = f"projects/{project_id}/secrets/{secret_id}/versions/{version_id}"
response = client.access_secret_version(request={"name": name})
secret_password = response.payload.data.decode("UTF-8")
conn_str = (
"DRIVER={PostgreSQL Unicode};"
"DATABASE=test;"
"UID=user;"
"PWD=" + secret_password + ";"
"SERVER=34.76.yy.xxxx;"
"PORT=5432;"
)
with pyodbc.connect(conn_str) as con:
print(pd.read_sql("SELECT * from entries",con=con))
BTW, you can install the lib using pip install google-cloud-secret-manager.
Finally, you can also use this approach to keep the instance IP, user, DB name, etc. creating more secrets if you prefer

Generating Cloud Storage Signed URL from Google Cloud Function without using explicit key file

I'd like to create a pre-signed upload URL to a storage bucket, and would like to avoid an explicit reference to a json key.
Currently, I'm attempting to do this with the Default App Engine Service Account
I'm attempting to follow along with this answer but am getting this error:
AttributeError: you need a private key to sign credentials.the
credentials you are currently using <class
'google.auth.compute_engine.credentials.Credentials'> just contains a
token. see
https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account
for more details.
My Cloud Function code looks like this:
from google.cloud import storage
import datetime
import google.auth
def generate_upload_url(blob_name, additional_metadata: dict = {}):
credentials, project_id = google.auth.default()
# Perform a refresh request to get the access token of the current credentials (Else, it's None)
from google.auth.transport import requests
r = requests.Request()
credentials.refresh(r)
client = storage.Client()
bucket = client.get_bucket("my_bucket")
blob = bucket.blob(blob_name)
service_account_email = credentials.service_account_email
print(f"attempting to create signed url for {service_account_email}")
url = blob.generate_signed_url(
version="v4",
service_account_email=service_account_email,
access_token=credentials.token,
# This URL is valid for 120 minutes
expiration=datetime.timedelta(minutes=120),
# Allow PUT requests using this URL.
method="PUT",
content_type="application/octet-stream",
)
return url
def get_upload_url(request):
blob_name = get_param(request, "blob_name")
url = generate_upload_url(blob_name)
return url
When you use version v4 of signed URL, the first line of the method calls ensure_signed_credentialsmethod that check if the current service account can generate a signature in standalone mode (so with a private key). And so, that's break the current behavior.
In the comment of the function, it's clearly describe that a service account JSON file is required
If you are on Google Compute Engine, you can't generate a signed URL.
Follow `Issue 922`_ for updates on this. If you'd like to be able to
generate a signed URL from GCE, you can use a standard service account
from a JSON file rather than a GCE service account.
So, use v2 version instead.

Authenticate calls to Google Cloud Functions programmatically

I am trying to authenticate to Google Cloud Functions from SAP CPI to fetch some data from a database. To push data, we use pub/sub, with a service account access token, and it works perfectly. But for the functions, it needs an identity token instead of an access token. We get the previous token with a groovy script (No Jenkins). Is it possible to authenticate to the functions also with an access token? Or to get the identity token without building a whole IAP layer?
You have to call your Cloud Functions (or Cloud Run, it's the same) with a signed identity token.
So you can use a groovy script for generating a signed identity token. Here an example
import com.google.api.client.http.GenericUrl
import com.google.api.client.http.HttpRequest
import com.google.api.client.http.HttpRequestFactory
import com.google.api.client.http.HttpResponse
import com.google.api.client.http.javanet.NetHttpTransport
import com.google.auth.Credentials
import com.google.auth.http.HttpCredentialsAdapter
import com.google.auth.oauth2.IdTokenCredentials
import com.google.auth.oauth2.IdTokenProvider
import com.google.auth.oauth2.ServiceAccountCredentials
import com.google.common.base.Charsets
import com.google.common.io.CharStreams
String myUri = "YOUR_URL";
Credentials credentials = ServiceAccountCredentials
.fromStream(new FileInputStream(new File("YOUR_SERVICE_ACCOUNT_KEY_FILE"))).createScoped("https://www.googleapis.com/auth/cloud-platform");
String token = ((IdTokenProvider) credentials).idTokenWithAudience(myUri, Collections.EMPTY_LIST).getTokenValue();
System.out.println(token);
IdTokenCredentials idTokenCredentials = IdTokenCredentials.newBuilder()
.setIdTokenProvider((ServiceAccountCredentials) credentials)
.setTargetAudience(myUri).build();
HttpRequestFactory factory = new NetHttpTransport().createRequestFactory(new HttpCredentialsAdapter(idTokenCredentials));
HttpRequest request = factory.buildGetRequest(new GenericUrl(myUri));
HttpResponse httpResponse = request.execute();
System.out.println(CharStreams.toString(new InputStreamReader(httpResponse.getContent(), Charsets.UTF_8)));
Service account key file is required only if you are outside GCP. Else, the default service account is enough, but must be a service account. Your personal user account won't work
Add this dependency (here in Maven)
<dependency>
<groupId>com.google.auth</groupId>
<artifactId>google-auth-library-oauth2-http</artifactId>
<version>0.20.0</version>
</dependency>
Or you can use a tool that I wrote and open sourced. I also wrote a Medium article for explaining the use cases
You can only access your secured cloud function using Identity token.
1.Create a service account with roles/cloudfunctions.invoker
2.Create a cloud function that allows only authenticated requests
https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME
from google.oauth2 import service_account
from google.auth.transport.requests import AuthorizedSession
target_audience = 'https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME'
creds = service_account.IDTokenCredentials.from_service_account_file(
'/path/to/svc.json', target_audience=target_audience)
authed_session = AuthorizedSession(creds)
# make authenticated request and print the response, status_code
resp = authed_session.get(target_audience)
print(resp.status_code)
print(resp.text)

GCP Cloud Storage file push by python using service account json file

I have written simple python program as per Google documentation. It throws me an error saying given account does not have access. Tried different combinations but didn't work.
I have cross checked the given access by supplying to java program and gsutil. Both these places I am able to access the bucket and upload file. Issue with the python program. Kindly show some light on this issue.
from google.oauth2 import service_account
from google.cloud import storage
credentials = service_account.Credentials.from_service_account_file('C:/Users/AWS/python/sit.json'
,scopes=['https://www.googleapis.com/auth/cloud-platform'])
storage_client = storage.Client(credentials=credentials,project='proj-sit')
bucket = storage_client.get_bucket('b-sit')
blob = bucket.blob('myfile')
blob.upload_from_string('New contents!. This is test.')
and i have received below error
Traceback (most recent call last):
File "C:\Users\AWS\python\mypgm.py", line 21, in <module>
bucket = storage_client.get_bucket('pearson-gcss-sit') # pearson-gcss-sit pearson-bbi-dev global-integration-nonprod
File "D:\Development_Avecto\Python36\lib\site-packages\google\cloud\storage\client.py", line 227, in get_bucket
bucket.reload(client=self)
File "D:\Development_Avecto\Python36\lib\site-packages\google\cloud\storage\_helpers.py", line 106, in reload
method="GET", path=self.path, query_params=query_params, _target_object=self
File "D:\Development_Avecto\Python36\lib\site-packages\google\cloud\_http.py", line 319, in api_request
raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 GET https://www.googleapis.com/storage/v1/b/b-sit?projection=noAcl: someid-sit#someinfo-sit.iam.gserviceaccount.com does not have storage.buckets.get access to b-sit.
[Finished in 10.6s]
Note : I can see role as 'storage.objectAdmin' in console.cloud.google.com.
For more information, I can upload the files by using below java program.
GoogleCredentials credentials = GoogleCredentials.fromStream(new FileInputStream(connectionKeyPath))
.createScoped(Lists.newArrayList("https://www.googleapis.com/auth/cloud-platform"));
Storage storage = StorageOptions.newBuilder().setCredentials(credentials).build().getService();
BlobId blobId = BlobId.of("some-sit", "cloudDirectory/file.zip");
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).setContentType("application/zip").build();
Blob blob = storage.create(blobInfo, fileContent);
I got the root cause of the issue.
Root cause : My bucket has access of 'roles/storage.objectAdmin' which does not have access to 'storage.buckets.get'. Hence I get the above error in the line where i have get_bucket function. This I have found in the google documentation.
All the sample codes in the documentation has get_bucket function to upload the files. My questions how can we upload the files to bucket without this access (storage.buckets.get)? Because we uploaded to the same bucket by Java without this access.
Can you show some light on this ? please
The service account you are using does not have the proper permissions.
You can solve this issue by granting at least the roles/storage.objectAdmin role at bucket or project level.
The roles/storage.objectAdmin role:
Grants full control over objects, including listing, creating, viewing, and deleting objects.
To grant it at bucket level run:
gsutil iam ch serviceAccount:someid-sit#someinfo-sit.iam.gserviceaccount.com:roles/storage.objectAdmin gs://[BUCKET_NAME]
To grant it at project level run:
gcloud projects add-iam-policy-binding yourProject --member serviceAccount:someid-sit#someinfo-sit.iam.gserviceaccount.com --role roles/storage.objectAdmin
EDIT:
You need to pass the credentials to the storage_client:
storage_client = storage.Client('proj-sit', credentials=credentials)
I have removed the line to get the bucket in my code and added below line. Below line done a trick to me. Hence I can upload the file with the access of 'roles/storage.objectAdmin'
bucket = storage_client.bucket('b-sit')

python- Cannot create signed url for Google cloud storage object

I use the code below to generate signed url for a file on Google cloud storage but when i click the link i got SignatureDoesNotMatch
from google.cloud.storage._signing import generate_signed_url
from google.oauth2.service_account import Credentials
signed_url = generate_signed_url(credentials=Credentials.from_service_account_file(google_cloud_platform__key_path),
resource=self.canonicalized_resource,
expiration=self.expiration,
content_type=self.content_type
)

Resources