I'm fairly new to Vue.js, but I've built some basic CRUD apps using axios.
What I want to do is use Google Cloud BigQuery to pull in raw data and then display or manipulate it in Vue. My goal is to make a sort of simple data dashboard where you can filter things or display some different insights from a handful of BigQuery queries.
I can install BigQuery API as a dependency from Vue GUI. But after that I'm a little lost. How do I import BigQuery into my code? How do I run the example code to fetch some public data?
I'm also unsure how to include the google credentials. I currently have this line in vue.config.js, but unsure if this is correct:
process.env.VUE_APP_GOOGLE_APPLICATION_CREDENTIALS = '/Google_Cloud_Key/Sandbox-f6ae6239297e.json'
Given the lack of any resources out there for doing this, I also wonder, should I not be trying to retrieve data this way? Should I make an intermediate API that runs the BigQuery queries and then returns JSON to my Vue app?
In order to make requests to the BigQuery API, you need to use a Service Account, which belongs to your project, and it is used by the Google BigQuery Node.js client library to make BigQuery API requests.
First, set an environment variable with your PROJECT_ID which you will use:
export GOOGLE_CLOUD_PROJECT=$(gcloud config get-value core/project)
Next, create a new service account to access the BigQuery API by using:
gcloud iam service-accounts create my-bigquery-sa --display-name "my bigquery service account"
Next, create credentials that your code will use to login as your new service account. Create these credentials and save it as a JSON file ~/key.json by using the following command:
gcloud iam service-accounts keys create ~/key.json --iam-account my-bigquery-sa#${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com
Set the GOOGLE_APPLICATION_CREDENTIALS environment variable, which is used by the BigQuery API library, covered in the next step, to find your credentials. The environment variable should be set to the full path of the credentials JSON file you created. Set the environment variable by using the following command:
export GOOGLE_APPLICATION_CREDENTIALS="/home/${USER}/key.json"
You can read more about authenticating the BigQuery API.
The following example shows how to initialize a client and perform a query on a BigQuery public dataset. Moreover in the samples/ directory you can find a lot of examples, such as Extract Table JSON, Get Dataset and many more.
I hope you find the above pieces of information useful.
Related
Spark allows us to read directly from Google BigQuery, as shown below:
df = spark.read.format("bigquery") \
.option("credentialsFile", "googleKey.json") \
.option("parentProject", "projectId") \
.option("table", "project.table") \
.load()
However having the key saved on the virtual machine, isn't a great idea. I have the Google key saved as JSON securely in a credential management tool. The key is read on-demand and saved into a variable called googleKey.
Is it possible to pass JSON into speak.read, or pass in the credentials as a Dictionary?
The other option is credentials. From spark-bigquery-connector docs:
How do I authenticate outside GCE / Dataproc?
Credentials can also be provided explicitly, either as a parameter or from Spark runtime configuration. They should be passed in as a
base64-encoded string directly.
// Globally
spark.conf.set("credentials", "<SERVICE_ACCOUNT_JSON_IN_BASE64>")
// Per read/Write
spark.read.format("bigquery").option("credentials", "<SERVICE_ACCOUNT_JSON_IN_BASE64>")
This is more like chicken and egg situation. if you are storing credential file in secret manager (hope that's not your credential manager tool). How would you access secret manager. For that you might need key and where would you store that key.
For this, Azure has created a managed identities, through which two different services can talk to each other without providing any keys (credential) explicitly.
If you are running from Dataproc, then the node has a built in service account which you can control on cluster creation. In this case you do not need to pass any credentials/credentialsFile option.
If you are running on another cloud or on prem, you can use the local secret manager, or implement the connector's AccessTokenProvider which lets you full customization of the credentials creation.
When trying to run queries from python (boto3) to AWS Athena, the following error is raised:
botocore.exceptions.ClientError: An error occurred
(AccessDeniedException) when calling the StartQueryExecution
operation: User: arn:aws:iam::account-id:user/sa.prd is not
authorized to perform: athena:StartQueryExecution on resource:
arn:aws:athena:us-east-1:account-id:workgroup/primary
I don't have access to AWS console. I was also informed that there is another user "sa.prd.athena" that has the right permissions (what seems not to happen to "sa.prd").
Is it possible to use boto3 specifying a different user? Now don't use any specific user.
If not possible to use a different user, is it possible to set some kind of policy to be used by boto3 in runtime execution (this because I don't have access to AWS management console)
Thanks,
BR
The User in AWS is determined by the credentials that are used to sign the API call to the AWS API. There are several ways to pass these credentials to AWS SDKs in general (and boto3 in particular).
It looks for credentials in these places and takes them from the first one where they're present:
Hard-Coded credentials while instantiating a client
Credentials stored in environment variables
Credentials stored in ~/.aws/credentials (By default it uses those of the default profile)
In the instance metadata service on EC2/ECS/Lambda
Since you're not directly setting up credentials, I assume it takes them from the SDK configuration (3), so you could just overwrite them while instantiating your Athena client like this:
import boto3
athena_client = boto3.client(
'athena',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN
)
This is an adapted example from the documentation, you need to specify your credentials instead of the uppercase variables.
Hardcoding these is considered bad practice though, so you might want to look into option (2) using environment variables, or setting up another profile in your local SDK and telling the client to use that. Information on that can be found in the boto3-docs I linked above.
I am using Node library to integrate my application with BigQuery. I am planning to accept projectId, Email and private key from user and then I will validate credentials by making call to getDataset operation with limit 1 This will ensure that all 3 parameters passed by user are proper.
But then I realized that even if I pass different valid project ID, my call to getDataset passes. Operation gets datasets from that project. So I was wondering if Service account is not linked to project. Any idea how I can validate these three parameters ?
A service account key has some attributes inside it including project_id, private_key, client_email and many others. In this page you can find how to configure the credentials to use the client libraries.
Basically, the first step is creating a service account and download a JSON key (I suppose that you already completed this step)
Then you need to set an environment variable in your system so your application can access the credentials.
For Linux/Mac you can do that running:
export GOOGLE_APPLICATION_CREDENTIALS="[PATH]"
For Windows (using CMD):
set GOOGLE_APPLICATION_CREDENTIALS=[PATH]
Does anyone here have experience running a NodeJS application on GAE flex environment?
I have a specific question about authenticating to non-cloud platform Google APIs (E.g. Google Drive, Google Sheets, etc and not Cloud Storage, etc).
I've been using Google's ADC and a following the method described here: https://github.com/google/google-api-nodejs-client#choosing-the-correct-credential-type-automatically
This worked for me previously - I was able to use ADC to authenticate with Cloud KMS & cloud storage.
But, I am unable to ship a a feature using the Google Sheets API (it writes data to a sheet based on user interaction, so that I can give this data super easily to the business team) as it runs locally with ADC and the service client keyfile that creates a JWT, but does not run in production.
As there aren't any concrete examples on this for NodeJS I ended up going down the rabbit hole of looking through the separate node docs to solve my issue.
In particular, the Sheets API states that I need to use OAuth 2, but these docs say it's better for me to stick with ADC:
- https://cloud.google.com/appengine/docs/flexible/nodejs/authorizing-apps#oauth_20_authorization
- https://developers.google.com/identity/protocols/OAuth2ServiceAccount
And why would it work with my local copy of my GAE service account's JWT, but not the Compute engine service account? I've added both appengine and compute engine service account emails to the spreadsheet with Write access, but it doesn't seem to work for the latter.
i keep running into the same error where my code works locally and I can append values to a sheet where my app engine email has access to write to, but in production, I continue to get the same error which says the app has insufficient permissions: `"The Sheets API returned an error: Error: A Forbidden error was returned while attempting to retrieve an access token for the Compute Engine built-in service account. This may be because the Compute Engine instance does not have the correct permission scopes specified. Request had insufficient authentication scopes."
So i've also done domain wide authority delegation, and done every single thing I about adding the right access can find.
My main question is, will I be able to continue using the method laid out in this code (https://github.com/google/google-api-nodejs-client#choosing-the-correct-credential-type-automatically), or should I be using OAuth2 as in this documentation (https://developers.google.com/identity/protocols/OAuth2ServiceAccount#authorizingrequests)? And if I need to use OAuth2, what's the point of even being on GAE if I can't even access Google's Sheets API without loading my own credentials in my env variables?
Am I doing something wrong?
I recently wrote NodeJS code that writes to a spreadsheet. It uses a service account and creates a JWT from a key file. My code runs as a Cloud Function, but perhaps it would give you ideas for how to solve your problem in GAE.
As mentioned by Martin, you can access Google Sheets via the Google Sheets API with the Node Client library. I'd follow that link for details.
Here's a summary:
Create a service account. Name it credentials.json.
Enable the Google Sheets API
Copy the email address found from step 1.
Share the Google Sheet with the email address.
Call the Google Sheets API, authorizing with the credentials created above.
Links:
Sheets API: https://developers.google.com/sheets
Node Client: https://github.com/googleapis/google-api-nodejs-client
I have setup a private cloud which has 2 computes and the back end is working successfully. OpenStack and Horizon is used to create and manage instances.
Basically, Horizon would be used on an administrator level and not by the user. So for the user to enter inputs like RAM, Disk Storage, Image etc, i want to create a frontend (ReactJS, NodeJS) website though which they can provide the details for VM creation.
The flow would be:
User provides all details through an HTML form.
Those details go to the Administrator sitting who will check the details and then approve the request.
Once the request is approved, VM is created and user can manage the VM.
I want to achieve somewhat like the above.
The goal is to not allow access for the user to create, manage VM instances through horizon.
Any help ?
EDITS
The flow should be the following :
Login to the front end user dashboard.
Allow user to create a flavor by accepting the values.
Store the values into MySQL DB.
Admin would get the request inn his login. An 'Accept' and 'Decline' button would be there on each flavor entry. If admin accepts, using shelljs the openstack flavor creation command is run and the output is saved in db.
The user could get the accepted flavor on his side.
The user would create a VM instance by selecting the flavor which he recieved above.
Using openstack create server command, the VM would be created and a token url would be generated which will be displayed to the user.
8.When user clicks on the url, VM is launched.
So the above is the goal to be achieved using React, NodeJS and MySQL.
The difficult parts would be:-
Running the openstack cli commands using shelljs
Generating a token url.
I hope there might be a way to do this.
Thanks
Not sure how to help you on this. Some ideas:
You will need a python backend unless you want your backend to talk to the OpenStack REST API directly. Similarly to Horizon, using the Django Python web framework will save you some time with using the python client APIs
Then you can build a frontend app with ReactJS and BackboneJS. Since ReactJS itself will not let you communicate with the HTTP Response generated by your backend. Also, you should look into Redux to deal with the data flow.
Here is an open source project which does what you want mostly: https://github.com/cyverse/atmosphere/
HTH