I typically deploy new updates to my cloud run instances on GCP using the CLI:
gcloud run deploy CLOUD_RUN_INSTANCE --image gcr.io/ORGANIZATION/IMAGE --region us-east1 --platform managed --allow-unauthenticated --quiet
How would I run this same command as an http request using [axios][1] from my firebase functions?
The answers above solves the task of deploying Cloud Run by using a call to Cloud Build and then Cloud Build deploys the new revision, nevertheless the question is very specific:
gcloud run deploy CLOUD_RUN_INSTANCE --image gcr.io/ORGANIZATION/IMAGE --region us-east1 --platform managed --allow-unauthenticated --quiet
How would I run this same command as an http request using axios from my firebase functions?
So we need to use the replaceService method. The following code will do this by purely using axios and HTTP request. It's worth to mention that this is an snippet but can be adapted to different approaches like Firebase Functions, etc:
const {GoogleAuth} = require('google-auth-library');
const axios = require('axios');
const create_revision = async () => {
const auth = new GoogleAuth({
scopes: 'https://www.googleapis.com/auth/cloud-platform'
});
const token = await auth.getAccessToken();
//TODO: Replace as needed
region = 'REGION';
project_id = 'PROJECT_ID';
service_name = 'SERVICE_NAME';
image = 'gcr.io/PROJECT_ID/IMAGE';
//Get the current details of the service
try {
resp = await axios.get(
`https://${region}-run.googleapis.com/apis/serving.knative.dev/v1/namespaces/${project_id}/services/${service_name}`,
{headers: {'Authorization': `Bearer ${token}`}}
);
service = resp.data;
//Create the body to create a new revision
body = {
"apiVersion": service.apiVersion,
"kind": service.kind,
"metadata": {
"annotations": {
"client.knative.dev/user-image": image,
'run.googleapis.com/ingress': service.metadata.annotations['run.googleapis.com/ingress'],
'run.googleapis.com/ingress-status': service.metadata.annotations['run.googleapis.com/ingress-status']
},
"generation": service.metadata.generation,
"labels": (service.metadata.labels === undefined)? {} : service.metadata.labels,
"name": service.metadata.name,
},
"spec": {
"template": {
"metadata": {
"annotations": {
"autoscaling.knative.dev/maxScale": service.spec.template.metadata.annotations['autoscaling.knative.dev/maxScale'],
"client.knative.dev/user-image": image,
},
"labels": (service.spec.template.metadata.labels === undefined) ? {} : service.spec.template.metadata.labels,
},
"spec": {
"containerConcurrency": service.spec.template.spec.containerConcurrency,
"containers": [{
"image": image,
"ports": service.spec.template.spec.containers[0].ports,
"resources": {
"limits": service.spec.template.spec.containers[0].resources.limits
}
}],
"serviceAccountName": service.spec.template.spec.serviceAccountName,
"timeoutSeconds": service.spec.template.spec.timeoutSeconds
}
},
"traffic": service.spec.traffic[0]
}
}
//Make the request
create_service_response = await axios.put(
`https://${region}-run.googleapis.com/apis/serving.knative.dev/v1/namespaces/${project_id}/services/${service_name}`,
body,
{headers: {'Authorization': `Bearer ${token}`}}
);
console.log(create_service_response.status)
}catch (err) {
console.error(err.response.data);
}
};
This is the minimum body needed to create a new revision without modifying any previous configurations. As well to make any more customizations, the API docs can be helpful. The code was created by analyzing the output of the command by adding the --log-http flag.
Of course this is a little more complicated than using the Cloud Build approach, but this answers the question and can be helpful for others.
Got it working using the following
try {
async function updateTheme() {
let token: any
const jwtClient = new google.auth.JWT(firebaseClientEmail, '', firebaseKey, [
'https://www.googleapis.com/auth/cloud-platform',
])
await jwtClient.authorize(async function (err: any, _token: any) {
if (err) {
console.log('JWT ERROR: ', err)
return err
} else {
token = _token.access_token.split('.')
token = token[0] + '.' + token[1] + '.' + token[2]
const deploySteps = [
{
name: 'gcr.io/cloud-builders/gcloud',
args: [
'run',
'deploy',
`${name}`,
'--image',
`gcr.io/${googleCloudProject}/theme-${theme}`,
'--region',
'us-east1',
'--allow-unauthenticated',
'--platform',
'managed',
'--quiet',
],
},
]
const deployRevisions = async () => {
await axios({
method: 'post',
url: `https://cloudbuild.googleapis.com/v1/projects/${googleCloudProject}/builds`,
headers: {
Authorization: `Bearer ${token}`,
},
data: {
steps: deploySteps,
timeout: '1200s',
},
})
.catch(function (error: any) {
console.log('ERROR UPDATING THEME: ', error)
return
})
.then(function (response: any) {
console.log('SUCCESSFULLY DEPLOYED THEME UPDATE')
})
}
if (token) {
deployRevisions()
} else {
console.log('MISSING TOKEN')
}
}
})
}
await updateTheme()
} catch (e) {
console.log('tried updating theme but something went wrong')
return
}
Related
I am having trouble setting up Partner Referrals when calling the PayPal API using Node.
Every time I attempt to call the API I receive the following error:
error: "invalid_token"
error_description: "The token passed in was not found in the system"
According to the documentation the URL to call is https://api-m.sandbox.paypal.com/v2/customer/partner-referrals
Looking at the URL and the error message, I believe I am getting this error because I am using production credentials, not sandbox. However, I cannot find any documentation showing the production URL for this.
Am I correct in believing this is the sandbox URL? What is the production URL if so?
Ive followed the onboarding checklist but cant seem to make this work.
Here is my code:
getAuthToken = async () => {
const clientIdAndSecret = "mylongsecret";
const authUrl = "https://api-m.paypal.com/v1/oauth2/token";
const base64 = Buffer.from(clientIdAndSecret).toString('base64')
const response = await fetch(authUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json',
'Accept-Language': 'en_US',
'Authorization': `Basic ${base64}`,
},
body: 'grant_type=client_credentials'
});
const data = await response.json();
return data;
}
setUpMerchant = async () => {
let authData = await this.getAuthToken();
const partnerUrl = "https://api-m.sandbox.paypal.com/v2/customer/partner-referrals";
let data = {
"operations": [
{
"operation": "API_INTEGRATION",
"api_integration_preference": {
"rest_api_integration": {
"integration_method": "PAYPAL",
"integration_type": "THIRD_PARTY",
"third_party_details": {
"features": [
"PAYMENT",
"REFUND"
]
}
}
}
}
],
"products": [
"EXPRESS_CHECKOUT"
],
"legal_consents": [
{
"type": "SHARE_DATA_CONSENT",
"granted": true
}
]
};
const request = await fetch(partnerUrl, {
method: 'POST',
headers: {
'Authorization': 'Bearer '+authData.access_token,
'Content-Type': 'application/json',
'data': data,
},
});
const partnerData = await request.json();
return partnerData;
}
Edit: I discovered the issue was I was running a GET request instead of a POST. The accepted answer is the correct URL
According to the documentation the URL to call is https://api-m.sandbox.paypal.com/v2/customer/partner-referrals
The production URL does not have sandbox. in the domain.
I can't resolve how to change "shared" state to false with the google drive API.
Here is what i do:
I fetch all my folders & files witch are public with [this
one][1]
(I use q:"'me' in owners and visibility != 'limited'" filter)
I take file/folder ID and put it inside [this other one][2]
Inside the response object I got this line i want to change : "shared": true
I don't where I can set it to false, is someone getting any idea?
Have a nice day
Edit: I use NodeJs (netlify function), here is my code to get my files & folders :
const { google } = require('googleapis')
const CLIENT_ID = process.env.CLIENT_ID
const CLIENT_SECRET = process.env.CLIENT_SECRET
const REDIRECT_URI = 'https://developers.google.com/oauthplayground'
const REFRESH_TOKEN = process.env.REFRESH_TOKEN
exports.handler = async () => {
const oauth2Client = new google.auth.OAuth2(CLIENT_ID, CLIENT_SECRET, REDIRECT_URI)
oauth2Client.setCredentials({ refresh_token: REFRESH_TOKEN })
const drive = google.drive({
version: 'v3',
auth: oauth2Client,
})
try {
const result = await drive.files.list({
q:"'me' in owners and visibility != 'limited'"
})
return {
statusCode: 200,
headers: {
'Access-Control-Allow-Origin': '*',
},
body: JSON.stringify({ ...result, Body: result.toString('utf-8') })
}
} catch (e) {
console.log(e.message)
return { statusCode: 500, body: e.message }
}
}
To change visibility ("shared": true -> "shared": false),
I tried #Tanaike answer with :
const fetch = require('node-fetch')
const API_ENDPOINT_A = 'DELETE https://www.googleapis.com/drive/v3/files/'
const API_ENDPOINT_B = '/permissions/anyoneWithLink'
exports.handler = async (event) => {
try {
const itemId = '1-7ESYk_zKJ5Sdfg_z-XiuoXxrKKpHwSa' // event.queryStringParameters.itemId
const response = await fetch(API_ENDPOINT_A + itemId + API_ENDPOINT_B)
const data = await response.json()
return {
statusCode: 200,
headers: {
'Access-Control-Allow-Origin': '*'
},
body: JSON.stringify({ data })
}
} catch (error) {
console.log(error)
return {
statusCode: 500,
body: JSON.stringify({ error: 'Failed fetching data' })
}
}
}
But i don't know how I can pass my private info (api key...), I used OAuth 2 for fetch, should I use it too for edit visibility ?
[1]: https://developers.google.com/drive/api/v2/reference/files/list
[2]: https://developers.google.com/drive/api/v2/reference/files/patch
If I understand correctly, you are unable to see the place where you can change the file to "shared": true. If so, in the same link you provided from the official documentation you can find it in the "Request body".
From I fetch all my folders & files witch are public with this one (I use q:"'me' in owners and visibility != 'limited'" filter), when you want to change the permission of publicly shared to the restricted, you can achieve this using Drive API as follows.
Request:
Permissions: delete is used.
DELETE https://www.googleapis.com/drive/v3/files/###fileId###/permissions/anyoneWithLink
Sample curl:
curl --request DELETE \
-H 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
'https://www.googleapis.com/drive/v3/files/###fileId###/permissions/anyoneWithLink'
If the file is publicly shared and not shared with other specific users, when this request is run, "shared": true is changed to "shared": false.
Note:
If you want to remove the permission of the specific user, you can achieve this as follows.
Retrieve the permission ID using Permissions: list.
curl \
-H 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
'https://www.googleapis.com/drive/v3/files/###fileId###/permissions'
Using the retrieved permission ID, you can delete the permission as follows.
curl --request DELETE \
-H 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
'https://www.googleapis.com/drive/v3/files/###fileId###/permissions/###permissionId###'
References:
Permissions: delete
Permissions: list
Added:
From your following replying,
I actualy use nodejs for drive API, I tried the http solution, but idk how can i pass my clientId and secretId (error 500), You can find my code on my question, i edited it
When you want to achieve to delete permissions with googleapis for Node.js, you can use the following script. In this case, please include the scope of https://www.googleapis.com/auth/drive.
Sample script:
const fileId = "###"; // Please set your file ID.
const drive = google.drive({ version: "v3", auth }); // Please use your client here.
drive.permissions.delete(
{
fileId: fileId,
permissionId: "anyoneWithLink",
},
(err, res) => {
if (err) {
console.error(err);
return;
}
console.log(res.data); // In this case, no values are returned.
}
);
In this sample script, the permission of the publicly shared file is deleted.
I got it working with the following code :
const result = await drive.permissions.delete({
fileId:"YOUR FILE ID",
permissionId:"anyoneWithLink"
})
Here is the full update code :
const { google } = require('googleapis')
const CLIENT_ID = process.env.CLIENT_ID
const CLIENT_SECRET = process.env.CLIENT_SECRET
const REDIRECT_URI = 'https://developers.google.com/oauthplayground'
const REFRESH_TOKEN = process.env.REFRESH_TOKEN
exports.handler = async () => {
const oauth2Client = new google.auth.OAuth2(CLIENT_ID, CLIENT_SECRET, REDIRECT_URI)
oauth2Client.setCredentials({ refresh_token: REFRESH_TOKEN })
const drive = google.drive({
version: 'v3',
auth: oauth2Client,
})
try {
const result = await drive.permissions.delete({
fileId:"YOUR FILE ID",
permissionId:"anyoneWithLink"
})
return {
statusCode: 200,
headers: {
'Access-Control-Allow-Origin': '*',
},
body: JSON.stringify({ ...result, Body: result.toString('utf-8') })
}
} catch (e) {
console.log(e.message)
return { statusCode: 500, body: e.message }
}
}
For most of you reading this, it is portably the most basic question and done within 2 minutes..
Maybe someone got the time to provide the code for me or can recommend a resource where this is being explained for an absolute beginner.
I want to call an API from IBM cloud functions that requires authentication.
I got this code from an IBM video tutorial with that I can call any open API:
let rp = require('request-promise')
function main(params) {
if (params.actionA == 'joke') {
const options = {
uri: "http://api.icndb.com/jokes/random",
json: true
}
return rp(options)
.then(res => {
return { response: res }
})
} else if (params.actionB == 'fact') {
const options = {
uri: "https://catfact.ninja/fact",
json: true
}
return rp(options)
.then(res => {
return { response: res }
})
}
}
I want to keep the joke API but want to exchange the Cat fact API with this inspirational quote API (which needs authenticaion): https://english.api.rakuten.net/HealThruWords/api/universal-inspirational-quotes/details
I can get this node.js code from rakuten to call the quote api.
var http = require("https");
var options = {
"method": "GET",
"hostname": "healthruwords.p.rapidapi.com",
"port": null,
"path": "/v1/quotes/?id=731&t=Wisdom&maxR=1&size=medium",
"headers": {
"x-rapidapi-host": "healthruwords.p.rapidapi.com",
"x-rapidapi-key": "api key here",
"useQueryString": true
}
};
var req = http.request(options, function (res) {
var chunks = [];
res.on("data", function (chunk) {
chunks.push(chunk);
});
res.on("end", function () {
var body = Buffer.concat(chunks);
console.log(body.toString());
});
});
req.end();
How can I incorporate it into the if function? I want to use that function with watson assistant , which works well with the current code. I just need the catfact api exchanged by the quote api.
The two code snippets that you have provided use different modules to do the http request. That is probably why it looks a bit complicated.
The first step to change the behaviour in a way that you have described, is to replace the complete options in the else branch with the options from you second code snippet.
The second step is to provide the necessary api key for the quote URL as a parameter to the function.
Can you give this code snippet below a try, after adding an additional parameter apiKey to the function. I have not tested the code below, but that's how I would do it. Please let me know if it worked for you and I can improve the answer on your feedback.
let rp = require('request-promise')
function main(params) {
if (params.actionA == 'joke') {
const options = {
uri: "http://api.icndb.com/jokes/random",
json: true
}
return rp(options)
.then(res => {
return { response: res }
})
} else if (params.actionB == 'fact') {
const options = {
"method": "GET",
"hostname": "healthruwords.p.rapidapi.com",
"port": null,
"uri": "/v1/quotes/?id=731&t=Wisdom&maxR=1&size=medium",
"headers": {
"x-rapidapi-host": "healthruwords.p.rapidapi.com",
"x-rapidapi-key": params.apiKey,
"useQueryString": true
}
}
return rp(options)
.then(res => {
return { response: res }
})
}
}
I'm trying to call appsync from a lambda function that I set up using aws amplify. I can tell that my lambda function has read/write permission for appsync, but when I make the POST request from lambda to appsync, I get a Unable to parse JWT token error. The weird thing is that when I look at the header, I don't see the authorization jwt that I see when I am requesting from the web application, so that could be why I'm seeing this error. Instead, I see an x-amz-security-token and a different type of authorization string that you can see in the image below.
My code is pulled from a blog I found from Adrian Hall:
const env = require('process').env
const fetch = require('node-fetch')
const URL = require('url')
const AWS = require('aws-sdk')
AWS.config.update({
region: env.AWS_REGION,
credentials: new AWS.Credentials(
env.AWS_ACCESS_KEY_ID,
env.AWS_SECRET_ACCESS_KEY,
env.AWS_SESSION_TOKEN
),
})
exports.handler = (event, context, callback) => {
const ListCourses = `query ListCourses(
$filter: ModelTodoFilterInput
$limit: Int
$nextToken: String
) {
listCourses(filter: $filter, limit: $limit, nextToken: $nextToken) {
items {
id
}
nextToken
}
}`
// const details = {
// userId: event.request.userAttributes.sub,
// userDetails: {
// name: event.request.userAttributes.name,
// },
// }
const post_body = {
query: ListCourses,
operationName: 'ListCourses',
variables: details,
}
console.log(env)
console.log(`Posting: ${JSON.stringify(post_body, null, 2)}`)
// POST the GraphQL mutation to AWS AppSync using a signed connection
const uri = URL.parse(env.API_GRAPHQLAPIENDPOINTOUTPUT)
const httpRequest = new AWS.HttpRequest(uri.href, env.REGION)
httpRequest.headers.host = uri.host
httpRequest.headers['Content-Type'] = 'application/json'
httpRequest.method = 'POST'
httpRequest.body = JSON.stringify(post_body)
AWS.config.credentials.get(err => {
const signer = new AWS.Signers.V4(httpRequest, 'appsync', true)
signer.addAuthorization(AWS.config.credentials, AWS.util.date.getDate())
const options = {
method: httpRequest.method,
body: httpRequest.body,
headers: httpRequest.headers,
}
console.log('here is the uri and options')
console.log(uri.href)
console.log(options)
fetch(uri.href, options)
.then(res => res.json())
.then(json => {
console.log(`JSON Response = ${JSON.stringify(json, null, 2)}`)
callback(null, event)
})
.catch(err => {
console.error(`FETCH ERROR: ${JSON.stringify(err, null, 2)}`)
callback(err)
})
})
}
Does anyone know why the credentials methods are authorizing the way that they are and how I can fix this UnauthorizedException error? Just to sanity check me, in amplify, I did select that I wanted this lambda function to have read/write access and I can see in the CF template that:
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"appsync:Create*",
"appsync:StartSchemaCreation",
"appsync:GraphQL",
"appsync:Get*",
"appsync:List*"
],
"Resource": [
{
"Fn::Join": [
"",
[
"arn:aws:appsync:",
{
"Ref": "AWS::Region"
},
":",
{
"Ref": "AWS::AccountId"
},
":apis/",
{
"Ref": "apiGraphQLAPIIdOutput"
},
"/*"
]
]
}
]
}
]
}
You would have to use AWS_IAM as the authorization mode for your API if you want to call it from the lambda. Based on the error, it seems your API is setup to use AMAZON_COGNITO_USER_POOLS as the authorization. If you want to mix the 2 in your API, you might want to look at the following blog:
https://aws.amazon.com/blogs/mobile/using-multiple-authorization-types-with-aws-appsync-graphql-apis/
Might be a dumb question, but couldn't find a clear answer on stack/aws docs. My assumption is that it should be built in to lambda.
I am running Node10.x, with Axios module, in AWS Lambda. I have a successful function which checks a DNS/EC2/Endpoint pathway and returns the proper response. I want to automate it so it checks, lets say...every 10 minutes. It will notify me in SES if there is an error, and do nothing if its a good response.
All the functionality works, except I am having trouble getting SES integrated. inside the if statement below, i have added this code, the console.log works, so its just the SES part im having issues with.
exports.handler = async (event, context) => {
let aws = require('aws-sdk');
let ses = new aws.SES({
region: 'us-east-1'
});
let data = "document_contents=<?xml version=\"1.0\" encoding=\"UTF-8\"?><auth><user>fake</user><pass>info</pass></auth>";
var axios = require("axios");
var config = {
headers: { 'Content-Type': 'text/xml' },
};
let res = await axios.post('https://awebsiteidontwanttogiveout.com', data, config);
let eParams;
if (res.status === 200) {
console.log("Success");
eParams = {
Destination: {
ToAddresses: ["banana#apples.com"]
},
Message: {
Body: {
Text: {
Data: "Test SUCCESSFUL"
}
},
Subject: {Test SUCCESSFUL"
}
},
Source: "banana#apples.com"
};
ses.sendEmail(eParams);
}
if (res.status != 200) {
console.log("Failure");
eParams = {
Destination: {
ToAddresses: ["banana#apples.com"]
},
Message: {
Body: {
Text: {
Data: "Test FAIL"
}
},
Subject: {
Data: "Test FAIL"
}
},
Source: "banana#apples.com"
};
ses.sendEmail(eParams);
}
};
I'm getting a time out after 3 seconds. I uploaded a zip file to node, with dependencies. do I need to install AWS SDK and upload that with the file? shouldn't it be built in somehow? am I missing something in my SES call?
thanks
There are two issues to address:
sendEmail is an async function from AWS SDK, you have to use:
await ses.sendEmail(eParams).promise()
or else, Lambda would end execution before sendEmail method completes.
Lambda's default timeout is 3 seconds. This can be increased to a max of 15 minutes in the Lambda configuration.