Api rest google drive folder not visible after creation by nodejs - node.js

When creating a folder via google api rest I cannot see the newly created folder from the google drive console. The code works, it returns the json with all the data of the folders just created.
function createFolder(nameProduct,folderId){
console.log("nome folder : "+nameProduct);
console.log("id parent : "+folderId);
var fileMetadata = {
'name': nameProduct,
'mimeType': 'application/vnd.google-apps.folder',
parent : [folderId]
};
let objFolder = {
auth: jwToken,
resource: fileMetadata,
fields: 'id'
}
return new Promise((resolve,reject)=>{
drive.files.create(objFolder).then(function(response) {
// Handle the results here (response.result has the parsed body).
console.log("Response", response);
resolve(response);
},
function(err) {
// handle error here.
reject(err);
console.error("Execute error", err);
});
});
}
var jwToken = new google.auth.JWT(
key.client_email,
null,
key.private_key, ["https://www.googleapis.com/auth/drive"],
null
);
jwToken.authorize((authErr) => {
if (authErr) {
console.log("error : " + authErr);
return;
} else {
console.log("Authorization accorded");
}
});
The authentication works fine, I'm able to crete file. I have this problem only with folder.
Why is not visible on google drive ? The method return correctly the id

You appear to be using a service account to authenticate. A service accounts are dummy users they have their own google drive account, google calendar account and probably a few more.
When you uploaded that file its uploaded to the service accounts google drive account. There is no web view for a service account the only access you have to it is programmaticlly.
options: upload to your account
Create a directory on your google drive account share that directory with the service account and upload to that directory.
Option: share the service accounts directory with you
Have the service account create a directory and then grant yourself permission on the directory. You will then be able to see that directory in your account.
Note
When the service account uploads files it will be the owner of that file make sure that you grant yourself permissions on these files or you wont be able to access them. Yes you can have a file on your drive account that you dont have permission to access. 😉

Related

Google service account unable to list calendars?

Can anyone indicate the right settings to give allow a service account access to a Google calendar?
We have a node.js project that is meant to provide access to the Google calendars it has been given access to, but we can't work out why it can't see the calendars.
At the same time, we were able to trying a different account that has been working in another environment and this worked, but the problem is that no one who previously worked on the project has access to the configuration for it to compare.
The response.data we are getting with the problem account:
{
"kind": "calendar#calendarList",
"etag": "\"p33k9lxxjsrxxa0g\"",
"nextSyncToken": "COias9Pm4vUCEjxvdGurdXRob24tZGV2LXNlcnZpY2UxQG90YxxxdGhvbi1kZXYuaWFtLmdzZXJ2aWNlYWNxb3VudC5jb20=",
"items": []
}
Either way this would suggest the issue is with the configuration of the service account, rather than the code itself, but will share the code anyhow:
import { Auth, calendar_v3 as calendarV3 } from 'googleapis';
const Calendar = calendarV3.Calendar;
async getCalendarApi () {
const jwtClient = new Auth.JWT(
this.keyConfig.clientEmail,
undefined,
this.keyConfig.privateKey.replace(/\\n/g, '\n'),
[
'https://www.googleapis.com/auth/calendar.readonly',
'https://www.googleapis.com/auth/calendar.events'
]
);
const calendarApi = new Calendar({
auth: jwtClient
});
await jwtClient.authorize();
return calendarApi;
}
async init () {
// loads the configuration from our app's configuration file
this.keyConfig = getKeyConfig('google-calendar');
const calendarApi = await this.getCalendarApi();
const response = await calendarApi.calendarList.list();
console.log(JSON.stringify(response.data, undefined, 2));
}
As for the service account, in the Google console, this is the how it was set up:
Open up https://console.cloud.google.com/
Select the appropriate project at the top
Search for 'Google Calendar API' in the search bar
Validate it is enabled
Click 'Manage', which takes me to https://console.cloud.google.com/apis/api/calendar-json.googleapis.com/metrics?project=xxxxxx ('xxxxxx' is the project id, but masked here)
Click on 'Credentials
Click on 'Create Credentials' at the top and select 'service account'
Provide a name of the service account
Click 'Create and Continue'
Add a role: Basic -> Viewer
Click 'Done'
In the 'Credentials' page (https://console.cloud.google.com/apis/credentials?project=xxxxxx) click edit for the account we just created
In the 'keys' section: Add key -> Create New Key and then specify JSON
From the above steps we take the client_email and the private_key field values to user with our nodejs app.
Then in the calendar we want it to access, we add the email address of the service account as a viewer.
Trying all the above still results in the list of calendars visible by the service account to be empty.
Any ideas?
From your explanation and the returned value, I'm worried that in your situation, you might have never inserted the shared Calendar with the service account. If my understanding is correct, how about the following modification?
Modification points:
Insert the shared Calendar to the service account.
In this case, please modify the scope https://www.googleapis.com/auth/calendar.readonly to https://www.googleapis.com/auth/calendar.
async init () {
// loads the configuration from our app's configuration file
this.keyConfig = getKeyConfig('google-calendar');
const calendarApi = await this.getCalendarApi();
const response = await calendarApi.calendarList.insert({resource: {id: "####group.calendar.google.com"}}); // Please set the Calendar ID here. Or {requestBody: {id: "####group.calendar.google.com"}}
console.log(JSON.stringify(response.data, undefined, 2));
}
Retrieve the calendar list using your script.
After the Calendar was inserted to the service account using the above script, you can obtain the shared Calendar in the Calendar list.
Reference:
CalendarList: insert

How to use googleapis google.auth.GoogleAuth() for google API service account in Twilio serverless function?

How to use googleapis google.auth.GoogleAuth() for google API service account in Twilio serverless function, since there is no FS path to provide as a keyFile value?
Based on the example here ( https://www.section.io/engineering-education/google-sheets-api-in-nodejs/ ) and here ( Google api node.js client documentation ) my code is based on the example here ( Receive an inbound SMS ) and looks like...
const {google} = require('googleapis')
const fs = require('fs')
exports.handler = async function(context, event, callback) {
const twiml = new Twilio.twiml.MessagingResponse()
// console.log(Runtime.getAssets()["/gservicecreds.private.json"].path)
console.log('Opening google API creds for examination...')
const creds = JSON.parse(
fs.readFileSync(Runtime.getAssets()["/gservicecreds.private.json"].path, "utf8")
)
console.log(creds)
// connect to google sheet
console.log("Getting googleapis connection...")
const auth = new google.auth.GoogleAuth({
keyFile: Runtime.getAssets()["/gservicecreds.private.json"].path,
scopes: "https://www.googleapis.com/auth/spreadsheets",
})
const authClientObj = await auth.getClient()
const sheets = google.sheets({version: 'v4', auth: authClientObj})
const spreadsheetId = "myspreadsheetID"
console.log("Processing message...")
if (String(event.Body).trim().toLowerCase() == 'KEYWORD') {
console.log('DO SOMETHING...')
try {
// see https://developers.google.com/sheets/api/guides/values#reading_a_single_range
let response = await sheets.spreadsheets.values.get({
spreadsheetId: spreadsheetId,
range: "'My Sheet'!B2B1000"
})
console.log("Got data...")
console.log(response)
console.log(response.result)
console.log(response.result.values)
} catch (error) {
console.log('An error occurred...')
console.log(error)
console.log(error.response)
console.log(error.errors)
}
}
// Return the TwiML as the second argument to `callback`
// This will render the response as XML in reply to the webhook request
return callback(null, twiml)
...where the Asset referenced in the code is for a JSON generated from creating a key pair for a Google APIs Service Account and manually copy/pasting the JSON data as an Asset in the serverless function editor web UI.
I see error messages like...
An error occurred...
{ response: '[Object]', config: '[Object]', code: 403, errors: '[Object]' }
{ config: '[Object]', data: '[Object]', headers: '[Object]', status: 403, statusText: 'Forbidden', request: '[Object]' }
[ { message: 'The caller does not have permission', domain: 'global', reason: 'forbidden' } ]
I am assuming that this is due to the keyFile not being read in right at the auth const declaration (IDK how to do it since all the example I see assume a local filepath as the value, but IDK how to do have the function access that file for a serverless function (my attempt in the code block is really just a shot in the dark)).
FYI, I can see that the service account has an Editor role in the google APIs console (though I notice the "Resources this service account can access" has the error
"Could not fund an ancestor of the selected project where you have access to view a policy report on at least one ancestor"
(I really have no idea what that means or implies at all, very new to this)). Eg...
Can anyone help with what could be going wrong here?
(BTW if there is something really dumb/obvious that I am missing (eg. a typo) just LMK in a comment so can delete this post (as it would then not serve any future value of others))
The caller does not have permission', domain: 'global', reason: 'forbidden
This actually means that the currently authenticated user (the service account) does ot have access to do what you are asking it to do.
You are trying to access a spread sheet.
Is this sheet on the service accounts google drive account? If not did you share the sheet with the service account?
The service account is just like any other user if it doesn't have access to something it cant access it. Go to the google drive web application and share the sheet with the service account like you would share it with any other user just use the service account email address i think its called client id its the one with an # in it.
delegate to user on your domain
If you set up delegation properly then you can have the service account act as a user on your domain that does have access to the file.
delegated_credentials = credentials.with_subject('userWithAccess#YourDomain.org')

Node JS Google Drive api error: The user's Drive storage quota has been exceeded

I want to upload files to Google Drive using Node JS api. I have enabled Google Drive api and created service account. Then I share one folder with this account. The problem is that I see file I have uploaded using node js, but free space doesn't change when I upload files using api, so I can't monitor how many space left. What is more, when I upload about 7 GB using api this error appered(I have 14 GB free on Google Drive):
code: 403,
errors: [
{
domain: 'global',
reason: 'storageQuotaExceeded',
message: "The user's Drive storage quota has been exceeded."
}
]
Why can I see this files on Google Drive, but they don't use my Google Drive space? How can I toogle it to use my Google Drive space?
Uploading function:
const { google } = require('googleapis');
const path = require('path');
const fs = require('fs');
const SCOPES = ['https://www.googleapis.com/auth/drive'];
const KETFILEPATH = "key.json"
let main_dir_id = "1oT2Fxi1L6iHl9pDNGyqwBDyUHyHUmCJJ"
const auth = new google.auth.GoogleAuth({
keyFile: KETFILEPATH,
scopes: SCOPES
})
let createAndUploadFile = async (auth, file_path, mimeType, folder_id, i = 0) => {
const driveService = google.drive({ version: 'v3', auth })
let fileMetaData = {
'name': file_path.slice(file_path.lastIndexOf("/") + 1),
'parents': [folder_id]
}
let media = {
mimeType: mimeType,
body: fs.createReadStream(file_path)
}
let res = await driveService.files.create({
resource: fileMetaData,
media: media,
})
if (res.status === 200) {
console.log('Created file id: ', res.data.id)
return res.data.id
} else {
// this error is in res
return 0
}
}
Issue:
As you can see at Share folders in Google Drive, the storage space will be occupied by the account who uploaded the file (that is, the owner of the file), not the owner of the shared folder:
Storage is counted against the person who uploaded the file, not the owner of the folder.
Therefore, if you try to upload the file with the service account, the file will take storage space in the service account's Drive. That is to say, when you check there are still 14 GB free in your regular account's Drive, you are looking at the wrong place.
Possible solutions:
Call About: get with your service account to check how much space you have left in that account's Drive. Chances are you can delete some files in order to free up some storage space.
If that is not a possibility, I'd suggest granting domain-wide authority to the service account and use that to impersonate your regular account (and upload the files on behalf of it).
Of course, transfering ownership of the file would free up space in the service account's Drive, but since you cannot upload the file, I don't think that is a feasible solution to this problem.
I had same problem and I created this tool to delete some files in my service account
https://github.com/quangvinh2080/google-drive-cli
Set your credentials
export GDRIVE_CLIENT_EMAIL=example#email.com
export GDRIVE_PRIVATE_KEY="service account's private key"
Check quota
google-drive about:get --fields=storageQuota --rawOutput
List & delete files & empty trash:
google-drive files:list
google-drive files:delete --fileId=<fileId>
google-drive files:emptyTrash

Using Google Drive API

I am in the process of setting up data scraper that writes the data to an excel file and then I want to upload those files to a folder in my own drive account. This will be done once a day via a scheduler, so fully automated is the aim here.
I am looking at this quickstart guide and it goes through the process of Oauth2. I don't want to access any others user data, just push up the files i create. This may be a stupid question but do i have to go through the oAuth process and not just use an api key and secret for example?
In Step 4 it says The first time you run the sample, it will prompt you to authorize access:, how would i do that if this is running on an EC2 instance for example
Thanks
If you are only going to be accessing your own account then you should look into a service account. Service accounts are preauthorized so you wont get a window popping up asking for access. I am not a node.js programer so i cant help much with code. I dont see a service account example for node.js for google drive you may be able to find one for one of the other apis or check the client library sample examples
A service account is not you its a dummy user you can take the service account email address and share a folder or file on your personal google drive account with the service account and it will have access. read more about service accounts
I thought i would post how i got this to work (i need to look into checking if the file exists and then replace it, but that's later), firstly #DalmTo and #JoeClay had good points so i looked into them further and come across this blog post.
# index.js
const google = require('googleapis');
const fs = require('fs');
const config = require('./creds.json');
const drive = google.drive('v3');
const targetFolderId = "123456789"
const jwtClient = new google.auth.JWT(
config.client_email,
null,
config.private_key,
['https://www.googleapis.com/auth/drive'],
null
);
jwtClient.authorize((authErr) => {
if (authErr) {
console.log(authErr);
return;
}
const fileMetadata = {
name: './file.txt,
parents: [targetFolderId]
};
const media = {
mimeType: 'application/vnd.ms-excel',
body: fs.createReadStream('./file.txt' )
};
drive.files.create({
auth: jwtClient,
resource: fileMetadata,
media,
fields: 'id'
}, (err, file) => {
if (err) {
console.log(err);
return;
}
console.log('Uploaded File Id: ', file.data.id);
});
});
As I said previously my next step is to check if the file exists and if it does replace it

Access Denied when creating signedUrl on Google Storage Object in App engine

Exact text: "Access Denied. Caller does not have storage.objects.get access to object on (url)"
I currently have 2 projects on google services. One that is a resource for an Android/iOS app that will upload images to google cloud storage (more so firebase storage). The other is a node.js server that detects when an image is uploaded and sends an email with a signedUrl to that image.
The node.js server has the following google storage setup and function that is called:
var storage = require("#google-cloud/storage")({
keyFilename: "google_secret.json",
projectId: 'xxxxxxxxxxx'
});
function getImageFromGcloud(x, y, time, date_string) {
var bucket = storage.bucket('bucket-name');
var storage_ref = 'Incomplete_Scans/' + date_string + "/" x + "_" + y + "_" + time + ".jpeg";
bucket.file(storage_ref).getSignedUrl({
action: 'read',
expires: '03-17-2025'
}, function(err, url) {
if (err){
console.error(err);
return;
}
console.log("Sent email");
});
}
The link provided from getSignedUrl used to work, when clicked it would open a new tab with the image there. However as of recently, it had just stopped. Perhaps it has to do with the permissions, however nothing I'm doing is working. Any ideas?
Thank you in advanced.
If you are getting a 403 (permission denied) instead of a 401 (Invalid authentication), it's likely that your service account that is signing requests had lost its access to the objects for some reason. Add read access to that service account.

Resources