Description:
I have created a Firebase app where a user can insert a Firestore document. When this document is created a timestamp is added so that it can be automatically deleted after x amount of time, by a cloud function.
After the document is created, a http/onCreate cloud function is triggered successfully, and it creates a cloud task. Which then deletes the document on the scheduled time.
export const onCreatePost = functions
.region(region)
.firestore.document('/boxes/{id}')
.onCreate(async (snapshot) => {
const data = snapshot.data() as ExpirationDocData;
// Box creation timestamp.
const { timestamp } = data;
// The path of the firebase document('/myCollection/{docId}').
const docPath = snapshot.ref.path;
await scheduleCloudTask(timestamp, docPath)
.then(() => {
console.log('onCreate: cloud task created successfully.');
})
.catch((error) => {
console.error(error);
});
});
export const scheduleCloudTask = async (timestamp: number, docPath: string) => {
// Convert timestamp to seconds.
const timestampToSeconds = timestamp / 1000;
// Doc time to live in seconds
const documentLifeTime = 20;
const expirationAtSeconds = timestampToSeconds + documentLifeTime;
// The Firebase project ID.
const project = 'my-project';
// Cloud Tasks -> firestore time to life queue.
const queue = 'my-queue';
const queuePath: string = tasksClient.queuePath(project, region, queue);
// The url to the callback function.
// That gets envoked by Google Cloud tasks when the deadline is reached.
const url = `https://${region}-${project}.cloudfunctions.net/callbackFn`;
const payload: ExpirationTaskPayload = { docPath };
// Google cloud IAM & ADMIN principle account.
const serviceAccountEmail = 'myServiceAccount#appspot.gserviceaccount.com';
// Configuration for the Cloud Task
const task = {
httpRequest: {
httpMethod: 'POST',
url,
oidcToken: {
serviceAccountEmail,
},
body: Buffer.from(JSON.stringify(payload)).toString('base64'),
headers: {
'Content-Type': 'application/json',
},
},
scheduleTime: {
seconds: expirationAtSeconds,
},
};
await tasksClient.createTask({
parent: queuePath,
task,
});
};
export const callbackFn = functions
.region(region)
.https.onRequest(async (req, res) => {
const payload = req.body as ExpirationTaskPayload;
try {
await admin.firestore().doc(payload.docPath).delete();
res.sendStatus(200);
} catch (error) {
console.error(error);
res.status(500).send(error);
}
});
Problem:
The user can also extend the time to live for the document. When that happens the timestamp is successfully updated in the Firestore document, and a http/onUpdate cloud function runs like expected.
Like shown below I tried to update the cloud tasks "time to live", by calling again
the scheduleCloudTask function. Which obviously does not work and I guess just creates another task for the document.
export const onDocTimestampUpdate = functions
.region(region)
.firestore.document('/myCollection/{docId}')
.onUpdate(async (change, context) => {
const before = change.before.data() as ExpirationDocData;
const after = change.after.data() as ExpirationDocData;
if (before.timestamp < after.timestamp) {
const docPath = change.before.ref.path;
await scheduleCloudTask(after.timestamp, docPath)
.then((res) => {
console.log('onUpdate: cloud task created successfully.');
return;
})
.catch((error) => {
console.error(error);
});
} else return;
});
I have not been able to find documentation or examples where an updateTask() or a similar method is used to update an existing task.
Should I use the deleteTask() method and then use the createTask() method and create a new task after the documents timestamp is updated?
Thanks in advance,
Cheers!
Yes, that's how you have to do it. There is no API to update a task.
Related
I was trying to deploy a function to Firebase to send notifications to all admin accounts when a new user signs up to the app, this is my current code:
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp();
exports.newDoctorNotification = functions.database.ref("/doctors/{pushId}")
.onCreate((snapshot, context) => {
const newDoctorID = context.params.pushId;
const notificationContent = {
notification: {
title: "New Doctor",
body: "A new doctor just signed up! uid: " + newDoctorID,
icon: "default",
sound: "default",
},
};
const adminTokensRef = functions.database.ref("device_tokens/admin");
const tokens = [];
adminTokensRef.once("value", (querySnapshot) => {
querySnapshot.forEach((adminToken) => {
tokens.push(adminToken.val());
});
});
if (tokens.length > 0) {
return admin.messaging().sendToDevice(tokens, notificationContent)
.then(function(result) {
console.log("Notification sent");
return null;
})
.catch(function(error) {
console.log("Notification failed ", error);
return null;
});
}
});
I have tried many variations such as the get() function and on(), but all give me the same error, I was trying to check the docs on this but they only talked about database triggers so I'm not sure if normal retrieval can work or not.
EDIT:
I updated my code to reach the database node through the snapshot given in the onCreate event, and now it works, although I am facing another problem now, if I push a new doctor under the node "doctors" it doesn't call the function.. but if I hit "test" in the Google Cloud console and it calls the function I get "null" in my snapshot.val() and "undefined" in the newDoctorID above, whereas the snapshot.key gives "doctors", why is it not calling the onCreate function?
While not a front-end developer, I'm trying to set up a web app to show up a demo for a product. That app is based on the Sigma.js demo app demo repository.
You'll notice that this app relies on a graph which is hosted locally, which is loaded as:
/src/views/Root.tsx :
useEffect(() => {
fetch(`${process.env.PUBLIC_URL}/dataset.json`)
.then((res) => res.json())
.then((dataset: Dataset) => {...
// do things ....
and I wish to replace this by a call to another service which I also host on Cloud Run.
My first guess was to use the gcloud-auth-library, but I could not make it work - especially since it does not seem to support Webpack > 5 (I might be wrong here), the point here this lib introduces many problems in the app, and I thought I'd be better off trying the other way GCP suggests to handle auth tokens: by calling the Metadata server.
So I replaced the code above with:
Root.tsx :
import { getData } from "../getGraphData";
useEffect(() => {
getData()
.then((res) => res.json())
.then((dataset: Dataset) => {
// do even more things!
getGraphData.js :
import { getToken } from "./tokens";
const graphProviderUrl = '<my graph provider service URL>';
export const getData = async () => {
try {
const token = await getToken();
console.log(
"getGraphData.js :: getData : received token",
token
);
const request = await fetch(
`${graphProviderUrl}`,
{
headers: {
Authorization: `Bearer ${token}`,
},
}
);
const data = await request.json();
console.log("getGraphData.js :: getData : received graph", data);
return data;
} catch (error) {
console.log("getGraphData.js :: getData : error getting graph data", error);
return error.message;
}
};
tokens.js :
const targetAudience = '<my graph provider service base URL>'; // base URL as audience
const metadataServerAddress = "169.254.169.254"; // use this to shortcut DNS call to metadata.google.internal
export const getToken = async () => {
if (tokenExpired()) {
const token = await getValidTokenFromServer();
sessionStorage.setItem("accessToken", token.accessToken);
sessionStorage.setItem("expirationDate", newExpirationDate());
return token.accessToken;
} else {
console.log("tokens.js 11 | token not expired");
return sessionStorage.getItem("accessToken");
}
};
const newExpirationDate = () => {
var expiration = new Date();
expiration.setHours(expiration.getHours() + 1);
return expiration;
};
const tokenExpired = () => {
const now = Date.now();
const expirationDate = sessionStorage.getItem("expirationDate");
const expDate = new Date(expirationDate);
if (now > expDate.getTime()) {
return true; // token expired
}
return false; // valid token
};
const getValidTokenFromServer = async () => {
// get new token from server
try {
const request = await fetch(`http://${metadataServerAddress}/computeMetadata/v1/instance/service-accounts/default/token?audience=${targetAudience}`, {
headers: {
'Metadata-Flavor': 'Google'
}
});
const token = await request.json();
return token;
} catch (error) {
throw new Error("Issue getting new token", error.message);
}
};
I know that this kind of call will need to be done server-side. What I don't know is how to have it happen on a React + Node app. I've tried my best to integrate good practices but most questions related to this topic (request credentials through a HTTP (not HTTPS!) API call) end with answers that just say "you need to do this server-side", without providing more insight into the implementation.
There is a question with similar formulation and setting here but the single answer, no upvote and comments is a bit underwhelming. If the actual answer to the question is "you cannot ever call the metadata server from a react app and need to set up a third-party service to do so (e.g. firebase)", I'd be keen on having it said explicitly!
Please assume I have only a very superficial understanding of node.js and React!
The error is with the batchProcessDocuments line and has the error:
{
code: 3,
details: 'Request contains an invalid argument.',
metadata: Metadata {
internalRepr: Map { 'grpc-server-stats-bin' => [Array] },
options: {}
},
note: 'Exception occurred in retry method that was not classified as transient'
}
I've tried to copy the example as much as possible but without success. Is there a way of finding out more information regarding the input parameters that are required? There are very few examples of using Document AI on the web with this being a new product.
Here is my code sample:
const projectId = "95715XXXXX";
const location = "eu"; // Format is 'us' or 'eu'
const processorId = "a1e1f6a3XXXXXXXX";
const gcsInputUri = "gs://nmm-storage/test.pdf";
const gcsOutputUri = "gs://nmm-storage";
const gcsOutputUriPrefix = "out_";
// Imports the Google Cloud client library
const {
DocumentProcessorServiceClient,
} = require("#google-cloud/documentai").v1beta3;
const { Storage } = require("#google-cloud/storage");
// Instantiates Document AI, Storage clients
const client = new DocumentProcessorServiceClient();
const storage = new Storage();
const { default: PQueue } = require("p-queue");
async function batchProcessDocument() {
const name = `projects/${projectId}/locations/${location}/processors/${processorId}`;
// Configure the batch process request.
const request = {
name,
inputConfigs: [
{
gcsSource: gcsInputUri,
mimeType: "application/pdf",
},
],
outputConfig: {
gcsDestination: `${gcsOutputUri}/${gcsOutputUriPrefix}/`,
},
};
// Batch process document using a long-running operation.
// You can wait for now, or get results later.
// Note: first request to the service takes longer than subsequent
// requests.
const [operation] = await client.batchProcessDocuments(request); //.catch(err => console.log('err', err));
// Wait for operation to complete.
await operation.promise();
console.log("Document processing complete.");
}
batchProcessDocument();
I think this is the solution: https://stackoverflow.com/a/66765483/15461811
(you have to set the apiEndpoint parameter)
I have an iteration that can take up to hours to complete.
Example:
do{
//this is an api action
let response = await fetch_some_data;
// other database action
await perform_operation();
next = response.next;
}while(next);
I am assuming that the operation doesn't times out. But I don't know it exactly.
Any kind of explanation of nodejs satisfying this condition is highly appreciated. Thanks.
Update:
The actual development code is as under:
const Shopify = require('shopify-api-node');
const shopServices = require('../../../../services/shop_services/shop');
const { create } = require('../../../../controllers/products/Products');
exports.initiate = async (redis_client) => {
redis_client.lpop(['sync'], async function (err, reply) {
if (reply === null) {
console.log("Queue Empty");
return true;
}
let data = JSON.parse(reply),
shopservices = new shopServices(data),
shop_data = await shopservices.get()
.catch(error => {
console.log(error);
});
const shopify = new Shopify({
shopName: shop_data.name,
accessToken: shop_data.access_token,
apiVersion: '2020-04',
autoLimit: false,
timeout: 60 * 1000
});
let params = { limit: 250 };
do {
try {
let response = await shopify.product.list(params);
if (await create(response, shop_data)) {
console.log(`${data.current}`);
};
data.current += data.offset;
params = response.nextPageParameters;
} catch (error) {
console.log("here");
console.log(error);
params = false;
};
} while (params);
});
}
Everything is working fine till now. I am just making sure that the execution will ever happen in node or not. This function is call by a cron every minute, and data for processing is provided by queue data.
I have used the code from the Firebase documentation to schedule a backup of the data in my Firestore project in a bucket every 6 hours. See the link and the code here:
https://firebase.google.com/docs/firestore/solutions/schedule-export
const functions = require('firebase-functions');
const firestore = require('#google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient();
// Replace BUCKET_NAME
const bucket = 'gs://BUCKET_NAME';
exports.scheduledFirestoreExport = functions.pubsub
.schedule('every 24 hours')
.onRun((context) => {
const projectId = process.env.GCP_PROJECT || process.env.GCLOUD_PROJECT;
const databaseName =
client.databasePath(projectId, '(default)');
return client.exportDocuments({
name: databaseName,
outputUriPrefix: bucket,
// Leave collectionIds empty to export all collections
// or set to a list of collection IDs to export,
// collectionIds: ['users', 'posts']
collectionIds: []
})
.then(responses => {
const response = responses[0];
console.log(`Operation Name: ${response['name']}`);
})
.catch(err => {
console.error(err);
throw new Error('Export operation failed');
});
});
Everything works well, my data is saved like I want to but nevertheless I am getting an error:
Error serializing return value: TypeError: Converting circular structure to JSON
Can someone tell me what I should change? Would be glad to get a hint.