Generating rss.xml for Angular 8 app locally works fine, but not on prod - node.js

I am trying to generate a rss.xml file for my angular 8 app with ssr + firebase + gcp inside the domain.
I've created a RssComponent which can be reached at /rss route. There i call getNews() method and receive an array of objects. Then I make a http request to /api/rss and in server.ts i handle that request:
app.post('/api/rss', (req, res) => {
const data = req.body.data;
const feedOptions = // defining options here
const feed = new RSS(feedOptions);
data.forEach((item) => {
feed.item({
title: item.headingUa,
description: item.data[0].dataUa,
url: item.rssLink,
guid: item.id,
date: item.utcDate,
enclosure: {url: item.mainImg.url.toString().replace('&', '&'), type: 'image/jpeg'}
});
});
const xml = feed.xml({indent: true});
fs.chmod('dist/browser/rss.xml', 0o600, () => {
fs.writeFile('dist/browser/rss.xml', xml, 'utf8', function() {
res.status(200).end();
});
});
});
And finally on response i'm opening the recently generated rss.xml file in RssComponent. Locally everything is working fine but on Google Cloud Platform it's not generating a file.

As explained in the Cloud Functions docs:
The only writeable part of the filesystem is the /tmp directory
Try changing the path to the file to the /tmp directory.
Nevertheless, using local files in a serverless environment is a really bad idea. You should assume the instance handling the following request will not be the same as the previous one.
The best way to handle this would be to avoid writing local files and instead storing the generated file in GCP Storage or Firebase Storage, and then retrieving it from there when needed.
This will ensure your functions are idempotent, and also will comply with the best practices.

Related

How to process JS file returned from Express response.sendFile()

I have an API which uses Node.js + Express on the backend.
For one of the API endpoints, I'd like to use the Express response object method of "sendFile", documented here:
https://expressjs.com/en/api.html#res.sendFile
The API should return a Javascript file through the sendFile method.
What I can't figure out is how to read in the .js file on the front end so that I can use the JavaScript functions defined in the file. The sendFile portion appears to be working -- it's just the use of the file which I can't figure out.
Here's what I'm doing on the backend:
app.get("/api/member", async (req, res) => {
options = {
root: path.join(__dirname, '/static'),
dotfiles: 'deny'
}
res.sendFile("member.js", options, (err) => {
if (err) {
console.log(err)
next(err)
} else {
console.log('Sent file')
}
})
});
This seems to be working fine, as I can navigate to the endpoint on my localhost and it loads the JS file. The file member.js simply contains some javascript function definitions.
But, I can't figure out how to consume/use the file once it arrives to the front end.
Here's what I have currently on the frontend:
async function refreshJS() {
const url = `${baseUrl}/member`;
const response = await fetch(url, { credentials: "include" });
const script = document.createElement("script")
script.type = "text/javascript"
script.src = response.body
document.head.appendChild(script)
eval(script)
}
I've spent a lot of time looking through the console/debugger to find the text associated with the JS functions -- but they're nowhere to be found.
I've tested this general framework by loading JS files locally through the console and it worked, so I think it's wrapped up in a misunderstanding of where the JS functions live in the API response. For example, if I replace the command above of:
script.src = response.body
with
script.src = "member.js"
then everything works fine provided I have the file locally.
The examples that I've reviewed seem to deal exclusively with sending an HTML file which is loaded on the frontend. But, I can't find supporting documentation from the fetch API to understand how to use the JS file contents.

FastAPI: file uploaded bytes on client ProgressEvent done twice before completing upload

I'm trying to implement a multiple large file uploader within a Vue.js application. The backend is a FastAPI application.
The issue is about a strange behavior for the ProgressEvent associated with an axios POST for uploading a file to the backend.
The ProgressEvent.loaded value is not incremental and resets when the file is almost entirely uploaded into the backend. It starts back from a low number of uploaded bytes and finally completes the upload. It seems like the file is uploaded twice.
I have this simple FastAPI path operation function implementing the file upload endpoint:
#router.post(
"/upload/",
status_code=status.HTTP_200_OK,
summary="Upload job-specific file",
description=(
"Accepts file uploads. Files can be uploaded in chunks to allow pausing/ resuming uploads"
),
dependencies=[Depends(get_current_user)]
)
async def upload_file_chunk(chunk: UploadFile, custom_header_job_id=Header(...), settings: Settings = Depends(get_settings)):
filename = compose_upload_filename(custom_header_job_id, chunk.filename)
#filename = '_'.join([custom_header_job_id, chunk.filename])
path_to_file = os.path.join(settings.UPLOAD_FOLDER, filename)
try:
async with aiofiles.open(path_to_file, "ab") as input_file:
while content := await chunk.read(1024):
await input_file.write(content)
except FileNotFoundError:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="File chunk not found",
)
return {"hello": "world"}
The endpoint is not completed yet, since it is supposed to do other things besides receiving the file.
The frontend request starts from a Vue component:
uploadFileChunk({ file, jobId, startByte, onProgressUpdate = undefined }) {
const chunk = file.slice(startByte);
const formData = new FormData();
formData.append("chunk", chunk, file.name);
//formData.append("jobId", jobId);
/* return axios.post(`http://localhost:1234/upload`, formData, {
headers: {
"Custom-Header-Job-Id": jobId,
"Content-Disposition": `form-data; name="chunk"; filename="${file.name}"`,
"Content-Range": `bytes=${startByte}-${startByte + chunk.size}/${
file.size
}`,
},
onUploadProgress: onProgressUpdate,
}); */
return instance.post(`/files/upload`, formData, {
headers: {
"Custom-Header-Job-Id": jobId,
"Content-Disposition": `form-data; name="chunk"; filename="${file.name}"`,
"Content-Range": `bytes=${startByte}-${startByte + chunk.size}/${
file.size
}`,
},
onUploadProgress: onProgressUpdate,
});
}
const onProgressUpdate = (progress) => {
console.log("loaded: ", progress.loaded);
const percentage = Math.round(
(progress.loaded * 100) / file.size);
};
console.log("percentage: ", percentage);
The commented request points to a different Node.js backend with a file upload endpoint exclusively made to assess if the current issue I'm facing up is dependent on the client code or the backend. Here is the implementation (not an expert in Node.js and express):
const express = require("express");
const cors = require("cors");
const multer = require("multer");
const app = express();
app.use(express.json());
app.use(cors());
const upload = multer({ dest: "uploads/" });
app.post("/upload", upload.single("chunk"), (req, res) => {
res.json({ message: "Successfully uploaded files" });
});
app.listen(1234);
console.log("listening on port 1234");
In addition, the client code is actually put into a much more articulated pattern involving XState for managing the uploader component. Nevertheless, attached snippets should be enough to have an idea of the main parts to be discussed here.
Here is a screenshot for the request the FastAPI endpoint:
FastAPI
Where we can see the file is almost entirely uploaded and then the uploaded parameter drops to a lower upload percentage, eventually finalizing the upload (not shown here).
The same experiment repeated on the Node.js endpoint does not create issues, which splits the upload in much less packets:
Node.js express
It seems like the Node.js backend works fine, whereas the FastAPI doesn't. In my opinion there are some issues with how FastAPI/ starlette manages large files. It could be something related to the spooled file that starlette creates, or maybe something happening when passing from storing the file in the main memory to the mass memory. Unfortunately, starlette UploadFile class seems very hermetic and not easy to be customized/ inspected.
DETAILS
FastAPI backend running on a Debian Bullseye docker image
FastAPI version 0.78.0
python-multipart version 0.0.5
client and server running in localhost and tested with Chrome 103.0.5060.53 (Official Build) (x86_64)
system: Mac OSX 11.3.1
Thank you so much for your help!

Firebase Storage-How to delete file from storage with node.js?

I want to delete a folder in firebase storage with node js because this is a firebase function.
For example :
storageRef.child(child1).child(child2).delete();
something like this, but firebase documentation doesn't tell anything.
One more question:
When initialize storage documentation node js requires my admin json, but realtime database doesn't want this wonder why?
Have a look at the Node.js client API Reference for Google Cloud Storage and in particular at the delete() method for a File.
You can do it like this using Node.js:
const firebase = require('firebase-admin');
async function deleteImageFromFirebase(imageName) {
await firebase.storage().bucket().file("folderName/"+imageName).delete();
}
And like this client side:
// Create a reference to the file to delete
var desertRef = storageRef.child('images/desert.jpg');
// Delete the file
desertRef.delete().then(function() {
// File deleted successfully
}).catch(function(error) {
// Uh-oh, an error occurred!
});
View this info on the Firebase website:
how to delete files Firebase-storage
This might be late but at least on Web (so basically what you need), there is new API to delete the whole folder.
I tested deleting a folder with 2 pictures inside and it works. I then tried a folder-A with contents: folder-B + picture-A. Folder-B also has a picture-B inside; it still deleted folder-A with all of its contents.
Solution:
const bucket = admin.storage().bucket();
return bucket.deleteFiles({
prefix: `posts/${postId}`
);
I couldn't find this on the official documentation (perhaps is really new API) but really cool article where I found the solution:
Automatically delete your Firebase Storage Files from Firestore with Cloud Functions for Firebase
import { storage } from "./firebaseClient";
import { bucket } from "./firebaseServer";
//Let's assume this is the URL of the image we want to delete
const downloadUrl = "https://storage.googleapis.com/storage/v1/b/<projectID>.appspot.com/o/<location>?"
//firebase delete function
const deleteImages = async ({ downloadUrl }) => {
const httpsRef = storage.refFromURL(downloadUrl).fullPath;
return await bucket
.file(httpsRef)
.delete()
.then(() => "success")
.catch(() => "error")
}
//call the deleteImages inside async function
const deleteStatus = await deleteImages({ downloadUrl: oldImage });
console.log(deleteStatus) //=> "success"

Get source url for image stored in Bluemix Object Storage container using Node.js app

I have an Object Storage instance on Bluemix where I am storing images in the container. I need a source url for the images stored there so that I can use that image. To do this, I'm thinking of creating a Node.js app so that I will write a post call where I'll pass image name present in Object Storage as request, so that it will give me the image url as response.
Is this possible or not? If possible, can anyone suggest whether there are any npm modules which do this functionality? If not, are there any other suggestions to get the url of image?
Any help is appreciated..Thanks!
start the server by command node app.js also u need package pkgcloud to perform this operation. You can get the object storage credentials simply by creating a key on IBM console inside Storage module.
inside app.js insert a new route for download
var objectStorageHandler = require("./lib/objectStorageHandler.js");
app.get('/download', function(req, res) {
(new objectStorageHandler()).download('YourContainerName', 'imagenamewithextension',function(download){
console.log(res);
download.pipe(res);
});
});
Inside Lib folder create a module with name objectStorageHandler.js
Inside objectStorageHandler.js write code
var pkgcloud = require('pkgcloud');
var objectStorageHandler = function(){
}
objectStorageHandler.prototype.download = function(container, file,callback)
{
var config = {
provider: 'openstack',
useServiceCatalog: true,
useInternal: false,
keystoneAuthVersion: 'v3',
authUrl: 'https://identity.open.softlayer.com',
tenantId: 'YOURPROJECTID', //projectId from credentials
domainId: 'YOURDOMAINID',
username: 'YOURUSRNAME',
password: 'YOURPASSWORD',
region: 'dallas' //dallas or london region
};
var client = pkgcloud.storage.createClient(config);
client.auth(function (error) {
if(error) {
console.error("Authorization error for storage client (pkgcloud): ", error);
}
else {
var request = client.download({
container: container,
remote: file
});
callback(request);
}
});
}
module.exports = objectStorageHandler;
after server started lets assume at port 3000 simply call localhost:3000/download that will download the image, we can also pass image name in parameters to download images dynamically.

Use Cordova's Filetransfer plugin with express/blob storage

I am using Typescript and Cordova 4.0.
I have the following sample code:
uploadImages(imageUris: Array<string>): any {
var fileTransfer = new FileTransfer();
for (var i = 0; i < imageUris.length; i++) {
fileTransfer.upload(imageUris[i], encodeURI('http://3187cf3.ngrok.com/test/photos'), (success) => {
alert('success');
}, (err) => {
alert('error');
});
}
}
This corresponds to an express route:
var router = express.Router(),
test = test.controller;
router
.post('/test/photos', bind(test.uploadPhotos, test));
Which corresponds to a controller method:
uploadPhotos(req: express.Request, res: express.Response) {
console.log(req);
}
I can't seem to figure out how to, inside of my controller, grab the "file" or image I'm posting to my server using Filetransfer. It's not on req.body or req.query, and when I look through the entire req I can't seem to locate the file. The app flow is working enough to actually make the POST request to test/photos, I just don't know how to or if I can access the file at that point.
How does Filetransfer work, and how can I access the data I need in my controller so that I can push it to Azure Blob Storage?
It looks like you have everything setup correctly to send the data through to your controller. The issue is that you need to put the file on the request since cordova's filetransfer plugin doesn't do that by default.
You can do that with a popular library multer.
npm install multer --save-dev To install multer and save it to your package.json file.
In your express config file, add something like the following:
var multer = require('multer');
app.use(multer({ dest: path.resolve(config.root, 'public/img/') }))
'public/img/' is the path you would like for your image to be saved.
Now your req will have files on it. To upload a single file, you would use req.files.file. You'll want to use this variable to send your file to azure's blob storage using something like blobSvc.createBlockBlobFromLocalFile(containerName, fileName, filePath)
Since you're using Azure for remote storage, chances are you will want to remove the local file that multer has saved. I'd recommend using fs or rimraf to remove the file stored in public/img/, or whatever you set the path to in your express config. If you are using fs, you'll want to use the .unlink command.

Resources