Node.js Express save SVG file stream to file - node.js

I am new to Express and I need your help.
How to save SVG on server using Express?
const qr = require('qr-image');
const fs = require('fs');
exports.qr = function (req, res) {
var code = qr.image(new Date().toString(), { type: 'svg' });
// I would like to do something like this:
fs.writeFile('qr.svg', code, (er)=> console.log(er));
// and download using express.static
res.type('svg');
code.pipe(res);
};
Currently I am returning image as a stream and it works fine.
I have an api with mongodb built on Express and I would like to store QR Codes on the server side. Api is for the application built with Xamarin managing event tickets.
QR images are going to be downloaded more than once, that's why I would like to put them on server.
Maybe better way would be to store them with SQLite locally on the client device? Or maybe I should just send json data to be parsed into SVG?
How do you think?
At the moment I am not implementing db locally.

After some time when I went back to the topic I found the solution.
You need to add new action to your pipe. Simple as that:
var code = qr.image(ticket.code, { type: 'svg' });
code.pipe(fs.createWriteStream('i_love_qr.svg'));

Related

Read data from DB and send it as a downloadable zip file

I have a usecase where I want to read data from database and send it to frontend as a downloadable zip file. I am stuck on how to achieve this using node js and express.
For now I was trying to send just a downloadable json file and was confused how to achieve it.
What I have tried so far ->
const data = db.read() // fetch an array of objects.
const myBuffer = Buffer.from(JSON.stringify(data)); //creating a buffer
const readableStream = Readable.from(myBuffer);
res.setHeader('Content-Type', 'application/octet-stream');
res.setHeader('Content-Disposition', 'attachment; filename=\"my json file.json\"');
readableStream.pipe(res);
trying this from postman gives me json directly. My question is how to create a downloadable zip file of data here and send it to client. I want to make sure I don't use fs to write a file on server and then send it. Any help and guide would be great, Thanks!!
As #ardritkrasniqi suggested to use a third party npm package, I was able to make an in memory zip file and send it to client using npm package archiver.
I created it like this ->
const data = db.read() // fetch an array of objects.
const myBuffer = Buffer.from(JSON.stringify(data)); //creating a buffer
const archive = archiver.create('zip', {});
archive.append(myBuffer, { name: 'test.json'});
res.setHeader('Content-Type', 'application/zip');
res.setHeader('Content-Disposition', 'attachment; filename=\"test.zip\"');
archive.pipe(res);
archive.finalize();
Hope this will be helpful for others who are trying to achieve the same.
PS. I am using express as my backend node framework.

How to fetch mutiple images from express api's into react application?

I am using mern stack to program an application. I used multer package with express for file uploading and uploaded images in a directory in node.js application. Now i want to fetch the image from that directory. How can i do this? I have found
res.sendFile()
but there are cases when i will need to fetch multiple files from server at once. I also have found just sending path from api to react and serving from a folder into react which i don't find secure? how do i go about it?
You should decouple the back end from the front end, which means, separate the express part from the React part and make simple API, express also can serve files as static(search google for static file serving), then call the API from your React App. Just give the React App Image URL like example.com/static/image1.png(<img src={"example.com/static/image1.png"} />)
I ended up using streams for my problem the following snippet might come in handy for some one with similar issue.
const fs = require('fs')
const stream = require('stream')
app.get('path',(req, res) => {
const r = fs.createReadStream('path to file') // or any other way to get a readable stream
const ps = new stream.PassThrough() // <---- this makes a trick with stream error handling
stream.pipeline(
r,
ps, // <---- this makes a trick with stream error handling
(err) => {
if (err) {
console.log(err) // No such file or any other kind of error
return res.sendStatus(400);
}
})
ps.pipe(res) // <---- this makes a trick with stream error handling
});

nodejs json endpoint from api json stream

I am connecting to an api and pulling a set of json data. the javascript outputs the json as the variable feedData and when i include that in a html i get the json on a html page as expected. What i want to do is output it as a json endpoint. I tried to get fancy and when i tried:
var express = require('express');
var app = express();
app.get('/', function (req, res) {
res.send(feedData);
});
app.listen(3000, function () {
console.log('posting data');
});
the problem is that i need to import an api.js file and when i attempt to load this i get an error related to the api.js file which is windows is not defined. Like i said, the html
document.getElementById('mydiv').innerHTML += JSON.stringify(feedData, undefined,2);
then
<pre id="mydiv"></pre>
works fine but i will also to recall every x seconds because this is a live json feed.
Currently, i just decided to connect via python, loading it into mongodb and creating a nodejs endpoint from there, which works fine, but it seems there should be a way here.
Try using res.json, res.json(feedData); to send objects. If you want to send a string just, use res.send
Instead of
res.send(feedData);
use
res.json(feedData);
Hope this helps

Upload a photo to facebook album

I have a nodejs (+express + mongodb,gridstore) backend, and want to upload a photo to a facebook album.
I came across 2 methods. the first ( https://developers.facebook.com/blog/post/526/ ) needs to get the full url resource of my picture, which I don't have as it is data that I pull from gridstore,
and the second ( https://developers.facebook.com/docs/reference/api/album/ ) is very poorly documented, via the Graph API, where I can't figure out what my request should look like. (the form-data, what fields should it have, how to convert my data blob\stream from gridstore to it)
Here is what I currently have, and doesn't work:
facebook.uploadPhoto = function (token, albumId, photo, callback) {
var fb = fermata.json('https://graph.facebook.com/' + albumId);
fb.photos({access_token:token}).post({'Content-Type':"multipart/form-data"}, {source:{data:photo}}, callback);
};
Any help would be much appreciated
There is a good chance the file is not properly serialized. Fermata will take a node File buffer via data. Have you tried passing that instead?
fs.readFile("/path/to/photo.jpg", function (err, data) {
fermata.json("https://graph.facebook.com/graph/api").post({"Content-Type":"multipart/form-data"}, {fileField: {data:data, name:"", type:""} }, callback);
});
Adding your access token etc..
I solved this problem by using a simple POST to the facebook graph API using the poster module.
var options = {
uploadUrl: 'https://graph.facebook.com/'+user+'/photos?access_token='+accessToken,
method: 'POST',
fileId: 'source',
fields: {'message':''} // Additional fields according to graph API
};
var fileName = ''; // Local or remote url where to find the image
poster.post(fileName, options, function(err, data) {
if (err) {
//Something went wrong
} else {
// Everything ok
}
});
Honestly, I've got limited experience working with the Facebook graph API and mostly using PHP and Java.
Here is some streams that you might find helpful:
Upload Photo To Album with Facebook's Graph API
Facebook Graph API - upload photo using JavaScript
Basically, I recommend you punt a little in your implementation and code it in the following way:
Create a REST web service function call in Node.js to output a single image from gridstore using an internal UID.
Code your uploadToFacebook function to use an image URL that calls the REST web service function.
Basically, this would allow you to validate the image output by pointing your browser to the REST web service and avoid any blob\stream conversions inside your uploadToFacebook function. I'm assuming you store the image in gridstore vs. mongodb.
hope that helps...

Express response body to buffer

I'm trying to build a quick and simple image uploading service with Node, that takes the received images and saves them to Mongo's GridFS.
GridFS get requires a Buffer object NodeJS Mongo Driver GridFS put
The question is pretty simple: how do I exactly cast/transform the received request body into a proper buffer.
My code so far (only the important pieces):
api.js
var express = require('express');
var connect = require('connect');
var app = module.exports = express.createServer();
app.configure(function(){
app.use(express.bodyParser());
app.use(express.methodOverride());
app.use(app.router);
});
var upload = require('./upload.js');
app.post('/upload', upload.upload);
upload.js
exports.upload = (function(req, res, next){
console.log("Uploading image...");
// Create buffer
// Rest of the code
}
I've tried:
var buffer = new Buffer(util.inspect(req.body),'binary');
Creates the buffer, but it has a wrong size and probably not the correct content since util.inspect is obviously not the right way to go.
And:
var buffer = new Buffer(req.body);
Result:
[Decode error - output not utf-8][Decode error - output not utf-8]
Buffer length = 0
I'm quite new to both Node and JavaScript developing in general, so probably I'm missing something quite simple, don't hesitate to point the obvious :)
Thanks!
First, remember that Express is built on top of Connect, which is the library that handles a large amount of the lower-level HTTP work, and it's where bodyParser() comes from.
The body parser middleware internally uses Formidable to parse file uploads.
Formidable's default behavior is to write uploaded files directly to disk – in other words, you don't actually have access to the uploaded file stream within your route handler. You get the values of any regular form fields (<input>s) sent along in req.body, and you get uploaded file paths via req.files, but you don't get file content.
The easy answer here is to simply read the file from disk and use that to save into Mongo, remembering to delete the temporary file when done. Of course, this introduces the unnecessary intermediate step of writing the file upload to a temporary folder and then loading to Mongo.
If you want to stream file data directly into Mongo, you have more of a challenge in front of you. You'll have to write your own middleware to parse the upload stream.
This is actually relatively easy. You can just start with the internal Connect body parser implementation—using Formidable to do the heavy lifting—and use the onPart API to pass the stream back to your route handler so that you can pass it off to the Mongo driver.

Resources