streaming a remote image to the buffer using Koa - node.js

First time playing around with nodejs streams.. I feel like I'm missing something fundamental here about how streams work. When I make the request to the URL it logs out a 404. If I try to write to the buffer it throws an error.
TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be one of type string, Buffer, or URL. Received type object
1 const Koa = require('koa')
2 const app = new Koa()
3 const fs = require('fs')
4
5 const url = 'http://4.bp.blogspot.com/-cDeYCsNL-ZQ/UozsUJ7EqfI/AAAAAAAAGSk/EtuzOVpHoS0/s400/andy.png'
6 app.use(ctx => {
7 const buffer = new Buffer.alloc(1000)
8 ctx.request.get(url).pipe(fs.createWriteStream(buffer))
9 console.log(ctx.request)
10 console.log(ctx.response)
11 })
12
13 app.listen(1337)

The error is caused by this part: fs.createWriteStream(buffer). That method expects a string that contains a filesystem path, e.g. fs.createWriteStream("image.png").
On a general level, your usecase doesn't require explicitly creating any Buffers. Even Koa is redundant. If you'd like to stream contents of an URL to hard disk using Buffers behind the scenes. You can write:
const request = require("request")
const fs = require("fs")
const url = 'http://4.bp.blogspot.com/-cDeYCsNL-ZQ/UozsUJ7EqfI/AAAAAAAAGSk/EtuzOVpHoS0/s400/andy.png'
request(url).pipe(fs.createWriteStream('image.png'))

Related

React gives Error: not supported error when I try and import local module

I have a local module (speech.js) in my create-react-app src folder that is the google text to speech code on their website. I adjusted it to be an arrow function and use that specific export syntax.
const textToSpeech = require('#google-cloud/text-to-speech');
// Import other required libraries
const fs = require('fs');
const util = require('util');
export const main = async () => {
// Creates a client
const client = new textToSpeech.TextToSpeechClient();
// The text to synthesize
const text = "Hello world";
// Construct the request
const request = {
input: {text: text},
// Select the language and SSML Voice Gender (optional)
voice: {languageCode: 'en-US', ssmlGender: 'NEUTRAL'},
// Select the type of audio encoding
audioConfig: {audioEncoding: 'MP3'},
};
// Performs the Text-to-Speech request
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
const writeFile = util.promisify(fs.writeFile);
await writeFile('output.mp3', response.audioContent, 'binary');
console.log('Audio content written to file: output.mp3');
};
What I'm not understanding is why this syntax isn't working in App.js.
import {main} from './speech';
I get the error, Error: not support and "4 stack frames were collapsed". Quite informative!
Does anyone know what the error could be here? I thought as long as I used es6 style imports and exports I wouldn't receive errors. Could this be due to the first require() statement of speech.js? Any help would be appreciated. I've felt like banging my head against the wall for the past 40 minutes.
May not be the correct answer but I believe it has a good chance of being right. I believe that since node is just a runtime environment and not a part of the actual browser, you aren't able to use node modules with react (a frontend framework). The solution to this quandary would be to use something like electron.

Do not see the reason I am getting a NOENT returned when I can see the file at the exact spot I am calling for it to be

I know this is very similar to other questions that have been asked on the same error. In the case I have seen though, the file name had been left off of the url. In my case (as far as I know) the url is specified as it should be and I can see the file on my localhost using other tools.
I have a need in a node.js app to perform I/O on json files without the benefit of using express routing. This is an API that has only one route (processor.js). It is accessed by a menu selection on the GUI by selecting 'Process'. From that point on everything happens within that route including multiple GETs/PUTs to json (for ids to data and then using the ids to get the data) and the building of SQL rows for populating SQL-Server Tables from the parsed json data. That, at least is the concept I am testing now. It is the hand I have been dealt, so I don't have other options.
I am using fs-extra rather than request or axios etc., because they all seem to expect express routes to accomplish the I/O. I appear to be able to directly read and write the json using fs-extra. I am using sequelize (or will be) for the SQL side.
That's the background.
Here is my processor.js (I am merely validating that I can in fact get idsList returned to me at this point):
'use strict';
// node_modules
const express = require('express');
const router = express.Router();
const fse = require('fs-extra')
// local modules
const idsList = require('../functions/getIds');
router.get('/', (req, res) => {
console.log(idsList);
});
module.exports = router;
Here is my getIds function:
'use strict';
// library modules
const express = require('express');
const router = express.Router();
const fse = require('fs-extra');
const uri = require('../uri');
// initialize general variables
let baseURL = `http://localhost:5000${uri}/`;
let idsID = 'ids.json';
const getIds = async () => {
let url = `${baseURL}${idsID}`;
try {
const idsList = await fse.readJson(url);
console.log('fse.readJson',idsList);
} catch (err) {
console.error(err);
}
}
module.exports = getIds();
And, here is my error, output to the console (it didn't format very well):
Listening on port 5000...
{ [Error: ENOENT: no such file or directory, open
'http://localhost:5000/Users/doug5solas/sandbox/libertyMutual/playground/api/ids.json']
errno: -2,
code: 'ENOENT',
syscall: 'open',
path:
'http://localhost:5000/Users/doug5solas/sandbox/libertyMutual/playground/api/ids.json' }
What am I missing?
You can use fs-extra to manipulate files and directories in your local file system only.
If you want to read files hosted on other machine over http, try using an http client like: axios.
I moved away from fs-extra to fs.readFileSync and solved the problem. It is not my preference. But it does work and the file is small, and only once.

Uncaught TypeError: URL is not a constructor using WHATWG URL object support for electron

I am trying to read a file using WHATWG URL object support here
and I am getting this error: Uncaught TypeError: URL is not a constructor
here is my code:
var fs = require("fs");
const { URL } = require('url');
var dbPath = 'file://192.168.5.2/db/db.sqlite';
const fileUrl = new URL(dbPath);
I faced the same issue, then I looked into the url module and found a solution
For Node V6 use,
const URL = require('url').Url;
or
const { Url } = require('url');
If you look into the module, it exports 5 methods one of which is Url, so if you need to access Url, you can use either of the two methods
Are you using Node 6 instead of Node 8?
Node 6
const url = require('url');
const myUrl = url.parse('http://example.com');
const myUrlString = url.format(myUrl);
https://nodejs.org/dist/latest-v6.x/docs/api/url.html#url_url
Node 8
const { URL } = require('url');
const myUrl = new URL('http://example.com');
const myUrlString = myUrl.toString();
https://nodejs.org/dist/latest-v8.x/docs/api/url.html#url_url
Node v10.0.0 and newer (currently Node v19.x)
URL Class
v10.0.0 | The class is now available on the global object.
As mentioned here: Node.js Documentation - Class: URL
So this should work without require('url'):
const myUrl = new URL('http://example.com');
The docs you took this info out are for the node of version 8.4.0.
If it does not work for you, that means that your node is of version 6.11.2. Then, just change the letter case of URL -
const { Url } = require('url');
const myUrl = new Url('http://example.com');
because url module exports Url, not URL.

How to pass PDFKit readable stream into request's post method?

My app needs to create a PDF file and then upload it to another server. The upload happens down the line via the post method from the request NPM package. Everything works fine if I pass in an fs.createReadStream:
const fs = require('fs');
const params = {file: fs.createReadStream('test.pdf')};
api.uploadFile(params);
Since PDFKit instantiates a read stream as well, I'm trying to pass that directly into the post params like this:
const PDFDocument = require('pdfkit');
const doc = new PDFDocument();
doc.text('steam test');
doc.end();
const params = {file: doc};
api.uploadFile(params);
However, this produces an error:
TypeError: Path must be a string. Received [Function]
If I look at PDFKit source code I see (in coffeescript):
class PDFDocument extends stream.Readable
I'm new to streams and it's clear I'm not understanding the difference here. To me if they are both readable streams, they should both be able to be passed in the same way.

how to put data continously put data into a stream and transmit it while compressing it in node js

I am a newbie to javascript.
What i am trying to do is to fetch data from the data base and then transmit it on the internet.
Now i can only read one entry at a time but i want to compress all the entries together rather than compressing one entry at a time.
I can either store all of them in an array and then pass this array to zlib function. but this take up alot of time and memory.
Is it somehow possible to compress the data while transmitting it in node js with express api at the same time as it is being read, sort of like streaming servers, who on real time compress data while retrieving it from memory and then transmitting it over to the client
It's certainly possible. You can play around with this example:
var express = require('express')
, app = express()
, zlib = require('zlib')
app.get('/*', function(req, res) {
res.status(200)
var stream = zlib.createGzip()
stream.pipe(res)
var count = 0
stream.write('[')
;(function fetch_entry() {
if (count > 10) return stream.end(']')
stream.write((count ? ',' : '') + JSON.stringify({
_id: count,
some_random_garbage: Math.random(),
}))
count++
setTimeout(fetch_entry, 100)
})()
})
app.listen(1337)
console.log('run `curl http://localhost:1337/ | zcat` to see the output')
I assume you're streaming JSON, and setTimeout calls would need to be replaced with actual database calls of course. But the idea stays the same.
I'd recommend to use node.js's pipe.
Here is an example of pipe streaming with zlib (compression): it reads a file, compresses it and writes it to a new file.
var gzip = zlib.createGzip();
var fs = require('fs');
var inp = fs.createReadStream('input.txt');
var out = fs.createWriteStream('input.txt.gz');
inp.pipe(gzip).pipe(out);
You can change the input to come from your database input and change the output to be the HTTP response.
ref : http://nodejs.org/api/stream.html
ref : http://nodejs.org/api/zlib.html

Resources