React gives Error: not supported error when I try and import local module - node.js

I have a local module (speech.js) in my create-react-app src folder that is the google text to speech code on their website. I adjusted it to be an arrow function and use that specific export syntax.
const textToSpeech = require('#google-cloud/text-to-speech');
// Import other required libraries
const fs = require('fs');
const util = require('util');
export const main = async () => {
// Creates a client
const client = new textToSpeech.TextToSpeechClient();
// The text to synthesize
const text = "Hello world";
// Construct the request
const request = {
input: {text: text},
// Select the language and SSML Voice Gender (optional)
voice: {languageCode: 'en-US', ssmlGender: 'NEUTRAL'},
// Select the type of audio encoding
audioConfig: {audioEncoding: 'MP3'},
};
// Performs the Text-to-Speech request
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
const writeFile = util.promisify(fs.writeFile);
await writeFile('output.mp3', response.audioContent, 'binary');
console.log('Audio content written to file: output.mp3');
};
What I'm not understanding is why this syntax isn't working in App.js.
import {main} from './speech';
I get the error, Error: not support and "4 stack frames were collapsed". Quite informative!
Does anyone know what the error could be here? I thought as long as I used es6 style imports and exports I wouldn't receive errors. Could this be due to the first require() statement of speech.js? Any help would be appreciated. I've felt like banging my head against the wall for the past 40 minutes.

May not be the correct answer but I believe it has a good chance of being right. I believe that since node is just a runtime environment and not a part of the actual browser, you aren't able to use node modules with react (a frontend framework). The solution to this quandary would be to use something like electron.

Related

How can i transpile a jsx file imported dynamically in nodejs?

I am currently working on a nodejs command-line package that should dynamically import a React component basing on the path of the component passed as parameter. My code is the following:
const path = require('path')
const pathArg = process.argv[2]
const componentPath = path.join(filePath, '../../../', pathArg)
const {default} = await import(componentPath)
The component i want to import would be something like
// src/components/Button/index.jsx
function Button () {
return <button>Click me!</button>
}
module.exports = Button
However, when i run my script passing src/components/Button/index.jsx as param, i receive unexpected token "<" while reading <button>, since the file is in jsx.
How can i transpile the React file dynamically (e.g. with Babel) and get rid of the error?

Extract WAV header on javascript frontend (ReactJS)

I'm trying to analyze a file I'll be uploading from react, I need to know if it can be uploaded based on several factors.
I found https://github.com/TooTallNate/node-wav
It works great on nodejs and I'm trying to use it on react. The sample creates a readable stream and pipes it to the wav reader.
var fs = require('fs');
var wav = require('wav');
var file = fs.createReadStream('track01.wav');
var reader = new wav.Reader();
// the "format" event gets emitted at the end of the WAVE header
reader.on('format', function (format) {
//Format of the file
console.log(format);
});
file.pipe(reader);
Using FilePond controller I'm able to get a base64 string of the file. But I can't figure out how to pass it to the reader
this is what I have so far on ReactJS:
var reader = new wav.Reader();
reader.on('format', function (format) {
//Format of file
console.log('format', format);
});
const buffer = new Buffer(base64String, 'base64')
const readable = new Readable()
readable._read = () => { }
readable.push(buffer)
readable.push(null)
readable.pipe(reader)
But I get Error: bad "chunk id": expected "RIFF" or "RIFX", got "u+Zj"
Since this file works on NodeJS with the same lib is obvious I'm doing something wrong.
EDIT:
this was a problem with my Base64 string, this method works if anyone needs to analyze a wav on the frontend

Do not see the reason I am getting a NOENT returned when I can see the file at the exact spot I am calling for it to be

I know this is very similar to other questions that have been asked on the same error. In the case I have seen though, the file name had been left off of the url. In my case (as far as I know) the url is specified as it should be and I can see the file on my localhost using other tools.
I have a need in a node.js app to perform I/O on json files without the benefit of using express routing. This is an API that has only one route (processor.js). It is accessed by a menu selection on the GUI by selecting 'Process'. From that point on everything happens within that route including multiple GETs/PUTs to json (for ids to data and then using the ids to get the data) and the building of SQL rows for populating SQL-Server Tables from the parsed json data. That, at least is the concept I am testing now. It is the hand I have been dealt, so I don't have other options.
I am using fs-extra rather than request or axios etc., because they all seem to expect express routes to accomplish the I/O. I appear to be able to directly read and write the json using fs-extra. I am using sequelize (or will be) for the SQL side.
That's the background.
Here is my processor.js (I am merely validating that I can in fact get idsList returned to me at this point):
'use strict';
// node_modules
const express = require('express');
const router = express.Router();
const fse = require('fs-extra')
// local modules
const idsList = require('../functions/getIds');
router.get('/', (req, res) => {
console.log(idsList);
});
module.exports = router;
Here is my getIds function:
'use strict';
// library modules
const express = require('express');
const router = express.Router();
const fse = require('fs-extra');
const uri = require('../uri');
// initialize general variables
let baseURL = `http://localhost:5000${uri}/`;
let idsID = 'ids.json';
const getIds = async () => {
let url = `${baseURL}${idsID}`;
try {
const idsList = await fse.readJson(url);
console.log('fse.readJson',idsList);
} catch (err) {
console.error(err);
}
}
module.exports = getIds();
And, here is my error, output to the console (it didn't format very well):
Listening on port 5000...
{ [Error: ENOENT: no such file or directory, open
'http://localhost:5000/Users/doug5solas/sandbox/libertyMutual/playground/api/ids.json']
errno: -2,
code: 'ENOENT',
syscall: 'open',
path:
'http://localhost:5000/Users/doug5solas/sandbox/libertyMutual/playground/api/ids.json' }
What am I missing?
You can use fs-extra to manipulate files and directories in your local file system only.
If you want to read files hosted on other machine over http, try using an http client like: axios.
I moved away from fs-extra to fs.readFileSync and solved the problem. It is not my preference. But it does work and the file is small, and only once.

How to use i18next in serverless node js?

I am using Node JS Azure functions. I am trying to internationalize the error messages returned by the functions with i18next. I could find examples with express or plain node server. In these cases middleware pattern can be used.
But for functions, I need a way to call i18next.t('key') with probably a language parameter which I am not able to find. Calling i18next.changeLanguage() before every call to i18next.t('key') doesn't seem practical.
My skeleton code is as follows
const i18next = require("i18next");
const backend = require("i18next-node-fs-backend");
const options = {
// path where resources get loaded from
loadPath: '../locales/{{lng}}/{{ns}}.json',
// path to post missing resources
addPath: '../locales/{{lng}}/{{ns}}.missing.json',
// jsonIndent to use when storing json files
jsonIndent: 4
};
i18next.use(backend).init(options);
exports.getString = (key, lang) => {
//i18next.changeLanguage(lang,
return i18next.t(key);
}
It is possible to fetch translations without doing changeLanguage each time?
As pointed out in the comments you need to call the i18next.changeLanguage(lang) function whenever the language needs to be defined or changed.
You can take a look to the documentation here.
The code could look like this
const i18next = require('i18next')
const backend = require('i18next-node-fs-backend')
const options = {
// path where resources get loaded from
loadPath: '../locales/{{lng}}/{{ns}}.json',
// path to post missing resources
addPath: '../locales/{{lng}}/{{ns}}.missing.json',
// jsonIndent to use when storing json files
jsonIndent: 4
}
i18next.use(backend).init(options)
exports.getString = (key, lang) => {
return i18next
.changeLanguage(lang)
.then((t) => {
t(key) // -> same as i18next.t
})
}

How to pass PDFKit readable stream into request's post method?

My app needs to create a PDF file and then upload it to another server. The upload happens down the line via the post method from the request NPM package. Everything works fine if I pass in an fs.createReadStream:
const fs = require('fs');
const params = {file: fs.createReadStream('test.pdf')};
api.uploadFile(params);
Since PDFKit instantiates a read stream as well, I'm trying to pass that directly into the post params like this:
const PDFDocument = require('pdfkit');
const doc = new PDFDocument();
doc.text('steam test');
doc.end();
const params = {file: doc};
api.uploadFile(params);
However, this produces an error:
TypeError: Path must be a string. Received [Function]
If I look at PDFKit source code I see (in coffeescript):
class PDFDocument extends stream.Readable
I'm new to streams and it's clear I'm not understanding the difference here. To me if they are both readable streams, they should both be able to be passed in the same way.

Resources