Runtime import in typescript - node.js

I want to import a json file at the runtime.
My code:
const path = "../../assets/data/"+<fileprefix>+".json"
const data : Information = await import(path);
fileprefix here changes at the runtime.
I'm getting module not found error as below.
Error: Cannot find module '../../assets/data/US.json
US here is the fileprefix that comes at runtime.
Is there any way I can make node to find the module at the runtime?

It's a little unclear what you mean by import. If you are referring to the ES6 import (import something from 'something'), then, no. You can't do that. However, I see that you're doing import(path). Is import() a library of some sort or are you expecting that to work with an es6 import?
Here's what you should do (IMO):
What you can do, is serve your json files as static files and then use the fetch api to bring that json in.
Ex:
fetch('https://jsonplaceholder.typicode.com/todos/1')
.then(response => response.json())
.then(json => console.log(json))
Obviously, instead of the url being https://jsonplaceholder.typicode.com/todos/1, it would the url to your static json file.
One other option would be to get all of your json files (or references to your json files) into a single module. Then you can import that module and then call whatever method you need to call so that the correct json file comes back. Here's an article on how to go about this: https://medium.com/#leonardobrunolima/javascript-tips-dynamically-importing-es-modules-with-import-f0093dbba8e1

Related

How to read JSONL line-by-line after hitting url in Node.JS?

From the Shopify API, I receive a link to a large amount of JSONL. Using NodeJS, I need to read this data line-by-line, as loading it all at once would use lots of memory. When I hit the JSONL url from the web browser, it automatically downloads the JSONL file to my downloads folder.
Example of JSONL:
{"id":"gid:\/\/shopify\/Customer\/6478758936817","firstName":"Joe"}
{"id":"gid:\/\/shopify\/Order\/5044232028401","name":"#1001","createdAt":"2022-09-16T16:30:50Z","__parentId":"gid:\/\/shopify\/Customer\/6478758936817"}
{"id":"gid:\/\/shopify\/Order\/5044244480241","name":"#1003","createdAt":"2022-09-16T16:37:27Z","__parentId":"gid:\/\/shopify\/Customer\/6478758936817"}
{"id":"gid:\/\/shopify\/Order\/5057425703153","name":"#1006","createdAt":"2022-09-27T17:24:39Z","__parentId":"gid:\/\/shopify\/Customer\/6478758936817"}
{"id":"gid:\/\/shopify\/Customer\/6478771093745","firstName":"John"}
{"id":"gid:\/\/shopify\/Customer\/6478771126513","firstName":"Jane"}
I'm unsure how to process this data in NodeJS. Do I need to hit the url, download all of the data and store it in a temporary file, then process the data line-by-line? Or can I read the data line-by-line directly after hitting the url (via some sort of stream?) and process it without storing it in a temporary file on the server?
(The JSONL comes from https://storage.googleapis.com/ if that helps.)
Thanks.
using axios you can set the response to be a stream, and then using a buildin readline module, you can process your data line by line.
import axios from 'axios'
import { createInterface } from 'node:readline'
const response = await axios.get('https://raw.githubusercontent.com/zaibacu/thesaurus/master/en_thesaurus.jsonl', {
responseType: 'stream'
})
const rl = createInterface({
input: response.data
})
for await (const line of rl) {
// do something with the current line
const { word, synonyms } = JSON.parse(line)
console.log('word, synonyms: ', word, synonyms);
}
testing this there is barely any memory usage
You can easily run a great CLI tool called jq. Magic.
Unlike tying yourself to browser code, this code can be run in any way you need to parse JSONL.
jq -cs '.' doodoo.myshopify.com.export.jsonl > out.json
Would take my nicely just downloaded bulk file from a query and give me a very nice pure JSON data structure to play with, or save.

How to Properly Import a Module Into A File (Node.js)

I am trying to make a video chat using WebRTC and Node.js. I currently am trying to add a selectable microphone (e.g. being able to change mics and webcams). I made the function, but when I try to import a function from the file that generates the IDs of the the devices, it doesnt work. Note that I am not currently getting any errors, instead, when I add the import statement to the file, nothing shows up (except for the dropdowns that change the mic and webcam).
Is there a reason that node wont let me import a function?
Note that the file that I am trying to import into exports a bunch of functions (thats the purpose of it), RTC.js. However, I also tried importing into another file and it didnt work either (the file that imports the first file, rtc.js).
Thanks in advance
The github repository is located here
Export is like this line you already did https://github.com/divinelemon/VideoChatAppSwitchMics/blob/master/ws/stream.js#L34
module.exports = stream;
Import is like here you did https://github.com/divinelemon/VideoChatAppSwitchMics/blob/master/app.js#L5
let stream = require( './ws/stream' );
you can also ES6 import/export function:-
const someFunction = (){
...
}
export default someFuntion ( in the case on single function)
When you want to export multiple functions
export { someFunction1, someFunction2}.
Now the place where you want to import
import somFunction from 'filelocation' ( note in case of default exports you need to use the same name of the function)
In the case of multiple functions.You can change the function name but keep in mind the order of exports and imports.
import { myFunction1,myFunction2} from 'fileLocation'

RE: Is it possible to import a script from an URL using node.js?

Since this question already exists, but I don't know if that the right choice to do after es2015 update to javascript, could someone recomment some easy way to do that in node LTS?
You can download it with your favorite http library (such as got() or axios() or node-fetch()) and once you have it downloaded locally into memory, you can use eval() to run it. If it's actually a module, then you can save it to disk and load it with require(). The latest import syntax will support URLs, but I don't think node.js yet allows it to be anything other than a file URL.
Here's an example, using the got() library to download the script and save it to a file:
const {promisify} = require('util');
const pipeline = promisify(require('stream').pipeline);
const fs = require('fs');
const got = require('got');
pipeline(
got.stream('http://yourURLhere'),
fs.createWriteStream('yourfile.js')
).then(result => {
// load the script
require(`./yourfile.js`)
}).catch(err => {
console.log(err);
});
Note, it can be done in a few less steps with .pipe() (as commonly illustrated), but .pipe() does not forward errors from the readstream whereas pipeline() does.

How to eval es6 code from NodeJS at runtime with Babel?

I'm trying to build a nodeJS tool to help me analyzing another AngularJS source code.
The idea is to :
read some of the angular project javascript files
for each file, grab the content
eval the content from the file
do some stuff
The problem I'm facing is that my Angular source code uses es6 features like import, export, arrow functions, ...Etc. and I using nodeJS which does not support these features yet.
So I tried to use #babel/core transform() from my Node app code, but it doesn't work. I keep getting error like Unexpected identifier which means it doesn't understand the import {stuff} from 'here'; syntaxe.
srcFiles.forEach(content => {
try {
(function() {
eval(require("#babel/core").transform(content.text).code)
}.call(window, angular));
} catch (e) {
console.log(e);
}
});
An sample test file :
import _ from 'loadash';
console.log("I'm a file with import and export");
export const = 42;
Any idea how I can get this stuff working ? Or maybe another approach ?
You can pass options as the second parameter of transform method. See examples here

nodejs require() json from file garbage collection

I'm using a file for storing JSON data. My module makes CRUD actions on the file and I'm using require() to load the json, instead of fs.readFile(). The issue is, if the file is deleted, using fs.unlink(), then calling the file again using require still loads the file... which has just been deleted. I'm a bit lost how to get around this, possibly #garbage-collection?
Example:
fs.writeFile('foo.json', JSON.stringify({foo:"bar"}), function(){
var j = require('./foo.json')
fs.unlink('./foo.json', function(){
console.log('File deleted')
var j = require('./foo.json')
console.log(j)
})
})
When loading a module using require, Node.js caches the loaded module internally so that subsequent calls to require do not need to access the drive again. The same is true for .json files, when loaded using require.
That's why your file still is "loaded", although you deleted it.
The solution to this issue is to use the function for loading a file that is appropriate for it, which you already mentioned: fs.readFile(). Once you use that, everything will work as expected.

Resources