Sharing code between React Native + Node - node.js

I am using React Native and Node.js. I want to share code between the two. My folder structure is as so.
myreactnativeapp/
mynodeserver/
myshared/
In the react native and node apps I have included the
package.json
"dpendencies" : {
"myshared": "git+https://myrepository/ugoshared.git"
}
This can then be included in each project via require/import etc. This all works fine and for production I'm happy with it. (Though I'd love to know a better way?)
The issue I'm facing is in development it's really slow.
The steps for a change to populate are:
Make changes in Shared
Commit Changes to git
Update the npm module
In development, I really want the same codebase to be used rather than this long update process. I tried the following:
Adding a symlink in node_models/shared - doesn't work in react-native package mangaer
Using relative paths ../../../shared - doesn't work in react-native package mangaer
Any other ideas?
Update 1
I created a script.sh which I run to copy the files into a local directory before the package manager starts. It's not ideal but at least I only have to restart the packager instead of messing with git etc.
#myreactnativeapp/start.sh
SOURCE=../myshared
MODULE=myshared
rm -rf ./$MODULE
mkdir ./$MODULE
find $SOURCE -maxdepth 1 -name \*.js -exec cp -v {} "./$MODULE/" \;
# create the package.json
echo '{ "name": "'$MODULE'" }' > ./$MODULE/package.json
# start the packager
node node_modules/react-native/local-cli/cli.js start
Then in my package.json I update the script to
"scripts": {
"start": "./start.sh",
},
So, the process is now.
Make a change
Start/Resetart the packager
Automatic:
Script copies all .js files under myshared/ -> myreactnativeapp/myshared/
Script creates a package.json with the name of the module
Because I've added the package.json to the copied files with the name of the module, in my project I can just include the items the same as I would if the module was included via the package manager above. In theory when I switch to using the package in production I wont have to change anything.
Import MyModule from 'myshared/MyModule'
Update 2
My first idea got tiresome restarting the package manager all the time. Instead i created a small node script in the shared directory to watch for changes. Whenever there is a change it copies it to the react native working directory.
var watch = require('node-watch')
var fs = require('fs')
var path = require('path')
let targetPath = '../reactnativeapp/myshared/'
watch('.', { recursive: false, filter: /\.js$/ }, function(evt, name) {
console.log('File changed: '+name+path.basename(__filename))
// don't copy this file
if(path.basename(__filename) === name) {
return
}
console.log(`Copying file: ${name} --> ${targetPath+name}`);
fs.copyFile(name, targetPath+name, err => {
if(err) {
console.log('Error:', err)
return;
}
console.log('Success');
})
});
console.log(`Starting to watch: ${__dirname}. All files to be copied to: ${targetPath}`)

Related

NPM CLI application relative path doesn't work

I'm creating a CLI application in NodeJS and the package is going to be published on NPM. The application is very simple as it has only two files. Here is the structure of application:
package.json
{
"name": "mycliapp",
"version": "1.0.0",
"description": "Some description",
"main": "./bin/cli.js",
"preferGlobal": true,
"bin": {
"mycliapp": "bin/cli.js"
},
}
bin/cli.js
const nodePlop = require('node-plop');
const configPath = './bin/config.js';
const plop = nodePlop(configPath, {
force: argv.force || argv.f
});
bin/config.js
{
// some configuration
}
Now if I create symlink with npm install -g from this directory and run the command mycliapp from the same development directory, it works absolutely fine but if I run this mycliapp command from any other directory in my computer, the const configPath = './bin/config.js' is tried to be taken from the current working directory not from the actual npm package and hence the config file is not found.
How can I solve this issue? I tried using __dirname and __filename with path.join but nothing seems to be working.
I also published this package on npm and installed from there, the same issue is occurring.
The JSON in your package.json is malformed -- you should remove the trailing comma after you set the bin parameter.
Also, if your configuration file is in the same directory as your script, you should reference it within your script as ./config.js.
Finally, you need to include a shebang (#!/usr/bin/env node) at the top of your cli.js file or any file you intend to use as your point of entry into the app so that your system knows what interpreter to use to execute the file.
See this post on the npm blog for more information.

Electron package - how to write/read files

I have file test.txt in my root directory of app. When I run my app with command npm start, I can write to my file without any problem, but when I make package using electron packager, writing text to my file is not possible anymore - I got error
Error: EACCES: permission denied, open './test.txt'
For this, I'm using node.js filesystem:
fs.writeFile("./test.txt",text,function(err){
if(err) {
return alert(err);
}
alert("saved");
});
How is possible to make this working? And is possible to include some extra folder in my app after package process? Thanks for your help!
There are a lot of options to choose to package your electron app in 2019, so in case you come to this question like I did and are using electron-builder, please try my suggestion below.
If you are using electron-builder to package your application and need to read/write a file that is stored within your solution, you can add it to your files property in your package.json. The properties in this file property are files that are copied when packaging your electron app - reference.
In my example, I was reading/writing to file.json.
let fs = require("fs");
fs.writeFile("./file.json", "data to file", "utf-8", (error, data) => {
if (error){
console.error("error: " + error);
}
});
My folder structure looked like this.
parent-folder
app/
assets/
configs/
images/
resources/
...
file.json
My app was not working after I packed it until I added the following file.json in my "build" property in my package.json.
"build": {
"productName": "MyApp",
"appId": "org.dev.MyApp",
"files": [
"app/dist/",
"app/app.html",
"app/main.prod.js",
"app/main.prod.js.map",
"package.json",
"file.json", // << added this line
],
//...
}
Didn't really found out what the problem was, so I tried another solution, which works for me (my main aim was to save data to some local memory of app).
I used npm package electron-store which is really easy to use.
You can get it by typing this to terminal
npm install electron-store
More info about it here: Electron store
Hope it helps someone else too :-)

Module not found error when trying to use a module as a local module

I am trying to understand as how to make a local module. At the root of node application, I have a directory named lib. Inside the lib directory I have a .js file which looks like:
var Test = function() {
return {
say : function() {
console.log('Good morning!');
}
}
}();
module.exports = Test;
I have modified my package.json with an entry of the path to the local module:
"dependencies": {
"chat-service": "^0.13.1",
"greet-module": "file:lib/Test"
}
Now, if I try to run a test script like:
var greet = require('greet-module');
console.log(greet.say());
it throws an error saying:
Error: Cannot find module 'greet-module'
What mistake am I making here?
modules.export is incorrect. It should be module.exports with an s.
Also, make sure after you add the dependency to do an npm install. This will copy the file over to your node_modules and make it available to the require function.
See here for a good reference.
Update:
After going through some examples to figure this out I noticed, most projects have the structure I laid out below. You should probably format your local modules to be their own standalone packages. With their own folders and package.json files specifying their dependencies and name. Then you can include it with npm install -S lib/test.
It worked for me once I did it, and it'll be a good structure moving forward. Cheers.
See here for the code.

How can I bundle a precompiled binary with electron

I am trying to include a precompiled binary with an electron app. I began with electron quick start app and modified my renderer.js file to include this code that is triggered when a file is dropped on the body:
spawn = require('child_process').spawn,
ffmpeg = spawn('node_modules/.bin/ffmpeg', ['-i', clips[0], '-an', '-q:v', '1', '-vcodec', 'libx264', '-y', '-pix_fmt', 'yuv420p', '-vf', 'setsar=1,scale=trunc(iw/2)*2:trunc(ih/2)*2,crop=in_w:in_h-50:0:50', '/tmp/out21321.mp4']);
ffmpeg.stdout.on('data', data => {
console.log(`stdout: ${data}`);
});
ffmpeg.stderr.on('data', data => {
console.log(`stderr: ${data}`);
});
I have placed my precompiled ffmpeg binary in node_modules/.bin/. Everything works great in the dev panel, but when I use electron-packager to set up the app, it throws a spawn error ENOENT to the console when triggered. I did find a very similar question on SO, but the question doesn't seem to be definitively answered. The npm page on electron-packager does show that they can be bundled, but I cannot find any documentation on how to do so.
The problem is that electron-builder or electron-packager will bundle your dependency into the asar file. It seems that if the dependency has a binary into node_modules/.bin it is smart enough to not package it.
This is the documentation for asar packaging for electron-builder on that topic. It says
Node modules, that must be unpacked, will be detected automatically
I understand that it is related to existing binaries in node_modules/.bin.
If the module you are using is not automatically unpacked you can disable asar archiving completely or explicitly tell electron-builder to not pack certain files. You do so in your package.json file like this:
"build": {
"asarUnpack": [
"**/app/node_modules/some-module/*"
],
For your particular case
I ran into the same issue with ffmpeg and this is what I've done:
Use ffmpeg-static. This package bundles statically compiled ffmpeg binaries for Windows, Mac and Linux. It also provides a way to get the full path of the binary for the OS you are running: require('ffmpeg-static').path
This will work fine in development, but we still need to troubleshoot the distribution problem.
Tell electron-builder to not pack the ffmpeg-static module:
"build": {
"asarUnpack": [
"**/app/node_modules/ffmpeg-static/*"
],
Now we need to slightly change the code to get the right path to ffmpeg with this code: require('ffmpeg-static').path.replace('app.asar', 'app.asar.unpacked') (if we are in development the replace() won't replace anything which is fine).
If you are using webpack (or other javascript bundler)
I ran into the issue that require('ffmpeg-static').path was returning a relative path in the renderer process. But the issue seemed to be that webpack changes the way the module is required and that prevents ffmpeg-static to provide a full path. In the Dev Tools the require('ffmpeg-static').path was working fine when run manually, but when doing the same in the bundled code I was always getting a relative path. So this is what I did.
In the main process add this before opening the BrowserWindow: global.ffmpegpath = require('ffmpeg-static').path.replace('app.asar', 'app.asar.unpacked'). The code that runs in the main process is not bundled by webpack so I always get a full path with this code.
In the renderer process pick the value this way: require('electron').remote.getGlobal('ffmpegpath')
I know I'm a bit late but just wanted to mention ffbinaries npm package I created a while ago exactly for this purpose.
It'll allow you to download ffmpeg/ffplay/ffserver/ffprobe binaries to specified location either during application boot (so you don't need to bundle it with your application) or in a CI setup. It can autodetect platform, you can also specify it manually.
If anyone happens to need an answer to this question: I do have a solution to this, but I have no idea if this is considered best practice. I couldn't find any good documentation for including 3rd party precompiled binaries, so I just fiddled with it until it finally worked. Here's what I did (starting with the electron quick start, node.js v6):
From the app directory I ran the following commands to include the ffmpeg binary as a module:
mkdir node_modules/ffmpeg
cp /usr/local/bin/ffmpeg node_modules/ffmpeg/
ln -s ../ffmpeg/ffmpeg node_modules/.bin/ffmpeg
(replace /usr/local/bin/ffmpeg with your current binary path, download it from here) Placing the link allowed electron-packager to include the binary I saved to node_modules/ffmpeg/.
Then to get the bundled app path I installed the npm package app-root-dir by running the following command:
npm i -S app-root-dir
Since I could then get the app path, I just appended the subfolder for my binary and spawned from there. This is the code that I placed in renderer.js:.
var appRootDir = require('app-root-dir').get();
var ffmpegpath=appRootDir+'/node_modules/ffmpeg/ffmpeg';
console.log(ffmpegpath);
const
spawn = require( 'child_process' ).spawn,
ffmpeg = spawn( ffmpegpath, ['-i',clips_input[0]]); //add whatever switches you need here
ffmpeg.stdout.on( 'data', data => {
console.log( `stdout: ${data}` );
});
ffmpeg.stderr.on( 'data', data => {
console.log( `stderr: ${data}` );
});
This is how I would do it:
Taking cues from tsuriga's answer, here is my code:
Note: replace or add OS path accordingly.
Create a directory ./resources/mac/bin
Place you binaries inside this folder
Create file ./app/binaries.js and paste the following code:
'use strict';
import path from 'path';
import { remote } from 'electron';
import getPlatform from './get-platform';
const IS_PROD = process.env.NODE_ENV === 'production';
const root = process.cwd();
const { isPackaged, getAppPath } = remote.app;
const binariesPath =
IS_PROD && isPackaged
? path.join(path.dirname(getAppPath()), '..', './Resources', './bin')
: path.join(root, './resources', getPlatform(), './bin');
export const execPath = path.resolve(path.join(binariesPath, './exec-file-name'));
Create file ./app/get-platform.js and paste the following code:
'use strict';
import { platform } from 'os';
export default () => {
switch (platform()) {
case 'aix':
case 'freebsd':
case 'linux':
case 'openbsd':
case 'android':
return 'linux';
case 'darwin':
case 'sunos':
return 'mac';
case 'win32':
return 'win';
}
};
Add the following code inside the ./package.json file:
"build": {
....
"extraFiles": [
{
"from": "resources/mac/bin",
"to": "Resources/bin",
"filter": [
"**/*"
]
}
],
....
},
import binary file path as:
import { execPath } from './binaries';
#your program code:
var command = spawn(execPath, arg, {});
Why this is better?
Most of the answers require an additional package called app-root-dir
The original answer doesn't handle the (env=production) build or the pre-packed versions properly. He/she has only taken care of development and post-packaged versions.

How to share code between node.js apps?

I have several apps in node that all share a few modules that I've written. These modules are not available via npm. I would like to be able to share freely between apps, but I don't want to copy directories around, nor rely on Git to do so. And I'm not really big on using symlinks to do this either.
I would like to arrange directories something like this:
app1
server.js
node_modules
(public modules from npm needed for app1)
lib
(my own modules specific to app1)
app2
server.js
node_modules
(public modules from npm needed for app2)
lib
(my own modules specific to app2)
shared_lib
(my own modules that are used in both app1 and app2)
The problem I'm seeing is that the modules in shared_lib seem to get confused as to where to find the modules that will be in the node_modules directory of whichever app they are running in. At least I think that is the problem.
So....what is a good way to do this that avoids having duplicates of files? (note that I don't care about duplicates of things in node_modules, since those aren't my code, I don't check them into Git, etc)
The npm documentation recommends using npm-link to create your own Node.js packages locally, and then making them available to other Node.js applications. It's a simple four-step process.
A typical procedure would be to first create a package with the following structure:
hello
| index.js
| package.json
A typical implementation of these files would be:
index.js
exports.world = function() {
return('Hello World');
}
package.json
{
"name": "hello",
"version": "0.0.1",
"private": true,
"main": "index.js",
"dependencies": {
},
"engines": {
"node": "v0.6.x"
}
}
"private:true" ensures that npm will refuse to publish the package. This is a way to prevent accidental publication of private packages.
Next, navigate to the root of your Node.js package folder and run npm link to link the package globally so it can be used in other applications.
To use this package in another application, e.g., "hello-world", with the following directory structure:
hello-world
| app.js
Navigate to the hello-world folder and run:
npm link hello
Now you can use it like any other npm package like so:
app.js
var http = require('http');
var hello = require('hello');
var server = http.createServer(function(req, res) {
res.writeHead(200);
res.end(hello.world());
});
server.listen(8080);
I've got this working by having node_modules folders at different levels - node then automatically traverses upwards until it finds the module.
Note you don't have to publish to npm to have a module inside of node_modules - just use:
"private": true
Inside each of your private package.json files - for your project I would have the following:
app1
server.js
node_modules
(public modules from npm needed for app1)
(private modules locally needed for app1)
app2
server.js
node_modules
(public modules from npm needed for app2)
(private modules locally needed for app2)
node_modules
(public modules from npm needed for app1 & app2)
(private modules locally for app1 & app2)
The point is node.js has a mechanism for dealing with this already and it's awesome. Just combine it with the 'private not on NPM' trick and you are good to go.
In short a:
require('somemodule')
From app A or B would cascade upwards until it found the module - regardless if it lived lower down or higher up. Indeed - this lets you hot-swap the location without changing any of the require(...) statements.
node.js module documentation
Just use the correct path in your require call
For example in server.js that would be:
var moduleName = require('../shared_lib/moduleName/module.js');
Its Important to know that as soon as your path is prefixed with '/', '../', or './' the path is relative to the calling file.
For further information about nodes module loading visit:
http://nodejs.org/docs/latest/api/modules.html
Yes, you can reference shared_lib from app1, but then you run into a problem if you want to package and deploy app1 to some other environment, such as a web server on AWS.
In this case, you're better off installing your modules in shared_lib to app1 and app2 using "npm install shared_lib/module". It will also install all the dependencies of the shared_lib modules in app1 and app2 and deal with conflicts/duplicates.
See this:
How to install a private NPM module without my own registry?
If you check out the node.js docs, you'll see that Node.js understands the package.json file format as well, at least cursorily.
Basically, if you have a directory named foo, and in that directory is a package.json file with the key-value pair: "main": "myCode.js", then if you try to require("foo") and it finds this directory with a package.json file inside, it will then use foo/myCode.js for the foo module.
So, with your directory structure, if each shared lib has it's own directory with such a simple package.json file inside, then your apps can get the shared libs by:
var lib1 = require('../shared_lib/lib1');
var lib2 = require('../shared_lib/lib2');
And that should work for both of these apps.
Another solution can be cloning files from the other places into this repo:
clone.js:
const path = require('path')
const fs = require('fs')
const shared = [
{
type: 'file',
source: '../app1',
files: [
'src/file1',
'src/file2',
'...'
],
},
]
function cloneFiles(source, files) {
const Reset = '\x1b[0m'
const FgGreen = '\x1b[32m'
console.log(`---------- Cloning ${files.length} files from ${source} ----------`)
for (const file of files) {
const sourceFile = path.join(__dirname, '..', source, file)
const targetFile = path.join(__dirname, '..', file)
process.stdout.write(`📁 ${file} ... `)
fs.copyFileSync(sourceFile, targetFile)
console.log(`${FgGreen}Done!${Reset}`)
}
console.log(`---------- All done successfully ----------\n`)
}
;(() => {
for (const item of shared) {
switch (item.type) {
case 'file':
cloneFiles(item.source, item.files)
break
}
}
})()
Then, in the package.json you can add this script and call it when you want to clone / sync files:
"clone": "node clone.js"

Resources