I wrote a simple npm module to precompile my handlebars templates when using django compressor to do post-processing for some client side components and found that I need to ship the npm module with a few js files.
Currently I just assume no one is installing this with the global flag because I've "hard coded" the path to these dependencies in the npm module itself
example layout of my npm module
/
* /bin
* /lib/main.js
* /vendor/ember.js
Now inside main.js I want to use the ember.js file ... currently my hard coded approach looks like this
var emberjs = fs.readFileSync('node_modules/django-ember-precompile/vendor/ember.js', 'utf8');
Again -this only works because I assume you install it local but I'd like to think node.js has a more legit way to get locally embedded files
Anyone know how I can improve this to be more "global" friendly?
What you can do is get the directory of the current file and make your file paths relative to that.
var path = require('path')
, fs = require('fs');
var vendor = path.join(path.dirname(fs.realpathSync(__filename)), '../vendor');
var emberjs = fs.readFileSync(vendor + '/ember.js', 'utf8');
Hope that helps!
One of the great strengths of Node.js is how quickly you can get up and running. The downside to this approach is that you are forced to fit the design patterns it was build around.
This is an example where your approach differs too much from Nodes approach.
Node expects everything in a module to be exposed from the modules exports, including templates.
Move the readFileSync into the django-ember-precompile module, then expose the returned value via a module export in lib/main.js.
Example:
package.json
{
"name": "django-ember-precompile",
"main": "lib/main.js"
}
lib/main.js
module.exports.ember = readFileSync('vendor/ember.js')
vendor/ember.js
You obtain your template via
var template = require('django-ember-precompile').ember
This example can be refined, but the core idea is the same.
Related
I've decided to try out ReactJS. Along with that, I've decided to use Gulp for compiling .jsx to .js, also for the first time.
I can compile it no problem for the client use with browserify. Here's my gulp task:
browserify("./scripts/main.jsx")
.transform(
babelify.configure({
presets: ["react"]
}))
.bundle()
.pipe(source('bundle.js'))
.pipe(gulp.dest('./scripts/'));
But since I use PHP to generate data, I need to get those data to node. If I use browserify, it will prevent me from using process.argv in node. I can save data to file and read that file in node, so I wouldn't need to pass the whole state to node, but I still need to pass the identifying arguments, so the node knows which file to load.
What should I use instead of browserify?
If you need to compile a React module to es5 for use on the server, use Babel itself.
A module that may help with reading and writing files is this one: https://nodejs.org/api/fs.html
Have you considered posting and getting from a database?
Here's how I solved it:
I have learnt that you can create standalone bundles with browserify, so I've compiled all the server code I need (components + rendering) as a standalone bundle. Then I have created small node script which is responsible only for reading arguments, loading data and sending it to the rendering code.
I'm not sure if this is a proper way how it should be done but it works.
Here's code for the "setup" script:
var fs = require('fs');
var Server = require('./server.js');
if (process.argv[2]) {
region = process.argv[2].toLowerCase().replace(/[^a-z0-9]/, '');
if (region != '') {
var data = JSON.parse(fs.readFileSync(__dirname + '/../tmp/' + region + '.json', 'utf8'));
console.log(Server.render(data.deal, data.region));
}
}
This way I only need to deploy two files and I still can easily compile jsx to js.
I'm building an isomorphic React/React-Router/Redux/Webpack application and I'm attempting to implement server side rendering.
My directory looks like:
/client
/actions
/components
/containers
/server
/server.js
In my webpack config, I have aliases set up for all the folders inside client:
var path_base = path.resolve(__dirname, '..');
const resolve = path.resolve;
const base = function() {
var args = [path_base];
args.push.apply(args, arguments);
return resolve.apply(resolve,args);
};
const resolve_alias = base.bind(null, 'src/client');
const aliases = [
'actions',
'components',
'constants',
'containers',
'middleware',
'reducers',
'routes',
'store',
'styles',
'utils',
'validation'
];
so that inside the code that gets bundled by webpack, I can do:
import { Widget } from 'components';
and that import gets resolved by webpack.
Now in my server code, in order to do the rendering, I have to import some of my client files, like routes/index.js. The problem I'm running into when I import my routes file, it's using a webpack alias to another file, say components or containers so naturally, the node js require system can't resolve it.
How do I fix something like that? I looked at this question and it talks about essentially setting up the same aliases that exist in webpack with mock-require. But then the issue becomes that my routes file imports all my components which then all import things like stylesheets, images, etc. Should I then be using something like webpack-isomorphic-tools?
The guides I've been looking at (this for example) are all great at showing how server side rendering is accomplished but none of them really talk about how to resolve all the requires and whatnot.
After battling with this issue for 2 days I settled on babel-plugin-webpack-alias.
What you need to do to resolve paths with that is:
$ npm install --save-dev babel-plugin-webpack-alias
Add the plugin to your .babelrc
Add the aliases to your webpack.config (make sure you use path.join())
Refer to this post if you have problems loading styles
The other option I tried was universal-webpack but I found it to be a bit verbose. If you want to see roughly how the whole server-side loading works, you can check out this video.
If you really want them, run your server side code through babel and use this plugin: https://www.npmjs.com/package/babel-plugin-module-alias which will let you do the same thing as webpack.
Edit: This one works a lot better: https://github.com/jagrem/babel-resolve-relative-module it allows multiple paths
Try to use NODE_PATH. Node will always look for a module in this path during require calls. It allows to short cut your relative paths as you want.
// turn this
import {Widget} from '../../components';
// into this
import {Widget} from 'components';
See Node.js docs for more information.
P.S. this thing is very sensitive, so use it carefully. Now your code tightly depends from the environment and may break somewhere.
If you use webpack-isomorphic-tools then it'll take your webpack config into account for your server side which will make all your aliases work.
https://www.npmjs.com/package/webpack-isomorphic-tools
I need to "read" an array of some paperjs paths in nodejs and get their dimension. I wanted to use paper npm module but saw that it has a dependency to Cairo.
As I'm deploying to heroku it is a little difficult to use Cairo. I know its possible but I want to know if its really necessary just for "reading" the dimensions of a path group.
A present day answer: Yes it's possible.
———
To quote from the docs:
Paper.js comes in three different versions on NPM: paper, paper-jsdom and paper-jsdom-canvas. Depending on your use case, you need to require a different one:
paper is the main library, and can be used directly in a browser context, e.g. a web browser or worker.
paper-jsdom is a shim module for Node.js, offering headless use with SVG importing and exporting through jsdom.
paper-jsdom-canvas is a shim module for Node.js, offering canvas rendering through Node-Canvas as well as SVG importing and exporting through jsdom.
→ If you don't require rendering to canvas, paper-jsdom should do the trick (which doesn't need Cairo)
———
Important note:
If I understood correctly this means you can't use PaperScript, but have to use JavaScript directly. This will affect how you have to write code:
new paper.Path() instead of new Path()
Create project manually using paper.setup()
No mathematical operators (+ - * / %)
... (More info in the docs)
———
A simple example to create a line, log its bounding box size (and also export the SVG):
Note: I used this with the current Node LTS (v14.16.1)
package.json
{
"name": "paper-jsdom-example",
"main": "index.js",
"dependencies": {
"paper-jsdom": "^0.12.15"
}
}
index.js
const paper = require('paper');
const fs = require('fs');
var size = new paper.Size(300, 300)
paper.setup(size);
var path = new paper.Path();
path.strokeColor = '#348BF0';
var start = new paper.Point(100, 100);
var end = new paper.Point(200, 200);
path.moveTo(start);
path.lineTo(end);
console.log('width', path.bounds.width, 'height', path.bounds.height);
var svg = paper.project.exportSVG({asString:true});
fs.writeFileSync('punchline.svg', svg);
I'm trying to add gm module to my cloud. Since parse is not a node.js environment I made small changes over other modules I used, but gm module requires so much node core module. Do I have to push all of the sub modules to parse. Also how can I add the core modules. Changing require('xxx') to require('xxx/index.js') or require('xxx/xxx.js') failed.
These are the modules I could find where gm is depended and changed these files. I did include all the files in the modules only changed the following ones.
- gm/index.js
- events/events.js
- util/util.js
- stream/index.js
- emitter/index.js (also depends on util)
- asynclist/index.js
- eventproxy/index.js
changing all of these gives error
Result: Error: Module ./lib/eventproxy not found
at libs/eventproxy/index.js:1:18
at libs/asynclist/index.js:1:18
at libs/emitter/index.js:1:17
at libs/stream/index.js:22:15
at libs/gm/index.js:6:14
at main.js:118:14
// This error is caused by ./lib/eventproxy.
// It must be cloud/libs/asynclist/node_modules/eventproxy/lib/eventproxy.js
// Parse doesn't recognize './' as this folder in cloud code
where gm/index.js's require part is
var Stream = require('cloud/libs/stream/index.js').Stream;
var EventEmitter = require('cloud/libs/events/events.js').EventEmitter;
var util = require('cloud/libs/util/util.js');
My cloud folder structure is
cloud/libs/gm
cloud/libs/events
cloud/libs/util
cloud/libs/stream
cloud/libs/emitter
cloud/libs/asynclist
cloud/libs/eventproxy
EDIT: I found more dependencies. According to this, gm has 36 dependent libraries and I clearly need a simple solution.
EDIT: for the relative path problem with parse, this is a solution but as I said I need a simple way for the whole problem.
In my project folder project I have a file utils.js like this
global.root_path = process.cwd();
global.custom_require = function() {
console.log('custom require called');
}
I would like to include this file in every execution of node so that in every js file in project i can call custom_require, or access global.root_path as I'm doing already with built-in function require.
Do you know if there is a solution (command line options, environment variable or whatever) to achieve this?
UPDATE
I don't want to include utils.js in every file, in that case I wouldn't need to modify global object.
Thanks!
Simply require it. Example app.js:
require('./utils.js');
var express = custom_require('express');
var app = express();
var port = process.env.PORT || 3000;
app.listen(port);
Then, you could just do (in terminal)
node app
As long as you require the code containing the global declarations on it, then you can use those globals from any file that's required after the fact.
You most probably don't want to have a custom "plugin" altering the behavior of all your applications when run locally, thus it'd be best to stay away from that kind of pattern. Instead, you might want to create a module, use npm link on it, and then npm link something in your project's directory, where something is the module name.
Then each project could just add one line like require('something'); at the very beginning. Changing something would immediately impact all the projects which included it, thanks to the behavior in npm link.