electron/muon: require not defined in renderer - node.js

I'm currently working on an IPFS/Ethereum dapp in Muon.
Because i need Metamask i started with this Boilerplate: https://github.com/SwapyNetwork/electron-metamask-boilerplate
Everything is working fine so far.
However i can not use require('anything') in the renderer process or in html script tags. (See below)
There seems to be a problem with the boilerplate code but i can't find it.
Or is node code in renderer not supported in Muon?
My only change in testing is setting node-integration explicitly to true and
inserting require('fs') in index.js. (i installed fs of course).
I tried many different solutions from stackoverflow and other sites but couldn't find a solution yet.
Error Message
Thank you

As per the muon's github repo:
Some of Muons goals include:
Use the Chromium source directly (eliminating electron's copy of chrome_src) with minor patches
make integrating chrome components less painful
faster and more streamlined end-to-end build process (see browser-laptop-bootstrap).
add support for Chrome extensions
add security focused features for the renderer:
remove node completely (from the renderer process)
full sandbox
scriptable window.opener support
As you can see there, muon does not supports node code in the rendered. It is by design for security purpose. Muon may be great for certain applications, but I recommend switching on to electron if you really need to use require in the renderer.

Related

Bazel nodejs liveserver

I've been going through the documentation at https://bazelbuild.github.io/rules_nodejs/ in order to put together a small web based application. I've got babel building the JS code, and http_server serving it, and ibazel watching it, and everything is working as expected: when I make a change, ibazel notices it and restarts the http_server rule.
The next thing I wanted to look at is getting autoreload in the browser so that the browser would automatically refresh when the change was compiled. My understanding is that this requires the http server to not be killed by ibazel, but instead to stay up and trigger a refresh via the ibazel_live_reload mechanism. I believe that http_server doesn't support this, but ts_devserver is explicitly mentioned in several places. However, ts_devserver doesn't seem to be maintained anymore (although I did find a devserver EXE in the npm package, there isn't a bazel rule that I saw to use it).
Is there a third party live development server that supports the ibazel reload mechanism - or am I missing something completely obvious?
Disclosure, I'm a core maintainer on rules_nodejs
As of rules_nodejs v3.0.0, ts_devserver has been renamed to concatjs_devserver to try and better namespace it (it has little to nothing really to do with Typescript). Its docs can be found here.
Note though that the concatjs_devserver comes with some compatibility gotchas, all dependencies have to be in named AMD/UMD or goog.module format for example, and may be tricky to use unless following the rest of the google3 toolchain.
We've (as the maintainers of rules_nodejs) tried not to wrap an existing devserver and publish it as of yet for various reasons, but it's something that has come up in discussion. I'm currently investigating some options in this space.
I'm not aware of any published devservers that currently support the ibazel protocol, there is a wrap of browsersync in the Angular Components repo which you may find useful.

Package node.js app as cross-platform executable, not for desktop app

There are a lot of questions on this topic, but they don't seem to distinguish between executables for desktop or server-side apps. I suppose my first question would be: what's the difference? For example, Zeit/pkg says they are a "node.js binary compiler", whereas nwjs (formerly node-webkit) says they are a "an app runtime based on Chromium and node.js".
I tried zeit/pkg and it works great, but have read that there can be performance issues unless it's configured properly. I wanted to make sure I was choosing the right tool and came across nwjs. It seems to do a lot of the same stuff pkg does, but has a larger following, as well as more docs and a robust api. Can I use nwjs as a server-side executable (i.e. not using the desktop feature) the same way I would use pkg?
This answer states that nwjs "is an option, but it really isn't set-up to do a server - client type relationship", but then a comment says "you can launch a server from node-webkit just in the way you launch it in Node.js. It's just that node-webkit provide another way beyond B/S architecture".
So, is nwjs effectively the same as pkg, or fundamentally different?
I realize that there's also Electron, which states "build cross platform desktop apps" and appears similar to nwjs. I'm not trying to get into a Electron vs nwjs debate, but rather desktop vs. server, if there's a difference.
you got most things already, only few clarifications are needed. Reason nw.js / Electron declares itself as for desktop application is, it's core architectural design is intended to integrate node.js with chromium to have UI enables create application does have UI. You can still use part of those framework (node.js side) without initiating visible ui, in that case behaviorwise it'll be similar to plain node.js does. Still there is caveat, like as it tightly integrated with chromium in core already for some cases you should have screen to chromium correctly initiates (or creating virtual buffer as lot of CI does, or etcs).
Also, when your concern is performance, I'd doubt using UI framework for server side work achieves what you desire - while there won't be huge, integration between node to chromium have overhead compare to bare node.js obvioulsy.
Getting back to original question, I feel question itself is somewhat vague. If the intention is truly server side application probably you won't need to package it but correctly deploy node and its dependency modules or packaging it sort of installable manner instead of creating single binary as pkg does.

What are the trade-offs of writing conventional node modules in ES6 with a babel workflow?

When developing front-end code for the browser, I often use the es2017 preset when transpiling down to a distribution bundle, which allows me all the conveniences of the included transformers. For conventional modules, I usually stick to whatever the required node engine I've specified for that particular module supports.
I would like to start developing these "conventional" modules using babel transformers as well, but I can foresee drawbacks to this, including:
It might inhibit the debugging workflow (more specifically when working with an IDE)
The performance of the module might suffer
What's the current state on this matter - would you say it makes sense to use babel in conventional modules given the aforementioned and other trade-offs? What are the pros/cons for your preferred workflow?
Bonus question: What are some reputable modules and/or module authors out there that are already using this technique? I've seen Facebook do it for their react ecosystem but I guess that makes sense since those are mostly modules for the browser.
It is converted back to vanilla JS (babel does that part).
What you get is that you can utilize Classes which I found useful.
Hopefully with time, browsers will support ES6 and we will not need babel.
The only drawback is that when debugging, you have to produce a source map, but that is temporary, see above.
To answer your second question: I'm using React in one of the websites, and most of the modules I needed (from npm) are using ES6.
I believe that the trade-offs or drawbacks that you mention both do not apply to developing nodejs code using babel as ES7 transpiler. Personally, I find using ES7 features with node tremendously productive.
There is source map support for debugging. I use karma for testing and it comes with excellent source map support (I use IntelliJ but I believe most IDEs will do). You can checkout this REST-API repository on github. It's a nice stack for building nodejs data backend. It uses karma for testing - even comes with code coverage support. It also integrates with pm2 for scaling and availability.
Regarding performance: I think transpiled code has been shown to run faster in many scenarios than the code a developer would write when not having advanced language features available. I will post some links later.

Node js plugin permissions

I create a web server with Sails.js, and want to allow third dev to create node.js plugins installable from a web page (store).
My problem is I don't want this plugin to require sails (or other critical modules) and have access to database and services and do what they want.
For example using fs and delete all files.
How can I do that ? I have no idea if node.js can lock some scripts on this own directory
I don't think that node expose some sandboxing functionality so when you load a js code into node that code can do what it want.
From your description yours plugins are more like browser javascript code so I think that you can use a headless browser to execute your code and retrieve the result. I've never tried it by myself but it should work. You just have to figure out how to pass parameters to plugin and get the result, also performance will be very bad because the headless browser is quite heavy. Try looking at
http://phantomjs.org/
Another solution is to run the plugins directly inside node but sanitizing the code before running. There are some projects like:
http://gf3.github.io/sandbox/
https://github.com/asvd/jailed
They can help you limiting the powers of the plugins.
Anyway are you sure about it ? in any major CMS platform that I've seen (wordpress, joomla, drupal, liferay ...) the platform's author trusts plugins authors and plugins can always do what they want.

JS library that provides simple utilities for browsers and the nodejs environment?

I'm looking for a javascript library that attempts to provide the same simple utilities in both the browser environment AND nodejs (iteration, mapping, maybe control-flow) so that code can more easily be re-used across server and client. I know you can hack out parts of any JS library (YUI, jQuery, ...) and get them to work in both environments, I'm just wondering if it's already been done or standardized.
The closest I've seen is this: https://github.com/kof/sharedjs
But it's incomplete and has some odd stuff. I'm wondering if there is something more polished before I fork and hack.
The underscore library was built to add more functional programming to jquery, things like mapping, and also templating.
Because it doesn't rely on the DOM (it leaves that to jquery) it functions well in node.
The RightJS link library has a server build link that has node.js in mind.
From the download page:
RightJS is also available as a server-side library. In this case it contains only the native JavaScript unit extensions and the Class, Observer, Options units along with all the non-DOM utility functions from the Util module.
Our server-side build follows the CommonJS principles and is ready for use with the node.js framework.
Node's GitHub wiki has a list of CommonJS-compatible modules which will run in Node and browsers.
Some of the other modules on that page may also run in a browser environment. For example, the excellent DateJS works fine in Node. (It is available as a NPM.)
Btw, RightJS is also available on NPM

Resources