Configure Bower with Sails.js 0.10 - node.js

I installed grunt-bower module and followed up the below link:
https://stackoverflow.com/a/22456574/194345
but still not able to access js or css files placed in assets folder.
I installed jquery with bower which is in "assets\lib\jquery\dist\jquery.js" but in browser, tried following URL to access it:
http://localhost:1337/js/jquery.js
http://localhost:1337/lib/js/jquery.js
http://localhost:1337/lib/jquery/dist/jquery.js
it always displays not found.
What is the correct path?

I think you may not have fully implemented the solution from the link you posted, specifically the part about configuring the grunt-bower task. I've edited that otherwise excellent answer to be a but more clear--you need to wrap the configuration in a function and save it as tasks/config/bower.js:
module.exports = function(grunt) {
grunt.config.set('bower', {
dev: {
dest: '.tmp/public',
js_dest: '.tmp/public/js',
css_dest: '.tmp/public/styles'
}
});
grunt.loadNpmTasks('grunt-bower');
};
Then, next time you lift Sails, jquery.js will be copied to .tmp/public/js and therefore be available at http://localhost:1337/js/jquery.js.

Related

can't load files when trying to publish my threejs project on github pages

I've been experimenting with react three fiber and wanted to publish a project on github pages. I did a build of my project and deployed it on pages. I get an error in the console that it can not load my files:
main.bb26e057.js:1 Failed to load resource: the server responded with a status of 404 ()
Here is a link to the repository: https://github.com/olvard/repossumtory/tree/main/possumtory ,
Here is a link to the pages site: https://olvard.github.io/repossumtory/possumtory/build/
I've tried fiddleing with different filepaths but i don't understand why npm run build would give me incorrect filepaths anyways.
Would be grateful for any help.
Try providing the homepage property in the package.json of the React App.
In your case, https://olvard.github.io/repossumtory/possumtory/build/ is the start url.
So modify include this line in your package.json.
"homepage": "https://olvard.github.io/repossumtory/possumtory/build/",
Edit:
Adding to your comment: The model fails to load at the first attempt because of your model.js on the last line.
It should be useGLTF.preload('opossum.glb') instead of useGLTF.preload('/opossum.glb')
I would recommend using a package called gh-pages. It's a scripted way of uploading your site using the pages functions of GitHub and also supports custom configuration options on commit.
All you need to do is run:
npm install gh-pages --save-dev
Then create a new file. Inside this file simply insert this code:
var ghpages = require('gh-pages');
ghpages.publish('path-of-dir-to-commit', function(err) {
if(err) {
console.log(err);
} else {
console.log("success");
});
You can read more on the npm site for the configuration options such as naming the repo it generates if non is found, etc.

Module not found error when trying to use a module as a local module

I am trying to understand as how to make a local module. At the root of node application, I have a directory named lib. Inside the lib directory I have a .js file which looks like:
var Test = function() {
return {
say : function() {
console.log('Good morning!');
}
}
}();
module.exports = Test;
I have modified my package.json with an entry of the path to the local module:
"dependencies": {
"chat-service": "^0.13.1",
"greet-module": "file:lib/Test"
}
Now, if I try to run a test script like:
var greet = require('greet-module');
console.log(greet.say());
it throws an error saying:
Error: Cannot find module 'greet-module'
What mistake am I making here?
modules.export is incorrect. It should be module.exports with an s.
Also, make sure after you add the dependency to do an npm install. This will copy the file over to your node_modules and make it available to the require function.
See here for a good reference.
Update:
After going through some examples to figure this out I noticed, most projects have the structure I laid out below. You should probably format your local modules to be their own standalone packages. With their own folders and package.json files specifying their dependencies and name. Then you can include it with npm install -S lib/test.
It worked for me once I did it, and it'll be a good structure moving forward. Cheers.
See here for the code.

How to not bundle node_modules, but use them normally in node.js?

Architecture
I would like to share code between client and server side. I have defined aliases in the webpack config:
resolve: {
// Absolute paths: https://github.com/webpack/webpack/issues/109
alias: {
server : absPath('/src/server/'),
app : absPath('/src/app/'),
client : absPath('/src/client/'),
}
},
Problem
Now on the server side I need to include webpack in order to recognize the correct paths when I require a file. For example
require('app/somefile.js')
will fail in pure node.js because can't find the app folder.
What I need (read the What I need updated section)
I need to be able to use the webpack aliases. I was thinking about making a bundle of all the server part without any file from node_modules. In this way when the server starts it will use node_modules from the node_modules folder instead of a minified js file (Why? 1st: it doesn't work. 2nd: is bad, because node_modules are compiled based on platform. So I don't want my win files to go on a unix server).
Output:
Compiled server.js file without any node_modules included.
Let the server.js to use node_modules;
What I need updated
As I've noticed in https://github.com/webpack/webpack/issues/135 making a bundled server.js will mess up with all the io operation file paths.
A better idea would be to leave node.js server files as they are, but replace the require method provided with a custom webpack require which takes in account configurations such as aliases (others?)... Can be done how require.js has done to run on node.js server.
What I've tried
By adding this plugin in webpack
new webpack.optimize.CommonsChunkPlugin(/* chunkName= */"ignore", /* filename= */"server.bundle.js")
Entries:
entry: {
client: "./src/client/index.js",
server: "./src/server/index.js",
ignore: ['the_only_node_module'] // But I need to do that for every node_module
},
It will create a file server.js which only contains my server code. Then creates a server.bundle.js which is not used. But the problem is that webpack includes the webpackJsonp function in the server.bundle.js file. Therefore both the client and server will not work.
It should be a way to just disable node_modules on one entry.
What I've tried # 2
I've managed to exclude the path, but requires doesn't work because are already minified. So the source looks like require(3) instead of require('my-module'). Each require string has been converted to an integer so it doesn't work.
In order to work I also need to patch the require function that webpack exports to add the node.js native require function (this is easy manually, but should be done automatically).
What I've tried # 3
In the webpack configuration:
{target: "node"}
This only adds an exports variable (not sure about what else it does because I've diffed the output).
What I've tried # 4 (almost there)
Using
require.ensure('my_module')
and then replacing all occurrences of r(2).ensure with require. I don't know if the r(2) part is always the same and because of this might not be automated.
Solved
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
Related
https://www.bountysource.com/issues/1660629-what-s-the-right-way-to-use-webpack-specific-functionality-in-node-js
https://github.com/webpack/webpack/issues/135
http://webpack.github.io/docs/configuration.html#target
https://github.com/webpack/webpack/issues/458
How to simultaneously create both 'web' and 'node' versions of a bundle with Webpack?
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
Thanks
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
My solution was:
{
// make sure that webpack will externalize
// modules using Node's module API (CommonJS 2)
output: { ...output, libraryTarget: 'commonjs2' },
// externalize all require() calls to non-relative modules.
// Unless you do something funky, every time you import a module
// from node_modules, it should match the regex below
externals: /^[a-z0-9-]/,
// Optional: use this if you want to be able to require() the
// server bundles from Node.js later
target: 'node'
}

How to use Gulp to create a separate vendor bundle with Browserify from Bower components

I'm using Gulp and Browserify to package my Javascript into 2 separate bundles: application.js and vendor.js.
How do I bundle the vendor package if my vendor libraries are installed with Bower?
In my gulpfile, I'm using the following modules:
var gulp = require("gulp");
var browserify = require("browserify");
var debowerify = require("debowerify");
var source = require("vinyl-source-stream");
Assuming that I have only the Phaser framework installed with bower (for this example), my Gulp task to create the application package looks like this:
gulp.task("scripts-app", function () {
browserify("./app/javascripts/index.js")
.external("phaser")
.pipe(source("application.js"))
.pipe(gulp.dest("./tmp/assets"));
});
Meanwhile, the vendor task looks like this:
gulp.task("scripts-vendor", function () {
browserify()
.transform(debowerify)
.require("phaser")
.pipe(source("vendor.js"))
.pipe(gulp.dest("./tmp/assets"));
});
When I run this Gulp task, I get an error that states Error: Cannot find module 'phaser' from and then all the directories it search through (none of which are the bower_components directory).
Any ideas about how to package these up successfully are greatly appreciated. Thanks!
Answered my own question:
When using require in the Gulp task, you need to supply a path to a file, not just a name.
gulp.task("scripts-vendor", function () {
browserify()
.transform(debowerify)
.require("./bower_components/phaser/phaser.js")
.pipe(source("vendor.js"))
.pipe(gulp.dest("./tmp/assets"));
});
Notice that require("phaser") became require("./bower_components/phaser/phaser.js").
Doing this works, although the bundle takes forever to build (around 20 seconds). You're probably better of just loading giant libraries/frameworks directly into your app through a <script> tag and then using Browserify Shim.
This let's you require() (in the NodeJS/Browserify sense) global variables (documentation).
Seems like you figured out how to require the bower file. Hopefully you'll only have to bundle it once initially, and not every build. Including the library via a script tag isn't a bad idea. Another technique I'm using is to use scriptjs (a polyfill would work too), to async load whatever vender libraries I need, but make sure to include any/all require's after the script loads. For example, your index.js could be like:
$script.('/assets/vendor', function() {
var phaser = require('phaser');
//rest of code
});
It's especially nice for loading cdn files or having the ability to defer loading certain libraries that aren't necessarily used in the core app by every user, or loading libraries after client-side routing.

Server side includes (SSI) with grunt connect web server

We are using yeoman for our dev process and currently using the "grunt server" command to run the grunt connect web server for local development. Every time we save a file, grunt will run all its tasks and reload the browser.
The problem is with Server side includes we use to include the header and footer. We had it previously working with Apache, IIS and Tomcat but have no idea how to get connect to do the same. It just treats it as an html comment.
eg include:
<!--#include virtual="header.html" -->
So,
1. Is there a way to get grunt/connect to include these files?
2. If not can we use Apache with yeoman/grunt?
3. If all fails, is there another way to include files with connect?
You can have express handle SSI with the help of the ssi node module.
I put together a github repo with this simple example: https://github.com/sfarthin/express-ssi-example .
I deployed this app to heroku so you can see it in action: http://intense-basin-9464.herokuapp.com/
app.use(function(req,res,next) {
var filename = __dirname+(req.path == "/" ? "/index.shtml" : req.path);
if(fs.existsSync(filename)) {
res.send(parser.parse(filename, fs.readFileSync(filename, {encoding: "utf8"})).contents);
} else {
next();
}
});
you can easily use connect-ssi:
https://github.com/soenkekluth/connect-ssi
I also used the ssi module for that.
for now I includes are only allowed for .shtml files. 'will change that soon.
Thanks a lot for all your help #steve-farthing and #soenke I finally ended up using a much simpler solution which was to install Apache with SSI enabled and add the following JS tag to the footer.
<script type="text/javascript">
document.write('<script src="//localhost:35729/livereload.js?snipver=1" type="text/javascript"><\/script>')
</script>
Now when we run grunt serve we still need to manually navigate to http://localh0st/app/ but everything else seems to work fine after that.

Resources