How to deploy Ganttlab in a path different than "/" - node.js

I am installing Ganttlab locally and I want to be able to use it in https://[mygitlabserver]/ganttlab/. The configuration and build process are pretty straightforward. However, I noticed that some files like "dist/index.html" and "dist/js/app.xxxxxx.js" have been built with "/" as the default path, resulting in a lot of 302's or 404's.
Is there a way I can configure this path before the build:webapp?
Thanks in advance.

The webapp uses vue.
So create a vue config file: packages/ganttlab-adapter-webapp/vue.config.json
module.exports = {
publicPath: './'
}
The code above will make all paths relative.
Depending on you setup you can also set the path explicit publicPath: '/ganttlab/'
For more info: github.com/ganttlab/ganttlab download and install

Related

Deploy VueJS App in a sub-directory or sub-path

I’m experiencing problems deploying a Vue JS app built using the Webpack CLi to work.
If uploaded in a root directory everything renders fine, but inside a subfolder, all the links break.
I want deploy VueJS App to this url :
https://event.domain.net/webinar
I have added publicPath in vue.config.js :
var path = require(‘path’)
module.exports = {
publicPath: ‘./’
}
But only the css and js folders point to the path /webinar.
For assets, fonts and others still point to the subdomain https://event.domain.net.
CSS and JS point to path /webinar
Asset, fonts still point to subdomain https://event.domain.net/
Console
use value of publicPath as /webinar that should work.
More details are here https://cli.vuejs.org/config/#publicpath
you can configure publicPath even based on environment.
Sagar Rabadiya pointed you to the right link:
create a file called vue.config.js in the project root (where your package.json is located).
prompt the following code snippet inside:
module.exports = {
publicPath: process.env.NODE_ENV === 'production'? '/your-sub-directory/' : '/'
}
and save the file.
Open a terminal and navigate to your project, then run npm run build to generate a production build from it.
As soon as the production build has been generated, copy the contents from it and paste it in the sub-directory you created in the root folder. For example, if you use Apache, the default root directory is the htdocs folder. I've also created a virtual host on the server, maybe you also need to do this.
Open the browser and type the address where your sub-directory lives. For example: http://your-server-url:your-port/your-sub-directory/ Your should see your app now.

How can I change the name of the _nuxt folder?

Hello I've got an issue with a Nuxt.js app that I can't seem to resolve. What I want to do is to change the name of the generated _nuxt folder with some other name. So far I've updated the nuxt.config.js and added this snippet:
build: {
publicPath: '/new-folder'
},
as far as I understand this publicPath variable expects a CDN link so probably this is not the correct way of changing the default _nuxt folder name.
I have also tried adding the buildDir: 'new-folder but when I run the build command it doesn't show up in the project. No matter what changes I added in the nuxt.config file when I deployed it on heroku all the assets where still in the _nuxt folder which causes issues to my project. Am I not seeing something am I doing something wrong?
Since the default answer in nuxt JS Documentation is /_nuxt/. The correct answer should be /yourCustomName/ - be ware that you need two forward slash.
In nuxt.config.js
build: {
publicPath: '/customName/'
}
t's simple, just change in build the publicPath.
buildDir is to change the folder for development, where the files will be when coding
nuxt.config.js:
build: {
publicPath: 'new-folder/',
},
in my case, my publicPath is leo/
you can check more about it here:
https://medium.com/#andrejsabrickis/how-to-set-custom-configuration-for-nuxt-js-generate-task-5055e53c2da5

How to resolve path to a file within node_modules

Problem loading a file using relative path when my node app is initialised using a another node app
I have created an npm which relies on a file stored relative to project root. something like this
index.js
- res
- config.json
Now I read the config.json using following code
const pathToConfig = path.resolve(__dirname, '../res/config.json')
This works great locally.
But in my prod setup this app is initialised by another node app.
And __dirname resolves to root of that app so all my logic to find config.json get messed up.
Is there any way I can read the file without worrying about how node app was initialised?
Have you tried the command process.cwd()? It is almost the same as __dirname but does differ slightly.

How to not bundle node_modules, but use them normally in node.js?

Architecture
I would like to share code between client and server side. I have defined aliases in the webpack config:
resolve: {
// Absolute paths: https://github.com/webpack/webpack/issues/109
alias: {
server : absPath('/src/server/'),
app : absPath('/src/app/'),
client : absPath('/src/client/'),
}
},
Problem
Now on the server side I need to include webpack in order to recognize the correct paths when I require a file. For example
require('app/somefile.js')
will fail in pure node.js because can't find the app folder.
What I need (read the What I need updated section)
I need to be able to use the webpack aliases. I was thinking about making a bundle of all the server part without any file from node_modules. In this way when the server starts it will use node_modules from the node_modules folder instead of a minified js file (Why? 1st: it doesn't work. 2nd: is bad, because node_modules are compiled based on platform. So I don't want my win files to go on a unix server).
Output:
Compiled server.js file without any node_modules included.
Let the server.js to use node_modules;
What I need updated
As I've noticed in https://github.com/webpack/webpack/issues/135 making a bundled server.js will mess up with all the io operation file paths.
A better idea would be to leave node.js server files as they are, but replace the require method provided with a custom webpack require which takes in account configurations such as aliases (others?)... Can be done how require.js has done to run on node.js server.
What I've tried
By adding this plugin in webpack
new webpack.optimize.CommonsChunkPlugin(/* chunkName= */"ignore", /* filename= */"server.bundle.js")
Entries:
entry: {
client: "./src/client/index.js",
server: "./src/server/index.js",
ignore: ['the_only_node_module'] // But I need to do that for every node_module
},
It will create a file server.js which only contains my server code. Then creates a server.bundle.js which is not used. But the problem is that webpack includes the webpackJsonp function in the server.bundle.js file. Therefore both the client and server will not work.
It should be a way to just disable node_modules on one entry.
What I've tried # 2
I've managed to exclude the path, but requires doesn't work because are already minified. So the source looks like require(3) instead of require('my-module'). Each require string has been converted to an integer so it doesn't work.
In order to work I also need to patch the require function that webpack exports to add the node.js native require function (this is easy manually, but should be done automatically).
What I've tried # 3
In the webpack configuration:
{target: "node"}
This only adds an exports variable (not sure about what else it does because I've diffed the output).
What I've tried # 4 (almost there)
Using
require.ensure('my_module')
and then replacing all occurrences of r(2).ensure with require. I don't know if the r(2) part is always the same and because of this might not be automated.
Solved
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
Related
https://www.bountysource.com/issues/1660629-what-s-the-right-way-to-use-webpack-specific-functionality-in-node-js
https://github.com/webpack/webpack/issues/135
http://webpack.github.io/docs/configuration.html#target
https://github.com/webpack/webpack/issues/458
How to simultaneously create both 'web' and 'node' versions of a bundle with Webpack?
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
Thanks
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
My solution was:
{
// make sure that webpack will externalize
// modules using Node's module API (CommonJS 2)
output: { ...output, libraryTarget: 'commonjs2' },
// externalize all require() calls to non-relative modules.
// Unless you do something funky, every time you import a module
// from node_modules, it should match the regex below
externals: /^[a-z0-9-]/,
// Optional: use this if you want to be able to require() the
// server bundles from Node.js later
target: 'node'
}

Require.js optimizer supposed to copy all files over into the output directory?

I am trying to integrate the r.js optimizer on the server side (Apache Sling) and face one problem: when resolving modules it always looks them up under the output directory (dir), not from within the source directory (baseUrl or appDir), doesn't find them and thus fails.
/project/build.js
({
name: "modules/main",
dir: "/target",
baseUrl: "/sources"
})
If you wonder, the root path / is inside the server's JCR repository, not a file system. Also I simplified the example a bit (hopefully without concealing the issue).
It will resolve and read the main file properly:
/sources/modules/main.js
require(["modules/foo"]);
However, when it now tries to resolve modules/foo, it tries to read it from /target/modules/foo.js instead of /sources/modules/foo.js as I would expect, which does not exist and the whole r.js execution fails and stops.
I tried using appDir and all kinds of combinations, but the issue is always the same. I am fairly sure it is not related to my integration code... AFAIU from documentation and googling around, it should either copy them to the target before building the optimized file or simply pick them up from the source directory automatically.
Am I supposed to copy all the raw source files to /target myself before running r.js?
Maybe the problem is that baseUrl=/overlay is different from build.js residing inside /project?
Maybe r.js also looks at the current working directory of the r.js process (which is so far undefined in my case)?
Can the output directory (dir) live outside appDir or baseUrl?
My require.js configuration looks like so:
({
appDir: "../app",
baseUrl: "js/lib", // means the base URL is ../app/js/lib
dir: "../app-built", //target
// offtopic, but a very handy option
mainConfigFile: "../app/config.js",
// I'm not 100% sure if it's equivalent to your version
// where you're not using "modules" and just "name"
modules: [{
name: "../some/main" // this is ../app/js/some/main.js
}]
})
Reading through https://github.com/jrburke/r.js/blob/master/build/example.build.js#L15 - it seems you do want an appDir specified if you want the files to be copied to the target dir before optimization.
To answer your other questions
you don't need to manually copy files over
baseUrl should point to the same place as baseUrl used in your app's config - however you have to adjust it depending on what appDir you choose to use (e.g. appDir="../app" and baseUrl="js/lib", or appDir="../app/js" then baseUrl="lib", etc.)
appDir and dir should be relative to the build config file - I don't know what happens when you use absolute paths
yes - output dir does (has to?) live outside appDir. BaseURL is within the appDir/dir (all these names are really confusing..)
I would say
use the "appDir" setting
try using "modules" like I did instead of just "name"
make "appDir" and "dir" relative paths to the build file if you can - these absolute paths might be what's breaking? because other than that the config looks very similar to the one I use
I know there's a different way of configuring it where your output is 1 file, which case the files are read from the source dir - but I haven't used that much myself.
Hope this helps.
Answering myself: I got it to work with the single output file approach using out instead of appDir or dir:
({
name: "modules/main",
baseUrl: "/sources"
out: "/target/out.js",
})
In this case it reads all the modules from the sources and creates a /target/out-temp.js which it then moves to /target/out.js when done.
This seems to suit my needs so far.

Resources