Laravel Mix not immediately exiting/terminating after compiled success - node.js

I have a project that is currently using Laravel Jetstream with Inertia (inertia-vue2). While at first, it was good, but now I have irritated issue that also affects my GitHub actions. When I run, say, yarn dev or yarn prod, then it compiled successfully, it didn't exit immediately unless I press ctrl+c then terminate it. It was fine that I need to do that every time I run yarn dev on my machine (but honestly, I'm not), but this has become the problem on GitHub actions because I can't manually terminate Laravel Mix on the workflow. I was waiting to finish the build step for 3 hours to terminate itself, but it didn't. Any way to fix this? Here's the screenshot and my webpack.mix.js and my webpack.config.js.
This is the 8th time I have run this job, still the same result. Modify the webpack.mix.js, remove the yarn.lock and node_modules on my machine, then push the yarn.lock, still same, I'm running out of ideas.
And here's my webpack.mix.js (and yes, I have PostCSS and SASS, I was intending to going full sass, but since Vuetify is struggling, so at the moment, I'll use the postcss instead for Vuetify stuff, but other than that, I'm using sass)
const mix = require('laravel-mix');
require('laravel-mix-favicon');
require('vuetifyjs-mix-extension');
/*
|--------------------------------------------------------------------------
| Mix Asset Management
|--------------------------------------------------------------------------
|
| Mix provides a clean, fluent API for defining some Webpack build steps
| for your Laravel applications. By default, we are compiling the CSS
| file for the application as well as bundling up all the JS files.
|
*/
mix.js('resources/js/app.js', 'public/js').vue()
.postCss('resources/css/app.css', 'public/css', [
require('postcss-import'),
require('autoprefixer'),
])
.sass('resources/sass/print.scss', 'public/css')
.webpackConfig(require('./webpack.config'))
.vuetify()
.sourceMaps()
.favicon({
blade: 'resources/views/favicon.blade.php',
})
.disableNotifications();
if (mix.inProduction()) {
mix.version();
}
And here's my webpack.config.js
const path = require('path');
module.exports = {
resolve: {
alias: {
'#': path.resolve('resources/js'),
},
},
output: {
chunkFilename: 'js/[name].js?id=[chunkhash]',
}
};
Any help would be appreciated. Thanks!
[EDIT]
In Github Actions, I always cancel workflow, and I don't want to wait for three hours++ anymore. This is a pain T-T.

Related

Gatsby Source Drupal not fetching data when trying to deploy to netlify/heroku

I have a site running Gatsby and Gatsby-Source-Drupal7, it is a plugin that uses Graphql to make an axios get request to https://stagingsupply.htm-mbs.com/restws_resource.json and uses the json data to query. I am able to run it just fine on my computer by going to localhost:8000 and it creates over 200k nodes, but when I try to deploy on any cloud service provider like Gatsby Cloud or Netlify it doesn't fetch any nodes or data at all from the site.
Warning from console
Starting to fetch data from Drupal
warn The gatsby-source-drupal7 plugin has generated no Gatsby nodes. Do you need
it?
Code
code from gatsby config
module.exports = {
siteMetadata: {
title: `new`,
siteUrl: `https://www.yourdomain.tld`,
},
plugins: [
{
resolve: `gatsby-source-drupal7`,
options: {
baseUrl: `https://stagingsupply.htm-mbs.com/`,
apiBase: `restws_resource.json`, // optional, defaults to `restws_resource.json`
},
},
]
}
gatsby-config.js from node_modules/gatsby-source-drupal7
const createNode = actions.createNode; // Default apiBase to `jsonapi`
apiBase = apiBase || `restws_resource.json`; // Fetch articles.
// console.time(`fetch Drupal data`)
console.log(`Starting to fetch data from Drupal`);
const data = yield axios.get(`${baseUrl}/${apiBase}`, {
auth: basicAuth
});
const allData = yield Promise.all(_.map(data.data.list,
Link to repo that works on local computer https://github.com/nicholastorr/gatsby-d7
any and all help will be appreciated
As you pointed out, you've played around with the Node versions using NODE_ENV and engines workarounds. My guess also relies on a mismatching Node version between environments but as Netlify docs suggests, there are only two ways of customizing Node versions to manage dependencies.
Set a NODE_VERSION environment variable.
Add a .node-version or .nvmrc file to the site’s base directory in your repository. This will also tell any other developer using the
repository which version of Node.js it depends on.
Without seeing your Netlify build command (to see the NODE_VERSION) there's no .node-version nor .nvmrc in your repository. I'd try creating it at the root of the project with the v14.17.1 in it and trying a fresh install.
In addition, double-check other server-related conflicts like IP-blocking, etc.
Error was nothing Gatsby or Node related, my site was block the IP of the server :>

Smarter webpack bundling with react and express

I've got a react app going, and it's run on an express server and is bundling with webpack. My issue is that everytime I restart the server, like when i am making changes to it, it takes forever to rebuild the frontend bundle, even though i don't make any changes to the frontend.
It would be nice to just reload the server portion and leave the current frontend bundle in tact when just making server/api changes that don't involve the front end bundle.
Here is the code that run's in a dev environment:
const compiler = webpack(webpackConfig)
const middleware = webpackMiddleware(compiler, {
publicPath: webpackConfig.output.publicPath,
contentBase: 'src',
stats: {
colors: true,
hash: false,
timings: true,
chunks: false,
chunkModules: false,
modules: false
}
})
app.use(middleware)
app.use(webpackHotMiddleware(compiler))
app.get('*', (req, res) => {
res.write(middleware.fileSystem.readFileSync(path.join(__dirname, 'build/app.html')))
res.end()
})
Is there a smarter way to do this? is it possible to leave the current frontend bundle in memory and just reload the server? Or can I detect if the bundle needs to be updated and skip the process if it doesn't need to be updated?
Any tips, advice and suggestions are welcome! Let me know if you need any other info. Thanks!
Chokidar solution
If using webpack-dev-tools, a great library for watching changes is chokidar
Chokidar does still rely on the Node.js core fs module, but when using
fs.watch and fs.watchFile for watching, it normalizes the events it
receives, often checking for truth by getting file stats and/or dir
contents.
Here is a small example of using chokidar to watch only the targetfolder. By targeting just a specific folder you could leave the frontend intact. I haven't tried this for your specific use case but at first sight it seems that this may suit your needs.
var production = process.env.NODE_ENV === 'production'
if(!production) {
var chokidar = require('chokidar')
var watcher = chokidar.watch('./targetfolder')
watcher.on('ready', function() {
watcher.on('all', function() {
console.log("Clearing /targetfolder/ module cache from server")
Object.keys(require.cache).forEach(function(id) {
if (/[\/\\]targetfolder[\/\\]/.test(id)) delete require.cache[id]
})
})
})
}
There's a great example on Github, called Ultimate Hot Reloading Example
NB: The webpack-dev-server doesn't write files to disk, it serves the result from memory trough an Express instance. But webpack --watch does write files to disk.
Flag solution
You can use webpack's --watch flag.
In your package.json, the script block that starts your server (or the one that runs webpack), add this webpack --progress --colors --watch.
See Webpack documentation, it says:
We don’t want to manually recompile after every change…
When using watch mode, webpack installs file watchers to all files, which were used in the compilation process. If any change is detected, it’ll run the compilation again. When caching is enabled, webpack keeps each module in memory and will reuse it if it isn’t changed.
Example in package.json:
"scripts": {
"dev": "webpack --progress --colors --watch"
}
I have this problem in a SpringBoot app, where i can rebuild my bundle rapidly, but then that wouldn't necessarily make the server look inside an actual folder and find it live in realtime. So really what you need to do is have a way to configure your server to always look in a local folder for the bundle.js file instead of pulling it from the WAR/JAR or wherever it normally pulls it from. It's not "webpack issue". It's an issue of how to make servers read directly from the bundle.js on the folder. I would give you the Spring way of doing it but that's not your architecture.

Unexpected token import - using react and node

I'm getting an error when running my react app: Uncaught SyntaxError: Unexpected token import
I know that there are a plethora of similar issues on here, but I think mine is a little different. First of all, here is the repository, since I'm not sure where exactly the error is: repo
I'm using create-react-app, and in a seperate backend directory I'm using babel (with a .babelrc file containing the preset es2015). The app worked fine until I added another file in a new directory in the backend folder (/backend/shared/validations/signup.js).
I was using es6 before that too, and it was working perfectly fine. First I thought it was some problem with windows, but I cloned the repo on my Ubuntu laptop and I'm getting the same error there.
Some things I tried already:
Move the entire folder from /backend to the root folder
Move just the file (signup.js) just about everywhere
So no matter where the file is, the error stays the same. If I remove the entire file and all references to it the app works again.
I think this error is pretty weird, considering I'm using es6 everywhere else in the app without trouble. It would be great if anyone could help me with this error.
edit: If you want to test this on your own machine just clone the repo and run npm start in the root folder (and also npm start in the backend folder, but that isn't required for the app to run, or for the error to show up).
That's what's happening:
Your webpack is set to run babel loader only in the src folder (include directive).
To change that, one approach is:
1) extract webpack configuration with npm run eject (don't know if there is another way to override settings in react-create-app). From the docs:
Running npm run eject copies all the configuration files and the
transitive dependencies (Webpack, Babel, ESLint, etc) right into your
project so you have full control over them.
2) in config/paths.js, add an appShared key, like that:
module.exports = {
/* ... */
appSrc: resolveApp('src'),
appShared: resolveApp('backend/shared'),
/* ... */
};
3) in config/webpack.config.dev.js, add the path to the babel loader, like that:
{
test: /\.(js|jsx)$/,
include: [ paths.appSrc, paths.appShared ],
loader: 'babel',
/* ... */
},
It works now!

How to not bundle node_modules, but use them normally in node.js?

Architecture
I would like to share code between client and server side. I have defined aliases in the webpack config:
resolve: {
// Absolute paths: https://github.com/webpack/webpack/issues/109
alias: {
server : absPath('/src/server/'),
app : absPath('/src/app/'),
client : absPath('/src/client/'),
}
},
Problem
Now on the server side I need to include webpack in order to recognize the correct paths when I require a file. For example
require('app/somefile.js')
will fail in pure node.js because can't find the app folder.
What I need (read the What I need updated section)
I need to be able to use the webpack aliases. I was thinking about making a bundle of all the server part without any file from node_modules. In this way when the server starts it will use node_modules from the node_modules folder instead of a minified js file (Why? 1st: it doesn't work. 2nd: is bad, because node_modules are compiled based on platform. So I don't want my win files to go on a unix server).
Output:
Compiled server.js file without any node_modules included.
Let the server.js to use node_modules;
What I need updated
As I've noticed in https://github.com/webpack/webpack/issues/135 making a bundled server.js will mess up with all the io operation file paths.
A better idea would be to leave node.js server files as they are, but replace the require method provided with a custom webpack require which takes in account configurations such as aliases (others?)... Can be done how require.js has done to run on node.js server.
What I've tried
By adding this plugin in webpack
new webpack.optimize.CommonsChunkPlugin(/* chunkName= */"ignore", /* filename= */"server.bundle.js")
Entries:
entry: {
client: "./src/client/index.js",
server: "./src/server/index.js",
ignore: ['the_only_node_module'] // But I need to do that for every node_module
},
It will create a file server.js which only contains my server code. Then creates a server.bundle.js which is not used. But the problem is that webpack includes the webpackJsonp function in the server.bundle.js file. Therefore both the client and server will not work.
It should be a way to just disable node_modules on one entry.
What I've tried # 2
I've managed to exclude the path, but requires doesn't work because are already minified. So the source looks like require(3) instead of require('my-module'). Each require string has been converted to an integer so it doesn't work.
In order to work I also need to patch the require function that webpack exports to add the node.js native require function (this is easy manually, but should be done automatically).
What I've tried # 3
In the webpack configuration:
{target: "node"}
This only adds an exports variable (not sure about what else it does because I've diffed the output).
What I've tried # 4 (almost there)
Using
require.ensure('my_module')
and then replacing all occurrences of r(2).ensure with require. I don't know if the r(2) part is always the same and because of this might not be automated.
Solved
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
Related
https://www.bountysource.com/issues/1660629-what-s-the-right-way-to-use-webpack-specific-functionality-in-node-js
https://github.com/webpack/webpack/issues/135
http://webpack.github.io/docs/configuration.html#target
https://github.com/webpack/webpack/issues/458
How to simultaneously create both 'web' and 'node' versions of a bundle with Webpack?
http://nerds.airbnb.com/isomorphic-javascript-future-web-apps/
Thanks
Thanks to ColCh for enlighten me on how to do here.
require = require('enhanced-require')(module, require('../../webpack.config'));
By changing the require method in node.js it will make node.js to pass all requires trough the webpack require function which allow us to use aliases and other gifts! Thanks ColCh!
My solution was:
{
// make sure that webpack will externalize
// modules using Node's module API (CommonJS 2)
output: { ...output, libraryTarget: 'commonjs2' },
// externalize all require() calls to non-relative modules.
// Unless you do something funky, every time you import a module
// from node_modules, it should match the regex below
externals: /^[a-z0-9-]/,
// Optional: use this if you want to be able to require() the
// server bundles from Node.js later
target: 'node'
}

Sails.js application not refreshing files from assets after start

I have a Sails.JS application with Angular.JS front-end.
The angular files are stored in /assets/linker and they are injected properly on start. My issue is that when I change css or js file from assets the change doesn't appear on the server, the loaded js file is the same as when the server started. I tried to clear my browser cache and tried in another browser, but still the same.
I also tried to run the application with forever -w and nodemon, but still nothing. The application is in dev mode, anyway starting with sails lift --dev does not solve the issue neither.
I have feeling that I miss something in configuration. Is there any way to force reloading of assets?
You need to check your Gruntfile configuration. It's where the magic happen in term of linker and livereload.
Specifically, you'll need to look at the watch task and the related tasks.
By default it looks like this :
watch: {
api: {
// API files to watch:
files: ['api/**/*']
},
assets: {
// Assets to watch:
files: ['assets/**/*'],
// When assets are changed:
tasks: ['compileAssets', 'linkAssets']
}
}
I found the problem. I made the Angular.js structure with angular generator
which adds not only the js structure, but also karma test environment containing shell and bat scripts, karma framework and more.
Building sails application with all these files in watched folder is breaking the refresh functionality. There's no errors in console and nothing in the running application, but the files from assets are not reloaded anymore.
Tip of the day: be careful with the files you have in assets and take a look what does generators generate!
I came here looking for livereload, after a little search
Live Reloading
Enabling Live Reload in Your HTML
in current version of Sails v0.10 there is a file for watch task: tasks/config/watch.js

Resources