npm build result still have the source code? - node.js

I was using npm run build to build my page for production, and I check the /build folder the source code doesn't exists. But when I host the host the website via IIS or httpd and use the browser to open the page, I found the source code is in there! That means anyone can grab my code and build their own website? Wired.

In your build directory, you'll see some .map files. Those are source maps, and they contain all the data necessary to rebuild things in the way you see files on-disk, even though everything is bundled for the browser.
These are useful development tools. They allow you to set breakpoints. You can even see code from other languages that was transpiled.
You should disable sourcemaps for your production builds.

Related

How to npm init/install/run build properly in a wordpress plugin?

It may be a dumb question but I feel that I'm floundering around attempting to edit the wordpress plugin that I downloaded from github repo: https://github.com/WordPress/gutenberg-examples.
Right now, I'm following the block tutorial from https://developer.wordpress.org/block-editor/how-to-guides/block-tutorial/.
So what I did:
download the pre-built plugin in zip folder
upload it to wordpress site that I created in docker (https://developer.yoast.com/blog/set-up-wordpress-development-environment-in-docker/)
open the zip folder and move it to my plugins directory
open, for example, "01-basic-esnext" folder inside the gutenberg-examples folder and edit in block.build.js. The changes I made would be shown in the block editor in the wordpress post that I created.
However, what I'm confused about is the npm stuff that's mentioned in "Development" section.
For each of the examples that include an esnext example the following
commands are required to build the plugins:
To install the node packages
npm install
To build the production version of the plugin
npm run build
To build a development version, change to the local
directory of the block you are working on, and run npm start to watch
for changes and automatically rebuild as you develop.
cd 01-basic-esnext/
npm start
Before that, the "Development" section already mentioned building a docker/wordpress environment for the plugin right inside the gutenberg-examples folder.
However, the way I set up my stuff is different. I already have a docker running in my wordpress folder like what I followed in https://developer.yoast.com/blog/set-up-wordpress-development-environment-in-docker/ and this plugin is already in my plugins directory. I can easily edit the files in visual studio code and see the changes in my local wordpress site.
So should I be doing something to install the npm stuff or leave it alone?
npm is used for these WordPress plugins because the -esnext versions of the examples get built from Javascript modules. The build process runs on npm and nodejs. That means, if you work on that -esnext code, that you're living in a hybrid world -- you have php and apache running your development web server, and you have nodejs and npm handling your builds.
The WordPress team carefully rigged a docker setup to support the process of edit / run for you. So if you use your own docker setup, you won't get the benefit of theirs.
When you have finished your development effort, you can use npm run build to build a .zip file which you can then install in your own WordPress instances using the Upload Plugin button at the top of the Add Plugins page.
Code is poetry, for sure. But development environments are not.

npm doesn't compile changes made on production server

I have a laravel application deployed on a shared hosting server. I managed to deploy the app, install all the composer/node dependencies and it all runs with no error. I'm trying to make some minor changes on one of my components, but for some reason after the npm run dev(production) everything seems to be compilled with no error, but the actual application in the browser does not reflect the changes. I tried to clear all the caches in the app and in the browsers I'm using. I tried also to run 'npm run watch'. I replaced files, also I replaced the whole folder. If I remove something npm does display the error about the missing files, but the changes are not compiled. I've been googling now for 2 hours,but I cannot find anything useful . Any idea is welcomed. Thanks in advance.
Using Laravel Mix's .version() could help in cache busting. Don't forget to use mix('path') instead of asset('path') in your blade files

Netlify: How do you deploy sites that are nested in a folder?

I have a repo that has the backend and frontend (create-react-app) in two separate folders. For the build command, I have something like cd frontend && npm run build and for the publish directory, I have something like frontend/build, but this is not working.
disclaimer: I work for Netlify.
If you were to clone a new copy (no node modules installed in the project, for instance) of your project on a fresh laptop with nothing else except node and npm installed there, how would you build it? Imagine netlify's build process like that. So you're missing at least an "npm install" step in there :)
Anything else missing, like globally installed npm packages? Need to specify them in package.json so that Netlify's build network knows to grab them for you. Ruby gems? Better have a Gemfile in your repo!
Netlify tries to npm install (and bundle install) automatically for you, assuming there is a package.json either in the root of your repository (I'm guessing yours is in frontend/ ?) OR if you set the "base" parameter so that we start our build in the base directory. This is probably a good pattern for you, to set "base" to frontend, and then set your publish directory to build.
You can specify that base parameter in netlify.toml something like this:
[build]
base = "frontend"
Note that netlify.toml must reside in the root of your repository.
For more details on how Netlify builds, check out the following articles:
Overview of how our build network works. This article also shows how you can download our build image to test locally.
Settings that affect our build environment. Useful for telling us about what node version to use, for instance.
Some frequently experienced problems
If after some reading and experimenting, you still can't figure things out, ping the helpdesk.
The top answer is correct ^. For anyone looking to simply change the base directory (lets say there is only one npm install/start) you need to change the BASE DIRECTORY, which you will find in the build settings. Simply go to: site-settings -> build & deploy - and you will see it where I pointed in the picture attacted. Hopefully that helps someone in need of this. see here

NPM errors and control in Azure Websites

I want to build my Node.JS application in a Azure Website.
There will be an usage of different NPM packages via my packages.json file.
My problem is that I often receive error messages which are related to missing NPM files.
Normally I put my files via FTP or edit them per VS Studio 15 Azure plugin directly on the server. This may be the reason why NPM isn't triggering as Microsoft intended it.
I would prefer a way in which I can just run commands with elevated privileges to have full control over NPM by myself.
Which ways are possible to avaid these problems?
If you're publishing your nodeJS application 'manually' via FTP there are little concerns about that.
First of All, 'manually' means manually.
Git
If you use continuous deployment via Git the final deployment step is to call npm install in your current application folder, this will install all the packages listed in package.json file.
The node_modules folder is excluded by default in .gitignore file, so all packages are downloaded by the server
Web deployment
If you're using web deployment from visual studio or command line, all the files contained by your solution are copied to Hosting environment including node_modules folder , because of this the deployment would take a long time to finish due the huge amount of dependencies and files that the folder contains.
Even worst: this scenario could take you to the same scenario you're facing right now.
FTP deployment
You're copying everything yourself. So the same thing occurs in Web Deployment is happen in FTP deployment method.
--
The thing is that when you copy all those node_modules folder contents you're assuming that those dependencies remains the same in the target enviroment, most of the cases that's true, but not always.
Some dependencies are platform dependent so maybe in you're dev environment a dependency works ok in x86 architectures but what if your target machine or website (or some mix between them) is x64 (real case I already suffer it).
Other related issues could happen. May be your direct dependencies doesn't have the problem but the linked dependencies to them could have it.
So always is strongly recommended to run npm install in your target environment and avoid to copy the dependencies directly from your dev environment.
In that way you need to copy on your target environment the folder structure excluding node_modules folder. And then when files are copied you need to run npm install on the server.
To achieve that you could go to
yoursitename.scm.azurewebsites.net
There you can goto "Debug Console" Tab, then goto this directory D:\home\site\wwwroot> and run
npm install
After that the packages and dependencies are downloaded for the server/website architecture.
Hope this helps.
Azure tweak the Kudu output settings, in local Kudu implementations looks the output is normalized.
A workaround -non perfect- could be this
npm install --dd
Or even more detailed
npm install --ddd
The most related answer from Microsoft itself is this
Using Node.js Modules with Azure applications
Regarding control via a console with elevated privileges there is the way of using the Kudu console. But the error output is quite weird. It's kind of putting blindly commands in the console without much feedback.
Maybe this is a way to go. But I didn't tried this yet.
Regarding deployment it looks like that Azure wants you to prefer Continuous Deployment.
The suggested way is this here.

Best way to set up a node.js web project in a closed environment

We build a web application and our project uses various npm packages for development, testing and run-time.
The project is built as part of a large project in TFS. TFS runs ant to build the project. Our build.xml first runs npm install, then transpiles and minifies the TypeScript and Sass files (using Grunt tasks) and then builds the final war fie.
This all works OK, but our TFS is not allowed to access the Internet during the build, only our local network. Therefore, we have all the npm libraries we use copied to a file server in our network, and our package.json dependencies point to paths on that file server.
Does this seems like a reasonable solution?
The problem we have is that the npm install takes about 10 minutes to get all the >50 packages we use (which includes karma, grunt, sass, tslint, etc. – total is 170MB).
We are now looking for way to reduce the TFS build time. One option is to but the node_modules in our source control and skip the npm install step, but is seems wrong to put third-party code in our source control.
I’d love to hear other ideas to handle this and have shorter build time.
Note that on developers machine the project builds in no time, as all packages are already installed, but TFS builds start by getting a clean environment from source control, so nothing is installed.
Tough problem. You could have TFS check if your package.json checksum has changed in order to determine if a "clean" is necessary. You'd still have a 10 minute build whenever package.json is updated, but package.json changes are usually infrequent.
The lines become blurred when you host your own npm libraries since this is essentially taking a snapshot of only the dependencies you need. Therefore, if you added a dependency, colors, you'd have to update your npm repo. That could be viewed as updating the node_modules folder on your npm repo. It's a static list of available dependencies which essentially defeats the purpose of a package.json (unless of course other internal apps use the internal npm repo).
BUT, I digress, I'd argue that the best option is to have a package.json checksum for TFS to know if it should bother rebuilding node_modules.

Resources