Building React App inside of Node server best practice - node.js

Have a react repo and a node repo. The are on the same root directory like so:
--myReactApp
--myNodeServer
I have a script in the React app that builds the React app and places the generated files into the Node server's public directory.
npm run-script build
#Remove the files from the server directory
rm -r ../myNodeServer/public/*
#copy over new files
cp -r ./build/. ../myNodeServer/public
This is all good, but now I am doing CI/CD with Jenkin's pipeline and it pulls the myNodeServer repo that does not have the React app in the public folder. I don't think it is best practice to include generated files in the myNodeServer's public directory, so I don't want to check in the React app into the the Node repo.
What do developer's generally do in the this situation. Is it possible to pull from 2 different repos in a pipeline script.
Right now my script looks like this:
pipeline {
agent any
tools {nodejs "Node8"}
environment {
DBPASSWORD = credentials('DBPASSWORD')
DEPLOYSECRET = credentials('DEPLOYSECRET')
}
stages {
stage('NPM Install') {
steps {
sh 'npm install'
}
}
stage('Unit Test') {
steps {
sh 'env PORT=3090 DBUSER=username DBPASSWORD=$DBPASSWORD DEPLOYSecret=$DEPLOYSECRET ENV=unitTest ./node_modules/mocha/bin/_mocha --timeout 10000 --timeout 0 --ui bdd --recursive ./test'
}
}
}
}

Related

MeteorUp volumes and how Meteor can access to their contents

First, thank you for reading my question. This is my first time on stackoverflow and I made a lot of research for answers that could help me.
CONTEXT
I'm developing a Meteor App that is used as a CMS, I create contents and store datas in mongoDb collections. The goal is to use these datas and a React project to build a static website, which is sent to an AWS S3 bucket for hosting purpose.
I'm using meteorUp to deploy my Meteor App (on an AWS EC2 instance) and according to MeteorUp documentation (http://meteor-up.com/docs.html#volumes), I added a docker volume in my mup.js:
module.exports = {
...
meteor: {
...
volumes: {
'/opt/front': '/front'
},
...
},
...
};
Once deployed, volume is well set in '/opt/myproject/config/start.sh':
sudo docker run \
-d \
--restart=always \
$VOLUME \
\
--expose=3000 \
\
--hostname="$HOSTNAME-$APPNAME" \
--env-file=$ENV_FILE \
\
--log-opt max-size=100m --log-opt max-file=10 \
-v /opt/front:/front \
--memory-reservation 600M \
\
--name=$APPNAME \
$IMAGE
echo "Ran abernix/meteord:node-8.4.0-base"
# When using a private docker registry, the cleanup run in
# Prepare Bundle is only done on one server, so we also
# cleanup here so the other servers don't run out of disk space
if [[ $VOLUME == "" ]]; then
# The app starts much faster when prepare bundle is enabled,
# so we do not need to wait as long
sleep 3s
else
sleep 15s
fi
On my EC2, '/opt/front' contains the React project used to generate a static website.
This folder includes a package.json file, every modules are available in the 'node_modules' directory. 'react-scripts' is one of them, and package.json contains the following script line:
"build": "react-scripts build",
React Project
React App is fed with a JSON file available in 'opt/front/src/assets/datas/publish.json'.
This JSON file can be hand-written (so the project can be developed independently) or generated by my Meteor App.
Meteor App
Client-side, on the User Interface, we have a 'Publish' button that the Administrator can click when she/he wants to generate the static website (using CMS datas) and deploy it to the S3 bucket.
It calls a Meteor method (server-side)
Its action is separated in 3 steps:
1. Collect every useful datas and save them into a Publish collection
2. JSON creation
a. Get Public collection first entry into a javascript object.
b. Write a JSON file using that object in the React Project directory ('opt/front/src/assets/datas/publish.json').
Here's the code:
import fs from 'fs';
let publishDatas = Publish.find({}, {sort : { createdAt : -1}}).fetch();
let jsonDatasString = JSON.stringify(publishDatas[0]);
fs.writeFile('/front/src/assets/datas/publish.json', jsonDatasString, 'utf8', function (err) {
if (err) {
return console.log(err);
}
});
2. Static Website build
a. Run a CD command to reach React Project's directory then run the 'build' script using this code:
process_exec_sync = function (command) {
// Load future from fibers
var Future = Npm.require("fibers/future");
// Load exec
var child = Npm.require("child_process");
// Create new future
var future = new Future();
// Run command synchronous
child.exec(command, {maxBuffer: 1024 * 10000}, function(error, stdout, stderr) {
// return an onbject to identify error and success
var result = {};
// test for error
if (error) {
result.error = error;
}
// return stdout
result.stdout = stdout;
future.return(result);
});
// wait for future
return future.wait();
}
var build = process_exec_sync('(cd front && npm run build)');
b. if 'build' is OK, then I send the 'front/build' content to my S3 bucket.
Behaviors:
On local environment (Meteor running on development mode):
FYI: React Project directory's name and location are slightly different.
Its located in my meteor project directory, so instead of 'front', it's named '.#front' because I don't want Meteor to restart every time a file is modified, added or deleted.
Everything works well, but I'm fully aware that I'm in development mode and I benefit from my local environment.
On production environment (Meteor running on production mode in a docker container):
Step 2.b : It works well, I can see the new generated file in 'opt/front/src/assets/datas/'
Step 3.a : I get the following error:
"Error running ls: Command failed: (cd /front && npm run build)
(node:39) ExperimentalWarning: The WHATWG Encoding Standard
implementation is an experimental API. It should not yet be used in
production applications.
npm ERR! code ELIFECYCLE npm ERR! errno 1 npm
ERR! front#0.1.0 build: react-scripts build npm ERR! Exit status 1
npm ERR! npm ERR! Failed at the front#0.1.0 build script. npm ERR!
This is probably not a problem with npm. There is likely additional
logging output above.
npm ERR! A complete log of this run can be found in: npm ERR!
/root/.npm/_logs/2021-09-16T13_55_24_043Z-debug.log [exec-fail]"
So here's my question:
On production mode, is it possible to use Meteor to reach another directory and run a script from a package.json?
I've been searching for an answer for months, and can't find a similar or nearby case.
Am I doing something wrong?
Am I using a wrong approach?
Am I crazy? :D
Thank you so much to have read until the end.
Thank you for your answers!
!!!!! UPDATE !!!!!
I found the solution!
In fact I had to check few things on my EC2 with ssh:
once connected, I had to go to '/opt/front/' and try to build the React-app with 'npm run build'
I had a first error because of CHMOD not set to 777 on that directory (noob!)
then, I had an error because of node-sass.
The reason is that my docker is using Node v8, and my EC2 is using Node v16.
I had to install NVM and use a Node v8, then delete my React-App node_modules (and package-lock.json) then reinstall it.
Once it was done, everything worked perfectly!
I now have a Meteor App acting as a CMS / Preview website hosted on an EC2 instance that can publish a static website on a S3 bucket.
Thank you for reading me!
!!!!! UPDATE !!!!!
I found the solution!
In fact I had to check few things on my EC2 with ssh:
once connected, I had to go to '/opt/front/' and try to build the React-app with 'npm run build'
I had a first error because of CHMOD not set to 777 on that directory (noob!)
then, I had an error because of node-sass.
The reason is that my docker is using Node v8, and my EC2 is using Node v16.
I had to install NVM and use a Node v8, then delete my React-App node_modules (and package-lock.json) then reinstall it.
Once it was done, everything worked perfectly!
I now have a Meteor App acting as a CMS / Preview website hosted on an EC2 instance that can publish a static website on a S3 bucket.
Thank you for reading me!

Jenkins NodeJS Plugin fails during install phase with npm ci

I'm trying to build a npm project ( Angular Project ) on the Jenkins master ( on a Docker ).
Last week I've setup the project and everything worked just fine. Today morning I've made a change on a branch and it kept failing during the first phase which include a node command called with the NodeJS plugin. There's no log on Jenkins so I can't really know what's going on.
Here's the stage code:
stages {
stage('Install') {
steps {
script {
lastRunningStage="Install"
}
nodejs(nodeJSInstallationName: 'node14.16', configId: '320662bf-2907-4c12-87f1-225abaa8d503') {
sh 'npm ci'
}
}
}

Sharing code between React Native + Node

I am using React Native and Node.js. I want to share code between the two. My folder structure is as so.
myreactnativeapp/
mynodeserver/
myshared/
In the react native and node apps I have included the
package.json
"dpendencies" : {
"myshared": "git+https://myrepository/ugoshared.git"
}
This can then be included in each project via require/import etc. This all works fine and for production I'm happy with it. (Though I'd love to know a better way?)
The issue I'm facing is in development it's really slow.
The steps for a change to populate are:
Make changes in Shared
Commit Changes to git
Update the npm module
In development, I really want the same codebase to be used rather than this long update process. I tried the following:
Adding a symlink in node_models/shared - doesn't work in react-native package mangaer
Using relative paths ../../../shared - doesn't work in react-native package mangaer
Any other ideas?
Update 1
I created a script.sh which I run to copy the files into a local directory before the package manager starts. It's not ideal but at least I only have to restart the packager instead of messing with git etc.
#myreactnativeapp/start.sh
SOURCE=../myshared
MODULE=myshared
rm -rf ./$MODULE
mkdir ./$MODULE
find $SOURCE -maxdepth 1 -name \*.js -exec cp -v {} "./$MODULE/" \;
# create the package.json
echo '{ "name": "'$MODULE'" }' > ./$MODULE/package.json
# start the packager
node node_modules/react-native/local-cli/cli.js start
Then in my package.json I update the script to
"scripts": {
"start": "./start.sh",
},
So, the process is now.
Make a change
Start/Resetart the packager
Automatic:
Script copies all .js files under myshared/ -> myreactnativeapp/myshared/
Script creates a package.json with the name of the module
Because I've added the package.json to the copied files with the name of the module, in my project I can just include the items the same as I would if the module was included via the package manager above. In theory when I switch to using the package in production I wont have to change anything.
Import MyModule from 'myshared/MyModule'
Update 2
My first idea got tiresome restarting the package manager all the time. Instead i created a small node script in the shared directory to watch for changes. Whenever there is a change it copies it to the react native working directory.
var watch = require('node-watch')
var fs = require('fs')
var path = require('path')
let targetPath = '../reactnativeapp/myshared/'
watch('.', { recursive: false, filter: /\.js$/ }, function(evt, name) {
console.log('File changed: '+name+path.basename(__filename))
// don't copy this file
if(path.basename(__filename) === name) {
return
}
console.log(`Copying file: ${name} --> ${targetPath+name}`);
fs.copyFile(name, targetPath+name, err => {
if(err) {
console.log('Error:', err)
return;
}
console.log('Success');
})
});
console.log(`Starting to watch: ${__dirname}. All files to be copied to: ${targetPath}`)

How to execute Node .JS APIs test cases using Jenkinsfile

I am new to Jenkins. I have a small Node .JS server and the test cases are written using Mocha(Integration test cases, not unit test cases). I am trying to create a CI Pipeline for this using Jenkins. My Jenkinsfile looks as follows:
#!/usr/bin/env groovy
pipeline {
agent {
docker {
image 'node'
args '-u root'
}
}
stages {
stage('Build') {
steps {
echo 'Installing Dependencies...'
sh 'npm install'
}
}
stage('Run') {
steps {
echo 'Starting application...'
sh 'npm start'
}
}
stage('Test') {
steps {
echo 'Testing...'
sh 'npm test'
}
}
}
}
In the run stage, the server is started using the command node server.js, Once the server is up I want the test cases to be executed against this server. But I notice that, Jenkins never executes the Test stage since the server remains started(this is what i want), and does not exit from it.
How can I have the server started and also have the test stage run against this server?
You should run the tests before running the server. The test should not depend on the running server. Tests should require whatever is required and test, then you should run the server.
https://github.com/jenkinsci/pipeline-examples/tree/master/jenkinsfile-examples/nodejs-build-test-deploy-docker-notify
I have resolved this by creating separated build jobs and then linking them together. In the run stage, I change the directory to the build folder using "cd" command and start the server. In the test stage, I do the same, but, execute the test cases on the server started in run stage.
Thank you everyone for your inputs.

Writing a Jenkins Pipeline Shared Library to publish to Nexus NPM repository

I used to publish my NPM projects to Nexus using a DSL pipeline containing a publish stage with this kind of step :
stage ('Publish') {
nodejs(nodeJSInstallationName: 'Node LTS', configId: '123456ab-1234-abcd-1234-f123d45e6789') {
sh 'npm publish'
}
}
I have a NodeJS installation named "Node LTS" on my Jenkins and a npmrc config file with this configId.
Now I want to export this stage into a groovy SharedLib.
According to Declarative Pipeline documentation and this nodejs-plugin issue, I could write this :
stage('Publish') {
tools {
nodejs 'Node LTS'
}
steps {
sh 'npm publish'
}
}
But this does not set authentification configuration that is currently in my npmrc configuration file :
registry=http://my-nexus/repository/npm-private/
_auth="some=base=64=credential=="
always-auth=true
Any idea to retreive this configuration with declarative syntax and prevent this error message ?
npm ERR! code ENEEDAUTH
npm ERR! need auth auth required for publishing
npm ERR! need auth You need to authorize this machine using `npm adduser`
Taking a look to npm log files and reading documentation, I finally find the best solution was to specify the following publish configuration in my package.json file :
{
"name": "#my-company/my-project",
...
"publishConfig": {
"registry": "http://my-nexus/repository/npm-private/"
},
...
}
I leave the .npmrc configuration :
registry=http://my-nexus/repository/npm-private/
_auth="some=base=64=credential=="
always-auth=true
Note : the always-auth is needed, in my case, for automation script : https://docs.npmjs.com/misc/config
I struggled with having an node package published to nexus 3 from jenkins pipeline and here is what worked for me. It might help someone.
pipeline {
agent any
environment {
registryCredentials = "nexus"
registryPrivate = "http://nexus:8081/repository/your-nexus-repo/" // nexus repository
}
stages {
stage('Publish') {
steps {
script {
nodejs('your-jenkins-nodejs-name') {
sh("rm ~/.npmrc || echo 'trying to remove .npmrc'") // remove .npmrc
// this token is copied from ~/.npmrc file after a interactive npm login
// do a npm login to your nexus npm hosted private repo and get the token
sh 'echo "//nexus:8081/repository/vinsystems-npm/:_authToken=NpmToken.302af6fb-9ad4-38cf-bb71-57133295c7ca" >> ~/.npmrc'
sh("cd ./WebClientWorkspace && yarn install")
sh("cd ..")
sh("yarn publish ./path/to/your/js-library --registry=${registryPrivate} --registry=${registryPrivate} --non-interactive --verbose")
}
}
}
}
}
}

Resources