Can Webpack Dev server create files in my project root? - node.js

I have an project set up and running with Webpack 5.28.0 and webpack-dev-server 4.11.1
Its all working nicely but I would like to be able to have the dev server write some files back to my project root. These are debug/log files that I'd like to save as JSON.
I'd also like this to be automatic, I don't want to have to click anything or trigger the action manually.
So the ideal flow would be that I run npm start, my build kicks off in a browser, the page generates a load of log data and this is then written back to my project root. Either using some browser function or calling back to Node script in my build.
Is this possible with dev-server?

You could setup the dev-server middleware to add an API endpoint to accept data and write it to your filesystem
// webpack.config.js
const { writeFile } = require("node:fs/promises");
const bodyParser = require("body-parser");
module.exports = {
// ...
devServer: {
setupMiddlewares: (middlewares, devServer) => {
devServer.app?.post(
"/__log",
bodyParser.json(),
async (req, res, next) => {
try {
await writeFile(
"debug-log.json",
JSON.stringify(req.body, null, 2)
);
res.sendStatus(202);
} catch (err) {
next(err);
}
}
);
return middlewares;
},
},
};
Then your front-end app needs only to construct the payload and POST it to the dev-server
const debugData = { /* ... */ };
fetch("/__log", {
method: "POST",
body: JSON.stringify(debugData),
headers: { "content-type": "application/json" },
});

Related

how to expose static directories stored in aws lambda code over API gateway?

I am trying to serve static react app bundle using aws lambda only.
I have used a NodejsFunction and applied commandHooks during bundling to bundle a react build along with my code as shown below. I have also attached it to an API gateway as u can see.
private uiLmbda = new NodejsFunction(this, "renderer", {
entry: path.join(__dirname, "..", "handlers", "ui-handler.ts"),
handler: "handler",
bundling: {
commandHooks: {
beforeBundling(inputDir: string, outputDir: string) {
const staticAssets = path.join(__dirname, "..", "build");
const relativePath = path.relative(inputDir, staticAssets);
return [`cp -r ${relativePath} ${outputDir}`];
},
afterBundling(inputDir: string, outputDir: string) {
return [];
},
beforeInstall() {
return [];
},
},
},
});
private scanAPI = new RestApi(this, "scan-api");
private uiGatewayIntegration: LambdaIntegration = new LambdaIntegration(
this.uiLmbda
);
And in constructor i am calling this :-
this.scanAPI.root.addMethod("GET", this.uiGatewayIntegration, {});
Now,i have an index.js as my lambda handler and a build folder with index.html and other refered static files as shown below.
the handler code is as shown :-
import * as fs from "fs";
import * as path from "path";
export const handler = async (event: any) => {
try {
console.log(path.resolve("./build/index.html"));
return {
statusCode: 200,
headers: {
"content-type": "text/html",
},
body: fs
.readFileSync(path.join(__dirname, "build", "index.html"))
.toString("utf-8"),
isBase64Encoded: false,
};
} catch (error) {
console.log(error);
}
};
so i am able to send the html using above handler.
but not the relative files as it seems the api gateway is not aware that i wish to render subdirectories from the get method. Any help on how i can do that ? attching screenshots where u can see that main.js ( referenced through html as <script src='/task/var/build/main'><script/> ) gives 403 from api gateway.
Okay, turns out i needed to handle this case in 2 places, one was api gateway and other was lambda handler.
I attached the lambda to serve static files using express.static and in api gateway i enabled proxy, and it worked.

How to dynamically add CORS sites to Google Cloud App Engine Node API

I am new to API deployment.
I have a Node Express API that has CORS enabled in the root app.js, for the API and a socket.io implementation:
var app = express();
app.use(cors({
origin : ["http://localhost:8080", "http://localhost:8081"],
credentials: true
}))
and
const httpServer = createServer(app);
const io = new Server(httpServer, {
cors: {
origin: ["http://localhost:8080", "http://localhost:8081"],
credentials: true,
methods: ["GET"]
}
});
I will set up a sales website that allows a customer to pay for a license to use the API with their site, i.e. https://www.customersite.com
My question is how can I dynamically add the customer's website (say after they submit a form from another site) to the CORS list? Ideally it would be via an API call. The only option which I can think of (that is not automated) is to manually maintain a global js file (i.e. config.js) with the cors list from within the Google platform using the file explorer / editor, and to iterate over it as an array similar to process.env.customerList. This will not work for me as I need to have this step happen automatically.
Any and all suggestions are appreciated.
Solution: Use a process manager like pm2 to 'reload' the API gracefully with close to no downtime.
PM2 reloads can be triggered programmatically. I made a PUT endpoint for modifying CORS list /cors/modify that sent a programmatic pm2 message when a successful modification was done.
Note: on Windows OS must use programmatic messaging:
pm2.list(function(err, list) {
pm2.sendDataToProcessId(list[0].pm2_env.pm_id,
{
type : 'process:msg',
data : {
msg : 'shutdown'
},
topic: true
},
function(err, res) {
console.log(err);
pm2.disconnect(); // Disconnects from PM2
}
);
if (err) {
console.log(err);
pm2.disconnect(); // Disconnects from PM2
}
});
which can then be caught with
process.on('message', async function(msg) {
if (msg == "shutdown" || msg.data.msg == 'shutdown') {
console.log("Disconnecting from DB...");
mongoose.disconnect((e => {
if (e) {
process.exit(1)
} else {
console.log("Mongoose connection removed");
httpServer.close((err) => {
if (err) {
console.error(err)
process.exit(1)
}
process.exit(0);
})
}
}));
}
});

Angular and Nodejs basic file download

Could someone show me an example of a user basic file download using Node and Angular please. I understand it like this, but this is not working:
Nodejs:
// Does it matter that it is a post and not a get?
app.post('/some-api', someData, (request, response) => {
response.download('file/path/mytext.txt');
});
Angular 2+:
this.httpclient.post<any>('.../some-api', {...data...}).subscribe(response => {
console.log(response);
// This throws an error, but even if it doesn't,
// how do I download the Nodejs `response.download(...) ?`
});
Here are possible answers, but they are so complex, could someone just give me a super basic example (basically what I have here, but a working version). The easiest solution please.
How do I download a file with Angular2
Angular download node.js response.sendFile
There you go..
Node.js Server:
const express = require("express");
const router = express.Router();
router.post("/experiment/resultML/downloadReport",downloadReport);
const downloadReport = function(req, res) {
res.sendFile(req.body.filename);
};
Component Angular:
import { saveAs } from "file-saver"
download() {
let filename = "/Path/to/your/report.pdf";
this.api.downloadReport(filename).subscribe(
data => {
saveAs(data, filename);
},
err => {
alert("Problem while downloading the file.");
console.error(err);
}
);
}
Service Angular:
public downloadReport(file): Observable<any> {
// Create url
let url = `${baseUrl}${"/experiment/resultML/downloadReport"}`;
var body = { filename: file };
return this.http.post(url, body, {
responseType: "blob",
headers: new HttpHeaders().append("Content-Type", "application/json")
});
}

Load custom configuration runtime

I have a nuxt application in which I will need to append data from a generated configuration file when the application is first started. The reason I cannot do this in the actual build is because the configuration file does not exists at this point; it is generated just before calling npm start by a bootstrap script.
Why don't I generated the configuration file before starting the application you may ask and this is because the application is run in a docker container and the built image cannot include environment specific configuration files since it should be used on different environments such as testing, staging and production.
Currently I am trying to use a hook to solve this, but I am not really sure on how to actually set the configuration data in the application so it can be used everywhere:
# part of nuxt.config.js
hooks: {
listen(server, listener) {
# load the custom configuration file.
fs.readFile('./config.json', (err, data) => {
let configData = JSON.parse(data));
});
}
},
The above hook is fired when the application first starts to listen for connecting clients. Not sure this is the best or even a possible way to go.
I also made an attempt of using a plugin to solve this:
import axios from ‘axios’;
export default function (ctx, inject) {
// server-side logic
if (ctx.isServer) {
// here I would like to simply use fs.readFile to load the configuration, but this is not working?
} else {
// client-side logic
axios.get(‘/config.json’)
.then((res) => {
inject(‘storeViews’, res.data);
});
}
};
In the above code I have problems both with using the fs module and axios.
I was also thinking about using a middleware to do this, but not sure on how to proceed.
If someone else has this kind of problem here is the solution I came up with in the end:
// plugins/config.js
class Settings
{
constructor (app, req) {
if (process.server) {
// Server side we load the file simply by using fs
const fs = require('fs');
this.json = fs.readFileSync('config.json');
} else {
// Client side we make a request to the server
fetch('/config')
.then((response) => {
if (response.ok) {
return response.json();
}
})
.then((json) => {
this.json = json;
});
}
}
}
export default function ({ req, app }, inject) {
inject('config', new Settings(app, req));
};
For this to work we need to use a server middleware:
// api/config.js
const fs = require('fs');
const express = require('express');
const app = express();
// Here we pick up requests to /config and reads and return the
// contents of the configuration file
app.get('/', (req, res) => {
fs.readFile('config.json', (err, contents) => {
if (err) {
throw err;
}
res.set('Content-Type', 'application/json');
res.end(contents);
});
});
module.exports = {
path: '/config',
handler: app
};

How can I use body-parser with LoopBack?

I see that LoopBack has the Express 3.x middleware built-in. Indeed, body-parser is in loopback/node_modules. But I cannot figure out how to use it as middleware. I have never worked with Express 3.x, so maybe it's just that. require does not work, obviously, unless I install body-parser as a dependency in my project.
What should I do in server.js to use body-parser so that web forms are parsed into req.params? That's what it does, right?
After hours of frustration, I just added it to middleware.json like so:
"parse": {
"body-parser#json": {},
"body-parser#urlencoded": {"params": { "extended": true }}
}
It is installed as a dependency. Now I have form data in req.body in my routes. My server/boot/routes.js looks like this:
module.exports = function(app) {
app.post('/mailing_list', function(req, res) {
console.log(req.body.email);
res.send({"status": 1, "message": "Successfully added to mailing list."})
});
}
Just to be more clear about what it takes to get this working (because I still struggled for a while after finding this answer!), here are the steps I took:
As described above, in $APP_HOME/server/middleware.json, add the body-parser to the "parse" section:
{
"initial:before": {
"loopback#favicon": {}
},
"initial": {
"compression": {},
"cors": {
"params": {
"origin": true,
"credentials": true,
"maxAge": 86400
}
}
},
"session": {
},
"auth": {
},
"parse": {
"body-parser#json": {},
"body-parser#urlencoded": {"params": { "extended": true }}
},
"routes": {
},
"files": {
},
"final": {
"loopback#urlNotFound": {}
},
"final:after": {
"errorhandler": {}
}
}
Next, I added the parser setup to $APP_HOME/server/server.js:
var loopback = require('loopback');
var bodyParser = require('body-parser');
var multer = require('multer');
var boot = require('loopback-boot');
var app = module.exports = loopback();
app.use(bodyParser.json()); // for parsing application/json
app.use(bodyParser.urlencoded({ extended: true })); // for parsing application/x-www-form-urlencoded
app.use(multer()); // for parsing multipart/form-data
app.start = function() {
...
...
cont'd
Then, since I didn't want to mess with custom routes, I added the following to $APP_HOME/common/models/model.js:
module.exports = function(Model) {
Model.incoming = function(req, cb) {
cb(null, 'Hey there, ' + req.body.sender);
}
Model.remoteMethod(
'incoming',
{ accepts: [
{ arg: 'req', type: 'object', http: function(ctx) {
return ctx.req;
}
}],
returns: {arg: 'summary', type: 'string'}
}
);
};
I can now run my app with $> slc run .
When I post to the endpoint, it now gets parsed properly, and all is well with the world. I hope this helps someone else!
I'm using loopback 2.14.0:
To make use of the body-parser in your custom bootscript routes you should only need to:
1) install body-parser
npm install body-parser --save
2) Register the the module in middleware.json
"parse": {
"body-parser#json": {},
"body-parser#urlencoded": {"params": { "extended": true }}
},
There is no need to require the parser setup in server.js, loopback does this for you when you register the middleware.
Please note body parser is now installed in your source "node_modules" directory as well as in the loopback modules directory.
If at all possible try register custom remote methods as described in the loopback documentation.
Registering routes this way gives you access to loopback's body-parser out of the box and is the 'cleanest' implementation.
Based on this answer https://stackoverflow.com/a/29813184/605586 from Ben Carlson you have to
npm install --save body-parser multer
then in your server.js require the modules:
var bodyParser = require('body-parser');
var multer = require('multer');
and use them before app.start:
app.use(bodyParser.json()); // for parsing application/json
app.use(bodyParser.urlencoded({ extended: true })); // for parsing application/x-www-form-urlencoded
app.use(multer().any()); // for parsing multipart/form-data
Then you can create a remote method:
App.incoming = function (req, cb) {
console.log(req);
// the files are available as req.files.
// the body fields are available in req.body
cb(null, 'Hey there, ' + req.body.sender);
}
App.remoteMethod(
'incoming',
{
accepts: [
{
arg: 'req', type: 'object', http: function (ctx) {
return ctx.req;
}
}],
returns: { arg: 'summary', type: 'string' }
}
);
Using this you can upload files and additional data fields to loopback with multipart/form-data.
I'm posting this just for informational purposes. I ran into this same issue and found this works as well. You can add a file in the server/boot/ directory with the following:
var bodyParser = require('body-parser');
module.exports = function(app) {
app.use(bodyParser.urlencoded({ extended: true }));
}
Of course, you have to install the package by running:
npm install --save body-parser
That will save the package under the node_modules directory.
If you want it to be the first thing to run, you can start the file name with a "0" since these are loaded in alphabetical order.
That being said, I figure it is more 'correct' and elegant to use the middleware configuration approach mentioned above than this one, but I share it in the event someone else finds it useful.
In Loopback ^3.22.0, I can suffice by adding the
"parse": {
"body-parser#json": {}
},
to the server/middleware.json
in order to consume application/json post bodies in the server/boot/routes.js
module.exports = function(app) {
app.post('/api/sayhello', function(req, res, next) {
console.log(req.body)
One could also use the built-in parser of express framework inside loopback like this, for example for json parsing:
app.use(app.loopback.json());
I have different test result.
1) For json and urlencode types, there is NO need to add their parser in middleware.json. I can get data from req.body successfully without adding body-parser#json and body-parser#urlencoded. The Loopback should already support them.
Loopback related source code(I think)
1. in strong-remote repo , rest-adapter.js , there is body-parser for json and urlendcoded
line 35
var json = bodyParser.json;
var urlencoded = bodyParser.urlencoded;
line 315
root.use(urlencoded(urlencodedOptions));
root.use(json(jsonOptions));
2.
remote-object.js
line 33
require('./rest-adapter');
line 97
RemoteObjects.prototype.handler = function(nameOrClass, options) {
var Adapter = this.adapter(nameOrClass);
var adapter = new Adapter(this, options);
var handler = adapter.createHandler();
if (handler) {
// allow adapter reference from handler
handler.adapter = adapter;
}
return handler;
};
2) For the raw type, we can add body-parser#raw in "parse" part in middleware.json , of course, it needs to npm install body-parser.
My test code :
1.My readable stream is from the file uploadRaw.txt , the content is :
GreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaGreenTeaEeeeend
2. middleware.json
"parse": {
"body-parser#raw": {
"paths": [
"/api/v1/Buckets/?/upload"
]
}
},
3.
it('application/octet-stream -- upload non-form', () =>
new Promise((resolve) => {
const options = {
method: 'POST',
host: testConfig.server.host,
port: testConfig.server.port,
path: ${appconfig.restApiRoot}/Buckets/${TEST_CONTAINER}/upload,
headers: {
'Content-Type': 'application/octet-stream',
},
};
const request = http.request(options);
request.on('error', (e) => {
logger.debug(problem with request: ${e.message});
});
const readStream = fs.createReadStream('tests/resources/uploadRaw.txt');
readStream.pipe(request);
resolve();
}));
4.
Bucket.upload = (req, res, options, cb) => {
logger.debug('sssssss in uploadFileToContainer');
fs.writeFile('/Users/caiyufei/TEA/green.txt', req.body, (err) => {
if (err) {
logger.debug('oh, failed to write file');
return;
}
logger.debug('green file is saved!');
});
};
OR
Bucket.upload = (req, res, options, cb) => {
logger.debug('sssssss in uploadFileToContainer');
const writeStream = fs.createWriteStream('/Users/caiyufei/TEA/green.txt');
const streamOptions = {
highWaterMark: 16384,`enter code here`
encoding: null,
}
streamifier.createReadStream(Buffer.from(req.body), streamOptions).pipe(writeStream);
};
5. package.json
"body-parser": "^1.17.1",
"streamifier": "^0.1.1",

Resources