Error with multipart/form-data in express.js running on Azure - node.js

So I've got an express site running on Windows Azure. I'm currently having problems submitting forms that are marked as enctype="multipart/form-data".
The error I'm getting in the logs is: TypeError: Object # has no method 'tmpDir'
When running natively (initiated via node.exe) it works absolutely fine, its only when using the AzureEmulator or on Azure itself it fails.
Now I expect this has something to do with Azure's infrastructure, but I'm wondering whether anyone has managed to work around this?

So a multi-pronged issue here, I'll explain my findings as best as possible, please bear with me.
Connect uses node-formidable for its multipart form parsing, specifically the IncomingForm class. In the constructor of IncomingForm it sets the upload directory to be that of the parameter you pass in, or defaults to the Operating System's temp directory, defined by os.tmpDir(). However, this method is missing from the Windows implementation of node's "os" module.
After reading copious posts, threads etc, I discovered that you should be able to get around this, you need to set the uploadDir property of the bodyParser.
app.use(express.bodyParser({ uploadDir: 'path/to/dir' }));
However there is (at the time of writing) a bug in connect's implementation of multipart forms processing in that it creates an object of IncomingForm without passing any parameters into the constructor, and then setting the properties further down:
var form = new formidable.IncomingForm
, data = {}
, files = {}
, done;
Object.keys(options).forEach(function(key){
form[key] = options[key];
});
So I've forked both express & connect and updating the code to read as:
var form = new formidable.IncomingForm(options)
, data = {}
, files = {}
, done;
Object.keys(options).forEach(function(key){
form[key] = options[key];
});
You can find the forked versions here: not a shameless plug

Fix for Windows Environments (Azure web sites + node.js application).
server.js:
Make sure it does not set an upload dir or tmp dir
app.use(express.bodyParser());
packages.json:
Force node 0.10.21 or above:
"engines": { "node": "v0.10.24" }
Force express 3.4.8 or above:
"express": "3.4.8"
This should update your node to the fixed lib versions and the problem should be gone.

Related

Detect whether an npm package can run on browser, node or both

I'm building a NextJs application which uses a set of abstractions to enable code from both react (Component logic etc.) and node (API logic) to utilize same types and classes. The motive here is to make the development process seem seamless across client side and server side.
For example. a call on User.create method of the User class will behave differently based on the runtime environment (ie. browser or node) - on the browser, it will call an api with a POST request and on server it will persist data to a database.
So far this pattern worked just fine, and I like how it all turned out in terms of the code structure. However, for this to work, I have to import modules responsible for working on the server and browser to a single module (which in this case is the User class) this leads to a critical error when trying to resolve dependencies in each module.
For example I'm using firebase-admin in the server to persist data to the Firebase Firestore. But it uses fs which cannot be resolved when the same code runs on the browser.
As a work around I set the resolve alias of firebase-admin to false inside the webpack configuration (given it runs on browser) see code below.
/** next.config.js **/
webpack: (config, { isServer }) => {
if (!isServer) {
// set alias of node specific modules to false
// eg: service dependencies
config.resolve.alias = {
...config.resolve.alias,
'firebase-admin': false,
}
} else {
// set alias of browser only modules to false.
config.resolve.alias = {
...config.resolve.alias,
}
}
While this does the trick, it won't be much long until the process gets really tedious to include all such dependencies within resolve aliases.
So, my approach to this is to write a script that runs prior to npm run dev (or manually) that will read all dependencies in package.json and SOMEHOW identify packages that will not run on a specific runtime environment and add them to the webpack config. In order to this, there should be a way to identify this nature of each dependency which I don't think is something that comes right out of the box from npm or the package itself.
Any suggestion on how this can be achieved is really appreciated. Thanks`

Electron Node.js node localstorage osx mkdir permission denied

I am working with Electron and Node.js. We have developed an application that works fine on windows and as a requirement had to package it for mac os. I packaged the application using electron-packager, the packaging process completes and package is generated. Double clicking it throws an error that permission denied for mkdir, as i am using node localstorage to maintain some settings on the user's local machine. somehow mac doesn't local storage to create folder in the root of the application. Any help in this matter will be great. Thanks
First off, is the code in question in the main process or in a renderer process? If it is the latter, you don't need to use 'node-localstorage', because you can use the renderer's native LocalStorage. If you are in the main process, then you need to provide your own storage strategy so using 'node-localstorage' is a viable option.
In any case, you need to carefully consider where to store the data; for starters, let's look at where Electron's renderer processes would store its LocalStorage data: this differs based on the OS, but you can get and set the paths using the app module -- the path in question is userData, which on OS X would default to ~/Library/Application Support/<App Name>. Electron uses that folder to persist cookies, caches, LocalStorage etc. so I would suggest using that folder as well. (Otherwise, refer to XDG defaults for good defaults)
What your example above was trying to do is store your 'errorLogDb' in the current working directory, which might depend on your OS, where your App is installed, how you executed it, etc.
Finally, it's a good idea to differentiate between your 'production' app and your app during development and testing, because you might not want to use the same storage folders for every environment. In any case, just writing to './errorLogDb' is likely to cause lots of headaches so I'd be thankful for the permission denied error.
this strategy worked for me:
const { LocalStorage } = require('node-localstorage');
let ls;
mb.on('ready', () => {
let prefsPath = mb.app.getPath('userData') + '/prefs';
ls = new LocalStorage(prefsPath);
loadPrefs();
});
mb.on('after-create-window', () => { /* ls... */ }
exports.togglePref = () => { /* ls... */ }

How to add static files to an Electron app

How can I add JSON or TOML files in an Electron app for deployment? The following code works in development environment, but does not after packaging by electron-packager.
var presets = toml.parse(fs.readFileSync('presets.toml','utf8'));
According to the guide that took me way too long to find, the Electron team patched the fs module to provide a "virtual file system" under the root (/).
That means your file is accessible at fs.readFileSync('/presets.toml'); (notice the forward slash).
I went crazy till I found this one.
The problem is not adding it because it's already added.
Problem is finding base path after packaging.
So here it is:
const { app } = window.require('electron').remote;;
console.log(app.getAppPath());
Also, as you can see, if you're using React you need to use window.require instead of regular require (it throws nasty error otherwise).
Found about this here:
https://github.com/electron/electron/issues/3204#issuecomment-151000897

Serve out swagger-ui from nodejs/express project

I would like to use the swagger-ui dist 'as-is'...well almost as-is.
Pulled down the latest release from github (2.0.24) and stuck it in a folder in my app. I then server it out statically with express:
app.use('/swagger', express.static('./node_modules/swagger-ui/dist'));
That works as expected when I go to:
https://mydomain.com/swagger
However I want to populate the url field to my swagger json dynamically. IE I may deploy to different domains:
https://mydomain.com/api-docs
https://otherdomain.com/api-docs
And when I visit:
https://mydomain.com/swagger
https://otherdomain.com/swagger
I would like to dynamically set the url.
Is that possible?
Assuming the /api-docs (or swagger.json) are always on the same path, and only the domain changes, you can set the url parameter of the SwaggerUi object to "/path/to/api-docs" or "/path/to/swagger.json"instead of a full URL. That would make the UI load that path as relative to the domain the UI is hosted on.
For reference, I'm leaving the original answer as well, as it may prove useful in some cases.
You can use the url parameter to set the URL the UI should load.
That is, if you're hosting it under https://mydomain.com/swagger you can use https://mydomain.com/swagger?url=https://mydomain.com/api-docs and https://mydomain.com/swagger?https://otherdomain.com/api-docs to point at the different locations.
However, as far as I know, this functionality is only available at the current alpha version (which should be stable enough) and not with 2.0.24 that you use (though it's worth testing).
Another method would be to use the swagger-ui middleware located in the swagger-tool.
let swaggerUi = require('../node_modules/swagger-tools/middleware/swagger-ui');
app.use(swaggerUi(config.swagger));
The variable config.swagger contains the swagger.yaml or swagger.json. I have in my setting
let config = {
appRoot: __dirname,
swagger: require('./api/swagger/swagger.js')
};
Note: I am using the require('swagger-express-mw') module
You could try with this on index.html file of the swagger-ui... It works for me.
if (url && url.length > 1) {
url = decodeURIComponent(url[1]);
} else {
url = window.location.origin + "/path/to/swagger.json";
}

Temporary File Download

Is there a service that creates basically a one-time download of a file, preferably something I can use from NodeJS?
I've done some research on FilePicker, and haven't found anything about regenerating the link it gives you for a file. There may be a way to do this with NodeJS, but I'm using Meteor at the same time so many Node things probably will conflict.
You could build it with meteor. Using meteor-router with meteorite & use server side routing to deliver the files.
You need a collection to keep track of downloaded files:
Server JS
var downloads = new Meteor.Collection("downloads");
//create a link
downloads.insert({url:"/mydownload.zip",downloaded:false})
Meteor.Router.add('/file/:id', 'GET', function(id) {
download = downloads.findOne(id);
if( download) {
if(dowload.downloaded) {
this.response.send("You've already downloaded me")
}
else
{
//I guess you could just redirect or stream the file for an extra layer of surety
this.response.redirect(download.url);
}
}
});
On the client you can use /files/{{_id}} with _id of the file from downloads the person has as the link
My recommendation would also be to add custom server-side logic to count # of uploads (or just flag a file as downloaded/not downloaded) and respond accordingly. The closest you could do with Filepicker.io would be using the security policies to restrict downloading the file to a specific time interval.
in addition to using the router package
in Meteor.startup you can add
var require = __meteor_bootstrap__.require;
fs = require( 'fs' );
the fs variable should be declared on the server only. the fs package is used by Meteor and does not need to be added separately.
once you have done this, you can create files with Meteor.uuid() as their name which makes them unique and very difficult to guess. It is also possible to delete the file after a certain amount of time by using Meteor.setTimeout
the question is: where do the files to be downloaded come from?
Solution using Heroku Cloud and NodeJS Meteor Hooks
Heroku in particular is actually great for temporary file download links: they offer a "temporary scratchpad" filesystem that is reset every time the program restarts, and each running Node server cannot see the files other instances have created.
Each dyno gets its own ephemeral filesystem, with a fresh copy of the
most recently deployed code. During the dyno’s lifetime its running
processes can use the filesystem as a temporary scratchpad, but no
files that are written are visible to processes in any other dyno and
any files written will be discarded the moment the dyno is stopped or
restarted.
Taken from the Heroku documentation: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem
Thus, any files written to the "filesystem" will be temporary.
This allows for a very easy solution to this problem: you can simply use NodeJS filesystem manipulation to create temporary files on the server, serve them once (or for a limited time), and then remove them so they cannot be downloaded again.
This in combination with something like $.download() will make a seamless experience which in turn prevents unauthorized downloads.

Resources