Saving file on linux with FileStream of node js - linux

I wan't to save pdf file on meteor server using NodeJs FileStream:
var fs = Npm.require('fs');
var writeStream = fs.createWriteStream(fileName);
pdf.create(html).toStream(function(err, stream) {
stream.pipe(writeStream);
});
There is no problem when I run code locally on Windows but after mup deploy of app to DigitalOcean server with Ubuntu it crashes my app.
I checked directory with ls -l and meteoruser is owner of directory I want to save file in, yet still it crashes my app.
Why is it crashing? Is there a way of checking what is the problem?
Edit:
I found that error I get is ENOENT.
Before piping stream I've tried to write some data to stream and it was success, so the problem must be:
stream.pipe(writeStream);
That gives ENOENT and crashes the server.

Related

Where will file be saved in heroku for node.js app?

I am create a file from a buffer and saving it to my heroku virtual machine:
var wstream = fs.createWriteStream( name + '.wav');
wstream.write(buffer)
wstream.end()
I need to send that file to google cloud storage using its absolute path: Do you know what the path will be?
path = '?'
The application root on Heroku should be /app. So if you have a file named foo.txt in the root of your repo, it would be /app/foo.txt. You can confirm this by opening up a console (e.g. heroku run bash) and running pwd.

readFileSync throws error when server launched as linux service

i'm trying to make a simple api for myself using a node/express server running on digital ocean. in the server file i have something like this:
var data = fs.readFileSync('path/to/data.json','utf8');
which works perfectly fine when i launch the server manually from the cmd line
node server
but what i have setup is a linux service so that everytime i restart my digital ocean machine it will automatically launch the server, the service ( kept in etc/init/ ) looks like this:
start on filesystem and started networking
respawn
exec node /path/to/server.js
the issue is that when I make the request to the server that runs the readFileSync call it works fine if the server had been launched manually from the cmd line, but when the server was launched via the service then the readFileSync throws the following error:
Error: ENOENT, no such file or directory 'path/to/data.json'
at Error (native)
at Object.fs.openSync (fs.js:500:18)
at Object.fs.readFileSync (fs.js:352:15)
the file and the directory do exist ( if i make a request for the data.json file directly in my browser i can see it )
what am i missing? is there something about launching the server as a service that conflics with using readFileSync? is there an alternative approach to what i'm trying to do? should i use some kind of request/fetch resource module for accessing that json file?
You're using a relative path but the process is not being started from where you think it is. Instead of using relative paths, use absolute paths.
So if your layout looks like:
server.js
path/
to/
data.json
Then inside your server.js, you can just do something like:
var path = require('path');
// ...
var data = fs.readFileSync(path.join(__dirname, 'path/to/data.json'), 'utf8');

nodeJS:fs.write callback and fs.writeFile not working

I knew nothing about fs until I was learning to use casperjs to scrape some content from a website and save them to a file. Following some examples on the web, I write this file scrape.js (The json data has been tested so it has nothing to do with the issue):
var fs = require('fs');
var url = "http://w.nycweb.io/index.php?option=com_k2&view=itemlist&id=4&Itemid=209&format=json";
var casper = require('casper').create();
casper.start(url,function(){
var json = JSON.parse(this.fetchText('pre'));
var jsonOfItems={},items = json.items;
items.forEach(function(item){
jsonOfItems[item.id] = item.introtext.split('\n');
})
fs.write('videoLinks.json',JSON.stringify(jsonOfItems),function(err){
if (err) console.log(err);
console.log('videoLinks.json saved');
})
});
casper.run();
When I do casperjs scrape.js in command line of my Ubuntu 14.04 server, I won't see the file saved message as expected, although the file is properly saved. So this is the first question: why the callback isn't running at all?
Secondly, I also tried fs.writeFile, but when I replace fs.write with it, the file isn't saved at all, nor is there any error information.
I do notice that in casper documentation it's said that casper is not a node.js module and some module of node.js won't be available, but I doubt it has anything to do with my issues. And I think it worths to mention that previously when I run this script I only get a respond like
I'm 'fs' module.
I had to follow this question to reinstall fs module globally to get it working.
fs.write expects a file descriptor where you are trying to give it a filename. Try fs.writeFile. https://nodejs.org/dist/latest-v4.x/docs/api/fs.html#fs_fs_writefile_file_data_options_callback
Edit: Oh you tried that. Are you sure it didn't write it somewhere like the root directory? Tried a full path in there?
And what version of node are you running?

How to stop WSH execution while working with node.

I am running forllowing test.js file using
>node test.js
test.js is simple code
var fs = require("fs");
fs.readFile('2.rtf', function (err, data) {
if (err) return console.error(err);
console.log(data.toString());
});
console.log("Program Ended");
It shows error
so naturally I remove the extension and ran following command which worked fine.
node test
I am running windows 7 32 bit. How to stop WSH while using node commnad?
The problem seems to be that you have a file named node.js in the same directory and due to the way Windows resolves commands, it's trying to execute the file node.js (using the default handler for that file type, which is typically JScript) instead of node the node.js executable. Rename the node.js file to something else and it should work.

How to upload file using easy-ftp in node?

I am trying to upload a file into my hosting server using node and easy-ftp.
I try with the following code:
var EasyFtp = require ("easy-ftp");
var ftp = new EasyFtp();
var config = {
host:'homexxxxx.1and1-data.host',
type:'SFTP',
port:'22',
username:'u90xxxx',
password:"mypass"
};
ftp.connect(config);
ftp.upload("/test/test.txt", "/test.txt", function(err){
if (err) throw err;
ftp.close();
});
No error message but no file uploaded
I tried the same using promises
const EasyFTP = require('easy-ftp-extra')
const ftp = new EasyFTP()
const config = {
host:'homexxxxx.1and1-data.host',
type:'SFTP',
port:'22',
username:'u90xxxx',
password:"mypass"
};
ftp.connect(config);
ftp.upload('/test.txt', '/test.txt')
.then(console.log)
.catch(console.error)
ftp.upload()
The same issued. No file is uploaded. No error in node console.
The config is the same used in filezilla to transfer files. SFTP protocol. Everything working well with filezilla.
What I am doing wrong?
Looks like you may have a path problem over here.
"/test/test.txt"
The path specified will try to take file from root folder like "c:\test\test.txt".
Assuming you want the file to be taken from your project folder try this path:
"./test/test.txt"
Other things in your code are precisely the same as in mine and mine works.
For me, it was just silently failing, and intelli-sense was not available.
npm remove easy-ftp
npm install easy-ftp
npm audit fix --force until no more vulnerabilities
Afterwards, intelli-sense was available and it started working.

Resources