node.js issues with Meteor's file system - node.js

I have tried to figure out what i am missing from this puzzle between. Node.js and Meteor.js. Meteor is built on Node.js i know this. But Meteor doesn't not work properly with Node.js. Either I need to do 20 more steps to get the same result, which I don't know what they are. Or there is a serious bug between the two. Standalone Node.js runs the command below just fine. Running the same commands on Meteor cause errors or undefined results. Wish i had a why to solve this or they need to patch this so it will work the way it should work.
examples #1
var fs = require('fs');
fs.readFile('file.txt', 'utf8', function (err,data) {
if (err) {
return console.log(err);
}
console.log(data);
});
example #2
var jetpack = require('fs-jetpack');
var data = jetpack.read('file.txt');
console.log(data);
example #3
var fs = require ('fs');
var readMe = fs.readFileSync('file.txt', 'utf8');
console.log(readMe);

You shouldn't try to load files like this because you don't know what the folder structure looks like. Meteor creates builds from your project directory, both in development and production mode. This means that even though you have a file.txt in your project folder, it doesn't end up in the same place in the build (or it isn't even included in the build at all).
For example, your code tries to read the file from the development build folder .meteor/local/build/programs/server. However, this folder doesn't contain file.txt.
Solution: Store file.txt in the private folder of your project and use Assets.getText to read it. If you still want to use the functions from fs to load the file, you can retrieve the absolute path with Assets.absoluteFilePath.

Related

Best way to copy a directory from an external drive to a local folder with electronjs?

Just wondering if anyone has ever attempted to copy a directory from an external drive (connected via USB) to a local folder.
I am using ElectronJS so I can use my JavaScript, HTML/CSS skills to create a desktop application without utilising a C language. (i.e. C# or C++) With ElectronJS there's a lot less to worry about.
Here is the list of things I've tried so far:
basic fs.copyFile (using copyFile intially and will then loop round the directory to copy all files)
var fs = require('fs');
window.test = () => {
fs.moveSync("targetFile","destDir", function(err) {
if(err){
console.log(err);
}else{
console.log("copy complete")
}
});
}
fs.moveSync is not a function even though Visual Studio Code brought up moveSync as a suggestion when I entered fs. (ctrl + space)
using child_process functions to copy files using the command line.
Code is:
var process = require('child_process')
window.test = function(){
process.exec('ipconfig', function(err, stdout, stderr){
if(err){
console.log(err);
}else{
console.log(stdout)
}
})
}
Then bundled with browserify. Bundle.js is then imported into the html file and the test function is called on the click of a button. I'm aware the command is ipconfig for now, this was merely used to see if a command could be executed. It appears it could because I was getting process.exec is not defined.
use the node-hid node module to read and trasfer data from the external drive.
The exposed functions within this module were also reported as being undefined. And I thought about the use case longer I thought a simple copy process would suffice because external drive can be accessed like any other folder in the file explorer.
Unfortunately, all of the above have failed and I've spent the most part of the day looking for alternative modules and/or solutions.
Thanks in advance because any help to achieve this would be much appreciated.
Thanks
Patrick
The npm package fs-extra should solve your problem.
It has the move function, which
Moves a file or directory, even across devices
Ended up adding this to my preload.js for:
window.require = require;
It will work for now but is due to be depreciated.
I'll use this for now and make other updates when I have to.

How to compile ReactJS for use on server with command line arguments?

I've decided to try out ReactJS. Along with that, I've decided to use Gulp for compiling .jsx to .js, also for the first time.
I can compile it no problem for the client use with browserify. Here's my gulp task:
browserify("./scripts/main.jsx")
.transform(
babelify.configure({
presets: ["react"]
}))
.bundle()
.pipe(source('bundle.js'))
.pipe(gulp.dest('./scripts/'));
But since I use PHP to generate data, I need to get those data to node. If I use browserify, it will prevent me from using process.argv in node. I can save data to file and read that file in node, so I wouldn't need to pass the whole state to node, but I still need to pass the identifying arguments, so the node knows which file to load.
What should I use instead of browserify?
If you need to compile a React module to es5 for use on the server, use Babel itself.
A module that may help with reading and writing files is this one: https://nodejs.org/api/fs.html
Have you considered posting and getting from a database?
Here's how I solved it:
I have learnt that you can create standalone bundles with browserify, so I've compiled all the server code I need (components + rendering) as a standalone bundle. Then I have created small node script which is responsible only for reading arguments, loading data and sending it to the rendering code.
I'm not sure if this is a proper way how it should be done but it works.
Here's code for the "setup" script:
var fs = require('fs');
var Server = require('./server.js');
if (process.argv[2]) {
region = process.argv[2].toLowerCase().replace(/[^a-z0-9]/, '');
if (region != '') {
var data = JSON.parse(fs.readFileSync(__dirname + '/../tmp/' + region + '.json', 'utf8'));
console.log(Server.render(data.deal, data.region));
}
}
This way I only need to deploy two files and I still can easily compile jsx to js.

Node js module mkdirp only creates half the directories

I'm trying to use mkdirp for a project, but when I feed it a var with my dir path I want created, it only creates the first half of it. I've installed the module locally with npm. I'm using Node v0.10.20 on a Raspberry Pi.
This is how it looks:
var filePath = "upload/home/pi/app/temp";
mkdirp(filePath, function(error) {
if(error) {
console.log(error);
} else {
...
}
});
I don't get an error creating the path, but it only creates "upload/home/pi", however if I run my script again, it creates the rest of the directory structure. Upload is a
directory in the current working directory which is the user home.
I emailed the author of the module who suggested that it could be because I'm using a flash drive as my medium, which in turn lies about when IO operations are complete, which I guess confuses node.js to think it has successfully written the path to disk. How should I tackle my problem? I guess I can do a check on if the directory was created, and loop that until it has, but that feels like the wrong thing to do. Any suggestions welcome.
Thanks.
Try doing this synchronously:
var filePath = "upload/home/pi/app/temp";
mkdirp(filePath)

Grunt-Karma: Use Node.js fs-framework in Jasmine Testfile

I'm writing unit-tests with the Jasmine-framework.
I use Grunt and Karma for running the Jasmine testfiles.
I simply want to load the content of a file on my local file-system (e.g. example.xml).
I thought I can do this:
var fs = require('fs');
var fileContent = fs.readFileSync("test/resources/example.xml").toString();
console.log(fileContent);
This works well in my Gruntfile.js and even in my karma.conf.js file, but not in my
Jasmine-file. My Testfile looks like this:
describe('Some tests', function() {
it('load xml file', function() {
var fs = require("fs");
fileContent = fs.readFileSync("test/resources/example.xml").toString();
console.log(fileContent);
});
});
The first error I get is:
'ReferenceError: require is not defined'.
Does not know why I cannot use RequireJS here, because I can use it
in Gruntfiel.js and even in karma.conf.js?!?!?
Okay, but when manually add require.js to the files-property in karma.conf.js-file,
then I get the following message:
Module name "fs" has not been loaded yet for context: _. Use require([])
With the array-syntax of requirejs, nothing happens.
I guess that is not possible to access Node.js functionality in Jasmine when running the
testfiles with Karma. So when Karma runs on Node.js, why is it not possible to access the 'fs'-framework of Nodejs?
Any comment/advice is welcome.
Thanks.
Your test do not work because karma - is a testrunner for client-side JavaScript (javascript who run in browser), but you want to test node.js code with it (which run on the server part). So karma just can't run server-side tests. You need different testrunner, for example take a look to jasmine-node.
Since this comes up first in the Google search, I received a similar error but wasn't using any node.js-style code in my project. Turns out the error was one of my bower components had a full copy of jasmine in it including its node.js-style code, and I had
{ pattern: 'src/**/*.js', included: false },
in my karma.conf.js.
So unfortunately Karma doesn't provide the best debugging for this sort of thing, dumping you out without telling you which file caused the issue. I had to just tear that pattern down to individual directories to find the offender.
Anyway, just be wary of bower installs, they bring a lot of code down into your project directory that you might not really care to have.
I think you're missing the point of unit testing here, because it seems to me that you're copying application logic into your test suite. This voids the point of a unit test because what it is supposed to do is run your existing functions through a test suite, not to test that fs can load an XML file. In your scenario if your XML handling code was changed (and introduced a bug) in the source file it would still pass the unit test.
Think of unit testing as a way to run your function through lots of sample data to make sure it doesn't break. Set up your file reader to accept input and then simply in the Jasmine test:
describe('My XML reader', function() {
beforeEach(function() {
this.xmlreader = new XMLReader();
});
it('can load some xml', function() {
var xmldump = this.xmlreader.loadXML('inputFile.xml');
expect(xmldump).toBeTruthy();
});
});
Test the methods that are exposed on the object you are testing. Don't make more work for yourself. :-)

using streamlinejs with nodejs express framework

I am new to the 'nodejs' world.So wanting to explore the various technologies,frameworks involved i am building a simple user posts system(users posting something everybody else seeing the posts) backed by redis.I am using express framework which is recommended by most tutorials.But i have some difficulty in gettting data from the redis server i need to do 3 queries from the redis server to display the posts.In which case have to use neested callback after each redis call.So i wanted to use streamline.js to simplify the callbacks.But i am unable to get it to work even after i used npm install streamline -g and require('streamline').register(); before calling
var keys=['comments','timestamp','id'];
var posts=[];
for(var key in keys){
var post=client.sort("posts",'by','nosort',"get","POST:*->"+keys[key],_);
posts.push(post);
}
i get the error ReferenceError: _ is not defined.
Please point me in the right direction or point to any resources i might have missed.
The require('streamline').register() call should be in the file that starts your application (with a .js extension). The streamline code should be in another file with a ._js extension, which is required by the main script.
Streamline only allows you to have async calls (calls with _ argument) at the top level in a main script. Here, your streamline code is in a module required by the main script. So you need to put it inside a function. Something like:
exports.myFunction = function(_) {
var keys=['comments','timestamp','id'];
var posts=[];
for(var key in keys){
var post=client.sort("posts",'by','nosort',"get","POST:*->"+keys[key],_);
posts.push(post);
}
}
This is because require is synchronous. So you cannot put asynchronous code at the top level of a script which is required by another script.

Resources