Nodejs upload a model to shapeways api - node.js

I am trying to upload a obj file to shapeways api. I tried using the request module by creating the url. It doesn't seem to be working. Could someone show me some code examples? The documentation only mentions php
Thanks
fs.readFile("/models/temp.obj",function(err,data){
var params = {file:data,
filename:"temp.obj",
hasRightsToModel: 1,
acceptTermsAndConditions: 1}
var client = new shapeways.client({ consumerKey: config.app.key,
consumerSecret: config.app.secret,
oauthToken: req.session.oauthToken,
oauthSecret: req.session.oauthSecret, });
client.addModel(params,callback);
})
Here's the code. The fs.readfile returns undefined.

Looks like your code has two issues.
One you don't specify your root dir, I doubt you really mean /models.
And two you should never ignore the err parameter.
fs.readFile(__dirname + "/models/temp.obj", function (err, data) {
if (err) {
throw err;
}
// . . .
});

My params had filename. It should have been fileName. Silly mistake. Everything else I had incorporated anyways. Thanks for all your help.

Related

DiscordJS + NodeJS: SyntaxError: Unexpected end of JSON input at JSON.parse (<anonymous>)

Having an issue where I will randomly get this error after about an hour of my code running.
SyntaxError: Unexpected end of JSON input
at JSON.parse (<anonymous>)
Here is my code:
function matchMaking() {
setInterval(function() {
fs.readFile('./rankedQueue.json', 'utf8', function(err, data) {
const file = JSON.parse(data); //The line the error occurs
if (err) {
console.log(err)
} else {
//lots of code here
}
})
}, 10 * 1000)
}
edit: This is the content of the JSON file.
{
"queue": [],
"waiting": [],
"lowLevel": [],
"placeHolder": [
{
"test": "test"
}
]
}
The arrays are being pushed to, and then spliced a couple times a minute.
After searching here and some forums, I've tried using fs.readFileSync, which makes the code not run at all. And now I'm finding some specific examples of this error that I can't quite seem to make the solutions apply to me. If anyone has any idea of what I should be changing, It would be appreciated.
Thanks in advance.
As the errors said data is not in json type
Can you make sure the file is valid and it's is what JSON.parse() can parse
How to use fs.readFileSync
Using fs.readFileSync returns a buffer. You can easily convert the buffer into something readable. You use buffer.toString('ascii'). Also, fs.readFile has a callback argument while readFileSync DOESN'T. What I think you were doing about readFileSync was:
fs.readFileSync('./rankedQueue.json', 'utf8', function(err, data) {
const file = JSON.parse(data); //The line the error occurs
if (err) {
console.log(err)
} else {
//lots of code here
}
})
But you should actually be doing this:
var data = fs.readFileSync('./rankedQueue.json');
var JSON = JSON.parse(data.toString('ascii'));
You could drop the 'ascii'.
Incorrect / Hacky Answer
Use require instead. Node.JS has it so it can parse a JSON file for you. Keep in mind that the way you require a file compared to fs is different.
If your code is located at /project/foobar/code.js
JSON is at /project/rankedQueue.json
fs automatically goes to the top of where your project folder is, e.g. /project & So ./rankedQueue.json
require does not do this, and stays in the folder where the file is. You have to add ../ for every folder you want to go above. So: ./../rankedQueue.json.
I'd suggest also running your JSON through a validator as well so it can tell you whats wrong.

Express js adding routes in multiple files

Bit of context: I am learning nodejs/express and have got a small application that in the end should function as an api. I have got a routes directory with a few subdirectories containing files such as Post.js or Users.js, each file defining a few routes for Posts, Users etc.
I have the following bit of code in my index.js placed in routes directly:
public readDir(path, app) {
let dir = path != null ? path : __dirname;
fs.readdir(dir, (err, elements) => {
if(err) throw err;
if(!elements) return;
elements.forEach(element => {
if(element === "init.js") return;
let new_path = x.join(dir, "/", element);
fs.lstat(new_path , (err, stat) => {
if(err) throw err;
if(stat.isDirectory()) {
this.readDir(new_path , app);
} else if(stat.isFile()) {
require(PATH)(app);
}
});
});
});
}
What it does is the following: It reads the Routes directory with each subdirectory by calling itself in a loop and requiring any file that is found (path module is imported as x, I should probably change that sometime). This works fortunately, every route is mapped properly and can be accessed by making a call with postman / insomnia.
My question would be how this could be done better, primarily performance wise whilst still keeping the structure of multiple files and/or directories?
I have already seen this answer and this one and though both seem like great and functional answers I was wondering which would be the better option?
Any pointers would be great!

reading a packaged file in aws lambda package

I have a very simple node lambda function which reads the contents of packaged file in it. I upload the code as zip file. The directory structure is as follows.
index.js
readme.txt
Then have in my index.js file:
fs.readFile('/var/task/readme.txt', function (err, data) {
if (err) throw err;
});
I keep getting the following error NOENT: no such file or directory, open '/var/task/readme.txt'.
I tried ./readme.txt also.
What am I missing ?
Try this, it works for me:
'use strict'
let fs = require("fs");
let path = require("path");
exports.handler = (event, context, callback) => {
// To debug your problem
console.log(path.resolve("./readme.txt"));
// Solution is to use absolute path using `__dirname`
fs.readFile(__dirname +'/readme.txt', function (err, data) {
if (err) throw err;
});
};
to debug why your code is not working, add below link in your handler
console.log(path.resolve("./readme.txt"));
On AWS Lambda node process might be running from some other folder and it looks for readme.txt file from that folder as you have provided relative path, solution is to use absolute path.
What worked for me was the comment by Vadorrequest to use process.env.LAMBDA_TASK_ROOT. I wrote a function to get a template file in a /templates directory when I'm running it locally on my machine with __dirname or with the process.env.LAMBDA_TASK_ROOT variable when running on Lambda:
function loadTemplateFile(templateName) {
const fileName = `./templates/${templateName}`
let resolved
if (process.env.LAMBDA_TASK_ROOT) {
resolved = path.resolve(process.env.LAMBDA_TASK_ROOT, fileName)
} else {
resolved = path.resolve(__dirname, fileName)
}
console.log(`Loading template at: ${resolved}`)
try {
const data = fs.readFileSync(resolved, 'utf8')
return data
} catch (error) {
const message = `Could not load template at: ${resolved}, error: ${JSON.stringify(error, null, 2)}`
console.error(message)
throw new Error(message)
}
}
This is an oldish question but comes up first when attempting to sort out whats going on with file paths on Lambda.
Additional Steps for Serverless Framework
For anyone using Serverless framework to deploy (which probably uses webpack to build) you will also need to add the following to your webpack config file (just after target: node):
// assume target: 'node', is here
node: {
__dirname: false,
},
Without this piece using __dirname with Serverless will STILL not get you the desired absolute directory path.
I went through this using serverless framework and it really was the file that was not sent in the compression. Just add the following line in serverless.yml:
package:
individually: false
include:
- src/**
const filepath = path.resolve('../../filename.text');
const fileData2 = fs.readFileSync(process.env.LAMBDA_TASK_ROOT + filepath, 'utf-8');
I was using fs.promises.readFile(). Couldn't get it to error out at out. The file was there, and LAMBDA_TASK_ROOT seemed right to me as well. After I changed to fs.readFileSync(), it worked.
I hade the same problem and I tried applying all these wonderful solutions above - which didn't work.
The problem was that I setup one of the folder name with one letter in upper case which was really lowercase.
So when I tried to fetch the content of /src/SOmething/some_file.txt
While the folder was really /src/Something/ - I got this error...
Windows (local environment) is case insensitive while AWS is not!!!....

NodeJs Method Chaining of fs.mkdir and fs.rename

I'm working on a file uploader script which creates a new folder (based on the timestamp) and moves the uploaded file to the created folder.
Sometimes it works and sometimes I'm getting a ENOENT rename Error (file/folder not exist).
The following code is in my post route:
var form = new multiparty.Form({
uploadDir: "C:"+ path.sep + "files"
});
form.parse(req, function(err, fields, files) {
var dirPath = "C:"+ path.sep + "files" + path.sep + +new Date;
fs.mkdir(dirPath, 0777, function(err, dirPath){
if (err) console.error(err);
console.log("Created new folder");
fs.rename(files.file[i].path, dirPath + path.sep + files.file[i].originalFilename, function(err){
if (err) console.error(err);
console.log("Moved file");
});
}(err, dirPath));
next();
});
I'm using express(4) and the multiparty module.
As you can see I'm using async functions.
So the question is: What is wrong with my code?
Edit
The error I'm talking about: "Error: ENOENT, rename 'C:\files\7384-1r41cry.png'"
It has something to do with a race condition. With fs.mkdirSync everything works fine.
My guess would be some sort of race condition happening here.
This kind of stuff is easy to get wrong and hard to get it right.
I normally use gulp for this kind of stuff and maybe should you :)
To copy a whole directory tree into some other directory wouldn't be easier.
gulp.src('./inputDir/**/*').pipe(gulp.dest('./outputDir')
And all files from inputDir would be copied into outputDir
But maybe coping is not a option. The files could too large, right?
Lets hack it a bit to make it work the way we want.
var fs = require('fs')
, gulp = require('gulp')
, thr = require('through2').obj
, SRC = './test/**/*.{css,html,js}' // it could be 'my/file/located/here'
, OUT = './output' // it could be 'my/newer/file/located/there'
;
gulp.src(SRC)
.pipe(thr(function(chunk, enc, next){
chunk.contents = new Buffer('') // cleaning the contents of the file
chunk._originalPath = chunk.path
next(null, chunk)
}))
.pipe(gulp.dest(OUT))
.pipe(thr(function(chunk, enc, next){
// now a place holder file exists at our destination.
// so we can write to it and being convident it exists
console.log('moving file from', chunk._originalPath, 'to', chunk.path)
fs.rename(chunk._originalPath, chunk.path, function(err){
if (err) return next(err)
next()
})
}))
This moves all css, html and js files from input to output, regardless of how many nested directories there are
gulp is awesome :)
Ok a few things...
you really should start using promises, it makes the code easier to read and the error handling is way superior. I usually use when.js, but there are other alternatives.
you should throw the errors or return on errors or you would try to continue running the function even when the previous operations failed.
you do
if (err) console.error(err);
should be
if (err) throw err;

NodeJS: Asynchronous file read problems

New to NodeJS.
Yes I know I could use a framework, but I want to get a good grok on it before delving into the myriad of fine fine tools that are out there.
my problem:
var img = fs.readFileSync(path);
the above works;
fs.readFile(path, function (err, data)
{
if (err) throw err;
console.log(data);
});
the above doesn't work;
the input path is : 'C:\NodeSite\chrome.jpg'
oh and working on Windows 7.
any help would be much appreciated.
Fixed
Late night/morning programming, introduces errors that are hard to spot. The path was being set from two different places, and so the source path were different in both cases. Thankyou for your help. I am a complete numpty. :)
If you are not setting an encoding when reading a file, you will get the binary content.
So for example, the following snippet will output the content of the test file using UTF-8 encoding. If you don't use an encoding, you will get an output like "" on your console (raw binary buffer).
var fs = require('fs');
var path = "C:\\tmp\\testfile.txt";
fs.readFile(path, 'utf8', function (err, data) {
if (err) throw err;
console.log(data);
});
Another issue (especially on windows-based OS's) can be the correct escaping of the target path. The above example shows how path's on Windows have to be escaped.
java guys will just use this javascript asynchronous command as if in pure java , troublefreely :
var fs = require('fs');
var Contenu = fs.readFileSync( fILE_FULL_Name , 'utf8');
console.log( Contenu );
That should take care of small & big files.

Resources