I am trying to pass the Microsoft Cognitive services facial API an image which the user has uploaded. The image is available on the server in the uploads folder.
Microsoft is expecting the image to be 'application/octet-stream' and passed as binary data.
I am currently unable to find a way to pass the image to the API that is satisfactory for it to be accepted and keep receiving "decoding error, image format unsupported". As far as im aware the image must be uploaded in blob or file format but being new to NodeJs im really unsure on how to achieve this.
So far i have this and have looked a few options but none have worked, the other options i tried returned simmilar errors such as 'file too small or large' but when ive manually tested the same image via Postman it works fine.
image.mv('./uploads/' + req.files.image.name , function(err) {
if (err)
return res.status(500).send(err);
});
var encodedImage = new Buffer(req.files.image.data, 'binary').toString('hex');
let addAPersonFace = cognitive.addAPersonFace(personGroupId, personId, encodedImage);
addAPersonFace.then(function(data) {
res.render('pages/persons/face', { data: data, personGroupId : req.params.persongroupid, personId : req.params.personid} );
})
The package it looks like you're using, cognitive-services, does not appear to support file uploads. You might choose to raise an issue on the GitHub page.
Alternative NPM packages do exist, though, if that's an option. With project-oxford, you would do something like the following:
var oxford = require('project-oxford'),
client = new oxford.Client(YOUR_FACE_API_KEY),
uuid = require('uuid');
var personGroupId = uuid.v4();
var personGroupName = 'my-person-group-name';
var personName = 'my-person-name';
var facePath = './images/face.jpg';
// Skip the person-group creation if you already have one
console.log(JSON.stringify({personGroupId: personGroupId}));
client.face.personGroup.create(personGroupId, personGroupName, '')
.then(function(createPersonGroupResponse) {
// Skip the person creation if you already have one
client.face.person.create(personGroupId, personName)
.then(function(createPersonResponse) {
console.log(JSON.stringify(createPersonResponse))
personId = createPersonResponse.personId;
// Associate an image to the person
client.face.person.addFace(personGroupId, personId, {path: facePath})
.then(function (addFaceResponse) {
console.log(JSON.stringify(addFaceResponse));
})
})
});
Please update to version 0.2.0, this should work now.
Related
I'm trying to create pages that will take a user information and save them to the database. the user information are {name, age....... picture}, when I put the information without a picture it work fine and the data saved to the database but when I try to put the picture with them it gives me the error.
I'm sorry for the picture quality.
any one can help me with this.
I'm using nodejs and react
thx :)
You would need to send the picture as Base64 in your object, as it follows:
var data = {
name: 'John',
age: 27,
picture: 'data:image/png;base64,R0lGODlhPQBEAJos...',
}
In NodeJS, if using Express, the picture will be req.body.picture. So, all you need to do is store the file, then get the temp path do do what you need.
You can store base64 file doing:
var filePath = './tmp/myPicture.png';
fs.writeFile(filePath, req.body.picture, 'base64', (err) => {
if (err) {
res.json({ err: 'Error while creating temp file from base64.' });
} else {
// Your file was uploaded, so you can read your file here.
}
});
I have created a Node JS server which does the following:
Uploads media files (videos and images) to the server using multer
If the media is an image, then resize it using sharp
It the media is a video , then resize and compress it using fluent-ffmpeg
Upload files to Firebase storage for backup
All this is working know fluently. The problem is that, when the size of an uploaded file is big, the request processing takes long time. So I want to show some progress on the client side as below:
State 1. The media is uploading -> n%
State 2. The media is compessing
State 3. The media is uploading to cloud -> n%
State 4. Result -> JSON = {status: "ok", uri: .., cloudURI: .., ..}
Firebase storage API has a functionality like this when we creating an upload task as shown below:
let uploadTask = imageRef.put(blob, { contentType: mime });
uploadTask.on('state_changed', (snapshot) => {
if (typeof snapshot.bytesTransferred == "number") {
let progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
}
});
I have found that, it is possible to realize this using websockets, I am interested if there is other methods to do that.
The problem is described also here: http://www.tugberkugurlu.com/archive/long-running-asynchronous-operations-displaying-their-events-and-progress-on-clients
And there is one of the methods Accessing partial response using AJAX or WebSockets? but I am looking for a more flexible and professional solution.
I have solved this problem using GraphQL Subscriptions. The same approach can be realized using WebSockets. The steps to solve this problem are as below:
Post files to upload server
Generate operation unique ID and send it as response to the client
Ex: response = {op: "A78HNDGS89NSNBDV7826HDJ"}
Create a subscription by opID
Ex: subscription { uploadStatus(op: "A78HNDGS89NSNBDV7826HDJ") { status }}
Every time on status change send request to the GraphQL endpoint, which which publishes the data to the pubsub. To send GraphQL request from nodejs server you can use https://github.com/prisma-labs/graphql-request
Ex:
const { request } = require('graphql-request');
const GQL_URL = "YOUR_GQL_ENDPOINT";
const query = `query {
notify ("Status text goes here")
}`
request(GQL_URL, query).then(data =>
console.log(data)
)
notify resolver function publishes the data to the pubsub
context.pubsub.publish('uploadStatus', {
status: "Status text"
});
If you have more complicated architecture, you can use message brokers like RabbitMQ, Kafka etc.
If someone knows other solutions, please let us know )
I'm working on building a snippet manager app and through the interface you can create new snippets and edit them using a code editor but what I'm stuck at is how can I send the snippet code to my server using POST for it to create a new file for that snippet.
For ex. -
const getUser = async (name) => {
let response = await fetch(`https://api.github.com/users/${name}`);
let data = await response.json()
return data;
}
One solution that I can think of is to parse the code into JSON equivalent that'll contain all the tokens in JSON format but for that I'll have to add parsers for every language and select a parser based on what language the user selected. I'm trying to figure out a way to avoid having to add all the parsers unless there isnt any solution for this.
Another solution I can think of is to generate the file from the frontend and send that file through POST request.
My current stack is Node+React
Using the second solution is working for me right now. I've written the code below for it -
app.post("/create", isFileAttached, function(req, res) {
const { file } = req.files;
const saveLocation = `${saveTo}/${file.mimetype.split("/")[1]}`;
const savePath = `${saveLocation}/${file.name}`;
if (!fs.existsSync(saveLocation)) {
fs.mkdirSync(saveLocation, { recursive: true });
}
fs.writeFile(savePath, file.data.toString(), err => {
if (err) throw err;
res.status(200).send({ message: "The file has been saved!" });
});
});
With this solution I no longer have to add any parsers, since whatever's written in the files are no longer a concern anymore.
I am using multer-gridfs-storage and gridfs-stream to store my video in the backend (Express/Node). When I try to retrieve the file to play on my front end (React) the player refuses to recognize the source.
I am using Video-React to display the video on download. The download is successful, I get a Binary string back from the backend, which I converted to a Blob.
try{
fileBlob = new Blob([res.data], {type : res.headers['content-type']});
}catch(err){
console.log('Error converting to blob');
console.log(err);
}
This is my Video-React player being rendered
<Player
autoPlay
ref="player"
>
<source src={this.state.fileURL} />
<ControlBar autoHide={false} />
</Player>
Then I tried two techniques
readDataAsURL
let reader = new FileReader();
reader.onload = function(event){
//rThis is just a reference to the parent function this
rThis.setState({fileURL: reader.result}, () => {
rThis.refs.player.load();
});
}
try{
reader.readAsDataURL(fileBlob);
}catch(err){
console.log('Error trying readDataURL');
console.log(err);
}
src is being set correctly but the video never loads
URL.createObjectURL
let vidURL = URL.createObjectURL(fileBlob);
rThis.setState({fileURL: vidURL}, () => {
rThis.refs.player.load();
});
src is set to a blob: url but still nothing
Is this an issue with Video-react or should I be doing something else? Any pointers to references I could look at will also help. What am I doing wrong? dataURL works in the case of images, I checked, but not video.
So after some more reading, I finally figured out the problem. Since I'm using gridfs-stream I'm actually piping the response from the server. So I was never getting the whole file, and trying to convert res.data, which is just a chunk, was a mistake. Instead, in my res object, I found the source url within the config property.
res.config.url
This contained my source url to which my server was piping the chunks. Should have figured it out earlier, considering I picked GridFS storage for precisely this reason.
I've been working on a small twitter-like website to teach myself React. It's going fairly well, and i want to allow users to take photos and attach it to their posts. I found a library called React-Camera that seems to do what i want it do to - it brings up the camera and manages to save something.
I say something because i am very confused about what to actually -do- with what i save. This is the client-side code for the image capturing, which i basically just copied from the documentation:
takePicture() {
try {
this.camera.capture()
.then(blob => {
this.setState({
show_camera: "none",
image: URL.createObjectURL(blob)
})
console.log(this.state);
this.img.src = URL.createObjectURL(blob);
this.img.onload = () => { URL.revokeObjectURL(this.src); }
var details = {
'img': this.img.src,
};
var formBody = [];
for (var property in details) {
var encodedKey = encodeURIComponent(property);
var encodedValue = encodeURIComponent(details[property]);
formBody.push(encodedKey + "=" + encodedValue);
}
formBody = formBody.join("&");
fetch('/newimage', {
method: 'post',
headers: {'Content-type': 'application/x-www-form-urlencoded;charset=UTF-8'},
body: formBody
});
console.log("Reqd post")
But what am i actually saving here? For testing i tried adding an image to the site and setting src={this.state.img} but that doesn't work. I can store this blob (which looks like, for example, blob:http://localhost:4000/dacf7a61-f8a7-484f-adf3-d28d369ae8db)
or the image itself into my DB, but again the problem is im not sure what the correct way to go about this is.
Basically, what i want to do is this:
1. Grab a picture using React-Camera
2. Send this in a post to /newimage
3. The image will then - in some form - be stored in the database
4. Later, a client may request an image that will be part of a post (ie. a tweet can have an image). This will then display the image on the website.
Any help would be greatly appreciated as i feel i am just getting more confused the more libraries i look at!
From your question i came to know that you are storing image in DB itself.
If my understanding is correct then you are attempting a bad approcah.
For this
you need to store images in project directory using your node application.
need to store path of images in DB.
using these path you can fetch the images and can display on webpage.
for uploading image using nodejs you can use Multer package.