How do I push an object to a JSON file with javascript? - node.js

So, I have my JSON file:
{
HomeWork: []
}
And I want to push an object:
{Title: T, Due: D, Description: Desc} To the JSON file, How should I do that? I am using this right now:
async function AddWork(obj) {
workData.HomeWork.push(obj);
//RNFS.writeFile('../../json/work.js', JSON.stringify(workData));
}
I tried whit the RNFS line, but it did not help.

If you want to do that, there is no edit mode ! so be careful .
If you read it and modify it , you are doing it locally, so you need override the file you are working on.
you need to read it using Read File in fs and then after you do the modification you need to Write it on the top of the file you are working with
I hope it helps <3

Related

Is it possible in Node to read in a file an export its contents (without exporting asyncronly)?

TLDR: I want to read in a file's contents and then export a function which relies on those contents ... without making that exported function use promises or some other form of asynchronicity.
I'm trying to write an XML-validating module, and in order for it to do its thing I need to read in an XSD file. However, this only needs to happen once at "load time", so ideally I'd rather not have other modules that use my function have to wait for a promise to resolve to get their results. If I were using Webpack this would be easy, as I could use it's text file loader to bring in the XSD as if it were any other module ... but unfortunately I'm not.
In other words, currently I have to do (borderline pseudo-code):
module.exports.validate = () =>
new Promise((resolve) =>
fs.readFile(path, (file) => {
// use file to validate, then:
resolve(validationResult);
});
});
};
and instead I'd like to do:
fs.readFile(path, (file) => {
module.exports.validate = myValidationFunction;
});
But the above doesn't work because you can't export from callbacks, so my question is, is there any other way to accomplish this?
The https://github.com/jonschlinkert/to-exports library seems to offer exactly this, so it seems like it's possible ... but it doesn't work for me :(
P.S. At worst I could literally wrap the contents of the file inside the template string characters, rename the file to be .js, and export it that way:
module.exports = `*XSD contents go here*`;
However, that seems very kludgy, so I'm hoping there is a better way.
If you want to read a file synchronously, then use fs.readFileSync. It returns the contents of the file or throws an error.

How can I display files that have been saved as a Buffer?

I am saving files as Buffers in my mongo database (using mongoose, nodejs, electron). For now, I'm keeping it simple with text-only files. I read a file in using
fs.readFile(filePath, function(err, data) {
if (err) {console.log(err);}
typeof callback == "function" && callback(data);
});
Then I create a new file in my database using the data variable. And, now I have something that looks like BinData(0,"SGVsbG8gV29ybGQK") stored in my mongodb. All is fine so far.
Now, what if I wanted to display that file in the UI? In this case, in Electron? I think there are two steps.
Step 1 The first is bringing this variable out of the DB and into the front-end. FYI: The model is called File and the variable that stores the file contents is called content.
So, I've tried File.content.toString() which gives me Object {type: "Buffer", data: Array[7]} which is not the string I'm expecting. What is happening here? I read here that this should work.
Step 2 Display this file. Now, since I'm only using text files right now, I can just display the string I get once Step 1 is working. But, is there a good way to do this for more complex files? Images and GIFs and such?
You should save the file mime.
And then set response.header MIME
response.setHeader("Content-Type", "image/jpeg");
response.write(binary)
response.end()

Jenkins: extended choice parameter - groovy - how to create file on the master

i have a json string defined in the groovy script part of the 'extended choice parameter' plugin. Additionally I want to write the json config in a file on the master side inside the groovy script area. I thought, maybe the job directory would be the best place?
http://hudson/hudson/job/MY_JOB/config.json
If you ask now, why i should do this; the reason behind is, i don´t want the config pre-saved somewhere else. I don´t like the idea of configuring the file outside of the job config. I want to see/adjust configs at one place - in the job config.
I need many other informations from the json config for later use in a python code section within the same job.
My questions are:
Am i following a wrong path here? Any suggestions?
can i write directly the json config on the master side? It doesn´t have to be the jenkins job directory. I don´t care about the device/directory.
if the approach is acceptable, how can i do this?
The following code doesn´t work:
def filename = "config.json"
def targetFile = new File(filename)
if (targetFile.createNewFile()) {
println "Successfully created file $targetFile"
} else {
println "Failed to create file $targetFile"
}
Remark:
hudson.FilePath looks interesting!
http://javadoc.jenkins-ci.org/hudson/FilePath.html
Thanks for your help, Simon
I got it:
import groovy.json.*
// location on the master : /srv/raid1/hudson/jobs
jsonConfigFile = new File("/srv/raid1/hudson/jobs/MY_JOB/config.json")
jsonConfigFileOnMaster = new hudson.FilePath( jsonConfigFile )
if( jsonConfigFileOnMaster.exists() ){
jsonConfigFileOnMaster.delete()
}
jsonConfigFileOnMaster.touch( System.nanoTime())
jsonFormatted = JsonOutput.toJson( localJsonString )
jsonConfigFile.write jsonFormatted

Nodejs: parsing XML, editing values, saving the end result using sax-js module

What I'd like to achieve is:
parsing a chunk of XML
editing some values
saving the end result in a
new xml file
The module is sax-js: https://github.com/isaacs/sax-js#readme
The module has some built-in mechanism to read/write any.
I thought the task would be a piece of cake; on the contrary I have been struggling with it for the whole day.
Here is my code:
var fs = require('fs');
var saxStream = require("sax").createStream(true);
saxStream.on("text", function (node) {
if (node === 'foo') { //the content I want to update
node = 'blabla';
}
});
fs.createReadStream("mysongs.xml")
.pipe(saxStream)
.pipe(fs.createWriteStream("mysongs-copy.xml"));
I did think that updating some content (see the comment above) would suffice to write the updated stream into a new file.
What's wrong with this code?
Thanks for your help,
Roland
The sax module doesn't let you modify nodes like that. If you take a look at this bit of code, you'll see that the input is passed indiscriminately to the output.
All hope is not, however, lost! Check out the pretty-print example - it would be a good starting point for what you want to do. You'd have to do a bit of work to implement the readable part of the stream, though, if you still want to be able to .pipe() out of it.
If you know the general structure of the XML, you can try xml-flow. It converts an XML stream into objects, but has a utility to convert them back to xml strings:
https://github.com/matthewmatician/xml-flow
Based on deoxxa's answer I wrote an NPM module for this https://www.npmjs.com/package/sax-streamer

How can I secure use of fopen on a CSV file?

I have a PHP script which allows a user to upload a CSV, and then make some changes via an API.
I use fopen to open and access the file after it's been uploaded. I check for size, name, presence of known bad extensions etc using the $_FILES array on upload.
The data is simply a grid of ID's and corresponding action codes.
It's a closed group of users, and nothing is being include)()'ed or require()'d from this input but I am still concerned that by manipulating the upload something bad can happen.
if (($han = fopen($fileloc, "r")) !== false) {
while (($data = fgetcsv($han, 50, ",")) !== false) {
array_push($stack, $data); //
}
fclose($han);
}
The only thing I could see is when echoing back HTML (echo or print) use: htmlspecialchars just to be on the safe side
Hope that helps! ^_^

Resources