Append to an existing array of JSON objects on file using rapidjson - rapidjson

I have an array of JSON objects similar to below:
[
{"hello": "rapidjson",
"t": true,
"f": false,
"n": null,
"i": 2,
"pi": 3.1416},
{"hello": "rapidjson",
"t": true,
"f": false,
"n": null,
"i": 12,
"pi": 3.88},
{"hello": "rapidjson",
"t": true,
"f": false,
"n": null,
"i": 14,
"pi": 3.99}
]
My application spits out a bunch of JSON objects that I need to add to a JSON file every 30 seconds lets say.
Each round I need to append to the same file and add new JSON objects to the array of JSON objects that I have. The first entry of each JSON file is the JSON schema.
The problem I am facing is that I do not know how to each time read the previous file and add new objects to the array of existing objects in file and write back the updated file.
Could you please provide me guidance as what needs to be done? or point me to couple of examples (couldn't find similar example in tutorial)?

Assuming the current array of JSON objects on file is:
[{"one", "1"}, {"two", "2"}]
and we want to add a JSON object --> {"three", "3"} and write it back to the same file so that the final file looks like this: [{"one", "1"}, {"two", "2"}, {"three", "3"}]
Here is the list of complete steps to take:
using namespace rapidjson;
FILE* fp = fopen(json_file_name.c_str(), "r");
char readBuffer[65536];
FileReadStream is(fp, readBuffer, sizeof(readBuffer));
Document d, d2;
d.ParseStream(is);
assert(d.IsArray());
fclose(fp);
d2.SetObject();
Value json_objects(kObjectType);
json_objects.AddMember("three", "3", d2.GetAllocator());
d.PushBack(json_objects, d2.GetAllocator());
FILE* outfile = fopen(json_file_name.c_str(), "w");
char writeBuffer[65536];
FileWriteStream os(outfile, writeBuffer, sizeof(writeBuffer));
Writer<FileWriteStream> writer(os);
d.Accept (writer);
fclose(outfile);

This post is from some years ago, but my answer is still relevant. I had the same problem as #SamZ. This answer improves the answer of #SamZ in several aspects:
No reading or parsing of the already existing file is needed
The new object is added directly to the already existing file without document merging
Here is the code:
bool appendToFile(const std::string& filename, const rapidjson::Document& document)
{
using namespace rapidjson;
// create file if it doesn't exist
if (FILE* fp = fopen(filename.c_str(), "r"); !fp)
{
if (fp = fopen(filename.c_str(), "w"); !fp)
return false;
fputs("[]", fp);
fclose(fp);
}
// add the document to the file
if (FILE* fp = fopen(filename.c_str(), "rb+"); fp)
{
// check if first is [
std::fseek(fp, 0, SEEK_SET);
if (getc(fp) != '[')
{
std::fclose(fp);
return false;
}
// is array empty?
bool isEmpty = false;
if (getc(fp) == ']')
isEmpty = true;
// check if last is ]
std::fseek(fp, -1, SEEK_END);
if (getc(fp) != ']')
{
std::fclose(fp);
return false;
}
// replace ] by ,
fseek(fp, -1, SEEK_END);
if (!isEmpty)
fputc(',', fp);
// append the document
char writeBuffer[65536];
FileWriteStream os(fp, writeBuffer, sizeof(writeBuffer));
Writer<FileWriteStream> writer(os);
document.Accept(writer);
// close the array
std::fputc(']', fp);
fclose(fp);
return true;
}
return false;
}

Related

Mongoose handle base64/ binary encode and decode

I'm struggling to wrap my head around this and I've read all the posts I can find on it, but still am not making sense of the results I'm getting. I'm using react-signature-canvas which produces the output into a base64 string. I then seem to be successfully saving this through Mongoose with:
let data = req.body.submittedSig.data;
let split = data.split(',');
let base64string = split[1];
const buffer = Buffer.from(base64string, 'base64');
//...//
submittedSig: {
data: buffer,
contentType: plan.submittedSig.contentType
},
this shows in the db as:
submittedSig
Object
data
BinData(0, 'iVBORw0KGgoAAAANSUhEUgAAAJYAAABsCAYAAACICEudAAANIElEQVR4Xu2dSwgsRxWGjaj4CKjgxpVzjYrowidowOAoxMcmKgqi…')
contentType
"image/png"
When trying to decode, I've used several approaches such as .toBase64() - https://www.mongodb.com/docs/atlas/app-services/functions/globals/#bson--binary-json- mongo function, but I get a 'not a function' error, and otherwise I've played with various combinations of Buffer.from(data, 'base64') and .toString('base64'). https://nodejs.org/api/buffer.html#static-method-bufferfromstring-encoding
const servicePlan = await ServicePlan.findOne({ _id: req.params.planId })
// let sig = Buffer.from(servicePlan.submittedSig.data, 'base64').toString('base64');
// sig = servicePlan.submittedSig.data.toString('base64');
sig = servicePlan.submittedSig.data.toBase64();
servicePlan.submittedSig.data = sig
console.log("sig plan: ", servicePlan)
res.send(servicePlan)
For any combination of the Buffer.from or .toString- whether used separetely or together, the console.logs the output as
data: new Binary(Buffer.from("6956424f5277304b ... 414141456c46546b5375516d4343", "hex"), 0),
and the return comes as:
"submittedSig": {
"data": {
"type": "Buffer",
"data": [
137,
80,
78,
71,
13,
10,
26,
10,
0,
0,
0,
13,
...
]
So as far as I can tell, the encoding takes it into binary, but all of my output is not decoding back to base64. Let alone, the end of the console.log of shows 'hex' which had me trying to convert the binary as if it was stored as a hex object, but that hasn't worked either. I'm sure its something simple that I just don't get with these, appreciate any guidance.

Deserialize json one record at a time

I am working with large json files and memory is a concern. I would like to read one object into memory at a time from file. Is this possible?
In ServiceStack.Text docs it says there is an API using reader/stream
But I can't see how to get that working. The files are too large to deserialize in one go. Is it possible to handle this scenario with SS?
Thanks
No you'll want to use a streaming JSON parser like System.Text.Json Utf8JsonReader, this is the example on System.Text.Json introductory page:
byte[] data = Encoding.UTF8.GetBytes(json);
Utf8JsonReader reader = new Utf8JsonReader(data, isFinalBlock: true, state: default);
while (reader.Read())
{
Console.Write(reader.TokenType);
switch (reader.TokenType)
{
case JsonTokenType.PropertyName:
case JsonTokenType.String:
{
string text = reader.GetString();
Console.Write(" ");
Console.Write(text);
break;
}
case JsonTokenType.Number:
{
int value = reader.GetInt32();
Console.Write(" ");
Console.Write(value);
break;
}
// Other token types elided for brevity
}
Console.WriteLine();
}

Insert a mgo query []M.bson result into a file.txt as a string

i have to insert into a file the result of a mgo query MongoDB converted in Go to get the id of images
var path="/home/Medo/text.txt"
pipe := cc.Pipe([]bson.M{
{"$unwind": "$images"},
{"$group": bson.M{"_id": "null", "images":bson.M{"$push": "$images"}}},
{"$project": bson.M{"_id": 0}}})
response := []bson.M{}
errResponse := pipe.All(&response)
if errResponse != nil {
fmt.Println("error Response: ",errResponse)
}
fmt.Println(response) // to print for making sure that it is working
data, err := bson.Marshal(&response)
s:=string(data)
if err22 != nil {
fmt.Println("error insertion ", err22)
}
Here is the part where I have to create a file and write on it.
The problem is when I got the result of the query in the text file I got an enumeration values in the last of each value for example:
id of images
23456678`0`
24578689`1`
23678654`2`
12890762`3`
76543890`4`
64744848`5`
so for each value i got a number sorted in the last , and i can't figure out how , after getting the reponse from the query i converted the Bson to []Byte and then to Stringbut it keeps me getting that enumeration sorted values in the last of each results
I'd like to drop those 0 1 2 3 4 5
var _, errExistFile = os.Stat(path)
if os.IsNotExist(errExistFile) {
var file, errCreateFile = os.Create(path)
if isError(erro) {
return
}
defer file.Close()
}
fmt.Println("==> done creating file", path)
var file, errii = os.OpenFile(path, os.O_RDWR, 0644)
if isError(errii) {
return
}
defer file.Close()
// write some text line-by-line to file
_, erri := file.WriteString(s)
if isError(erri) {
return
}
erri = file.Sync()
if isError(erri) {
return
}
fmt.Println("==> done writing to file")
You could declare a simple struct eg
simple struct {
ID idtype `bson:"_id"`
Image int `bson:"images"`
}
The function to put the image ids into the file would be
open file stuff…
result := simple{}
iter := collection.Find(nil).Iter()
for iter.Next(&result){
file.WriteString(fmt.Sprintf("%d\n",result.Image))
}
iter.Close()

emit to specifics sockets (like a whisper) which contain specific socket.name

I have my sockets stored like this in an object "people" . but now I would like to extract coincidences in people.name with an object like ["4323","9","43535"] for example 9. meaning extract in this case "OGyF_FMFbsr0ldcbAAAK" socket.
In a few words navigate through ["4323","9","43535"] and find if they are in people , so then emit a notification to the socket which contain people.name === 9 . Could be more than one socket.
So.
for each "attending"
["4323","9","43535"]
in "people"
{
"ZA-CJOc1PtiwDVxkAAAD":
{"name":"4","owns":"2-0-62","inroom":"2-0-62","device":"desktop"},
"wKg2rcFSHgcl4m3WAAAG":
{"name":"3","owns":"2-0-110","inroom":"2-0-110","device":"desktop"},
"OGyF_FMFbsr0ldcbAAAK":
{"name":"9","owns":null,"inroom":null,"device":"desktop"}
}
then emit
io.sockets.socket(id).emit("notification", result);
QUESTIONS:
How do I make the right code to select sockets to send notification?
How then would emit the notification for each one?
thanks in advance
If I understand what you're asking correctly, then one way to do this is to iterate over the keys of your people object, compare the name properties of each of them with the elements in your attending array, and push any matching keys into a new array found to get a list of people whose name is found in your attending list.
You can then iterate over the found array to emit messages to clients in your people object that match your search criteria.
var attending = ['4323', '9', '43535'],
found = [];
var people = {
'ZA-CJOc1PtiwDVxkAAAD': {
'name': '4', 'owns': '2-0-62', 'inroom': '2-0-62', 'device': 'desktop'
},
'wKg2rcFSHgcl4m3WAAAG': {
'name': '3', 'owns': '2-0-110', 'inroom': '2-0-110', 'device': 'desktop'
},
'OGyF_FMFbsr0ldcbAAAK': {
'name': '9', 'owns': null, 'inroom': null, 'device': 'desktop'
}
};
for (var person in people) {
for (var i = 0, numAttending = attending.length; i < numAttending; i++) {
if (people[person].name === attending[i]) {
found.push(person);
}
}
}
for (var i = 0, numFound = found.length; i < numFound; i++) {
io.sockets.socket(found[i]).emit('notification', result);
};
Edit
If you want to push whole objects onto your found array, you could do it like this. Since the entire object and not only the client id is being stored in the array, the emit loop below needs some slight adjusting to keep working.
for (var person in people) {
for (var i = 0, numAttending = attending.length; i < numAttending; i++) {
if (people[person].name === attending[i]) {
found.push(people[person]);
//this would give something like this, without the socket id
//[{"name":"3","owns":null,"inroom":null,"device":"desktop"}]
}
}
}
for (var person in found) {
io.sockets.socket(person).emit('notification', result);
};

Get previously bounded data

Assume I'm using a slightly modified version of the example code from the selection.data() API docs,
var matrix = [
[11975, 5871, 8916, 2868],
[ 1951, 10048, 2060, 6171],
[ 8010, 16145, 8090, 8045],
[ 1013, 990, 940, 6907]
];
var tr = d3.select("body").append("table").selectAll("tr")
.data(matrix, function(d) { return d[0]; })
.enter().append("tr");
var td = tr.selectAll("td")
.data(function(d) { return d; })
.enter().append("td")
.text(function(d) { return d; });
On a subsequent update of my matrix 2d array, I want to catch (and do something with...) any table cell that changes. Eg.
// updated matrix
var matrix2 = [
[11975, 5871, 8916, 2868],
[ 1951, 10048, 2060, 6171],
[ 8010, 16145, 8090, 999999999],
[ 1013, 990, 940, 6907]
];
// bind the new data
var tr = d3.select("table").selectAll("tr")
.data(matrix2, function(d) { return d[0]; });
var cells = tr.selectAll("td")
.data(function(d) { return d; });
var updatedCells = rows.filter(function(d,i) {
// HOWTO get previously bound value for cell???
var prevCellValue = null;
return prevCellValue != d;
} );
In the update selection resulting from a join, is there a way to retrieve the previously bound value for a given selection? Once I've called selection.data(newData), it seems like I've lost the previously bound data. I can call selection.data() and temporarily store the output to a variable before binding new data to the DOM element, but it seems awkward (esp. for this 2D array example) to index the previously bound data within the anonymous function passed to, for example, the selection.filter().
(BTW, I tagged "svg" because my actual example uses SVG elements, so I previously tried this.textContent in my selection.filter() function. Unfortunately, this.textContent already had the newly bound data value for the given cell.)
EDIT: this.textContent "sort of" has the previously bound data, but it's potentially processed. I'd prefer the raw, unaltered data if possible.
D3 doesn't provide a way to get back the previously bound data. In your case, you might want to consider storing the data value in an attribute of the element it is bound to so that you can compare it later, i.e. something like
.data(...)
.attr("myvalue", function(d) { return d; });
Then you should be able to do something like
cells.filter(function(d) { return d3.select(this).attr("myvalue") != d; });

Resources