I have a search in SuiteScript 2.0 that's working fine. But for each record the search brings back, I want to update a particular field (I use it elsewhere to determine that the record has been examined). It's potentially a very large result set, so I have to page it. Here's my code:
var sResult = mySearch.runPaged({pageSize: 10});
for (var pageIndex = 0; pageIndex < sResult.pageRanges.length; pageIndex++)
{
var searchPage = sResult.fetch({ index: pageRange.index });
searchPage.data.forEach(function (result)
{
var name = result.getValue({ name: "altname"})
result.setValue({
name: 'tracker',
value: new Date()
})
});
}
You can see where I have a call to result.setValue(), which is a non-existent function. But it shows where I want to update the 'tracker' field and what data I want to use.
Am I barking up the wrong tree entirely here? How do I update the field for each result returned?
As Simon says you can't directly update a search result, but you can use submitFields method.
This example is from NetSuite documentation:
var otherId = record.submitFields({
type: 'customrecord_book', //record Type
id: '4', // record Id
values: {
'custrecord_rating': '2'
}
});
This approach will save more governance than load and save the record.
AFAIK You can't directly update a search result and write the value back to the record, the record needs to be loaded first. The snippet doesn't say what the type of record it is you're searching for or want to load, but the general idea is (in place of your result.setValue line):
var loadedRecord = Record.load({type:'myrecordtype', id:result.id});
loadedRecord.setValue( {fieldId:'tracker', value: new Date() });
loadedRecord.save();
Keep in mind SuiteScript governance and the number of records your modifying. Load & Save too many and your script will terminate earlier than you expect.
Bad design: instead of using result.setValue inside the iteration, push those to an "update" array then after the data.forEach have another function that loops thru the update array and processes them there with record.submitFields()
Be careful of governance....
Related
I am designing an item inventory system for a website that I am building.
The user's inventory is loaded from a Web API. This information is then processed so that it is more suited to my web app. I am trying to combine all the item records into one MongoDB collection - so other user inventories will be cached in the same place. What I have to deal with is deleting old item records if they are missing from the user's inventory (i.e. they sold it to someone) and also upserting the new items. Please note I have looked through several Stack Overflow questions about bulk upserts but I was unable to find anything about conditional updates.
Each item has two unique identifiers (classId and instanceId) that allow me to look them up (I have to use both IDs to match it) which remain constant. Some information about the item, such as its name, can change and therefore I want to be able to update those records when I fetch new inventory information. I also want new items that my site hasn't seen before to be added to my database.
Once the data returned from the Web API has been processed, it is left in a large array of objects. This means I am able to use bulk writing, however, I am unaware of how to upsert with conditions with multiple records.
Here is part of my item schema:
const ItemSchema = new mongoose.Schema({
ownerId: {
type: String,
required: true
},
classId: {
type: String,
required: true
},
instanceId: {
type: String,
required: true
},
name: {
type: String,
required: true
}
// rest of item attributes...
});
User inventories typically contain 600 or more items, with a max count of 2500.
What is the most efficient way of upserting this much data? Thank you
Update:
I have had trouble implementing the solution to the bulk insert problem. I made a few assumptions and I don't know if they were right. I interpreted _ as lodash, response.body as the JSON returned by the API and myListOfItems also as that same array of items.
import Item from "../models/item.model";
import _ from 'lodash';
async function storeInventory(items) {
let bulkUpdate = Item.collection.initializeUnorderedBulkOp();
_.forEach(items, (data) => {
if (data !== null) {
let newItem = new Item(data);
bulkUpdate.find({
classId: newItem.classId,
instanceId: newItem.instanceId
}).upsert().updateOne(newItem);
items.push(newItem);
}
});
await bulkUpdate.execute();
}
Whenever I run this code, it throws an error that complains about an _id field being changed, when the schema objects I created don't specify anything to do with schemas, and the few nested schema objects don't make a difference to the outcome when I change them to just objects.
I understand that if no _id is sent to MongoDB it auto generates one, but if it is updating a record it wouldn't do that anyway. I also tried setting _id to null on each item but to no avail.
Have I misunderstood anything about the accepted answer? Or is my problem elsewhere in my code?
This is how I do it :
let bulkUpdate = MyModel.collection.initializeUnorderedBulkOp();
//myItems is your array of items
_.forEach(myItems, (item) => {
if (item !== null) {
let newItem = new MyModel(item);
bulkUpdate.find({ yyy: newItem.yyy }).upsert().updateOne(newItem);
}
});
await bulkUpdate.execute();
I think the code is pretty readable and understandable. You can adjust it to make it work with your case :)
I have been looking for a way to pull the records in the System Notes in NetSuite. The lines below throw an 'INVALID_RCRD_TYPE' error:
var columns = new Array();
columns[0] = new nlobjSearchColumn('internalid').setSort();
var results = nlapiSearchRecord('systemnote', null, null, columns);
I wonder how to reference the System Notes as the first argument of nlapiSearchRecord API. Obviously, it's not called systemnote.
A similar question has been posted here but the System Notes has been incorrectly referenced there.
systemnotes aren't available as record type which is evident from the record browser link
However, you can still get system notes fields using join searches on any record type in NetSuite
eg:
x = nlapiSearchRecord('vendor', null, null,
[new nlobjSearchColumn('date', 'systemNotes'),
new nlobjSearchColumn('name', 'systemNotes'), // Set By
new nlobjSearchColumn('context', 'systemNotes'),
new nlobjSearchColumn('newvalue', 'systemNotes'),
new nlobjSearchColumn('oldvalue', 'systemNotes'),
])
x[0].getValue('name', 'systemNotes'); //gives the set by value
Thanks for your responses guys. I finally managed to query the System Notes using the code below. I thought I should share it in case someone else wants to accomplish the same job. I created a RESTlet in NetSuite using the below code that returns the list of merged customer records merged after a given date.
I created a new search with ID customsearch_mergedrecords and in Criteria tab, added a filter on 'System Notes: NewValue' where the description is 'starts with Merged with duplicates:' and in the Results tab, I added the columns I needed.
Note that you need to create the new search on Customer, not on System Notes. System Notes is hooked up in the search using join (the second argument in nlobjSearchFilter constructor).
function GetMergedRecordsAfter(input) {
var systemNotesSearch = nlapiLoadSearch('customer', 'customsearch_mergedrecords');
var filters = new Array();
filters.push(new nlobjSearchFilter('date', 'systemNotes', 'notbefore', input.fromdate));
systemNotesSearch.addFilters(filters);
var resultSet = systemNotesSearch.runSearch();
var searchResultJson = [];
resultSet.forEachResult(function (searchResult){
var searchColumns = resultSet.getColumns();
searchResultJson.push({
ID: searchResult.getValue(searchColumns[0]),
Name: searchResult.getValue(searchColumns[1]),
Context: searchResult.getValue(searchColumns[2]),
Date: searchResult.getValue(searchColumns[3]),
Field: searchResult.getValue(searchColumns[4]),
NewValue: searchResult.getValue(searchColumns[5]),
OldValue: searchResult.getValue(searchColumns[6]),
Record: searchResult.getValue(searchColumns[7]),
Setby: searchResult.getValue(searchColumns[8]),
Type: searchResult.getValue(searchColumns[9]),
InternalId: searchResult.getValue(searchColumns[10])
});
return true;
});
return searchResultJson;
}
var Poll = mongoose.model('Poll', {
title: String,
votes: {
type: Array,
'default' : []
}
});
I have the above schema for my simple poll, and I am uncertain of the best method to change the value of the elements in my votes array.
app.put('/api/polls/:poll_id', function(req, res){
Poll.findById(req.params.poll_id, function(err, poll){
// I see the official website of mongodb use something like
// db.collection.update()
// but that doesn't apply here right? I have direct access to the "poll" object here.
Can I do something like
poll.votes[1] = poll.votes[1] + 1;
poll.save() ?
Helps much appreciated.
});
});
You can to the code as you have above, but of course this involves "retrieving" the document from the server, then making the modification and saving it back.
If you have a lot of concurrent operations doing this, then your results are not going to be consistent, as there is a high potential for "overwriting" the work of another operation that is trying to modify the same content. So your increments can go out of "sync" here.
A better approach is to use the standard .update() type of operations. These will make a single request to the server and modify the document. Even returning the modified document as would be the case with .findByIdAndUpdate():
Poll.findByIdAndUpdate(req.params.poll_id,
{ "$inc": { "votes.1": 1 } },
function(err,doc) {
}
);
So the $inc update operator does the work of modifying the array at the specified position using "dot notation". The operation is atomic, so no other operation can modify at the same time and if there was something issued just before then the result would be correctly incremented by that operation and then also by this one, returning the correct data in the result document.
I'm trying to figure out the best way to track changes to fields when using mongoose.js. For example, every time the name field on an object is set, I want to add a new entry to that object's history (as an embedded document) that looks something like { field: 'name', previous: 'foo', current: 'bar', date: '3/06/2012 9:06 am' }.
I started by trying to use a plug-in that hooks .pre('save') but I can't figure out which fields have been modified without grabbing the old value from the database and comparing them myself. Then I thought I could use custom setters, but I ran into the same problem - I don't know which field was modified. Currently I'm left with doing something like this which hard codes the field name into to the setter:
var comment = new Schema({
name : { type: String, set: trackName },
history : [Change]
});
var trackName = function(val) {
var change = new Change;
change.field = 'name';
change.previous = this.name;
change.current = val;
change.date = Date.now();
this.history.push(change);
return val;
}
But this means I need a custom setter for each field name that I want to track. I'm guessing there must be a better way to accomplish this.
Looks like i missed 'Document.modifiedPaths'. This does exactly what I need to determine which fields have been modified.
NodeJS + Express, MongoDB + Mongoose
I have a JSON feed where each record has a set of "venue" attributes (things like "venue name" "venue location" "venue phone" etc). I want to create a collection of all venues in the feed -- one instance of each venue, no dupes.
I loop through the JSON and test whether the venue exists in my venue collection. If it doesn't, save it.
jsonObj.events.forEach(function(element, index, array){
Venue.findOne({'name': element.vname}, function(err,doc){
if(doc == null){
var instance = new Venue();
instance.name = element.vname;
instance.location = element.location;
instance.phone = element.vphone;
instance.save();
}
}
}
Desired: A list of all venues (no dupes).
Result: Plenty of dupes in the venue collection.
Basically, the loop created a new Venue record for every record in the JSON feed.
I'm learning Node and its async qualities, so I believe the for loop finishes before even the first save() function finishes -- so the if statement is always checking against an empty collection. Console.logging backs this claim up.
I'm not sure how to rework this so that it performs the desired task. I've tried caolan's async module but I can't get it to help. There's a good chance I'm using incorrectly.
Thanks so much for pointing me in the right direction -- I've searched to no avail. If the async module is the right answer, I'd love your help with how to implement it in this specific case.
Thanks again!
Why not go the other way with it? You didn't say what your persistence layer is, but it looks like mongoose or possibly FastLegS. In either case, you can create a Unique Index on your Name field. Then, you can just try to save anything, and handle the error if it's a unique index violation.
Whatever you do, you must do as #Paul suggests and make a unique index in the database. That's the only way to ensure uniqueness.
But the main problem with your code is that in the instance.save() call, you need a callback that triggers the next iteration, otherwise the database will not have had time to save the new record. It's a race condition. You can solve that problem with caolan's forEachSeries function.
Alternatively, you could get an array of records already in the Venue collection that match an item in your JSON object, then filter the matches out of the object, then iteratively add each item left in the filtered JSON object. This will minimize the number of database operations by not trying to create duplicates in the first place.
Venue.find({'name': { $in: jsonObj.events.map(function(event){ return event.vname; }) }}, function (err, docs){
var existingVnames = docs.map(function(doc){ return doc.name; });
var filteredEvents = jsonObj.events.filter(function(event){
return existingVnames.indexOf(event.vname) === -1;
});
filteredEvents.forEach(function(event){
var venue = new Venue();
venue.name = event.vname;
venue.location = event.location;
venue.phone = event.vphone;
venue.save(function (err){
// Optionally, do some logging here, perhaps.
if (err) return console.error('Something went wrong!');
else return console.log('Successfully created new venue %s', venue.name);
});
});
});