I'm using the solr-client module for nodejs to query my solr collections.
Now I'm trying to add to and update collections in my backend code using solr-client.
I've tried http://lbdremy.github.io/solr-node-client/code/add.js.html succesfully to add data to a collection. But I don't know how to update records.
I've tried using this method (all methods can be found here: http://lbdremy.github.io/solr-node-client/code/solr.js.html);
/**
* Send an update command to the Solr server with the given `data` stringified in the body.
*
* #param {Object} data - data sent to the Solr server
* #param {Function} callback(err,obj) - a function executed when the Solr server responds or an error occurs
* #param {Error} callback().err
* #param {Object} callback().obj - JSON response sent by the Solr server deserialized
*
* #return {Client}
* #api private
*/
Client.prototype.update = function (data, callback) {
var self = this;
this.options.json = JSON.stringify(data);
this.options.fullPath = [this.options.path, this.options.core, 'update/json?commit=' + this.autoCommit + '&wt=json']
.filter(function (element) {
if (element) {
return true;
}
return false;
})
.join('/');
updateRequest(this.options, callback);
return self;
}
But how does this method knows which records to update? Does it searches for pk's in the data parameter and when it matches with your pk in the collection, it get's updated? And does it need an extra commit?
But how does this method knows which records to update? SEE BELOW
Does it searches for pk's in the data parameter and when it matches with your pk in the collection, it get's updated? - YES
And does it need an extra commit? - YES
Technically, u can use the INSERT as well as UPDATE. They are the same in SOLR
Related
I have a User UserEvent script for purchase order. When Purchase Order created from NetSuite UI its working fine, while Purchase Order created from SOAP xml request, this script not able to capture any log on suite script Execution log ?
I have checked all the below parameter for executing script .
1- Setup > Integration > SOAP web service prefrences > RUN SERVER SUITESCRIPT AND TRIGGER WORKFLOWS (checked true)
2- APPLIES TO -> PurchaseOrder
3- Log Level -> Debug
4- Status -> Testing/Release (checked in Both case)
5- Deployed -> checked
6- Inactive -> false
7- Roles -> All Roles/ Specific Roles (Checked with both )
While UserEvent Script for Items working fine on UI & SOAP Request case.
TIA
/**
* #NApiVersion 2.x
* #NScriptType UserEventScript
* #NModuleScope Public
*/
define(["N/log"], function (log) {
var exports = {};
/**
* Function definition to be triggered before record is loaded.
*
* #param {Object} scriptContext
* #param {Record} scriptContext.newRecord - New record
* #param {Record} scriptContext.oldRecord - Old record
* #param {string} scriptContext.type - Trigger type
* #Since 2015.2
*/
function beforeSubmit(scriptContext) {
log.debug({ "title": "Before Submit", "details":"Before Submit Event"});
}
function afterSubmit(scriptContext){
log.debug({ "title": "After Submit", "details": "After Submit Event"});
}
exports.beforeSubmit = beforeSubmit;
exports.afterSubmit = afterSubmit;
return exports;
});
I've configured a GCS bucket to automatically delete objects stored within it after 20 days (did this via the GCP web UI). When I reference the bucket object in Node.JS, how can I get the number of days it's configured for aging out?
GCS Lifecycle Reference is here, but does not provide examples.
To get the number of days the object was configured for aging out, that is, the value you set for your bucket in the Console, you must obtain it from the reference to the bucket, not an object within the bucket. You can use the bucket.getMetadata() method. This method will call the API, which you can try here.
If you would like to know when an object on that bucket will die, you can just get the metadata of that object using the method object.getMetadata(), check the creation date and do simple maths with the value you configured for your bucket.
Have you tried working with addLifecycleRule method and setting the bucket policies?
#example
* const {Storage} = require('#google-cloud/storage');
* const storage = new Storage();
* const bucket = storage.bucket('bucket-name');
*
* //-
* // Automatically have an object deleted from this bucket
* // of age.
* //-
* bucket.addLifecycleRule({
* action: 'delete',
* condition: {
* age: 20 // Specified in days.
* }
* }, function(err, apiResponse) {
* if (err) {
* // Error handling omitted.
* }
I am trying to send text to all my peers and I found this function "sendDirectlyToAll".
For your convenience, I put the function information here:
sendDirectlyToAll(channelLabel, messageType, payload) - broadcasts a
message to all peers in the room via a dataChannel.
string channelLabel - the label for the dataChannel to send on.
string messageType - the key for the type of message being sent.
object payload - an arbitrary value or object to send to peers.
I do not understand the meaning of 2nd and 3rd parameters. Could you kindly show me an example how to use this function?
Thanks
Derek
Here is my example showing how i managed to get it working:
/**
* send directly to all other peers
*/
oSimpleWebRTC.sendDirectlyToAll(
'meta', // sLabel
'info', // sType - will become oData.sType
{"foo": "bar"} // oData - will become oData.payload
);
/**
* Handle incoming dataChannel messages sent by "sendDirectlyToAll"
* #param {object} oPeer The Remote sending Peer Object
* #param {string} sLabel A Label, e.g.: 'meta'
* #param {object} oData Object containing the relevant Data
*/
oSimpleWebRTC.on('channelMessage', function (oPeer, sLabel, oData) {
// e.g. we want label "hark" to be ignored, as it fires continiously.
if ('hark' === sLabel) {
return true;
}
if ('meta' === sLabel) {
if ('info' === oData.type)
{
// do your stuff
console.log(oData.payload.foo);
}
}
}
Also, There are Answers to this question at the official SimpleWebRTC issues Tracker: https://github.com/andyet/SimpleWebRTC/issues/450
See my blog post to this example: https://blog.ueffing.net/post/2017/02/22/simplewebrtc-usage-example-of-senddirectlytoall/
I am using Kue, a priority job queue backed by redis in node.js app.
I want to push a method of an instance to the queue so that it can be directly called when the queue is processed, something similar to as shown below.
This is an example from Larevel
class UserController extends Controller
{
/**
* Send a reminder e-mail to a given user.
*
* #param Request $request
* #param int $id
* #return Response
*/
public function sendReminderEmail(Request $request, $id)
{
$user = User::findOrFail($id);
$this->dispatch(new SendReminderEmail($user));
}
}
I'm a little confused as to how this can be achieved in my node app with Kue.
I'm creating my first Yeoman Generator. I want to download an external zip containing a CMS and unzip it in the root. According to this thread this should be possible. Has this not been implemented yet? What do i need to copy over to my generator if not?
I have run generator generator and got my basic generator up. This is my code so far.
Generator.prototype.getVersion = function getVersion() {
var cb = this.async()
, self = this
this.log.writeln('Downloading Umbraco version 6.1.6')
this.download('http://our.umbraco.org/ReleaseDownload?id=92348', '.');
}
This generates an error telling me that it "cannot find module 'download'". What is the correct syntax?
I did a little investigation for you.
There are two methods to download something with yeoman...
/**
* Download a string or an array of files to a given destination.
*
* #param {String|Array} url
* #param {String} destination
* #param {Function} cb
*/
this.fetch(url, destination, cb)
/**
* Fetch a string or an array of archives and extract it/them to a given
* destination.
*
* #param {String|Array} archive
* #param {String} destination
* #param {Function} cb
*/
this.extract(archive, destination, cb)
The callback will pass an error if something went wrong.
There's also a method to download packages from Github.
/**
* Remotely fetch a package from github (or an archive), store this into a _cache
* folder, and provide a "remote" object as a facade API to ourself (part of
* generator API, copy, template, directory). It's possible to remove local cache,
* and force a new remote fetch of the package.
*
* ### Examples:
*
* this.remote('user', 'repo', function(err, remote) {
* remote.copy('.', 'vendors/user-repo');
* });
*
* this.remote('user', 'repo', 'branch', function(err, remote) {
* remote.copy('.', 'vendors/user-repo');
* });
*
* this.remote('http://foo.com/bar.zip', function(err, remote) {
* remote.copy('.', 'vendors/user-repo');
* });
*
* When fetching from Github
* #param {String} username
* #param {String} repo
* #param {String} branch
* #param {Function} cb
* #param {Boolean} refresh
*
* #also
* When fetching an archive
* #param {String} url
* #param {Function} cb
* #param {Boolean} refresh
*/