Cloudant change notifications - couchdb

I am new to Cloudant but have found it useful for a first stage of IoT data. But I need to subscribe to changes based on an id field that is separate from the _id and is unique to the sensor that is sending the data. The examples that I’ve seen so far haven’t helped with this problem. What I’m doing now is sending a separate json doc for each post, so it should return new docs with this sensor id. The json docs sometimes come in by the second but it can be hours as well.
I’m using c# in a .Net web app. The code below creates a call to the Cloudant database and returns the data that I want based on an index that was created for the field SensorID,
json =
{{
"selector": {
"SensorID" : "h7365cf3-17bc-4422-b436-f7bcf12b2e2a"
},
"fields": [
"Data"
]
}}
url = My Cloudant url + ” /_find”.
This returns all docs with the sensorID field that corresponds to the SensorID value in the json query, but just the json object of each doc nested in the Data field.
using (WebClient client = new WebClient())
{
byte[] postBytes = System.Text.ASCIIEncoding.UTF8.GetBytes(json.ToString());
client.UseDefaultCredentials = true;
client.Credentials = new NetworkCredential(username, password);
client.Headers[HttpRequestHeader.ContentType] = "application/json";
var response = client.UploadData(url, "POST", postBytes);
JObject iJson = JObject.Parse(client.Encoding.GetString(response));
return parseIncoming(iJson);
}
When the call is to My Cloudant url + “GET /_DB_UPDATES”, it returns information regarding changes to the whole database. This can be set up as a continuous feed.
I was hoping that this meant that i could subscribe to changes in documents to get new data coming, like Redis Pub/Sub. I’m starting to think that this might not be the case, but if anybody can show me how to do it I would be grateful.

As #adasilva70 said, you need to use the _changes feed.
You can filter changes with an appropriate filter function (so that only changes regarding the documents you're interested in show up).
You can get all updates since a given sequence point (everything since the last data you got) and/or you can use long polling or continuous mode for instant notifications.

Related

Use an array of values to query Firestore and setup a snapshot listener

Here is my problem:
I have a firestore collection that has a number of documents. There are about 500 documents generated/updated every hour and saved to the collection.
I would like to query the collection and setup a real-time snapshot listener for a subset of document IDs, that are provided by the client.
I think maybe I could to something like this (this syntax is likely not correct...just trying to get a feel for if it's even possible...but isn't the "in" limited to an array of 10 items? ):
const subbedDocs = ["doc1","doc2","doc3","doc4","doc5"]
docsRef.where('docID', 'in', subbedDocs).onSnapshot((doc) => {
handleSnapshot(doc);
});
I'm sorry, that code probably doesn't make sense....I'm still trying to learn all the ins and outs of Firestore.
Essentially, what I am trying to do is take an array of ID's and setup a .onSnapshot listener for those ID's. This list of IDs could be upwards of 40-50 items. Is this even possible? I am trying to avoid just setting up a listener on the whole collection and filtering out things I am not "subscribed" too as that seems wasteful from a resources perspective.
If you have the doc IDs in your array (it looks like you have) you can loop over them and start a listener during that:
const subbedDocs = ["doc1", "doc2", "doc3", "doc4", "doc5"];
for (let i = 0; i < subbedDocs.length; i++) {
const docID = subbedDocs[i];
docsRef.doc(docID).onSnapshot((doc) => {
handleSnapshot(doc);
});
}
It would be better to listen to a query and all filtered docs at once. But if you want to listen to each of them with a explicit listener that would do the trick.
As you've discovered, Firestore's in operator only allows up to 10 entries in the array. I'm also guessing you've added the docID as a field in the document, since I don't believe 'docID references the actual documentid.
I would not take this approach, because of the 10-entry limitation. What I would do is, as the client is selecting documents to follow, set a field (same in each document) to a unique Id for the client, so your query completely avoids the limitation. You can allow an unlimited number of Client listeners (up to implementation limits of Firestore) if you add that client ID into an array (called something like "ListenerArray") [again, as the client is selecting them]. Your query would be more like:
docsRef.where('ListenerArray', 'array-contains', clientID).onSnapshot((doc) => {
handleSnapshot(doc);
})
array-contains checks a single value against all entries in a document array, without limit. Every client can mark any number of documents to subscribe to.

immutable _id error when performing MongoDB bulkWrite replaceOne on first attempt only

I'm working on a little web application that will crawl and update baseball standings by day and track teams positions (among other things) over time.
I have an API I grab all of this from and a collection in MongoDB that stores all the team data and information for the current day. Right now I just run this manually but eventually it'll be automated to run at like 3am or whenever.
The API supplies a unique ID for each team that never changes. So what I'm doing is I'm taking in the team data from the API. Passing it to a function that then extracts the teams data (there is other data from the response object I don't need), puts it into an object for replacement, and then wherever that team ID exist in the collection its document is replaced in a bulkWite.
async function currentStandings(db,team_standings,callback){
const current_standings = db.collection('current_standings');
let replacePool = [];
for(const single_team of team_standings.data.standing){
let replaceOnePusher = {
replaceOne: {
"filter": {"team_id": single_team.team_id},
"replacement": single_team
}
}
replacePool.push(replaceOnePusher);
}
await current_standings.bulkWrite(replacePool);
callback();
}
However when I execute this code for the first time each day I get an error reading BulkWriteError: After applying the update, the (immutable) field '_id' was found to have been altered to _id: ObjectId('5f26e57b6831761ac840bf1d') (not the same ID every day) and if I look in Compass the data isn't updated. If I immediately run the script again, it goes through successfully without error. Refreshing the data in compass generates the correct data.
Can someone explain to me what is going wrong here? This is actually my first time using MongoDB since I wanted to learn it and this pet project seemed like a good place to start.

infusionsoft contact field query with python

I know how to connect to Infusionsoft with Python 3 and how to process the following simple example:
#Set up the contact data we want to add
contact = {}; #blank dictionary
contact[“FirstName”] = “John”;
contact[“LastName”] = “Doe”;
contact[“Email”] = "john#doe.com";
contact[“Company”] = “ACME”;
But how do I mass update the WHOLE database? e.g. If I want to update ALL The Phone1 fields with an extra bit of code using IF statements.
Using Infusionsoft API you can only update contacts data one by one, sending a separate request per contact. Exact request depends on which type of API you use: REST or XML-RPC

How to count number of activities in a getstream feed?

Does getstream provide a way to retrieve the number of activities within a feed? I have a notification feed setup. I can retrieve the activities using paginated get. However, I would like to display the number of items within the feed.
Unfortunately not, there is no API endpoint to retrieve the amount of activities within a feed.
At the moment there's no way to count activities directly. You can try
Use a custom feed and use reactions. So now you call count from reactions.
Other way is store in your app (cache/db/etc).
It worked for me:
signature = Stream::Signer.create_jwt_token('activities', '*', '<your_secret>', '*')
uri = URI("https://us-east-api.stream-io-api.com/api/v1.0/enrich/activities/?api_key=<api_key>&ids=<id1,id2,...>&withReactionCounts=true")
req = Net::HTTP::Get.new(uri)
req['Content-Type'] = "application/json"
req['Stream-Auth-Type'] = "jwt"
req['Authorization'] = signature
res = Net::HTTP.start(uri.hostname, uri.port, :use_ssl => true) {|http|
http.request(req)
}
puts JSON.parse(res.body)
References:
Retrieve
reactions_read-feeds
activities
ruby client
Update 2022
You can only get the number of activities in groups within aggregated or notification feeds. Flat feeds are still not supported.
Source
The best solution is to store important numbers in the database which Stream provides.
Source

Change notification in CouchDB when a field is set

I'm trying to get notifications in a CouchDB change poll as soon as pre-defined field is set or changed. I've already had a look at filters that can be used for filtering change events(db/_changes?filter=myfilter). However, I've not yet found a way to include this temporal information, because you can only get the current version of the document in this filter functions.
Is there any possibility to create such a filter?
If it does not work, I could export my field to a separate database and the only poll for changes in that db, but I'd prefer to keep together my data for obvious reasons.
Thanks in advance!
You are correct: filters and _changes feeds can only see snapshots of a document. What you need is a function which can see the old document and the new document and act correctly. But that is unavailable in _filters and _changes.
Obviously your client code knows if it updates that field. You might update your client code however there is a better solution.
Update functions can access both documents. I suggest you make an _update
function which notices the field change and flags that in the document. Next you
have a simple filter checking for that flag. The best part is, you can use a
rewrite function to make the HTTP API exactly the same as before.
1. Create an update function to flag interesting updates
Your _design/myapp would be {"updates", "smart_updater": "(see below)"}.
Update functions are very flexible (see my recent update handlers
walkthrough). However we only want to mimic the normal HTTP/JSON API.
Your updates.smart_updater field would look like this:
function (doc, req) {
var INTERESTING = 'dollars'; // Set me to the interesting field.
var newDoc = JSON.parse(req.body);
if(newDoc.hasOwnProperty(INTERESTING)) {
// dollars was set (which includes 0, false, null, undefined
// values. You might test for newDoc[INTERESTING] if those
// values should not trigger this code.
if((doc === null) || (doc[INTERESTING] !== newDoc[INTERESTING])) {
// The field changed or created!
newDoc.i_was_changed = true;
}
}
if(!newDoc._id) {
// A UUID generator would be better here.
newDoc._id = req.id || Math.random().toString();
}
// Return the same JSON the vanilla Couch API does.
return [newDoc, {json: {'id': newDoc._id}}];
}
Now you can PUT or POST to /db/_design/myapp/_update/[doc_id] and it will feel
just like the normal API except if you update the dollars field, it will add
an additional flag, i_was_changed. That is how you will find this change
later.
2. Filter for documents with the changed field
This is very straightforward:
function(doc, req) {
return doc.i_was_changed;
}
Now you can query the _changes feed with a ?filter= parameter. (Replication
also supports this filter, so you could pull to your local system all documents
which most recently changed/created the field.
That is the basic idea. The remaining steps will make your life easier if you
already have lots of client code and do not want to change the URLs.
3. Use rewriting to keep the HTTP API the same
This is available in CouchDB 0.11, and the best resource is Jan's blog post,
nice URLs in CouchDB.
Briefly, you want a vhost which sends all traffic to your rewriter (which itself
is a flexible "bouncer" to all design doc functionality based on the URL).
curl -X PUT http://example.com:5984/_config/vhosts/example.com \
-d '"/db/_design/myapp/_rewrite"'
Then you want a rewrites field in your design doc, something like (not
tested)
[
{
"comment": "Updates should go through the update function",
"method": "PUT",
"from": "db/*",
"to" : "db/_design/myapp/_update/*"
},
{
"comment": "Creates should go through the update function",
"method": "POST",
"from": "db/*",
"to" : "db/_design/myapp/_update/*"
},
{
"comment": "Everything else is just like normal",
"from": "*",
"to" : "../../../*"
}
]
(Once again, I got this code from examples and existing code I have laying
around but it's not 100% debugged. However I think it makes the idea very clear.
Also remember this step is optional however the advantage is, you never have to
change your client code.)

Resources