Azure mobile services offline synchronisation - xamarin.ios

I'm trying to do some offline synchronisation from a Xamarin.iOS app. When I'm calling PullAsync on the IMobileServiceSyncTable the call never returns.
I've tried with a regular IMobileServiceTable, which seems to be working fine. The sync table seems to be the thing here that doesn't work for me
Code that doesn't work:
var client = new MobileServiceClient(ApiUrl);
var store = new MobileServiceSQLiteStore("syncstore.db");
store.DefineTable<Entity>();
await client.SyncContext.InitializeAsync(store);
table = client.GetSyncTable<Entity>();
try
{
await table.PullAsync("all", table.CreateQuery());
}
catch (Exception e)
{
Debug.WriteLine(e.StackTrace);
}
return await table.ToListAsync();
Code that works:
var client = new MobileServiceClient(configuration.BaseApiUrl);
return await table.ToListAsync();
Can anyone point out something that seems to be wrong? I do not get any exception or nothing that points me in any direction - it just never completes.
UPDATE 1:
Seen some other SO questions where people had a similar issue because they somewhere in their call stack didn't await, but instead did a Task.Result or Task.Wait(). However, I do await this call throughout my whole chain. Here's e.g. a unit test I've written which has the exact same behaviour as described above. Hangs, and never returns.
[Fact]
public async Task GetAllAsync_ReturnsData()
{
var result = await sut.GetAllAsync();
Assert.NotNull(result);
}
UPDATE 2:
I've been sniffing on the request send by the unit test. It seems that it hangs, because it keeps on doing the http-request over and over several hundereds of times and never quits that operation.

Finally! I've found the issue.
The problem was that the server returned an IEnumerable in the GetAll operation. Instead it should've been an IQueryable
The answer in this question pointed me in the right direction
IMobileServiceClient.PullAsync deadlock when trying to sync with Azure Mobile Services

Related

How to add continuous running code into nodejs postgresql client?

I'm stuck on a problem of wiring some logic into a nodejs pg client, the main logic has two part, the first one is connect to postgres server and getting some notification, it is as the following:
var rules = {} // a rules object we are monitoring...
const pg_cli = new Client({
....
})
pg_cli.connect()
pg_cli.query('LISTEN zone_rules') // listen to the zone_rules channel
pg_cli.on('notification', msg => {
rules = msg.payload
})
This part is easy and run without any issue, now what I'm trying to implement is to have another function keeps monitoring the rules, and when an object is received and put into the rules, the function start accumulating the time the object stays in the rules (which may be deleted with another notification from pg server), and the monitoring function would send alert to another server if the duration of the object passed a certain time. I tried to wrote the code in the following style:
function check() {
// watch and time accumulating code...
process.nextTick(check)
}
check()
But I found the onevent code of getting notification then didn't have a chance to run! Does anybody have any idea about my problem. Or should I doing it in another way?
Thanks!!!
Well, I found change the nextTick to setImmediate solve the problem.

Azure function run code on startup for Node

I am developing Chatbot using Azure functions. I want to load the some of the conversations for Chatbot from a file. I am looking for a way to load these conversation data before the function app starts with some function callback. Is there a way load the conversation data only once when the function app is started?
This question is actually a duplicate of Azure Function run code on startup. But this question is asked for C# and I wanted a way to do the same thing in NodeJS
After like a week of messing around I got a working solution.
First some context:
The question at hand, running custom code # App Start for Node JS Azure Functions.
The issue is currently being discussed here and has been open for almost 5 years, and doesn't seem to be going anywhere.
As of now there is an Azure Functions "warmup" trigger feature, found here AZ Funcs Warm Up Trigger. However this trigger only runs on-scale. So the first, initial instance of your App won't run the "warmup" code.
Solution:
I created a start.js file and put the following code in there
const ErrorHandler = require('./Classes/ErrorHandler');
const Validator = require('./Classes/Validator');
const delay = require('delay');
let flag = false;
module.exports = async () =>
{
console.log('Initializing Globals')
global.ErrorHandler = ErrorHandler;
global.Validator = Validator;
//this is just to test if it will work with async funcs
const wait = await delay(5000)
//add additional logic...
//await db.connect(); etc // initialize a db connection
console.log('Done Waiting')
}
To run this code I just have to do
require('../start')();
in any of my functions. Just one function is fine. Since all of the function dependencies are loaded when you deploy your code, as long as this line is in one of the functions, start.js will run and initialize all of your global/singleton variables or whatever else you want it to do on func start. I made a literal function called "startWarmUp" and it is just a timer triggered function that runs once a day.
My use case is that almost every function relies on ErrorHandler and Validator class. And though generally making something a global variable is bad practice, in this case I didn't see any harm in making these 2 classes global so they're available in all of the functions.
Side Note: when developing locally you will have to include that function in your func start --functions <function requiring start.js> <other funcs> in order to have that startup code actually run.
Additionally there is a feature request for this functionality that can voted on open here: Azure Feedback
I have a similar use case that I am also stuck on.
Based on this resource I have found a good way to approach the structure of my code. It is simple enough: you just need to run your initialization code before you declare your module.exports.
https://github.com/rcarmo/azure-functions-bot/blob/master/bot/index.js
I also read this thread, but it does not look like there is a recommended solution.
https://github.com/Azure/azure-functions-host/issues/586
However, in my case I have an additional complication in that I need to use promises as I am waiting on external services to come back. These promises run within bot.initialise(). Initialise() only seems to run when the first call to the bot occurs. Which would be fine, but as it is running a promise, my code doesn't block - which means that when it calls 'listener(req, context.res)' it doesn't yet exist.
The next thing I will try is to restructure my code so that bot.initialise returns a promise, but the code would be much simpler if there was a initialisation webhook that guaranteed that the code within it was executed at startup before everything else.
Has anyone found a good workaround?
My code looks something like this:
var listener = null;
if (process.env.FUNCTIONS_EXTENSION_VERSION) {
// If we are inside Azure Functions, export the standard handler.
listener = bot.initialise(true);
module.exports = function (context, req) {
context.log("Passing body", req.body);
listener(req, context.res);
}
} else {
// Local server for testing
listener = bot.initialise(false);
}
You can use global variable to load data before function execution.
var data = [1, 2, 3];
module.exports = function (context, req) {
context.log(data[0]);
context.done();
};
data variable initialized only once and will be used within function calls.

range.address throws context related errors

We've been developing using Excel JavaScript API for quite a few months now. We have been coming across context related issues which got resolved for unknown reasons. We weren't able to replicate these issues and wondered how they got resolved. Recently similar issues have started popping up again.
Error we consistently get:
property 'address' is not available. Before reading the property's
value, call the load method on the containing object and call
"context.sync()" on the associated request context.
We thought as we have multiple functions defined to modularise code in project, may be context differs somewhere among these functions which has gone unnoticed. So we came up with single context solution implemented via JavaScript Module pattern.
var ContextManager = (function () {
var xlContext;//single context for entire project/application.
function loadContext() {
xlContext = new Excel.RequestContext();
}
function sync(object) {
return (object === undefined) ? xlContext.sync() : xlContext.sync(object);
}
function getWorksheetByName(name) {
return xlContext.workbook.worksheets.getItem(name.toString());
}
//public
return {
loadContext: loadContext,
sync: sync,
getWorksheetByName: getWorksheetByName
};
})();
NOTE: above code shortened. There are other methods added to ensure that single context gets used throughout application.
While implementing single context, this time round, we have been able to replicate the issue though.
Office.initialize = function (reason) {
$(document).ready(function () {
ContextManager.loadContext();
function loadRangeAddress(rng, index) {
rng.load("address");
ContextManager.sync().then(function () {
console.log("Address: " + rng.address);
}).catch(function (e) {
console.log("Failed address for index: " + index);
});
}
for (var i = 1; i <= 1000; i++) {
var sheet = ContextManager.getWorksheetByName("Sheet1");
loadRangeAddress(sheet.getRange("A" + i), i);//I expect to see a1 to a1000 addresses in console. Order doesn't matter.
}
});
}
In above case, only "A1" gets printed as range address to console. I can't see any of the other addresses (A2 to A1000)being printed. Only catch block executes. Can anyone explain why this happens?
Although I've written for loop above, that isn't my use case. In real use case, such situations occur where one range object in function a needs to load range address. In the mean while another function b also wants to load range address. Both function a and function b work asynchronously on separate tasks such as one creates table object (table needs address) and other pastes data to sheet (there's debug statement to see where data was pasted).
This is something our team hasn't been able to figure out or find a solution for.
There is a lot packed into this code, but the issue you have is that you're calling sync a whole bunch of times without awaiting the previous sync.
There are several problems with this:
If you were using different contexts, you would actually see that there is a limit of ~50 simultaneous requests, after which you'll get errors.
In your case, you're running into a different (and almost opposite) problem. Given the async nature of the APIs, and the fact that you're not awaiting on the sync-s, your first sync request (which you'd think is for just A1) will actually contain all the load requests from the execution of the entire for loop. Now, once this first sync is dispatched, the action queue will be cleared. Which means that your second, third, etc. sync will see that there is no pending work, and will no-op, executing before the first sync ever came back with the values!
[This might be considered a bug, and I'll discuss with the team about fixing it. But it's still a very dangerous thing to not await the completion of a sync before moving on to the next batch of instructions that use the same context.]
The fix is to await the sync. This is far and away the simplest to do in TypeScript 2.1 and its async/await feature, otherwise you need to do the async version of the for loop, which you can look up, but it's rather unintuitive (requires creating an uber-promise that keeps chaining a bunch of .then-s)
So, your modified TypeScript-ified code would be
ContextManager.loadContext();
async function loadRangeAddress(rng, index) {
rng.load("address");
await ContextManager.sync().then(function () {
console.log("Address: " + rng.address);
}).catch(function (e) {
OfficeHelpers.Utilities.log(e);
});
}
for (var i = 1; i <= 1000; i++) {
var sheet = ContextManager.getWorksheetByName("Sheet1");
await loadRangeAddress(sheet.getRange("A" + i), i);//I expect to see a1 to a1000 addresses in console. Order doesn't matter.
}
Note the async in front of the loadRangeAddress function, and the two await-s in front of ContextManager.sync() and loadRangeAddress.
Note that this code will also run quite slowly, as you're making an async roundtrip for each cell. Which means you're not using batching, which is at the very core of the object-model for the new APIs.
For completeness sake, I should also note that creating a "raw" RequestContext instead of using Excel.run has some disadvantages. Excel.run does a number of useful things, the most important of which is automatic object tracking and un-tracking (not relevant here, since you're only reading back data; but would be relevant if you were loading and then wanting to write back into the object).
Finally, if I may recommend (full disclosure: I am the author of the book), you will probably find a good bit of useful info about Office.js in the e-book "Building Office Add-ins using Office.js", available at https://leanpub.com/buildingofficeaddins. In particular, it has a very detailed (10-page) section on the internal workings of the object model ("Section 5.5: Implementation details, for those who want to know how it really works"). It also offers advice on using TypeScript, has a general Promise/async-await primer, describes what .run does, and has a bunch more info about the OM. Also, though not available yet, it will soon offer information on how to resume using the same context (using a newer technique than what was originally described in How can a range be used across different Word.run contexts?). The book is a lean-published "evergreen" book, son once I write the topic in the coming weeks, an update will be available to all existing readers.
Hope this helps!

Meteor blocking clarification

According to the meteor docs, inserts block:
On the server, if you don't provide a callback, then insert blocks
until the database acknowledges the write, or throws an exception if
something went wrong. If you do provide a callback, insert still
returns the ID immediately.
So this would be wrong:
Meteor.methods({
post: function (options) {
return Stories.insert(options)
}
});
I need to do this:
Meteor.methods({
post: function (options) {
return Stories.insert(options, function(){})
}
});
Can somebody confirm that this is the case? The former will block the ENTIRE SERVER until the db returns?
Yeah, it will block, but not the entire server.
In Meteor, your server code runs in a single thread per request, not in the asynchronous callback style typical of Node. We find the linear execution model a better fit for the typical server code in a Meteor application.
So, if you are worried about that it will block the entire server as it will do in typical Node, don't be.

SPSite.Exists() returns true although the site collection doesn't exists

I'm currently working on a TimerJob which does some site collections management. When the job runs it looks into a list to retrieve the Url of a site collection then it calls SPSite.Exists() to check if the site still exists or not.
To test the TimerJob I delete a site collection but left the corresponding entry in the list. Then I start the TimerJob and step through its code in debug mode. When it comes to the point to check whether the site exists SPSite.Exists() returns true.
When I run the TimerJob a second time for the same site collection the SPSite.Exists() method returns false as it should.
So now I'm wondering why SPSite.Exists() returns the false result when I run the job for the first time. Could this be caused by caching?
When I run the same code outside of the TimerJob SPSite.Exists() returns the correct result every time.
UPDATE
So I did some more debugging and it seems as this problem is really caused by some caching mechanism as it doesn't occur when the Windows SharePoint Services Timer Service was restarted after the test site collection has been deleted and before the TimerJob was started.
At the moment I can't imagine another solution than trying to access the deleted site and catching the exception that will be thrown to determine if the site really exists.
UPDATE 2
After some more tests I realized that the problem doesn't occur for the first call of SPSite.Exists() ( within the TimerJob) after the Timer Service is restarted. The second call (for a different site collection) still leads to the known problem.
UPDATE 3
At the moment I'm using an ugly hack to solve my problem. When SPSite.Exists() returns true although it actually doesn't exist, I create a SPSite object and try to provoke a FileNotFoundException by calling its Usage property. When the exceptions arise I know that the site doesn't exist. Strangely enough after the exception was thrown SPSite.Exists() returns the correct result (false).
Any other suggestions out there?
Bye,
Flo
I had this same issue and tried the HTTP request method but found it to be somewhat slow for checking a large number of sites at once. Instead i ended up using something like this:
public bool SPSiteExists(string url) {
SPSite.InvalidateCacheEntry(new Uri(url), Guid.Empty);
return SPSite.Exists(uri);
}
Same for me. I had a similar problem after I deleted a site-collection I still got true for SPSite.Exists();
The strange thing was, that if i opened the deleted site-collection URL in a browser - the first request resulted in a HTTP 400 error message, whereas the second request was the expected HTTP 404.
My workaround was to just create a HTTP GET for the URL to create the first request and then check for site existence again.
private void touchWeb(string url, System.Net.ICredentials credentials)
{
try
{
Uri uri = new Uri(url);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Credentials = credentials;
request.Method = "GET";
string result = "";
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader readStream = new StreamReader(responseStream, System.Text.Encoding.UTF8))
{
result = readStream.ReadToEnd();
}
}
}
}
catch (Exception) { }
}

Resources