NodeInvocationException: The Node invocation timed out after 60000ms - node.js

I have an asp.net core web application with angular 5. In my repository layer I have a simple linq query that get data from a table. every thing works well till I change the query and join it to another entity to fetch data from two table.
the join query get data from DB quickly and there is no delay.
and now when I run the app I get this error:
NodeInvocationException: The Node invocation timed out after 60000ms.
You can change the timeout duration by setting the
InvocationTimeoutMilliseconds property on NodeServicesOptions
When I run the API alone, it works well and returns Json data without any problem.
any help would be appreciated.

Related

Dealing with Azure Cosmos DB cross-partition queries in REST API

I'm talking to Cosmos DB via the (SQL) REST API, so existing questions that refer to various SDKs are of limited use.
When I run a simple query on a partitioned container, like
select value count(1) from foo
I run into a HTTP 400 error:
The provided cross partition query can not be directly served by the gateway. This is a first chance (internal) exception that all newer clients will know how to handle gracefully. This exception is traced, but unless you see it bubble up as an exception (which only
happens on older SDK clients), then you can safely ignore this message.
How can I get rid of this error? Is it a matter of running separate queries by partition key? If so, would I have to keep track of what the existing key values are?

Node js REST Client Scaling the Data collection

I have a scenario where my node js client collects data from rest api.
Scenario : my api endpoint is like this http://url/{project}
where project is parameter. the project comes from a Database table.
here is my procedure:
I am getting all the projects names from Database to a list
using a loop calling rest endpoint for every project in the list
My Query: If I have less number of projects in the Database this procedure working fine but, If I have around 1000 projects to collect, the requests are taking long time and some times failing due to timeout errors.
How can I scale this process so that it finish collecting data in a good amount of time?

mongodb Atlas server - slow return

So I understand how some queries can take a while and querying the same information many times can just eat up ram.
I am wondering is their away to the following query more friendly for real-time requests?
const LNowPlaying = require('mongoose').model('NowPlaying');
var query = LNowPlaying.findOne({"history":[y]}).sort({"_id":-1})
We have our iOS and Android apps that request this information every second - which takes toll on MongoDB Atlas.
We are wondering if their is away in nodeJS to cache the data that is returned for at least 30 seconds and then fetch the new playing data when the data has changed.
(NOTE: We have a listener script that listen for song metadata to change - and update NowPlaying for every listener).
MongoDB will try doing its own caching when possible of queried data in memory. But the frequent queries mentioned may still put too much load on the database.
You could use Redis, Memcached, or even in-memory on the NodeJS side to cache the query results for a time. The listener script referenced could invalidate the cache each time an update occurs for a song's metadata to ensure clients get the most up-to-date data. One example of an agnostic cache client for NodeJS is catbox.

Entity framework core stress testing is slow

I build a .net core 2.1 application with EF core.
I have use Transaction with read uncommitted isolation level.
I build the async API and create a simple ef query async (get 5 fields of first user, not reference to other table).
[query user][1]
When i create a single request, the query take small time
When i stress test with 10 threads, ramp-up: 5, loop forever (using jmeter), the query time is same
However, when i stress test to the api using jmeter (100 threads, ramp-up: 20s, loop forever), some query take small time, some query take large time (maybe 5s, 10s, 25s ...), another query throw connection timeout exception
what should i do?
Issue resolved: Take some days to investigating, i tried with this solution and it's working well. So, i will share it on this post, if you have other solutions to increase the performance, pls tell me about it.
Creating database connections is an expensive process that takes time. You can specify that you want a minimum pool of connections that should be created and kept open for the lifetime of the application. These are then reused for each database call.
Should use transaction isolation level "Read Uncommitted"
Should use the same Database Connection for multiple operations on one request
All APIs, methods should be Async method, make sure do not mixing Async with Sync.
Thanks all !!!
First using JMeter, run your test in NON GUI mode to ensure you don't have wrong results and follow best-practices, see:
https://www.ubik-ingenierie.com/blog/jmeter_performance_tuning_tips/
Once you confirmed issues are real, check multiple things:
No N+1 Select issue (loops of queries)
Granularity of retrieved data, are you retrieving too much data
performances of SQL queries issued by looking at DB ?
Pool size
See some interesting blogs:
http://www.progware.org/Blog/post/Slow-Performance-Is-it-the-Entity-Framework-or-you.aspx
https://www.thereformedprogrammer.net/entity-framework-core-performance-tuning-a-worked-example/
https://medium.com/#hoagsie/youre-all-doing-entity-framework-wrong-ea0c40e20502

BigQuery connector for excel - Request failed: Error. Unable to execute query. Timeout while fetching URL

I am trying to pull data from BigQuery into Excel.
when i run simple and fast queries, everything runs fine. when running "heavy" query that takes long to retrieve i get the following error:
Request failed: Error. Unable to execute query. Timeout while fetching URL: https://www.googleapis.com/bigquery/v2/projects/{my-project}/queries.
the query i can see the query and retrieve its results in the browser tool query history.
I manage to retrieve data for simpler queries.
any ideas?
I would believe it has to do with default time out configuration. is there a way to set the timeout parameters for the connector?
Many thanks for your support.
it looks like the bigquery web connector was not setting a timeout correctly. We have currently updated it to 60 seconds from 15 seconds. 60 seconds is the longest timeout we can use without major restructuring is because the connector is hosted in an appengine app.
Your 8-10 minute query, unfortunately, will not work. One alternative may be to run the query yourself and save the result in a bigquery table (i.e. set a destination table for the query) and then just read that table from excel (i.e. select *).

Resources