It's a asp net webservice project.
API_A: simple api
API_B: gen rdlc report by webform localreport.
the API_A takes a seconds in normal times.
but when i call API_B by 20 threads(by soapui loadtest), the website is very slowly.all api will slowly, such as API_A, it takes 30 seconds.
anyone know why? thanks
Related
I am trying to use Logic apps to ping our website every 10 minutes. I would like to know how to get a response time of that call to make sure the website is now slow.
Currently i am doing this
Recurrence (Every 10 minutes)
Get Current Time
Http GET Call
Get Current time 2
Difference of (Current time 2 - Current time)
Condition to see if it is greater than threshold.
This looks like a not clean solution. Wondering if there is a easier way to get the time / latency of that HTTP call in step 3
According to the official doc, with the connector you're using is not possible to get response time. You'd better use Azure Functions for that. More info:
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-http
You can use azure application insights for this kind of situation it's the best and optimal solution.
https://learn.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview
I'm running a web scraper in my React (MERN STACK) web app. I'm using request-promise (rp) and cheerio library to fetch url/html.
I have this method to run in componentWillMount() every times a user goes to the X page. The array it fetches are around 80-150 elements long with 4-5 objects. But it doesn't seem very efficient to run it every time a user enters that X page. So is there a better way to do it? Sometimes it takes a while before the array "loads" / from 5 seconds and up to 30-40 seconds at most.
One option I wondered if was possible is a fetch method running every 15 minutes (for the whole server) or so and posting it to my MongoDB, then retrieving when user enters that X page instead. Is that possible in any way? Like a extern method without having anyone on the page?
Or is there any script you could run on your desktop to run every 15 minutes to push data to database?
Ended up using Heroku Scheduler to set up a cron job, works great.
I have a very large nodejs controller that runs thru huge tables and checks data against other huge tables. This takes quite a while to run.
Today my page only shows the final result after 20 minutes.
Is there anyway to stream out the result continuously from the controller to the web page? Like a real time scrolling log of whats going on?
(or is Socket I.O. my only option)
You can try with node-cron and set the dynamic query for fetching data with different limits and append data in the front-end side.
I am not sure is it a proper way or not.
The goal of the app is to generate a pdf using puppeteer, we fetch the data, build the html template then using chrome headless generate the pdf, we then, return a link to the newly generated pdf.
The issue, is it takes about 7000 ms to generate a pdf, mainly because of the three puppeteer functions : launch (launch the headless broweser), goto (navigate to the html template) and pdf (generates the pdf).
So having around 7~8 seconds to answer one request, with more incoming requests or a sudden spike, it could easily takes about 40 to 50 seconds for 30 simultaneous requests, which I find unacceptable.
After many time spent on research, I will implement the cluster module to take advantage of multiple processes.
But besides clustering, are there any other possible options to optimize the time on a single instance?
There are something to consider ...
Consider to call puppeteer.launch once per application start. Your conversion script will just check is browser instance already exists and use it by calling newPage(), which basically create new tab, instead of every time creating the browser.
You may consider to intercept Request as page.on('request', this.onPageRequest); when calling goto() and filter out certain types of the files which page is loading right now, but you don't need them for PDF rendering; you may filter out external resources as well if this is your case.
When use pdf() you may return back Buffer from your service, instead of using file system and return link to the location of PDF file created. this may or may not speed up things, depend on your service setup; anyway less IO should be better.
This is probably all you can do for single instance of your app; With the implementation above regular (couple of pages) PDF with a few images render for me in 1-2 sec.
To speed up things use clustering. Other than embed it inside your application you may consider to use PM2 manager to start and scale multiple instances of your service.
I am working on my client's pure HTML CSS website having data bindings with JSON datasets using Knockoutjs. For tables I have used Datatables library.
I have hosted the website on Windows Azure websites.
Here is the link of website : http://bit.ly/(REMOVED SINCE IT IS CONFEDENTIAL)
It takes around 4 seconds to load the website even though I have used CDN for common JS libraries.
It should not have that much load time. I am unable to find the culprit here. I am fetching data from 4 different datasets. Does it impact on performance? Or there is problem with Windows Azure datacenter, It takes while to get response from Azure server. Is Azure culprit?
You can examine the page load time on the website link given above.
Any help would be appreciated.
Solution :
Instead of using sync calls, used
$.getJSON(url, function(data){
//whole knockoutjs logic and bindings
}
All model .js files (starting with patientMedicationChart-Index.js) are loaded synchronously (async:false is set in that file). This means that the browser has to wait for each script file to be loaded before continuing to load the next.
I count about 10 files loaded like that for your demo, which (for me) each take about 200ms to load (about 95% of that 200ms is spent waiting for a response, which also seems rather slow; that might be a server issue with Azure). So times 10 is already 2 seconds spent loading those files, and only after loading all of them will the ready event for the page be triggered.
There might be a reason for wanting to load those files synchronously, but as it is, it's causing a significant part of the loading time for the entire page.