How can I load test my website on Azure? [closed] - azure

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I need to measure how many concurrent users my current azure subscription will accept, to determine my cost per user. How can I do this?

This is quite a big area within capacity planning of a product/solution, but effectively you need to script up a user scenario, say using a tool like JMeter or VS2012 Ultimate has a similar feature, then fire-off lots of requests to your site an monitor the results.
Visual Studio can deploy your Azure project using a profiling mode, which is great for detecting the bottlenecks in your code for optimisation. But if you just want to see how many requests per/role before it breaks something like JMeter should work.
There are also lots of products out there on offer, like http://loader.io/ which is great for not worrying about bandwidth issues, scripting, etc... and it should just work.
If you do role your own manual load testing scripts, please be careful to avoid false negatives or false positives, by this I mean that if you internet connection is slow and you send out millions of requests, the bandwidth of your internet may cause your site to appear VERY slow, when in-fact its not your site at all...

This has been answered numerous times. I suggest searching [Azure] 'load testing' and start reading. You'll need to decide between installing a tool to a virtual machine or Cloud Service (Visual Studio Test, JMeter, etc.) and subscribing to a service (LoadStorm)... For the latter, if you're focused on maximum app load, you'll probably want to use a service that runs within Azure, and make sure they have load generators in the same data center as your system-under-test.
Announced at TechEd 2013, the Team Foundation Test Service will be available in Preview on June 26 (coincident with the //build conference). This will certainly give you load testing from Azure-based load generators. Read this post for more details.

Related

Single or multiple instances of Application insights resource? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
The community reviewed whether to reopen this question 10 months ago and left it closed:
Original close reason(s) were not resolved
Improve this question
We have a microservice project with multiple applications consisting of frontend (angular, angular.js), backend apps (ASP.NET Core, PHP), gateways etc.
I was wondering whether it's a correct approach to have an Application Insights resource per project or maybe there should be just one per environment for all the applications ? It seems if I create multiple application insight resources and assign them all to separate projects Azure can somehow figure out they are all linked (routes visible on application map). I'm not sure what's the correct approach.
There are a few things to take into account here, like the amount of events you're tracking and if that 'fits' into one instance of Application Insights. Or if you're OK with using Sampling.
As per the FAQ: use one instance:
Should I use single or multiple Application Insights resources?
Use a single resource for all the components or roles in a single business system. Use separate resources for development, test, and release versions, and for independent applications.
See the discussion here
Should I use single or multiple Application Insights resources?
I would have one app insight per service. The reason is that app insights don’t cost until you hit the threshold. So if you use one app insight to log everything, it’s likely that you will hit the threshold pretty quickly.
Also, it is good practice to separate out the logs for each service as the data they hold can differ with regards to personal information.
You can however track the request across all services by application map or writing a query that combines the logs across multiple app insights.

Learning Windows Azure at home? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to learn Windows Azure to prepare for MCSD Web Development certification. Assuming that I have access to Visual Studio, VMWare, SQL Server etc., is it possible to develop and test Azure applications locally? I want to run Azure on my virtual machine without registering at Microsoft website, applying for any trial periods etc. Any suggestions?
TLDR: You can learn a lot about Azure without an account. Probably enough to pass the test; but maybe not enough to manage a production deployment.
You can learn a lot about how applications run inside of Azure using the emulators (express and full) that are included with the Azure tools for Visual Studio. Microsoft has several decent articles on getting started with the Azure tools. However, there is some tacit knowledge about actually using Azure -- things like how to navigate the management portals (or the fact that there currently are two portals) -- that can probably only be learned through actually using the infrastructure. Those kinds of questions may not be on the test, but the knowledge will definitely be helpful if you ever have to deal with Azure in a professional context. Start with the emulator, build some things that will run on Azure, and once you have a few samples, reconsider using a 30 day trial to actually run something in Azure and get a "feel" for the real thing.
As a side note, the Azure platform has evolved quite a bit over the last several years... if you find yourself reading an article from 2011 or '12, you may want to check again for newer information, as the recommended tools/APIs/etc may be deprecated or just plain gone in the newest SDKs.
The best way to understand Azure without Azure account is to install Windows Azure Pack.
https://technet.microsoft.com/en-us/library/dn296435.aspx
Try Microsoft Virtual Academy It's free and if you setup a Microsoft Account you can track your progress. They have a lot of courses on different Microsoft products and I just searched and found a few for Azure.
The good thing I like about the courses is that they are presented by MVP's, MCT's and Microsoft Evangelists, so they know what they are talking about.

Where are the Windows Azure datacenter locations? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am looking for some sort of map to showing physical location of worldwide Windows Azure MS Data Centers.
Can you help?
Here is a public Google Map of Azure datacenter locations - https://maps.google.com/maps/ms?msid=214511169319669615866.0004d04e018a4727767b8&msa=0&ll=-3.513421,-145.195312&spn=147.890481,316.054688
Microsoft does not disclose the exact location of the data centres, for obvious reasons, although the internet does have some information you may have seen, such as http://matthew.sorvaag.net/2011/06/windows-azure-data-centre-locations/
Worth noting, thought, that this refers only to the 'main' Windows/SQL Azure data centres; in addition there are many CDN nodes around the world in smaller data centres.
I am curious though - why do you ask?
Even below link will give you the location of data centers.
http://azure.microsoft.com/en-in/regions/
The exact physical location of a data centre isn't usually relevant for users of applications. What's more important is the latency that they see when reaching the application.
But the most important thing is usually the speed of your own application.
For example, at my particular location in the UK I see somewhat better responses from the Northern Europe Azure site than the Western Europe site. This will be down to the particular route taken by packets from my PC through the local network and out to the point on the wider Internet where it peers with the Microsoft Azure systems.
If I'm dialled in through a VPN to an office in the US then I'll see better responses from a US-hosted Azure site.
However, compared to the ~60 millisecond ping time I see to the data centre, the ~200 millisecond response time from the SQL Azure queries on my site are something I can control and which are more important.
Better ways to make your Web application faster include:
Cache, cache, cache. Use the CDN hosted versions of e.g. JQuery where possible.
Minify your scripts and CSS, and merge if possible.
Only perform postbacks as a last resort. Use Javascript / AJAX to load data into your application.
... all of which applies to Web applications whether they're on Azure or other hosts.

Simple self-hosted website monitoring [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I'm looking for a simple self hosted website monitoring tool.
It should be somthing similar to watchmouse.com or pimgdom.com, with a nice UI, colorful charts and so on (Customers like that :)).
At the moment we use Zabbix also for HTTP monitoring, but since now our hoster care about the hardware and software monitoring on the machine directly, we don't need Zabbix anymore.
For pure http-monitoring zabbix or an other monitoring suite is really an overkill.
So what I'm not looking for is:
Zabbix
Nagios
Hyperic
...
Sadly but the truth, after some hours of researching I wasn't able to find a fitting application. My hope is now on you.
I realize this is an old question but I was looking for something like this today and came across Cabot which is self hosted and free, and according to the project's description: "provides some of the best features of PagerDuty, Server Density, Pingdom and Nagios".
Hope this helps someone in the future.
I found this a while ago for my purposes. Nice and simple and self hosted.
You do need shell access to setup cron jobs for it so it probably won't work in a shared environment.
php Server Monitor
Hope this helps.
Peter
I had a lot of success with Groundwork in the past, It's a BEAST and does just about everything imaginable and can be configured in so many ways. It might be overkill if you are just looking for something to schedule some http responses then graph the logs.
Groundwork is more for enterprise level deployments and has both Paid and Community editions with a pretty active community behind it too.
Not sure if you have already found a solution to this or not but give a shot to Apica System's Synthetic Monitoring. You can use the full SaaS, full on-premise, or hybrid model of this system. Take a look at the free trial and if you like what you see, the full portal as well as monitoring agents (with tons of more features than the trial) can be hosted behind your firewall in your own network. As per for monitoring, you can monitor websites/mobile apps, API endpoints, DNS, etc. You can also run complex use cases and see how the web app responds using Selenium or ZebraTester scripts.
If all you want to monitor is website uptime/downtime and response time, I'd have a look at TurboMonitor - it doesn't have all the bells and whistles provided by some other monitoring websites but it's quick and accurate for those two things.
Price-wise, I wouldn't take what they have on their website too seriously. I only actually found out about them when I met them in person and they were very happy to give me a "professional" account for free, supposedly like 5€/month or something on their website.

Monitoring Bandwidth on your server [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I use to be on a shared host and I could use there standard tools to look at bandwidth graph.
I now have my sites running on a dedicated server and I have no idea whats going on :P sigh
I have installed webmin on my Fedora core 10 machine and I would like to monitor bandwidth. I was about to setup the bandwidth module and it gave me this warning:
Warning - this module will log ALL network traffic sent or received on the
selected interface. This will consume a large amount of disk space and CPU
time on a fast network connection.
Isn't there anything I can use that is more light weight and suitable for a NOOB? 'cough' Free tool 'cough'
Thanks for any help.
vnStat is about as lightweight as they come. (There's plenty of front ends around if the graphs the command line tool gives aren't pretty enough.)
I use munin. It makes pretty graphs and can set up alerts if you're so inclined.
Unfortunately this is not for *nix but I have an automated process to analyise my IIS logs that moves them off the web server and analyises them with Web Log Expert. Provided the appropriate counter is turned on it gives me the bandwidth consumed for every element of the site.
The free version of their tool won't allow scripting but it does the same analysis. It supports W3C Extended and Apache (Common and Combined) log formats.
Take a look at mrtg. It's fairly easy to set up, runs a simple cron job to collect snmp stats from your router, and shows some reasonable and simple graphs. Data is stored in an RRD database (see the mrtg page for details) and can be mined for other uses as well.

Resources