Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to learn Windows Azure to prepare for MCSD Web Development certification. Assuming that I have access to Visual Studio, VMWare, SQL Server etc., is it possible to develop and test Azure applications locally? I want to run Azure on my virtual machine without registering at Microsoft website, applying for any trial periods etc. Any suggestions?
TLDR: You can learn a lot about Azure without an account. Probably enough to pass the test; but maybe not enough to manage a production deployment.
You can learn a lot about how applications run inside of Azure using the emulators (express and full) that are included with the Azure tools for Visual Studio. Microsoft has several decent articles on getting started with the Azure tools. However, there is some tacit knowledge about actually using Azure -- things like how to navigate the management portals (or the fact that there currently are two portals) -- that can probably only be learned through actually using the infrastructure. Those kinds of questions may not be on the test, but the knowledge will definitely be helpful if you ever have to deal with Azure in a professional context. Start with the emulator, build some things that will run on Azure, and once you have a few samples, reconsider using a 30 day trial to actually run something in Azure and get a "feel" for the real thing.
As a side note, the Azure platform has evolved quite a bit over the last several years... if you find yourself reading an article from 2011 or '12, you may want to check again for newer information, as the recommended tools/APIs/etc may be deprecated or just plain gone in the newest SDKs.
The best way to understand Azure without Azure account is to install Windows Azure Pack.
https://technet.microsoft.com/en-us/library/dn296435.aspx
Try Microsoft Virtual Academy It's free and if you setup a Microsoft Account you can track your progress. They have a lot of courses on different Microsoft products and I just searched and found a few for Azure.
The good thing I like about the courses is that they are presented by MVP's, MCT's and Microsoft Evangelists, so they know what they are talking about.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I already use bot framework using c# and hosted in azure. Our bot is integrated with LUIS and use a SharePoint list to answer FAQs. We want to upgrade our bot and we see now many new bot solutions like power virtual agent and composer. Is it worth to switch to any of the new solution? If yes, which one is better and what is the fees.
TL;DR if you are happy with your current solution, there isn't any reason to switch.
Estimated costs:
Bot Framework - $60/mo (10,000 messages)
Power Virtual Agents - $1,000/mo (2,000 conversations)
Composer - Assumed to be the same or similar to Bot Framework, $60/mo
This is largely a matter of preference, but I can provide my personal insight. This post appears to be a little out of date, but gives a good high level overview of the three options. I personally don't have experience with Bot Framework Composer.
First, a general opinion on switching. If you have this working already on Bot Framework and are happy with it, I see little reason to switch. Bot Framework is going to be your cheapest option. There are obviously a lot of variables to cost, but I'd say if you are using free tier for everything except your app service plan, you could probably run a Bot Framework bot for < $60/month for 10,000 messages. Compare that to Power Virtual Agents which costs $1,000/month per 2,000 sessions (if each session was 5 messages, these would be equivalent prices; you can adjust by your expected conversation length). If Bot Framework Composer is just a front end for a traditional Bot Framework bot, I would expect the pricing to be the same around $60/month.
Here are my thoughts on each of the platforms.
Bot Framework gives you total control over the features of your bot. It's 100% code which will turn some people off, but it's ideal for organizations with existing developer competencies or people who prefer not to fight with sometimes finicky builder UIs. In my experience, it is MUCH easier to build integrations this way, especially to Enterprise systems that may be using SOAP. If you can code it, this bot should be able to execute it. The biggest con is that some functionality that is out of the box in other options (notably transcript/session management and reporting) has only token support via modules and poor documentation. There is also no visual builder for your dialogs, though I actually find that sometimes those visuals can be more of a hinderance. For me, the flexibility outweighed the drawbacks and this is what my organization has selected as our Enterprise platform.
Power Virtual Agents is part of the Power Platform and goes completely the other way. It is completely no-code. There are a number of out of the box connectors (integrations), and you can build your own using custom connectors and flows. However, that's harder than it sounds. I have found this a very poor option if you need to create your own custom APIs. Our organization has determined that this would be the best option if you want to allow citizen developers to create their own bots without help from a developer or IT team. It excels at Q&A and simple dialogs, but beyond that I've found it to be frustrating to work with.
Bot Framework Composer seems to be a middle ground between the two. I don't have any personal experience with it. It seems there is still some coding required, but it does have a visual builder for dialogs and it appears to have some other nice out of the box features. I'm not sure if it's any easier to create integrations in this platform. I would see the niche here being an accelerator to coded Bot Framework bots, and also good for lesser technical people who would appreciate being able to visualize their process flows. It could also be a better tool if your customers are heavily involved in the development process, allowing them to see process flows as they are developed instead of just by testing them with conversation.
If any of the MSFT guys have any additions or corrections to my analysis, let me know and I'll edit them into my response.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 5 years ago.
Improve this question
I have recently began automating the deployment of all of the Azure resources and other modifications that need to be made to build the dev environments at my company. I started working with Powershell on Linux using the .NET Core release of the AzureRM module. Come to find out, half of the cmdlets for interacting with Azure are present in another module, Azure. Which doesn't have a .NET Core release yet. See my other recent post for additional details on that issue.
I tried running my script today on Windows and it bombed horribly. Probably due to some weird syntactical differences between the platforms or something. I haven't began troubleshooting yet. But this led me to thinking about whether or not Powershell was even the best solution. Can anyone recommend an alternative method?
Preferably something less proprietary with better cross-platform support. I recognize there are similar questions on StackOverflow. But they address entire applications and CI/CD pipelines. I'm mostly referring to the underlying resource groups, security rules, etc. However I will likely also leverage this script to deploy k8s, couchbase, etc as well. So perhaps an entire design change is in order.
I'm looking forward to your insight, friends.
I'm using powershell on linux\windows to deploy resources to Azure without much of a hassle. But for resource provisioning I'd go with ARM Templates to automate deployments, as they can be deployed with almost anything, are kinda easy to understand when you scan through them and they are just a bunch of json.
ARM templates can be plugged into ansible (which I'm using currently) and some other things (like VSTS, Terraform, etc)
Also, Invoke-AzureRmResourceAction and New\Get\Remove-AzureRmResource are available on linux and windows and can be used to do pretty much anything in Azure (but they are a lot trickier than native cmdlets).
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Currently i am trying to learn various Services of Amazon web services and Microsoft windows azure like Amazon sns,Amazon Storage,amazon Search.
I am having some question in my mind that why now a days cloud platforms are getting so popular than old traditional approach like previously we were storing our files(img,txt,.doc etc) in our web application project only but now adays some web application are storing their files on amazon storage or on azure storage.
What is the benefits of storing files and folders over cloud platform ??
Next why amazon search or azure search is preferred as when they were
not available searching was done and amazon and azure search are not
freely availabale??
Now if we talk about push notification then why to use amazon or azure
push notification if we can easily send notification through codes
that are available on internet??
I general i just want to know that why now a days web application are using more cloud platforms(Azure or amazon) even though they are costly??
Can anybody explain me this with some details??
Among the many reasons, the most important and common ones I can think of are:
High Availability - When you manage your own services, there is always an operational challenge of ensuring that they do not go down i.e, crash. This may cause a downtime to your application or even data loss depending on the situation. The cloud services you have mentioned, offer reliable solutions that guarantee maximum up time and data safety (by backup, for example). They often replicate your data across multiple servers, so that even if one of their servers are down, you do not loose any data.
Ease of use - Cloud services make it very easy to use a specific service by providing detailed documentation and client libraries to use their services. The dashboard or console of many cloud services are often user friendly and do not require extensive technical background to use. You could deploy a Hadoop cluster in Google Compute Engine in less than a minute, for instance. They offer many pre-built solutions which you can take advantage of.
Auto-Scale - Many cloud services nowadays are designed to scale automatically. The are built to scale automatically with increasing traffic. You do not have to worry about the traffic or application load.
Security - Cloud services are secure. They offer sophisticated security solutions using which, you can secure your service from being misused.
Cost - Trying to host your own services require extensive resources like high end servers, dedicated system administrators, good network connectivity etc. Cloud services are quite cheap these days.
Of course you could solve these problems yourself, but smaller organizations often do not prefer to do so because of the operational overhead. It would take more time and resources to reach a stage where your solution is both reliable and functional. People would often prefer to work on the actual problem their application is trying to solve and abstract away most operational problems which cloud services readily offer.
p.s. These are some opinions from an early stage startup perspective.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I need to measure how many concurrent users my current azure subscription will accept, to determine my cost per user. How can I do this?
This is quite a big area within capacity planning of a product/solution, but effectively you need to script up a user scenario, say using a tool like JMeter or VS2012 Ultimate has a similar feature, then fire-off lots of requests to your site an monitor the results.
Visual Studio can deploy your Azure project using a profiling mode, which is great for detecting the bottlenecks in your code for optimisation. But if you just want to see how many requests per/role before it breaks something like JMeter should work.
There are also lots of products out there on offer, like http://loader.io/ which is great for not worrying about bandwidth issues, scripting, etc... and it should just work.
If you do role your own manual load testing scripts, please be careful to avoid false negatives or false positives, by this I mean that if you internet connection is slow and you send out millions of requests, the bandwidth of your internet may cause your site to appear VERY slow, when in-fact its not your site at all...
This has been answered numerous times. I suggest searching [Azure] 'load testing' and start reading. You'll need to decide between installing a tool to a virtual machine or Cloud Service (Visual Studio Test, JMeter, etc.) and subscribing to a service (LoadStorm)... For the latter, if you're focused on maximum app load, you'll probably want to use a service that runs within Azure, and make sure they have load generators in the same data center as your system-under-test.
Announced at TechEd 2013, the Team Foundation Test Service will be available in Preview on June 26 (coincident with the //build conference). This will certainly give you load testing from Azure-based load generators. Read this post for more details.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I'm looking for a simple self hosted website monitoring tool.
It should be somthing similar to watchmouse.com or pimgdom.com, with a nice UI, colorful charts and so on (Customers like that :)).
At the moment we use Zabbix also for HTTP monitoring, but since now our hoster care about the hardware and software monitoring on the machine directly, we don't need Zabbix anymore.
For pure http-monitoring zabbix or an other monitoring suite is really an overkill.
So what I'm not looking for is:
Zabbix
Nagios
Hyperic
...
Sadly but the truth, after some hours of researching I wasn't able to find a fitting application. My hope is now on you.
I realize this is an old question but I was looking for something like this today and came across Cabot which is self hosted and free, and according to the project's description: "provides some of the best features of PagerDuty, Server Density, Pingdom and Nagios".
Hope this helps someone in the future.
I found this a while ago for my purposes. Nice and simple and self hosted.
You do need shell access to setup cron jobs for it so it probably won't work in a shared environment.
php Server Monitor
Hope this helps.
Peter
I had a lot of success with Groundwork in the past, It's a BEAST and does just about everything imaginable and can be configured in so many ways. It might be overkill if you are just looking for something to schedule some http responses then graph the logs.
Groundwork is more for enterprise level deployments and has both Paid and Community editions with a pretty active community behind it too.
Not sure if you have already found a solution to this or not but give a shot to Apica System's Synthetic Monitoring. You can use the full SaaS, full on-premise, or hybrid model of this system. Take a look at the free trial and if you like what you see, the full portal as well as monitoring agents (with tons of more features than the trial) can be hosted behind your firewall in your own network. As per for monitoring, you can monitor websites/mobile apps, API endpoints, DNS, etc. You can also run complex use cases and see how the web app responds using Selenium or ZebraTester scripts.
If all you want to monitor is website uptime/downtime and response time, I'd have a look at TurboMonitor - it doesn't have all the bells and whistles provided by some other monitoring websites but it's quick and accurate for those two things.
Price-wise, I wouldn't take what they have on their website too seriously. I only actually found out about them when I met them in person and they were very happy to give me a "professional" account for free, supposedly like 5€/month or something on their website.