Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
Is Azure appropriate for building an entire database-backed website, with custom tables on the backend, and custom pages, forms, and CSS on the frontend?
Like any database-backed website, there should be facility for backend logic in response to client-browser GET & POST requests.
Which Azure resources are appropriate? Logic Apps?
Sharing an answer i received on MSDN. (not sure yet if i will mark this one as best answer):
While is it possible, you have better options to consider.
If you are building a JS-powered frontend (using Angular/React), you could host the static assets directly on blob storage and expose it as a static website. The doc also covers how you could add a CDN to it for faster delivery to your customers.
The backend could be built entirely using Azure Functions. If you are using Table Storage or CosmosDB, there are bindings available that you could use and simplify the code that you would have to manage.
If you have background workflows (like batch jobs) that you have to run, then Logic Apps are indeed a good option. For complex use cases, Durable Functions might be a better fit though.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I have an application that we implemented kind of a microservices type architecture. The application contains 6 services (6 Docker containers). I need to load test this application. As I don't have much experience in the testing field, I'm not sure which method to use.
Right now, I have used the Gatling Load testing application for the load test. Here, I record the testing script by start the recorder and wander around my application to record all routes. I have gone through most of the routes in that single recording in order to mimic a practical user. I thought, normally users use an application like this and I can load test with its 1000 times by editing the number of threads/users.
Later I read about API testing which we will focus on APIs. Loading each APIs with a heavy load. So, I'm confused that which testing method should I use? If we go for API testing, it will provide only how much we can scale for that particular API right? (Not sure)
Is there any issue with my method of load testing?
It depends entirely on what you hope to achieve...
If you're looking to validate that your entire application (code + production infrastructure) can handle a given load, then driving as though going through the full website is the right path.
However, if you're looking to see how a particular api scales or want to help developers explore the ramifications of changes, then you will probably want to just drive that API directly to avoid other limitations your system may have.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I want to make sure I am taking the right approach.
I am building virtual environments in Azure on a regular anywhere from 3 to 5 servers at a time. Each server has 1 of 4 different resources (RAM/CPU/...) that it will need. Obviously I could script out each VM and just use powershell to deploy each individual VM each time.
More over what I want is a utility or webpage where I can say I need to create x servers and here are the specifications for them, how much will it cost and make it start building them.
Is there any tool like this or what would be the best approach to this?
You could automatically create Azure resources from a Resource Manager template. You create a template file that deploys the resources and a parameters file that supplies parameter values to the template.
Also, you could easily edit and deploy the template on the Azure portal. In this way, you could search Template----Deploy from a custom template---Build your own template in the editor. You could reuse the template after you save it. You could find multiple guidances and sample about the template what you want to deploy.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
When creating a bot in Azure Bot Service you have two hosting options according to this article: https://learn.microsoft.com/en-us/bot-framework/bot-service-overview-introduction
App Service Plan (Standard Azure web app, Web App Bot)
Consumption Plan (Azure Functions, Functions Bot)
I'm trying to understand the strengths and weaknesses of each. The billing model of a Function Bot would work best for my use case, but I seem to be finding limitations. It also seems that Microsoft's documentation is biased towards Web App Bots being the standard.
Here is what I know so far:
The billing model is different. App Services plans are billed more like an always running VM vs. Functions are pay-per-run.
App Service uses the standard ASP.NET MVC model. Functions use C# scripts
Visual Studio seems to have better support for debugging and publishing App Service plan bots
One thing I think is related to #2 is Global Message handling. The examples Microsoft gives for implementing a global message handler seems to require the use of global.asax.cs to register the global handler, this file isn't present in a Function Bot.
Are Web App bots the preferred option from Microsoft?
Is .NET Core better supported in either option?
Is there a way to implement global handlers in Function Bots?
Are there other specific weaknesses of Function Bots?
Is one option more "modern" than the other?
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I have a client that is very jealous about her data and she asked me to replace the default bot storage of my bot with a custom storage that saves all the data in an on-premises database.
If I replace the storage, will the bot framework save permanently any conversation data in any other place? (let's say, somewhere in Azure) That's something my client would like to avoid for security concerns.
Thanks!
Saving and loading of all session data is handled in the ChatConnector's getData() and saveData() unless you provided your own via settings.storage. In non-emulator real-life scenarios it will go to https://state.botframework.com/v3/botstate/...
The bot framework doesn't store anything else, I believe. I explored this exact question very recently. Take a look:
http://www.pveller.com/smarter-conversations-part-3-breadcrumbs/
http://www.pveller.com/smarter-conversations-part-4-transcript/
I had to read the source (many times actually) to trace the inner workings of the Bot Framework and I didn't see anything that would make me think that there's another persistence somewhere.
You are probably better off asking on the official support channel to confirm and assure your client but I think you're good.
As to how reasonable it is... companies do far more crazier things for all kinds of reasons :) By the way, will you also use Microsoft's LUIS for NLU? Does your client have similar concerns about all incoming messages going through that service? It's a deep rabbit hole. I think of engagement (vs. back office automation) bots as very much cloud-native. Not easy to shield yourself from it and yet benefit from all the new tech built for it.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
All these Azure technologies (Bots, FaaS, Logic Apps and Runbooks) are used to run schadule jobs.
I don't know when we should use these and which scenario we should use them.
YMMV, but here are some pretty good rule of thumbs:
Are you doing PowerShell based Automation work? If Yes, consider Azure Automation Runbooks.
Are you building a bot? If Yes, consider the Azure Bot Framework service.
Are you build a workflow that executes on a timer, especially one that integrates with other services (etc.)? If Yes, consider Logic Apps.
Are you writing generic application code? If Yes, consider Azure Functions.
If none of those fit, I'd be surprised, but you might try starting with Azure Functions since we're kind of an "Everything as a Service", but there is a reason we have the different products - they specialize to enable better productivity within their specialty (Bots, Automation, and Integration).
Note: I'm one of the PMs on the Azure Functions team here at Microsoft.