Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 5 years ago.
Improve this question
I have recently began automating the deployment of all of the Azure resources and other modifications that need to be made to build the dev environments at my company. I started working with Powershell on Linux using the .NET Core release of the AzureRM module. Come to find out, half of the cmdlets for interacting with Azure are present in another module, Azure. Which doesn't have a .NET Core release yet. See my other recent post for additional details on that issue.
I tried running my script today on Windows and it bombed horribly. Probably due to some weird syntactical differences between the platforms or something. I haven't began troubleshooting yet. But this led me to thinking about whether or not Powershell was even the best solution. Can anyone recommend an alternative method?
Preferably something less proprietary with better cross-platform support. I recognize there are similar questions on StackOverflow. But they address entire applications and CI/CD pipelines. I'm mostly referring to the underlying resource groups, security rules, etc. However I will likely also leverage this script to deploy k8s, couchbase, etc as well. So perhaps an entire design change is in order.
I'm looking forward to your insight, friends.
I'm using powershell on linux\windows to deploy resources to Azure without much of a hassle. But for resource provisioning I'd go with ARM Templates to automate deployments, as they can be deployed with almost anything, are kinda easy to understand when you scan through them and they are just a bunch of json.
ARM templates can be plugged into ansible (which I'm using currently) and some other things (like VSTS, Terraform, etc)
Also, Invoke-AzureRmResourceAction and New\Get\Remove-AzureRmResource are available on linux and windows and can be used to do pretty much anything in Azure (but they are a lot trickier than native cmdlets).
Related
Is there a way to specify the memory used by firebase functions when running locally through the emulator? I know it can be done in google cloud (which I have done and can see my functions are working) but im not able to see anything in the documentation and I suspect low mem is causing issues in performance when running locally.
It seems this is not publicly documented and we can assume that this is not currently possible. You can file a Feature Request in the Public Issue Tracker. With every feature request, the Engineering team has more visibility of your needs and they can work on them accordingly to the impact on the users. This is the importance to look first for an existing one. I have searched but not found any, so I think the best way to proceed in this case is to create a new one. Please as much details as possible about how would you like to work this feature and if you found any workaround, so, the community in the PIT can implement it.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I am from QA team. My dev team has created pipelines in Azure Data factory. They want me to QA test them. I need to write manual test cases and later after some time I also need to automate this. Please guide me how/ what to test using manual test case. Also suggest me automation tool for later stage that I should use to create automation test cases. Selenium?
You can take a look at this blog post, it really helped me when I started with testing in ADF: https://blogs.msdn.microsoft.com/karang/2018/11/18/azure-data-factory-v2-pipeline-functional-testing/
You won't be able to test everything in Data Factory, at most you can check if connection strings are correct, queries dont break, objects are present (in database or blob storage or whatever you data source is), etc. Testing if the end result of a pipeline is what you intended to do, is highly dependent of the use case and most of the time its not worth it.
I'm not an expert, but as far as I know, Selenium is used to automate browser testing related stuff. Here you won't need a complex framework, you can get away with using a Powershell script as described in the blog post, but you also have other options like Python, .NET, REST api.
Hope this helped!!
Our Q&A team just changes the settings to see the pipeline behavior, uses not normal data to push trough the pipeline, different time zones and timestamps and etc.. But the majority of the test are the final pipeline results.
I have used a Specflow project (https://specflow.org/) and supporting .Net code to set up the tests and execute the pipeline on test files held in the project. You can automate this into your build or release pipelines.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to learn Windows Azure to prepare for MCSD Web Development certification. Assuming that I have access to Visual Studio, VMWare, SQL Server etc., is it possible to develop and test Azure applications locally? I want to run Azure on my virtual machine without registering at Microsoft website, applying for any trial periods etc. Any suggestions?
TLDR: You can learn a lot about Azure without an account. Probably enough to pass the test; but maybe not enough to manage a production deployment.
You can learn a lot about how applications run inside of Azure using the emulators (express and full) that are included with the Azure tools for Visual Studio. Microsoft has several decent articles on getting started with the Azure tools. However, there is some tacit knowledge about actually using Azure -- things like how to navigate the management portals (or the fact that there currently are two portals) -- that can probably only be learned through actually using the infrastructure. Those kinds of questions may not be on the test, but the knowledge will definitely be helpful if you ever have to deal with Azure in a professional context. Start with the emulator, build some things that will run on Azure, and once you have a few samples, reconsider using a 30 day trial to actually run something in Azure and get a "feel" for the real thing.
As a side note, the Azure platform has evolved quite a bit over the last several years... if you find yourself reading an article from 2011 or '12, you may want to check again for newer information, as the recommended tools/APIs/etc may be deprecated or just plain gone in the newest SDKs.
The best way to understand Azure without Azure account is to install Windows Azure Pack.
https://technet.microsoft.com/en-us/library/dn296435.aspx
Try Microsoft Virtual Academy It's free and if you setup a Microsoft Account you can track your progress. They have a lot of courses on different Microsoft products and I just searched and found a few for Azure.
The good thing I like about the courses is that they are presented by MVP's, MCT's and Microsoft Evangelists, so they know what they are talking about.
Deployment
I currently work for a company that deploys through github. However, we have to log in to all 3 servers to update them manually with a shell script. When talking to the CTO he made it very clear that auto-deployment is like voodoo to him. Which is understandable. We have developers in 4 different countries working remotely. If someone where to accidentally push to the wrong branch we could experience downtime, and with our service we cannot be down for more than 10 minutes. And with all of our developers in different timezones, our CTO wouldn't know till the next morning and we'd have trouble meeting with the developers who had the issue because of vast time differences.
My Problem: Why I want auto-deploy
While working on my personal project I decided that it may be in my best interest to use auto-deployment, but still my project is mission critical and I'd like to mitigate downtime and human error as much as possible. The problem with manual deployment is that I simply cannot manually deploy on up to 20 servers via SSH in a reasonable amount of time. The problem perpetuates when I consider auto-scaling. I'd need to spin up a new server from an image and deploy to it.
My Stack
My service is developed on the Node.js Express framework. These environments are very rich in deployment and bootstraping utilities. My project uses npm's package.json to uglify my scripts on deploy, and also runs my service as a daemon using forever-monitor. I'm also considering grunt.js to further bootstrap my environments for both production and testing environments.
Deployment Methods
I've considered so far:
Auto-deploy with git, using webhooks
Deploying manually with git via shell
Deploying with npm via shell
Docker
I'm not well versed in technologies like Docker, but I'm interested and I'd definitely give points to whoever gave me a good description as to why I should or shouldn't use Docker, because I'm very interested in its use. Other methods are welcome.
My Problem: Why I fear auto-deploy
In a mission critical environment downtime can put your business on hold, and to make matters worse there's a fleet of end users hitting the refresh button. If someone pushes something that's not build passing to the production branch and that's auto-deployed, then I'm looking at a very messy situation.
I love the elegance of auto-deployment, but the risks make me skeptical. I'm very much in favor of making myself as productive as possible. So I'm looking for a way to deploy to many servers with ease, and in very efficient manner.
The Answer I'm Looking For
Explain to me how to mitigate the risks of auto-deployment, or explain to me an alternative which is better suited to my project. Feel free to ask for any missing details in the comments.
No simple answer here. I offer a set of slides published by Mike Brittain from Etsy, a company that practices continuous deployment:
http://www.slideshare.net/mikebrittain/mbrittain-continuous-deploymentalm3public
Selected highlights:
Deploy frequently and in small batches
Use config/feature flags to control system behaviour and "dark release" major features
Code review all changes to the production branch
Invest in monitoring and improve the feedback loop
Manage "services" separately to the "application" and be mindful of run-time version and backwardly compatible changes.
Hope this helps
I'm looking for examples that show Azure scales up well. Does anyone know of any large, high-volume sites that run on Azure?
This type of question seems out of scope. That said, try looking at some Windows Azure case studies. This is where you're likely to find information on deployment details. Typically, deployment details aren't revealed to the general public, and those of us who do know about such deployments are usually bound by NDA's.