IntelliJ IDEA azure ARM template intellisense - azure

I am new to azure ARM templates... I would like to use IntelliJ IDEA, it has azure support plugin from Microsoft. I have whole project (java+maven+azure) in IDEA. I would like to work with related ARM template in the same place. ARM template documentation is huge... So, any in-IDE highlight/intellisense would be appreciated.
Problems:
when I open (regular file) JSON file (with schema) with ARM template
CTRL+SPACE shows possible values from schema, but it is really slow (like 8seconds) every time! Which makes it unusable. There isn't a cache of any type?
it does not offer all supported values, the latest "2019-04-01" is missing. But the schema contains it "$ref": "https://schema.management.azure.com/schemas/2019-04-01/Microsoft.Storage.json#/resourceDefinitions/storageAccounts"
CTRL+SPACE on location really shows list of possible locations. Unfortunately, it shows it "names" instead of IDs. "West Europe" vs. "westeurope". So it creates invalid template.
when I open deployment from Azure Explorer, it shows ARM template and its parameters in split view side-by-side. It is very fancy.
BUT... it do nothing to CTRL+SPACE.
split view keep changing widths all the times )-:
Questions:
How do you work with ARM templates?
Is it so slow for you too?
Is there somewhere a newer schema definition? (MS docs said that https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json# is latest)
I understand that half of it goes to IDEA team, half to azure plugin team... but it just seems to me, that I am doing something wrong.

... for anybody interested in ARM template in IntelliJ IDEA ...
IDEA is not currently (2020) ready for effective work (unfortunately). Azure Toolkit for IDEA plugin needs more work... lets hope for future releases.
Free Visual Studio Code with Azure resource manager extensions is smooth and fully integrated.

Related

Where can i find the code of an Azure Function?

I'm new to Azure Functions and been thrown into a project without a proper introduction and anybody I could ask is out of office. My simple most likely stupid question is; where can I find the actual code?
In the azure portal, the functions are listed as "read-only" and only contains a function.json. The resource is an App Service and it has a couple of functions. There is no link to any git repository in properties.
Read-only Functions would be compiled and published (e.g. through Visual Studio or a CI/CD pipeline.) The Azure Functions Portal engineers are working on a new, improved experience for this but for now, if your Function app has a deployment source configured, you can view it from the Portal in two ways from your Function app:
Platform Features > Deployment Options.
Platform Features > Resource Explorer. In the file tree on the lefthand side, find your Function app's name, and under that, sourcecontrols. Click to expand in the righthand window.
If your Function app doesn't have a deployment source configured (e.g. your team has been publishing code manually) then things get harder. Depending on how your company has set up their source control and what you already have access to, the function.json you see might help: the entryPoint property in a build-generated function.json will give you the full assembly name of that function (e.g. VSSample.HelloSequence.Run). That or the assembly name of the uploaded DLL in the scriptfile property might help you locate the project.
Good luck! Keep posting if you have further questions; we're here to help.
AFAIK, if we create the azure function in the Visual Studio and publish it to Azure(there may be other ways), it will appear in the situation you described like the screenshot.
Actually, the code is existing in the portal, but the code has been compiled, you could access it in the Platform features -> Advanced tools (Kudu) -> Debug console.
If you want to get the .cs file, I think you should ask it for your coworkers, there will not be in azure.
You said that the Azure function is listed as "read-only". Did you check from the Application Settings menu if you can change the Azure function app edit mode to Read/Write?

TFS2017 Release Management Template Configuration tips

Good Afternoon,
I'm a little new to the Release Management Department and I've been tasked with converting our Release Management 2013 templates over to 2017. I've ran into a road block recently that I've been unable to figure out how to proceed, and wanted to consult the experts for tips or suggestions.
The most major issue is that Release Management 2017 doesn't offer the ability to setup Environment Tags, meaning I can't setup a "Production" environment and tag all of our production IIS servers in it in order to run a single command against all the servers. This hinders me greatly.
I've done research on this issue and have come to two possible work arounds. Either 1) Creating a release template for each IIS Site or 2) Creating a Release Template for each Environment (Test/Stage/Prod). The problem here is that we managed over 100 different IIS Sites and Databases, so creating an individual template for each site would be...astronomical. Whereas creating one bulk template for each environment could lead to issues down the line if we needed to release a specific site.
I figure that we are not the only company that work with this many IIS Sites, and that someone has to have figured out a better solution that the two above. Do we need to look elsewhere other than TFS2017 (With built in RM Features)?
I appreciate any and all advice on the issue.
The tasks you'd be using to target your servers (such as "PowerShell on Target Machines") take a list of servers to execute against. You can store the server names in a variable on the release definition at the environment level.
In TFS 2017 Update 1, you'll be able to store related variables in variable groups and share them across release definitions.

Azure-Deployment to stage ignores service configuration

I created a cloud service and tested it successfully locally. I added service configurations for stage and production. Here is a snippet of my staging-configuration:
and here my configuration-settings:
Then when I publish I set up the deployment as follows:
All this worked like 2 weeks ago. But now he deploys in VS and when I look into Azure Service Configure area it looks like this:
I played a little bit with the "Update development ..."-checkbox on the second screen but the result is the same.
So it ignores all the settings I made and just won't tranistion my configuration to the ine I named "CloudStage". My current Web PI tells me that I use Windows Azure SDK for .NET (VS 2013) 2.3. I don't get the point.
Edit
Some more things I observed:
No WADLogsTable and WADWindowsEventLogsTable is generated automatically in the staging storage.
I deactivated Remote Desktop because it was one of the changes I made to monitor the event log (which wasn't useful here)
I manually changed the connection strings in Azure Portal but it seems as if the worker is totally unaware of the storage (rebooted it with no success).
Edit
I recognized another thing. Here you can see a running deployment of my service:
See the warning-mark on the left? If I go to my Error list this is shown:
This warning is senseless since it tells me that I did everything the right way. My *.Local.csfg-files are pointing to the local storage. So?!?
This seems weird. Please check the in your ServiceConfiguration.CloudStage.cscfg to verify the expected values.
Have you tried updating any other property like Enabling Remote desktop? Does that get updated on your deployment? You should select the "Deployment Update" check box in the publish dialog. Now, when deploying to an existing Cloud Service, it should ask you if you want to replace it.
If you get the Object reference error every time you right click on project, there might be some issue with the Azure SDK set up.
I'm a little bit further now. What I did was:
Deleted all Services in Azure.
Deleted all Storage Accounts in Azure
Removed my Service-Project completely from solution (not the library containing the worker-logic).
Re-added storage-accounts in Azure.
Re-added services in Azure.
Re-added a project in the solution and added the worker-logic inside it.
Builded up all the publishing-stuff again.
Published it.
The first publish ended like the one described in my question. After I checked the "Update development..."-option in properties of my worker it finally took my transitions into the stage!
Now I recognized, that WADLogsTable was still empty. I hit the instance right in server-explorer and choosse "Update diagnostic settings...". There was an option "Transfer period" suddenly set to "None". This explained to me, why my table was empty and after I set it back to "1" my table is filling again!
Another funny thing beside: When I right-click my Cloud-project in the solution I get "Object reference not set to an instance...". When I just click it left and choose Build->Publish it works.
I just hope that I can help somebody with this. Lets see if it's stable now.
Edit: Yesterday it worked - today is still the same issue :-(.
When you get "Object reference not set to an instance.." for a CloudService project you usually have some kind of mismatch. It could be that a setting in the ServiceConfiguration is not defined in the ServiceDefinition. It could also be that there is a publish profile defined in the .ccproj file for the CloudService that doesn't exist. This might also be what is causing your problems with the different configurations.
So it turns out that the problem is completely on client-side. My Visual Studio (now with SDK 2.4) is doing something wrong. I set up a fresh installation with all the stuff needed :-( and there it works perfect. I'll try to determine if one of my extensions is causing the strange "Object reference not set..."-bug.
Repair-Installation of VS does not solve the problem btw.

Difference in output on Azure

I've run into a little problem here. What I get on my local environment and my cloud result is different... I've tried using IntelliTrace, but everytime I want to debug a track it gives me a No source available message.
There aren't any exceptions or anything like that, everything loads perfectly fine... it just seems like the 4th case of the switch-case is screwed. I'm using 4 const ints in a static Common.cs file to populate these 4 possibilities; I know I could be using an enum, but it shouldn't really matter, right?
If this helps, I am also using Telerik's RadChart control. In other words, these 4 options manipulate the data in 4 different ways. People have told me that there is no way to debug code hosted within Azure, and that I could probably use Azure Diagnostics and keep tracing every few lines or so...
Does anyone have any pointers on which direction I should go? or have faced similar problems before? Many thanks... I am pretty much clueless in here.
EDIT: The problem lay with the localization on Azure. On my local machine the date format is dd/mm/yyyy, whereas on Azure it is mm/dd/yyyy. Hence, the problem arose...
It seems to me you're using a web role. If that's the case, the quickest way to explore differences between local deployment and azure deployment is to enable Web Deploy on your cloud project.
Once you've done that, use the Publish option on the Web project (NOT on the cloud project) to quickly upload your code changes to Azure, and explore doing old-fashioned Response.Write.
Ugly, but quite efficient when you don't get what's happening.
Pierre

Best practices for applying changes to a SharePoint application

I feel like I need a better defined framework for updating my SharePoint (MOSS 2007) application with custom code changes. I am creating wsp solution files with features and new types and such, but once those get tested and deployed, I feel like it's a bit of a leap of faith, and that makes me nervous and occasionally reluctant to deploy changes. After deployment, it's difficult to correlate the current state of the SharePoint application with the specific code that is deployed on that SharePoint server. What features are actually installed and on which sites? Which features are activated or deactivated? Which version of this custom field or content type is really there? Things like this. If an error crops up, I have to rely on my assumptions about what code is there and actually running, or I have to spend time digging through deployed assemblies and the 12 hive -- not impossible, but pretty unpleasant.
What steps should I take to improve my ability to unambiguously determine the state of the application and find the code that truly represents that state? Are there third-party tools that can help with this?
I feel your pain... Application Developyment Lifecycle with SharePoint 2007 leaves me with a bitter taste in my mouth.
To answer your question. We built our own deployment utility that does a few things for us.
Checks state of key Timer Jobs (too many times we would do a deployment to find one WFE that did not get deployment)
Checks state of key Services on all our web front ends (again we want to know health of farm before we start kicking off timer jobs).
Shows file version and date of selected assemblies from GAC (does this across all Web Front Ends). We have seen problems before where assemblies did not get installed correctly across the farms.
Updates web.config settings based on an custom XML scheme we provide. We ran into some problems with web.config updates so we have thought about creating a utility to validate the web.config (specifically make sure there are no duplicate entries for specific keys).
Push content type updates (first time content types are deployed via feature it works great, but as soon as you need to update that content type it gets tough).
Checks status of WSP package after deployment or upgrade.
This utility uses the SharePoint API to do most of this work. Some of it is done by checking WMI Events.
Unfortunately the SharePoint development experience is lacking in this regard. As long as you are "namespacing" all features deployed using solution packages, you can use solution management from central admin to keep track of versions, and what gets deployed to which site collection.
Features are scoped from all levels from the farm to an individual web; so maintenence from that level is a little tough. I just try to organize all deployed code from the (top down) solution level.
It gets even more complicated when deploying custom timer jobs, event handlers, etc; I really hope that version next will address a lot of these common developer concerns.
Isn't the only way that you have a planned/controlled deployment process and a version management system like TFS
In the current project I am involved in we have:
Continuous builds
Daily Builds on a development server
When we release something to test we merge the code to the Main bransch in the version management system (TFS)
When tested and ready for production then we merge the main bransch to the release bransch
Using this structured way we always knows what is deployed in what environment and can also track all changes based on environment or changes in requirements(are also tracked in TFS)

Resources