Windows Azure Table: C# API for Update/Merge? - azure

Windows Azure Table has two distinct mechanisms for altering an existing entity: Update, which modifies properties in place, and Merge which replaces the entire entity.
Which of these is used when you call TableServiceContext.UpdateObject()? (I'm guessing Update.) And is the other one exposed at all through this API?
(Apologies if this is right under my nose in the docs and I'm not seeing it.)

Actually, it's Merge that modifies properties in place, and Update that replaces the entire entity.
I believe the storage client library does a merge by default, but I think you can use SaveChangeOptions.UpdateAsReplace to modify this behavior.
An easy way to test/verify this is to run a debugging proxy like Fiddler and just see what happens over the wire.

Related

How can I use one parameter file for templates that use some different parameters? (Azure LogicApp deployment using Powershell in DevOps Pipelines)

So I am working on a project right now and I am facing an issue. In the company I work at we use two different resource groups, one for demo and then for productive. Before now we used to manually copy every new Logic App from the demo account and change the parameters so that it uses the correct ones for productive. We have around 80 logic apps as of now, and we seperate them in groups. The objective is to make it much easier and with as little as possible manual work required.
We are using them to sync SQL tables, CRM data and a lot of other stuff together. So I have many logic apps that use different parameters. For example, one can sync from the Calender to the SQL server, and the other one syncing two SQL tables but each table has to be accessed with a different user. What I want to do is have 6-7 parameter files depending on the sync. But when the deployment sees that I have parameter values that arent being used by a template, the deployment fails with the following error that makes it necessary to create a new parameter file for almost all new logic apps:
Code=InvalidTemplate; Message=Deployment template validation failed: 'The template parameters 'sql_server......' in the parameters file are not valid; they are not present in the original template and can therefore not be provided at deployment time. The only supported parameters for this template are 'logicAppName, logicAppLocation........ sql-8_username, sql-8_password, sql-8_sqlConnectionString'. Please see https://aka.ms/arm-deploy/#parameter-file for usage details.'.
Is there a way to make these parameters optional so that each templates uses the ones it needs? I googled around but the main thing I found did not help much => https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-best-practices#parameters
Thanks a lot for any help you guys can provide!
You can make it optional to provide a value for a param by using a defaultValue.
Any defaultValue supplied for a parameter must be valid for all users in the default deployment configuration.
Do not provide default values for user names, passwords (or anything that requires a secureString) or anything that will increase the attack surface area of the application
Do not use empty strings as default values (use language expressions to facilitate the scenario)
Template expressions can be used to create default values
Reference: https://github.com/Azure/azure-quickstart-templates/blob/master/1-CONTRIBUTION-GUIDE/best-practices.md#parameters

The best way to publish new version to Azure app/services?

Say I have 1 azure app which calls 1 azure api service. Now I need to update both applications to a newer version, in the most extended scale, i.e. database not compatible, api has changes to existing method signatures that are not compatible to old version invocation either. I use visual studio's publish profile to directly update. The problem I've been facing is that during the publish process, although it's only a few seconds of time, there're still active end users doing things on the web app and making api calls. I've personally seen results in such situations which are unstable, unpredictable and the saved data might be simply corrupt data.
So is there a better way to achieve some sort of 'flash update' which causes absolutely no side effect to end users? Thanks.
You should look at a different deployment strategy. First update the database, maybe with accepting null values, deploy a new API next to the current one. Validate it. Switch the traffic from current to new. Same for the website. It is a blue green deployment strategy, requires some more effort but solves the downtime or errors. https://www.martinfowler.com/bliki/BlueGreenDeployment.html
For the web app, you should use the deployment slots, deploy your new version to a staging slot and once you are ready, it is a matter of pointing the site URL to the new slot. This doesn't take anytime at all.
For the database, I believe you should freeze updates, take a backup and let the users work in readonly mode, and once you finish all your DB migration and changes, point the application to the new database and that is it.

How to synchronize source code from one TFS to another TFS

We are maintaining code for one of our clients.
Initially, we copied all the source code that they have and added it to our TFS 2012.
We modify the code any time they need a bug fix and give the client deployment packages.
Now, client wants all the latest code in their TFS 2012 as well.
Is there a way to update their source code with our changes? ...
preferably automatically (i.e. power shell script) and preferably with history of changes.
There are many approaches each with some pros and cons. The following are the main options I would suggest.
Database backup and restore
This is the only path that guarantees full fidelity. It has some technical difficulties (e.g. SQL Server version and editions) and political (how much information you care to expose, how much effort you want to put in sanitizing your data).
Project synchronization
There are some tools, most notably the Integration Platform, that use the API to read and reply the changes from one system to the other. It requires that the syncing tool can see both systems via HTTP(S).
It gives you the flexibility to project only some data (say source code not work items).
Keep in mind that you will always loose something in the process: the Changeset number will never match, some users details.
Dumb dump
Give up conserving full history and be content to share the code.
This is the simplest to implement: get all the code, ship and check into the other system. You can associate release notes in the check-in.
Two simple scripts using TF.exe is all you need.
You can use TFS Integration Tool to achieve the code migration(TFS-to-TFS). TFS Integration Tool moving data between two different servers. The migration is done through the APIs of TFS, and there also some limitations.(Check the above link for more info)
Detail steps please see my answer in this question: Move Team Project to another Project Collection TFS 2013

Bootstraping an application, is triggers a good idea?

I'm a building an internal webb application for components of building parts. I have table with projects which is tied to some other tables. When a user creates a new project, I want to "bootstrap" the project with a default categorization schema, which the user then can modify for his/her project. So I need to do some copy from a default schema and tie it to the users project.
I'm running NodeJS on backend, AngularJS on frontend and postgres as db. Where is the best way to put this logic? Either I use triggers on the db. The trigger is activated when a new post is made to the project table. Or, I'll do it with complicated queries in Node. Or is there some other way? Is there a best practice? It's probably "easier" to do a trigger. But I worry about the maintenance and testing of the app.
Since the issue that you have is related to the state of the database, you should solve it inside the database. There are basically two ways of solving this:
Revoke the insert privilege on the project table. Create a function new_project() that has parameters for all the required initial state of the project. Inside that function you create schema, do some copying, setup privileges and populate the tables with the parameter values.
Revoke the insert privilege on the project table. Create a view that has all required columns from all relevant tables to make a valid initial project and create an INSTEAD OF INSERT trigger on the views. In the trigger function you perform all the required steps as above.
Debugging code on PostgreSQL is not very advanced but whether or not you place you code in PostgreSQL or on the application side, you will have the same issues. The advantage of PostgreSQL is that the bug - if any - is never far away from where you code operates.

Difference in output on Azure

I've run into a little problem here. What I get on my local environment and my cloud result is different... I've tried using IntelliTrace, but everytime I want to debug a track it gives me a No source available message.
There aren't any exceptions or anything like that, everything loads perfectly fine... it just seems like the 4th case of the switch-case is screwed. I'm using 4 const ints in a static Common.cs file to populate these 4 possibilities; I know I could be using an enum, but it shouldn't really matter, right?
If this helps, I am also using Telerik's RadChart control. In other words, these 4 options manipulate the data in 4 different ways. People have told me that there is no way to debug code hosted within Azure, and that I could probably use Azure Diagnostics and keep tracing every few lines or so...
Does anyone have any pointers on which direction I should go? or have faced similar problems before? Many thanks... I am pretty much clueless in here.
EDIT: The problem lay with the localization on Azure. On my local machine the date format is dd/mm/yyyy, whereas on Azure it is mm/dd/yyyy. Hence, the problem arose...
It seems to me you're using a web role. If that's the case, the quickest way to explore differences between local deployment and azure deployment is to enable Web Deploy on your cloud project.
Once you've done that, use the Publish option on the Web project (NOT on the cloud project) to quickly upload your code changes to Azure, and explore doing old-fashioned Response.Write.
Ugly, but quite efficient when you don't get what's happening.
Pierre

Resources