Auto format azure bicep file - azure

How to auto format azure bicep files to have common way for writing/format code
from
param instLocation array = [
'euw'
]
to
param instLocation array = ['euw']

To auto format azure bicep file, you can use
format via the VS Code UI. View -> Command palette... then type format document
use cli az bicep format that will come out with next release, https://github.com/Azure/bicep/pull/8580

Related

Terraform script execution, on the basis of input parameter values

I have create a terraform script for google storage bucket.
I have added multiple optional parameters to script. for which I provide values from tfvars file.
is it possible that if I provide few specific required values from tfvars & script will execute on the basis of input values & ignore rest of the properties which are not provided in tfvars.

Azure Data Factory to create an empty csv file

I have a requirement where I want to check if file exist in a folder. If file exists then I want to skip else I want to create an empty csv file.
One way I have tried using Metadata activity and then using copy activity I am able to move empty file to the destination.
I want to check if there is any better way of creating empty csv file with on the fly without using copy activity?
Yes, there is another approach but it includes coding.
You can create a Azure Function in your preferred coding language and trigger it using Azure Data Factory Azure Function activity.
The Azure Function activity allows you to run Azure Functions in an Azure Data Factory or Synapse pipeline. To run an Azure Function, you must create a linked service connection. Then you can use the linked service with an activity that specifies the Azure Function that you plan to execute.
Learn more here.
In Azure Function, you can access the directory where you want to check the files availability and can also create/delete/update the csv files with schema based.
You can use a Copy Activity. Source: Write a SQL Query in-line that includes all the columns as NULL and a WHERE clause that will produce no data. Note that no tables are involved, and you can issue the query against any SQL source, including Serverless. Target: the csv file, include headers. Use an IF activity to create the file, if the metadata activity shows the file is absent.
Example query--
SELECT
CAST( NULL AS DATE) AS CreatedDate
, CAST( NULL AS INT) AS EmployeeId
, CAST( NULL AS VARCHAR(20)) AS FirstName
, CAST( NULL AS VARCHAR(50)) AS LastName
, CAST( NULL AS CHAR(20)) AS PostalCode
WHERE
1=2;

Get blob contents from last modified folder in Azure container via Azure logic apps

I have an Azure logic app that's getting blob contents from my Azure storage account on a regular basis. However, my blobs are getting stored in sub-directories.
Eg. MyContainer > Invoice > 20200101 > Invoice1.csv
Every month my 3rd sub-directory that is '20200101' will change to '20200201', '20200301' so & so forth.
I need my Logic app to return the blob contents of the latest folder that gets created in my container.
Any advice regarding this?
Thanks!!
For this requirement, please refer to my logic app below:
1. List all of the folders under /mycontainer/Invoice/.
2. Initialize two variables in type of Integer, one named maxNum and the other named numberFormatOfName.
3. Use "For each" to loop the value from "List blobs" above. In "For each" loop, first set numberFormatOfName with expression int(replace(items('For_each')?['Name'], '/', '')). Then add a "If" condition to judge if numberFormatOfName greater than maxNum. If true, set the value of maxNum with numberFormatOfName.
4. After the "For each" loop, use another "List blobs" to list all of the blobs in latest(max number) folder. The expression in below screenshot is string(variables('maxNum')).
If you do not want list blobs, but you want get the blob content. You can do it like below:
==============================Update==============================
Running the logic app, I get the result shown as below screenshot:
I created three folders 20200101, 20200202, 20200303 under /mycontainer/Invoice in my blob storage. The content of three csv file are 111,111, 222,222, 333,333. The logic app response the third csv file content 333,333.
=============================Update 2=============================

In Azure powershell set output to table format instead of json

In Azure powershell, the json format is annoying at times. Need to scroll backwards and de-crypt .
How do i change the default to be --output table for all az commands?
In Azure powershell, type the following inorder-to change the default output to table
az configure
it will prompt you with following question. Select 3
Do you wish to change your settings? (y/N): y
What default output format would you like?
[1] json - JSON formatted output that most closely matches API responses.
[2] jsonc - Colored JSON formatted output that most closely matches API responses.
[3] table - Human-readable output format.
..
Please enter a choice [Default choice(1)]: 3
This is for users who doesn't like the default json format
Since you've given a solution on how to configure default table output for Azure CLI, this question should also have a solution for Azure PowerShell.
The easiest solution I can think of is just piping the output to Format-Table e.g. Get-AzResourceGroup | Format-Table -AutoSize.
You can have a look for more information at Format Azure PowerShell cmdlet output.

Add Dynamic Content - Azure Data Factory ADF V2

I need to add a Dynamic Content in an ADF. In such a way that it needs to read the files from the folder with name as ‘StartDateOfMonth-EndDateOfMonth’ as below format.
Result: 20190601-20190630
Here are a few steps for how you can achieve that:
in DataSet:
create parameter "Date"
set up a connection to one selected file
now, replace "File" field with expression, similar to the following:
#concat('filedata_',dataset().Date,'.csv')
in Pipeline:
when using above DataSet, you just must pass the value, which you can set up by 'Set Variable'

Resources