Generate release notes and send via E-mail - azure

I've created a release pipeline which generates release notes, and later publishes it on the Wiki.
Only issue i am having is that the Wiki cannot be updated, but rather a new Wiki pages/sub-page needs to be created manually each time. Due to this, i was wondering if there's anyway i can send my release notes to an email address instead?
my PowerShell
$content = [IO.File]::ReadAllText("$(System.DefaultWorkingDirectory)\releasenotes.md")
$data = #{content=$content;} | ConvertTo-Json;
$connectionToken= '[token]'
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($connectionToken)"))
$params = #{uri = '$(WikiPath)';
Method = 'PUT';
Headers = #{Authorization = "Basic $base64AuthInfo" };
ContentType = "application/json";
Body = $data;
}
Invoke-WebRequest #params
Any guide will be appreciated, quite new to all this.

This might depend on whether you're using a on-premises agent or a hosted agent as you will require access to an SMTP server from the agent your job is running. You can then simply use PowerShell to send your email:
Send-MailMessage -From $From -To $To -Subject $Subject -Body $Body -BodyAsHTML -SmtpServer $SmtpServer -Priority High -Encoding UTF8
Wes

Related

Azure Devops Run Result Steps and Run Summary Details Attachments API

I am trying to fetch the attachments of the Test Run Result steps and Test Run Result Summary Details but unable to find any API related to that
I am attaching the Image below Rectangle 1 is the attchments for Test Run Result Steps and Rectangele 2 is the attachments for Test Run Result Summary Detais
If anyone have any knowledge about these perticular apis please let me know.
I have checked the AZURE API Documentation but couldn't find the specific API if I have missied something please let me know.
Thanks
By calling the Get Test Result Attachments REST API, we can get all the IDs of the attachments:
GET https://dev.azure.com/{organization}/{project}/_apis/test/Runs/{runId}/Results/{testCaseResultId}/attachments?api-version=6.0-preview.1
After that, if you want to get the attachments you can call Attachments - Get Test Result Attachment Zip REST API with the specific Attachment ID.
GET https://dev.azure.com/{organization}/{project}/_apis/test/Runs/{runId}/Results/{testCaseResultId}/attachments/{attachmentId}?api-version=6.0-preview.1
Please note that the REST API Attachments - Get Test Result Attachment Zip will display the context of the attachments instead of download the attachments directly. If you want to download the attachments, you can write a script to save them to a local directory. The following PowerShell script for your reference:
$AttachmentsOutfile = "D:\Test\HellWorld.java"
$connectionToken="You PAT Here"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::
ASCII.GetBytes(":$($connectionToken)"))
$AuditLogURL = "https://dev.azure.com/{organization}/{project}/_apis/test/Runs/{runId}/Results/{testCaseResultId}/attachments/{attachmentId}?api-version=6.0-preview.1"
$AuditInfo = Invoke-RestMethod -Uri $AuditLogURL -Headers #{authorization = "Basic $base64AuthInfo"} -Method Get –OutFile $AttachmentsOutfile
UPDATE:
However the Get Test Result Attachments REST API can only get the attachments attached from the test run UI (attached by clicking the Add attachment button).
To get the attachments of the Test Run Result steps and Test Run Result Summary, we can call Results - Get REST API with parameter detailsToInclude=iterations added:
GET https://dev.azure.com/{organization}/{project}/_apis/test/Runs/{runId}/results/{testCaseResultId}?detailsToInclude=iterations&api-version=6.0
After that we can download the attachments by their ID. The following PowerShell script for your reference to download them in a loop:
Param(
[string]$orgurl = "https://dev.azure.com/{org}",
[string]$project = "Test0924",
[string]$downloadlocation = "C:\temp\1025\",
[string]$TestRunId = "1000294",
[string]$ResultId = "100000",
[string]$user = "",
[string]$token = "PAT"
)
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
#List test result and test step attachments:
$testresultUrl = "$orgurl/$project/_apis/test/Runs/$TestRunId/Results/$($ResultId)?detailsToInclude=iterations&api-version=6.0"
$attachments = (Invoke-RestMethod -Uri $testresultUrl -Method Get -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}).iterationDetails.attachments
ForEach ($attachment in $attachments) {
#Get test result and step attachments:
$attachmentid = $attachment.id
$attachmentname = $attachment.name
$attachmenturl = "$orgurl/$project/_apis/test/Runs/$TestRunId/Results/$ResultId/attachments/$($attachmentid)?api-version=6.0"
Invoke-RestMethod -Uri $attachmenturl -Method Get -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -OutFile $downloadlocation\$attachmentname
}

Azure Cognitive Services Read Request with Powershell

I'm trying to use the Azure Cognitive Services API with Powershell to 'read' the contents of a .jpg in a blob store. Everything I am trying to do works perfectly using the Azure API demo/test page, so I am fairly sure that this, to a degree, proves that some elements I'm using in my code are valid. Well, at least they are when using the API testing tool.
Here is my Powershell:
Clear-Host
$myUri = "<ENDPOINT value from "keys and endpoint blade">/vision/v3.0/read/analyze?language=en"
$imagePath = "<path to image in blob. accessible online and anonymously>"
$subKey = "<KEY #1 from "keys and endpoint" blade>"
$headersHash = #{}
$headersHash.Add( "Host", "westeurope.api.cognitive.microsoft.com" )
$headersHash.Add( "Ocp-Apim-Subscription-Key", $subKey )
$headersHash.Add( "Content-Type","application/json" )
$bodyHash = #{ "url" = $imagePath }
out-host -InputObject "Sending request:"
$response = Invoke-WebRequest -uri $myUri `
-Method Post `
-Headers $headersHash `
-Body $bodyHash `
-verbose
"Response: $response"
When I send that, all I ever get is a:
Invoke-WebRequest : The remote server returned an error: (400) Bad Request.
At C:\scratch\testy.ps1:15 char:13
+ $response = Invoke-WebRequest -uri $myUri `
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-W
ebRequest], WebException
+ FullyQualifiedErrorId :
WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand
I must be missing something basic but I cannot see what. There are no published examples of using Powershell to access the CS APIs that I can find but there is an example with Python (requests) and I'm pretty sure I'm emulating and representing what goes into the Python example correctly. But then again, its not working so something isn't right.
Strangely, when I try to recreate this in Postman, I get a 202 but no response body, so its not possible for me to view or extract the apim-request-id in order to manufacture the next request to retrieve the results.
Found the issue. The problem was made clear when I wrapped the call in a try/catch block and put this in the catch block:
$streamReader =
[System.IO.StreamReader]::new($_.Exception.Response.GetResponseStream())
$ErrResp = $streamReader.ReadToEnd() | ConvertFrom-Json
$streamReader.Close()
I was then able to look at the contents of the $ErrResp variable and there was a fragment of a string which said "unable to download target image.." or something to that effect. Odd because I could use the URL I was supplying to instantly connect to and get the image.. so it had to be the way the URL was being injecting into the body.
It was.
When using a hashtable as the body, where your Content-Type is 'application/json' all you need to do, it seems, is use convertto-json with your hash first. This worked and I instantly got my 202 and the pointer to where to collect my results.
Hopefully this will help someone, somewhere, sometime.

How to use Azure App Configuration REST API

Trying to figure out how to use Azure AppConfiguration REST API (mostly to retrieve and create key-values). So far I found two sources of information: Configuration Stores REST API docs and this GitHub repo Azure App Configuration.
How these two sources are corresponding with each other? They apparently describe some different AppConfig REST API.
I managed to retrieve values from my AppConfig store using this type of URI and AAD authorization
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.AppConfiguration/configurationStores/{configStoreName}/listKeyValue?api-version=2019-10-01 But it allows to get only one value of one particular key.
The other approach uses URI based on AppConfig endpoint {StoreName}.azconfig.io/kv/... and must have more flexible ways to retrieve data. But I can't make it work. I tried to follow instructions. And I tried to make a request to this URI using AAD token as I did for the first type of API. In both cases I get 401 auth error.
Could anyone share some detailed working examples (Powershell, Postman)? Any help would be appreciated.
https://management.azure.com/ is the Azure Resource Management API, while the azconfig.io one is App Configuration's own API.
I think you should use App Configuration's own API. The same Azure AD token will not work for this API however. You need to request another access token with resource=https://yourstorename.azconfig.io or scope=https://yourstorename.azconfig.io/.default, depending if you use v1 or v2 token endpoint of Azure AD.
Use the $headers in the script to authenticate your api calls:
function Sign-Request(
[string] $hostname,
[string] $method, # GET, PUT, POST, DELETE
[string] $url, # path+query
[string] $body, # request body
[string] $credential, # access key id
[string] $secret # access key value (base64 encoded)
)
{
$verb = $method.ToUpperInvariant()
$utcNow = (Get-Date).ToUniversalTime().ToString("R", [Globalization.DateTimeFormatInfo]::InvariantInfo)
$contentHash = Compute-SHA256Hash $body
$signedHeaders = "x-ms-date;host;x-ms-content-sha256"; # Semicolon separated header names
$stringToSign = $verb + "`n" +
$url + "`n" +
$utcNow + ";" + $hostname + ";" + $contentHash # Semicolon separated signedHeaders values
$signature = Compute-HMACSHA256Hash $secret $stringToSign
# Return request headers
return #{
"x-ms-date" = $utcNow;
"x-ms-content-sha256" = $contentHash;
"Authorization" = "HMAC-SHA256 Credential=" + $credential + "&SignedHeaders=" + $signedHeaders + "&Signature=" + $signature
}
}
function Compute-SHA256Hash(
[string] $content
)
{
$sha256 = [System.Security.Cryptography.SHA256]::Create()
try {
return [Convert]::ToBase64String($sha256.ComputeHash([Text.Encoding]::ASCII.GetBytes($content)))
}
finally {
$sha256.Dispose()
}
}
function Compute-HMACSHA256Hash(
[string] $secret, # base64 encoded
[string] $content
)
{
$hmac = [System.Security.Cryptography.HMACSHA256]::new([Convert]::FromBase64String($secret))
try {
return [Convert]::ToBase64String($hmac.ComputeHash([Text.Encoding]::ASCII.GetBytes($content)))
}
finally {
$hmac.Dispose()
}
}
# Stop if any error occurs
$ErrorActionPreference = "Stop"
$uri = [System.Uri]::new("https://{myconfig}.azconfig.io/kv?api-version=1.0")
$method = "GET"
$body = $null
$credential = "<Credential>"
$secret = "<Secret>"
$headers = Sign-Request $uri.Authority $method $uri.PathAndQuery $body $credential $secret
Sauce: https://github.com/Azure/AppConfiguration/blob/master/docs/REST/authentication/hmac.md#JavaScript

How to configure Webhook activity for runbooks execution in Azure Data Factory v2?

I am working on running runbooks (powershell and graphical) from ADF. One of the ways I found to accomplish this task is to use webhooks. I will have runbooks running in parallel and in series (if dependency exists on previous runbook).
Overall,
If a flat file is dropped in Azure Blob storage then it triggers the pipeline that contains respective runbook(s). This part is working.
The webhook of runbook(s) are used in ADF webhook activity. This is where I am facing the problem. I am unsure about what should be in the body of webhook activity?
After some research I was able to find something about Callback uri that needs to be added (or somehow generated) in the body of the webhook. How can I get this Callback uri? If I don't add proper callback uri then the activity runs till timeout. I believe the functioning should be webhook activity completes when the runbook it's running is executed successfully so we can move on to next webhook activity in a pipeline. I have tried web activity as well but it's the same issue.
The body I am using right now is just below json.
{"body":{"myMessage":"Sample"}}
I have referenced:
https://vanishedgradient.com/2019/04/25/webhooks-with-azure-data-factory/
https://mrpaulandrew.com/2019/06/18/azure-data-factory-web-hook-vs-web-activity/
https://social.msdn.microsoft.com/Forums/en-US/2effcefb-e65b-4d5c-8b01-138c95126b79/in-azure-data-factory-v2-how-to-process-azure-analysis-service-cube?forum=AzureDataFactory
Thanks for the links, they are useful sources. I've managed to get this working for a pipeline that calls a runbook to resize azure analysis services. Having the runbook return failure and success information was not well documented.
Here's some code to assist a little, which i've taken from several places but a lot from the open issue (https://github.com/MicrosoftDocs/azure-docs/issues/43897) on this Microsoft page: https://learn.microsoft.com/en-us/azure/data-factory/control-flow-webhook-activity
The datafactory Webhook activity passes in some "Headers", SourceHost which is #pipeline().DataFactory and SourceProcess which is #pipeline().Pipeline. This was so we can do some checking to confirm that the runbook is being run by acceptable processes.
The Body of the call is then other variables we required:
#json(concat('{"AnalysisServer":"', pipeline().parameters.AASName, '", "MinimumSKU":"', pipeline().parameters.SKU,'"}') )
Your runbook needs the WebhookData parameter
param
(
[Parameter (Mandatory=$false)]
[object] $WebhookData
)
You can then grab all the bits you need, including checking if a callbackuri was provided:
if ($WebhookData)
{
# Split apart the WebhookData
$WebhookName = $WebhookData.WebhookName
$WebhookHeaders = $WebhookData.RequestHeader
$WebhookBody = $WebhookData.RequestBody | Convertfrom-Json
$WebhookADF = $WebhookHeaders.SourceHost
$WebhookPipeline = $WebhookHeaders.SourceProcess
Write-Output -InputObject ('Runbook started through webhook {0} called by {1} on {2}.' -f $WebhookName, $WebhookPipeline, $WebhookADF)
# if there's a callBackURI then we've been called by something that is waiting for a response
If ($WebhookBody.callBackUri)
{
$WebhookCallbackURI = $WebhookBody.callBackUri
}
...
}
The variable $WebHookHeaders: #{Connection=Keep-Alive; Expect=100-continue; Host=sXXevents.azure-automation.net; SourceHost=**MYDATAFACTORYNAME**; SourceProcess=**MYPIPELINENAME**; x-ms-request-id=**UNIQUEIDENTIFIER**}
You can then grab information out of your json body: $AzureAnalysisServerName = $WebHookBody.AnalysisServer
Passing an error/failure back to your runbook is relatively easy, note that I put my success/update message in to $Message and only have content in $ErrorMessage if there's been an error:
$ErrorMessage = "Failed to do stuff I wanted"
if ($ErrorMessage)
{
$Output = [ordered]#{ output= #{
AzureAnalysisServerResize = "Failed" }
error = #{
ErrorCode = "ResizeError"
Message = $ErrorMessage
}
statusCode = "500"
}
} else {
$Output = [ordered]#{
output= #{
"AzureAnalysisServerResize" = "Success"
"message" = $Outputmessage
}
statusCode = "200"
}
}
$OutputJson = $Output | ConvertTo-Json -Depth 10
# if we have a callbackuri let the ADF Webhook activity know that the script is complete
# Otherwise it waits until its timeout
If ($WebhookCallBackURI)
{
$WebhookCallbackHeaders = #{
"Content-Type"="application/json"
}
Invoke-WebRequest -UseBasicParsing -Uri $WebhookCallBackURI -Method Post -Body $OutputJson -Header $WebhookCallbackHeaders
}
I then end the if ($WebhookData) { call with an else to say the runbook shouldn't be running if not called from webhook:
} else {
Write-Error -Message 'Runbook was not started from Webhook' -ErrorAction stop
}
Passing back an error message was quiet easy, passing back a success message has been traumatic, but the above seems to work, and in my datafactory pipeline i can access the results.
Output
{
"message": "Analysis Server MYSERVERNAME which is SKU XX is already at or above required SKU XX.",
"AzureAnalysisServerResize": "Success"
}
Note that with the Invoke-WebRequest, some examples online don't specify -UseBasicParsing but we had to as the runbook complained: Invoke-WebRequest : The response content cannot be parsed because the Internet Explorer engine is not available, or Internet Explorer's first-launch configuration is not complete.
I'm not sure if this is best practice but I have something that is working in a Powershell Workflow Runbook.
If the runbook has a webhook defined then you use the webhookdata parameter. Your request body needs to be in JSON format and the $WebhookData param picks it up. For example supposed the Body in your webhook activity looks like this:
{"MyParam":1, "MyOtherParam":"Hello"}
In your runbook you pick up the parameters this way:
Param([object]$WebhookData)
if($WebhookData){
$parameters=(ConvertFrom-Json -InputObject $WebhookData.RequestBody)
if($parameters.MyParam) {$ParamOne = $parameters.MyParam}
if($parameters.MyOtherParam) {$ParamTwo = $parameters.MyOtherParam}
}
The variables in your runbook $ParamOne and $ParamTwo are populated from the parsed JSON Body string. The data factory automatically appends the callBackUri to the Body string. You don't need to create it.
You have to use the $WebhookData name. It's a defined property.
I hope this helps.
Apologies for delay. I had found the complete solution few months back. Thanks to Nick and Sara for adding the pieces. I used similar code as return code. We were using graphical runbooks with limited changes allowed so I just added return code (Powershell) at the end of the runbook to have little to no impact. I plugged in below code:
if ($WebhookData)
{
Write-Output $WebhookData
$parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)
if ($parameters.callBackUri)
{
$callbackuri = $parameters.callBackUri
}
}
if ($callbackuri)
{
Invoke-WebRequest -Uri $callbackuri -UseBasicParsing -Method POST
}
Write-Output $callbackuri
After this I added an input parameter using "Input and Output" button available in the runbook. I named the input parameter as "WebhookData" and type as "Object". The name of input parameter is case-sensitive and should match the parameter used in Powershell code.
This resolved my issue. The runbook started when called from ADF pipeline and moved to next pipeline only when the underlying runbook called by the webhook was completed.

Sending fpdf attachments does not work on my linux suse server but it does on my shared hosting account

I have a php program i have developed on the internet. So far i have used a shared hosting package. Everything worked until i moved to a vps (apache2 suse 9.1 plesk) . I have found certain php functions have not been activated. I have solved most of them by using the internet.
My main problem is emailing pdfs with fpdf. i.e
<?php
// download fpdf class (http://fpdf.org)
require("fpdf.php");
// fpdf object
$pdf = new FPDF();
// generate a simple PDF (for more info, see http://fpdf.org/en/tutorial/)
$pdf->AddPage();
$pdf->SetFont("Arial","B",14);
$pdf->Cell(40,10, "this is a pdf example");
// email stuff (change data below)
$to = "steven#siteaddress.co.uk";
$from = "me#domain.com";
$subject = "send email with pdf attachment";
$message = "<p>Please see the attachment.</p>";
// a random hash will be necessary to send mixed content
$separator = md5(time());
// carriage return type (we use a PHP end of line constant)
$eol = PHP_EOL;
// attachment name
$filename = "example.pdf";
// encode data (puts attachment in proper format)
$pdfdoc = $pdf->Output("", "S");
$attachment = chunk_split(base64_encode($pdfdoc));
// main header (multipart mandatory)
$headers = "From: ".$from.$eol;
$headers .= "MIME-Version: 1.0".$eol;
$headers .= "Content-Type: multipart/mixed; boundary=\"".$separator."\"".$eol.$eol;
$headers .= "Content-Transfer-Encoding: 7bit".$eol;
$headers .= "This is a MIME encoded message.".$eol.$eol;
// message
$headers .= "--".$separator.$eol;
$headers .= "Content-Type: text/html; charset=\"iso-8859-1\"".$eol;
$headers .= "Content-Transfer-Encoding: 8bit".$eol.$eol;
$headers .= $message.$eol.$eol;
// attachment
$headers .= "--".$separator.$eol;
$headers .= "Content-Type: application/octet-stream; name=\"".$filename."\"".$eol;
$headers .= "Content-Transfer-Encoding: base64".$eol;
$headers .= "Content-Disposition: attachment".$eol.$eol;
$headers .= $attachment.$eol.$eol;
$headers .= "--".$separator."--";
// send message
//mail($to, $subject, "", $headers);
if (#mail($to, $subject, "",$headers)) {
echo('<p>Mail sent successfully.</p>');
} else {
echo('<p>Mail could not be sent.</p>');
}
?>
The file above works on my share hosting , but when it comes to sending from my vps i get this error message from my file
Mar 23 19:16:56 h1871885 suhosin[64630]: ALERT - mail() - double newline in headers, possible injection, mail dropped (attacker '86.137.40.199', file '/srv/www/vhosts/sitename.co.uk/httpdocs/main/email.php', line 111)
after much trial, the error is from this line
if (#mail($to, $subject, "",$headers))
If i remove the "", it sends the email on my vps but there is no attachment. this also happens on my shared account. The attachment ends up in the message with a hole load of chars'.
So i def need them in there. does anyone have a clue how to overcome this problem.
many thanks
after setting suhosin.ini to 0
Mar 23 20:52:48 h1871885 suhosin[60778]: ALERT - mail() - double newline in headers, possible injection, mail dropped (attacker '86.137.40.199', file '/srv/www/vhosts/sitename.co.uk/httpdocs/main/email1.php', line 56)
You have an awful lot of .$eol.$eol in your $headers, and I imagine suhosin is forbidding the mail on the second instance. But I presume you've looked enough at RFC2822 to know exactly where you need blank lines in your message formatting, so you can turn off suhosin's mail() protection, assuming you're confident that you don't have any remotely exploitable injection vulnerabilities.

Resources