How to configure and install nano server using DSC powershell on Windows server 2019 - azure

I have Windows Server 2019, where I want to setup Nano Server installation and Docker using DSC powershell scripts.
This requirement is for Azure VM using State Configuration from Azure Automation.
The Script
configuration Myconfig
{
Import-DscResource -ModuleName DockerMsftProvider
{
Ensure = 'present'
Module_Name = 'DockerMsftProvider'
Repository = 'PSGallery'
}
}
I know, I am missing few parameters here.. please help me in completing this script
Similarly, I need it to setup Nano server if possible.

Related

ASP.NET Core NodeServices on Azure Linux WebApp

I am used to publish Azure WebApps on Windows but now I am trying to deploy an ASP.NET Core 3 (with NodeServices) to a Linux WebApp and I am receiving the following error message:
InvalidOperationException: Failed to start Node process. To resolve this:.
[1] Ensure that Node.js is installed and can be found in one of the PATH directories.
Current PATH enviroment variable is: /opt/dotnetcore-tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/site/wwwroot
Make sure the Node executable is in one of those directories, or update your PATH.
On Windows WebApps I have a lot of other apps and all are fine.
On Kudu I typed node -v and the output was v12.13.0.
Can anybody please help me?
Thank you very much.
After a long research and the assistance of Microsoft's Engineer https://github.com/caroe2014 this is the three steps final answer:
1) Startup.cs
services.AddNodeServices(options =>
{
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
{
options.ProjectPath = Path.GetFullPath("/home/site/wwwroot");
}
}
);
2) And what I found is Node is not present in the container so it is necessary to have a script to install and start it before starting the app itself. So I have this start1.cs file:
#!/bin/bash
apt-get install curl
curl -sL https://deb.nodesource.com/setup_12.x | bash
apt-get install -y nodejs
set -e
export PORT=8080
export ASPNETCORE_URLS=http://*:$PORT
dotnet "Web.Identity.dll"
Where Web.Identity.dll is the dll of my app.
3) Set startup command to /home/site/wwwroot/start1.sh (On Azure Portal - App service Configuration - or Azure DevOps).
That's all.
Try to mention the path in the code, This is how NodeServices was configured in Startup.cs:
services.AddNodeServices(options =>
{
options.ProjectPath = "Path\That\Doesnt\Exist";
});

How to automate Azure P2S VPN connection on Windows 10 with Jenkins pipeline

I have installed an Azure P2S VPN on my Windows computer and I can connect it manually. I also have a PowerShell script to do the job.
Here's the script:
rasphone "Azure-VPN"
$wshell = New-Object -ComObject wscript.shell;
$wshell.AppActivate('Network Connections')
Sleep 2
$wshell.SendKeys('~')
Sleep 2
$wshell.SendKeys('~')
The $wshell.SendKeys('~') is to replace pressing Enter key when I connect manually.
I can run this script to connect VPN successfully from command line:
> powershell C:\myScript.ps1
True
Now I want to run this script on a Jenkins pipeline. But it seems like this cannot be achieved.
stage('VPN'){
bat "powershell C:\\myScript.ps1"
}
It returns False on the Jenkins console output.
I also tried following the accepted answer here but still no luck (cannot run neither from command line nor on Jenkins)
> rasdial Azure-VPN /phonebook:%userprofile%\AppData\Roaming\Microsoft\Network\Connections\Cm\<aLongNumber>\<aLongNumber>.pbk
Remote Access error 623 - The system could not find the phone book entry for this connection.
Is there any workaround for this? My purpose is to use Jenkins pipeline to turn on the VPN, send some files over the network and then turn it off.
You could select to use Jenkin’s Powershell plugin for directly running Powershell scripts on Windows via Jenkins. You could get more references from this blog.
Alternatively, refer to this SO answer, you could invoke a batch file with Jenkins like this for Windows paths:
stage('build') {
dir("build_folder"){
bat "run_build_windows.bat"
}
}
or
stage('build') {
bat "c://some/folder/run_build_windows.bat"
}

Update application using Azure Automation DSC

How do I update application using Azure automation DSC?
When I change the configuration and upload and compile the configuration the status of the Vm node goes from Complaint to Pending status.
Then, I have to wait 30 min for the configuration to pickup the new config which then updates the application. I changed the package version too. Is there a way to force trigger the update?
Following is my code:
Configuration Deploy
{
Import-DscResource -ModuleName cWebPackageDeploy
Import-Dscresource -ModuleName PowerShellModule
node "localhost"
{
cWebPackageDeploy depwebpackage
{
Name = "website.zip"
StorageAccount = "testdeploy"
StorageKey = "xxxxxxxxxxxxxxxxxxxxxxx"
Ensure = "Present"
PackageVersion = "1.0"
DeployPath = "C:\Temp\Testdeploy"
DependsOn = "[PSModuleResource]Azure.Storage"
}
PSModuleResource Azure.Storage
{
Ensure = 'present'
Module_Name = 'Azure.Storage'
}
}
}
Deploy
There is no way of doing that using Azure Automation natively.
That being said you can always work around that by telling a vm to pull configuration with Update-DscConfiguration.
You can create a script that uploads the configuration, compiles it and forces a VM to pull from the pull server.

OctopusDSC module in Azure DSC not found

I'm trying to install an Octopus tentacle as part of an Azure deploy using Powershell DCS extension
I've installed OctopusDSC under the automation user and it appears in the module list
ResourceGroupName : RESOURCEGROUP
AutomationAccountName : AUTOMATIONUSER
Name : OctopusDSC
IsGlobal : False
Version :
SizeInBytes : 0
ActivityCount : 0
CreationTime : 22/02/2017 14:03:07 +00:00
LastModifiedTime : 22/02/2017 14:04:42 +00:00
ProvisioningState : Succeeded
I've then created a powershell script with a basic install that is trying to import the module (first few lines below):
Configuration installoctopus
{
Import-DscResource -ModuleName OctopusDSC
But then I get the error during deployment:
Unable to load resource 'OctopusDSC': Resource not found.\r\n\r\nAt C:\Packages\Plugins\Microsoft.Powershell.DSC\2.22.0.0\DSCWork\installoctopus2.0\installoctopus2.ps1:8 char:7\r\n+ cTentacleAgent OctopusTentacle\r\n+
I've tired with Import-DscResource -Module OctopusDSC as well as Import-DscResource -Module * but get the same errors
One of the first parts of the OctopusDSC documentation is
First, ensure the OctopusDSC module is on your $env:PSModulePath. Then you can create and apply configuration like this.
but I didn't have to do this for the cChoco DSC (and I'm unsure how to do it as part of a DSC configuration?) module which works fine. Is this a different type of module that requires extra import options? Is it actually a powershell module and required to be on the guest VM despite being in the Azure automation module list
The OctopusDSC resource needs to be on the guest VM for the Import-DscResource -ModuleName OctopusDSC cmd to succeed on the guest VM. So make sure it's in the ZIP file that contains your configuration script.
The easiest way to get all the resources needed into the zip file is to create it with the Publish-AzureRMVMDSCConfiguration cmdlet and just use the OutputArchivePath param. But for that cmdlet to find it, it must be in $env:PSModulePath on the machine where you run the cmdlet. So 1) install OctopusDSC in PSModulePath (on the "build" machine) and then 2) run the cmdlet.
Alternatively, you can manually add the OctopusDSC module to the zip file - usually this is just putting the folder in the zip file, but depending on the resource can mean more than that (I don't know of a good doc on manually creating it), but it's trivial to try this route and see if it works.

Chef WebPI cookbook fails install in Azure

I setup a new Win2012 VM in Azure with the Chef plugin and have it connected to manage.chef.io. Added a cookbook which uses the WebPi cookbook to install ServiceBus and its dependencies. The install fails with the following error:
“Error opening installation log file. Verify that the specified log file location exists and is writable.”
After some searching it looks like this is not new in Azure based on this 2013 blog post - https://nemetht.wordpress.com/2013/02/27/web-platform-installer-in-windows-azure-startup-tasks/
It offers a hack to disabled security on the folder temporarily but I'm looking for a better solution.
Any ideas?
More of the log output -
Started installing: 'Microsoft Windows Fabric V1 RTM'
.
Install completed (Failure): 'Microsoft Windows Fabric V1 RTM'
.
WindowsFabric_1_0_960_0 : Failed.
Error opening installation log file. Verify that the specified log file location exists and is writable.
DependencyFailed: Microsoft Windows Fabric V1 CU1
DependencyFailed: Windows Azure Pack: Service Bus 1.1
.
..
Verifying successful installation...
Microsoft Visual C++ 2012 SP1 Redistributable Package (x64) True
Microsoft Windows Fabric V1 RTM False
Log Location: C:\Windows\system32\config\systemprofile\AppData\Local\Microsoft\Web Platform Installer\logs\install\2015-05-11T14.15.51\WindowsFabric.txt
Microsoft Windows Fabric V1 CU1 False
Windows Azure Pack: Service Bus 1.1 False
Install of Products: FAILURE
STDERR:
---- End output of "WebpiCmd.exe" /Install /products:ServiceBus_1_1 /suppressreboot /accepteula /Log:c:/chef/cache/WebPI.log ----
Ran "WebpiCmd.exe" /Install /products:ServiceBus_1_1 /suppressreboot /accepteula /Log:c:/chef/cache/WebPI.log returned -1
A Chef contact (thanks Bryan!) helped me understand this issue better. Some WebPI packages do not respect the explicit log path provided to WebPIcmd.exe. The author should fix the package to use the provided log path when it is set. So the options became:
Have the author fix the package
Run Chef in a new scheduled task as a different user which has access
to the AppData folder
Edit the cookbook to perform/unperform a registry edit to temporarily move the AppData folder to a location that the System
user has access. Either in my custom cookbook or fork the WebPI
cookbook.
Obviously, waiting on the author (Microsoft in this case) to fix the package would not happen quickly.
Changing how the Azure VM runs Chef doesn't make sense considering the whole idea is to provide the configuration at the time of provisioning and it just work. Plus changing the default setup may have unintended consequences and puts us in a non-standard environment.
In the short term, I decided to alter the registry in my custom cookbook.
registry_key 'HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders' do
values [{
:name => "Local AppData",
:type => :expand_string,
:data => "%~dp0appdata"
}]
action :create
end
webpi_product 'ServiceBus_1_1' do
accept_eula true
action :install
end
webpi_product 'ServiceBus_1_1_CU1' do
accept_eula true
action :install
end
registry_key 'HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders' do
values [{
:name => "Local AppData",
:type => :expand_string,
:data => '%%USERPROFILE%%\AppData\Local'
}]
end
This change could also be done in the WebPI cookbook as well to fix this issue for all dependent cookbooks. I decided to not approach this until the WebPI team responds to a feature request for the framework to verify packages respect the log path instead.
http://forums.iis.net/t/1225061.aspx?WebPI+Feature+Request+Validate+product+package+log+path+usage
Please go and reply to this thread to try to get the team to help protect against this common package issue.
Here is the solution with POWERSHELL
I had the same error while installing "Service Fabric SDK" during VMSS VM creation. Also the system user was used.
Issue: when I was connecting with RDP with my "admin" user and run the same, it worked.
Solution: change the registry entry as above, install and reset back
here is my solution using "powershell"
I installed 2 .reg files into %TEMP% folder. The content is the old and new exported key / value for the
plugin-sf-SDK-temp.reg
Windows Registry Editor Version 5.00
[HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders]
"Local AppData"=hex(2):25,00,54,00,45,00,4d,00,50,00,25,00,00,00
plugin-sf-SDK-orig.reg
Windows Registry Editor Version 5.00
[HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders]
"Local AppData"=hex(2):25,00,55,00,53,00,45,00,52,00,50,00,52,00,4f,00,46,00,\
49,00,4c,00,45,00,25,00,5c,00,41,00,70,00,70,00,44,00,61,00,74,00,61,00,5c,\
00,4c,00,6f,00,63,00,61,00,6c,00,00,00
Integrate the following code into your custom-powershelgl script:
Write-Output "Reset LocalApp Folder to TEMP"
Start-Process "$($env:windir)\regedit.exe" `
-ArgumentList "/s", "$($env:TEMP)\plugin-sf-SDK-temp.reg"
## replace the following lines with your installation - here my SF SDK installation via WebWPIcmd
Write-Output "Installing /Products:MicrosoftAzure-ServiceFabric-CoreSDK"
Start-Process "$($env:programfiles)\microsoft\web platform installer\WebPICMD.exe" `
-ArgumentList '/Install', `
'/Products:"MicrosoftAzure-ServiceFabric-CoreSDK"', `
'/AcceptEULA', "/Log:$($env:TEMP)\WebPICMD-install-service-fabric-sdk.log" `
-NoNewWindow -Wait `
-RedirectStandardOutput "$($env:TEMP)\WebPICMD.log" `
-RedirectStandardError "$($env:TEMP)\WebPICMD.error.log"
Write-Output "Reset LocalApp Folder to ORIG"
Start-Process "$($env:windir)\regedit.exe" `
-ArgumentList "/s", "$($env:TEMP)\plugin-sf-SDK-orig.reg"

Resources