Deploying to a private IIS server from a build in Visual Studio Team Services - iis

Having done what is suggested here: Deploy from Visual Studio Online build to private IIS server
... how do I setup automatic deploys as part of my build when I build a whole branch **/*.sln?
What I have tried ...
In VS I can get the latest version of the code, open a solution and then ...
right click > publish > pick publish profile > deploy
I have named my publish profiles things like "dev", "qa", "production", these refer to the environments into which the project will be deployed and the profiles contain all of the configuration information needed for VS to deploy (via webdeploy / msdeploy) using "one click deploy" that application.
I want to have Team Services on the build server do the exact same thing for projects that have publish profiles defined after it's built the code.
My understanding was that I could just add the msbuild args like this ...
this results in the deployment part of the build throwing the following exception in to the build log ...
C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\Web\Microsoft.Web.Publishing.targets(4288,5):
Error ERROR_USER_NOT_ADMIN: Web deployment task failed.
(Connected to 'server' using the Web Deployment Agent Service, but could not authorize. Make sure you are an administrator on 'server'.
Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_USER_NOT_ADMIN.)
What user is this using if not the user defined in the publish profile?
Related issues:
Publishing via TFS Build Service fails with "User not Admin"
TFS Builds: Running the builds as administrator
I added an account to the server in question (since the build and server to be deployed to are the same server it made things easier), I also added a group to the server called "MSDepSvcUsers" and added the new account in question to it and the admins group on the box.
I then told both the Web Deployment Agent service and the Team Services Agent service to run under this account (and restarted them).
Unfortunately the result is the same ... I now really want to know how I go about ensuring the account that is used for the msdeploy command is something I expect without relying on loads of scripting ... or maybe that's why Microsoft haven't set this up as a default deploy step option in Team Services already!

Ok so I had some long conversations with the VSTS team over at Microsoft about this and the long and short of it is ...
Microsoft:
We understand your frustration with this area and a big project is
about to spin up to resolve this issue
...
Me being me, came up with some "trick to make it happen".
I managed to figure out that the build box for some odd reason can't be the same server that you are deploying too (no idea why) but having figured that out I wrote a simple console app that with some additional feedback from Microsoft came out pretty good.
It even reports progress back to the process and can log exceptions in the deployment as exceptions in order to fail the build by calling up "internal commands" (neat how this works by the way kudos to the team for that).
There are some hacks in here and it's not perfect but hopefully it'll help someone else, I call this because it's part of the code that gets built in my repo so I am able to add a step in to the build process to call this from within the build output passing the environment name I want to deploy to.
This in tern grabs all the packages (as per the settings above) and uses their publish profiles to figure out where the packages need to go and sends them to the right servers to be deployed ...
using System;
using System.Diagnostics;
using System.IO;
using System.Reflection;
namespace Deploy
{
class Program
{
static string msDeployExe = #"C:\Program Files\IIS\Microsoft Web Deploy V3\msdeploy.exe";
static void Main(string[] args)
{
var env = args[0];
var buildRoot = Path.Combine(Assembly.GetExecutingAssembly().Location.Replace("Deploy.exe", ""), env);
//var commands = GetCommands(buildRoot);
var packages = new DirectoryInfo(buildRoot).GetFiles("*.zip", SearchOption.AllDirectories);
bool success = true;
for (int i = 0; i < packages.Length; i++)
{
if (!Deploy(packages[i], env)) success = false;
Console.WriteLine("##vso[task.setprogress]" + (int)(((decimal)i / (decimal)packages.Length) * 100m));
}
Console.WriteLine("##vso[task.setprogress]100");
if(success) Console.WriteLine("##vso[task.complete result=Succeeded]");
else Console.WriteLine("##vso[task.complete result=SucceededWithIssues]");
}
static bool Deploy(FileInfo package, string environment)
{
bool succeeded = true;
Console.WriteLine("Deploying " + package.FullName);
var procArgs = new ProcessStartInfo
{
FileName = msDeployExe,
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true,
Arguments =
"-source:package='" + package.FullName + "' " +
"-dest:auto,ComputerName='" + environment + ".YourDomain.com',UserName='deployment user',Password='password',AuthType='ntlm',IncludeAcls='False' " +
"-verb:sync " +
"-disableLink:AppPoolExtension " +
"-disableLink:ContentExtension " +
"-disableLink:CertificateExtension " +
"-setParamFile:\"" + package.FullName.Replace("zip", "SetParameters.xml") + "\""
};
try
{
Console.WriteLine(msDeployExe + " " + procArgs.Arguments);
using (var process = Process.Start(procArgs))
{
var result = process.StandardOutput.ReadToEnd().Split('\n');
var error = process.StandardError.ReadToEnd();
process.WaitForExit();
if (!string.IsNullOrEmpty(error))
{
Console.WriteLine("##vso[task.logissue type=error]" + error);
succeeded = false;
}
foreach (var l in result)
if (l.ToLowerInvariant().StartsWith("error"))
{
Console.WriteLine("##vso[task.logissue type=error]" + l);
succeeded = false;
}
else
Console.WriteLine(l);
}
}
catch (Exception ex) {
succeeded = false;
Console.WriteLine("##vso[task.logissue type=error]" + ex.Message);
Console.WriteLine("##vso[task.logissue type=error]" + ex.StackTrace);
}
return succeeded;
}
}
}

No you don't need a ton of PS scripts to achieve this. MSDeploy.exe is an incredibly useful tool that can probably cover your needs. Add the /t:Package build argument to your VS build task to create a package. Then use a Commandline task to deploy the MSDeploy package to your IIS site. Here are more details about out WebDeploy/MSDeploy works:
http://www.dotnetcatch.com/2016/02/25/the-anatomy-of-a-webdeploy-package/

I do this all of the time. What I did was setup a Release in the Release tab and signed up to enable Deployment Groups. Once you have a the Deployment Group enabled on your account (needed contact MS to get this enabled). I could download PS script that I run on each of the machines that I want to deploy to. Then in the Release screen I can setup the steps to run in a Deployment Group and then the various publish tasks run on the local server allowing them to work.
Using the Deployment Groups is an excellent solution because if you have it load balanced it will deploy to only a portion of the load balanced servers at a time. Allowing the app to stay up the whole time.

Related

Getting Error When Starting Node in ASP.NET Core 2.0

I first create a vanilla asp.net core 2.0 web site with vs2017. Then, I add node starting by updating startup.cs with these lines:
public void ConfigureServices(IServiceCollection services)
{
services.AddMvc();
services.AddNodeServices(options => {
options.LaunchWithDebugging = true;
options.DebuggingPort = 9229;
});
}
I create a view that returns some data and when I run it locally, it works. Here is that code:
public async Task<IActionResult> NodeTest([FromServices] INodeServices nodeServices)
{
ViewData["ResultFromNode"] =
await nodeServices.InvokeAsync<string>("NodeSrc/myNodeModule.js");
return View(viewName: "NodeTest");
}
My myNodeModule.js is as follows:
module.exports = function (callback) {
var message = 'Hello from Node js script at ' +
new Date().toString() + ' process.versions ' +
process.version;
callback(/* error */ null, message);
};
When I run this locally with vs2017 it works. When I use VSTS and deploy, even after adding a process variable of ASPNETCORE_ENVIRONMENT to Development, I still get an error when I try to browse to the page. The error i below.
My 2 questions are:
1. How can I get better errors reported
2. How can I get node to run on the server in both debug and production modes.
Error:
Error.
An error occurred while processing your request.
Request ID: 0HL87NKEP2GL6:00000002
Development Mode
Swapping to Development environment will display more detailed information about the error that occurred.
Development environment should not be enabled in deployed applications, as it can result in sensitive information from exceptions being displayed to end users. For local debugging, development environment can be enabled by setting the ASPNETCORE_ENVIRONMENT environment variable to Development, and restarting the application.
You need to specify Copy if newer in Copy to Output Directory of JS file properties (Right click the JS file=>Properties=>Change Copy to Output Directory to Copy if newer).
BTW: it works fine in local without changing Copy to Output directory, it looks for the file in project root path. Did test with Using Node Services in ASP.NET Core article
Regarding development environment in Azure Web app, you can do it in Application settings of web app:
Go to azure portal
Select your web app
Select Application Settings
Add ASPNETCORE_ENVIRONMENT with Development value in App Settings.

SSL based webserver on Windows IoT

I am working on a project which involves gathering some sensor data and build a GUI on it, with controlling of sensors. It has following two basic requirements.
Should be a web based solution (Although it will only be used on LAN or even same PC)
It should be executable on both windows IoT core and standard windows PC (Windows 7 and above)
I have decided to use Embedded webserver for Windows IoT, which seems to be a good embedded server based on PCL targeting .NET 4.5 and UWP. So I can execute it on both environments. That is great! But the problem is this web server doesn't support SSL, I have tried to search other servers and have come up with Restup for UWP, which is also a good REST based web server, but it also doesn't support SSL.
I needs an expert opinion, that if there is any possibility I can use SSL protocol in these web servers. Is it possible that it can be implemented using some libraries like OpenSSL etc? (Although I think that it would be too complex and much time taking to implement it correctly)
Edit
I would even like to know about ASP.NET core on Windows 10 IoT Core, if I can build an application for both windows. I found one example but it is DNXbased, and I don't want to follow this way, as DNX is deprecated.
Any help is highly appreciated.
Late answer, but .NET Core 2.0 looks promising with Kestrel. I successfully created a .Net Core 2.0 app on the PI 3 this morning. Pretty nifty and If you already have an Apache web server, you’re almost done. I’m actually going to embed (might not be the right term) my .Net Core 2.0 web application into a UWP app, rather than create multiple unique apps for the touchscreens around the house.
.Net Core 2.0 is still in preview though.
https://learn.microsoft.com/en-us/aspnet/core/fundamentals/servers/kestrel?tabs=aspnetcore2x
I know this post is pretty old, but I have built the solution which you are asking bout. I’m currently running .Net 5.0 on a Raspberry pi. When you build the .net core web project, select the correct target framework and the target runtime to win-arm. Copy the output some directory on the pi and you will have to access the device using powershell to create a scheduled task to start the web project. Something like this:
schtasks /create /tn "Startup Web" /tr c:\startup.bat /sc onstart /ru SYSTEM
That starts a bat file which runs a powershell command which has the following command:
Set-Location C:\apps\vradWebServer\ .\VradTrackerWeb.exe (the .\VradTrackerWeb.exe is on a second line in the file) - the name of the webapp.
That starts the server. If you have any web or apps posting to the webserver you will need an ssl cert. I used no-ip and let’s encrypt for this. For let’s encrypt to work, you will need an external facing web server and have the domain name point to it. Run let’s encrypt on the external server and then copy out the cert and place it in your web directory on the pi. I then have a uwp program that runs on the pi and when it starts, it gets it’s local address and then updates no-ip with the local address, so the local devices communicating will be correctly routed and have the ssl cert. Side note, my uwp app is the startup app on the device. The scheduled task is important because it allows you to run you app and the web server. The following snip is how I get the ip address and then update no-ip.
private string GetLocalIP()
{
string localIP = "";
using (Socket socket = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, 0))
{
socket.Connect("8.8.8.8", 65530);
IPEndPoint endPoint = socket.LocalEndPoint as IPEndPoint;
localIP = endPoint.Address.ToString();
}
return localIP;
}//GetLocalIP
private async void UpdateIP()
{
string localIP = "";
string msg = "";
var client = new HttpClient(new HttpClientHandler { Credentials = new NetworkCredential("YourUserName", "YourPassword") });
try
{
localIP = GetLocalIP();
string noipuri = "http://dynupdate.no-ip.com/nic/update?hostname=YourDoman.hopto.org&myip=" + localIP;
using (var response = await client.GetAsync(noipuri))
using (var content = response.Content)
{
msg= await content.ReadAsStringAsync();
}
if (msg.Contains("good") == true || msg.Contains("nochg")==true)
{
SentDynamicIP = true;
LastIPAddress = localIP;
}
else
{
SentDynamicIP = false;
}
}
catch(Exception ex)
{
string x = ex.Message;
}
finally
{
client.Dispose();
}
}//UpdateIP

How to establish a continuous deployment of non-.NET project/solution to Azure?

I have connected Visual Studio Online to my Azure website. This is not a .NET ASP.NET MVC project, just several static HTML files.
Now I want to get my files uploaded to Azure and available 'online' after my commits/pushes to the TFS.
When a build definition (based on GitContinuousDeploymentTemplate.12.xaml) is executed it fails with an obvious message:
Exception Message: The process parameter ProjectsToBuild is required but no value was set.
My question: how do I setup a build definition so that it automatically copies my static files to Azure on commits?
Or do I need to use a different tooling for this task (like WebMatrix).
update
I ended up with creating an empty website and deploying it manually from Visual Studio using webdeploy. Other possible options to consider to create local Git at Azure.
Alright, let me try to give you an answer:
I was having quite a similar issue. I had a static HTML, JS and CSS site which I needed to have in TFS due to the project and wanted to make my life easier using the continuous deployment. So what I did was following:
When you have a Git in TFS, you get an URL for the repository - something like:
https://yoursite.visualstudio.com/COLLECTION/PROJECT/_git/REPOSITORY
, however in order to access the repository itself, you need to authenticate, which is not currently possible, if you try to put the URL with authentication into Azure:
https://username:password#TFS_URL
It will not accept it. So what you do, in order to bind the deployment is that you just put the URL for repository there (the deployment will fail, however it will prepare the environment for us to proceed).
However, when you link it there, you can get DEPLOYMENT TRIGGER URL on the Configure tab of the Website. What it is for is that when you push a change to your repository (say to GitHub) what happens is that GitHub makes a HTTP POST request to that link and it tells Azure to deploy new code onto the site.
Now I went to Kudu which is the underlaying system of Azure Websites which handles the deployments. I figured that if you send correct contents in the HTTP POST (JSON format) to the DEPLOYMENT TRIGGER URL, you can have it deploy code from any repository and it even authenticates!
So the thing left to do is to generate the alternative authentication credentials on the TFS site and put the whole request together. I wrapped this entire process into the following PowerShell script:
# Windows Azure Website Configuration
#
# WAWS_username: The user account which has access to the website, can be obtained from https://manage.windowsazure.com portal on the Configure tab under DEPLOYMENT TRIGGER URL
# WAWS_password: The password for the account specified above
# WAWS: The Azure site name
$WAWS_username = ''
$WAWS_password = ''
$WAWS = ''
# Visual Studio Online Repository Configuration
#
# VSO_username: The user account used for basic authentication in VSO (has to be manually enabled)
# VSO_password: The password for the account specified above
# VSO_URL: The URL to the Git repository (branch is specified on the https://manage.windowsazure.com Configuration tab BRANCH TO DEPLOY
$VSO_username = ''
$VSO_password = ''
$VSO_URL = ''
# DO NOT EDIT ANY OF THE CODE BELOW
$WAWS_URL = 'https://' + $WAWS + '.scm.azurewebsites.net/deploy'
$BODY = '
{
"format": "basic",
"url": "https://' + $VSO_username + ':' + $VSO_password + '#' + $VSO_URL + '"
}'
$authorization = "Basic "+[System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($WAWS_username+":"+$WAWS_password ))
$bytes = [System.Text.Encoding]::ASCII.GetBytes($BODY)
$webRequest = [System.Net.WebRequest]::Create($WAWS_URL)
$webRequest.Method = "POST"
$webRequest.Headers.Add("Authorization", $authorization)
$webRequest.ContentLength = $bytes.Length
$webRequestStream = $webRequest.GetRequestStream();
$webRequestStream.Write($bytes, 0, $bytes.Length);
$webRequest.GetResponse()
I hope that what I wrote here makes sense. The last thing you would need is to bind this script to a hook in Git, so when you perform a push the script gets automatically triggered after it and the site is deployed. I haven't figured this piece yet tho.
This should also work to deploy a PHP/Node.js and similar code.
The easiest way would be to add them to an empty ASP .NET project, set them to be copied to the output folder, and then "build" the project.
Failing that, you could modify the build process template, but that's a "last resort" option.

Strange behavior of Windows Azure Compute Emulator (SDK 1.8) with multiple role instances on a clean machine with VS2012 but WITHOUT VS2010

Have you ever tried to run a hosted service in the windows azure emulator with full IIS and multiple role instances? Some days ago I noticed that only one of the multiple instances of a web role is startet in IIS at a time. The following screenshot illustrates the behavior and the message box in front of the screenshot shows the reason for this behavior. The message box appears on trying to start one of the stopped websites in the IIS Manager.
Screenshot: IIS with stopped Websites
The sample cloud application contains two web roles: MvcWebRole1 and WCFServiceWebRole1 each configured to use three instances. My first thought was: "Sure! No port collision will happen in the real azure world because every role instance is an own virtual machine. It cannot work in the emulator!" But after some research and analyzing many parts of the azure compute emulator I found out that the compute emulator creates a unique IP for each role instance (in my example from 127.255.0.0 up to 127.255.0.5). This MSDN blog article (http://blogs.msdn.com/b/avkashchauhan/archive/2011/09/16/whats-new-in-windows-azure-sdk-1-5-each-instance-in-any-role-gets-its-own-ip-address-to-match-compute-emulator-close-the-cloud-environment.aspx) of the microsoft employee Avkash Chauhan describes this behavior as well. After that conclusion I came to the following question: why the hell does the compute emulator (more precisely DevFC.exe) not add the IP of the appropriate role to the binding information of each Website???
I added the IP to each Website by hand and tadaaaaa: every Website can be started without any collisions. The next screenshot demonstrates it with the changed binding information highlighted.
Screenshot: IIS with started Websites
Once again: Why the hell does the emulator not do it for me? I wrote a small static helper method to do the binding extension thing for me on every role start. Maybe someone wants to use it:
public static class Emulator
{
public static void RepairBinding(string siteNameFromServiceModel, string endpointName)
{
// Use a mutex to mutually exclude the manipulation of the iis configuration.
// Otherwise server.CommitChanges() will throw an exeption!
using (var mutex = new System.Threading.Mutex(false, "AzureTools.Emulator.RepairBinding"))
{
mutex.WaitOne();
using (var server = new Microsoft.Web.Administration.ServerManager())
{
var siteName = string.Format("{0}_{1}", Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.CurrentRoleInstance.Id, siteNameFromServiceModel);
var site = server.Sites[siteName];
// Add the IP of the role to the binding information of the website
foreach (Binding binding in site.Bindings)
{
//"*:82:"
if (binding.BindingInformation[0] == '*')
{
var instanceEndpoint = RoleEnvironment.CurrentRoleInstance.InstanceEndpoints[endpointName];
string bindingInformation = instanceEndpoint.IPEndpoint.Address.ToString() + binding.BindingInformation.Substring(1);
binding.BindingInformation = bindingInformation;
server.CommitChanges();
}
else
{
throw new InvalidOperationException();
}
}
}
// Start all websites of the role if all bindings of all websites of the role are prepared.
using (var server = new Microsoft.Web.Administration.ServerManager())
{
var sitesOfRole = server.Sites.Where(site => site.Name.Contains(RoleEnvironment.CurrentRoleInstance.Role.Name));
if (sitesOfRole.All(site => site.Bindings.All(binding => binding.BindingInformation[0] != '*')))
{
foreach (Site site in sitesOfRole)
{
if (site.State == ObjectState.Stopped)
{
site.Start();
}
}
}
}
mutex.ReleaseMutex();
}
}
}
I call the helper method as follows
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
if (RoleEnvironment.IsEmulated)
{
AzureTools.Emulator.RepairBinding("Web", "ServiceEndpoint");
}
return base.OnStart();
}
}
I got it!
I have this behavior on three different machines which are all formatted and served with fresh clean windows 8, Visual Studio 2012 and Azure SDK 1.8 and Azure Tools installations recently. So a reinstallation of the Azure SDK and Tools (as Anton suggests) should not change anything. But the cleanliness of my three machines is the crucial point! Anton, do you have Visual Studio 2010 on your machine with at least VS2010 SP 1 installed? I analyzed IISConfigurator.exe with ILSpy and found the code which sets the IP in the binding information of the websites to '*' (instead of 127.255.0.*). It depends on the static property Microsoft.WindowsAzure.Common.Workarounds.BindToAllIpsWorkaroundEnabled. This method internally uses Microsoft.WindowsAzure.Common.Workarounds.TryGetVS2010SPVersion and leads to setting the IP binding to '*' if the SP level of Visual Studio 2010 is smaller than 1. TryGetVS2010SPVersion checks four registry keys and I don't know why but one of the keys exists in my registry und returns the Visual Studio 2010 SP level 0 (I never installed VS2010 on no one of the three machines!!!). As I changed the value of HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\DevDiv\vs\Servicing\10.0\SP from 0 to 10 (something greater 0 should do it) the Azure Emulator starts to set the 127.255.0.* IPs of the roles to the binding information on all of the websites in the IIS and all websites are started correctly.

Unable to create Website using powershell in window service

I Want to create a website in IIS using powershell script.I write this code in a window service.
protected override void OnStart(string[] args)
{
string sitename = "rajesh";
//To create a New site
//Powershell script
string script = "cd\\\n" + "import-Module WebAdministration \n" + "IIS:\n" + "New-item iis:\\Sites\\rajesh -PhysicalPath C:\\inetpub\\wwwroot\test -bindings #{Protocol='http';bindingInformation='*:8080:" + sitename + "'}" + "\n add-content C:\\Windows\\System32\\drivers\\etc\\Hosts '127.0.0.1 " + sitename + "'";
string s = RunScript(script);
}
private static string RunScript(string scriptText)
{
Runspace runspace = RunspaceFactory.CreateRunspace();
runspace.Open();
Pipeline pipeline = runspace.CreatePipeline();
pipeline.Commands.AddScript(scriptText);
pipeline.Commands.Add("Out-String");
Collection<PSObject> results = pipeline.Invoke();
runspace.Close();
StringBuilder stringBuilder = new StringBuilder();
foreach (PSObject obj in results)
{
stringBuilder.AppendLine(obj.ToString());
}
return stringBuilder.ToString();
}
But when i try to start this window service in services listing...It throws an error.. The services on locat computer start and stopped..... When i change the powershell script to Create a folder or some small task then this window service is working but when i try to create a website through this powershell..it throws an error and window service can't be started.This powershell script is working in powershell.But not in window service..
The error produced would be helpful.
Sitename isn't a variable either, is that atypo here or in the code?
You could try replacing the \n with ; as well, powershell doesn't use that as new line, though i guess the C# is.
There's a couple of things to note here and I think you maybe don't understand some aspects of Windows Services:
OnStart is not the place to be putting any kind of logic. This event is there so you can initialise the service. For example starting a background worker thread that would run the main logic of the service and respond to external requests. For example - starting a TCP listener or initialising .NET Remoting or WCF.
OnStart is only called once when the service starts.
OnStart must return within 30 seconds although your initialisation code can request additional time.
You're calling out to PowerShell and the WebAdministration add-in to create websites. Whilst this is doable, it's clunky, perhaps you should consider calling directly into the IIS7 Managed API's:
Microsoft.Web.Administration Namespace
Here's an example of creating a site using the managed API's:
using (ServerManager serverManager = new ServerManager())
{
Site site = serverManager.Sites.Add("My Web Site", "C:\\inetpub\\wwwroot\test", 80);
site.Bindings.Clear();
site.Bindings.Add("*:8080:", "http");
serverManager.CommitChanges();
}
It should also be noted that any configuration tasks performed on IIS require that the account doing this have elevated privileges i.e. be an Administrator or if it's a service run under the SYSTEM account.

Resources