SSL based webserver on Windows IoT - security

I am working on a project which involves gathering some sensor data and build a GUI on it, with controlling of sensors. It has following two basic requirements.
Should be a web based solution (Although it will only be used on LAN or even same PC)
It should be executable on both windows IoT core and standard windows PC (Windows 7 and above)
I have decided to use Embedded webserver for Windows IoT, which seems to be a good embedded server based on PCL targeting .NET 4.5 and UWP. So I can execute it on both environments. That is great! But the problem is this web server doesn't support SSL, I have tried to search other servers and have come up with Restup for UWP, which is also a good REST based web server, but it also doesn't support SSL.
I needs an expert opinion, that if there is any possibility I can use SSL protocol in these web servers. Is it possible that it can be implemented using some libraries like OpenSSL etc? (Although I think that it would be too complex and much time taking to implement it correctly)
Edit
I would even like to know about ASP.NET core on Windows 10 IoT Core, if I can build an application for both windows. I found one example but it is DNXbased, and I don't want to follow this way, as DNX is deprecated.
Any help is highly appreciated.

Late answer, but .NET Core 2.0 looks promising with Kestrel. I successfully created a .Net Core 2.0 app on the PI 3 this morning. Pretty nifty and If you already have an Apache web server, you’re almost done. I’m actually going to embed (might not be the right term) my .Net Core 2.0 web application into a UWP app, rather than create multiple unique apps for the touchscreens around the house.
.Net Core 2.0 is still in preview though.
https://learn.microsoft.com/en-us/aspnet/core/fundamentals/servers/kestrel?tabs=aspnetcore2x

I know this post is pretty old, but I have built the solution which you are asking bout. I’m currently running .Net 5.0 on a Raspberry pi. When you build the .net core web project, select the correct target framework and the target runtime to win-arm. Copy the output some directory on the pi and you will have to access the device using powershell to create a scheduled task to start the web project. Something like this:
schtasks /create /tn "Startup Web" /tr c:\startup.bat /sc onstart /ru SYSTEM
That starts a bat file which runs a powershell command which has the following command:
Set-Location C:\apps\vradWebServer\ .\VradTrackerWeb.exe (the .\VradTrackerWeb.exe is on a second line in the file) - the name of the webapp.
That starts the server. If you have any web or apps posting to the webserver you will need an ssl cert. I used no-ip and let’s encrypt for this. For let’s encrypt to work, you will need an external facing web server and have the domain name point to it. Run let’s encrypt on the external server and then copy out the cert and place it in your web directory on the pi. I then have a uwp program that runs on the pi and when it starts, it gets it’s local address and then updates no-ip with the local address, so the local devices communicating will be correctly routed and have the ssl cert. Side note, my uwp app is the startup app on the device. The scheduled task is important because it allows you to run you app and the web server. The following snip is how I get the ip address and then update no-ip.
private string GetLocalIP()
{
string localIP = "";
using (Socket socket = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, 0))
{
socket.Connect("8.8.8.8", 65530);
IPEndPoint endPoint = socket.LocalEndPoint as IPEndPoint;
localIP = endPoint.Address.ToString();
}
return localIP;
}//GetLocalIP
private async void UpdateIP()
{
string localIP = "";
string msg = "";
var client = new HttpClient(new HttpClientHandler { Credentials = new NetworkCredential("YourUserName", "YourPassword") });
try
{
localIP = GetLocalIP();
string noipuri = "http://dynupdate.no-ip.com/nic/update?hostname=YourDoman.hopto.org&myip=" + localIP;
using (var response = await client.GetAsync(noipuri))
using (var content = response.Content)
{
msg= await content.ReadAsStringAsync();
}
if (msg.Contains("good") == true || msg.Contains("nochg")==true)
{
SentDynamicIP = true;
LastIPAddress = localIP;
}
else
{
SentDynamicIP = false;
}
}
catch(Exception ex)
{
string x = ex.Message;
}
finally
{
client.Dispose();
}
}//UpdateIP

Related

Use NetOffice.PowerPointApi on azure app service

I have written a code to save all the slides in a presentation as jpeg. It works well in visual studio locally on my system, but when I deploy it on Azure app service, I get 500 internal server error.
IIS received the request; however, an internal error occurred during the processing of the request. The root cause of this error depends on which module handles the request and what was happening in the worker process when this error occurred. IIS was not able to access the web.config file for the Web site or application. This can occur if the NTFS permissions are set incorrectly. IIS was not able to process configuration for the Web site or application. The authenticated user does not have permission to use this DLL. The request is mapped to a managed handler but the .NET Extensibility Feature is not installed.
The code:
using pptd = NetOffice.PowerPointApi;
using NetOffice.PowerPointApi.Enums;
using NetOffice.OfficeApi.Enums;
public void genThumbnails(string originalfileName,string renamedFilename, string dirPath)
{
pptd.Application pptApplication = new pptd.Application();
pptd.Presentation pptPresentation = pptApplication.Presentations.Open(dirPath + renamedFilename, MsoTriState.msoFalse, MsoTriState.msoFalse, MsoTriState.msoFalse);
int i = 0;
foreach (pptd.Slide pptSlide in pptPresentation.Slides)
{
pptSlide.Export(dirPath + originalfileName + "_slide" + i + ".jpg", "jpg", 1280, 720);
i++;
}
pptPresentation.Close();
}
What is the mistake that I am doing? Does NetOffice package also need MS Office installed on the server like Office.Interop?
The standard windows and Linux web apps used blessed operating system images. As part of the PaaS design, customers are limited as to what they can run as there is no MS Office inter-op present and also because Azure Web Apps is a sandbox.
My suggestion would be to create a container image that has the necessary dependencies that you need and then deploy your custom container to an Azure Web App Container.

Azure Speech Recognition not detecting microphone SPXERR_MIC_NOT_FOUND

I have a small sample application to test speech recog. It works in some machines but not in other machines. In my dev environment where I first installed the necessary packages, it all worked 100% with no issues. But, my team mates are unable to get it working with the installation of our software that has this code in it. We have mixed environments where in some cases we are using Remote Desktop with the application running on the remote machine (so with the device integration via RDP). And also locally without RDP. It does not detect the mic in both cases. Windows detects the mic. The recorder app works and testing all works so we know the mic is being recognized by windows.
However, the speech SDK does not recognize it.
I have tried 2 ways. First ,with using the FromDefaultMicrophoneInput But with that not working, i changed it to FromMicrophoneInput instead and specifed the microphone ID.
Using NAudio to enumerate the microphones, the mic is detected and listed:
var enumerator = new MMDeviceEnumerator();
string specifiedMicID = string.Empty;
foreach (var endpoint in
enumerator.EnumerateAudioEndPoints(DataFlow.Capture, DeviceState.Active))
{
if (endpoint.FriendlyName != this.MicName)
continue;
else
{
specifiedMicID = endpoint.ID;
break;
}
}
audioConfig = AudioConfig.FromMicrophoneInput(specifiedMicID);
But, when trying to instantiate the SpeechRecognizer with that audio config:
using (var recognizer = new SpeechRecognizer(config, audioConfig))
{
...
}
We get the SPXERR_MIC_NOT_FOUND. Even thought it is clearly there and working in all other cases in windows and with Naudio detecting it fine.
Any ideas what is going on here?
Thank youj.
Are you creating a UWP application? If so, you'll need to retrieve the audio device IDs differently:
var devices = await DeviceInformation.FindAllAsync(DeviceClass.AudioCapture);
foreach (var device in devices)
{
Console.WriteLine($"{device.Name}, {device.Id}\n");
}
Please refer to the documentation here for more information:
https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-select-audio-input-devices#audio-device-ids-on-uwp
If you're still having issues, we'd need to get the SDK logs to debug further. Instructions on how to turn on logging can be found here:
https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-use-logging

SOAP NTLM login using .net core 2.1.300 failed on Ubuntu

I recently need to use .net core to do SOAP NTLM login.. to my horror.. I realized ,net core does not come with SOAP support.. fumbling around, I came across SOAPCore on nuget package which has SOAP middleware for .net core. My console app interfacing to .net core 2.1 SDK tries to do NTLM login. Below is the codes, very simple.. it's trying to login to Milestone VMS.
<----------codes--------------->
int MAX_BUFFERSIZE = 2 * 1024 * 1024;
string strURL = "http://192.168.51.207/ServerAPI/ServerCommandService.asmx";
BasicHttpBinding httpBinding = new BasicHttpBinding();
httpBinding.MaxBufferSize = MAX_BUFFERSIZE;
httpBinding.MaxReceivedMessageSize = MAX_BUFFERSIZE;
httpBinding.Security.Mode = BasicHttpSecurityMode.TransportCredentialOnly;
httpBinding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Ntlm; //changed Ntlm to Windows also don't help
EndpointAddress endpoint = new EndpointAddress(strURL);
var factory = new ChannelFactory(httpBinding, endpoint);
CredentialCache cc = new CredentialCache();
NetworkCredential ntcc = new NetworkCredential("user", "password", "domain");
cc.Add(strURL, 80, "ntlm", ntcc);
factory.Credentials.Windows.ClientCredential = cc.GetCredential(strURL, 80, "ntlm");
var client = factory.CreateChannel();
Guid guid = Guid.NewGuid();
LoginInfo lo = client.Login(guid, "");
Console.Write("\ntoken=" + lo.Token);
ConfigurationInfo config = client.GetConfiguration(lo.Token);
... //do something
client.Loguout(guid, lo.Token); //logout
<-------------end of code segment------------>
Now, running this in Windows 10 works fine.. it's able to login and get the info needed.. but funny thing is when it runs on Linux.. I installed BASH for Windows 10 and it's Ubuntu 18.04, and have installed .net core 2.1.300.. it gave an http code 401 exception: "The HTTP request is unauthorized with client authentication scheme 'Ntlm'. The authentication header received from the server was 'Negotiate, Ntlm'."
I've previously read on Stackoverflow about something quite similar and it was said that .net core 2.1 would resolve it.. I'm already using 2.1.300. Is it still not resolved? or am I doing something wrong?
Another question I have pertaining to this, the "ServerCommandServiceSoap" interface is generated using svcutil.exe from the wsdl file provided by Milestone in their SDK. Now, what is the difference between svcutil.exe and wsdl.exe? I noticed that the proxy class generated isn't the same usi ng these two tools.. using svcutil.exe has an interface class with dependencies on System.ServiceModel, while using wsdl.exe has no interface class and depends on System.Web.Services which is not available in .net core. why is it different when they are ran against the same wsdl document?
Can someone please enlighten me on this? thanks a lot.. :)

ASP.NET Core WebSockets on IIS 7.5

I know the WebSockets are supported only on Windows 8 and higher. But sometimes you just can't upgrade the system in large organization. So I tried implement WebSockets on ASP.NET Core app.
I take NuGet package "AspNetCore.WebSockets.Server" and run as a self-hosted app on Windows 7 and everything works well. But hosting on IIS7.5 on the same machine wont allow me to upgrade HTTP connection to WebSocket. Even if I try to simulate the handshake the IIS simple removes my "Sec-WebSocket-Accept" header.
static async Task Acceptor(HttpContext hc, Func<Task> next)
{
StringValues secWebSocketKey;
if(hc.Request.Headers.TryGetValue("Sec-WebSocket-Key", out secWebSocketKey))
{
hc.Response.StatusCode = 101;
hc.Response.Headers.Clear();
hc.Response.Headers.Add("Upgrade", new StringValues("websocket"));
hc.Response.Headers.Add("Connection", new StringValues("Upgrade"));
// Disappears on client
hc.Response.Headers.Add("Sec-WebSocket-Accept", new StringValues(GetSecWebSocketAccept(secWebSocketKey[0])));
}
await next();
}
I definitely sure IIS7.5 physically can manage WebSockets if they was implemented by developer and that behavior (header removal) looks like a dirty trick from Microsoft
I am afraid you need IIS 8
With the release of Windows Server 2012 and Windows 8, Internet
Information Services (IIS) 8.0 has added support for the WebSocket
Protocol.
https://www.iis.net/learn/get-started/whats-new-in-iis-8/iis-80-websocket-protocol-support
The new http.sys is the one that can turn a regular HTTP connection into a binary communication for websockets. Although you can implement your own thing, you cannot hook it into http.sys.

Strange behavior of Windows Azure Compute Emulator (SDK 1.8) with multiple role instances on a clean machine with VS2012 but WITHOUT VS2010

Have you ever tried to run a hosted service in the windows azure emulator with full IIS and multiple role instances? Some days ago I noticed that only one of the multiple instances of a web role is startet in IIS at a time. The following screenshot illustrates the behavior and the message box in front of the screenshot shows the reason for this behavior. The message box appears on trying to start one of the stopped websites in the IIS Manager.
Screenshot: IIS with stopped Websites
The sample cloud application contains two web roles: MvcWebRole1 and WCFServiceWebRole1 each configured to use three instances. My first thought was: "Sure! No port collision will happen in the real azure world because every role instance is an own virtual machine. It cannot work in the emulator!" But after some research and analyzing many parts of the azure compute emulator I found out that the compute emulator creates a unique IP for each role instance (in my example from 127.255.0.0 up to 127.255.0.5). This MSDN blog article (http://blogs.msdn.com/b/avkashchauhan/archive/2011/09/16/whats-new-in-windows-azure-sdk-1-5-each-instance-in-any-role-gets-its-own-ip-address-to-match-compute-emulator-close-the-cloud-environment.aspx) of the microsoft employee Avkash Chauhan describes this behavior as well. After that conclusion I came to the following question: why the hell does the compute emulator (more precisely DevFC.exe) not add the IP of the appropriate role to the binding information of each Website???
I added the IP to each Website by hand and tadaaaaa: every Website can be started without any collisions. The next screenshot demonstrates it with the changed binding information highlighted.
Screenshot: IIS with started Websites
Once again: Why the hell does the emulator not do it for me? I wrote a small static helper method to do the binding extension thing for me on every role start. Maybe someone wants to use it:
public static class Emulator
{
public static void RepairBinding(string siteNameFromServiceModel, string endpointName)
{
// Use a mutex to mutually exclude the manipulation of the iis configuration.
// Otherwise server.CommitChanges() will throw an exeption!
using (var mutex = new System.Threading.Mutex(false, "AzureTools.Emulator.RepairBinding"))
{
mutex.WaitOne();
using (var server = new Microsoft.Web.Administration.ServerManager())
{
var siteName = string.Format("{0}_{1}", Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.CurrentRoleInstance.Id, siteNameFromServiceModel);
var site = server.Sites[siteName];
// Add the IP of the role to the binding information of the website
foreach (Binding binding in site.Bindings)
{
//"*:82:"
if (binding.BindingInformation[0] == '*')
{
var instanceEndpoint = RoleEnvironment.CurrentRoleInstance.InstanceEndpoints[endpointName];
string bindingInformation = instanceEndpoint.IPEndpoint.Address.ToString() + binding.BindingInformation.Substring(1);
binding.BindingInformation = bindingInformation;
server.CommitChanges();
}
else
{
throw new InvalidOperationException();
}
}
}
// Start all websites of the role if all bindings of all websites of the role are prepared.
using (var server = new Microsoft.Web.Administration.ServerManager())
{
var sitesOfRole = server.Sites.Where(site => site.Name.Contains(RoleEnvironment.CurrentRoleInstance.Role.Name));
if (sitesOfRole.All(site => site.Bindings.All(binding => binding.BindingInformation[0] != '*')))
{
foreach (Site site in sitesOfRole)
{
if (site.State == ObjectState.Stopped)
{
site.Start();
}
}
}
}
mutex.ReleaseMutex();
}
}
}
I call the helper method as follows
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
if (RoleEnvironment.IsEmulated)
{
AzureTools.Emulator.RepairBinding("Web", "ServiceEndpoint");
}
return base.OnStart();
}
}
I got it!
I have this behavior on three different machines which are all formatted and served with fresh clean windows 8, Visual Studio 2012 and Azure SDK 1.8 and Azure Tools installations recently. So a reinstallation of the Azure SDK and Tools (as Anton suggests) should not change anything. But the cleanliness of my three machines is the crucial point! Anton, do you have Visual Studio 2010 on your machine with at least VS2010 SP 1 installed? I analyzed IISConfigurator.exe with ILSpy and found the code which sets the IP in the binding information of the websites to '*' (instead of 127.255.0.*). It depends on the static property Microsoft.WindowsAzure.Common.Workarounds.BindToAllIpsWorkaroundEnabled. This method internally uses Microsoft.WindowsAzure.Common.Workarounds.TryGetVS2010SPVersion and leads to setting the IP binding to '*' if the SP level of Visual Studio 2010 is smaller than 1. TryGetVS2010SPVersion checks four registry keys and I don't know why but one of the keys exists in my registry und returns the Visual Studio 2010 SP level 0 (I never installed VS2010 on no one of the three machines!!!). As I changed the value of HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\DevDiv\vs\Servicing\10.0\SP from 0 to 10 (something greater 0 should do it) the Azure Emulator starts to set the 127.255.0.* IPs of the roles to the binding information on all of the websites in the IIS and all websites are started correctly.

Resources