Get website's ID in IIS 6 using the command line - iis

I need to get the website's ID by its name in IIS 6 in Windows server 2003. I did it using C# but later discovered that the server doesn't have the .Net framework installed. I just know the name of the website.
string directoryEntry = string.Format("IIS://localhost/w3svc");
var w3Svc = new DirectoryEntry(directoryEntry);
foreach (DirectoryEntry de in w3Svc.Children) {
if (de.SchemaClassName == "IIsWebServer" && de.Properties["ServerComment"][0].ToString() == "MyWebsiteName") {
Environment.ExitCode = Convert.ToInt32(de.Name);
return;
}
}
int id = Process.Start("MyApp.exe");
I have to run a batch file to start and stop the website on a remote machine. I am able to run the batch to start and stop the website by using PSexec but I don't know how to get the website's ID dynamically. Can anybody please help me with this?
Please let me know if I am not clear enough or if I need to provide more info on this.

It looks like IisApp.vbs is the recomended option. Check out the documentation for it. Sorry if this isn't what you needed.

Related

SSL based webserver on Windows IoT

I am working on a project which involves gathering some sensor data and build a GUI on it, with controlling of sensors. It has following two basic requirements.
Should be a web based solution (Although it will only be used on LAN or even same PC)
It should be executable on both windows IoT core and standard windows PC (Windows 7 and above)
I have decided to use Embedded webserver for Windows IoT, which seems to be a good embedded server based on PCL targeting .NET 4.5 and UWP. So I can execute it on both environments. That is great! But the problem is this web server doesn't support SSL, I have tried to search other servers and have come up with Restup for UWP, which is also a good REST based web server, but it also doesn't support SSL.
I needs an expert opinion, that if there is any possibility I can use SSL protocol in these web servers. Is it possible that it can be implemented using some libraries like OpenSSL etc? (Although I think that it would be too complex and much time taking to implement it correctly)
Edit
I would even like to know about ASP.NET core on Windows 10 IoT Core, if I can build an application for both windows. I found one example but it is DNXbased, and I don't want to follow this way, as DNX is deprecated.
Any help is highly appreciated.
Late answer, but .NET Core 2.0 looks promising with Kestrel. I successfully created a .Net Core 2.0 app on the PI 3 this morning. Pretty nifty and If you already have an Apache web server, you’re almost done. I’m actually going to embed (might not be the right term) my .Net Core 2.0 web application into a UWP app, rather than create multiple unique apps for the touchscreens around the house.
.Net Core 2.0 is still in preview though.
https://learn.microsoft.com/en-us/aspnet/core/fundamentals/servers/kestrel?tabs=aspnetcore2x
I know this post is pretty old, but I have built the solution which you are asking bout. I’m currently running .Net 5.0 on a Raspberry pi. When you build the .net core web project, select the correct target framework and the target runtime to win-arm. Copy the output some directory on the pi and you will have to access the device using powershell to create a scheduled task to start the web project. Something like this:
schtasks /create /tn "Startup Web" /tr c:\startup.bat /sc onstart /ru SYSTEM
That starts a bat file which runs a powershell command which has the following command:
Set-Location C:\apps\vradWebServer\ .\VradTrackerWeb.exe (the .\VradTrackerWeb.exe is on a second line in the file) - the name of the webapp.
That starts the server. If you have any web or apps posting to the webserver you will need an ssl cert. I used no-ip and let’s encrypt for this. For let’s encrypt to work, you will need an external facing web server and have the domain name point to it. Run let’s encrypt on the external server and then copy out the cert and place it in your web directory on the pi. I then have a uwp program that runs on the pi and when it starts, it gets it’s local address and then updates no-ip with the local address, so the local devices communicating will be correctly routed and have the ssl cert. Side note, my uwp app is the startup app on the device. The scheduled task is important because it allows you to run you app and the web server. The following snip is how I get the ip address and then update no-ip.
private string GetLocalIP()
{
string localIP = "";
using (Socket socket = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, 0))
{
socket.Connect("8.8.8.8", 65530);
IPEndPoint endPoint = socket.LocalEndPoint as IPEndPoint;
localIP = endPoint.Address.ToString();
}
return localIP;
}//GetLocalIP
private async void UpdateIP()
{
string localIP = "";
string msg = "";
var client = new HttpClient(new HttpClientHandler { Credentials = new NetworkCredential("YourUserName", "YourPassword") });
try
{
localIP = GetLocalIP();
string noipuri = "http://dynupdate.no-ip.com/nic/update?hostname=YourDoman.hopto.org&myip=" + localIP;
using (var response = await client.GetAsync(noipuri))
using (var content = response.Content)
{
msg= await content.ReadAsStringAsync();
}
if (msg.Contains("good") == true || msg.Contains("nochg")==true)
{
SentDynamicIP = true;
LastIPAddress = localIP;
}
else
{
SentDynamicIP = false;
}
}
catch(Exception ex)
{
string x = ex.Message;
}
finally
{
client.Dispose();
}
}//UpdateIP

Connect local USB printer to work on azure website

I am a student and working on a project on my practice.
I have a database on Azure and my website that is connected to the database is also on Azure.
But now I want to print a POS-label from the website. The label I want to print is in code from Visual studio.
But when I try to print from the website it says: Settings to access printer are not valid.
How can I fix that? I have read that some people making a virtual machine on Azure and work from there.
I can't use javascript:Window.Print() because I don't want to print anything from the website, just from code.
var ps = new PrinterSettings();
ps.PrinterName = "Brother QL-500";
var size = new PaperSize("My Size", 2, 1);
ps.DefaultPageSettings.PaperSize = size;
recordDoc.PrinterSettings = ps;
recordDoc.Print();

Access denied upon doing a GetDirectories() but Dir in Powershell works

I have a problem I hope someone might help me with.
I've created a custom action page where I among other things will scan a directory on a remote server for a set of directories, and inside those directories I am searching for a set of files.
However, when I execute the code on the production server I get an Access denied exception.
If I use the same code on my testserver (accessing the same remote server) it works just fine.
If I use powershell or explorer on the production server I can access the remote directory and files with no problems.
I am using the same account in all scenarios (if I print out Page.User.Identity.Name and SPContext.Current.Web.CurrentUser.LoginName they are the same and equal to the account I use on the test server and the one I am logged on with on the production server when accessing the remote server from command line or explorer).
The code looks like this:
string user = SPContext.Current.Web.CurrentUser.LoginName.Remove(0,7);
string user_path = "\\\\srv\\share1\\subdir\\dir\\" + user;
// The line below will raise an exception on the production server.
foreach (string board_path in Directory.GetDirectories(user_path, "Board*")) {
foreach (string board_file in Directory.GetFiles(board_path, "Board*.xml")) {
.
.
}
}
I cant figure out why the code runs on the testserver but not on the production machine. I am using SharePoint 2010 Standard.
Thanks in advance for any kind of help I can get.
/Fredrik
The problem was solved by using SPSecurity.RunWithElevatedPrivileges()!
/Fredrik

How do I obtain Central Administration site url programmatically?

I have some code (console app) running on a SharePoint farm machine, and I need the app to figure out the url of Central Administration site for that farm. I remember seeing some SharePoint API doing exactly that, but I can't find it now.
I've seen a bunch of hacks people are using for that, like looking it up in Windows registry, but I need a way via SharePoint API.
in C#
Microsoft.SharePoint.Administration.SPAdministrationWebApplication centralWeb =
SPAdministrationWebApplication.Local;
To expand on the answer from #RDeVaney:
Microsoft.SharePoint.Administration.SPAdministrationWebApplication centralWeb =
SPAdministrationWebApplication.Local;
string centralAdminUrl = centralWeb.Sites[0].Url;
Here is the code from msdn, please refer if it can answer your question
SPWebServiceCollection webServices = new SPWebServiceCollection(SPFarm.Local);
foreach (SPWebService webService in webServices)
{
foreach (SPWebApplication webApp in webService.WebApplications)
{
if (!webApp.IsAdministrationWebApplication)
{
get the URL here
}
}
}

Do I need to replace localhost in the IIS://localhost/MimeMap when reading the Mimemap

I'm reading out the mime types from IIS's MimeMap using the command
_mimeTypes = new Dictionary<string, string>();
//load from iis store.
DirectoryEntry Path = new DirectoryEntry("IIS://localhost/MimeMap");
PropertyValueCollection PropValues = Path.Properties["MimeMap"];
IISOle.MimeMap MimeTypeObj;
foreach (var item in PropValues)
{
// IISOle -> Add reference to Active DS IIS Namespace provider
MimeTypeObj = (IISOle.MimeMap)item;
_mimeTypes.Add(MimeTypeObj.Extension, MimeTypeObj.MimeType);
}
Do I need replace the localhost part when I deploy it to my live server? If not, why not and what are the implications of not doing so.
Cheers
It should not be an issue to leave the host as 'localhost'.
After all, you want to get the MimeMap of the machine your app is running on, correct?
A possible complication that I can forsee is that if you are using a third party as a host. They can do anything they want with host headers and it may be possible that localhost is not available for whatever reason.
But you should simply give it a shot and adjust if necessary.
If you leave it like 'Localhost', you will have to run this script directly on the server.
If you change it to fetch the machine name directly, you can think of running this script remotely as well.

Resources