Executable to launch an Azure Virtual Machine - azure

I need to create tools so that a non-experienced/non-technical users can use (which means connect and start/stop) a Virtual machine on Azure. For connection, the RDP connection is doing a good enough job and is easy to take a hand-on. On the other side, to start / stop a virtual machine you normally need to access to the Azure portal which (on top of being not straightforward for a non-technical user) causes some access policy problems. One option could be to just let the virtual machine always "on" but then we are billed for 100% of time even though the user only needs it for a couple hours a week.
That's why I investigated the possibility to create a script that could be put into an executable file that would launch automatically the virtual machine by just clicking the exec. I have already seen this stackoverflow question :
Start azure virtual machine without azure portal
which suggests to create an Azure PowerShell script that would start the virtual machine. Only problem is that launching a powershell script is out of the technical level of the person who would use it. On top of that, there is a need to install Azure add-on for powershell (if I understand correctly) which would not be possible depending on the machine and the rights the user have on it.
So my question : Do you have any idea on how I could make a simple program (in the form for example of an executable that would run on any machine without any dependency) that would start an azure virtual machine ?
One solution I thought about but it seemed very complicated : create a "super low cost" virtual machine that would be on 100% of time and just create an exec that instruct this VM to start the other virtual machine on demand ?
Thanks for your help

I have a problem with the idea that a powershell script is outside of the scope of a user that can run an exe file. If built properly, a ps1 should just be a double-click, exactly like an exe.
Aside from that, you have a couple hurdles to look at.
Your user can't have access to the resources that they need to interact with.
This can be done by passing custom PScredential objects through the script and pulling the credentials from a file. You would build the credential file with ConvertFrom-SecureString and then import it in with CovertTo-SecureString. The biggest problem with this is that if the user can see where that file is stored, they could potentially write a script to access that file and gain privileged access.
Your user doesn't have permission to run the powershell resources needed to execute the script. For this, you'd need to build in runas permission on the script, and I think creating an exe might be the best avenue for that. Although you could have the initial script call another shell with elevated permissions and work through that.
There are tools out there like PowerGUI, that will compile a ps1 file into an exe format. A properly compiled and secure exe file would hide the scripts that call out to secure string files and also allow for custom runas permissions built into the program.

Related

Allowing custom scp/sftp/rsync in sandbox shell

I am currently building a sandbox environment utilizing docker where all users will be directly guided into a docker container with home dir linked as data vol upon login. This is achieved through the use of a custom user shell instead of /bin/bash csh etc.
However, scp/rsync/sftp fails with this custom shell unfortunately. My current solution is to make a separate no-login account with the same home dir as the sandbox user and only allow scp/rsync/sftp through rssh in this account where users can then upload their data.
Just wondering if I can streamline this process somehow and use only 1 account to redirect users into docker containers directly AND allow sftp/scp processes as well?
Thanks!
Edit:
upon more poking around, I have discovered that the newest sshd internal-ftp-server subsystem do not invoke the users login shell, and I can enable sftp this way. However, i dont think rsync and scp works this way unfortunately.
The ultimate goal i want is basically to be able to sandbox users directly into a container upon login and to allow sftp/scp/rsync upload ideally using the SAME credentials instead of setting up a separate one for them.

Azure Worker Role Calling 3rd Party Command Line Component

For my system, I have a back-end process that uses a 3rd party command line tool to do some occasional processing. This tool writes to and reads from the file system (I point it at some files, it works its magic, and then it writes out the results to another file).
This is obviously easy to do with an Azure Virtual Machine. Just write a Windows Service to manage this command line tool and have it read from a Queue to get the processing jobs.
To this point, however, I've been able to do everything in Azure without having to resort to a full blown VM. I like that. I like not having to worry about applying patches and other maintenance, downtime and the like.
So, my question is, is there something in Azure that would let me have this service without resorting to a VM? Would a "Worker Role" be able to accomplish this? Can it read and write to/from the file system? Can it handle 3rd party tools with a bunch of arbitrary dependencies? Can I launch another process from C# code within the worker role?
Would a "Worker Role" be able to accomplish this?
Absolutely! Remember that a Worker Role is a full blown VM also (with same OS powering Azure Virtual Machine).
Can it read and write to/from the file system?
Yes. However there's a catch. You can can't read/write to any arbitrary location on the VM. You would have full access to a special folder on that VM called Local Storage. You can read more about it here: http://msdn.microsoft.com/en-us/library/azure/ee758708.aspx
Can it handle 3rd party tools with a bunch of arbitrary dependencies?
Yes, again! Again, there's a catch. Since these VMs are stateless VMs, anything you install after the VM is stood up for you by Microsoft is not guaranteed to be there in case Microsoft decides to tear down that VM for whatever reasons. If you need to install any additional software, you would have to install them via a process called Startup Tasks. You can read about them here: http://msdn.microsoft.com/en-us/library/azure/hh180155.aspx.
Can I launch another process from C# code within the worker role?
Though I have not tried it personally but I think it is possible because you get a VM running latest version of Windows server.

When running SharePoint solution deployment through a remote shell, solution deployment hangs

I have a remote shell (using PowerShell) running a solution deployment on SharePoint 2010. It hangs on the deployment and never finishes deploying. When I do this locally from the box, there are no issues deploying.
I thought originally that this could have to do with paging due to the remote shell memory limit (which is, by default, 512MB). I increased that to 2 GB:
Set-Item WSMan:\localhost\Shell\MaxMemoryPerShellMB 2048
This hasn't made any difference. I also ensure that the SharePoint Administration and Timer Service are started before attempting to deploy. Also, the credentials I have used to open the remote shell are for the farm account.
What would cause a solution deployment to hang only when run through a remote shell?
I attempted the same thing recently and found that remote shells have a limited amount of access to the remote environment depending on the credential provider used. Security policies inhibit some IO operations with the file system, etc. I was able to work around this leveraging a package management solution such as Chocolatey.org, which can kick off powershell commands in another process on the remote box, but from what I understand of the issue, it's a Double Hop credential problem that can be solved using CredSSP when establishing the remote connection.

Running existing program on Windows AZURE cloud platform

i have an existing program that i would like to upload to the cloud without rewriting it and i'm wondering if that is possible.
For exemple can i upload and run a photoshop instance in the cloud and use it?
Of course not the GUI but photoshop has a communication sdk so web program should be able to control it!
As far as i can see, Worker roles looks good but they have to be written in a specific way and i can't rewrite photoshop !
Thanks for your attention!
As long as your existing program is 64bit compatible and it has installer that supports unattended/silent install; or your programm is xcopy deployable, you can use it in Azure.
For the programm that requires installation and supports unattended/silent install you can use StartUp Task.
For the program that is just xcopy deployable, just put it in a folder of your worker role, and make sure the "Copy to Output" attribute of all required files are set to "Copy always". Then you can use it.
However the bigger question is, what are you going to do with that "existing programm" in Azure, if you do not have API-s to work with.
Here's the thing, the Worker role should be what you need - it's essentially a virtual machine running a slightly different version of Windows, that you can RDP to, and use it normally. You can safely run more or less anything up there, but you need to automate the deployment (e.g. using startup tasks). As this can prove a bit problematic, Microsoft has created a Virtual machine Role. You create your own deployment and that's what gets raised when you instantiate the machine.
However! This machine is stateless, meaning that files it creates aren't saved if it gets restarted. So you need to ensure the files are saved somewhere else, e.g. in blob storage (intended for just such a purpose).
What I would do in your case, is create a virtual machine role, with Photoshop installed, and a custom piece of software next to it, accepting requests via Azure Queues, that does the processing, and saves the file to blob storage, then sends the file onwards to whoever requested

run cgi script as a different user

I have a tool written in perl which is used by different users in my company. Each user has his/her own disk space allocated to them and they run the tool in their diskspace. This is working fine without any issues. As a next step, I wanted to enable the tool through web and created a web application through which users can run this tool, the issue that i have is, the tool is always run as a single user. I know the user name through authentication, is there a way by which i can run the tool as the user who is running the web application?
Yes, suexec.
Also see questions tagged suexec.

Resources