I trying to open a browser from cloud shell. I have firefox installed in cloud shell but can't launch it.
getting this error :
XPCOMGlueLoad error for file ...../firefox/libxul.so:
libgtk-x11-2.0.so.0: cannot open shared object file: No such file or directory
Couldn't load XPCOM.
I found a couple of solutions on Google but didn't work. So thought of checking if it's allowed to launch a browser from Azure cloud Shell or not?
Probably, It's not allowed to do that since cloud shell offers a browser-accessible, pre-configured shell experience for managing Azure resources without the overhead of installing, versioning, and maintaining a machine yourself.
You could get more details about Features & tools for Azure Cloud Shell.
No, this is not possible, because its a guiless container. so there is no video in the container. you can only use tty
Related
I am trying mirror a process on Windows that I use on linux for creating an automated build of an application web server. The only way to have full automation is to have some sort of EXPECT utility for the interactions on the CLI.
We use a script to setup all the configuration and settings for our applications web servers that is easily portable to batch since the web server is windows compatible as well with the same commands.
I've basically got it down for Linux, however the flip side to this is that the windows doesn't seem to have a clear expect alternative.
Any suggestions to allow it to send a response when prompted and continue the rest of the scripted process would be most helpful.
Either this isn't possible, or it's so simple, I am missing the trick or I am going about it the wrong way. Similar to this question.
I prefer working with VS Code and basically, I want to treat the home path in cloud CLI as a local folder, exposed to VS Code.
I have installed the following VS Code extensions:
Azure Account
Azure Storage
Azure CLI Tools
If I connect to cloud shell via VS Code (F1 > Azure:Open Bash in Cloud Shell) (as explained here) or through the Portal, I have a home directory /home/john, where I can put files. It is this area I want to connect to from my PC (via VS Code).
My first thought was that this area would be exposed in Azure Storage Explorer, however, the only thing in my cloud shell storage account is: File Shares: azclishare > .cloudconsole > acc_john.img. There is no sign of any of the files in /home/john. I'm guessing they're wrapped up in acc_john.img.
I also though about using SCP, but I can't find any reference to this either and I can't find any "connection strings" in the portal.
If anyone has any ideas, I'd be grateful if you could share...
P.S. I am using Windows 10.
It's always the same, post a question on SO then find the answer!
The full answer is here: https://learn.microsoft.com/en-us/azure/cloud-shell/persisting-shell-storage
The short answer is that Cloud Shell does map to the storage account (files), but to /usr/john/clouddrive.
In fact, there is a symlink to clouddrive in /home/john.
I'm developing a small script on Python 3.6, and I'll most likely use it from Google Cloud Shell. In said script, I want to do some API calls, and then, open the web browser with an URL, result of those calls. The following code works in other environments that I have tested on, but not in Cloud Shell:
import webbrowser as wb
# different calls and process here, not relevant to the issue
wb.open('URL_HERE')
#This just echoes the URL.
Is there anyway to make Python 'tell' Cloud Shell to use the browser from where it's running? i.e: if I'm using Chrome to open Cloud Shell, is there any way to open the link in Chrome? It doesn't matter if it's with webbrowser or other library.
Cloud Shell is just a “window” displaying a command line from a remote and temporary Compute Engine virtual machine instance. Meaning that when you run the script, you are actually running it in the remote VM (not in Chrome), and that specific VM does not have a browser by itself.
For example, when you try to run an app in Cloud Shell (here you can find a quick example using "mvn appengine:run"), once the application is running, you will see a message in Cloud Shell, something like:
[INFO] GCLOUD: INFO: Module instance default is running at http://localhost:8080/
If you click on http://localhost:8080/ , you will actually be redirected to the temporary address assigned for the Cloud Shell VM instance (something like 8080-dot-VM-ID-devshell.appspot.com).
In summary, you can’t command Cloud Shell to open the browser with a specific URL shown in the remote VM command line. Also, you should consider that there are outgoing connections limitations and Cloud Shell is intended for interactive use only.
I wasn't sure whether to ask this in an Inkscape specific forum or here in Azure. I tagged both.
My goal is to run a windows build of Inkscape in a cloud function preferably or in an App Service to open up different vector files and send them back to the user as a plain SVG.
I've downloaded the binary archive (https://inkscape.org/en/release/0.92.2/windows/32-bit/) and extracted it in Kudu on both a paid App Service and in a Function App.
When I run inkview.com it seems to be working. It outputs info to cmd
But when I run inkscape.com it just stays open for a couple of seconds and quits. (Just outputs a blank line and exits) I've tried -V and -? and many other commands (also using the -Z without GUI command).
Does anybody have an idea of what's going on here? Is Azure perhaps missing some dependencies that Inkscape needs to run? Any ideas on how to troubleshoot?
Thanks in advance.
Azure Functions, like WebApps and Mobile Apps, run in an App Service. The App Service runs in a secure environment called a sandbox which imposes certain limitation. Amongst them, is the use of GDI+.
With Inkspace being a graphics program, I can only imagine that it is making use of GDI+, so it would be blocked.
You can see the list of limitation https://github.com/projectkudu/kudu/wiki/Azure-Web-App-sandbox#unsupported-frameworks
In order to be able to run inkspace in Azure, you need to host in something other than App Service, such as a VM, Cloud Service, Service Fabric, Containers... etc.
I now have my Windows Azure environment set up so that I can access my Worker Role with Remote Desktop. However, I'm not sure how to proceed at the moment. After much digging I found a web site that was offline but in Google's cache there was mention of attaching to the Worker Role running in the Azure Cloud from the Visual Studio debugger. But I only have Visual Developer (not studio) 2010 and I have searched all over and as far as I can see there is no such option to attach to a remote server. I am able to publish my project to the Azure Cloud without error and I have a "healthy" instance of my Worker Role showing as active and running.
I did connect with RDP through the Azure Management portal. The login worked fine and up came the remote desktop window. I searched through much of what I could find and was unable to find my Worker Role. I must have the wrong impression of RDP, because I had hoped to see the Worker Role's main display form when I logged in, just like I do when I debug it locally in the Cloud Emulator. But instead all I saw was a blank desktop with some base level server inspection and management routines. I even checked the Event Viewer for Application related messages and saw none.
So now I'm stuck wondering if my Worker Role is actually running or not, despite the seemingly positive status messages from the Management Portal, and I still want to attach to my Worker Role for debugging through Visual Developer, if it's possible, but I am unable to figure out how.
Anyone with experience in this area that can give me some solid tips on what to do next, please respond.
UPDATE: I believe my worker role may be running because I opened a command window and did a Netstat and saw it listening on the correct port. However, that may just be my Worker Role shell class that starts the custom EXE I have it launch as a spawned proces. I still haven't confirmed if my custom EXE is running yet.
UPDATE-2: Just ran TaskList from a command window and the custom EXE is listed.
UPDATE-3: Everything is working as I just ran a remote test of the service so that's not a problem. Still want to know how to attach to the Worker Role from Visual Developer 2010 for remote debugging, and if it's possible to see the custom EXE's display form like I do when doing local debugging in the Cloud Emulator.
-- roschler
There is a set of articles here which goes in length on how to set up for remote debugging in Azure:
http://blogs.u2u.be/peter/post/2011/06/21/Remote-debugging-an-Azure-Worker-role-using-Azure-Connect-Remote-desktop-and-the-remote-debugger.aspx
http://blogs.u2u.be/peter/post/2011/06/24/Remote-debugging-an-Azure-worker-role-using-Azure-Connect-remote-desktop-and-remote-debugger-part-2.aspx
http://blogs.u2u.be/peter/post/2011/06/26/Remote-debugging-a-Windows-Azure-Worker-Role-using-Azure-Connect-Remote-desktop-and-the-remote-debugger-part-3.aspx
The key takeaway is that you don't need to actually install Visual Studio on Azure, you only need to copy the Remote Debugger bits and then use Azure Connect to add your developer machine to the Virtual Network.
You can setup Remote Debugging with Visual Studio 2012
http://code.msdn.microsoft.com/Remote-Debugging-Windows-dedaaec9
When you say:
But instead all I saw was a blank desktop with some base level server inspection and management routines.
this is exactly what you get with an Azure VM. It's a basic OS install, plus the bare minimum of Azure stuff it needs to run and the code you've uploaded. There's no fancy monitoring or health checks available on the machine by default, you're expected to have provided those yourself to have them available without having to RDP into the machine to check on it.
RDP is very good for tracking down certain problems, like checking that a startup task will run, checking which directories items are installed in and just generally being nosey. If you need extra tools to track down a problem, you can just install them while you're connected to the server. For example I have RDPed into a server and installed the Microsoft Debugging Tools, to track down a memory issue.
I suppose you could remote into your VM, install Visual Studio there, and debug the process...
I also suppose it might be possible to enable remote debugging (not sure what's involved there, but such a thing exists, and it works over TCP) and debug from a local instance of Visual Studio.
To my knowledge, neither is commonly done.
Based on other answers, you would be better off writing a log file to a local storage. You can read the file from RDP if you reallyhace to. Keep in mind, debugging on Azure isn't really simple, and rightly so.
What I was thinking though was, maybe you could run the process using the user's credentials. I can't verify at the moment, but you have a better shot of seeing the ui when you rdp.