Unzip file uploaded to Azure Websites - azure

I uploaded a zip of my Wordpress site to an Azure website. When I try to FTP in with Winscp it works, but I can't use unzip transfer.zip in the command interface.
How do I unzip my zip file that is now on the server?

This is possible using the Azure portal's console.
Navigate to portal.azure.com.
Locate your website (Browse->Websites, click on name of the website you uploaded your ZIP file too)
Scroll down until you see the "Console" button. By default, this is on the righthand side about three-quarters of the way to the bottom of the icon list (or "blade" in Azure portal parlance). The icon looks like this:
Navigate to the directory you uploaded your ZIP to using the cd command, just as you would on a typical console or shell window.
Execute unzip archive.zip where archive.zip is the name of your ZIP file.
Note that as of today, the unzip command will not output any progress reports; it runs silently. So, if you have a large archive that takes some time unzip, it may appear as if the command has hung, but in fact it is working and you just need to wait.
Update Sep 2018:
The unzip command outputs its progress to the console, e.g.:
Archive: archive.zip
creating: archive/
inflating: archive/203439A9-33EE-480B-9151-80E72F7F2148_PPM01.xml
creating: archive/bin/
inflating: archive/bin/some.dll

One way is to upload the command line version of 7-Zip, it's a standalone .EXE file.
Next, from the Azure Preview Portal Azure portal (2014), navigate to your Website, click on the console tile and type the unzip command:
7za x thezipfile.zip
An alternative to the portal is to use the console from Kudu. Insert "SCM" between your Website name and azurewebsites.net like this to launch Kudu:
yoursitename.scm.azurewebsites.net
One advantage in using Kudu is that you can upload files directly in the browser just by drag&dropping them.
Pretty cool.

So it seams that the recent answers to this question is obsolete
because console button is not available anymore in azure portal so there is no way to access unzip this way.
You have to use kudu site instead in order to access console and run unzip command.
Simply navigate to https://your-web-app-name.scm.azurewebsites.net and click CMD
Or just navigate https://your-web-app-name.scm.azurewebsites.net/DebugConsole
Then do your unzip -filename

unzip by info-zip.org is currently available on AZURE in the console.
As indicated it produces a red error message: Bad Request, but appears to work correctly nevertheless.
I contacted the supplier and they said:
I know approximately nothing about Microsoft Azure, and even less about
exactly what you did, but I took a quick look at the UnZip source, and
didn't see anything like a "Bad Request" message. My first guess
would be that UnZip seems to be working correctly because it is
working correctly, and that the "Bad Request" message comes from
somewhere else.

Since all answers above explain how to unzip files in a Web App running on Windows, let me add how to unzip in a Linux environment.
Head to https://mysite.scm.azurewebsites.net and select "Bash" ("SSH" is much better/faster to use, but doesn't support unzip or 7za to date):
Now simply type unzip myfile.zip and the files inflates. The process is also printed to the console:
Hope this helps.

There is now unzip logic built in to the Kudu interface. Drag to the correct spot in Kudu and the file will upload and unzip automatically.

Related

Failure to unzip application packages in Azure Batch

I followed the guidance in this wiki page to create an application package for my Azure Batch pool, but now my nodes are stuck in an unusable state because it fails to unzip. I can't find anything in the documentation that talks about what kind of compressed file is acceptable here, other than "a zip file".
I have a collection of database files used for some genomic sequencing tools that I have stored in a folder structure, which I created a compressed archive with using tar -zvcf and gave a .zip extension to. That did not work, so I tried uploading the same file with a .tar.gz extension and it also failed.
The Batch Node is running the CentOS image Azure Batch recommends for container applications, and my startup task is not running in the context of the container.
Can anyone point me to documentation or personal experience that helps clarify what kind of files can be used for this? Thank you in advance!
Yes, you are correct, but let me emphasise on the confusion, tar is the different compress archive file format then zip i.e. more detail here: What is the difference between tar and zip? it is mentioned many times in the documentation you mentioned along with
Batch App Package feature only support *.zip format and hence changing file extension from *.tar to *.zip is not the right way as they are 2 different way they get compressed et. al.
Extra docs:
https://azure.microsoft.com/en-au/blog/application-packages-and-task-dependencies-now-available-on-azure-batch/
https://kb.winzip.com/help/winzip/AboutZIPsAndOtherArchives_4.htm
Thanks and hope it helps.

How to download a folder to my local PC from Google Cloud console

I have a folder I want to download from Google Cloud Console using the Linux Ubuntu command terminal. I have logged in to my SSH console and so far I can only list the contents of my files as follows.
cd /var/www/html/staging
Now I want to download all the files from that staging folder.
Sorry, if I'm missing the point. Anyway, I came here seeking a way to download files from Google Cloud Console. I didn't have the ability to create an additional bucket as the author above suggested. But I accidently noticed that there is a button for exactly what I needed.
Seek keebab-style menu button. In the appearing dropdown you should find Download button.
If you mean cloud shell, then I typically use the gcp storage tool suite.
In summary, I transfer from cloud shell to gcp storage, then from storage to my workstation.
First, have the Google cloud ask installed on your system.
Make a bucket to transfer it into with gsutil mb gs://MySweetBucket
From within cloud shell, Move the file I to the bucket. gsutil cp /path/to/file gs://MySweetBucket/
On your local system pull the file down. gsutil cp gs://MySweetBucket/filename
Done!

How to list CACLS (rights) for a folder and log to disk

What I want to know is how can I write the CACL rights of a folder recursively (folders and files) out to disk. I want to write two different folders out and compare them using two output files in a tool like winmerge. I have a website that works when it's setup using some manual steps but then when I publish from the build server something gets set with the CACLS automatically and the site get's access denied again when you try to browse to it. I know the TFS build server is doing something when it publishes but I am trying to figure out what is different after it publishes. I made a backup of the good folder that works, so I need to output the CACL's, etc of each folder and do a text compare. I already went through by eye and checked all the folders and files with the file properties viewer and looked for missing files or altered web.configs. It's not that. I have looked all over google and can't find a very good solution. Can someone help me?
From PowerShell you could run something like this:
gci -Recurse *.* | % {cacls $_} > c:\temp\cacls.txt
That should give you something you can compare in a diff tool.

linux (CLI) download files via shared dropbox (folder`) link without a account

I was thinking to use dropbox to upload my source code of a web-application. For this folder i would create a shared link. This link i like to use to download all the latest source files on my test server (instead of using s/FTP).
Now i know you can use dropbox with linux by installing their version, but it requires to create account. I don't want to use a account, and for sure don't want to use my own account.
Is there anyway to use a shared (folder) link, and download all the files in that folder command-line, without a account (maybe something like wget) ? There is no need for live-syncing, it would be fine to trigger the download with some bash script.
Thanks.
If you're ok with your links being public (which i think is not a good idea) , then you can just create a file with a list of links to your files and then create a bash script to loop over each line of the file get the link with wget
If you want to use authentication, you'll have to register for a Dropbox API key and then create a script (in python,ruby or java etc) to authenticate and get the files.
If you don't have a specific need for dropbox, i'll recommend you use git (or similar). With git you'll just have to create the repository on your server and clone it on your desktop. Then you can just edit your files and push it to the server.... it's so much easier.
Rogier, github has become the norm for hosting code. There are other options (Sourceforge, Google Code, Beanstalk) or you can set up a private git repository on your own computer.
Somewhere deep in my browser history there's an article about how to do that.
However a little googling turned up http://news.ycombinator.com/item?id=1652414. Let me know if you can't find some satisfactory instructions on your own of how to set up a git repo on your computer.

Cruisecontrol, deployment, folder permissions

We're using cruisecontrol.net, it builds the version, creates a zip file, then 15 min later, unzips the file on the Integration server. But when the folder gets to the integration server, often, the security permission on one of the folders is totally hosed. The Domain admin and folder owner can't even open the folder in explorer. We reboot and the folder permissions are good we can delete the folder and redeploy the zip file and it's okay.
Does anyone have any idea what or how the folder permissions are getting so messed up?
Any tools to use to diagnose/watch what exactly is messing it up?
Have you tried using psexec from system internals to upzip to file on the remote machine rather than the build machine?
Also, it seems to me that rather than unzipping the zip just copy the stuff directly to the remote server. I'm not seeing the reason to zip it and then just unzip it?

Resources