I have a WordPress installation on an Azure Website (not a WebRole). I have FTP access to the site but it literally can take like an hour, which is insane because if I could just ZIP the thousands of files on the site (due to all the plugins, etc) it might take 5 seconds to run the zip. Downloading that then would be far more reliable, as its just 1 file, not 10,000 that could get messed in transfer. Traditional hosters allow you to get on their control panel and zip folders, for instance, but FTP doesn't allow this.
So can I do this on an Azure website, any way, shape or how??? I've looked a bit into SFTP which seems to have some such capabilities but it doesn't seem to be implemented in Azure websites. What can I do, this whole work flow is despisable, I can't live with it, it discourages backups. This encourages one to go to a traditional shared host but I would rather not if possible.
Use the Kudu Console. To access it, simply go to {yoursite}.scm.azurewebsites.net.
You will then be prompted for your login credentials for your Microsoft Azure Account. Once logged in, click on 'Debug Console' at the top of the web page.
Within the UI, Next to each file and folder, there is a Down Arrow icon that lets you download the item.
For files, it directly downloads the file by navigating to it.
For directories, it downloads a zip file containing the full content of the folder.
Detailed instructions can be found here: https://github.com/projectkudu/kudu/wiki/Kudu-console
Adding a screeshot of kudu-console.
Another solution would be the Kudu API, you can accomplish a lot of stuff with it such as downloading a folder from the app service as ZIP as well as automating this using a script. Use the following link within a web browser for example:
If logged in to kudu from the browser just use:
https://{{your-site}}.scm.azurewebsites.net/api/zip/{{folder-path}}
If using a script or command line pass your credentials as follows:
https://user:pass#{{your-site}}.scm.azurewebsites.net/api/zip/{{folder-path}}
Where user and pass can be obtained by going to your app service in the Azure Portal and clicking Get Publish Profile in the Overview tab. See the Deployment Credetials documentation for more details.
Note: The folder path starts from D:\home.
For more information consult the kudu Rest API documentation.
Related
so my uncle asked me to update something on his website. I found out he uses azure service. I've never used this service before. After looking around I thought it was as easy as Searching App Services, then clicking on the only app running, then on the side panel click development center, then I choose FTP access. I then used FileZilla, I edited a file and then re-uploaded the file but nothing gets updated on the site. Am I missing something obvious?
Thanks.
If you want to update your website, you can use FTP,Git or other tools. You can refer to this document which can tell you how to deploy your sites. And contains other way to deploy your app. According your describe, you can check the last modified time of files in kudu.The following paragraph will introduce kudu.
If you just want to update your static files in sites. The easiest way is logging on portal and find your app.
First, you should click Advanced Tool in left side, and click the link to open KUDU Management.
Second, find Debug console options and click CMD or PowerShell, then you can see folders in the page, click site->wwwroot .
Third, you can add/delete/modify your static files by buttons in the page. When click the icon of pen, you can modify this file.
If you want to copy your local files, you also can drop it to this page, it's very convenient. More function about Kudu you can see this site.
Hope my answer works for you.
I've got an umbraco site which I deploy to an azure web app service. The data is on an azure sql database. I have been able to deploy this successfully, and can verify that all the data I expect to be there is present in the content view.
However I have added content on various pages, in rich text editors, and on my local site I can see this content on the site. But on my deployed site the content in rich text editors is only visible on the content view, not on the site. I've tried publishing each item but nothing will appear.
What else can I try?
Umbraco needs some additional configuration to be treated properly on Azure. It especially affects indexes and XML caching file.
Please check the following blog post made by one of the Umbraco HQ Core developers - Sebastiaan Janssen: https://cultiv.nl/blog/making-sure-your-umbraco-site-performs-on-azure/. Go step by step to ensure if your app is properly configured.
Going further you may be in need to also ensure proper configuration for load balancing, which you can find here: https://our.umbraco.org/documentation/getting-started/setup/server-setup/load-balancing/flexible
I found the answer after a much experimenting.
I had not manually included in my project (and thus not deploying) Views/Partials/Grid/fanoe.cshtml. This file includes the and I guess I was using some default template which is using this file, rather than the other grid templates in the same folder.
I have some files on S3 and would like to view those files in web. Problem is that the files are not public and I dont want them to be public. Google doc viewer works but condition is, files should be public.
Can I use office web apps to show in browser. Since the files are private, I do not want to store any data on Microsoft servers. It looks like even google doc viewer stores the info while parsing.
What is the cleanest way?
Thanks.
I have looked around for something similiar before and there are some apps you can install locally (CyberDuck, S3 Browser, etc). In the browser has been limited until recently (full disclosure I worked on this project).
S3 LENS - https://www.s3lens.com/
I probably get a minus here, but also Microsoft has an online viewer, which works the same way: the file needs to be publicly accessible.
Here is the link: https://view.officeapps.live.com/op/view.aspx
What I cloud add is that those files need to be publicly accessible only for a short period, i.e. until the page gets opened. So you cloud trick them by uploading the file to be viewed to a public temporary storage in a randomly generated folder and give that url to the online viewer.
Of course this is not that safe, since the file will get as some point to the temp storage and then to Google or Microsoft, but the random path names offer some degree of safety.
I've created recently a small glitch app, which demonstrates what I just explained: https://honeysuckle-eye.glitch.me/
It uploads local files to a temp storage and then opens the viewer from that temp storage; the temp storage only last for one download, so it is pretty safe.
I have 2 azure websites in the same subscription and I want to copy the site from one to another. I know I can copy the entire site down to my local machine using FTP then upload the entire site, but it seems like there should be an easier way, especially considering the FTP hostname is the same for both sites.
These are not deployment slots, so I can't just swap them in the interface.
Thanks.
I think you can use the SiteReplicator site extension to do that. You can find it here https://www.siteextensions.net/packages/sitereplicator/.
The scm site can be found at URL_OF_Your_SITE.scm.azurewebsites.net and then go to Site Extensions to install it if it's not visible in the portal under the Site Extensions gallery.
You can use backup/restore feature. That will create a ZIP file from your site, store it in your storage account (including configuration serialized to XML) and then you can restore to a different/new site. In the end it is basically copying files around anyway, but maybe it is fancier than doing it manually through FTP. Another benefit is that also the website's configuration is copied around. This is one time thing only though. It is not clear from your question whether you want to do copy one time or periodically (that I would suggest the site replicator mentioned in the other response here).
Some links which might help:
Backup - http://azure.microsoft.com/en-us/documentation/articles/web-sites-backup/
Restore - http://azure.microsoft.com/en-us/documentation/articles/web-sites-restore/
Alright, I am obviously missing something here. I have moved several websites over to Azure to take advantage of all that it has to offer. Traditionally our team has always used DreamWeaver to ftp up/down and such. What I don't understand is how I go about getting hooked up to an EXISTING site on Azure. I can easily setup and web deploy to a NEW site, but I am trying to give the rest of the team access to the sites I have setup and I am lost as to how to approach this.
I have tried the File > Open Web Site route, and the issue with that is it never then saves the project/info anywhere in VS, and we are required to hook back up to it each time.
All of our local sites are on a shared network drive, so we all access the same local resources. I thought I could simply pass them all the publish profiles and they could then import, get, and then edit and publish files... but it never gives the option to "get all files" from the server.
Hope this makes sense?! Thanks in advance! :)
For multiple developer scenarios, it would be in your best interest to use a source control system such as Git or TFS. This will allow you not only to share the source across team members, but also give you the benefit of tracking changes and merging files that are modified across team members.
If you aren't comfortable with source control, you do still have access to the files via FTP or Secure FTP.
You could also use WebMatrix which has the concept of download from server built directly into the tooling.