Is there a way to easily enable and download Windows Azure IIS logs for a ASP web site? I found a few articles (about transferring logs to storage) but those all seem too complicated for something that should be readily available.
Read this part of the Azure documentation on How to: Configure diagnostics and download logs for a web site. Basically:
Configure Logging in Diagnostics section of Configure management page
Download log files for your web site via FTP (see website Dashboard management page and make note of the FTP site)
Reading log files using one of several methods described
Related
We use a app builder to create our applications, download them as .zip, and then just click "Add Application" on IIS, point to the application folder and it would be working.
Our applications require no server logic or database, they are self contained, the database is external and acessed via REST on javascript.
I can't seem to find the equivalent of the above procedure on Azure, it has options like Python/Node/Java/.NET but I'm not sure which of these fit our application needs.
How can I just easily upload a web app on Azure without any backend required?
If I'm understanding your question correctly, you can drag/drop a zip file into the "kudu" console of an Azure webapp to upload the file. Azure takes care of unzipping the file and putting the files in the web app.
To get to that portal, change your web app url to contain ".scm." and append /ZipDeploy to the end of the url:
https://yourwebapp.scm.azurewebsites.net/zipdeploy
Then drag/drop your file onto the web page.
From the main kudu page (remove zipdeploy from your url) you can upload entire directories of files if you need to. I regularly deploy webapps this way and it works great.
One of the easiest approaches is to make use of Azure App Services. This is Azure's Platform as a Service (Paas) offering that allows you to get a web site up fairly quickly whether that be a static site that is mostly client side logic or your more traditional apps such as asp.net\php\etc. You can deploy your app through various means from CI\CD, Github, FTP, etc. You can find more details and a 5 minute quickstart at https://learn.microsoft.com/en-us/azure/app-service/.
More recently Azure has offered the ability to host static web sites right in Azure's Blob storage. It doesn't come with all the features of Azure App Services and does not have any server side execution but has the advantage of price and simplicity and when used in concert with some CDN networks it can be attractive for static client side only sites. More details at https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website
I have created a .Net Core 1.1 web app and hosted it in Azure with the logging concepts covered briefly in https://blogs.msdn.microsoft.com/webdev/2016/10/25/announcing-asp-net-core-1-1-preview-1/
I have then setup logs to write to an Azure blob storage which I can see working, however from what I can see, the only way to actually view these logs are by downloading individual files.
Is there a way to see all of these logs in a stream with date filtering etc or how to feed that data into a different Azure application that could provide a usable GUI like Kibana?
Is there a way to see all of these logs in a stream with date filtering etc or how to feed that data into a different Azure application that could provide a usable GUI like Kibana?
The log file is grouped by date. If you want to view the log online, you could install a web site extension named Azure Web Site Logs Browser which could allow us to view all the logs(from blobs service and file system). You can also fork this project from GitHub and apply your own query on logs.
Steps to add Azure Web Site Logs Browser extension.
Azure portal -> Open your Web App -> Choose [Extensions] menu-> Click [Add] link-> Choose the extension you want to add.
If you have enabled 'Application logging' under 'App service -> Diagnostics logs', you will be able to view real-time log information under 'App server -> Log Stream'.
For GUI filtering capabilities, checkout Azure OMS Log Analytics. It can do comprehensive log filtering and analytics.
For application specific details, you can check 'Application Insights'.
Hope that helps,
Mihir
Microsoft have a desktop application to show the contents of various Azure storage containers. It is called the Azure Storage Explorer
I Used Azure Website Migration Assistance to migrate my web service that was running on my Local VM's IIS. My Migration process was successful and also I was able to use this web service. But I can't find where to find the migrated source code in azure portal. All I can see is some 20Mb of data in on the dashboard graph of azure portal. If I need to changed some of my code where to do this?
What is on the Azure Web App should now match what was on your IIS server. Now, to update the web app, you can use the deployment techniques here: https://azure.microsoft.com/en-us/documentation/articles/web-sites-deploy/
The simplest method to deploy to check what content is on your web app would be to use the SCM site. This is available at: https://your-site-name.scm.azurewebsites.net. Go to Debug Console > CMD and then the site > wwwroot folder to see your web app content. You can also upload to the site via drag and drop.
Alternatively, you can download the publishing settings for your web app via the portal and then re-use the migration tool, select the site, and then upload the publishing settings. However I would suggest using the deployment techniques above first. (Disclaimer: I wrote the migration tool.)
There are multiple ways to push changes to your Azure Website/Web App. They are listed here: https://azure.microsoft.com/en-us/documentation/articles/web-sites-deploy/
One simple way is to use an FTP client like FileZilla. In the classic portal, you will find the FTP address (hostname) and the credentials in the dashboard tab. In the new portal, select your Web App and the FTP address will be displayed in the Essentials section at the top of the page. Click on Settings and Deployment credentials to set your FTP user password.
Another simple alternative is to use Dropbox. Take a look at this video for how to set instructions: https://channel9.msdn.com/Series/Windows-Azure-Web-Sites-Tutorials/Dropbox-Deployment-to-Windows-Azure-Web-Sites
I need to add some Geoserver data on my Azure Website. I'm just wondering that if I need to install Geoserver for that on my account (if it is possible) or is there some other way?
As I was just looking around, I found this link working
http://geoserver.azurewebsites.net/
If this link is working, does it mean that we can configure Geoserver on Azure
There are a couple of ways to do this.
One option is to use a plain old Azure Web App with the GeoServer web archive (WAR file).
To do this download the web archive version of GeoServer. Then create a web app in the Azure portal. Under application settings you need to specify Java version 8 and a Tomcat version above 7.0.65. Then all you need to do is ftp the GeoServer WAR file to /site/wwwroot/webapps.
Once you have done that you should be able to access GeoServer by visiting your web app with the path /geoserver appended to the url. But be patient, the initial start up is slow and can take a few minutes.
Another option is to deploy a container image. I have ended up creating one myself that is tailored for use in an Azure Web App.
To use it you will need a Linux App Service Plan. When creating your web app select Docker Hub as the container image source and use coderpatros/geoserver-azure-web-app as the image. You can also specify a tag to use a particular GeoServer version. i.e. coderpatros/geoserver-azure-web-app:2.14.2. The available tags can be found on Docker Hub at https://hub.docker.com/r/coderpatros/geoserver-azure-web-app/tags. After you've created the web app you need to go to application settings and set WEBSITES_ENABLE_APP_SERVICE_STORAGE to true to enable persistent storage.
There is a OpenGeo Suite in Azure marketplace which helps in publishing maps and data from a variety of formats and sources with GeoServer.
Not sure if it will help you, but here ... OpenGeo Suite
I have been looking through the C# and Rest API's for the Microsoft Azure web sites but I cannot find a way of executing the drop box sync command that can be done through the azure portal. Is this possible from an API that any one knows of?
There is support for deploying a WebDeploy file to a Website using PowerShell, so there must be a corresponding API for it.
If you download the publish settings for the website, you'll se that it has a PublishUrl which is the WebDeploy endpoint for the server, along with a msdeploySite, which is your unique site on the server. A WepDeploy file is nothing more than a fancy zip (AFAIK), so by digging into this it should be possible to come up with something which can talk to the webdeploy endpoint and thereby publish.
I don't think that API is something Microsoft publishes though, so you might have to dig deep.