After I upload another file to Assets in Media Service of azure portal this function become completely broken. This section shows me an error and does work completely.
This error does not depend on browser I use and it is not a problem of browser cache.
Console logs - http://static.realgrad.ru/portal.azure.com-1487781825223.log.txt
Related
I am unsure if this is happening by design, or it's supposed to work this way. Through Visual studio Publish option, I published my ASP.NET Core website on Azure App Service. Now, when I access the fileManager on https://mywebsite.scm.azurewebsites.net/fileManager, I am missing majority of the files. I can only browse some log files, several zipped packages.
The website itself works fine when I am trying to access it on the link, however I am unable to browse all the files for it. I was wondering if this is an intended design, or I have done something wrong.
Kind Regards.
HTML5 component, that allows web pages to store named key value pairs locally. It just setting to store our web app.
To solve the files not able to view via kudu
Open Browser -> go to Developer tools or hit F12 -> In the console run the following cmd
window.localStorage['maxViewItems'] = 1000
Still you are facing issue please check and compare with Azure App Service Editor and Kudu files. I had checked both Kudu and App Service Editor shows the same files which contain the web app.
Refer Kudu error with maxViewItems in localstorage
Good day everyone I have a Web App created using Entity framework it has an upload feature and the upload feature works on localhost. Now after publishing to azure the site is working however the uploaded images are not displaying when I try to upload a new image it gave an "An error occurred while processing your request". Connection string is correct since other data are showing I changed the permission of the upload folder then republished still the same. Looking at the dev console it is showing failed to load resource. But why? Do I need to configure azure? Any help is appreciated thank you!
What I did was look at azure website's Kudu console (something I am not aware of) and upon looking at the Kudu console I realized that my upload directory is not included which is why the images are not loading it doesn't exist. I don't why it is not included but eventually I created a new upload directory at the kudu console and manually transfer the photos after that the photos are now displaying and upload is now working. For references: reference 1 reference 2
I'm managing small web app, deploying with (always) FTP. My teammate requested me to upload a folder including small teaser website. As usual routine, I opened the Azure Web App Deployment Center FTP dashboard and copied server address and credentials and pasted them into FileZilla(latest version). Login went well and I could see file list and folders. However, when I tried to upload files, remove existing files, all of the writing attempts failed with 550 Access is denied error. What could be the cause of this?
On the other hands, I could remove existing files on KUDU debug console.
I check the ftp link the deployment center provide, looks like the portal provides a read-only ftp link. So go to your web app, click Get publish profile, choose the publishUrl under tag profileName=<yourwebname-FTP>.
I have a script that suppose to upload my customers files from this page demo.dupleplay.com to my Google Cloud Platform Storage, the point that I create the script for is letting my customers upload their files on my storage without giving them the access to the Bucket. Just like Uptobox and Media Fire and those service but I did connect my service to my Google Cloud Storage instead of a normal server.
the problem is only when uploading remotly to the upload page (link provided above) the Chrome browser is giving an Network error just like when the server have a lot of traffic but it's not, any another browser is handling the upload correctly without any issue, how can I fox that with chrome?
This is the used script: https://storage.googleapis.com/duple-play.appspot.com/Archive.zip
I have an Azure Web App configured for Backups. It runs once a day and backup both website and database". I have checked the database size is just 170MB but now during taking backup I am facing error "The website + database size exceeds the 10 GB limit for backups. Your content size is 10 GB".
It necessarily need not be your Database that can cause the size to exceed. The Logfiles folder can also contributes to the size. If you are sure that the Database is not at fault, then look at the other areas that can cause the size to bloat. You can determine folder content size using the Azure Web Apps Disk Usage Site Extension.
This is how it looks like:
Installing the Site Extension:
Browse the Kudu site for your web app: https://sitename.scm.azurewebsites.net
Click on Site extensions and then click on Gallery.
Search for Azure Web Apps Disk Usage site extension.
Click on + icon to install the site extension.
Once installed, click on the Restart Site button.
Now, this should show up under Installed. Once installed, the + incon should change to play (>) as shown below
This should take you to the following url: https://sitename.scm.azurewebsites.net/DiskUsage/ (replace sitename with the name of the web app)
Excluding specific folders from the backup:
There is way to exclude specific folders from your backup.
Create a file called _backup.filter. Place it here: D:\home\site\wwwroot\_backup.filter.
specify the files and folders that you want to exclude in this file
for example I would add the following to my _backup.filter file:
\site\wwwroot\Logs
\LogFiles
\Data
For more information on how back up works, refer this blogpost: http://zainrizvi.io/2015/06/05/creating-partial-backups-of-your-site-with-azure-web-apps/
If none of this helps, create an Azure Support Ticket: How to create an Azure support request
It is very odd, please firstly check the Web Apps Disk Usage from quota in the web protal and have a try to use following two ways.
1.Scale up your web app then Scale down back. To refresh the web app.
2.I suggest you could create another server plan, then move this web app to that service plan.
If this two ways still don't solve this error.I suggest you could connect to the azure support.