azure shared storage file not found - azure-web-app-service

In this project I am using azure app service with linux containers, there is a certain file that must persist and be shared across all the instances. However when the container starts it fails with the following error
Interop+Crypto+OpenSslCryptographicException: error:2006D080:BIO routines:BIO_new_file:no such file
Using File Zilla and the app service FTPS credentials I am able to upload the file I need to the default shared storage ending in this folder structure
-/
|_ASP.NET
|_LogFiles
|_site
|_thefileineed.txt
As you can see, is a c# project, so it has an appsettings.json file in which the path to this file is declared
{
"PathToFile":"/home/thefileineed.txt"
}
Because it uses containers I asume that i must mount the shared storage inside the container with compose-file.yml and following the documentation I use the following setup:
...
volumes:
- ${WEBAPP_STORAGE_HOME}:/home
What i am missing? or how is it supposed to access to the file

Your setting should be like below.
MyExternalStorageIn the docker compose configuration I have to set......volumes:- MyExternalStorage:/var/www/html/contao
For more details, please read related posts.
1. Azure Web App usage of WEBAPP_STORAGE_HOME variable in docker-compose
2. [ Q&A ] Web App Docker Compose Persistent Storage

Related

Wrong path for File Share in Azure App Service

I'm deploying my multi-container application on Azure App Service (Web App for Containers) and time has come to add persistent storage.
I made it work, but the result is not what I expected.
So I have a Storage Account with a File Share (Transaction optimized).
In that File Share, I created a directory called media.
In my App Service, under Settings > Configuration > Path mappings, I created a Mount storage record as below :
In my docker-compose.yml file, I have the followings :
backend:
container_name: my_container_name
image: my_image
volumes:
- filestore:/whatever/media
# ...
volumes:
filestore:
The backend of my application stores files in /whatever/media folder and it works, my files are in Azure. But not at the correct path. In azure, I'd like to have everything under the media directory that I created. Instead, files and directories are created at the same level of my azure media directory but not in it :
You can see in the screenshot above the result :
helpdesk
media
Whereas i'd like to have only
media
With media/helpdesk/...
How can I achieve that ?
What am I missing ?
Thanks in advance for your help

How I deploy Grafana on Azure WebApp with Docker-compose file?

I want to deploy monitoring dashboards using Grafana as web apps using Azure-cloud and share them with my team members.
But I found some problem:
(1) In Docker-compose, Grafana needs volumes to store data.
(2) So I made Azure Storage & File share. And mapping path this storage to Webapp.
Storage Mount is as follows.
name : namename
mapping path : /var/lib/grafana
format : AzureFiles
(3) And this is my docker-compose.yml
services:
grafana:
image: grafana/grafana
ports:
- 3001:3000
volumes:
- namename:/var/lib/grafana
(4) After I build it, my webapp was down and shown me the screen below.
enter image description here
and error log is this.
service init failed: migration failed: database is locked
Logging is not enabled for this container.
I don't know what is problem, and how to fix it.
Also, I want to attach storage and check its inside.
How I do?
When you mount the Azure File Share to the container, then the path you mount will have the root owner and group. But the image runs with the user grafana, so it does not have permission to migrate files.
The solution is that mounts to a new path that does not exist in the image. For example, path /var/lib/grafana-data. then it will work well. Then you need to copy the data yourself from the path /var/lib/grafana to the path /var/lib/grafana top persist them.

How to run yaml file from Azure cloud shell

I have created a Windows Server container on an Azure Kubernetes Service (AKS) cluster using the Azure CLI. While trying to deploy my aspnet core app to the AKS cluster, I am stuck on this step of the above link. I have sample.yaml file on my Windows-10 hard drive that needs to run in the Azure cloud shell using the following command:
kubectl apply -f sample.yaml
Question: Where can I place the above sample.yaml file so I can run the above command in Azure Cloud Shell? I am assuming it probably has to be somewhere in my Azure storage account but where exactly it should be placed so above command can recognize its path? Currently it's giving an expected error: the path "sample.yaml" does not exist
You can directly create a file named sample.yaml using vi or nano or code sample.yaml in the Azure could shell then copy your YAML definition.
For example, type code sample.yaml in the Azure Bash. It opens a sample.yaml file then copy YMAL content and save it. The file automatically was stored in your current working path /home/user.
Or, you can upload your sample.yaml from your local to the Azure path.
Or, you also could persistently store your file into the Azure file share. To find the Azure file share, you can type df command.

Yaml configuration to mount Azure Blob Container share

How to configure Azure Blob Storage Container on an Yaml
- name: scripts-file-share
azureFile:
secretName: dev-blobstorage-secret
shareName: logs
readOnly: false```
The above is for the logs file share to configure on yaml.
But if I need to mount blob container? How to configure it?
Instead of azureFile do I need to use azureBlob?
And what is the configuration that I need to have below azureBlob? Please help
After the responses I got from the above post and also went through the articles online, I see there is no option for Azure Blob to mount on Azure AKS except to use azcopy or rest api integration for my problem considering the limitations I have on my environment.
So, after little bit research and taking references from below articles I could able to create a Docker image.
1.) Created the docker image with the reference article. But again, I also need support to run a bash script as I am running azcopy command using bash file. So, I tried to copy the azcopy tool to /usr/bin.
2.) Created SAS tokens for Azure File Share & Azure Blob. (Make sure you give required access permissions only)
3.) Created a bash file that runs the below command.
azcopy <FileShareSASTokenConnectionUrl> <BlobSASTokenConnectionUrl> --recursive=true
4.) Created a deployment yaml that runs on AKS. Added the command to run bash file in that.
This gave me the ability to copy the files from Azure File Share Folders to Azure Blob Container
References:
1.) https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#obtain-a-static-download-link
2.) https://github.com/Azure/azure-storage-azcopy/issues/423

How to ship Airflow logs to Azure Blob Store

I'm having trouble following this guide section 3.6.5.3 "Writing Logs to Azure Blob Storage"
The documentation states you need an active hook to Azure Blob storage. I'm not sure how to create this. Some sources say you need to create the hook in the UI, and some say you can use an environment variable. Either way, none of my logs are getting written to blob store and I'm at my wits end.
Azure Blob Store hook(or any hook for that matter) tells overflow how to write to into Azure Blob Store. This is already included in recent versions of airflow, wasb_hook.
You will need to make sure that the hook is able to write to Azure Blob Store. Just mention the REMOTE_BASE_LOG_FOLDER bucket should be named like wasb-xxx. Once you take care of these two things instructions works without a hitch,
I achieved writing logs to blob using below steps
Create folder named config inside airflow folder
Create empty __init__.py and log_config.py files inside config folder
Search airflow_local_settings.py in your machine
/home/user/env/lib/python2.7/site-packages/airflow/config_templates/airflow_local_settings.py
/home/user/env/lib/python2.7/site-packages/airflow/config_templates/airflow_local_settings.pyc
run
cp /home/user/env/lib/python2.7/site-packages/airflow/config_templates/airflow_local_settings.py config/log_config.py
Edit airflow.cfg [core] section
remote_logging = True
remote_log_conn_id = log_sync
remote_base_log_folder=wasb://airflow-logs#storage-account.blob.core.windows.net/logs/
logging_config_class =log_config.DEFAULT_LOGGING_CONFIG
Add log_sync connection object as below
install airflow azure dependency
pip install apache-airflow[azure]
Restart webserver and scheduler

Resources