Upload backup files on Azure file storage from Ubuntu server - azure

I need to upload my backup files from my Ubuntu server to Azure file storage, unable to upload it. Please share any idea or suggestions for the same.
Thank you in advance!!!

You just want to store the your gitlab backup files? or want store and share them?
If you just want to store them, I think we can create Azure storage blobs to store backup files. In Linux we can install Azure CLI 1.0 or Azure CLI 2.0 to upload files to Azure blobs.
More information about how to use CLI 1.0 or CLI 2.0 to upload files to Azure, please refer to the link.
If you want to store and share the backup files, I think we can use Azure file share storage. Azure files share service same as SMB 3.0, so you can mount the Azure file share to your Ubuntu, in this way, you can upload the backup files to it. Then you can mount Azure file share service to others to share the backup files.
More information about Azure file share service, please refer to the link.

Have you thought of implementing some agent tool to backup the data from Ubuntu to Azure Cloud storage? I think it can be a way out. Have a look at Cloudberry. It may help you. I see no other way to help which does not take so much time and effort.

Azure File Storage on-premises access from across all regions for Linux distribution - Ubuntu 17.04 is now supported right out of the box and no extra setup is needed.
https://azure.microsoft.com/en-us/blog/azure-file-storage-on-premises-access-for-ubuntu/

Related

Azure Databricks integration with Unix File systems

I am looking for help to understand the integration of Unix file system with Azure DataBricks. I would like to connect to on-Prem Unix file systems and access relevant files and process through DataBricks and load into ADLS Gen2.
I understand that if the files are available in DBFS, we should be able to process. But my requirement is specific to process files available on on-prem Unix file system using Azure Technologies such as Azure DataBricks or Azure DataFactory.
Any suggestion/help in this regard will be very helpful.
Unfortunately, it is not possible to directly connect to on-Prem Unix file systems.
However you can try below workarounds:
You can upload files onto DBFS, and then access them. Browse DBFS using the UI
To copy large files use AzCopy. AzCopy is a command-line utility that you can use to copy blobs or files to or from a storage account.

move file from azure vm to azure file storage

I have a Azure VM (Win 2016) where I have a folder where we have file coming every 5 minute.
Now I want to create and Window Service which will run on Azure VM and if any file exist, it will move to Azure File storage.
Could someone guide whats need to do or any other approach?
As I see it, you have 2 options:
Mount File Storage Share as a network drive. Once you mount the share as a network drive (and get a drive letter) you can simply use System.IO namespace to perform IO operations. Please see this link for more details: https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows.
Use Microsoft's Azure Storage SDK which is a wrapper over Azure Storage REST API and upload files from the local folder to the share in Azure File Storage. Please note that once the file is uploaded in Azure File Storage, you would need to manually delete the file from your server to achieve move operation.

Microsoft Azure backup VM database every hour

I would like to have some help regarding this.. What I want to achieve is for my PostgreSQL installed on my docker-container in a VM to be backed-up every hour and that backup will be deleted for 2 hours.
I read about the microsoft blobs, and I think that's what I'm looking for but the question for now is.. How to transfer the backup of Postgresql database to the microsoft blob.. Is there a shell command for that?
I find this PostgreSQL-Backup tool provides a way to backup PostgreSQL, and it seems support backup PostgreSQL to Azure Storage, but this feature is not free.
Besides, even if you backup PostgreSQL to Azure Blob, currently Azure Blob storage does not provide auto-delete feature. You could try to mount the file share and use the basic feature (not be charged) that PostgreSQL-Backup tool provides to backup PostgreSQL to that file share, and then you could run a WebJob to detect backup files and dynamically&programmatically delete backup files.

Using AzCopy in azure virtual machine

I have an azure virtual machine which has some application specific CSV files(retrieved via ftp from on-premise) that needs to be stored into a blob (and eventually will be read and pushed into a Azure SQL DB via a worker role). Question is around pushing the files from VM to blob. Is it possible to get AzCopy without installing the SDK to have the files copied to the blob? Is there a better solution than this? Please read the points below for further information
Points to note:
1) Though files could be directly uploaded to a blob rather than getting them into the VM first and copying from there, for security reasons the files will have to be pulled into the VM and this cannot be changed.
2) I also thought about a worker role talking to a VM folder share (via common virtual network) to pull the files and upload to the blob, but this does not appear to be a right solution after reading some blogs - as it requires changes to both VMs (worker role VM and the Iaas VM).
3) Azure File Service is still in preview (?) and hence cannot be used.
Is it possible to get AzCopy without installing the SDK to have the
files copied to the blob?
Absolutely yes. You can directly download AzCopy binaries without installing SDK using the following links:
Version 3.1.0: http://aka.ms/downloadazcopy
Version 4.1.0: http://aka.ms/downloadazcopypr
Source: http://blogs.msdn.com/b/windowsazurestorage/archive/2015/01/13/azcopy-introducing-synchronous-copy-and-customized-content-type.aspx

How to share Azure Storage Emulator among multiple development pc?

Hi I am new to azure development. We are planning to use blobs to store an images. At development time it create local storage emulator to store blobs located on local pc. Can we make it shared so all developers working on this project can use it to store and retrieve that blobs.
I dig a lot but don't find any answer.
Any help would be highly appreciated.
Thanks in advance.
hi the storage emulator just uses localDB to save the data.
see these ans Windows Azure Blob Storage Emulator File Storage Location
DSInit has disappeared on upgrading Azure SDK to 2.3
you can change the save location to a sql server instance. which you can save on a server. and hence can share among others
WAStorageEmulator init /sqlInstance <shared sql server instance>

Resources