I have a scenario where I need to synchronize multiple "on premise" Microsoft SQL server databases from business datacenters to cloud storage (let's call it blob storage).
In the past, I've used the Azure Data Factory on-premise client to bypass firewall considerations, not require a VPN, and delivery data directly to Azure Blob storage.
I need to do the same thing using Google tools (destination Google Cloud Storage). Is there an equivalent GCP tool that does not require a VPN? If not, any lowish-priced tool recommendations?
To send file from on prem to Google Cloud Storage (GCS) your need 2 things
gcloud SDK installed on your local environment
a service account key file. Keep it secrets!!
Then, follow these steps
Authenticate your service account in gcloud sdk gcloud auth activate-service-account <Service Account email> --key-file=<Service Account File>
Extract your data locally
Use gsutil to send the file to the cloud
The connexion is authenticated and don't required VPN.
Related
I am trying to redeploy an Azure Cloud Service (classic) to an "extended support" one since the former is being deprecated. Following this guide and the prerequisites I have created a virtual network and new storage account. I set up a bunch of permissions and the Connectivity Check for my storage account indicates no problems. However when I try to create and deploy a new Cloud Service (Extended Support) using my (updated) .cscfg, .csdef and .cspkg files I get this error:
Error:AuthorizationFailure, message:This request is not authorized to perform this operation. (Code: PackageUriForbidden)
I've tried setting the container and blob access to public for the deploy files, I have added Network Contributor and Storage Blob Data Contributor to both the subscription and the cloud storage resources for my user account. What am I missing?
I tried deploying cloud services extended support via Azure Portal and it got deployed successfully.
Refer below :-
I have uploaded all my cloud services packages in my storage account and used those packages from my storage blobs and created the Cloud service ES instance.
I enabled connection from all networks for my storage account, Thus I did not receive any authorization error :-
It looks like your Storage account has Firewall and V-Net enabled for the selected Networks. Or There’s an IP Address added to restrict storage account.
I created a Create a Service endpoint in my V-Net to allow Microsoft.storage like below :-
Added this V-Net in the selected Networks in Storage account’s Firewall :-
Checked this in order to allow azure services to access storage account like below :-
Now, when I try to deploy another cloud service with the same storage account having firewall and V-Net enabled, I get the same error as yours refer below :-
I allowed my client machine’s IP in the storage account and was able to add the packages without any error while deploying the Cloud Service:-
I need an SFTP server hosted on Azure. The important thing is that it should support multiple user accounts and their management at runtime ( via an interface or API), i.e. password reset, account blocking/unblocking (preferably user groups also).
I have found some guides on how to set up an SFTP server on Azure:
https://learn.microsoft.com/en-us/samples/azure-samples/sftp-creation-template/sftp-on-azure/
https://charbelnemnom.com/how-to-deploy-sftp-service-on-microsoft-azure/
Their major drawback is that every change ( addition of a new user, password update etc.) requires a new deployment which is not acceptable.
Also, there is an SFTP functionality for the Azure Blob Storage:
https://learn.microsoft.com/en-us/azure/storage/blobs/secure-file-transfer-protocol-support
This functionality is still in preview and allows only to add local users in the Azure portal. Unfortunately, other - more advanced features like account blocking etc - are missing. It is also not possible to manage the user accounts from the code.
There exist some products on Azure marketplace provided by external companies, like Azure SFTP Gateway which supports all functionalities listed above. I am not sure about the further maintenance of these products and I couldn't find any information about SLA or similar things for these products.
I would like to ask if there is a reliable resource or a way to set up a resource that could serve as an SFTP server and meet the requirements listed above? If there is no explicit solution, maybe there is a way to integrate an SFTP server with a database or do something else?
Presently, we only support the SFTP for Azure Blob Storage.
SFTP for Azure Blob Storage supports multiple local user accounts and container-level permissions. You can use the Portal, ARM, PowerShell or CLI to manage users, permissions and passwords/keys. You can also disable individual users by disabling ssh key and password authentication.
Connect to Azure Blob Storage using SFTP (preview) | Microsoft Docs
az storage account local-user | Microsoft Docs
Is this even possible? I have a couple web apps and a couple of Azure Functions running under the same App Service Plan. I'd like to (ideally) have them use a specific Storage plan, so I can keep everything in one place. I envision them in different containers under the same plan.
If that's not possible...then where are the files? Are they on the storage that's built into the App Service Plan itself? If so, can I connect to this somehow, so I can manage the files through something like Storage Explorer?
Today when playing with the Azure Az Powershell tool I found I was able to provision a Function App without a Azure Storage back-end. This cannot be done via the UI. An easy way to provision a Function App with a storage account backend is by leveraging the Azure UI for provisioning.
When a Function App is provisioned via command line, the bits seem to be stored within the function app itself. There is an FTP URL given if you download the publish profile. The files can be read and written to using an FTP tool like WinSCP (as alternative to Kudu)
I'd like to (ideally) have them use a specific Storage plan, so I can keep everything in one place. I envision them in different containers under the same plan. If that's not possible...then where are the files?
Every Azure Web App has a home directory stored/backed by Azure Storage. More detail info please refer to Azure WebApp sandbox. It is owned by Azure WebApp Service, we are not able to choose Azure Storage to setup WebApp by ourselves currently. But we could config storage account for Azure WebApp Diagnostic logs.
Are they on the storage that's built into the App Service Plan itself? If so, can I connect to this somehow, so I can manage the files through something like Storage Explorer?
Different WebApp Service Plan has different volume of the storage. We could use Kudu tool (https://yoursite.scm.azurewebsites.net) to manage the files. More detail info about Kudu please refer to the document.
Update:
We could access the home directory with the Kudu tool. More details please refer to the snapshoot
Is it possible to list out the cloud services for azure storage account with out using certification thumbprint . And also get the deployment ID for particular cloud service?
I have connected storage account(based on account key and account name) using azure storage client library and list out the tables and containers. My question is, displays the cloud services for particular storage account with out using certification.
Note: I saw rest api to list out the storage account and services using subscription id with certification.
I am waiting your response.
Is it possible to list out the cloud services for azure storage
account with out using certification thumbprint . And also get the
deployment ID for particular cloud service?
To achieve this, you would need to use Azure Service Management API and API calls need to be authenticated. Using X509 Certificate is one of them (which you don't want to use). Other way to achieve this would be to use Azure Active Directory. You can read more about it here: http://msdn.microsoft.com/en-us/library/azure/ee460782.aspx#bk_ad however authenticating API requests using Azure AD is more complicated than using a certificate IMHO.
Is it possible to use Azure Cloud Storage with Windows Azure websites?
All of the code samples for Cloud Storage that I have found use Azure Cloud Services with a Web role.
I am using RavenDB embedded so I need Azure Cloud Storage, right?
I am currently using Azure Cloud Services + Azure Cloud Storage.
PS. this is for a small personal website with almost no traffic.
Azure Cloud Storage (Blob, Tables etc.) are ordinary network service which uses REST protocol under the hood. The Cloud Storage services can be accessed from Azure Cloud services as well as Azure Websites. Also you can use Azure Cloud Storage from you on-premiss applications (however, due to network latency it might be a bit slower than from the cloud).
Azure Cloud Storage API (which you probably mentioned) seems to be the best option to use when on Azure Websites.
Please note that the same applies to Azure SQL - it can be used from Azure Cloud service as well as Azure Website (including on-premiss applications)