Azure Services - Data Security - azure

May I ask what is the security protocol (Https/TCPIP etc) applied in the following scenarios in Azure? I need these details to write my design document.
Between Azure Services
Azure Data Factory interacting with Azure Storage
Azure Databricks interacting with Azure Storage
Azure Python SDK connecting to Storage Account (Is it TCP/IP ?)
If there is any support page in MS Azure, please direct me there.

Inside the Azure data centers used TLS/SSL for communication between
services and you can read about it "Encryption of data in transit"
section on this page.
The main SDK implementations are wrappers around the REST API and
Python SDK is one of them.

Related

Data Ingestion Patterns in Data Factory using REST API

I am reaching out to you gather best practices around ingestion of data from various possible API's into a Blob Storage. I am considering to interface with all the various possible API's using Data Factory and here are the possible set of API's which I have currently:
Ingesting from an API service within the same Resource Group in Azure Cloud
Ingesting from an API service from a different resource group but within the same Azure subscription
Ingestion from an API service from a different Azure Subscription / VNET
Ingestion from an API service available publicly such as Twitter, Facebook
Ingestion from an API service which is available on-premises
Any other possible API services
My questions around the above API services are:
a) What are the specific security related settings I need to take care in order to interface with the above API's (Managed Identity, Service Principal etc.)
b) When to use which security setting ?
c) Along with Azure Data Factory, is there any other Azure service which can be leveraged for the above ingestion from the API's
d) What are the specifics of Runtimes / Linked services which I should be taking care about e) Any specifics around AAD resource and Authentication type
Just wanted to add that for #5 , you will have to use SHIR ( Self hosted IR ) . Please read about this here .
for c) Along with Azure Data Factory, is there any other Azure service which can be leveraged for the above ingestion from the API's
Logic Apps has connectors backed in for many existing API's

Migrate Qlik reports to azure

I did some research and found that there are some options to migrate on-premises QlikView reports to Azure data lake , using the IaaS approach.
Is there an PaaS component for QlikView in Azure ?
As of today , there is no PaaS component in Azure for QlikView . We have to go for the IaaS option, while migrating.
I can't find any Azure PaaS component to bind Qlik and ADLS, except using REST API from this link. The details of authentication of REST API could be referred in that case.
Here are some third-party tools to implement the transmission:
1.Dremio: https://www.dremio.com/, which supports ADLS connector.
2.Panoply: https://panoply.io/integrations/azure-blob-storage/, which supports Azure Blob Storage connector. Next step, you could move data from Azure Blob Storage into ADLS with ADF copy activity.

Clarification regarding storage account in web applications

I have an on-premises mvc application with a database calls to one more server.
When I deploy this application to windows azure, I am curious to know what will be stored in the storage account for this cloud service?
Is it database records or something else?
Given you mentioned creating a Cloud Service (so, I'm assuming Web Role for your MVC app): The deployment needs a storage account, at a minimum, for storing diagnostic log information, as well as your cloud service package and configuration.
Storage account is mostly used for "Blob" storage. In Azure environment we should not prefer to store blob data( like image and doc/PDF ) in database.best practice to store blob storage link.
Azure Storage provides the flexibility to store and retrieve large amounts of unstructured data, such as documents and media files with Azure Blobs; structured nosql based data with Azure Tables; reliable messages with Azure Queues and use SMB based Azure Files for migrating on-premises applications to the cloud.
for Overview and reference : http://azure.microsoft.com/en-in/documentation/services/storage/

How can get the data from azure monitoring tab using the sdk (or powershell)

I'm trying to monitor all our resources in a single place and after reading the msdn pages on monitoring web apps and cloud services and azure sql databases i can't seem to understand how to query azure (thorugh the SDK or powershell) to give me the same data i can see in the azure monitoring page.
is there some programattic way to get this data?
I'm afraid there isn't an easy answer to this. There isn't one API where you can fetch all the data: it's mostly service specific.
A couple of pointers:
Metrics for web sites, web/worker roles and VMs can be accessed through the metrics API. See here: https://convective.wordpress.com/2014/06/22/using-azure-monitoring-service-with-azure-virtual-machines/
Metrics for SQL Database are available through the sys.resource_stats and sys.dm_db_resource_stats DMVs.
Windows Azure Diagnostics (for web/worker roles) performance counters can be fetched though the table storage API, in the wadperformancecounterstable table.
There are probably API's for other services as well. I've built a tool myself that fetches data from a couple of these services, and can both plot it and expose it through a unified API, see: https://github.com/WadGraphEs/AzurePlot

cocos2d-x connection to Windows Azure Storage

I write an application using cocos2d-x. Now I want to store some data in the Windows Azure Storage and get the data sometime, how can I do that?
As written, it's difficult to answer such a broad question. Having said that: I'll do my best to give you an objective answer describing Azure's storage options from a service perspective.
Azure Mobile Services. This lets you have a CRUD interface to storage, and is build to provide a REST-based API, which fronts storage. It defaults to SQL Database, but you can easily override this by creating your own custom API and using server-side JavaScript / Node.js to read/write to any storage system.
Azure blobs/tables/queues. This is the collective set of Azure large-scale storage, with up to 200TB per account namespace. You can access storage directly from your game, or through your own service tier - that's up to you. You need to worry about security, as you don't want to have your blobs exposed as public unless you want to. Fortunately you may use something called a Shared Access Signature to grant access to your app, while keeping these resources private to the rest of the world.
SQL Database. Azure provides database-as-a-service, largely compatible with SQL Server. As long as you have a proper connection string, it's just like having a local database.
3rd-party hosted solutions. There are companies that host data services in Azure, such as ClearDB (MySQL) and MongoLab (MongoDB).
One other option: Custom database solutions. If you're not using a built-in or 3rd-party storage service, you can always install a database server within a Virtual Machine. You're now managing the server, but this would give you ultimate choice.

Resources