Connecting Qlikview to Azure Data Lake Store Gen 1 - azure

I am working on a new requirement and need to connect Qliksense and Qlikview to Azure Data Lake Store Gen 1. I search a lot about it but didn't find any useful information to claim the connectivity betweem Qlik and Microsoft Azure Data lake.

We are deploying Qlikview to to Azure and received the following assistance from Microsoft so hopefully it leads you in the correct direction:
https://help.qlik.com/en-US/sense/September2018/Subsystems/PlanningQlikSenseDeployments/Content/Sense_Deployment/Azure-architecture.htm
Azure deployment
In a Microsoft Azure deployment, you install Qlik Sense Enterprise on a Azure cloud infrastructure that is flexible, high performance, and is quick to set up.
Deploying Qlik Sense Enterprise on Azure will enable you to quickly add new applications in a simple, and scalable manner. You can do this with a basic knowledge of Azure security and scalability options but without the need to follow complex on-premise installation and configuration procedures. Using Azure will enable you to get your Qlik Sense infrastructure up and running in fraction of the time required for an on-premise deployment, and will enable you to scale your deployment quickly and easily, regardless of unexpected changes in demand.
You can deploy Qlik Sense to Azure manually, or you can use an Virtual Hard Disk (VHD) available in the Azure Marketplace that includes Qlik Sense preinstalled. However, predefined images do not include a file share, so can only support single node Qlik Sense deployments.

Related

When should we use file share in azure as compared to Azure Blobs?

Could someone please tell some examples where we can use Azure file share in azure instead of Azure Blobs. In the internet whenever I search I get it can be mounted or it follows SMB protocol. But still I am not understanding a single case where we can use Azure File share.
For this I tried to look into When to use Azure blob storage versus Azure file share?
-This is a similar question but doesn't answer my question.
Azure provides a variety of storage tools and services, including Azure Storage. To determine which Azure technology is best suited for your scenario, see Review your storage options in the Azure Cloud Adoption Framework.
For detailed information and examples refer to this article: https://learn.microsoft.com/en-us/azure/storage/common/storage-introduction
It depends mostly on your use-case and how you plan to access the data. If you simply want to mount and access your files Azure Files will be your best fit. If you are looking for the lowest cost and want to access your data programmatically through your application Azure Blob would be a better fit. Both are accessible through the portal or Azure Storage Explorer.
I also recommend this Learn module which covers the difference in data types and solutions.
Additional information: Azure Blob Storage vs Azure File Storage
Cost details of Azure Blob Storage pricing & Azure Files pricing
In short: if you ...
have an application that needs to store or access files in the cloud, use Blob Storage
need a file share that can be used by, for instance, a server, use File Shares
Azure Files shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Azure Files shares can also be cached on Windows Servers with Azure File Sync for fast access near where the data is being used.
This means a File Share is, somewhat simplified, similar to a network share you would have in a local environment.
Azure Blob Storage helps you create data lakes for your analytics needs, and provides storage to build powerful cloud-native and mobile apps. Optimize costs with tiered storage for your long-term data, and flexibly scale up for high-performance computing and machine learning workloads.
This means Blob Storage is what you need when you're building powerful cloud-native and mobile apps.

Is it possible to use Azure Data Factory on-premise without data running through the cloud?

is it possible to use Azure Data Factory on-premise without letting the data run through the cloud? I know Talend got a prodcut, where the data is transfered only on our machines and not in the cloud.
Read documentation on Microsoft.com but didnt find any useful information
You may use a self-hosted integration runtime to transfer data entirely through your on-premises infrastructure, as long as both the data source and sink are on-premises.
However, the control flow will still happen through the cloud, even if the data itself never leaves your data center. For this reason, setting up a self-hosted integration runtime will still require outbound network access from your infrastructure to Azure.
Check out this piece of documentation for more information: https://learn.microsoft.com/en-us/azure/data-factory/create-self-hosted-integration-runtime?tabs=data-factory#command-flow-and-data-flow

What does the Create Web App + Database option offer in portal azure

I'm looking at creating a web app in portal azure but I came across this option. Create Web App + Database.
My question is if I select the DB engine to be SQL Azure. How big is the database?
Also, what's the difference between the Basic and Standard hosting plans?
Thanks in advance.
How big is the database?
The SQLAzure option refers to the Serverless Azure SQL Database offering. Source
Like most of Azure's managed SQL offerings, it scales up based on how much data you throw at it over time, but it appears the limit for storage is 2TB. Source
Also, what's the difference between the Basic and Standard hosting plans?
This is pretty explicitly addressed on the App Service Pricing page. Among other differences, the Standard plan comes with more disk space for your app and supports auto-scaling of the underlying resources.

Azure backup lifecycle management

Looking for a link on Azure Backup lifecycle management or help file will also do or anyways to design the Backup Lifecycle management
The Azure Backup service provides simple, secure, and cost-effective solutions to back up your data and recover it from Azure. You can back up anything from on-premise data to Azure File Shares and VMs, including Azure PostgreSQL databases.
To start off, you can learn more about Azure Backup here. This article summarizes Azure Backup architecture, components, and processes. While it’s easy to start protecting infrastructure and applications on Azure, you must ensure that the underlying Azure resources are set up correctly and being used optimally in order to accelerate your time to value.
To learn more about the capabilities of Azure Backup, and how to efficiently implement solutions that better protect your deployments, detailed guidance and best practices have been described to design your backup solution on Azure.
For additional reading, also refer to some Frequently asked questions about Azure Backup.

Copy files from on-prem to azure

I'm new to Azure eco system. I'm doing some research on copying data from on-prem to azure. I found following options:
AzCopy
Azure Data Factory (Copy Data Tool)
Data Management Gateway
Ours is a Microsoft shop; so, I'm looking for tools that gel with MS platform. Also, down the line, we want to automate the entire thing as much as we can. So, I think, Azure Storage Explorer is out of the question. Is there a preference among the above 3. Or, are there any better tools?
I think you are mixing stuff, Copy Data Tool is just an Azure Data Factory Wizard to make some sample data moving between resources. Azure Data Factory uses the data management gateway to get on premises resources such as files and databases.
What you want to do can be made with Azure Data Factory. I recommend using version 2 (even in its preview version) because its Authoring is easier to understand if you are new to the tool. You can graphically configure linked services, datasets and pipelines from there.
I hope this helped, if you need further help just ask away!
If you're already familiar with SSIS, there's also the option to use SSIS in ADF that enables on-prem data access via VNet.

Resources