Azure Table Storage and ADO.NET for CRUD operations - azure

If there is any guidance available on how to use ADO.NET for CRUD Operations on Windows Azure Table Storage?

Table Storage is sometimes a confusing name as it has nothing to do with a relational tables based database like for example SQL Server.
If you want to get your hands dirty with Table Storage then follow How to Use the Table Storage Service. Another way is to make use of for example Simple.Data.Azure which you can install as a Nuget package.
However if you want to keep on making use of ADO.NET to a relational database like you're used to now then take a look at SQL Databases: How to Use SQL Database in .NET applications.

Related

How can I implement in the Micorsoft Azure / Microsoft Synapse serverless SQL Pool service the Row Level Security feature on external tables?

I am looking at a Data Lake csv file and want to create an external table in the serverless SQL Pool of Microsoft Synapse. The goal is to query this file with Row Level Security constraints in place.
When the external table is created on a dedicated Server, I am able to query the file with Row Level Security constraints in place.
How can I make the Row Level security for external tables on a serverless SQL Pool?
Unfortunately, row level-security is not supported in serverless SQL pool at the moment.
Can you please vote for this on our User Voice?
https://feedback.azure.com/forums/307516-sql-data-warehouse?category_id=171048
You can't use the feature as it is. T-SQL support on Serverless is limited.
E.g. CREATE FUNCTION isn't supported.
This syntax is not supported by serverless SQL pool in Azure Synapse Analytics.
You could of course try to DIY using Views which are supported in Serverless.
In the figure below Entitlements would become another CSV and EXTERNAL TABLE that you would create.
You'll have to either find the right function to get current user and/or role for View's SELECT query, or provide it via some wrapper code from some other place where you maintain your own Context.
Disclaimer: I've not done this in Serverless so can't say for sure.

Near real-time ETL of Oracle data to Azure SQL

I have an Oracle DB with data that I need to load and transform into an Azure SQL Database. I have no control over either the DB nor the application that updates its data.
I'm looking at Azure Data Factory, but I really need data changes in Oracle to be reflected as near to real-time as possible.
I would appreciate any suggestions / insights.
Is ADF the correct tool for the job? If so, what is a good approach to use? If not suitable, what should I consider using instead?
For real-time you don't really want an ELT/ETL tool like ADF. Consider a replication agent like Attunity or (gulp at the licensing costs) GoldenGate.
I don't think Data Factory is not good for you. Yes you can copy data from Oracle to Azure SQL database with it. But like #Thiago Custodio said, we need need to do it to each table you have. That's too complicated.
Just reference: Copy data from and to Oracle by using Azure Data Factory.
As you said, you really need data changes in Oracle to be reflected as near to real-time as possible.
The migration/copy time must be very short. Then the data in Oracle and Azure SQL database could be same before the Oracle data changed next time. I searched a lot and didn't find any real-time copy tools. Actually, I think you want the copy could be something like 'data sync'.
I found this link Sync Oracle Database with SQL Azure, hope it could give some good ideas for you.
About the data migration or copy, You can using bellow ways:
SQL Server Migration Assistant for Oracle (OracleToSQL)
Azure Database Migration Service (DMS)
Reference tutorial:
Migrating Oracle Databases to SQL Server (OracleToSQL): SQL Server Migration Assistant (SSMA) for Oracle is a comprehensive environment that helps you quickly migrate Oracle databases to Azure SQL database.
How to migrate Oracle to Azure SQL Database with minimum downtime:
Hope this helps.
For the record, we went with a product named QLik Replicate (aka Attunity) and it is working very well!

How to manage CosmosDB Stored procedures, Function and Triggers as like SQL DB project

I am developing a Saas based application which has hybrid DB architecture (Azure SQL Server and Azure Cosmos DB).
To manage SQL Server Tables, Stored procedures, triggers and functions we will create a SQLDB project (.sqlproj). Also we can generate .dacpac and deploy in the sql server.
As like SQL, we will have collections, stored procedures, triggers and functions in Azure CosmosDB.
How to manage CosmosDB collection, procedure, trigger? Is there any project templete available to manage? Suggest a solution to proceed.
Based on my experience with CosmosDb, I believe there is nothing sort of project templates available for CosmosDb. Because it is not as easy as SQL Db project.
I suggest you will have to store them as json files in local solution version control and version them accordingly.
You could write necessary programming logic to execute these scripts/cosmos DB logic using SQL API for .NET or another platform. This way you are controlling the collections, udf, triggers etc from your code, and you can version your code accordingly.
More references here: https://learn.microsoft.com/en-us/azure/cosmos-db/programming
Azure CosmosDBs can be managed through ARM templates. You can use these to version your databases/collections/etc. See Microsoft.DocumentDB resource types documentation.

Is it possible to use Cosmos DB instead of Azure SQL DATABASE?

I am very excited to use Cosmos DB into my current application instead of Azure SQL database.
Before use Cosmos DB as backend in my current application, I have few questions in my mind those are
In my current application I used Entity framework.
And also used column encryption, dynamic data masking features.
So, if I moved to Cosmos DB instead of using Azure SQL database then how can I achieve those features by using Cosmos DB?
Documentation doesn't specify details about encryption, masking and entity framework.
Can you please tell me “is it possible to use Cosmos DB with above requirements instead of Azure SQL Database?
Entity Framework is specific to relational databases, so it doesn't fit with Cosmos DB's document store (or graph, or tables).
Regarding encryption: Cosmos DB provides encryption-at-rest, built-in. There is no per-property data-masking feature built-in; you'd have to do your own data masking.
Whether you migrate to a document (or graph, or table) store is really up to you, and whether you want to re-shape your data to fit in such a storage model, vs a relational model. No real way to answer that for you. (TL;DR you cannot just switch from relational to, say, document, without any changes, as they are fundamentally different storage concepts).

Upload SQL Database and its Data to Azure

I created an SQL database using ASP.NET Core 1.1 Migrations.
After I created the database I added some data to the database.
What options do I have to upload this database to Azure?
I need to send the Scheme and the initial data.
Is it possible to run Entity Framework migrations on Azure?
This article describes the possibilities to migrate an existing database to SQL Azure.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-cloud-migrate
However, in your scenario, this might be overkill to go through the steps of realy doing a migration.
If your number of tables and data is rather small, why not create a SQL script to create the tables & insert the data?
Connect to your SQL Azure using SQL Server Management Studio and execute the script.
As for the Entity Framework, yes, you can run those on SQL Azure as well.

Resources