Table Replication and Synchronization in AZURE - azure

I am pretty new to AZURE cloud and stuck at a place where I want to repplicate 1 table into another database with same schema and table name.
By replication I mean, the new table in another database should automatically synced with the original table. I can do this using the elastic table, but the queries are taking way too long and some time getting timed out, so I am thinking of having a local table in another database instead of elastic table, but I am not sure how I can do this in AZURE ?
Note: Both database resided on same DB server
Any example, links will be helpful
Thanks

To achieve this you can use a DACPAC (Data-Tier Package) a data tier package can be created in Visual Studio or extracted from an existing database. They contain the database creation scripts and manage your deltas for you. More information can be found here. For information about how to build and deploy a DACPAC using both VS and extracted from a database see this answer

Related

How to transfer data from old database to new modified database in django?

I have old django project and new django project. I created dump file from database of old django. And also I made changes in tables and created new tables.
Now I want to load that dump file to my new django app. I am facing errors when I firstly migrate then restore data or firstly restore then migrate..
When I do migration first, it says tables already exist.
When I do restore first , it says django.db.utils.ProgrammingError: relation "django_content_type" already exists
I use migrate --fake error goes but new tables are not created in database.
I spent 3-4 days but could not succeed.
Please, help me if you can.
PS: my database is postgresql
This is not straightforward and will need some manual interventions and it depends on what do you want to do in the future
If the tables that already exist in the database have a stable design and won't be changed or you can do the changes manually using SQL statements then set managed = False to the models' meta, this will make Django skip making migrations for those models
If you want to keep the power of migration in the new project for all models then this will more complex
Delete all your migrations
You need to make your models equivalent to your database, you can set managed=False for new models like Users
Run python manage.py makemigrations, this will create the structure of the initial database.
Fake running the migrations python manage.py migrate --fake
Dump the records of django_migrations table
Create a new empty migration (with --empty) and add the SQL statements of the django_migrations table to it using migrations.RunSQL()
now fake again so you skip that new migration.
Now you are ready to use migrations as usual.
When installing new database, you will just need to run python manage.py migrate

Azure Synapse Polybase/External tables - return only latest file

We have an files partitioned in the datalake and are using Azure Synapse SQL Serverless pool to query them using external tables before visualising in Power BI.
Files are stored in the following partition format {source}/{year}/{month}/{filename}_{date}.parquet
We then have an external table that loads all files for that source.
For all files that increment each day this is working great as we want all files to be included. However we have some integrations that we want to return only the latest file. (i.e. the latest file sent to us is the current state that we want to load into Power BI).
Is it possible in the external table statement to only return the latest file? Or do we have to add extra logic?
We could load all the files in, and then filter for the latest filename and save that in a new location. Alternatively we could try to create an external table that changes every day.
Is there a better way to approach this?
If you are using dedicated pools then I would alter the location of your table with the latest files folder.
Load every day into a new folder and then alter the LOCATION of the external table to look at the current/latest day, but you might need to add additional logic to track in a control table what the latest successful load date is.
Unfortunately I have not found a better way to do this myself.

Continuous Integration of SQL Server in Visual Studio Online

I was just about to do a continuous integration of SQL Server scripts with VSTS. I have two script files in my visual studio 2015 database project.
createStudentTable.sql => simple create table script
Script.PostDeployment1.sql => :r .\createStudentTable.sql (pointing to the above script)
Now after the successful build in visual studio online I suddenly recognized that a .dacpac file is also created - see this screenshot:
Now my database has around 100 tables + view and stored procedures. Now does this .dacpac file contain the entire schema details? If so then it would be an huge overhead in carrying this .dacpac with every build.
Please advise.
Dacpac file only contains the schema model definition of your database and it does not contain any of table data unless you add all of insert statements in the postdeploymentscript.sql
The overhead of dacpac is that it compares the model in dacpac and your target database when the actual deployment happens.
This is a trade-off. If you don't use dacpac then you will end up doing all the database versions and version migrations by yourself manually or using another tool that can make those database change managements with ALTER statements somewhat easier.
BTW the scale of 100 table can be handled well by dacpac.

Coredata migration is really needed?

I have a SQLite database with two columns it is bundled in the app. There is no write or save interaction in the database, it is fixed and read only. I read some documents and tutorials about the lightweight/manual migration all make it clear that you have to save the user data when migrating, that is not my case I don't need to save user data, I will deploy a new app version with a new database. I want to add two new attributes to my database and use in the app. Why I have to migrate? Why can't I just delete the old three files of SQLite database and add the new one and use the new attributes as needed. So I tried and did not work, anyone here to give me the steps to make the app to recognize the new database?
Actually the way to delete the SQL database files is the right way.
But you have to do that before the Core Data stack is going to be initialized.

Bulk Load Files into SQL Azure?

I have an ASP.NET app that takes multimegabyte file uploads, writes them to disk, and later MSSQL 2008 loads them with BCP.
I would like to move the whole thing to Azure, but since there are no "files" for BCP, can anyone comment on how to get bulk data from an Azure app into SQL Azure?
I did see http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx but am not sure if that applies.
Thanks.
BCP is one way to do it.
This post explains it in three easy steps:
Bulk insert with Azure SQL
You are on the right track. The Bulk Copy API will work. I am using it. And it's the fastest way to import data because it uses INSERT BULK statements. Not to get confused with the BULK INSERT statement which is not supported in SQL Azure. In essence, BCP and the SqlBulkCopy API use the same method.
http://www.solidq.com/sqj/Pages/2011-May-Issue/Migrating-Data-into-Microsofts-Data-Platform-SQL-Azure.aspx for a detailed analysis of options available
I think it's important to note that BCP cannot handle source files that are Unicode while using a format file to do your import.

Resources