Execute two database projects with different schemas to a single database - visual-studio-2012

I have two database projects. Both have different schemas (dbo and nts). I want to execute both the projects to a single database. When I execute my nts project I am getting the name of the stored procedures as [nts].[StoredProcedureName]. When I execute dbo project to this database, the previous nts stored procedures are getting deleted. If I reverse my order (dbo first then nts), my nts stored procedures are getting executed with dbo as their schema. I am new to databases and sql. I tried going to the database and adding nts schema. But it doesn't solve my problem. Let me know if you need any other information.

When you publish your database projects, make sure these settings in the Advanced publish options are set as follows:
Always re-create database = unchecked
Drop objects in target but not in source = unchecked
Side note: You should consider combining the 2 projects into 1. A single database project can support multiple schemas.

Related

How to transfer data from old database to new modified database in django?

I have old django project and new django project. I created dump file from database of old django. And also I made changes in tables and created new tables.
Now I want to load that dump file to my new django app. I am facing errors when I firstly migrate then restore data or firstly restore then migrate..
When I do migration first, it says tables already exist.
When I do restore first , it says django.db.utils.ProgrammingError: relation "django_content_type" already exists
I use migrate --fake error goes but new tables are not created in database.
I spent 3-4 days but could not succeed.
Please, help me if you can.
PS: my database is postgresql
This is not straightforward and will need some manual interventions and it depends on what do you want to do in the future
If the tables that already exist in the database have a stable design and won't be changed or you can do the changes manually using SQL statements then set managed = False to the models' meta, this will make Django skip making migrations for those models
If you want to keep the power of migration in the new project for all models then this will more complex
Delete all your migrations
You need to make your models equivalent to your database, you can set managed=False for new models like Users
Run python manage.py makemigrations, this will create the structure of the initial database.
Fake running the migrations python manage.py migrate --fake
Dump the records of django_migrations table
Create a new empty migration (with --empty) and add the SQL statements of the django_migrations table to it using migrations.RunSQL()
now fake again so you skip that new migration.
Now you are ready to use migrations as usual.
When installing new database, you will just need to run python manage.py migrate

Table Replication and Synchronization in AZURE

I am pretty new to AZURE cloud and stuck at a place where I want to repplicate 1 table into another database with same schema and table name.
By replication I mean, the new table in another database should automatically synced with the original table. I can do this using the elastic table, but the queries are taking way too long and some time getting timed out, so I am thinking of having a local table in another database instead of elastic table, but I am not sure how I can do this in AZURE ?
Note: Both database resided on same DB server
Any example, links will be helpful
Thanks
To achieve this you can use a DACPAC (Data-Tier Package) a data tier package can be created in Visual Studio or extracted from an existing database. They contain the database creation scripts and manage your deltas for you. More information can be found here. For information about how to build and deploy a DACPAC using both VS and extracted from a database see this answer

Continuous Integration of SQL Server in Visual Studio Online

I was just about to do a continuous integration of SQL Server scripts with VSTS. I have two script files in my visual studio 2015 database project.
createStudentTable.sql => simple create table script
Script.PostDeployment1.sql => :r .\createStudentTable.sql (pointing to the above script)
Now after the successful build in visual studio online I suddenly recognized that a .dacpac file is also created - see this screenshot:
Now my database has around 100 tables + view and stored procedures. Now does this .dacpac file contain the entire schema details? If so then it would be an huge overhead in carrying this .dacpac with every build.
Please advise.
Dacpac file only contains the schema model definition of your database and it does not contain any of table data unless you add all of insert statements in the postdeploymentscript.sql
The overhead of dacpac is that it compares the model in dacpac and your target database when the actual deployment happens.
This is a trade-off. If you don't use dacpac then you will end up doing all the database versions and version migrations by yourself manually or using another tool that can make those database change managements with ALTER statements somewhat easier.
BTW the scale of 100 table can be handled well by dacpac.

Coredata migration is really needed?

I have a SQLite database with two columns it is bundled in the app. There is no write or save interaction in the database, it is fixed and read only. I read some documents and tutorials about the lightweight/manual migration all make it clear that you have to save the user data when migrating, that is not my case I don't need to save user data, I will deploy a new app version with a new database. I want to add two new attributes to my database and use in the app. Why I have to migrate? Why can't I just delete the old three files of SQLite database and add the new one and use the new attributes as needed. So I tried and did not work, anyone here to give me the steps to make the app to recognize the new database?
Actually the way to delete the SQL database files is the right way.
But you have to do that before the Core Data stack is going to be initialized.

Kofax project and batch class

Kofax Capture Version 9
I have an existing Project and Batch class that works, built previously by Kofax engineer.
What I need to do is change the script in the project to use a new DB connection. This seemed simple enough.
Using project builder I copied the existing project, altered the script and saved the project. Using Capture Administration I copied the existing batch class and then used Synchronize Kofax Transformation Project and pointed to the new project. All this seemed to work without error.
However the script being executed is the original not my altered one, any guidance would be great.
Make sure you are creating a new batch after publishing your change. The batch class class update function works in very limited scenarios, so I don't generally recommend it.
There are many ways that a database connection might be handled in script. Usually I would expect that a function at the project script level handles the connection and is called from any sub class, but you might want to check any sub classes to make sure they aren't using locally defined connection strings.
Even if you are making a connection in script (which you've now changed), you might also be using product features that use databases. Open Project Settings and check the Databases tab.
If there are relational databases listed, simply change as needed.
If you are actually using "Remote Fuzzy" databases then these might be using Kofax Search and Matching Server which connects to a relational database to build the fuzzy db. In this case you would need to use KSMS Admin to change the connection on the KSMS server.
If you are using "Local Fuzzy" databases then the info is based on the content of a text file. You might have some external process (possibly Markview) that dumps this text file from a database.

Resources