Visual studio database project two way synonyms - visual-studio-2012

We have few databases running together, with synonyms between them. There're also two way synonyms between the databases. VS database projects can't seem to handle these. Two way synonyms don't work in VS, I can only reference another database in VS one way, otherwise there's circular reference. I tried creating a snapshot of the database project in VS but to be able to take a snapshot I need to build the project, to be able to build the project I need to reference the other database project which doesn't compile because it doesn't recognise synonyms, etc. It seems multiple databases (same server) with two way synonyms on each other is too complicated for VS to manage. Has anyone managed to get something like this to work?

Use the SQLPackage command line to create the dacpac (it's a bit more forgiving of the cross-database references than the GUI). Add those as DB references.
There's a section here about using SQLPackage to extract the dacpac from an existing database.
http://schottsql.blogspot.com/2012/10/ssdt-importing-existing-database.html
I've written about external references here:
http://schottsql.blogspot.com/2012/10/ssdt-external-database-references.html
We have a lot of cross-DB dependencies and once we get past the initial builds or start from a restored DB, we don't have any issues w/ the references.

Related

Continuous Integration of SQL Server in Visual Studio Online

I was just about to do a continuous integration of SQL Server scripts with VSTS. I have two script files in my visual studio 2015 database project.
createStudentTable.sql => simple create table script
Script.PostDeployment1.sql => :r .\createStudentTable.sql (pointing to the above script)
Now after the successful build in visual studio online I suddenly recognized that a .dacpac file is also created - see this screenshot:
Now my database has around 100 tables + view and stored procedures. Now does this .dacpac file contain the entire schema details? If so then it would be an huge overhead in carrying this .dacpac with every build.
Please advise.
Dacpac file only contains the schema model definition of your database and it does not contain any of table data unless you add all of insert statements in the postdeploymentscript.sql
The overhead of dacpac is that it compares the model in dacpac and your target database when the actual deployment happens.
This is a trade-off. If you don't use dacpac then you will end up doing all the database versions and version migrations by yourself manually or using another tool that can make those database change managements with ALTER statements somewhat easier.
BTW the scale of 100 table can be handled well by dacpac.

Kofax project and batch class

Kofax Capture Version 9
I have an existing Project and Batch class that works, built previously by Kofax engineer.
What I need to do is change the script in the project to use a new DB connection. This seemed simple enough.
Using project builder I copied the existing project, altered the script and saved the project. Using Capture Administration I copied the existing batch class and then used Synchronize Kofax Transformation Project and pointed to the new project. All this seemed to work without error.
However the script being executed is the original not my altered one, any guidance would be great.
Make sure you are creating a new batch after publishing your change. The batch class class update function works in very limited scenarios, so I don't generally recommend it.
There are many ways that a database connection might be handled in script. Usually I would expect that a function at the project script level handles the connection and is called from any sub class, but you might want to check any sub classes to make sure they aren't using locally defined connection strings.
Even if you are making a connection in script (which you've now changed), you might also be using product features that use databases. Open Project Settings and check the Databases tab.
If there are relational databases listed, simply change as needed.
If you are actually using "Remote Fuzzy" databases then these might be using Kofax Search and Matching Server which connects to a relational database to build the fuzzy db. In this case you would need to use KSMS Admin to change the connection on the KSMS server.
If you are using "Local Fuzzy" databases then the info is based on the content of a text file. You might have some external process (possibly Markview) that dumps this text file from a database.

How should I go about using a temporarily changed copy of a DLL locally when it's been checked in to TFS?

We have a Libraries folder where we keep third-party DLLs and our own utility DLLs for all applications to reference. I want to do development against one of our utility DLLs and an application that consumes it at the same time. But if I check out the library DLL to change it for temporary local use, TFS insists on checking it out exclusively, which trips other people up. I understand the reasoning behind it doing that (hard/impossible to merge a DLL, so two people shouldn't be working on one at the same time), but I just want to mess with my local copy while I'm working on the library it represents.
I suppose I could delete my application's reference to the DLL and recreate the reference pointing to some other place, but of course this just begs for me to forget and check it in like that, which would obviously be bad. Not to mention that this is a pain in the neck.
How should I proceed in such a situation?
You are using a server workspace that does not allow editing outwith TFS. In TFS 2012 local workspaces were introduced which do not have a read only flag for files and you are free to edit at will.
You can change your existing workspace in a few clicks: http://msdn.microsoft.com/en-us/library/bb892960.aspx
You could just go into the file system and mark the file as writeable. Once you are happy the binary is good you could check it out, copy the new version of the file over and check it back in again. TFS marks binary files like this as locked for good reason, as you can't merge them in the way you can with textual content.
The best approach would be to use a NuGet repository to manage your binary dependencies, instead of relying on binaries checked into source control.

How do you handle code promotion in a Sharepoint environment?

In a typical enterprise scenario with in-house development, you might have dev, staging, and production environments. You might use SVN to contain ongoing development work in a trunk, with patches being stored in branches, and your released code going into appropriately named tags. Migrating binaries from one environment to the next may be as simple as copying them to middle-ware servers, GAC'ing things that need to be GAC'ed, etc. In coordination with new revisions of binaries, databases are updated, usually by adding stored procedures, views, and adding/adjusting table schema.
In a Sharepoint environment, you might use a similar version control scheme. Custom code (assemblies) ends up in features that get installed either manually or via various setup programs. However, some of what needs to be promoted from dev to staging, and then onto production might be database content that supports the custom code bits.
If you've managed an enterprise Sharepoint environment, please share thoughts on how you manage promotion of code and content changes between environments, while protecting your work and your users, and keeping your sanity.
I assume when you talk about database content you are referring to the actual contents contained in a site a or a list.
Probably the best way to do this is to use the stsadm import and export commands to export and import content from one environment to another. (Don't use backup/restore when going from one environment to another.)
For any file changes (assemblies, aspx) you can use Features and then keep track of the installers. You would install the feature and do an upgrade to push changes.
There's no easy way to sync the data...you can use stsadm import/export commands as John pointed out. But this may not be straight-forward, especially if the servers are configured differently.
There's also Data Sync Studio product (http://www.simego.net/DataSync_Studio.aspx) you can try.
Depending on what form the database content takes, I would keep the creation of it in code so it's all in one place (your Visual Studio project) and can also be managed via source control. Deployment of the content could either be via a console application or even better feature receiver.
You might also like to read this blog post and look at the tool mentioned there for another approach.
The best resource I can point you to is Eric's paper:
http://msdn.microsoft.com/en-us/library/bb428899.aspx
I was part of a team working to better the story around development of WSS and MOSS solutions with TFS, but I don't know where that stands.

What is a good deployment tool for websites on Windows?

I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.

Resources