I am using the DAC framework's Import Export Service Client tool to export a BACPAC file from a SQL Database on Azure. But when I restore the BACPAC file to the Azure Database, it loses the comment of a Store Procedure or a View, which was initially entered above the Create statement. Is this because of the export tool neglecting comments outside of the Create View or Store Procedure?
/*
XXXX Procedure
Version 1.0, */
CREATE Procedure [dbo].XXXX
I am not familiar with DAC. But I think this is an expected behavior. Comments are not used at runtime. If they’re removed, performance can be slightly improved as the engine doesn’t need to parse the comment.
Best Regards,
Ming Xu.
Related
I am pretty new to AZURE cloud and stuck at a place where I want to repplicate 1 table into another database with same schema and table name.
By replication I mean, the new table in another database should automatically synced with the original table. I can do this using the elastic table, but the queries are taking way too long and some time getting timed out, so I am thinking of having a local table in another database instead of elastic table, but I am not sure how I can do this in AZURE ?
Note: Both database resided on same DB server
Any example, links will be helpful
Thanks
To achieve this you can use a DACPAC (Data-Tier Package) a data tier package can be created in Visual Studio or extracted from an existing database. They contain the database creation scripts and manage your deltas for you. More information can be found here. For information about how to build and deploy a DACPAC using both VS and extracted from a database see this answer
I have not been able to find a solution to this so will ask the experts.
A co-worker has a .txt file on his laptop that we want to load into Azure SQL DB using SSMS and Bulk Insert. We can open the local file easily enough but we don't know how to reference this file in FROM clause.
Assuming a file named myData.txt is saved to
c:\Users\Someone
how do we tell Azure SQL DB where that file is?
You don't. :) You have to upload a file to an Azure Blob Store and then, from there, you can use BULK INSERT or OPENROWSET to open the file.
https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017
I've written an article that describes the steps to open a JSON file here:
https://medium.com/#mauridb/work-with-json-files-with-azure-sql-8946f066ddd4
I fixed this problem by uploading the file to a local database and then use a linked server to my Azure db to insert or update the record. Much easier than creating a Blob Storage. However, if the file is very big or you have a lot of files to upload you might not want to use my method as linked servers is not the quickest connection.
I am using xamarin Azure SDK to download and manage the local database for my Xamarin . Forms App.
We are facing downloading time issues because we have a lot of data,
so I am thinking of taking backup once of the SQLite File from one device and use it to restore on different devices as restoring the same SQLite File.
Planned to use Azure Blob storage to store backup of SQLite files and for different device planning to download that blob of SQLite file and thinking of restore it on different devices.
Any Help will be appreciated.
Thanks :)
An approach I have used in the past is to create a controller method on the azure end which the client app can call that generates a pre-filled sqlite database or 'snapshot' on the server (making sure you include all the extra azure tables and columns) and then return a download url for the file to the client. We also zip-up the snapshot database to reduce the download times. You could store this 'snapshot' in azure blob if you desired.
Please refer given link. SQLite is not supporting only relationship like Foreign Key.
Memory Stream as DB
you can upload back up file on Blob with respective user details. and when there is any call with same user details you can download it from blob.
Those are the links that provide you with the code / knowledge required to use Azure Blob Storage on Xamarin:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-xamarin-blob-storage
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs
As this question is very general I can provide you only with those general links. There are many details on how to implement that in your case, if you face some specific problem I recommend to ask another question with the exact description of that specific problem.
EDIT: According to your comment you have some problems in replacing the local file. The only thing is that you must replace it before you initialize SQLite, otherwise it is a simple file operation.
Kofax Capture Version 9
I have an existing Project and Batch class that works, built previously by Kofax engineer.
What I need to do is change the script in the project to use a new DB connection. This seemed simple enough.
Using project builder I copied the existing project, altered the script and saved the project. Using Capture Administration I copied the existing batch class and then used Synchronize Kofax Transformation Project and pointed to the new project. All this seemed to work without error.
However the script being executed is the original not my altered one, any guidance would be great.
Make sure you are creating a new batch after publishing your change. The batch class class update function works in very limited scenarios, so I don't generally recommend it.
There are many ways that a database connection might be handled in script. Usually I would expect that a function at the project script level handles the connection and is called from any sub class, but you might want to check any sub classes to make sure they aren't using locally defined connection strings.
Even if you are making a connection in script (which you've now changed), you might also be using product features that use databases. Open Project Settings and check the Databases tab.
If there are relational databases listed, simply change as needed.
If you are actually using "Remote Fuzzy" databases then these might be using Kofax Search and Matching Server which connects to a relational database to build the fuzzy db. In this case you would need to use KSMS Admin to change the connection on the KSMS server.
If you are using "Local Fuzzy" databases then the info is based on the content of a text file. You might have some external process (possibly Markview) that dumps this text file from a database.
I have an ASP.NET app that takes multimegabyte file uploads, writes them to disk, and later MSSQL 2008 loads them with BCP.
I would like to move the whole thing to Azure, but since there are no "files" for BCP, can anyone comment on how to get bulk data from an Azure app into SQL Azure?
I did see http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx but am not sure if that applies.
Thanks.
BCP is one way to do it.
This post explains it in three easy steps:
Bulk insert with Azure SQL
You are on the right track. The Bulk Copy API will work. I am using it. And it's the fastest way to import data because it uses INSERT BULK statements. Not to get confused with the BULK INSERT statement which is not supported in SQL Azure. In essence, BCP and the SqlBulkCopy API use the same method.
http://www.solidq.com/sqj/Pages/2011-May-Issue/Migrating-Data-into-Microsofts-Data-Platform-SQL-Azure.aspx for a detailed analysis of options available
I think it's important to note that BCP cannot handle source files that are Unicode while using a format file to do your import.