I was trying to take specfic tables data in an Azure SQL DB and then dump this data in to a new table of same structure inside another existing SQL database. I don't want full back-up and both Azure SQL Databases are inside the same Azure SQL Server. It shows some security errors and so I thought I might be doing something wrong. Any tools/automated way to achieve this in Azure?
Try to use SQL Server Management Studio to generate a script of the data to copy from table A on Database A to table A on Database B as explained here.
Right click the source database and select "Generate Scripts..."
Click "Next" and skip the introduction. Click "Select specific database objects" and select the desired table "Newsletters" and "Next".
Now comes the important, "Advanced" part: Generate a script by selecting the output to a file, to the clipboard or to a new query window. Click on "Advanced" and in the option "Types of data to script", select "Data only" as shown here. Click "OK".
Select the "Types of data to script" as needed...
Now confirm the script generation with "Next".
The script has been generated. Click "Finish" to close the wizard. You will get the generated script in a new query window in the background.
Open a new query window in the destination database.
Then, copy the generated script into the destination query windows connected to the destination database and execute it (F5).
Check the result. All data should be available in the destination database in the table.
Related
The company that I work for recently moved all of our medical claims onto a SQL Server database, and we now need software to retrieve these records and export them to a csv or txt file. Presently, our claims table has 11 million records but this increases daily.
Can this be done in Desktop BI, Power BI, Desktop Excel, or SQL Server Management Studio? I know that sheets in Desktop Excel have a cap at 1.2M rows, but we're thinking of loading into a data model and exporting that way (perhaps with DaxStudio).
Thank you.
You dont need Power BI or DaxStudio, exporting to CSV can be achieved within SQL Server Management Studio.
Right click on a database -> Tasks -> Export Data
Select your data source, probably using SQL Server Native Client. Input Server Name, Authentication details, and Database name
On the Destination tab, select Flat File, and enter a name / location and select file type of csv
Select "Copy data from one or more tables"
Select your table to export and configure the csv output as needed
On the next screen Run Immediately (you can also generate a rerunnable package from here) and Finish
As of now I have done by using 'Advanced Editor' option, for each table. I need to change the DB for the whole project itself.
data is in cloud only.
please post your solution
Thanks
From File->Options and settings->Data source settings click Change Source... button and enter new values for Server and Database.
I'm working on creating an excel report that collects data from a local copy of a SQL Server database on my maschine, where I use Power Query to retrieve the data. These are then loaded into a PowerPivot data model. Now I'm finished with the development and on my way to put this into production on another server on the customer's server. The Excel workbook must change database settings using sql server database user Connection (not integrated). I had hoped that I could change the database Connection Properties at the Data tab, but there are not any easy way to change the connection string to the new server. Now I can't see any option that going through every Power Query query and change them manually. I have great hope that you Power Query experts have a Nice explaination and an example how I can solve this.
Hope to hear from you soon
Regards Geir F
There isn't a great solution for bulk server rename today, but we're very aware of the customer demand! (I can't promise anything about upcoming features, but at some point in the past I heard the dev team discuss this feature.) I'd recommend showing your support for this feature at https://excel.uservoice.com/
If you need to solve this soon, manually opening each query and editting the server string is what you need to do, sorry :\
(If you're building new reports again, Power BI Desktop lets you parameterize the server name to a top-level query, which would allow for quick rename operations!)
Do you only need to change the server name? If you go to the Data Source Settings window, you can select the SQL Server source you are using and click on the "Change Source..." button. If you change the server name in that dialog, it will change the server name in all of the queries that use that source (assuming it's the first step in the query).
Is there functionality in Azure SQL Data Warehouse similar to SQL Server's right-click -> Modify in SSMS for stored procedures?
Is there functionality in Azure SQL Data Warehouse similar to SQL Server Management Studio's right-click -> Script Table As... for tables?
I am running into inconveniences when trying to make modifications to my SP's and tables in my Azure Data Warehouse because I cannot do either of these things....I have to script out my SP/table and save that script somewhere so I can make modifications without having to rewrite it.
What I have tried:
In Visual Studio (2015 Enterprise, Update 1, Installed latest version of Data Tools yesterday), I right-click on the asset in the SQL Server Object Explorer and select "View Code"...the result of this is an error popup that says "Object reference not set to an instance of an object."
In SSMS (2014 v12.0.4213.0) none of my tables show up in the Object Explorer, and if I right-click -> Modify on a stored procedure, I get the following error:
The only way right now that I can think to get the code is to write selects against sys.sql_modules and sys.tables et al.
Any insight would be great!
SQL Data Warehouse does not currently support SSMS. This is a high priority work item for the service and we are working to enable support soon.
I can script out objects from Visual Studio Community 2015 (which is free, as in beer).
I've got the Azure DW registered in SQL Server Object Explorer - which is a bit clunky if you're used to SSMS - and can script an object by right-clicking and selecting "View Code".
If you change anything, you generally need to refresh the tree at the database level before the right-click functions work again.
For SSDT support, including view code, you must be running the preview version.
https://msdn.microsoft.com/en-us/library/mt204009.aspx
I am trying to make modifications to a named query in a cube and the data source view (.dsv) won't allow me to make edits or create a new named query. It gives me a login error even though I can successfully do a test connection in the data source dialog and can login successfully with SSMS.
The error is: "Login failed for user "
If I instead use the Windows authentication I am able to do everything; however, I would like to use the SQL Server Authentication option.
Are there privileges that need to be set for this? I'm pretty sure I've tried giving this user everything from db_owner etc in my local environment and still no luck.
I was wondering if maybe SSAS requires a specific role to allow SQL Server Authentication to be used for editing/creating named queries in the data source view.
Please refer to the following steps to slove this issue:
In the SSDT development interface, double-click on your data source.
In the Data Source Desginer dialog, please click "Edit".
Please select "SqlClient Data Provider", and then use your SQL Account for the data source.
Follow this link
(https://social.msdn.microsoft.com/Forums/sqlserver/en-US/b5f05388-42e0-4fb6-92a9-d7d3e08aa98c/ssdt-2012-named-query-problem?forum=sqlanalysisservices)