Is there anyway that I can automatically download files from MS sharePoint when a file is created or modified without installing anything(e.g. connection gateway of power automate)?
To download files from MS sharePoint when a file is created or modified, you can build a Power Automate flow like below:
I have not been able to find a solution to this so will ask the experts.
A co-worker has a .txt file on his laptop that we want to load into Azure SQL DB using SSMS and Bulk Insert. We can open the local file easily enough but we don't know how to reference this file in FROM clause.
Assuming a file named myData.txt is saved to
c:\Users\Someone
how do we tell Azure SQL DB where that file is?
You don't. :) You have to upload a file to an Azure Blob Store and then, from there, you can use BULK INSERT or OPENROWSET to open the file.
https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017
I've written an article that describes the steps to open a JSON file here:
https://medium.com/#mauridb/work-with-json-files-with-azure-sql-8946f066ddd4
I fixed this problem by uploading the file to a local database and then use a linked server to my Azure db to insert or update the record. Much easier than creating a Blob Storage. However, if the file is very big or you have a lot of files to upload you might not want to use my method as linked servers is not the quickest connection.
i get few excel files via FTP server i want to save those files into my table in clob using plsql developer.
There is a SSIS project which takes data from oracle table and export that into xls file.But before exporting there are File System Task and Execute SQL Task which are supposed to delete xls file and after that create it again. The project looks like on images:
it perfectly works on a lochalhost. But the task is to export a file to other server. Excel file path for "Excel connection manager" is
C:\Users\YIskende\Desktop\New folder\report.xlsx
but how to change that in such a way to make it work with an external server
2 solutions (it depends also from SSIS and SQL Server version)
1) Package Deployment Mode: If you use a package configuration (old SSIS and SQL Server versions), you can create a Configuration database on your server where to store paths and variable values (Menu: SSIS -> Package Configurations)
2) Project Deployment Mode: In SQL Server 2014 version you have a Integration Services Catalogs in SQL Management Studio where deploy your project and save your values
I've created a BACPAC backup of my Azure SQL Database using the "Export" option within the Azure Management Console.
Having downloaded this to my machine I'm a little stuck on how I can restore this to a local SQL Server instance. I came across the DacImportExportCli tool but couldn't find an example of a local restore.
Also if anyone has written a script that does this (so it can be scheduled) that would be awesome.
This can be done simply through SQL Server Management Studio 2012
Right click on the Connection > Databases node and select "Import Data-tier application..."
Select "Next" on the introduction step.
Browse, or connect to a storage account where backups are kept.
Try "SqlPackage.exe"
I needed to export a SQL Azure database and then import it into a local SQL 2008 R2 server (Note I am also using Visual Studio 2010). Microsoft certainly went out of their way to make this a painful task, however, I was able to do it by doing the following:
Go to this link http://msdn.microsoft.com/en-us/jj650014 and install the SQL Server Data Tools for Visual Studio 2010
This will install on your local drive. In my case here is where it put it: C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin
Browse to this via the command line or powershell
You are going to want to execute the SqlPackage.exe
Open up this link to see a list of all the parameter options for SqlPackage.exe (http://msdn.microsoft.com/en-us/library/hh550080(v=vs.103).aspx)
Here is my command line that I needed to execute to import a .bacpac file into my local SQL 2008 R2 server:
.\SqlPackage.exe /a:Import /sf:C:\mydatabasefile.bacpac /tdn:NorthWind /tsn:BINGBONG
/tdn is the name of the database you want your bacpac file to restore to.
/tsn is the name of your SQL server.
You can see all these parameter descriptions on the link from #5.
You can restore the BACPAC by using the client side tools. Videos are here:
http://dacguy.wordpress.com/2011/09/09/importexport-services/
The tools are available here:
http://sqldacexamples.codeplex.com/documentation
Seems my prayers were answered. Redgate launched their SQL Azure Backup tool for FREE today - http://www.red-gate.com/products/dba/sql-azure-backup/download
If you're using SSMS 2012, it is as easy as right-clicking on the Databases folder under a server in the Object Explorer and choosing "Import Data-tier Application...".
There is one bump in the road to watch out for: as of Mar 26 2013 (when I needed to find out how to do this myself), when you export a .bacpac from Azure, it will be downloaded as a .zip file, not a .bacpac file, and the file dialog that is opened by the Browse button in the import wizard will only show either *.bacpac or *.* in the file filters, implying that .zip is not supported. However, if you change the filter to *.*, select your downloaded .zip, and click Next, the wizard will proceed normally.
Here's a script to restore a bunch of bacpac files at once:
Bulk Restore bacpac files local
cd [FOLDERPATH]
$goodlist = dir
cd 'C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin'
foreach($i in $goodlist){
$name = $i.Name;
$namer = $i.Name.Substring(0, $i.Name.length - 7);
.\SqlPackage.exe /a:Import /sf:[FOLDERPATH]\$name /tdn:$namer /tsn:[SERVERNAME]
}