Is it possible to transfer data from SSIS to SharePoint and place the csv in a sharepoint list.
I have tried automating this with ssis and for some reason when I execute the package under a scheduled task the package will not run....
If i create a scheduled task and run dtexec (and the package path) this will not run under the scheduled task but it will run if i am using a .bat file and enter the same command script.. I am using creds that have access to the sharepoint site. It seems that there is just no way to automate placing csv files onto sharepoint.
http://social.msdn.microsoft.com/Forums/en/sharepoint2010programming/thread/905fd9fb-ae70-4335-9628-d28d040f0bdc
http://social.msdn.microsoft.com/Forums/en/sharepointdevelopment/thread/d59bbc46-27b4-468e-9ed6-70435200bef2
Although I haven't had the need to use it in a production environment yet, I'm sure this custom component will suit your needs :)
http://ssisctc.codeplex.com/wikipage?title=SharePoint List Destination&referringTitle=Home
Related
I have a pipeline in azure that runs a script once per month. The script invokes a VBA. The problem is I can't run this VBA in azure, since in order for the script to run it requires a copy of excel. Is there any way to automate the process of executing a VBA either within azure or somewhere else and then grab the resulting excel files? I'm open to any ideas. Where else can I run VBAs external to azure and then draw those files into azure blob storage.
Thanks
First, use Windows Task Scheduler to run the excel every month and use macro to save the result in some network place, onedrive or sharepoint; or sent it by email; or push into some database, etc...
Second, use power automate and office script every month to handle the result file if necessary.
Is there anyway that I can automatically download files from MS sharePoint when a file is created or modified without installing anything(e.g. connection gateway of power automate)?
To download files from MS sharePoint when a file is created or modified, you can build a Power Automate flow like below:
I have an SSIS package that will read excel files from a shared network drive. I am using excel data source task.
In visual studio the task works fine even if the excel file is open and being used by another user.
When I deploy the package to the ssis catalog, and execute, the package fails when the file is open with someone.
Is there any way to make the excel task work even when the file is open?
By default the Excel provider controls this since it doesn't open a file in shared mode.
You can otherwise use a File System Task to copy the Excel file from a shared location to a private one and use it in your Data Flow.
I'm in the process of evaluating the possibilites of lifting and shifting my ssis packages to ADFv2 but without testing I'm finding it hard to see if all SSIS functionalities are supported.
For example my package unzips files, modifies contents of files (script task) saving new version in different directory, loads modified files to DB and update data etc
What I'm not sure about is unzipping the files (I dont want to transfer unzipped files from on prem) and also modifying files with script task. I believe these would have to be moved outside of SSIS and created as an activity of ADF? And leave only the load of files, updating data etc as my SSIS package? Probably with the files stored in Blob storage?
Or can all this still be done directly in SSIS?
Thanks
What you currently do using SSIS on premises, you could also do using SSIS in ADF. For example, you could install additional (un)zip programs using custom setup and utilize the %TEMP% folder/current working directory (".") of your SSIS IR to modify files, see
https://learn.microsoft.com/en-us/azure/data-factory/how-to-configure-azure-ssis-ir-custom-setup
https://learn.microsoft.com/en-us/sql/integration-services/lift-shift/ssis-azure-files-file-shares?view=sql-server-2017
The tasks I do manually for updating my web site:
Stop IIS 7
Copy source files from a folder to the virtual directory of my web site
Start IIS 7
There are many ways to approach this, but here is one way.
I am assuming you don't want every single file in your source repository to exist on your destination server. The best way to reliably extract what you need from your source on a regular basis is through a build file. Two options for achieving this are nant and msbuild.
Once you have the set of files you want to deploy, you now need a way to distribute them to your destination server & to stop and start IIS. Again, there are options, but I would personally recommend powershell (with the IIS snapin) for this.
If you want this to happen regularly, consider a batch file executed by some timer, such as a scheduled task, or even better, a CI solution such as TeamCity.
For a full rundown, there are examples within my PowerUp project that does this.
It depends where you are updating from, but you could have a your virtual directory pointing to a local read-only working copy of your source code and create a task that every day runs a batch file/powershell script/etc. that would update that working copy (via a svn update, git pull etc.)
That supposes that you have a branch that always contains the latest releasable code.
You have to create a batch file with the following content:
Stop WWW publishing service
Delete your old files
Copy the new files
Start WWW publishing service
You can start/stop services like this:
net stop "World Wide Web Publishing Service"
When you have your batch file you can create a task in the Task Scheduler that calls your batch in a regular time interval (e.g. each day).