Can you use a Sharepoint Trigger to export an uploaded Excel File to a Network Drive as CSV? - excel

I have a business requirement to process files uploaded by regional businesses for import to another system. It's envisaged that users will use SharePoint 2007 (soon to be SharePoint 2013), the event will trigger an export to CSV and the process will then run against those files.
Is this possible in either SharePoint versions?
Would that be an app, or standalone service I would want to create and schedule?
Does anyone have a more elegant solution? Essentially the CSV export is feeding in to a program that allows a user to visually validate and press a button to push to the other system after tweaking.

With custom code, you could create an event receiver on the list where CSV file lives that will run some code whenever the CSV file is updated. Here's a starter:
http://elczara.wordpress.com/2011/02/16/sharepoint-2010-event-receiver/
Make it a farm solution (sandbox solutions can't write to the filesystem directly), and you'll probably want to look up RunWithElevatedPrivileges, since the user doing the uploading may not have permission to write to the file system.
Steve's suggestion of rethinking the end-to-end solution is a good one, although I'm not sure how you can trigger the other system to "do its business".

Yes, it is possible, both with the 2007 version and the 2013.
Depending on your deployment scenario, you can:
create a custom timer job that will execute your job.
Create a custom site Workflow, with a loop and a delay, that will do the job.
The 1st is easier to build and maintain, but has less perspectives if you need to apply custom process.
But if you can control the application that consumes the feed, why don't you consume SharePoint directly? From the 2010 version, you can very easily get data using the listdata.svc web service. With older versions, you can still get data using a simple web service.

Related

Automating spotfire

I have some industrial data that i wish to present in a spotfire dashboard to a client. I want to make the dashboard so that it would update automatically and I have not been able to find a tutorial on Tibco's site or here for how to do this.
It would be great if someone could tell me how to make spotfire look in a particular place (server, desktop, wherever) for a new .csv file, open it and create a defined set of visualizations and then mail a pdf to the client.
I have been through the spotfire automation services manual but I can't find a specific guide to what I need it to do.
All help appreciated.
I went back and read thru the whole question again, including the other person's answer. In order to be able to do what you want to do, you need to have Spotfire Server and Spotfire Automation Services. Automation Services in a product that you have to purchase separately from Spotfire, although it is bundled with the Analyst client.
In your question you said -- It would be great if someone could tell me how to make spotfire look in a particular place (server, desktop, wherever) for a new .csv file, open it and create a defined set of visualizations and then mail a pdf to the client.
I made the assumption that you knew you needed Automation Services to do this. All of my answers have been based around the use of Automation Services. That's the only way to push an email to a user that I know of. After you setup the Automation Services job, you also have to use Active Batch to schedule the Automation Services job, which I noted as part of the original 3 step process.
I want to make the dashboard so that it would update automatically and
I have not been able to find a tutorial on Tibco's site or here for
how to do this.
What you want to do is schedule updates to your linked data. What this will do is re-query the data-source at the specified schedule (once a day, twice an hour, etc) that you specify and cache it on the web server.
Here is the documentation for that.
Schedule Updates
Scheduling updates using Spotfire Server (be sure to navigate down the sub items on the left)
Monitoring Schedule Updates
It would be great if someone could tell me how to make spotfire look
in a particular place (server, desktop, wherever) for a new .csv file,
open it and create a defined set of visualizations and then mail a pdf
to the client.
For this, you still want to use scheudle updates for the first part after you have linked your analysis to your csv file. Your file name will have to remain the same for spotfire to pick it up unless you customize this with some scripting. Once that is complete, you'll want to use Automation Services to complete your mailing a pdf.
Automation Services Tutorial
Automation Services User Manual
Generally speaking, this is a three step process.
1. In the desktop app, create a report (File -- Export -- to PDF -- prepared report. In this step you are creating the export and telling Spotfire specifically what to export. Where to find the report
2. In the desktop app, create the Automation Services Job (Tools -- Automation Services Job Builder). All jobs start with opening the file. Then you create the export. Then send the email. Sample active batch jobe
3. Now, you have to automate the task. This can be done with Windows Task Scheduler or Active Batch.
Those are the high-level steps. There is a lot of syntax and detail in each of the steps, but this should get you started. Please reply with more detailed questions on any one of the steps.

Spotfire Automation

Is it possible to automate the publish of dxp file to the server. What I want to achieve is build a command line tool, user navigates to the dxp file and runs a command (let's say publish), this should save the file to the library without opening spotfire client (something similar to running spotfire in Headless mode).
I got to know that Spotfire Automation Services can be of some help in this task, but I have never used the automation services and don't know how to install or find the automation services module. Any help or direction is highly appreciated. Thanks
Automation Services is a licensed framework from TIBCO that lets you automate several tasks including Open or Save analysis to library, replace or remap data sources, run alerts etc.
But if your sole requirement is to publish analysis to library then use the import-library-content on command line, or save it in a batch file and use IronPython to trigger that as using button from the DXP.
See API docs of import-library-content for usage.
you can check Google for Automation Services and see more details on its capabilities, but I'm pretty sure your use case is covered. Spotfire does not offer this feature out of the box (except maybe using the admin command tool, but that'd only be for admins).
your organization will need to buy a license for AS, since it's a separate product in the Spotfire suite. talk to your TIBCO rep or send me a PM.
Yes, you can use Automation Services for this. Automation Services needs to be licensed from TIBCO, but it provides a framework that will allow you to create job.xml files which specify a list of tasks for Automation Services to execute. You can then submit the job file to a web service when you want the tasks to be executed.

comparing sharepoint list and sqlserver table

I have a list in sharepoint which maintains particular month OnCall list,and we are maintaining employee directory in sql server. My requirement is to get complete data from sql server and show it in sharepoint and compare with sharepoint list and show small icon for the employees who are On Call for that particular Month. Can anyone please suggest me the waus of implementing this.
Thanks in advance.
Update: I have finished the part where I have to connect to the sqlserver database and get the employees information. For this we are using 3rd party web part to connect to the sql server and pull the data from the table. Now I have to show some kind of image on the employee name to show that he is on-call for that week. We are going to cretae custom list for maintaing the list of people who are on-Call. Can anyone please advise me on how to accomplish this.
Write a custom webpart which will pull the data from the list using sharepoint object model and SQL server using ADO.NET and do the said comparison.
If you were looking for out of the box, I am afraid there i too little information given here to analyze if its feasible out of the box or not.
If you have the SharePoint Enterprise version, you can look at using the Business Data Catalog. This will let you bind columns to external data sources. This might provide you with the functionality you're looking for.
If you do not have the Enterprise features, do you have access to deploy WSP packages and custom code?
You will have to write your own data access to your external data source. Your options would be to have a job that pulls data from the external data source and populates SharePoint list(s) or create a custom view that pulls the external data on-demand.
You'll have to come up with synchronization strategies. Meaning, is the data in the external SQL data source static, reference information that does not need to be updated depending on what a user does in SharePoint? This seems to be the case based on your question. If you do need to update the external data source, you'll have to hook into the on save event (so probably a custom event handler that listens for ItemAdding) to update the data, validate, and optionally cancel the operation with an error message.
If you can't deploy WSP packages / DLLs, you could take a look at the jQuery SharePoint library. This will let you interact with lists using jQuery. If you also write a WCF or Web Service wrapper around the data you need access to from your external data source that is accessible from the SharePoint environment, you can use hack together a solution.
To accomplish this you'd need to place a Content Editor Web Part on the page you need custom data access. In there you will write the code to reference the jQuery javascript library and jQuery SharePoint library. The code will have to make the calls to your external data service and make any updates you need.
This is the least reliable method to accomplish what you want since it's entirely page-based and can be broken by simply disabling script or someone editing the CEWP or removing it altogether.
If you don't have access to place a CEWP or any of the other solutions, then you have no options at all.
it relatively easy now to pull all the data using the third party webpart and saving it into a custom list. I would recommend you not only creating custom list but also creating the content types for this list. take a look at SharPoint MVP's post about creating a Custom List with Content Types

SharePoint Online file storage

We have a requirement to store documents in SharePoint Online as people copy files to a shared network directly.
Is there a way of automating this? I was thinking of a windows service which will poll the directories, find any changes like new subdirectories or new files, then upload them to a SharePoint Online document library.
You don't have to poll if you use a FileSystemWatcher inside your Windows service for real-time notifications.
http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
However, if your requirement is 100% accuracy, you will need to build in some sort of tracking/checksum mechanism to make sure that every document was 1) detected and 2) successfully moved to SharePoint.
You may want to have your service check the delta every time it starts up, and then subsequently only respond to FileSystemWatcher events.
EDIT: Per Tony's question below, here are some additional thoughts on getting files to SharePoint.
First, try a simple test.
1) Copy the URL of a document library within the BPOS SharePoint site. Make sure you're on a machine that has the Office Online sign in app on it.
2) Open Notepad. Type some random text.
3) Click on File -> Save As.
4) Paste the URL.
5) Attempt to save the file.
This works great on "regular" SharePoint (done it many times). If this works with BPOS, it opens up several options.
File System Replication to a SharePoint Online or Office 365 document library is planned to be released with the "Cloud Connector for Office 365". With the current version database content is supported only, but bi-directional with V2.0

Importing bulk data into sharepoint

I have an issue with a new sharepoint install that we've recently deployed to replace an ageing content management system that I implemented a few years ago.
What I'd really like is to save my colleagues as much effort as possible by transferring the content from my CMS into sharepoint.
I'm not very good with sharepoint yet, and my development platform of choice is PHP MySQL, so basically I'm wondering if sharepoint has any facility to import sites, I can easily built filters to reformat the content in my CMS into whatever (please let it be XML) format sharepoint will accept but I have no idea if sharepoint will even let me do this.
I have limited access to the sharepoint server, although in this case I can probably negotiate more if that's the only way.
Mostly I just need some pointers - does sharepoint have any facility to do this, and where do I start doing it?
Thanks
SharePoint has the ability to import data from an Excel spreadsheet (Site Actions > Create > Import Spreadsheet).
The only problem you may run into with this method is that you don't necessarily have full control over what column types the importer uses for your data--if that's important, then it will take some trial and error.
If you're familiar with .NET and you can get access to run a program on the server, you can write a program to import data into existing lists using the SharePoint object model.
the fastest way to bulk import data into SharePoint is through the batchdata method
http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spweb.processbatchdata.aspx
it is aimed ad importing list data, but it seems there are some workarounds to make it work with publishing pages
http://social.msdn.microsoft.com/Forums/sharepoint/en-US/f8fe190d-c1ed-4e15-bda2-7792211973cc/bulk-publishing-page-creation-using-processbatchdata?forum=sharepointdevelopmentlegacy

Resources