How to permit user to load its own file in spotfire? - spotfire

I would like to permit user to load their own data (excel file) through the user interface. And then I want to retrieve the file and use it in my data function.
I'm working on a Spotfire Dashboard. I need that the user import its own data and I want to use this table later in a data function (r Embedded in spotfire). I know we can use Ironpython or Javascript throught spotfire and I have no idea how to permit user to load its data
Can you help me please
Any help would be appreciated!

sorry to tell you, but this is not a feature that's available in Spotfire. you might be able to accomplish this with some serious Python scripting, but I wouldn't put the time in to bother.
there are a lot of potential issues that could crop up, but essentially they come down to: users are stupid; do not trust them.
you will spend far too much time to define edge cases such as "what if a user uploads a file that is missing a needed column?" "what if a user uploads some other file type (.pdf)?" "what if a user uploads data that is not actually valid or has been manipulated to show different results?"
my strong suggestion is that you consider what you are trying to accomplish by letting users provide their own data, and try to find some other way to do that. otherwise, HIC SVNT DRACONIS ;)
as an alternative, you can probably use an information link connected to a database? but the answer to your question is: you cannot, and you should not want to.

Related

Subscribe to google forms program submit without ownership

With the wake of the pandemic causing schools to go to distance learning, many classes take attendance by using a simple google form sent out to students to complete for each class everyday. While this seems like a simple solution, it is a pain for students to complete and keep track of. One way that I thought I could make this easier would be to keep track of which forms I have submitted everyday.
As of now, my problem is that I need a way to subscribe to the submit of a google form (based on a link). When that google form is submitted all I need to do is find a way to convey that to a program. What I do not understand is how I would be able to do that without having ownership of the form or make a teacher recreate the form. Is there a way that I can check if a google form has been submitted?
A couple of ideas I have had would be to sniff network traffic for a post request from a google form and get that link and compare it to other links in the program to see which one was submitted, but I would think there is an easier way to do this. Any ideas or code is welcomed.
I understand stack overflow is for already written code so if you do not agree with this post either ignore it or point me to the correct place where this should be posted. Thank you.

Can a form's onload script access other entities than the primary one?

I have a requirement to add fields onto a form based on data from another set of entities. Is this possible using an event script or does it require a plugin?
Given that I understand your assignment correctly, it can be done using JavaScript as well as a plugin. There is a significant difference that you need to take into consideration.
Is the change to the other entities to be made only when an actual user loads a form? If so, JS is the right way.
Or perhaps you need to ensure that those values are written even if a console client or system process retrieves the value of the primary entity? In that case, C# is your only option.
EDIT:
Simply accessing the values from any entity in the onload event can be done using a call to oData. I believe someone else asked a similar question recently. The basic format will look like this.
http://Server:Port/Organization
/XrmServices/2011/OrganizationData.svc
/TheEntityLogicalNameOfYoursSet()?$filter=FieldName eq 'ValueOfIt'
Some extra remarks.
If you're targeting on-line installation, the syntax will differ, of course, because the Schema-Server-Port-Organization are provided in a different pattern (https, orgName.crm4.something.something.com etc.). You can look it up on Settings.
Perhaps it should go without saying and I'm sure you realize it but for completeness' sake, TheEntityLogicalNameOfYours needs to be substituted for the actual name (unless that is your actual name, in which case I'll be worried, haha).
If you're new to this whole oData thingy, keep asking. I got the impression that the info I'm giving you is appreciated but not really producing "aha!" experience for you. You might want to ask separate questions, though. Some examples right off the top of my head.
a. "How do I perform oData call in JavaScript?"
b. "How do I access the fetched data?"
c. "How do I add/remove/hide a field programmatically on a form?"
d. "How do I combine data from...?"

Generate XML feeds or auto export files in SAP?

Is there any way to generate automatically the result of an SAP transaction? Let's say I want to see the production orders for one MRP controller (I have the COOIS transaction for this). Is there any way to generate an XML feed with the result of that transaction and refresh it let's say.. every 10 minutes?
Or to auto-export an .xls file with the result somewhere... ? I know I have the jobs and the spools but I have to manually download the result from the SAP GUI.
I don't have access to ABAP so I would like to know if there are other methods to get data from SAP?
Since "a transaction" might be anything from a simple report to a complex interactive application that does not even have a simple "result", I doubt that there's a way to provide any generic tool for this. You might try the following:
Schedule a job and have the result sent to some mailbox instead of printing it. Then use the programming language of your choice to grab and process the mail.
Check whether there are BAPIs available (BAPI_PRODORD_* or something like that - I'm not a CO expert, so I wouldn't know which one to use). You can call these BAPIs from an external program without having to write ABAP yourself - however, you'll most likely need the help of someone who knows ABAP in order to get the interface documentation and understand the concepts.

Is there a way to get download all the statistics on events at once from Flurry?

We are bumping into limitations with Flurry. We use events and parameters to track some game play info (like number of KO/map) but 1/ the limit of 15 parameters per event is a problem and 2/ the visualisation is not good (for instance Ko/map is shown by map so we have to open each event one after another).
We are trying to build a better visualisation with excel using the CSV files provided by Flurry, but then again we need to download the 50+ CSV files and it's really not convenient.
Is there a way to get all the information in one CSV or to get the information another way?
As a side note Flurry support is not answering any of our emails. :(
thanks for your help!
Have you tried checking out playtomic instead. Sounds like it might match your problem better.
They have an API to access your data. So you should be able to access it realtime.
You might also want to check out www.parse.com

Integrating with 500+ applications

Our customers use 500+ applications and we would like to integrate these applications with our. What is the best way to do that? These applications are time registration applications and common for most of them is that they can export to csv or similar, some of them are actually home-brewed excel sheets where time is registered.
The best idea so far is to create our own excel sheet, which can be used to integrate with all these applications. The integrations could be in the form of cells containing something like ='[c:\export.csv]rawdata'!$A$3 Where export.csv is the csv file exported from the time registration applications. Can you see a better way to integrate against all these applications? It should be mentioned that almost all our customers have Microsoft Office.
Edit: Answers to the excellent questions from Pontus Gagge:
How similar are the data in the different applications?
I assume that since they time registration applications, they will have some similarities, but I assume that some will register the how long time one has worked in total for a whole month, while others will spesify for each day. If Excel is chosen, I believe that many of the differences could be ironed out using basic formulas.
What quality is the data?
The quality of the data can vary so basic validation must be undertaken, a good way is also to make it transparent for the customers, how our application understands their input, so they are responsible.
How large amounts of data are you talking about?
There will be information about the time worked for up to 50 employees.
Is the integration one-way only?
Yes
With what frequency should information be transferred?
Once per month (when they need to pay salaries).
How often do the applications themselves change, and how often does your product change?
If their application is a home-brewed Excel sheet, then I assume it will change once a year (due for example a mistake someone). If it is a standard proper time registration application, then I do not believe they are updated more often than every fifth year or so, as it is a very stabile concept.
Should the integration be fully automatic or can your end users trigger a data transfer?
They can surely trigger data transfer. The users are often dedicated to the process so they can be trained at doing it, which means that they could make up to, say 30, mouse clicks in order to integrate each month.
Will the customers have somebody to monitor the integrations?
As we have many customers, many of them should be able to undertake the integration themselves. We will though be able to assist them over the telephone. We cannot, though undertake the integration ourselves because we would then be responsible for any errors due to user mistakes, etc.
Does the phrase 'integration spaghetti' mean anything to you...?
I am looking for ideas from the best chefs to cook a nice large portion of that.
You need to come up with a common data format, and a way to translate the individual data formats to the common format. There's really no way around this - any solution you come up with will have to do this in one way or the other. It's the essential complexity of what you're doing.
The bigger issue is actually variances within the source data, in terms of how things like dates are stored, missing columns, etc. Doing a generic conversion for CSV to move columns around is comparatively easy.
I would also look at CSV and then use an OLEDB connection against the CSV file for importing.
If you try to make something that can interface to any data structure in the universe (and 500 is plenty close enough), it is guaranteed to be a maintenance nightmare. Instead I would approach this from multiple angles:
Devise an interface into which a human can enter this data already in the proper format. With 500+ clients, I'd make this a small, raw but functional browser based site that users can use to enter this information manally. This is the fall-back. At the end of the day, a human can re-key the information into the site and solve the import issue. Ideally, everyone would use this instead of their own format. Data entry people are cheap.
Similar to above, but expanded, I would develop a standard application or standardize on an off-the-shelf application that can be used to replace their existing format. This might take more time than #1. The goal would be to only do one-time imports of these varying data schemas into the application and be done with them for good.
The nice thing about spreadsheets is that you can do anything anywhere. The bad thing about spreadsheets is that you can do anything anywhere. With CSV or a spreadsheet there is simply no way to enforce data integrity and thus consistency (which is the primary goal) on the data. If the source data is already in a database, then that is obviously simpler.
I would be inclined to use database format into which each of these files need to be converted rather than a spreadsheet (e.g. use something like Jet (MDB)). If you have non-Windows users then that will make it harder and you might have to use a spreadsheet. The problem is that it is too easy for the user to change their source structure, break their upload and come crying to you. If a given end user has a resident expert, they can find a way of importing the data into that database format . If you are that expert, then I would on a case-by-case basis, write something that would import into that database format. XML would be the other choice, but that will likely take more coding than an import/export into a database format.
Standardization of the apps (even having all the sources in a database format instead of a spreadsheet would help) and control over the data schema is the ultimate goal rather than permitting a gazillion formats. There really is no nice answer other than standardization. Otherwise, you are having to write a converter for every Tom-Dick-and-Harry format and again when someone changes the source format.
With a multitude of data sources mapping each one correctly to an intermediate format is not trivial. Regular expressions are good with a finite set of known data formats. Multipass can help when data is ambiguous without context (month,day fields and have several days of data), and also help defeat data entry errors. But it seems as this data is connected to salaries there needs a good reliable transfer.
An import configuring trick
Get the customer to make a set of training data in the application. It should have a "predefined unique date" and each subsequent data field have a number corresponding to the target data field in your application. On importing your application needs to recognise the predefined date, determine the unique translation required and effect the displaying/saving of this "mapping key", and stop the import. eg If you expect "Duration hours" in field two then get the user to enter 2 in the relevant field which might be "Attendance hours".
On subsequent runs, and with the mapping definition key, import becomes a fairly easy process of translation.
Note on terms
"predefined date" - must be historical, say founding date of your company?, might need to be in PC clock settable range.
"mapping key" - could be string of hex digits and nybble based so tractable to workout
The entered code can be extended to signify required conversions ie customer's application has durations in days and your application expects it in hours.
Interfacing with windows programs (in order if increasing fragility)
Ye Olde saving as CSV file
Print to operating system printer that is setup as a text file/pdf, then scavenge the data out of that
Extract data via the application interface control, typically ActiveX for several windows programs ie like Matlab's Spreadsheet Link
Read native file format xls format ie like Matlab's xlsread
Add an additional intermediate spreadsheet sheet that has extended cell references ie ='[filename]rawdata'!$A$3
Have a look at Teiid by JBoss: http://jboss.org/teiid
Also consider using SOA - e.g., if you're on Java, try JBoss SOA platform: http://www.jboss.com/resources/soa/?intcmp=1004
Use a simple XML format. A non-technical person can easily understand a simple XML format (and could even identify basic problems with XML documents that are not well-formed).
Maybe use a DTD (or even better an XML schema) to do very basic validation, and then supplement this with an XSL stylesheet to do more validation with better error reporting. (An XSL stylesheet simply converts from XML to something else and so can be generate readable error messages.)
The advantage of this approach is that web browsers such as Internet Explorer can apply the XSL stylesheets. A customer need only spend at most a day enhancing their applications or writing excel macros to generate the XML data in the format that you specify.
Recent versions of Excel have support for converting spreadsheet data to XML, and can even validate against schemas.
Once the data passes the XSL validation checks, you have validated XML data.
If you have heaps of data and heaps of money, you could look at existing data management and cleansing tools:
http://www-01.ibm.com/software/data/infosphere/datastage
http://www-01.ibm.com/software/data/infosphere/qualitystage
But even then, you'll likely need to follow kyoryu's suggestion assuming you have 500+ data formats. The problem isn't your side. You need them to standardize their output formats if you have no control over their apps. CSV is likely the easiest. You could even send them a excel template to help them along.

Resources