We'd like to distribute a Spotfire dashboard to different users, however the location of the data source file is on Sharepoint and for every user it's mapped to:
C:\Users\myusername\Sharepoint\spotfiretable.csv
Where myusername is the windows login account username and if I gave someone this Spotfire file they'd have to get prompted to re-direct Spotfire to the same folder mapped under their user name. Is there any way to avoid this and have Spotfire recognize the location of the data source automatically by getting the username information from system environment variables?
I would try embedding the data source in the dxp/ analysis with no data just the column names. This way visual ect don't have to be recreated.
Set up a python script to update the data source to the user name path and refresh the table. You can even trigger this event on load so it would be seemless for end user.
Here is a link showing python script to updated data source.
https://spotfired.blogspot.com/2014/05/replace-data-tables-from-file.html?m=1
If you require assistance with the python script post a new question with what code you've tried to implement.
Related
I've been trying to setup a flow that automatically mirrors (read: copies and creates new) Word templates from one Teams Channel to another.
The goal is to have them appear in the "+ new"-dropdown in sharepoint and teams (see screenshot).
"+new"-dropdown in Teams
I believe I managed to find the path to the folder in which to put the template-file, which is: https://yourcompany.sharepoint.com/sites/YourSiteName/Freigegebene Dokumente/Forms
The "Freigegebene Dokumente" should be "shared documents" in english afaik.
Here comes the caveat:
When I upload a template-file via powerautomate, they do not show up in the "+ new"-dropdown. When I then try to manually add the same file via the "+ new"-dropdown, it tells me that a file with that name already exists.
Am I missing something? I fully expected this to just work.
What I am trying to get running so that the bigger solution can work is this:
Manually trigger a Microsoft PowerAutomate Cloudflow
Get information about a specific file in SharePoint (a .dotx MS-Word template)
Get the content of that specific file
Create a new File in the */Forms folder with the same name and content as seen in the screenshot.
Be able to create a new File in SharePoint (and Teams) from that template via the "+new"-dropdown
The two parameters used are:
#outputs('Dateieigenschaften_abrufen')?['body/{FilenameWithExtension}']
and
#body('Dateiinhalt_abrufen')
Screenshot of the flow
After running this flow, I expect to have a new template to choose from in the "+new"-dropdown in SharePoint. That is not the case.
I know that there is something uploaded, because if I try to manually add the template via the "+new"-dropdown
manual upload in "+new"-dropdown
I get this error message:
Error Message on manual upload
In previous version I had a functionaly to download a .wiq file and it opens inside visual studio. From there I could go to web version of query editor.
Link is similar to: http://tfs/_queries/query/?tempQueryId={Guid}&resultsEditorContext=query-edit
For now application is a TFS extension and I want to open web query editor from there.
So the problem is how to get or generate this tempQueryId.
Thank you for your help!
At present, you still could be able to download the .wiq file.
Select the Edit query wiql
In the pop-up web dialog, click Export
Choose save it will be download as New Query 1.wiq in local file system
Besides, there are two ways to share a query, Email query items or share a query URL.
If share a query with Copy query URL, it will generate a temp query id, which is not allow others to access the query path. This should be the same temp query ID which you mentioned.
However, there isn’t the method or property to get the temp query id. Also, from collection database, there isn’t the table for that information too. Take a look at this similar question here.
I've created a tool which is being used by multiple users who all have access to the shared folder the tool is saved in. In the tool I use the function UserNameWindows to pull in the username of the person using the tool. The function I use is =VLOOKUP(UserNameWindows(),T25:U34,2,FALSE) where the T25:U35 range is a mapping of user names to actual names.
The issue is that it works for me but my it's not pulling the username's of other users. Could it be that read-only is preventing the calculation from performing? I sat with one of the users to check and it's like the tool is remembering my Windows username from when I saved it.
You probably used Application.Username in the function and that is the username set in Microsoft Office. It will be blank if they have not set it in File>Options>General>User name. Use Environ("Username") in your function and it will use the windows login username.
Goal:
Create an additional functionality in report server. It is a button or an address link and when you press it then you open a Excel application containing a customized templete. The Excel's sheet is connected to a cube in SSAS.
Problem:
Is it possible to do it to fullfill the goal?
Information:
*The report server is located in a server and the user use client computer. If endusers gonna review the report in report server, they have to type the IP number and "reportserver" in the webrowser's address field.
*Using SSAS as a datasource.
*You should enable to save the document in the end user's computer.
You can upload non-SSRS files to the SSRS report server with Upload File button.
This means you can upload a template workbook configured however you'd like it. So long as the user does not have the ability to publish to SSRS then the only locations they'd be able to save it are the locations they could save any file on the network.
In Lotus Notes For Every User there is one nsf file will be created with the userid name as a file name. I want to Extract the Contact details using that nsf file Using Java Lotus Notes API.. (If my userid is user1 means the nsf file created in user1.nsf).. Is it possible to extract all the contacts of that user using the user's nsf file?
The tricky part here isn't reading the contact documents, it's finding the database itself. Depending on the installation, the contacts could be either on the server or local on their workstation.
If you're running from a server agent, you can only access the databases on the current server, or another server your credentials have access to. However, sometimes by default the user's contacts are put into a local database on their workstation and you can only reach them from code running in the user's context.
If that's the case, you have no choice but to find a way to run something on each user's workstation. You could
a) have the user replicate the names.nsf to the server, or
b) synchronize the contacts using the mail action.
For "a", you might send a special email with a LotusScript button in it to automate the replication. I've seen that method used in email migrations when using Quest software's migration tool, and it works well.
For "b", if you have a recent enough version of Notes you may follow these instructions to enabling the sychronize contacts task on the replicator. Ohterwise you'll need to instruct the users how to synchronize contacts using the actions menu. One the contacts are synchronized, a copy of the contact information will be contained in the user's mail file, which will be available on the mail server. You can then simply access the contacts view to read the documents.