I created one Power Automate Flow. When I need to check Run History for my Flow I clicked on "All runs" and able to see all the History.
This history is only for 28 days, then after it automatically deletes by Power Automate. I need to create one Flow that automatically emails me that all history, which run in 28 days.
For that needs to create recurrence flow, but in that flow not found that Flow history. Is there any way to download or email that All History automatically?
Per my knowledge, out-of-box solution only enables you to manually download the Power Automate run history. If you want to download it on schedule, you may need to use 3rd party tools or create a custom solution.
Reference which may help:https://2die4it.com/2020/07/08/custom-connector-to-get-flow-run-history/
Related
I have a PowerApp that:
takes a SharePoint file,
filters the data,
and calls a PowerApps/Flow to:
execute an Excel script,
get the file contents,
and create a file with it.
The script is shared to all PowerApps users, same for the source file, everyone can read, edit and delete the file.
When anyone but I run the flow either from the PowerApp or from the flow itself, they get an error "Script not found. It may have been unshared or deleted." This happens when anyone other than me tries to run the app/flow.
The weird thing is that if I run the PowerApps / Flow, everything works fine.
Do you know how can I get other users than me to run it either from the PowerApps or PowerAutomate ?
This issue is 'kinda' similar to this one but it's different in the source file.
Found out how this one is working. Sharing the solution here :
Firstly, you'll need to create all your flows and PowerApps apps within a "Solution". You can create a new solution from "Solutions/New solution".
Within this new solution, create a button-triggered flow that runs Office Scripts. Let's call it "Child flow". Please note this flow requires to have a last action to respond to a PowerApp or flow.
Configure this child flow with run-only users settings as described in my previous response. Make sure to choose "Use this connection ...".
Within the same solution, create another flow with the PowerApps trigger. This flow will be triggered by your PowerApps app. In this flow, add a step to run the child flow you've just created above.
Within the same solution, create your PowerApps app. Configure your PowerApps button to run the second flow (the parent flow).
Now share the PowerApps app with another user, ask them to try it out, and see if the Run script action in the child flow can work properly.
Thanks to Yutao from MS for the solution !
I look for a way to collect and save activity and security logs from the Azure devops server(onprem 2019.1).
Logs include - user logins, build events, work item events, security changes etc.
I'm aware about this option https://server_name/tfs/_oi/_diagnostics/activityLog. But it's not an api interface.
Any Idea how it's can be implemented that?
Thanks.
TFS keeps track of an activity log of all recent activities. This information is stored in 2 tables inside Tfs_Configuration and Tfs_collectionname called tbl_Command and tbl_Parameter. These tables keep a record of every single command that every single user has executed against TFS for the last 14 days.
If you don’t want to get activity log through tbl_Command table and hidden activity log page (http://server:port/tfs/_oi), I’m afraid there is no other way at present.
You could add your request for this feature on our UserVoice site, which is our main forum for product suggestions. After suggest raised, you can vote and add your comments for this feedback. The product team would provide the updates if they view it.
I have some industrial data that i wish to present in a spotfire dashboard to a client. I want to make the dashboard so that it would update automatically and I have not been able to find a tutorial on Tibco's site or here for how to do this.
It would be great if someone could tell me how to make spotfire look in a particular place (server, desktop, wherever) for a new .csv file, open it and create a defined set of visualizations and then mail a pdf to the client.
I have been through the spotfire automation services manual but I can't find a specific guide to what I need it to do.
All help appreciated.
I went back and read thru the whole question again, including the other person's answer. In order to be able to do what you want to do, you need to have Spotfire Server and Spotfire Automation Services. Automation Services in a product that you have to purchase separately from Spotfire, although it is bundled with the Analyst client.
In your question you said -- It would be great if someone could tell me how to make spotfire look in a particular place (server, desktop, wherever) for a new .csv file, open it and create a defined set of visualizations and then mail a pdf to the client.
I made the assumption that you knew you needed Automation Services to do this. All of my answers have been based around the use of Automation Services. That's the only way to push an email to a user that I know of. After you setup the Automation Services job, you also have to use Active Batch to schedule the Automation Services job, which I noted as part of the original 3 step process.
I want to make the dashboard so that it would update automatically and
I have not been able to find a tutorial on Tibco's site or here for
how to do this.
What you want to do is schedule updates to your linked data. What this will do is re-query the data-source at the specified schedule (once a day, twice an hour, etc) that you specify and cache it on the web server.
Here is the documentation for that.
Schedule Updates
Scheduling updates using Spotfire Server (be sure to navigate down the sub items on the left)
Monitoring Schedule Updates
It would be great if someone could tell me how to make spotfire look
in a particular place (server, desktop, wherever) for a new .csv file,
open it and create a defined set of visualizations and then mail a pdf
to the client.
For this, you still want to use scheudle updates for the first part after you have linked your analysis to your csv file. Your file name will have to remain the same for spotfire to pick it up unless you customize this with some scripting. Once that is complete, you'll want to use Automation Services to complete your mailing a pdf.
Automation Services Tutorial
Automation Services User Manual
Generally speaking, this is a three step process.
1. In the desktop app, create a report (File -- Export -- to PDF -- prepared report. In this step you are creating the export and telling Spotfire specifically what to export. Where to find the report
2. In the desktop app, create the Automation Services Job (Tools -- Automation Services Job Builder). All jobs start with opening the file. Then you create the export. Then send the email. Sample active batch jobe
3. Now, you have to automate the task. This can be done with Windows Task Scheduler or Active Batch.
Those are the high-level steps. There is a lot of syntax and detail in each of the steps, but this should get you started. Please reply with more detailed questions on any one of the steps.
Is it possible to automate the publish of dxp file to the server. What I want to achieve is build a command line tool, user navigates to the dxp file and runs a command (let's say publish), this should save the file to the library without opening spotfire client (something similar to running spotfire in Headless mode).
I got to know that Spotfire Automation Services can be of some help in this task, but I have never used the automation services and don't know how to install or find the automation services module. Any help or direction is highly appreciated. Thanks
Automation Services is a licensed framework from TIBCO that lets you automate several tasks including Open or Save analysis to library, replace or remap data sources, run alerts etc.
But if your sole requirement is to publish analysis to library then use the import-library-content on command line, or save it in a batch file and use IronPython to trigger that as using button from the DXP.
See API docs of import-library-content for usage.
you can check Google for Automation Services and see more details on its capabilities, but I'm pretty sure your use case is covered. Spotfire does not offer this feature out of the box (except maybe using the admin command tool, but that'd only be for admins).
your organization will need to buy a license for AS, since it's a separate product in the Spotfire suite. talk to your TIBCO rep or send me a PM.
Yes, you can use Automation Services for this. Automation Services needs to be licensed from TIBCO, but it provides a framework that will allow you to create job.xml files which specify a list of tasks for Automation Services to execute. You can then submit the job file to a web service when you want the tasks to be executed.
I have a business requirement to process files uploaded by regional businesses for import to another system. It's envisaged that users will use SharePoint 2007 (soon to be SharePoint 2013), the event will trigger an export to CSV and the process will then run against those files.
Is this possible in either SharePoint versions?
Would that be an app, or standalone service I would want to create and schedule?
Does anyone have a more elegant solution? Essentially the CSV export is feeding in to a program that allows a user to visually validate and press a button to push to the other system after tweaking.
With custom code, you could create an event receiver on the list where CSV file lives that will run some code whenever the CSV file is updated. Here's a starter:
http://elczara.wordpress.com/2011/02/16/sharepoint-2010-event-receiver/
Make it a farm solution (sandbox solutions can't write to the filesystem directly), and you'll probably want to look up RunWithElevatedPrivileges, since the user doing the uploading may not have permission to write to the file system.
Steve's suggestion of rethinking the end-to-end solution is a good one, although I'm not sure how you can trigger the other system to "do its business".
Yes, it is possible, both with the 2007 version and the 2013.
Depending on your deployment scenario, you can:
create a custom timer job that will execute your job.
Create a custom site Workflow, with a loop and a delay, that will do the job.
The 1st is easier to build and maintain, but has less perspectives if you need to apply custom process.
But if you can control the application that consumes the feed, why don't you consume SharePoint directly? From the 2010 version, you can very easily get data using the listdata.svc web service. With older versions, you can still get data using a simple web service.