Splunk - Finalize and delete current activity - search

I am using Splunk community edition for a monitoring dashboard.
I display a page of real time charts, I refresh the browser every now and then via a script.
This all works well however I am looking for a way to finalize and delete jobs running in the background as there is a limit as to how many live charts one can display in the community edition and when it refreshes the browser it seems to run to the limit after a while.
Thus how can I schedule the finalizing and deleting of the old jobs? (possibly a file I can remove from time to time)

I've overcame it by putting this at the top of my dashboard view (surrounded by XML tags):
< view autoCancelInterval="5" template="dashboard.html" >
The auto-cancel interval is set to 5 seconds. This means that when the search becomes inactive for 5 seconds or more, it will automatically cancel the search from running.

Related

Is there a way to save progress in a desktop application developed in python and pyqt5?

Our application process a large audio file into number of smaller segemnts and displays it on a table in the GUI. The user can listen, label and comment on each segment in the table. So is there a way to save the progress that can be resumed from where the user left off with the last accessed row in the table?
For example in the table has 700 rows and the user has worked with 100 and closes the application, the next time they open the application they must be able to start working with the 101st row and the previous work must be saved.
pickle (import pickle) allows you to write/read python objects and therefore save a progress status from one session to another.

Kentico 8.2 - Is there a way to delete Attachment History from the GUI

Our attachments have been stored in our database for a very long time, and it has caused the database to become huge and our backups to be extremely unreliable. We have moved our attachments to the file system and that shaved off a great deal of size.
Now our largest table is CMS_AttachmentHistory. I've been able to test brute force deleting every row in SQL (and every row in the CMS_VersionAttachment junction table). But is there a way to accomplish this in the Kentico Admin GUI without having to resort to this?
When I say brute force delete, I mean:
DELETE FROM dbo.CMS_VersionAttachment
DELETE FROM dbo.CMS_AttachmentHistory
There is an option in the GUI that will do this, but it's also going to impact the page version history. If you go to Settings > Content > Content management and look in the Workflow section, you can see a setting named Version history length. Reducing this to a lower number (I believe 20 is the default) will reduce the version history stored to reflect the new value by deleting the unneeded rows.
This will affect all version history though, not just the attachment, but also the pages themselves. That being the case, you would need to decide if you need/want to keep the version history of the pages or not.
If you don't want to lose that history, then I'd say that a good option would be to write a script that can set the AttachmentBinary column to null for the records that you don't need/want (given that you say that you now store the files on the filesystem, any current versions will have the correct value, so this is probably all of them)
Not sure about 8.2. But you can try to experiment with recycle bin/objects. There are a couple topics: topic 1 and topic 2. I just check it puts there attachment every time you delete it, even though I have in setting "files on disc". You can do like Kentico recommends set binary field to null or write script using API.

How to avoid saving report output version in cognos?

I am working on cognos and user wants that the output version of report is not saved with report.
I looked for options and found below screen that has radio button making it impossible to uncheck. Setting value to zero means unlimited number of output.
I believe setting Duration to 0 days will stop output versions being saved.
If not, set it to 1 day and it will clear itself each night.
There is default setting in Cognos that can be accessed from report properties.
It allows us to set the default action when user click on report link in Cognos connection.
As in below screen, I have set default action to run the report that was earlier view most recent report.

Execute SQL job based on excel or text file message

We have a SQL Server Agent job, which we need to run manually based on user/s request. The request is random and unfortunately, we cannot predict when the users will request it. Also, we have to it in after hours (Plus, it takes over an hour to run).
Anyways, I wanted to see if it was possible to run this job automatically based on a text file we can put on a Share drive. Users can update this file to say "Run" or "Stop" along with a few parameter values. I could setup a schedule to run daily and if the status changes to "Run" then the job runs for day/s until the text file is updated to say "Stop".
You can create and extremely simple small desktop for the user, where they can start Run/ Stop the job (simple C#, connect to database and execute job with the name of the job).
You can create a power shell script which will look for a file (text file) and depending on the value will run/stop the job.

Excel 2007 Refresh Imported CSV File From Web

Log data from a test is uploaded to a web service, and the processed CSV is downloaded back into Excel for viewing in charts. At the moment, this is done via copy and paste for short CSV files and the Data > From Text feature for larger CSV files. Unfortunately, this takes a bunch of time for every test, and I need to make the process very simple for someone else to update the Excel spreadsheet.
The Excel spreadsheet contains 5 raw-data pages which are used to store the CSV from the server. I have no issues selecting Data > From Text, entering the website URL, and completing the format to import. This process can be repeated (same as the Copy and Paste) for all 5 pages to import the data.
This process only allows me to put in one filename, so I am using the same URL for the data, and having PHP return the CSV of the latest (or a specifically configured) test whenever the website is accessed. I've verified that this process is working correctly.
Unfortunately, when I do 'Refresh All', it prompts for a filename unless I go to Data > Connections > Properties, and uncheck 'Prompt for file name on refresh'.
However, even when I do that, I'm getting mixed results. Sometimes only one of the pages will update. (Seems to be the last one I set up.) Sometimes none of them do. I need a solution which updates all 5 pages based on the current CSV from the server without having to set up the connections again every time. Ideally I'd like to just hide these raw data sheets so we can have an Excel file that's just the final charts.
Surely this is a common function and I am doing something wrong, yet all the guides I try on the Internet don't seem to work. For example, this one:
http://www.kimgentes.com/worshiptech-web-tools-page/2010/8/18/web-connecting-csv-files-as-external-data-to-excel-spreadshe.html [URL is corrected]
Seems like they only set up one connection. I can get one working to refresh, but not more than one.
I have seen this happen and finally figured it out. There are actually 3 things that can happen to give this result, and a separate solution for each:
First, Excel software uses the IE 11 web object to when it does web
retrieval of data. This means it will be "sticky" to sessions using
IE11 to access the data. Most websites these days are run by cloud
servers, which generate sessions on the server with the most load.
This normally has no impact on users on web browsers since they
login and can visually enter their credentials etc. But when a
program accesses a website and must use a specific web browser, it
must use the properties of that browser and how it works. I ran into
this a lot when I would generate and be able to download my CSV
files on the website in Chrome, then try to use Excel to import the
same files wouldn't work (it would say they weren't there). The
solution to this, at least for now, is to use IE 11, login to the
website, generate the CSV files and test that they can be
downloaded. Then use Excel to run the web import and it should pick
up the same sticky session to get the CSV files.
Second, password entry is a different thing, but also has to do with the stickiness
of the data. For some reason Excel will not cache your credential
responses for logging into a website without you entering them 3
times. This experience may change for you, but I found that I must
enter a new credential set (for a new web import of a CSV) 3 times
before it becomes permanently cached by Excel. After this, I don't
have the problem.
Third, most complex Excel programs that require
web import may also require that you either import local data you
downloaded from a website, import data from a website into a sheet
or run more complex objects like Macros. All of these need proper
permissions. You may need to set your Trust Center settings to allow
you to use your Excel program on your computer in this way. That is
part of MS office. You can set add and update those as per MS info
here:
https://support.microsoft.com/en-us/office/add-remove-or-change-a-trusted-location-7ee1cdc2-483e-4cbb-bcb3-4e7c67147fb4

Resources