Paraview: Create a state file in an external program - vtk

My c++ code outputs a number of vtu files and stl files. Each vtk file has a different mesh and a different number of fields. I want the user to be able to open those vtu files in Paraview together so that they are all on the same pipeline. Currently, the user has to open each vtu file separately or group select them together in the Open File dialog box and open them. But I want to give the user a better experience. I like the user to not worry about all the different but files and open just one "combined file". Is there a way to create one single file from all these vtu and stl files? Or create a single "reference" file that will reference those other vtu and STL files and the user has to open only the reference file?

If you have a way to get the list of file to load, you can create a python script alongside to your data, where you basically put:
from paraview.simple import *
# recover file list
# ...
for file in files:
OpenDataFile(file)
Then one can just load this script as a state in ParaView.

Related

Embed Large Text File into C++Builder Windows App

I want to embed a large dictionary text file (more than 66k words of no more than 30 characters per word, with one word per line) into a C++Builder executable to run on Windows. How can that be done?
Rather than embed the text file into the executable itself, you could just store the file in the same folder as the executable (or any other folder of your choosing), and then load the file at runtime using a TStringList object using its LoadFromFile() method.
Otherwise, if you really want to embed the file, you can store its content in a resource in the executable, by referring to the file with an .rc script added to the project (or, using the Project > Resources and Images... dialog), and then load the resource data into a TStringList object at runtime using its LoadFromStream() method with a TResourceStream. See Resource Files Support in Embarcadero's documentation for further details and an example.
Successfully used the Resource Files Support link to embed a short text list file into a sample exe file.

azure data factory: iterate over millions of files

Previously I had a problem on how to merge several JSON files into one single file,
which I was able to resolve it with the answer of this question.
At first, I tried with just some files by using wild cards in the file name in the connection section of the input dataset. But when I remove the file name, theory tells me that all of the files in all folders would be loaded recursively as I checked the copy recursively option, in the source section of the copy activity.
The problem is that when I manually trigger the pipeline after removing the file name from the input of the data set, only some of the files get loaded and the task ends successfully but only loading around 400+ files, each folder has 1M+ files, I want to create BIG csv files by merging all the small JSON files of the source (I already was able to create csv file by mapping the schemas in the copy activity).
It is probably stopping due to a timeout or out of memory exception.
One solution is to loop over the contents of the directory using
Directory.EnumerateFiles(searchDir)
This way you can process all the files without having the list / contents of all files in memory at the same time.

Shortcut with environment variable

I'm at work, with a folder in which we create a daily excel sheet to manage our clients. For the sake of understanding let's imagine this files is in a folder called OCTOBER and they are named MD01, MD02, MD03... based on the day we are.
I was trying to setup a shortcut on my desktop that will call the correct file every time so I don't have to go trough the file structure to access it. Something like this:
"....\OCTOBER\MD%DAY%.xls"
But the moment I try to setup the path this way I get an error say this is not a valid path. Well, either I am missing something here or what? Can't this be done?
A typical method is to create a link to a master file in the folder (say master.xlsm) The master spreadsheet would automatically:
determine the date
determine the appropriate file to open
open the file (say MD10.xls)
close itself
Alternatively you could create a little vbscript or powershell script or .bat file to do the same thing.

Make Infragistics UltraGrid Document Exporter launch file

We have an existing application that allows exporting of an Infragistics data grid to either Excel or PDF format. Currently, when the user clicks on the Export button, it asks them where to save the file and then it exports it and saves it. Then, to launch it, they go to where they saved it and then it launches.
The user wants the application to instead launch the grid into either Adobe Acrobat or Excel and THEN that is where the user can opt to save the file. They don't want it to ask where you wish to save it before it exports, like it currently is doing now.
Is this possible with the Infragistics Document Exporter? I couldn't find any information on this from the Infragistics web site.
I'm thinking, instead of giving it a filename, I could instead use a stream maybe to the console or something like that and let the OS give the user the option to launch it?
Is there an example somewhere of this being done? I see there is an overload in the Export member function that allows you to pass in a stream.
Thanks!.
The Infragistics excel engine and documents engine will need to write to a file to be able to have the file opened in Excel or Adobe Acrobat so you will still need to save a file before they can open it.
For the requirement to open the file, you could use System.Diagnostics.Process.Start method and if there is a program associated with the file type you can pass the file that you just saved.
As there is a dependency on the file system to open the file in Excel or Adobe Acrobat you will not be able to achieve your goal of not requiring the file be saved first. While it may be an option to save the file in a temporary location and then open that file it also has an issue that if the user were to click save in excel it would still save in the temporary location so they would need to know to use Save As to save in a different location.

difference between data writing to existing file and copy entire content from another in C#

I have developed one windows application,which is checking for newly updated or generated file in a directory and copying the data to a temporary file.My application is doing its job perfectly.But,There is one third party application "POS Text Sender",which will read all the text from a temporary file and display on CCTV Camera if i use notepad text editor to copy data.But when my application does this work,POS Text Sender will read the contents from first file and its also trace the updated contents of that file from time to time.Once New file is generated in that directory,as usual,my application will copy the entire contents of that file to temporary file,but POS Text sender will not read that data and then if it has to display any contents,I should restart POS Text Sender.I really dont know how POS Text Sender knows that my application is copying from newly generated file and how to get it stopped.What is the difference between data writing to existing file and copy entire content from another
What is the difference between data writing to an existing file and copying the entire contents from another?
It sounds like maybe the 3rd-party app is looking at either the created or modified date stamp on the files, and not taking both into account. The difference is as follows:
when you create a new file, created and modified will be the same
when you edit an existing file, modified will be later than created
when you copy an existing file, the new file's created will be later than its modified timestamp

Resources