How to use fetch to add output to txt file, instead of overwriting the file? - linux

I want to use fetch to gather a line of info from multiple nodes, and store them in the same txt file.
right now I have:
fetch:
src: /path/to/file.txt
dest: /ansible/path/to/file.txt
flat: yes
Instead of adding info to the existing txt file, it overrides the file and deletes the old info.

According to the official documentation of fetch module
Files that already exist at dest will be overwritten if they are different than the src.
https://docs.ansible.com/ansible/latest/collections/ansible/builtin/fetch_module.html
You maybe could use lineinfile or blockinfile module.

Fetch all the files that you want to append while renaming them with some combination of ansible_hostname in the name string. All files need to be added in the same folder, the destination name used will make the difference since you get the same file name from all sources it might be ending up overwriting the name. Use a changing variable like ansible_hostname or some sort of node identifier like IP address. Use this variable in building the file name for your fetched file
get a list of all the fetched files in a variable
Iterate thru that variable and then try lookup for each file
block={{lookup('file', 'sourceFile')}}
You can also iterate over all files in a folder, while appending to the end of the destination file. In your case, I believe blockinfile will be appropriate for this operation.

Related

SSIS won't execute foreach loop for dynamic xlsx filename [duplicate]

This question already has answers here:
SSIS - How to loop through files in folder and get path+file names and finally execute stored Procedure with parameter as Path + Filename
(2 answers)
Closed 3 years ago.
I have a xlsx file that will be dropped into a folder on a monthly basis. The filename will change every month (filename_8292019) based on the date, to which I cannot change.
I want to build a foreach loop to pick up the xlsx file and manipulate it (load into SQL server table, the move the file to an archive folder). I cannot figure out how to do this with a dynamic filename (where the date changes.
I was able to successfully run the package when converting the xlsx to CSV, and also when pointing directly to the xlsx filename.
[Flat File Destination [219]] Error: Cannot open the datafile "filename"
OR errors relating to file not found
The Files: entry on the Collection tab of the Foreach Loop container will accept wildcard characters.
The general pattern here is to create a variable, say, FileName. Set your Files: to something like:
Files:
BaseFileName*
or, if you want to be sure to only pick up spreadsheets, maybe:
Files:
BaseFileName*.xlsx
Select either Name and extension or Fully qualified, which will include the full file path. I usually just use Name and extension and put the file path into another variable so when Ops tells me they're moving my drop location, I can change a parameter instead of editing the package. This step tells the container to remember the name of the file it just found so you can use it later for a variable mapping.
On the Variable Mappings tab, select your variable name and assign it to Index 0.
Then, for each spreadsheet, the container will loop, pick up the name of the first file it finds that matches your pattern, and assign the full name, with the date extension (and path, if you go that way), to your variable. Pass the variable as in input parameter to the tasks inside the loop and use that to process the file, including moving it to the archive, or you'll get yourself into an infinite loop, processing the same file(s) over and over. <--Does that sound like the voice of experience? Yeah. Been there, done that.
Edit:
Here, the FullFilePath variable is just the folder name, without a file reference. (Red variable to red entry in the Folder box).
The FileBaseName variable drives what shows up in the Files box. (Blue to blue).
Another variable picks up the actual file name, with the date extension. Later, say in a File System Task, if I need the folder & file name together, I concatenate the variables.
As far as the Excel Connection Manager error you're getting, unfortunately I'm no help. I don't use it. We have SentryOne's Task Factory for SSIS which includes a much more resilient Excel connector.

Renaming multiple zip files with complex name using single command in linux

I have multiple zip files in a folder with names like below:
"abc.zip-20181002084936558425"
How to rename all of them with one command to get result like below:
"abc-20181002084936558425.zip"
I want the timestamp before the extension for multiple filenames. Now every file has different timestamp so renaming should consider that. Can I rename multiple files like this using single command.
Providing your file are really all with the same name convention and you being in the right directory :
for i in *.zip-*; do newName=${i//.zip};mv $i $newName".zip";done
should do the trick.

how to access puppet variable in custom facts?

I am extracting a zip at a location, in extracted folder directory there is a install.jar.I want to copy a file to the directory where install.jar is available. Now the zip I am extracting may have different folder structure every time, because of this I can not use
file{'EXTRACTED_PATH/ant.properties':
ensure: present
}
So I wrote a custom fact that will find out path of a install jar & I accessed value in manifest like
$install_jar_location=$::getinstallerpath
Now in facts file I have to give path to search, this path I want to pass as a parameter.
for that I have declared one variable in manifest file
$installer_location="/home/oracle/Installer"
how can I access it in my custom fact.I tried to do following but $installer_locationvariable value is coming up blank.
Facter.add(:getinstallerpath) do
setcode do
Facter::Util::Resolution.exec("echo \"$(dirname \"$(find $installer_location -name \*install*.jar)\")\" ")
end
end

How can I use Ansible lineinfile to remove all but a few specific lines?

I'm attempting to ensure all but a few specific lines are removed from a file. I can get halfway there with the following task.
- name: ensure only the correct lines are present
lineinfile: dest=/path/to/file
regexp="pattern1|pattern2[0-9]*|pattern3[0-9]*"
state=present
backup=yes
Ultimately I want to ensure that pattern1, pattern2[0-9]*, and pattern3[0-9]* are the only lines that remain in this file. I've attempted to come up with a regex that negates this one and then specify state=absent but I've been unsuccessful so far.
If you want only specific lines in your file, why don't you just transfer that file with your desired lines to the remote server? You can use copy module to transfer that file as is or template module to dynamically substitute some variables inside or even assemble module to generate a file from some fragments(such as config).

node.js rename the files incrementally

I have been using Node.JS file system module for performing various file related operations. I have a need to verify the file name if exists in a directory and if exists i would need to keep a suffix at the end of the file. Typically how windows does with duplicate file names..
if TestFile.txt already exists and another file with same names comes in during processing the new file should be renamed as TestFile (1).txt and next file with same name should be renamed as TestFile (2).txt.
What could be the best way to achieve this. Do i have to use a temporary array to keep all file names and traverse through for each? This is a multi threaded environment and there could be 50,000+ documents coming for processing.
Thanks a ton.

Resources