I am trying to copy all the secrets along with the subfolders from one VAULT path to another.
Example:
source = "/path/namespace/TEAM1/jenkins"
(note: the above source path consists of subfolders like job1,job2,job3... and all these subfolders contains the respective secrets in the form of key-value pairs)
destination="/path/namespace/team1/jenkins"
I could able to manually copy each secret to the destination folder, but wondering any code snippet would help me here to achieve this. Like recursively copy all the secrets along with the respective sub-folders to the destination PATH.
Taking vault secret backup from one path to another like.
input_path: secret/tmp1
output_path: secret/tmp2
so now with this python script you can sync all secret from secret/tmp1 to secret/tmp2
Need to add input_path and output_path in python script then just run.
Link for python script.
https://github.com/vinamra1502/vault-backup-restore
With this script you can copy all secrets along with the subfolders from one vault path to others.
Ex. secret/tmp1 secret copy to secret/tmp2 path.
Related
I am new to Synapse and I have to make a pipeline that will delete files from folders in a hierarchy like the attached image. expecting hierarchy. The red half circles mark the files I would like to delete files for example older than 2 months.
As for now I have made a pipline for a single folder and using the for each loop I can get to the files and delete the corresponding one. And it works, since I have about 60-70 folders and even more files I wanted to go a level higher up and make a pipeline for each folder to execute. And with this is a problem. When i use GetMetadata Activity for top folder, and use for each loop to take name folders then i can not acess files in folder just only folder. Could you help me someone how to slove this?
deleting pipline for single folder using for each loop
We can achieve this using nested for each activities with the help of execute pipeline activity. As mentioned, Get metadata with wildcards returns all files without folders and Delete activity is unable to recognize wildcard folder paths(Folder/*).
I have created a similar folder structure for demo. In my pipeline, I have first created an array parameter req_files (sample1.csv and sample2.csv) with names of files required.
Note: If you want to dynamically do this, you can use append variable to build required file names (file09/22 and file08/22).
I used one get metadata to get folder names (which are inside root folder). I am iterating through the output of get metadata in my for each activity (items value is #activity('root folder contents').output.childItems).
Inside my for each, I used another get metadata activity to loop through each of the sub folders (to get file contents).
Now I have the folder name and list of files inside it. I am going to use execute pipeline to implement nested for each. Create 3 parameters in a new pipeline called delete_pipeline (where I perform delete) as current_folder, folder_files and files_needed.
Pass the following dynamic content for each of them from parent pipeline.
current_folder: #item().name
folder_files: #activity('sub folder contents').output.childItems
files_needed: #pipeline().parameters.req_files
Now in delete_pipeline, I have a for each loop to loop through the list of files we are passing (items value is #pipeline().parameters.folder_files).
Inside this for each, I am using an If condition activity. This is because I want to delete files which are not in my req_files parameter (array from parent pipeline which we passed to files_needed parameter in delete_pipeline). The condition for if condition activity will be as following:
#contains(pipeline().parameters.files_needed,item().name)
We need to delete the file only when it is not present in req_files (files_needed). So, when the condition is false, we perform delete.
I have created 2 parameters file_namepath_of_file_to_delete and file_name_to_delete in the dataset I am using for delete activity with following dynamic content.
file_namepath_of_file_to_delete: Folder/#{pipeline().parameters.current_folder}
file_name_to_delete: #item().name
When I run the pipeline, it keeps the required files and deletes the rest. The following are output images for reference.
Debug output: https://i.imgur.com/E6GNVHW.png
My folder after I run the pipeline: https://i.imgur.com/bqN00Dw.png
I'm using python to download an application from a content distribution network. The application downloads as self extract file cabinet. When executed it creates a directory using a version naming format. For example app-version2/../app.exe, Thus I cannot rely on the folder name as it may change in the future. I'm trying to find the best way to work with the content inside the folder without depending on the actual folder name.
My idea was to rename the folder using os.listdir() and then os.rename(app-version2, myapp) This would work but is not automated. What would be the best automated method to find a folder name that contains version numbers and change that to something more static?
Assuming you want to find the path of the directory which begins with app, you can accomplish this using pathlib.Path:
from pathlib import Path
app_path = next(Path().glob('app*'))
This will give you the path to the first file or directory in your current directory whose name begins with "app".
My folder structure is like below,
Container/xx56585/DST_1/2021-03-26/xxxxxxxx.csv
Container/xx56585/DST_1/2021-03-26/xxxxxxxx.ctl
Container/xx56585/DST_2/2021-03-26/yyyyyyyyy.csv
Container/xx56585/DST_2/2021-03-26/yyyyyyyyy.ctl
Container/xx56585/DST_3/2021-03-26/zzzzzzzzz.csv
Container/xx56585/DST_3/2021-03-26/zzzzzzzzz.ctl
Container/xx56585/DST_4/2021-03-26/sssssssssss.csv
Container/xx56585/DST_4/2021-03-26/sssssssssss.ctl
I need to copy .csv and .ctl files to sFTP target and move these files to achieve folder(in the blob storage after copy activity)
Please help me on this
Update:
We can use Get Metadata1 to check does the ctl file exist.
Add dynamic content #concat('xx56585/',item(),'/',substring(adddays(utcnow(),-3),0,10),'/') to the path.
I created a simple test to copy files under <rundate> folders to target folder.
My folder structure
Input/xx56585/DST_1/2021-03-26/xxxxxxxx.csv
Input/xx56585/DST_2/2021-03-26/yyyyyyyyy.csv
Input/xx56585/DST_3/2021-03-26/zzzzzzzzz.csv
Input/xx56585/DST_4/2021-03-26/sssssssssss.csv
Output:
Define an Array type variable Array1 and assign the value ["DST_1","DST_2","DST_3","DST_4"].
At ForEach1 activity, we can add dynamic content #variables('Array1') to traverse this array.
Inside ForEach1 activity, we can use Copy activity to copy files under the dynamic path via expression #concat('xx56585/',item(),'/',substring(adddays(utcnow(),-3),0,10),'/').
My current date is 2020-03-29 so I use adddays(utcnow(),-3) to get 2020-03-26 in the above steps.
That's all.
I think we can add filter activity in this before copy activity in which we can use substring function and find if file name contains .ctl or .csv
Eg.: I have a file mycode.py which contains 2 secrets
myfakesecret : "ANSAJHSAKDKDMKADKAMCKSMKSMCKSCC"
MyOriginalSecret: "H%&&^DBSHDBHBBBS%^&&&DSD2343"
I want to ignore myfakesecret but not MyOriginalSecret in truffleHog scan.
If I use --exclude_paths exclude-patterns.txt where exclude-patterns.txt contains mycode.py then truffle hog scan will ignore both secrets.
Can I specify a secret hash or name or any other way to exclude secret not complete file so that it should ignore a particular secret?
Ideally, your code does not include the sensitive secret at all.
That way, truffleHog scan has nothing to ignore/exclude.
mycode.py should read that secret from a file/source outside the repository, at runtime (when you are executing the program.
I am extracting a zip at a location, in extracted folder directory there is a install.jar.I want to copy a file to the directory where install.jar is available. Now the zip I am extracting may have different folder structure every time, because of this I can not use
file{'EXTRACTED_PATH/ant.properties':
ensure: present
}
So I wrote a custom fact that will find out path of a install jar & I accessed value in manifest like
$install_jar_location=$::getinstallerpath
Now in facts file I have to give path to search, this path I want to pass as a parameter.
for that I have declared one variable in manifest file
$installer_location="/home/oracle/Installer"
how can I access it in my custom fact.I tried to do following but $installer_locationvariable value is coming up blank.
Facter.add(:getinstallerpath) do
setcode do
Facter::Util::Resolution.exec("echo \"$(dirname \"$(find $installer_location -name \*install*.jar)\")\" ")
end
end