How to sync two folders with Gradle? - groovy

I need keep folder inside synced with a folder called outside. The inside folder needs to be an exact copy of the outside folder - all subdirectories, files, etc.
The Copy task works great, except that it will only overwrite files - it does not delete files that are still in the inside folder if those files are no longer in the outside folder.
Right now I am using the Delete task, which the Copy task depends on. The Delete task fails every other build, with the below error. The inside folder does get deleted, but the new files from the Copy task are not copied over.
Error:(117) A problem occurred evaluating project ':android'.
> Cannot convert the provided notation to a File or URI: true.
The following types/formats are supported:
- A String or CharSequence path, e.g 'src/main/java' or '/usr/include'
- A String or CharSequence URI, e.g 'file:/usr/include'
- A File instance.
- A URI or URL instance.
I am guessing this happens because of some type of Gradle caching issue - how do I fix this, or design the process better? thanks!

looks like there's a big difference between using a tasks parameter to declare task type/dependencies and doing so in the task method body.
Something like this worked great
task deleteFiles(type: Delete) {
delete destinationHtmlFolder
}
task copyFiles(dependsOn: tasks.withType(Copy)){
println "copying all files"
}

Related

Azure Synapse Analytics - deleting pipeline Folder

I am new to Synapse and I have to make a pipeline that will delete files from folders in a hierarchy like the attached image. expecting hierarchy. The red half circles mark the files I would like to delete files for example older than 2 months.
As for now I have made a pipline for a single folder and using the for each loop I can get to the files and delete the corresponding one. And it works, since I have about 60-70 folders and even more files I wanted to go a level higher up and make a pipeline for each folder to execute. And with this is a problem. When i use GetMetadata Activity for top folder, and use for each loop to take name folders then i can not acess files in folder just only folder. Could you help me someone how to slove this?
deleting pipline for single folder using for each loop
We can achieve this using nested for each activities with the help of execute pipeline activity. As mentioned, Get metadata with wildcards returns all files without folders and Delete activity is unable to recognize wildcard folder paths(Folder/*).
I have created a similar folder structure for demo. In my pipeline, I have first created an array parameter req_files (sample1.csv and sample2.csv) with names of files required.
Note: If you want to dynamically do this, you can use append variable to build required file names (file09/22 and file08/22).
I used one get metadata to get folder names (which are inside root folder). I am iterating through the output of get metadata in my for each activity (items value is #activity('root folder contents').output.childItems).
Inside my for each, I used another get metadata activity to loop through each of the sub folders (to get file contents).
Now I have the folder name and list of files inside it. I am going to use execute pipeline to implement nested for each. Create 3 parameters in a new pipeline called delete_pipeline (where I perform delete) as current_folder, folder_files and files_needed.
Pass the following dynamic content for each of them from parent pipeline.
current_folder: #item().name
folder_files: #activity('sub folder contents').output.childItems
files_needed: #pipeline().parameters.req_files
Now in delete_pipeline, I have a for each loop to loop through the list of files we are passing (items value is #pipeline().parameters.folder_files).
Inside this for each, I am using an If condition activity. This is because I want to delete files which are not in my req_files parameter (array from parent pipeline which we passed to files_needed parameter in delete_pipeline). The condition for if condition activity will be as following:
#contains(pipeline().parameters.files_needed,item().name)
We need to delete the file only when it is not present in req_files (files_needed). So, when the condition is false, we perform delete.
I have created 2 parameters file_namepath_of_file_to_delete and file_name_to_delete in the dataset I am using for delete activity with following dynamic content.
file_namepath_of_file_to_delete: Folder/#{pipeline().parameters.current_folder}
file_name_to_delete: #item().name
When I run the pipeline, it keeps the required files and deletes the rest. The following are output images for reference.
Debug output: https://i.imgur.com/E6GNVHW.png
My folder after I run the pipeline: https://i.imgur.com/bqN00Dw.png

Node : What is the right way to delete all the files from a directory?

So I was trying to delete all my files inside a folder using node.
I came across 2 methods .
Method 1
Delete the folder using rmkdir. But if I plan on adding the images on the same folder then I use mkdir and creates the same folder again and appends the files to it.
Example: I have an Add Files and Delete ALL button. When I click deleteAll , the folder gets deleted. And when I click add then the folder gets created and the file gets added to that folder
Method 2
Using readdir , I loop through the files and stores in an array and then delete only the files instead of the folder.
Which is the best way to do it ? If its not among these then please advice me a better solution.
The rm function of ShellJS will do the trick. It works as a one-liner, and it works cross-platform, and is well tested and documented. It even supports recursive deletes.
Basically, something such as:
const { rm } = require('shelljs');
rm('-rf', '/tmp/*');
(Sample code taken from ShellJS' documentation.)

Multiple Excel files using SSIS [duplicate]

I have a source from which the files are to be processed. Multiple files are coming to that loacation at any time randomly (Package should run every 2 hours). I have to process only the new files, i can not delete, move the already processed files from that location. I can only copy the files to Archive location. How can I achieve this ?
You can achieve this using the following steps.
Use the foreach file enumerator for your incoming folder and save
the filename in "IncomingFile" variable. Configure to select "Name
and Extension"[In my code I have used that otherwise you need to do
some modification to the script]
Create tow SSIS variables Like "ArchivePath" as string and
"IsLoaded" as Boolean[default to false].
Create the SSIS script component and use "IncomingFile" and
"ArchivePath" as the readonly variable. "IsLoaded" should be the
ReadandWrite variable.
Write the following code in the script component. If file is already
exists then it will return true. Otherwise False.
public void Main()
{
var archivePath = Dts.Variables["ArchivePath"].Value.ToString();
var incomingFile = Dts.Variables["IncomingFile"].Value.ToString();
var fileFullPath = string.Format(#"{0}\{1}",archivePath,incomingFile);
bool isLoaded = File.Exists(fileFullPath);
Dts.Variables["IsLoaded"].Value = isLoaded;
Dts.TaskResult = (int)ScriptResults.Success;
}
Use the Precedence constraint to call the Data flow task and evaluation operation should be "Expression" . Set something as follows in your expression box.
#IsLoaded==False
Hope this helps.
Your package should process the files in a given directory, then move them to another directory once processed. That way, each time the package runs, it has to fully process the source directory.
To process each files in a directory, use the ForEach Container. You can specify a folder to look in, and some expressions to filter. If, for instance, your filename contains a timestamp, you could use that timestamp to filter your files in or out.
You use a flat file source to read files, then use the filesystem task to move them around.
To start, take a look at the answer here: Enumerate files in a folder using SSIS Script Task
The SSIS Script Task should enumerate all the files in a given folder, then take a snapshot of the already processed files from a table where you will keep a log of what's processed, ignore the already processed ones and just return the non-processed in an object variable for a for-each task to consume.

Can gradle do substitutions as it copies resources?

For a group of developers, all the differences are stored in a normal property file:
token1=some value
token2=9000
etc.
The 'tokens' are used in a series of XML files that reside in the normal src/main/resources directory. When Gradle copies these files into the build directory (and I don't know for sure what task that is), is there any opportunity to execute custom code? Specifically, I would like to have the token values from the property file substituted into the copy. Thus, the original copy remains untouched, but the version in the runtime has the desired values for the given developer.
Finally, I know this can done brute force with two or three steps that change the file after it is copied. I really want to know if there is an elegant way to do this in a single step.
After compilation, Gradle calls processResources task that copies the resources into the build directory. While copying resources, processResources can be configured to do the filtering (or possibly execute custom code by adding a doLast):
processResources {
filter org.apache.tools.ant.filters.ReplaceTokens, tokens: [
...
]
}
These two links can provide more help:
http://java.dzone.com/articles/resource-filtering-gradle
http://mrhaki.blogspot.in/2010/11/gradle-goodness-add-filtering-to.html

How do I delete a directory with cc.net / cruiscontrol? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Pre-build task - deleting the working copy in CruiseControl.NET
I would like to delete my working directory during the cruisecontrol build process...I'm sure this is easy, but I have been unable to find an example of it...
If you know how to create a directory, that would be useful as well.
Thanks.
One of two ways.
If you're already using an MSBuild file or something similar, add the action to the MSBuild file.
Instead of directly executing some command, create a batch file that executes that command and then deletes the directory, and have CCnet call that batch file instead.
My guess is that you want to delete the working directory before CruiseControl.NET gets the latest code from source control. If this is the case, then the only way to accomplish this is to write a custom source control provider for CruiseControl.NET that first deletes the working directory and then gets the latest code. Have a look at CruiseControl.NET's source code for examples of how to write a source control provider.
If you want to delete the working directory after the latest code is retrieved from source control, then you can use CruiseControl.NET's executable task by running "cmd /c del directoryname".
In the ASP.NET work, for me, the easiest way I do it (which allows me to hit either MSBUild or NAnt depending upon the project) was to roll my own exe that takes an argument which I pass in with a bat file fired by CC.NET. It's not the safest thing in the world, but if you have total control over your automated build machine; it's not too shabby. Quick and reusable.
Drop in the exe somewhere that does the recursive delete:
static void Main(string[] args)
{
for (int n = 0; n < args.Length; n++)
{
if (Directory.Exists(args[n].ToString()))
{
Directory.Delete(args[n].ToString(), true);
}
}
}
Drop it in somewhere multiple files can pass arguments to it and just write a custom .bat file for each project. So my task block looks like this:
<tasks>
<msbuild>
<executable>C:\WINDOWS\Microsoft.NET\Framework\v3.5\MSBuild.exe</executable>
<workingDirectory>Z:\WorkingDirectory</workingDirectory>
<projectFile>YourSolution.sln</projectFile>
<logger>C:\Program Files\CruiseControl.NET\server\ThoughtWorks.CruiseControl.MsBuild.dll</logger>
</msbuild>
<exec>
<executable>Z:\SomePathToBuildScripts\YourCustomBat.bat</executable>
</exec>
</tasks>
Then the final step is setting up that .bat file to perform the delete/rebuild functions after use. In the bat file just make sure you rebuild ("MD") the directories you deleted if youexpect to publish a site back to them. On our dev boxes I found this to be the best way to prevent the beloved Frankenbuild.
The way I've done this in the past is to not have CC.Net checkout source itself. Instead, there are two <msbuild> elements for the project, the first one calling a build target that runs svn-clean.pl (compiled to .exe), and then updates the source using svn.exe. The second <msbuild> element starts the main build process.
You can easily replace svn-clean with a delete command. For my projects, deleting chaff from a checkout has always been faster than checking out a fresh working copy.
The two msbuild elements are necessary because the main project build file is often updated. This is important because updates to your build file(s) will only be reloaded if you start a new msbuild process.
This setup breaks down when I (very rarely) move or change the dependencies of that clean-and-update build target to the extent that the msbuild process would need to reload for valid instructions to run the clean-and-update target. When this happens, I stop CC.Net before committing, go into the CC.Net server, and do an 'svn update' by hand.
Sidelight: It could well be that CC.Net has a natural clean-before-build operation by now. I've since moved to TeamCity, which is configurable to do this every build or only when the developer chooses (e.g., when you know you've made a change that would not update cleanly--svn moves of directories with build products comes to mind).

Resources