How Do I delete about 5,000 Cron job log files in a Cpanel.
The files are so many and I need to remove those files with the name shown in the image link attached here.
https://i.imgur.com/YKsNfhk.png
Related
I have a Azure File Copy Task as a part of my build. Some directory needs to be recursively copied to a blob container. Basically, the "cdn" directory from sources should be copied to the cdn blob container.
So, as "Source" for the task, i specified "$/Website/AzureWebsite/www.die.de/cdn-content/cdn/*"
As "Container Name" i specified "cdn".
The task works: My files do get copied. However, after the copying ends, i also see a directory named "$tf" which has various subdirectories with numbers as names. (0, 1, 2, etc.). All of those contain files named "*.gz" or ".rw".
Where is this coming from and how do i get rid of it?
I found this thread: https://developercommunity.visualstudio.com/content/problem/391618/tf-file-is-still-created-in-a-release-delete-all-s.html, the $tf folder is generated when mapping sources for TFVC repository, it's by design. It will create a temp workspace and map the sources first when you queue a build.
If you want to get rid of these, then set the Workspace type to a server workspace, but lose the advantages of local workspaces. See: TFS creates a $tf folder with gigabytes of .gz files. Can I safely delete it? for guidance.
My actual requirement is I want to upload a file in gitlab artifacts with some data generated on run time for a specific job (even job got success or fail). If job got failed then I'll retry that particular job and I want to access the data in that file uploaded in artifacts in previous run. Can you please suggest some way on how to do this.
I have a jenkins job that amongst other things I want to dynamically create a .foo and a .bar file in my jenkins workspace home directory as the job is running.
How can I do that with bash as a part of the job?
The job will run after a Docker container has been created with the Docker Plugin.
So the order of precedence I want is :
Job starts:
docker container get's created
create .foo file (and enter foobarbaz as a content in the file)
create .bar file (and enter foobarbaz as a content in the file)
How can I achieve this?
One simple solution would be to associate your Jenkins job with a source controlled repository URL, which already include the files with the right content.
Each job execution would create a workspace populated with that repo content, and launch your docker container.
If not, you would need to add a script step, like this one in Groovy.
I am using Jenkins to automate some tasks on a remote server. During these tasks, a script is creating a lot of log files. How can I make these log files available in Jenkins for other? Jenkins won't be creating these files, some script running on my server will. The job will take ~15 days to complete and I would like users to be able to go take a look at the log files anytime in Jenkins.
Jenkins has a mechanism known as "User Content", where administrators can place files inside $JENKINS_HOME/userContent, and these files are served from http://yourhost/jenkins/userContent. This can be thought of as a mini HTTP server to serve images, stylesheets, and other static resources that you can use from various description fields inside Jenkins.
So just put your log files under the userContent directory, others should see them.
We have a website with all the media (css/images) stored in a media folder. The media folder and it's 95 subdirectories contain about 400 total files. We have a Cruiscontrol project that monitors just the media directory for changes and when triggered copies those files to our integration server.
Unfortunately, our integration server is at a remote location and so even when copying 2-3 files the NANT task is taking 4+ minutes. I believe the combination of the sheer number or directories/files and our network latency is causing the NANT task to run slow. I believe it is comparing the modified dates of both the local and remote copy of every file.
I really want to speed this up and my initial thought was instead of trying to copy the whole media folder, can I get the list of file modifications from CruiseControl and specifically copy those files instead, saving the NANT task the work of having to compare them all for changes.
Is there a way to do what I am asking or is there a better way to accomplish the same performance gains?
This sounds like a job for RoboCopy. Use NAnt to bootstrap the execution and let RC do the file synchronization.
Update: digging deeper into the CCNet documentation you'll find the <modificationWriter/> task. Adding this task to your ccnet project will write out an xml file containing information about all the modifications. You should be able to read in the contents of that file in your NAnt script. A suggestion here is to use the NAnt <style/> task to convert the modification xml into a NAnt script containing copy and delete tasks.