Launch python script based on a dynamic config file - python-3.x

Each morning, a config file is fed with a list of hours. Hours in the json file are different each days.
I would like a python script to be executed for each hour of the configuration file. What is the best way to do this?
Do I have to do a loop while? What is the best practice?

You can set up a cronjob that reads the config file and executes the python scripts every morning.

I would recommend this video.
basically, you can schedule a python script daily that will execuetes automatically.

Related

Do I need multiple run configurations - one per Python file - in Pycharm even though the only difference between them is the script?

I created a Python project in Pycharm which contains multiple Python files. As of just now, I need to create a run configuration for each Python file in my project, even though they're all the exact same - with the exception of the script.
This seems unnecessary and laborious and I would love to just use one run configuration for multiple Python files.
That said, I'm a novice Python programmer just getting started and so still unfamiliar with large parts of the language.
My Project Files:
My Run Configuration - Used for all Python files:
Some Research Carried Out
I've searched for a solution and explanation to this, but have been unable to find anything. Some of the places I've tried:
JetBrainsTV on youtube (https://www.youtube.com/watch?v=JLfd9LOdu_U)
JetBrains Website (https://www.jetbrains.com/help/pycharm/run-debug-configuration-python.html)
Stack Overflow
I hope there is sufficient detail here, if not I'd be happy to elaborate.
If those files are independent and you have nothing specific to them, then I see two simple ways of running them:
You don't have to manually create a run configuration for every file. You can just Right-Click on the file in the project tree and click "Run "
You can use the Terminal and run them files using the python interpreter as needed.
I was facing a similar situation when I started competitive programming. In my case I wanted to redirect my Test Cases from an input.txt file rather than manually typing the test cases for every run of my code. Using the above solution was not feasible, as I would need to manually change the Script Path and Redirect Input path in the Run Configuration window for every script I was running.
So what I wanted was, one run configuration, that would run all the scripts with Redirect Input path being set to input.txt.
To do that,
I created a main.py file with the following content:
import sys
if __name__ == '__main__':
fname = sys.argv[1]
exec(open(fname).read())
This main.py file is going to run my other python scripts.
Created this run configuration for the main.py file.
Now, every time I needed to run any code, with the code window open, ran this configuration, which actually executed main.py with current file name passed as its argument, which would then also take the inputs redirected from input.txt.
Hope this helps you or anyone trying to run multiple python scripts with a single run configuration in PyCharm.

Linux bash to compare two files but the second file must be find

I have a batch that integrates an xml file time to time but could happens daily. After it integrates it puts in a folder like /archives/YYMMDD(current day). The problem is if the same file is integrated twice. So I need a script what verifys the file (with diff command its possible but risky to make a bottleneck) but the problem is I can't find to resolve how to make to give the second files location.
P.S. I can't install on the server anything.
Thanks in advance.

bash & inotify - monitoring and moving file

I'm not an advanced user of Linux, but I'm looking for some simple script in bash which will be working in cron or any other way and in 5-10 minutes period time will be looking for new files, after new directory/files are already uploaded to the directory script will move new directory with files to other location.
I found inotify which can be a great solution for this, but the issue is how to go with it.
I've been using inotifywait to recognize that some filesystem change ocurred in certain path.
Take a look at:
http://linux.die.net/man/1/inotifywait
You can specify on what changes you are interested (delete, create, modify, etc.) and whether the script should output it or simply exit after change.
I've been using this tool in a way that my script was starting inotifywait and when it exists, do some action and restart inotifywait again.
Hope this helps.
Martin

Bash script to tar.gz file by day

I have few catalogues in my linux directory , the files name is logs-YYMMDD
/logs-140617
/logs-140616
/logs-140615
Can somebody tell me how to write bash script to repeat every day and pack file by date -1 day (today bash script would create archive of logs-140616 and delete unpacked file ?
Thanks for answers
use logrotate for that purpose, that's what it is for.
For periodically executing stuff, a cron job is the way to do that.
logrotate already gets executed by cron, on a daily base. Therefore, when using logrotate, there would be not need to fiddle with the daily-execution aspect of your problem.

How to auto-purge perl script logs from inside a cron entry?

I have a crontab setup to run a perl script every hour, at 5 minutes past the hour (so 2:05, 3:05, 10:05, etc.):
5 * * * * perl /abs/path/to/my/script.pl >> /abs/path/two/my/script-log.txt 2>&1
As you can see, it's also redirecting both STDOUT and STDERR to the same log file.
I've been asked to refactor either the Perl script, the crontab entry, or to add any new code/scripts necessary so that every night, at midnight, the script-log.txt gets cleared/flushed/emptied.
That is, every night, at midnight, if script-log.txt has 20MB of text in it, to clean it out so that it now has nothing (0bytes) in it, and then at 12:05 AM the crontab would kick back in, run script.pl, and start adding more text to the same script-log.txt log file.
It would be enormously easier if there was a way to modify the crontab entry with some Linux/Perl magic to set up such a "daily rolling log file". In a worst-case scenario, we can always write a new script to purge script-log.txt and cron it to run at midnight. But my higher-ups would greatly prefer to not have yet-another cron job, and are looking for a way to do this from the entry itself (if possible).
In reality, we have dozens of entries that work like this, and so the problem with writing "purging script" and cronning it to run at midnight is that we'll constantly be updating the purging script as we add/delete other scripts that generate these kinds of log files. Thus, if we can set such purging up at the entry level, each cron entry cleans itself. Thanks for any insight/pointers/suggestions in advance.
You might want to look into logrotate.

Resources