Cron Job with dompdf - cron

I`m running a cron job daily to check if an invoice has to be created (using dompdf).
The PDF should get created and saved in a folder.
So far the cron_script.php gets executed and all Database entries and email sendings get done.
The problem is the dompdf library.
PHP Warning: realpath(): SAFE MODE Restriction in effect. The script whose uid is 10000 is not allowed to access /var/www/vhosts/domain.com owned by uid 0 in /var/www/vhosts/domain.com/dompdf/include/functions.inc.php on line 135
PHP Warning: file_put_contents(): SAFE MODE Restriction in effect. The script whose uid is 10000 is not allowed to access /var/www/vhosts/domain.com/files owned by uid 33 in /var/www/vhosts/domain.com/cron_script.php on line 162
PHP Warning: file_put_contents(/var/www/vhosts/domain.com/files/invoice_2145.pdf): failed to open stream: No such file or directory in /var/www/vhosts/domain.com/cron_script.php on line 162
It must have something to with permission. But Safe_Mode is completly Off (Master and Slave).
Thanks for help
Toni

Related

Snakemake cannot write metadata

I have troubles to get snakemake-minimal=7.8.5 to run on Windows 10. I can execute rules, but snakemake terminates due to an error regarding the metadata:
Failed to set marker file for job started ([Errno 2] No such file or directory: 'C:\\test\\project\\.snakemake\\incomplete\\cnVucy9leHBlcmltZW50XzAzL2RmX2ludGVuc2l0aWVzX3Byb3RlaW5Hcm91cHNfbG9uZ18yMDE3XzIwMThfMjAxOV8yMDIwX04wNTAxNV9NMDQ1NDcvUV9FeGFjdGl2ZV9IRl9YX09yYml0cmFwX0V4YWN0aXZlX1Nlcmllc19zbG90XyM2MDcwLzE0X2V4cGVyaW1lbnRfMDNfZGF0YS5pcHluYg=='). Snakemake will work, but cannot ensure that output files are complete in case of a kill signal or power loss. Please ensure write permissions for the directory C:\test\project\.snakemake
I tried to troubleshoot doing the following
change the folders: Documents, User folder, and like the above in the root folder of my c drive
I tried to manipulate the security settings: Controlled folder or RandsomWare Access, see discussion -> it is deactivated
If I erase the .snakemake it is re-creating upon execution, so I assume I have write access. However, some security setting is disallowing the long filename with the hash
I tried the same workflow on a different Windows 10 machine and there I don't get the error, so I assume it is some windows issue.
Did anyone encounter the same error and found a solution?
I agree it is due to the length of the filename. It seems the default max filename length is 260. The file you pasted has a length of 262. You can edit the registry to allow longer filenames. Also consider opening an issue in snakemake to improve the documentation or otherwise address this issue for windows machines.

Generate hash of newly downloaded file

I'd like my bash script to perform an action every time new file is downloaded to /Downloads (generate hash of downloaded file and send it to API). So far I've been trying to make use of "inotify-tools", but it works only for newly created file and that won't do.
Script should work like this:
I download a file via browser (normal way)
Script notices new file and is executed automatically
Thanks in advance for help :D
You can use /etc/crontab to check ~/Downloads folder at startup and every n minutes. Script that will run every nth minute can do either
Keep the number of files. If number decreases script updates cache. And if number increases then gets the latest created file (or modified) and sends that file's hash to the api via curl.
Keep the name of files. If a file no longer exists, script then updates the cache of file names. If a new file appears again hashes and sends hash to the api via curl.
You can keep cache of files under /tmp.
If you can provide an example scenario I can write a simple script

Apache commons net FTPClient storeFile() creating file with the username prefixed

Happy New Year! :)
So, I'm using the Apache Commons FTPClient for the first time to FTP a file to the Mainframes FTP server. I have the code below to setup required configurations and eventually FTP the file using storeFile() which seems to work except one small problem below.
ftpClient.login(username, password);
System.out.println(ftpClient.getReplyString());
File blankFile = new File("./src/main/resources/BLANK.DAT");
System.out.println("File exists: "+ blankFile.exists());
InputStream inputStream = new FileInputStream(blankFile);
ftpClient.sendSiteCommand("sbdataconn=****.****.*****"); --Hidden from post
System.out.println(ftpClient.getReplyString());
ftpClient.sendCommand(FTPCmd.PWD);
System.out.println(ftpClient.getReplyString());
ftpClient.sendSiteCommand("lrecl=80 blksize=3120 recfm=FB");
System.out.println(ftpClient.getReplyString());
ftpClient.sendSiteCommand("pri=5 sec=15");
System.out.println(ftpClient.getReplyString());
ftpClient.storeFile("Destination.File.Name",inputStream);
System.out.println(ftpClient.getReplyString());
The corresponding console output logs say:
Connected to ****ftp.****.com.
220-FTPD1 IBM FTP CS V2R3 at *****, 06:46:21 on 2021-01-05.
220 Connection will close if idle for more than 15 minutes.
230 USERNAME is logged on. Working directory is "USERNAME.".
File exists: true
200 SITE command was accepted
257 "'USERNAME.'" is working directory.
200 SITE command was accepted
200 SITE command was accepted
250 Transfer completed successfully.
The Mainframes team confirms that they are seeing the file but the file is named as 'USERNAME.Destination.File.Name', but we need the file to be named as 'Destination.File.Name' to kick off the next step in processing. Am I missing something in my configs? Or is this an expected behaviour when FTPing to Mainframes using Apache Commons. How do I go further from here? Any help is appreciated. Thanks!
Well, it appears the filename in the storeFile() method of FTPClient had to be enclosed within single quotes('') to have the file created with exactly that name. So, the solve was to simply replace
ftpClient.storeFile("Destination.File.Name",inputStream);
with
ftpClient.storeFile("'Destination.File.Name'",inputStream);
and it worked as expected.
Thanks and have a wonderful 2021.

Gnome online accounts

I successfully installed gnome-online-accounts on my PC, equipped with Debian 9 OS. Everything works fine if I work from the X-terminal, having logged on with the default user. The command:
gio list google-drive://XXXXXXXXXXX#gmail.com/
gives the expected results.
But it doesn't if the same command is given thru crontab, though from the same default user. Here is the message:
gio: google-drive://XXXXXXXXXXX#gmail.com/: Operation not supported
If the problem was caused by an unmounted file system, due to lost of connectivity, the message should be:
gio: google-drive://XXXXXXXXXXX#gmail.com/: The specified location is not mounted
It seems like the command was given by another user.
Anyone has an idea about where is the trick?
As hinted at the end of this page, inside the bash script executed by crontab, before the gio call, it should be added:
declare -x DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/XXXX/bus
The XXXX value must be replaced by the UID value of the user enabled to the goa connection. This value is often "1000".

Torque : pbs_Server No such file or directory (2) in recov_attr, read2

I try to start pbs_server deamon and I get this message :
05/30/2016 10:26:57;0001;PBS_Server;Svr;PBS_Server;LOG_ERROR::No such file or directory (2) in recov_attr, read2
05/30/2016 10:26:57;0001;PBS_Server;Svr;PBS_Server;LOG_ERROR::que_recov, recov_attr[common] failed
pbs deamon appear to be running
root 4670 1 0 10:27 ? 00:00:40 /usr/local/sbin/pbs_server -d /var/torque
but jobs are stuck
Do you have any idea about the pb ?
Thank you very much for the help
Vince
It looks like you might have a problem with your queues, possibly file corruption. I would check the files in server_priv/queues. If you have a new enough version, the files will be in XML format, in which case you can manually revise them. If you have doubts about any of them, you can just move them out of the way, but if you have jobs running in any queue the server cannot find when it loads up, you may lose those jobs.

Resources