How to run PyPdf2 fileMerger within a shared folder on a network drive - file-permissions

I am trying to merge multiple files using PyPDf2 within a folder in my office's shared drive. However, my program never finishes running because I believe that it does not have permission to access the folder. Is there a way to allow access to it?
from PyPDF2 import PdfFileMerger
import os
path = "H:\\Accounting\\ME\\Attachments\\Pdf"
pdf_files = ["\\file1.pdf", "\\file2.pdf"]
merger = PdfFileMerger()
for files in pdf_files:
merger.append(path + files)
if not os.path.exists(path + "\\newMerged.pdf"):
merger.write(path + "\\newMerged.pdf")
merger.close()
print("done")

Related

accessing in the sub-folder of a folder in python

What I am trying to do is to access the images saved in my path that is E:/project/plane
but unable to get access.
I tried using glob.glob, all I'm getting is access to the subfolder not to the images inside the subfolder.
I also tried to take the name of the subfolder as an input combine it with the path but still can't get access to the folder.
Can anyone help me with how can I achieve this task?
here is my Python code:
import os
import glob
import cv2
path= "E:\project\%s"% input("filename")
print(path)
for folder in glob.glob(path + '*' , recursive=True):
print(folder)

Different behavior with send_file and tempfile on local machine vs production server

I have a small Flask application that utilizes Python's TemporaryDirectory to create a zip file and return it to the client. The code snippet looks like this:
import tempfile, os, shutil
def process():
with tempfile.TemporaryDirectory(dir=os.getcwd()) as tmpdir:
with tempfile.TemporaryDirectory(dir=tmpdir) as wrapper_dir:
#some processing
zipfile = shutil.make_archive("azipfile", "zip", wrapper_dir)
return send_file(zipfile, mimetype="zip", attachment_filename="azipfile.zip", as_attachment=True)
It nests 2 temporary directories, zips the inner directory, and sends the zipped file to the user. If I run the app on my machine locally, it behaves as expected and leaves no directories or files behind since both the context manager and tempfile should have cleaned itself up nicely after serving the zipfile.
However when I run this on PythonAnywhere, it will throw this error:
OSError: [Errno 39] Directory not empty: '/home/{user}/{app}/{tempdirectory}'
Logically, the error makes sense since the tempdirectory still contains the zipfile but why is there is difference in behavior running it in production versus locally?

Have Dialogflow's fulfillment webhook enabled on every intent

I'm using fulfillment webhooks to store analytics data on my servers, so I need it enabled on every possible intent. So far I've been doing it by manually checking "Enable webhook call for this intent" on every intent. That is kinda dangerous though, as it would be easy to forget doing it on an intent. Is there any global way to have it enabled for all intents?
There is no direct way to do this, but I have made a python script to do the same.
You need to follow below steps to get it done:
Export your agent
Go to settings of your agent, select Export and Import tab and select Export as zip.
This will give you zip file of your agent
Put the zip file in the same folder where your python script file will be
present
Run the python script
A folder named zipped will be created
Go inside that folder and select all the files and folders present in
that folder and zip them
Restore your agent
Go to settings of your agent, select Export and Import tab and select Restore from zip, select the zip file which you created in above step.
Python code:
import zipfile
import json
import os
import glob
cwd = os.getcwd()
zip_ref = zipfile.ZipFile(cwd + '/your_agent.zip', 'r')
zip_ref.extractall('zipped')
zip_ref.close()
cwd = cwd + '/zipped/intents'
files = glob.glob(cwd + "/*.json")
for file in files:
print(file)
if "usersay" not in file:
json_data= json.loads(open(file).read())
json_data['webhookUsed'] = True
with open(file, 'w') as outfile:
json.dump(json_data, outfile)
print('Done')
Hope it helps.

Can Dirsync for Python sync files and folders in two directions

I want to create a script to sync files between two directories, and was going to utilise Dirsync and Python 3 for this.
from dirsync import sync
sync('C:/03py/Sync/Sync1','C:/03py/Sync/Sync2','sync', twoway=True, create=True)
After running the file for the first time, the folders are synced. I then put a dummy file and folder into the target directory and reran the above script, hoping the file and folder would be copied back into the source directory. However I get the following:
Only in C:/03py/Sync/Sync2
<< TESTTWOFOLDER
<< _TESTTWOWAY.txt
I am not certain if I am using the above commands correctly.
i don't know if this helps...
from dirsync import sync
source_path = 'C:\\wamp\\www\\first-python-app\\http'
target_path = 'C:\\wamp\\www\\first-python-app\\dev'
sync(target_path, source_path, 'sync', twoway=True, purge=True)
sync(source_path, target_path, 'sync')

Zipfile file in cloud(amazon s3) without writing it first to local file(no write privileges)

I need to zip some files in amazon s3 without needing to write them to file locally first. Ideally my code worked in development but i don't have many write privileges in production.
folder = output_dir
files = fs.glob(folder)
f = BytesIO()
zip = zipfile.ZipFile(f, 'a', zipfile.ZIP_DEFLATED)
for file in files:
filename = os.path.basename(file)
image = fs.get(file, filename)
zip.write(filename)
zip.close()
the proplem is at this line in production
image = fs.get(file, filename)
Because i don't have write privileges.
My last resort is to write to /tmp/ directory which i have privileges to.
Is there a way to zip files from a url path or directly in the cloud?
I ended up using python tempfile which ended up being a perfect solution.
Using NamedTemporaryFile gave me the guarantee to create named and system visible temporary files that could be deleted automatically. No manual work.

Resources