flask wtf upload image file to aws s3 - python-3.x

I want to upload image to aws s3 using presigned url and the image is selected by users only.
So I make file input field using flask-wtf filefield.
After user submit the form(PostForm), I want to get the image's data and send to presigned-url.
But I don't know how to get image file's information from the Form.
Please help me!!!
I used example code from below tutorials. But the difference is that I use flask-wtf not local image file.
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-presigned-urls.html
In following code, the problem line is here.
files = {'file': (save_objectname, image.read())}
http_response = requests.post(response['url'], data=response['fields'], files=files)
when I print(image.read())
it shows
b'' <--- nothing...
How can I fix it???
def create_post():
form = PostForm()
save_post_foldername = 'post_images'
if form.validate_on_submit():
if form.post_image.data:
image_file = form.post_image.data
save_post_objectname = generate_filename(image_file)
# Generate a presigned S3 POST URL
post_to_aws_s3(image_file, save_post_foldername, save_post_objectname)
def post_to_aws_s3(image, save_foldername, save_objectname):
fields = {
"acl": "public-read",
"Content-Type": image.content_type
}
conditions = [
{"acl": "public-read"},
{"Content-Type": image.content_type}
]
try:
response = s3_client.generate_presigned_post(
Bucket=S3_BUCKET,
Key=save_foldername+'/'+save_objectname,
Fields=fields,
Conditions=conditions,
ExpiresIn=600
)
print(response)
print(response['url'])
print(image.content_type)
#with open(image, 'rb') as f:
#files = {'file': ('abc.png', f)}
#files = {'file': (image, f)}
files = {'file': (save_objectname, image.read())}
http_response = requests.post(response['url'], data=response['fields'], files=files)
print(image.read())
except ClientError as e:
print("error")

Access_ID and secret ID refer to your aws user access id and secret key, because boto3 will use that to access your aws s3 using the user you assign to it.
s3 = boto3.resource("s3", aws_access_key_id=os.getenv("ACCESS_ID"), aws_secret_access_key=os.getenv("ACCESS_SECRET_ID"))
s3.Bucket("[NAME OF BUCKET]").put_object(Key="images/"+request.files["image"].filename, Body=request.files["image"] )
make sure you you include "enctype" under forms, under jinja, e.g.
<form method="POST" action="" enctype="multipart/form-data">

Related

PYTHON FLASK - request.files displays as file not found eventhough it exits

I am trying to trigger an external api from postman by passing the uploadFile in the body as form-data. Below code throws me an error as 'FileNotFoundError: [Errno 2] No such file or directory:'
Note: In postman, uploadFile takes file from my local desktop as input. have also modified the postman settings to allow access for files apart from working directory
Any help would be highly appreciable.
Below is the Code:
#app.route('/upload', methods=['POST'])
#auth.login_required
def post_upload():
payload = {
'table_name': 'incident', 'table_sys_id': request.form['table_sys_id']
}
files = {'file': (request.files['uploadFile'], open(request.files['uploadFile'].filename,
'rb'),'image/jpg', {'Expires': '0'})}
response = requests.post(url, headers=headers, files=files, data=payload)
return jsonify("Success- Attachment uploaded successfully ", 200)
Below code throws me an error as 'FileNotFoundError: [Errno 2] No such file or directory:
Have you defined UPLOAD_FOLDER ? Please see: https://flask.palletsprojects.com/en/latest/patterns/fileuploads/#a-gentle-introduction
i am passing the attribute (upload file) in body as form-data, can this be passed as raw json
You cannot upload files with JSON. But one hacky way to achieve this is to base64 (useful reference) encode the file before sending it. This way you do not upload the file instead you send the file content encoded in base64 format.
Server side:
import base64
file_content = base64.b64decode(request.data['file_buffer_b64'])
Client side:
-> Javascript:
const response = await axios.post(uri, {file_buffer_b64: window.btoa(file)})
-> Python:
import base64
with open(request.data['uploadFile'], "rb") as myfile:
encoded_string = base64.b64encode(myfile.read())
payload = {"file_buffer_b64": encoded_string}
response = requests.post(url, data=payload)

Using local image for Read 3.0, Azure Cognitive Service, Computer Vision

I am attempting to use a local image in my text recognition script, the documentation has the following example (https://learn.microsoft.com/en-us/azure/cognitive-services/computer-vision/quickstarts/python-hand-text):
But when I change the image_url to a local file path, it sends a HTTPError: 400 Client Error: Bad Request for url. I have tried following other tutorials but nothing seems to work.
Any help would be greatly appreciated :)
The Cognitive services API will not be able to locate an image via the URL of a file on your local machine. Instead you can call the same endpoint with the binary data of your image in the body of the request.
Replace the following lines in the sample Python code
image_url = "https://raw.githubusercontent.com/MicrosoftDocs/azure-docs/master/articles/cognitive-services/Computer-vision/Images/readsample.jpg"
headers = {'Ocp-Apim-Subscription-Key': subscription_key}
data = {'url': image_url}
response = requests.post(
text_recognition_url, headers=headers, json=data)
with
headers = {'Ocp-Apim-Subscription-Key': subscription_key,'Content-Type': 'application/octet-stream'}
with open('YOUR_LOCAL_IMAGE_FILE', 'rb') as f:
data = f.read()
response = requests.post(
text_recognition_url, headers=headers, data=data)
And replace the following line:
image = Image.open(BytesIO(requests.get(image_url).content))
with
image = Image.open('./YOUR_LOCAL_IMAGE_FILE.png')

Download Zoom Recordings to Local Machine and use Resumable Upload Method to upload the video to Google Drive

I have built a function in Python3 that will retrieve the data from a Google Spreadsheet. The data will contain the Recording's Download_URL and other information.
The function will download the Recording and store it in the local machine. Once the video is saved, the function will upload it to Google Drive using the Resumable Upload method.
Even though the response from the Resumable Upload method is 200 and it also gives me the Id of the file, I can't seem to find the file anywhere in my Google Drive. Below is my code.
import os
import requests
import json
import gspread
from oauth2client.service_account import ServiceAccountCredentials
DOWNLOAD_DIRECTORY = 'Parent_Folder'
def upload_recording(file_location,file_name):
filesize = os.path.getsize(file_location)
# Retrieve session for resumable upload.
headers = {"Authorization": "Bearer " + access_token, "Content-Type": "application/json"}
params = {
"name": file_name,
"mimeType": "video/mp4"
}
r = requests.post(
"https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable",
headers=headers,
data=json.dumps(params)
)
print(r)
location = r.headers['Location']
# Upload the file.
headers = {"Content-Range": "bytes 0-" + str(filesize - 1) + "/" + str(filesize)}
r = requests.put(
location,
headers=headers,
data=open(file_location, 'rb')
)
print(r.text)
return True
def download_recording(download_url, foldername, filename):
upload_success = False
dl_dir = os.sep.join([DOWNLOAD_DIRECTORY, foldername])
full_filename = os.sep.join([dl_dir, filename])
os.makedirs(dl_dir, exist_ok=True)
response = requests.get(download_url, stream=True)
try:
with open(full_filename, 'wb') as fd:
for chunk in response.iter_content(chunk_size=512 * 1024):
fd.write(chunk)
upload_success = upload_recording(full_filename,filename)
return upload_success
except Exception as e:
# if there was some exception, print the error and return False
print(e)
return upload_success
def main():
scope = ["https://spreadsheets.google.com/feeds", "https://www.googleapis.com/auth/spreadsheets",
"https://www.googleapis.com/auth/drive.file", "https://www.googleapis.com/auth/drive"]
creds = ServiceAccountCredentials.from_json_keyfile_name('creds.json', scope)
client = gspread.authorize(creds)
sheet = client.open("Zoom Recordings Data").sheet1
data = sheet.get_all_records()
# Get the Recordings information that are needed to download
for index in range(len(sheet.col_values(9))+1 ,len(data)+2):
success = False
getRow = sheet.row_values(index)
session_name = getRow[0]
topic = getRow[1]
topic = topic.replace('/', '')
topic = topic.replace(':', '')
account_name = getRow[2]
start_date = getRow[3]
file_size = getRow[4]
file_type = getRow[5]
url_token = getRow[6] + '?access_token=' + getRow[7]
file_name = start_date + ' - ' + topic + '.' + file_type.lower()
file_destination = session_name + '/' + account_name + '/' + topic
success |= download_recording(url_token, file_destination, file_name)
# Update status on Google Sheet
if success:
cell = 'I' + str(index)
sheet.update_acell(cell,'success')
if __name__ == "__main__":
credentials = ServiceAccountCredentials.from_json_keyfile_name(
'creds.json',
scopes='https://www.googleapis.com/auth/drive'
)
delegated_credentials = credentials.create_delegated('Service_Account_client_email')
access_token = delegated_credentials.get_access_token().access_token
main()
I'm still trying to figure out how to upload the video to the folder that it needs to be. I'm very new to Python and the Drive API. I would very appreciate if you can give me some suggestions.
How about this answer?
Issue and solution:
Even though the response from the Resumable Upload method is 200 and it also gives me the Id of the file, I can't seem to find the file anywhere in my Google Drive. Below is my code.
I think that your script is correct for the resumable upload. From your above situation and from your script, I understood that your script worked, and the file has been able to be uploaded to Google Drive with the resumable upload.
And, when I saw your issue of I can't seem to find the file anywhere in my Google Drive and your script, I noticed that you are uploading the file using the access token retrieved by the service account. In this case, the uploaded file is put to the Drive of the service account. Your Google Drive is different from the Drive of the service account. By this, you cannot see the uploaded file using the browser. In this case, I would like to propose the following 2 methods.
Pattern 1:
The owner of the uploaded file is the service account. In this pattern, share the uploaded file with your Google account. The function upload_recording is modified as follows. And please set your email address of Google account to emailAddress.
Modified script:
def upload_recording(file_location, file_name):
filesize = os.path.getsize(file_location)
# Retrieve session for resumable upload.
headers1 = {"Authorization": "Bearer " + access_token, "Content-Type": "application/json"} # Modified
params = {
"name": file_name,
"mimeType": "video/mp4"
}
r = requests.post(
"https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable",
headers=headers1, # Modified
data=json.dumps(params)
)
print(r)
location = r.headers['Location']
# Upload the file.
headers2 = {"Content-Range": "bytes 0-" + str(filesize - 1) + "/" + str(filesize)} # Modified
r = requests.put(
location,
headers=headers2, # Modified
data=open(file_location, 'rb')
)
# I added below script.
fileId = r.json()['id']
permissions = {
"role": "writer",
"type": "user",
"emailAddress": "###" # <--- Please set your email address of your Google account.
}
r2 = requests.post(
"https://www.googleapis.com/drive/v3/files/" + fileId + "/permissions",
headers=headers1,
data=json.dumps(permissions)
)
print(r2.text)
return True
When you run above modified script, you can see the uploaded file at "Shared with me" of your Google Drive.
Pattern 2:
In this pattern, the file is uploaded to the shared folder using the resumable upload with the service account. So at first, please prepare a folder in your Google Drive and share the folder with the email of the service account.
Modified script:
Please modify the function upload_recording as follows. And please set the folder ID you shared with the service account.
From:
params = {
"name": file_name,
"mimeType": "video/mp4"
}
To:
params = {
"name": file_name,
"mimeType": "video/mp4",
"parents": ["###"] # <--- Please set the folder ID you shared with the service account.
}
When you run above modified script, you can see the uploaded file at the shared folder of your Google Drive.
Note:
In this case, the owner is the service account. Of course, the owner can be also changed.
Reference:
Permissions: create

How can I upload or store images in a folder using Flask?

I currently have my images in a binary format along with the extension of the images. I would like to store images in a folder using flask. How can I do that?
Format of the image.
image = {
'img_src': binary_format_of_image,
'ext': image_extension,
'id': image_id
}
Flask has a intro into uploading files through your Flask Application
An approach from that page:
def allowed_file(filename):
return '.' in filename and \
filename.rsplit('.', 1)[1].lower() in ALLOWED_EXTENSIONS
#app.route('/upload-my-image', methods=['POST'])
def upload_file():
# check if the post request has the file part
if 'file' not in request.files:
flash('No file part')
return redirect(request.url)
file = request.files['file']
# if user does not select file, browser also
# submit an empty part without filename
if file.filename == '':
flash('No selected file')
return redirect(request.url)
if file and allowed_file(file.filename):
filename = secure_filename(file.filename)
file.save(os.path.join(ABSOLUTE_PATH_TO_YOUR_FOLDER, filename))
new_image = Image(
path=PATH_TO_YOUR_FOLDER,
filename=filename,
ext=filename.rsplit('.', 1)[1].lower()
)
# Save new_image model
return redirect(url_for('uploaded_file', filename=filename))
The datastructure that Flask request.files use is a FileStorage.
Also note that the <form> tag has to be marked with enctype=multipart/form-data and a <input type=file>

Upload file to AWS Device Farm

I am trying to use python + boto3 to create upload in device farm (uploading a test or an application). The method "create_upload" works fine as it returns an upload arn and url to upload to it.
When I try to use requests to upload file to this URL I get an error:
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><AWSAccessKeyId>AKIAJV4C3CWPBUMBC3GA</AWSAccessKeyId><StringToSign>AWS4-HMAC-SHA256
My code:
response = client.create_upload(
projectArn=df_project,
name="test.zip",
type="APPIUM_JAVA_TESTNG_TEST_PACKAGE",
contentType='application/octet-stream'
)
test_url = response["upload"]["url"]
files = {'upload_file': open('/tmp/test.zip','rb')}
r = requests.post(test_url, files=files, data={})
Also i tried using curl, and requests.post passing the file to the data
attribut:
r = requests.put(test_url, data=open("/tmp/test.zip", "rb").read())
print(r.text)
and
cmd = "curl --request PUT --upload-file /tmp/test.zip \""+test_url+"\""
result = subprocess.call(cmd, shell=True)
print(result)
I did this before in a past project. Here is a code snippet of how I did it:
create a new upload
#http://boto3.readthedocs.io/en/latest/reference/services/devicefarm.html#DeviceFarm.Client.create_upload
print('Creating the upload presigned url')
response = client.create_upload(projectArn=args.projectARN,name=str(args.testPackageZip),type='APPIUM_WEB_JAVA_TESTNG_TEST_PACKAGE')
#create the s3 bucket to store the upload test suite
uploadArn = response['upload']['arn']
preSignedUrl = response['upload']['url']
print('uploadArn: ' + uploadArn + '\n')
print('pre-signed url: ' + preSignedUrl + '\n')
#print the status of the upload
#http://boto3.readthedocs.io/en/latest/reference/services/devicefarm.html#DeviceFarm.Client.get_upload
status = client.get_upload(arn=uploadArn)
print("S3 Upload Bucket Status: " + status['upload']['status'] + '\n')
print("Uploading file...")
#open the file and make payload
payload = {'file':open(args.testPackageZip,'rb')}
#try and perform the upload
r = requests.put(url = preSignedUrl,files = payload)
#print the response code back
print(r)
Hope that helps
I work for AWS Device Farm. This problem occurs when some of the request parameters don't match what was used to sign the URL. The times that I have seen this issue, it has to do with the content type. I would suggest not passing it in the CreateUpload request. If you do pass it, then you'll also need to include that as a request header when making the PUT call.

Resources