We have a Cosmos DB from which we are trying to retrieve data using different parameters using a http triggered function API. I'm aware of the cosmos input binding but for this I need to put in my SQL query in the function.json file and this is fine if all the parameters are present in the query. The problem is, I would like to fetch data based on different parameters and it is possible that not all these parameters will be sent for each query. Is there a way this function API dynamic enough to create the SQL query in run time and fetch the data from Cosmos?
Instead of using the input binding, I tried using the Cosmos python SDK but was getting the below error. "Exception: AttributeError: module 'azure.functions' has no attribute 'In'".
When I normally run a python program to access cosmos outside functions, it is working fine which tells me that I have the necessary libraries already imported. But this is failing when calling from python functions. I've already checked that I'm pointing to the correct interpreter (python 3.8).
Is there something I'm missing? Below is my code
import logging
from azure.cosmos import exceptions, CosmosClient, PartitionKey
import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
endpoint = "https://xxxx.documents.azure.com:443/"
key = '******=='
client = CosmosClient(endpoint, key)
database_name = 'mydb'
database = client.create_database_if_not_exists(id=database_name)
container_name= 'mycoll'
container = database.create_container_if_not_exists(id=container_name,partition_key=PartitionKey(path="/name"),
offer_throughput=400)
query = 'SELECT * FROM c WHERE c.name = "Anupam"'
items = list(container.query_items(query=query, enable_cross_partition_query=True))
print(items)
return func.HttpResponse(items, status_code=200)
else:
return func.HttpResponse(
"This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",
status_code=200
)
I introduced the below in the requirements.txt.
azure-functions
azure.cosmos
{[2021-02-10T06:34:16.248Z] Worker failed to function id 4af477f8-eff0-4937-b87f-98f7828d95ec.
[2021-02-10T06:34:16.250Z] Result: Failure
Exception: AttributeError: module 'azure.functions' has no attribute 'In'
Stack: File "C:\Program Files\Microsoft\Azure Functions Core Tools\workers\python\3.8/WINDOWS/X64\azure_functions_worker\dispatcher.py", line 271, in _handle__function_load_request
func = loader.load_function(
File "C:\Program Files\Microsoft\Azure Functions Core Tools\workers\python\3.8/WINDOWS/X64\azure_functions_worker\utils\wrappers.py", line 32, in call
return func(*args, **kwargs)
File "C:\Program Files\Microsoft\Azure Functions Core Tools\workers\python\3.8/WINDOWS/X64\azure_functions_worker\loader.py", line 76, in load_function
mod = importlib.import_module(fullmodname)
File "C:\Users\dell\AppData\Local\Programs\Python\Python38\lib\importlib\__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "D:\Visual Studio\Projects\Functions1\SBTopicTrigger1\__init__.py", line 5, in <module>
def main(message: func.ServiceBusMessage, inputdocument: func.In[func.Document], outputSbMsg:func.ServiceBusMessage):
.}
Below is my function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
Any ideas how to get rid of this "Exception: AttributeError: module 'azure.functions' has no attribute 'In'" Error?
RTM
You have Cosmos input binding missing from function.json
E.g.
{
"type": "cosmosDB",
"name": "todoitems",
"databaseName": "ToDoItems",
"collectionName": "Items",
"connectionStringSetting": "CosmosDBConnection",
"direction": "in",
"Id": "{Query.id}",
"PartitionKey": "{Query.partitionKeyValue}"
}
Python code input param type is wrong. Should be func.DocumentList not func.In
import logging
import azure.functions as func
def main(req: func.HttpRequest, todoitems: func.DocumentList) -> str:
if not todoitems:
logging.warning("ToDo item not found")
else:
logging.info("Found ToDo item, Description=%s", todoitems[0]['description'])
return 'OK'
Related
I'm trying out azure CI (aci) but ran directly into problems:
I try to get a bitpoll docker image chessmasterrr/bitpoll running in Azure CI and I'm struggling how to mount the settings file.
The manual in docker hub says to mount the file as /path/on/host/settings_local.py:/Bitpoll/bitpoll/settings_local.py.
Has anybody an idea how to get that running in azure?
I tried to put it in a azure file share or mount it as a base64 encoded secret.
The following configuration helped me deploy the ACI with the docker image. However, issues remain with container termination loop though...
To deploy the ACI container 'succesfully':
Create a Storage account with two file shares:
The names are arbitrary and can be anything.
What is important is that one of them will hold the 'settings.py' file and the other the database content.
In one of the fileshares put the 'settings.py' file. The Docker hub page mentions a 'settings_local.py' file but that does not work afaik as this output shows:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 323, in run_from_argv
self.execute(*args, **cmd_options)
File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 361, in execute
self.check()
File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 387, in check
all_issues = self._run_checks(
File "/usr/local/lib/python3.9/site-packages/django/core/management/commands/migrate.py", line 64, in _run_checks
issues = run_checks(tags=[Tags.database])
File "/usr/local/lib/python3.9/site-packages/django/core/checks/registry.py", line 72, in run_checks
new_errors = check(app_configs=app_configs)
File "/usr/local/lib/python3.9/site-packages/django/core/checks/database.py", line 9, in check_database_backends
for conn in connections.all():
File "/usr/local/lib/python3.9/site-packages/django/db/utils.py", line 216, in all
return [self[alias] for alias in self]
File "/usr/local/lib/python3.9/site-packages/django/db/utils.py", line 213, in __iter__
return iter(self.databases)
File "/usr/local/lib/python3.9/site-packages/django/utils/functional.py", line 80, in __get__
res = instance.__dict__[self.name] = self.func(instance)
File "/usr/local/lib/python3.9/site-packages/django/db/utils.py", line 147, in databases
self._databases = settings.DATABASES
File "/usr/local/lib/python3.9/site-packages/django/conf/__init__.py", line 79, in __getattr__
self._setup(name)
File "/usr/local/lib/python3.9/site-packages/django/conf/__init__.py", line 66, in _setup
self._wrapped = Settings(settings_module)
File "/usr/local/lib/python3.9/site-packages/django/conf/__init__.py", line 157, in __init__
mod = importlib.import_module(self.SETTINGS_MODULE)
File "/usr/local/lib/python3.9/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'bitpoll.settings'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Bitpoll/./manage.py", line 10, in <module>
execute_from_command_line(sys.argv)
File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 381, in execute_from_command_line
utility.execute()
File "/usr/local/lib/python3.9/site-packages/django/core/management/__init__.py", line 375, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/local/lib/python3.9/site-packages/django/core/management/base.py", line 336, in run_from_argv
connections.close_all()
File "/usr/local/lib/python3.9/site-packages/django/db/utils.py", line 219, in close_all
for alias in self:
File "/usr/local/lib/python3.9/site-packages/django/db/utils.py", line 213, in __iter__
return iter(self.databases)
File "/usr/local/lib/python3.9/site-packages/django/utils/functional.py", line 80, in __get__
res = instance.__dict__[self.name] = self.func(instance)
File "/usr/local/lib/python3.9/site-packages/django/db/utils.py", line 147, in databases
self._databases = settings.DATABASES
File "/usr/local/lib/python3.9/site-packages/django/conf/__init__.py", line 79, in __getattr__
self._setup(name)
File "/usr/local/lib/python3.9/site-packages/django/conf/__init__.py", line 66, in _setup
self._wrapped = Settings(settings_module)
File "/usr/local/lib/python3.9/site-packages/django/conf/__init__.py", line 157, in __init__
mod = importlib.import_module(self.SETTINGS_MODULE)
File "/usr/local/lib/python3.9/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'bitpoll.settings'
And then the ARM template:
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"containerGroupName": {
"type": "string",
"metadata": {
"description": "Name of the Azure Container Instance group"
},
"defaultValue": "demo_container_group_unique"
},
"location": {
"type": "string",
"metadata": {
"description": "Location of the reosurces to be deployed"
},
"defaultValue": "westeurope"
},
"storageAccountName": {
"type": "string",
"metadata": {
"description": "Name of the storage account"
},
"defaultValue": "storage_account_unique"
},
"storageAccountKeyValue": {
"type": "string",
"metadata": {
"description": "The storage account key (value)"
}
}
},
"functions": [],
"variables": {},
"resources": [
{
"type": "Microsoft.ContainerInstance/containerGroups",
"apiVersion": "2021-10-01",
"name": "[parameters('containerGroupName')]",
"location": "[parameters('location')]",
"properties": {
"sku": "Standard",
"containers": [
{
"name": "publiccmbp3",
"properties": {
"command": ["./manage.py", "migrate"],
"image": "chessmasterrr/bitpoll",
"ports": [
{
"port": 8000
}
],
"environmentVariables": [
],
"resources": {
"requests": {
"memoryInGB": 4,
"cpu": 2
}
},
"volumeMounts": [
{
"name": "onlyfile",
"mountPath": "/Bitpoll/bitpoll/",
"readOnly": false
},
{
"name": "database",
"mountPath": "/Bitpoll/database/",
"readOnly": false
}
]
}
}
],
"restartPolicy": "Never",
"ipAddress": {
"ports": [
{
"port": 8000
}
],
"ip": "10.0.2.4",
"type": "Public"
},
"osType": "Linux",
"volumes": [
{
"name": "onlyfile",
"azureFile": {
"shareName": "onlyfile",
"readOnly": false,
"storageAccountName": "[parameters('storageAccountName')]",
"storageAccountKey": "[parameters( 'storageAccountKeyValue')]"
}
},
{
"name": "database",
"azureFile": {
"shareName": "database",
"readOnly": false,
"storageAccountName": "[parameters('storageAccountName')]",
"storageAccountKey": "[parameters( 'storageAccountKeyValue')]"
}
}
]
}
}
],
"outputs": {}
}
This template deploys the ACI with the image. It also executes the initial command that is mentioned on docker hub:
"command": ["./manage.py", "migrate"]
Next will be the deployment of the ARM template.
I wrote a small powershell script that assists with deployment:
$ErrorActionPreference = "Stop"
$RESOURCEGROUPNAME = "az204demo"
$STORAGEACCOUNTNAME = "az204ferdi"
$NEWRESOURCEGROUP = "testres110"
$ARMTEMPLATE = "TemplateACiExample.json"
$ACINAME = "myacitest11"
Connect-AzAccount
$STORAGEACCOUNT = Get-AzResource -ResourceGroupName $RESOURCEGROUPNAME -Name $STORAGEACCOUNTNAME
$STORAGERESOURCEID = $STORAGEACCOUNT.ResourceId
$STORAGEACCOUNTKEYS = Get-AzStorageAccountKey -ResourceGroupName $RESOURCEGROUPNAME -Name $STORAGEACCOUNTNAME
New-AzResourceGroup -Name $NEWRESOURCEGROUP -Location "WestEurope"
New-AzResourceGroupDeployment -ResourceGroupName $NEWRESOURCEGROUP `
-containerGroupName $ACINAME `
-storageAccountName $STORAGEACCOUNTNAME `
-storageAccountKeyValue $STORAGEACCOUNTKEYS[0].Value `
-TemplateFile $ARMTEMPLATE
After deployment a new sqlite file is created in the database fileshare:
Unfortunately the ACI enters into an endless loop/termination afterwards. The log service provided with the ACI also does not offer any information.
Hopefully this answers your question regarding the volumes and helps you to succesfully use the docker image.
Kind regards
The following code works fine locally, but fails in AWS Lambda:
authURL = os.environ['authURL_env']
reportURL = os.environ['reportURL_env']
FirstDayOfPreviousMonth = dt.date.today()-dt.timedelta(days=1)
LastDayOfPreviousMonth = dt.date.today()-dt.timedelta(days=1)
payload = json.dumps({
"email": os.environ['email_env'],
"password": os.environ['password_env']
})
headers = {
'Content-Type': 'application/json'
}
apiConn = url.PoolManager()
# try:
tokenResponse = apiConn.request('POST', authURL, headers=headers, body=payload)
authToken = json.loads(tokenResponse.data)
payload = json.dumps({
"start_date": str(FirstDayOfPreviousMonth),
"end_date": str(LastDayOfPreviousMonth),
"interval": "day",
"dimensions": [
"supply_tag_id",
"demand_partner_id"
]
})
headers = {
'Authorization': authToken['token'],
'Content-Type': 'application/json'
}
response = apiConn.request('POST', reportURL, headers=headers, body=payload)
vendor_df = pd.read_json(response.data)
In AWS Lambda I get the following error:
{
"errorMessage": "Expected file path name or file-like object, got <class 'bytes'> type",
"errorType": "TypeError",
"requestId": "9f594e9e-7703-420b-b981-e4352f1d64db",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 56, in lambda_handler\n springserve_df = pd.read_json(response.data)\n",
" File \"/opt/python/pandas/util/_decorators.py\", line 207, in wrapper\n return func(*args, **kwargs)\n",
" File \"/opt/python/pandas/util/_decorators.py\", line 311, in wrapper\n return func(*args, **kwargs)\n",
" File \"/opt/python/pandas/io/json/_json.py\", line 588, in read_json\n json_reader = JsonReader(\n",
" File \"/opt/python/pandas/io/json/_json.py\", line 673, in __init__\n data = self._get_data_from_filepath(filepath_or_buffer)\n",
" File \"/opt/python/pandas/io/json/_json.py\", line 710, in _get_data_from_filepath\n self.handles = get_handle(\n",
" File \"/opt/python/pandas/io/common.py\", line 823, in get_handle\n raise TypeError(\n"
]
}
To add more mystery to it, this exact code works in an older Lambda. Its only when I try to create it in a new lambda that it begins to fail.
I have documents like this
class Users(Document):
name = StringField(required=True)
email = EmailField(required=True, primary_key=True)
preferences = ListField(ReferenceField(Preferences, required=True))
languages = ListField(ReferenceField(Languages), required=True)
class Languages(Document):
name = StringField(required=True,unique=True)
active = BooleanField(default=True)
class Preferences(Document):
name = DictField(required=True,unique=True)
active = BooleanField(default=True)
I am trying to retrieve information from these 3 collections .
return code from my python
output = Users.objects.aggregate([
{
'$lookup':
{
"from": "languages",
"localField": "languages",
"foreignField": "_id",
"as": "languages"
}
},
{
'$lookup':
{
"from": "preferences",
"localField": "preferences",
"foreignField": "_id",
"as": "preferences"
}
}
])
return jsonify({'result': output})
but getting below error :
File "D:\user.py", line 51, in get
return jsonify({'result': output})
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\site-packages\flask\json_init_.py", line 355, in jsonify
f"{dumps(data, indent=indent, separators=separators)}\n",
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\site-packages\flask\json_init_.py", line 133, in dumps
rv = json.dumps(obj, **kwargs)
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\json_init.py", line 238, in dumps
**kw).encode(obj)
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\json\encoder.py", line 201, in encode
chunks = list(chunks)
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\json\encoder.py", line 431, in _iterencode
yield from _iterencode_dict(o, _current_indent_level)
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\json\encoder.py", line 405, in _iterencode_dict
yield from chunks
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\json\encoder.py", line 438, in _iterencode
o = default(o)
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\site-packages\flask_mongoengine\json.py", line 19, in default
return superclass.default(self, obj)
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\site-packages\flask\json_init.py", line 57, in default
return super().default(o)
File "C:\Users\Ideapad\AppData\Local\Programs\Python\Python310\Lib\json\encoder.py", line 179, in default
raise TypeError(f'Object of type {o.class.name} '
TypeError: Object of type CommandCursor is not JSON serializable
its working fine with the json.dumps
dumps(output, ensure_ascii=False)
My goals is to restrict access to ec2 using tag key. It works fine if I remove the condition from the IAM policy. However, if I add the aws:TagKeys condition then I get UnauthorizedOperation error. Need some assistance in fixing the IAM policy or either the code to work with tagkey.
Here's the IAM policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ec2:DescribeInstances",
"ec2:DescribeKeyPairs"
],
"Resource": "*",
"Condition": {
"ForAnyValue:StringEquals": {
"aws:TagKeys": "mytag"
}
}
}
]
}
Here's my python code:
import os
import boto3
import json
os.environ['AWS_DEFAULT_REGION'] = 'ap-south-1'
os.environ['AWS_ACCESS_KEY_ID'] = 'myacceskey'
os.environ['AWS_SECRET_ACCESS_KEY'] = 'secret'
def list_instances_by_tag_value(tagkey, tagvalue):
# When passed a tag key, tag value this will return a list of InstanceIds that were found.
ipdict={}
ec2client = boto3.client('ec2')
#response = ec2client.describe_key_pairs()
#print(response)
response = ec2client.describe_instances(
Filters=[
{
'Name':'tag:' + tagkey,
'Values':[tagvalue]
}
]
)
client_dict = {}
for reservation in (response["Reservations"]):
print(reservation)
#boto3.set_stream_logger(name='botocore')
output = list_instances_by_tag_value("mytag", "abcd")
Here's the exception:
Traceback (most recent call last):
File "test.py", line 29, in <module>
output = list_instances_by_tag_value("mytag", "abcd")
File "test.py", line 20, in list_instances_by_tag_value
'Values':[tagvalue]
File "C:\python35\lib\site-packages\botocore\client.py", line 272, in _api_call
return self._make_api_call(operation_name, kwargs)
File "C:\python35\lib\site-packages\botocore\client.py", line 576, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (UnauthorizedOperation) when calling the DescribeInstances operation: You are not authorized to perform this operation.
I have checked that tagkey is supported by describeinstances - https://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_DescribeInstances.html
Also checked couple of SO threads after which I changed my action to very specific DescribeInstances from Describe*
But its still not working for me.
Got it: Why does applying a condition to ec2:DescribeInstances in an IAM policy fail?
DescribeInstances does not support resource level permissions
Azure CS has an OCR demo (westcentralus endpoint) at
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/?v=18.05
On a poor test image (which I'm afraid I can't post because it's an identity document), I get OCR results that 100% match the actual text for three test cases in fact - remarkable.
However, when I follow the sample at the URL below, with the westeurope endpoint, I get poorer OCR results - some text is missing:
https://learn.microsoft.com/en-us/azure/cognitive-services/Computer-vision/quickstarts/python-print-text
Why is this? More to the point - how do I access the v=18.05 endpoint?
Thanks for all speedy help.
I think I got your point: you are not using the same operation between the 2 pages you mention.
If you read the paragraph just above the working demo you are mentioning here it says:
Get started with the OCR service in general availability, and discover
below a sneak peek of the new preview OCR engine (through "Recognize
Text" API operation) with even better text recognition results for
English.
And if you have a look to the other documentation you are pointing at (this one), they are using the OCR operation:
vision_base_url = "https://westcentralus.api.cognitive.microsoft.com/vision/v2.0/"
ocr_url = vision_base_url + "ocr"
So if you want to use this new preview version, change the operation to recognizeText
It is available in West Europe region (see here), and I made a quick test: the samples provided on Azure demo page are working with this operation, and not in the other one.
But this time the operation needs 2 calls:
One POST operation to submit your request (recognizeText operation), where you will have a 202 Accepted answer with an operationId
One GET opertaion to get the results (textOperations operation), with your OperationId from the previous step. For example: https://westeurope.api.cognitive.microsoft.com/vision/v2.0/textOperations/yourOperationId
DEMO :
For the CLOSED sign from Microsoft Demos:
Result with OCR operation:
{
"language": "unk",
"orientation": "NotDetected",
"textAngle": 0.0,
"regions": []
}
Result with RecognizeText:
{
"status": "Succeeded",
"recognitionResult": {
"lines": [{
"boundingBox": [174, 488, 668, 675, 617, 810, 123, 622],
"text": "CLOSED",
"words": [{
"boundingBox": [164, 494, 659, 673, 621, 810, 129, 628],
"text": "CLOSED"
}]
}, {
"boundingBox": [143, 641, 601, 811, 589, 843, 132, 673],
"text": "WHEN ONE DOOR CLOSES, ANOTHER",
"words": [{
"boundingBox": [147, 646, 217, 671, 205, 698, 134, 669],
"text": "WHEN"
}, {
"boundingBox": [230, 675, 281, 694, 269, 724, 218, 703],
"text": "ONE"
}, {
"boundingBox": [291, 697, 359, 722, 348, 754, 279, 727],
"text": "DOOR"
}, {
"boundingBox": [370, 726, 479, 767, 469, 798, 359, 758],
"text": "CLOSES,"
}, {
"boundingBox": [476, 766, 598, 812, 588, 839, 466, 797],
"text": "ANOTHER"
}]
}, {
"boundingBox": [56, 668, 645, 886, 633, 919, 44, 700],
"text": "OPENS.ALL YOU HAVE TO DO IS WALK IN",
"words": [{
"boundingBox": [74, 677, 223, 731, 213, 764, 65, 707],
"text": "OPENS.ALL"
}, {
"boundingBox": [233, 735, 291, 756, 280, 789, 223, 767],
"text": "YOU"
}, {
"boundingBox": [298, 759, 377, 788, 367, 821, 288, 792],
"text": "HAVE"
}, {
"boundingBox": [387, 792, 423, 805, 413, 838, 376, 824],
"text": "TO"
}, {
"boundingBox": [431, 808, 472, 824, 461, 855, 420, 841],
"text": "DO"
}, {
"boundingBox": [479, 826, 510, 838, 499, 869, 468, 858],
"text": "IS"
}, {
"boundingBox": [518, 841, 598, 872, 587, 901, 506, 872],
"text": "WALK"
}, {
"boundingBox": [606, 875, 639, 887, 627, 916, 594, 904],
"text": "IN"
}]
}]
}
}