Azure function to read a blob schema - python-3.x

I am currently making a Azure Function that read a blob schema whenever a new file is uploaded. I use the Blob Trigger Function for it to trigger whenever a new file is uploaded, but from there I'm stuck. Can anyone help ?

Below is the sample code for Blod trigger to read data. Review your code with this which might help you in resolving the issue.
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "queuemsg",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "inputblob",
"type": "blob",
"dataType": "binary",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "$return",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false,
"scriptFile": "__init__.py"
}
Here is the sample python code.
import logging
import azure.functions as func
# The input binding field inputblob can either be 'bytes' or 'str' depends
# on dataType in function.json, 'binary' or 'string'.
def main(queuemsg: func.QueueMessage, inputblob: bytes) -> bytes:
logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
return inputblob
For complete information check this Blob Trigger documentation and Blob Trigger input binding.

Related

What are the differences between the first and second object in the created azure function?

When i run command
func new --name getTodo
i get the following code in my function.json file
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
],
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
so i have two objects in array - what is the difference between the first and the second one ? What are the meanings of "direction": "in", vs "direction": "out", in the second object ?
One means trigger, and another is output binding. Both of them is http.
The trigger give input data by http request, and output binding output data in http response.

How can i query data from my azure cosmos db?

I have azure functions written in nodejs. I can't find a way how to get data for example from my created azure cosmos db. I know that there is azure cosmos SDK, but i don't want to use that way.I want to learn to do it through the azure functions because it is possible with them also.
i try do to this:
function.json
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "cosmosDB",
"name": "inputDocument",
"databaseName": "dbtodos",
"collectionName": "items",
"connectionStringSetting": "todos_DOCUMENTDB",
"partitionKey": "/all",
"direction": "in"
}
],
"disabled": false
}
index
module.exports = async function (context, req) {
context.res = {
// status: 200, /* Defaults to 200 */
body: context.bindings.inputDocument
};
};
after my deploy when i visit the automatically generated url - i can't even open the link.There is not requests coming back.
If i do some basic example where i don't try to pull data from the db then my url is working after deploy.
How can i get the data ?
My data in the local.settings.json was wrong. I had azure storage for other table not for the one that i wanted to query... The code works perfectly fine

How to specify a VS Code binding to upload a file from /tmp to blob storage in an Azure Function

I have an Azure Function written in Python 3.8 that creates a text file in /tmp. I want to upload it to blob storage. I am forced to develop locally in VS Code and need a binding in function.json. The problem is that the binding wants me to specify a data item that represents the blob, and since I am creating the blob from scratch by uploading a text file to the Container in the Storage account, I do not have any such data item in my code. So I always error.
Specifically, I have a container named "swsw-2020" in my storage account. Here is the code I am using to upload the file from /tmp to that container.
try:
from azure.storage.blob import BlobServiceClient, BlobClient # noqa
# Create the BlobServiceClient that is used to call the Blob service for the storage account
blob_service_client = BlobServiceClient.from_connection_string(conn_str=connection_string)
# Upload the output file, use destination for the blob name
blob_client = blob_service_client.get_blob_client(
container=container_name, blob=destination)
with open(filename, "rb") as data:
blob_client.upload_blob(data)
except Exception as err:
log_error(f"Failed to upload '{filename}' to Azure Blob Storage: {err}")
And here is my function.json snippet, which is obviously wrong but I have absolutely no idea how to make it right.
{
"type": "blob",
"direction": "out",
"name": "data",
"path": "swsw-2020",
"connection": "AzureWebJobsStorage"
}
I am completely open to better ways to do this. I just want to get my TXT file in /tmp uploaded to a blob in the "swsw-2020" container in my Storage account. Thanks!
Update:
You can use a simply way to dynamic set the blob name:
This is the function.json:
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"route": "{test}",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"name": "outputblob",
"type": "blob",
"path": "test1/{test}.txt",
"connection": "str",
"direction": "out"
}
]
}
Then, for example, if you want to create a blob named 1.txt, you can hit the function like this: http://localhost:7071/api/1
Original Answer:
You can upload the file before stored in some folder.(This is because maybe you will face some problem about access permission.)
It seems you are not using output binding, just connect to storage manully.
The output binding should be used like this:
_init_.py
import logging
import azure.functions as func
def main(req: func.HttpRequest,outputblob: func.Out[func.InputStream],) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
testdata = 'this is test.'
outputblob.set(testdata)
name = req.params.get('name')
return func.HttpResponse(f"This is output binding test, "+name)
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"name": "outputblob",
"type": "blob",
"path": "test1/test.txt",
"connection": "str",
"direction": "out"
}
]
}
Let me know if have more doubts.

Azure function prints out the http error message

I have been trying to execute an azure function. The function does some calculation and returns a json response. If I just print out the json, it gets printed, but the log also contain the below error statement:
2020-06-05T05:03:35.256 [Error] Executed 'Functions.curvefitting' (Failed, Id=8b47fa39-746e-4153-9451-d18bb79ed4cd)
Unable to cast object of type 'System.Byte[]' to type 'Microsoft.AspNetCore.Http.HttpRequest'.
I have coded my main function as:
import azure.functions as af
def main(myblob: af.InputStream) -> str:
json_response = <some calculatios>
return json_response
And here's my function.json file:
{
"scriptFile": "xyz.py",
"entryPoint": "main",
"bindings":[
{
"authLevel": "function",
"type": "blobTrigger",
"direction": "in",
"name": "myblob",
"path": "xyz.xlsx",
"connection": "AzureWebJobsStorage",
"methods":[
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
}
]
}
Please let me know what I am doing wrong here. I am new to azure functions.
Based on your Json file it looks like you started out with an HttpTriggered function, but you changed it to a BlobTriggered function. Your input binding defines a blobTrigger, but there's also methods there (which are HTTP methods) and the output binding is an HTTP binding.
Most important question is: what are you trying to achieve? If this should be an HttpTriggered function that uses a Blob as input, define an HttpTrigger and an input binding for the blob.
This would be an example of an HttpTriggered Function with an input Blob binding:
{
"scriptFile": "xyz.py",
"entryPoint": "main",
"bindings":[
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"name": "myblob",
"type": "blob",
"dataType": "binary",
"path": "xyz.xlsx",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
Your main entry point would look something like
def main(req: func.HttpRequest, myblob: func.InputStream) -> func.HttpResponse:
For more information and examples, see Azure Blob storage input binding for Azure Functions and Azure Functions HTTP trigger.

Table storage RowKey ,Partition Key

Could someone please tell me whether it is possible to assign same value for both Partitionkey and Rowkey in Azure Functionapp?
Many Thanks in advance
Based on your description, I just created my Http Trigger for Node.js to check this issue.
function.json:
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "table",
"name": "outputTable",
"tableName": "emails",
"connection": "AzureWebJobsDashboard",
"direction": "out"
}
],
"disabled": false
}
index.js:
var date=Date.now();
var key=date+'-'+Math.ceil(Math.random()*1000);
context.bindings.outputTable = {
"partitionKey": key,
"rowKey":key,
"GPIOPin":2,
'status':true
};
Leverage Azure Storage Explorer to check my table as follows:
For more details about the output sample for Table storage binding, you could refer to here.
It is possible. The design consequences will be that you will have partitions with a size of one entity. Remember that batch operations and transaction support are limited to entities in the same partition.

Resources