Lambda AWS Rekognition to DynamoDB - Error - node.js

I am using this tutorial to link Rekognition results to a DynamoDB table.
It is giving me this error:
{
"errorMessage": "Unable to get object metadata from S3. Check object key, region and/or access permissions.",
"errorType": "InvalidS3ObjectException",
"stackTrace": [
"Request.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/json.js:48:27)",
"Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:105:20)",
"Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:77:10)",
"Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:683:14)",
"Request.transition (/var/runtime/node_modules/aws-sdk/lib/request.js:22:10)",
"AcceptorStateMachine.runTo (/var/runtime/node_modules/aws-sdk/lib/state_machine.js:14:12)",
"/var/runtime/node_modules/aws-sdk/lib/state_machine.js:26:10",
"Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:38:9)",
"Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:685:12)",
"Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:115:18)"
]
}
The code used from GitHub is this.
I made sure the region-name is the same for the lambda-bucket and the table.
I am a starter in this, so any help will be appreciated!
Thanks!
Edit:
I made some modifications and now it is giving me this:
{
"errorMessage": "Requested resource not found",
"errorType": "ResourceNotFoundException",
"stackTrace": [
"Request.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/json.js:48:27)",
"Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:105:20)",
"Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:77:10)",
"Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:683:14)",
"Request.transition (/var/runtime/node_modules/aws-sdk/lib/request.js:22:10)",
"AcceptorStateMachine.runTo (/var/runtime/node_modules/aws-sdk/lib/state_machine.js:14:12)",
"/var/runtime/node_modules/aws-sdk/lib/state_machine.js:26:10",
"Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:38:9)",
"Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:685:12)",
"Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:115:18)"
]
}

The fact that you're seeing ResourceNotFoundException suggests a couple of potential causes:
the Lambda function could not find the DynamoDB table: make sure that you modified config,js to include the name of the DynamoDB table correctly, by setting config.dynamo.tableName = '<your table>'
Rekognition could not read the image from S3: make sure that the image filename is of the form faces.jpg rather than test faces.jpg (which gets escaped to test+faces.jpg)

There are a couple of reasons why this could be happening:
1) The resource definitely does not exist. Triple-check Bucket name, DynamoDB Table name, regions, etc.
2) It's very likely that your function lacks permissions. Check the IAM Role that your Lambda function is using and attach the right policies to it. On this case, your function needs access to S3, DynamoDB and Rekognition. Make sure all of these policies are attached to the IAM role.

Related

Cannot get Azure ML Model Management API to work in GermanyWestCentral as in EastUs for example

Considering the Azure ML API and how to manage models with it, I've got the following problem.
Here is the sequence that works:
Obtain a token(OK) - > POST https://login.microsoftonline.com/{{TenantId}}/oauth2/token
List Workspaces (OK) -> GET https://management.azure.com/subscriptions/{{SubscriptionId}}/providers/Microsoft.MachineLearningServices/workspaces?api-version=2018-03-01-preview
I've got several workspaces, some created in EastUS and some in GermanWestCentral regions
In the json returned from previous API call, there is an attribute returned for each workspace called discoveryUrl which is either 'https://germanywestcentral.api.azureml.ms/discovery' or 'https://eastus.api.azureml.ms/discovery'
Invoking GET https://germanywestcentral.api.azureml.ms/discovery returns
{
"api": "https://germanywestcentral.api.azureml.ms",
"catalog": "https://catalog.cortanaanalytics.com",
"experimentation": "https://germanywestcentral.api.azureml.ms",
"gallery": "https://gallery.cortanaintelligence.com/project",
"history": "https://germanywestcentral.api.azureml.ms",
"hyperdrive": "https://germanywestcentral.api.azureml.ms",
"labeling": "https://germanywestcentral.api.azureml.ms",
"modelmanagement": "https://germanywestcentral.api.azureml.ms",
"pipelines": "https://germanywestcentral.aether.ms",
"studio": "https://ml.azure.com"
}
Invoking GET https://eastus.api.azureml.ms/discovery returns
{
"api": "https://eastus.api.azureml.ms",
"catalog": "https://catalog.cortanaanalytics.com",
"experimentation": "https://eastus.experiments.azureml.net",
"gallery": "https://gallery.cortanaintelligence.com/project",
"history": "https://eastus.experiments.azureml.net",
"hyperdrive": "https://eastus.experiments.azureml.net",
"labeling": "https://eastus.experiments.azureml.net",
"modelmanagement": "https://eastus.modelmanagement.azureml.net",
"pipelines": "https://eastus.aether.ms",
"studio": "https://ml.azure.com"
}
The modelmanagement url do not have the same structure in both regions
"modelmanagement": "https://germanywestcentral.api.azureml.ms"
versus "modelmanagement": "https://eastus.modelmanagement.azureml.net",
(well fine, that should not be an issue)
Now invoking GET https://eastus.modelmanagement.azureml.net/api/subscriptions/{{SubscriptionId}}/resourceGroups/{{resourceGroupName}}/providers/Microsoft.MachineLearningServices/workspaces/{{workspaceName}}/services?api-version=2018-03-01-preview&count=100
do return data describing the services available under the workspace
But the problem is that invoking:
GET https://germanywestcentral.api.azureml.ms/api/subscriptions/{{SubscriptionId}}/resourceGroups/{{resourceGroupName}}/providers/Microsoft.MachineLearningServices/workspaces/{{workspaceName}}/services?api-version=2018-03-01-preview&count=100 returns a 530 ERROR with "unknown to the cluster" in the body of the response
Any idea or hints on why this and how to get around this issue ?
Error 530 is kind of authentication error as far as I know. Please check the input var your used for below bugged URL
https://germanywestcentral.api.azureml.ms/api/subscriptions/{{SubscriptionId}}/resourceGroups/{{resourceGroupName}}/providers/Microsoft.MachineLearningServices/workspaces/{{workspaceName}}/services?api-version=2018-03-01-preview&count=100

Postgres error 23505: Key (extname)=(pgcrypto) already exists

Using graphile-worker to schedule jobs and storing the information in postgres. Whenever I create a new instance of postgres and just after that I create a new DB, I keep getting this error.
{
"length": 229,
"name": "error",
"severity": "ERROR",
"code": "23505",
"detail": "Key (extname)=(pgcrypto) already exists.",
"schema": "pg_catalog",
"table": "pg_extension",
"constraint": "pg_extension_name_index",
"file": "nbtinsert.c",
"line": "656",
"routine": "_bt_check_unique",
"level": "error",
"message": "========> duplicate key value violates unique constraint \"pg_extension_name_index\"",
"stack": "error: duplicate key value violates unique constraint \"pg_extension_name_index\"\n
at Parser.parseErrorMessage (/home/baqir/WebstormProjects/nektar/node_modules/pg-protocol/dist/parser.js:287:98) at Parser.handlePacket (/home/baqir/WebstormProjects/nektar/node_modules/pg-protocol/dist/parser.js:126:29)\n
at Parser.parse (/home/baqir/WebstormProjects/nektar/node_modules/pg-protocol/dist/parser.js:39:38)\n
at Socket.<anonymous> (/home/baqir/WebstormProjects/nektar/node_modules/pg-protocol/dist/index.js:11:42)\n
at Socket.emit (node:events:390:28)\n
at addChunk (node:internal/streams/readable:315:12)\n
at readableAddChunk (node:internal/streams/readable:289:9)\n
at Socket.Readable.push (node:internal/streams/readable:228:10)\n
at TCP.onStreamRead (node:internal/stream_base_commons:199:23)"
}
I have tried using diffrent versions of postgres using docker, even tried to create an instance on AWS. Its always the same error.
My team however, they just create the DB and start the workers and graphile does the migrations by itself. I do not understand what is wrong here.
You get this error when two sessions try to create the same extension at the exact same time. (If they aren't at the same time, then you get a friendlier error message, or no error at all if the create was done with "IF NOT EXISTS").
So apparently you are launching two things to do the same (or overlapping) migrations at the same time, while your colleagues are not.

Creating Azure Recovery Service Vault - Python SDK - bad request

I am trying to create a Recovery Service Vault in Azure using Python SDK.
Package version: azure-mgmt-recoveryservices==2.0.0
Code snippet:
client=RecoveryServicesClient(client_secret_credential, subscription_id)
client.vaults.begin_create_or_update(
resource_group_name="my-custom-rg",
vault_name="name_of_the_vault",
vault={
"location": "centralus",
"sku": {
"name": "Standard",
},
"identity": {
"type": "SystemAssigned",
}
}
)
I got the following error:
File "/<my-computer-path>/azure/lib/python3.8/site-packages/azure/mgmt/recoveryservices/operations/_vaults_operations.py", line 293, in _create_or_update_initial
raise HttpResponseError(response=response, error_format=ARMErrorFormat)
azure.core.exceptions.HttpResponseError: Operation returned an invalid status 'Bad Request'
What am I doing wrong? How could I do more "error investigation"?
Thank you Robert. Posting your suggestion as an answer to help other community members.
This error azure.core.exceptions.HttpResponseError: Operation returned an invalid status 'Bad Request' is caused by missing backup_policy_resource properties.
policy_resource = BaseBackupPolicyResource(properties=backup_policy)
poller = client.backup_policies.create_or_update(
vault_name="my-vault-name",
resource_group_name="my-resource-group",
backup_policy_name="my-daily-backup",
parameters=policy_resource
)
You can refer to How to create disk Backup Policy using azure python SDK?

Azure Data Factory - Copy files to SFTP resolving destination from foreach item

Another Azure Data Factory question.
I'm trying to use a 'Copy Data' activity within a ForEach, setting the destination sink to an item of the foreach.
My setup is as follows:
Lookup activity to read a json file.
The format of the json file:
{
"OutputFolders":[
{
"Source": "aaa/bb1/Output",
"Destination": "Dest002/bin"
},
{
"Source": "aaa/bbb2/Output",
"Destination": "Dest002/bin"
},
{
"Source": "aaa/bb3/Output",
"Destination": "Dest002/bin"
}
]
}
Foreach activity with items set to #activity('Read json config').output.value[0].OutputFolders
Within the foreach activity a 'Copy Data' activity
This Sink has the following Sink dataset:
When I run this pipeline however I get the following error message:
{
"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=SftpPermissionDenied,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Permission denied to access '/#item().Destination'.,Source=Microsoft.DataTransfer.ClientLibrary.SftpConnector,''Type=Renci.SshNet.Common.SftpPermissionDeniedException,Message=Permission denied,Source=Renci.SshNet,'",
"failureType": "UserError",
"target": "Copy output files",
"details": []
}
So Message=Permission denied to access '/#item().Destination' seems to indicate that the destination folder is not resolved. Since this folder does not exist I get a SftpPermissionDenied.
I used the same method to copy files to a file share and there it seemed to work.
Does somebody have an idea how to make this destination resolve correctly?
What you would usually do in this type of situation is to create a Parameter in the Dataset which you would then reference in the File Path you are trying to construct.
This way, you can input your '#item().Destination' to this Parameter in your Copy Activity, as it will appear on the Dataset in the Pipeline.
There is also an example here: https://www.mssqltips.com/sqlservertip/6187/azure-data-factory-foreach-activity-example/
Ok, I tried some more and apparently if I use a concat function it works.
So #concat(item().Destination)
I do get a warning 'item' is not a recognized function, but it does the trick.
Not very straightforward and I wonder why the initial approach doesn't work.

Azure Graph API - ClaimsMappingPolicy with ClaimsTransformation

Im trying to automate the configuration of an enterpise application via the Azure Graph API.
Specifically, its the Azure Palo Alto Admin UI - https://learn.microsoft.com/en-us/azure/active-directory/saas-apps/paloaltoadmin-tutorial#configure-azure-ad-sso
Ive managed to get this working via the frontend, but im having trouble configuring the custom claims via the Graph Api.
For now, i just want to use a string claim in the custom claim as the customadmin value with a hardcoded value for the admin role
When creating via the portal, you can easily enter a string value as the source type of the claim.
However, via the Graph API the source type must be user, resource, audience, company or transformation.
https://learn.microsoft.com/en-us/azure/active-directory/develop/active-directory-claims-mapping#claim-schema-entry-elements
It seems that you can create a string type of transformation and then link the transformation into the main ClaimsSchema.
There is a similar example documented here https://learn.microsoft.com/en-us/graph/api/resources/claimsmappingpolicy?view=graph-rest-1.0#example-definition-that-uses-a-claims-transformation
But I cannot get the example to work. Even with a bit of massaging, the example fails. This is what ive been trying:
cat <<- EOF > claims.json
{
"definition": [
"{\"ClaimsMappingPolicy\":{
\"Version\":1,
\"IncludeBasicClaimSet\":\"true\",
\"ClaimsSchema\":[
{\"Source\":\"user\",\"ID\":\"extensionattribute1\"},{\"Source\":\"transformation\",\"ID\":\"DataJoin\",\"TransformationId\":\"JoinTheData\",\"JwtClaimType\":\"JoinedData\"}
],
\"ClaimsTransformation\":[
{\"ID\":\"JoinTheData\",\"TransformationMethod\":\"Join\",\"InputClaims\":[{\"ClaimTypeReferenceId\":\"extensionattribute1\",\"TransformationClaimType\":\"string1\"}], \"InputParameters\": [{\"ID\":\"string2\",\"Value\":\"sandbox\"},{\"ID\":\"separator\",\"Value\":\".\"}],\"OutputClaims\":[{\"ClaimTypeReferenceId\":\"DataJoin\",\"TransformationClaimType\":\"outputClaim\"}]}
]
}}"
],
"displayName": "Azure Reference Claim",
"isOrganizationDefault": false
}
EOF
az rest --method post --headers Content-type="application/json" --url "https://graph.microsoft.com/v1.0/policies/claimsMappingPolicies" --body #claims.json
Ive tried both the v1.0 and beta APIs but they both have the same behaviour
Which returns with the following error:
Bad Request({
"error": {
"code": "Request_BadRequest",
"message": "Property has an invalid value.",
"innerError": {
"date": "2020-09-01T13:03:10",
"request-id": "bc7cf58e-fe6d-47d1-b1e5-cae43326864f"
}
}
})
I was able to get the rest of the Palo Alto claim working (excluding the custom string) with the following:
{
"definition": [
"{\"ClaimsMappingPolicy\":{
\"Version\":1,
\"IncludeBasicClaimSet\":\"true\",
\"ClaimsSchema\": [{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier\"},{\"Source\":\"user\",\"ID\":\"givenname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname\"},{\"Source\":\"user\",\"ID\":\"displayname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name\"},{\"Source\":\"user\",\"ID\":\"surname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname\"},{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"username\"}]
}}"
],
"displayName": "Palo Alto Claims Policy",
"isOrganizationDefault": false
}
And i was able to create a CustomString transformation which isnt linked to anything with the following:
{
"definition": [
"{\"ClaimsMappingPolicy\":{
\"Version\":1,
\"IncludeBasicClaimSet\":\"true\",
\"ClaimsTransformation\":[{\"ID\":\"CreateTermsOfService\",\"TransformationMethod\":\"CreateStringClaim\",\"InputParameters\": [{\"ID\":\"value\",\"DataType\":\"string\", \"Value\":\"sandbox\"}],\"OutputClaims\":[{\"ClaimTypeReferenceId\":\"TOS\",\"TransformationClaimType\":\"createdClaim\"}]}]
}}",
],
"displayName": "sdfa",
"isOrganizationDefault": false
}
However, when i try them together in the format of the example I get an error.
{
"definition": [
"{\"ClaimsMappingPolicy\":{
\"Version\":1,
\"IncludeBasicClaimSet\":\"true\",
\"ClaimsSchema\": [
{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier\"},{\"Source\":\"user\",\"ID\":\"givenname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname\"},{\"Source\":\"user\",\"ID\":\"displayname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name\"},{\"Source\":\"user\",\"ID\":\"surname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname\"},{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"username\"},{\"Source\":\"transformation\",\"TransformationID\":\"xxxxxxxxx\",\"ID\":\"DataJoin\",\"SamlClaimType\":\"test\"}
],
\"ClaimsTransformation\":[
{\"ID\":\"xxxxxxxxx\",\"TransformationMethod\":\"CreateStringClaim\",\"InputParameters\": [{\"ID\":\"value\",\"DataType\":\"string\", \"Value\":\"sandbox\"}],\"OutputClaims\":[{\"ClaimTypeReferenceId\":\"DataJoin\",\"TransformationClaimType\":\"createdClaim\"}]}
]
}}"
],
"displayName": "Palo Alto Claims Policy",
"isOrganizationDefault": false
}
Which returns the same unhelpful error:
Bad Request({
"error": {
"code": "Request_BadRequest",
"message": "Property has an invalid value.",
"innerError": {
"date": "2020-09-01T13:03:10",
"request-id": "bc7cf58e-fe6d-47d1-b1e5-cae43326864f"
}
}
})
Any ideas what i am doing wrong? Im trying to base off of the example, which i cant get working.
I do not want to use powershell, i want to be able to automate via my desktop terminal.
I imagine i can avoid this situation and get the PA to integrate with AAD without a hardcoded value, but i feel that i should be able to get this working this way.
The mandatory encoding of the ClaimMappingPolicy object makes it quite fiddely to develop, so its possible there is a problem there somewhere.
Ive also tried just creating the ClaimsSchema without the ClaimsTransformation and then running a PATCH to amend the object with the transformed object, but it just overwrites the whole ClaimsMappingPolicy object rather than adding just the extra field.
When I remove the transformation source from the ClaimsSchema the request succeeds.
cat <<- EOF > claims.json
{
"definition": [
"{\"ClaimsMappingPolicy\":{
\"Version\":1,
\"IncludeBasicClaimSet\":\"true\",
\"ClaimsSchema\": [
{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier\"},{\"Source\":\"user\",\"ID\":\"givenname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname\"},{\"Source\":\"user\",\"ID\":\"displayname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name\"},{\"Source\":\"user\",\"ID\":\"surname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname\"},{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"username\"}
],
\"ClaimsTransformation\":[
{\"ID\":\"xxxxxxxxx\",\"TransformationMethod\":\"CreateStringClaim\",\"InputParameters\": [{\"ID\":\"value\",\"DataType\":\"string\", \"Value\":\"sandbox\"}],\"OutputClaims\":[{\"ClaimTypeReferenceId\":\"DataJoin\",\"TransformationClaimType\":\"createdClaim\"}]}
]
}}"
],
"displayName": "Palo Alto Claims Policy",
"isOrganizationDefault": false
}
EOF
But there isnt an association between the ClaimsSchema and the ClaimsTransformation. This hints at a problem with the ClaimsSchema object
{\"Source\":\"transformation\",\"TransformationID\":\"xxxxxxxxx\",\"ID\":\"DataJoin\",\"SamlClaimType\":\"test\"}
But this looks suitable when looking at the documentation and the (possibly broken) reference example.
Providing information in answer as its too long to comment it.Please try this below query in Graph explorer
Post https://graph.microsoft.com/beta/policies/claimsMappingPolicies
{"definition":["{\"ClaimsMappingPolicy\":{\"Version\":1,\"IncludeBasicClaimSet\":\"true\", \"ClaimsSchema\":[{\"Source\":\"user\",\"ID\":\"extensionattribute1\"},{\"Source\":\"transformation\",\"ID\":\"DataJoin\",\"TransformationId\":\"JoinTheData\",\"JwtClaimType\":\"JoinedData\"}],\"ClaimsTransformations\":[{\"ID\":\"JoinTheData\",\"TransformationMethod\":\"Join\",\"InputClaims\":[{\"ClaimTypeReferenceId\":\"extensionattribute1\",\"TransformationClaimType\":\"string1\"}], \"InputParameters\": [{\"ID\":\"string2\",\"Value\":\"sandbox\"},{\"ID\":\"separator\",\"Value\":\".\"}],\"OutputClaims\":[{\"ClaimTypeReferenceId\":\"DataJoin\",\"TransformationClaimType\":\"outputClaim\"}]}]}}"],"displayName":"TestclaimsPolicy","isOrganizationDefault":false}
Post https://graph.microsoft.com/beta/policies/claimsMappingPolicies
{"definition":["{\"ClaimsMappingPolicy\":{\"Version\":1,\"IncludeBasicClaimSet\":\"true\",\"ClaimsSchema\": [{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier\"},{\"Source\":\"user\",\"ID\":\"givenname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname\"},{\"Source\":\"user\",\"ID\":\"displayname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name\"},{\"Source\":\"user\",\"ID\":\"surname\",\"SamlClaimType\":\"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname\"},{\"Source\":\"user\",\"ID\":\"userprincipalname\",\"SamlClaimType\":\"username\"}],\"ClaimsTransformation\":[{\"ID\":\"CreateTermsOfService\",\"TransformationMethod\":\"CreateStringClaim\",\"InputParameters\": [{\"ID\":\"value\",\"DataType\":\"string\", \"Value\":\"sandbox\"}],\"OutputClaims\":[{\"ClaimTypeReferenceId\":\"TOS\",\"TransformationClaimType\":\"createdClaim\"}]}]}}"],"displayName":"Test1234","isOrganizationDefault":false}
for more information on CreateTermsOfService please refer to this document

Resources