RESTkit Code 1001 could not find object for keyPath: "'" - ios4

Thanks in advance for all your help community!! i had a early problem which was pointed out to me and i fixed "[Person object]-should be-->[Person alloc]init]i fixed that and now able to add a Person object now to my server. But here is the problem all the values on server are null and its throwing back a 1001 code Code=1001 "Could not find an object mapping for keyPath: ''" UserInfo=0x5938ce0 {=RKObjectMapperKeyPath, NSLocalizedDescription=Could not find an object mapping for keyPath: ''}
what am i mapping wrong any ideas? Noobie here and be grateful for any help provided!
Ohh yea am willing to pay for lessons on this matter if your in socal. Thnx again!
RKObjectMapping* userMapping = [RKObjectMapping mappingForClass:[Person class]];
[userMapping mapKeyPath:#"updated_at" toAttribute:#"updatedAt"];
[userMapping mapKeyPath:#"created_at" toAttribute:#"createdAt"];
[userMapping mapKeyPath:#"name" toAttribute:#"name"];
[userMapping mapKeyPath:#"id" toAttribute:#"personId"];
RKObjectMapping* dogMapping = [RKObjectMapping mappingForClass:[Dog class]];
[dogMapping mapKeyPath:#"created_at" toAttribute:#"createdAt"];
[dogMapping mapKeyPath:#"person_id" toAttribute:#"spersonId"];
[dogMapping mapKeyPath:#"name" toAttribute:#"name"];
[dogMapping mapKeyPath:#"updated_at" toAttribute:#"updatedAt"];
[dogMapping mapKeyPath:#"id" toAttribute:#"dogId"];
RKObjectMapping *dataMapping = [RKObjectMapping mappingForClass:[Data class]];
[dataMapping mapKeyPath:#"dog" toAttribute:#"dogs"];
[dataMapping mapKeyPath:#"person" toRelationship:#"person" withMapping:userMapping];
[[RKObjectManager sharedManager].mappingProvider addObjectMapping:dataMapping];
[[RKObjectManager sharedManager].mappingProvider setMapping:userMapping
forKeyPath:#"people"];
RKObjectRouter * router = [RKObjectManager sharedManager].router;
[router routeClass: [Person class] toResourcePath:#"/people/:personId"];
[router routeClass: [Person class] toResourcePath:#"/people"
forMethod:RKRequestMethodPOST];
RKObjectMapping *personSerializationMapping= [RKObjectMapping mappingForClass:
[NSMutableDictionary class]];
[personSerializationMapping mapAttributes:#"name", nil];
[[RKObjectManager sharedManager].mappingProvider
setSerializationMapping:personSerializationMapping forClass: [Person class]];
Person* daveLiu =[[[Person alloc]init]autorelease];
daveLiu.name = #"dave";
[[RKObjectManager sharedManager] postObject:daveLiu delegate:self];

All of my objects from my server are "anonymous" and I had this same problem with RestKit not being able to find the correct mapping. The way I fixed it was to use the objectManager methods that let me include the mapping:
[objectManager postObject:object mapResponseWith:[object objectMapping] delegate:self];
[objectManager loadObjectsAtResourcePath:path objectMapping:mapping delegate:self];
So now my contracts that all come from the server like this:
{
"Id": "12345678-1234-1234-1234-1234567890ab",
"Type": "SURVEY",
"Code": "1234",
"Description": "blah blah blah",
"Message": "blah blah blah",
"Url": "http://nowhere.com/",
"NextAttemptHours": 4
}
Can be mapped into the correct object. It's a bit of extra work to keep you object mappings available, but it does the job. I did it by implementing a class method on my objects to return the appropriate objectMapping which keeps things very clean and understandable.

Related

Attempt to create simple Contact gives error: PATCH requests require components to be updated

I am currently using SDK version 3.39.0 and version 0004 of the API_MKT_CONTACT service definition to create a new Contact in Marketing Cloud with the following code:
ContactOriginData contact =
ContactOriginData.builder()
.originOfContact(origin)
.originTimestamp(ZonedDateTime.now())
.externalContactID(pii.getId().toString())
.firstName(pii.getFirstName())
.lastName(pii.getLastName())
.language(pii.getLanguage())
.countryReg(pii.getRegion())
.build();
// use low level API as a work around for https://github.com/SAP/cloud-sdk/issues/156
ODataRequestUpdate contactRequest = service
.updateContactOriginData(contact)
.withHeader("Sap-Cuan-RequestTimestamp", getFormattedTime(System.currentTimeMillis()))
.withHeader("Sap-Cuan-SequenceId", "UpdatePatch")
.withHeader("Sap-Cuan-SourceSystemType", "EXT")
.withHeader("Sap-Cuan-SourceSystemId", "sdk-test")
.toRequest();
String servicePath = "/sap/opu/odata/SAP/API_MKT_CONTACT_SRV;v=0004";
ODataRequestBatch requestBatch = new ODataRequestBatch(servicePath, ODataProtocol.V2);
requestBatch.beginChangeset().addUpdate(contactRequest).endChangeset();
HttpClient httpClient = HttpClientAccessor.getHttpClient(destination);
ODataRequestResultMultipartGeneric batchResult = requestBatch.execute(httpClient);
Running this produces the following error:
{
"error": {
"code": "/IWFND/CM_MGW/096",
"message": {
"lang": "en",
"value": "PATCH requests require components to be updated"
},
"innererror": {
"application": {
"component_id": "CEC-MKT-DM-IC",
"service_namespace": "/SAP/",
"service_id": "API_MKT_CONTACT_SRV",
"service_version": "0004"
},
"transactionid": "3B63A2A6CC920630E0060492A51E7EE7",
"timestamp": "20210310210334.4378960",
"Error_Resolution": {
"SAP_Transaction": "For backend administrators: use ADT feed reader \"SAP Gateway Error Log\" or run transaction /IWFND/ERROR_LOG on SAP Gateway hub system and search for entries with the timestamp above for more details",
"SAP_Note": "See SAP Note 1797736 for error analysis (https://service.sap.com/sap/support/notes/1797736)",
"Batch_SAP_Note": "See SAP Note 1869434 for details about working with $batch (https://service.sap.com/sap/support/notes/1869434)"
},
"errordetails": []
}
}
}
However, if I execute a similar request in postman it works without issue:
Request Payload:
--batch
Content-Type: multipart/mixed; boundary=changeset
--changeset
Content-Type: application/http
Content-Transfer-Encoding: binary
PATCH ContactOriginData(ContactOrigin='<ContactOrigin>',ContactID='24D8F7F6-440D-44F8-A24B-552435477688') HTTP/1.1
Accept: application/json
Content-Type: application/json
Content-Length: 172
Sap-Cuan-RequestTimestamp: '2021-03-10T14:07:00.000'
Sap-Cuan-SequenceId: UpdatePatch
Sap-Cuan-SourceSystemType: EXT
Sap-Cuan-SourceSystemId: postman-test
{"OriginDataLastChgUTCDateTime":"/Date(1615410479885)/","EmailAddress":"samantha.cook#theoasis.com","FirstName":"Samantha","LastName":"Cook","Country":"US","Language":"EN"}
--changeset--
--batch--
Response Payload:
--1D7E85E6BC66B34E61ACF0EF3964CBD90
Content-Type: multipart/mixed; boundary=1D7E85E6BC66B34E61ACF0EF3964CBD91
Content-Length: 430
--1D7E85E6BC66B34E61ACF0EF3964CBD91
Content-Type: application/http
Content-Length: 262
content-transfer-encoding: binary
HTTP/1.1 204 No Content
Content-Length: 0
dataserviceversion: 2.0
sap-message: {"code":"HPA_STAGING_AREA/037","message":"Payload is processed via staging area. See Import Monitor for details.","target":"","severity":"info","transition":false,"details":[]}
--1D7E85E6BC66B34E61ACF0EF3964CBD91--
--1D7E85E6BC66B34E61ACF0EF3964CBD90--
I should note that I have also tried using .replacingEntity() which doesn't work either and produces a completely different error:
Inline component is not defined or not allowed (HTTP PUT)
Is there something with the SDK that I am missing or not using correctly?
Any help would be appreciated!
Cheers!
To update an entity you should get it from the service first. That is regardless whether you are using:
PATCH which will update only changed fields
or PUT which will send the full entity object
Currently you are creating a new entity object via the builder: ContactOriginData.builder(). Instead, please use the corresponding getContactOriginDataByKey() method of your service to first retrieve the entity to update from the service. Actually many services will force you to do this to ensure you are always editing the latest version of your data. This often happens via ETags which the SDK will also handle for you automatically.
You can find more information about the update strategies from the SDK on the documentaiton.
Edit:
As you pointed out in the comments the actual goal is to create an entity and the specific service in question only allows PUT and PATCH to create objects.
In that case using replacingEntity() (which translates to PUT) should already work with your code. You can make PATCH work as well by replacing the builder approach with a constructor call + setter approach.

TypeORM RepositoryNotFoundError when searching for entities using the class in Jest

I am having an issue here difficult to debug. I upgraded all my project dependencies and suddenly all my tests (Jest 25.5.4 or 26.x) started to fail with the "RepositoryNotFoundError".
The strange behavior is that all the entities are loaded in the metadata storage:
import { Connection, getMetadataArgsStorage } from 'typeorm';
let connection = await createConnection(); //<- The connection is creating according to my config
console.log(getMetadataArgsStorage()); //<- All the entities are here
console.log(getRepository('User')); //<- This works
console.log(getRepository(User)); //<- But this will raise the error
After some time debugging, I noticed the error is at https://github.com/typeorm/typeorm/blob/0.2.24/src/connection/Connection.ts#L482 and I created a repository for you to replicate the issue.
The comparison (metadata.target === target) always returns false. The targets are from the same class, but they are somewhat different. Using toString() return different versions of the class, one with comments stripped, another without comments (if I am using removeComments: true in my tsc config):
const targetMetadata = connection.entityMetadatas[7].target; // 7 is the index in my debug, it can be anything else in the array
console.log(targetMetadata === User); // prints false
I still did not figure out what caused the issue after the upgrade. Unfortunately I cannot share the code of the project, but I can give more information if you need. Could you help me to figure out what is the problem?
My jest config (in package.json):
"jest": {
"moduleFileExtensions": [
"js",
"json",
"ts"
],
"rootDir": "src",
"testRegex": ".spec.ts$",
"transform": {
"^.+\\.(t|j)s$": "ts-jest"
},
"coverageDirectory": "../coverage",
"testEnvironment": "node"
}
I still haven't tried tinkering with the example repository you provided, but I see that the paths in your TypeORM configuration files start with dist/. If you're using ts-jest and also have set src as your rootDir I think that could be the cause of your trouble.

AJV schema validation fails

I am using Jsonix, I have used below mentioned command to generate jsonix mapping and jsonix schema as,
java -jar jsonix-schema-compiler-full.jar -generateJsonSchema -d mappings books.xsd
it is properly generating mapping and schema, I want to validate JSON using AJV and the generated JSON Schema, so I have tried this,
var fs = require('fs');
var Ajv = require('ajv');
var XMLSchemaJsonSchema = JSON.parse(fs.readFileSync('../node_modules/jsonix/jsonschemas/w3c/2001/XMLSchema.jsonschema').toString());
var JsonixJsonSchema = JSON.parse(fs.readFileSync('../node_modules/jsonix/jsonschemas/jsonix/Jsonix.jsonschema').toString());
var booksJsonSchema = JSON.parse(fs.readFileSync('./books.jsonschema').toString());
var ajv = new Ajv();
ajv.addSchema(XMLSchemaJsonSchema, 'http://www.jsonix.org/jsonschemas/w3c/2001/XMLSchema.jsonschema');
ajv.addSchema(JsonixJsonSchema, 'http://www.jsonix.org/jsonschemas/jsonix/Jsonix.jsonschema');
var validate = ajv.compile(booksJsonSchema);
var data ={
"book": [
{
"#id": "bk001",
"author": "Writer",
"title": "The First Book",
"genre": "Fiction",
"price": "44.95",
"pub_date":2000-10-01,
"review": "An amazing story of nothing."
},
{
"#id": "bk002",
"author": "Poet",
"title": "The Poet's First Poem",
"genre": "Poem",
"price": "24.95",
"pub_date":2000-10-02,
"review": "Least poetic poems."
}
]
};
var valid = validate(data);
if (!valid) {
console.log('Validation failed errors:');
console.log(validate.errors);
}else{
console.log("successfully done validation");
}
But it is throwing error
/Users/qliktag/Desktop/QAGG/qagUI2/Scripts/QLIKTAG-2602/node_modules/ajv/lib/ajv.js:183
else throw new Error(message);
^
Error: schema is invalid: data.definitions['nonPositiveInteger'].anyOf[0].exclusiveMaximum should be number
at Ajv.validateSchema (/Users/qliktag/Desktop/QAGG/qagUI2/testScripts/node_modules/ajv/lib/ajv.js:185:16)
at Ajv._addSchema (/Users/qliktag/Desktop/QAGG/qagUI2/Scripts/QLIKTAG-2602/node_modules/ajv/lib/ajv.js:316:10)
at Ajv.addSchema (/Users/qliktag/Desktop/QAGG/qagUI2/Scripts/QLIKTAG-2602/node_modules/ajv/lib/ajv.js:136:29)
at Object.<anonymous> (/Users/qliktag/Desktop/QAGG/qagUI2/Scripts/QLIKTAG-2602/mappings/ajvSample.js:248:5)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
Actually the error comes while ajv addschema, Is I did anything wrong?
To continue using draft-04 schemas added meta: false to prevent adding draft-06 meta-schema https://github.com/epoberezkin/ajv/releases/tag/5.0.0
var ajv = new Ajv({
schemaId: 'id',
meta: false,
});
var metaSchema = require('../node_modules/ajv/lib/refs/json-schema-draft-04.json');
ajv.addMetaSchema(metaSchema);
ajv._opts.defaultMeta = metaSchema.id;
ajv._refs['http://json-schema.org/schema'] = 'http://json-schema.org/draft-04/schema';
After adding this use addSchema to allow booleans for exclusiveMaximum
ajv.addSchema(XMLSchemaJsonSchema, 'http://www.jsonix.org/jsonschemas/w3c/2001/XMLSchema.jsonschema');
ajv.addSchema(JsonixJsonSchema, 'http://www.jsonix.org/jsonschemas/jsonix/Jsonix.jsonschema');
The change of exclusiveMaximum from boolean to number happened with Draft-06/07 of JSON Schema.
// var ajv = new Ajv({schemaId: 'id'});
// If you want to use both draft-04 and draft-06/07 schemas:
var ajv = new Ajv({schemaId: 'auto'});
ajv.addMetaSchema(require('ajv/lib/refs/json-schema-draft-04.json'));
Use this and then addSchema to allow booleans for exclusiveMaximum
ajv.addSchema(XMLSchemaJsonSchema, 'http://www.jsonix.org/jsonschemas/w3c/2001/XMLSchema.jsonschema');
ajv.addSchema(JsonixJsonSchema, 'http://www.jsonix.org/jsonschemas/jsonix/Jsonix.jsonschema');
Author of Jsonix here.
As it is pointed out in the documentation, JSON Schema generation is an experimental feature. So it would be not surprizing that it fails. You are welcome to file issues.

Trying to sort MarkLogic collection query result on a dateTime index

I am using node.js to setup an application. I am using node to test some stuff. I use the marklogic module in node.js to query the database and return JSON. For this I wrote a transform on the XML content, that works.
Now I want to order the result of a collection query on the timestamp in the data. The timestamp has an index on it of type xs:dateTime. And lives in a namespace "http://exmaple.com/sccs".
Example document:
<?xml version="1.0"?>
<obj:object xmlns:obj="http://marklogic.com/solutions/obi/object">
<obi:metadata xmlns:obi="http://marklogic.com/solutions/obi" createdBy="user-app-user" createdDateTime="2015-10-26T16:42:30.302458Z" lastUpdatedBy="user-app-user" lastUpdatedDateTime="2015-10-26T16:45:01.621435Z">
</obi:metadata>
<obj:label>This alert was send based on the First Time Seen RuleThis subject was spotted first time at this sensor location</obj:label>
<obj:type>alert</obj:type>
<obj:id>c2151ee0-f0a9-4eb5-85c2-1c5b3c7c7a65</obj:id>
<obj:content>
<alert xmlns="http://example.com/sccs/alert">
<obj:property sourceCount="1">
<actions type="array" elementType="string">
<action>This alert was send based on the First Time Seen Rule</action>
<action>This subject was spotted first time at this sensor location</action>
</actions>
</obj:property>
<obj:property sourceCount="1">
<status>Inactive</status>
</obj:property>
<obj:property sourceCount="1">
<sensor-id>test-sensor-id</sensor-id>
</obj:property>
<obj:property sourceCount="1">
<device-id>00:00:00:00:00:04</device-id>
</obj:property>
<obj:property sourceCount="1">
<alertType>trespasser-alert</alertType>
</obj:property>
<obj:property sourceCount="1">
<position>{"type":"Point", "coordinates":[52.2, 4.3]}</position>
</obj:property>
<obj:property sourceCount="1">
<scc:id xmlns:scc="http://example.com/sccs">04fdef0a-9d3f-4743-9e88-04da279a0c37</scc:id>
</obj:property>
<obj:property sourceCount="1">
<scc:timestamp xmlns:scc="http://example.com/sccs">2015-10-01T13:06:00Z</scc:timestamp>
</obj:property>
</alert>
</obj:content>
<obj:workspaces>
<obj:workspace id="Public">
</obj:workspace>
</obj:workspaces>
<obj:sourceIds count="1">
<source:id xmlns:source="http://marklogic.com/solutions/obi/source">57358890-8d71-4515-90c1-5cacc54347f7</source:id>
</obj:sourceIds>
</obj:object>
Now my node script:
var marklogic = require('marklogic');
var my = require('./my-connection.js');
var db = marklogic.createDatabaseClient(my.connInfo);
var qb = marklogic.queryBuilder;
var options = {
"sort-order": [
{
"direction": "descending",
"type": "xs:dateTime",
"collation": "http://marklogic.com/collation/codepoint",
"element": {
"name": "timestamp",
"ns": "http://sensingclues.com/sccs",
},
"annotation": [ "some user comment can go here" ] },
{
"direction": "ascending",
"score": null
}
]
};
db.documents.query(
qb.where(
qb.collection("alert")
).orderBy(qb.sort('timestamp'))//.withOptions(options)//.orderBy(qb.sort('timestamp'))
//.slice(qb.transform('alerts-query-transform')) // HK :use transform
).result( function(documents) {
var arrAlerts = new Array();
console.log('The alerts collection:')
documents.forEach( function(document) {
arrAlerts.push(document.content);
});
console.log(arrAlerts);
}, function(error) {
console.log(JSON.stringify(error, null, 2));
});
Gives:
node alerts-no-transform-test.js
{
"message": "query documents: response with invalid 400 status",
"statusCode": 400,
"body": {
"errorResponse": {
"statusCode": 400,
"status": "Bad Request",
"messageCode": "SEARCH-BADORDERBY",
"message": "SEARCH-BADORDERBY: (err:FOER0000) Indexes are required to support element, element-attribute, json-property, or field sort specifications."
}
}
}
If I try to use options as defined also above I get:
node alerts-no-transform-test.js
/home/hugo/git/sccss-middletier/cluey-app/node_modules/marklogic/lib/query-builder.js:4807
throw new Error('unknown option '+key);
^
Error: unknown option sort-order
at QueryBuilder.withOptions (/home/hugo/git/sccss-middletier/cluey-app/node_modules/marklogic/lib/query-builder.js:4807:15)
at Object.<anonymous> (/home/hugo/git/sccss-middletier/cluey-app/alerts-no-transform-test.js:33:6)
at Module._compile (module.js:460:26)
at Object.Module._extensions..js (module.js:478:10)
at Module.load (module.js:355:32)
at Function.Module._load (module.js:310:12)
at Function.Module.runMain (module.js:501:10)
at startup (node.js:129:16)
at node.js:814:3
Question : What is the correct way to sort the result based on a timestamp dateTime index?
You mention this collation in the sample sort order configuration:
"collation": "http://marklogic.com/collation/codepoint",
However, you mention no specific collation in your index configuration.
Yet in MarkLogic 8 the default collation is not codepoint but the UCA Root Collation
It is possible that you are simply trying to sort on a non-existent index (because the index is created with the default collation and your code is using the codepoint collation).
I suspect this because of the message:
"message": "SEARCH-BADORDERBY: (err:FOER0000) Indexes are required to support element, element-attribute, json-property, or field sort specifications."
In situations like this, I always use cts:element-values() or cts:values() etc.. in queryConsole to test my index and make sure it is exactly as I expect - before I try to refer to it in code. This may help you ensure that the index is what you expect.
ok I found a pointer on SO here
Now this works:
// get all the devices from ML
db.documents.query(
qb.where(
qb.collection("alert")
).orderBy(qb.sort(qb.element(qb.qname('http://example.com/sccs', 'timestamp')),
'descending'))
.slice(qb.transform('alerts-query-transform')) // HK: use transform
).result( function(documents) {
Apparently I have to explicitly point to the element? Is there any useful documentation on how to use the query builder in detail?
On the side : note the transform applied to the XML to map to JSON ...

Nodemailer with Gmail on Loopback error - Object #<Object> has no method 'getToken'

I am learning Loopback and I decided to make some email sending. I want to use gmail account.
I created remote method and configured datasources. Here is how it looks:
"myEmailDataSource": {
"name": "myEmailDataSource",
"connector": "mail",
"transports": [
{
"type": "smtp",
"host": "smtp.gmail.com",
"auth": {
"xoauth2": {
"user": "myMail#gmail.com",
"clientId": "myClientId.apps.googleusercontent.com",
"clientSecret": "mySecret",
"refreshToken": "myToken"
}
}
}
]
}
But when I want to send an email, it throws this error:
TypeError: Object #<Object> has no method 'getToken'
at SMTPConnection._handleXOauth2Token (/home/arth95/Projects/firstCMS/node_modules/loopback/node_modules/nodemailer/node_modules/nodemailer-smtp-transport/node_modules/smtp-connection/src/smtp-connection.js:961:67)
at SMTPConnection.login (/home/arth95/Projects/firstCMS/node_modules/loopback/node_modules/nodemailer/node_modules/nodemailer-smtp-transport/node_modules/smtp-connection/src/smtp-connection.js:233:18)
at SMTPTransport.<anonymous> (/home/arth95/Projects/firstCMS/node_modules/loopback/node_modules/nodemailer/node_modules/nodemailer-smtp-transport/src/smtp-transport.js:96:24)
at SMTPConnection.g (events.js:180:16)
at SMTPConnection.EventEmitter.emit (events.js:92:17)
at SMTPConnection._actionEHLO (/home/arth95/Projects/firstCMS/node_modules/loopback/node_modules/nodemailer/node_modules/nodemailer-smtp-transport/node_modules/smtp-connection/src/smtp-connection.js:692:10)
at SMTPConnection._processResponse (/home/arth95/Projects/firstCMS/node_modules/loopback/node_modules/nodemailer/node_modules/nodemailer-smtp-transport/node_modules/smtp-connection/src/smtp-connection.js:511:16)
at SMTPConnection._onData (/home/arth95/Projects/firstCMS/node_modules/loopback/node_modules/nodemailer/node_modules/nodemailer-smtp-transport/node_modules/smtp-connection/src/smtp-connection.js:357:10)
at CleartextStream.EventEmitter.emit (events.js:95:17)
at CleartextStream.<anonymous> (_stream_readable.js:746:14)
Why is that?
I had exact same problem. Did you find any solution for this?
As a workaround I've done following.
create a boot script in server\boot
in the script wrote following code
var email = app.models.Email;
var auth = email.dataSource.connector.transports[0].transporter.options.auth;
auth.xoauth2 = require('xoauth2').createXOAuth2Generator(auth.xoauth2);
This converts the xoauth2 object that you defined in data source to XOAuth2Generator object that is needed by nodemailer.
You need to have xoauth2 module installed.
There should be a better way to handle this. But so far I've not found it, so using this workaround.

Resources