Post Service call with Multiple records - sap-cloud-sdk

I would like to know how to post multiple records to SAP using "BatchRequestBuilder" along with ChangeSet .I am using a custom odata service call(ODataCreateRequestBuilder),not using the VDM model. I did'nt get any blog or documentation to start with.
Can you please help me in this regard.
Updated:
Below is what I am trying to post to SAP
[{"purchaseSchAgrmntNo":"","customerMaterialNumber":"","plant":"","vendorNo":""},{"purchaseSchAgrmntNo":"","customerMaterialNumber":"","plant":"","vendorNo":""}]
SAP SDK version : 3.9.0
I have added below code with only one CreateRequest.
ChangeSet changeSet = new ChangeSetBuilder().addCreateRequest( ODataCreateRequestBuilder.withEntity(sapConfig.getServiceUrlRepriceList(),
sapConfig.getEntityRepriceList())
.withBodyAsMap(responseBody)
.build()).build();
BatchResult batchResult = BatchRequestBuilder.withService("URL?").addChangeSet(changeSet).build().execute(httpClient);
Can you let me know if this is correct.Also let me know what I have to pass in the service.Is it service URL?
Thanks,
Arun Pai

The BatchRequestBuilder is actually not directly part of the SAP Cloud SDK but a dependency that the SDK internally uses to execute batch requests. That is why on the SDK level there is no documentation on how to use it.
Roughly, a batch request comprises of multiple change sets which in turn group together multiple operations. The ChangeSetBuilder allows you to build up change sets which you can then pass to a BatchRequestBuilder.
So if you want to run create requests in batch mode you would want to leverage public ChangeSetBuilder addCreateRequest(ODataCreateRequest oDataCreateRequest).
You can take a look at how the SAP Cloud SDK uses these classes to build up batch requests to get an idea how it works in detail. As a starting point look towards BatchFluentHelperBasic. However, unless you don't know the service you want to query at compile time, I recommend that you leverage the generator to generate this code so that you can use the VDM instead which simplifies this.
If you extend your question to hold more specific information on what you actually want to achieve I can expand my answer to give a more concrete example. Also please include the SDK version you are using.

Related

How to create low code based workflow setup in nodejs?

I want to create a workflow automation where an activity comes in and user can setup a multilevel workflow.
For frontend i am using https://reactflow.dev
How to structure things in nodejs backend. Things like database, accessing control flow statements, statements which requires crons.
You also may want to have a look at node-red.
It's an open-source product that does exactly that.
There's a set of built-in nodes.
You can develop your own nodes, or import 3rd party ones. Which are stored in NPM.
You can also just create a node with javascript or typescript code in it, on the fly.
You should check Flumejs: https://flume.dev/
https://flume.dev/docs/quick-start/
Also you should see this code sandbox example. Try to read the code
and all the dependencies: https://codesandbox.io/s/node-based-code-generation-test-forked-ll9flz?file=/src/App.tsx
I hope you find this helpful.

Unable to set mandatory headers for ContactOriginData PATCH in batch mode

According to the integration guide for Contacts OData the Sap-Cuan-SequenceId header is mandatory when updating a ContactOriginData record. When updating in singleton mode I am able to set this header as follows and it works without issue:
service
.updateContactOriginData(contact)
.withHeader("Sap-Cuan-SequenceId", "PatchUpdate")
.executeRequest(destination);
However, there is no option to set this header when performing the same update in batch mode:
service
.batch()
.beginChangeSet()
.updateContactOriginData(contact)
.withHeader(...) // this option does not exist
.endChangeSet()
.executeRequest(destination);
When I run the batch one my SAP Import Monitor shows the error:
Invalid content in field Sap-Cuan-SequenceId
Is it possible to set this header in batch mode and I'm just not seeing how? I am using version 3.39.0 of the SDK. Any help would be greatly appreciated!
Thanks!
This clearly looks like an implementation shortcoming. The SDK has a new API for OData BATCH in the OData v4 client which shouldn't have this issue. Mentioned service exposes OData v2 only and the OData v2 BATCH implementation has been historically different. For compatibility reasons, it has to be kept like this. We plan to provide a parallel implementation to align it with OData v4 and fix many minor and major inconsistencies.
If this is super urgent we can try to provide a workaround using the SDK's generic OData client otherwise create an issue in this GitHub repository and the SDK team will update you when the fix for adding headers is going to be released.

SAP Cloud SDK CI/CD Pipeline: Usage with non S\4 Services

I am using SAP Cloud SDK (Java flavour) to create an extension application of SuccessFactors.
I sadly discovered that the Jenkins pipeline does not allow me to use any other service than the ones listed here: SCN Blog (scroll to the Appendix).
This does not make so much sense to me, as now the SDK can be used - and it is sponsored to be used by SAP - also with SaaS in its ecosystem, SuccessFactors being one of them.
Any hint? Can this check be somehow "bypassed"?
Thanks,
Roberto.
Please note that the blog post is quite old, have you verified your assumption that it does not work with SuccessFactors API?
Nonetheless, we recently introduced a configuration option which allows you to disable certain checks, cf https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#s4sdkqualitychecks
checkServices is what you would want to disable in your scenario.
As stated by Florian in the comment and following the Project Piper documentation, parameters "checkServices" and "customODataServices" can be used to customize the behavior of the pipeline when running upon a non-Business Hub API.
"checkServices: false" will completely deactivated the check, whereas "customODataServices: [ yourApiName ]" will skip the check just for the specified services.

Azure Machine Learning Recommendations API: Delete Build fails

I'm using Azure's Recommendations API to generate product recommendations. I'm keeping Recommendations up to date via an SSIS package that updates the data, creates a new build for the data, and if the build succeeds, deletes any previously existing builds.
The API documentation is straightforward, and everything works fine except one call in particular fails for me returning a HTTP 406 error (Not acceptable - Accept header doesn't match a response type supported by the server).
Has anyone successfully deleted a build via this API Call?
According to the documentation a Delete Build call should look like this:
Method=DELETE
{rootURI}/DeleteBuild?buildId=%27%27&apiVersion=%271.0%27
Andrew,
I just ran into this question... Did you resolve this issue?
In contrast with most of the functions, The datamarket version of the API requires a DELETE method to actually delete a build (as apposed to a GET) -- that may be the first thing to check.
Also, I want to make sure you are aware that the datamarket version of the API will be deprecated (please read the migration guide at https://azure.microsoft.com/en-us/documentation/articles/cognitive-services-migration-from-dm/ for more details). You should now be using the Recommendations API Cognitive Service (see https://www.microsoft.com/cognitive-services/en-us/recommendations-api ).
Either way, if you are still running into issue, feel free to contact us at mlapi#microsoft.com and we can check what may be wrong with your call.
Thanks!
Luis Cabrera
Recommendations API Program Manager

Connect Appserver of progress open edge through node.js

Does anyone have experience, information or some (coding) examples about a solution to establish a connection between the appserver of progress and node.js!? The aim is to create REST-Services to the db which can be accessed by the web like an angular-app.
Thanks for any advice
Christian
Starting with 11.2 (and enhanced in later versions) you can create REST-based applications utilizing the AppServer as a platform. ProDatasets are used as output (they convert easily to xml and/or json).
This is all explained in the Web Services part of the documentation. I'm providing a link below.
Basic steps
You need to consult the manual for all these steps...
Create an ABL program with input parameters (could be a parameter, a temp-table or a dataset) and and a single output parameter (could be a temp-table, a dataset or a single character or longchar parameter).
Add ABL-specific REST annotation to the program
Map the parameters in OpenEdge Studio
Setup REST agents with the restman utility
Export a "WAR-file" and deploy your webservice.
Calling the web service from node.js should be no greater problem than calling any REST based web service.
In versions prior to 11.2 you can "fake it and make it" utilizing WebSpeed. You can create a webspeed program that read parameters from the query-string (using get-field()) and then writes a response to the "webstream". Use either the WRITE-XML or WRITE-JSON methods on a temp-table or a dataset for writing the result. Don't forget to add a good MIME type though... This might not be as robust and customizable but it will work...
OE 11.4 Product Documentation - Web Services See chapter "II Creating OpenEdge REST Web Services"
These might also be useful:
OE 11.4 Product Documentation - Working with XML
OE 11.4 Product Documentation - Working with JSON

Resources