What fields should be set in the FileSpec object passed into IClient.sync in the perforce java API?
I've set:
The pathname passed to the constructor to the abs path/... of the local workspace
setClient(myIClient)
the action to FileAction.SYNC
No exception is thrown, but one IFileSpec comes back in the result, and it complains that the client 'unknownclient' is (not surprisingly) unknown.
Sounds like you forgot to call:
IClient client = server.getClient(clientName);
server.setCurrentClient(client);
Related
We are currently struggeling with Batch Query,
which seems to ignore the filter expressions on S4 side caused by a wrong URL encoding.
/sap/opu/odata/sap/ZP2M_A_CONTRACT_SEARCH_HDR_CDS/ZP2M_A_CONTRACT_SEARCH_HDR?$filter=PurchaseContractID eq %274600002020%27&$select=*&$format=json
Executing the query using FluentHelperRead.execute(HttpClient)
the returned list of entities contains the expected result with exactly one entity.
Executing the query as Batch Query the following request is logged in console:
GET ZP2M_A_CONTRACT_SEARCH_HDR?%24filter%3DPurchaseContractID+eq+%25274600002020%2527%26%24select%3D*%26%24format%3Djson HTTP/1.1
The collected list from all batch result parts contains all entities.
It seems, that the query URL is encoded in wrong way
and that S4 ignored the filter expressions when encoded in this way.
e.g. $filter is encoded to %24filter which is ignored by S4.
This seems to be a bug in BatchRequestImpl.getRequest(ODataQueryImpl) method,
where URL encoding is done a 2nd time on already encoded URL parts.
if(systemQuery.indexOf("$format=json&$count=true") != -1)
{
systemQuery = systemQuery.substring(0, systemQuery.indexOf("$format=json&$count=true") -1);
keysUrl.append("/$count");
}
systemQuery = URLEncoder.encode(systemQuery, "UTF-8"); // this code line which encodes the query 2nd time
keysUrl.append("?");
The code line systemQuery = URLEncoder.encode(systemQuery, "UTF-8"); located in
BatchRequestImpl(1.38.0) - line 295
BatchRequestImpl(1.42.2) - line 307
encodes the systemQuery string again (including the already encoded parts of FilterExpression as well).
When undoing the changes of this code line in debugger and replacing the scapces by %20 or '+' the Batch Query looks like that
GET ZP2M_A_CONTRACT_SEARCH_HDR?$filter=PurchaseContractID%20eq%20%274600002020%27&$select=*&$format=json HTTP/1.1
GET ZP2M_A_CONTRACT_SEARCH_HDR?$filter=PurchaseContractID+eq+%274600002020%27&$select=*&$format=json HTTP/1.1
and it returns the expected result (exactly 1 entity).
This wrong encoding appears when using these library versions:
sdk-bom: 3.16.1
connectivity: 1.38.0
This issue appears in newest SDK versions as well:
sdk-bom: 3.21.0
connectivity: 1.39.0
This issue appears with connectivity JAR in newest version too:
sdk-bom: 3.21.0
connectivity: 1.40.2
Debugging together with a ABAP/S4 colleague figures out,
that S4 only applies filter expressions, if the keyword $filter is found in request,
%24filter%3D is ignored (the cause why we get all entities running the Batch Query).
My suggestion to solve it would be
// decode query first (to decode the filter expression)
systemQuery = URLDecoder.decode(systemQuery, "UTF_8");
// encode query
systemQuery = org.apache.commons.httpclient.util.URIUtil.encodeQuery(systemQuery, "UTF_8");
My code, how I am calling the batchRequest:
FluentHelperRead<?, MyEntity, ?> queryApi = myService.getAll... // with adding some filter expression
BatchRequestBuilder batchRequestBuilder = BatchRequestBuilder.withService(MyService.DEFAULT_SERVICE_PATH);
ODataQuery query = queryApi.toQuery();
batchRequestBuilder.addQueryRequest(query);
HttpClient httpClient = HttpClientAccessor
.getHttpClient(DefaultErpHttpDestinationAccessor.get());
BatchRequest request = batchRequestBuilder.build();
BatchResult result = request.execute(httpClient);
// ... evaluate response
I think, this is a general issue in the Cloud SDK.
Would is be possible to get this fixed in next Cloud SDK release?
Can you share your code for Batch request? Do you use BatchRequestImpl directly?
The thing is SAP Cloud SDK relies on some dependencies one of which introduces the BatchRequestImpl and if it's called directly the bug is on the dependency side. I have already informed them to investigate this double encoding issue. Unfortunately, we can't directly influence how fast it is resolved and sometimes it takes longer than we'd like.
The good news, we're working on replacing this dependency with our own implementation to solve exactly this kind of problem. The batch is work in progress and should be available in Beta around the end of next month for OData V4 and hopefully around the same time for OData V2 (it's not a hard commitment and depends on other priorities).
From here we have to wait for whatever happens first:
The bug is fixed on the dependency side
Internal OData client implementation is ready together with Batch
I hope it helps and explains current solution path. If you share a bit around your deadlines and the potential impact we'll be happy to consider that.
This has been fixed within the dependency and as of version 3.25.0 the SAP Cloud SDK includes the fix.
I am attempting to set the database/document context programmatically via the Python API. My steps are as follows:
session = BaseXClient.Session("localhost", 1984, "admin", "admin")
query = session.query("//node")
query.context("doc('dbname')") # **NOT SURE HOW TO SET THE DB TO USE**
query.execute()
I already know that I can simply use the session object as follows and it works fine:
session.execute("xquery doc('dbname')//node/child")
But I am looking for a way to OPEN a database within the scope of the program call separate from the query string. I am not able to find the documentation to explicitly set the database prior to executing the query using the context object. I have looked at the source code for the python BaseXClient and there is context method for the Query() instance that is not well documented. I am attempting to use this to set the Database and not having much luck.
The context you have supplied is just a string. It is not evaluated. In a client server context it is difficult to see how one could pass in a database here.
I think your alternatives are to use the execute command to open a database before running the query. This will set the context. e.g.
var q = session.execute("open mydatabase",log.print)
var q = session.query("count(*)")
or use the query command bind command to pass parameters
var q = session.query("declare variable $db external; count(collection($db))")
q.bind("db", "mydatabase","",log.print);
q.execute(log.print);
Sorry these examples use Javascript and my BaseX Node client as I am not familiar with the Python API but I am sure the same applies in the Python API
In spring-batch, data can be passed between various steps via ExecutionContext. You can set the details in one step and retrieve in the next. Do we have anything of this sort in spring-integration ?
My use case is that I have to pick up a file from ftp location, then split it based on certain business logic and then process them. Depending on the file names client id would be derived. This client id would be used in splitter, service activator and aggregator components.
From my newbie level of expertise I have in spring, I could not find anything which help me share state for a particular run.I wanted to know if spring-integration provides this state sharing context in some way.
Please let me know if there is a way to do in spring-context.
In Spring Integration applications there is no single ExecutionContext for state sharing. Instead, as Gary Russel mentioned, each message carries all the information within its payload or its headers.
If you use Spring Integration Java DSL and want to transport the clientId by message header you can use enrichHeader transformer. Being supplied with a HeaderEnricherSpec, it can accept a function which returns dynamically determined value for the specified header. As of your use case this might look like:
return IntegrationFlows
.from(/*ftp source*/)
.enrichHeaders(e -> e.headerFunction("clientId", this::deriveClientId))
./*split, aggregate, etc the file according to clientId*/
, where deriveClientId method might be a sort of:
private String deriveClientId(Message<File> fileMessage) {
String fileName = fileMessage.getHeaders().get(FileHeaders.FILENAME, String.class);
String clientId = /*some other logic for deriving clientId from*/fileName;
return clientId;
}
(FILENAME header is provided by FTP message source)
When you need to access the clientId header somewhere in the downstream flow you can do it the same way as file name mentioned above:
String clientId = message.getHeaders().get("clientId", String.class);
But make sure that the message still contains such header as it could have been lost somewhere among intermediate flow items. This is likely to happen if at some point you construct a message manually and send it further. In order not to loose any headers from the preceding message you can copy them during the building:
Message<PayloadType> newMessage = MessageBuilder
.withPayload(payloadValue)
.copyHeaders(precedingMessage.getHeaders())
.build();
Please note that message headers are immutable in Spring Integration. It means you can't just add or change a header of the existing message. You should create a new message or use HeaderEnricher for that purpose. Examples of both approaches are presented above.
Typically you convey information between components in the message payload itself, or often via message headers - see Message Construction and Header Enricher
Following these guides https://developers.google.com/apps-script/guides/rest/quickstart/target-script and https://developers.google.com/apps-script/guides/rest/quickstart/nodejs, I am trying to use the Execution API in node to return some data that are in a Google Spreadsheet.
I have set the script ID to be the Project Key of the Apps Script file. I have also verified that running the function in the Script Editor works successfully.
However, when running the script locally with node, I get this error:
The API returned an error: Error: ScriptError
I have also made sure the script is associated with the project that I use to auth with Google APIs as well.
Does anyone have any suggestion on what I can do to debug/ fix this issue? The error is so generic that I am not sure where to look.
UPDATE: I've included a copy of the code in this JSBin (the year function is the entry point)
https://jsbin.com/zanefitasi/edit?js
UPDATE 2: The error seems to be caused by the inclusion of this line
var spreadsheet = SpreadsheetApp.open(DriveApp.getFileById(docID));
It seems that I didn't request the right scopes. The nodejs example include 'https://www.googleapis.com/auth/drive', but I also needed to include 'https://www.googleapis.com/auth/spreadsheets' in the SCOPES array. It seems like the error message ScriptError is not very informative here.
In order to find what scopes you'd need, to go the Script Editor > File > Project Properties > Scopes. Remember to delete the old credentials ~/.credentials/old-credential.json so that the script will request a new one.
EDIT: With the update in information I took a closer look and saw you are returning a non-basic type. Specifically you are returning a Sheet Object.
The basic types in Apps Script are similar to the basic types in
JavaScript: strings, arrays, objects, numbers and booleans. The
Execution API can only take and return values corresponding to these
basic types -- more complex Apps Script objects (like a Document or
Sheet) cannot be passed by the API.
https://developers.google.com/apps-script/guides/rest/api
In your Account "Class"
this.report = spreadsheet.getSheetByName(data.reportSheet);
old answer:
'data.business_exp' will be null in this context. You need to load the data from somewhere. Every time a script is called a new instance of the script is created. At the end of execution chain it will be destroyed. Any data stored as global objects will be lost. You need to save that data to a permanent location such as the script/user properties, and reloaded on each script execution.
https://developers.google.com/apps-script/reference/properties/
I am getting below error when I run the Orchestration and try to assign value to a promoted property by reading the value of another promoted property.
Error in Suspended Orchestration:
Inner exception: There is no value associated with the property BankProcesses.Schemas.Internal_ID' in the message.
Detail:
I have 2 XSD schemas, 1 for calling a stored procedure and reading its response and another to write it into a flat file. The internal ID returned in the response from SP needs to be passed to a node in another XSD schema to write to a flat file format.
I have promoted an element from the response schema and also promoted an element from the schema to write to flat file. I am assigning the value to promoted propeties as below:
strInternalId = msgCallHeaderSP_Response(BankProcesses.Schemas.Internal_ID);
msgCallSP(BankProcesses.Schemas.Header_Internal_ID) = strInternalId;
But when I run the orchestration I get the error as mentioned above. I have checked the reponse from stored procedure and the reponse XML does contain some value but I am unable to assign that value to another schema. Please advice
Thanks,
Mayur
You can use exists to check the existence of property.
if(BankProcesses.Schemas.Internal_ID exists msgCallHeaderSP_Response)
{
strInternalId = msgCallHeaderSP_Response(BankProcesses.Schemas.Internal_ID);
msgCallSP(BankProcesses.Schemas.Header_Internal_ID) = strInternalId;
}
One scenario that might cause this error is that there is no Header_Internal_ID element in the message you are trying to modify. Can you inspect the message before modification to ensure that there is an element whose value should be changed - drop the message out to a file location, maybe.
If this is the case, then just ensure that you create this element when you instantiate you r message for the first time - even if you initially set it to an empty element.
HTH
To check if the property exists, you can use this syntax:
BMWFS.LS.BizTalk.CFS.BankProcesses.Schemas.Internal_ID exists msgCallHeaderSP_Response
However, if the case is that the source field would always be there, you have to work backwards to find out why the Property is not appearing on the Context.
If it's coming from a Port, is the message passign through an XmlDisassembler Component? If it's coming from another Orchestration, are you actually setting the Property?
The easiest way to look at the Context is to route the Message, msgCallHeaderSP_Response, to a Stopped Send Port. You can then view the Context in BizTalk Administrator.