I am working as automation tester for banking domain account. I have a query and need your help.
Current Approach :
1.The framework which we are using for API-Services testing is Java based - Serenity -Cucumber framework using Rest Assured.
2.All the script development/script maintenance activities related to this framework is taken care by the automation testers in our team.
3.Few weeks before , we got to know about Karate framework and completed the PoC( Proof of concept).
4.All went well and we are in the plan to migrate our existing Rest assured java code Karate framework.
The reason for migration - Karate framework Api services testing can be able to do by Manual testers as well.So we are in the plan for migration.
Query
We have almost 80 web services already developed in Rest assured and running successfully.
Also for migration, all the services are inter-dependent. So we have to use both Res-assured and Karate framework code together ,unless it's fully migrated.
Can't able to migrate all the services immediately ,it's a time taken effort.
Is it possible to run karate framework and Rest Assured Java code in the same scenario.
Scnario given in Karate Feature file:
Given urlCustomerservices
When method get
Then status 200
def getCustIDfromUserservices=newcallJavaFunction().getcustid("user","password")
print getCustIDfromUserservices
This "getcustid" have the Rest assured java code for "Post" call service to get the customer number.
When I am running this in code karate framework , getting this error -
"io.restassured.internal.RequestSpecificationIMpl.invokeMethod(java/lang/String;Ljava/lang/object;)Ljava/lang/object;
Could any help in this. Whether can we run both karate and Rest assured code together in same scenario in karate framework . If yes, why am getting this error, when am trying to get the response in Rest assured.
First I'll say that this is not something we claim to support :) So you are on your own.
That said, it sounds like a simple library conflict. My guess is your existing Maven pom.xml has a lot of libraries floating around. You will need to do some investigation and find out what maven exclusions you need to do or which libraries need explicit versions specified. If you are lucky, switching from karate-apache to karate-jersey may do the trick.
Also I strongly recommend creating a Karate quickstart, then adding the extra stuff one by one in "hello world" mode and see what causes the problem. Use the mvn dependency:tree command and see the differences between a Karate project and yours. If you know how to use Maven profiles, that may be one way to go. All the best !
Worst case, fall back to 2 maven modules and run 2 test-suites, it is fine and not the end of the world. You can gradually migrate.
EDIT - also see https://stackoverflow.com/a/65628686/143475
Related
I am using SAP Cloud SDK (Java flavour) to create an extension application of SuccessFactors.
I sadly discovered that the Jenkins pipeline does not allow me to use any other service than the ones listed here: SCN Blog (scroll to the Appendix).
This does not make so much sense to me, as now the SDK can be used - and it is sponsored to be used by SAP - also with SaaS in its ecosystem, SuccessFactors being one of them.
Any hint? Can this check be somehow "bypassed"?
Thanks,
Roberto.
Please note that the blog post is quite old, have you verified your assumption that it does not work with SuccessFactors API?
Nonetheless, we recently introduced a configuration option which allows you to disable certain checks, cf https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/configuration.md#s4sdkqualitychecks
checkServices is what you would want to disable in your scenario.
As stated by Florian in the comment and following the Project Piper documentation, parameters "checkServices" and "customODataServices" can be used to customize the behavior of the pipeline when running upon a non-Business Hub API.
"checkServices: false" will completely deactivated the check, whereas "customODataServices: [ yourApiName ]" will skip the check just for the specified services.
I am using ODM 8.10 and want to automate building rule app files. The code is currently configured in the old Classic Rule Project, and we are trying to avoid migrating to Decision Services at this time. I have found build jars for Decision Services but nothing so far for Classic Rule Projects. There must be a way to do this as the rule app jar files are created in the eclipse IDE when you deploy/export a ruleApp. I am trying to find out the jar files the IDE uses and the commands it calls to execute the rule app builds.
Re: "There must be a way to do this"
But you will not necessarily have access to it. The ODM product developers have experience, source code, documentation, and other tools that you do not have access to.
Having said that, there is an build/deploy API that you may be able to access via ANT. I haven't used it since switching to Decision Services when that became feasible in ODM 8.7. Standard practice before that time was to automate deployments via Ant and a "headless" version of Eclipse. If the latest online docs don't describe it, you might try the older docs.
WARNING: Classic Rule Projects are a dead end! Not only will all your effort building them in a non-standard way be wasted, I believe that it will likely be more trouble than just migrating to Decision Services (which is not usually that difficult).
I am working on project where we have used spring integration. And we have may flow which eventually create a full flow of the system.
Now, we needs to create a main flow which have all abstract component which internally call the sub flow. I found spring integration flow project for create a subflow. https://github.com/spring-projects/spring-integration-flow/tree/master.
But while I try to find out latest jar I found which is build on 2015 (https://mvnrepository.com/artifact/org.springframework.integration/spring-integration-flow). Now I am confused that do we have to use this project or some other approach spring integration build which needs to use.
e.x:
we have 3 flow file.
1) prepare-file.xml
2) prepare-database.xml
3) enrich-object.xml
which eventually call like prepare-file.xml-->prepare-database.xml-->enrich-object.xml
Now, we like to create a file which is master-flow.xml which shows all component in diagram very high level.
Thanks,
Nishit C.
Well, that project hasn't have enough interest in community for a while. And now when most people step aside from XML configuration in favor of Java & Annotation configuration with Spring Boot on top, such a project doesn't have its attractiveness any more.
On the other hand we provide a Java DSL for Spring Integration flows several years already: https://docs.spring.io/spring-integration/docs/current/reference/html/#java-dsl
I would say its IntegrationFlow definitions with the sub-flow functionality may server for your requirements.
I understand that this might not be an answer you are looking for, but at least this one should be as some food to think about.
How to do automation for web services using Cu Cumber? Please guide me on how to start it for my project on web services.
You cannot start automation testing using cucumber. Cucumber is just a BDD framework, which lets you execute test cases.
However, The most recommended option is to use ruby language with the cucumber framework. If your using ruby and if your project has rest-api's, you can use the gem called rest-client and airborne.
You can search about these, and learn how are you going to pass the request with the parameters and analyse the response and call these codes using gherkin language. All these together will constitute a framework called cucumber, which you can use to execute.
On a side note: The next time you post, please have some research already done and post what error are you facing.
At the moment, I'm using two different frameworks for REST APIs integration testing, and load/stress testing. Respectively : geb (or cucumber) and gatling. But most of the time, I'm re-writing some pieces of code in load / performance scenarii that I've been writing for integration testing.
So the question is : is there a framework (running on the JVM) or simply a way, to write integration tests (for a strict REST API use case), preferably programmatically, then assemble load testing scenarios using these integration tests.
I've read cucumber maybe could do that, but I'm lacking a proper example.
The requirements :
write integration tests programmatically
for any integration test, have the ability to "extract" values (the same way gatling can extract json paths for instance)
assemble the integration tests in a load test scenario
If anyone has some experience to share, I'd be happy to read any blog article, GitHub repository, or whatever source dealing with such an approach.
Thanks in advance for your help.
It sounds like you want to extract a library that you use both for your integration tests as well as your load test.
Both tools you are referring to are able to use external jar.
Suppose that you use Maven or Gradle as build tool, create a new module that you refer to from both your integration tests and your load tests. Place all interaction logic in this new module. This should allow you to reuse the code you need.