while writing integration tests in SAP Hybris, I am getting exceptions which imply that the model objects are not available during the test case.
It seems like normally the ImpEx which run normally during initialization are not running here.It is becoming hectic creating objects using model service.
Is there some other way?What about the custom objects defined by me in the product(like ABCProduct extending Product) and also values for them?Is it possible to mock them as well?What about BaseSite and PriceRow?
There are some things you need to know about the test system.
Tenants
Usually you work with the master tenant. The test system however has its own tenant called junit. A tenant is meant to be kind of like a separate data set for the hybris server running the same code. That way you can run different shops on the same infrastructure but every shop can only access the data that is meant for the tenant. How does it work? Every tenant has a table prefix, only the tenant master has an empty one. So the products table for the master tenant is called "products", but the "products" table for the junit tenant is called "junit_products".
Further reading: https://help.sap.com/viewer/d0224eca81e249cb821f2cdf45a82ace/1905/en-US/8c14e7ae866910148e59ebf4a2685857.html
Initialization
When you use the initialization using ant initialize or the admin console, you usually only initialize the master tenant. When you want to initialize the junit tenant, you need to either change to the junit tenant in the admin console or run ant initialize -Dtenant=junit. However this creates only the most basic data.
More on how to execute initialization in admin console in section "Executing Tests": https://help.sap.com/viewer/d0224eca81e249cb821f2cdf45a82ace/1905/en-US/aae25ecb74ab4bd69cc5270ffd455459.html
Creating test data
There are a few classes you can inherit from, to create an integration test, but only ServicelayerTest provides methods to create sample data. All those methods import impex files located in /hybris/bin/platform/ext/core/resources/servicelayer/test/
createCoreData() Creates languages, currencies, units etc. See: testBasics.csv
createDefaultCatalog() Creates a sample product catalog with an online catalog version and basic sample products. See: testCatalog.csv
createHardwareCatalog() Creates a sample product catalog with staged and online version, products and classifications. See testHwcatalog.csv and testClassification.csv
createDefaultUsers() Creates sample customers with addresses etc. See testUser.csv
Importing custom data
To import data not covered by the ServicelayerTest methods, I recommend one of two approaches.
Using ModelService and other services to create your data. E.g. you can use the OrderService to create sample orders. You can as well create utility classes that provide you with creating sample data. You can wire every service you need by annotating it with the #Resource annotation.
Using impex files to create all data you need. You can split those up into different files that serve different needs (e.g. customers, orders, products...). The method importCsv(String pathToFile, String encoding) in ServicelayerTest provides you the opportunity to import those.
Related
I want to make a webservice and it looks like Loopback is good starting point.
To explain my question, I will describe situation
I have 2 MySQL Tables:
Users
Companies
Every User has it's Company. It's like master User for it's company.
I wish to create Products table for each company next way:
company1_Products,
company2_Products,
company3_Products
Each company have internal Users, like:
company1_Users,
company2_Users,
company3_Users
Internal users are logging in from corresponding subdomain, like
company1.myservice.com
company2.myservice.com
For the API, I want datasource to get Products from the corresponding table. So the question is, how to change datasource dynamically?
And how to handle Users? Storing in one table is not good, because internal company users could be in different companies...
Maybe there's better way to do such models?
Disclaimer: I am co-author and one of current maintainers of LoopBack.
how to change datasource dynamically?
The following StackOverflow answer describes a solution how to attach a single model (e.g. Product) to multiple datasources: https://stackoverflow.com/a/28327323/69868 This solution would work if you were creating one MySQL database per company instead of using company's name as the prefix of Product table name.
To achieve what you described, you can use model subclassing. For each company, define a new company-specific Product model inheriting from the shared Product model and changing the table name.
// common/models/company1-product.json
{
"name": "Company1_Product",
"base": "Product",
"mysql": {
"tableName": "company1_Products"
}
// etc.
}
You can even create these models on the fly using app.registry.createModel() and app.model() APIs, and then run dataSource.autoupdate to create SQL tables for the new model(s).
And how to handle Users? Storing in one table is not good, because internal company users could be in different companies...
I suppose you can use the same approach as you do for Products and as you described in your question.
Maybe there's better way to do such models?
The problem you are facing is calling multi-tenancy. I am afraid we haven't figured out an easy to use solution yet. There are many possible ways how to implement multi-tenancy.
For example, you can create one LoopBack application for each Company (tenant) and then create a top-level LoopBack or Express application to route incoming requests to appropriate tenant-specific LB app instance. See the following repository for a proof-of-concept implementation: https://github.com/strongloop/loopback-multitenant-poc
How can i prevent the demo site to load the catalog of the heat clinic demo?
all i need is an empty shop with admin roles and permissions, no catalog data, no translations.
This is configured with a property, import.sql.enabled. If you set this to false then no data will be loaded at all.
To include or exclude specific import SQL, we do not have a great way to manage that piece by piece yet. Specifically for admin roles/permissions you would need this import to execute, which would be as easy as moving this #Bean method inside of your own #Configuration classes in your project (like CorePersisteceConfig).
i want to add my project's endpoint in the project tear down script. What is the syntax in order to get the endpoint for all requests and test requests as the user will assign their endpoint via all requests and test requests before running the project?
i seen an example using test step but i don't want to retrieve it via the test step route:
testRunner.testCase.getTestStepByName("dd").getHttpRequest().getEndpoint();
The tear down script use either , log, context, runner nd project variables.
Thanks
Based on the information updated in the question, it looks like you have to access the endpoint in the TearDown Script of the project.
It also appears that you would need to execute the same set of tests against different base url of the endpoint and domain. Not sure even you might need to use the credentials accordingly.
Considering the above, it would be easy to project level properties.
Here you would go:
Create a project level custom property for base url, say BASE_URL as property name and value as http://10.0.0.1:8008. Of course, change it with actual value as needed with respect to the tests to be executed.
Similarly create another project level property for domain, say DOMAIN_NAME and provide its value according the test.
Double click on service / interface, click on Service Endpoints tab.
Remove all the existing values.
Add a new endpoint by clicking + icon.
Add ${#Project#BASE_URL} as endpoint and ${#Project#DOMAIN_NAME} as domain values
If required, you use the same approach for the credentials.
Now click on Assign button there and choose All requests and Tests option from the dropdown.
Similarly, do the same if you have multiple services / interfaces.
How to access the above values in TearDown Script?
log.info "Endpoint : ${project.getPropertyValue('BASE_URL')}"
log.info "Domain : ${project.getPropertyValue('DOMAIN_NAME')}"
When you want to change domain or base url, just change the values of the respective project properties before you run execute the tests against different servers / environments.
EDIT:
The values for the endpoint or domain can passed dynamically (without even changing value saved in the project) from command line using SOAPUI_HOME/bin/testrunner utility while executing the tests. For more details, refer documentation
We are already using loopback as our backend server for REST APIs.
Now our product demands to have multi tenancy in our system i.e seperate database per user.
So after searching for an hour , we got something like Loopback-MultiTenancy POC Sample.
The sample looks nice and exact what we need , though there are some issues we are facing using this poc and also at architecture level.
This POC create seperate folders per tenant. Each tenant folder has its own config , own datasource and its own models ,which is fine. But what we have is , we have common models for all users.
So whenever new user gets created , will have to create new tenant folder and move all models inside that folder either manually or using some script.
But when we gonna have 100 of users and say we want to change a particular model schema , so what it requires is reflect the changes in all other tenant folders also, Which is kind of very very troublesome for us.
So we are looking for better solution which doesnt ask for duplication and also serves the purpose , as of loopback.
We are kind of stuck , need some help or advice.
Thanks,
Following is my requirement :
Whenever site is created, with help of GroupListener we are adding some custom attributes to created site.
So assume that you are crating "Liferay Test" site then it will have some fix custom attribute "XYZ" with value set in GroupListeners onAfterCreate method.
I can see this value in custom fields under site settings.
Now based on this values, we are creating groups in another system(out of liferay, using webservices).
So far so good.
Whenever we are deleting the site we need to remove the equivalent groups from other system via web service.
But while deleting site, in GroupListener we are not able to retrieve the custom attributes.
On further debug by adding expando listener, I observed that Expando listeners are getting called first and then delete method of GroupLocalService/GroupListener.
And hence we are not able to delete the groups present in another system.
So I was wondering if we can have ordering defined for listeneres.
Note: Since we were not getting custom attributes in listeners we implemented GroupLocalServiceImpl and with this we are getting custom attributes in delete method on local environment but not on our stage environment which has clustering.
You shouldn't use the ModelListeners for this kind of change, rather create ServiceWrappers, e.g. wrap the interesting methods in GroupLocalService (for creation as well as deletion).
This will also enable you to react to failures to create records in your external system etc.