I'm developing a database DSL which has multiple backends. To avoid forcing unwanted dependencies upon users, the DSL is split up into a "core" package, containing the DSL itself, and one package for each backend. The backend packages all depend on the core package, as it defines the API each backend needs to provide.
Now, I want to add a test suite for my DSL. Since most of the functionality being tested lives in the core package, that's where I want to put the test suite. However, in order to actually run any tests, at least one backend is needed. This means that the test suite depends on both the core package and a backend package, but the backend package in turn depends on the core package, creating a circular dependency.
The obvious solution is to create yet another package for the tests only which depends on both the core and a backend, or to move the backend API into its own package which the core and backend packages can depend on (allowing the backends to not depend on the core package). However, if possible I'd like to keep the package structure as it is, and have the test suite as part of the core package.
Is this possible?
One possible solution, where you could keep the package layout, would be to have a test suite module provided by the core package, e.g. in YourDSLLib.Tests, implementing the common tests in a generic way for anything that satisfies the defined API.
You could then add a very simple test to each backend, which just calls the testing function in YourDSLLib.Tests.
An advantage would be the possibility to maintain common testing code, but also to customize tests per backend if necessary.
Related
In our setup we are building and deploying our UI5 app as an embedded static resource within our Spring boot maven-based application. During the CI build with the SAP Cloud SDK pipeline, the frontent tests are however not being executed.
Looking at the pipeline code, it seems to me that those stages are only executed for HTML5 modules and not for Java modules. However, the npm modules should be available as they are collected during initialization stage as far as I can see.
So the question for me is if there is a way to execute the frontend tests also in this scenario or if not, whether this intentionally not being done due to other constraints I am not aware of.
For projects using MTA/Cloud Application Programming Model this is correct. Currently, we expect only html5 modules to contain frontends and the corresponding tests. Reason for that is that MTA brings that structure by default and there were no other request for this yet. However, as it also looks like a valid setup we will discuss whether we implement that in one of the future releases. You are also invited to create pull requests.
If you are using a plain maven project generated with the SAP Cloud SDK, you can have this setup of having the frontend embedded into the webapp folder. In this case, you only need to configure the npm script ci-frontend-unit-test in your package.json in the root of the project.
My boss asked me to delete Non usable files in my project and asked me to delete ExampleUnitTest and ExampleInstrumentedTest What are these files used for and will it gonna be a problem if I deleted them?
A simple response is : No, there is no problem if you delete this files.
You can evaluate your app's logic using local unit tests when you need to run tests more quickly and don't need the fidelity and confidence associated with running tests on a real device. With this approach, you normally fulfill your dependency relationships using either Robolectric or a mocking framework, such as Mockito. Usually, the types of dependencies associated with your tests determine which tool you use:
If you have dependencies on the Android framework, particularly those that create complex interactions with the framework, it's better to include framework dependencies using Robolectric.
If your tests have minimal dependencies on the Android framework, or if the tests depend only on your own objects, it's fine to include mock dependencies using a mocking framework like Mockito.
Then instrumented unit tests are tests that run on physical devices and emulators. Instrumented tests provide more fidelity than local unit tests, but they run much more slowly. Therefore, we recommend using instrumented unit tests only in cases where you must test against the behavior of a real device.
I'm new to Node, coming from a Java background. These days I'm experimenting with each part of a full application: database, rest api, ui.
So far I wrote the database-backed logic, which runs on its own, processes text files, store data about them in the database and exposes a REST API to query that data. I'm now going to make the ui to navigate that data.
Would it be reasonable having a structure like this:
- (a) main project folder
- (b) backend application (a Restify server responds to REST calls querying the database)
- (c) ui application (an http server serves React static files)
If that makes sense, I would guess that:
(b) has a package.json with server- and rest- related dependencies (i.e. Restify, MongoDB, ...)
(c) has another package.json with dependencies for ui (i.e. React, Webpack, etc, but not Restify or MongoDB)
(a) has a third package.json which cares for installing each sub-project (I'd say by running npm install through hand-written npm-scripts).
Otherwise, how do you usually handle such Node projects? Do you keep each application completely separate from the rest?
For those who know that tool, this mimics a Maven multi-module project; though that level of automation is not needed, I'd just like to come up with a self-contained package.
These project structures are called as monorepos - A single node project repository that contains multiple packages. There are tools like Lerna. If you are using yarn as package manager, it comes with experimental feature of workspaces.
I'm used to working with Dart, where sharing types between server and client is as simple as importing the relevant packages into your project.
Can something similar be accomplished with Yesod/Haskell? Should I use GHCJS for the client? Maybe Elm? The goal is not having to worry about the data getting mangled in transit between server and client - and also not having to write a single line of JS. :o)
I haven't been able to find any good, beginner friendly docs on how to best tackle this challenge using Haskell. I suspect I just haven't looked in the right places. Any and all help is more than welcome.
To achieve this with GHCJS you can just build your project out of three core packages in this fashion:
frontend - something based on ghcjs-dom, I like Reflex-dom
backend - use your favorite framework, I like Snap, Yesod should work just the same
shared - code shared between frontend and backend
Where frontend and backend both depend on shared of course. Frontend is compiled with GHCJS, backend with GHC.
If you would like to see a complete example I would highly recommend studying hsnippet. Take a look at WsApi.hs where a set of up and downstream messages is being defined. All the JSON instances are derived in one place and imported in both frontend and backend.
Hsnippet uses websockets. This is not a requirement of course. You could use regular XHR in your own app. The principle stays the same. You define your API and serialization instances (usually JSON) in the shared package and import the relevant modules in both frontend and backend.
Personally I also share validation code, database entity definitions generated with persistent etc. Once you set it up sharing additional stuff is mostly a copy paste to one of the shared modules and then import wherever.
i'm new in web dev and have following questions
I have Web Site project. I have one datacontext class in App_Code folder which contains methods for working with database (dbml schema is also present there) and methods which do not directly interfere with db. I want to test both kind of methods using NUnit.
As Nunit works with classes in .dll or .exe i understood that i will need to either convert my entire project to a Web Application, or move all of the code that I would like to test (ie: the entire contents of App_Code) to a class library project and reference the class library project in the web site project.
If i choose to move methods to separate dll, the question is how do i test those methods there which are working with data base? :
Will i have to create a connection to
db in "setup" method before running
each of such methods? Is this correct that there is no need to run web appl in this case?
Or i need to run such tests during
runtime of web site when the
connection is established? In this case how to setup project and Nunit?
or some another way..
Second if a method is dependent on some setup in my .config file, for instance some network credentials or smtp setup, what is the approach to test such methods?
I will greatly appreciate any help!
The more it's concrete the better it is.
Thanks.
Generally, you should be mocking your database rather than really connecting to it for your unit tests. This means that you provide fake data access class instances that return canned results. Generally you would use a mocking framework such as Moq or Rhino to do this kind of thing for you, but lots of people also just write their own throwaway classes to serve the same purpose. Your tests shouldn't be dependent on the configuration settings of the production website.
There are many reasons for doing this, but mainly it's to separate your tests from your actual database implementation. What you're describing will produce very brittle tests that require a lot of upkeep.
Remember, unit testing is about making sure small pieces of your code work. If you need to test that a complex operation works from the top down (i.e. everything works between the steps of a user clicking something, getting data from a database, and returning it and updating a UI), then this is called integration testing. If you need to do full integration testing, it is usually recommended that you have a duplicate of your production environment - and I mean exact duplicate, same hardware, software, everything - that you run your integration tests against.