I am injecting repositories into a class, and once I've injected the repositories, I'm assigning my context to each of the repositories so I have my unit of work.
What I'm trying to figure out is, is there a way for me to automatically assign my Unit of Work to each repository as I inject it so that a developer doesn't have to consider this when setting up their code. I've already got my unit of work configured in my base class, the one the developer will be inheriting from.
Can I do something like;
Bind<I>().To<S>().WhenInjectedInto<IBaseClass>( i,b => { i.UnitOfWork = b.UnitOfWork });
But not have to repeat that pattern every time?
[UPDATE]
I'm looking at ways to figure out if Ninject is injecting and what from into,
https://github.com/ninject/ninject.extensions.interception
I'm trying to look through the tests to see if this is far off base. Any recommendations?
I think the better way would be to inject the context into the repositoies using constructor injection. In a web project you can use InRequestScope for the context binding. For a WPF/WinForms/Console application have a look at Ninject.Extensions.NamedScope. You can define that a single context is used for all dependencies of you IBaseClass.
Related
From what I understand, from SAP Commerce Cloud 2005 onward the way to customize the REST-endpoints within SAP Commerce Cloud for Spartacus is to use commercewebservices (non-template) and then add own occ-extensions with your REST-endpoints.
That works fine for new endpoints, but what if I want to customize an existing controller from within commercewebservices? Since I am not using the template anymore commercewebservices cannot be modified anymore. I don't see a way how I could for example customize de.hybris.platform.commercewebservices.core.v2.controller.CartsController.
Swapping out commercewebservices with your own extension generated from the template does not work since multiple OOTB (e.g. cmsocc) extensions depend on commercewebservices hence it will always be loaded and clash with our own extension derived from commercewebservices.
Customizing commercewebservices with an addOn also does not solve the problem since, as I understand, it is not possible to add your own controller and bind it to the a url-pattern already used from a controller within commercewebservices
If you want to override an existing API endpoint (CartsController in our case), you can do so with the #RequestMappingOverride annotation.
Using this annotation, you can "shadow" the existing request mapping of the out-of-the-box controller with your custom controller in your own OCC extension.
You can find more details and an example here:
Overriding the REST API [help.sap.com]
EDIT
And let's not forget:
All of the action happens in the facades anyway, and you can also extend the API responses without overriding the Controller using the WsDTO concept plus additional converters. (see Extending Data Objects[help.sap.com] for more details)
Thanks for the response.
The annotation RequestMappingOverride works fine. There is one problem with this approach, lets assume I do following:
Introduce an new called MyController extending the CartsController
Override a single method and annotated this method with RequestMappingOverride
Starting up the system I do get now ambiguous mappings on all mappings of CartsController which I did not override
The reason is, I have now two Controllers registered with the same mappings. The CartsController and MyController which inherits all the methods which are not overriden from CartsController. The only solution I found is to override every single method of the CartsController, annotate all methods with RequestMappingOverride and then just do a super call. That is a bit clumsy and leads to a lot of boilerplate code. I wish the annoation RequestMappingOverride would work on class-level rather than only on method level
I need to get access to a service inside my model class. That service has a bunch of injection of its own, so it would be great if I could get an instance of that service inside that class model.
Angular, for example, has a reflective injector that allows to do that, I remember seeing somewhere an example of instantiating a service with the help of the nestjs reflector but I couldn't find that example, it was pretty long ago.
I don't know if I had to add any code to the question. I just have a model class and I need to use a service there. I need to inject it not through the class constructor, that's it, I believe.
How do I do that?
EDIT:
Seems like nest has a reflector but I also have to inject it...my goodness...it doesn't have a reflective injector akin to that in Angular.
In Angluar you don't have to inject the reflector itself...apparently it's not always suitable to do that...
The Problem
I'm aware of the basic way to create a route/endpoint in ServiceStack using methods with names like "Get", "Post", "Any", etc inside a service but in the particular case that I'm trying to work with I have an existing service (which I can make an IService via inheritance) that can not be retrofitted w/ServiceStack attributes and currently uses DTOs for the requests and responses.
This service contains many functions that I do not want to manually mask (as this is a pass-through layer) but otherwise already conform to ServiceStack's requirements. What I'm wondering is if there's a way to manually create these routes in a way that would work like I've mocked up here. My existing functions and DTOs already contain the information I would need to define the routes so if this approach is possible it would only require me to enumerate them at initialization time as opposed to generating the services layer manually.
I noticed there is an extension method on Routes.Add that takes an Expression of type Expression> but I was not able to get that working because I believe the underlying code makes assumptions about the type of Expression generated (LambdaExpression vs MemberExpression or something like that). I also may be barking up the wrong tree if that's not the intended purpose of that function but I can not find documentation anywhere on how that variant is supposed to work.
Why?
I'm not sure this is necessary but to shed some light on why I want to do this as opposed to retrofitting my existing layers: The current code is also used outside of a web service context and is consumed by other code internally. Retrofitting ServiceStack in to this layer would make every place that consumes it require ServiceStack's assemblies and be aware of the web service which is a concern I want separated from the lower code. We were previously using MVC/WCF to accomplish this goal but we want some of the features available from ServiceStack.
the current architecture looks like this:
data -> DAL -> discrete business logic -> composition -> web service
Hopefully that makes enough sense and I'm not being obtuse. If you would like any more details about what I want to do or why I'll try to update this post as soon as possible.
Thanks!
You might use the fallback route in order to provide your own routing mechanism.
Then you get the request.Path property and route using your own mapping of path:Function which can be stored in a simple dictionary.
Anyway, if you go this path I don't see much benefit in using servicestack. It seems you just need an http handler that routes requests to existing services.
We have a working website using ServiceStack as the back end that amounts to a complex data-entry form.
My users have requested an "offline editor" for the forms. To use the offline program, the user will have to connect to the ServiceStack service, create empty instances of the forms, and then I will save the POCOs from the service to disk using ServiceStack's JSON serializer. From there the user can log off the service and edit the POCOs. When they're done, they reconnect to the service, and post/put the edited POCO object.
This all works great. My question involves validation. The validation logic is built into my Service.Interface library, which isn't available offline. The winforms program references only the POCO library and the ServiceStack "common" libraries, which do not look like they include the ServiceStack.Validation namespace.
Is there a way I can rearrange my project so that both the service and the Winforms client can run Validation against the POCOs, so that they can have data validation while offline?
UPDATE:
getting closer, I think - I moved all of the Validation classes into their own project. From my Winforms project, I can now manually set up a validator for a POCO class like this:
ServiceStack.FluentValidation.IValidator<SomePOCO> IValidator;
IValidator = new Tonto.Svc.Validation.SomePOCOValidator();
ServiceStack.FluentValidation.Results.ValidationResult vr =
IValidator.Validate(_rpt);
I can see the validator constructor being set up and the rules being initialized, but the .Validate method doesn't seem to do anything. (object comes back as valid, and breakpoints into custom validator code never get there).
UPDATE #2
I discovered my validator code wasn't running from Winforms because my validators all specify a servicestack ApplyTo Put/Post only (see sample code below). When I remove the entire Ruleset clause, though, then validation happens in my service on GETs - something I never want.
Can anyone think of a way to configure the validator rules to run for POST/PUT only when called from ServiceStack, but to also always run when NOT in servicestack? So close!
public class SomePOCOValidator : AbstractValidator<SomePOCO>
{
public SomePOCO()
{
RuleSet(ApplyTo.Put | ApplyTo.Post, () =>
{
(rules)
});
}
}
If your validation is doing anything interesting, then it probably HAS to be done "online".
Maybe just allow your client to save the POCOs locally until they go back online, at which point you send them up to your server. Any transactions that are okay, get processed normally, and any that fail, get returned for the user to edit (so your client will need some smarts to have a working set of POCOs for editing)...
If you don't want ANY extra stuff on the client, just have the transactions that fail to validate get stuffed into a "needs_corrections" table on the server, and then code up a supervisor-sort of screen to manage that table.
The validation framework that ServiceStack uses is named FluentValidation. There is no WinForms support in it. Jeremy Skinner the creator of FluentValidation answerd a question about this back in 2010 on his forum here.
Personally I don't use FV with WinForms - the vast majority of my projects are web-based with the occasional WPF project.
However, if I was going to do this then I probably wouldn't validate the controls directly, but instead use a ViewModel which is bound to the controls. I'd use a fairly strict convention where the names of the controls would match the names of the properties that they're bound to. Then, after validation completes I'd walk the control hierarchy to find the control with the name that matches the property that failed validation (I'm not sure how you'd do this in WinForms, but in WPF I'd use LogicalTreeHelper.FindLogicalNode) and then use the ErrorProvider to set the appropriate error.
Jeremy
I was able to work out a solution that allowed me to use ServiceStack validation libraries on both a ServiceStack client and an offline client. Here are the details.
Move all AbstractValidators to their own project: Proj.Svc.Validation.
get rid of all RuleSets in your AbstractValidators.
Reference Proj.Svc.Validation from Proj.Svc.Interface and Proj.OfflineWinformsClient projects.
Turn OFF the ValidationFeature() plugin in your service. All validation will have to be done manually. This means no iOC injected validators in your service classes.
When it's time to validate, either from your service or the offline client, manually declare the validator and use it like this.
IValidator validator = new
Tonto.Svc.Validation.SomePOCOValidator();
ServiceStack.FluentValidation.Results.ValidationResult vr =
validator.Validate(poco);
if (!vr.IsValid)
(throw exception or notify user somehow);
I've been trying to use RedBean ORM (http://redbeanphp.com) to implement UserInterface and UserProviderInterface of the Silex Security Provider Package.
Because of the way the RedBean ORM handles functions for its objects, I've needed to wrap the bean object in another class.
This works great for authentication, but fails tests for Remember Me functionality.
I noticed that somewhere along the chain the Security Package serializes the object.
I thought maybe this was the reason for the error, so I created properties for "id" and "password" in my wrapper class and used __sleep and __wakeup methods to ignore the bean during sleep and reload it on wakeup. Despite everything seeming to load properly during __wakeup the test for "Remember Me" functionality is still failing.
I have created a github repository of my code. If anyone has any ideas, I'd much appreciate it!
For some reason RedBean, Silex and PHPUnit aren't allowing themselves to be included in the repository. A simple composer update should pull them down for you. If anyone has any ideas why, I'd appreciate an answer to that as well.
The github repository can be found at:
https://github.com/christianmagill/silex-redbean-security
The applicable files are
To create the test user in the database:
/setup.php
To run the test:
/index.php
My implementation of UserInterface:
/src/App/Model/UserSecurityWrapper.php
My implementation of UserProviderInterface:
/src/App/Model/UserProvider.php
My modified test:
/src/App/Test/RememberMeRedBeanServiceProviderTest.php
The original test:
/vendor/silex/silex/tests/Silex/Tests/Provider/RememberMeServiceProviderTest.php
The problem was with my custom UserProvider's supportsClass method. I was not taking namespacing into account. It seems like this function is not called for basic authentication, but is needed for the remember me provider.