How do you connect a software component and an RTE generated function? (in davinci developer) - autosar

I have a generated RTE, which contains functions such as did_read, did_checkconditions and did_write. I want to create a software component and connect the created software component to the generated rte(RTE_UNCONNECTED). I know that I can manually change the code in the RTE and make it work, but I need to re-write every time I re-generate RTE. How do I connect/access rte generated functions from a Software component in da vinci developer?
Tried looking into the documentation of vector.
I have defined a Software component in the developer and instantiated by creating a component prototype. I have connected the service ports to the created prototype in the configurator. I created a dummy_init runnable to add to the os tasks so that I can generate code. I need other runnables to respond to the request such as did_read, did_write.
I need to connect the RTE and the software component in da vinci developer/configurator so that I shall be able to send and receive data.

I assume from your did_read, did_write and did_checkconditions, that these come out of the Dcm ( as service ports) and therefore from the DiagExtract.
For this, you should check your SWCs, which ones provide actually the DIDs. Then you should take a look in the generic AUTOSAR DCM SWS, how Service Ports for certain elements look like and create similar ports then in your SWCD to generate the functions.
8.8.3.2 DataServices_{Data}
Using the concepts of the SW-C template, the interface is defined as follows if ClientServer interface is used (DcmDspDataUsePort set to USE_DATA_SYNCH_CLIENT_SERVER or USE_DATA_ASYNCH_CLIENT_SERVER or
USE_DATA_ASYNCH_CLIENT_SERVER_ERROR)
{Data} meaning here, for each DID!

Related

Can I mock a system with custom properties?

We're using the Destinations service to configure connections to different kinds of systems. As part of this, we are using the "Additional Properties" section to add non-standard properties, such as my.custom.property=123.
We have successfully used the SAP Cloud SDK's MockUtil to write Spring integration tests that use the files systems.yml and credentials.yml as source for test systems.
However, we couldn't find a way to create an entry there that would provide a test system with a custom property like my.custom.property=123.
The erp section accepts only the properties known for ERP systems, such as sapClient. The general systems section accepts only the absolute basic properties name, type, uri, and proxy. Adding an unknown property in either section results in a runtime error because the mock utils are unable to parse the unknown property into the data classes with fixed structure.
Is there another way to mock a Destination that would allow us to include non-standard properties?
For example, the DestinationAccessorMocker looks promising, as it seems to enable setting up custom implementations of the Destination interface, but we couldn't figure out how to employ it.
Found an option that works.
MockUtil mockUtil = new MockUtil();
MockDestination destination = MockDestination
.builder("my-service", URI.create("http://localhost:1234/"))
.property("my.custom.property", "123")
.build();
mockUtil.mockDestination(destination);
Maybe somebody can confirm that this is an intended way to do this?

Fetching Initial Data from CloudKit

Here is a common scenario: app is installed the first time and needs some initial data. You could bundle it in the app and have it load from a plist or something, or a CSV file. Or you could go get it from a remote store.
I want to get it from CloudKit. Yes, I know that CloudKit is not to be treated as a remote database but rather a hub. I am fine with that. Frankly I think this use case is one of the only holes in that strategy.
Imagine I have an object graph I need to get that has one class at the base and then 3 or 4 related classes. I want the new user to install the app and then get the latest version of this class. If I use CloudKit, I have to load each entity with a separate fetch and assemble the whole. It's ugly and not generic. Once I do that, I will go into change tracking mode. Listening for updates and syncing my local copy.
In some ways this is similar to the challenge that you have using Services on Android: suppose I have a service for the weather forecast. When I subscribe to it, I will not get the weather until tomorrow when it creates its next new forecast. To handle the deficiency of this, the Android Services SDK allows me to make 'sticky' services where I can get the last message that service produced upon subscribing.
I am thinking of doing something similar in a generic way: making it possible to hold a snapshot of some object graph, probably in JSON, with a version token, and then for initial loads, just being able to fetch those and turn them into CoreData object graphs locally.
Question is does this strategy make sense or should I hold my nose and write pyramid of doom code with nested queries? (Don't suggest using CoreData syncing as that has been deprecated.)
Your question is a bit old, so you probably already moved on from this, but I figured I'd suggest an option.
You could create a record type called Data in the Public database in your CloudKit container. Within Data, you could have a field named structure that is a String (or a CKAsset if you wanted to attach a JSON file).
Then on every app load, you query the public database and pull down the structure string that has your classes definitions and use it how you like. Since it's in the public database, all your users would have access to it. Good luck!

Changing operation flow in nodejs

I am trying to create an ecommerce website in nodejs.I want it to be modular so that we can add extensions later without editing the main codebase. For example suppose I have an extension which checks if a user is requester or approver, and if he is an approver he can checkout, otherwise a approval request will be sent to corresponding approver.Suppose I emit an event when a checkout is made, then that extension can catch it and process it. But at the same time I want the normal flow to be changed. How can I do that? Should I create a checkout module extending original checkout module and override functions and make sure that extension's module is loaded ? If I do it there will be problem if two different extensions are adding features to same core module.What is the best way to do it ?
Generally speaking, there are two ways widely used to extend a web app :
Webhooks
Api
Both have their pros and cons.
What you are trying to do is possible in hook style, because the code will be execute on the server itself and you can extend some objects and modify their behavior as you want.

Is there a way to link a specific method to a Route in ServiceStack?

The Problem
I'm aware of the basic way to create a route/endpoint in ServiceStack using methods with names like "Get", "Post", "Any", etc inside a service but in the particular case that I'm trying to work with I have an existing service (which I can make an IService via inheritance) that can not be retrofitted w/ServiceStack attributes and currently uses DTOs for the requests and responses.
This service contains many functions that I do not want to manually mask (as this is a pass-through layer) but otherwise already conform to ServiceStack's requirements. What I'm wondering is if there's a way to manually create these routes in a way that would work like I've mocked up here. My existing functions and DTOs already contain the information I would need to define the routes so if this approach is possible it would only require me to enumerate them at initialization time as opposed to generating the services layer manually.
I noticed there is an extension method on Routes.Add that takes an Expression of type Expression> but I was not able to get that working because I believe the underlying code makes assumptions about the type of Expression generated (LambdaExpression vs MemberExpression or something like that). I also may be barking up the wrong tree if that's not the intended purpose of that function but I can not find documentation anywhere on how that variant is supposed to work.
Why?
I'm not sure this is necessary but to shed some light on why I want to do this as opposed to retrofitting my existing layers: The current code is also used outside of a web service context and is consumed by other code internally. Retrofitting ServiceStack in to this layer would make every place that consumes it require ServiceStack's assemblies and be aware of the web service which is a concern I want separated from the lower code. We were previously using MVC/WCF to accomplish this goal but we want some of the features available from ServiceStack.
the current architecture looks like this:
data -> DAL -> discrete business logic -> composition -> web service
Hopefully that makes enough sense and I'm not being obtuse. If you would like any more details about what I want to do or why I'll try to update this post as soon as possible.
Thanks!
You might use the fallback route in order to provide your own routing mechanism.
Then you get the request.Path property and route using your own mapping of path:Function which can be stored in a simple dictionary.
Anyway, if you go this path I don't see much benefit in using servicestack. It seems you just need an http handler that routes requests to existing services.

re-using ServiceStack validation in Winforms offline client

We have a working website using ServiceStack as the back end that amounts to a complex data-entry form.
My users have requested an "offline editor" for the forms. To use the offline program, the user will have to connect to the ServiceStack service, create empty instances of the forms, and then I will save the POCOs from the service to disk using ServiceStack's JSON serializer. From there the user can log off the service and edit the POCOs. When they're done, they reconnect to the service, and post/put the edited POCO object.
This all works great. My question involves validation. The validation logic is built into my Service.Interface library, which isn't available offline. The winforms program references only the POCO library and the ServiceStack "common" libraries, which do not look like they include the ServiceStack.Validation namespace.
Is there a way I can rearrange my project so that both the service and the Winforms client can run Validation against the POCOs, so that they can have data validation while offline?
UPDATE:
getting closer, I think - I moved all of the Validation classes into their own project. From my Winforms project, I can now manually set up a validator for a POCO class like this:
ServiceStack.FluentValidation.IValidator<SomePOCO> IValidator;
IValidator = new Tonto.Svc.Validation.SomePOCOValidator();
ServiceStack.FluentValidation.Results.ValidationResult vr =
IValidator.Validate(_rpt);
I can see the validator constructor being set up and the rules being initialized, but the .Validate method doesn't seem to do anything. (object comes back as valid, and breakpoints into custom validator code never get there).
UPDATE #2
I discovered my validator code wasn't running from Winforms because my validators all specify a servicestack ApplyTo Put/Post only (see sample code below). When I remove the entire Ruleset clause, though, then validation happens in my service on GETs - something I never want.
Can anyone think of a way to configure the validator rules to run for POST/PUT only when called from ServiceStack, but to also always run when NOT in servicestack? So close!
public class SomePOCOValidator : AbstractValidator<SomePOCO>
{
public SomePOCO()
{
RuleSet(ApplyTo.Put | ApplyTo.Post, () =>
{
(rules)
});
}
}
If your validation is doing anything interesting, then it probably HAS to be done "online".
Maybe just allow your client to save the POCOs locally until they go back online, at which point you send them up to your server. Any transactions that are okay, get processed normally, and any that fail, get returned for the user to edit (so your client will need some smarts to have a working set of POCOs for editing)...
If you don't want ANY extra stuff on the client, just have the transactions that fail to validate get stuffed into a "needs_corrections" table on the server, and then code up a supervisor-sort of screen to manage that table.
The validation framework that ServiceStack uses is named FluentValidation. There is no WinForms support in it. Jeremy Skinner the creator of FluentValidation answerd a question about this back in 2010 on his forum here.
Personally I don't use FV with WinForms - the vast majority of my projects are web-based with the occasional WPF project.
However, if I was going to do this then I probably wouldn't validate the controls directly, but instead use a ViewModel which is bound to the controls. I'd use a fairly strict convention where the names of the controls would match the names of the properties that they're bound to. Then, after validation completes I'd walk the control hierarchy to find the control with the name that matches the property that failed validation (I'm not sure how you'd do this in WinForms, but in WPF I'd use LogicalTreeHelper.FindLogicalNode) and then use the ErrorProvider to set the appropriate error.
Jeremy
I was able to work out a solution that allowed me to use ServiceStack validation libraries on both a ServiceStack client and an offline client. Here are the details.
Move all AbstractValidators to their own project: Proj.Svc.Validation.
get rid of all RuleSets in your AbstractValidators.
Reference Proj.Svc.Validation from Proj.Svc.Interface and Proj.OfflineWinformsClient projects.
Turn OFF the ValidationFeature() plugin in your service. All validation will have to be done manually. This means no iOC injected validators in your service classes.
When it's time to validate, either from your service or the offline client, manually declare the validator and use it like this.
IValidator validator = new
Tonto.Svc.Validation.SomePOCOValidator();
ServiceStack.FluentValidation.Results.ValidationResult vr =
validator.Validate(poco);
if (!vr.IsValid)
(throw exception or notify user somehow);

Resources