How to make Servicestack serialize an implicit string overload the way I want it to? - servicestack

I have a small class which I am using to make sure the strings sent and received by a service remain URL safe without additional encoding (see below).
Ideally I would like to just apply this type to my DTOs and have Servicestack be smart enough to use the implicit operators.
public class MyDto {
Base64UrlString myString;
}
var dto = new MyDto() { myString = "hello i am url safe"; }
On the client this is received as myString: {}
Is there a more elegant way to do this? I had hoped applying a type this way would "just work"
// used only for Base64UrlEncoder
using Microsoft.IdentityModel.Tokens;
namespace MyDto.ServiceModel.Types
{
public class Base64UrlString
{
private readonly string _base64UrlString;
public Base64UrlString(string str)
{
_base64UrlString = Base64UrlEncoder.Encode(str);
}
public static implicit operator string(Base64UrlString base64UrlString) => base64UrlString.ToString();
public static implicit operator Base64UrlString(string str) => new(str);
public override string ToString() => Base64UrlEncoder.Decode(_base64UrlString);
}
}

You'll need to change your class to a struct to make use of the custom struct behavior you're trying to use in your example.
Also ServiceStack.Text serializers only serializes public properties by default so your DTO should use public properties:
public class MyDto {
public Base64UrlString MyString { get; set; }
}
Alternatively you can configure it to serialize public fields with:
JsConfig.Init(new Config {
IncludePublicFields = true
});

Related

Haxe: Native Interface properties implementable?

I've got this compiletime errors when I make some class implement an interface with properties that have been fromerly defined in some native sub class, like openfl.display.Sprite. It occurs when I'm targeting flash, not js.
Field get_someValue needed by SomeInterface is missing
Field set_someValue needed by SomeInterface is missing
Field someValue has different property access than in SomeInterface (var should be (get,set))
In contrast, there's no problem with interface definitions of 'native' methods or 'non-native' properties. Those work.
Do I have to avoid that (not so typical) use of interfaces with haxe and rewrite my code? Or is there any way to bypass this problem?
Thanks in advance.
Example:
class NativePropertyInterfaceImplTest
{
public function new()
{
var spr:FooSprite = new FooSprite();
spr.visible = !spr.visible;
}
}
class FooSprite extends Sprite implements IFoo
{
public function new()
{
super();
}
}
interface IFoo
{
public var visible (get, set):Bool; // Cannot use this ):
}
TL;DR
You need to use a slightly different signature on the Flash target:
interface IFoo
{
#if flash
public var visible:Bool;
#else
public var visible (get, set):Bool;
#end
}
Additional Information
Haxe get and set imply that get_property():T and set_property(value:T):T both exist. OpenFL uses this syntax for many properties, including displayObject.visible.
Core ActionScript VM classes (such as Sprite) don't use Haxe get/set, but are native properties. This is why they look different.
Overriding Core Properties
If you ever need to override core properties like this, here is an example of how you would do so for both Flash and other targets on OpenFL:
class CustomSprite extends Sprite {
private var _visible:Bool = true;
public function new () {
super ();
}
#if flash
#:getter(visible) private function get_visible ():Bool { return _visible; }
#:setter(visible) private function set_visible (value:Bool):Void { _visible = value; }
#else
private override function get_visible ():Bool { return _visible; }
private override function set_visible (value:Bool):Bool { return _visible = value; }
#end
}
Overriding Custom Properties
This is not needed for custom properties, which are the same on all platforms:
class BaseClass {
public var name (default, set):String;
public function new () {
}
private function set_name (value:String) {
return this.name = value;
}
}
class SuperClass {
public function new () {
super ();
}
private override function set_name (value:String):String {
return this.name = value + " Q. Public";
}
}
Need to provide the method signatures in an Interface. Currently its just a property declaration.
The error message is saying it all.
Field get_someValue needed by SomeInterface is missing
Field set_someValue needed by SomeInterface is missing
Hopefully that helps.

Update a mixed type activity in GetStream.IO using the java stream client loses the type attribute

I have taken the MixedType example code that comes with the java stream client (https://github.com/GetStream/stream-java) and added a update step using updateActivities. After the update the activity stored in stream loses the 'type' attribute. Jackson uses this attribute when you get the activities again and it is deserialising them.
So I get:
Exception in thread "main" Disconnected from the target VM, address: '127.0.0.1:60016', transport: 'socket'
com.fasterxml.jackson.databind.JsonMappingException: Could not resolve type id 'null' into a subtype of [simple type, class io.getstream.client.apache.example.mixtype.MixedType$Match]
at [Source: org.apache.http.client.entity.LazyDecompressingInputStream#29ad44e3; line: 1, column: 619] (through reference chain: io.getstream.client.model.beans.StreamResponse["results"]->java.util.ArrayList[1])
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
at com.fasterxml.jackson.databind.DeserializationContext.unknownTypeException(DeserializationContext.java:849)
See here where I have updated the example:
https://github.com/puntaa/stream-java/blob/master/stream-repo-apache/src/test/java/io/getstream/client/apache/example/mixtype/MixedType.java
Any idea what is going on here?
The issue here is originated by Jackson which cannot get the actual instance type of an object inside the collection due to the Java type erasure, if you want to know more about it please read this issue: https://github.com/FasterXML/jackson-databind/issues/336 (which also provides some possible workarounds).
The easiest way to solve it, would be to manually force the value of the property type from within the subclass as shown in the example below:
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY, property = "type", visible = true)
#JsonSubTypes({
#JsonSubTypes.Type(value = VolleyballMatch.class, name = "volley"),
#JsonSubTypes.Type(value = FootballMatch.class, name = "football")
})
static abstract class Match extends BaseActivity {
private String type;
public String getType() {
return type;
}
}
static class VolleyballMatch extends Match {
private int nrOfServed;
private int nrOfBlocked;
public VolleyballMatch() {
super.type = "volley";
}
public int getNrOfServed() {
return nrOfServed;
}
public void setNrOfServed(int nrOfServed) {
this.nrOfServed = nrOfServed;
}
public void setNrOfBlocked(int nrOfBlocked) {
this.nrOfBlocked = nrOfBlocked;
}
public int getNrOfBlocked() {
return nrOfBlocked;
}
}
static class FootballMatch extends Match {
private int nrOfPenalty;
private int nrOfScore;
public FootballMatch() {
super.type = "football";
}
public int getNrOfPenalty() {
return nrOfPenalty;
}
public void setNrOfPenalty(int nrOfPenalty) {
this.nrOfPenalty = nrOfPenalty;
}
public int getNrOfScore() {
return nrOfScore;
}
public void setNrOfScore(int nrOfScore) {
this.nrOfScore = nrOfScore;
}
}

Invalid signature for SetUp or TearDown method - What am I doing wrong?

I am trying to do some dependency injection for my tests using nUnit. I'm new to TDD and nUnit so it's possible I am missing something simple. So basically I've created a SetUp method for my interfaces. I originally was using a constructor but I read it's bad to do this when doing TDD so I now using a method.
When I run my test I construct an object and assign it to the interface and then I call a method using that interface. I want to test if it can parse a string decimal.
When I run my test it says test failed and the message is:Invalid signature for SetUp or TearDown method
See below for the actual code:
public class DonorTests
{
private IDonor _Donor;
private IValidateInput _ValidInput;
//DonorTests(IDonor donor, IValidateInput validInput)
//{
// _Donor = donor;
// _ValidInput = validInput;
//}
[SetUp]
void Setup(IDonor donor, IValidateInput validInput)
{
_Donor = donor;
_ValidInput = validInput;
}
[Test]
public void HandleStringNotDecimal()
{
_ValidInput = new ValidateInput();
Assert.IsTrue(_ValidInput.IsDecimal("3445.3450"));
}
}
My class that uses this interface
public class ValidateInput : IValidateInput
{
public decimal RoundTwoDecimalPlaces(decimal amount)
{
return Math.Round(amount);
}
public bool IsDecimal(string amount)
{
decimal ParsedDecimal;
return Decimal.TryParse(amount, out ParsedDecimal);
}
public decimal ConvertToString(string value)
{
decimal ParsedDecimal;
Decimal.TryParse(value, out ParsedDecimal);
return ParsedDecimal;
}
}
You're injecting dependencies using constructor injection previously, right? I think you will not be able to perform dependency injection using method decorated with SetUpAttribute because such method has to be parameterless. Also Setup method has to be public, see this SO thread.
How are we typically dealing with similar situations in our company is:
[TestFixture]
public class DonorTests
{
private IDonor _Donor;
private IValidateInput _ValidInput;
[SetUp]
public void Setup()
{
_Donor = new Donor();
_ValidInput = new ValidateInput();
}
[Test]
public void HandleStringNotDecimal()
{
Assert.IsTrue(_ValidInput.IsDecimal("3445.3450"));
}
}
Or if construction of ValidInput and Donor is cheap then we simply create new instance for each test, having special method for that purpose so when we decide to test another implementation of IValidateInput then it is enough to change it in one place only:
[TestFixture]
public class DonorTests
{
[Test]
public void HandleStringNotDecimal()
{
var validInput = CreateValidateInput();
Assert.IsTrue(validInput .IsDecimal("3445.3450"));
}
private static IValidateInput CreateValidateInput()
{
return new ValidateInput();
}
}
Besides the cause mentioned in the accepted answer, I have met the same error when leaving method as non-public (private or protected).
NUnit most probably relies on reflection and does not deal with non-public methods, so special methods (i.e. decorated with NUnit specific attributes) must be public.

How does one extend MEF to create objects based on a factory type provided as an attribute?

Consider the following existing classes which uses MEF to compose Consumer.
public interface IProducer
{
void Produce();
}
[Export(typeof(IProducer))]
public class Producer : IProducer
{
public Producer()
{
// perform some initialization
}
public void Produce()
{
// produce something
}
}
public class Consumer
{
[Import]
public IProducer Producer
{
get;
set;
}
[ImportingConstructor]
public Consumer(IProducer producer)
{
Producer = producer;
}
public void DoSomething()
{
// do something
Producer.Produce();
}
}
However, the creation of Producer has become complex enough that it can no longer be done within the constructor and the default behavior no longer suffices.
I'd like to introduce a factory and register it using a custom FactoryAttribute on the producer itself. This is what I have in mind:
[Export(typeof(IProducer))]
[Factory(typeof(ProducerFactory))]
public class Producer : IProducer
{
public Producer()
{
// perform some initialization
}
public void Produce()
{
// produce something
}
}
[Export]
public class ProducerFactory
{
public Producer Create()
{
// Perform complex initialization
return new Producer();
}
}
public class FactoryAttribute : Attribute
{
public Type ObjectType
{
get;
private set;
}
public FactoryAttribute(Type objectType)
{
ObjectType = objectType;
}
}
If I had to write the "new" code myself, it may very well look as follows. It would use the factory attribute, if it exists, to create a part, or default to the MEF to create it.
public object Create(Type partType, CompositionContainer container)
{
var attribute = (FactoryAttribute)partType.GetCustomAttributes(typeof (FactoryAttribute), true).FirstOrDefault();
if (attribute == null)
{
var result = container.GetExports(partType, null, null).First();
return result.Value;
}
else
{
var factoryExport = container.GetExports(attribute.ObjectType, null, null).First();
var factory = factoryExport.Value;
var method = factory.GetType().GetMethod("Create");
var result = method.Invoke(factory, new object[0]);
container.ComposeParts(result);
return result;
}
}
There are a number of articles how to implement a ExportProvider, including:
MEF + Object Factories using Export Provider
Dynamic Instantiation
However, the examples are not ideal when
The application has no dependencies or knowledge of Producer, only IProducer. It would not be able to register the factory when the CompositionContainer is created.
Producer is reused by several applications and a developer may mistakenly forget to register the factory when the CompositionContainer is created.
There are a large number of types that require custom factories and it may pose a maintenance nightmare to remember to register factories when the CompositionContainer is created.
I started to create a ExportProvider (assuming this would provide the means to implement construction using factory).
public class FactoryExportProvider : ExportProvider
{
protected override IEnumerable<Export> GetExportsCore(ImportDefinition definition,
AtomicComposition atomicComposition)
{
// What to do here?
}
}
However, I'm having trouble understanding how to tell MEF to use the factory objects defined in the FactoryAttribute, and use the default creation mechanism if no such attribute exists.
What is the correct manner to implement this? I'm using MEF 2 Preview 5 and .NET 4.
You can make use of a property export:
public class ProducerExporter
{
[Export]
public IProducer MyProducer
{
get
{
var producer = new Producer();
// complex initialization here
return producer;
}
}
}
Note that the term factory isn't really appropriate for your example, I would reserve that term for the case where the importer wants to create instances at will, possibly by providing one or more parameters. That could be done with a method export:
public class ProducerFactory
{
[Export(typeof(Func<Type1,Type2,IProducer>)]
public IProducer CreateProducer(Type1 arg1, Type2 arg2)
{
return new Producer(arg1, arg2);
}
}
On the import side, you would then import a Func<Type1,Type2,IProducer> that you can invoke at will to create new instances.

Can/Should a domain object be responsible for converting itself to another type?

We have a class Event (it's actually named differently, but I'm just making abstraction):
public class Event
{
public string Name { get; set; }
public string Description { get; set; }
public EventType EventType { get; set; }
}
We need to build an instance of a Message class with this object, but depending on the EventType, we use a different builder:
switch (event.EventType)
{
case EventType.First:
message = FirstMessageBuilder.Build(event);
break;
case EventType.Second:
message = SecondMessageBuilder.Build(event);
break;
}
Do you think this is acceptable, or should we take the following approach:
Make an abstract class:
public class Event
{
public string Name { get; set; }
public string Description { get; set; }
public abstract Message BuildMessage();
}
Then derive two classes: class FirstMessage and class SecondMessage and make the domain objects responsible for building the message.
I hope it isn't too abstract. The bottom line is we need to transform one class to another. A simple mapper won't do, because there are properties with XML content and such (due to a legacy application making the events). Just accept what we're trying to do here.
The real question is: can a domain object be responsible for such a transformation, or would you not recommend it? I would avoid the ugly switch statement, but add complexity somewhere else.
Whilst I agree with Thomas, you might want to look at the following design patterns to see if they help you:
Vistor Pattern
Double-Dispatch Pattern
Builder Pattern
Strictly speaking, a domain object shouldn't be responsible for anything other than representing the domain. "Changing type" is clearly a technical issue and should be done by some kind of service class, to maintain a clear separation of concerns...
In order to gain the readability of
var message = eventInstance.AsMessage();
as well following the single responsibility principle, you could define AsMessage() as an extension method of the event type.
There are few possible solutions. To use abstract factory:
public interface IMessageFactory
{
Message Create();
}
public class FirstMessageFactory : IMessageFactory
{
public Message Create()
{
//...
}
}
public class SomeService
{
private readonly IMessageFactory _factory;
public SomeService(IMessageFactory factory)
{
_factory = factory;
}
public void DoSomething()
{
var message = _factory.Create();
//...
}
}
Now you can wire IoC container to right factory for requested service.
To use Assembler which makes the transformation:
public interface IAssembler<TSource, TDestination>
{
TDestination Transform(TSource source);
}
This is quite similar to factory pattern, but if you are dependent on EventType, its possible to do it like:
public interface IAssembler<TEventType>
{
object Transform(object source);
}
I would encapsulate the logic into a separate Factory/Builder class, and use an extension method on Event to call the builder.
This would give you the best of both worlds.

Resources