When programming in Swift, what are the advantages and disadvantages for using a struct to hold all your Constants?
Normally, I would go for a class, but swift classes don't support stored type properties yet. I'd like to write my code in the form of Sample A rather in Sample B, purely because of readability, less lines of code and the single statement does what is read (a readable constant that can't be over-written).
// Sample A.swift
struct Constant {
static let PI = 3.14
}
//Sample B.swift
private let globalPI = 3.14
class Constant {
class var PI: Float {
return globalPI
}
}
If I am just referencing Constant.PI from a Struct perspective (for use in computations, printing, conditionals or whatever), am I being wasteful with the copies that are being made? Are there copies made, if I am just referencing the constant as Constant.PI in other parts of my code?
You can declare a class constant in Swift:
class Foo {
class var PI : Double { return 3.1416 }
}
The key is to define the custom getter and no setter.
Related
In one package / module I have the following code
// Pieces.hx
package;
#:structInit class Piece {
public var type: PieceType;
public var value: Int;
public function new(
type: PieceType,
value: Int,
) {
// do some stuff
}
}
var Pawn:Piece = { type: PieceType.Pawn, value: 1 }
enum PieceType {
Pawn;
}
Then in another file i have code trying to use the value property
// Game.hx
import Pieces.Pawn
import Pieces.PieceType;
class Main {
override function init() {
var value = Pawn.value
}
}
The error I'm getting when I try to compile is
src/Main.hx:46: characters 29-33 : PieceType has no field tile
What would be the correct way to structure this so that the Pawn that is imported is the class not the PieceType enum?
Another question is do I even need the enum at all? If I wanted write a switch statement can I just check if the passed argument is a Pawn class?
ex
function switch(p: Piece) {
switch(P) {
case Pawn: // Is true?
}
}
The convention for Haxe is that variable names are camelCase, which means that they begin with a lowercase character. In this case, this would solve your problem as the variable name would be pawn and the enum name would be Pawn so they would not conflict.
Alternatively, assuming the value field is the same for every pawn and doesn't change, you could have a map to retrieve this instead of adding it to an instance:
// Pieces.hx
enum PieceType {
Pawn;
Rook;
...
}
private final valueByPieceType:Map<PieceType, Int> = [
Pawn => 1,
Rook => ...,
...
];
function getValue(pieceType:PieceType):Int {
return valueByPieceType[pieceType];
}
// Game.hx
import Pieces.PieceType;
import Pieces.getValue;
class Main {
override function init() {
var value = getValue(Pawn);
}
}
This avoids requiring the extra instance/class in this example. If all pawns are meant to have this value then this is better as it ties the value to the fact it is a Pawn and prevents you from accidentally creating a pawn with a value other than 1.
However, if for whatever reason neither of those options are helpful, perhaps I would suggest changing the enum value to something like TPawn. Otherwise, only if absolutely necessary, you could create an alias within the Game.hx, however, this is not a good solution as then the Pieces module is unusable without adding this extra bit everywhere.
// Game.hx
import Pieces.Pawn as PawnPiece;
import Pieces.PieceType;
class Main {
override function init() {
var value = PawnPiece.value;
}
}
Keeping the enum and switch statement is also useful, as it prevents you from making mistakes because the compiler will ensure that you check every possibility, so you don't miss out any pieces. The switch is also more concise than having a manual check for every single piece type.
I can't seem to get this working, but I'd be surprised if it wasn't possible in Haxe.
I'm trying to pass a couple of Enum values defined in my game to a function, so that it can then concatenate them as String types and pass that to other functions.
Example:
// In a general Entity class:
public override function kill():Void {
messages.dispatchCombined(entityType, ListMessages.KILLED);
super.kill();
}
And in my Messages.hx class:
package common;
import msignal.Signal.Signal1;
/**
* A Message / Event class using Signals bound to String names.
* #author Pierre Chamberlain
*/
class Messages{
var _messages:MessagesDef;
public function new() {
_messages = new MessagesDef();
}
public function add(pType:String, pCallback:FuncDef) {
if (_messages[pType] == null) {
_messages[pType] = new Signal1<Dynamic>();
}
var signals = _messages[pType];
signals.add( pCallback );
}
public function dispatch(pType:String, pArg:Dynamic):Bool {
var signals = _messages[pType];
if (signals == null) return false;
signals.dispatch(pArg);
return true;
}
//Compiler doesn't like passing enums :(
public inline function addCombined(pSource:Enum, pEvent:Enum, pCallback:FuncDef) {
add( combine(pSource, pEvent), pCallback );
}
public inline function dispatchCombined(pSource:Enum, pEvent:Enum, pArg:Dynamic):Bool {
return dispatch( combine(pSource, pEvent), pArg);
}
//How can I just pass the enum "names" as strings?
static inline function combine(a:Enum, b:Enum):String {
return String(a) + ":" + String(b);
}
}
typedef MessagesDef = Map<String, Signal1<Dynamic>>;
typedef FuncDef = Dynamic->Void;
Note how addCombined, dispatchCombined and combine expect an "Enum" type, but in this case I'm not sure if Haxe actually expects the entire Enum "class" to be passed (ie: ListMessages instead of ListMessages.KILLED) or if a value should work. Anyways, compiler doesn't like it - so I'm assuming another special Type has to be used.
Is there another way to go about passing enums and resolving them to strings?
I think you need EnumValue as parameter type (if it is only for enum values), and use Std.String to convert to String values.
static inline function combine(a:EnumValue, b:EnumValue):String {
return Std.string(a) + ":" + Std.string(b);
}
Of course that can be written smaller using String interpolation:
static inline function combine(a:EnumValue, b:EnumValue):String {
return '$a:$b';
}
Of course that can be 'more dynamic' using type parameters:
static inline function combine<A, B>(a:A, b:B):String {
return '$a:$b';
}
There is totally no need to use Dynamic as suggested. If you use Dynamic, you basically turn off the type system.
live example:
http://try.haxe.org/#a8844
Use Dynamic instead of Enum or pass them as Strings right away since you can always convert to enum from String if you need it later.
Anyway pass the enum as enum:Dynamic and then call Std.string(enum);
EDIT: Using EnumValue is definitely better approach than Dynamic, I use Dynamic in these functions because I send more than just Enums there and I am not worried about type safety in that case.
class Base, and class Ext extends Base.
class B<T> with typed method foo<T>(value:T)
Why B<Base>.foo doest not accept instance of B<Ext> (implicit downcast of the type parameter?) by default?
Here is an example
http://try.haxe.org/#d443f
class Test {
static function main() {
var bExt = new B(new Ext());
var bBase = new B(new Base());
bBase.foo(bExt);
//ofc
//bBase.foo(cast bExt);
}
}
class B<T>
{
public function new(v:T)
{
}
public function foo(v:B<T>)
{
//
}
}
class Base {
public function new(){}
}
class Ext extends Base {
public function new(){
super();
}
}
Is there any way to trigger implicit cast of the type parameter for B.foo?
There are three ways to interpret and answer your question:
1. foo(v:B<T>):
This is your example and it doesn't compile because T isn't allowed to be be variant. It happens because of the very existence of foo and because allowing bBase.foo(bExt), that is, unifying bExt with bBase, will then allow bBaseOfbExt.foo(bBase).
It is the fact that foo exists and that it can potentially modify the type that makes the bExt unification with bBase unsafe; you can see a similar (but maybe clearer) explanation in the manual, using arrays: type system – variance.
2. foo(v:T):
This is closer to what's on the body of your question (but not in the example) and it works fine.
3. foo<A>(v:B<A>):
Finally, if you have a type parameterized method, it also works, but you'd probably face other variance issues elsewhere.
I'm trying to create a haxe.ds.HashMap where the keys are an object I don't control. Thus, they don't implement the hashCode method and I can't change them to.
I would really like to use an abstract to accomplish this, but I'm getting some compile time errors.
Here is the code I'm playing with:
import haxe.ds.HashMap;
abstract IntArrayKey( Array<Int> ) from Array<Int> {
inline public function new( i: Array<Int> ) {
this = i;
}
public function hashCode(): Int {
// General warning: Don't copy the following line. Seriously don't.
return this.length;
}
}
class Test {
static function main() {
var hash = new HashMap<IntArrayKey, Bool>();
}
}
The compile errors are:
Test.hx:15: characters 19-51 : Constraint check failure for haxe.ds.HashMap.K
Test.hx:15: characters 19-51 : IntArrayKey should be { hashCode : Void -> Int }
But the moment I change my abstract over to a class, it compiles fine:
import haxe.ds.HashMap;
class IntArrayKey {
private var _i: Array<Int>;
inline public function new( i: Array<Int> ) {
this._i = i;
}
public function hashCode(): Int {
// General warning: Don't copy the following line. Seriously don't.
return this._i.length;
}
}
class Test {
static function main() {
var hash = new HashMap<IntArrayKey, Bool>();
}
}
It's the exact same hashCode implementation, just a different context. Is there some way to accomplish this? Or is it a language limitation?
As far as I know, abstracts currently can't satisfy type requirements like this, quoting from the code:
abstract HashMap<K:{ function hashCode():Int; }, V >(HashMapData<K,V>) {
So, I doubt you could do that in a meaningful way.
Important point would be that while abstracts can sometimes provide overhead-free abstractions which is quite useful for optimizations, the time needed to instantiate(probably hidden from sight with abstract Name(Holder) to Holder having #:from Array<Int> and #:to Array<Int>) holder for your array which will have the required method isn't that high(compared to usual runtime overheads), and unless it is a really frequent code, should be your first way to go.
However, the HashMap code itself is quite short and simple: here.
You could just copy it and make it work with your example. Maybe you could even forge a better yet generic version by using interfaces(though I'm not sure if abstracts can actually implement them).
Here are two classes that I need to map, on the left side:
class HumanSrc {
public int IQ;
public AnimalSrc Animal;
}
class AnimalSrc {
public int Weight;
}
on the right side are the same objects, but composed using inheritance:
class HumanDst : AnimalDst {
public int IQ;
}
class AnimalDst {
public int Weight;
}
so the mapping I need is:
humanSrc.IQ -> humanDst.IQ
humanSrc.Animal.Weight -> humanDst.Weight;
I can easily do this mapping explicitly, but I have several classes that all derive from Animal, and Animal class is large, so I would prefer to map Animal once, and then have that included in every derived class mapping.
I looked at .Include<> method, but I do not think it supports this scenario.
Here is the essence of what I am looking for (pseudo-code):
// define animal mapping
var animalMap = Mapper.CreateMap<AnimalSrc, AnimalDst>().ForMember(dst=>dst.Weight, opt=>opt.MapFrom(src=>src.Weight);
// define human mapping
var humanMap = Mapper.CreateMap<HumanSrc, HumanDst>();
humanMap.ForMember(dst=>dst.IQ, opt=>opt.MapFrom(src=>src.IQ));
// this is what I want. Basically I want to say:
// "in addition to that, map this child property on the dst object as well"
humanMap.ForMember(dst=>dst, opt=>opt.MapFrom(src=>src.Entity));
As a workaround you can add BeforeMap with mapping base class. Probably it is not the best solution but at least it requires less mapping configuration:
humanMap.BeforeMap((src, dst) =>
{
Mapper.Map(src.Animal, (AnimalDst)dst);
});