Suggestion to Build a multiplayer Texas Holdem Poker game using NodeJS, Flutter - node.js

I have been building basic and complex mobile apps in Android and Flutter also have knowledge of NodeJS. I have already built a basic NodeJs multiplayer server for Texas Holdem Poker. Mulitple rooms and table logic are still remaining.
I want to develop a client mobile app in Flutter as I have deep knowledge of flutter.
I have been exploring the packages and convenient tools that are going to be used in Flutter but am still clueless.
This game development in Flutter is a new challenge for me so I would like to ask whether the technology stack for such a game is good or not?
Shall I consider switching from Flutter to something else?

I'd say go for it. I created a card game myself while learning Flutter - just needed a fun project to see it end to end.
Similarly to you, I used node.js in the backend, enabling me to play the game with my friends.
I ended up going with GraphQL for my back end services (using AWS AppSync service). Since GraphQL will give you push notification over websockets, it was ideal to send data between different clients playing the game. Node.js deployed on AWS Lambda with Dynamo DB for persistence worked without a problem.
I also ended up doing an AI algorithm (using Monte Carlo Tree Search) that enabled me to play the game against the AI. And as the final bit - I ported my back end node.js interface to dart, so I could play the game against the AI fully off-line.
I think I spent the most time figuring out how to do the card animation, especially since I wanted for the card moves to happen sequentially. AI would sometimes play the moves too fast, so there were multiple cards flying around at the same time. Once this was done - the rest of it was easy.
If you need more details, let me know. I'm happy to share bits of the code with you.
Edit: here are some code details. I'll try to edit out parts of code that you won't need, so probably some of it will fail to compile at first... And keep in mind - some things may be overcomplicated in my code - as I got the things up and running I would move on, so there is a lot to improve here...
First the back-end. As I mentioned, I used AWS: Dynamo DB to store the game data; a single lambda function to do all the work; and AppSync API: simply because it offers message push to the clients - you don't need to keep polling the back-end to see if there are any changes. And finally - I used Cognito user pool to authenticate users. In my app you can create your account, validate it through email etc.
I used AWS Amplify to setup a back-end: this is AWS framework that enables you to easily deploy your back-end even if you don't know that much about AWS. It is actually meant for app developers that don't need or want to learn about AWS in detail. Even if you know your way around AWS - it is still very useful, since it will automate a lot of security wiring for you, and will help you automate the deployment.
Currently, there is official amplify-flutter package; at the time I did this I used 3rd party package https://pub.dev/packages/amazon_cognito_identity_dart_2.
Now I'm assuming that you got your back-end setup: you deployed your node.js code in Lambda, and now you need your AppSync schema. Here's mine:
type Match #aws_iam #aws_cognito_user_pools {
matchId: Int
matchData: AWSJSON!
}
type PlayEvent #aws_iam #aws_cognito_user_pools {
matchId: Int!
success: Boolean!
playerId: Int!
eventType: String!
inputData: AWSJSON
eventData: AWSJSON
matchData: AWSJSON
}
input PlayEventInput {
matchId: Int!
eventType: String!
eventData: AWSJSON
playerId: Int
}
type Query {
getMatch(matchId: Int): Match #function(name: "tresetaFv2-${env}") #aws_iam #aws_cognito_user_pools
}
type Mutation {
playEvent(input: PlayEventInput!): PlayEvent #function(name: "tresetaFv2-${env}") #aws_iam #aws_cognito_user_pools
}
type Subscription {
onPlayEvent(matchId: Int, success: Boolean): PlayEvent #aws_subscribe(mutations: ["playEvent"])
}
As you can see, I'm using AWSJSON data type a lot - instead of coding my entire game schema in GraphQL, I'm simply passing JSON back and forth.
Few things to understand:
MatchData type holds your game state: what card each player holds, what are the cards in the deck, on the table, who's turn is it to play etc.
Query getMatch will be fired by the client app after selecting the match from the active matches list. When you join the game, this is how you fetch the game state.
Mutation playEvent is how you play your move: you pass PlayEventInput type - specifying matchID and playerID, event Type (in your case you can play card, fold, call...). The playEvent Mutation will return PlayEvent - telling you if the move was successful (was it legal at the time? Was your turn to play?), and it will return MatchData - the new game state after the move was played.
And finally - Subscription onPlayEvent. After each client joins the match (by match ID), it will subscribe to this subscription. As you can see from the subscription definition - it will subscribe to playEvent mutation: each time a player plays a move, it will notify all the others about the move - and pass on the result of the Mutation, so everyone will get the full game state to refresh their UI. A nice trick here is - you subscribe to mutations that have success=true, so you don't push any message for failed moves.
You will also see some annotations - this is how you tell Amplify to wire things for you:
#function(name: "tresetaFv2-${env}"): you tell it to call Lambda funciton called "tresetaFv2-${env}" to actuall do the work
#aws_iam #aws_cognito_user_pools - this is how you tell it that this API is scured by Cognito user pools.
So now we have the AWS back-end setup. So how do you actually run the query and subscribe to the events coming from the back-end?
First, I use this UserService in my code: https://github.com/furaiev/amazon-cognito-identity-dart-2/blob/main/example/lib/user_service.dart
This is my AppSync Service - this is just a generic way of calling AppSync:
import 'dart:convert';
import 'package:amazon_cognito_identity_dart_2/cognito.dart';
import 'package:flutter/foundation.dart';
import 'package:http/http.dart' as http;
import 'package:treseta_app/auth/auth_services.dart';
import 'package:web_socket_channel/web_socket_channel.dart';
import 'package:web_socket_channel/io.dart';
import 'package:web_socket_channel/web_socket_channel.dart';
class AppSyncImpl {
final String _endPoint;
String get _wsEndPoint => _endPoint.replaceAll('https', 'wss').replaceAll('appsync-api', 'appsync-realtime-api');
String get _host => _endPoint.replaceAll('https://', '').replaceAll('/graphql', '');
UserService _userService;
set userService(UserService userService) => this._userService = userService;
AppSyncImpl(this._endPoint, this._userService);
Future<http.Response> getHttpResponse(Map<String, String> query) async {
CognitoUserSession session = await _userService.getSession();
return http.post(
_endPoint,
headers: {
'Authorization': session.getAccessToken().getJwtToken(),
'Content-Type': 'application/json',
},
body: json.encode(query),
);
}
Future<Map<String, dynamic>> glQueryMutation(Map<String, String> query) async {
http.Response response;
try {
response = await getHttpResponse(query);
} catch (e) {
print(e);
}
return json.decode(response.body);
}
int wsTimeoutInterval;
void disconnectWebSocketChannel(WebSocketChannel wsChannel, String uniqueKey) {
wsChannel.sink.add(json.encode({'type': 'stop', 'id': uniqueKey}));
}
Future<WebSocketChannel> wsSubscription(
{#required Map<String, String> query,
#required String uniqueKey,
Function(Map<String, dynamic>) listener}) async {
var jwtToken = (await _userService.getSession()).getIdToken().getJwtToken();
var header = {'host': _host, 'Authorization': jwtToken};
var encodedHeader = Base64Codec().encode(Utf8Codec().encode(json.encode(header)));
// Note 'e30=' is '{}' in base64
var wssUrl = '$_wsEndPoint?header=$encodedHeader&payload=e30=';
var channel = IOWebSocketChannel.connect(wssUrl, protocols: ['graphql-ws']);
channel.sink.add(json.encode({"type": "connection_init"}));
channel.stream.listen((event) {
var e = json.decode(event);
switch (e['type']) {
case 'connection_ack':
wsTimeoutInterval = e['payload']['connectionTimeoutMs'];
var register = {
'id': uniqueKey,
'payload': {
'data': json.encode(query),
'extensions': {'authorization': header}
},
'type': 'start'
};
var payload = json.encode(register);
channel.sink.add(payload);
break;
case 'data':
listener(e);
break;
case 'ka':
print('Reminder: keep alive is not yet implemented!!!');
break;
case 'start_ack':
print('Ws Channel: Subscription started');
break;
default:
print('Unknown event $event');
}
}, onError: (error, StackTrace stackTrace) {
print('Ws Channel: $error');
}, onDone: () {
channel.sink.close();
print('Ws Channel: Done!');
});
return channel;
}
}
And this is the actual back-end service; notice how there is a function that corresponds to each query, mutation and subscription in GraphQL Schema (getMatchData, playEvent, subscribeWS):
import 'dart:async';
import 'dart:convert';
import 'package:flutter/foundation.dart';
import 'package:flutter/widgets.dart';
import 'package:treseta_app/appsync/app_sync.dart';
import 'package:treseta_app/auth/auth_services.dart';
import 'package:treseta_app/models/api_models.dart' as api;
import 'package:meta/meta.dart';
import 'package:treseta_app/models/game_model.dart';
import 'package:web_socket_channel/web_socket_channel.dart';
class AwsBackendService extends ChangeNotifier {
final subscriptionStreamCtrl = StreamController<api.PlayEvent>.broadcast();
Stream get subscriptionStream => subscriptionStreamCtrl.stream;
UserService userService;
String endPoint;
AppSyncImpl _appSyncImpl;
BuildContext context;
AwsBackendService({#required this.context, this.userService, this.endPoint})
: _appSyncImpl = AppSyncImpl(endPoint, userService);
#override
Future<api.Match> getMatchData(int matchId) async {
final query = {
'operationName': 'GetMatch',
'query': '''query GetMatch { getMatch(matchId:$matchId){matchData, playerId}}'''
};
User user = await userService.getCurrentUser();
String username = user.name;
var matchData = await _appSyncImpl.glQueryMutation(query);
var match = api.Match.fromJson(matchData['data']['getMatch']);
match.matchData.username = username;
return match;
}
Future<api.PlayEvent> playEvent(int matchId, int playerId, String eventType, api.PlayEventInputData eventData) async {
var encoded = json.encode(eventData.toJson()).replaceAll("\"", "\\\"");
final query = {
'operationName': 'PlayEvent',
'query':
'''mutation PlayEvent { playEvent(input:{matchId:$matchId, eventType:"$eventType", eventData:"$encoded"}){matchId, eventData, success, playerId, eventType, inputData, matchData}}''',
};
api.PlayEvent result =
await _appSyncImpl.glQueryMutation(query).then((value) => api.PlayEvent.fromJson(value['data']['playEvent']));
User user = await userService.getCurrentUser();
String username = user.name;
result.matchData.username = username;
return result;
}
WebSocketChannel _wsClient;
String _wsUniqeKey;
Future<void> subscribeWs(int matchId, int playerId, MatchData model) async {
final query = {
'operationName': 'OnPlayEvent',
'query':
'subscription OnPlayEvent { onPlayEvent(matchId:$matchId, success:true) {matchId, success,playerId,eventType,inputData,eventData,matchData}}',
};
_wsUniqeKey = UniqueKey().toString();
_wsClient = await _appSyncImpl.wsSubscription(
query: query,
uniqueKey: _wsUniqeKey,
listener: (Map<String, dynamic> message) {
var playEvent = api.PlayEvent.fromJson(message['payload']['data']['onPlayEvent']);
subscriptionStreamCtrl.sink.add(playEvent);
});
return;
}
void disconnect() {
try {
if (_wsClient != null) {
_appSyncImpl.disconnectWebSocketChannel(_wsClient, _wsUniqeKey);
}
} catch (e) {
print(e);
}
}
#override
void dispose() {
super.dispose();
subscriptionStreamCtrl.close();
disconnect();
}
}
You will see that each event we receive from AppSync subscription is just pumped into a Stream object.
And finally, my main ChangeNotifier Provider that holds the match data and notifies the UI is something like this - you will see how the incoming subscription events are processed, and UI is notified that there's a new event to be animated - a card being thrown, or new hand dealt etc.
import 'dart:async';
import 'package:flutter/widgets.dart';
import 'package:treseta_app/backend/backend_service.dart';
import 'package:treseta_app/models/api_models.dart';
import 'package:treseta_app/models/game_model.dart';
class Treseta extends ChangeNotifier {
MatchData model;
int playerId;
AwsBackendService _backendService;
StreamSubscription backendServiceSubscription;
AwsBackendService get backendService => _backendService;
set backendService(BackendService value) {
if (backendServiceSubscription != null) backendServiceSubscription.cancel();
_backendService = value;
// this is where we process the incoming subscription messages:
backendServiceSubscription = _backendService.subscriptionStream.listen((inEvent) {
PlayEvent playEvent = inEvent as PlayEvent;
playCard(
inData: playEvent,
playerId: playEvent.playerId,
card: playEvent.inputData.card);
});
}
int _matchId;
int get matchId => _matchId;
set matchId(int value) {
_matchId = value;
subscribeAndGetMatchData();
}
Future<void> disconnect() async => backendService.disconnect();
Future<MatchData> getMatchData() async {
var match = await backendService.getMatchData(matchId);
playerId = match.playerId;
model = match.matchData;
return model;
}
Future<void> subscribeAndGetMatchData() async {
if (matchId != null) {
disconnect();
await getMatchData();
await subscribe();
}
return;
}
Future<void> subscribe() async {
backendService.subscribeWs(matchId, playerId, this.model);
}
void playCard(
{GameCard card,
GlobalKey sourceKey,
GlobalKey targetKey,
int playerId = 0,
PlayEvent inData}) async {
// Animation notifier logic goes here...
}
#override
void dispose() {
disconnect();
backendServiceSubscription.cancel();
super.dispose();
}
}
There you go - I hope this helps.
There is another bit on how to animate the cards - the big challenge is to run your animation sequentially - even if the players are quick enough to play cards very quickly. If you want, I can post some details around it as well.
*** One more edit ****
I created a repo with a simple animation example. If I find time, I'll write some comments in Readme file...
https://github.com/andrija78/cards_demo

Related

amqplib sendToQueue() is not sending messages to RabbitMQ

I am building a Typescript package around amqplib's Promise API to make it simpler for me to send messages between two RabbitMQ queues.
This is a method inside a class of said package that is responsible for sending a message to RabbitMQ. It's not actually sending messages, because I can't see them from CloudAMQP.
class PostOffice {
connection: any;
constructor(connection: any) {
this.connection = connection;
}
//...
// TMQMessage is an object that can be trivially serialized to JSON
// The Writer class holds the channel because I'm having trouble
// making Typescript class member updates stick after a method returns.
async send(writer: Writer, message: TMQMessage) {
//...RabbitMQ code begins here
let channel = writer.stageConsume[0]; // This channel is created elsewhere
// Queue is created here and I can see it in CloudAMQP.
await channel.assertQueue(message.to + "/stage/1", {
durable: true,
}); // `message.to` is a string indicating destination, e.g. 'test/hello'
// According to docs, the content should be converted to Buffer
await channel.sendToQueue(message.to + "/stage/1", Buffer.from(JSON.stringify(message)));
channel.close()
}
}
This is a class I made that creates connections and channels, which is being used here:
import { Writer } from "./Writer";
export class ConnectionFactory {
url: string;
amqp: any;
constructor(url: string) {
this.url = url;
this.amqp = require('amqplib');
}
async connect(timeout: number = 2000) {
let connection = await this.amqp.connect(this.url, {
// timeout for a message acknowledge.
timeout, // Make the timeout 2s per now
})
return connection;
}
async createChannels(connection: any, writer: Writer) {
//... relevant code starts here
let stageChannels = [];
stageChannels.push(await connection.createChannel())
writer.stageConsume = stageChannels;
return writer;
}
}
}
My connection factory seems to be working properly, because I can see the connections from CloudAMQP's dashboard. Also I can see the Queues that have been created (asserted in my code) from the dashboard too. However, I am unable to get amqplib to send a message out to CloudAMQP.
Here's the (async) code I'm using to call my package:
let Cf = new ConnectionFactory(url)
let connection = await Cf.connect()
let writer = new Writer();
let message: TMQMessage = {
//... message content in JSON
}
writer = await Cf.createChannels(connection, writer)
let po = new PostOffice(connection)
await po.send(writer, message)
What seems to be wrong?
Probably want to await the close method too. But in general I would recommend amqp-client.js (that also have TypeScript definitions), https://github.com/cloudamqp/amqp-client.js/

How to monitor number of RXJS subscriptions?

I'm using an Observable to provide event subscription interface for clients from a global resource, and I need to manage that resource according to the number of active subscriptions:
Allocate global resource when the number of subscriptions becomes greater than 0
Release global resource when the number of subscriptions becomes 0
Adjust the resource usage strategy based on the number of subscriptions
What is the proper way in RXJS to monitor the number of active subscriptions?
How to implement the following within RXJS syntax? -
const myEvent: Observable<any> = new Observable();
myEvent.onSubscription((newCount: number, prevCount: number) => {
if(newCount === 0) {
// release global resource
} else {
// allocate global resource, if not yet allocated
}
// for a scalable resource usage / load,
// re-configure it, based on newCount
});
I wouldn't expect a guaranteed notification on each change, hence newCount + prevCount params.
UPDATE-1
This is not a duplicate to this, because I need to be notified when the number of subscriptions changes, and not just to get the counter at some point.
UPDATE-2
Without any answer so far, I quickly came up with a very ugly and limited work-around, through complete incapsulation, and specifically for type Subject. Hoping very much to find a proper solution.
UPDATE-3
After a few answers, I'm still not sure how to implement what I'm trying, which is the following:
class CustomType {
}
class CountedObservable<T> extends Observable<T> {
private message: string; // random property
public onCount; // magical Observable that needs to be implemented
constructor(message: string) {
// super(); ???
this.message = message;
}
// random method
public getMessage() {
return this.message;
}
}
const a = new CountedObservable<CustomType>('hello'); // can create directly
const msg = a.getMessage(); // can call methods
a.subscribe((data: CustomType) => {
// handle subscriptions here;
});
// need that magic onCount implemented, so I can do this:
a.onCount.subscribe((newCount: number, prevCont: number) => {
// manage some external resources
});
How to implement such class CountedObservable above, which would let me subscribe to itself, as well as its onCount property to monitor the number of its clients/subscriptions?
UPDATE-4
All suggested solutions seemed overly complex, and even though I accepted one of the answers, I ended up with a completely custom solution one of my own.
You could achieve it using defer to track subscriptions and finalize to track completions, e.g. as an operator:
// a custom operator that will count number of subscribers
function customOperator(onCountUpdate = noop) {
return function refCountOperatorFunction(source$) {
let counter = 0;
return defer(()=>{
counter++;
onCountUpdate(counter);
return source$;
})
.pipe(
finalize(()=>{
counter--;
onCountUpdate(counter);
})
);
};
}
// just a stub for `onCountUpdate`
function noop(){}
And then use it like:
const source$ = new Subject();
const result$ = source$.pipe(
customOperator( n => console.log('Count updated: ', n) )
);
Heres a code snippet illustrating this:
const { Subject, of, timer, pipe, defer } = rxjs;
const { finalize, takeUntil } = rxjs.operators;
const source$ = new Subject();
const result$ = source$.pipe(
customOperator( n => console.log('Count updated: ', n) )
);
// emit events
setTimeout(()=>{
source$.next('one');
}, 250);
setTimeout(()=>{
source$.next('two');
}, 1000);
setTimeout(()=>{
source$.next('three');
}, 1250);
setTimeout(()=>{
source$.next('four');
}, 1750);
// subscribe and unsubscribe
const subscriptionA = result$
.subscribe(value => console.log('A', value));
setTimeout(()=>{
result$.subscribe(value => console.log('B', value));
}, 500);
setTimeout(()=>{
result$.subscribe(value => console.log('C', value));
}, 1000);
setTimeout(()=>{
subscriptionA.unsubscribe();
}, 1500);
// complete source
setTimeout(()=>{
source$.complete();
}, 2000);
function customOperator(onCountUpdate = noop) {
return function refCountOperatorFunction(source$) {
let counter = 0;
return defer(()=>{
counter++;
onCountUpdate(counter);
return source$;
})
.pipe(
finalize(()=>{
counter--;
onCountUpdate(counter);
})
);
};
}
function noop(){}
<script src="https://unpkg.com/rxjs#6.4.0/bundles/rxjs.umd.min.js"></script>
* NOTE: if your source$ is cold — you might need to share it.
Hope it helps
You are really asking three separate questions here, and I question whether you really need the full capability that you mention. Since most of the resource managment stuff you are asking for is already provided for by the library, doing custom tracking code seems to be redundant. The first two questions:
Allocate global resource when the number of subscriptions becomes greater than 0
Release global resource when the number of subscriptions becomes 0
Can be done with the using + share operators:
class ExpensiveResource {
constructor () {
// Do construction
}
unsubscribe () {
// Do Tear down
}
}
// Creates a resource and ties its lifecycle with that of the created `Observable`
// generated by the second factory function
// Using will accept anything that is "Subscription-like" meaning it has a unsubscribe function.
const sharedStream$ = using(
// Creates an expensive resource
() => new ExpensiveResource(),
// Passes that expensive resource to an Observable factory function
er => timer(1000)
)
// Share the underlying source so that global creation and deletion are only
// processed when the subscriber count changes between 0 and 1 (or visa versa)
.pipe(share())
After that sharedStream$ can be passed around as a base stream which will manage the underlying resource (assuming you implemented your unsubscribe correctly) so that the resource will be created and torn down as the number of subscribers transitions between 0 and 1.
Adjust the resource usage strategy based on the number of subscriptions
The third question I am most dubious on, but I'll answer it for completeness assuming you know your application better than I do (since I can't think of a reason why you would need specific handling at different usage levels other than going between 0 and 1).
Basically I would use a similar approach as above but I would encapuslate the transition logic slightly differently.
// Same as above
class ExpensiveResource {
unsubscribe() { console.log('Tear down this resource!')}
}
const usingReferenceTracking =
(onUp, onDown) => (resourceFactory, streamFactory) => {
let instance, refCount = 0
// Again manage the global resource state with using
const r$ = using(
// Unfortunately the using pattern doesn't let the resource escape the closure
// so we need to cache it for ourselves to use later
() => instance || (instance = resourceFactory()),
// Forward stream creation as normal
streamFactory
)
).pipe(
// Don't forget to clean up the stream after all is said and done
// Because its behind a share this should only happen when all subscribers unsubscribe
finalize(() => instance = null)
share()
)
// Use defer to trigger "onSubscribe" side-effects
// Note as well that these side-effects could be merged with the above for improved performance
// But I prefer them separate for easier maintenance.
return defer(() => onUp(instance, refCount += 1) || r$)
// Use finalize to handle the "onFinish" side-effects
.pipe(finalize(() => onDown(instance, refCount -= 1)))
}
const referenceTracked$ = usingReferenceTracking(
(ref, count) => console.log('Ref count increased to ' + count),
(ref, count) => console.log('Ref count decreased to ' + count)
)(
() => new ExpensiveResource(),
ref => timer(1000)
)
referenceTracked$.take(1).subscribe(x => console.log('Sub1 ' +x))
referenceTracked$.take(1).subscribe(x => console.log('Sub2 ' +x))
// Ref count increased to 1
// Ref count increased to 2
// Sub1 0
// Ref count decreased to 1
// Sub2 0
// Ref count decreased to 0
// Tear down this resource!
Warning: One side effect of this is that by definition the stream will be warm once it leaves the usingReferenceTracking function, and it will go hot on first subscription. Make sure you take this into account during the subscription phase.
What a fun problem! If I am understanding what you are asking, here is my solution to this: create a wrapper class around Observable that tracks the subscriptions by intercepting both subscribe() and unsubscribe(). Here is the wrapper class:
export class CountSubsObservable<T> extends Observable<T>{
private _subCount = 0;
private _subCount$: BehaviorSubject<number> = new BehaviorSubject(0);
public subCount$ = this._subCount$.asObservable();
constructor(public source: Observable<T>) {
super();
}
subscribe(
observerOrNext?: PartialObserver<T> | ((value: T) => void),
error?: (error: any) => void,
complete?: () => void
): Subscription {
this._subCount++;
this._subCount$.next(this._subCount);
let subscription = super.subscribe(observerOrNext as any, error, complete);
const newUnsub: () => void = () => {
if (this._subCount > 0) {
this._subCount--;
this._subCount$.next(this._subCount);
subscription.unsubscribe();
}
}
subscription.unsubscribe = newUnsub;
return subscription;
}
}
This wrapper creates a secondary observable .subCount$ that can be subscribed to which will emit every time the number of subscriptions to the source observable changes. It will emit a number corresponding to the current number of subscribers.
To use it you would create a source observable and then call new with this class to create the wrapper. For example:
const source$ = interval(1000).pipe(take(10));
const myEvent$: CountSubsObservable<number> = new CountSubsObservable(source$);
myEvent$.subCount$.subscribe(numSubs => {
console.log('subCount$ notification! Number of subscriptions is now', numSubs);
if(numSubs === 0) {
// release global resource
} else {
// allocate global resource, if not yet allocated
}
// for a scalable resource usage / load,
// re-configure it, based on numSubs
});
source$.subscribe(result => console.log('result is ', result));
To see it in use, check out this Stackblitz.
UPDATE:
Ok, as mentioned in the comments, I'm struggling a little to understand where the stream of data is coming from. Looking back through your question, I see you are providing an "event subscription interface". If the stream of data is a stream of CustomType as you detail in your third update above, then you may want to use fromEvent() from rxjs to create the source observable with which you would call the wrapper class I provided.
To show this I created a new Stackblitz. From that Stackblitz here is the stream of CustomTypes and how I would use the CountedObservable class to achieve what you are looking for.
class CustomType {
a: string;
}
const dataArray = [
{ a: 'January' },
{ a: 'February' },
{ a: 'March' },
{ a: 'April' },
{ a: 'May' },
{ a: 'June' },
{ a: 'July' },
{ a: 'August' },
{ a: 'September' },
{ a: 'October' },
{ a: 'November' },
{ a: 'December' }
] as CustomType[];
// Set up an arbitrary source that sends a stream of `CustomTypes`, one
// every two seconds by using `interval` and mapping the numbers into
// the associated dataArray.
const source$ = interval(2000).pipe(
map(i => dataArray[i]), // transform the Observable stream into CustomTypes
take(dataArray.length), // limit the Observable to only emit # array elements
share() // turn into a hot Observable.
);
const myEvent$: CountedObservable<CustomType> = new CountedObservable(source$);
myEvent$.onCount.subscribe(newCount => {
console.log('newCount notification! Number of subscriptions is now', newCount);
});
I hope this helps.
First of all, I very much appreciate how much time and effort people have committed trying to answer my question! And I am sure those answers will prove to be a useful guideline to other developers, solving similar scenarios with RXJS.
However, specifically for what I was trying to get out of RXJS, I found in the end that I am better off not using it at all. I specifically wanted the following:
A generic, easy-to-use interface for subscribing to notifications, plus monitoring subscriptions - all in one. With RXJS, the best I would end up is some workarounds that appear to be needlessly convoluted or even cryptic to developers who are not experts in RXJS. That is not what I would consider a friendly interface, more like something that rings over-engineering.
I ended up with a custom, much simpler interface that can do everything I was looking for:
export class Subscription {
private unsub: () => void;
constructor(unsub: () => void) {
this.unsub = unsub;
}
public unsubscribe(): void {
if (this.unsub) {
this.unsub();
this.unsub = null; // to prevent repeated calls
}
}
}
export class Observable<T = any> {
protected subs: ((data: T) => void)[] = [];
public subscribe(cb: (data: T) => void): Subscription {
this.subs.push(cb);
return new Subscription(this.createUnsub(cb));
}
public next(data: T): void {
// we iterate through a safe clone, in case an un-subscribe occurs;
// and since Node.js is the target, we are using process.nextTick:
[...this.subs].forEach(cb => process.nextTick(() => cb(data)));
}
protected createUnsub(cb) {
return () => {
this.subs.splice(this.subs.indexOf(cb), 1);
};
}
}
export interface ISubCounts {
newCount: number;
prevCount: number;
}
export class CountedObservable<T = any> extends Observable<T> {
readonly onCount: Observable<ISubCounts> = new Observable();
protected createUnsub(cb) {
const s = this.subs;
this.onCount.next({newCount: s.length, prevCount: s.length - 1});
return () => {
s.splice(s.indexOf(cb), 1);
this.onCount.next({newCount: s.length, prevCount: s.length + 1});
};
}
}
It is both small and elegant, and lets me do everything I needed to begin with, in a safe and friendly manner. I can do the same subscribe and onCount.subscribe, and get all the same notifications:
const a = new CountedObservable<string>();
const countSub = a.onCount.subscribe(({newCount, prevCount}) => {
console.log('COUNTS:', newCount, prevCount);
});
const sub1 = a.subscribe(data => {
console.log('SUB-1:', data);
});
const sub2 = a.subscribe(data => {
console.log('SUB-2:', data);
});
a.next('hello');
sub1.unsubscribe();
sub2.unsubscribe();
countSub.unsubscribe();
I hope this will help somebody else also.
P.S. I further improved it as an independent module.

what is the best way or pattern to create a Wrapper class to abstract the use of RabbitMQ in node js

so I'm using RabbitMQ for some Projects and i noticed that i ll use some duplicate code all the Time that's why i decided to make a Wrapper Class or Interface that have some function to use RabbitMQ direct without repeating the code all the time. i began to do this yesterday and i already had some Problems since i wanted to use OOP and Javascript can be complicated when using OOP (at least i think so)
I began with creating a class IRAbbitMQ with function init to initialize a connection and create a channel, i knew that i cant use nested classes so instead i wanted to use Factory functions, i tried to make the connection and channel a part of the class IRabbitMQ properties but i dont know why that gave me undefined when i create an instance of it
class IRabbitMQ {
constructor() {
this.init(rabbitMQServer); // rabbitMQServer for example 'localhost//5672'
}
// establish a Connection to RAbbitMQ Server
async init(host) {
try {
let connection = await amqplib.connect(host);
let channel = await connection.createChannel();
channel.prefetch(1);
console.log(' [x] Awaiting RPC requests');
this.connection = connection;
this.channel = channel;
}
catch(err) {
console.error(err);
}
}
// Close the Connection with RabbitMQ
closeConnection() {
this.connection.close();
}
log() {
console.log(this.connection);
}
EventPublisher() {
function init(IRabbit, publisherName) {
if(!IRabbit.connection) {
throw new Error('Create an Instance of IRabbitMQ to establish a Connection');
}
let ch = IRabbit.channel;
console.log(ch);
}
return {
init : init
}
}
}
var r = new IRabbitMQ();
r.log();
when i run the code the output is undefined, i dont know why since i m initializing the connection and channel properties in the init function and then called that function in the constructor so that should be initialized when i create an object of the Wrapper class. i wanted also to take some advices from you wether it is good to use classes or is there any other better way to create a Wrapper class or Interface for RabbitMQ to make it easy to use it and not have to duplicate Code.
Not really an answer, but I was able to successfully log the connection with this example code. I trimmed out other code to just focus on the .log() part that was logging a undefined.
Code is far from perfect, but works at least
const amqplib = require('amqplib');
class IRabbitMQ {
constructor() { }
async init(host) {
try {
const connection = await amqplib.connect(host);
const channel = await connection.createChannel();
channel.prefetch(1);
console.log(' [x] Awaiting RPC requests');
this.connection = connection;
this.channel = channel;
}catch(err) {
console.error(err);
}
}
log() {
console.log(this.connection);
}
}
async function createInstance(){
const instance = new IRabbitMQ();
try {
await instance.init('amqp://localhost');
}catch (e) {
throw new Error('OOPS!');
}
return instance;
}
async function runLogic() {
const r = await createInstance();
r.log();
}
runLogic().catch(console.log);
Just comment if you'd want me to give additional advice/tips, but this seems to work for me.

Use Sinon.fakeServer with promises and mocha

My problem is the following: I want to test a method that uploads a buch of data into an AWS S3 bucket. The problem is: I don't want to really upload data every time I am testing and I don't want to care about credentials sitting in the env. So I want to setup Sinon's fake-server module to simulate the upload and return the same results then S3 would. Sadly, it seems to be difficult to find a working example with code using async/await.
My test looks like this:
import {skip, test, suite} from "mocha-typescript";
import Chai from "chai";
import {S3Uploader} from "./s3-uploader.class";
import Sinon from "sinon";
#suite
class S3UploaderTest {
public server : Sinon.SinonFakeServer | undefined;
before() {
this.server = Sinon.fakeServer.create();
}
after() {
if (this.server != null) this.server.restore();
}
#test
async "should upload a file to s3 correctly"(){
let spy = Sinon.spy();
const uploader : S3Uploader = new S3Uploader();
const upload = await uploader.send("HalloWelt").toBucket("onetimeupload.test").toFolder("test/hw.txt").upload();
Chai.expect(upload).to.be.a("object");
}
}
Inside of the uploader.upload() method, I resolved a promise out of a callback. So how can I simulate the uploading-process?
Edit: Here is the code of the s3-uploader:
import AWS from "aws-sdk";
export class S3Uploader {
private s3 = new AWS.S3({ accessKeyId : process.env.ACCESS_KEY_ID, secretAccessKey : process.env.SECRET_ACCESS_KEY });
private params = {
Body: null || Object,
Bucket: "",
Key: ""
};
public send(stream : any) {
this.params.Body = stream;
return this;
}
public toBucket(bucket : string) {
this.params.Bucket = bucket;
return this;
}
public toFolder(path : string) {
this.params.Key = path;
return this;
}
public upload() {
return new Promise((resolve, reject) => {
if (process.env.ACCESS_KEY_ID == null || process.env.SECRET_ACCESS_KEY == null) {
return reject("ERR_NO_AWS_CREDENTIALS");
}
this.s3.upload(this.params, (error : any, data : any) => {
return error ? reject(error) : resolve(data);
});
});
}
}
Sinon fake servers are something you might use to develop a client that itself makes requests, instead of a wrapper around an existing client like AWS.S3, like you're doing. In this case, you're better off just stubbing the behavior of AWS.S3 instead of testing the actual requests it makes. That way you can avoid testing the implementation details of AWS.S3.
Since you're using TypeScript and you've made your s3 client private, you're going to need to make some changes to expose it to your tests. Otherwise, you won't be able to stub its methods without the TS compiler complaining about it. You also won't be able to write assertions using the params object, for similar reasons.
Since I don't use TS regularly, I'm not too familiar with it's common dependency injection techniques, but one thing you could do is add optional constructor arguments to your S3Uploader class that can overwrite the default s3 and arguments properties, like so:
constructor(s3, params) {
if (s3) this.s3 = s3;
if (params) this.params = params;
}
After which, you can create a stub instance and pass it to your test instance like this:
const s3 = sinon.createStubInstance(AWS.S3);
const params = { foo: 'bar' };
const uploader = new S3Uploader(s3, params);
Once you have the stub instance in place, you can write assertions to make sure the upload method was called the way you want it to be:
sinon.assert.calledOnce(s3.upload);
sinon.assert.calledWith(s3.upload, sinon.match.same(params), sinon.match.func);
You can also affect the behavior the upload method using the sinon stub api. For example, to make it fail like so:
s3.upload.callsArgWith(1, null);
Or make it succeed like so:
const data = { whatever: 'data', you: 'want' };
s3.upload.callsArgWith(1, null, data);
You'll probably want a completely separate test for each of these cases, using an instance before hook to avoid duplicating the common setup stuff. Testing for success will involve simply awaiting the promise and checking that its result is the data. Testing for failure will involve a try/catch that ensures the promise was rejected with the proper error.
Also, since you seem to be doing actual unit tests here, I'll recommend testing each S3Uploader method separately instead of calling them all in once big test. This drastically reduces the number of possible cases you need to cover, making your tests a lot more straightforward. Something like this:
#suite
class S3UploaderTest {
params: any; // Not sure the best way to type this.
s3: any; // Same. Sorry, not too experienced with TS.
uploader: S3Uploader | undefined;
before() {
this.params = {};
this.s3 = sinon.createStubInstance(AWS.S3);
this.uploader = new S3Uploader(this.s3, this.params);
}
#test
"send should set Body param and return instance"() {
const stream = "HalloWelt";
const result = this.uploader.send(stream);
Chai.expect(this.params.Body).to.equal(stream);
Chai.expect(result).to.equal(this.uploader);
}
#test
"toBucket should set Bucket param and return instance"() {
const bucket = "onetimeupload.test"
const result = this.uploader.toBucket(bucket);
Chai.expect(this.params.Bucket).to.equal(bucket);
Chai.expect(result).to.equal(this.uploader);
}
#test
"toFolder should set Key param and return instance"() {
const path = "onetimeupload.test"
const result = this.uploader.toFolder(path);
Chai.expect(this.params.Key).to.equal(path);
Chai.expect(result).to.equal(this.uploader);
}
#test
"upload should attempt upload to s3"() {
this.uploader.upload();
sinon.assert.calledOnce(this.s3.upload);
sinon.assert.calledWith(
this.s3.upload,
sinon.match.same(this.params),
sinon.match.func
);
}
#test
async "upload should resolve with response if successful"() {
const data = { foo: 'bar' };
s3.upload.callsArgWith(1, null, data);
const result = await this.uploader.upload();
Chai.expect(result).to.equal(data);
}
#test
async "upload should reject with error if not"() {
const error = new Error('Test Error');
s3.upload.callsArgWith(1, error, null);
try {
await this.uploader.upload();
throw new Error('Promise should have rejected.');
} catch(err) {
Chai.expect(err).to.equal(err);
}
}
}
If I were doing this with mocha proper, I'd group each method's tests into a nested describe block. I'm not sure if that's encouraged or even possible with mocha-typescript, but if so you might consider it.

Trying make a simple http get request in Angular Dart

I am learning front-end development in general with Angular Dart for a personal project (I've just learned back-end development with Django.) I get easily confused with the HTTP tutorial because from my perspective as a web developer beginner it tries to do a lot of things at the same time in different files or places and all related just for one purpose (it might be efficient to code that way, but I find hard learning from it.) I created an API for the project, and I want to know how to make a simple HTTP get request to build from there.
This is the JSON object that I'm trying to display:
"kanji": "老",
"onyomi": "ロウ",
"kunyomi": "お.いる、ふ.ける",
"nanori": "えび, おい, び",
"rtk_keyword": "old man",
"english": "old man, old age, grow old",
"jlpt": 3,
"jouyou_grade": "4",
"frequency": 803,
"radicals": [
2555,
2613
],
"pk": 1267
And this is my failed attempt at getting to display that data:
import "dart:async";
import 'dart:convert';
import 'package:angular2/angular2.dart';
import 'package:http/http.dart' as http;
import 'package:http/http.dart';
#Component (
selector: "test",
template: """
<button (click)="change_info()">{{info}}</button>
""",
)
class Test {
String info = "default info";
String _url = 'localhost:8000/kanji_detail/老';
String get_and_decode(String url) {
String data_to_return;
http.get(url).then((response) => data_to_return = JSON.decode(response.body));
return data_to_return;
}
String get_and_decode_long(String url) {
Response request_response;
Future request_future;
String data_to_return;
request_future = get(url);
request_future.then((response) => request_response = response);
data_to_return = JSON.decode(request_response.body);
return data_to_return;
}
change_info() {
info = get_and_decode_long(_url);
}
}
the get_and_decode_long is me understanding that a Future and a Response are involved in this process, it wasn't very obvious.
Future<String> get_and_decode(String url) async {
var response = await http.get(url);
return JSON.decode(response.body));
}
There is no way to get back from async to sync execution. async / await makes it quite easy to woek with async code though.
In your example return data_to_return; is executed before the response arrives.
See also
https://www.dartlang.org/tutorials/language/futures
https://www.dartlang.org/articles/language/await-async

Resources