Trying make a simple http get request in Angular Dart - get

I am learning front-end development in general with Angular Dart for a personal project (I've just learned back-end development with Django.) I get easily confused with the HTTP tutorial because from my perspective as a web developer beginner it tries to do a lot of things at the same time in different files or places and all related just for one purpose (it might be efficient to code that way, but I find hard learning from it.) I created an API for the project, and I want to know how to make a simple HTTP get request to build from there.
This is the JSON object that I'm trying to display:
"kanji": "老",
"onyomi": "ロウ",
"kunyomi": "お.いる、ふ.ける",
"nanori": "えび, おい, び",
"rtk_keyword": "old man",
"english": "old man, old age, grow old",
"jlpt": 3,
"jouyou_grade": "4",
"frequency": 803,
"radicals": [
2555,
2613
],
"pk": 1267
And this is my failed attempt at getting to display that data:
import "dart:async";
import 'dart:convert';
import 'package:angular2/angular2.dart';
import 'package:http/http.dart' as http;
import 'package:http/http.dart';
#Component (
selector: "test",
template: """
<button (click)="change_info()">{{info}}</button>
""",
)
class Test {
String info = "default info";
String _url = 'localhost:8000/kanji_detail/老';
String get_and_decode(String url) {
String data_to_return;
http.get(url).then((response) => data_to_return = JSON.decode(response.body));
return data_to_return;
}
String get_and_decode_long(String url) {
Response request_response;
Future request_future;
String data_to_return;
request_future = get(url);
request_future.then((response) => request_response = response);
data_to_return = JSON.decode(request_response.body);
return data_to_return;
}
change_info() {
info = get_and_decode_long(_url);
}
}
the get_and_decode_long is me understanding that a Future and a Response are involved in this process, it wasn't very obvious.

Future<String> get_and_decode(String url) async {
var response = await http.get(url);
return JSON.decode(response.body));
}
There is no way to get back from async to sync execution. async / await makes it quite easy to woek with async code though.
In your example return data_to_return; is executed before the response arrives.
See also
https://www.dartlang.org/tutorials/language/futures
https://www.dartlang.org/articles/language/await-async

Related

kdbxweb usage for creating dbs, storing and retrieving passwords for use in scripts/jobs

I'm trying and failing at learning to use this kdbxweb library. The documentation for it is confusing me, probably because I lack some prerequisite knowledge that is standard and so the documentation isn't really written for me yet.
Below is my code where I'm trying to learn to use it. All I really want to use this for is a place to store passwords rather than in plain text in a way I can send script to a team member and they can setup a similar credentials database either within the script or outside it and it will pull in their various ODBC database passwords.
The idea eventually would be to create the entry name as the name of the given ODBC connection and then based on a request to initiate connection the UID and PWD would be retrieved and added into connection string. I'm trying to get away from MS Access/VBA for this sort of thing and learn to use NodeJS/TypeScript for it instead.
import * as fs from 'fs';
import * as kdbx from 'kdbxweb';
(async() => {
try {
const database = kdbx.Kdbx.create(new kdbx.Credentials(kdbx.ProtectedValue.fromString('test')),'credentials');
//const group = database.createGroup(database.getDefaultGroup(),'subgroup');
//const entry = database.createEntry(group);
//entry.fields.set('Password',kdbx.ProtectedValue.fromString('test'));
//entry.pushHistory();
//entry.times.update();
await database.save();
//fs.writeFileSync('credentials/credentials.kdbx',data);
} catch (e :any) {
throw e;
}
})();
The error I'm getting when trying to do this is "argon2 not implemented" and while argon2 is mentioned at the top of documentation, I don't understand what that is even talking about in the least. It sounded like it has to do with an additional cryptography API that I don't think I even should need. I tried to take the code of the example implementation but I had no idea how to actually make use of that at all.
I also tried reading code for the web-app written using this library, but the way it's integrated into the application makes it completely impossible for me to parse at this point. I can't tell what type of objects are being passed around/etc. to trace the flow of information.
Old solution below
I found a better solution, it was painful to learn how to do this, but I did eventually get it working. I'm using the node C++ implementation of argon instead and it no longer echos the minified script into console
This is setup as argon/argon2node.ts, and requires the argon2 node library. I think now that I got this working if I wanted to switch to the rust version or something like that I could probably work that out. It's mostly about figuring out exactly where the parameters need to go, since sometimes the names are a little different and you have to convert various parameters around.
import { Argon2Type, Argon2Version } from "kdbxweb/dist/types/crypto/crypto-engine";
import argon from 'argon2';
export default async function argon2(
password: ArrayBuffer,
salt: ArrayBuffer,
memory: number,
iterations: number,
length: number,
parallelism: number,
type: Argon2Type,
version: Argon2Version
): Promise<ArrayBuffer> {
try {
//https://github.com/keeweb/kdbxweb/blob/master/test/test-support/argon2.ts - reviewed this and eventually figured out how to switch to the C++ implementation below after much pain
const hashArr = new Uint8Array(await argon.hash(
Buffer.from(new Uint8Array(password)), {
timeCost: iterations,
memoryCost: memory,
parallelism: parallelism,
version: version,
type: type,
salt: Buffer.from(new Uint8Array(salt)),
raw: true
}
));
return Promise.resolve(hashArr);
} catch (e) {
return Promise.reject(e);
}
}
And below is my odbc credentials lookup based on it
import * as fs from 'fs';
import * as kdbx from 'kdbxweb';
import argon2 from './argon/argon2node';
import * as byteUtils from 'kdbxweb/lib/utils/byte-utils';
export default async(title : string) => {
try {
kdbx.CryptoEngine.setArgon2Impl(argon2);
const readBuffer = byteUtils.arrayToBuffer(fs.readFileSync('./SQL/credentials/credentials.kdbx'));
const database = await kdbx.Kdbx.load(
readBuffer,
new kdbx.Credentials(kdbx.ProtectedValue.fromString('CredentialsStorage1!'))
);
let result;
database.getDefaultGroup().entries.forEach((e) => {
if(e.fields.get('Title') === title) {
const password = (<kdbx.ProtectedValue>e.fields.get('Password')).getText();
const user = <string>e.fields.get('UserName');
result = `UID=${user};PWD=${password}`;
return;
}
});
return result;
} catch(e : any) {
throw e;
}
}
To resolve this, I had to do a bit of reading to understand buffers and arraybuffers and such from the documentation a bit, which wasn't easy but I eventually figured it out and created below testing reading and writing entries and such. I still have a bit to learn but this is close enough I thought it worth sharing for anyone else who may try to use this
I also had to get a copy of argon2-asm.min.js and argon2.ts which I pulled from the github for keeweb which is built with this library.
import * as fs from 'fs';
import * as kdbx from 'kdbxweb';
import { argon2 } from './argon/argon2';
function toArrayBuffer(buffer : Buffer) {
return buffer.buffer.slice(buffer.byteOffset, buffer.byteOffset + buffer.byteLength);
}
function toBuffer(byteArray : ArrayBuffer) {
return Buffer.from(byteArray);
}
(async() => {
try {
kdbx.CryptoEngine.setArgon2Impl(argon2);
fs.unlinkSync('./SQL/credentials/credentials.kdbx');
const database = kdbx.Kdbx.create(new kdbx.Credentials(kdbx.ProtectedValue.fromString('test')),'credentials');
const entry = database.createEntry(database.getDefaultGroup());
entry.fields.set('Title','odbc');
entry.fields.set('Password',kdbx.ProtectedValue.fromString('test'));
const data = await database.save();
fs.writeFileSync('./SQL/credentials/credentials.kdbx',new DataView(data));
const readData = toArrayBuffer(fs.readFileSync('./SQL/credentials/credentials.kdbx'));
console.log('hithere');
const read = await kdbx.Kdbx.load(
readData,
new kdbx.Credentials(kdbx.ProtectedValue.fromString('test'))
);
console.log('bye');
console.log(read.getDefaultGroup().entries[0].fields.get('Title'));
const protectedPass = <kdbx.ProtectedValue>read.getDefaultGroup().entries[0].fields.get('Password');
console.log(
new kdbx.ProtectedValue(
protectedPass.value,
protectedPass.salt
).getText()
);
} catch (e :any) {
console.error(e);
throw e;
}
})();
Things that I don't grasp I'd like to understand better include why the argon implementation isn't built-in. He says " Due to complex calculations, you have to implement it manually " but this just seems odd. Perhaps not appropriate for this forum, but would be nice to know about alternatives if this is slow or something.

Console.log Refuses to Print Statements in Mutation File

I am using a PostgreSQL database, with a GraphQL / NodeJS server. One of the mutations is giving me extensive problems, due to its intrinsically complex nature. I am trying to use console.log statements throughout so I can track the data, but not a SINGLE statement prints. Now, before you all jump on me and say that the mutation probably isn't getting hit, that's not the case. I'm getting return values, the mutation is occurring (I checked the Network section of the Browser to confirm. I also have error handlers that do not get triggered) but nothing gets printed.
The code for one of the two mutations called simultaneously is as follows...
import db from "../../../../utils/generatePrisma.js";
import checkOwnerAuth from "../../../../utils/checkAuthorization/check-owner-auth.js";
import checkManagerAuth from "../../../../utils/checkAuthorization/check-manager-auth.js";
export default {
Mutation: {
scorecardToolCreateWeeklyReports: async (_, {
token,
dspId,
role,
transporterId,
date,
feedbackStatus,
feedbackMessage,
feedbackMessageSent,
rank,
tier,
delivered,
keyFocusArea,
fico,
seatbeltOffRate,
speedingEventRate,
distractionsRate,
followingDistanceRate,
signalViolationsRate,
deliveryCompletionRate,
deliveredAndRecieved,
photoOnDelivery,
attendedDeliveryAccuracy,
dnr,
podOpps,
ccOpps
}, context) => {
let owner;
let manager;
if (role === 'OWNER') {
owner = await checkOwnerAuth(token)
}
if (role === 'MANAGER') {
manager = await checkManagerAuth(token)
}
const foundDriver = await db.driver.findFirst({
where: {
transporterId: transporterId,
dspId: dspId
}
})
if (!foundDriver) {
throw new Error('Driver does not exist')
}
console.log("\n-----------------------\n Found Driver in scoreCardToolCreateWeeklyReport")
console.log(foundDriver)
try {
return await db.weeklyReport.create({
data: {
driver: {
connect: {
id: foundDriver.id
}
},
date: date,
feedbackStatus: feedbackStatus,
feedbackMessage: feedbackMessage,
feedbackMessageSent: feedbackMessageSent,
rank: rank,
tier: tier,
delivered: delivered,
keyFocusArea: keyFocusArea,
fico: fico,
seatbeltOffRate: seatbeltOffRate,
speedingEventRate: speedingEventRate,
distractionsRate: distractionsRate,
followingDistanceRate: followingDistanceRate,
signalViolationsRate: signalViolationsRate,
deliveryCompletionRate: deliveryCompletionRate,
deliveredAndRecieved: deliveredAndRecieved,
photoOnDelivery: photoOnDelivery,
attendedDeliveryAccuracy: attendedDeliveryAccuracy,
dnr: dnr,
podOpps: podOpps,
ccOpps: ccOpps
}
}).then( (resolved) => {
console.log(resolved)
})
} catch (error) {
console.log("\n---------------\n Error in WeeklyReportCreation")
console.log(error)
throw new Error(error)
}
}
}
}
Interestingly enough, the return I recieve is exactly what I would want and would expect, however, it does not persist, and on any refresh, rerender, or movement between pages and there's no model, its like the mutation just never even happened. When the mutation is called from the Frontend, it runs somewhat as expected. Upon hitting inspect on the browser and looking at the response section in the Network tab, I get the following...
{"data":
{"scorecardToolCreateWeeklyReports":
{"id":"91c7dd10-0af8-4fc7-9906-20643700c97f",
"createdAt":"2022-04-12T18:09:09.787Z",
"date":"11-24-22",
"hadAccident":false,
"feedbackMessage":"null",
"feedbackMessageSent":false,
"feedbackStatus":"Fantastic",
"acknowledged":false,
"acknowledgedAt":null,
"rank":1,
"tier":"Fantastic",
"delivered":116,
"keyFocusArea":"null",
"fico":"850",
"seatbeltOffRate":"Coming Soon",
"speedingEventRate":"Coming Soon",
"distractionsRate":"Coming Soon",
"followingDistanceRate":"Coming Soon",
"signalViolationsRate":"Coming Soon",
"deliveryCompletionRate":"100",
"deliveredAndRecieved":"100",
"photoOnDelivery":"100",
"attendedDeliveryAccuracy":0,
"dnr":0,
"podOpps":54,
"ccOpps":0,
"__typename":"WeeklyReport"}}}
The mutation shown before is then placed into a minor resolver...
import GraphQLJSON from "graphql-type-json";
import scorecardToolCreateDriverAccounts from "./mutations/scorecardToolCreateDriverAccounts.js";
import scorecardToolCreateWeeklyReports from "./mutations/scorecardToolCreateWeeklyReports.js";
export default {
Query: {
},
Mutation: {
...scorecardToolCreateDriverAccounts.Mutation,
...scorecardToolCreateWeeklyReports.Mutation
},
JSON: GraphQLJSON
}
And then this minor resolver is then imported into the main resolver
import GraphQLJSON from 'graphql-type-json';
// NEW RESOLVERS
import ownerReslovers from './owner/ownerResolvers.js';
import managerResolvers from './manager/managerResolvers.js';
import driverResolvers from './driver/driverResolvers.js';
import dspResolvers from './dsp/dspResolvers.js';
import weeklyReportResolvers from './weeklyReport/weeklyReportResolvers.js';
import scorecardResolvers from './scorecardTool/scorecardResolvers.js';
import chatroomResolvers from './chatrooms/chatroomResolvers.js';
import shiftPlannerResolvers from './shiftPlanner/shiftPlannerResolvers.js';
import messagesResolvers from './messages/messagesResolvers.js';
import accidentResolvers from './accidents/accidentResolvers.js';
import shiftPlannerDatesResolvers from './shiftPlannerDates/shiftPlannerDatesResolvers.js';
import shiftResolvers from './shift/shiftReolvers.js';
// ADDITIONAL RESOLVERS
import additionalResolvers from './additional/additionalResolvers.js';
export default {
Query: {
...ownerReslovers.Query,
...managerResolvers.Query,
...driverResolvers.Query,
...dspResolvers.Query,
...weeklyReportResolvers.Query,
...shiftPlannerResolvers.Query,
...chatroomResolvers.Query,
...messagesResolvers.Query,
...accidentResolvers.Query,
...shiftPlannerDatesResolvers.Query,
...shiftResolvers.Query,
...scorecardResolvers.Query,
...additionalResolvers.Query
},
Mutation: {
...ownerReslovers.Mutation,
...managerResolvers.Mutation,
...driverResolvers.Mutation,
...dspResolvers.Mutation,
...weeklyReportResolvers.Mutation,
...shiftPlannerResolvers.Mutation,
...chatroomResolvers.Mutation,
...messagesResolvers.Mutation,
...accidentResolvers.Mutation,
...shiftPlannerDatesResolvers.Mutation,
...shiftResolvers.Mutation,
...scorecardResolvers.Mutation,
...additionalResolvers.Mutation
},
JSON: GraphQLJSON,
}
It turns out my final thought was correct. The mutation is 100% being run, and the log statements are being hit.
The issue is due to AWS EC2 CodePipeline deployment. The backend server is perpetually being run on AWS's end, so the console where the logs occur is no longer my VSC terminal, but the instance's AWS console.
This is also the same with the front end, where if I run the code locally, none of the new logs get hit. I can only see ALL of the frontend logs if I am on the IP adress in the browser, not the localhost.
This all being said, I currently have no idea how to view the console in AWS and if anyone was privy to that information and cared to share that would be greatly appreciated

Suggestion to Build a multiplayer Texas Holdem Poker game using NodeJS, Flutter

I have been building basic and complex mobile apps in Android and Flutter also have knowledge of NodeJS. I have already built a basic NodeJs multiplayer server for Texas Holdem Poker. Mulitple rooms and table logic are still remaining.
I want to develop a client mobile app in Flutter as I have deep knowledge of flutter.
I have been exploring the packages and convenient tools that are going to be used in Flutter but am still clueless.
This game development in Flutter is a new challenge for me so I would like to ask whether the technology stack for such a game is good or not?
Shall I consider switching from Flutter to something else?
I'd say go for it. I created a card game myself while learning Flutter - just needed a fun project to see it end to end.
Similarly to you, I used node.js in the backend, enabling me to play the game with my friends.
I ended up going with GraphQL for my back end services (using AWS AppSync service). Since GraphQL will give you push notification over websockets, it was ideal to send data between different clients playing the game. Node.js deployed on AWS Lambda with Dynamo DB for persistence worked without a problem.
I also ended up doing an AI algorithm (using Monte Carlo Tree Search) that enabled me to play the game against the AI. And as the final bit - I ported my back end node.js interface to dart, so I could play the game against the AI fully off-line.
I think I spent the most time figuring out how to do the card animation, especially since I wanted for the card moves to happen sequentially. AI would sometimes play the moves too fast, so there were multiple cards flying around at the same time. Once this was done - the rest of it was easy.
If you need more details, let me know. I'm happy to share bits of the code with you.
Edit: here are some code details. I'll try to edit out parts of code that you won't need, so probably some of it will fail to compile at first... And keep in mind - some things may be overcomplicated in my code - as I got the things up and running I would move on, so there is a lot to improve here...
First the back-end. As I mentioned, I used AWS: Dynamo DB to store the game data; a single lambda function to do all the work; and AppSync API: simply because it offers message push to the clients - you don't need to keep polling the back-end to see if there are any changes. And finally - I used Cognito user pool to authenticate users. In my app you can create your account, validate it through email etc.
I used AWS Amplify to setup a back-end: this is AWS framework that enables you to easily deploy your back-end even if you don't know that much about AWS. It is actually meant for app developers that don't need or want to learn about AWS in detail. Even if you know your way around AWS - it is still very useful, since it will automate a lot of security wiring for you, and will help you automate the deployment.
Currently, there is official amplify-flutter package; at the time I did this I used 3rd party package https://pub.dev/packages/amazon_cognito_identity_dart_2.
Now I'm assuming that you got your back-end setup: you deployed your node.js code in Lambda, and now you need your AppSync schema. Here's mine:
type Match #aws_iam #aws_cognito_user_pools {
matchId: Int
matchData: AWSJSON!
}
type PlayEvent #aws_iam #aws_cognito_user_pools {
matchId: Int!
success: Boolean!
playerId: Int!
eventType: String!
inputData: AWSJSON
eventData: AWSJSON
matchData: AWSJSON
}
input PlayEventInput {
matchId: Int!
eventType: String!
eventData: AWSJSON
playerId: Int
}
type Query {
getMatch(matchId: Int): Match #function(name: "tresetaFv2-${env}") #aws_iam #aws_cognito_user_pools
}
type Mutation {
playEvent(input: PlayEventInput!): PlayEvent #function(name: "tresetaFv2-${env}") #aws_iam #aws_cognito_user_pools
}
type Subscription {
onPlayEvent(matchId: Int, success: Boolean): PlayEvent #aws_subscribe(mutations: ["playEvent"])
}
As you can see, I'm using AWSJSON data type a lot - instead of coding my entire game schema in GraphQL, I'm simply passing JSON back and forth.
Few things to understand:
MatchData type holds your game state: what card each player holds, what are the cards in the deck, on the table, who's turn is it to play etc.
Query getMatch will be fired by the client app after selecting the match from the active matches list. When you join the game, this is how you fetch the game state.
Mutation playEvent is how you play your move: you pass PlayEventInput type - specifying matchID and playerID, event Type (in your case you can play card, fold, call...). The playEvent Mutation will return PlayEvent - telling you if the move was successful (was it legal at the time? Was your turn to play?), and it will return MatchData - the new game state after the move was played.
And finally - Subscription onPlayEvent. After each client joins the match (by match ID), it will subscribe to this subscription. As you can see from the subscription definition - it will subscribe to playEvent mutation: each time a player plays a move, it will notify all the others about the move - and pass on the result of the Mutation, so everyone will get the full game state to refresh their UI. A nice trick here is - you subscribe to mutations that have success=true, so you don't push any message for failed moves.
You will also see some annotations - this is how you tell Amplify to wire things for you:
#function(name: "tresetaFv2-${env}"): you tell it to call Lambda funciton called "tresetaFv2-${env}" to actuall do the work
#aws_iam #aws_cognito_user_pools - this is how you tell it that this API is scured by Cognito user pools.
So now we have the AWS back-end setup. So how do you actually run the query and subscribe to the events coming from the back-end?
First, I use this UserService in my code: https://github.com/furaiev/amazon-cognito-identity-dart-2/blob/main/example/lib/user_service.dart
This is my AppSync Service - this is just a generic way of calling AppSync:
import 'dart:convert';
import 'package:amazon_cognito_identity_dart_2/cognito.dart';
import 'package:flutter/foundation.dart';
import 'package:http/http.dart' as http;
import 'package:treseta_app/auth/auth_services.dart';
import 'package:web_socket_channel/web_socket_channel.dart';
import 'package:web_socket_channel/io.dart';
import 'package:web_socket_channel/web_socket_channel.dart';
class AppSyncImpl {
final String _endPoint;
String get _wsEndPoint => _endPoint.replaceAll('https', 'wss').replaceAll('appsync-api', 'appsync-realtime-api');
String get _host => _endPoint.replaceAll('https://', '').replaceAll('/graphql', '');
UserService _userService;
set userService(UserService userService) => this._userService = userService;
AppSyncImpl(this._endPoint, this._userService);
Future<http.Response> getHttpResponse(Map<String, String> query) async {
CognitoUserSession session = await _userService.getSession();
return http.post(
_endPoint,
headers: {
'Authorization': session.getAccessToken().getJwtToken(),
'Content-Type': 'application/json',
},
body: json.encode(query),
);
}
Future<Map<String, dynamic>> glQueryMutation(Map<String, String> query) async {
http.Response response;
try {
response = await getHttpResponse(query);
} catch (e) {
print(e);
}
return json.decode(response.body);
}
int wsTimeoutInterval;
void disconnectWebSocketChannel(WebSocketChannel wsChannel, String uniqueKey) {
wsChannel.sink.add(json.encode({'type': 'stop', 'id': uniqueKey}));
}
Future<WebSocketChannel> wsSubscription(
{#required Map<String, String> query,
#required String uniqueKey,
Function(Map<String, dynamic>) listener}) async {
var jwtToken = (await _userService.getSession()).getIdToken().getJwtToken();
var header = {'host': _host, 'Authorization': jwtToken};
var encodedHeader = Base64Codec().encode(Utf8Codec().encode(json.encode(header)));
// Note 'e30=' is '{}' in base64
var wssUrl = '$_wsEndPoint?header=$encodedHeader&payload=e30=';
var channel = IOWebSocketChannel.connect(wssUrl, protocols: ['graphql-ws']);
channel.sink.add(json.encode({"type": "connection_init"}));
channel.stream.listen((event) {
var e = json.decode(event);
switch (e['type']) {
case 'connection_ack':
wsTimeoutInterval = e['payload']['connectionTimeoutMs'];
var register = {
'id': uniqueKey,
'payload': {
'data': json.encode(query),
'extensions': {'authorization': header}
},
'type': 'start'
};
var payload = json.encode(register);
channel.sink.add(payload);
break;
case 'data':
listener(e);
break;
case 'ka':
print('Reminder: keep alive is not yet implemented!!!');
break;
case 'start_ack':
print('Ws Channel: Subscription started');
break;
default:
print('Unknown event $event');
}
}, onError: (error, StackTrace stackTrace) {
print('Ws Channel: $error');
}, onDone: () {
channel.sink.close();
print('Ws Channel: Done!');
});
return channel;
}
}
And this is the actual back-end service; notice how there is a function that corresponds to each query, mutation and subscription in GraphQL Schema (getMatchData, playEvent, subscribeWS):
import 'dart:async';
import 'dart:convert';
import 'package:flutter/foundation.dart';
import 'package:flutter/widgets.dart';
import 'package:treseta_app/appsync/app_sync.dart';
import 'package:treseta_app/auth/auth_services.dart';
import 'package:treseta_app/models/api_models.dart' as api;
import 'package:meta/meta.dart';
import 'package:treseta_app/models/game_model.dart';
import 'package:web_socket_channel/web_socket_channel.dart';
class AwsBackendService extends ChangeNotifier {
final subscriptionStreamCtrl = StreamController<api.PlayEvent>.broadcast();
Stream get subscriptionStream => subscriptionStreamCtrl.stream;
UserService userService;
String endPoint;
AppSyncImpl _appSyncImpl;
BuildContext context;
AwsBackendService({#required this.context, this.userService, this.endPoint})
: _appSyncImpl = AppSyncImpl(endPoint, userService);
#override
Future<api.Match> getMatchData(int matchId) async {
final query = {
'operationName': 'GetMatch',
'query': '''query GetMatch { getMatch(matchId:$matchId){matchData, playerId}}'''
};
User user = await userService.getCurrentUser();
String username = user.name;
var matchData = await _appSyncImpl.glQueryMutation(query);
var match = api.Match.fromJson(matchData['data']['getMatch']);
match.matchData.username = username;
return match;
}
Future<api.PlayEvent> playEvent(int matchId, int playerId, String eventType, api.PlayEventInputData eventData) async {
var encoded = json.encode(eventData.toJson()).replaceAll("\"", "\\\"");
final query = {
'operationName': 'PlayEvent',
'query':
'''mutation PlayEvent { playEvent(input:{matchId:$matchId, eventType:"$eventType", eventData:"$encoded"}){matchId, eventData, success, playerId, eventType, inputData, matchData}}''',
};
api.PlayEvent result =
await _appSyncImpl.glQueryMutation(query).then((value) => api.PlayEvent.fromJson(value['data']['playEvent']));
User user = await userService.getCurrentUser();
String username = user.name;
result.matchData.username = username;
return result;
}
WebSocketChannel _wsClient;
String _wsUniqeKey;
Future<void> subscribeWs(int matchId, int playerId, MatchData model) async {
final query = {
'operationName': 'OnPlayEvent',
'query':
'subscription OnPlayEvent { onPlayEvent(matchId:$matchId, success:true) {matchId, success,playerId,eventType,inputData,eventData,matchData}}',
};
_wsUniqeKey = UniqueKey().toString();
_wsClient = await _appSyncImpl.wsSubscription(
query: query,
uniqueKey: _wsUniqeKey,
listener: (Map<String, dynamic> message) {
var playEvent = api.PlayEvent.fromJson(message['payload']['data']['onPlayEvent']);
subscriptionStreamCtrl.sink.add(playEvent);
});
return;
}
void disconnect() {
try {
if (_wsClient != null) {
_appSyncImpl.disconnectWebSocketChannel(_wsClient, _wsUniqeKey);
}
} catch (e) {
print(e);
}
}
#override
void dispose() {
super.dispose();
subscriptionStreamCtrl.close();
disconnect();
}
}
You will see that each event we receive from AppSync subscription is just pumped into a Stream object.
And finally, my main ChangeNotifier Provider that holds the match data and notifies the UI is something like this - you will see how the incoming subscription events are processed, and UI is notified that there's a new event to be animated - a card being thrown, or new hand dealt etc.
import 'dart:async';
import 'package:flutter/widgets.dart';
import 'package:treseta_app/backend/backend_service.dart';
import 'package:treseta_app/models/api_models.dart';
import 'package:treseta_app/models/game_model.dart';
class Treseta extends ChangeNotifier {
MatchData model;
int playerId;
AwsBackendService _backendService;
StreamSubscription backendServiceSubscription;
AwsBackendService get backendService => _backendService;
set backendService(BackendService value) {
if (backendServiceSubscription != null) backendServiceSubscription.cancel();
_backendService = value;
// this is where we process the incoming subscription messages:
backendServiceSubscription = _backendService.subscriptionStream.listen((inEvent) {
PlayEvent playEvent = inEvent as PlayEvent;
playCard(
inData: playEvent,
playerId: playEvent.playerId,
card: playEvent.inputData.card);
});
}
int _matchId;
int get matchId => _matchId;
set matchId(int value) {
_matchId = value;
subscribeAndGetMatchData();
}
Future<void> disconnect() async => backendService.disconnect();
Future<MatchData> getMatchData() async {
var match = await backendService.getMatchData(matchId);
playerId = match.playerId;
model = match.matchData;
return model;
}
Future<void> subscribeAndGetMatchData() async {
if (matchId != null) {
disconnect();
await getMatchData();
await subscribe();
}
return;
}
Future<void> subscribe() async {
backendService.subscribeWs(matchId, playerId, this.model);
}
void playCard(
{GameCard card,
GlobalKey sourceKey,
GlobalKey targetKey,
int playerId = 0,
PlayEvent inData}) async {
// Animation notifier logic goes here...
}
#override
void dispose() {
disconnect();
backendServiceSubscription.cancel();
super.dispose();
}
}
There you go - I hope this helps.
There is another bit on how to animate the cards - the big challenge is to run your animation sequentially - even if the players are quick enough to play cards very quickly. If you want, I can post some details around it as well.
*** One more edit ****
I created a repo with a simple animation example. If I find time, I'll write some comments in Readme file...
https://github.com/andrija78/cards_demo

Trying to retrieve Data from MongoDB using Typescript

Context: Am not too experienced with TypeScript, as we don't use it at work, am attempting to just build a little portfolio piece for personal exposure.
So to start with this is my code:
import { request, Request, Response } from 'express';
import { Neighborhood as NeighborhoodType } from '../interfaces/neighborhood.interface';
import Neighborhood from '../models/neighborhood';
const fetchNeighborhoods = async (request: Request, response: Response): Promise<void> => {
try {
const neighborhoods: NeighborhoodType[] = await Neighborhood.paginate();
response.status(200).send(neighborhoods);
} catch (error) {
throw error;
}
};
Am attempting to fetch the neighborhoods from the DB, and am receiving the error Type 'PaginateResult<Neighborhood>' is missing the following properties from type 'Neighborhood[]': length, pop, push, concat, and 26 more. on this line const neighborhoods: NeighborhoodType[] = await Neighborhood.paginate();
If I remove the NeighborhoodType[] then the method will work fine. The neighborhood interface is literally an object with a string.
export interface Neighborhood extends Document {
name: string,
}
Is it an issue with MY code or is it an issue with one of the dependencies?
For anyone who encounters this issue:
The problem stems from trying to set the return type. As Mongoose will always return one document, an array of documents or an empty array (unless using onFail()) the return type can be inferred so there is no need to add NeighborhoodType[].
The PaginateResult Type is essentially the type of Array if I'm not mistaken and is expecting the Neighborhood type to have all of the array methods which it will not.

Aurelia Fetch usage broken after NPM update

I just recently did an npm update on my Aurelia CLI project using TypeScript in Visual Studio 2015. I'm using aurelia-fetch-client for making calls to my Web API (.NET Core) backend.
This is an example of the code that was previously compiling and running fine:
import { autoinject } from "aurelia-framework";
import { HttpClient, json } from "aurelia-fetch-client";
import { SupportedCultureInfo } from "../../model/Resources/SupportedCultureInfo";
#autoinject()
export class ResourceService {
constructor(private http: HttpClient) {
}
getSupportedCultures(): Promise<SupportedCultureInfo[]> {
return this.http.fetch("resource/cultures").then(response => response.json());
}
}
Neither Visual Studio nor ReSharper provides any indication in the code editor UI that this will not compile, however after the recent update my build is now broken with this error:
TS2322: Type 'Promise<Response>' is not assignable to type 'Promise<SupportedCultureInfo[]>'
The only workaround I've found so far is to return Promise<any> instead. What I really want to do here though is return classes mapped from the JSON result and have the return type of the method be a strongly-typed Promise as well.
Does anyone know what has changed recently that could cause this? It's very frustrating.
UPDATE:
This is the code that I was able to get working:
import { autoinject } from "aurelia-framework";
import { HttpClient, json } from "aurelia-fetch-client";
import { SupportedCultureInfo } from "../../model/resources/SupportedCultureInfo";
#autoinject()
export class ResourceService {
constructor(private http: HttpClient) {
}
getSupportedCultures(): Promise<SupportedCultureInfo[]> {
return this.http.fetch("resource/cultures")
.then(response => response.json())
.then<SupportedCultureInfo[]>((data: any) => {
const result = new Array<SupportedCultureInfo>();
for (let i of data) {
const info = new SupportedCultureInfo();
info.cultureCode = i.cultureCode;
info.name = i.name;
info.isCurrentCulture = i.isCurrentCulture;
result.push(info);
}
return result;
});
}
}
I had to do two things that weren't immediately obvious:
Use the generic overload of then() to specify the return type I want to use
Explicitly declare any type for the JSON parameter in the second callback, otherwise the gulp transpiler thinks it's still typed as Response.
UPDATE:
I've accidentally hit Enter causing the comment to be posted unfinished. I'm unable to edit the comment so updating my answer as per the updated question.
Chaining Callbacks:
If the API resource/cultures returns the response as SupportedCultureInfo[] and getSupportedCultures() just need to return the response as it is, then there's no need for that second callback. The answer I've posted previously would be sufficient.
I'm guessing a second callback is most likely required in either of these two cases (or for some other reason)
The API returns a different type that has to be mapped to SupportedCultureInfo[]
The API response requires any further processing before sending it back from getSupportedCultures()
If you require a second callback to process the response further then you should call the generic then<TResult> method while you're reading the response as json instead of at a later part in the callback chain.
Reason for reported error:
In the updated code, the reason gulp transpiler treats data as type of Response is due to the fact that a non-generic then(response => response.json()) is being used which returns Promise<Response>.
Instead use then<SupportedCultureInfo[]>(response => response.json()) which would return Promise<SupportedCultureInfo[]> or then<any>(response => response.json()) which would return Promise<any>.
Using either then<SupportedCultureInfo[]>(response => response.json()) or then<any>(response => response.json()) would give you the data in the second callback as SupportedCultureInfo[] or any respectively.
getSupportedCultures(): Promise<SupportedCultureInfo[]> {
return this.http.fetch("resource/cultures")
.then<SupportedCultureInfo[]>(response => response.json())
.then(data => {
// data will be of type SupportedCultureInfo[]
});
}
As the method signature for then<TResult> is intact it should give a strongly typed data varaible.
Solution
The changes to typescript implies that we specify the return type as a type parameter to then<TResult> callback as shown below.
getSupportedCultures(): Promise<SupportedCultureInfo[]> {
return this.http.fetch("resource/cultures").then<SupportedCultureInfo[]>(response => response.json());
}
Details:
I ran into the same situation when I did an npm update. Although my initial thoughts were to blame aurelia-fetch-client as well, I did some digging into the source code to contribute a fix for this issue. But, in my quest I found that typescript is the real culprit here.
The interface Promise<T> had some changes in the way then callback handles the return types. Now, the desired return type needs to be passed in as a type parameter TResult in then<TResult> callback.

Resources