Related
I have a function with multiple forEach loops:
async insertKpbDocument(jsonFile) {
jsonFile.doc.annotations.forEach((annotation) => {
annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
return jsonFile;
}
I need to make sure that the async code in the forEach loop calling the this.addVertex function is really done before executing the next one.
But when I log variables, It seems that the this.addRelation function is called before the first loop is really over.
So I tried adding await terms before every loops like so :
await jsonFile.doc.annotations.forEach(async (annotation) => {
await annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
await annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
But same behavior.
Maybe it is the log function that have a latency? Any ideas?
As we've discussed, await does not pause a .forEach() loop and does not make the 2nd item of the iteration wait for the first item to be processed. So, if you're really trying to do asynchronous sequencing of items, you can't really accomplish it with a .forEach() loop.
For this type of problem, async/await works really well with a plain for loop because they do pause the execution of the actual for statement to give you sequencing of asynchronous operations which it appears is what you want. Plus, it even works with nested for loops because they are all in the same function scope:
To show you how much simpler this can be using for/of and await, it could be done like this:
async insertKpbDocument(jsonFile) {
for (let annotation of jsonFile.doc.annotations) {
for (let entity of annotation.entities) {
await this.addVertex(entity);
}
for (let relation of annotation.relations) {
await this.addRelation(relation);
}
}
return jsonFile;
}
You get to write synchronous-like code that is actually sequencing asynchronous operations.
If you are really avoiding any for loop, and your real requirement is only that all calls to addVertex() come before any calls to addRelation(), then you can do this where you use .map() instead of .forEach() and you collect an array of promises that you then use Promise.all() to wait on the whole array of promises:
insertKpbDocument(jsonFile) {
return Promise.all(jsonFile.doc.annotations.map(async annotation => {
await Promise.all(annotation.entities.map(entity => this.addVertex(entity)));
await Promise.all(annotation.relations.map(relation => this.addRelation(relation)));
})).then(() => jsonFile);
}
To fully understand how this works, this runs all addVertex() calls in parallel for one annotation, waits for them all to finish, then runs all the addRelation() calls in parallel for one annotation, then waits for them all to finish. It runs all the annotations themselves in parallel. So, this isn't very much actual sequencing except within an annotation, but you accepted an answer that has this same sequencing and said it works so I show a little simpler version of this for completeness.
If you really need to sequence each individual addVertex() call so you don't call the next one until the previous one is done and you're still not going to use a for loop, then you can use the .reduce() promise pattern put into a helper function to manually sequence asynchronous access to an array:
// helper function to sequence asynchronous iteration of an array
// fn returns a promise and is passed an array item as an argument
function sequence(array, fn) {
return array.reduce((p, item) => {
return p.then(() => {
return fn(item);
});
}, Promise.resolve());
}
insertKpbDocument(jsonFile) {
return sequence(jsonFile.doc.annotations, async (annotation) => {
await sequence(annotation.entities, entity => this.addVertex(entity));
await sequence(annotation.relations, relation => this.addRelation(relation));
}).then(() => jsonFile);
}
This will completely sequence everything. It will do this type of order:
addVertex(annotation1)
addRelation(relation1);
addVertex(annotation2)
addRelation(relation2);
....
addVertex(annotationN);
addRelation(relationN);
where it waits for each operation to finish before going onto the next one.
foreach will return void so awaiting it will not do much. You can use map to return all the promises you create now in the forEach, and use Promise.all to await all:
async insertKpbDocument(jsonFile: { doc: { annotations: Array<{ entities: Array<{}>, relations: Array<{}> }> } }) {
await Promise.all(jsonFile.doc.annotations.map(async(annotation) => {
await Promise.all(annotation.entities.map(async (entity) => {
await this.addVertex(entity);
}));
await Promise.all(annotation.relations.map(async (relation) => {
await this.addRelation(relation);
}));
}));
return jsonFile;
}
I understand you can run all the addVertex concurrently. Combining reduce with map splitted into two different set of promises you can do it. My idea:
const first = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.entities.map(this.addVertex));
return acc;
}, []);
await Promise.all(first);
const second = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.relations.map(this.addRelation));
return acc;
}, []);
await Promise.all(second);
You have more loops, but it does what you need I think
forEach executes the callback against each element in the array and does not wait for anything. Using await is basically sugar for writing promise.then() and nesting everything that follows in the then() callback. But forEach doesn't return a promise, so await arr.forEach() is meaningless. The only reason it isn't a compile error is because the async/await spec says you can await anything, and if it isn't a promise you just get its value... forEach just gives you void.
If you want something to happen in sequence you can await in a for loop:
for (let i = 0; i < jsonFile.doc.annotations.length; i++) {
const annotation = jsonFile.doc.annotations[i];
for (let j = 0; j < annotation.entities.length; j++) {
const entity = annotation.entities[j];
await this.addVertex(entity);
});
// code here executes after all vertix have been added in order
Edit: While typing this a couple other answers and comments happened... you don't want to use a for loop, you can use Promise.all but there's still maybe some confusion, so I'll leave the above explanation in case it helps.
async/await does not within forEach.
A simple solution: Replace .forEach() with for(.. of ..) instead.
Details in this similar question.
If no-iterator linting rule is enabled, you will get a linting warning/error for using for(.. of ..). There are lots of discussion/opinions on this topic.
IMHO, this is a scenario where we can suppress the warning with eslint-disable-next-line or for the method/class.
Example:
const insertKpbDocument = async (jsonFile) => {
// eslint-disable-next-line no-iterator
for (let entity of annotation.entities) {
await this.addVertex(entity)
}
// eslint-disable-next-line no-iterator
for (let relation of annotation.relations) {
await this.addRelation(relation)
}
return jsonFile
}
The code is very readable and works as expected. To get similar functionality with .forEach(), we need some promises/observables acrobatics that i think is a waste of effort.
Summary
Is functional programming in node.js general enough? can it be used to do a real-world problem of handling small bulks of db records without loading all records in memory using toArray (thus going out of memory). You can read this criticism for background. We want to demonstrate Mux and DeMux and fork/tee/join capabilities of such node.js libraries with async generators.
Context
I'm questioning the validity and generality of functional programming in node.js using any functional programming tool (like ramda, lodash, and imlazy) or even custom.
Given
Millions of records from a MongoDB cursor that can be iterated using await cursor.next()
You might want to read more about async generators and for-await-of.
For fake data one can use (on node 10)
function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
async function* getDocs(n) {
for(let i=0;i<n;++i) {
await sleep(1);
yield {i: i, t: Date.now()};
}
}
let docs=getDocs(1000000);
Wanted
We need
first document
last document
number of documents
split into batches/bulks of n documents and emit a socket.io event for that bulk
Make sure that first and last documents are included in the batches and not consumed.
Constraints
The millions of records should not be loaded into ram, one should iterate on them and hold at most only a batch of them.
The requirement can be done using usual nodejs code, but can it be done using something like applyspec as in here.
R.applySpec({
first: R.head(),
last: R.last(),
_:
R.pipe(
R.splitEvery(n),
R.map( (i)=> {return "emit "+JSON.stringify(i);})
)
})(input)
To show how this could be modeled with vanilla JS, we can introduce the idea of folding over an async generator that produces things that can be combined together.
const foldAsyncGen = (of, concat, empty) => (step, fin) => async asyncGen => {
let acc = empty
for await (const x of asyncGen) {
acc = await step(concat(acc, of(x)))
}
return await fin(acc)
}
Here the arguments are broken up into three parts:
(of, concat, empty) expects a function to produce "combinable" thing, a function that will combine two "combinable" things and an empty/initial instance of a "combinable" thing
(step, fin) expects a function that will take a "combinable" thing at each step and produce a Promise of a "combinable" thing to be used for the next step and a function that will take the final "combinable" thing after the generator has exhausted and produce a Promise of the final result
async asyncGen is the async generator to process
In FP, the idea of a "combinable" thing is known as a Monoid, which defines some laws that detail the expected behaviour of combining two of them together.
We can then create a Monoid that will be used to carry through the first, last and batch of values when stepping through the generator.
const Accum = (first, last, batch) => ({
first,
last,
batch,
})
Accum.empty = Accum(null, null, []) // an initial instance of `Accum`
Accum.of = x => Accum(x, x, [x]) // an `Accum` instance of a single value
Accum.concat = (a, b) => // how to combine two `Accum` instances together
Accum(a.first == null ? b.first : a.first, b.last, a.batch.concat(b.batch))
To capture the idea of flushing the accumulating batches we can create another function that takes an onFlush function that will perform some action in a returned Promise with the values being flushed, and a size n of when to flush the batch.
Accum.flush = onFlush => n => acc =>
acc.batch.length < n ? Promise.resolve(acc)
: onFlush(acc.batch.slice(0, n))
.then(_ => Accum(acc.first, acc.last, acc.batch.slice(n)))
We can also now define how we can fold over the Accum instances.
Accum.foldAsyncGen = foldAsyncGen(Accum.of, Accum.concat, Accum.empty)
With the above utilities defined, we can now use them to model your specific problem.
const emit = batch => // This is an analog of where you would emit your batches
new Promise((resolve) => resolve(console.log(batch)))
const flushEmit = Accum.flush(emit)
// flush and emit every 10 items, and also the remaining batch when finished
const fold = Accum.foldAsyncGen(flushEmit(10), flushEmit(0))
And finally run with your example.
fold(getDocs(100))
.then(({ first, last })=> console.log('done', first, last))
I'm not sure it's fair to imply that functional programming was going to offer any advantages over imperative programming in term of performance when dealing with huge amount of data.
I think you need to add another tool in your toolkit and that may be RxJS.
RxJS is a library for composing asynchronous and event-based programs by using observable sequences.
If you're not familiar with RxJS or reactive programming in general, my examples will definitely look weird but I think it would be a good investment to get familiar with these concepts
In your case, the observable sequence is your MongoDB instance that emits records over time.
I'm gonna fake your db:
var db = range(1, 5);
The range function is a RxJS thing that will emit a value in the provided range.
db.subscribe(n => {
console.log(`record ${n}`);
});
//=> record 1
//=> record 2
//=> record 3
//=> record 4
//=> record 5
Now I'm only interested in the first and last record.
I can create an observable that will only emit the first record, and create another one that will emit only the last one:
var db = range(1, 5);
var firstRecord = db.pipe(first());
var lastRecord = db.pipe(last());
merge(firstRecord, lastRecord).subscribe(n => {
console.log(`record ${n}`);
});
//=> record 1
//=> record 5
However I also need to process all records in batches: (in this example I'm gonna create batches of 10 records each)
var db = range(1, 100);
var batches = db.pipe(bufferCount(10))
var firstRecord = db.pipe(first());
var lastRecord = db.pipe(last());
merge(firstRecord, batches, lastRecord).subscribe(n => {
console.log(`record ${n}`);
});
//=> record 1
//=> record 1,2,3,4,5,6,7,8,9,10
//=> record 11,12,13,14,15,16,17,18,19,20
//=> record 21,22,23,24,25,26,27,28,29,30
//=> record 31,32,33,34,35,36,37,38,39,40
//=> record 41,42,43,44,45,46,47,48,49,50
//=> record 51,52,53,54,55,56,57,58,59,60
//=> record 61,62,63,64,65,66,67,68,69,70
//=> record 71,72,73,74,75,76,77,78,79,80
//=> record 81,82,83,84,85,86,87,88,89,90
//=> record 91,92,93,94,95,96,97,98,99,100
//=> record 100
As you can see in the output, it has emitted:
The first record
Ten batches of 10 records each
The last record
I'm not gonna try to solve your exercise for you and I'm not too familiar with RxJS to expand too much on this.
I just wanted to show you another way and let you know that it is possible to combine this with functional programming.
Hope it helps
I think I may have developed an answer for you some time ago and it's called scramjet. It's lightweight (no thousands of dependencies in node_modules), it's easy to use and it does make your code very easy to understand and read.
Let's start with your case:
DataStream
.from(getDocs(10000))
.use(stream => {
let counter = 0;
const items = new DataStream();
const out = new DataStream();
stream
.peek(1, async ([first]) => out.whenWrote(first))
.batch(100)
.reduce(async (acc, result) => {
await items.whenWrote(result);
return result[result.length - 1];
}, null)
.then((last) => out.whenWrote(last))
.then(() => items.end());
items
.setOptions({ maxParallel: 1 })
.do(arr => counter += arr.length)
.each(batch => writeDataToSocketIo(batch))
.run()
.then(() => (out.end(counter)))
;
return out;
})
.toArray()
.then(([first, last, count]) => ({ first, count, last }))
.then(console.log)
;
So I don't really agree that javascript FRP is an antipattern and I don't think I have the only answer to that, but while developing the first commits I found that the ES6 arrow syntax and async/await written in a chained fashion makes the code easily understandable.
Here's another example of scramjet code from OpenAQ specifically this line in their fetch process:
return DataStream.fromArray(Object.values(sources))
// flatten the sources
.flatten()
// set parallel limits
.setOptions({maxParallel: maxParallelAdapters})
// filter sources - if env is set then choose only matching source,
// otherwise filter out inactive sources.
// * inactive sources will be run if called by name in env.
.use(chooseSourcesBasedOnEnv, env, runningSources)
// mark sources as started
.do(markSourceAs('started', runningSources))
// get measurements object from given source
// all error handling should happen inside this call
.use(fetchCorrectedMeasurementsFromSourceStream, env)
// perform streamed save to DB and S3 on each source.
.use(streamMeasurementsToDBAndStorage, env)
// mark sources as finished
.do(markSourceAs('finished', runningSources))
// convert to measurement report format for storage
.use(prepareCompleteResultsMessage, fetchReport, env)
// aggregate to Array
.toArray()
// save fetch log to DB and send a webhook if necessary.
.then(
reportAndRecordFetch(fetchReport, sources, env, apiURL, webhookKey)
);
It describes everything that happens with every source of data. So here's my proposal up for questioning. :)
here are two solutions using RxJs and scramjet.
here is an RxJs solution
the trick was to use share() so that first() and last() won't consumer from the iterator, forkJoin was used to combine them to emit the done event with those values.
function ObservableFromAsyncGen(asyncGen) {
return Rx.Observable.create(async function (observer) {
for await (let i of asyncGen) {
observer.next(i);
}
observer.complete();
});
}
async function main() {
let o=ObservableFromAsyncGen(getDocs(100));
let s = o.pipe(share());
let f=s.pipe(first());
let e=s.pipe(last());
let b=s.pipe(bufferCount(13));
let c=s.pipe(count());
b.subscribe(log("bactch: "));
Rx.forkJoin(c, f, e, b).subscribe(function(a){console.log(
"emit done with count", a[0], "first", a[1], "last", a[2]);})
}
here is a scramjet but that is not pure (functions have side effects)
async function main() {
let docs = getDocs(100);
let first, last, counter;
let s0=Sj.DataStream
.from(docs)
.setOptions({ maxParallel: 1 })
.peek(1, (item)=>first=item[0])
.tee((s)=>{
s.reduce((acc, item)=>acc+1, 0)
.then((item)=>counter=item);
})
.tee((s)=>{
s.reduce((acc, item)=>item)
.then((item)=>last=item);
})
.batch(13)
.map((batch)=>console.log("emit batch"+JSON.stringify(batch));
await s0.run();
console.log("emit done "+JSON.stringify({first: first, last:last, counter:counter}));
}
I'll work with #michał-kapracki to develop a pure version of it.
For this exact kind of problems I made this library: ramda-generators
Hopefully it's what you are looking for: lazy evaluation of streams in functional JavaScript
Only problem is that I have no idea on how to take the last element and the amount of elements from a stream without re-running the generators
A possible implementation that compute the result without parsing the whole DB in memory could be this:
Try it on repl.it
const RG = require("ramda-generators");
const R = require("ramda");
const sleep = ms => new Promise(resolve => setTimeout(resolve, ms));
const getDocs = amount => RG.generateAsync(async (i) => {
await sleep(1);
return { i, t: Date.now() };
}, amount);
const amount = 1000000000;
(async (chunkSize) => {
const first = await RG.headAsync(getDocs(amount).start());
const last = await RG.lastAsync(getDocs(amount).start()); // Without this line the print of the results would start immediately
const DbIterator = R.pipe(
getDocs(amount).start,
RG.splitEveryAsync(chunkSize),
RG.mapAsync(i => "emit " + JSON.stringify(i)),
RG.mapAsync(res => ({ first, last, res })),
);
for await (const el of DbIterator())
console.log(el);
})(100);
I got confuse, how to accomplished this task let me show code then issue
case 1
let arr1=[1,3]
let arr2=[1,2,3]
compare these 2 array if arr2 is greater then delete 2 from database
case 2
let arr1=[1,2,3]
let arr2=[1,2]
compare these 2 array if arr1 is greater, then insert 3 into database
and return promise reject like resolve can anyone tell me what is best to achieve this.
solution to your problem
Step 1: find the difference in the two arrays lets say arr1 & arr2
Step 2: check if arr2.length is greater than delete the difference from the db
Step 3: if arr1.length is greater than insert difference in db
for Step 1 implement the below "difference" function:
Array.prototype.difference = function(arr) {
return this.filter(function(i) {return arr.indexOf(i) === -1;});
};
[1,2,3,4,5,6].diff( [3,4,5] );// return [1,2,6]
// here you capture the difference among arrays
let diffArray = arr1.difference(arr2);
for step 2 & step 3:
if(arr2.length > arr1.length){
diffArray.forEach((element)=>{
// your db deletion code comes here something like.....db.delete(element);
return new Promise((resolve, reject) => {
// db deletion code
// return resolve(result)....if successfully inserted
// reject(err).........if some error occurs
})
.then(result => result)
.catch(err => err)
});
// similarly
if (arr1.length >arr2.length){
diffArray.forEach((element)=>{
// your db insertion code comes here
return new Promise((resolve, reject) => {
// db insertion code
// return resolve(result)....if successfully inserted
// reject(err).........if some error occurs
})
.then(result => result)
.catch(err => err)
});
}
}
Happy Coding :)
I've got two event streams. One is from an inductance loop, the other is an IP camera. Cars will drive over the loop and then hit the camera. I want to combine them if the events are within N milliseconds of each other (car will always hit the loop first), but I also want the unmatched events from each stream (either hardware can fail) all merged into a single stream. Something like this:
---> (only unmatched a's, None)
/ \
stream_a (loop) \
\ \
--> (a, b) ---------------------------> (Maybe a, Maybe b)
/ /
stream_b (camera) /
\ /
--> (None, only unmatched b's)
Now certainly I can hack my way around by doing the good ole Subject anti-pattern:
unmatched_a = Subject()
def noop():
pass
pending_as = [[]]
def handle_unmatched(a):
if a in pending_as[0]:
pending_as[0].remove(a)
print("unmatched a!")
unmatched_a.on_next((a, None))
def handle_a(a):
pending_as[0].append(a)
t = threading.Timer(some_timeout, handle_unmatched)
t.start()
return a
def handle_b(b):
if len(pending_as[0]):
a = pending_as[0].pop(0)
return (a, b)
else:
print("unmatched b!")
return (None, b)
stream_a.map(handle_a).subscribe(noop)
stream_b.map(handle_b).merge(unmatched_a).subscribe(print)
Not only is this rather hacky, but although I've not observed it I'm pretty sure there's a race condition when I check the pending queue using threading.Timer. Given the plethora of rx operators, I'm pretty sure some combination of them will let you do this without using Subject, but I can't figure it out. How does one accomplish this?
Edit
Although for organizational and operational reasons I'd prefer to stick to Python, I'll take a JavaScript rxjs answer and either port it or even possibly rewrite the entire script in node.
You should be able to solve the problem using auditTime and buffer. Like this:
function matchWithinTime(a$, b$, N) {
const merged$ = Rx.Observable.merge(a$, b$);
// Use auditTime to compose a closing notifier for the buffer.
const audited$ = merged$.auditTime(N);
// Buffer emissions within an audit and filter out empty buffers.
return merged$
.buffer(audited$)
.filter(x => x.length > 0);
}
const a$ = new Rx.Subject();
const b$ = new Rx.Subject();
matchWithinTime(a$, b$, 50).subscribe(x => console.log(JSON.stringify(x)));
setTimeout(() => a$.next("a"), 0);
setTimeout(() => b$.next("b"), 0);
setTimeout(() => a$.next("a"), 100);
setTimeout(() => b$.next("b"), 125);
setTimeout(() => a$.next("a"), 200);
setTimeout(() => b$.next("b"), 275);
setTimeout(() => a$.next("a"), 400);
setTimeout(() => b$.next("b"), 425);
setTimeout(() => a$.next("a"), 500);
setTimeout(() => b$.next("b"), 575);
setTimeout(() => b$.next("b"), 700);
setTimeout(() => b$.next("a"), 800);
.as-console-wrapper { max-height: 100% !important; top: 0; }
<script src="https://unpkg.com/rxjs#5/bundles/Rx.min.js"></script>
If it's possible for b values to be closely followed by a values and you do not want them to be matched, you could use a more specific audit, like this:
const audited$ = merged$.audit(x => x === "a" ?
// If an `a` was received, audit upcoming values for `N` milliseconds.
Rx.Observable.timer(N) :
// If a `b` was received, don't audit the upcoming values.
Rx.Observable.of(0, Rx.Scheduler.asap)
);
I have developed a different strategy than Cartant, and clearly much less elegant, which may give you somehow a different result. I apologize if I have not understood the question and if my answer turns out to be useless.
My strategy is based on using switchMap on a$ and then bufferTime on b$.
This code emits at every timeInterval and it emits an object which contains the last a received and an array of bs representing the bs received during the time interval.
a$.pipe(
switchMap(a => {
return b$.pipe(
bufferTime(timeInterval),
mergeMap(arrayOfB => of({a, arrayOfB})),
)
})
)
If arrayOfB is empty, than it means that the last a in unmatched.
If arrayOfB has just one element, than it means that the last a has been matched by the b of the array.
If arrayOfB has more than one element, than it means that the last a has been matched by the first b of the array while all other bs are unmatched.
Now it is a matter of avoiding the emission of the same a more than
once and this is where the code gets a bit messy.
In summary, the code could look like the following
const a$ = new Subject();
const b$ = new Subject();
setTimeout(() => a$.next("a1"), 0);
setTimeout(() => b$.next("b1"), 0);
setTimeout(() => a$.next("a2"), 100);
setTimeout(() => b$.next("b2"), 125);
setTimeout(() => a$.next("a3"), 200);
setTimeout(() => b$.next("b3"), 275);
setTimeout(() => a$.next("a4"), 400);
setTimeout(() => b$.next("b4"), 425);
setTimeout(() => b$.next("b4.1"), 435);
setTimeout(() => a$.next("a5"), 500);
setTimeout(() => b$.next("b5"), 575);
setTimeout(() => b$.next("b6"), 700);
setTimeout(() => b$.next("b6.1"), 701);
setTimeout(() => b$.next("b6.2"), 702);
setTimeout(() => a$.next("a6"), 800);
setTimeout(() => a$.complete(), 1000);
setTimeout(() => b$.complete(), 1000);
let currentA;
a$.pipe(
switchMap(a => {
currentA = a;
return b$.pipe(
bufferTime(50),
mergeMap(arrayOfB => {
let aVal = currentA ? currentA : null;
if (arrayOfB.length === 0) {
const ret = of({a: aVal, b: null})
currentA = null;
return ret;
}
if (arrayOfB.length === 1) {
const ret = of({a: aVal, b: arrayOfB[0]})
currentA = null;
return ret;
}
const ret = from(arrayOfB)
.pipe(
map((b, _indexB) => {
aVal = _indexB > 0 ? null : aVal;
return {a: aVal, b}
})
)
currentA = null;
return ret;
}),
filter(data => data.a !== null || data.b !== null)
)
})
)
.subscribe(console.log);
I am following the first half of this excellent article, but there is a place I am stuck. https://jrsinclair.com/articles/2016/marvellously-mysterious-javascript-maybe-monad/
I have implemented a very similar Maybe monad, but one of my functions I need to pass to map is asynchronous. Ideally, I would be able to do this in a combination of .then() and map(). I want to do something like this...
const getToken = async (p) => {
let result = utils.Maybe.of(await makeAICCCall(p.aiccsid, p.aiccurl))
.map(parseAuthenticatedUser)
.thenMap(syncUserWithCore) <-- I can't figure this out
.map(managejwt.maketoken)
.value
return result;
}
I have tried everything I can think of, but I have not been able to figure this out.
natural transformations
Nesting of data containers can get messy, but there's a well-known technique for keeping them flat – eitherToPromise below is considered a natural transformation – it converts an Either to a Promise which allows it to be flattened in the .then chain, but also simultaneously prevents your value/error wires from getting crossed
Note: You probably want to use makeAICCall to return an Either (Left, Right) instead of Maybe because you'll be able to return an error message (instead of Nothing, which is less informative)
import { Left, Right, eitherToPromise } from './Either'
const makeAICCall = (...) =>
someCondition
? Left (Error ('error happened'))
: Right (someResult)
const getToken = p =>
makeAICCall (p.aiccsic, p.aiccurl) // => Promise<Either<x>>
.then (eitherToPromise) // => Promise<Promise<x>>
// => auto-flattened to Promise<x>
.then (syncUserWithCore) // => Promise<x>
.then (managejwt.maketoken) // => Promise<x>
Supply your favourite implementation of Either
// Either.js
export const Left = x =>
({
fold: (f,_) => f (x),
// map: f => Left (x),
// chain: ...,
// ...
})
export const Right = x =>
({
fold: (_,f) => f (x),
// map: f => Right (f (x)),
// chain: ...,
// ...
})
export const eitherToPromise = m =>
m.fold (x => Promise.reject (x), x => Promise.resolve (x))
runnable demo
const someAsyncCall = x =>
new Promise (r => setTimeout (r, 1000, x))
const authenticate = ({user, password}) =>
password !== 'password1'
? Left (Error ('invalid password'))
: Right ({user, id: 123})
const someSyncCall = token =>
Object.assign (token, { now: Date.now () })
const getToken = x =>
someAsyncCall (x)
.then (authenticate)
.then (eitherToPromise)
.then (someSyncCall)
// minimal dependencies
const Left = x =>
({ fold: (f,_) => f (x) })
const Right = x =>
({ fold: (_,f) => f (x) })
const eitherToPromise = m =>
m.fold (x => Promise.reject (x), x => Promise.resolve (x))
// test it
getToken ({user: 'alice', password: 'password1'})
.then (console.log, console.error)
// 1 second later ...
// { user: 'alice', id: 123, now: 1509034652179 }
getToken ({user: 'bob', password: 'password2'})
.then (console.log, console.error)
// 1 second later ...
// Error: invalid password ...
hey, lookit that
Our solution above results in a sequence of .then calls – an answer to your previous question demonstrates how such a program can be expressed in a different way
nullables
You should try your best to write functions that have a well-defined domain and codomain – you should be able to say, for example,
My function takes a string (domain) and returns a number (codomain) – anonymous wiseman
And avoid writing functions that have descriptions like,
It can take a number or a string and it returns an array, but could also return undefined. Oh and sometimes it can throw an error. But that's it, I'm pretty sure. – anonymous ignoramus
But of course we'll be dealing with null and undefined sometimes. How can we deal with it in a "functional way" – that's what you're wondering, right?
If you find yourself in an encounter with a function's nullable codomain (ie, can return a nullable), we can create a little helper to coerce it into a type we want. We'll demonstrate again with Either, just to tie it into the original code later
const Left = x =>
({ fold: (f,_) => f (x) })
const Right = x =>
({ fold: (_,f) => f (x) })
const eitherFromNullable = (x, otherwise = x) =>
x === null ||
x === undefined
? Left (otherwise)
: Right (x)
// !! nullable codomain !!
const find = (f, xs) =>
xs.find (x => f (x))
// example data
const data =
[1,2,3,4,5]
// perform safe lookups by wrapping unsafe find in eitherFromNullable
eitherFromNullable (find (x => x > 3, data))
.fold (console.error, console.log)
// <console.log> 4
eitherFromNullable (find (x => x > 5, data))
.fold (console.error, console.log)
// <console.error> undefined
eitherFromNullable (find (x => x > 5, data), Error (`couldn't find a big number !`))
.fold (console.error, console.log)
// <console.error> Error: couldn't find a big number !
nullables and natural transformations
Remember, we do our best to avoid nullables, but sometimes we can't help it. To show how this might tie in with the original code, let's pretend that instead of returning an Either, makeAICCall will instead return some x or some null
We just screen it with eitherFromNullable – new code in bold
const getToken = p =>
makeAICCall (p.aiccsic, p.aiccurl) // => Promise<x?> could be null !!
.then (x => // => Promise<Either<x>>
eitherFromNullable (x, Error ('bad aic call')))
.then (eitherToPromise) // => Promise<Promise<x>>
// => auto-flattened to Promise<x>
.then (syncUserWithCore) // => Promise<x>
.then (managejwt.maketoken) // => Promise<x>
stop hating lambdas
You probably wanna ditch that lambda, right ? Ok fine, just don't make it into a fetish.
const eitherFromNullable = otherwise => x =>
x == null ? Left (otherwise) : Right (x)
// ooooh, yeah, you like that
makeAICCall (p.aiccsic, p.aiccurl)
.then (eitherFromNullable (Error ('bad aic call')))
.then (eitherToPromise)
.then ...
don't get stuck
Nullable doesn't mean anything – you decide what it means in the context of your program.
const eitherFromNullable = (x, otherwise = x) =>
// we consider null and undefined nullable,
// everything else is non-null
x === null ||
x === undefined
? Left (otherwise)
: Right (x)
You could decide that false and 0 and empty string '' are also "nullable" – Or, you could just as easily decide to have very specific adapters eitherFromNull, eitherFromUndefined, eitherFromBoolean, etc – it's your program; it's up to you!
I feel like I'm starting to repeat myself ^_^'
make it routine
So you're saying you have lots of areas in your program where nulls are just unavoidable; maybe it's some dependency that you cannot get rid of. We'll imagine your code base with the following
// no one wants to do this for every endpoint!
const getUser = id =>
new Promise ((resolve, reject) =>
request ({url: '/users', id}, (err, res) =>
err
? reject (err)
: res.status === 403
? reject (Error ('unauthorized'))
res.body == null ?
? reject (Error ('not found'))
: resolve (User (JSON.parse (res.body)))))
It's using request which has an older Node-style callback interface that we're tired of wrapping in a promise
The API endpoints will respond with a 403 status if the requestor is unauthorized to view the requested resource
The API endpoints we talk to sometimes respond with null with status 200 (instead of 404) for missing resources; for example /users/999, where 999 is an unknown user id, will not trigger an error, but will fetch an empty response body
The API will respond with a valid JSON document for all other requests
We wish we could use something other than request, but our supervisor says No. We wish the API endpoints had different behavior, but that's out of our control. Still, it's within our power to write a good program
// functional programming is about functions
const safeRequest = (type, ...args) =>
new Promise ((resolve, reject) =>
request[type] (args, (err, res) =>
err
? reject (err)
: res.status === 403
? reject (Error ('unauthorized'))
res.body == null ?
? reject (Error ('not found'))
: resolve (JSON.parse (res.body))))
const getUser = id =>
safeRequest ('get', {url: '/users', id})
const createUser = fields =>
safeRequest ('post', {url: '/users', fields})
const updateUser = (id, fields) =>
safeRequest ('put', {url: '/users', id, fields})
Can it be improved more? Sure, but even if that's as far as you went, there's nothing wrong with that; all of the necessary checks happen for each endpoint because they were defined using safeRequest
Ok, so you wanna take it further? No problem. It's your program, do whatever you want!
const promisify = f => (...args) =>
new Promise ((resolve, reject) =>
f (...args, (err, x) =>
err ? reject (err) : resolve (x)))
const promiseFromResponseStatus = res =>
res.status === 403 // or handle other status codes here too !
? Promise.reject (Error ('unauthorized'))
: Promise.resolve (res)
const promiseFromNullableResponse = res =>
res.body == null // or res.body == '', etc
? Promise.reject (Error ('not found'))
: Promise.resolve (res.body)
const safeRequest = (type, ...args) =>
promisify (request [type]) (...args)
.then (promiseFromResponseStatus)
.then (promiseFromNullableResponse)
.then (JSON.parse)
const getUser = id =>
safeRequest ('get', {url: '/users', id})
const createUser ...
....