JS Journey into outer space - Day 3

This article was written as part of a knowledge sharing program I created for front-end developers. It builds up to creating reusable abstractions by studying async transducers.

Day 3

Perhaps it's good to look at what was created so far and see if there's anything to improve. The code in the repo was written in a test-driven way from a data perspective, but operators and helpers remain untested at this point. Also, not all code is typed. This might become a concern if this would be a open source library with actual users, but it isn't, so just consider adding more tests and types a good exercise at generics ๐Ÿ˜€

I don't like to use alien terms but ๐Ÿ‘ฝ

It isn't easy for me either, but we have to talk about polymorphism. We've created implementations for transducing arrays and eithers, but we're not anywhere close to dealing with observables, which is one of the aims of this project.

๐Ÿฆ‰ Polymorphism means there's a single interface that can be used for different types. In functional programming, functions that accept different types have a generic type signature. For instance, the identity function above has the TypeScript type <A>(input: A) => A, where A is any type. Also note that operators in JavaScript are often polymorphic: the + operator can be used on both numbers and strings. This is known as operator overloading.

Generics in TypeScript can become hard to read at some point and it also has some limitations for annotating the kind of functions we'll be using. This is resolved by using the fp-ts library, but coming from JavaScript means we'll need to build a small spaceship to be able to travel to that remote planet...

So, before we go down the rabbit hole of async or fly to planet Abstract we'll need to get a better grasp of polymorphism. Let's first look at another sync type. Below is an example to transduce a string. While usually considered as a primitive type, we can of course consider a string as a "container" of characters ๐Ÿคจ

const isW = (a: string) => a === 'w';
const hello = compose(filter(isW), map(prependHello));
function* stringGenerator(s: string) {
  for(const c of s) yield s;
}
const stringConcat = (a, c) => a + c;
const result = transduce(input, stringConcat, stringGenerator, '');
console.log(result); // hello w

Seeing double? ๐Ÿ˜ต

Wasn't that easy? All you need is to translate the generator and concat function to string equivalents. But, wait a minute:

function* arrayGenerator<T>(arr: T[]) {
  for(const x of arr) yield x;
}
function* stringGenerator(s: string) {
  for(const c of s) yield s;
}

Yes, the generator functions are identical. That's because in JavaScript the for...of statement operates on iterable objects. So, for any object that implements the iterable protocol we can instead retrieve the iterator:

function iterator<T>(iterable: Iterable<T>) {
  return iterable[Symbol.iterator]();
}

๐Ÿฆ‰ In some functional languages, such as Haskell, polymorphism is expressed through type classes. Type classes are like generic interfaces that define behaviour which is shared between types. In JavaScript the built-in Iterable can also be considered a type class: it defines the iterator behaviour for iterable objects. See the TypeScript definition below.

interface Iterable<T> {
  [Symbol.iterator](): Iterator<T>;
}

We can pass a generic function to transduce, getting one step closer to making a function that can handle a pretty wide range of types: any type that implements the iterator protocol can make use of a single function to loop over its inner values. This isn't limited to the built-in types, as we can extend or create any class in JavaScript.

transduce(input, someConcat, iterator, someInit);

At this point we could make the iterator part of the transduce internals and thus have it only accept types that implement the iterable interface. However, we would still need to pass the concat and init arguments. As we will see later there is a way to generalise this, but there's a bit more ground to cover. Instead we first move on to async.

Worried about the future? ๐Ÿ™€

How to get from the present to the asynchronous? When dealing with promises you could be tempted by the fact that we now have async generators in JavaScript. However, when we allow async generators in the transduce function it means that we would need to await every iteration. This would seriously impact the performance of the synchronous usecase, which is not desirable.

What can we do instead of awaiting when dealing with promises? Since the arrival of async/await in 2017 we use it a lot less, but a promise is still an object with a then method, which receives a callback. Callbacks used to be the main way of dealing with async for a long time, and we'll need to get reacquainted with them here.

To transduce a promise without a generator we'll have to adapt the original function:

function transduce(input, fn, onNext, init) {
  onNext(input, (cur) => {
    init = fn(init, cur);
  });
  return init;
};

Instead of working with a generator we just call a function that will kick off processing the value(s) which are gathered from the input. In the case of a promise that is the resolved value (we'll deal with rejection later). The function to pass in becomes:

function promiseOnNext<T>(p: Promise<T>, callback: (val: T) => void) {
  return p.then(callback);
}

When the callback happens the function call will update init, which will be used for the sync case: when the overwrite of init happens asynchronously, it will be ignored (and garbage collected). We have a working solution for getting the resolved value, but how do we "update" the initial value (like concat), which is again a promise? We would need to be able to resolve a promise "from the outside". Luckily there exists a pattern that allows for just that: Deferred.

๐Ÿฆ‰ A deferred promise exposes a resolve (and reject) method in addition to then (and catch). It was a popular pattern before promises were standardised in JavaScript, but is now considered an anti-pattern.

interface Deferred<T> extends Promise<T> {
  resolve: (v: T) => void;
  reject: (err: any) => void;
}

function createDeferred<T>() {
  let resolve, reject;
  const deferred = new Promise<T>((rs, rj) => {
    resolve = rs;
    reject = rj;
  }) as Deferred<T>;
  deferred.resolve = resolve;
  deferred.reject = reject;
  return deferred;
}

We pass the deferred as initial value to transduce and update it by simply calling its resolve method with the received value:

function promiseConcat<T>(a: Promise<T>, c: T) {
  a.resolve(c);
  return a;
}
const xform = hello(promiseConcat);
const result = transduce(Promise.resolve('world'), xform, promiseOnNext, createDeferred());
console.log(await result); // hello world

What if we pass an rejection? Then the deferred should also be rejected. We need another callback for handling the error case, but we can combine it in the same function. We just need another function to dispatch on the initial value. Let's call it onError.

function transduce(input, fn, onNext, onError, init) {
  onNext(input, (cur) => {
    init = fn(init, cur);
  }, (error) => {
    onError(init, error);
  });
  return init;
};

Now to transduce a rejected promise:

function promiseOnNext<T>(p: Promise<T>, nextCallback: (val: T) => void, errorCallback: (err: any) => void) {
  return p.then(nextCallback).catch(errorCallback);
}
function promiseOnError<T>(a: Promise<T>, error: any) {
  a.reject(error);
}
const xform = hello(promiseConcat);
const result = transduce(Promise.reject('boom!'), xform, promiseOnNext, promiseOnError, createDeferred());
result.catch(console.log); // boom!

Promises also have a finally method that always gets called after it's either resolved or rejected. However, there isn't any method to be called, so let's just pass a function that does nothing (noop).

function transduce(input, fn, onNext, onError, onComplete, init) {
  onNext(input, (cur) => {
    init = fn(init, cur);
  }, (error) => {
    onError(init, error);
  }, () => {
    onComplete(init);
  });
  return init;
};

function promiseOnNext<T>(
  p: Promise<T>,
  nextCallback: (val: T) => void,
  errorCallback: (err: any) => void,
  completeCallback: () => void
) {
  return p.then(nextCallback).reject(errorCallback).finally(completeCallback);
}
function noop() {}
const xform = hello(promiseConcat);
const result = transduce(
  Promise.resolve('world'),
  xform,
  promiseOnNext,
  promiseOnError,
  noop,
  createDeferred()
);

result.finally(() => {
  console.log('finally!');
}); // finally!

Say, have we met before?

Does the onNext pattern resemble anything you've seen before? Of course! Observables have identical methods and callbacks. Where promises have resolve, reject and finally observables have next, error and complete. It's the same concept, with of course the difference that observables "resolve" to multiple values. However, as soon as you are able to transduce async input, you get handling observables for free ๐Ÿ‘ป

Next up: observables!

Comments

Popular posts from this blog

Abandoning hope... and XForms

JS Journey into outer space - Day 5

JS Journey into outer space - Day 4