Composition

Dear Computer

Chapter 8: Functions Revisited

Composition

Imagine we are processing some text entered in a form, and we need to normalize the input. In particular, we must strip out the spaces and turn the text lowercase. A collaborator has already written these two helper functions to perform these operations separately:

JavaScript
const removeSpaces = text => text.replaceAll(' ', '');
const lowerize = text => text.toLowerCase();

We want to apply both of these functions to the input, so we chain them together:

JavaScript
removeSpaces(lowerize(input))

Now imagine there are actually 7 steps required to normalize the input:

JavaScript
censorProfanity(redact(clampAtLength100(removeApostrophes(stripBritishU(removeSpaces(lowerize(input)))))))

The code has become gross. It'd be nice if we could wire all the function calls together into a single function that represents the whole normalization pipeline. Then we could very succinctly say something like this:

JavaScript
normalize(text)

In fact, we can collapse functions together. Wiring two functions together to produce a new function is called composition. This term is overloaded; it is also used to describe the has-a relationship of objects. In a function composition, the actual parameters are passed off to the first function, whose return value is passed off to the second, whose return value is passed off to the third, and so on.

Here two of the functions are composed using a helper function compose that welds them together:

JavaScript
const normalize = compose(removeSpaces, lowerize);

This is JavaScript, but JavaScript doesn't have a builtin compose function. Not to worry, as we can write it ourselves. This compose function accepts two functions as parameters and returns a lambda that wires them together in the proper order:

JavaScript
function compose(g, f) {
  return x => g(f(x));
}

Note that compose only forms a lambda and returns it. None of f, g, or the lambda are actually called anywhere in this code. Some other code will later call normalize like this:

JavaScript
const tame = normalize(wild);

When normalize is called, the wild parameter arrives at the start of the pipeline as x. It calls f and then g, which have been preserved in the lambda's closure.

Function-loving Haskell does have a builtin composition operator. It is named ., perhaps to look like the mathematical composition operator seen in \(g \circ f\). The Haskell expression g (f x) is composed as (g . f) x. The type signature of the . operator shows how the two functions piece together:

Haskell
(.) :: (b -> c) -> (a -> b) -> (a -> c)

The second parameter is a function that takes in an a and yields a b. The first parameter is a function that takes in a b and yields a c. The overall value returned by the operator is a function that takes in an a and produces a c.

An equivalent normalize function might be defined in Haskell as follows:

Haskell
removeSpaces = filter (/= ' ')
lowerize = map toLower
normalize = removeSpaces . lowerize

To sort a list in reverse, we sort it and then reverse it. Or we wire the reverse and sort functions together to produce an rsort function:

Haskell
rsort :: Ord a => [a] -> [a]
rsort = reverse . sort

To get the second element of a list, the neck, we take the head of the tail. Or we wire these two functions together to produce a neck function:

Haskell
neck :: [a] -> a
neck = head . tail

To count how many positive numbers are in a list, we filter the list and then ask its length. Or we wire these two functions together to produce a positiveCount function:

Haskell
positiveCount :: Ord a => [a] -> Int
positiveCount = length . filter (> 0)

To find the length of the longest word in a list of words, we find all the lengths and then select out the maximum. Or we wire these two functions together to produce a longth function:

Haskell
longth :: [String] -> Int
longth = maximum . map length

example = longth ["a", "an", "the"]  -- yields 3

To find the sum of the y-coordinates whose x-coordinates are at least 0 from a list of coordinate pairs, we filter out the pairs, extract the y-coordinates, and add them all together. Or we wire these three functions together to produce a sumRights function:

Haskell
sumRights :: [(Int, Int)] -> Int
sumRights = sum . map snd . filter (\(x, _) -> x >= 0)

example = sumRights [(1, 3), (-5, 6), (2, 6)]  -- yields 9

We can write sophisticated computational pipelines without a lot of noise using the composition operator.

Composition with Application

Both . and $ are infix operators that are used to pass some computation from their right operand into their left operand. For both operators, the left operand is a function awaiting a parameter. They differ in their right operand and in the type of value they produce. Some expressions can be written with either operator. Let's examine how.

Suppose we want a function sumPositives that sums up all the positive numbers of a list. In our first draft, we write it with neither of the operators but end up with something that doesn't compile:

Haskell
sumPositives xs = sum filter (> 0) xs  -- doesn't compile

This fails because function calling is left associative and has the highest precedence of all operations in Haskell. The sum function is leftmost, so it is called first and is passed the parameterless filter function, as if it were written like this:

Haskell
sumPositives xs = (sum filter) (> 0) xs  -- doesn't compile

We want filter to get called first, so we add some extra parentheses to elevate its precedence:

Haskell
sumPositives xs = sum (filter (> 0) xs)

We could have used the $ operator:

Haskell
sumPositives xs = sum $ filter (> 0) xs

The $ operator has the lowest precedence of all Haskell operators, and therefore its left and right operands must have higher precedences. The two sides will be evaluated on their own first. Then the right value will be passed as a parameter to the function on the left. Parentheses achieve a similar effect, but they work by raising the precedence of their nested expression over the expressions on the outside.

Compare these two operators to two different ways we humans might raise our self-esteem. We could hang around an incompetent lowlife like $ and feel like a genius. Or we could elevate our own prestige by decorating ourselves with ().

$ is sometimes called the application operator because it applies the function on its left to the parameter on its right. In contrast, the composition operator . expects two functions as its operands. It glues them together so that the return value of the second becomes the parameter of the first. It doesn't perform any application but yields a new function waiting to be applied. The type signatures indicate this difference:

Haskell
(.) :: (b -> c) -> (a -> b) -> (a -> c)
($) :: (a -> b) -> a -> b

We can use . to glue sum and the partially-applied filter together before performing the final application to xs:

Haskell
sumPositives xs = (sum . filter (> 0)) xs

Instead of the parentheses, we could use the $ operator to elevate the precedences of its two operands over itself:

Haskell
sumPositives xs = sum . filter (> 0) $ xs

This is a nice solution, but we can go a step further. If we think of our job as assembling a function out of other functions rather than definining one from scratch, we can write it in point-free style:

Haskell
sumPositives = sum . filter (> 0)

In Haskell, combining terms to yield functions is much like combining terms to produce numbers or strings.

← Partial ApplicationHigher-order I/O →