From: Christiaan Baaij Date: Mon, 22 Feb 2010 08:32:33 +0000 (+0100) Subject: Merge branch 'master' of http://git.stderr.nl/matthijs/projects/cλash-paper X-Git-Url: https://git.stderr.nl/gitweb?p=matthijs%2Fmaster-project%2Fdsd-paper.git;a=commitdiff_plain;h=380b879df0cc5c8826f9a3731687311d8840011d;hp=432416d4f3cb67cdb0ae332d01a78dae6328d0f5 Merge branch 'master' of git.stderr.nl/matthijs/projects/cλash-paper --- diff --git "a/c\316\273ash.lhs" "b/c\316\273ash.lhs" index a0ccb8e..a83750b 100644 --- "a/c\316\273ash.lhs" +++ "b/c\316\273ash.lhs" @@ -765,6 +765,180 @@ data IntPair = IntPair Int Int currently supported. \end{xlist} + \subsection{Polymorphic functions} + A powerful construct in most functional language is polymorphism. + This means the arguments of a function (and consequentially, values + within the function as well) do not need to have a fixed type. + Haskell supports \emph{parametric polymorphism}, meaning a + function's type can be parameterized with another type. + + As an example of a polymorphic function, consider the following + \hs{append} function's type: + + TODO: Use vectors instead of lists? + + \begin{code} + append :: [a] -> a -> [a] + \end{code} + + This type is parameterized by \hs{a}, which can contain any type at + all. This means that append can append an element to a list, + regardless of the type of the elements in the list (but the element + added must match the elements in the list, since there is only one + \hs{a}). + + This kind of polymorphism is extremely useful in hardware designs to + make operations work on a vector without knowing exactly what elements + are inside, routing signals without knowing exactly what kinds of + signals these are, or working with a vector without knowing exactly + how long it is. Polymorphism also plays an important role in most + higher order functions, as we will see in the next section. + + The previous example showed unconstrained polymorphism (TODO: How is + this really called?): \hs{a} can have \emph{any} type. Furthermore, + Haskell supports limiting the types of a type parameter to specific + class of types. An example of such a type class is the \hs{Num} + class, which contains all of Haskell's numerical types. + + Now, take the addition operator, which has the following type: + + \begin{code} + (+) :: Num a => a -> a -> a + \end{code} + + This type is again parameterized by \hs{a}, but it can only contain + types that are \emph{instances} of the \emph{type class} \hs{Num}. + Our numerical built-in types are also instances of the \hs{Num} + class, so we can use the addition operator on \hs{SizedWords} as + well as on {SizedInts}. + + In \CLaSH, unconstrained polymorphism is completely supported. Any + function defined can have any number of unconstrained type + parameters. The \CLaSH compiler will infer the type of every such + argument depending on how the function is applied. There is one + exception to this: The top level function that is translated, can + not have any polymorphic arguments (since it is never applied, so + there is no way to find out the actual types for the type + parameters). + + \CLaSH does not support user-defined type classes, but does use some + of the builtin ones for its builtin functions (like \hs{Num} and + \hs{Eq}). + + \subsection{Higher order} + Another powerful abstraction mechanism in functional languages, is + the concept of \emph{higher order functions}, or \emph{functions as + a first class value}. This allows a function to be treated as a + value and be passed around, even as the argument of another + function. Let's clarify that with an example: + + \begin{code} + notList xs = map not xs + \end{code} + + This defines a function \hs{notList}, with a single list of booleans + \hs{xs} as an argument, which simply negates all of the booleans in + the list. To do this, it uses the function \hs{map}, which takes + \emph{another function} as its first argument and applies that other + function to each element in the list, returning again a list of the + results. + + As you can see, the \hs{map} function is a higher order function, + since it takes another function as an argument. Also note that + \hs{map} is again a polymorphic function: It does not pose any + constraints on the type of elements in the list passed, other than + that it must be the same as the type of the argument the passed + function accepts. The type of elements in the resulting list is of + course equal to the return type of the function passed (which need + not be the same as the type of elements in the input list). Both of + these can be readily seen from the type of \hs{map}: + + \begin{code} + map :: (a -> b) -> [a] -> [b] + \end{code} + + As an example from a common hardware design, let's look at the + equation of a FIR filter. + + \begin{equation} + y_t = \sum\nolimits_{i = 0}^{n - 1} {x_{t - i} \cdot h_i } + \end{equation} + + A FIR filter multiplies fixed constants ($h$) with the current and + a few previous input samples ($x$). Each of these multiplications + are summed, to produce the result at time $t$. + + This is easily and directly implemented using higher order + functions. Consider that the vector \hs{hs} contains the FIR + coefficients and the vector \hs{xs} contains the current input sample + in front and older samples behind. How \hs{xs} gets its value will be + show in the next section about state. + + \begin{code} + fir ... = foldl1 (+) (zipwith (*) xs hs) + \end{code} + + Here, the \hs{zipwith} function is very similar to the \hs{map} + function: It takes a function two lists and then applies the + function to each of the elements of the two lists pairwise + (\emph{e.g.}, \hs{zipwith (+) [1, 2] [3, 4]} becomes + \hs{[1 + 3, 2 + 4]}. + + The \hs{foldl1} function takes a function and a single list and applies the + function to the first two elements of the list. It then applies to + function to the result of the first application and the next element + from the list. This continues until the end of the list is reached. + The result of the \hs{foldl1} function is the result of the last + application. + + As you can see, the \hs{zipwith (*)} function is just pairwise + multiplication and the \hs{foldl1 (+)} function is just summation. + + To make the correspondence between the code and the equation even + more obvious, we turn the list of input samples in the equation + around. So, instead of having the the input sample received at time + $t$ in $x_t$, $x_0$ now always stores the current sample, and $x_i$ + stores the $ith$ previous sample. This changes the equation to the + following (Note that this is completely equivalent to the original + equation, just with a different definition of $x$ that better suits + the \hs{x} from the code): + + \begin{equation} + y_t = \sum\nolimits_{i = 0}^{n - 1} {x_i \cdot h_i } + \end{equation} + + So far, only functions have been used as higher order values. In + Haskell, there are two more ways to obtain a function-typed value: + partial application and lambda abstraction. Partial application + means that a function that takes multiple arguments can be applied + to a single argument, and the result will again be a function (but + that takes one argument less). As an example, consider the following + expression, that adds one to every element of a vector: + + \begin{code} + map ((+) 1) xs + \end{code} + + Here, the expression \hs{(+) 1} is the partial application of the + plus operator to the value \hs{1}, which is again a function that + adds one to its argument. + + A labmda expression allows one to introduce an anonymous function + in any expression. Consider the following expression, which again + adds one to every element of a list: + + \begin{code} + map (\x -> x + 1) xs + \end{code} + + Finally, higher order arguments are not limited to just builtin + functions, but any function defined in \CLaSH can have function + arguments. This allows the hardware designer to use a powerful + abstraction mechanism in his designs and have an optimal amount of + code reuse. + + TODO: Describe ALU example (no code) + \subsection{State} A very important concept in hardware it the concept of state. In a stateful design, the outputs depend on the history of the inputs, or the