Listening to: Haskell Cast, Episode 3: Simon Peyton Jones on GHC
Commentary follows below.
It becomes very difficult to change a language when it becomes embedded in many products. Early in the lifetime of Haskell, changes to the standard library were made more liberally. As Haskell becomes more successful, it will lose a little less nimble.
"It's not language changes that's most difficult to manage, it's library changes. It's one reason we recently established the core libraries committee."
GHC has supported type level functions. f Int = Char
, gives us associated types. Introduced as a result of a paper comparing type-level programmability features across languages. Haskell was missing associated types.
With a closed-type family, you can give an order for equations. As a result, you need fewer equations.
Rationale: solve Cabal hell by introduing a level of abstraction.
199x: Started with an ML-like module system as a notion, but turned out to be too much trouble in early language evolution, so went with simplest thing that could possibly work.
Fast forward: 2005. Needed something to indicate a minimum unit of distribution - a Cabal package. Cabal packages as a metamodule. The idea behind Backpack is how to make Haskell packages depend on APIs rather than concrete implementations. It's not about the Haskell language, perse, but rather a way to tackle the Cabal hell. It'd be a big change, because it would change how the distribution of Haskell packages.
Status: it's not yet implemented.
"Cabal hell is a real place, a very real place".
"How can more people get involved?"
GHC is everyone's compiler; it's time to work on this together. 14 new commiters have joined in since 2013.
New pieces that are growing: backend code generation, data parallel programming, template Haskell, parallel I/O manager, dynamic linking. There's a lot of room for more ideas.
On BDFLs - there has to be a way to make decisions where universal consensus isn't possible. On the compiler, SPJ will serve as a BDFL of sorts. The core libraries committee will serve as BDFL where library decisions are concerned.
Array fusion as a way to abstract away low-level details of vector operations, as well vector instructions.
If you want to program a parallel machine, start with a functional language. To make things actually run fast, it takes a lot of iteration.
There's also large scale concurrency in Haskell, like Cloud Haskell.
There's also small scale concurrency in Haskell, like forkIO and MVars. This is the process-level communication. There's an approach that uses LVars (lattice-based flows) that might introduce determinism in these types of calculations.
Purely functional parallelism: Repa, Data Parallel Haskell, Accelerate, etc.
GHC out of the box already supports parallelism.
Part of the parallelism problem is that someone can take (with a lot of effort) a program in a low-level language and access all the parallelism in a given machine. The challenge in a functional, high-level language is to find a way to do this cleanly and within a few constant factors of the speed of the low-level implementation.
You can't screw around with side effects, because the ordering of effects is not obvious in a lazy language.
Being able to prove that pattern match failings don't exist would be great. There's some research being done at this point. We can be reasonably certain that a Haskell program won't segfault, but we're still at a point where a program might fail at runtime because there might be a failure to pattern match.
Getting a handle on newtype and coercions. It's possible to convert from an Age
to and Int
at runtime for free. It's not possible to convert from a [Age]
to [Int]
for free at runtime - this would require a map
.
This is where Roles come in. It requires a new form of equality.
GHC 7.8 exposes some of these ideas in an experimental form.
Over time, hope to build more type level reasoning into GHC. Type-level naturals are being added, allowing to express ideas such a vector of length 3 and do basic operations on that. In future versions of GHC, there'll be support for more such operations.
Going beyond word processors and database management in schooling, and trying to reform ICT computer science curriculum as a "foundational discipline".
Functional programming is still a foreign concept to many teachers. A separate exercise from the Computing in School project would be to establish functional programming as a medium for carrying the concepts of computer science. However, less work is being done on this end because the goal is to get computer science into curriculums at all.
Haskell is growing. I'm pretty excited about the developments in the realm of concurrency, myself. Given that I develop APIs as part of my day-to-day, having easy to use, parallelizable concurrency primitives in Haskell makes it look very tempting. Furthermore, having strong guarantees available at compile-time make me feel more comfortable that things will work when refactoring time comes around...
...and there will always be refactoring!