This post is going to draw an angry Haskell mob, but I just have to say it out loud: I have never seen a definition of the so-called category Hask and I do not actually believe there is one until someone does some serious work.

Let us look at the matter a bit closer. The Haskell wiki page on Hask says:

The objects of Hask are Haskell types, and the morphisms from objects `A`

to `B`

are Haskell functions of type `A -> B`

. The identity morphism for object `A`

is `id :: A -> A`

, and the composition of morphisms `f`

and `g`

is `f . g = \x -> f (g x)`

.

Presumably “function” here means “closed expression”. It is then immediately noticed that there is a problem because the supposed identity morphisms do not actually work correctly: `seq undefined () = undefined`

and `seq (undefined . id) () = ()`

, therefore we do not have `undefined . id = undefined`

.

The proposed solution is to equate `f :: A -> B`

and `g :: A -> B`

when `f x = g x`

for all `x :: A`

. Again, we may presume that here `x`

ranges over all closed expressions of type `A`

. But this begs the question: *what does f x = g x mean?* Obviously, it cannot mean “syntactically equal expressions”. If we had a notion of observational or contextual equivalence then we could use that, but there is no such thing until somebody provides an operational semantics of Haskell. Written down, in detail, in standard form.

The wiki page gives two references. One is about the denotational semantics of Haskell, which is just a certain category of continuous posets. That is all fine, but such a category is not the syntactic category we are looking for. The other paper is a fine piece of work that uses denotational semantics to prove cool things, but does not speak of any syntactic category for Haskell.

There are several ways in which we could resolve the problem:

- If we define a notion of observational or contextual equivalence for Haskell, then we will know what it means for two expressions to be indistinguishable. We can then use this notion to equate indistinguishable morphisms.
- We could try to define the equality relation more carefully. The wiki page does a first step by specifying that at a function type equality is the extensional equality. Similarly, we could define that two pairs are equal if their components are equal, etc. But there are a lot of type constructors (including recursive types) and you’d have to go through them, and define a notion of equality on all of them. And after that, you need to show that this notion of equality actually gives a category. All the while, there will be a nagging doubt as to what it all means, since there is no operational semantics of Haskell.
- We could import a category-theoretic structure from a denotational semantics. It seems that denotational semantics of Haskell actually exists and is some sort of a category of domains. However, this would just mean we’re restricting attention to a subcategory of the semantic category on the definable objects and morphisms. There is little to no advantage of doing so, and it’s better to just stick with the semantic category.

Until someone actually does some work, **there is no Hask**! I’d delighted to be wrong, but I have not seen a complete construction of such a category yet.

Perhaps you think it is OK to pretend that something is a category when it is not. In that case, you would also pretend that the Haskell monads are actual category-theoretic monads. I recall a story from one of my math professors: when she was still a doctoral student she participated as “math support” in the construction of a small experimental nuclear reactor in Slovenia. One of the physicsts asked her to estimate the value of the harmonic series $1 + 1/2 + 1/3 + \cdots$ to four decimals. When she tried to explain the series diverged, he said “that’s ok, let’s just pretend it converges”.

**Supplemental: ** Of the three solutions mentioned above I like the best the one where we give Haskell an operational semantics. It’s more or less clear how we would do this, after all Haskell is more or less a glorified PCF. However, the thing that worries me is `seq`

. Because of it `undefined`

and `undefined . id`

are *not* observationally equivalent, which means that we cannot use observational equivalence for equality of morphisms. We could try the wiki definition: `f :: A -> B`

and `g :: A -> B`

represent the same morphisms if `f x`

and `g x`

are observationally equivalent for all closed expressions `x :: A`

. But then we need to prove something after that to know that we really have a category. For instance, I do not find it obvious anymore that programs which involve seq behave nicely. And what happens with higher-order functions, where observational equivalence and extensional equality get mixed up, is everything still holding water? There are just too many questions to be answered before we have a category.

**Supplemental II:** Now that the mob is here, I can see certain patterns in the comments, so I will allow myself replying to them en masse by supplementing the post. I hope you all will notice this. Let me be clear that I am not arguing against the usefulness of category-theoretic thinking in programming. In fact, I support programming that is informed by abstraction, as it often leads to new insights and helps gets things done correctly. (And anyone who knows my work should find this completely obvious.)

Nor am I objecting to “fast & loose” mode of thinking while investigating a new idea in Haskell, that is obviously quite useful as well. I am objecting to:

- The fact the the Haskell wiki claims there is such a thing as “the category Hask” and it pretends that everything is ok.
- The fact that some people find it acceptable to defend broken mathematics on the grounds that it is useful. Non-broken mathematics is also useful, as well as correct. Good engineers do not rationalize broken math by saying “life is tough”.

Anyhow, we do not need the Hask category. There already are other categories into which we can map Haskell, and they explain things quite well. It is ok to say “you can think of Haskell as a sort of category, but beware, there are technical details which break this idea, so you need to be a bit careful”. It is not ok to write on the Haskell wiki “Hask is a category”. Which is why I put up this blog post, so when people Google for Hask they’ll hopefully find the truth behind it.

**Supplemental III**: On Twitter people have suggested some references that provide an operational semantics of Haskell:

- John Launchbury: A natural semantics for lazy evaluation
- Alan Jeffrey: A fully abstract semantics for concurrent graph reduction

Can we use these to define a suitable notion of equality of morphisms? (And let’s forget about `seq`

for the time being.)

I thought it had been resolved years ago that Hask was not a real category?

I was a great disappointment to me when, after years of study, I realized that CT was broken for Turing complete systems: since it is undecidable in general whether certain arrows exist, you end up having to restrict yourself to either strongly-normalizing systems or “magic” systems, like Set, that pretend to allow you to see to the end of infinity.

Your comment makes no sense. Turing completeness has nothing to do with categories. It is perfectly ok to have categories in which it is undecidable whether some particular arrow exists, or in fact whethere the thing even is a category. You are confusing decidability and well-definedness.

Note: I am cutting off replies on this subdiscussion because it has nothing to do with the blog post. There are enough information channels on the internet for people to discuss this to death.

To keep my comment simple: I think that this article is just pointing out the obvious.

This points much more towards a very poorly evolved pseudo-mathematical culture surrounding modern day programming language usage and design. I am assured that the biggest advocates of “haskell being a well-designed language because well … category theory and stuff” are entirely disjoint from the set of people who actually have spent far too much of their lives studying category theory.

Anyways, a higher degree of rigor would be nice in PL discussions. Or as you say getting “someone who actually does some work” to make claims formally (via a theorem and proof at least?) instead of a slide presentation. I avoid this side of “computer science” and “mathematics” quite extensively because of little things like this.

@Anonymous Coward: you’d be glad to hear that there are plenty of people in the PL research community who hold a very high standard of mathematics, many of them even higher than mathematicians themselves because they routinely formalize their theorems with proof assistants. The “PL community” that one meets on reddit is more like the community you find in a beer bar after a long PL conference day.

“This points much more towards a very poorly evolved pseudo-mathematical culture surrounding modern day programming language usage and design.”

No: “… surrounding Haskell.” Period.

It seems to me that the real problem is that Haskellers take for granted that they can work in a terminating/productive/whatever subset of the language, rather than the whole of it. Of course, the assumption is false, but, if it were true, Hask would be a bona fide category.

If wishes were horses …

Marc — Not sure what you mean by CT being “broken” for Turing Complete systems. Decidability of halting doesn’t seem important in this context…

Andrej is observing that “Hask” (at least currently) fails to form a category in the sense that naïve Haskellers wished it would… This is not because of turing completeness, but because of laziness.

More generally, because of not being anything at all. Haskell has no definition.

As I understand it, you don’t even need to work in a terminating subset to have a category (albeit without all the niceities like straightforward categorical products that one might like). You just need to be in “the subset that doesn’t include seq”. And that’s what we all pretend, since we all know that seq absent a typeclass constraint is bad and wrong and should never have been added since it generally breaks eta and messes up everything.

It is there for a reason, as is unsafePerformIO.

Your constructivism is showing. :-)

People just like to believe there is a category hiding somewhere in there. Or, more precisely, just use the language of CT to speak about Haskell functions. And, unlike the convergence of harmonic series, the existence of Hask has not been proven to lead to a contradiction as far as I understand. :-)

It’s not my constructivism, it’s my being a mathematician. I do not accept the kind of slopiness that passes for mathematics on the Haskell wiki page. If they want to do that sort of thing they should call it pseudo-mathematics.

It is usual for constructivists to call their constructivism “just being a mathematician”. :-) It is a simple fact that people, yes, even mathematicians, use the language by analogy, without formalizing every aspect of it. Sometimes it’s justified, sometimes it’s not, but it’s human and it’s pervasive. See https://paulhankin.github.io/Complexity/ for a cute example.

May I refer you to “Am I a constructive mathematician?“, since you seem to think I am one.

I think labels are not so important. It’s just that your paragraph

> Until someone actually does some work, there is no Hask! I’d delighted to be wrong, but I have not seen a complete construction of such a category yet.

seems to strongly suggest you _are_ a man who strongly believes the object doesn’t exist until formally constructed.

Of course _there is Hask_, in the sense people you quote are talking. We don’t actually know whether it can be modelled in some precise mathematics, but so far we haven’t seen strong evidence to the contrary. And even if it can, it is perfectly possible it can do in many individually interesting but mutually incompatible ways – similarly to the reals in your linked post.

I think it should be clear from the context that I am talking about mathematics here, and not Internet-level hand waving. And futhermore it should be clear that you are misrepresenting my position about formal construction: I am talking about the fact that I have not seen any precise definition of Hask. This is a statement about humans, namely that nobody has done a piece of math. It is not a statement in my belief about existence of mathematical objects. So don’t put words in my mouth.

And regarding “evidence to the contrary”, that’s not how mathematics works. If someone says “Hask is a category” then it’s their duty to prove it. Until then, they’ve got nothing.

“People just like to believe [complete nonsense about Haskell].”. Indeed.

Well, except if you use Ramanujan summation, where you than end up with the Euler–Mascheroni constant.

The `Fast and Loose Reasoning is Morally Correct’ paper seems relevant here.

I already commented on the Fast & Loose paper in my blog post, but perhaps I was not explicit enough. Haskell people offer that paper as the universal solution to any mathematical objection. The paper doesnot define a syntactic category out of Haskell. It uses PER models on domains as a semantic model. It does not address the question I am asking here.

Andrej, thanks for the post.

I wrote a follow-up/response/excuse here: https://ro-che.info/articles/2016-08-07-hask-category

`seq` and the liftedness of products breaks the nice abstraction, yes.

Dan Piponi wrote up this already-by-then folklore result in 2009.

http://blog.sigfpe.com/2009/10/what-category-do-haskell-types-and.html

That said, in practice, getting by with “fast and loose reasoning” and pretending that these extra issues don’t exist is sufficient for the identities we can borrow from category theory to be useful.

https://www.cs.ox.ac.uk/jeremy.gibbons/publications/fast+loose.pdf

If your program terminates under both semantics, then working in the simplified seq-less world is vastly easier.

So while Hask isn’t a category once you include `seq` in the language, and (,) isn’t a categorical product, because `_|_ /= (_|_ , _|_ )`, the view is “morally correct” to borrow the terminology from that latter paper.

Category theory is a powerful enough substrate that even doing it wrongly adds a lot of utility.

This is precisely what I am arguing against. Hand-waving and referring to the “fast & loose” paper as a solution to every mathematical hole. It is dangerous, it is dishonest, and it is doing a huge disservice to the community. People walk away from Haskell thinking they know some category theory where in fact they have not even seen a category yet.

Thereis a perfectly good categorical treatment of Haskell in terms of denotational semantics. It uses (a suitable version of) $\omega$-CPOs. It explains exactly what’s going on. It clears up misunderstanding about products and monads, and all the other stuff, by showing how we need to argue with inequalities instead of equalities, etc. But this is not Hask. There is no Hask and people should not be told there is one. It is not that hard to learn a bit of domain theory.

By the way, I am quite happy to ignore

`seq`

, but I am not willing to ignore general recursion because you cannot write any program in Haskell without it.Personally I’m happy for a massive “assuming either no seq or totality” statement to be added, at least when talking H98/2010 or after instance dispatch is resolved. Making the disclaimer would certainly be well worth doing, because seq really does mess things up otherwise!

We need better cultural education about the Fast And Loose assumption and/or seq, definitely. But given that much, the rest mostly drops out. I haven’t seen the categorical logic take on System F (with or without additional fix) specifically for example, but it’s a lot clearer that’s going to exist. And a lot of good work gets done miles away from anywhere that seq would have any utility.

Semantics of System F has been studied, of course. For the latest advances (and a signigicant cleanup) see:Neil Ghani, Fredrik Nordvall Forsberg, Alex Simpson: Comprehensive Parametric Polymorphism: Categorical Models and Type Theory. FoSSaCS 2016: 3-19 .

Andrej, I wonder if you have read Dan Piponi’s “What Category do Haskell Types and Functions Live in?” (http://blog.sigfpe.com/2009/10/what-category-do-haskell-types-and.html)? In no way does it invalidate your points (it supports them), but he argues (and I agree) that asking whether Haskell types and functions form a purported category “Hask” is not really the right question. Rather the better question to ask is what *class* of categories can be modelled by the fragment of the language in which a given program is written. Quoting Piponi:

Haskell (and various subsets and extensions) is essentially a way to give semantics to internal languages for various classes of category. However complicated and messy those semantics might get on a real world computer, the language itself is a thing of beauty and more general than might appear at first.

Sure, but as long as we do not actually know how the said fragment of the programming language actually computes, we cannot formulate a mathematical model of such a fragment, and so we know nothing. All I am asking for here is an honest mathematical standard.

Hmm, re-reading my comment above, I would like to make one qualification that laziness is not the entire source of the problem; it interacts with it though. The presence of ‘seq’ (which is not even properly accounted for in Haskell’s supposed denotational semantics) is what really gums up the works, I think.

@Gershom: Yes, but you need to pretend Haskell is a terminating language to treat Hask as a moral analogue of the category of sets. And the entire library ecosystem shows that Haskellers do think of Hask as a moral analogue of the category of sets.

@Jon @Andrej The idea that undecidability is irrelevant (or can be worked around) in a logical or mathematical system is itself the problem I see. I guess I subscribe to a stricter brand (or at least different brand) of constructivism. Good to know.

I may have met you online before, I think. I would not call your ideas “stricter brand of constructivism” because it goes a bit too far for constructivist. For instance, you do not even believe in first-order logic and natural numbers because that automatically creates non-decidable formulas. So, before you do anything else, you have to argue that your position is actually sustainable and that you can do basic mathematics. But I do not want an off-topic discussion about this here.

What brand of constructivism is this? Regardless of whether it is possible to do actual mathematics in it (which Andrej brings up), what you are describing is completely foreign to the three main branches of constructive mathematics: Brouwer’s school, Markov’s school and Bishop’s school.

Feel free to follow up with me on this topic, but not on poor Andrej’s blog comments.

I know I’m really late with this reply, but do you know about middle Wittgenstein’s mathematics (https://plato.stanford.edu/entries/wittgenstein-mathematics/)? Based on Andrej’s description, it sounds very similar to your ideas. In short, Wittgenstein sees mathematics as a machine we can ask questions of (in the form of propositions), and whenever we prove something, we add it to the machine. To me, it seems a trick definition of “proposition”, but nonetheless, it is interesting to read about.

To all: I updated the blog post with “Supplemental II” to steer the discsussion into more useful (mathematical) waters. I am not interested in “fast & loose is useful” and “in practice it does not matter”, I know that already. I am interested in (1) making sure people are not lied to, mathematically speaking, and (2) finding a good solution (which I think already exists and is called denotational semantics).

This is so good. Thank you.

There are a couple more superstitions in Hask (I believe).

– existence of a free monad for any functor

– every monad being strong (in Cartesian sense)

Something should be done with all this alchemy.

Or there being monads at all in Haskell. Haskell added nothing that was not already present in Algol 60, apart from bald pretension.

@Andrej: “It is not that hard to learn a bit of domain theory.”

I’ve bought Stoy’s book on denotational semantics, and, while it’s certainly enlightening, it’s not an easy read (for a non-mathematician like me). It takes Stoy some 30 pages (!) to set up the necessary order/lattice theory, and explain the intuitions that guide his definitions, until he can finally provide a definition of “domain”.

Ah yes, a book on domain theory. I always thought that Stoltenberg-Hanserns’s book “Mathematical Theory of Domains” was quite good. And Gordon Plotkin’s Pisa notes are a classic.

With my applied-category-theorist’s hat on, I am motivated to ask: if Hask isn’t a category, then what is it? Category theory isn’t just about categories. If identities are the problem, is Hask a semicategory? If there is more than one way to say what we mean by two morphisms being equal, is it some kind of enriched category? Can we describe axiomatically what a “category with a seq operator” looks like?

PS. Andrej, as far as I can see there is no threading of comments on your blog, so without an @address it is very hard for me to guess who you are responding to in each comment. Is there some way to turn threading on?

Haskell has no defintion, it is not anything it all. The discussion does not hinge on some tiny technicality.

Hask is not Set.

The Haskell community isn’t a unified whole. Some of us would like to have better control over `seq`, to permit more accurate categorical reasoning, but that got away from us long before I joined the community, and I don’t see us ever managing to put that genie back in the bottle.

Ignoring fast and loose, since you seem willing to take `seq` off the table, then I very strongly agree with you that whatever ‘Hask’ is, it is better thought of in ?-CPO terms than as Set!

Hask very obviously isn’t Set — it contains messy, difficult things like hyperfunctions, it has no pushouts, we can’t play axiom of choice games, etc.

We can go further when talking about the sins of our community. Parametricity isn’t naturality. People tend to conflate `(forall a. f a -> g a)` with a natural transformation from f to g, when it is something much stronger. A natural transformation is a set of morphisms that satisfy a diagram. This is one function that can be given all those types. If you replace naturality with parametricity and try to build up the rest of category theory, then limits are too weak and you can’t build products from them.

http://dl.acm.org/citation.cfm?id=2599154

Even so, you can do a non-trivial amount of Hask-enriched category theory, so long as everything is done in a bicategory style, with explicit associators to handle newtypes and the inability to get things ‘on the nose’.

When doing category theory in Hask, I’m always cognizant of the shakiness of the foundations, the tediousness of the bicategory-like encoding, and the limitations of the model.

That doesn’t mean I’m in a hurry to throw it all out and start over from a blank sheet of paper when large portions of category theory do apply, and when you can read almost anything from the last 70 years, simply transcribe it and get meaningful results.

As a mathematician I’m more interested in exploiting the structure that Hask does have, and using category theory as a useful tool for talking about structure from outside of that model that occasionally has pale reflections inside of it that I can turn into code.

I think we’re in agreement then.

Can I please ask everyone to refrain from politics and real life (because it’s my blog). I’d like to steer the discussion towards getting an actual syntactic category out of Haskell (without

`seq`

if need be), just to see if it has any use that $\omega$CPO does not. Alan Jeffrey’s and John Launchbury’s papers seem handy. Launchbury proves adequacy so that should give a good relationship between the allgeed syntactic category and the $\omega$CPOs.But then your “official semantics” would disagree with how Haskellers actually think about their code in practice. No worse sin can a programming language’s semantics commit. I don’t think anyone would realistically want to write programs thinking of all types as ?-CPOs or whatever. That’s fine for types inhabited by computations, like functions or objects. But the beauty of algebraic data types is that its values are precisely those given by the cases, without the pesky addendum “or bottom”.

When I program in Haskell, what I actually do in practice is treat bottom as undefined behavior in the C sense. Since Haskell implementors are more reasonable people than C implementors, I trust them not to do things like eat all my data when the meaning of my program is undefined. Just like I trust Java implementors not to eat all my data when I use a null reference. But that doesn’t detract from the fact that, for my actual purposes, the meaning of the program is remains undefined.

“undefined” in haskell is not at all like undefined behavior in C. The meaning of a program that uses bottom is not undefined—its meaning is precisely defined by Haskell’s denotational semantics (assuming you are using just the fragment of haskell which actually does have a denotational semantics).

In Haskell, by the way, the so-called “algebraic data types” do not behave in the way you suggest. Some explain this by saying that Haskell doesn’t actually have algebraic data types; others say that it does have them, but then just conveniently redefine the term. To me, it’s six of one, half a dozen of the other.

My opinion—if you’re going to code in a partial (and lazy!) language, you had better begin thinking about cpos! (Alternatively, someone could write down an operational semantics for Haskell, but to my knowledge, this has not actually been done systematically.)

Regarding whether the “official semantics” correspond to how Haskell developers think about their code in practice, investigating this may lead to a number of common errors which Haskell developers are routinely committing…

Or I could switch to Standard ML, which I’ve already done. General recursion makes computations somewhat ill-behaved. As far as I can tell, the best thing one can do is “cordon off” this misbehavior so that it doesn’t infect values, which is exactly what call-by-value languages do. :-)

I rest my case. We have here (presumably) a real-world Haskeller who is mistaken about

`undefined`

and the recursive types in Haskell. I will blame this on the pseudo-math that form a cloud of confusion around Haskell. One can get only so far by learning a mathy language such as Haskell by osmosis, and after that there’s some math awaiting. (And that’s a good thing. Engineers in other areas also need to learn some Real Math.)Eduardo—Cheers for switching to SML! I did this years ago as well once I got completely fed up with the Haskell language and the pseudo-mathematical shroud which its followers have cloaked round it. Life is much more fun on that side of things…

(But I think you are mistaken about how bottoms work in Haskell, and how Haskell’s semantics would work assuming it had any.)

I don’t see what the harm is in assuming less behavior is defined than it actually is. A semantics that tells me “bottom is a value” is only telling me that Haskell can’t express basic things like the set of all integers, or the set of all lists whose element type is some other given set. But I *really* need those things in my day-to-day programming!

I think of treating defined behavior as undefined, as not too different from removing axioms from a theory: I don’t risk proving any “wrong” theorems (not provable in the old theory) if I do so. It’s a-whole-nother matter if I start looking at models (possible implementations), of course…

@Mike Shulman: I turned on threading but perhaps it won’t kick in on an existing post. I can’t see threading. I hope this discussion will die out soon, anyhow.

Regarding the excellent question “what is Hask if not a category?”, I think the supposed equations for a category are actually inequalities in some partial order (induced by a notion of “observation”). For instance, it seems like $f \leq f \circ \mathrm{id}$, meaning roughly that “$f \circ \mathrm{id}$ is at least as defined as $f$” (I hope I got the direction right).

(.) can be patched into correctness,

(#) :: (b -> c) -> (a -> b) -> a -> c

(#) f g = f `seq` g `seq` \x -> f (g x)

but, sadly, GHC doesn’t do well at optimizing it:

https://ghc.haskell.org/trac/ghc/ticket/7542

In `lens`, I’m often just putting on or taking off a newtype, so I use special cases for composing in a coercible argument to avoid the spurious costs of eta-expansion wrappers without triggering bad behavior.

As an example to show that the concerns you raise here aren’t entirely academic, we’ve even seen accumulation of asymptotically significant numbers of eta-expansion wrappers in the past:

https://ghc.haskell.org/trac/ghc/ticket/7436

Q1: Can you explain why you want a syntactic category for Haskell, or why do you think Hask is one? Hask is used as a layman’s version of proper CPO semantics, not as a syntactic category. And Haskellers aren’t taught to reason with operational semantics.

I know the Wiki says “Haskell function”, but it does not even distinguish syntax from semantics, and it talks about “unique function” in a way that makes no sense unless it’s using denotational semantics.

While the notation looks sloppy for a PLer, it is comparable to writing 1 + 1 = 2 instead of [[1 + 1]] = [[2]].

Q2: the bigger issue, IMHO, is when people treat the category of CPOs as bicartesian closed, and publish papers doing that without any caveat that I can find—though I still hope I missed some. I even mentioned one example in a question: http://cstheory.stackexchange.com/q/33812/989

I have been told more than once, and the Haskell wiki says this explicitly, that “the objects of Hask are Haskell types”. It does not say that the object are $\omega$CPOs, and the various redditors and other Haskellers certainly speak of it in terms of syntax. So, by my reading, this means it is syntax. I already stated explicitly that I am all in favor of $\omega$CPOs. If that’s what people think they are using, then they should call the category “$\omega$CPO” and not “Hask”. But I very much doubt that the average Haskell programmer knows anything about $\omega$CPOs.

You ask why I want a syntactic category. I do not really care about Haskell, but syntactic models are generally useful because they capture operational aspects in a setting which is directly comparable to the denotational semantics. That is, if I have a syntactic category $S$ and I want to consider denotational semantics in category $C$ I may hope that $S$ is the universal model in $C$. This has beautiful consequences, and it’s just one possible use of syntactic categories.

Hask really is not a category, it is something else. In practice people use it to get ideas from category theory and implement them in Haskell in cool ways. Edward Kmett explained to me once (at about three times the speed I could follow) how lenses are just a bunch of category theory. That’s the sort of thing that this broken Hask non-category is good for.

I had missed this comment, sorry for the messages on Twitter.

The Wiki appears to talk about syntax *because* Haskellers aren’t taught to distinguish it from semantics. They write fmap id = id exactly like people writing 1 + 2 = 3, and do the same when they use equational reasoning, which uses denotational equality and ignores syntactic equality altogether.

Indeed, there’s no quotienting of programs in sight on the Wiki yet they talk about unicity of functions. And Haskell blog posts do use bottom and equational reasoning and collapse syntax and semantics.

Of course I agree about the sloppiness, and probably Haskellers should understand the distinction between syntax and semantics. And probably understand they’re not doing real proofs.* But that is a different point. And in fact, I suspect ignoring the distinction among syntax and semantics is not so bad for the math Haskellers do (unlike the other, well-known problems with both Hask and $\omega$CPO); they’re not performing themselves structural induction on syntax or derivations.

On the situation, I find Sec. 3.4 significant (and worrisome):

http://research.microsoft.com/en-us/um/people/simonpj/papers/history-of-haskell/

> Perhaps more importantly, the dynamic semantics of Haskell is cap- tured very elegantly for the average programmer through “equational reasoning”—much simpler to apply than a formal denotational or operational semantics, thanks to Haskell’s purity. […] Equational reasoning in Haskell is part of the culture, and part of the training that every good Haskell programmer receives. As a result, there may be more proofs of correctness properties and program transformations in Haskell than any other language, despite its lack of a formally specified semantics! Such proofs usually ignore the fact that some of the basic steps used—such as ?-reduction in Haskell—would not actually preserve a fully formal semantics even if there was one, yet amazingly enough, (under the right conditions) the conclusions drawn are valid even so (Danielsson et al., 2006)!

There seems to be a supposition here that a denotational semantics, which is not extant but never mind, constitutes a language definition. It does not. One needs to show that the semantics is adequate, for which one must have an operational semantics, also not in evidence. Furthermore, equational reasoning is in now way sufficient for understanding programs. The complexity matters as much as putative extensional correctness, and this is inherently an operational matter. It is of course famous that Haskell programs have absurd space usage, for example, and very difficult to predict performance characteristics. This is not a feature, but a bug.

Hi, I wrote most of that wiki page, mostly to make the point that Hask is not a Cartesian closed category. This is awkward, because people often have a CCC in mind, or at least the idea of a CCC in mind, when they write Haskell. They form intuitions about how their programs will behave based on the theory of CCCs. These intuitions sometimes turn out correct and sometimes do not, because their mental model of Hask as a CCC is not correct. Eventually they end up with an intuitive mental sense of Hask as “approximately” a CCC and work with that. Note that switching languages from Haskell to ML is no solution to this problem.

For most Haskell programmers, whether Hask is a mere category or not seems rather less important: it doesn’t matter so much if one can come up with a category Hask, since by and large it won’t be any category that Haskell programmers are thinking of with their “Functor” and “Monad” classes if they are even thinking of a specific category.

Possibly there really is some CCC that programmers can sometimes usefully think with, together with a mapping from morphisms to pieces of text that the Haskell compiler will accept; I don’t know. That at least seems to me a more promising avenue for making the bothersome hand-waving go away.

Well, first of all, thanks for writing the page! If I could suggest an improvement, it would be that you should be clear as to whether Hask is supposed to be syntactic (made of types) or semantic (made of domains). You probably do not want to get into explanations of what “domains” and “$\omega$CPOs” are, so perhaps you can just be a bit vague, but warn the reader that there are technicalities which need to be resolved for a water-tight definition.

What I dislike is that on the wiki page it says explicitly “Hask is a category” but then in the comments here I read “for most Haskell programmers, whether Hask is a mere category or not seems rather less important: it doesn’t matter so much if one can come up with a category Hask”. Well then, since you do not have a category and it does not matter whether you have a category, say out loud that you are

pretendingto have one.I invite more knowledgeable folks (including yourself, indeed) to further edit the page, since this is beyond my expertise. It’s still not clear to me whether there is or is not a sensible category Hask somewhere in the ballpark. I was vaguely under the impression that one had been defined somewhere.

A definition for x=y is easy. It just means for every g::T->(), g x and g y either both yield () or both diverge. I leave it as an exercise to the reader to show that this is an equivalence relation.

Also, technically only GHCs implementation of seq breaks Hack. seq (undefined . id) 7 should be undefined, but GHC can’t prove undefined . id = undefined.

Your definition is incomplete. When you say “

`g x`

and`g y`

either both yield`()`

or both diverge” you have to define what it means to “yield`()`

” and what it means to “diverge”. That is, you need to provide an operational account of the language.As for “only

`seq`

breaks Hask”: sure, if you can give an operational account of Haskell without`seq`

that makes things work, that will already be something (btw, see the references in “Supplemental III” to get an idea of what it takes to provide an operational semantics of Haskell).