A Letter from Dijkstra on APL (1982)

(jsoftware.com)

36 points | by tosh 6 hours ago

8 comments

  • adregan 2 hours ago
    APL is the only language I've ever dreamt about writing (as in: I could see the characters); I'd dreamt about programming in the past, but those dreams were usually what I would categorized as a nightmare—desperately trying to fix a bug that I couldn't figure out.

    Due to my affinity for the language, and my wish to have worked in its heyday (would love to have an APL gig someday), I have been exposed to various writings and recordings of Ken Iverson. I've also been exposed to a few of Dijkstra's thoughts on APL.

    I have to say that Iverson generally comes across as a very generous and curious individual while Dijkstra seems to have been a miserable ass. Maybe, given the lens, I've not given Dijkstra a proper chance to demonstrate a more positive attitude, so I'm open to any suggestions of writings where he doesn't seem like such a grump.

    • mlajtos 2 hours ago
      > Maybe, given the lens, I've not given Dijkstra a proper chance to demonstrate a more positive attitude, so I'm open to any suggestions of writings where he doesn't seem like such a grump.

      Kinda hard to find where Dijkstra praised something (except Algol 60).

      One funny example: he called FORTRAN "an infantile disorder", though he said this about the team behind it: "At that time this was a project of great temerity and the people responsible for it deserve our great admiration.".

      On LISP: "LISP has jokingly been described as 'the most intelligent way to misuse a computer'. I think that description is a great compliment because it transmits the full flavor of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts."

      Alan Kay on Dijkstra: "Arrogance in computer science is measured in nano-dijkstras."

      • tristramb 55 minutes ago
        "Kinda hard to find where Dijkstra praised something (except Algol 60)."

        Hamilton Richards, who was one of Dijkstra's colleagues at the University of Texas, told me in an email that Dijkstra was impressed by the work of Richard Bird on functional programming.

  • garyrob 1 hour ago
    I wrote a lot of APL for my undergraduate Senior Project in 1978/1979.

    I really enjoyed it because it was fun. You could do an incredible amount of work in a single line of code.

    The only problem was, that line would then be almost impossible to read and understand! It could easily be used as a "write-only" language even without a separate obfuscation step.

    When I become a professional programmer right after college, I never used it again, and learned to write code that was readable above all else.

    • WillAdams 41 minutes ago
      Is this an instance of the maxim that one has to be twice as smart to debug code as to write it?

      Are you aware of any APL programs written using Literate Programming?

      Apparently there was at least one attempt:

      Lee J. Dickey. Literate programming in APL and APLWEB. APL Quote Quad, 23(4):11–??, June 1, 1993. CODEN APLQD9. ISSN 0163-6006.

      Perhaps that additional layer of documentation would help? (APL is a language I've always been fascinated by, but never had occasion to more than superficially examine)

  • shrubble 2 hours ago
    I wonder if EWD would have had the same opinion if he were alive today, with every Unicode font having the APL characters immediately available on the screen.

    Did he feel the language design was bad, or would having TTF fonts being able to show "rho", "iota", "grade up" have removed one or more of his objections?

    • mlajtos 1 hour ago
      Oh, he would definitely hate it even more. It was too high-level for his taste.

      What I would like to know is how he would bend Algol 60 if he had tablet with pencil that could evaluate it in real-time.

  • Almondsetat 3 hours ago
    One can appreaciate striving for simplicity (a programming language that can be taught and explained with pen and paper), but one must also consider that computers are meta-devices.

    Before computers, we could write things only on paper, either with our hands or a typewriter. So, naturally, when computers came about, the way of thinking about programming was very text-driven, with an emphasis on what a typewriter could represent.

    But then, code could be written directly with computers, opening up more typesetting possibilities thanks to keyboards not being bound anymore by the mechanical limitations of typewriters. You could add keys and combinations to your heart's desire, and they would be natively digital and unlimited.

    Now, with graphics, both 2D and 3D, and a myriad or other HIDs, shouldn't we try to make another cognitive jump?

    • mlochbaum 2 hours ago
      It's very strange to see handwriting lumped in with typewriting, to be described as limited relative to screens! Iverson notation was a 2D format (both in handwriting and typeset publications) making use of superscripts, subscripts, and vertical stacking like mathematics. It was linearized to allow for computer execution, but the designers described this as making the language more general rather than less:

      > The practical objective of linearizing the typography also led to increased uniformity and generality. It led to the present bracketed form of indexing, which removes the rank limitation on arrays imposed by use of superscripts and subscripts.

      (https://www.jsoftware.com/papers/APLDesign.htm)

      I think this is more true than they realized at that time. The paper describes the outer product, which in Iverson notation was written as a function with a superscript ∘ and in APL became ∘. followed by the function. In both cases only primitive functions were allowed, that is, single glyphs. However, APL's notation easily extends to any function used in an outer product, no matter how long. But Iverson notation would have you write it in the lower half of the line, which would quickly start to look bad.

    • WillAdams 34 minutes ago
      I've long been fascinated by this question, probably spurred on by having read Hermann Hesse's _The Glass Bead Game_ (originally published as _Magister Ludi_) when I was impressionably young.

      The problem of course is: ``What does an algorithm look like?''

      Depicting one usually directs one into flowchart territory, and interestingly efforts at that style of programming often strive for simplicity, e.g., the straight-down preference from Raptor or Drakon --- systems which do not implement that often become a visual metaphor for ``spaghetti code'':

      https://blueprintsfromhell.tumblr.com/

      https://scriptsofanotherdimension.tumblr.com/

      As a person who uses: https://www.blockscad3d.com/editor/ and https://github.com/derkork/openscad-graph-editor a fair bit, and needs to get Ryven up-and-running again (or to fix the OpenSCAD layer in his current project or try https://www.nodebox.net/ again), this is something I'd really like to see someone be successful with, but the most successful exemplar would be Scratch, which I've never seen described as innovatively expressive --- I'd love to see such a tool which could make a traditional graphical application.

    • le-mark 2 hours ago
      All those things can be specified in text. Fortress was a language that had the facility to use mathematical notation. Turned out to be not so compelling iirc.

      https://en.wikipedia.org/wiki/Fortress_(programming_language...

    • VorpalWay 3 hours ago
      We do have syntax highlighting these days. And our editors work like hypertext, where I can go to definitions, find usages, get inheritance hierarchies etc. Quite a ways from your suggestion, but also a few steps removed from a type writer.

      I think any such leap would have to be a really big one to catch on though, due to inertia. Colorforth is not exactly popular, and I can't think of any other examples.

      • jmalicki 2 hours ago
        With LLMs you can write your code by hand drawing a diagram on a touch screen.
        • ModernMech 2 hours ago
          This has been possible since Sketchpad in 1963.
          • WillAdams 32 minutes ago
            Yes, but there don't seem to be any current implementations which are more than academic exercises (I'd love to be wrong about that and be pointed to something which I could try).
            • ModernMech 27 minutes ago
              The reason for this is that we've been trying to draw code by hand since 1963 and it doesn't really work out well except in limited domains. Maybe it'll work better with LLMs tho, I guess we'll see.
    • segmondy 3 hours ago
      We already did, it's natural language. Talk to your computer and get code, aka vibe coding.
      • Almondsetat 2 hours ago
        So humans just have a mouth and that's it? Language is the be-all-end-all of how humans can interact with the world and express themselves?
  • Hendrikto 3 hours ago
    Ironically, I think the examples given in the post validate Dijkstra’s points, instead of disproving them, as the author intended.
    • bear8642 3 hours ago
      How so?

      I'm struggling to see how Roger's manipulation of the expressions without executing each line validates Dijkstra's point...

      • empath75 2 hours ago
        "This is easy to understand, see:"

        5 lines of completely inscrutable symbols follow.

        If you are expecting someone to learn a completely new notational language before you can communicate a basic algorithm, you have gone wrong somewhere.

        You could also similarly write down merge sort in pure lambda calculus, which is interesting as an exercise, but not especially useful as working code, or as a way to explain how merge sort works.

  • mlajtos 2 hours ago
    Dijkstra's go-to language (pun intended) was Algol 60 (& Pascal) – everything else was shit in his view. Some of his comments:

    FORTRAN — "an infantile disorder"

    COBOL — "the use of COBOL cripples the mind"

    BASIC — students exposed to it are "mentally mutilated beyond hope of regeneration"

    PL/I — "the fatal disease"

    APL — "a mistake, carried through to perfection"

    He liked his languages and programs to be easily traceable with pen & paper. He always wrote programs on the paper (and proved correctness) and only then into computer. REPL-driven development (what APL pioneered) was a foreign concept to him. He would be so appalled by LLM code generation.

    • tristramb 25 minutes ago
      He liked to be able to reason about programs without running them. He preferred simpler languages because they contain less irrelevant noise which got in the way of that.
  • wood_spirit 3 hours ago
    The opening paragraphs about how people enamoured by a shiny gadget will overlook a terrible interface brings immediately to my mind the modern day LLMs.
    • asdfasgasdgasdg 3 hours ago
      I don't find this observation of Djikstra's to be one of his best. If there is a gadget that does a thing that no other gadget does, what does it even mean for the interface to be "terrible?" How can you even know if the interface is terrible, given that a better one has yet to be invented? Maybe the interface is as good as it can be for the tool in question.

      I also don't love your mapping of this observation onto modern LLMs. The interface of an LLM is natural language text, along with some files written in plain text or markdown. Can it be improved? Undoubtedly! But as a baseline, it doesn't seem half bad to me. If it is so terrible, it should not be hard to propose an interface that will be significantly more productive. Can you?

      • empath75 2 hours ago
        > If there is a gadget that does a thing that no other gadget does, what does it even mean for the interface to be "terrible?" How can you even know if the interface is terrible, given that a better one has yet to be invented? Maybe the interface is as good as it can be for the tool in question.

        That's just a taste judgement, and you can decide the interface sucks on a one of a kind item quite easily, and people often do.

    • dsecurity49 3 hours ago
      [dead]