I had given myself a project of writing my interview phone screen coding question in as many languages as I could, more as a lark, but also in 'prep' for a candidate being 'funny' when I say 'use whatever language you are comfortable in'.
I did APL, which was fun, and easy. However in researching the language I came across some IBM accounting source code written in APL. It really clued me into how the language got such a following in HR/Finance applications. You got an array of everyones hours per day. Weekly total hours, pay, tax, medicare, etc, were all broken out in little calculative statements based on the array of hours. So rather than long strings of APL characters, there was just massive amounts of english and these little digraphs or trigraphs of APL for the calculative values.
> I had given myself a project of writing my interview phone screen coding question in as many languages as I could, more as a lark, but also in 'prep' for a candidate being 'funny' when I say 'use whatever language you are comfortable in'.
I would honestly love a candidate to use some esoteric/legacy/wacky/niche language to spice it up.
If they can walk me through it that would be a Strong Hire in my book (assuming it actually works when I test it later).
>It really clued me into how the language got such a following in HR/Finance applications. You got an array of everyones hours per day. Weekly total hours, pay, tax, medicare, etc, were all broken out in little calculative statements based on the array of hours.
Yeah I guess APL is kind of 'spreadsheets as a Real Programming Language', isn't it?
I can remember still attending an IBM planning meeting in 1984 when APL was being touted as 'the future'. Cue lots of eye rolling from the sales teams who were not entirely convinced.
Many years later I came across it again in a financial modelling application where it worked well. Only problem was that there were (I think) only three people in the company who understood it. So it didn't last long as the risks / costs were just too great. Which I guess would be a typical scenario for many APL applications.
Author here. I wrote this because I thought APL deserved an up to date introductory text. It's basically the notes I kept when learning it myself. Stoked to see it on HN.
To me it looks like one of the steepest learning curves I know: uncommon syntax, special characters, custom keyboard layout required, different dialects, ...
And in the end, I wonder if it is more readable than regex which is sometimes considered 'write-only', because reading it can be so time consuming. I know you wrote about the readability topic, but I don't mean 'can I read it', but rather 'How much effort takes it to read it'.
As someone who spent a few weekends learning APL: no, the learning curve is not that steep.
The syntax is very simple, there's nothing scary about special characters if you can get over your fear of them, custom keyboard layout is absolutely not required (I've never used a custom layout for APL or Japanese or Cyrillic or Greek or any other non-latin characters that I type on a keyboard). Dialects are a thing but Dyalog is such an obvious choice if you want to learn APL that I wouldn't even worry about the rest. (K is worth looking into for its own sake but I don't consider it a dialect of APL; J might be a dialect worth exploring but so far it didn't impress me much and I found it much harder to read than APL).
One more thing: I find that reading APL is easier than writing APL.
I'd say there are two learning curves involved. The first is knowing what every symbol does. This sounds like it will be quite hard, but honestly probably takes less than a day. APL really doesn't have a lot of stuff to remember. It just looks a bit weird.
The second learning curve however is the much harder part, learning how to write APL 'idiomatically'. This is mostly true in any language though, especially one in a different paradigm.
arendtio says>"To me it looks like one of the steepest learning curves I know: uncommon syntax, special characters, custom keyboard layout required, different dialects, ..."<
so, while a considerably smaller language, it's like learning Chinese (for someone who uses a Western alphabet)?
Thanks for the writeup. I'm still not a fan of APL due to its non-ASCII characters, but I like the link to https://beyondloom.com/blog/denial.html that you provided.
I note that on my iOS iPad some of the symbols render as a empty boxes, e.g. in the Fizzbuzz example the left arrows point to an empty box. I have no idea if that is in fact the correct symbol—it could well be! (Also, I have no idea if this could be corrected.)
Thanks for this. I don't know when I will make the time to learn new languages, but I think I've decided that when I do, one of the array programming languages is where I'll focus. This page seems like as good a place to start as any.
The xkcd "light saber" thing made lisp seem appealing to me a few years ago, but after a brief intro to the array languages, now I think they're much more deserving of that comparison.
Don't flame me, but I think APL is the perfect language to introduce coding to kids. I've had great success with it personally. The symbols are suprisingly intuitive to them, and APL's strengths match the domain of what kids typically do with computers at first.
I don’t think you’re wrong at all. I was not a kid at the time, but APL was the first language I learned. There is one drawback: it kind of spoils you. If you continue to do any programming you’ll probably have to use other (inferior) languages, and you’ll always miss APL.
I think in some way anyone's first language will spoil them for whatever that language is best suited for. My "first" language was Z80 assembly, because I wanted to make games for my calculator (I ended up writing just one[0], also "first" because technically my first was TI-BASIC of course). I've missed feeling like I directly was speaking "to the metal" ever since. Don't get me wrong, I'm super-happy with the high-level languages I've used since, but whenever I have to wrangle efficient code out of JavaScript there's some (probably misplaced) nostalgia for when I didn't have to worry about whether or not the JIT will kick in or not.
Hm, i'll have to disagree. My first language, decades before i became a programmer, was Commodore Basic. When i started learning programming in earnest, my first languages were Java, C and Python. The first language that spoiled me was ReasonML, compiling and reloading a JavaScript frontend in milliseconds. The first real job using Typescript was like watching paint dry. Around 30 seconds of waiting for compiler, linter and the rest of tooling on each push? My colleagues disabled the linter to get work done.
APL was originally created by Iverson as a notation used to teach mathematics and only became a programming language able to be executed by a computer around the time that (or shortly after) he published his book _A Programming Language_ [0] in 1962.
Even throwing away the “executable by a computer” part of APL and only considering it as a notation, APL can be powerful. Iverson gave a lecture, which was published as _Notation as a Tool of Thought_ [1] that contains a good discussion of what exactly makes a given notation “good”.
I'm ironically going to be that pedantic guy and say "well ackchyhually, the point is that it's Turing complete so it lacks no expressive power, you're thinking of verbosity".
APL, forth and prolog are routinely rediscovered over and over again. They are like a cat standing in the path of a door : refusing the mainstream and refusing obscurity.
Maybe lisp was in that club too but clojure made it mainstream again, by my horribly inaccurate, unscientific and probably wrong reckoning.
That's something I like about array languages too, and Lisp is similar from what I know of it. Essentially defining a DSL for the problem and then working with it.
It happens like once a year, it's not that sudden. That said, Learning APL looks like an outstanding resource when other introductions to Dyalog are outdated, incomplete, or just low-quality, so I think it deserves to be highlighted even if everyone's otherwise tired of hearing about APL.
I've been reading hacker news for around two years and it's one of the recurrent topics I think. It comes in waves so we may be in a peak of some sorts, but I think it'll always come and go.
Here's what I don't get about APL. Why is it so important that everything be a single symbol? Why not give the symbols more descriptive names and let people use those names, instead of being forced to work with only the symbols? Heck, it wouldn't be much work to make a text editor that lets you switch back and forth.
APL seems to be designed for people who place an extremely, extremely high weight on code golf-level terseness. IMHO, terseness isn't the be-all, end-all of comprehensibility. I'd rather just use the words "max" and "min" than learn redundant special symbols for the same that save me a couple characters.
Having been interested in APL, but never taking the plunge and learning it, I've seen this article "Notation as a tool of thought" (this is linked in the submitted article as well) thrown around: https://www.jsoftware.com/papers/tot.htm
The paper kicks off with a quote by George Boole: "That language is an instrument of human reason, and not merely a medium for the expression of thought, is a truth generally admitted."
Having a terse language in itself can lend itself to thinking about problems and playing around with them in your head more effectively. I know when I'm thinking about an algorithm to be implemented in something like java there isn't room for all the boilerplate in my head. Stuff like "public static void main..." gets fuzzed out when thinking about a solution, to be replaced the idea of "place where main method is".
I like the idea of boiling down useful abstractions to the most simple symbols we can manipulate mentally, to be able to fit the most possible high level abstractions in our mind at the same time. This should make it possible to be able to play more with constructing solutions. Never feeling particularly locked down because you have to manipulate a lot of syntax and curly braces to implement or think about an idea. That you only have to switch around a few symbols for a change in semantic meaning.
Having never learned something like APL myself, I can't speak personally about the effectiveness of this idea.
https://code.kx.com/q/ uses words for a lot of things.
Instead of the APL ⌈/ you write 'max'. Instead of the APL ⌈\ you write 'maxs'. And so on.
Personally, I much prefer the APL for a few reasons.
If I tell you that +/ is sum, you can reasonably guess what product is. (×/).
However, in q, if I tell you 'sum' is sum, can you guess what product is? Well, maybe. But it's prd. Would you have guessed that first try?
This seems like it might not be that important, and obviously to the q developers it isn't. But it is a consideration that logically leads to terse symbols.
If you had to write reduce(plus, arr) every time of course you'd rather have a word for 'sum', it's only because of the symbols that let you decompose larger problems into the smaller primitives without ending up with more verbose code than other languages.
With modern day IDEs, auto suggestions would likely lead most developers to easily discovering the exact syntax for "product".
Additionally, words that describe the inherent function makes reading the source A THOUSAND TIMES easier.
If A new developer sees the word max/min/sum, they can almost assuredly guess the meaning versus a random pictograph. And even after learning that ‰ means reduce (for example), it's just one more thing that I have to remember.
> With modern day IDEs, auto suggestions would likely lead most developers to easily discovering the exact syntax for "product".
Sure. I don't doubt this. However, if your language is good enough, it's unnecessary. I don't want to rely on an IDE for something as straightforward as taking the product of a list of numbers.
> words that describe the inherent function makes reading the source A THOUSAND TIMES easier
If your functionality is complicated, this is probably true! No one's saying not to use descriptive function names in APL. You absolutely can (and probably should) for things used multiple times more complicated than a couple of symbols. However, in general I don't think this is true. Have you tried?
With regards to guessing the meaning, you do have a good point. If you show someone some APL, and they've never seen it before, it will most likely be completely unreadable, unless they've defined some sort of custom DSL. It doesn't mean they can't pick it up though.
Comparing q and APL again, if I see the word 'raze' in q, without knowing what that is already (looking at docs etc) it's basically impossible to guess. In APL, that would be ,/ instantly readable as 'reduce by concatenation'.
> it's just one more thing that I have to remember
Yes, you'll have to remember the symbol for reduce. But you won't have to remember the word for sum, product, any, all, raze, max, min...
To note is that, while it's easy to remember the word for sum/product/any/all/raze/max/min, you have to know that there's a word for it in the first place.
In contrast, in APL, once you know about reduce, there's just one way any of those could be written, and it's obvious. And after you learn about scan (cumulative reduction), you also know the cumulative version for all of those!
in APL you have ~80 of such "pictographs" (and many you'd already know from regular maths). Sure, it's something you have to remember, but not much, really. Words can definitely help new english-speaking devs understand code, but, after getting used to the symbols, they just provide so much flexibility.
I like the idea that APL could provide this advantage but I just haven't seen any compelling examples. And that's my experience with a lot of APL stuff - beautiful theory that just doesn't quite seem to deliver on the promises.
It feels like the Zen of APL is "symbols are a freaking great idea, let's do more of those!" when intellectual communities that are heavy on symbolic manipulation are generally quite conservative in the introduction of new symbols... let alone the dozens that APL introduces.
To each his own and all, but APL might have to content itself with a small group of devoted fans.
I absolutely agree with you about the introduction of new symbols. Most glyphs have been there for decades now, and there is a high barrier to introduce new ones. There are Dyalog developers who experiment with new primitives that use new glyphs, and it takes years until they are actually introduced into the language, if ever.
Very clever people have been working in array languages for more than 60 years, and there is a set of around 20-30 primitives that all of them agree are worth to have. Depending on the array language of choice, there are another 10-40 primitives more. Symbols have double use (monadic and dyadic), so that makes a total of 20-40 symbols to learn, and you already know many of those (+-×÷=<>, and probably a few more).
To each his own indeed. But if you haven't, I would suggest you to give it a try. It makes much more sense once you start writing code.
But it is the switching from infix operators to prefix operator that is making the second example less readable (and the implicit precedence of the operators). Writing it as
(a times (b plus c)) equals ((a times b) plus (a times c))
would be a fair comparison, which is much more readble than
If you're used to it, I'd guess you can read [EDIT: was five; thought more] two to four symbols in the time you can read a short name. And you can use the higher-level pattern-recognition skills that would otherwise be applied to reading names to understand combinations of symbols. At least for me it's much easier to visually manipulate symbols in my head.
You've written as though using symbols instead of names comes at an extraordinary cost. Having used both I don't think this is the case (and it's the conventional languages that force you to use words; you're free to define reverse ← ⌽ in APL but usually not the other way around). If you want to read more about this perspective, you could check out my page "What is a primitive?" discussed on HN yesterday: https://news.ycombinator.com/item?id=28060423
I doubt that it can be read much faster. Think about newspaper: you don't read the symbols that make up words - you literally read the words themselves. Taht's why it is pettry esay to fllw txet wtih miexd up caharcetrs. As long as first and last latter stay the same.
One advantage is that long expressions might fit in one line, but that's about it.
I mainly use Scala, which allows to use symbols like ← ⌽ as well and quite a few libraries do that - but I found unless it's a very commonly used symbol, it's not helpful for productivity.
These are the "higher-level pattern-recognition skills" that I was talking about. You use them to read words; I can also use them for something like (in BQN) -⟜» (subtract previous from current) or +`⊸⊔ (split using start markers). There's an opportunity cost.
To be clear, you do think there's an advantage to using symbols for very common operations? APL doesn't have all that many primitives (~60 symbols), and most would fit this requirement. Here's Alan Perlis[0]: "The large number of primitive functions, at first mind-numbing in their capabilities, quickly turn out to be easily mastered, soon almost all are used naturally in every program — the primitive functions form a harmonious and useful set."
If APL is so ideally productive and the barriers to its adoption so substantially slight, why has it not won?
The programming community has been extremely open-minded to revolutionary new approaches. So very many have gained at least a foothold and a small group of devotees, from LISP to Haskell to Forth to APL. But none of these languages have built communities of a size and productivity to match Python or C or the many other highly successful Algol-family languages. This despite the fact that some of these off-the-beaten-path languages, especially LISP, have a long history of being used as teaching languages.
So why hasn't APL taken off already? Surely there must be a more thoughtful answer than "people are Philistines". Perhaps both APLers and LISPers greatly underestimate the challenges of adopting new symbolic notations that are very different from what they're used to in written language and mathematics. Maybe for the average technical person, the cognitive costs are much greater and the benefits much less than they seem to the average APL or LISP fan.
If you mean this to be a direct reply to me then you're putting quite a lot of words in my mouth (leephillips is much closer to the position you're arguing against). I don't think the APL language or environment is ideally productive, and I think there are large barriers to adoption and probably wouldn't choose an array language for a commercial project. I made the narrow point that symbols have significant advantages over words—and if you'll go to link I posted you'll find that I also argued that symbols have significant disadvantages as well and shouldn't be used for everything.
If I was to consider anything ideally productive it would be my own array language, BQN. Obviously if APL was so good there would be no need to make such a language. Nonetheless I think BQN is far from a perfect array language[0], and that the array paradigm is not a good fit for everything. The feature I miss most is algebraic datatypes or even tagged unions, so there are many contexts where I'd choose a statically-typed language. BQN also (intentionally) makes dictionaries and mutable data harder to work with than other languages, which isn't always a good thing.
It's hard to even figure out where to begin with barriers to APL's adoption. Proprietary implementations, storing source code in binary format, lack of publically-accessible documentation? Do note that APL grew rapidly in the late 1970s and by the 80s was close to mainstream. Personally I don't think it has what it takes to become mainstream in the future, but I'm glad that Julia is doing well. There are a few barriers that have largely gone away, which include the total embarrasment of only beginning to switch from Goto to control structures in 1996[1], text encoding problems (there are still some font issues), and difficulty in trying or obtaining Dyalog APL (GNU is not a good substitute: it, uh, doesn't have control structures). In case it's not clear I still think the APL situation is quite a mess, which is why I'm working on BQN.
As you've noted, APL is a niche language. Why the need to smack it down? If you're not interested there's no need for you to try it. I advise you to stay away. If you want me to make you interested, I can't. If you're worried APL will take over and force you to use it… well, it won't.
I think it's naive to assume a language would "win" on its own merit. Most of our software today is what it is due to path dependence.
Was JavaScript a terrific language? No, it was a little doodle for bringing a little interactivity to websites, which were simple at the time. The web took off; thus JavaScript took off. If the web had never taken off, JS wouldn't be a thing.
Same for C. What was so great about C, or its predecessor B? Nothing really. B was a minimal language designed to fit the constraints of a minicomputer. Unix happened, C was a natural evolution of B, Unix took off as a simple & cheap (even free) operating system for cheap computers. C got popular with Unix, and not due to anything revolutionary in its design. C++ rode the OOP hype wave and C's popularity. C#'s popularity is due to Microsoft's massive influence.
Java similarly rode the OOP hype train and I guess Sun's marketing towards their many Enterprise customers (who also bought the "write once run anywhere" lie).
Remember when PHP was popular? Was that because it was such an awesome language, or was it because it seemed easy and integrated well with this other popular thing that took off (the web)?
(On a similar note, it's not clear at all that Linux would have become as popular as it is today if it weren't for the lawsuits looming over the BSDs in the 90s)
Of course I'm oversimplifying and glossing over a lot of history but in the decades between Unix and C#, there have been countless programming languages that could have easily challenged each language that I named on their own merit. But that merit does not matter, because that's not what makes a language take off.
If anything, the programming community seems rather reluctant to consider different approaches, and when they do it, they would rather do it in the comfort of their existing languages and ecosystem of tools & libraries rather than jump to something new entirely. Lisp and Haskell never got mainstream despite their use in teaching, but I think most of the programming community today sees the value in functional thinking and immutability; thus various constructs that have been historically exclusive to functional programming languages have made it into mainstream languages. You couldn't take lambdas and map & reduce for granted in 2000 and the FP aficionados had a fight ahead of them to convince the mainstream that they aren't just trying to be clever and look smart.
And on the topic of teaching, I'll have to point out that LISP has absolutely not been a teaching language across universities around the globe. MIT is not a representative example. Most unis and colleges just stick to what seems to be mainstream (20 years ago: C, C++, Java, ...; today C#, Java, JS..) or just easy to start with (Python) and languages like C and Haskell are now reserved for one-off/in-depth courses about specific topics.
If anything, I find it surprising that LISP has as big of a following as it does today, but I guess its (limited) use in teaching plus writings from certain influential people (including the author of this very site) have had their effect.
APL by contrast has faced challenges; it was considered too easy to be a teaching language. Other challenges include the symbols, but not because they look scary and alien! We might take the ability to type fancy glyphs for granted today but go back a few decades when computers were using 8-bit character sets, often wired to glass terminals or printers or framebuffers with a hardcoded character set.. you'll start to see the problem. You literally could not enter or view the APL symbols on a lot of computers available at the time. Interoperability with other character sets absolutely was a real challenge and probably alone sufficient to push APL mostly into obscurity for a long time. And indeed this issue was important enough that Iverson moved on from APL to create J.
Even this site shows how reluctant people are to accept anything new and alien looking. Almost all postings about APL feature the usual knee jerk reactions to the weird looking syntax, tersity, and symbols. APL and its kin go directly against what's been preached as best practices in programming for so long now that it's not surprising that people are uncomfortable with it. Doubters are not going to judge APL on its merit, because they won't learn it.
I think there's a lot to criticize about APL and its ecosystem (let's be clear that due to its relative obscurity, it has missed a lot of the potential evolution and progress that a vibrant community would bring with it), but I think few here are willing to look beneath the surface. Indeed most developers just need a language with mature tooling and libraries and big community, everything else is secondary.
>I'll have to point out that LISP has absolutely not been a teaching language across universities around the globe
In France Lisp was an essential part of any computer science course at university, and I am pretty sure it is the same today. Same for oCaml.
Actually I do not see how you could seriously teach computer science without studying Lisp.
Schools that do not teach Lisp but more mainstream languages are not universities, their only objective is to make sure their students find a job at the end of the cursus, not to teach them the fundamental of computer science.
I think it's something you just have to learn to believe it. Otherwise you'll doubt it forever and ever. APLers have learned, and they want to keep their symbols.
I can only speak for myself, and I know I'd be confused seeing code that says (enclose bind grade-up index right), but when I see it spelled in APL as (⊂∘⍋⌷⊢), I will immediately recognize it as sort.
It really does feel similar to the difference between "es you es aitch I" and "sushi".
I don't need to spell out every symbol, that just makes it harder to pattern-match and see the meaning.
Oh, this interesting little thing I noticed just now: jodrellblank also mentioned "enclose bind grade_up index self_right" hours before me in https://news.ycombinator.com/item?id=28095934
And I had read that comment! It just didn't register at all when I read that comment; even though I read it word by word, I did not realize that he also wrote the sort train (until I read it again now and converted to symbols in my head). Yet I would have immediately recognized it had I seen it written as APL symbols.
Visual pattern matching can be strong. (It's what allows you to recognize and read words that you do not know how to spell correctly.)
The history of APL is helpful to consider. It, initially, was a notation for describing computer systems and programs, and its use as a programming language came later (though not greatly later).
If you want something APL-like but without the symbol heavy nature, the nearest is going to be languages like Haskell that similarly promote tacit programming but without quite as heavy a reliance on symbolic notations, or descendent languages of APL like q.
Something I didn't think of earlier is function trains, which annoyingly won't make much sense without knowing some APL - they're a pattern which takes more effort to explain than they take to start using. But still, simple APL expressions run right to left so the example below pushes 3 into ⊢ which echoes it unchanged, that goes into + which is a no-op here on a positive integer, and then the 3 goes into ⍳ which makes the first 3 numbers:
⍳+⊢ 3 ⍝ the first 3 numbers
1 2 3
it behaves differently from the same thing in parens shown below, which triggers a function train pattern and slightly breaks the right-to-left execution by making the data able to feed into two functions, not just one. In the below code the 3 is pushed into ⊢ on the right which echoes it unchanged but it also jumps over and is pushed into ⍳ on the left to make the first three numbers 1 2 3, and these two results become (1 2 3 + 3) and the whole thing makes 4 5 6:
(⍳+⊢) 3 ⍝ 3 plus the first 3 numbers
4 5 6
It's not the parens which make the difference, assigning it to a variable to make a named function also does it:
f ← ⍳+⊢
f 3
4 5 6
Point being, there's something semantically meaningful happening but it has no symbol to turn into words. It's not the proximity of the functions being bunched together even, it's roughly an odd number of functions separated from their argument(s) but with some edge cases. The text editor would have to be parsing APL to detect it, and I don't know how it would describe what's happening in useful words. "iota plus self_right" might still let you recognise three things together, but does "enclose bind grade_up index self_right" look like a function train? Does "It's easier to spot the pattern with symbols, it's easier to have the pattern exist at all in a symbols based language.
When the Dyalog IDE can show you:
+,-,×,÷
┌─┼───┐
+ , ┌─┼───┐
- , ┌─┼─┐
× , ÷
to explain how the train is being interpreted, what does the text version of that say?
It wouldn't take much work to make a text editor that lets you switch this:
if (username == "alice") {
results[5] = true;
}
into this:
if username valueequals STRINGaliceSTRING begin
let results index 5 be true stop
endif
because symbols are incomprehensible, words are much clearer, and you don't care about saving "a couple of characters" and you don't like codegolf. Would you use it? If not, why not? How do you know the amount of symbols you use is the perfect amount and not merely the amount you are habituated to?
> "Why not give the symbols more descriptive names and let people use those names
For the same reason almost nobody wants {} to be BEGIN/END; the symbols are so ingrained, so well understood, so automatic and habitual that there's no benefit to trying to turn them back into words. Why not move towards doing the same for other common operations? Once you've used ⌊ as floor and ⌈ as ceiling, Math.Floor() feels like a drag. Once you're familiar with 4↑list to get the first 4 items, list[0..4] is a drag and it has more symbols! list.take(4) has more symbols and is no clearer. How do you stop a forever-expanding proliferation of symbols? I don't know, but APL seems to have done a surprisingly good job of general purpose computing in under ~80 symbols which has hardly grown in 70 years.
> "APL seems to be designed for people who place an extremely, extremely high weight on code golf-level terseness."
Would you be surprised that this is valid Dyalog APL for a function to find the highest value in an array of positive integers?
result←findMax data
max←0
:For i :In data
:If i>max
max←i
:EndIf
:EndFor
result←max
then
findMax 5 1 2 3 5 6 3 1
6
Dyalog has keywords, classes, namespaces, methods, libraries, and they get laughed at because who wants :EndFor . It's a bit of a PR issue - if you head to APL it's largely because you like golf, because if that's not what you want you may as well use Python/etc. But once you get there even if APL remains too opaque to use for everything it becomes annoying to know a short way to express what you want that you are comfortable with and have to laboriously boilerplate it out in another language with many lines, and have those lines contain more symbols into the bargain. And we say that one of the hard problems is "naming things", symbols and tacit programming can help avoid putting a name to variables that only hold some intermediate state you don't actually care about but need for the next couple of lines.
In a lot of these APL discussions I see an obsession with terse implementations of simple standard library-type functionality. Who on Earth has to literally write a loop in the middle of their source code to find the max of a bunch of numbers? I would just call the appropriate standard library function, especially in languages like Python that have excellent general-purpose data structures and standard libraries for them.
Nah, they could easily write a function for max if they really wanted or needed it. Maybe terseness just isn't that tremendously valuable to a lot of coders.
But that function is precisely what they're writing? Add a "return max;" at the end and you have you function. It will still be longer than |/.
> Maybe terseness just isn't that tremendously valuable to a lot of coders.
That's the thing. The big difference between the APL community and the rest is the value they accord to terness. I'm not saying they are right or wrong, I'm saying that it is what defines them.
Another point that goes with the symbols is that the symbols may not have the baggage that words have. Words can mean a lot of different things.
I wonder how many other programming languages there are that are still mainly proprietary. It seems like most languages these days have their primary implementation as open source.
APL's ASCII based relation, J is opensource(GPL, I think). [BQN](https://mlochbaum.github.io/BQN/) is a newer rationalized entry which is completely FOSS.
K is another array language which has a proprietary implementation. Tcl is a relatively popular proprietary language.
I asked myself the same -- should I start with APL, J or K? I concluded that APL is the "original" from which the others draw inspiration, and so it felt most natural to start there. Note that wherever you decide to start, the experience you gain carries over to the others.
I did APL, which was fun, and easy. However in researching the language I came across some IBM accounting source code written in APL. It really clued me into how the language got such a following in HR/Finance applications. You got an array of everyones hours per day. Weekly total hours, pay, tax, medicare, etc, were all broken out in little calculative statements based on the array of hours. So rather than long strings of APL characters, there was just massive amounts of english and these little digraphs or trigraphs of APL for the calculative values.