I wonder: is there any reason beyond sheer curiosity* to learn Fortran in 2025? Not being snarky, genuinely curious about what Fortran brings to the table.
The article did not discuss this, but to me, one of the bigger differences between Fortran and more modern languages is the difference between functions and subroutines. Yes, they are not synonyms in Fortran and serve different purposes. I think this would trip up more people initially than the clunky syntax.
It is also a bit funny that the author complains about older Fortran programs requiring SCREAMING_CASE, when if anything this is an improvement over previous and current practices. Too many Fortran codes have overly terse variable names that often were just single characters or impenetrable abbreviations for obscure terms. I have had to create cheat sheets for each program to figure out what each variable was.
Sun Microsystems had a great quote about this back in the day [1]:
> Consistently separating words by spaces became a general custom about the tenth century A.D., and lasted until about 1957, when FORTRAN 77 abandoned the practice.
What practical difference ever existed, beyond the fact that a subroutine does not return a value? AFAIK variable scope was handled identically. Recursion was likewise identical (forbidden originally).
Yes, sorry for the confusion. To be clear, the quote is directly about spaces not being significant in the source code in general, but I was commenting more about how this mindset affects variable names in practice. At least in my experience, many codes would benefit from variables names that use underscores.
I actually had a fantastic experience with Fortran lately. I ported a compute kernel from python/numpy to Fortran 2018, partially due to the GIL and partly so I could use Intel's compiler. The performance improvement was tremendous. Several times faster per core, then multiplying further because I could take advantage of threading. In all, the 3 day project increased actual throughput 450x.
(I considered JAX, but the code in question was not amenable to a compute graph. Another option was to thread by fork, and use IPC.)
I liked the language itself more than expected. You have something like "generics" with tensors. Suppose you pass a parameter, N, and you also would like to pass a tensor, and you would like to specify the tensor's shape (N, N). You can do this; the parameter type constraints can reference other parameters.
Tensors and various operations are first-class types, so the compiler can optimise operations easily for the system you're building on. In my case, I got 80% improvement from ifx over gfortran.
Invocation from Python was basically the same as a C library. Both Python and Fortran have facilities for C interop, and Numpy can be asked to lay out tensors in a Fortran compatible way.
Part of what eased the port was that Numpy seems to be a kind of "Fortran wrapper". The ergonomics on tensor addressing, slicing and views is identical.
I did something similar many years ago. I was amazed that Fortran was not more discussed as an option to write performant code within a Python / numpy codebase.
At the time everyone seems to default to using C instead. But Fortran is so much easier! It even has slicing notations for arrays and the code looked so much like Numpy as you say.
I've never found anything to back this up, but my impression was that both the Python / Numpy and Fortran 90 slicing operations were directly inspired by MATLAB (although most of the ideas go back to at least Algol 68).
It also helps that Fortran compatibility is a must for pretty much anything that expects to use BLAS.
Yea, Fortran is nice to use for so-called "scientific" computing. It has high performance as well as some handy intrinsic functions like DOT_PRODUCT and TRANSPOSE but the best features to me are colon array slicing syntax like Python/Numpy and arrays being indexed from 1 which makes converting math equations into code more natural without constantly worrying about off-by-one errors.
I wouldn't call multi-dimensional arrays tensors though. That's a bit of a bastardization of the term that seemed to be introduced by ML guys.
It wasn't until I started using Fortran that I realized how similar it is to BASIC which must have been a poor-man's Fortran.
If you're interested in safer code when working with older versions of Fortran, add 'implicit none' at the top. This will eliminate the integer vs. float automatic assignment and make variable definition much tighter. Back when I mucked about with Fortran (previous century really), we tried to do the memory allocation and I/O in C and the compute in Fortran. That would play well with both superscalar and vector machines.
Just looking at Fortran code fills the back of my mouth with the sensation I got from being around the printed IBM manuals in the basement computer lab at UIC of being around dust and powdered printer paper.
This makes me feel old. I learned Fortran on the Dartmouth Time-Sharing System back in 1980. While I think of Fortran now and again and think about re-learning, I haven't thought of DTSS in years.
One of oldest HIGH-LEVEL programming languages. "Programming" originally meant physically flipping switches or plugging cables into boards. Eventually, this moved to binary (1s and 0s), and eventually to Assembly.
First language I learned while in college in the 1990s. While I enjoyed the class, I have a funny picture of me kicking the FORTRAN book across my dorm room after I turned in my final project.
Fortran, the language, is also older than FLOW-MATIC.
FLOW-MATIC's claim to fame was beating Fortran at releasing a working implementation (and having syntax that looked like English, but that's not something to be proud of). Plankalkül, however, has not yet been implemented so if we're only counting releases of working software, it isn't a contender.
> a quick introduction to a few modern Fortran features: declaring variables, printing and reading to and from the terminal, if and select case, and stop
Pretty much sums up this one. Can't say that I agree if/select/stop are "modern" features.
* Nothing wrong with that as a reason, of course
It is also a bit funny that the author complains about older Fortran programs requiring SCREAMING_CASE, when if anything this is an improvement over previous and current practices. Too many Fortran codes have overly terse variable names that often were just single characters or impenetrable abbreviations for obscure terms. I have had to create cheat sheets for each program to figure out what each variable was.
Sun Microsystems had a great quote about this back in the day [1]:
> Consistently separating words by spaces became a general custom about the tenth century A.D., and lasted until about 1957, when FORTRAN 77 abandoned the practice.
[1] https://docs.oracle.com/cd/E19957-01/802-2998/802-2998.pdf
Waitaminit, is that why we have "sub" in Visual Basic ?
(I considered JAX, but the code in question was not amenable to a compute graph. Another option was to thread by fork, and use IPC.)
I liked the language itself more than expected. You have something like "generics" with tensors. Suppose you pass a parameter, N, and you also would like to pass a tensor, and you would like to specify the tensor's shape (N, N). You can do this; the parameter type constraints can reference other parameters.
Tensors and various operations are first-class types, so the compiler can optimise operations easily for the system you're building on. In my case, I got 80% improvement from ifx over gfortran.
Invocation from Python was basically the same as a C library. Both Python and Fortran have facilities for C interop, and Numpy can be asked to lay out tensors in a Fortran compatible way.
Part of what eased the port was that Numpy seems to be a kind of "Fortran wrapper". The ergonomics on tensor addressing, slicing and views is identical.
At the time everyone seems to default to using C instead. But Fortran is so much easier! It even has slicing notations for arrays and the code looked so much like Numpy as you say.
It also helps that Fortran compatibility is a must for pretty much anything that expects to use BLAS.
I wouldn't call multi-dimensional arrays tensors though. That's a bit of a bastardization of the term that seemed to be introduced by ML guys.
It wasn't until I started using Fortran that I realized how similar it is to BASIC which must have been a poor-man's Fortran.
https://en.wikipedia.org/wiki/Jacquard_machine
> Next time, we’ll talk more about...
Alas, there was no next time.
FLOW-MATIC's claim to fame was beating Fortran at releasing a working implementation (and having syntax that looked like English, but that's not something to be proud of). Plankalkül, however, has not yet been implemented so if we're only counting releases of working software, it isn't a contender.
Pretty much sums up this one. Can't say that I agree if/select/stop are "modern" features.
Edit: and just three paragraphs in, the author admits they didn't even bother using the oldest version of FORTRAN itself.
So, "oldest" means "rather early".
https://en.wikipedia.org/wiki/Plankalk%C3%BCl