117 comments

  • silentvoice 12 hours ago ago

    There are two sides to numerical linear algebra. The first is the "linear algebra" part, which is very mathematically sophisticated and the language you choose to represent these concepts is not so important as your understanding. pencil and paper is the ideal place to prove out understanding of this.

    The "numerical" part is a minefield because it will take all your math and demolish it. just about every theoretical result you proved out above will hold not true and require extra-special-handholding in code to retain _some_ utility.

    As such I think a language which enables you to go as fast as possible from an idea to seeing if it crosses the numerical minefield unscathed is the one to use, and these days that is python. It is just so fast to test a concept out, get immediate feedback in the form of plotting or just plain dumb logging if you like, and you can nearly instantly share this with someone even if you're on ARM +linux & they are Intel+windows

    The most problematic issue with python&numpy, as it relates to learning _numerical_ side of linear algebra, is making sure you haven't unintentionally promoted a floating point precision somewhere (for example: if your key claim is that an algorithm works entirely in a working precision of 32 bits but python silently promoted your key accumulator to 64 bits, you might get a misleaing idea of how effective the algorithm was) but these promotions don't happen in a vacuum and if you understand how the language works they won't happen.

    edit: & I have worked professionally with fortran for a long time, having known some of the committee members and BLAS working group. so I have no particular bias against the language

  • noobermin a day ago ago

    So, the OP is an actual educator whereas I've only really advised grad students or undergrads. I'm surprised being exposed to any new language doesn't come with it's "whys" for students. Like why should we care about type safety anyway? Or why not loop over all indices, why use (:) for some of them? May be I'm not really convinced that the whys from students in a python class are worse than the whys in fortran. Honestly, if there is some compiler option for turning on implicit none by default, I'd just do that too just to get people in the door as that too feels like more confusion than it's worth keeping, although they do need to learn what it means before they leave.

    Also, the downside is fortran does not have nice plotting capabilities without an external tool. At least I know of no nice libraries like matplotlib, which again is a point in just teaching them a more general purpose language from the get go to get used to it so they can plot and code in the same language...or perhaps, matlab/octave et al as others suggested. I feel like the niceness of fortran (namely, well defined type safe function/subroutine interfaces and easy path to writing performant code) that isn't offered by python is only useful after the first threshold of learning to put algorithm to paper. The literally second (and arguably for some fields, even more important) task of actually plotting results doesn't have the same convenience as the intrinsic procedures in fortran, whereas if they had learned julia or python, these tools would be at the very least be at the same convenience level of the array facilities, essentially behind a rather easy[^] to use library. In fact, in julia, you're already mostly there although it's not my cup of tea. Perhaps the answer is julia after all.

    Does OP's courses just use an external program (like gnuplot) or a black box to plot?

    [^] easy to use once you know how to program a little, of course.

    • shakna a day ago ago

      Fortran has a few nice plotting libraries. [0] Including matplotlib.

      Personally, I've only used ogpf, which is a single-file library, making it easier to run for beginners.

      [0] https://fortran-lang.org/packages/graphics/

      • noobermin a day ago ago

        I did a cursory scan and some of these seem not my cup of tea, but honestly ogpf looks rather pleasant for quick plots. Thanks! I might use this.

        That said, the point of these being external libraries and thus making them a bit less convenient still sort of stands, as being external libraries means you need to link them which exposes more CS tier stuff (installing libraries, make files, etc) that distracts from just learning codes, which again just motivates using a tool that abstracts some of that behind a managed package and library system.

        I'm assuming you could use things like lfortran in jupyter which I imagine might allow these things to be bundled, although I haven't followed that effort as of late.

        • pklausler a day ago ago

          What language does have built-in plotting capabilities that don't depend on external libraries?

          • dgacmu a day ago ago

            Language vs system with a language?

            Mathematica, matlab, maple, octave, etc.

          • int_19h 21 hours ago ago

            It depends on how you define "built-in" exactly. I would argue that in this context, if the end user isn't aware that an external library is used under the hood, it still qualifies as built-in. In which case, R has plot() in its stdlib.

            • noobermin 14 hours ago ago

              For the aid of the unfamiliar and those unwilling to just google (I often am one of these in other discussions), a number of functions in fortran that operate on array types are built-in to the language and do not require linking in, these are called "intrinsic procedures" in fortran's parlance. They are generally implemented for modern compilers on modern OS'es as libraries of course linked in during runtime like the C stdlib is in C (gfortran's are a C library last I checked although I'm not super knowledgeable about the details there). From the user's perspective, they need not be linked during compilation. External libraries (modules in modern fortran) on the other hand require you to link them in and also have their modules in the compiler's module path, similar to includes but not quite the same as modules are binary files.

              Anyway, this is discussion is all for the sake of teaching students. For a python learning student, I assume they do not start from zero and instead told to using anaconda or some build script that gives them a standard jupyter install. From then on in the code they use, matplotlib and numpy appear on equal footing, a set of function calls that just have different prefixes, in their eyes. This surface level similarity is what I mean by them appearing to have convenience level. The fact that jupyter installs are pretty standard and have loads of documentation helps ameliorate potential issues during installation.

              In fortran on the other hand, you do not need an external module for the array facilities (things like shape, dot_product, things for initialising arrays, etc) given the built-ins and the first class nature of arrays. However, you will need to `use` a module for plotting (fairly easy, essentially one line for importing) and link it and add it to the module path (potentially fraught). This is what I mean about them appearing on different footing, from a naive student's perspective.

              While there are attempts to provide a nice package system for fortran, it's generally the wildwest out there just like it is for c++ and c, so unless the instructor essentially has them work only on lab machines they control, using external libraries seems to me to be a huge source of headaches when dealing with students once they go off to install it themselves.

            • TimorousBestie 16 hours ago ago

              By this definition, python doesn’t have plotting in its stdlib.

  • patagurbon a day ago ago

    The post dismisses Julia quite quickly, especially since it is a language essentially purpose built to teach numerical linear algebra. Numerical methods is taught in Julia in at least a dozen universities I'm aware of, including MIT.

    Unicode support and a few other syntax niceties make translation from the blackboard to the editor nice and clean. Fortran is great but legibility and easy tooling like (reproducible) package managers are paramount in teaching

    • loiseaujc 11 hours ago ago

      OP here. I actually love Julia and acknowledge its ecosystem can be fantastic for learning purposes. Just see the Computational Thinking course at MIT.

      I'm not dismissing Julia. Actually, the first lines of my conclusions are

      > In the end, when it comes to teaching the basics of numerical linear algebra, Python and Fortran are not that different. And in that regard, neither is Julia which I really like as well.

      I feel like people got the impression that I'm saying they *should* use Fortran for teaching. That ain't the point, but maybe I did not convey it as clearly as I would have like. The point is : a programming language with strong typing, clear begin/end constructs, ensuring inputs to a function cannot be accidentally modified (otherwise it has to be a subroutine), etc actually makes it easier for the students to effectively *learn* computational thinking rather than having to battle with syntax errors and strange intricacies of a general-purpose language. Fortran is just an example which turns out to be historically related to number crunching.

      Unicode support and greek letters sure can be useful when presenting code snippets in your slides, but it essentially is syntactic sugar coating. And, unfortunately, many students have no idea how to spell greek letters (e.g. \to for \tau, \fi for \phi, etc) and just end-up loosing time on aesthetic details rather than focusing on the learning objective.

      Finally, you may not know it but Fortran does have a package manager. Check out fpm (https://github.com/fortran-lang/fpm). It basically is just like Pkg for Julia, also using a toml manifest, can be installed via conda, makes sure things are reproducible across compilers and platforms, etc.

    • noobermin a day ago ago

      Saying fortran is not legible is not an argument that holds water against fortran 90. I don't want to be uncharitable but I don't know how anyone can have this opinion unless they just don't have much familiarity with it.

      • patagurbon a day ago ago

        I didn’t say it was illegible. I said legibility is paramount, and I don’t think it makes the right trade offs in that regard to be a great teaching language

        It’s far more legible for numerics than a lot of languages, maybe except Julia and Chapel. Julia was just driven in large part by teaching mathematics at mit and I think that shows

        • mastermage 18 hours ago ago

          I would go and say Fortran is pretty legible because it only has a handful of builtin keywords and none of the fancy stuff alot of other languages have.

          Julia has the fancy stuff aswell as being very legible and also having a nice REPL for instant feedback which is usefull for people learning (as well as multiple notebook implementations Pluto.jl or Jupyter) Chapel has even more of the fancy stuff like multiple loop types for different kinds of parallelism which is just wildly cool.

      • jcranmer a day ago ago

        A large share of the illegibility of Fortran code is actually just the aversion of numerics code to having meaningful variable names.

        • pjmlp 14 hours ago ago

          Something that also happens in languages like C and Go, while C has the same reason as Fortran, given the limitations of their early compilers, Go has no reason for single letter receivers.

        • atrettel a day ago ago

          I second this. When I worked on some older Fortran codes, I had to keep a cheat sheet for the variable names and what they meant or controlled. It definitely made the code hard to read.

          • noobermin 15 hours ago ago

            Yeah, this isn't the case for any codes written since may be the 00s or even the late 90s, although I admit I haven't worked on a code that old recently.

      • kjs3 a day ago ago

        Regrettably, any discussion of Fortran will be quickly filled with people who once had to write a couple of F77 programs in college and never got over it, never used a really nice Fortran compiler, and of the very few who actually knew the language has evolved in the last 50 years the vast majority of that minority couldn't name a single significant thing that changed in F90/F95 through Fortran 2018.

        But they all have Opinions, which they are compelled to share.

        • adastra22 18 hours ago ago

          C++ has evolved immensely in that same time. I still would NEVER use C++ for anything new, now that Rust exists.

          Why should I use Fortran, for anything that isn't maintaining legacy code?

          • pjmlp 14 hours ago ago

            Rust might exist, but it isn't shipping in console devkits, or industry reference game engines, just to quote two examples where it has yet to achieve parity with C++.

          • noobermin 15 hours ago ago

            Fortran is nowadays a DSL for computational codes. If you don't know why you would or could use it, you're not the target audience for it.

            Calling it a DSL is a bit rich given the history of computing but at least that way at least CS and developer types will know how to regard it.

          • kjs3 10 hours ago ago

            Oh, there's the one I forgot: The "I don't know how to use that technology, therefore there is no use for it and no one should use it" guy. Least surprising thing ever it's from the rust crowd.

            • adastra22 9 hours ago ago

              I’ve maintained Fortran codes before in my career.

          • atemerev 15 hours ago ago

            I like modern C++ much better than Rust. I am 3 times more productive in it and don't have to fight the compiler for hours. And RAII is a simple idiom that prevents 95% of actual problems.

            • adastra22 15 hours ago ago

              Your C++ is full of bugs that rustc was telling you about.

              • pjmlp 14 hours ago ago

                Including on LLVM backend used by Rust, I guess.

                When is that fully bootstraped rustc coming?

              • atemerev 15 hours ago ago

                Well yes, the code I didn't wrote (because I was 3x slower) surely has no bugs.

      • pjmlp 14 hours ago ago

        Or Fortran 2023, the current standard.

    • zevets 21 hours ago ago

      Julia's choice to encourage people naming their variables greek letters is bad though. There's a whole group of students who struggle with the symbols, but understand the concepts (a residual). Julia, when used to its full capabilities, gains an enormous amount of its power from a huge amount of clever abstractions. But in the 1st-course-in-numerical-methods class context, this can be more offputting than the "why np?" stuff this article mentions.

      For teaching linear algebra, MATLAB is unironically the best choice - as the language was originally designed for that exact purpose. The problem is that outside of a numerical methods class, MATLAB is a profound step backwards.

      • int_19h 21 hours ago ago

        If students struggle with the use of Greek letters as symbols, it'll be difficult for them to deal with lots of math and physics where this is the standard notation. Intuitively it feels that the best thing that the language can do here is to enable notation that is the closest to the underlying math.

        • hulitu 17 hours ago ago

          > If students struggle with the use of Greek letters as symbols

          After using Character map or other "user friendly" methods to enter Greek letters as symbols on a computer, i would say, yes, people struggle with the use of Greek letters. Unless, of course, one has a Greek keyboard.

      • pavon 20 hours ago ago

        Whatever language you choose, you are only going to be teaching a subset of it anyway, so just ignore the Unicode identifier support. I code in Julia professionally, and never use it.

      • bouchard 14 hours ago ago

        Using greek symbols as public-facing function arguments, etc. is definitely not recommended, and not that common (at least in my experience).

        It's best used for internal calculations where the symbols better match the actual math, and makes it easier to compare with the original reference.

  • criddell a day ago ago

    Fortran is not a better choice unless you are only thinking about the immediate needs of the course. In the wider world, Python is going to be a lot more useful to the students.

    • cultofmetatron a day ago ago

      if you're smart enough to learn fortran and learn to impliment and undersatnd numerical methods in it, I would argue that learning python will be an afterthought. You can learn python along with numpy in a week tops if you already understand the theory. I believe a lot of numpy libs are written in fortran code anyway though I could certainly be wrong there.

      • esafak a day ago ago

        Why learn Fortran? If you want nice things stop propping up dinosaurs. Let it die already.

        Teach them numerical algorithms and have the students contribute to a better language's BLAS, LAPACK, or numerical library like numpy, jax, scipy, etc. Be part of the solution.

        • cultofmetatron a day ago ago

          fortran is still very much the GOAT when it comes to numerical methods. to the point where all the modern fancy numerical methods libraries still squeeze out performance by calling out to libraries implemented in fortran. Its far from dying. its very good at its niche.

          • adgjlsfhk1 21 hours ago ago

            This isn't true. Openblas and MKL are both C/C++ with assembly hardcoded microkernels. SciPy is in the process of removing the last of their Fortran because no one wants to maintain it, and newer methods in other languages are faster. Fortran hasn't been in the core of everything for decades.

            • zevets 21 hours ago ago

              For better or worse, Fortran is still a popular language to write clever PDE schemes in, as it maximizes "time to first, fast-enough-running code".

              But for anything with a userbase of more than ~15 people, C/C++ are widely preferred.

              • adgjlsfhk1 20 hours ago ago

                Julia is starting to pick up steam here. It's a lot easier to write mixed precision algorithms in since the type system is pretty much designed for efficiently writing generic algorithms (and it doesn't hurt that Julia's ODE solvers are SOTA)

                • wolvesechoes 18 hours ago ago

                  > Julia is starting to pick up steam here

                  First time I saw this claim was over 9 years ago.

        • dismalaf 21 hours ago ago

          Fortran has modern versions and is much nicer for writing numerical code than C, C++, Rust, etc...

    • tgv 17 hours ago ago

      That's not the goal of learning the basics of numerical programming. Plus, anyone doing such a course should already know how to write code.

    • jhbadger 20 hours ago ago

      Today, sure. But you could have made the same argument with Pascal being more practical in the 1980s. And yet here we are 40 years later with Pascal having become obscure and Fortran still with us. While ultimately any programming learning should be language independent, there's an argument to be made to stick to languages like Fortran which have a track record of being around a long time over whatever is currently popular.

      • CJefferson 20 hours ago ago

        Fortran is with us in a technical sense, but its use is tiny. I’ve done about six Fortran to python rewrite projects during my academic career. No-one has seriously suggested going the other way.

        I will bet you any amount of money you like Python will be more popular than Fortran in 5,10,20 and 30 years.

        • spragl 17 hours ago ago

          You are defintely not risk averse. Try to look back 30 years and see how much popularity has changed since then.

  • stared 11 hours ago ago

    Sure, it is fine to teach numerical linear algebra in Fortran - 15 years ago, I was learning with Fortran 77. Yes, there are many benefits of using a different language, even for its own sake. Or using a language without distractions - just to focus on the course, rather than one that can be used for building a website or so. Or without less "magic" than Python.

    At the same time, I find most argument bizarre, other - harmful.

    From the bizarre region, students asking "What is 'import numpy as np'?". A legit question. If you want to avoid all technicalities, pen and paper is the right approach. All in all, they need to run code somehow and there will be questions like "What is 'implicit none'?"

    If they are not asking such question, is is not because they focus some much on linear algebra. It's because they are lost - or at very least - have lost interest.

    From ones (in my opinion) actively harmful:

    > It’s about teaching the basics of scientific computing to engineering students with a limited programming experience.

    This is a big red flag. If they have little background in programming, teach good standards of Python. Otherwise they will get worst patterns, and think that GOTO is the preferred way of coding.

    If you think why a lot of academic code looks like mess, one of the reason is using archaic tools and techniques as a standard.

    Also - if your focus in linear algebra and avoid magic, you CAN do it without any numpy, just Python lists.

    • loiseaujc 10 hours ago ago

      OP here.

      > This is a big red flag. If they have little background in programming, teach good standards of Python.

      That is a big issue, I'll admit it. And actually, in my Uni, I am a very strong proponent of making an intro to programming class mandatory before even being able to enroll in any other engineering classes. Unfortunately, I don't see this happening anytime soon mostly because the higher-ups have a distorted view of what programming actually is. And I believe it is the same in many French universities. May-be different elsewhere, I don't know, but in the mean time I have to make do with what I have.

      > Otherwise they will get worst patterns, and think that GOTO is the preferred way of coding.

      People keep on referring to `goto`. But that is a construct that has been considered bad since the Fortran 1990 standard, 35 years ago. Fair enough, there are plenty of legacy codes using it. Just like there are plenty of C code written 35 years which are just terrible by today standard. Modern Fortran (and by modern I mean anything following 2003 standard and more recent) does not use `goto`. And academics writing Fortran code today do not use `goto`, nor do they teach it.

      > If you think why a lot of academic code looks like mess, one of the reason is using archaic tools and techniques as a standard.

      Again, I think this is a very distorted view. When you need to run your code on 1000+ CPU, which is what I and my colleagues do, you better make sure your code makes use of all the most recent good practices and use industry-standard tooling not to waste CPU hours because of a stupid bug or what not.

  • ziotom78 14 hours ago ago

    I’m a teaching assistant in a C++ course for second-year physics students. The goal is to teach numerical and data analysis, but the course is structured around a very object-oriented style. For example, every function is defined through inheritance:

      class BaseFunction {
      public:
        BaseFunction();
        virtual double Eval(double x) const = 0;
      };
    
    If students want to integrate or find the roots of a function, they must first subclass it:

      class SinClass : public BaseFunction {
      private:
        double _omega, _phase;
      public:
        SinClass(double omega, double phase);
        double get_omega() const;
        double get_phase() const;
        void set_omega(double new_omega);
        void set_phase(double new_phase);
        double Eval(double x) const override {
          return sin(_omega * x + _phase);
        }
      };
    
    Everything in the course (numerical integration, PDEs, Monte Carlo sampling, …) follows this pattern.

    The professor is an excellent teacher and students love him. But other faculty worry that our students leave without any real exposure to Python, and that they keep reproducing this heavy OOP style even in situations where a simple lambda or a few lines of NumPy would be far more natural. (Lambdas are shown at the very end of the course, but mostly as a curiosity.)

    That’s why I found the blog post so interesting: it shows how natural code can look in Python or Fortran. By contrast, our students’ code is weighed down by boilerplate. It makes me think that sometimes the real difficulty in teaching numerical analysis isn’t the language itself, or whether arrays start at 0 or 1, but the teaching approach that frames every problem through layers of abstraction.

    • dekken_ 12 hours ago ago

      > other faculty worry that our students leave without any real exposure to Python

      have you seen this? https://cppyy.readthedocs.io/en/latest/

    • jampekka 13 hours ago ago

      Is the course about teaching these methods generally, or teaching to implement them in C++ in particular (maybe for e.g. ROOT)?

      Using C++ as a pedagogical tool for general teaching of such methods would be quite a choice.

      • ziotom78 10 hours ago ago

        The course is meant to teach students how to write numerical code (integration, PDEs, Monte Carlo, …) rather than to teach OO design itself. C++ is the chosen vehicle, and the professor has a strong OOP background, so the course material ends up structured that way.

        That’s why some of us wonder whether a lighter approach (lambdas, NumPy in Python, etc.) might let students focus more directly on the numerical methods without so much boilerplate.

  • tomrod a day ago ago

    Fortran, Octave, or Julia are excellent for learning linear algebra.

    This was the path I took, before going to Python, Go, and Rust.

  • veqq a day ago ago

    APL is better, obviously. There are even dozens of textbooks for teaching math with its notation.

  • nivter 16 hours ago ago

    If it's about teaching and not about efficiency, why not just use plain Python? One could argue it is actually better since students don't have to worry about typing and syntax, and it allows a gentler introduction to commonly used tools like jax and numpy while getting comfortable with the language.

    • blahedo 16 hours ago ago

      > since students don't have to worry about typing and syntax

      As someone who regularly teaches intro programming using Python, I assure you that students learning Python need to worry both about types and about syntax, and the fact that both are invisible does them less favours than you might think. Type errors happen all the time in Python, but they aren't caught until runtime and only when given the right test cases, and the error message points somewhere in the program that may be quite distant from the place where the problem actually is. Syntax errors are less common for experienced programmers, but newcomers struggle just as much with syntax in Python as they do in languages like C++ and Java (both of which I've also taught intro programmers using).

    • TimorousBestie 16 hours ago ago

      Numerical linear algebra is intrinsically strongly typed. The same algorithms that work in double or extended double precision may not work at single or half precision.

      Pure python has a tendency to silently widen every floating point type to double. Numpy overlays a C ABI on top of python’s oversimplified type system, which complicates matters further.

      I wouldn’t teach numerical linear algebra in any weakly typed language.

  • abdullahkhalids a day ago ago

    My scientific computing journey was

    - Matlab in the first few science lab courses + first CS course.

    - C++ in second CS course

    - Fortran for the scientific computing course

    I found Fortran worse than matlab. The error messages were difficult to parse, and it was much more difficult to do step through debugging like in matlab.

    But later I learned Python, and now use it professionally to do scientific computing, and I would give anything to go back to Fortran. Or use Rust or Julia. Or if Wolfram/Mathematica if that was possible. Anything but Python.

    The fundamental problem with Python is that all the math is hacked into it, unlike Julia/Matlab/Mathematica where the math takes first priority and other values are secondary.

    • noobermin a day ago ago

      May be you learned all of these extremely recently before for decades I would definitely say C++ error messages were far worse than anything a fortran compiler has ever barked at me for. The bad days are definitely over but I still think C++ template errors can still be the thing of horrors even today. I know you compared matlab to fortran but you even said you took C++ just prior to this and I'm amazed that didn't harden you for anything gfortran/ifort would throw at you.

    • bluedino a day ago ago

      What are the obstacles in your using Fortran (or Rust or Julia) in place of Python?

      • abdullahkhalids a day ago ago

        Many other researchers I work with have almost no programming experience outside of Python or other high-level languages. Switching to Fortran or Rust will significantly slow down our work for at least an year or two while people catch up.

        Julia would be easier to switch, but it's still months of work to port over existing libraries.

    • naijaboiler a day ago ago

      correct. Python is a general purpose language pretending to speak math.

  • stathibus a day ago ago

    If you are unwilling to teach through python's warts you should use Matlab, not fortran.

    • dubya a day ago ago

      I’d suggest Octave over Matlab, because current Matlab has tons of distracting AI and autocomplete front and center. Probably really helpful for getting a plot just right or implementing an algorithm from a paper, but not so good for learning the basics.

    • goerz a day ago ago

      Even better: Julia (although Fortran is pretty good!)

      • adgjlsfhk1 a day ago ago

        I translated the jacobi example to julia, and it does seem to address every one of his gripes with Python.

        • cbolton 17 hours ago ago

          I think his main point is about strict typing in Fortran. You can add type annotations in Julia but it's almost an anti-pattern if you don't need them e.g. for dispatch. In any case the type annotations in these examples would be quite unnecessary, unlike in Fortran (where as I understand you can at best enable implicit typing but then must use variable names with specific patterns).

          • adgjlsfhk1 8 hours ago ago

            I would argue that the strict typing in Fortran is actually a significant hindrance. there's nothing about any of these algorithms that requires double precision (or even contiguous storage), so why should the algorithm randomly restrict it? Annotations of rank (e.g. AbstractMatrix in julia) help document what the code does, but the strict annotations of Fortran are restrictions without value.

          • TimorousBestie 15 hours ago ago

            Numerical linear algebra is the exemplar killer app for multiple dispatch; not exploiting it would be a waste.

      • QuadmasterXLII a day ago ago

        I love julia, but the default workflow is

        Step 1) Write the function using high level abstractions Step 2) Glance over the generated assembly and make sure that it vectorized the way you wanted.

        • mofeing a day ago ago

          > Glance over the generated assembly and make sure that it vectorized the way you wanted.

          Isn't that sth you would also need to do in Fortran? IMO Julia makes this so easy with its `@code_*` macros and is one of the main reasons why I use it.

          • wtcactus a day ago ago

            In my experience, Fortran compiler is heavily optimized. It competes head to head with C.

            Julia’s on the other hand, many times puts out very unoptimized code.

            Mind you, last time I looked at Julia was 2-3 years ago, maybe things have changed.

            • patagurbon a day ago ago

              If you write Julia similar to Fortran, with explicit argument types and for loops and avoiding allocations it shouldn’t be too far off. Fortran IIRC has a few semantics which might make it more optimal in a few cases like aliasing

              But indeed there are almost certainly less performance surprises in Fortran

        • TimorousBestie 15 hours ago ago

          This is the default workflow in every high-level language. Even if I’m writing explicit SIMD intrinsics in C targeting a specific processor, I still have to benchmark and maybe look at the assembly to make sure it did what I intended (or something better).

    • a-dub a day ago ago

      this is the way. octave or matlab.

      people like to complain about matlab as a programming language but if you're using it that way you're doing it wrong.

      matlab (the core language) is awesome for expressing matrices and vectors and their operations as well as visualizing the results. matrix expressions in matlab look almost identical to how they look in mathematical notation (or how one might write them in an email). you shouldn't be using programming language flow control (or any of the other programming language features), you should be learning how to write for loops as vector and matrix operations and learning from the excellent toolboxes.

      • OkayPhysicist a day ago ago

        IMO, the issue is that "Scientific computing" covers several disparate use cases.

        When you care about the math, Mathematica. It's a replacement for several pages of hand-written math, or a chalkboard.

        When you care about the result, MatLab. It's a replacement for your calculator, and maybe Excel.

        When you care about the resulting software? Python/Julia/Fortran.

    • kergonath a day ago ago

      Fortran is much more approachable and more regular than Matlab. Really, there’s no contest.

  • GhosT078 a day ago ago

    Ada is also excellent for linear algebra and other numerical programming.

  • dkga a day ago ago

    I personally think R and Julia are much better at this.

    • scheme271 15 hours ago ago

      I think R isn't great. It has some odd quirks (e.g. <- for assignment) and even using the tidyverse, it's a bit tough to gel with it's mostly functional but not always nature.

  • misja111 16 hours ago ago

    Stating that Fortran is better for teaching numerical linear algebra is like stating that Latin is better for teaching medicine. Sure, it might be a better fit, but unless your students have all the time in the world, it doesn't seem very efficient to put your students through the extra burden of learning an unfamiliar syntax, development environment etc.

    • atemerev 15 hours ago ago

      Modern Fortran is quite easy. It is learned overnight by most physics students. At this point, it is more a fast compiled DSL for numerical computations (like MATLAB but _much_ faster), than a universal programming language.

  • drnick1 20 hours ago ago

    I don't see what's wrong with Python for a first course. In fact, you can go very far with Python alone (in terms of performance) if you learn to use Numba well. If that isn't enough, I would go straight to C (opportunistically augmented by C++98 when needed), and learn how to use the GNU Scientific Library and Eigen (the linear algebra library).

    • adastra22 18 hours ago ago

      Nobody should be using C/C++ for anything new, much less using it as a teaching vehicle. That's just irresponsible at this point.

      • drnick1 17 hours ago ago

        Not true. C++ is great for scientific computing. There is a huge number of mature libraries like Eigen. You can also painlessly use Fortran/C libraries like LAPACK, SLICOT and many others. Performance is top notch. It's multiparadigm so you can use OOP where it makes sense. It's a very complex language, but for scientific computing the C++98 subset is enough and IMO it's far cleaner than modern C++ or Rust.

        C++ also has by far the best support for graphics (OpenGL/Vulkan) and GUI toolkits like Qt.

        • adastra22 15 hours ago ago

          You can also call out to these libraries from other languages that have far better safety features.

          • drnick1 7 hours ago ago

            "Safety" is largely irrelevant for scientific computing. Manipulation of memory is handled by libraries like Eigen and you don't have to use raw pointers at all.

          • wolvesechoes 11 hours ago ago

            How to expose C++ templates through C API do that they can be used from other safe languages?

      • pjmlp 14 hours ago ago

        First create the alternatives, including industry standards, then yes we can consider stopping using them, even if they have plenty of issues.

  • a day ago ago
    [deleted]
  • QuadmasterXLII a day ago ago

    Am I crazy or is the Jacobi iteration flipping the sign of u every iteration?

    Also the swapping of u and tmp doesn't work like that in python. Might in fortran.

  • randomNumber7 a day ago ago

    I think fortran would be cool if they program it by hand on punch card and the teacher then executes these to check the programs. Like in the very early days of programming where you had to submit these punch chards and wait until they get executed on the mainframe by an operator.

    • GJim 11 hours ago ago

      I know you are joking....

      ...however discussion on Fortan is inevitably dominated by those who once saw some F77 code years back and associate Fortran with punch cards.

      Fortran 90 was a big update of the language, and modern Fortran versions and compilers are very good indeed. Not to mention they can be used with the old tried and tested libraries which are bombproof.

      There is a reason Fotran is still used. It's simple, and you can *really* trust it.

    • jmclnx a day ago ago

      Fun times, my frist job but I messed it up.

      I should get back to Fortran but it has changed a lot over the years.

  • Bostonian a day ago ago

    A follow-up post is https://loiseaujc.github.io/posts/blog-title/jacobi_experime... "Jacobi method: From a naïve implementation to a modern Fortran multithreaded one".

  • fortranfiend 11 hours ago ago

    Yes Fortran is better, I say this with no preference at all. XD

  • constantcrying a day ago ago

    It seems pretty clear to me that the best language to use for numerical linear algebra is the language you know and a good curriculum would be structured in a way where your first encounter with a language is not a course where the choice of language is nearly irrelevant.

    • wombatpm a day ago ago

      You would think. I was a TA for numerical methods for undergrad engineers. Saw lots of bright students who understood the material but could debug their code. Several grad students started a 1credit seminar course on Fortran fundamentals. We worked those students like dogs, but when they took the numerical methods course they came back and thanked us because they knew how to get their code to compile and how to use the admin graphics package.

    • scheme271 15 hours ago ago

      Except for different students will know different languages and when you teach the class, you need to pick one. Also changing languages for each time the course is taught isn't feasible so either you pick one or use pseudocode. Choosing one has the advantage that students can actually run it.

      • constantcrying 14 hours ago ago

        >Except for different students will know different languages and when you teach the class, you need to pick one.

        Why? In which Uni do you not have an introductory programming class, which is a prerequisite for every other software based class?

        Just use the language of your introductory class everyone will known it and everyone can think about the algorithms.

  • hulitu 17 hours ago ago

    > While you have to use np.linalg.norm in Python to compute the norm of a vector, Fortran natively has the norm2 function for that. No

    That's what sucks in python as a newbie. Wanted to play with random, which python docs says it's a builtin. Obscure error message. Search the internet, no fix in sight. After more searching i found out that i actually have to "import random" (builtin ??) and use a method ? Wtf.

    So definitely python is not for newbies. Fortran code is much easier to underestand.

    • talideon 17 hours ago ago

      The Python docs say it's part of the built-in standard library, not that it's literally there and in scope by default.

      We learned the value of modularity decades ago. This is a sign of Fortran's age, not a good thing. Very few modern languages that aren't heavily domain-specific will have more than a bare minimum of functions in scope by default without needing to import them or a module containing them. The only relatively counterexample I can think of is PHP, and even it's grown namespaces over the years.

  • _giorgio_ 19 hours ago ago

    This kind of professor must disappear.

    They always want to teach the more elegant and divine method that, in reality, nobody uses.

    I studied Pascal and Fortran when they could have taught me more widely used languages. Shame on them.

    If Fortran is so useful and clear, just offer some lessons on it. Surely students will be enlightened and captivated by it.

  • bjourne 13 hours ago ago

    1. Students are allergic to useless knowledge. Prof. could ask them whether they prefer a "better" programming language they will never use OR one 99% of them will use every day at work. 2. Good luck finding TAs and support staff for Fortran-centric courses.

    Practicality beats purity.

  • toolslive a day ago ago

    > No off-by-one error – By default, Fortran uses a 1-based indexing. No off-by-one errors, period.

    I'm with Dijkstra on this one. https://www.cs.utexas.edu/~EWD/transcriptions/EWD08xx/EWD831...

    • loiseaujc 12 hours ago ago

      OP here. 0-based vs 1-based indexing is one these hot debates I don't want get in as I've mentioned in the post. Yet, point is, if you take pretty much any math textbook, indexing starts at 1, whether we like it or not. And for many students, understanding the mechanics of a given algorithm is already enough of an effort that they do not need, on top of that, to translate every indices in the book's pseudo-code from 1-based to 0-based. That's all I'm saying. No other value judgement.

    • Skeime 17 hours ago ago

      Dijkstra is right, of course. However, most math texts still use 1-based indexing. If you want to translate them into code, it's easier when the conventions match.

      (Now, if you had a proposal for switching math over to 0-based indexing ...)

      • toolslive 10 hours ago ago

        He (Dijkstra) even mentions this in the article:

        > The above has been triggered by a recent incident, when, in an emotional outburst, one of my mathematical colleagues at the University —not a computing scientist— accused a number of younger computing scientists of "pedantry" because —as they do by habit— they started numbering at zero.

      • scheme271 15 hours ago ago

        Good luck getting a community with literally hundreds of years of literature using 1 based indexing to change.

    • potbelly83 6 hours ago ago

      Doesn't fortran also support the ability to define arrays with arbitrary bounds i.e. (-4, 5) which is quite difficult todo in other languages

      • pklausler 5 hours ago ago

        Yes, but pitfalls were added to the feature in Fortran '90 and it should now be generally avoided.