Time-Traveling to 1979: Advice for Designing 'C with Classes

(coderschmoder.com)

16 points | by birdculture 8 days ago ago

20 comments

  • procaryote 10 hours ago ago

    The best advice is probably "don't", as it usually is to most people setting out to design a programming language, and even more so for people setting out to do a mostly backwards compatible extension to a language that isn't suited for what you want it to do.

    The second best advice is probably, do just c with classes. Allow defining your own allocator to make objects of those classes. It's fine if objects built with one allocator can only refer to objects built by the same one.

    Don't do templates, just do the minimum needed for a container type to know what type it contains, for compile time type checking. If you want to build a function that works on all numbers regardless if they are floats or complex or whatever, don't, or make it work on those classes and interfaces you just invented. A Float is a Number, as is an Integer. Put all that cleverness you'd waste on templates into making the compiler somewhat OK at turning that into machine types.

    Very specifically don't make the most prominent use of operator overloading a hack to repurpose the binary left shift operator to mean `write to stream`. People will see that and do the worst things imaginable, and feel good about themselves for being so clever.

  • kelseyfrog 8 days ago ago

    I have no doubt that had this happened nothing would have changed. C++'s legacy is that every positive incremental change is implemented at the last possible moment and in the most frustrating and caveat-laden manner.

    The individual merits of language features hold relatively little value compared to the sausage making machine that is the C++ language evolution process.

  • zvrba 12 hours ago ago

    What'd I tell to Bjarne:

    - In the future, you'll carry in your pocket a computer more powerful than the sum of all computers currently present at the university

    - The unchecked flat memory model of C will cause numerous security issues with sometimes grave consequences in the "real world"

    - Follow the design of Standard ML (SML) and adapt it to systems programming (yeah, it appeared in 1983, but surely papers have been published before that)

    - Do not even think about using unsigned types for sizes and get rid of implicit numeric conversions: if (v.size() - 1 < 0) fails on empty vector in today's C++

    - Deterministic resource management is still important and is _the_ feature that C++ gets praised for.

    - Lack of standard ABI will cause a lot of headaches and lost time.

    - I would tell him about LLVM IR, .NET assemblies, metadata and encourage him to first standardize an intermediate format which the compiler could read and write. That'd ensure seamless interoperability between compilers and even other languages.

    - Related to the above point: the header/source split will become a burden.

    • anthk 10 hours ago ago

      C++ created disasters on maintenance. The best could happen to C++ it's to be killed for once et all with Go as a systems' language and Rust maybe for the rest.

      • pjmlp 4 hours ago ago

        Go could have learnt a few lessons from the languages that predated it, instead like the authors originally did with C, it was more fun to create their own thing, and leave a few warts that will never be fixed as Go will never leave beyond 1.<increment counter>.

        At least it is much safer than C will ever be.

        Rust still needs to get rid of its C++ dependency on LLVM, and eventually GCC.

        • anthk 3 hours ago ago

          Go learnt everything from plan9 C and Limbo which are pretty much the refined versions of Unix, C and something being the core of Inferno.

          It's pretty much good enough maybe not for operating systems, but ideal for network-waiting daemons. It uses far more RAM than a core written in C, but for something built for the future, in 2030 it won't be that bloated. Specially when were are seeing Electron bound aberrations and even JS ridden crap under Gnome and Windows 11 for trivial tasks.

          • pjmlp 3 hours ago ago

            Yes Go is an improvement over C, with Alef and Limbo learnings, and some Oberon-2 in the mix, pity that they forgot about everyone else in the programming language community.

            Proper enumerations instead of iota/const, generators as language feature instead of callback gimmicks, generics still have a few sharp corners, the new Do approach, plugins half done, verbose error handling,....

  • benchloftbrunch 4 hours ago ago

    I would add to that, replace #include with a proper module system that fixes the encapsulation and redundant parsing problems once and for all.

    It's 2025 and C++ modules still aren't suitable for real world use yet despite being standardized 5 years ago.

    Additionally standardize the ABI up front so that different compilers can interoperate. Make namespaces native to the object file format.

    Also, explicitly standardize a compiler optimization mode that does not try to exploit UB in eldritch ways that break basic assumptions about how the machine works for 1% performance gain. I get that's an undecidable problem so it's ok if some extra annotations (call them "attributes" and write them [[like this]]) are needed here for explicit optimizer hints.

  • throwaway17_17 12 hours ago ago

    I really am curious why the article goes with just implementing Templates early. If the question is going back from today (or even 2013 as the year for Bjarne giving the question to his class) why would someone recommend templates when typed polymorphic datatypes constructors are a more sound method for implementing ‘generics’ (also easier to produce sensible error messages)?

    Also, why go with constexpr as a replacement (which is not as expressive unless I have badly misunderstood how they work) for pre-processor macros. There have been type-safe and sound implementations of macros, along with explicitly staged computations, since the early 2000’s, why would that not be more preferable?

    I think the article is a fun thought exercise, but i think it attempts to stick too closely to what C++ has become in our timeline and ignored better alternatives that if explained and implemented at the outset would result in a language that retained the performance and abstraction characteristics of C++ as it is today but would place it on sound foundation for further evolution as the language adapts to changes in the industry at large.

  • mpweiher 12 hours ago ago

    In 1979 the “standard practice in C of passing a large struct to a function” wasn’t just not standard practice, it didn’t exist!

    All you could pass as a parameter to a function were pointers to structs. In fact, with one exception, all parameters to functions were basically a machine word. Either a pointer or a full size int. Exception were doubles (and all floating point args were passed as doubles).

    Hmm..maybe two exceptions? Not sure about long.

    The treatment of structs as full values that could be assigned and passed to or returned from functions was only introduced in ANSI C, 1989.

    And of course the correct recommendation to Bjarne would be: just look at what Brad is doing and copy that.

    • dfawcus 3 hours ago ago

      > In 1979 the “standard practice in C of passing a large struct to a function” wasn’t just not standard practice, it didn’t exist!

      Yes it did exist. It just wasn't mentioned in the original K&R book.

      See this page of a memo from November 78, passing and returning structs was supported. When I learn C on a Unix system, there was a copy of this memo in the printed papers section.

      https://www.nokia.com/bell-labs/about/dennis-m-ritchie/cchan...

    • jibal 10 hours ago ago

      According to https://www.nokia.com/bell-labs/about/dennis-m-ritchie/chist..., which is authoritative:

      > During 1973-1980, the language grew a bit: the type structure gained unsigned, long, union, and enumeration types, and structures became nearly first-class objects (lacking only a notation for literals).

      And

      > By 1982 it was clear that C needed formal standardization. The best approximation to a standard, the first edition of K&R, no longer described the language in actual use; in particular, it mentioned neither the void or enum types. While it foreshadowed the newer approach to structures, only after it was published did the language support assigning them, passing them to and from functions, and associating the names of members firmly with the structure or union containing them. Although compilers distributed by AT&T incorporated these changes, and most of the purveyors of compilers not based on pcc quickly picked up them up, there remained no complete, authoritative description of the language.

      So passing structs entered the language before C89, and possibly was available in some compilers by 1979. I was very active in C during this period and was a member of X3J11 (I happen to be the first person ever to vote to standardize C, due to alphabetical order), but unfortunately I'm not able to pin down the timing from my own memory.

      P.S. Page of 121 of K&R C, first edition, says "The essential rules are that the only operations that you can perform on a structure are take its address with c, and access one of its members. This implies that structures may not be assigned to or copied as a unit, and that they cannot be passed to or returned from functions. (These restrictions will be removed in forthcoming versions.)"

      So they were already envisioning passing structs to functions in 1978.

      • mpweiher 4 hours ago ago

        Yeah, I don't think I claimed that these things weren't "envisioned" in 1979. My claim was that they most certainly weren't "common practice" in 1979.

        In my mind, they didn't exist yet, and certainly the 1978 definition of C that I read and you also cite confirms this: "they cannot be passed to or returned from functions". Not much time between 1978 and 1979, so while that's possible it doesn't seem particularly likely.

        My first C compiler, Manx Aztec C for the Amiga (obviously from the mid 1980s) didn't support structures as function arguments, and only got them with a later upgrade that supported ANSI C.

        The 2nd edition of "The C Programming Language" from 1988 also describes ANSI C (at least that's what it says on the cover), so I don't see any documentation that points to C with structures as function arguments in the 1979 timeframe.

        So I think even my less important claim, that structure passing came about with ANSI C, is pretty solid, even if there may have been isolated compilers that supported structure passing before that.

        And never mind the "common practice".

  • smallstepforman 14 hours ago ago

    In 1979, the industry needed ‘C with Classes’. It did not need whatever is required today. Hence the only viable path is the one we’re on. Counter point - who is using Pony (the programming language) today? No one.

    • jkhdigital 13 hours ago ago

      7 years ago, my graduate distributed systems professor required everyone to complete his projects in Elixir because it was trending on HN. It was my first functional language and after getting over the initial hump I fell in love with it.

      Now, I’m teaching undergraduate courses of my own and, while I do not have the flexibility to change the languages used in my current offerings, if I ever start teaching a systems programming course I will absolutely require the students to use Pony.

    • pjmlp 4 hours ago ago

      In 1979 outside places like Bell Labs, we were mostly coding in Assembly, PL/M, BASIC interpreters and compilers, Forth,...

      No one cared about C beyond a few universities.

      • anthk 3 hours ago ago

        Then there's Object Pascal with classes where it's deal for RAD software (even from Lazarus) plus SQLite3 bindings. It would have been a good Java/C# alternative, but you know, we are still being dragged back because of Java/C# and C++.

        All because Unix folks tried to create C with classes, or a crappy and bloated pseudo-Smalltalk OOP language to run everywhere (C++ under Unix/Motif, and Java from Sun).

        Meanwhile, TCL/Tk was good enough for tons of cases. If Sun supported it instead of Java we could have been using something as portable as Java but with a far lighter VM and requeriments. SQLite3 was granted.

        Just look at the shitload of applications created with VB5-6 under Windows. TCL/Tk could've get aficionados from both sides and create something cross-platform and playable. Minecraft with GL/DX bindings automatically compiled from C would weight far less...

        RAD applications? tons of them for the office. No need for OLE for MSOffice, it would have proper DB backends. Multimedia? I'm pretty sure SDL and FFMPEG bindings for TCL would born in the spot. Speed? all the improvements for the JDK we were seeing over a decade would have been in TCL and it would be on par on the experiments made for V8 in JS.

    • throwaway17_17 12 hours ago ago

      Curious as to why you chose Pony as the language to use for your none used language. Any specific reason or was it just the first one that you thought of that fit the sentence?

      I’m not sure if Pony is still being used, but the language was making some headway, at least on the PLT side of things. I know the inclusion of some of their reference capabilities work (and practical implementation of prior research in the area) would be a benefit to greenfield programming language design. I think they missed going the process calculus route, instead choosing actors, but overall I liked the direction.

  • throwaway38294 11 hours ago ago

    It's super weird to say that we need rvalue references. rvalues with their odd semantics are only needed so that they don't break compatibility with the current reference/temporary rules. Instead passing object by move should be built into the language - each class should be moveable by default with proper support in the language

  • VerifiedReports 10 hours ago ago

    I can't see this title without recommending the move Time After Time, starring Malcolm McDowell and Mary Steenburgen.

    The plot is based on the premise that H.G. Wells actually invents a time machine, and it's used by Jack the Ripper to travel to 1979 San Francisco.