Is this becoming a thing now (or maybe it was always a thing), where a language, like Rust, has become popular enough that instead of everyone talking about learning it, they now want to talk about how “simple” and “beautiful” C is for no other reason than signaling how different you are from the zeitgeist?
Rust exists for a reason, and it solves specific problems. That’s the “magic”, just like any abstraction in any language. So what’s the argument, that abstractions are bad? Clearly not:
> I also haven’t really experienced the problems Rust claims to be solving
This is like hearing someone say 20 years ago that “I’ve heard a lot of good things about PHP but I don’t see the point of it, because I’ve never had to write a web application that interfaces with a database” — well, no shit?
I'm a C programmer by day and I disagree with them, C is only simple if you're trying to do simple tasks with it.
The very common thing I want to use in C is some sort of variable size string object. But no, I have to dynamically allocate a buffer that I know will be at least the right size for any text I ever put into it, or do I create a buffer that's the correct size for that string but re-alloc if I ever change it to a longer string. But then how do I store the buffer size? Do I want to create a struct that constains a point to the buffer and the length, or use sizeof() to calculate the string length? But then I can't use sizeof() if I pass that buffer into a function via a pointer. If I pass that string to a function is it being copied straight away or just storing the pointer so I can't change the string at a later date. I can't enforce copy semantics
And god forbid you ever forget to include space for the NULL
I just want a string I can dump some text in, I don't want to go searching for libraries, I don't want to have to consider allocation and copying and all that crap.
If I wrote half of my boilerplate C code in python it would look just as simple and beautiful (if not more)
>I'm a C programmer by day and I disagree with them, C is only simple if you're trying to do simple tasks with it.
C gives you enough rope to shoot yourself in the foot. And rightfully so. It came out in a time when everyone was coding assembly. It's meant not to hold you back from doing voodoo with low-level stuff, therefore it won't hold your hand.
Not very practical in the world of today when we've been spoiled by 'better' languages and you need to quickly ship stuff that mostly works without worrying about the little things, but at the time it was revolutionary.
This was before my time, but I think it's a common misconception (only true for operating system development). When C was created, there was already Lisp, Cobol, Fortran, Algol, Simula, BASIC... and SmallTalk and Prolog were just around the corner - and most of those are much higher level than C).
Sure, but none of those were for systems programming which is squarely the domain that C was aimed at, case in point: the first thing that C was used to write was UNIX (before then it was BCPL and this was iirc before C even had structs which made that a very tricky job, once structs were in place it got a lot easier). Probably Don Hopkins has more knowledge about this.
> before C even had structs which made that a very tricky job...
...interesting that you mention that, I think that functions and structs are the essential 'core abstraction tools' that get you to at least 80% of any higher level abstractions that were invented since then, and this is exactly the reason why C is still quite popular. Its feature set is just enough to be considered a high level language which enables abstractions, but not more (especially no fads and fashions that came and disappeared again).
I think parent knows. A more accurate description would be everyone of the intended audience was writing assembly. Yes there are other languages of higher levels, but C was not invented to help their users. And since they are also not really what Rust targets either, IMO it’s reasonable to shorten it to drop the qualifier in this context.
Undefined Behaviour was a very late 'addition' to C, it only became necessary when C was standardised around 1990. And only after two more decades passed, UB became an actual problem when compiler vendors decided that it's fine to exploit it for optimization tricks.
“Decided that it’s fine to exploit it for optimisation tricks” is a poor characterisation. The reality is, if you define particular behaviours you will harm performance in some cases. If you define how something in particular should happen, then all architectures will need to implement that, regardless of their underlying semantics.
eg. C leaves the case of exceeding the size of an int undefined. In most cases it has a predictable effect on modern, mostly similar architectures but that is by no means guaranteed, and forcing an architecture to calculate overflow a particular way seems like a negative.
That being said, everyone has a pet example of a compiler doing some really odd and deep optimisations - I suspect that’s mostly due to successive layers and optimisers adding up to have unexpected effects, rather than a deliberate effort by compiler writers - but I’m no expert on the matter.
> UB became an actual problem when compiler vendors decided that it's fine to exploit it for optimization tricks.
Section 4. Conformance says "A strictly conforming program shall only use those features of the language and library specified in this International Standard. It shall not produce output dependent on any unspecified, undefined, or implementation-defined behavior, and shall not exceed any implementation limit."
Compilers are not allowed to produce output dependent on UB for strictly conforming ISO C programs, they must optimize those statements out. Treating UB as impossible is required for ISO C. It's NOT required for GNU C, or Clang C, or Microsoft Visual C, but they usually do so anyway (even though they're not compiling strictly conforming ISO C programs).
Did you miss their point? They merely used UB, correctly, as a term we all do recognize today.
I will utterly kill all humor by explaining it:
They made a funny observation about a thing that happens. The thing that happens is (today) called UB. The funny observation is that that comment kind of exhibited the outward appearance of what the effects of UB could look like.
It began reciting one metaphore, "enough rope to hang yourself" but mid-way unexpectedly switched to a different metaphore "shoot yourself in the foot", producing a combined invalid nonsensical output. As though a program suffered some UB in the routine for looking up and printing metaphores.
The comment author might have done it on purpose. Maybe they intended to make exactly that joke.
The history of the term UB has no more bearing than the history of any of the other words used.
>The very common thing I want to use in C is some sort of variable size string object. But no, I have to dynamically allocate a buffer that I know will be at least the right size for any text I ever put into it, or do I create a buffer that's the correct size for that string but re-alloc if I ever change it to a longer string. But then how do I store the buffer size?
As a C programmer shouldn't you have a library abstracting all this by now? Either your own or one of the dozens available, including pascal-style strings?
Okay, so you write your own string library. Now you'd like to do the same thing for resizable arrays, so you write a resizable array li… oops, you can't, because C doesn't have parametric types.
He didn't say "write your own library". He said "use" one (your own if you prefer). Or are you going to suggest there are no good string handling libraries for C?
It doesn't matter. My point is that it's not possible, not for you and not for anyone else, to implement a type-safe generic resizable array library in C.
My c knowledge is obviously old. I wonder if the author is lamenting the lack of a "modern" string manipulation in the standard library beyond just working on char buffers?
It's funny how both C and LISP programmers seem to suffer from NIH to the point that they'll roll their own just for the heck of it rather than to first see if there is a library that they can use.
The long term cost of those decisions as well as the number of really bad bugs (and security issues) that can be traced back to one-off code is likely much larger than the same figure for well used libraries. But it all sort of evens out whenever a bug in such a library is found because then it is so widespread that lots of systems will suffer.
>It's funny how both C and LISP programmers seem to suffer from NIH to the point that they'll roll their own just for the heck of it rather than to first see if there is a library that they can use.
Well, not sure about the LISP programmers, but C programmers have a good reason: they work under different environments (from embedded to Windows, legacy UNIX, the latest Ubuntu, ...) and also have different needs, regarding allocation, string management, etc. So one-size-fits-all lib might not cut it for everybody. It can also be as simple as having an inherited codebase which uses something else.
Still, there are popular string libraries, and C programmers do use them when they can.
I don't even know where to start with C++ stdlib string problems, but being mutable and doing a unique heap allocation (above a certain length - a behaviour which however isn't even standardized) are definitely at the top of the list.
std::string_view would have been a good thing if it hadn't added another memory corruption foot gun.
A universal string type is one of those things where you can either have convenience or performance, but never both.
If all you do is write code on Linux/Windows/MacOS, C++ strings might be fine. Things are different in the wider world. Many places I use C don't even have a C++ compiler (embedded, in particular).
They literally cannot. For one, neither C++ or Rust conform to standard C, meaning they already deviate from what C does on a basic level.
Their industry uses are highly different as well. C is almost a requirement for embedded systems, and while C++/Rust can be used there, they’re simply too complex and in Rust’s case additionally too young to be adopted. C++, and possibly Rust (if it can get its act together), are more used in high level programming, as that’s what they’re built for.
If C++ could do what C can, then why is the Linux kernel 98.5% C code? Wouldn’t it be better to use a more varied and powerful language?
Or, maybe different languages have different use cases and cannot be directly slotted in to replace one another.
What does language complexity has to do with anything? It will get compiled down to machine code, and both can be and are used for embedded. They occupy the exact same low-level niche as C, hell, they may be even more level as they can also do things like SIMD.
Linux kernel is C because Linus doesn’t like C++, it’s that easy. And usually no, why would you use multiple languages in a project if you don’t have a good reason?
It is quite difficult to write idiomatic C++ without memory allocation, for one, which is often a requirement for embedded code or any high-availability code that is not allowed to have memory fragmentation.
Sure, it's possible (you can write C in C++, for the most part, and to its credit, C++ has placement new). But many of C++'s niceties require memory allocation, making much of its value-add over C questionable.
C++ compilers shipped their own collections, for Borland compilers it was BIDS.
The first version used preprocessor macro tricks for generatic code, basically what Go folks re-discovered with //go:generate, we already had it in 1992.
BIDS 2.0 already used templates.
It was exactly because even C++ for MS-DOS provided safer library than C, that I was never that into C. It wasn't better than TP in features and safety, only in portability, and that I could get from C++ anyway.
That's fair! Perhaps I should amend to say that "current" idiomatic (almost got autocorrected to idiotic on my phone) c++ code is allocation heavy. Probably there exist suitable libraries I don't know about that basically implement the STL using predefined arenas for storage.
Linux isn't written in standard C. Not only does it not accept C's aliasing rules (hence the kernel is compiled with them disabled, which the standard doesn't offer), it doesn't even accept the memory model, because it had its own memory model first and it likes that one better.
As a result to some extent GCC and Clang are also compilers for some sort of "Linux C" which is strongly reminiscent of the ISO standard language but distinct.
And there's no reason you would choose C++ for a project where you'd otherwise use C on account of the (exaggerated) relationship between the two languages, the reason you'd do it would be that you want C++ features, and Linus doesn't want C++ features. Projects to "just" compile the existing Linux code but with C++ compilers failed AFAIK.
Unions for type punning is actually entirely valid in C, it's only UB in C++.
(I agree though that the C standard isn't actually relevant in the real world, most C compilers treat it more or less as a 'suggestion', unless enforced with options like '-pedantic')
struct strbuf { size_t cap, len; char *str; };
void sb_setf(struct allocator *a, struct strbuf *sb, const char *fmt, ...);
void sb_appendf(struct allocator *a, struct strbuf *sb, const char *fmt, ...);
// have other convenience functions for formatting fixed point values like "prefix AAA.BBB suffix" ("voltage: 7.23 V")
// special helpers for dates, times, etc.
Just keep building that library up and you'll have growable buffer, strings, lists, hashmap (uintptr -> uintptr is all you need in 99% of cases I've found, maybe some helper functions for string key -> void * built ontop of uintptr->uintptr) + replace/rewrite the standard library to operate on these types instead and you're good to go.
That works until you want to use code from somebody else who also has something like that that serves them well, but is slightly different.
If you’re extremely lucky, things will compile and work.
If you’re just lucky things won’t compile, and you’ll have to write conversion functions (or macros).
If you’re unlucky, there will be subtle differences, likely poorly documented, between the libraries, and code will compile but have subtle bugs.
Of course, other languages have that problem, too, but at the higher level of json parsers or graphics libraries, not at the basic level of strings, lists, or maps.
But then you're back in "having to pass the length as a separate parameter" territory, I guess. (but at least it's the length of the array here, not just the zero-terminated component, which is what you wanted).
I'm a programmer since the early 90's, but I have seen this problem only ever with Rust so far (where people are eager to learn something new, pick Rust, and then quickly get jaded - case in point, that's also me - I learned enough Rust to write a home computer emulator but in the process realised that it is not the right language for me, even though it should be from its feature set).
It feels a bit like a speed run of C++, at least this took around 2 decades for people to get fed up and turn away ;)
(and interestingly, I also switched back to 'mostly C' for my hobby stuff, and I'm quite happy with it)
I've felt that way too, and it's been enough to push me away even as I've tried to build things in Rust in earnest. Along the same lines, I've found it hard to say exactly why I like C and super dislike C++. I guess I have to say it's simplicity--like I won't argue C is by itself simple (integer promotion by itself is not simple) but it's definitely simpler than C++, and the simplicity and power of its core conceit (everything is a number) is just enrapturing. I think it's just an ethos thing: Rust doesn't strive to be simple and I find that makes it impossible for it to delight me.
I also think that's broadly why newer languages have failed to capture the je ne sais quoi of C: you really can't get away with "everything is a number" these days.
I don't think C consistently lives up to this principle:
- In the memory model, even simple integers can hold "poison" values.
- Pointers usually behave like integer addresses, but in the memory model they have "provenance" (edit: spelling), and they also have to follow "strict aliasing" rules.
- Signed integer overflow is UB. We could ignore integer promotion rules most of the time, if not for this restriction.
- Even simple integer assignment isn't simple when an integer is shared between threads. Atomic orderings are hilariously complicated.
I worry that a lot of people who find delight in C just...aren't aware of these rules? Or maybe aren't consistently aware? Or maybe are aware but think that some violations are benign?
Oh that's what I was nodding at with my "I wouldn't call C a simple language" comment. You are 100% correct. I assume--maybe wrongly but I don't think so--that the natural progression of the C programmer is:
- whoa cool everything is a number! Make that light blink, wipe that SDRAM chip, whiz bang!
- What the fuck is a torn read (insert any C gotcha in here)?! Everything is garbage!
- I know, I'll encapsulate "The right way to do things" in a library/new language.
- Never mind, I've decided to build websites (insert popular tech job here) for a living, but be super grouchy about it
The corollary to "I think Rust's complexity makes it impossible to delight me" is "I think C's brittleness makes it impossible to delight me." It has notes of innocence lost, nostalgia, a "simpler time", etc. Are those days gone forever, as the Dan say? Dunno.
We're very aware and try to shield ourselves somewhat with compiler options (eg. max warning level already goes a long way), sanitizers and analyzers (thankfully availability of such tools has improved dramatically with clang's ASAN, UBSAN, TSAN and the clang static analyzer).
(and actually: yes, some rules are benign if the major compilers agree on the same non-standard behaviour, so far I have never seen unions used for type punning break in C++ for instance - it's good that C++ now offers a 'proper' alternative though).
It's pretty much an illusion that any non-trivial C or C++ program can be entirely standard compliant, it always depends on the specific compilers it has been tested with - which is still a better situation than Rust, which only has a single implementation (so far).
Yeah it is bonkers to me that people don't turn all the stuff on. There's a lot of help out there that is basically free. It's worth taking an hour or so to go through all the checks--if nothing else it'll probably make you wise to some new footguns.
While I agree, I also have to say that Rust does a lot more than just solve some specific C and C++ issues. It doesnt solve all of them (e.g. logic errors, so if thats 90% of your issues you wont benefit too much), it repeats some design issues of C++ (massive complexity from the start, many ways to do the same thing), and implements a C-like unsafe{} language anyways.
If Rust was just C but with strong typing, a borrow checker and what would basically be suoer strong static analysis of pairing malloc() and free(), people would likely switch immediately.
But it isnt - its a whole different beast which by far is not perfect and repeats many issues it didnt need to repeat (e.g. terrible async like python).
Its just so good at having a compiler that tells you whats wrong that most of the other things are not that bad, and switching to Rust is likely good for 80%+ of C projects.
> If Rust was just C but with strong typing, a borrow checker and what would basically be super strong static analysis of pairing malloc() and free(), people would likely switch immediately.
This is in a nutshell why I haven't switched to Rust. All I actually want is a small language like C, Go or Zig, but with compile time memory safety guarantees. Even if it means that a lot of 'dangerous' flexibility is removed or in an unsafe{} block (essentially a Rust--).
IMHO one problem with Rust is that it is moving too quickly into too many different directions, and as a result becoming a 'kitchen-sink language' in the tradition of C++.
> just C but with strong typing, a borrow checker and what would basically be super strong static analysis of pairing malloc() and free()
Most of Rust's features interact in ways that aren't obvious. For example to get the basic memory safety guarantees that the borrow checker provides, you also need:
- the "no mutable aliasing" rule
- destructive move semantics and the Copy trait
- generic containers like Mutex and (probably?) generic enums like Option and Result
- thread safety traits like Send and Sync
- closures, and closure traits like FnOnce
Of course yes, you can have "safe C" without async, and Rust 1.0 shipped without async. But I think it's notable that The Book doesn't teach async. Most of the things The Book teaches are actually necessary for memory safety to work.
It doesnt matter what The Book teaches - for example a majority of Rusts web-facing ecosystem uses some async, which means
1) tokio, which uses unsafe{}, and/or
2) async functions requiring all calling functions to also be async
So, really, you can't avoid it. The ecosystem is built on the idea of NIH, which is fine, if it wasnt for so many rust features you can abuse so heavily (e.g. macros to make your own language that I then have to learn).
There are a lot of issues with the complexity Rust brings.
IMHO a 'rusty C' wouldn't need to cover the entire memory safety feature set that Rust provides, it just needs to be 'close enough' and otherwise warn in areas where the compiler isn't entirely sure. It can also do some checks at runtime at the cost of a slight performance hit (but again, in old C tradition this should be controllable by compile options).
> But it isnt - its a whole different beast which by far is not perfect and repeats many issues it didnt need to repeat (e.g. terrible async like python).
Not gonna lie, I don't see too much value in async, but that said, I can see where they (the Rust devs) are coming from. It was an oft requested feature, and it was in the pipeline for ages, leading to Rust dev burnout.
That said Rust team is working on making it fully usable, albeit the space of efficient, zero-cost abstraction closures that work with lifetimes is a set of one language - Rust.
Only when a certain language community tries to promote itself as a “better” C and thinking all C code should be converted and then some people actually tried and do not agree with it.
Just write programs with your favorite language, if it is actually a “better “ one it will win, and you don’t annoy yourself and others by over promoting it. C did not start by building a evangelical strike force but bunch of programmers that actually wrote software that people have to use.
>they now want to talk about how “simple” and “beautiful” C is for no other reason than signaling how different you are from the zeitgeist?
Given that this has been an argument for 3 decades or so, e.g. compared to C++ especially, I don't think so.
Even more so since C is not just some trendy language you pick up quickly, but needs quite a lot of time and effort to be profficcient in, to the point of appreciating its simplicity and portability/stability/etc benefits.
>Rust exists for a reason, and it solves specific problems.
Rust exists because its creators had a reason such a language was needed in mind. Doesn't mean others necessarily share it, or if they do, that they see Rust as the solution to the problem behind that reason.
I think the main mistake that the Rust people made in their initial promotion of the language was to position it as 'a replacement for C'. No programming language has ever replaced another completely, it always ended up as just another language with its own trade offs and quirks. This even holds within a single language eco system if it isn't managed carefully, Python comes to mind.
> they now want to talk about how “simple” and “beautiful” C is for no other reason than signaling how different you are from the zeitgeist?
It sounds to me like they keep learning languages with the same illusions and failing to take any lessons between their language exploration escapades.
All programming languages suck. It's just about finding the one that sucks the least for you (or your project/business).
They made the comparison of RPG characters and mentioned endgame frustrations. In my experience, C is the epitome of endgame frustrations (C++ maybe being worse, depending on codebases you work on); they've just yet to discover that
> All programming languages suck. It's just about finding the one that sucks the least for you (or your project/business).
Very true, but C is the standard language on *nixes, so it's often the one that sucks less. It also strikes a good balance between mental burden and expressiveness for writing simple programs that aren't so trivial that they're just a shell script or one-liner. Rust is just too ugly and verbose to make it what I reach for to quickly hack out a throwaway utility with.
Disagree on c++. Even if you forego the significant improvements in the last decade then: RAII and templates make it worth not using C anymore. Having to use shoddy macros for a resizeable array is just unreasonable.
I've written C++ for a long time, and I do agree that the language brings certain improvements over C. I think most people will agree with that. However, C++ can be incredibly frustrating to work with depending on the codebase.
Overall, C++ can be great choice, but it often times results in not being due to lack of rigorous discipline by everyone involved on the project.
It's a bit philosophical whether that's a problem of the tool or the craftsman, but I think it's usually a bit of both.
Agreed that it's a monstrosity, but the reality is that much of that monstrosity is hiding subtle background issues that exist in C that people don't talk about. look at the implementation of std::vector, and compare that against most of the home rolled macros that (dangerously) wrap calls to realloc, and tell me which one is the monstrosity! Personally, I'm glad someone else wrote vector and that I don't need to handle that.
I agree. C is absolutely terrible to use for anything that's not trivial. It may have a simple spec, but you pay dearly for it in code complexity. Its abstraction capabilities are only marginally better than writing straight assembly.
As someone who should like the proposition of C after having read a few books on it I decided to avoid the language if I can. Read: I wanted to like the language, but the deeper I went into it the more I realized that the absolute knowledge of everything needed to wield it is not something that is worth aquiring.
I understand most of the historical reasons why certain things in the language are the way they are, but some of those result in really bad ergonomics and counterintuitive behavior.
So simple and beautiful is not how I would describe C even if I would love it to be that way.
If there's a "hipster" notion here it's not that of choosing C as rebellion against Rust, but that choosing C constitutes a rebellion against Rust and only Rust, as if it's the only systems programming language worth it's salt.
E.g., explain to me like I'm 10 why you think Rust is better than, say, D.
It seems to me like D is like if you took the most complicated language in the world, C++, and did even more stuff to it, like adding in an optional garbage collector. One reason it's so complicated is that "low level" programming plus OOP seem to be a terrible mix. D is trying to improve a messy room by re-organizing some things and introducing a Roomba.
Rust blazes a whole new path, with the slightly different objective of safety. It's simple and lean, almost like C. It bakes in lessons learned over the years like inheritance being overly complicated, OOP generally being overly complicated, and how dangerous systems programming can be by having modules instead of classes, composition instead of inheritance, and memory safety as a default. It also broke the false dichotomy of "fast but dangerously leaky" or "slow but safely garbage collected" by introducing borrow checking and move semantics, allowing speed and safety with no garbage collection.
D feels like a continuation down an evolutionary dead end, like teleputer cartridges in Infinite Jest.
Rust feels like an innovative fresh start, like transferring digital DRM-free files.
(If you think that makes D sound charming, I agree. But realistically I would stick with Rust. :p)
I started a side project in C earlier this year, and now I'm a little upset at the prospect that doing that is going to be seen as some hipster niche. I felt the same way when everyone started growing beards.
These people (with rare exceptions) quite funnily never seem to have shipped some actual product or service based on those "simpler" languages. Or if they have it's shipping their parallel implementations of basic services that every language does better than C like strings, slices, data structures, etc. And no, if your code depends on cpp macro magic, it is not helping your point.
I've been learning Rust lately. Working on a little game. First few thousand lines of code in I was having some doubt. After being a little frustrated with borrows and lifetimes and with my strong tendency to prematurely optimize all the things I was soooo tempted to just switch to C and have full unfettered access to my sweet sweet pointers.
I'm glad I haven't switched though. Performance (which I'm measuring a ton of) is great. The situations where I wanted to bow out and take the 'easier' path were mostly bad design choices on my part and I think partly just laziness.
Rust forces me to think a little bit more, and I've really enjoyed learning about how/why the various language features were designed. Now I'm one of those damned Rust evangelists that have been annoying me for years =)
My reaction to that has always been that it forces us to think about the things we would have had to think anyways if we wanted to create reasonably reliable and secure software. So in the end if saves work, even if it doesn't appear like that at first.
The other side of the medal (especially for game development), is that in the higher level parts of game code you need quick turnaround times for experimentation and tweaking, and most of that code won't even make it into the final product (so all the upfront time you spent thinking about proper architecture and ownership details is wasted up there).
The traditional solution is to have different languages for different code layers (e.g. a compiled language for the low level parts, and an interpreted scripting language for the high level parts), but this comes with its own set of problems.
Don't games work by an engine which doesn't have a lot of rando changing user facing features, that could be rust. And the engine provides a dsl that is the high level user facing rapid change thing?
To counter myself, I could also easily imagine that game engines are also full of dirty hacks to get things that the front end people want done, now, by hook or by crook. Ie, if a nice sane interface doesn't exist, and woukd take more than 11 minutes to add the right way, then just butcher in any kind of tight coupling and to hell with worrying about it breaking for the next game.
Oh no, I think you made a good point. I was definitely thinking about game code per se, rather than engines. The extra rigour around an engine (any library code, really) is definitely a good thing
> The traditional solution is to have different languages for different code layers
Isn’t the rust solution to wrap Q&D/experimental code in “unsafe {}”? I think that’s superior for the parts where you don’t need the fast iterations of an interpreter.
Not sure, I haven't tried Rust for gamedev stuff. But I think unsafe{} is still very picky. It just removes some borrow checker restrictions, but still requires a lot of 'correctness', and I guess it also doesn't magically improve compilation time.
One other 'modern' solution is to use a compiled language for the whole 'stack', but with a hot code reload solution, so you can change the code on the running game - which also tremendously reduces turnaround times.
I really like Rust, but it feels a bit rigid for game programming. I'd probably still use it, because Bevy, but I'd definitely push a lot of logic into a scripting language.
I'm not personally familiar with it, but I think you want to take a look at Elixir. I've heard it often in the same vein as Rust but for web. Not exactly the same, but in terms of a high-quality language.
Don’t expect to see immediate benefit from Rust by writing a small Pong application. Rust gives you confidence at scale: the ability to depend on many 3rd party blocks without compromising stability or performance, the ability to grow and maintain the code, collaborate on it, etc. Nothing of this is easily seen on a small self-contained program you can write in C.
I still cannot figure out how to easily write small applications with Rust on low resource computers, like the one I'm typing this from. (Limited storage, CPU, memory.) It seems even the smallest programs require a massive toolchain. The default reliance on network connection for compilation is offputing. It's vastly easier for me to write small programs, offline, with C.
The rationale for using Rust over C that I see published the most is "memory safety". But I can write small C programs for text-processing using flex that do not manually manipulate memory. What benefit would there be to write them in Rust.
Why would you write small apps in rust? Just use go. It's perfect for < 10,000 line programs.
I've written more in C than in any other language (and I've written in dozens of languages professionally over the past 25 years). But the one thing that gets hammered home with every line of C I write is this: You're playing with a loaded gun.
The more code there is, the more likely it is that you have subtle heisenbugs that break in mysterious ways. My current job requires C (for a number of good reasons), and I really notice the jump in crazy, hard-to-track bugs despite my DECADES of experience as a careful, expert C programmer.
C is "simple" because all of the complexity gets pushed to the compiler behaviour and runtime environment. You don't even discover how many assumptions you've been making until you have years of experience under your belt.
That's a point I was trying to make multiple times - that C (apart from O/S code and drivers) is used for small command line utilities and language runtimes of higher-level languages. Nobody (except maybe game devs) wants to develop fat application server binaries or other long running server apps in C++ or other non-GC'd environments (due to memory fragmentation issues alone if not other things). And for a language runtime, you can't use Rust's memory safety guarantees either. This leaves Rust for what it was originally invented - browser engines and fat desktop apps.
Lots of people write complex, latency dependent code.
Previously they would write in something like C/C++ or use the JVM but try to avoid all garbage collection events though very carefully avoiding allocations, etc.
There's a ton of big productivity desktop applications written in C++ (which almost always also includes a significant amount of C code in 3rd-party libraries), pretty much all 3D modelling/animation tools, Photoshop, browsers...
Also: memory fragmentation is only an issue if you don't have a proper memory management strategy (which you absolute need in any non-trivial code base, even in GC'ed languages).
It’s pretty reasonable to argue that were these things written again, less than 100% of it would be in C[++]. It’d also pretty reasonable to argue that that percentage will be even smaller 5 years from now.
Sure, but that's a theoretical scenario, because this type of applications won't be written anymore. Anything that's relevant in those areas had been started in the early to mid 90's (even the 'newcomer' Blender had been released first in 1995).
...which unfounded assertion would that be? As far as I'm aware there are no contenders which would even attempt to throw tools like Maya, 3DSMax, Blender, ... from their throne. And all those tools hail from the 90's.
The only exception in recent time which gained some traction might be Figma for 2D design, but even though it runs in browsers the important parts are also written in C++.
Er the modern C stack has basically the same components as the modern Rust stack: a language frontend (say Clang or rustc) interfacing with LLVM or GCC. You could use an alternate C compiler from these two, but, really, why would you?
What the hell does it have to do with “low-resource computers”? Are you using a feature phone or what? Otherwise I can honestly can’t see how your hardware would be the bottleneck (I assume your problem is compile times).
Rust’s compiler is slightly slower than many other languages (as it simply does more), but it’s not a significant degree where you would need some monster machine..
I used to love C for its simplicity. There was just no surprises, and the limited feature set enforced a certain programming style that also happens to run very well on modern CPUs.
But the C standard library is just awful. It's so inconsistent and full of quirks you just have to know. Like how some string functions allow you to specify a size, while others don't. And how strtok keeps track of an internal state and behaves differently on subsequent calls.
I wish there was a language that as simple and limited as C, but with modern (and portable) functions for things like strings, networking, graphics and so on.
This is very painfully true, but one 'killer feature' of C is that it is useful without ever using stdlib functions (except basics like memset, memcpy, ... which can be considered compiler builtins anyway).
In more recent languages (even C++) there is no such clear distinction between the language and stdlib any more, which IMHO is a real problem (e.g. most of C++'s problems are actually stdlib problems, not language problems).
Because it was designed with UNIX API surface in mind, and also the reason why outside embedded and Windows everyone with a C compiler does POSIX, even crufty mainframes that are still being sold.
It is as much the case that modern cpus have been designed to facilitate c semantics. C was designed for the cpu initially, but a lot of generations have happened since then, and there have been attempts to try other things, and by now the cpu is designed for c to run well on it.
This is how I feel about C++. It's enjoyable to use!
I'm happy to use Python or JavaScript or something else when appropriate, but coming back to C++ is like sitting on your porch, enjoying a cool breeze and the relaxing after a hard day of juggling magic.
I do have an occasional cozy feeling about the time where I was writing VBA for a french bank lol.
I thankful that time is gone, but ! VBA is simple, and may be enjoyable to use!
> There’s just so much “magic” happening in each that it’s difficult to figure out what’s going on. When I’m writing code in these languages, I realize my brain is effectively writing psuedo-C code and then transpiling to whatever language I’m working with.
I feel the same way about these other languages, but for kind of the opposite reason! I write some code in the new language and then transpile it to C in my head just to see what I expect the CPU to be doing. I'm going through SICP right now and I find myself fighting the urge to imagine `cdr` as dereferencing the `next` pointer of a linked list. I spend a lot of time worrying about this stuff when writing other languages, probably to the detriment of my productivity.
Other than assembly language, all languages embody abstractions.
I find understanding certain abstractions hard going because they don't mesh well with my view of the problem domain and my design for solution's implementation.
Actually, x86 assembly also embodies plenty of abstractions. Not sure on ARM or RISC-V, but x86 assembly instructions are not executed the way you'd think by any modern processor.
Essentially the processor reads a while bunch of instructions at once, splits each in the raw microinstructions, arranges them in a graph of dependencies, and then solves an optimization problem to find the best way to schedule nodes from that graph onto its internal execution pipelines. In real-world execution, you can't even tell what assembly instruction(s ) are the ones being executed at a partocular time - disparate parts of various instructions. Some instructions, like "mov ax, 0", don't even execute: they just serve to mark which the tens of real registers is now free to use for another symbolic register like bx.
Scheduling and implementation are irrelevant to the complaint being made about high level languages.
No matter what the cpu does internally to arrive at producing the requested output from the supplied instructions, it DOES produce exactly the reqested and expected output from the given instructions.
What it does not do is for example maybe a + operator doesn't mean the same thing after some unknowable prior step changed the definition of +, or flatly not provide a means to manipulate some data in a way that a language author thought was crazy and no one could ever have a valid reason to do $thing like idk execute a string or something. Sure there are now optional settings and features you could consciously use, for example to enforce that data/exec seperation, but it doesn't just do it by it's own magic according to someone else's rules instead of your own code.
This is silly. Assembly is an abstraction too. Do you think the CPU actually runs the code you give it? Of course not. There's microcode, register aliasing, speculative execution, etc.
Even assembly itself has symbols and labels, which are themselves abstractions.
Did you really not understand the point the GP was trying to make or is this an attempt to be clever?
Of course assembly is an abstraction, but it is the lowest level programming language that ordinary mortals can still write code in, which was the point the GP was making.
Exactly nobody writes applications in microcode. Register aliasing and speculative execution have under normal circumstances no effect other than some performance which if the CPU just did as it was told by the assembly code (or actually, the machine code, the binary representation of the possibly optimized assembly) would still work exactly as advertised. You can also switch those off if they're features of the assembler, and if the CPU does them then you're going to have to live with it.
If you really wanted to make the point that Assembly is an abstraction then macros would have probably been a better thing to mention.
If you’re trying to understand the behaviour of the computer then all of the layers of abstraction matter, right down to the logic gates. Spectre and Meltdown proved that software developers can’t just “trust the CPU to do the right thing.”
Besides the issue of security, performance is also a thing. If you’re trying to squeeze every last cycle out of your program then you need to understand CPU cache hierarchies at the very least. That’s at a level far below assembly language, digging down into the physical layout of the machine.
I'm well aware of all of those levels. But they are not germane to the discussion about programming in general. They are important if you are doing some really specialist work (low latency, extreme performance, operating system design). But that wasn't the context at all.
Child, assemler mnemonic representations of opcodes are not hiding anything.
You could make the same, useless, argument about the raw binary. If you looked at an executable you are still actually only looking at a transcoded representation in ascii in an editor. The cpu doesn't actually know what 0A is, those are glyphs for numbers and letters in a human language.
Assembler mnemonics are not materially different, and even macros don't change this because the macros are macros, built out of other visible assembler not hidden magic.
You are not conducting useful argument or communication with this silliness.
> assemler mnemonic representations of opcodes are not hiding anything
They do hide things. There are often multiple ways to encode a line of assembly into machine code. For example on x86, JMP can take an 8-, 16-, or 32-bit displacement, and the assembler will usually select the shortest encodable variant. Some instructions have a shorter variant for certain registers, like ADD $1, %eax. You add useless REX prefixes.
The difference is that a CPU does way less magic under the hood than an optimizing compiler for a high level language, and there's no such thing as UB in assembly (outside some exotic cases like 'illegal' instructions on the 6502 which behave unpredictably)
The cpu magic doesn't do things you didn't ask for, or can't figure out how to ask for through a bunch of indirection.
All the cpu magic means is that you don't know how it did exactly what you expected. It still produced exactly and only the expected output from the given input.
High level language magic means it does things you didn't expect, and that you can have a hard time figuring out how to get it to do something you want if that doesn't happen to be one of the things the language designers predicted and decided for you that you should ever need to do.
You think C is "low level", but it was once considered "high level", and the compiler does many things you don't want, and the standard's ambiguously-worded, and implementers have their own interpretations about the ambiguity, and also bugs in their compilers, and you eventually arrive at:
"If you really want those instructions to happen in this function, without fear of magic, write them in assembly."
C's position in the "low- vs high-level" hierarchy arguable hasn't changed much since it was created. There were already higher level languages in the 60's (e.g. languages which abstracted the underlying hardware much more than C, but those weren't useful for writing an operating system in).
From what I have seen, there are two “contending” definitions for what is high-low level languages: one considers anything above assembly languages high-level, the other is less concrete, and would put for example managed languages into the high level category/towards that end of the spectrum, while C, Rust, C++ would be on the lower end, assembly even lower.
I prefer the latter definition, as the former is, while objective, quite useless. The second definition could be expanded by a partial order between languages by “feature X can be emulated in it” with some caveats[1], and then we might even get Rust/C++ beat C for low-levelness, since C don’t have any way to force vectorization (compiler-specific intrinsics don’t count!).
[1] since most languages employ FFI/linking, not even this definition is too specific — would probably have to write it as the “idiomatic language can emulate feature”
> After looking up compatibility for SDL, I noticed it is able to run on iOS 6 and greater, meaning it supports iOS devices all the way back to iPhone 3GS.
Whoah, I was convinced there is no way to write and run apps on these older iPhones - I wonder how difficult the whole process is.
I don't understand using SDL, it's not that hard to write something in Objective-C that runs on earlier versions of iOS as well. You can even cross-compile from Linux, or even run clang directly on iOS if you jailbreak. My own experience is with iOS 4 - 6.
For some time I think about using Rust, but in a cut down manner. Ignore Cargo and most of its stdlib. Not to use traits. Allow myself to use unsafe once in a while. Basically a subset of Rust that would be just safer-C. I wonder if that would be effective for me. Maybe even compile times would be acceptable. I guess, I must just try.
I am one of those that tried Rust, but guess back to C for my own projects. It's usually that I like to write smaller utilities and many quirks of C are not as painful. I like to write things that do not use dynamically allocated memory for example. Also plethora of available alternative C compilers for example something using QBE gives me a nice warm feeling. Yes, it is not all technical for me.
I’ve been enjoying writing libraries in C and being able to use them both from native applications and in web apps via WASM. I personally avoid emscripten and just implement the minimal C/JS glue and utilities myself.
For snprintf I use stb sprintf [1]. For the reasonable functions from string.h (memset, memcpy, memmove), you can just use compiler builtins (__builtin_memset, etc.) as long as you enable the bulk memory extension (-mbulk-memory). I haven’t needed much of math.h for the stuff I’ve made, when I have I just called stuff on the `Math` object in javascript.
More power to you. C is beautiful. Properly written C is more of an Art than Science. People who say C doesn’t work for large projects often forget that their cars run on C.
Using a memory unsafe language in a situation where it's not strictly necessary is in my opinion not justifiable at all. It's the leading cause for security issues by some measures[1], incredibly hard to reason about and hard to debug. Honestly unless you have a really, really, good reason not to, use a managed language. If that isn't good enough and you want to be fancy use Rust and only if you've exhausted everything else start writing C.
The opposite it also true though: Using a memory safe language where it's not strictly necessary is not justifiable (e.g. Rust is essential for implementing a sandbox - for instance a WASM VM, but not for code running inside that sandbox - because the whole point of a sandbox is that it can run untrusted, unsafe code safely).
The author appears to be quite green behind the ears, which is fine! They’re certainly doing themselves a disservice by being so sure of themselves. There’s a lot here that feels right in the pocket of Dunning-Kruger ignorance.
Being able to analyse the benefits of new tech at a distance (which is what the author is doing with their sterile toy projects) is not something that you can “fake until you make it”. It requires a lot of deep experience with different technologies, enough that you can pick up the common patterns of costs and benefits, which for the most part never change. This is exactly what OP is doing. Their big list of languages they’ve worked with is doing the opposite of what the author intends. All it says to me is that their bar is way too low, and that they do not understand the level of technological understanding required for a language’s inclusion in that list to mean absolutely anything in the context of this blog post.
In all I’m not really sure what the point of this post is. By the author’s own admission they haven’t worked with much C. If someone doesn’t understand the value in a language that addresses the memory safety footguns of C, I assume that they’re at best inexperienced, or at worst part of the quite sizeable contingent of C developers that are in complete denial about the language / standard library’s shortcomings, especially with regard to memory safety, because it’d require them to admit that they themselves are imperfect developers.
> The first language I properly learned was C in first year of my engineering degree. [...] My experience in the class solidified my belief that programming was what I wanted to do and forged a bond with the C language that I didn’t realize until now.
I am highly skeptical that the author understands the full ramifications of undefined behavior, dangling pointers, platform-dependent integer sizes, and the myriad sharp edges of the C programming language.
I don't disagree that, for example, Rust has a heavy syntax. But those exist for a reason. Those are the result of hard-earned lessons from phenomena like double-free (see ownership) and duck-typed templates in C++ (see traits).
Rust exists for a reason, and it solves specific problems. That’s the “magic”, just like any abstraction in any language. So what’s the argument, that abstractions are bad? Clearly not:
> I also haven’t really experienced the problems Rust claims to be solving
This is like hearing someone say 20 years ago that “I’ve heard a lot of good things about PHP but I don’t see the point of it, because I’ve never had to write a web application that interfaces with a database” — well, no shit?