From a DoS risk perspective there is no practical difference between an infinite loop, or a finite but arbitrarily large loop, which was always possible.
For example, this doesn't work:
#define DOUBLE(x) DOUBLE(x) DOUBLE(x)
DOUBLE(x)
That would only expand once and then stop because of the rule against repeated expansion. But nothing prevents you from unrolling the first few recursive expansions, e.g.:
#define DOUBLE1(x) x x
#define DOUBLE2(x) DOUBLE1(x) DOUBLE1(x)
#define DOUBLE3(x) DOUBLE2(x) DOUBLE2(x)
#define DOUBLE4(x) DOUBLE3(x) DOUBLE3(x)
DOUBLE4(x)
This will generate 2^4 = 16 copies of x. Add 60 more lines to generate 2^64 copies of x. While 2^64 is technically a finite number, for all practical purposes it might as well be infinite.
Without any specific implementation of a constraint it certainly can happen, although I'm not totally sure that it's something to be concerned about in terms of a DOS as much as a nuisance when writing code with a bug in it; if you're including malicious code, there's probably much worse things it could do if it actually builds properly instead of just spinning indefinitely.
Rust's macros are recursive intentionally, and the compiler implements a recursion limit that IIRC defaults to 64, at which point it will error out and mention that you need to increase it with an attribute in the code if you need it to be higher. This isn't just for macros though, as I've seen it get triggered before with the compiler attempting to resolve deeply nested generics, so it seems plausible to me that C compilers might already have some sort of internal check for this. At the very least, C++ templates certainly can get pretty deeply nested, and given that the major C compilers are pretty closely related to their C++ counterparts, maybe this is something that exists in the shared part of the compiler logic.
C++ also has constexpr functions, which can be recursive.
All code can have bugs, error out and die.
There are lots of good reasons to run code at compile time, most commonly to generate code, especially tedious and error-prone code. If the language doesn't have good built-in facilities to do that, then people will write separate programs as part of the build, which adds system complexity, which is, in my experience, worse for C than for most other languages.
If a language can remove that build complexity, and the semantics are clear enough to the average programmer (For example, Nim's macro system which originally were highly appealing (and easy) to me as a compiler guy, until I saw how other people find even simple examples completely opaque-- worse than C macros.
1. compile time evaluation of functions - meaning you can write ordinary D code and execute it at compile time, including handling strings
2. a "mixin" statement that has a string as an argument, and the string is compiled as if it were D source code, and that code replaces the mixin statement, and is compiled as usual
No. Other modern languages have strong compile-time execution capabilities, including Zig, Rust and C++. And my understanding is that C is looking to move in that direction, though as with C++, macros will not go away.
Nowadays, do pass 1 (abstract, intro, headings, conclusions), upload the paper to your LLM, then ask questions interactively. LLMs can make step 3 a lot more efficient.
I get the impression that somehow an attacker is able to inject this prompt (maybe in front of the actual coder’s prompt) in such a way to produce actual production code. I’m waiting to hear how this can happen - cross site attacks on the developer’s browser?
"Documentation, tickets, MCP server" in pictures...
With internal documentation and tickets I think you would have bigger issues... And external documentation. Well maybe there should be tooling to check that. Not expert on MCP. But vetting goes there too.
I rode for several years. Such a study needs to take into account the age, accident/ticket rate of the rider, and type of bike. While I was cautious, riding vs driving a car for economic reasons, there were plenty of testosterone-laiden youth who drove up my insurance rates.
It turns out that actuaries are pretty decent at the math side of these things. It's just that being a motorcycle rider, regardless of how safe you personally are, is dramatically more of a risk to insurers than covering a cage driver. They know that because they have access to all the data behind the claims and payouts that normies like us don't have access to. Literally everyone believes that they are the exception to things like these. That's the difference between an anecdote and data however. Insurance companies rely on data and not your word that you're the exception to the picture that all their data paints.
Location/culture I think matter too. A lot of equatorial nations drive motorcycles & motorbikes as the default. Car drivers aren’t better drivers on the whole, but they are very aware of motorcycles at all times that I feel other places, car drivers disregard all non-car transportation—motorcycles, bicycles, pedestrians.
I biked in Vietnam. It looks kind of mad but the speeds are actually quite low which is how they mostly survive. The locals don't go much about 20 mph on the scooters although they are capable of 60-70.
The problem is what is a minor accident in a car is a serious accident on a motorcycle. I'm a very safe driver and I have been t-boned through no fault of my own. On a motorcycle is death
https://youtu.be/ug4c2mqlE_0?si=qtqu7tOC7Xpw67aN