Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't want to add more to your fears, but also remember that LLMs have been trained on decades worth of Python code that assumes the presence of the GIL.


This could, indeed, be quite catastrophic.

I wonder if companies will start adding this to their system prompts.


Suppose they do. How is the LLM supposed to build a model of what will or won't break without a GIL purely from a textual analysis?

Especially when they've already been force-fed with ungodly amounts of buggy threaded code that has been mistakenly advertised as bug-free simply because nobody managed to catch the problem with a fuzzer yet (and which is more likely to expose its faults in a no-GIL environment, even though it's still fundamentally broken with a GIL)?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: