Tcl is still the king of this kind of distribution with Starkits/Starpacks. Since you can virtualize the filesystem within Tcl, you can pack in all your resources as regular files and generally scripts will work without knowing that this has happened (and without hacky extracting of the archive to disk before running, or while running).
Python can do something similar albeit far simpler via packaging as an executable zipfile. Just structure everything you need in a directory and zip up with __main__.py in the root. All resources are available as an in-memory binary stream, including individual source .py files. Of course if you want to package something like a C library, you need to unpack that to disk to be able to access it with ctypes.
Are you referring to the zipapp library? I had no idea this even existed until now but that seems pretty clever. Huh. Why can't it also dynamically unzip a shared object library?
CPython will automatically detect it's a zipfile, unzip it in-memory, and import the __main__.py in the root of the module and any nested modules.
The reason why you have to handle shared libraries and any other non-.py data yourself is because Python doesn't know what to do with it. You can access it as binary data via pkgutil.get_data(), but linking shared libraries is system-dependent and Python doesn't load them itself. As the dynamic linker can't find a shared library in a zip file, the only thing you can do is extract it separately. This is documented in https://docs.python.org/3/library/zipimport.html
Any files may be present in the ZIP archive, but only files .py and .pyc are available for import. ZIP import of dynamic modules (.pyd, .so) is disallowed.