Thanks for the link. It's ironic that in the name of security, that solution is probably one of the best available. SSH is so protected against footguns that legitimate use cases are forced to use demonstrably worse security practices, just because some people might shoot themselves in the foot. I'm stuck with either that option, expect, or a total misuse of ssh-agent.
Depending on your use case it might be better to just store the key unencrypted. There’s not really much point encrypting it if you’re storing the passphrase on disk alongside the key anyway.
Right (what's the threat model)? The possibilities of restricted passphrase-less keys are under-appreciated for non-interactive use, or even interactive use. I'd rather mint an ephemeral key for an endpoint I control than type credentials or, worse, forward the agent, if I have to call out of an untrusted system (like an HPC login node).
I mean, the use case is I want my GUI wrapper to interactively prompt the user for the decryption password. It’s not getting saved to disk; I just want ssh capabilities (including password protected ssh keys) inside an interactive desktop app.
I can't tell what that involves but, for instance, the two GUI things I typically use with SSH are Emacs (openssh) and x2go (libssh), and they don't do that. Surely you want the agent anyway.
You could probably use a self-signed cert, then configure socat either to trust that certificate (with the cafile option) or to disable verification (with the verify option)
https://github.com/jamespwilliams/strictbash, I wrote this little wrapper a while back that you can use as a shebang for scripts. It runs shellcheck for you before the script executes, so it’s not possible to run the script at all if there are failures. It also sets all the bash “strict mode” [0] flags.
I’m almost completely deaf in one ear and had problems with tinnitus in the beginning (~7 years ago), especially when trying to sleep. Nowadays, I really only notice it when I focus on it, and it has no impact on my quality of life whatsoever. I think a lot of it is psychological.
One funny bit about the talk, which isn’t mentioned in the slides, is that he actually gave the talk at Linaro (the very same company who were called out in the slides as a case study of submitting a particularly bad set of patches)
I agree that NixOS (and Nix) is difficult to learn and the documentation is generally poor, which I realise is what you’re getting at in your comment,
But I do think we should keep in mind what an incredible project NixOS is. NixOS isn’t just some other Linux distro, repeating pretty much what every distro has done for 40 years. It’s a complete rethink of how a Linux distribution and package manager should work, from first principles, in a way that I don’t think has ever been done before. That’s an incredibly bold thing to do. That alone, as a research project, would be an impressive achievement. But it hasn’t just ended up as some researcher’s PhD thesis. It works in the real world, and it’s built a large community despite its learning curve.
So despite the terrible docs and UX, I have trouble agreeing that it “sucks”!
Oh, I agree. It's why I put up with NixOS. I love what I can accomplish with it. It's a tool for making your own distro. It basically makes the impossible possible. What I'd like to see now is to make the difficult things easy.
The new installer is pretty great. My son’s desktop got hosed recently from ignoring updates and getting too far behind, so I used the visual Nix installer to put a Gnome desktop on there for him. He installs everything from Flathub anyway, so he hasn’t even noticed. Every once in a while I’ll hop on and update it. Silverblue would work about as well, I’d imagine.
The dystopian future of OS dev: Linux, Linux, Linux
I get that backwards compatibility is really important, but still. Gotta be POSIX compatible at least or else next to no one is going to use it as a daily driver.
For example:
> Coming up with a custom shell language was fun, but ultimately does not solve any real problems for us.
> Having a POSIX compliant shell would allow us to, well, run POSIX shell scripts. (This currently requires installing a 3rd party shell.)
Where's some variety? I hope to get into the field myself eventually, so maybe I should take it upon myself to create interesting things that might be useful.
> It’s a complete rethink of how a Linux distribution and package manager should work, from first principles, in a way that I don’t think has ever been done before.
> I can be confident that a Perl script I write today will run unaltered 10 years from now, modulo external collaborators.
Yeah, because the ecosystem is dead. The flipside is that the bulk of Perl modules available are stale and don't receive any security updates or bug fixes. If I had a pound for every time I've seen a 10+ year old bug with no comments in the CPAN issue tracker...
> Perl has a small set of core syntax and is very extensible and flexible in adopting new paradigms.
Depends what you call "core syntax". The syntax available to you with a stock Perl install is certainly not small compared to other languages.
> With a great amount of discipline, Perl scripts can be successfully scaled up into large, complex systems.
If you're disciplined, you can write complex systems pretty much as well as you could in Python or any other dynamically-typed language.
However, it requires a lot of discipline and experience, and with other languages you at least have good formatters and LSPs to help with that. Perl sorely lacks this kind of infrastructure.
I also think the lack of static types makes writing complex systems massively harder, but that's not a criticism of Perl in particular.
Some of us would call that "mature" or "stable". You don't have to change the basics of the language every two weeks and have people rewrite everything every three weeks (when the three week old version gets removed from ubuntu).
Look at LaTeX for example... ever goddamn academic uses latex, but it hasn't really changed for many years now... is it dead? Of course not.
C is a mature language. If they put out a new standard with a few changes once a decade that seems fine. But they do put out new versions, this year will see C23.
LaTeX is just packages (which do change, all the damn time, it's a curse) on top of Knuth's TeX. TeX, it's true, is feature frozen, Knuth has asked that when he dies the version is increased from a close approximation of Pi to Pi itself, and no further bugs are fixed. At that point TeX will be frozen solid, I'd be surprised if it lives another twenty to thirty years, its purpose is digital typesetting and er, you may not have noticed but we ain't printing so many books these days. Once we're not setting type anyway, TeX's purpose expires.
Is TeX that widely used for traditional publishing? The areas that TeX is good at (e.g. academic STEM) don't seem to have reduced in output, and there hasn't been a reasonable replacement in the 45(!) years since it came out, so absent a significant change in academia, I'd expect it to continue being around for the foreseeable future.
The LaTeX system makes sense under the assumption that you want to produce typeset documents, as gradually it is apparent that you don't, in fact, produce typeset documents, the extra bother seems dubious.
Knuth writes books. That's what TeX is for. If you write journal papers, eh, TeX is more likely to ensure you see in print what you intended, but increasingly Maths is done online in forums, nobody wants to wait six months to read a camera ready spell checked document - and meanwhile Unicode means squiggles are easier to write on a computer correctly, and programming means you can express what you meant in (terrible) Python instead of squiggles.
The Math professors at the University I work for do still write a lot of LaTeX (whereas in CS it's not so common as it was) but I wouldn't be surprised if an ordinary Maths undergrad can't write decent LaTeX today, so that's a smaller and ageing constituency.
I'm curious what input method you've found for unicode maths, most of it seems to be input via various TeX-like DSLs (e.g. mathjax), or people just take photos (or screenshots of the TeX and paste into powerpoint...)?
Also, (La)TeX gives you a good-enough environment (especially with something like overleaf for sharing) for maths+crossrefs+citations that I'm not sure you can go simpler (I've tried with markdown and rst, but both end up being more of a pain, and the more maths you have, the harder it is to go without macros). Modern word remains a issue as I found out today, and most other word processors have even less support for throwing something vaguely academic together (i.e. with LaTeX I can just start dumping in content and refs and maths just work).
I don't think there's any mainstream language where the basics of the language change every two weeks or people have to rewrite things every three weeks.
I realize you're exaggerating and that you don't literally means 2 or 3 weeks, but I have Python programs I haven't touched in about 10 years that I still regularly use and that still work fine. Some of my oldest Go programs from over 7 years ago still work fine, and I'm pretty sure that if I were to try running some of my older Ruby or PHP programs (some of which is >10 years old) today it would still work either out of the box or with a few small changes (IIRC some of the defaults in PHP changed wrt. errors, so that might need a little bit of fiddling).
Even JavaScript – the poster child of "rewrite every 2 weeks" – is pretty compatible. You don't need to switch to $new_js_framework and many people use older stuff just fine. The language itself is pretty compatible (almost to a fault, arguably).
I suspect the issue rarely is the language itself changing, but the tooling around it (in the JS case: bundlers, minifiers, polyfills etc.). I think JS/TS the language is good enough, it's really taking the code that's been written and making it usable on the frontend that breaks.
It's certainly the case in JavaScript that the ecosystem has been changing at a very fast pace (too fast, IMO), but you don't need to buy in to all of that.
I dunno, even for maintained ones I've seen far les breakage, whether now or decade ago. It always seemed like default choice for Perl was getting a warning this way of calling something is depreacated, while in Ruby/Python it was "well, we changed our mind on that API, fuck you and your code".
The way Py2to3 migration was handled was also some abomination. Meanwhile if I need my Perl code to work like old Perl I just write use v5.10 in header...
Had a moment a few days ago with a certain distro not packaging latexmk. Turns out it's a one-file Perl script. It... just works. No dependencies, packaging, deploy process, automation, package hubs, no container with an entire OS, no private runtime, no binaries. Just half a meg of Perl. And it just works, and exactly like it did in the 90s. No fuzz, no bloat. We are completely lost these days when it comes to software.
> The way Py2to3 migration was handled was also some abomination. Meanwhile if I need my Perl code to work like old Perl I just write use v5.10 in header...
To be fair, the way the Perl 5 to Perl 6 migration was handled was worse. It got so bad that they ended up renaming Perl 6 to something else, after it had already been released under the Perl 6 name.
Sure there was, everyone was talking about perl6 migration back in 2000's. The "unfortunately named language" appeared once it became clear once migration failed.
> So the upward migration path from Perl 5 to Perl 6 will probably be to run your code through a translator.
> Larry promised not to abandon Perl 5. The 5.6 maintenance track will continue as planned, and the 5.7 development track will eventually yield Perl 5.8, as planned. 5.8 will be the final release of Perl 5, but it will continue to be maintained and stable.
perl is at 5.37 now... so much for "5.8 will be the final release" :) Also, this does sounds awfully like Python 2->3 migration.
>>If you're disciplined, you can write complex systems pretty much as well as you could in Python or any other dynamically-typed language.
Python is today's Perl. It ditched the philosophical sermons of simplicity, minimality and all that, for practicality almost 15 years back. Since then, Python 3, has not shown any reluctance to add new syntactical features and sugar. The ecosystem is large. The language is flexible enough today to give Perl like feels from the 1990s and 2000s. Python is no longer big in web dev thanks to React, and its back to being a backend glue language(Java being a language for serious backend enterprise work) with Perl like philosophical leanings.
As of today think of Python as a Perl with mandatory tab based formatting.
> However, it requires a lot of discipline and experience
That’s true for any platform - Java or PHP. Perl has less support in modern IDE. Perl 5 to 6 transition failed many times so ecosystem lost many talents.
P.S. I wrote and supported large Perl codebases, no rocket science to manage it, if you do it right.
Perl was lucky that the Perl 6 diversion happened. If it had stayed the course and popularity it would no longer be perl like python is no longer python (such bad version dependency hell there's actually a layer of package manager hell). Now well after the perl 6 diversion Perl 5 activity has come back and there are new features. But most perl devs and the culture still has the habit of writing for portability and stability instead of using incompatible features a month after the compiler/interpreter/etc adds them.
Yep - you supplied evidence that its effectively dead. A few rows per day for all modules in the module registry ? Compare that to any other PL in the top 50 and you will have pages per day. For active PL in the top 10, it would be several dozens of pages per day.
(For Javascript, it would be several hundred pages per day - buts..lets not talk about JS).
Perl is an old geezer hacking out its dying last breaths. Calling it "stable" is fine. Death is a very stable state.
I'm not going to necessarily disagree with you. Perl was my professional go-to language for about 20 years, and I moved on to Python for the past eight or so.
Back 15+ years ago, you'd see perhaps 2-3x more updates on CPAN recents per day than what's there today.
I'll always love Perl, but it's definitely on a long slow popular decline.
> I also think the lack of static types makes writing complex systems massively harder, but that's not a criticism of Perl in particular.
I get what you're saying, but also that's kind of BS at the same time. If you're thinking Python type annotations are a great feature, then you're entitled to your opinion. Others might say static typing of a dynamic language is pointless, and yet type analysis does help find bugs in code... to which I would agree. And yet the cost of adding static type annotations, which are not very static, causes the cognitive load to increase, and increases line-noise in a very ironic way given how Python was originally parroted as a less obscure script.
To that last point, let's not forget about sigils in Perl, and how they sorta are a form of type indicator. There is not much ambiguity when the script uses a symbol out in front of any given type of variable ref. The topic of tokenizing variables/ constants, lists, arrays, hashes, or whatever... entities in a script is an academic topic, but it's not! Just look at GOlang for example, they made a huge mistake by exporting variables and things by using the First character of the object, upper or lowercase determines if that thing is exported. One does not need to go find the declaration of the object to see if there was an "Exported" key word annotated at the definition, as is the custom in Python. But the problem is now the language relies on latin language concepts of upper/lower case characters... which excludes many large spoken & written languages used around the world. Thing any Asian, or middle eastern typeset. Anyhoo, Python got that right... one could in theory write python in one's own native spoken language, and Perl can be written in Klingon if you want... any arbitrary spoken language. One of the reasons is because the sigil's are separate from the typeset, and can be changed. In this way, and many other ways, the pathological pragmatism of Perl is outstanding, and the lack of weird opinions is both a strength and weakness. One last bit about typing, performance, and correctness.... Perl is very fast. I'm always reading how in Python the `for` loop is preferable to the `while` loop because one is implemented in python itself, and the other is a very optimal C code implementation. Perl is optimised at every corner of the language, except modules, but also modules who choose... Python does that too for modules, like any math or data-science stuff.
On that last point, I want to throw a bone, and 100% agree most of the Perl libraries for modern practices are out dated, or non-existent, or security problems. A programming language is only as useful as it could be for any given college student to use for a 1st-year software-engineering program. So if the ecosystem is dead, the language is pretty much dead. That said, I know plenty of neck beards who still use IRC, and bang out Perl for quick prototyping... before moving to GO, C, or whatever...