You don't have to prevent root access. You just have to inform user of the risks, void warranties if you want but let users do whatever they want with the hardware that they own.
Please don't push the Overton Window any further. Installing my own software on my own PC should never void the hardware vendor's warranty. That delegitimizes the core concept of a PC.
(A horrific possible dystopia just flashed through my mind: "I'd love to throw out Chrome and install Firefox so that I could block ads, but, the laptop is expensive, and I can't afford voiding the warranty". I bet Google would *love* that world. Or, a UK version: "I'd love to use a VPN, but, regulation banned them from the approved software markets, and anything else would permanently set the WARRANTY_VIOLATED flag in the TPM").
It depends on what your software does; if it removes hardware protections then your warranty should be voided. Of course, those protections are either hardware or impossible to remove, like emergency cooling / lowering voltage when stuff overheats.
Warnings aren't always enough, sometimes we have to lock people down and physically prevent them from harming themselves.
It's not always people being stupid. I recall reading an article by someone who got scammed who seemed generally quite knowledgeable about the type of scam he fell for. As he put it, he was tired, distracted, and caught at the right time.
Outside of that, a lot of the general public have a base assumption of "if the device lets me do it, it's not wrong," and just ignore the warnings. We get so many stupid pop-ups, seemingly silly warning signs (peanuts "may contain nuts") that it's easy to dismiss this as just one example of the nanny state gone mad.
The idiotic statement is yours. If the "sometimes" is important to you, you can have it - you're not the first person on the internet to play word games.
But unless you can come up with a very detailed list of when it's acceptable "to lock people down and physically prevent them from harming themselves" and when it's not acceptable (it never is, it's a crazy statement), and I don't think you have such a list, your "sometimes" just means "whenever I, as the person writing the software judge", rendering it completely meaningless.
I’ll take a real world example where I watched someone start to climb over the side of a bridge. Luckily my words stopped him but I did consider whether I should pull him back and pin him to the ground for his own good.
Is your position that it would be better for his freedom for me to let him jump if I couldn’t dissuade him?
> sometimes we have to lock people down and physically prevent them from harming themselves
So where does my statement suggest we should make locking people up for their own good the norm?
I can come up with even more mundane examples of where we physically prevent people from harming themselves. High barriers to stop people getting into the tiger enclosure. If a member of staff saw someone dumb enough to try and climb in, rest assured they'd be physically dragged out for their own safety.
Or do you suggest we allow the general public to wander into the tiger exhibit to pet the animals? Personal freedom and all that.
Even if it's illegal? (like transmitting on forbidden frequencies)
It's not always the user who's installing software. Lots of people depend on other people to manage their devices. Manufacturers like the hardware they delivered to be trusted so users trust it regardless of who handled it.
I always hear as the excuse but it is ridiculous. If the user wants to transmit on "illegal" frequencies, all he has to do is to change the country setting in their Wi-Fi router, et voilà, illegal transmissions.
The entire Android OS has about as much access to radios than your average PC, if not less. In fact, even on recent android devices, wireless modems still tend to show up to the OS as serial devices speaking AT (hayes) (even if the underlying transport isn't, or even if the baseband is in the same chip). Getting them to transmit illegal frequencies is as much easy or hard as is getting a 4G USB adapter to do it.
At least in EU, transmitting is illegal, having hardware to transmit is not.
That's why people can buy TX/RX SDRs and Yaesu transceivers without a license.
AFAIK the radioamateur world, serious violations of frequency plans are rare and are usually quickly handled by regulators. OTOH, everyone is slightly illegal, e.g. transmitting encrypted texts or overpowering their rigs, but that's part of the fun.
And in some locations, quickly handled by the local amateur community, with foxhunts and community outreach to frequency violators - only getting regulators involved when just talking to the offenders fails.
> Even if it's illegal? (like transmitting on forbidden frequencies)
That's not relevant here. If frequencies are illegal, it should be impossible to program it in such a way. But even otherwise, it's the responsibility of the user to follow local laws. If I have a PTT phone, it's not legal for me to use forbidden frequencies just because it's possible. Why do these manufacturers care about what doesn't concern them when they violate even bigger laws all the time?
> It's not always the user who's installing software. Lots of people depend on other people to manage their devices.
That should be up to the user. Here we are talking about users who want to decide for themselves what their device does. You're talking as if giving the user that choice is the injustice. Nope. Taking away the choice is.
> Manufacturers like the hardware they delivered to be trusted so users trust it regardless of who handled it.
I see what you did here. But here is the thing. Securing a device is not antithetical to the user's freedom. That was what secure boot chain was originally supposed to accomplish until Microsoft managed to corrupt it into a tool for usurping control from the user.
Manufacturer trust is a farce. They should be deligating that trust to the user upon the sale of the device, through well proven concepts as explained above. They chose to distrust the user instead. Why? Greed!
> If frequencies are illegal, it should be impossible to program it in such a way.
You know there's a very fine line between hardware and software in this case so you're actually advocating for drm like control here.
> They should be deligating that trust to the user upon the sale of the device, through well proven concepts as explained above.
That same user who forgets passwords and recovery keys all the time and loses all access to documents when a device breaks? And you're presuming giving that kind of person who doesn't understand sh*t about backups, device security etc full access to their devices will not result in a lot of compromised devices?
I'm not sure manufacturers are the best party to trust but they have an interest in a secure reputation, which the majority of dumb users or eavesdropping governments do not have.
> They chose to distrust the user instead. Why? Greed!
There are more reasons to distrust the user. I don't buy greed is the only relevant one.
> so you're actually advocating for drm like control here.
Absolutely not. I'm saying that the hardware shouldn't have that capability at all in the first place. But whatever. Don't restrict it. Those functionalities are usually under the control of the kernel. If the user is smart enough to tinker with the subsystems at that level, they're also smart enough to deal with the consequences of its misuse. That isn't a good justification to just lock down devices like this. The harm that comes out of that is much worse than what anyone can do with an RF baseband chip.
> That same user who forgets passwords and recovery keys all the time and loses all access to documents when a device breaks? And you're presuming giving that kind of person who doesn't understand sh*t about backups, device security etc full access to their devices will not result in a lot of compromised devices?
Yeah, so? It's not like such a person is ever going to unlock a complex safety lock. Examples for that exist already. Who can sideload an app into a fresh Android device without enabling the developer mode and then installing the APK through ADB? Dumb users won't ever persist enough to reach there. To take it further, the user can be given the root key to the secure boot chain on a piece of paper with the explicit instruction to not share it with anyone or even use it if they don't know how to. Ordinary users can then go on about their day as if it is fully locked down. It's unfair to deny the control of the device to the smart user, when such a security is possible. The existence of a dumb user is not an excuse to lock out smart users.
> but they have an interest in a secure reputation, which the majority of dumb users or eavesdropping governments do not have.
I guess you haven't seen the spyware that OEMs ship with the android devices. Even Samsung is notorious for it - especially on their smart TVs. I'm not going to talk at all about the Chinese OEMs. For that matter, it's very hard for a normal user to even uninstall facebook - an app that's known to collect information from the device that it doesn't need. Manufacturers caring for their security reputation was some 20 years ago. Only Apple does it these days, just because it's their highlight feature. But even they tried once to ship off images on the phone to iCloud without the users' permission to 'check it for csam'. The rest treat it like a portable spying device on steroids.
> There are more reasons to distrust the user. I don't buy greed is the only relevant one.
Trusting the user isn't the manufacturer's prerogative. It's supposed to be the user's property once they pay for it. You are insisting on the manufacturer retaining control even afterwards - something I and many others vehemently oppose as unfair and scummy. Now if you are worried about the security reputation, proven methods exist that allow the smart users to take full control of the device while preventing regular users from shooting their own foot. But OEMs and their apologists pretend that the problem is entirely on the user side and the only solution is to lock it down in a block of glue. And there is one good reason for this ignorance, oversight and denial - greed. Retaining control over the end device forever allows them to squeeze users for their every last penny. I will need another epic post just to enumerate the ways in which the control over the end devices allows them to do so. But I'm not going to do that because HN has entire stories and discussions on each of those topics.
Especially if it's illegal (like speaking against the government, in some countries).
Maybe this is a bit of a hot take, but I think any government that has the ability to absolutely prevent people from breaking the law is a government with far too much power. I'm all in favor of law enforcement, but at some point it starts to cross over the line from enforcement to violation of people's free will.
Yes, very clear warnings; I could live with a small permanent icon in the status bar (via the GPU firmware) etc. But absolutely should not void warranties (overclocking might but never just root).
Easy enough to have an efuse blow if you overvolt; then an dificult conversion on a warrenty claim. Whilst ideologically this is ceeding some control I can accept it.
I don't think users understand the risks. I'm broadly accepting of the protection of end users through mechanisms. Peoples entire lives are managed through these small devices. We need much better sandboxing to almost create a separate 'VM' for critical apps such as banking and messaging.
The people who shouldn't disable these security features tend to be the first to do so. And then complain the loudest when the enter the "find out" phase.
This sounds so weird. Is there a legal requirement for this?
Does this offer any type of real protection? Or is there a code of conduct that that intelligence agencies never hire people with foreign nationalities?
It sounds like a natural expansion of AWS GovCloud offerings to me. Servicing the US government and it's contractors has been very lucrative for AWS. Taking that successful model into new markets makes sense.
The article does not explicitly say it, but it's clearly a defense against the CLOUD Act (https://en.wikipedia.org/wiki/CLOUD_Act); it all makes sense once you add that missing puzzle piece.
The CLOUD Act conflicts with EU laws like the GDPR (AFAIK, this has been confirmed more than once in EU courts already), which means that EU organizations (which have to follow the GDPR) might not be allowed to use USA-owned cloud services, even when the data is completely hosted within the EU, because the cloud service sysadmins might be forced through the CLOUD Act to break the GDPR. Requiring that all employees with a high level of access have EU citizenship and residency makes it much harder for a USA court to pressure them into breaking these EU laws.
If I remember correct from the hot money podcast https://www.ft.com/content/762e4648-06d7-4abd-8d1e-ccefb74b3... part of the problem for the credit card companies is figuring what are the boundaries of legality. Countries have very different laws. Things like representing homosexuality or age of consent are very different and credit cards feel that it is a risky business because of that.
The great thing about those keyboards was that a lot of people could just turn off text prediction and type text with a single hand without looking at the screen.
The great thing about T9 (certainly on the Nokia 3210 and 8210) back in the day was you could type messages fast with few k/s without looking. As long as you had enough experience to know the word combos.
Ive never found a t9 system as good as the Nokia implementation. In some respects its better than qwerty for short messages. And don’t get me started on apples fundamentally broken auto correct system. People dont know any better these days. There’s actual adults walking today that have never typed on a real keyboard.
Meh, Imma team Moto and I don't remember it well anymore, 20 years give or take.
Yes, the later variants both had a custom/user dictionary and could learn %he new words from the input. The latter could add the uncertainty in the input.
you make the point very well! You can still remember, what, 20 years later ?
On the nokia you'd press the button and it would cycle through the options. Once you knew all the words you could type really very fast indeed, and blind.
Someone asked about custom words below. You could definitely add custom words. I think you had to switch out of T9, key the word the old way and then switch back then 'add' to dictionary, but once in it would stay in the dictionary. I'm sure the amount of memory for custom words was quoted in marketing material at somepoint.
This is something that was clearly done just by the love of the craft and congratulations but also feel this could have a huge potential in real world applications.
First crazy idea that came to mind was a multi user desktop environment for an intranet where everyone has their own desktop but could also request access to other desktops entering and leaving them as they are working together through the day.
It really makes you think about those crazy internet folks from back in the day who thought copyright law was too strict and that restricting humanity to knowledge in such a way was holding us all back for the benefit of a tiny few.
I'm all for chopping up copyright law. But until we do so, companies like Meta need to be treated just like everyone else.
That means lawsuits, prison sentences, and millions in fines. And that's just the piracy part, there's also the lying/fraud part.
Interestingly, a Dutch LLM project was sent a cease and desist after the local copyright lobby caught wind of it being trained on a bunch of pirated eBooks. The case unfortunately wasn't fought out in court, because I would be very interested to see if this could make that copyright lobby take down ChatGPT and the other AI companies for doing the same.
The more concerning thing is that the best thing these overpaid people could come up with was.. download the torrent, like everyone else. Here you are, billions of resources, and no one is willing to spend a part of it to at least digitize some new data? Like even Google did?
I think they are morally required to improve the current state.
- Seed the torrent and publicly promote piracy pushing lawmakers.
- Contribute with digitisation and open access like Google did in the past.
- Make the part of their dataset that was pirated publicly accessible.
- Fight stupid copyright laws. I can't believe that copyright lasts more than 20 years. No field moves that slowly, and there should be tighter limits on faster moving fields.
Copyright and patent aren't the same thing. "Fast moving field" doesn't make sense in terms of copyrights. There's no reason the copywriter should last some minimum duration after the life of the creator.
If I write a really popular book, I don't want Hollywood to make it into a movie without compensating me just because they waited a few years
Fast moving field does make sense in terms of copyright because the knowledge is recorded in documents which are then copyrighted. E.g. research papers.
> If I write a really popular book, I don't want Hollywood to make it into a movie without compensating me just because they waited a few years
I genuinely don't understand this. Even at a decade copyright, pretty much anybody who was going to buy the book and read it has already done so. It costs you virtually nothing in sales, and society benefits from the resulting movie.
Your goal is to deprive everyone of having a movie, because someone who isn't you is going to make some money that was never going to you anyways? Your goals for copyright appear to be a net negative to the system that enforces copyright, which begs the question why should the system offer protection at all?
> Even at a decade copyright, pretty much anybody who was going to buy the book and read it has already done so. It costs you virtually nothing in sales, and society benefits from the resulting movie.
If the movie can be made then the book can be printed and sold by any publisher, under the current system. It creates a race to the bottom on the price of the book as soon as the copyright duration is done. Perhaps extending "fair use" stuff could allow one and not the other.
That race to the bottom is a feature, not a bug. It allows poor people to engage with culture. That's the tradeoff here. At some point copyright is protecting a tiny amount of profits for the author in exchange for locking people out of access.
Copyright is supposed to be a societal benefit, or there's little reason for society to spend money on enforcing it. That's where we currently are, and I think why there's such a strong reaction to copyright currently. We pay to protect the works and then we pay again to buy them. They become free when they're so culturally irrelevant that nobody wants them even for free. The costs of enforcement are socialized and the benefits are privatized.
At some point, copyright is going to have to provide more back to society or society will get tired of paying to enforce it.
Copywrite expiring in 20 years doesn't mean access is democratized. Publishers would likely keep the price the same, but instead is the author getting a cut, they just take everything.
Besides. The public isn't owed the fruits of my labor for free.
I honestly suspect fairly little would change. The US operates with a 20 year copyright for nearly 200 years, these long copyrights are actually far newer.
Also, you are not owed a monopoly on arrangement of words enforced by the public. There are plenty of other places to spend tax dollars.
Who do you think pays for prosecutors, lawyers, jails, and investigators going after pirating sites? DMCA claims are worth less than the paper they're printed on if not for the threat of all that.
This hasn't been my observation. Instead, I see a society where people regularly help and serve one other, frequently for free. Consider parents, social workers, most academics, food banks, charity in general, most workers in most businesses, et cetera. I wonder: who do you know and work with? A minority of people profit wonderfully off this. Incidentally, they seem to also preach principals that can only lead to the end of their gravy train.
You can counter by insisting that these "altruistic" behaviors are simply less directly but still in the altruist's interest. I would entirely agree.
I don't disagree with your point that, in life, not everybody is in it for themselves. But the examples you chose to demonstrate altruism are a bit ridiculous:
- parents: they wanted a child and now they have to take care of it, it's not a selfless act at all
- social workers: are paid to pretend to care. Often they genuinely do care, but this isn't altruism, it's a job
- most academics: I see you haven't met many academics. Altruistic (and selfless) are not terms I would use to describe them. The majority is very much in it for themselves...
- food banks, charity in general: very true, some charity do strive on unpaid volunteers, that is altruism
- most workers in most businesses: okay now you're getting ridiculous...
Many children are unwanted. Consider adoption and neglect. Parents know not to admit these things broadly.
Social work is a very low paid existence and most of the social workers I know could easily have earned more elsewhere which they are pained to know but persist through regardless because they care more for living in a world with less total suffering even at the cost of their own.
I earned my MSc from the University of Edinburgh and interacted thoroughly with academics there and in the process of getting there. I know many people with their PhDs and have had personal friendships with professors, postdocs, and other researchers. I would agree that academic incentive structure have been made deeply dysfunctional and delusion abounds. Also that defection is common. I have known some of those evil actors (e.g. Sharon Oviatt) so I don't deny their existence.
The very premise of business is that it takes a profit from the excess efforts of labor. I'm not the ridiculous sort that fails to recognize that often workers productivity is both made possible and enhanced by the accumulated coordination and structure of firms and owners should capture some of that value. However, increasingly research is showing that the advantages of our society are being captured by firms. Meanwhile, too many owners are failing to responsibly reinvest in the population and have made religions out of not fostering true growth.
My claim is that multiple cultural norms live side-by-side and I'm trying to help you and others realize that different options are plausible and more advantageous. The cooperators learn self preservation and hiding while they are also harvested while and beyond doing so. My speculation is that the expanded belief holding of perspectives like yours decreases the size of that population which will be a downward spiral of inefficiency and impoverishment. I expect the bottom will fall out viciously if it gets to that.
My spending time on this conversation is altruism, what is it for you?
Yep, you even see it on HN with artists and devs complaining about AI, especially when things like ChatGPT and Stable Diffusion were first announced. People who were pretty lax about copyright when it didn't affect them personally suddenly became copyright maximalists, talking about "stealing, theft, etc" Since then, people have calmed down and realized that AI is simply a tool like any other.