Hacker Newsnew | past | comments | ask | show | jobs | submit | more pabl8k's commentslogin

OK this just feels like this topic is an excuse for you to trot out tired tropes about bi people while scare-quoting the word to delegitimize the orientation.

What you're doing is called bi erasure. It happens from straight and gay people. Bi doesn't necessarily mean attraction or experience is exactly equally split. For some people it is, for some people it isn't. Some people may change how they self describe. None of that means people aren't legitimately bi.


I am not trying to de-legitimize anything. Nothing I said changes (or can change) your truth.

My post was based on what I've been told by both Gay and Bi people (the latter, several Bi people I knew complained about the first item with the young women). So if Straight and Gay people can participate in Bi Erasure, so too can Bi people I suppose.


In the US, returning US citizens have the right to re-enter without a photo or a fingerprint. It's a right I'd like to preserve, and that's getting increasingly harder to exercise as everything in the airport nudges passengers to automated systems with no obvious way to opt out.

Obviously other countries have their own rules.


Have you ever tried refusing to show a passport? As far as I know, US citizens have the absolute right to enter its borders (https://sgp.fas.org/crs/misc/home.pdf), but whenever I read about that, I wonder how they go about that when a random stranger arrives at a border and claims to be a citizen.

The best example I can find is https://en.wikipedia.org/wiki/William_Worthy#Right_to_travel...:

“He was able to return to the U.S. in October 1961, showing his birth certificate and vaccination record at Miami Airport”

That’s decades ago, though, and he still showed something (but not something that realistically identified him) to get in.


All you need to do is convince the agent that you are an American Citizen.

You can show them any identification documents you do have, like drivers license, to establish your identity. For establishing citizenship, one option is if you previously has a passport (expired, or just not with you) then can look up that record, and if it is recent enough to pull up a photo they can compare to you that can also help.

Other ways of verifying citizenship depends on what databases they have access to. For example if they have access to the Social security numident database, they could look up a number to see the associated name citizenship flags. I'm not sure if they do though.

From what I have heard anecdotally is that it is usually not too hard to convince them if you really are a citizen (except in edge cases like being born abroad to Americans without parents registering your birth with the US.)

In any case they will formally warn you about it being illegal for Americans to enter without a passport (it is, but there is no punishment for the crime anymore). This is to deter you from doing this again in the future, as having a passport makes things much faster and easier for everyone.


I'm a subscriber and I think you're missing the point. It's about the aggressive tactics to get people to subscribe. And even worse, they don't stop once you've subscribed. You pay them $15.99/month or whatever and then they still do full screen ads for additional product lifecycle steps (listening to music) or enroll family members. It's better on the web but the app experience is terrible. A side-effect I'm sure of the typical engagement/"experiment"-driven product management mindset, where what is not measured (consumer dissatisfaction) does not get managed.


Yea, my rule of thumb is: The more aggressive a company is trying to get me to subscribe, the less likely I will subscribe. On one end of the spectrum are sites that operate via donation and don't loudly ask that I donate. I'm more likely to subscribe to them. On the other end is YouTube Premium.


I don't think this is true, do you have a source?

They store registered users phone numbers and allow discovery by making a request with a hashed version of the phone numbers on your contact list. They add an extra layer to allow attestation of the software doing this using Intel's secure enclave. They give many examples of responding to warrants with only whether the number has been registered and the timestamp of registration, which they explain is the only information they hold.

Private Contact Discovery: https://signal.org/blog/private-contact-discovery/



There's a horrible conflation of concepts here. A pretty big one.

When people talk about cloud services, they generally mean part of an application that runs on the cloud that participates as a trusted actor in the application's trust model.

What people in the linked thread are realizing is that "signal has a server" and they are confused because they thought signal didn't have a server, or something.

So, what's important about Signals servers is that, outside of initial key exchange which is verified by two parties out of band, they are not a trusted entity, ever. When you send a message it goes through signals servers. When you sync your profile picture with other devices, same thing. The data transits signals servers. This is made possible because of cryptography. By encrypting the data in a way that is indecipherable by 3rd parties (Signal's servers included) your data is isomorphic to random noise. So, the only thing Signal needs to do is route the random noise to the right place. If it doesn't do that, it's a denial of service and about the only attack you're vulnerable to if you use Signal. Otherwise, the receiver gets the exact random noise that you sent, but only they can make sense of it because of the miracle of cryptography.

If you're really doing to throw a fit because Signal syncs a profile picture between your devices using the same level of crypto as is used for messaging then you're honestly crazy.

No. Signal did not "not have a cloud" and now they "have a cloud". Not by any reasonable interpretation of the events.


Signal has a "cloud" a server where they collect and store your name, your phone number, your photo, and list of every person you've contacted using Signal. That data isn't some ephemeral encrypted string that is only present when you "sync your profile picture" or when you send a message. It is collected and stored on their server where it will sit for at least as long as you have an account.

The justification for it was so that you could get a new device and have Signal download all of your info from your Signal's server down to your device. The data collection first takes place as soon as you set a pin or opt out of setting one (at which point a pin is assigned for you automatically).

The data is encrypted, but that does not make it impossible for signal or for 3rd parties to access it. see: https://community.signalusers.org/t/proper-secure-value-secu...

If you're a whistleblower or an activist, a list of every person you've been contacting using Signal is a highly sensitive data. No matter how you want to spin it, Signal is hosting that highly sensitive user data on their servers where Signal and 3rd parties alike could possibly gain access to them.


You should assume every bit of information sent on the internet is archived in a massive warehouse somewhere, because it is.

Thus, we have to trust the cryptography itself. Sending an encrypted message to a peer is no different from sending an encrypted message to yourself (other than the use of symmetric vs asymmetric crypto). The fact that you send a message to yourself which is stored persistently on signal's server doesn't change anything (and it's even opt in AFAIU). Sure, there are concerns about the implementation, but until someone can decrypt the blobs in storage (the crypto is broken) I don't see reason for outrage.

Pretty simply, if you don't trust the crypto then you have a very different threat model to pretty much everyone else. If you don't trust crypto you can't use the internet because you can't use TLS. You're relegated to networks where you trust every single node (where you don't need crypto) and other such stuff. Most of us trust the crypto because it's really the only practical option. I don't see the problem.


> You should assume every bit of information sent on the internet is archived in a massive warehouse somewhere, because it is.

Leaving aside the whataboutism here, you shouldn't assume that when you're using a secure messaging app that claims to be designed to never collect or store user data. Signal makes that claim at the start of their privacy policy and it is a lie. It started out true, but they begain colleting data and they refuse to update their policy.

> Thus, we have to trust the cryptography itself.

No one is suggesting we can't trust cryptography. The fact is that doesn't matter how strong your algprythm is when you're encrypting that data with a 4 digit number. You can 100% "trust the cryptography" and still acknollege that it won't take very long for someone to brute-force your pin and get your data plain text.

> Sending an encrypted message to a peer is no different from sending an encrypted message to yourself... (and it's even opt in AFAIU).

This has nothing to do with "sending data to yourself" and everything to do with Singal collecting data from you and storing it for itself. There is a massive difference between encrypting something yourself and sending that data to yourself and someone else copying data from you, encryping it, and saving it for themselves.

This data collection is also not opt in. At all. You can opt out of setting a pin, but if you do one will be automatically generated for you and your data still gets silently uploaded to Singal servers to be stored. The community spent months begging for Signal to add a way to opt out of this data collection, but they were ignored.

See:

https://community.signalusers.org/t/dont-want-pin-dont-want-...

https://community.signalusers.org/t/mandatory-pin-without-cl...

> Pretty simply, if you don't trust the crypto then you have a very different threat model

"The crypto" isn't the problem here. The problem is Signal collecting sensitive user data and permanently storing it on their servers in a manner that could allow it to be accessed by third parties and then not clearly disclosing that to their users and refusing to update their privacy policy to reflect the change.


Signal can't possibly read the data. How is that for itself? Only you can decrypt it! Signal doesn't have your data. They have garbage bits of effectively random noise.

You can prove it to yourself. Go take one of Signal's servers and try to find someone else's data there. You won't.

Why would Signal update their privacy policy to reflect the desire of misguided fear mongers? I certainly wouldn't do that if I were them.


> Signal can't possibly read the data.

They literally can. If you can brute force a 4 digit pin, you can access any of the data protected by a 4 digit pin. Some pins are longer, but it's notable that even after a lot of backlash they continue to push for "pins" and not "passwords" knowing that many will continue to use a simple four digit number.

> You can prove it to yourself. Go take one of Signal's servers and try to find someone else's data there. You won't.

um... what?

> Why would Signal update their privacy policy

To accurately reflect the data they collect and how it is used? So that they don't lie to their users by making claims that are demonstrably false? To notify whistleblowers and activists that their information and the information of those who they are in contact with could be discovered by state actors who can force Signal to give them access? There's three good reasons right there.

I'm sorry you're so upset by this. I know the reality is uncomfortable but that doesn't make it "fear mongering". I honestly wish it wasn't true. I wish they weren't collecting user data, I wish they were doing more to secure what they do collect, and most of all I wish they were honest and forthcoming about what they are doing, but wishes can't change what is. I hope that regardless of if you use Signal or not, you'll try to accept facts even when they aren't easy to accept.


Let me make this clear: if the data is stored in a way that Signal's service cannot decipher it, then it's not collected by any reasonable definition of collected". In order for Signal to collect it they would have to obtain it, which they don't, and can't, do.

This term isn't just some loose word to be thrown around and abused on message boards. If we take your definition of collected where handling encrypted data is collecting it, then "the internet" collects all data. Uh oh.

What signal does is route encrypted messages between principals in a system. That's all they do. They don't collect personal information. Read their subpoena responses, they publish all of them.


> Let me make this clear: if the data is stored in a way that Signal's service cannot decipher it, then it's not collected by any reasonable definition of collected".

I think this is misguided, and confuses the truth. Data collected and stored remotely is being "collected and stored remotely" regardless of how well protected it is.

I will however concede that it is possible to design a system where data is encrypted on a device and then uploaded to the cloud in such a way that simply having that encrypted data on a remote server doesn't put that data at risk. Signal did not design their system in that way.

> If we take your definition of collected where handling encrypted data is collecting it, then "the internet" collects all data. Uh oh.

Again, this isn't about handling encrypted data - it's about the long term storage of highly sensitive but encrypted data - and as I said above, even that is fine if it's done correctly. Signal has done a poor job of designing their system which leaves user's data at risk.

> What signal does is route encrypted messages between principals in a system. That's all they do.

That used to be "all they do". Then, about two years ago they decided they wanted everyone to have profiles which would be kept on the cloud. As soon as you install the software, before you try to send any message to anyone you're asked to provide a pin to secure you data. Once you set one (or opt out of setting it yourself) it collects a bunch of data from your device (not needed for routing anything - remember you've just installed the app and are not trying to send or receive any message at this time) and having collected that data it encrypts it on your device using the pin, then it uploads that data to their cloud. That data can be recovered by you (or anyone else for that matter) by providing the pin that you set. The data they just collected and stored is not used to transmit, route, or delver messages. This data collection takes place in addition to any information needed temporarily to transmit, route, or delver messages.

> Read their subpoena responses, they publish all of them.

That's incorrect. They publish the ones they are allowed to publish under the law (look up "national security letters" for more info) and their refusal to provide one agency with data says nothing about the requests they are forced to comply with. Their favorite examples involve cases where Signal was unable to hand over the data because they didn't collect it in the first place. Today, because of changes in their data collection practices, they now collect exactly the kinds of data they were not collecting before and were therefore unable to provide.

It's unlikely that Signal would be compelled by a standard subpoena to brute force their users pins to access the encrypted data. It is far more likely that the data is already being collected by an agency on-site, and that the data collection is continuous and ongoing (look up "Room 641A" for an example of on-site data collection by the state).

The fact that it is unlikely that Signal would be compelled by a standard subpoena to brute force their users pins does not mean:

- Signal employees can't do it themselves any time they feel like it.

- State actors can't do it whenever they feel like it

- A hacker couldn't gain access to a server and do it

Because of the sensitive nature of the messages sent over the platform, and because they have explicitly marketed themselves to vulnerable groups like whistleblowers and activists it is critical that Signal be honest about the risks of using their software. They insist they don't collect any data, while in practice they do. They say they secure the data they have, in practice that data is exposed by way of multiple vulnerabilities that could very well endanger the freedom or even the lives of the people using Signal.


Can you link to the implementation? I'll agree that a 4 digit pin is rather egregious and trivially crackable. I don't know a single serious cryptographer that would allow such nonsense which is why your comment sounds so unbelievable. I thought they were blending the pin with some device-local entropy to make a reasonably strong key. I'd like to verify your claim.


Basically, they planned to get around much of the problem by depending on a very insecure secure enclave to make up for a lack of basic sound security practices.

The scheme they came up with to store user data in the cloud was described here: https://signal.org/blog/secure-value-recovery/

The code is here: https://github.com/signalapp/SecureValueRecovery

This site does a pretty good job of explaining why this isn't a good design: https://palant.info/2020/06/16/does-signals-secure-value-rec...

I'm sure I've linked to it already, but please review the discussion here as well: https://community.signalusers.org/t/sgx-cacheout-sgaxe-attac...

Even more details here: https://community.signalusers.org/t/wiki-faq-signal-pin-svr-...


They definitely do not encrypt your data with a 4 digit pin. They use Argon2 (a slow hash, not that it matters specifically here since the security depends largely on the entropy) to derive a 32-byte key. Then they derive subkeys: an auth key, and part of a final encryption key. The other part of the encryption key is 32-bytes of entropy. You store your entropy in an SGX enclave with a limited number of attempts allowed to combat the possibility of a weak pin.

Few things:

1. The vulnerabilities in question for SGX have been patched, only one of which affected Signal at all.

2. Signal preemptively combats any future speculative execution vulns by adding "don't speculate about this next branch" instructions before every single branch.

3. nit: SRV is a scheme to store the 256bits of entropy in the cloud, not the actual user data. It's unclear from those links whether Signal has actually deployed the "store encrypted contacts" portion.

4. It is concerning that the security of this entropy is tied to Intel's SGX implementation.

5. If you use a strong password, which security nuts would, none of this matters.

6. If you turn off your pin, none of this happens at all (so it's at least opt out but IIRC setting a pin was optional).

7. I don't find your interpretation particularly charitable to the truth of what's actually happened. It's incredibly reactionary.

I will give you:

1. The trust model for Signal has changed to include a dependence on a piece of Signal cloud to enforce a rate limit on (really access to) escrowed entropy IFF you use a weak pin.

2. There does seem to be unnecessary confusion surrounding this whole thing.

What bothers me reading through this is that it was never made clear to users that the security model would change if you enabled a weak pin, in other words that the strength of your pin/password is now important if you don't/can't/won't trust Signal+Intel. If that was made clear there would be no issues at all and concerned citizens would simply disable their pin and deal with the not-improved UX or choose a strong pin such that the entroy escrow SVR thing is entirely moot.

I don't think they need to update their privacy policy or user agreement to reflect these technical implementation details, though, as I've previously stated.

Moxie blames the poor reception on not having analytics. I'd say they should have known, it's pretty obvious you can't pretend you don't need a password and try to hide it from users if you want to add stuff that needs a password, like usernames. But I also know from first hand experience how difficult it is to just sit there and say "whelp, we can't build this thing that will make many users happy and make the product better because it isn't perfect".

What's sad is actually that this is all in service of enabling username messaging and dropping the phone number requirement which is exactly what everyone is yelling about. So it's like, they listen to feedback from people who want to use Signal without a phone number requirement. Then they build the thing that lets them take a crack at the nut. And then they get reamed by HN for having the audacity to try and build a secure solution to a problem that largely only exists on HN and only for Signal (nobody gives a shit that every other app under the sun just stores your contacts in plaintext). Must really suck to get that kind of response.

I'll probably go turn off my pin. I have no interest in signal managing my contacts.


I did oversimplify their encryption scheme, but the issue is that in the end you still only need a pin to get the unencrypted data. I agree that if they'd been honest about passwords and the need for a strong one this wouldn't be as big an issue. It's because they were not honest that I don't think it's fair to expect their users (even the security nuts) to do it. Their target demographic will include whistleblowers and journalists who aren't necessarily all that tech-savvy.

The strengths and weaknesses of SGX are debatable, I may lean on the pessimistic side, but as you say it impacts the security model of Signal users and to me that means they (and new users) should be clearly informed. The first line of their privacy policy says "Signal is designed to never collect or store any sensitive information." which is demonstrably false.

As for opting out, unless something has changed they still store your data on the cloud, it's just handled differently:

https://old.reddit.com/r/signal/comments/htmzrr/psa_disablin...

I don't know what options someone has after they've already created a pin, if there's a way to remove your data from the cloud, I stopped using signal before they forced the pin (back when you could still just ignore the notice) and getting real answers to these kinds of basic questions is way more difficult than it should be. This is, again, a service targeting very vulnerable people whose lives and freedom may be on the line.

I was one of those Signal users who wanted them to move away from requiring a phone number too. That said, what I was looking for was something more like Jami. They managed to create a system with usernames and passwords but without phone numbers or accounts keeping your data in the cloud.

I'm not shitting on Signal's efforts overall. A lot of great work went into Signal and I'm pissed I still haven't found a good replacement for it, but the changes they made hurt the security and safety of the people who depend on Signal. They are a massive intelligence target and I can't blame them for anything they were forced to do, and if their goal was to subtly drive people away by raising a bunch of red flags I thank them, but if this is their best effort at communication and building trust how charitable can they expect us to be when two years later so many of their users don't have a clear idea of what's being collected and stored or what that means for their safety?


I just wanted to thank you for the information and the ensuing thread. Very interesting.


hi maybe do all that but also don't reject vaccines :)

before the vaccines plenty of young healthy people died.


> hi maybe do all that but also don't reject vaccines :) > before the vaccines plenty of young healthy people died.

I think we need to put the words "plenty of young healthy" into context or that statement is at risk of failing a fact check.

Take a glance at the EuroMOMO mortality graphs by age cohort[0] and you'll see this pandemic is definitely skewed towards the elderly, and underlying medical conditions (cardiovascular disease, diabetes, chronic respiratory disease, cancer) are known to increase the likelihood of serious illness and death from COVID-19[1][2]

[0] https://www.euromomo.eu/graphs-and-maps [1] https://pubmed.ncbi.nlm.nih.gov/34929892/ [2] https://www.who.int/health-topics/coronavirus#tab=tab_1


I just learned that an acquaintance of mine flew on a plane knowing they had tested positive for covid. Or my partner's family just had a family get-together (she didn't go) while one her family members had covid.

In the former case, policy changes to reduce likelihood of people flying with covid (temp checks, affidavits, allowing removal of obviously ill passengers), and to increase ventilation and filtration on planes, and perhaps to even bring back masks or at the very least encourage or incentivize them in times of high transmission.

I don't know that we can do much about the latter case other than better public health education and perhaps PSAs.


We had to cancel a lot of summer vacation plans as our kid wasn't eligible to get vaccinated (until very recently) and they abruptly cancelled the in flight mask mandate. The scenario you describe is exactly why we wear N95 in the airport and on planes even today. Case counts are going to need to come way down before the masks go away.


People should stay home when they are ill and isolate, with any transmissible illness.

But in 99.999% of cases, non-airtight masks do not work at all. Not even a little bit.


The number of significant digits you provided suggests you're extremely confident of this. Can you provide a source? For instance, there's this study[1] that suggests surgical masks are as effective as n95 masks.

[1] https://www.thelancet.com/journals/lancet/article/PIIS0140-6...


The Lancet also said that HCQ were dangerous and caused extra fatalities in COVID patients. They then retracted that because the data there were basing that on was completely made up. https://www.thelancet.com/journals/lancet/article/PIIS0140-6...

The journals are unreliable on this and many other topics. Objective data suggests that mandatory masking has no effect on transmission.


>The journals are unreliable on this and many other topics. Objective data suggests that mandatory masking has no effect on transmission.

1. It's ironic how much you denounce the value of journals and extol the value of "objective data", yet you have not attached any "objective data" (in any meaningful sense) with your original comment, and have dodged follow up requests for sources.

2. You're moving the goalposts from masks "do not work at all" to "mandatory masking has no effect".


The Lancet knowingly lied about HCQ. Therefore I don't trust them.

Anything less than an airtight full aerosol filtration respirator will not stop aerosols infected with SARS-CoV-2 from leaving and being inhaled by people. Fact. The latest CDC data suggests that "medical masks are better than cloth masks" but never attempt to identify a difference between a cloth mask and nothing.

This article outlines a paywalled paper suggesting masks don't work at all. https://www.cidrap.umn.edu/news-perspective/2020/04/data-do-...

The incidence of infection between areas that are fully mask-mandated can't be proven to differ significantly from areas that had no mask mandate due to many other factors https://www.cebm.net/covid-19/masking-lack-of-evidence-with-....

I don't trust the establishment because they were wrong about everything. Fauci lied about masks, then lied about lying about masks, then lied about HCQ, then about IVM and other treatments. The NIH has provided 0 guidance on treating COVID other than vaccines. They did eventually do a study on HCQ to "disprove" it's effectiveness but started the dose at 1200mg/day! Nearly lethal. This of course started on patients that were too far advanced in disease and also had co-morbidities. It was a sham and borderline homicidal. All to protect the emergency use authorization for vaccines and Remdesivir.

A course of HCQ: $10. A course of Remdesivir: $3000. Which is a more fiscally responsible opportunity for a for-profit industry that controls the NIH and FDA?

The NIH staffers including Fauci (and of course the expert who is not an expert: Bill Gates) stand to make a lot of money on vaccines from Moderna and Pfizer.

Fauci also lied about the NIh funding gain of function research in the Wuhan lab under oath to Congress. https://www.outkick.com/nih-admits-fauci-lied-about-gain-of-...

Historically the entire medical industrial complex and APA lied about the "chemical imbalance" theory for depression for decades while making billions of dollars selling SSRIs and misleading the population for profit: https://www.nature.com/articles/s41380-022-01661-0

It is my opinion that these establishment organizations have 0 credibility anymore and I would never trust anything put forth by them at face value.

2. If masks don't prevent transmission, and mandates are intended to prevent transmission then mandates don't work. Aristotelian logic.


>Anything less than an airtight full aerosol filtration respirator will not stop aerosols infected with SARS-CoV-2 from leaving and being inhaled by people. Fact.

Even if we suppose this is true, it does not support the claim that "in 99.999% of cases, non-airtight masks do not work at all. Not even a little bit". At best it supports the claim that "non-airtight masks" are not 100% effective, which is an entirely different claim.

>The latest CDC data suggests that "medical masks are better than cloth masks" but never attempt to identify a difference between a cloth mask and nothing.

Okay, but this feels like you're moving the goalposts again. In your original comment you were talking about "non-airtight masks" in general, not cloth masks in particular. Also, even if we grant that cloth masks are the same as wearing nothing, the fact that medical masks are better than cloth masks/nothing still contradicts your original claim that "non-airtight masks" (ie. including surgical masks) "do not work at all" in "99.999% of cases".

>This article outlines a paywalled paper suggesting masks don't work at all. https://www.cidrap.umn.edu/news-perspective/2020/04/data-do-...

The paper actually isn't paywalled. You have to log in to download pdf, but you can view the paper using the web viewer without any login: https://nap.nationalacademies.org/read/25776/chapter/1

I skimmed the paper and I disagree with the characterization that it "suggest[s] masks don't work at all". The conclusion seems to be "there's no evidence that masks works in practice (ie. because no such studies were conducted, not because studies were conducted and turned up negative), but evidence does seem to suggest that they works at capturing virus particles". The relevant parts from the conclusion:

"There are no studies of individuals wearing homemade fabric masks in the course of their typical activities. Therefore, we have only limited, indirect evidence regarding the effectiveness of such masks for protecting others, when made and worn by the general public on a regular basis. That evidence comes primarily from laboratory studies testing the effectiveness of different materials at capturing particles of different sizes.

The evidence from these laboratory filtration studies suggests that such fabric masks may reduce the transmission of larger respiratory droplets. There is little evidence regarding the transmission of small aerosolized particulates of the size potentially exhaled by asymptomatic or presymptomatic individuals with COVID-19. The extent of any protection will depend on how the masks are made and used. It will also depend on how mask use affects users’ other precautionary behaviors, including their use of better masks, when those become widely available. Those behavioral effects may undermine or enhance homemade fabric masks’ overall effect on public health. The current level of benefit, if any, is not possible to assess."

>The incidence of infection between areas that are fully mask-mandated can't be proven to differ significantly from areas that had no mask mandate due to many other factors https://www.cebm.net/covid-19/masking-lack-of-evidence-with-....

Again, moving the goalposts. This is talking about the effectiveness of mask mandates, your original claim was "in 99.999% of cases, non-airtight masks do not work at all. Not even a little bit.".

>2. If masks don't prevent transmission, and mandates are intended to prevent transmission then mandates don't work. Aristotelian logic.

Right, but "do masks work" and "do mask mandates work" are two separate questions. I suspect your intention might be to argue for the latter, but by overplaying your hand (ie. making the bold and unfounded claim that non-N95 masks don't work at all) you ended up getting dimissed/downvoted.

> The Lancet knowingly lied about HCQ. Therefore I don't trust them.

>I don't trust the establishment because they were wrong about everything. Fauci lied about masks, then lied about lying about masks, then lied about HCQ, then about IVM and other treatments. The NIH has provided 0 guidance on treating COVID other than vaccines. They did eventually do a study on HCQ to "disprove" it's effectiveness but started the dose at 1200mg/day! Nearly lethal. This of course started on patients that were too far advanced in disease and also had co-morbidities. It was a sham and borderline homicidal. All to protect the emergency use authorization for vaccines and Remdesivir.

>A course of HCQ: $10. A course of Remdesivir: $3000. Which is a more fiscally responsible opportunity for a for-profit industry that controls the NIH and FDA?

>The NIH staffers including Fauci (and of course the expert who is not an expert: Bill Gates) stand to make a lot of money on vaccines from Moderna and Pfizer.

This is getting derailed from the original discussion of masks, so I'm explicitly going to not respond to it.


"Lose weight, get in shape" is good advice that would lower all cause mortality if it reliably resulted in such a change. The advice that we ought be perfect to avoid mortality doesn't take into account reality.

I agree with your statement in the small (yes, get fit, try to motivate friends and family to do the same), but it's not useful to moralize about what-if on a population level in my opinion, and it is tantamount to blaming the victims of the pandemic for bad outcomes. Especially before the vaccines, many young and healthy people died or experienced very bad outcomes, and it still happens, though the vaccines have helped a ton in this regard.


Well, I obviously don’t mean to blame the victims here. Sorry if you or anyone took it that way. Obviously it’s tragic when anyone has a bad time or dies from this. To be more clear, my ire is directed at public health officials who would do a lot of good by simply encouraging more physical activity and for other political leaders who don’t use their power to advocate for infrastructure changes in the US that get people to be more physically mobile (more mass transit, and so on).


A small bit of nuance that could explain why the range could be $3.5 to $22 on a given day would be that they could be referring to different token types. The native token of ethereum (Ether) costs a fixed, smaller number of units of computation to send, whereas custom tokens (erc-20) have to do some additional bootstrapping work during transactions and cost more to send.

So there's the units of computation used x "gas price".

But yes, the "gas price" that serves to price congestion on the network is wildly variable due to congestion and low throughput. And that limited throughput is currently a real practical problem as you've pointed out in that the fees are too high.

The answer will likely be in L2s as other people are pointing out. I think one interesting side effect of this ecosystem is it is actually driving new research and application for cryptography (including substantial funding): for example practical uses of zero knowledge proofs.


Exactly this...it's not apples to apples comparison. Transferring an NFT is a larger transaction size so the cost is higher (hence the $22 price they are referring). The cost of the transaction is also based on how busy the blockchain is...like a way to discourage transactions if it's really busy. Like if you had a gas station that the price of gas (or actually more like a fee to use the gas pump) fluctuated based on how long the lines were.


This isn't really true. You submit a transaction with a maximum gas price. All wallets that I know of will show a maximum transaction fee based on this before you submit the transaction, and you can manually lock in any arbitrary gas limit you would like. If gas prices suddenly go up right as you submit your transaction (read: other network users outbid you for use of the network), your transaction will stay pending and either eventually process at the gas price specified, or fail and you will pay either nothing or a partial fee based on your maximum gas price * computation steps used.

Gas is basically a means to ensure all computation on ethereum halts.


I'm not a fan of Yuga or "land" sales on blockchains, but this argument comes out every time there is something even tangentially related to proof of work blockchains. There is not a significant marginal expenditure of electricity per nft minted or transferred. The same order of magnitude of energy is expended to ensure the security of the blockchain whether a large NFT mint happens or not so long as the blockchain exists.

What really happened here was that the company that launched this did it in a way that congested the ethereum network and an auction essentially occurred in "gas" to use the network for this purpose for a couple hours. Instead, the company should have pre-allocated guaranteed minting on a lottery basis so it didn't devolve into a "gas" auction or, increased supply to match demand, which they knew in advance (like you said, limits here are artificial).


> The same order of magnitude of energy is expended to ensure the security of the blockchain whether a large NFT mint happens or not so long as the blockchain exists.

That's not completely true. The amount of resources wasted per hour are proportional to the value of the coin. Lacking any genuine uses for for cryptocurrencies, people determine coin value by things like transaction volume, market capitalization, influencer endorsements and active wallets instead. Thus every additional user, transaction and especially the common million-dollar wash trades designed to pump their respective projects, contribute significantly to the waste that is generated.


That's only true in the long term. It's not true as applied to this particular sale.

https://ycharts.com/indicators/ethereum_network_hash_rate


A better way to look at the gas price is that is the cost of decentralization.

One single entity being the source of truth will always be cheaper than a blockchain verifying a transaction. The caveat is if that cost is worth it.

For buying tangible goods, the gas price might be worth it since Visa/MC control what you can sell online and payment processors can be circumvented using crypto. That is a pain point.

For intangible goods, the gas price is simply not worth it. What pain point exists for owning digital goods that would require decentralization? In my opinion, none.


Something like BOYC couldn't have been build in a traditional centralized manner. 4 founders had the idea of this club and were able to build it up on the back of Ethereum without any permission needed or their own infrastructure. It might not be worth it, but that's something unique that hasn't been possible before.


Serious question: what did they actually build?

They create images and sell them on someone else's network. I could list a JPG on eBay without having to run any infrastructure either.


Listing jpg’s on eBay wouldn’t get you a cut every time one of those jpgs was sold. You couldn’t easily airdrop items to holders of your apes, etc.


They built a weird business that made them rich.

If it wasn't built on crypto hype, it wouldn't have been popular.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: