Hacker Newsnew | past | comments | ask | show | jobs | submit | knorker's commentslogin

For Google connecting clients it's only half the internet.

Half. The. Internet.

What a failure. /s


This is mainly due to mobile devices only being issued ipv6 addresses by the telco 4g networks. They are the only ones using ipv6 on the millions of clients scale.

Comcast/Xfinity implemented v6 on their residential cable network 14 years ago ( https://corporate.comcast.com/comcast-voices/ipv6-deployment...)

Most other large eyeball networks have as well.


My current home ISP and my last one both support IPv6 just fine. It is not a mobile-only thing.

Everything supports both. We are talking about being issued only IPv6 addresses where you actually use it to connect to stuff.

Most mobile devices are only issued an IPv6 address and therefore when the masses do google searches it uses IPv6 and makes it look like there is huge adoption.


Unsurprisingly Google actually does also have IPv4 addresses. What they're measuring isn't "How did you reach our servers?" but instead "Could you have reached our IPv6 servers?"

So that measures everybody who has working IPv6. https://www.google.com/intl/en/ipv6/statistics.html


> We are talking about being issued only IPv6 addresses where you actually use it to connect to stuff.

You seem to be asserting that dual-stack machines use IPv4 by default, but that's not really true. If your machine has both IPv4 and IPv6 connectivity, browsers will in fact use IPv6 to connect to sites that support it, like Google. They prefer IPv6 by default and fall back to IPv4 if IPv6 is slower (Happy Eyeballs algorithm).

Of course, random software can mostly use whichever it wants, so I'm not claiming every process on such a machine will use IPv6, but most common stuff does.


By making the exposition unintelligible, I guess he's illustrating how complicated time travel makes a story.

Except Primer did that without this crutch.


To me it's not about art. It's about this setting making the production quality of a billion dollar movie look like a cardboard SNL set.

When walking past a high end TV I've honestly confused a billion dollar movie for a teen weekend project, due to this. It's only when I see "hang on, how's Famous Actor in this?" that I see that oh this is a Marvel movie?

To me it's as if people who don't see it are saying "oh, I didn't even realise I'd set the TV to black and white".

This is not high art. It's... well... the soap opera effect.


If films shot at a decent enough frame rate, people wouldn’t feel the need to try to fix it. And snobs can have a setting that skips every other frame.

Similar is the case for sound and (to a much lesser extent) contrast.

Viewers need to be able to see and hear in comfort.


If you think this is about snobbery, then I'm afraid you've completely misunderstood the problem.

This is more comparable to color being turned off. Sure, if you're completely colorblind, then it's not an issue. But non-colorblind people are not "snobs".

Or if dialog is completely unintelligible. That's not a problem for people who don't speak the language anyway, and would need subtitles either way. But people who speak English are not "snobs" for wanting to be able to understand dialog spoken in English.

I've not seen a movie filmed and played back in high frame rate. It may be perfectly fine (for me). In that case it's not about the framerate, but about the botched interpolation.

Like I said in my previous comment, it's not about "art".


There is no such thing as the soap opera effect. Good quality sets and makeup and cameras look good at 24 or 48 or 120 fps.

People like you insisting on 24 fps causes people like me to unnecessarily have to choose between not seeing films, seeing them with headaches or seeing them with some interpolation.

I will generally choose the latter until everything is at a decent frame rate.


> There is no such thing as the soap opera effect.

What has been asserted without evidence can be dismissed without evidence.

I'll take the Pepsi challenge on this any day. It looks horrible.

> Good quality sets and makeup and cameras look good at 24 or 48 or 120 fps.

Can you give an example of ANY movie that survives TV motion interpolation settings? Billion dollar movies by this definition don't have good quality sets and makeup.

E.g. MCU movies are unwatchable in this mode.

> People like you insisting on 24 fps

I don't. Maybe it'll look good if filmed at 120fps. But I have seen no TV that does this interpolation where it doesn't look like complete shit. No movie on no TV.

Edit: I feel like you're being dishonest by claiming that I insist on 24 fps. My previous comment said exactly that I don't, already, and yet you misrepresent me in your very reply.

> causes people like me to unnecessarily [… or …] seeing them with some interpolation

So you DO agree that the interpolation looks absolutely awful? Exactly this is the soap opera effect.

I know that some people can't see it. Lucky you. I don't know what's wrong with your perception, but you cannot simply claim that "there's no such thing" when it's a well known phenomenon that is easily reproducible.

I've come to friends houses and as soon as the TV comes on I go "eeew! Why have you not turned off motion interpolation?". I have not once been wrong.

"There's no such thing"… really… who am I going to believe? You, or my own eyes? I feel like a color blind person just told me "there's no such thing as green".


I agree with you that the interpolation isn’t ideal, I’m not praising it. It’s merely a necessity for me to not get headaches. It’s also much less noticeable on its lowest settings, which serve just to take the edge off panning shots.

The “soap opera effect” is what people call video at higher than 24 fps in general, it has nothing to do with interpolation. The term has been used for decades before interpolation even existed. You seem to be confused on that point.

Source video at 120 looks no worse than at 24, that’s all I’m saying.


Yeah, but soap opera effect also isn't only framerate either.

Earlier video cameras exposed the pixels differently, sampling the image field in the same linear fashion that it was scanned on a CRT during broadcast. In the US this was also an interlaced scanning format. This changes the way motion is reproduced. The film will tend to have a global motion blur for everything moving rapidly in the frame, where the video could have sharper borders on moving objects, but other distortions depending on the direction of motion, as different parts of the object were sampled at different times.

Modern digital sensors are somewhere in between, with enough dynamic range to allow more film-like or video-like response via post-processing. Some are still rolling shutters that are a bit like traditional video scanning, while others are full-field sensors and use a global shutter more like film.

As I understand it, modern digital sensors also allow more freedom to play with aperture and exposure compared to film. You can get surprising combinations of lighting, motion blur, and depth of field that were just never feasible with film due to the limited sensitivity and dynamic range.

There are also culturally associated production differences. E.g. different script, set, costume, makeup, and lighting standards for the typical high-throughput TV productions versus the more elaborate movie production. Whether using video or film, a production could exhibit more "cinematic" vs "sitcom" vs "soapy" values.

For some, the 24 fps rate of cinema provides a kind of dreamy abstraction. I think of it almost like a vague transition area between real motion and a visual storyboard. The mind is able to interpolate a richer world in the imagination. But the mature techniques also rely on this. I wonder whether future artists will figure out how to get the same range of expression out of high frame rate video or whether it really depends on the viewer getting this decimated input to their eyes...


You have never seen a movie at 120fps. Gemini Man exists at 60fps and that is as close as you are going to get. That blu-ray is controversial due to that fps. I thought it was neat, but it 100% looks and feels different than other movies.

Thanks. I'll give it a try.

> You seem to be confused on that point

Please stop repeatedly misrepresenting what I said. This is not reddit.

I have repeatedly said that this is about the interpolation, and that I'm NOT judging things actually filmed at higher framerates, as I don't have experience with that.

> Source video at 120 looks no worse than at 24, that’s all I’m saying.

Again, give me an example. An example that is not video games, because that is not "filmed".

You are asserting that there's no such thing as something that's trivially and consistently repeatable, so forgive me for not taking you at your word that a 120fps filmed movie is free of soap opera effect. Especially with your other lying.

So actually, please take your misrepresentations and ad hominems to reddit.

Edit: one thing that looks much better with motion interpolation is panning shots. But it's still not worth it.


There is plenty of 50/60 and even 120 footage out there, some is even popular. I’m sure you can find it yourself.

I don’t see what I lied about or what Reddit has to do with anything. I will definitely stop replying to someone so needlessly aggressive.


So true, everybody else is wrong and you're right.

Getting headaches from low frame rate is rare, I guess. I only know a few others with this problem.

But preferring high frame rate is common, as evidenced by games and the many people who use TV interpolation features.


There is no evidence that people prefer high frame rate movies. Motion interpolation on TVs is set on by default, not a conscious choice the end user is making.

> Duffer’s advice highlights a conflict between technological advances and creators' goals

I wouldn't call it a "technological advance" to make even the biggest blockbuster look like it was filmed with a 90s camcorder with cardboard sets.

Truemotion and friends are indeed garbage, and I don't understand how people can leave it on.


Not never. Woz championed some of that in the 1970s. It's before my time, but the Apple II was pretty open as I understand it.

Apple has not been nice and open since the 1970s. The only open and nice person in any important role is Wozniak.

They could also write the comment in French, and by the same argument people should need to go out of their way to copy-paste that into google translate.

Thousands of people are going to read this thing. The writer could spare thousands of people spending tens of seconds (totaling days of human life), by simply spending less than a second spelling out the obscure term.


Boy are you going to be surprised when you find out that there is an entire French literary tradition that doesn't concern itself with who does and does not speak the language.

Is this some snarky reddit comment?

Yeah, there are literally billions of people in the world who don't speak English. And yet HN is de facto English.

Do you see many people commenting on HN in French? How's that working out for them? Are they succeeding or failing to communicate?

It seems that people are, for the most part, succeeding in tailoring their message for their audience.


Who's going all the way to Google translate to copy and paste? You just select the text and right click/long press and select translate.

I'm not sure what you are attempting to add by being pedantic while not affecting the conclusion in any what whatsoever.

I've worked in software engineering on Internet things for decades and I have not once heard or seen this abbreviated before.

Uh, no. Scott Adams is not a one-mistake person. This is a years-and-years thing.

You're really rewriting history, here.

I have no problems forgiving people for mistakes, but no this is absolutely not one of those cases.


The fact that you can't return from there makes for a huge difference, though,

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: