I upgraded hundreds of projects. The docs with breaking changes are by version. For smaller solutions you can create a new solution and pull in the code and try to fix everything that breaks. There is also an upgrade tool that can work for smaller projects. But the fastest way for normal sized solutions is to update version by version up to the closest LTS. Then you can go LTS to LTS. .NET does also not only break in code, it breaks in behavior so you really want to test those changes in isolation.
Countries outside of the US exist, some of them with extremely low incomes that nevertheless hold segments of the population that are technically competent enough to not only understand what Docker is, but to use it on a regular basis.
The NYT is from the US so framing the question this way is not surprising and drawing the comparison of someone who can't afford NYT but knows what docker is, is interesting without your addition.
There are other things we could mention like, maybe there are many people who can afford NYT but still don't want to pay for it, but that's not what we were talking about. That being said, thanks for the reminder about other countries... I'm sure everyone on HN forgot about globes.
I myself don't wish or want free content, I can create my own entertainment without the need of a large media corporation or a keyboard warrior spoon feeding me.
I don't think comparing Journalism to Netflix or ESPN is relevant since they provide quality entertainment (at least in the minds of their base) vs Journalists who stretch out 2 bits of searchable information into a 10 page ad puzzle full of psychological poking.
Yes, most journalism is less valuable than the critically acclaimed fantasy horror Stranger Things. This doesn't mean Journalism is less important or that good journalism doesn't exist. Honestly it's crazy to me Journalism doesn't see more critique. Most just sensationalize, fearmonger, and point fingers.
It's not unusual for Americans to live insular lives where the rest of the world doesn't exist in their worldview. The globe snark is unnecessary and frankly not worthy of a HN comment.
And assuming that no one outside of the US could possibly be interested in US-oriented articles in the NYT - not to mention their world news - is just another example of the insular attitude I'm referring to.
Not exactly - they apply to whichever dictates your capacity - be it ingress or egress. Overwhelmingly, capacity for hyperscalers is dictated by egress peaks. Since most non-residential network links are delivered symmetrically, the capacity for the other direction is already there as a by-product.
Also don't underestimate the benefit of simplification - why bill for 2 things separately when one of them is the primary driver of the cost, the comparative cost to supply the other is negligible and is probably more effort to bill for than it's worth.
I'm not dismissing the vendor lock-in aspect, but I don't think it is the only reason at play.
I live in Australia and have noted the same phenomenon on Netflix. There is an absolute dearth of new, quality content coming through. I've seen no indication that formerly exclusive content from other platforms is making its way to Netflix in my country.
I don't believe the person you're replying to is overstating at all, and certainly not 'considerably'.
Yeah a friend of mine had this and he had to get surgery to not lose his left leg to permanent numbness due to the disc pressing on the nerves. I know one other person with the exact same issue and solution, and am also under the impression that without surgery only bad things happen.
It's a true statement; LD_PRELOAD cannot be used with statically linked binaries. You can "fiddle" in other ways, but not by using the LD_PRELOAD attack vector (although personally I wouldn't call it an "attack vector", although in some cases it could be where you can upload a malicious file and control the environment of another program somehow, or something along those lines).
>Which is unambiguously not the case, they merely slow it down by a small margin.
When you are using shared libaries, it is fairly trivial to hook into the library calls and replace them with whatever you want. When you are using static libraries, the linker and optimizer could for example inline the machine code directly in the application code. What tools do you have to do similar tricks with statically compiled binaries?
Even spouses of Googlers suffer from this malady occasionally. I have a close friend who has a wife working for Google as a tech writer, and every time the subject of Google's decline in search quality comes up he passionately defends them as light years ahead of the competition, untouchable, always on the edge of innovation, etc.
I really don't get where it comes from, it's just a company - not a religion!
Funny, I see the problem in the opposite direction.
Google results seem perfectly fine to me, and I know they have metrics that track how well it performs. And it's their golden goose, so it doesn't make any business sense that they wouldn't maintain its quality.
The "religion" seems, to me, to be people who keep insisting its results are getting worse. And I don't get it, because I just don't see it at all in my own experience. As far as I can tell, it's just anti-Google bias or something.
This thread has someone else asking for actual examples of where Google gives bad results and other search engines are great. And once again, nobody seems to be able to provide any. If Google really were declining, you'd think people would have put together objective evidence around it, because it'd be a heck of a news story.
There are a lot of queries where it insists that the results come from common websites like Reddit or major media outlets. Smaller websites don't show up even if they have more relevant content.
There others where the results are all SEO content marketing spam.
The thing it rarely seems capable of doing now is showing legitimate non-spam content from smaller websites.
It also used to be possible when you were looking for a specific web page you've already seen, to type in a bunch of terms that you know are on that page and then it comes right up because it's the one page with all of those terms together. That doesn't seem to work anymore.
Lots of things that don't make a great deal of business sense still nevertheless occur in large corporations. Frequently. I'm not sure that's a useful metric by which to judge a situation.
I personally didn't reply to that other poster, simply because their tone suggested (to me) that they were more looking for an argument than a civil debate. Yours is very different, and thank you for that. This is something I'd genuinely like to bottom out, because from my perspective there's been a noticeable, ongoing decline in the quality of search results from Google.
So, challenge accepted. Here's an example for you, and it's not even that subjective: Run a Google search that includes a minus term and tell me how effective the search results are in regards to obeying that instruction. That was a feature I used on a daily basis until Google made it completely non-functional.
If you want an example that doesn't rely on specific features, trying searching for "<thing you might want to buy> review". Enjoy the avalanche of low-quality SEO and link spam you're about to receive.
Also consider the fact that many searches you perform on Google are already offered with a "reddit" suffix, because so many people can only find the information they're looking for by appending that term to their search. Google by itself just cannot find what you want any more.
If that's not enough for you and you want specific examples, that's certainly possible. I was searching for a solution to a programming related problem on Google just a few days ago, and came up empty-handed after 20+ frustrating minutes of it including "related" search terms that weren't relevant. Reconstructing that search wouldn't be terribly difficult, if there were an indication the effort was worthwhile. In the end I went to ChatGPT and got my answer, including a fully-functional example, in less than 30 seconds. An experience that is becoming more and more the norm as time goes on.
First of all, thanks for writing all that. I appreciate it, and so I'd like to respond.
> Run a Google search that includes a minus term and tell me how effective the search results are in regards to obeying that instruction.
Sure. I search for the movie I watched last night -- "How the West Was Won" (without quotes) -- and nearly all of the results are for the 1962 movie, as it should be. I search for "How the West Was Won -1962" and all the results are for the 1977 TV series, plus a Led Zeppelin album. Works great -- I use the minus operator all the time.
> trying searching for "<thing you might want to buy> review". Enjoy the avalanche of low-quality SEO and link spam you're about to receive.
I just typed in "dehumidifier review" and the first three results are from Wirecutter, Consumer Reports, and Good Housekeeping. Down the page includes more trustworthy sources like Tom's Guide and The Spruce. This is exactly what I want the results to be. I trust those a lot more than some random blogger of YouTuber, for instance.
> Also consider the fact that many searches you perform on Google are already offered with a "reddit" suffix, because so many people can only find the information they're looking for by appending that term to their search
Which is fantastic for me. I'll often find really valuable opinions in a single Reddit thread that provide a different perspective from Wirecutter, for instance. This isn't a failure of Google -- it's a testament to Reddit's success. (Indeed, Wirecutter and Reddit are often the first two things I want to read when researching a product -- but Reddit is more of a second pass.)
> I was searching for a solution to a programming related problem on Google just a few days ago
I definitely agree that finding incredibly specific solutions to technical problems can sometimes be hit-or-miss, but that's not a problem with Google at all, that's a problem with the entire concept of keyword search. But to my eyes, Google hasn't gotten worse over a couple of decades, it's gotten better. And I think it's noteworthy that your solution was ChatGPT, as opposed to a different search engine. That's exactly the kind of thing I turn to ChatGPT for as well. But again, I don't interpret it as Google having gotten worse, just as a new tool that's even better for certain types of tasks.
As another programmer, and one with decades of experience building software running critical infrastructure, I have a very different view.
You seem to be expecting perfection, or a reasonable facsimile of it, but that's not the bar that's being set by the existing solution (people).
Using your example of even a pedestrian task such as buying airline tickets, humans generally have a worse error rate than a computer doing it.
Source: I live in the Philippines where manual handling is still de rigeur for everything from buying ferry tickets, to immigration paperwork, to car registration and driving licenses. The error rate is far, far higher than the automated systems that do these things in other countries.
Similarly, humans (in aggregate) are really not very good at driving cars safely. Software on the other hand, is only getting better at it as time goes on. It's perhaps debatable whether computers are currently better (again, in aggregate) than humans, but with the current state of the art my view is that they are.
I'd welcome these things with open arms in the Philippines compared to the average driver on the road. Or equally so in my home country of Australia, or the US, or anywhere where humans kill each other every day in fast-moving steel cages.