Hacker Newsnew | past | comments | ask | show | jobs | submit | more travisgriggs's commentslogin

Hats off to you and your company. I wish more companies could put up a notice like you did, much less show up as a CEO on frickin' HN and be willing to take responsibility as well as desire to do better. I honestly am a little confused how a person like you exists. The FAA should put someone like you in charge of Boeing.


> I would love to read a study on why people so readily believe and trust in AI chatbots.

We associate authority experts with a) quick and b) broad answers. It's like when we're listening to a radio show and they patch in "Dr So N. So" an expert in Whatever from Academia Forever U. They seem to know their stuff because a) they don't see "I don't know, let me get back to you after I've looked into that" and they can share a breadth of associated validations.

LLMs simulate this experience, by giving broadish, confident, answers very quickly. We have been trained by life's many experiences to trust these types of answers.


I largely agree with sibling responses.

BUT...

How do have code review be an educational experience for onboarding/teaching if any bad submission is cut down with due prejudice?

I am happy to work with a junior engineer and is trying, and we have to loop on some silly mistakes, and pick and choose which battles to balance building confidence with developing good skills.

But I am not happy to have a junior engineer throw LLM stuff at me, inspired the confidence that the psycophantic AI engendered in it, and then have to churn on that. And if you're not in the same office, how do you even hope to sift through which bad parts are which kind?


To mentor requires a mentee. If a junior is not willing to learn (reasoning, coming up, with an hypothesis, implementing the concept, and verifying it), then why should a senior bother to teach. As a philosopher has once said, a teacher is not meant to give you the solution, but to help you come up with your own.


Code review as an educational device is done. We're going to stop caring about the code before people who are bad programmers right now have time to get good.

We need to focus on architectural/system patterns and let go of code ownership in the traditional sense.


Aren't you effectively saying that no one will understand the code they're actually deploying? That's always true to an extent, but at least today you mostly understand the code in your sub area. If we're saying the future is AI + careful review, how am I going to have enough context to even do that review?


I expect that in most cases you'll review "hot spots" that AI itself identifies while trusting AI review for the majority of code. When you need to go deeper, I expect you'll have to essentially learn the code to fix it, in roughly the same way people will occasionally need to look at the compiler output to hunt down bugs.


Human trust has to be earned, why should AI trust be any different? If I’m supposed to yolo-approve any random code a machine spits out, it had better prove to me it’s nearly flawless, otherwise I’m applying the same review regiment I apply to any other code. To do otherwise is to shame the word “engineering” and the field thereof.


Engineering is a game of tradeoffs. Time is one of the things you have to trade off, given your strong opinions I expect this is something you've been in the industry long enough to understand intuitively.

Regarding proof, if you have contracts for your software write them up. Gherkin specs, api contracts, unit tests, etc. If you care about performance, add stress tests with SLOs. If you care about code organization create custom lint rules. There are so many ways to take yourself out of the loop rigorously so you can spend your time more efficiently.


> Regarding proof, if you have contracts for your software write them up. Gherkin specs, api contracts, unit tests, etc.

We really need widespread adoption of stuff like design-by-contract in mainstream PLs before we can seriously talk about AI coding.


Interesting. That was not my perception of Sun at all. “The network is the computer” was a marketing campaign. Java was a language developed for IoT/toasters, and then hard pivoted to a write once run anywhere weblet language (ultimately to be replaced by a guy who threw together an integerless programming language that sounded like a skin condition, renamed to ride the crest of energy sun marketing money threw at things).

Sure, Solaris was rock solid, but it was also pretty conservative in its march forward as a Unix, being ultimately trumped by Linux.

Sun had an amazing team of people that worked on Self project led by David Ungar and others (Lars Bak who helped give us V8). They let the whole team go, who then went off and did sime cool things with dynamic optimization, which Sun ultimately ended up hiring/buying back to create the HotSpot VM.

Any NIH and other dysfunctionality went far beyond the engineers at Sun.


> “The network is the computer” was a marketing campaign.

No, not at all. It became a marketing campaing in the very late 90s dot.com boom, but the concept that defined Sun goes back to the beginning, 1984. Back then, that was a radical vision and Sun truly lived it internally for a long time.

https://en.wikipedia.org/wiki/The_Network_is_the_Computer


It's a small side point, but the skin-disease name came later:

Mocha -> LiveScript -> JavaScript -> EczemaScript or whatever

[0] https://en.wikipedia.org/wiki/ECMAScript


> an integerless programming language

Technically true-ish, but deserves an important qualifier. The Javascript number format has a huge "safe space" of integers between

  Number.MIN_SAFE_INTEGER => -9007199254740991 (-(2^53 - 1))
  Number.MAX_SAFE_INTEGER => 9007199254740991 (2^53 – 1)
Also, the number format is a standard, not only used by JS, and given that it was supposed to be a minimal scripting language it is hard to argue against the initial design choice of choosing one all-encompassing big standard, and not burden the language with a complete set. Since he criticism was on the initial design:

> ultimately to be replaced by a guy who threw together an integerless programming language

I would like to refute it by pointing out that the criticism ignores the initial use case, as well as the actual existence of integers within that larger number format standard. Later, when enough people (and companies) demanded it, a big integer type was added, after all.

Internally runtimes use different paths depending on what kind of number it is.

For many use cases of integers, especially internal ones, like array indexing and counting, those integers are just that, and an extra integer type for extra purity is not much of a problem. For other uses of integers, e.g. finance (using cents instead of dollars), it sucks that you have to pay a lot of attention to what calculations you perform, so not having (had - until BIGINT) a real integer type as aid indeed made it less pleasant to do integer arithmetic.


> The media have comprehensively failed us.

Good. The author didn’t make the mistake of calling it the “news”.

I have for a long time felt that there is nuance about our “press” that doesn’t have good words in the public dialog. I struggle to articulate it myself.

Our modern “free press” is only free in that government is mostly not censoring it. But the press of today is a for profit endeavour. So it is not free to waste time “speaking truth” or something like that. It is incentivized to be whatever it takes to grab and keep eyeballs.

While there are people/institutions who publish things purely for information they feel is important, this is largely drowned out by the “trying to make money” crowd.

So our supposedly “free press”, while possibly free of despotic controls, is still a slave to the feedback loop of economics. Very much unfree. A sort of irony.


"But the press of today is a for profit endeavour."

It is worth paying attention to the significant rise in prominence of non-profit newsrooms, particularly in the USA.

Some notable examples:

The Baltimore Banner https://en.wikipedia.org/wiki/The_Baltimore_Banner Founded: 2022

ProPublica https://en.wikipedia.org/wiki/ProPublica Founded: 2007

The Texas Tribune https://en.wikipedia.org/wiki/The_Texas_Tribune Founded: 2009

The Marshall Project https://en.wikipedia.org/wiki/The_Marshall_Project Founded: 2014

I'm particularly excited about the Baltimore Banner, who are only a few years old but are earning sizable subscription revenue now (it's healthy for them not to be too dependent on donors).


I would like to see more information like this, thanks for sharing. Though at least one of those examples has a red flag for me - The Baltimore Banner gets a non-trivial amount revenue from advertising. For me personally, I feel like advertising is directly at odds with quality journalism.

I would also be interested to hear about how older small and alternate news sources compare to these newer ones. To use an example I'm familiar with, Willamette Week in Portland has a reputation of being halfway decent. Though to be fair, it also has advertising, and does not even have subscriptions since 1984.


"For me personally, I feel like advertising is directly at odds with quality journalism."

Advertising is how journalism has worked since journalism first started. Running a newspaper used to be a fantastic business, because you effectively had a local monopoly on advertising to a geographic area. If someone wanted to promote things in your city, you would be top of their list.

Facebook, Google, Craigslist etc completely decimated that business model over the past 20 years and the news industry is still trying to figure out how to fund itself via alternative means.

Historically news organizations have had very strong mechanisms for avoiding advertisers influencing their coverage - the "editorial–advertising firewall". Reputable new orgs like the Baltimore Banner should have policies like that in place today.


Getting grants is an alternative way and it's how freelancers are able to do reporting for cash-strapped newsrooms. Grants definitely have their own can of worms, though. Things like restrictive reporting requirements, do-not-do requirements, and the dynamics that just come from people giving other people relatively large sums of money.


Yeah, the problem with grants is that even with no strings attached there's still a subtle influence where a publication may not want to harm the interests of the source if that grant since they might not provide more funds in the future.


I don't know if it's possible to ever be completely free of outside influence. If anything, I think standards for publishing have become so low that any incentive model that helps keep a majority of facts straight should be the goal. The loss of traditional publishing gatekeepers has just generated a lot of noise and in that noise non-mainstream viewpoints have thrived.


> Advertising is how journalism has worked since journalism first started

That is a fair point. Maybe where it went off the rails is when we (collectively) were able to tie attention directly to the stories, and optimize for that. An old school newspaper has a much looser connection between subscriber behavior and advertising choices.

> editorial–advertising firewall

This is a mechanism I am not familiar with, thanks for mentioning it. Now I need to go learn something new!


> For me personally, I feel like advertising is directly at odds with quality journalism.

I think we've seen so many useless ads that this is effectively true but it really doesn't need to be.

Think about say Golf magazine. Is the average reader going to say, why are there advertisements for ball finding glasses in there? They'll probably be annoyed when every copy has one but to see various gadgets that could be helpful in your hobby is nice. Especially because they explain why you might want them and often how they work.

Then think about a TV advertisement. Some guy has a grill and stuff starts flying on screen and eventually they sip from a can of Bud Light. If I drink Bud Light is the entire neighbor going to show up in my backyard? There's really no information gained here except that a liquid product called Bud Light exists and that I should "drink responsibility".

The concept of advertising is useful and should be desirable however the current way it's done is often neither. There's a million things out there and the only way to find them out is by being shown them.


I didn't realize Pro Publica is that "new". I've been following them for almost as long. They are fantastic.


Definitely a template. I can’t think of a single major issue I’ve had with anything they’ve put out. I’m sure something exists, I haven’t read literally everything they’ve put out stall, but I have been very impressed with everything I have seen.


Blaming society for the poor state of journalism is tempting but ignores that the root of the problem lies from within. Financial institutions and other journalists demand information dense journalism to do their jobs and have no problem paying for it, so this is what they receive. Most regular people view news as a form of entertainment and have no problem with sacrificing their attention, and this is what they receive.


The free press has always been for profit!

What's changed is that the profit used to come from advertising. Since everyone read the news, they could charge a lot for ads.

Those days are over, and news now bubble up from social media. That kinda works, but it's far from ideal.

To me the 2019 "Covington kids" incident showed how broken the media had become. All the prestigious media, from NY Times down, reprinted a viral Twitter thread as front page news without any fact check.

The reported "facts" were completely wrong, and even if they had been right, some random kids being rude in a park should never be national news.

Bu that's the news world we live in now.


Speaking of Twitter, I couldn't help but notice the lack of a Twitter/X icon on the author's blog page. Lots of other social media links are present.


Author here.

Elon directly screwed over some of my friends. He turned Twitter from an imperfect mess into a shit filled pit of despair.

I don't want to encourage anyone to use his services.

Hope that clarifies things!


Worth noting this is far from a modern problem. Google "yellow journalism".


The press has always been for profit, it was never a charity. What I see today is a mix of trying to maximize profits (which is different from merely making a living from it), and it being more difficult nowadays to make money from diligent journalism, mostly due to how the internet works.


> to grab and keep eyeballs

yes, but also to manufacture consent for the priorities of the rich and powerful


People conveniently leave this out a lot. Outlets like The Guardian have lost massive amounts of money every year for decades. They are supported by wealthy people who want to see their agendas be influential.

So the quest is for eyeballs, but not for cash. They're totally willing to throw away the pennies* that they could get from that if the alternative is not to get the ideas they want to push into circulation, which often boosts their other business interests.

It's not even possible to make money from journalism. Every outlet is a money sink for someone, you should just wonder if that person has a moral reason for throwing away the cash or another goal.

[*] is there any news outlet that beats alpha other than the NYT? Maybe the WSJ?


Unlike opaquely financed and privately owned media companies, the Guardian is actually relatively clear and open in how it is financed and set up in a way to try to make them as independent as possible (see for example the Scott Trust's annual report https://uploads.guim.co.uk/2025/09/11/The_Scott_Trust_Limite...).

That's not to say that they don't run their fair share of gossip/clickbait... but show me an online medium that does not.


Its first and foremost purpose


> But the press of today is a for profit endeavour.

For me, the press today is a for influence endeavour. Most journalists have a POV the majority of topics they write about they express that POV with how they discuss the topic. For example, which people they quote, generally only ones that agree with their POV. If they present an a opposing view they always couch it and phrase things to push the reader to discount that view. If they preset a supporting view they phrase it in a way to make it sound trusting and authoritative.

To put it more simply, most journalists are trying to change the world to see things their way.


> To put it more simply, most journalists are trying to change the world to see things their way.

This is a good thing.

Aside from the bit where it's always been like this anyways, we, as modern humans, don't have the time to evaluate everything from first sources.

You can't read every scientic study or the 500 pages of tax documents that were studied to produce a report on someone committing tax fraud.

I don't need more "facts", I need useful information I can take action on.


And if that info is wrong because inconvenient counter examples are deprecated and ignored?


You’re using multiple definitions of “free” here. One is freedom in the Lockean sense, the other is freedom from the properties and consequences of an emergent system. It’s a bit like saying you are free to choose your own mate and have kids without government involvement but you’re still a slave to natural selection.

The concept of the free press does not guarantee that the truth will proliferate, it merely attempts to avoid the problem of the state defining what truth is. It’s an attempt to select the least worst option because no one knows of a perfect solution or even if one exists.


People do forget that there are only three known models for funding new/press and they are all susceptible to bias and error.

1. State

2. Profit driven

3. Charity (includes volunteers, billionaire patrons, crowdfunding)


For-profit media is definitely a problem, but Jeff Bezos didn't buy the Washington Post and Elon Musk didn't buy Twitter because they thought they were more profitable than any other investments they could have made.

I believe they did it because they wanted the power that owning a media outlet can provide in order to help protect their actually profitable businesses.

It certainly helps that they have their own revenue streams so that they're not just money down the drain. If the Post loses $100M per year, but Amazon keeps making Bezos $50B per year, that's fine, probably costs him less than the depreciation on his yachts or jets.


Elon bought Twitter because his mouth got ahead of him.

Recall His brazen offer

Their initial refusal

Him suing to buy

Them relenting

Him trying to back out

Them suing to force the purchase


Elon was quite clear the he bought Twitter to make it a free speech forum where you could openly discuss things, even from non establishment standpoints.

And that's what it's become.


Aww, that’s sweet.

Only time I’ve been kicked off Twitter (permanently, no comebacks) was under Elon’s rule.


Do you want to share for what?


Are you going to judge whether or not he was using the right kind of speech?


Just curious. I'm even open to change my mind a bit about 2025 Twitter.


My display name looked a bit like his. Hadn’t logged in for a few days, bam. What a sad, sore rich boy he is.


OK, that's super annoying!

As someone overly prone to provocative wordplay, that could have been me...

Quite separate from the political censorship issues though.


And, of course, there was the blocking of federally protected ATS-B data, and the expulsion of journalists who had written critical stories.


Here's an incident from September 2024 where links were blocked to a newsletter containing a hacked document with potentially damaging information about JD Vance - and the journalist who published that newsletter then had their account suspended: https://www.theverge.com/2024/9/26/24255298/elon-musk-x-bloc...


> And that's what it's become.

Hilariously incorrect.


Has it? I’ve never actually engaged with twitter. I always thought it was an echo chamber. From the outside looking in though it seems like twitter is the same pre-Elon as post-Elon. The difference is just which views are the blessed ones.


I see people advocating all known political views. Pre Elon, any non progressive opinions were heavily weeded out by moderators.

Of course, most progressives have left now that they encounter opposing views (AKA "fascism"), so you could think of it as an echo chamber. But it's not forced to be one by the site.


> Pre Elon, any non progressive opinions were heavily weeded out by moderators.

This is 100% a lie. Open nazi advocating violence were suppressed, but back then there were people who claimed they are conservatives or right wing who were not nazi.

Funny enough, open communists advocating violence were suppressed too back then. In fact, left was policed more strictly then the right.


No, he never intended that nor done that. He wanted to make it more biased toward right wing then it already was (and it was already biased toward conservatives in its moderation). Twitter did not became more open for all point of view. It became exclusively more far right friendly.

Elon Musk censors and suppresses whatever speech he does not like whole his life.


Except if you post "cisgender" or if you track Elon Musk's jet.

Edit: oh, and if you want to block his account from your feed, you can't


"Non-establishment" is a funny way to spell "Nazi".


Web Programming has stolen satisfaction from programming. At least for me.

I've coded in win32, XWindows, GTK, UIKit, Logo, Smalltalk, QT, and others since 95. I had various (and sometimes serious) issues with any of these as I worked in them. No other mechanism of helping humans interact with computation has been more frustrating and disappointing than the web. Pointing out how silly it all is (really, I have to use 3 separate languages with different computation models, plus countless frameworks, and that's just on the client side???), never makes me popular with people who have invested huge amounts of time and energy into mastering etheral library idioms or modern "best practices" which will be different next month. And the documentation? Find someone who did a quick blog on it, trying to get their name out there. Good luck.

The fact that an AI is an efficient, but lossy compression of the big pile, to help me churn it faster, it's actually kind of refreshing for me. Any confidence that I was doing the Right Thing in this domain always made me wonder how "imagined" it was. That fact that I have a stochastic parrot with sycophantic confidence to help me hallucinate through it all? That just takes it to 11.

I thought when James Mickens wrote "To Wash It All Away" (https://scholar.harvard.edu/files/mickens/files/towashitalla...), maybe someday things would get better. 10 years later, the furniture has moved and changed color some, but its still the same shitty experience.


This is pretty myopic, or something. Shows, at least, a real ignorance about the available possibilities (or lack thereof)—at scale—to the common worker for “saving for a later day”.

But I’m all ears. Now that you’ve diagnosed how 401K investing fools get what they deserve, care to offer any alternative solutions as to the entire work force should have been saving towards retirement.


(kinda unrelated but) I personally hate these forced savings schemes from the government. At least in my country the rates are low so it almost feels like I'm donating the entire difference in rates between the pension scheme and the S&P500 + taxes straight to the government.

At least give those of us brave (or stupid) enough to do something different with the money the option to.


401k is voluntary


I guess. My company (and I believe most) highly incentivize it by offering matching funds. The company says, "want to save $1000 of your paycheck? We'll throw in $500". The general wisdom is, you'd be an idiot to leave money like that going unreclaimed.


Where I am there's actually a %-age of your salary/wage that's "returned" to you when you retire. You don't get a choice.

You can choose to contribute to an employer-match scheme (I do) or even fund your own pension account with monthly payments to a pension provider. But these are in addition to the above forced pension.


I work on the micros that aren’t plugged I to a grid. So solar and batteries and the like. In that world, power consumption is everything. Interrupts and aggressive sleeping of your processor are you biggest tool.

Does anyone have any experience with current draw of typical pieces of “firmware” using this? I see that it’s on the larger side of what feels like micro, BUT tomorrows micro has been growing heaps over yesterdays micros for a long time, so I can ignore that.


Compared to other microcontrollers: ESP32 is very power hungry. Shiny displays are very power hungry, Wi-Fi is power hungry. So expect to draw about 5 watts/hour continuously while in operation with all bells and whistles.

With this said (I'm also using them for off-grid) you will need to put them to sleep and only use the display when absolutely needed for most scenarios. I've recently started using devices with e-paper display which at least solve that nuisance of the display power draw: https://www.waveshare.com/wiki/ESP32-S3-ePaper-1.54

The last thing to keep in mind is heating. They will warm quite a bit and you should consider a way to either keep them cooled or make them sleep enough to cooldown, otherwise they will reboot or stop working until they are cooled again.


Depends... do you need wifi, screen and others always on? can you wake some on a timer? on user interaction? on interrupts?

https://lastminuteengineers.com/esp32-sleep-modes-power-cons...

You can use those sleep modes in micropython as well

https://randomnerdtutorials.com/micropython-esp32-deep-sleep...


> 5 watts/hour

Typo I'm guessing, but I found this unit of "energy acceleration" amusing.


"Gotta go fast" :-)

In my language we say it colloquially that way, turned out wrong in English. Should have been 5 Wh.


Rather you would say it draws 5 watts. If someone is interested in draw over a period, e.g. over one hour, you'd say it used 5Wh in that period.


> If someone is interested in draw over a period, e.g. over one hour, you'd say it used 5Wh in that period.

Wh per hr? Let's just cut through the confusion and say it draws (J/s)Hr / Hr. :P

More seriously, if you are interested in energy the "correct" SI unit is J although in electrical applications [k/Mega/Giga]Whr is common. If you are interested in energy draw over a period, aka power, the "correct" and common unit is W. While 5 Wh per hour might seem simpler, it is equivalent to say this thing draws as much energy per hour as a device that that draws 5W would draw over one hour - needlessly redundant.


In the offgrid world we look constantly at batteries and they often express themselves in Wh. So it is a habit to measure anything else that way to avoid confusions.


I haven't used MicropythonOS per se, but Micropython is pretty efficient, and can utilise interrupts and sleep modes


I have a charger "controller" that I developed in MicroPython for an SAMD51 board. It can do sleep just fine, as long as you set up interrupts properly.

But I just need to do a bunch of ADC readings and some simple if/else conditions, so it doesn't require any real non-trivial computations.


> This is one of those rather surreal situations where everyone senior in this ecosystem knows that the math doesn’t work, but they don’t know that everyone else also knows this. They thought that they were the foolish ones, who simply didn’t get it.

I don’t know if it’s that surreal or unexpected. There’s a reason “The Emperors Clothes” is such a classic, enduring, fable. It’s happened before. It’ll happen again.

Not shading the article. All good points, just was surprised the author threw this bit in.

Buy more tulips.


> Buy more tulips

Railroads and fibre are better examples. Tulips are actually fucking useless as a productive asset. Railroads, fibre-optic cables, power production and datacentres are not.


Ah Java. The language I never got to love. I came of coding age during the “camps” era of object oriented stuff: Eiffel, Smalltalk, CLOS, C++, etc. Java, from 95ish to oh 98ish, was like a giant backdraft. Completely sucked the air out of the room for everything else.

Does anyone remember the full page ads in WSJ for programming language, that no on quite yet knew what it really was? So my formative impressions of Java on were emotional/irrational, enforced by comments like:

“Of Course Java will Work, there’s not a damn new thing in it” — James gosling, but I’ve always suspected this might be urban legend

“Java, all the elegance of C++ syntax with all the speed of Smalltalk” - Kent Beck or Jan Steinman

“20 years from now, we will still be talking about Java. Not because of its contributions to computer programming, but rather as a demonstration of how to market a language” — ??

I can code some in Java today (because, hey, GPT and friends!! :) ), but have elected to use Kotlin and have been moderately happy with that.

One thing that would be interesting about this list, is to break down the changes that changed/evolved the actual computation model that a programmer uses with it, vs syntactic sugar and library refinements. “Languages” with heavy footprints like this, are often just as much about their run time libraries and frameworks, as they are the actual methodology of how you compute results.


In the early days javascript wasn't far enough along that you could make a web app without a lot of compromises, and the cross-platform nature of java was a big, big plus. I worked on an internal java client application that was developed on Linux for end users on PC. Ten years after we released the first version, and while it was still under development, a group of powerful managers at our company demanded they be issued Macs instead of the PC corporate standard.

When IT asked us if our application worked on Mac, we shrugged and said "We don't have a Mac to test it. We've never run it on a Mac. We won't support it officially, so if there are Mac specific bugs you're on your own. But... it should work. Try it."

And it did work. All the Mac users had to do was click on our Webstart link just like the PC users. It installed itself properly and ran properly. Never had a single issue related to the OS. Before Java was introduced that was an unobtainable dream in a full-featured windowed application.


> I can code some in Java today (because, hey, GPT and friends!! :) ),

I love GPT. Such a marvellous tool. Before ChatGPT came along, I had no medical experience. Thanks to GPT and friends, I am now a doctor. I've opened a clinic of my own.


For whatever reason, gpt-5 writes java code like it is 1995. I think it was trained on decompiled code.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: