Hacker Newsnew | past | comments | ask | show | jobs | submit | more vouaobrasil's commentslogin

Well, Google's office is on ground that was once home to foxes, so they're in the wrong – it's karma. Google is the infestation, not the foxes.


> Coding agents still give you control (at least for now), but are like having really good autocomplete.

And I think that's the problem. I think autocomplete itself is a bad thing. If one has autocomplete, one is more likely to type stuff that is less valuable to be typed.


Even when it comes to a job, sacrificing enjoyment for efficiency can often make life less fun.


I agree. There is the ethical component, not just because the way they were trained but because the big tech companies that leverage them most efficiently are primarily trying to gain an unfair proportion of resources for themselves, so using them is participating in a losing game.


That is very interesting... I would not have guessed that.


I am a part-time coder, in that I get paid for coding and some of my code is actually used in production. I don't use LLMs or any AI in my coding, whatsoever. I've never tried LLM or AI coding, and I never will, guaranteed. I hate AI.

I agree with you, 100%. I like typing out code by hand. I like referring to the Python docs and I like the feeling of slowly putting code together and figuring out the building blocks, one by one. In my mind, AI is about efficiency for the sake of efficiency, not for the sake of enjoyment, and I enjoy programming,

Furthermore, I think AI embodies the model of the human being as a narrowly-scoped tool who gets converted from creator into a replaceable component, whose only job is to provide conceptual input into design. Sound good at first ("computers do the boring stuff, humans do the creative stuff"), but, and it's a big but: as an artist too, I think it's absolutely true that the creative stuff can't be separated from the "boring" stuff, and when looked at properly, the "boring" stuff can actually become serene.

I know there's always the counterpoint: what about other automations? Well, I think there is a limit past which automations give diminishing returns and become counterproductive, and therefore we need to be aware of all automations, but AI is the first sort of automation that is categorically always past the point of diminishing returns, because it targets exactly the sort of cognitive features that we should be doing ourselves.

Most people here disagree with me, and frequently downvote me too on the topic of AI. But I'll say this: in a world where efficiency and productivity has become doctrine, most people have also been converted into only thinking about the advancement of the machine, and have lost the essence of soul to enjoy that which is beyond mere mental performance.

Sadly, people in the tecnhnical domain often find emotional satisfaction in new tools, and that is why anything beyond the technical is often derided by those in tech, much to their disadvantage.


I've been publishing videos for a few years and made nearly 300. Best advice I can give you is:

1) Get a decent microphone

2) Check out freesound and the YT music library

3) Just start publishing. Make a video. Upload it.

Don't do too much research. Just talk about what you know. As you go on, do a little research here and there, but the #1 rule is just publish. Upload, upload, upload.


Shame it's not worldwide. Dogs are such a nuisance.


I think the question of whether LLMs are useful for software engineering is not the right question at all.

The better question should be whether long-term LLM use in software will make the overall software landscape better or worse. For example, LLM use could theoretically allow "better" software engineering by reducing bugs, making coding complex interfaces easier --- but in the long run, that could also increase complexity, making the overall user experience worse because everything is going to be rebuilt on more complex software/hardware infrastructures.

And, the top 10% of coder use of LLMs could also make their software better but make 90% of the bottom-end worse due to shoddy coding. Is that an acceptable trade-off?

The problem is, if we only look at one variable, or "software engineering efficiency" measured in some operational way, we ignore the grander effects on the ecosystem, which I think will be primarily negative due to the bottom 90% effect (what people actually use will be nightmarish, even if a few large programs can be improved).


If we assume that LLMs will make the software ecosystem worse rather than better, I think we have two options:

1. Attempt to prevent LLMs from being used to write software. I can't begin to imagine how that would work at this point.

2. Figure out things we can do to try and ensure that the software ecosystem gets better rather than worse given the existence of these new tools.

I'm ready to invest my efforts in 2, personally.


I would rather not play the prisoner's dilemma at all, and focus on 1 if possible. I don't code much but when I do code or create stuff, I do with without LLMs from scratch and at least some of my code is used in production :)


I don't think better search is exactly what we want. It would also be great to have less quantity and more quality. I think optimizing only search to make it better (including AI) only furthers the quantity aspect of content, not quality. Optimizing search or trying to make it better is the wrong goal IMO.


Arguably, the opposite is true. Ars Technica and others have written about this extensively [0].

Having summarized results appear immediately with links to the sources is preferable to opening multiple tabs and sifting through low-quality content and clickbait.

Many real-world problems aren't as simple as "type some keywords" and get relevant results. AI excels as a "rubber duck", i.e., a tool to explore possible solutions, troubleshoot issues, discover new approaches, etc.

Yes, LLMs are useful for junior developers. But for experienced developers, they're more valuable.

It's a tool, just like search engines.

Airplanes are also a tool. Would you limit your travel to destinations within walking distance? Or avoid checking the weather because forecasts use Bayesian probability (and some mix of machine learning)? Or avoid power tools because they deny the freedom of doing things the hard way?

One can imagine that when early humans began wearing clothing to keep warm, there were naysayers who preferred to stay cold.

The most creative people I know are using AI to further their creativity. Example: storytelling, world building, voice models, game development, artwork, assistants that mimic their personality, helping loved ones enjoy a better quality of life as they age, smart home automations to help their grandmother, text-to-speech for the visually impaired or those who have trouble reading, custom voice commands, and so on.

Should I tell my mom to turn off Siri and avoid highlighting text and tapping "Speak" because it uses AI under the hood? I think not.

They embrace it, just like creative people have always done.

[0] https://arstechnica.com/gadgets/2024/05/google-is-reimaginin...


Socrates had a skeptical view of written language, preferring oral communication and philosophical inquiry. This perspective is primarily presented through the writings of his student, Plato, particularly in the dialogue Phaedrus.

I confirmed that from my own memory via a Google AI summary, quoted verbatim above. Of course, I would never have learned it in the first place had somebody not written it down.


> Socrates had a skeptical view of written language, preferring oral communication and philosophical inquiry. This perspective is primarily presented through the writings of his student, Plato, particularly in the dialogue Phaedrus.

He did not. You should read the dialogue.

> I confirmed that from my own memory via a Google AI summary, quoted verbatim above.

This is the biggest problem with LLMs in my view. They are great at confirmation bias.

In Phaedrus 257c–279c Plato portrays Socrates discussing rhetoric and the merits of writing speeches not writing in general.

"Socrates: Then that is clear to all, that writing speeches is not in itself a disgrace.

Phaedrus: How can it be?

Socrates: But the disgrace, I fancy, consists in speaking or writing not well, but disgracefully and badly.

Phaedrus: Evidently."

I mean, writing had existed for 3 millennia by the point this dialogue was written.


It is both exciting how far we got and depressing how far we didn't.


Better search implies separating the wheat from the chaff. Unfortunately SEO spam took over and poisoned the whole space.


Better search implies more sophisticated search which means more opportunities to game the search.


A more sophisticated search could also empower the users if good search was the goal.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: