Hacker Newsnew | past | comments | ask | show | jobs | submit | echelon's commentslogin

They don't need to.

They want marketshare to enhance their other market positions and give them optionality for future strategy.

They'd love the whole market, but they don't need it and they won't employ too many resources chasing that.

They're a powerful giant with hands in so many places. Each enforcing other endeavors.

This encourages people to stay in the Apple hardware ecosystem, for instance. It dog foods their silicon. It keeps people thinking of Apple as the creative brand and operating system. More creatives buying Apple -> more being produced and consumed for and on Apple.

Also the strategy of getting kids young has always been genius. They started that in the eighties, I think.


I legitimately don't think the people posting on HN will be employed in this field in ten years.

This is the end of human programming.

I'd be overjoyed at how far we've come if it wasn't for big companies owning everything.


ofc this shit happens when its my turn to be an adult. what’s like even the point anymore?

Fight for rigorous antitrust enforcement.

Adopt open source models and platforms.

We have a chance, but it's threading the needle and I'm not sure we'll make it.


git rebase squash as a single commit on a single main branch is the one true way.

I know a lot of people want to maintain the history of each PR, but you won't need it in your VCS.

You should always be able to roll back main to a real state. Having incremental commits between two working stages creates more confusion during incidents.

If you need to consult the work history of transient commits, that can live in your code review software with all the other metadata (such as review comments and diagrams/figures) that never make it into source control.


Merging merge requests as merge commits (rather than fast-forwarding them) gives the same granularity in the main branch, while preserving the option to have bisect dive inside the original MR to actually find the change that made the interesting change in behavior.

I wish github created automation for this flow like they have for other variants.

But they have, with pull requests. When you merge a pull request it is done via the "subtree" merge strategt, which preserves partial commits and also does not flatten them.

This is one of the few hills I will die on. After working on a team that used Phabricator for a few years and going back to GitHub when I joined a new company, it really does make life so much nicer to just rebase -> squash -> commit a single PR to `main`

What was stopping you from squash -> merge -> push two new changesets to `main`? Isn't your objection actually to the specifics of the workflow that was mandated by your employer as opposed to anything inherent to merge itself?

> You should always be able to roll back main to a real state.

Well there's your problem. Why are you assuming there are non-working commits in the history with a merge based workflow? If you really need to make an incremental commit at a point where the build is broken you can always squash prior to merge. There's no reason to conflate "non-working commits" and "merge based workflow".

Why go out of the way to obfuscate the pathway the development process took? Depending on the complexity of the task the merge operation itself can introduce its own bugs as incompatible changes to the source get reconciled. It's useful to be able to examine each finished feature in isolation and then again after the merge.

> with all the other metadata (such as review comments and diagrams/figures) that never make it into source control.

I hate that all of that is omitted. It can be invaluable when debugging. More generally I personally think the tools we have are still extremely subpar compared to what they could be.


> I know a lot of people want to maintain the history of each PR, but you won't need it in your VCS.

Having worked on a maintenance team for years, this is just wrong. You don't know what someone will or won't need in the future. Those individual commits have had extra context that have been a massive help for me all sorts of times.

I'm fine with manually squashing individual "fix typo"-style commits, but just squashing the entire branch removes too much.


Disagree!

If those commits were ready for production, they would have been merged. ;)

Don't put a commit on main unless I can roll back to it.


I completely agree. It also forces better commit messages, because "maintaining the history of each PR" is forced into prose written by the person responsible for the code instead of hand-waving it away into "just check the commits" -- no thanks.

The flair system was used in conjunction with AutoMod to silence much of the community.

More: https://old.reddit.com/r/Atlanta/comments/1qbabii/ratlanta_h...


> I would say this is more likely driven by part of a bigger package deal with Google Search Placement and Google Cloud Services.

Can the DOJ and FTC look into this?

Google shouldn't be able to charge a fee on accessing every registered trademark in the world. They use Apple get get the last 30% of "URL Bars", I mean Google Search middlemen.

Searching Anthropic gets me a bidding war, which I'm sure is bleeding Google's competition dry.

We need a "no bare trademark (plus edit distance) ads or auto suggest" law. It's made Google an unkillable OP monster. Any search monopoly or marketplace monopoly should be subject to not allowing ads to be sold against a registered trademark database.


>Can the DOJ and FTC look into this?

I guess this venture into politics more than anything else. And I am opinionated on the subject.

But other than that, the point worth centred on is Apple no longer care as much as being the best. They care much more about extracting best business deals and money out of their current position. Which is very different to Steve Jobs era, No money can put crap on his plate.


> To be fair, the new opt-in "use strict" here is "switch to Temporal".

This. Don't break old code, just provide new best practices.

Update linters (or ideally first class language rules, like in Rust's "edition"s), to gradually kill off old behavior. Without having to do a decade long Python 2 -> 3 migration.

Temporal is nice. It learned from the many failures and dead bodies that came before it. And it had lots of good implementations to look at: Joda Time, Chrono, etc.


PHP suffers from this too. By too strict BC PHP has become a really mess of a language. IIRC there still is the ad-hoc parameter order convention and lack of any namespacing for builtins. Everything is global.

With JS i kind of get it as you cant control the env. Bit PHP does not have this limitation, and they still cant fix the mess that is PHP.


There are going to be lots of languages competing with Rust and Zig. It's a popular, underserved market. They'll all have their unique angle.

I has been served for several decades, however since the late-90's many decided reducing to only C and C++ was the way going forward, now the world is rediscovering it doesn't have to be like that.

They're are certainly going to be lots of languages because now with LLMs it's easier (trivial?) to make one + library (case in point: just within last month there're have been posted here ~20 new langs with codebases 20k~100k LOC) but don't really see them competing. Rust and Zig brought actual improvements and are basically replacing usecases that C++/C had limiting the space available to others.

Uhm, no? There is barely enough space for Rust, which happens to have a unique feature/value proposition that raises it above the vast majority of its competitors. If you're fine with UB or memory unsafe code, then you go with C simply because its deeply entrenched.

In that sense Zen-C changed too many things at once for no good reason. If it was just C with defer, there would have been an opportunity to include defer in the next release of the C standard.


Two years ago we were installing 1/10th of Chinese solar today?

Where are we at today? Can we catch up under this administration?

Where do we compare on a nuclear basis? I know my state installed nuclear reactors recently, but I'm not aware of any other build outs.

In a war game scenario, China is probably more concerned about losing access to oil and natural gas than we are. Not that we shouldn't be building this stuff quickly either.


> Can we catch up under this administration?

No. The future is Chinese, if the Chinese can maintain good governance.

A big "if"


I'm a filmmaker, and this is ArtCraft:

https://github.com/storytold/artcraft

AI tools are becoming incredibly useful for our industry, but "prompting" without visual control sucks. In the fullness of time, we're going to have WYSIWYG touch controls for every aspect of an image or scene. The ability to mold people and locations like clay, rotate and morph them in 3D, and create literally anything we can imagine.

Here are a bunch of short films we've made with the tool:

- https://www.youtube.com/watch?v=tAAiiKteM-U (Robot Chicken inspired Superman parody)

- https://www.youtube.com/watch?v=oqoCWdOwr2U (JoJo inspired Grinch parody)

- https://www.youtube.com/watch?v=Tii9uF0nAx4 (live action rotoscoped short)

- https://www.youtube.com/watch?v=tj-dJvGVb-w (lots of roto/comp VFX work)

- https://www.youtube.com/watch?v=v_2We_QQfPg (EbSynth sketch about The Predator)

- https://www.youtube.com/watch?v=_FkKf7sECk4 (a lot of rotoscoping, the tools are better now)


cool stuff!

Thank you!

If you give it a try, I'd love to get your feedback. I'd also like to see what you're making!


I love Antirez.

> However, this technology is far too important to be in the hands of a few companies.

This is the most important assessment and we should all heed this warning with great care. If we think hyperscalers are bad, imagine what happens if they control and dictate the entire future.

Our cellphones are prisons. We have no fundamental control, and we can't freely distribute software amongst ourselves. Everything flows through funnels of control and monitoring. The entire internet and all of technology could soon become the same.

We need to bust this open now or face a future where we are truly serfs.

I'm excited by AI and I love what it can do, but we are in a mortally precarious position.


This is what will occur - the bad scenario that is. Labor and its knowledge distributes (hard to contain knowledge), capital centralises and compounds. Always been that way. With AI there will be a a tension between the two of course.

The root question is: Will AI decentralise quicker than the disruption to this profession? I don't think so.

I've noticed us techies don't really understand economics and game theory all that well - we just see awesome toy and want to play with it and want others to enjoy it too. We have worked to democratize computing for years (e.g. OSS) now to our detriment. No one in society long term respects people who do this in a capitalist system; they find them naive. I can now understand why other professions find us a little immature like kids playing with tech toys.

I love solving problems with technology and love the field, but as I've gotten older I look back on a less technological life with nostalgia. Technology for all its benefit has disrupted the one thing humans do need and had for millions of years in our evolution - relative stability within their lifetimes. The mental health benefits to stability are massive and usually unmeasured. Technology, as evidenced by this thread, creates more and more anxiety about our future and our place within the community (e.g. social media, AI, and others). "Adaptability" isn't just a psychological trait; a wealthy person and secure person by definition is more adaptable too.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: