Hacker Newsnew | past | comments | ask | show | jobs | submit | li4ick's commentslogin

His book, "Understanding Software Dynamics", is one of the best technical books I've ever read. Top 3 for me.


If you don't mind me asking, what are the other two?


- Managing Gigabytes

- Hacker's Delight


Much appreciated. I bought all 3.


Thanks a bunch! Will check them out.


Yeah, this reads like cope. So, all software engineering? Then you must have a pretty narrow view of the problem space. Quite a lot of problems have a pretty well defined set of requirements that can fit on one page. It's the technical excellence that delivers on those requirements.


> Then you must have a pretty narrow view of the problem space. Quite a lot of problems have a pretty well defined set of requirements that can fit on one page.

If this is true, then these problems are already solved. And, given the nature of software, they do not need to be solved again. The reality is that every single piece of software ends up being unique at the edges. It's what makes software great, but also so challenging. It's also these edges that take your 1 page of requirements and turn them into an ambiguous, conflicting set of 50 pages.


> If this is true, then these problems are already solved

There's a lot of induced demand in software development. We have spent the last 70 years on incredible productivity improvements through better tooling, language design, frameworks, etc. But all that has done is create more demand, both for functionality and usability. Now that animations are cheap to add to UIs, people want to have them. Automating tasks that can be done quite cheaply by humans is becoming cost effective in ever more fields. Software of complexity that would be unimaginable in the 50s is used for purposes as mundane as many-to-many short messages.

Lots of software is unique, but at the same time there's also lots of software or software features that just recently became cheap enough to implement to be worth doing. And those can sometimes be quite simple in their requirements.


Can you give an example of a non-trivial problem with a well defined set of requirements that you can fit on one page?


To support our next-gen machine learning system, we need a 10 exabyte storage array. It should host a system accessible over TCP/IP or Infiniband that can stream random-access 1MB blocks of data to 65,536 different computers, at continuous loads of 64 GB/second each computer, using a protocol of your choice or design. Correct for all data corruption and do not lose a single bit during the next 1,000 years of operation.

Yes, we can add more specs but these alone should be pretty daunting.


What does this have to do with building software? This is something submitted on a form for hardware for a capex


If you think you’re gonna organize that many bits and not lose them for centuries at a time, you’re seriously underestimating the need for data replication and error correction algorithms, first of all.


Again, that's a hardware storage solution's problem. Adjust the parity level in the filesystem. Unless you're writing the requirements for building the software for a storage platform.


… yes, that is in fact the point


There are money, operations, and delivery time requirements, just off the top of my head, not expressed in that requirement that vastly changes the solution.


And the client is all over you/the company.


As a usual bore, I'd like to point out that trivial/non-trivial is highly subjective.

Is making an app to connect in Bluetooth to an infrared camera trivial?

Is making a 55Gib fizzbuzz non-trivial?


The specs for speedy FizzBuzz are relatively clear, even if the implementation is challenging.

For the camera, there are tons of unspecified behaviors and baked-in assumptions. Is the user configuring the camera, or are we discovering any/all cameras of a certain type? Authentication? What should happen when the connection fails? What if the bandwidth isn’t sufficient (e.g., due to distance or congestion) to deliver the full take from the camera in realtime? How will camera malfunctions be detected, handled, and reported?

A hobby project or prototype can ignore most of this, and insist that you turn it off-and-on if anything looks odd; a fancy turnkey security system should carefully consider all of this and more!


No plan survives contact with reality.


Ambiguous requirements are more symptomatic of a problem space that is poorly defined e.g. useless SaaS

If you're building to solve real problems(tm) you don't need a PM to pad out tickets with filler.


It is due to a problem space that’s poorly defined, but that’s not always indicative of ticket padding or a useless product.

I’ve dealt many times with real people, who present me with a real problem, but no single concrete solution.

If the PM doesn’t take the proper time to understand the problem (and try to find the real, underlying problem that often exists) — Or rushes into a solution without proper evaluation — Or just generally wants to quickly churn user asks into tickets — then the result is the same.

I generally agree that, in my experience, a good senior software engineer is well suited to teasing out “the real problem” and getting to a properly detailed solution even without a PM.

However, that doesn’t mean I don’t think there’s potential value in a PM that can set a broader vision of the product and help prioritize tasks. But the good ones do so by working with and deferring often to his or her software dev team, and NOT by working “top down”.


Can you share one problems defined set of requirements that fit on a single page?


You know, reading these comments is absolutely hilarious. It's because of this and that, hundreds of excuses that dance around the simple truth: you're an incompetent programmer. That's it. No, you're not capable of making something twice as fast if only you cared or had the time. If you haven't done it as a constant exercise, you can't.


> It's because of this and that, hundreds of excuses that dance around the simple truth: you're an incompetent programmer.

Classic arrogance and naivety from a Casey follower. Name call all you want, you can't hand wave the reality that the business determines the requirements, and in my industry they don't care about performance until it's a noticeable problem. Oh and the requirements they gave you are solving problem X when they really want to solve problem Y, so your optimal solution to problem X needs to be deleted.

I write correct, readable code as performant as it can be in the time I'm allotted. Call me incompetent, but it's what I'm hired to do.

All this bickering and harsh feelings are stemming from the author's inability to understand that different industries have different priorities.


> in my industry they don't care about performance until it's a noticeable problem

Yeah, I've done several dozen 10x or more performance improvements on our codebase. It's not always trivial, but most of the time it's not super-hard either.

In fact just today I did a 10x speedup of a query. After a couple of hours analyzing the issue, the fix was relatively simple: populate some temp tables before running the main query. A bit more complex than just running the query, but not terribly so.

Why hadn't we done that before? Because a customer suddenly got 1000x the volume of the previously largest user of that module, and so performance was suddenly not acceptable. It's 5 years since we introduced the module...


I don't think the structure of the "business requirements" argument is correct, and I will try to explain why.

To a first approximation, the reason modern software is slow isn't due to failure to optimize this algorithm or that code path, but rather the entire pancake stack of slow defaults — frameworks, runtimes, architectures, design patterns, etc. — where, before you even sit down and write a line of "business logic" code, or code that does something, you're already living inside a slow framework written in a slow language on top of a slow runtime inside a container with a server-client architecture where every RPC call is a JSON blob POSTed over HTTP or something. This is considered industry standard.

The "business requirements" guy is basically saying, I have to ship this thing by friday, I'm just going to pick the industry standard tools that let me write a few lines of code to do the thing I need to do. Ok, but that's the tradeoff he's making. He's deciding to pick up extremely slow tools for the sake of meeting his immediate deadline. That decision is producing unacceptable results.

It's not enough just to say people have different priorities. Selecting an appropriate point on multivariate system of tradeoffs is part of the skill of being a programmer. And if there's no point on the curve that delivers acceptable results in all categories — if, given a certain set of tools, it's not possible to ship quickly and deliver acceptable performance — then it should be an impetus for the programmer, the craftsman, to find better tools, improve his skills, push the "production curve" outward, until he can meet all the requirements.

For instance, a large percentage of modern programmers don't really know how to program from first principles, and tell the computer to do precisely and only the thing it needs to do. Essentially they only know how to glue tools together. Then in their head they're like, well gee, given that skillset, I could either (1) spend a bunch of time optimizing "hot spots," writing crazy algorithms, heroically trying to fight through all that slowness... or I could just (2) deliver the business logic and call it a day. Then they call this "prioritizing business requirements." No, there's a third, alternative, better option, which is to use better tools, which might initially be harder and more time consuming and less ergonomic to use, and then learning to get good with those tools, putting in the practice, recognizing patterns, thinking faster over time, coding faster... all of this is part of what mastering the trade of programming is about.

At the end of the day, there is just an ethic of self improvement and craftsmanship that is totally missing from programming today, and it surfaces whenever this debate comes up.


you're an incompetent programmer

The problem with this line of reasoning, eg "if you write slow code you're incompetent", is that it applies to everything that programmers do - if you write slow code you're incompetent, if you write buggy code you're incompetent, if you write undocumented code you're incompetent, if you write untested code you're incompetent, if you write code slowly you're incompetent, if you write code that doesn't fulfil all the requirements perfectly you're incompetent, and so on for each and every measure someone dreams up to measure how good code is.

You'll very quickly find there are no competent developers.


The flaw in this point is, there are really only a handful of measures that actually matter: writing performant code, shipping quickly, delivering business requirements (really this is an official sounding way of saying "doing the actual thing the program needs to do"), and eliminating bugs.

The other things are just proxies for the real measures, that people made up, and in fact are often harmful to the main goal. Like "documenting code" and "writing tests" a lot of the time are just cargo culting to make people feel like they're being responsible and following "best practices" without actually improving the measures that matter. I think that the other unlisted metrics in your "every measure someone dreams up..." are likely to fall under this category.

There isn't an infinite number of possible measures like you're suggesting, there's a finite number and a rather small number at that. You can definitely be really good or really bad at quickly shipping performant bug-free code that does its job. The problem in this debate is that one side is completely ignoring one of these measures, and trying to claim that it's because they have to prioritize the other ones, and that this is just an inevitable tradeoff, rather than that we lack the skill as an industry to do all of these things at an acceptable level. Being a good programmer may involve more axes than being a good chess player, but I think the claim that there are so many axes that it negates the existence of programming competence reductios to absurdum pretty quickly.


100%. Just claim that documentation and tests are cargo cults, take up previous cpu and memory like it’s 1985, and pretend that your shite, undocumented, unmaintainable, untestable, shift-right code is perfect.

Working with your type sucks. Confidently incorrect all the fucking time.


I mean, sure, this comment sounds nice on a web forum, but some of us have 0 hesitance bucketing others on a scale of competence. Looking at the state of software I interact with on a daily basis, I really don't care about being nice anymore.


some of us have 0 hesitance bucketing others on a scale of competence

Sure, and what I'm saying is that there are many scales, and all of us are at the incompetent end of some of them.


> some of us have 0 hesitance bucketing others on a scale of competence

Yeah, people like that exist... That has a strong and reverse correlation with competence.


This makes absolutely 0 sense.


The sentiment here and in most of industry is: “and that doesn’t matter.”


The general problem though is that "and that doesn't matter" isn't backed up by anything but gut feeling.

When FB dove into the numbers, they found they could save 50% in hardware costs. That _does_ matter.

It may be the case that your company has a 20k server rack to support 20M of sales and so it doesn't matter. But unless you've actually looked up the cost you shouldn't be making the claim because it's unbacked; it's just a bad faith arguement.


Also depends mostly on your size.

The cost to rewrite your system to be twice as fast is about the same if you have 20k of hardware costs or 20M, but one saves 10k and the other 10M. Only one offsets the cost of the programmers


The argument is not that performance optimization does not matter at all, but rather than low-level performance optimization, like rewriting compilers, writing your own custom storage system, or rewriting it in unreadable speed-optimized C++ is worth it.

You don't need to do a deep analysis to tell if optimization is needed or not, it is enough to assume best-case improvements and compare it to your FTE + overhead cost.

Do you have many of your servers run CPU-bound C++ apps you wrote? If not, don't bother eliminating class hierarchies, optimize your core logic instead.

Do your webapps spend most time waiting on database and microservices? See if you can eliminate or cache those, the wins are going to be much bigger than rewriting it in Rust.

Is your GraphQL compiler too slow? Unless you are FAANG with hundreds of thosands of developers and hundreds of people to spare, you will get much more bang-for-the-buck with some smart caching or just getting a bigger machine for CI jobs.

According to levels.fyi, the average senior SW engineer in San Francisco is $312k/year. With overhead, the actual cost is likely $500k/yer or so. There is _a lot_ of servers you can buy for that money before you can justify maintaining your own custom version of the existing software solution.


> The argument is not that performance optimization does not matter at all, but rather than low-level performance optimization, like rewriting compilers, writing your own custom storage system, or rewriting it in unreadable speed-optimized C++ is worth it.

People use these excuses for all kinds of performance arguments besides low-level/etc.

> You don't need to do a deep analysis to tell if optimization is needed or not, it is enough to assume best-case improvements and compare it to your FTE + overhead cost.

Sure, my point is people haven't even done rough math of FTE + overhead cost or even best-case improvements while still making those claims.

---

Less w.r.t. the article and more w.r.t. li4ick's comment. I've found numerous 10x gains in performance just by swapping an O(N^2) with an O(N) one (typically converting code using a List to using a Set instead). That doesn't cost 312k and if the original author was more concerned with performance they wouldn't've done things that way.


In some sectors software quality truly does not matter beyond a certain point (and the bar is really low). The customer very rarely has the resources or the time to try all the alternatives, especially if switching afterwards is expensive so if your marketing is good and your product works and has the needed features, you get sales, and if it's not, you don't.

Is it terrible that the app takes minutes to add a few thousand numbers? Of course it is. But it does not matter, the customer is used to software being terrible so he won't waste the time and money to switch to another software that is probably also terrible.


Hot tip: if you come into a conversation for the sole purpose of condescendingly insulting everyone, maybe put the phone down and take a breather.

On a related note: https://xkcd.com/359/


I have no idea how you would take what I wrote as an insult. It's insulting to me that people are paid 400k a year and still think that what I wrote is somehow NOT the baseline. It's not even a debate. I'm gonna take my own advice from the past and block HN for another 3 years and get back to work. Good luck.


I'm also more on the negative side because I'm really not convinced by the whole "AI will just automate/remove the boring aspects of life". Every single prototype capability of current AI points in the direction of a worst-case scenario of overall misery.


> Every single prototype capability of current AI points in the direction of a worst-case scenario of overall misery.

Software is often aimed at solving problems and making processes more cost-effective. Founders are often most interested in solving their own problems, problems of their friends, or the most valuable problems of their customers. For B2B companies to customers are other businesses. The greatest cost for businesses are employee costs. Solving the employee cost, but eliminating employees, is a great target for business owners.


Absolutely. I feel like I'm just watching the car crash in slow motion. Hopefully I'm wrong, but I really don't see this AI revolution working out well for humanity (in the short/medium term). I'm sure we will eventually get through it, but I think life for most is going to get real hard for the next 20-50 years.


Didn't they partner with SourceGraph to make Cody? Here's them talking a bit about it: https://www.youtube.com/watch?v=LYuh-BdcOfw. Maybe that's why?


No offense to the Atlassian guys, but how do they justify their horrid overall performance with BitBucket?


They seem to have pivoted BitBucket a few years ago. Rather than being a GitHub competitor, they moved to being the default option for companies that buy into the Atlassian ecosystem. JIRA was always their big product, and they've expanded into a bunch of other JIRA-adjacent products (sometimes via acquisitions) such as Trello and OpsGenie. For a company their size it makes sense to have BitBucket as a box they can tick in sales pitches, but I think they realised a long time ago that they weren't going to win anyone based purely on that.


I think every time I go to use Bitbucket I get an error 500 at least once.


The same way they would justify it for Jira


Just click more buttons.


Yes, I've seen this behaviour on quite a few sites recently, like they're running a bitcoin miner or something. Here's another example: https://zed.dev/ I have no idea what they're doing to slow my machine down to a halt.


Ok, I'll be the bad guy and say that the linked demo on loom is just not impressive at all. It's just 10x slower than what a simple regex query would've answered. Again, I'll be the bad guy in this comment section, but ML-based search methods for code are just not that useful. They do sound nice for non-coders, that's about it. I also feel like I'm qualified to say this because I have first-hand experience building code search tooling at scale.


> nice for non-coders, that's about it.

I'm not a coder but I code; not sure if that makes sense. I sort of know a bit of regex, but find it utterly painful and not worth the time.

I'm an amateur, effectively, with little time. I like AI tools for coding, because I can input a request, and get sample code. I know enough to be able to read most of the code that I get back, but not enough to be able to easily write such code on my own.

These types of tools, in my opinion, have the potential to make people like me, and even people who are much less knowledgable than me, productive programmers. This could be transformative.

Sure, we won't be as good or productive as real programmers, but that's beside the point.

In other words, what you're referring to with "that's about it" could still transform quite a few lives.


> I'm an amateur, effectively, with little time. I like AI tools for coding, because I can input a request, and get sample code.

This sounds kinda like contemporary Chinese text entry. Nobody remembers exactly which strokes in what order, but can feed some parameters into an app and get back candidate characters.

Of course the characters are a simple enumeration, while what you get from a coding-helper app is still a form of electronic hairball and requires a second look, and a third.


ChatGPT is an absolute game changer. And I was very conservative with systems like GPT and DALLE-2. For example, I've been very lazy with automating some things on my work laptop, using Powershell. Now, I just had to ask ChatGPT for "write a powershell script that toggles "use setup script" in the windows proxy settings" and I was done in 5 minutes. Amazing. I foresee a paradigm shift in how we use Google in the next 1-2 years.


You inspired me to try using ChatGPT for a similar need I had, but alas, "Write a bash script to toggle grayscale display mode in MacOS" just comes up with a plausible but non-working script that toggles a hallucinated "AppleIntensityEnabled" setting.

To be honest, I feel a bit of relief every time AI fails to do something. Like, okay, we've got a few more years...


If the code is not working you can just try telling it what went wrong. It doesn't have a Dev environment where it can test the code so you have to be its debugger.


i told it it was wrong about some math problem and it asked me what the correct answer was. i told it and it remembered. but more interestingly its very good at keeping context. it combined like 5 back and forths into 1 coherent sentence


Tell it what is wrong and why, it is surprisingly good at fixing itself with a bit of help. I was able to guide ChatGPT well enough to make it write an old school racing game using pygame. Start general, see what it gets correct and what should be changed, and give it better indications.


Chatgpt isn't a person though, it didn't fix anything. You gave it a prompt, it gave a result that was close. You add a line to the prompt and it gives you something closer to what you were looking for.

It looks more like refining a search pattern(like one might do with an LDAP query) on the part of the operator than it does an algorithm "fixing", "changing", and "being guided". It's interesting how we anthropomorphism the output of this algorithm compared to other APIs, even though the algorithm is closer to oher APIs than it is a human as far as we understand.


I don’t think anyone here is confusing ChatGPT for an actual person.

What is really interesting with ChatGPT compared to other interactive software is that you can give instructions the way you would do it with a human. You can literally copy paste a compilation error, with no more context, and it will fix the previous program it generated. Even just pointing vaguely to something like “that does not look correct, you forgot some edge cases” will result in an improved version.


Yes, I had to press "Try Again" a few times but, in general, the amount of stuff you can generate is staggering. It's also quite fun to ask it to "invent a programming language in the style of Rust and APL" for example. There's so much potential here...


> And I was very conservative with systems like GPT and DALLE-2

Yeah those are impressive but basically toys. Although at the moment it's clearly still a research prototype. For some things it works really well and beats Google by saving dozens of clicks and repeated searches, for others it's just plain wrong.


Can you show that pwsh script?


Their business is whatever their metrics say it is. In this case, TikTok is eating their lunch, so their business is to replicate TikTok.


But you can replicate tiktok whilst not shafting your existing users.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: