Hacker Newsnew | past | comments | ask | show | jobs | submit | seurimas's commentslogin

Positional encoding is standard for transformers of all stripes. They introduce a seemingly novel, redundant positional encoding scheme. It's more difficult to train, but seems to enable producing multiple tokens at once (i.e. you could get an answer that is N tokens long in N/x steps instead N steps).


Interesting! 25 business days for a business is close to 41 calendar days, too. I wonder if there's some sort of common, human constant involved. 25 days of engaging with something to decide whether that something is worth keeping around. Maybe the median relationship length is 25 days, too...


Is that a joke, or are you just misreading the stats on that page? Reddit definitely isn't down 30% of the time.


Joke, but not far off.


Usually, you pay what they ask and, if you don't like the price, you don't use their work. It's very simple, but OpenAI skipping that step opens them up to courts deciding that price, if a price is actually owed. That price would probably be more than 0, which is what the authors have gotten so far.


Doesn't that kind of demonstrate the value being actively stolen from the creators, more than anything? Copyright law killed Napster, too. That doesn't mean applying copyright law was wrong.


And now the guy who started Napster is on the board of Spotify who just decided they weren’t going to pay small time artists for their music anymore. Go figure.


Spotify and the rights holders come together and agree on the right price. Unfortunately, since long before Spotify ever existed, it's usually the record labels who owns the rights to the music and the actual creators still get shafted.


Except in this case, 10s of thousands of rights holders are just getting nothing as of the start of 2024 and I can tell you, Spotify certainly did not “come together” with any of us.


Didn’t they basically say they’re no longer going to bother paying for songs earning under $3 USD per year.

It seems like the only people that will be impacted are the abusive auto-generated spammer accounts with thousands of garbage tracks uploaded garnering 1200 streams a year by people accidentally playing them via Google Home misinterpretations etc.


1000 streams per year for each song. That’s not just that auto-generated junk. That’s a majority of ALL music on Spotify.

So yes an individual song might be $3 per year but that just shows how poor their royalties are to begin with. And tries to obscure the fact that artists don’t just release one song ever.

There’s thousands of artists who maybe even were somewhat successful at some point in their career but would have a lot of songs in their back catalog that don’t get that many streams annually. Suddenly they’ve gone from not making enough per stream from Spotify, to just getting paid nothing at all.


Of course it was wrong. Abolish all copyright.


Well, we didn't mind the mold and mildew as much, for starters.


The features vs versions graph is more than a little nonsensical, and its conclusions even more so.

> They should be stable and move slowly so that most of the time of your develoeprs is not spent fighting with their tools.

I can't speak to Go's tooling base, but Rust's is head and shoulders above any other ecosystem that I've had to work with. This is definitely more a point in Rust's favor than against.


Go's tooling is much better in one key aspect, compile times. Rust's compile times and feedback loop are terrible and it is not a community priority (ie. they want to make it faster, but only incrementally and not the orders of magnitude improvement it needs). Go's could be better but at least it is a priority.


Rust's compile times are worse than Go but are still way better than people make it to be.

Here are some examples:

---

Rust Analyzer:

Clean build: 31.02s

Incremental build: 3.07s (Added a `println!` in a level-2 dependency in the dependency graph)

SLOC: 306,467

SLOC of dependencies: ~879,304 (After removing `win*` packages which make up for another 1.3M)

---

Zola:

Clean build: 24.58s

Incremental build: 1.34s (Added a `println!` in a level-2 dependency in the dependency graph)

SLOC: 17,233

SLOC of dependencies: ~2,087,781 (After removing `win*` packages which make up for another 1.3M)

Obviously, not all source code gets compiled due to conditional compilation but it makes for a good approximate if you only take into account 2/3 of it.

---

For comparison with Zig by building ZLS:

Clean build: 20.13s

Incremental build: 10.27s (I believe this builds the ZLS package again from scratch so not really incremental)

SLOC: 45,806

SLOC of dependencies: I don't know how to get this.

---

This was on a Ryzen 5600x


At the hobby level in Rust I am constantly in awe of how good the Rust tooling is.


Go’s is ok. Not as good though.


Can you provide an example of where rust has better tooling than Go?


Offhand, I like rustfmt a lot more than gofmt because it handles long lines rather than only changing things within a given line; in my opinion, any time I need to manually format something myself, that's a failure of the formatting tool. I also find rustdoc to be much more flexible than Go's documentation functionality (which I forget the name of, but I assume it's something like `go doc`). At least the last time I used it Go required me to prefix any doc comment for a given function, type, etc. to start with the name of the item I was documenting; not only is this redundant, but it also means that any typo I make will cause the comment to be omitted completely, whereas Rust only requires a third slash for doc comments (i.e. `///` rather than just `//`). It's a minor thing, but small quality of life things add up across an entire ecosystem.


Prefixing doc comments with the ident name is a convention that isn't enforced in any way. You must have been using a 3rd party linter.


So any comment that happens to be right before a public type, function, etc. is exported into documentation without any special syntax needed? Honestly, in my opinion that's even worse, given that part of the whole point of tools like this is to generate public-facing documentation for websites; I want doc comments to be visibly different than regular comments, but it sounds like the only thing that would determine whether a comment is exported is the surrounding context.


You're just making problems up now ...


MUDs are a great breeding ground for bespoke programs. I've made my own system of triggers and aliases in Rust, which interfaces with Mudlet (very popular MUD client) through JSON over stdio. Being written in Rust, it has enabled a publicly usable web tool (http://seurimas.github.io/topper/explainer/?/topper/explaine...), but the majority of the code is just for me.


The site you linked isn't loading for me. It's just a black page with a favicon


I had the wrong path for the demo file. It ought to work now.


Point in fact, those were the rules. The DIF protected them past 250k. This was not an exceptional measure. Everything worked as intended.


the FDIC itself called this a "systemic risk exception", but SVB and Signature were not considered systemic before this, and thus had not behaved the same as the systemically important banks.

I am not sure we can say anything worked as intended.


I think the greatest idea for Rust scripting is just supporting WASM scripts. I've had some success toying with wasmer in bevy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: