Six years ago when I was working with a Phoenix API, we were measuring responses in microseconds on local dev machines, and under 5 ms in production with zero optimization. In comparison the identical Django app had a 50 ms floor.
AGPL and Apache are both open source licenses. So I’m not getting what the confusion would be as an end user, who won’t be modifying the software or packaging it for sale.
Yeah, CORS is not a safety mechanism. It’s a procedure of loosening the default safety mechanism of not sharing any response data from a cross site request with client side JavaScript.
I recently discovered Claude, and it does much better than Codex or Gemini for python code.
Gemini seems to lean to making everything a script, disconnected from the larger vision. Sure, it uses our existing libraries, but the files it writes and functions it makes can’t be integrated back in.
Codex is fast. Very fast. Which makes it great for a conversational UI, and answering questions about the codebasw or proposing alternatives but when it writes code it’s too clever. The code is valid but not pythonic. Like the invention of one line functions just to optimize a situation that had could be parameterized in three places.
Claude on the other hand makes code that is simple to understand and has enough architecture that you can lift it out and use as is without too much rewriting.
You can often have services available for VIPs, and be down for the public.
Unless there's a misconfiguration, usually apps are always visible internally to staff, so there's an existing methodology to follow to make them visible to VIPs.
But sometimes none of that is necessary. I've seen at a 1B market cap company, a failure case where the solution was manual execution by customer success reps while the computers were down. It was slower, but not many people complained that their reports took 10 minutes to arrive after being parsed by Eye Ball Mk 1s, instead of the 1 minute of wait time they were used to.
If you're a 'startup', you'll never need any of that work until you make it big. 99% of startups do not make it even medium size.
If you're a small business, you almost never need replicas or pooling. Postgres is insanely capable on modern hardware, and is probably the fastest part of your application if your application is written in a slower dynamic language like Python.
I once worked with a company that scaled up to 30M revenue annually, and never once needed more than a single dedicated server for postgres.
Why would your distro dictate the upgrade routine? Unless the distro stops supporting an older version of Postgres, you can continue using it. Most companies I know of wouldn't dare do an upgrade of an existing production database for at least 5 years, and when it does happen... downtime is acceptable.
So, I work for a company that has RTA adult websites. AI bots absolutely do scrape our pages needless of what raunchy material they will find. Maybe they discard it up after ingest, but I can’t tell. There are 1000s of AI bots on the web now from companies big and small so a solution like this will only divert a few scrapers.
reply