Yeah, I only read the abstract and looked at the plots, but this is what I hate about public health papers:
They say the prevalence of virus is down. They don't say that the cancer rate is down (granted too early to tell), nor do they talk about any adverse events or all cause mortality differences (again, probably too early to tell)
The only thing they can conclude is that the treatment given to stop the virus, stops the virus. But they don't mention any tradeoffs.
Not trying to be an anti-vaxxer conspiracy theorist, but good science needs to talk about the whole picture.
Research papers are not literature reviews. This paper reports on the results of this study. And that study only investigated what it investigated.
In the case of public health, there are a bunch of organizations that keep on top of the research and maintain a more comprehensive view of their perception of the current consensus.
For day to day guidance, individuals should be referring to either those sources, or healthcare professionals.
If people are looking at individual studies like this to make decisions, something has gone very wrong.
You can’t talk about the whole picture unless you have all the parts. There’s no reason all of those parts have to come from the same study.
The first thing on your list of complaints is something that by your own admission cannot yet be determined. If you’re not trying to be an anti-vaxxer, you’re doing a bad job of it.
25 years ago when my wife and I were poor grad students we had to do this. I tracked everything religiously and she cut coupons for the grocery store. We were generally positive about $100/month at best. Tracking it allowed us to not go negative.
As soon as we got real jobs with a real income, we didn't waste time with that. Our philosophy now is to just make sure that we spend well under our means and not track. We don't penny-pinch, but we still keep some of the grad school "do I really need this?" mentality.
Our normal spending is somewhere under 1/2 of our take-home (including mortgage), so we just don't worry about it and keep saving. It helps that we don't have fancy tastes. It's a nice stress free way of saving and we don't have to get neurotic about tracking every penny either.
It probably worth at some level not totally losing track of various subscriptions or routine daily purchases. Won’t buy you a house but can be a few thousand a year.
One of the things that the tracking taught me was to be allergic to subscriptions. I only have a few where significantly more convenient because I know I'll use it. Outside of our phones (and the kids don't get phones), we have one music service for the family, I'll allow two video streaming services and if the kids want to add one, they have to pick one to cancel, and I have a coffee subscription because the owner lives down the street from me and it's fresher than I'd get in the grocery store.
It's a good point about the routine daily purchases, I never thought of that. But I live semi-rural so I'm not out every day wandering around the city and picking up a snack or anything like that. I imagine that could add up.
Funny you bring this up, I was just talking about this to someone else in a different context. I'm a pretty old dude for programming, I've been hacking in the field since the early 80's, and professionally developing for the last 25 years. Most people would be pretty unimpressed with my skill set, it's pretty much just "linear algebra", and I've basically solved the same 5 or 10 problems over and over again.
The thing that, I think, has given me a competitive advantage is that I put a significant amount of effort into learning the domain I'm working in. I've gone from health care system to theoretical physics to image processing to logistics to financial plumbing to electricity markets to obscure stuff for the War Department, and so on.
The value that you really provide to a customer is deeply understanding their domain and the problem they have in that domain and then translating that for a computer. If you're just taking tickets off of Jira and writing code without context you're no better than an LLM (just kidding...maybe).
So yes, I suggest that whatever field you're working in, you put the effort into learning the domain as well as practitioners in that domain. That's how you become valuable. It's not easy, but after a few iterations you start to see patterns and it becomes easier.
Maybe some of my bias is that when you have a hammer everything looks like a nail - or in my case: when you have a matrix everything looks like an eigenvalue. YMMV.
We did "free" lunch for all here a couple of years ago. The idea is great, execution is terrible. You can't get a la carte free, only the full "FDA approved" lunch is free. So if you forget a drink, or just want to add a snack to your own packed lunch, you go get the whole thing and throw everything else away.
The elementary school tried adding the "share table" where you can put anything you don't want so that someone else could pick it up, but that was shut down because they could assure the feds that everyone was getting a "balanced" lunch.
My highschooler tells me of all the kids going through line multiple times to get pizza on pizza day and then throwing the rest away because they don't want that.
Of course we had a second tax that was approved this year because the free lunches were more expensive than they had planned. Wonder why.
If you wouldn’t mind sharing, what school district was this?
I’m curious to research and learn more! What accounts for the budget overrun? Are there stats on how many free meals were taken per student (especially if this was broken down on a per-day basis, this could back up the “pizza” explanation)?
I mean this is the nanny state at its best. Getting in the way of progress because you refuse to meet people, in this case kids, where they actually are. The challenge should be minimizing the amount of waste—cook literally anything where the kids will clean their plates then try to nudge toward healthier options while keeping your waste % low. Let them take any subset of the lunch as they please, prune dishes kids either don't take or leave behind until you have a menu.
Mind boggling how getting the kids actually fed is lower on the priority list than making sure they eat the "right" things.
Not exactly easy. The US military (hell just about every army on the planet) spends a lot of money and effort into developing field rations that are palatable enough for infantry sections on the move to eat in it's entirety. I can't imagine developing it for far more numerous school children is going to be any easier.
If you want a successful lunch program (and rations if you have a to-go bag) look no further than the US Navy's sub program.
Given the environment and danger (and having a bunch of humans in close proximity, deep under the ocean, with nowhere to go, hangry, is not going to inspire unit cohesion) they get really, really good food. Which is probably not a bad thing to give people tooling around with enough firepower to take out a few dozen cities.
Whenever I watch a video about American military nutrition, the only takeaway I have is "are these people incompetent?"
Sailors in the USA navy get fat after their first deployment, common knowledge. Why? Because half the time their food is frozen chicken nuggets, frozen tater tots, etc, chucked into the oven, served bulk at mess.
2025's most well funded army, that's the best they came up with? Why not just freeze non deep fried chicken breast? Why not use lentils for carbs? Why not fast-freeze dry vegetables?
In any case I don't see the relevance for schools. Hire a chief lunch lady who has the same job a head chef does - find the local produce and dairy and fish and meat, plan meals and portions, organize supply, and direct meals.
>Hire a chief lunch lady who has the same job a head chef does - find the local produce and dairy and fish and meat, plan meals and portions, organize supply, and direct meals.
Who's going to pay for all of that? Not the American taxpayer, who would consider it theft and waste, and not the poor kids who actually need school lunches, and probably not their parents.
You'll wind up with a Macdonald's kiosk in every school cafeteria, and vending machines full of Monster energy drinks.
I found a twitter thread years ago that talked about how the author had gone to school with a lot of (US) mafia children, and the school had unsurprisingly provided lunch via a local vendor with mob connections. Presumably some of the money wound up going to the mob.
But, the thread pointed out, since high-level mafia officials sent their children to that school, they had no interest in skimping on the lunches. And the lunches were excellent. After a big FBI bust, the mob-affiliated vendor was replaced with a major interstate school lunch vendor, and the quality of the food was rock-bottom.
I've tried to find the thread again, but I can't. If anyone else wants to dedicate an unreasonable amount of time to it, I'm pretty sure I originally found it through a links post on Marginal Revolution.
> The US military (hell just about every army on the planet) spends a lot of money and effort into developing field rations that are palatable enough for infantry sections on the move to eat in it's entirety.
Why? That's not even a real concept. If you want everyone to like everything they have, you can't do that without letting them trade away the stuff they hate.
>The CMNR reviewed many of these studies when they were initially completed and noticed that underconsumption of the ration appeared to be a consistent problem. Typically, soldiers did not consume sufficient calories to meet energy expenditure and consequently lost body weight. The energy deficit has been in the range of 700 to 1,000 kcal/d and thus raises concern about the influence of such a deficit on physical and cognitive performance, particularly over a period of extended use. Anecdotal reports from Operation Desert Storm, for example, indicated that some units may have used MREs as their sole source of food for 50 to 60 days—far longer than the original intent when the MRE was initially field tested.
>
>There have been successive modifications of the MRE since 1981. These modifications in type of food items, diversity of meals, packaging, and food quality have produced small improvements in total consumption but have not significantly reduced the energy deficit that occurs when MREs are consumed. This problem continues in spite of positive hedonic ratings of the MRE ration items in laboratory and field tests. The suboptimal intake of operational rations thus remains a major issue that needs to be evaluated.
Or to summarize it; soldiers weren't eating the full MRE's in Desert Storm, and it a widespread problem. Soldiers that weren't meeting their caloric intake requirements were suffering cognitive issues while in combat operations. Bit of an issue when you've got two groups of people trying to kill each other and not their own side.
So they figured the best option to get the soldiers to eat their rations was to keep improving and updating until soldiers were more inclined to eat the whole damn thing. I don't know if they've succeeded per say but they have been updating the menus pretty consistently since the 90's. I think only the beef stew and a few other meal items have stayed consistent over the last 30 years of MRE's.
Agreed, though the term makes for a funny metaphor in this case— a good nanny would likely take the same approach you describe here: meeting the kids where they're at and trying to encourage them to eat better along the way instead of making food just for it to be thrown away.
> literally anything where the kids will clean their plates then
Feeding kids sugar and hen nudging them to eat slightly less sugar while still providing inherently unhealthy meals seems suboptimal. Them cleaning their plates is not an inherently a good thing. Rather the opposite.
> making sure they eat the "right" things.
Certainly better than feeding them the wrong things? though.
It's not like starvation or malnourishment is the main issue when a significant proportion of children are overweight. Them eating crap is...
It's always a treat when the exact problem I'm describing shows up in the replies. Yes feed them sugar. Children have a significantly heightened sweet tooth until adolescence where it slowly declines and they develop more complex tastes and a tolerance for "adult" flavors. When I bake for kids I have to make it cloyingly sweet to an adult palate and it gets snarfed down. And it's also why Funfetti cake doesn't hit like it did as a kid because your tastes have changed. Trying to impose adult standards on kids is native at best and futile in aggregate—you can only serve it, you can't make them eat it and they won't.
You understand how moronic it sounds to prepare and serve food that kids won't eat in the hopes that they eat less, right? Plus free lunch programs are to deal with malnourishment and to make sure kids get at least one full meal a day.
My elementary school, which was a private school and so wasn't beholden to any government meddling, followed this formula and it worked out great. Every meal was carbs, protein, and sugar, and everything was sweet. It wasn't an apple, it was fruit cocktail in syrup, the pizza had sweetened bread and sauce, vegetables were sweat peas, carrots, and corn. Every student was put on a rotation to clean trays so I got to see first hand what the waste situation was. And it wasn't zero but you didn't see a tray full of food minus pizza coming back.
> serve food that kids won't eat in the hopes that they eat less, right?
Not hope as such. Ideally they eat it eventually. If they are not allowed to eat unhealthy foods they won't have much of an option. Even the most obstinate ones will change their mind after spending a couple of days being hungry.
This really does keep getting worse, first you were just wasting money for your ideals now you're suggesting we purposely let kids go hungry until they behave in the way
you want. We're beyond they just don't happen to like what's being served but you're trying territory and into they're going to eat it and like it or they don't get lunch. Please don't ever
run for your local school board.
And no we didn't all turn out overweight, it's been a minute but I think in my grade there were three "fat kids," two girls and one boy. I really don't understand why you take being overweight as the natural consequence of this. Kids crave sweets because it's calories and they're growing. In my early teens the size of my meals were on the order of two Chipotle burritos or the
entire taco twelve pack and I was a perfectly normal weight. I mean I was a girl in high school so I didn't exactly think that back then but I was fine. It wasn't until I was
post college and had depression that I put any kind of significant weight.
I find this attitude super weird. Adults are responsible for what kids eat and problem of kids taking multiple lunches can be solved by allowing them to go only once.
What is weird is that American kids seems to be taught to refuse "healthy" food. Somehow the problem of kids refusing fruits and real food is something that happens only once in a while with few kids elsewhere, but is apparently epidemic in america.
Yes we are responsible for what kids eat, it's why it's all
the more maddening we have adults who come up with a menu of how they wished kids ate, made it policy, and take literally no responsibility for the (I think very predictable) outcome.
As an American if I paid the same taxes but the half that's spent on building -b2 bombers- fine, substitute for "devices used to kill people I'll never meet in countries I'll never see," instead went to giving kids so much food they threw half of it away, I would be ecstatic with this change in the distribution of my taxes.
Actually, I think it should be the inverse of that. A CS student should come into CS!01 after hacking for years as a teenager and know how to use something like Linux from a practical standpoint and then college course should be all about the theory and ideas.
I remember when I was 10 or 12 or so hacking with my IBM 8086 and using basic, I accidentally "invented" the bubble sort. In fact, mine was extra slow and inefficient because both my outer and my inner loop went from 1 to N and there was no early exit if no swaps were made. A true O(N^2) algorithm. I didn't now what O(N^2) meant, but I had some understanding the things quickly got slower.
Then later in CS101 I learned about big-O and all the theories around sorting and it immediately clicked because I had a deep understanding of something that I experienced and then could tie it to real theory. The other way around - learning the theory before the experience - wouldn't have worked as well.
To tie it to your comment, you should have a deep experience with your OS of choice and then when you go to school, you learn why things are the way they were.
When I say this I often get accused of gate keeping, but I don't view it that way. I look at it as other types of majors that have existed longer than CS. I often make an analogy to music majors. I can't enroll as a freshman and say I'm going to be a music major without ever having played an instrument. People get accepted to a music department after they demonstrate the ability (usually though the equivalent of hacking while they were kids), and in their music classes they learn theory and how to play different instruments (just like learning different OSes or languages).
I kind of feel that CS should be the same way, you should show up to CS101 knowing how to do things from deep experience. You may not know any of the whys or theory, that's fine, but you should have experience in doing.
To tie it back to the parent: you should come to CS knowing how to run Linux, maybe because you copied configurations or scripts from the dark corners of the internet. And then the CS classes should be around why it's all that way. E.g., you know that to schedule something you use cron; and CS would be a discussion around how generic OSes need a way to schedule tasks.
I think the reason going into music is that way because the industry is already hyper competitive as is, and the “selection” process for talent shifts earlier. Perhaps CS will go that way eventually.
Anyway, when I made the comment, I was thinking it should be an elective and intended for people who either aren’t that familiar with Linux or want to become even more comfortable with it. There are certainly plenty of such students in my experience, myself included when I was in college.
Also just to be clear, this shouldn’t be just about “being able to run Linux at home” level of material, but things like writing non trivial applications using Linux subsystems and being able to troubleshoot them.
But I think he's the only one to have done it right. I've never seen velocity tracking correct for measured inaccuracy in each developer's estimates. I've tried so many times to implement his EBS approach, but no one wants to do it.
I worked for a company which adopted FogBugz. The multiplier it calculated to be applied to developer time estimates quickly diverge towards positive infinity. It's probably fair to share some of the blame for that between us and it. Nonetheless, we managed to hit our quarterly release deadline well in advance of the predicted five to six years :-P.
Sure. If I remember the details correctly it was some sort of individual approximation of 'veolicity story whatever points' too, to make it less arbitrary and obviously stupid to KPI-PIP-stack rank people with.
Yes. I do this a lot when writing linear algebra stuff. All the math texts write things in 1-based notation for matrices. The closer I can make the code match the paper I'm implementing makes life so much easier. Of course there's a big comment at the beginning of the function when I modify the pointer to explain why I'm doing it.
One trick that I always fall back on is to make a dependency graph. In meetings I used to pull up yuml.com but now I use mermaid. You can just start typing text and arrows and it renders in real time what depends on what. It's great in a live meeting to help focus people on where the problem really is, or in documentation to show why a change here will affect something there.
Both yuml and mermaid don't get you control over layout. I think that's a feature. If the layout engine can make a pretty picture that means your dependencies aren't too complex, but if the graph looks terrible and complicated, that means you're system is also probably terrible and complicated.
Totally agree on the lack of layout control being a hidden strength. When the graph looks like spaghetti, that's not a tool issue - it's a systems issue
I wish we could use that. I can’t use them because mermaid doesn’t have a way to assure no data is stored on mermaid servers so I can’t use it for anything proprietary or even work related at all. LucidChart has a way to tie into Corporate though
Does corporate IT need to whitelist every VSCode extension that's being used? I can see the logic -it's running arbitrary code on your system as your user on their network- but damn! How does that even work? A self-hosted VSCode marketplace or something?
Basically. VSCode supports airgap install or offline install of plugins. Store them in Artifactory like an arbitrary location like vs-code-plugins and then ask an admin to install them on your VM.
Ouch. We are headed that direction. The problem is, if a vulnerability is found in a plugin, then you have to get everyone to manually upgrade. Doing it this way means everyone’s software is always out of date, which has its own drawbacks too.
mermaid layout doesn't scale for me so I keep using yEd/yFiles and their tgf (trivial graph format -- tab-delimited relations) input with orthogonal layout. It's a bit of a hassle in a meeting but updates take about 15 seconds to refresh if you have everything set up. Automating it fully would require an expensive license.
> I don’t think Java makes people into bad programmers, but I do think it selection-biases for intellectually unambitious engineers. They learn exactly enough Java in college to pass their courses, and then get a job at a BigCo that doesn’t strictly require ever learning anything more than what they were taught in their “intro to data structures” course.
I think that's a fair comment, but also there's this perspective: I first touched Java 1.1 in 1997 in college, and only for a semester. Then for the next 22 years never looked at a line of Java, working mostly in C++ and Python plus dabbling in FORTRAN for high performance stuff that needed to be written there. I generally consider my self not intellectually unambitious.
Then I moved to a Java shop who specifically needed high performance math (well at least as high performance as you can get in Java, which is actually pretty good now). But we stick to Java 8 compatibility because we have some BIG customers who require that because of LTS on Java 8. There are some nice constructs that would help make the code more readable and modern, but when you need to support people and actually make money you do what you need to.
Sure, I am not claiming that you have to use every new feature every day 100% of the time. Obviously there are cases where you can’t upgrade for legal or compliance or “customer is just being difficult” reasons.
A lot of Java jobs aren’t that though, especially internal applications. A lot of places are running Java 17 or Java 21 on all their servers, literally have no plans to ever support anything lower, but the engineers are still writing Java like it’s 2003. That is what’s maddening to me.
I think it has a lot to do with work culture. Many tend to mimic what others are doing in order to not stick out.
At my previous job some were able to change that by consistently using "modern" features of Java. It inspired others to change and eventually we ended up with a good code base.
Be the one to start the change by implementing new features using good code. This will give others "permission" to do the same. Also try to give soft suggestions in code reviews or pair programming of simpler ways to do it (don't push too hard)
At my current job all of us were eager to try the latest features from the start, so we never had to convince new hires.
I know I came off as a bit negative, but in fairness to them, they did more or less continue working on what I was doing using the newer Java 21 features, and after I got a few pretty interesting changes merged in some of the more junior engineers started using them too; particularly I was able to successfully evangelize against the use of `synchronized` in most cases [2] and got at least some people using queues to synchronize between threads.
It honestly has gotten a fair bit easier for me since I've been doing this for awhile; at my last job I was the most experienced person on my direct team (including my manager) and one of the more experienced software people at the company, so I was able to throw my weight around a bit more and do stuff how I wanted. I tried not to be a complete jerk about it; there were plenty of times people would push back on what I was doing and I would think about it and agree that they were probably right, but I outwardly rejected arguments that seemed to be based on "I didn't learn this in university so it's wrong".
I have had other jobs (at much bigger companies) where they were not amenable to this. I would try and use new features and my PRs would be rejected as a result, usually with some vague wording of "this way is faster", which I later found out was (as far as I can tell) always a lie.
[1] It is not hard to find my job history but I politely ask you do not post it here.
[2] I'm sure someone here can give me a contrived example of where `synchronized` makes sense but if you need mutexes I think you're almost always better off with a ReadWriteLock or ReentrantLock.
They say the prevalence of virus is down. They don't say that the cancer rate is down (granted too early to tell), nor do they talk about any adverse events or all cause mortality differences (again, probably too early to tell)
The only thing they can conclude is that the treatment given to stop the virus, stops the virus. But they don't mention any tradeoffs.
Not trying to be an anti-vaxxer conspiracy theorist, but good science needs to talk about the whole picture.
reply