I've been working on a map that shows which neighborhoods in a city are nice/not nice with a short description.
Whenever I visit a new city, just looking at Google Maps is pretty meaningless - it's just a bunch of gray land and streets. I end up looking up Reddit posts for where to go, searching for crime maps, trying to find annotated maps, etc. to get a better idea of where to visit in a city (or even live, like when I had moved to Austin). AI generated scoring and descriptions, while imperfect, have already helped me when visiting SF recently. Early stage, so please help with submitting corrections, if you'd like!
I just checked a few cities I have been to and this seams surprisingly accurate. May I ask how you are collecting this data. Is it through Google reviews or how do you collect this data?
Very cool! I really like the idea, the way I'd develop this further is by having live crimes reporting on it so that you know which streets to avoid , similar to what waze does where people report items
I like the idea! In my opinion (looking at SF) it’s still too low resolution. SF in particular can vary greatly in safety, walkability, etc. even a few blocks over within a neighborhood.
It's not an AI issue. It's that in 4 years of a CS degree, CS students never touch a single kubectl command or barely build one functional web application in one software engineering related course. It's the failure of CS programs that is causing job market issues for college graduates.
It's just that when money was easy, companies could pay for training interns and recent grads things they should have learned in school. Now money is tight, so we see these job market problems. I am hopeful that colleges will adapt their curriculums based on the changing job market.
> It's the failure of CS programs that is causing job market issues for college graduates.
Definitely not. These failures have existed for decades. But it ultimately doesn't matter. You have the capacity to learn or you don't. CS programs are a waste of time, sure, but they have nothing to do with the job market.
I'd go so far as to say your comment is a perfect example of the problem. "kubectl" doesn't mean jack shit to being a good engineer. It's a technology we will use for a few years and then move onto something else.
Absolutely nobody is cutting junior positions because they don't know "kubectl"
The problem is developers are now DevOps, QA, PMs, Customer Support, and everything else.
CS programs teach fundamentals, though not necessarily the most popular tools in the industry. Those skills help regardless of whatever specific tools you end up using. I don’t think it’s a waste of time.
> The problem is developers are now DevOps, QA, PMs, Customer Support, and everything else.
As long as the hour expectations are reasonable I actually like this. This teaches you so many more skills than being pigeon-holed into software engineering. Granted, I like the idea of one day founding a business so I may need to be able to do all of these rolls some day.
I think you may be taking the parent comment too literally. Kubectl itself doesn't matter, replace it with anything you want - it represents the lack of the graduates ability to do anything useful in the real world.
Is doesn't make much sense to spend four years preparing for a job (market) and be completely unprepared at the end of it.
Well being a good engineer doesn’t mean “Jack shit” if you’re homeless and hungry because no one will hire you to exchange money for labor to support your addictions to food and shelter.
If I have to explain everything in detail and check behind your work as a junior dev snd wait for you to get up to speed, I might as well just use ChatGPT.
If I do need a perdón, why would I hire a junior engineer when I can just recruit an underpaid mid level developer more whose already proven himself and gotten up to speed especially if I can hire someone cheaply living in MiddleOfNowhere Nebraska who is willing to work for less than I made over a decade ago?
Implicit in this is an assumption that the purpose of a higher education is to give graduates the skills demanded by the businesses. What if it (the purpose of a higher education) was something else? What if it had always been something else, and only in the last few decades did a certain segment of the population try to convince us that the only reason higher education existed was to churn out workers that could slot straight into entry-level jobs without businesses having to invest in any training first?
College is a massive time and monetary commitment. The reason the vast majority of people go through it is because employment opportunities are gated away behind a degree. Without one, finding the job you want can be extremely difficult, and at times even impossible.
It always bothered me how colleges hold someone's future career hostage like this, forcing them to go through them, and then they pretend it's not the case and that college is simply intellectual enrichment.
The gating that colleges do also makes it much more difficult to change careers than it should be. The whole system does the task it's given extremely poorly, and pretends that it's not even responsible for the task. Great, fine. Let's work on alternative ways to handle job credentialing then.
>It always bothered me how colleges hold someone's future career hostage like this, forcing them to go through them, and then they pretend it's not the case and that college is simply intellectual enrichment.
>The gating that colleges do
It feels weird to me to blame colleges for job description requirements that they didn't write. Colleges aren't gatekeeping you, people requiring a degree and refusing to train employees are.
It's not a "certain segment" that changed their mind. It's the reality of what's happening to our country (countries). We can no longer, at large, afford the luxury of spending so much money ONLY to "learn to think". We also need practical education more than before. That's the undeniable nature of what has happened to the economy, especially in software.
I can guarantee you that the only people who went to college to be “better citizens of the world” were those who were privileged enough to have parents who could afford to subsidize their living.
That's completely untrue. I paid my way through school off student loans & the wages from my internships. I got a degree in computer science, but I took a wide variety of electives outside my program because I wanted to get a broad education.
Taking a course in macroeconomics is good for you; it makes you smarter. Sociology makes you smarter, pure math makes you smarter, english lit makes you smarter, miocrobiology, philosophy, and history all make you smarter. There's a lot of value in a liberal education. I certainly wouldn't have traded mine away for a primer on kubernetes or whatever.
As soon as I needed k8s on the job, I skimmed through the O'Reilly book and that was all I needed. The least valuable thing school can give you is an explanation for how to use a tool which is already well-documented.
All colleges have electives. I also took business classes with my computer science degree and while I am an MBA drop out, learning about business helped me talk my way into a lot of opportunities over the years. Of course learning how to write and communicate is actually part of the leveling guidelines to get to be a senior at every tech company.
And math classes are essential to understand machine learning algorithms and how to apply them and have been way behind the recent AI craze.
But I bet you would think different if you were a junior in today’s market…
> All colleges have electives. I also took business classes [which] helped me talk my way into a lot of opportunities over the years. [...] And math classes are essential to understand machine learning algorithms [and get a job related to] the recent AI craze.
These are exactly the sort of career-focused courses I wasn't talking about.
> But I bet you would think different if you were a junior in today’s market…
No. A k8s crash course from my alma mater wouldn't be any more use to me now than when I was entering the job market.
Machine learning algorithms is not a “new craze”, using computers to predict outcomes was a thing when I was in grad school in 2001.
You explicitly mentioned pure math and macroeconomics. Those are definitely helpful in a career. I was a math double major and I just recently started studying the latest in ML (not just gen AI) and my math background definitely helped.
And does history and philology make you “smarter”? Maybe at cocktail parties but it doesn’t help me better at exchanging labor for money to support my addiction to food and shelter and I definitely didn’t need to pay thousands of dollars for it.
Your anecdote does not make it "completely untrue". Only "not absolutely true", which isn't compelling in the face of the overwhelming other experiences.
> Only "not absolutely true", which isn't compelling in the face of the overwhelming other experiences.
Which is to say, yes, the person I was replying to was incorrect. But you liked the vibe of what they were saying, so now you're going to play with language to suggest that their statement might be true in some "non-absolute" sense where it doesn't matter whether the thing they said is correct or not.
Most people pursue higher education for multiple reasons. A lot of people just want the "campus experience." A lot of people enjoy learning. Many people just feel like it's expected of them. Any degree is associated with a boost in wages, so most people expect their loans to pay for themselves in time, especially at a cheaper institution. So the idea that middle-class people don't go to university to become better-read is patently silly.
If you are a “middle class” person that means if you don’t get a degree or don’t get a job you probably have parents to fall back on or at least go home. That means your parents can “subsidize your living” just like I said even if it’s just letting you stay rent free.
Much of my generation (hitting their twenties in the 1980s) went to university if they had the interest and Tertiary admissions scores over the threshold.
University was free, rent was cheap, particularly in shared houses, and part time work abounded (I worked three months of the year in mining or agriculture).
Many of those at the time were idealistic to a degree, almost all wanted to better themselves in some way or another.
Well, my still living parents grew up in the segregated South in the 40s-50s. There was no “idealism” about why my mom and her 3 sisters went to college and her brother went to trade school. They knew that college was the only way out, their parents were already struggling and they had no choice but to go to a “Black” college (now HBCU) because they were not allowed to go anywhere else.
On the other hand, I grew up as an only child with my mom a teacher and my dad a factory worker. While I knew I wasn’t going to be homeless or hungry or put undue burden on my parents, college was solely a means (dual degree in computer science and mathematics) to be employable even though by the time I went to college I had already been programming in 65C02 assembly and some BASIC for six years and was learning 68K assembly on my Mac my freshman year.
But knowing C and how to bit twiddle definitely helped me get a job straight out of college - a week after I graduated.
It would seem that some benefit flowed from the pragmatic idealism of the likes of Alexander Crummell and others that worked and fought hard to establish HBCU's.
I went to university while a number of kids I played football with didn't \1, the kind of event that prompted many to study law \2 and parallel that with art \3
And if you didn’t get a job right out of school would you have been homeless and hungry or could you just have moved back in with your parents? If you hadn’t gotten a job 5 years out of school that would have allowed you to support yourself would you be homeless or hungry?
I told both of my sons that I wouldn’t pay for a degree that would be less likely to lead to a decent paying job. When my youngest graduated from high school, I was working for BigTeh but I made it abundantly clear that a colleges sole purpose was to be gainfully employed.
> I made it abundantly clear that a colleges sole purpose was to be gainfully employed.
That's not true at all, though. An education has loads to offer beyond a bump in your wages. Do you really think knowledge is worthless unless you can make money with it?
You'll earn the money back with the increased wages a university degree affords you.
I never said knowledge was the only purpose of a degree—that'd be as ridiculous as saying "gainful employment" was the only purpose. It has multiple uses. This should be obvious.
And thank goodness we don't teach kubectl and never will.
Knowledge at the level of kubectl is worthless to the vast majority of grads. It's also extremely time-limited. In 5 years everything will have moved on.
Web apps? What fraction of our grads build web apps? How much will the technology for doing so change in 5 years? 10 years?
Universities are not vocational schools. We teach people to think. Then they learn what the details are on the job. This has always been the role of universities.
If you want something else, go to a coding bootcamp.
This isn't viable. This isn't a normal vocational job - it's too abstract/overwhelming for everybody to learn all of it on the job to the needed degree. We need to stop tricking kids into this path. Either CS needs to teach practical skills (I understand things evolve fast, but you're exaggerating, and regardless, an education in the current thing is absolutely invaluable, and also prepares you for learning the next thing on your own), or we need to make it much more clear that another education path is better. Also, boot camps are worthless for this.
Universities are research oriented, kubernetes is a practical skills which you'd expect to learn at a trade school no different to learning fluid dynamics vs plumbing
I have never met a single human being in person who did not believe that a crucial part of acquiring practical skills in software (not "trades") for young people is to go to university. Maybe people shouldn't theoretically expect this, but there is theory, and then there is reality. We need to make them match, in whichever direction.
Yes I'm sure people say that but that's because they don't think before they speak
You wouldn't expect the same from a doctor, lawyer, engineer, the problem is that everyday people aren't aware that there is a difference between software development and computer science...
If teaching fundamentals, then force students to utilize these fundamentals to build into applied or industry usecases.
For example - leveraging the K8s thread - force students in their OS class to understand HOW to apply data structures to manage scalability problems AND forcing students to understand Linux kernel internals like cgroups.
Most universities don't offer course to connect these fundamentals together - not even my Ivy League alma mater based on my own survey of curricula.
Everyone is incentivized to understand the bare minimum, overindex on Leetcode, and skim over harder fundamental courses. On top of that, ime at my alma mater, most CS faculty was essentially applied math nerds who could see beauty in a well formatted inductive proof but would glaze out when pushed on implementing Paxos using best practices around kernel or system performance. And vice versa for the system nerds.
In a world where AI/ML can increasingly automate away boiler plate work and even conduct limited reasoning, understanding how fundamentals and (shudder) first principles are connected into become a product or solution is what matters.
And thus, this becomes a critical thinking problem, which just cannot be learnt without experimentation and getting into the weeds.
Whey he’s advocating for is that people don’t want to get in tens of thousands of debt without expecting to get a job because they need money to survive and pay off said debt
This has not been the case the last 10 years in most of the top 50 programs. Capstone projects and senior seminar classes often touched upon more "serious" topics. I've seen classes focusing on projects related to distributed/cloud computing, complex web apps, or then often hot-topic of the year technologies (game AI, computer vision, machine learning)
Aside from a few exceptionally awful periods like 2009, when did fresh CS graduates struggle to find tech jobs? We all went through similar curriculums and picked up industry knowledge perfectly fine.
Careful about your labels! I went to brown in the late 90s and considered a cs degree before moving to math.
Even then among academic programs brown was considered a bit low brow in that it spent a fair amount of time on software and systems engineering: this was all part of the cs department, but it wasn’t science per se.
Today most cs programs are in fact software engineering programs; well and good, but they give themselves the same name as the original meaning which was, while engaged in engineering, more theoretical.
I can think of so many counter examples to what I just wrote I’m sort of excited for the comments, but tldr: in the late 90s, if you wanted engineering skills you hired MIT, Caltech, not Harvard. (Stanford alums didn’t move east and even then were not generally hirable for cash)
I’ll note my current perspective is probably aligned with your complaint - we need more engineers and quality engineers than we do theoretical cs undergrads by at least two orders of magnitude. The best schools taught their engineers theory, and I think that goes with the history of the discipline - most of the greats were tinkerers at least, more usually engaged in real engineering work while working on theory. The other way: getting theorists to become great coders - seems less common.
And in fact my era at brown produced a number of influential engineering folks like brian cantrill, so despite its non exalted status, it did the world some good!
And yet if you flip the paradigm entirely, you pretty much get coding bootcamps, which certainly don't have a great track record either. The answer is probably some more ideal balance between theory and practice, like Waterloo's CS program.
Related: During solo travelling whenever a thought crosses my mind to do something and my instinctual internal response is discomfort, I try to make myself do it - even if I feel awkward inserting myself or going back.
I've had so many awesome conversations with random interesting people every day during my trips thanks to this. I've gone places I'd otherwise not experience, all for the sake of exciting adventure and pushing my own bounds. The confidence that comes from this is significant.
Also, as a former remote software engineer of 3 years, it has been so energizing to socialize with people again. Best upper that there is.
There's a LOT here. I feel this applies to a lot of decisions.
For instance, if you want to make a product that requires a database and you like building database stuff, do the database stuff last. Do what is difficult first - fail fast.
The easy or default route will always be well known to someone.
Solo travelling was how I formed one of my most salient memories of the "moat of low status", to wit: going to Japan in 2011. Japan is an advanced G7 country, but unlike most of the rest, very few people there speak or understand English. So I was put in the position of having to get by with my shitty Japanese, or attempt to communicate even more futilely with the locals in English and seem like an even bigger, more clueless asshole. I think I gained more levels of Japanese in those two weeks than I did in two years of university education.
My lifetime best command of Italian was when I lost the keys to my apartment and had to ask around if anyone has seen them.
At that point I was already living part time in Italy for over two years, but since I was working remotely for a company in my country, I hardly had an opportunity to learn the language.
Fortunately Italians appreciate people attempting to speak their language.
Man, HN is sleeping on this right now. This is huge. 20% of the web is behind Cloudflare. What if this was extended to all customers, even the millions of free ones? Would be really amazing to get paid to use Cloudflare as a blog owner, for example
The cynic in me says we'll be seeing articles about blog owners getting fractions of a tenth of a penny while Cloudflare pockets most of the revenue.
And of course it will eventually be rolled out for everyone, meaning there will be a Cloudflare-Net (where you only can read if you give Cloudflare your credit card number), and then successively more competing infrastructure services (Akamai, AWS, ... meaning we get into a fractured marketplace kind of situation, similar to how you need dozens of streaming abos to watch "everything").
For AI, it will make crawling more expensive for the large guys and lead to higher costs for AI users - which means all of us - while at the same time making it harder for smaller companies to start something new, innovative. And it will make information less available on AI models.
Finally, there’s a parallel here to the net neutrality debate: once access becomes conditional on payment or corporate gatekeeping, the original openness of the web erodes.
This is not the good news for netizens it sounds like.
I worked at Cloudflare for 3 years until very recently, and it's simply not the culture to behave in the way that you are describing.
There exists a strong sense of doing the thing that is healthiest for the Internet over what is the most profit-extractive, even when the cost may be high to do so or incentives great to choose otherwise. This is true for work I've been involved with as well as seeing the decisions made by other teams.
That's the impression I get from Cloudflare - it seems like a group of highly skilled people attempting to solve real problems for the benefit of the web as a whole. As both a paid business user and a free user for home projects, I deeply appreciate what they've accomplished and how generously they allow unpaid users to benefit from their work.
I worry about what happens someday when leadership changes and the priority becomes value extraction rather than creation, if that makes sense. We've seen it so many times with so many other tech companies, it's difficult to believe it won't happen to Cloudflare at some point.
You are probably right that this is not the case right now. 25 years ago you could say the same about google employees. Incentives change with time, and once infrastructure is in place it's nearly impossible to get rid of it again.
So one better makes sure that it has not the potential to further introduce gatekeepers, where later such gatekeepers will realize that, in order to continue to live, they need to make a profit over everything else, and then everything is out of the window.
And then 20 years later Cloudflare hits hard times and gets bought by someone you don't like. The problem is that much power concentrated in any one place.
I lived in Munich for a month a few summers ago. While I enjoyed it, and it was definitely clean, I couldn't help but describe Munich as a "city where people go to work". Pleasant, but not exciting. Very walkable, though!
I'm on the SSL/TLS team @ Cloudflare. We have great managed certificate products that folks should consider using as certificate validity periods continue to shorten.
Simply having a domain managed by Cloudflare makes it magically https; yes, the traffic between the origin server and Cloudflare isn't encrypted, so it's not completely "secure", but for most uses it's good enough. It's also zero-maintenance and free.
Is this a joke (as in, that you don't actually work there) to make CF look bad for posting product advertisements in comment threads, or is this legit?
It's one of my first times posting on HN, thought this could be relevant helpful info for someone. Thanks for pointing out that it sounds salesy, rereading my comment I see it too now.
reply