its not good and just a portfolio but I love HN for years now so I want to put mine on this thread for posterity: elginbeloy.com (https://elginbeloy.com)
Is there a place I can read about taking unique creative approaches to original topics/games/concepts like this? "Thinking Different with Basics". I like this so much but its because it gets at an essence of creativity applied to the obvious I don't know how to learn or search for:(
I like his perspective that creativity = using an existing pattern in a new context.
You can be more creative by first consuming lots of different patterns in all sorts of contexts (e.g. playing lots of games, and also reading and experiencing lots of topics unrelated to games).
Then you try all the different permutations of patterns in your mental toolbox. Kinda like how the sibling comment rattled off different what-ifs.
Sadly i dont know if this can be learned persay as it wobbles along the “creativity” line.
Id say that youd need to have a genuine curiosity along with a “what if” mindset that is hard to teach. The path to these ideas is often a train of what ifs, what if snake was 3d? Then what if it was 3d on a planet? What about a cube?
You can take the same thought to other games. What if pong was 3d or on a sphere? What if pong supported 100 people playing together? How would that work?
Often what ifs will be deadends or uninteresting. It is like sales, a volume game. But u got to like the process or you wont get far.
Definitely. This is a pretty common approach: take an existing game, break it down into its constituent mechanics, then swap one of them out for a mechanic from an unrelated game. Rinse and repeat.
Case in point:
I built a twin-stick version of Snake that requires you to control two snakes simultaneously, called Twins of Caduceus. I even have a custom arcade box with two four-way joysticks so you can control one snake with each hand though you can play it with a regular keyboard. It’s a lot of fun, but you practically need the kind of hands that come built-in with localized neural ganglia to get a high score.
Snake was my very first OpenGL program (well, past a cube). You learn quite a bit about the basics and why one more dimension is not always better.
Fun times, this takes me back quite a bit. Definitely from the "what if" mindset, I was seeking something complex enough for learning and simple enough to actually finish. I must have been 15 or 16 at the time.
Yes! Curiosity is the way to open these doors. The first step is to keep a log of your thoughts. Anything that pops up, write it in your ideas book. Having ideas isn't an all or nothing. It's a practice. Get into the practice of writing down your small ideas and you will develop the ideas muscle.
Anybody who has meaningfully engaged with short-form dynamically adapted video content and read a book can EASILY tell the difference. It is Morphine vs Fentanyl
You say in your comment: "Anybody with a computer science education ... should be able to tackle this one" which is directly opposed to what they advertise: "You don't need a computer science background to participate"
100% agree but not just military. Self-driving vehicles will become the norm, robots to mow the lawn, clean the house, eventually humanoids that can interact like LLMs and be functional robots that help out around the house.
Curating a dataset is vastly different than introducing a new architectural approach. ImageNet is a database. Its not like inventing the convolutions for CNNs or the LSTM or a Transformer.
It's true that these are very different activities, but I think most ML researchers would agree that it's actually the creation of ImageNet that sparked the deep learning revolution. CNNs were not a novel method in 2012; the novelty was having a dataset big and sophisticated enough that it was actually possible to learn a good vision model from without needing to hand-engineer all the parts. Fei-fei saw this years in advance and invested a lot of time and career capital setting up the conditions for the bitter lesson to kick in. Building the dataset was 'easy' in a technical sense, but knowing that a big dataset was what the field needed, and staking her career on it when no one else was doing or valuing this kind of work, was her unique contribution, and took quite a bit of both insight and courage.
Exactly right. Neatly said by the author in the linked article.
> I spent years building ImageNet, the first large-scale visual learning and benchmarking dataset and one of three key elements enabling the birth of modern AI, along with neural network algorithms and modern compute like graphics processing units (GPUs).
Datasets + NNs + GPUs. Three "vastly different" advances that came together. ImageNet was THE dataset.
"CNNs and Transformers are both really simple and intuitive" and labeling a bunch of images you downloaded is not simple and intuitive? It was a team effort and I would hardly call a single dataset what drove modern ML. Most of currently deployed modern ML wasn't trained on that dataset and didn't come from models trained on it.
reply