In 1996/7 I had a chance to use Tcl/Tk to build one of the first stock tickers on the web called DigitalTrader [1], after that we used it to build some of the first vector embeddings in 2005 for early biological language models at Lawrence Berkeley National Lab [2,3] for space biosciences. Still a fan.
How can it be if we have yet to fully define what human intellect is or how it works? Not to mention consciousness. Machine intelligence will always be different than human intelligence.
And, at the heart of AlphaFold2 is the language model, the tip of the spear in AI today. 'Language' can come in many forms e.g. a protein or amino acid sequence.
AlphaFold 2 wasn't Q-learning based. It was supervised SGD and the "evoformer" they introduced is very close to a transformer. So it's not exactly an LLM, but it's a pretty close equivalent for protein data.
This person is forgetting the entire operation is based on space biosciences, not just space. Vector Space Biosciences presents at DeSci London March 2024 - Min: 4:27:33
https://youtu.be/fbnFEvfKRO8?t=16052
They are embedded into a particular semantic vector space that is learned based on a model. Another feature vector could be hand rolled based on feature engineering, tidf ngrams etc. Embedding is typically distinct from feature engineering that is manual.
[1] https://www.orafaq.com/usenet/comp.databases.oracle.misc/199...
[2] https://newscenter.lbl.gov/2005/03/31/a-search-engine-that-t...
[3] https://patents.google.com/patent/US7987191B2/en