Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's already possible to get some of this effect with codex. The trick is to keep appending the interaction in the prompt (to maintain a memory of sorts).

For examples, you can replicate all the prompts here: https://twitter.com/yoavgo/status/1599200756631887872 with prompt + memory.

The notebook at https://github.com/madaan/memprompt/blob/main/YoavsPythonPro... shows a demo of this.

Some of these ideas were earlier discussed in our work on memory-assisted prompting [1].

[1] https://arxiv.org/pdf/2201.06009.pdf.



After going through some of the chatter on Twitter, I think the UI plays a critical role in creating the wow effect. We have had models like codex that can do this with assistance from memory for some time now (quick example on https://www.memprompt.com/), but the chat gpt interface made it so much more easier to use and see the nicely formatted results. Huge lesson IMO for folks working on AI + product.

Of course, the whole family of models from OpenAI is amazing, just that one of the key takeaways here is the importance of making models accessible and fun.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: