LLMs augumented with RAG has great potential for docs as well.
Have a problem, ask the LL
and it will reference the docs. So you don’t have to read through 40 pages just to find an answer.
Some products already make use of it for their docs. More will in the future.
The advantage then is that you can have up to date docs that the LLM can pull from and be able to hopefully accurately pinpoint relevant docs and summarize an answer for the user.
I also think some startups will come that focus on providing this kind of service. Probably several such startups exist already even. Similar to how there are some companies from before LLMs existed that focused purely on better access to docs of open source products.
Have a problem, ask the LL and it will reference the docs. So you don’t have to read through 40 pages just to find an answer.
Some products already make use of it for their docs. More will in the future.
The advantage then is that you can have up to date docs that the LLM can pull from and be able to hopefully accurately pinpoint relevant docs and summarize an answer for the user.
I also think some startups will come that focus on providing this kind of service. Probably several such startups exist already even. Similar to how there are some companies from before LLMs existed that focused purely on better access to docs of open source products.