> Task-tracking and note-taking are practical and useful, but ultimately I want to treat my own thoughts as if they have value. I want to be a little more intentional and deliberate in my own thinking, and to have a space to engage in dialog with my own ideas. I want to be able to draw from my own knowledge instead of relying on AI assistants for everything. Maybe such an approach can even be complimentary to using AI tools; with the right plugins Obsidian can serve as an MCP server, which would allow tools like Claude to discover and read items in your vault. Perhaps this could offer the best of both worlds. But the key thing is that the AI is the assistant, and my thoughts and ideas remain my own.
Maybe I'm missing the author's point, as it's early here, but I don't see how your own thoughts can possibly lack value because of AI. LLMs can only summarize the documents it was trained on, so it has no way to tell you what you're thinking (like why something is wrong). The value of AI is using RAG or semantic search to make your notes more useful to you. What the author's suggesting is outside the capabilities of current LLMs. By design, AI can only be used as an assistant.
Maybe I'm missing the author's point, as it's early here, but I don't see how your own thoughts can possibly lack value because of AI. LLMs can only summarize the documents it was trained on, so it has no way to tell you what you're thinking (like why something is wrong). The value of AI is using RAG or semantic search to make your notes more useful to you. What the author's suggesting is outside the capabilities of current LLMs. By design, AI can only be used as an assistant.