nostr relay proxy

event page

nostr:npub1vwymuey3u7mf860ndrkw3r7dz30s0srg6tqmhtjzg7umtm6rn5eq2qzugd nostr:npub1h8nk2346qezka5cpm8jjh3yl5j88pf4ly2ptu7s6uu55wcfqy0wq36rpev I think you would both like this. Basically a way to work with your vault notes and feed them to an LLM as context to make it generate new notes or summarize your existing notes. Continue.dev can run with a provider or with a local LLM if you have one running in an ollama docker container. I haven’t experimented with letting it autocomplete or create templates for me yet, but the potential productivity gains seem massive.

rendered in 1.649405ms