nostr relay proxy

event page

I am running the lightest weight version of most of these models so you might see a big downgrade from something like Claude. Doing some testing right now between llama3.1:8b and phi3.5:3B. RAM usage at the bottom. Also have deepseek-coder:1.3B running at the same time. Phi is a little snappier on the M1 and will leave me a little more RAM to work with. https://image.nostr.build/541c524409726e906aa1b5b7c20f1fdeb99822947cc587e79743f853bfa99a51.jpg

rendered in 1.141341ms