zhenbo_endle@lemmy.caM · 1 year agoSeekStorm - sub-millisecond full-text search library & multi-tenancy server in Rustplus-squaremessage-squaremessage-square0linkfedilinkarrow-up12
arrow-up12message-squareSeekStorm - sub-millisecond full-text search library & multi-tenancy server in Rustplus-squarezhenbo_endle@lemmy.caM · 1 year agomessage-square0linkfedilink
zhenbo_endle@lemmy.caMEnglish · 1 year agoRust and Neovim - A Thorough Guide and Walkthroughplus-squarersdlt.github.ioexternal-linkmessage-square0linkfedilinkarrow-up16
arrow-up16external-linkRust and Neovim - A Thorough Guide and Walkthroughplus-squarersdlt.github.iozhenbo_endle@lemmy.caMEnglish · 1 year agomessage-square0linkfedilink
zhenbo_endle@lemmy.caMEnglish · 1 year agoMozilla-Ocho/llamafile: Distribute and run LLMs with a single file.plus-squaregithub.comexternal-linkmessage-square0linkfedilinkarrow-up15
arrow-up15external-linkMozilla-Ocho/llamafile: Distribute and run LLMs with a single file.plus-squaregithub.comzhenbo_endle@lemmy.caMEnglish · 1 year agomessage-square0linkfedilink
zhenbo_endle@lemmy.caM · 1 year agoLocal LLaMA Server Setup Documentationplus-squaregithub.comexternal-linkmessage-square0linkfedilinkarrow-up15
arrow-up15external-linkLocal LLaMA Server Setup Documentationplus-squaregithub.comzhenbo_endle@lemmy.caM · 1 year agomessage-square0linkfedilink
zhenbo_endle@lemmy.caMEnglish · 1 year agoRunning Local LLMs, CPU vs. GPU - a Quick Speed Testplus-squaredev.toexternal-linkmessage-square1linkfedilinkarrow-up17
arrow-up17external-linkRunning Local LLMs, CPU vs. GPU - a Quick Speed Testplus-squaredev.tozhenbo_endle@lemmy.caMEnglish · 1 year agomessage-square1linkfedilink