#core-infrastructure #meta #software #selfhosting #networks #LLMs #my-world #outside #house #wood #fire #organisation #occupation

All 5 posts tagged LLMs:

Open-webui

open-webui, formerly ollama-webui, is a Svelte application that serves as a ChatGPT-like interface for ollama which serves other other local and remot...


twinny

Twinny is a vscode plugin for copilot-like functionality with local LLMs, such as code completion, chat and test generation.. In contrast to most plugins like this (cody, codegpt etc.) they don't see...


ik_llama

ik_llama.cpp is a fork of llama.cpp, and llama.cpp was originally what ollama was built on. (Ollama has since rewritten the bits they depended on.)

...


Ollama

Ollama is a tool for running local LLMs. It exposes an interface to download and run different published LLMs from it's library (which seems largely huggingface backed.)

How do I use it?

Curren...


In 2023, I spent a lot of time playing with LLMs. Over the course of the year I got more confident in the idea that anything worth doing can be done locally. LLMs are so cool and fun to play around wi...