Local LLama LLM AI Chat Query Tool
Query a local model from your browser.
Stats
Chrome-Stats Rank
Other platforms
Summary
Elevate your browsing experience with our cutting-edge Chrome extension, designed to seamlessly interact with local models hosted on your own server. This extension allows you to unlock the power of querying local models effortlessly and with precision, all from within your browser.
Our extension is fully compatible with both Llama CPP and .gguf models, providing you with a versatile solution for all your modeling needs. To get started, simply access our latest version, which includes a sample Llama CPP Flask server for your convenience. You can find this server on our GitHub repository:
GitHub Repository - Local Llama Chrome Extension: https://github.com/mrdiamonddirt/local-llama-chrome-extension
To set up the server, install the server's pip package with the following command:
pip install local-llama
Then, just run:
local-llama
Safety
Risk impact
Local LLama LLM AI Chat Query Tool is safe to use. It does not request any sensitive permissions.
Risk likelihood
Local LLama LLM AI Chat Query Tool is probably trust-worthy. Prefer other publishers if available. Exercise caution when installing this extension.
Screenshots
Promo images
Similar extensions
Here are some Chrome extensions that are similar to Local LLama LLM AI Chat Query Tool: