Local LLama LLM AI Chat Query Tool
Extension stats
Extension summary
Elevate your browsing experience with our cutting-edge Chrome extension, designed to seamlessly interact with local models hosted on your own server. This extension allows you to unlock the power of querying local models effortlessly and with precision, all from within your browser.
Our extension is fully compatible with both Llama CPP and .gguf models, providing you with a versatile solution for all your modeling needs. To get started, simply access our latest version, which includes a sample Llama CPP Flask server for your convenience. You can find this server on our GitHub repository:
GitHub Repository - Local Llama Chrome Extension: https://github.com/mrdiamonddirt/local-llama-chrome-extension
To set up the server, install the server's pip package with the following command:
pip install local-llama
Then, just run:
local-llama
See more User reviews
Extension safety
Risk impact
Local LLama LLM AI Chat Query Tool does not require any sensitive permissions.
Risk likelihood
Local LLama LLM AI Chat Query Tool has earned a fairly good reputation and likely can be trusted.
Promo images
Similar extensions
Here are some Chrome extensions that are similar to Local LLama LLM AI Chat Query Tool: