ollama-ui

This extension hosts an ollama-ui web server on localhost

Total ratings

4.46 (Rating count: 28)

Review summary

These summaries are automatically generated weekly using AI based on recent user reviews. Chrome Web Store does not verify user reviews, so some user reviews may be inaccurate, spammy, or outdated.
Pros
  • Easy to get started with local large language models
  • Useful tool for interacting with Ollama server
  • Good user interface
  • Wonderful extension that simplifies the setup process
Cons
  • Limited compatibility issues with certain hardware (e.g., Quadro K6000)
  • Problems with using the Enter key to submit messages
  • Issues with fetching responses when running from a client PC
  • Requires a local server setup, which can be complex for some users
Most mentioned
  • Request for additional features (e.g., saving chats, uploading files)
  • Problems related to accessing the Ollama server from different devices
  • Interest in having a more user-friendly GUI and accessibility from various devices
See reviews for ollama-ui on Chrome Web Store
Upgrade to see all 27 reviews

User reviews

Recent rating average: 4.50
All time rating average: 4.46
Upgrade to see all 27 reviews

Rating filters

5 star
70% (19)
4 star
19% (5)
3 star
4% (1)
2 star
0%
1 star
7% (2)
Date Author Rating Lang Comment
2024-10-31
Sultan Papağanı
en Please update it and add more features. its awesome (send enter settings, upload-download image if it doesnt exist, export chat .txt, rename chat saving title (it ask our name ? it should say Chat name or something))
2024-10-27
Manuel Herrera Hipnotista y Biomagnetismo
en Simple solutions, as all effective things are. thanx
2024-08-13
Farheinheigt TooNight
2024-08-08
Bill Gates Lin
en How to setting prompt
2024-08-08
Damien PEREZ (Dadamtp)
en Yep, it's true, only work with Ollama on localhost. But my Ollama turn on another server exposed by openweb-ui. So I made a reverse proxy http://api.ai.lan -> 10.XX.XX.XX:11435 But the extension can't access it. Then I also tested with the direct IP : http://10.1.33.231:11435 But you force the default port: failed to fetch -> http://10.1.33.231:11435:11434/api/tags Finally, I made a ssh tunnel: ssh -L 11434:localhost:11435 [email protected] It's work, but not sexy
2024-08-06
Fabricio cincunegui
en i wish a low end firendly GUI for ollama. you made it thanks
2024-06-10
Денис Гасило
2024-05-24
Shafiq Alibhai
2024-05-19
Frédéric Demers
en wonderful extension, easy to get started with local large language models without needed a web server, etc.... would you consider inserting a MathJax library in the extension so that equations are rendered correctly? something like <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/3.2.2/es5/latest.min.js?config=TeX-AMS-MML_HTMLorMML"> </script> or package mathjax as a local resource perhaps....
2024-05-09
Hadi Shakiba
en It's great to have access to such a useful tool. Having 'copy' and 'save to txt' buttons would be a fantastic addition!
Upgrade to see all 27 reviews