ollama-ui

This extension hosts an ollama-ui web server on localhost

Total ratings

4.46 (Rating count: 28)

Review summary

These summaries are automatically generated weekly using AI based on recent user reviews. Chrome Web Store does not verify user reviews, so some user reviews may be inaccurate, spammy, or outdated.
Pros
  • Easy to get started with local large language models
  • User-friendly and simple interface
  • Great tool for chatting with local models
Cons
  • Limited compatibility with certain hardware (e.g., Quadro K6000 card)
  • Issues with using 'Enter' key to submit messages
  • Problems accessing Ollama server from client PCs
Most mentioned
  • Request for more features (e.g., export chat, upload/download images)
  • Desire for improved compatibility and functionality
  • Feedback on setup complexities and limitations
See reviews for ollama-ui on Chrome Web Store
Upgrade to see all 27 reviews

User reviews

Recent rating average: 4.50
All time rating average: 4.46
Upgrade to see all 27 reviews

Rating filters

5 star
70% (19)
4 star
19% (5)
3 star
4% (1)
2 star
0%
1 star
7% (2)
Date Author Rating Lang Comment
2024-10-31
Sultan Papağanı
en Please update it and add more features. its awesome (send enter settings, upload-download image if it doesnt exist, export chat .txt, rename chat saving title (it ask our name ? it should say Chat name or something))
2024-10-27
Manuel Herrera Hipnotista y Biomagnetismo
en Simple solutions, as all effective things are. thanx
2024-08-13
Farheinheigt TooNight
2024-08-08
Bill Gates Lin
en How to setting prompt
2024-08-08
Damien PEREZ (Dadamtp)
en Yep, it's true, only work with Ollama on localhost. But my Ollama turn on another server exposed by openweb-ui. So I made a reverse proxy http://api.ai.lan -> 10.XX.XX.XX:11435 But the extension can't access it. Then I also tested with the direct IP : http://10.1.33.231:11435 But you force the default port: failed to fetch -> http://10.1.33.231:11435:11434/api/tags Finally, I made a ssh tunnel: ssh -L 11434:localhost:11435 [email protected] It's work, but not sexy
2024-08-06
Fabricio cincunegui
en i wish a low end firendly GUI for ollama. you made it thanks
2024-06-10
Денис Гасило
2024-05-24
Shafiq Alibhai
2024-05-19
Frédéric Demers
en wonderful extension, easy to get started with local large language models without needed a web server, etc.... would you consider inserting a MathJax library in the extension so that equations are rendered correctly? something like <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/3.2.2/es5/latest.min.js?config=TeX-AMS-MML_HTMLorMML"> </script> or package mathjax as a local resource perhaps....
2024-05-09
Hadi Shakiba
en It's great to have access to such a useful tool. Having 'copy' and 'save to txt' buttons would be a fantastic addition!
Upgrade to see all 27 reviews