Page Assist - A Web UI for Local AI Models

Use your locally running AI models to assist you in your web browsing.

Total ratings

4.92 (Rating count: 71)

Review summary

These summaries are automatically generated weekly using AI based on recent user reviews. Chrome Web Store does not verify user reviews, so some user reviews may be inaccurate, spammy, or outdated.
Pros
  • Intuitive user experience and modern design
  • Fully-featured with support for local models and RAG functionality
  • Easy to use within the browser instead of a command prompt
  • Great for connecting to Ollama with a clean UI
  • Supports multiple languages and formats
Cons
  • Sidebar functionality does not work in certain browsers (e.g., Arc on MacOS)
  • Lack of support for certain platforms like LM Studio
  • Desire for the ability to connect to multiple URLs or ports in OLLama
Most mentioned
  • Great extension overall
  • Support for local models
  • Need for broader compatibility with different platforms
  • Request for additional features like file format support
See reviews for Page Assist - A Web UI for Local AI Models on Chrome Web Store
Upgrade to see all 72 reviews

Recent reviews

Recent rating average: 4.90
All time rating average: 4.92
Upgrade to see all 72 reviews

Rating filters

5 star
94% (68)
4 star
4% (3)
3 star
0%
2 star
1% (1)
1 star
0%
Date Author Rating Lang Comment
2024-11-19
Amjad Mazen
ar ممتاز جدا لكن للأسف لا يدعم اللغة العربية
2024-11-16
Yiannis Ravanis
en Great extension, but the sidebar functionality doesn't seem to work when using the Arc browser in MacOS. I left click somewhere on the page -> Open Copilot To Chat and nothing happens! :(
2024-11-12
Jaroslav
2024-11-11
Hyungkuk Jang
2024-11-08
Nicolae Cucuta
en Thank you Nazeem, you did a great extension and its very useful for a lot of people.
2024-10-27
Manuel Herrera Hipnotista y Biomagnetismo
en Wow, really nice extension, it feels fine to work with local llm and ollama in a useful and practical way. thanx
2024-10-23
Ritch Cuvier
en I like this application. I wish it could use two different URLs or two different ports in the OLLama URLs. That would allow me to utilize multiple PCS.
2024-10-05
Poyraz Akkaya
en Great but i cannot use it because i use lm studio please add support for lm studio because i have an a intel arc gpu and ollama doesn't support it😊 NOTE:Thank you developer for adding it 😊
2024-10-14
Leonardo Grando
en Congrats and thanks for bring us this extension. Fantastic
2024-10-10
Tristan Nguyen
en love it
Upgrade to see all 72 reviews