Page Assist - A Web UI for Local AI Models のレビュー
Page Assist - A Web UI for Local AI Models 作成者: Muhammed Nazeem
合計レビュー数: 73
開発者の返信
投稿日時: 2日前Hi, I’m uploading the latest version to the store, but the add-on has not been approved yet. It is still under review. It has been two months, and there is no timeline for when Mozilla Firefox will approve it. I’m sorry.- 5 段階中 5 の評価Firefox ユーザー 14457244 によるレビュー (7日前)
- Very good extension, but the defaults are a little bit conservative.
So for example the default text size from the page you are viewing when sent to LLM is truncated to only 7 kilobytes (around 1500 tokens).
So for bigger pages LLM doesn't even know what's on the page.
You can change this default in: "Settings -> Pipeline settings -> Maximum Content Size for Full Context Mode". It's in bytes. - 5 段階中 5 の評価Firefox ユーザー 12490047 によるレビュー (1ヶ月前)
- 5 段階中 5 の評価Robin Filer によるレビュー (4ヶ月前)
- 5 段階中 5 の評価Michal Mikoláš によるレビュー (5ヶ月前)Compatible with not only ollama, but OpenRouter as well. Very customizable and just works!
- 5 段階中 5 の評価Firefox ユーザー 19622186 によるレビュー (5ヶ月前)
- 5 段階中 4 の評価Firefox ユーザー 14478686 によるレビュー (5ヶ月前)I'm a novice, but finding Page Assist to be a useful tool for finding my way with Ollama, if I had to work with cmd line only I'd have given up. This is great.
- 5 段階中 5 の評価HumanistAtypik によるレビュー (6ヶ月前)
- This easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Absolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API. - 5 段階中 5 の評価Firefox ユーザー 19258258 によるレビュー (9ヶ月前)
- 5 段階中 5 の評価Firefox ユーザー 19244952 によるレビュー (9ヶ月前)
- This addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- 5 段階中 5 の評価Firefox ユーザー 19104715 によるレビュー (10ヶ月前)
- 5 段階中 5 の評価Firefox ユーザー 18939203 によるレビュー (1年前)