Page Assist - A Web UI for Local AI Models incelemeleri
Page Assist - A Web UI for Local AI Models geliştiren: Muhammed Nazeem
Vaz-Dev adlı kullanıcının incelemesi
Absolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API.
73 inceleme
- 5 üzerinden 5 puanyazan: 飞舞的冰龙, 3 gün önceWhy this great addon cannot be updated? It has already been version 1.5.65 on GitHub.
Geliştiricinin yanıtı
gönderilme: 3 gün önceHi, I’m uploading the latest version to the store, but the add-on has not been approved yet. It is still under review. It has been two months, and there is no timeline for when Mozilla Firefox will approve it. I’m sorry. - 5 üzerinden 5 puanyazan: Firefox kullanıcısı 14457244, 8 gün önce
- 5 üzerinden 5 puanyazan: MarsLife, bir ay önce
- 5 üzerinden 5 puanyazan: TA, bir ay önceVery good extension, but the defaults are a little bit conservative.
So for example the default text size from the page you are viewing when sent to LLM is truncated to only 7 kilobytes (around 1500 tokens).
So for bigger pages LLM doesn't even know what's on the page.
You can change this default in: "Settings -> Pipeline settings -> Maximum Content Size for Full Context Mode". It's in bytes. - 5 üzerinden 5 puanyazan: Firefox kullanıcısı 12490047, bir ay önce
- 5 üzerinden 5 puanyazan: Robin Filer, 4 ay önce
- 5 üzerinden 5 puanyazan: Michal Mikoláš, 5 ay önceCompatible with not only ollama, but OpenRouter as well. Very customizable and just works!
- 5 üzerinden 5 puanyazan: Firefox kullanıcısı 19622186, 5 ay önce
- 5 üzerinden 4 puanyazan: Firefox kullanıcısı 14478686, 5 ay önceI'm a novice, but finding Page Assist to be a useful tool for finding my way with Ollama, if I had to work with cmd line only I'd have given up. This is great.
- 5 üzerinden 5 puanyazan: HumanistAtypik, 7 ay önce
- This easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- 5 üzerinden 5 puanyazan: Firefox kullanıcısı 19258258, 9 ay önce
- 5 üzerinden 5 puanyazan: Firefox kullanıcısı 19244952, 9 ay önce
- 5 üzerinden 5 puanyazan: Vick, 10 ay önceThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- 5 üzerinden 5 puanyazan: Firefox kullanıcısı 19104715, 10 ay önce
- 5 üzerinden 5 puanyazan: Henrique, bir yıl önce
- 5 üzerinden 5 puanyazan: FFFire, bir yıl önce
- 5 üzerinden 5 puanyazan: Firefox kullanıcısı 18939203, bir yıl önce
- 5 üzerinden 5 puanyazan: sun-jiao, bir yıl önce