This app does not work offline. I have switched to Apollo for running offline LLMs. That one actually does work. To reproduce: 1. Download a model and let complete. 2. Exit app. 3. Turn on airplane mode, turn off cell data and WiFi. 4. Open app and type to chat. 5. Chat returns a timeout error, long after the time interval it would have taken to start a response had the app been connected.
My M2 iPad can handle bigger models, but Fullmoon’s in-app list is constrained to models they have packaged for mobile.
Does not work on iPad Air 4th Gen, even though this listing says it does. That pisses me off.
It is very convenient to use, and the installation is quite simple. It would be even better if it supported the recently released GPT OSS 20B.
3 stars minimum. Love the concept, but unable to install on the Mac. Upon selecting a LLM package, there’s no ability to continue and install.
If you install a model you can't uninstall it... App is now like 4 gb bruh
Love what it offers. Only request: allow me to delete downloaded models I don’t need. Thanks.
Hope can ad support for local modals
I travel very often and having access to LLM locally it’s a blast. The app is very good. I could ask for integrating with online LLM thru my API keys but I really like that app is focused on having LLM local and not trying to do all and nothing. It’s fast, simple and reliable. Pls also create a bsky profile so we can have a fullmoon community there !
PLEASE if you can add support to save whole chats to device and copy whole conversations this would be the perfect simple, powerful app love how clean the app is without feeling janky at ALL