Releases: jasonacox/TinyLLM
Releases · jasonacox/TinyLLM
v0.15.20 - Chatbot Model
What's Changed
- Enhance model selection persistence in chatbot by @jasonacox in #20
- Model selection behavior improved. Session will persist with selected model in that session even when refreshed. Last model selection will be remembered and auto-selected for any new sessions but will not impact existing sessions.
Full Changelog: v0.15.19...v0.15.20
v0.15.19 - Chatbot Docs
What's Changed
- v0.15.19 Chatbot Docs by @jasonacox in #19
- Update URL reader to display reading status to user while processing document.
Full Changelog: v0.15.18...v0.15.19
v0.15.18 - Chatbot Updates
What's Changed
- Chatbot update - model and image handling by @jasonacox in #18
- Model selection will now be stored as a cookie to allow it to persist between sessions.
- Image handling has been updated to recover when switching between vision models and language models. A new
MAX_IMAGES
setting has been added to allow persisting more than one image in the same conversation context (must be supported by model or the images will be pruned by chatbot) - Model selection option
/model list
will display list of available models in chat windows.
Full Changelog: v0.15.17...v0.15.18
v0.15.17 - Model Selector
What's Changed
- Add model selection UI feature by @jasonacox in #16
- Chatbot - The
/model
command will now initiate a UI popup window and dropdown to allow the use to select a model from the list of available models. Alternatively, the user can specify the model with the command (e.g./model mixtral
) to select it immediately without the popup.

Full Changelog: v0.15.16...v0.15.17
v0.15.16 - Think Tags
What's Changed
- Chatbot - Add
/think filter
command andTHINK_FILTER
envrionmental setting to have chatbot filter out (no display) the <think></think> content from models that have built in CoT reasoning like Deepseek R1. - Add /think filter command to chatbot by @jasonacox in #15
Full Changelog: v0.15.15...v0.15.16
v0.15.15 - Multi-model Support
What's Changed
- Chatbot: Add multi-model support with LiteLLM proxy instructions by @jasonacox in #13
- Docker Compose Quickstart by @jasonacox in #14
0.15.15 - Docker Compose
- Quick Start using Docker compose for Chatbot.
- Chatbot - Bug Fix: Remove token limit on response. The
MAXTOKENS
setting is used to prune content sent to LLM. If not set, no pruning will happen. - Chatbot - Added additional LiteLLM support with the environmental settings
LITELLM_PROXY
andLITELLM_KEY
. If set, these will override the OpenAI API settings to use LiteLLM and will removeEXTRA_BODY
defaults that conflict with LiteLLM. - LiteLLM - Added docker compose to start LiteLLM, PostgreSQL, and Chatbot.
0.15.14 - Multi-model Support
- Chatbot - Add
/model
command to list available models and dynamically set models during the session. - LiteLLM - Added instructions to use LiteLLM proxy to combine local LLMs, AWS Bedrock, OpenAI, and other LLM options.
Full Changelog: v0.15.13...v0.15.15
v0.15.13 - Chatbot Fix
0.15.13 - Resource Fix
- Chatbot - Add LLM connection closures for non-streaming ad-hoc calls (e.g. CoT calls). This has removed the resource warning as identified in Issue #12. Improved debug messages.
- Chatbot Documentation - Updated CoT prompts and added reasoning.md for additional prompt options.
Full Changelog: v0.15.12...v0.15.13
v0.15.12 - CoT Updates
- Chatbot - Update Chain of Thought (CoT) to check request before routing all prompts through the CoT process. Using
/think always
will force CoT for all requests. Additionally, CoT prompts updated for better responses. Bug fixes and other minor improvements including documentation. - Chatbot updated docker image: jasonacox/chatbot:0.15.12
Full Changelog: v0.15.11...v0.15.12
0.15.11 - Chain of Thought
- Chatbot - Add Chain of Thought (CoT) thinking option using the
/think on
or/think off
toggles to the UI. When activated, queries will be passed through an out-of-band CoT loop to allow the LLM to thoughtfully explore answer and then provide a conclusion summary to the user. Set environmental variable "THINKING" to "true" to default all conversations to CoT mode.
Full Changelog: v0.15.10...v0.15.11
v0.15.10 - Bug Fix
- Chatbot - Fix error handling bug used to auto-detect max content length of LLM. Updated user input UI rendering to better handle indention.
- News Bot Script - Added logic to verify news summary from LLM to help prevent hallucinations.
Full Changelog: v0.15.9...v0.15.10