Settings

Complete reference for all BlazeTracker settings.

Connection

SettingTypeDefaultDescription
Connection ProfileselectAPI connection for extraction calls. See Setup.
Auto ExtractbooleantrueAutomatically extract state from new messages.

Display

SettingTypeDefaultDescription
State Display Position'above' | 'below''below'Show state block above or below message content.
Temperature Unit'fahrenheit' | 'celsius''fahrenheit'Display unit for temperatures.
Time Format'12h' | '24h''12h'Time display format.

Tracking

Enable or disable extraction modules. Disabling modules reduces LLM calls per message. See Track Dependencies for dependency rules.

ModuleDefaultWhat It Tracks
TimetrueNarrative date and time
LocationtrueArea, place, position
PropstrueNearby objects and items
ClimatetrueWeather and temperature (procedural)
CharacterstruePositions, activities, moods, outfits
RelationshipstrueFeelings, secrets, wants, status
ScenetrueTopic, tone, tension
NarrativetrueEvents, milestones, chapters

Message Limits

SettingTypeDefaultDescription
Max Messages to Sendnumber10Maximum recent messages included in extractor prompts.
Max Chapter Messagesnumber24Maximum messages sent to the chapter description extractor.

Injection

SettingTypeDefaultDescription
Auto Inject StatebooleantrueAutomatically inject scene state (time, location, characters, etc.) into prompts. Disable for macro-only workflows.
Auto Inject NarrativebooleantrueAutomatically inject chapter summaries and events into prompts. Disable for macro-only workflows.
Injection Depthnumber0Prompt injection depth (0 = near most recent messages).
Max Recent Chaptersnumber5Maximum past chapters in “Story So Far” injection.
Max Recent Eventsnumber15Maximum out-of-context events from current chapter.
Token Budgetnumber0Token budget for injection (0 = use ST’s context size).

Scene Shakeups

SettingTypeDefaultDescription
Enable Scene ShakeupsbooleanfalseInject random context-appropriate events into prompts. See Scene Shakeups guide.
Max Messages Between Shakeupsnumber20Messages at which trigger probability reaches 100%. Range: 5–100.

Advanced

LLM Settings

SettingTypeDefaultDescription
Max Tokensnumber500Maximum tokens for LLM responses.
Max Requests/Minnumber0Rate limit for LLM requests (0 = no limit).
Include World InfobooleanfalseInclude lorebook entries in extractor prompts.
Prompt Prefixstring''Prepended to user template of all prompts (e.g., /nothink).
Prompt Suffixstring''Appended to user template of all prompts.

Category Temperatures

Default LLM temperatures per extraction category. Individual prompts can override these.

CategoryDefaultNotes
Time0.3Low for deterministic time parsing
Location0.5Moderate
Props0.5Moderate
Climate0.3Low for consistent weather classification
Characters0.5Moderate
Relationships0.6Slightly higher for nuanced feelings
Scene0.5Moderate
Narrative0.6Slightly higher for creative summaries

Debug

SettingTypeDefaultDescription
Debug LoggingbooleanfalseLog debug information to browser console.

Custom Prompts

Each extraction prompt can be overridden individually. For each prompt:

SettingDescription
TemperatureLLM temperature for this specific prompt (overrides category default)
System PromptStatic instructions (cacheable)
User TemplateDynamic content with placeholders

See Custom Prompts guide for details on customization.