株式会社オブライト
AI2026-05-17

Temperature

Also known as: Temperature / 温度パラメータ / サンプリング温度

A hyperparameter that controls randomness in LLM text generation. Values near 0 produce deterministic, consistent outputs; higher values yield more diverse and creative responses.


Overview

Temperature scales the logits fed into the softmax function, shaping the next-token probability distribution. At 0, the highest-probability token is always chosen (greedy decoding). At 1, sampling follows the trained distribution. Above 1, low-probability tokens become more likely, yielding more varied and creative outputs.

Practical guidelines

Use 0-0.2 for accuracy-critical tasks (SQL generation, data transformation), 0.5-0.7 for natural conversation, and 0.8-1.2 for creative writing and brainstorming. Temperature is typically combined with Top-p or Top-k for fine-grained control.

Related Columns

Related Terms

Feel free to contact us

Contact Us