Convert any UnitFast, Free & Accurate
°C
Units of
temperature
Definition and Fundamental Basis
Temperature is the quantitative expression of how hot or cold a system is, but in physics it is rigorously defined as the measure of the average translational kinetic energy of the particles that constitute that system. Because energy naturally flows from regions of higher temperature to lower temperature, temperature also dictates the direction of spontaneous heat transfer described by the second law of thermodynamics. In modern metrology the SI base unit is the kelvin (K), established by fixing Boltzmann’s constant kB = 1.380 649 × 10⁻²³ J K⁻¹, a choice that ties temperature directly to energy rather than to any particular material reference such as the former triple-point-of-water definition.
Practical Scales and Conversions
For day-to-day use, linear offset scales derived from the kelvin are more familiar. The Celsius scale pins 0 °C to the freezing point of water and 100 °C to its boiling point at 1 atm; the Fahrenheit scale places those landmarks at 32 °F and 212 °F respectively. Rankine (°R) is an absolute version of Fahrenheit useful in certain U.S. engineering calculations, while Planck and Réaumur appear primarily in historical or very specialized literature. Because all of these scales differ only by constant factors or offsets, they share identical temperature intervals; a change of 1 K equals a change of 1 °C and 1.8 °F.
Thermodynamic Role
Temperature connects macroscopic observables through the ideal-gas equation of state PV = nRT. It also appears in the entropy differential dS = dQ_rev/T, a relationship that sets the theoretical Carnot efficiency ceiling for any heat engine. As a result, controlling temperature lies at the heart of power-plant design, cryogenic liquefaction, and chemical reaction engineering. Material properties such as viscosity, semiconductor carrier mobility, and reaction rate typically follow Arrhenius or similar exponential temperature dependencies—phenomena that modern process control exploits to achieve consistent product quality.
Instrumentation from Millikelvin to Thousands of Kelvin
Engineers choose temperature sensors according to range, accuracy, and robustness.
• Thermocouples generate a Seebeck voltage and read reliably from −200 °C to roughly 2300 °C.
• Platinum resistance thermometers (RTDs) dominate precision industrial metrology to ±0.01 K.
• NTC thermistors provide high sensitivity for medical probes and battery packs.
• CMOS band-gap circuits form the core of on-chip temperature monitors in microprocessors.
• Infrared pyrometers deliver non-contact readings of molten metals, flame fronts, or satellites, extending toward 3000 K.
At the opposite extreme, dilution refrigerators using helium-3/helium-4 mixtures cool nanoelectronic devices and quantum computers below 1 mK, a prerequisite for observing the fractional quantum Hall effect and superconducting qubit coherence.
Planetary, Cosmic, and Human Contexts
Beyond laboratories, temperature informs disciplines from astrophysics to architecture. Astronomers quote the effective surface temperature of the Sun as about 5778 K and map stellar life cycles through Hertzsprung–Russell diagrams keyed to temperature. Cosmologists pegged the cosmic microwave background at 2.725 K, evidence of the Universe’s hot Big Bang origin. Closer to home, climatologists track global mean surface temperature anomalies relative to an 1850–1900 baseline to quantify anthropogenic warming. Building engineers synthesize air temperature, humidity, and radiant exchange into operative temperature indices that guide HVAC sizing for human comfort. In the kitchen, the same science underpins sous-vide cookery, where food is held within ±0.1 K to ensure pasteurization without overcooking.
Everyday Significance
Whether enabling the safe launch of a cryogenic rocket stage, calibrating a medical incubator, predicting road-surface icing, or fine-tuning an espresso machine’s flavor extraction, accurate temperature measurement remains one of the most universally applied tasks in science and technology.

kelvin
From Celsius to Kelvin: A Short History
In 1848, the prodigiously curious Scottish-Irish physicist William Thomson—later known as Lord Kelvin—set out to create a temperature scale that would not merely describe how hot or cold something felt, but would instead map directly onto the very motion of atoms. He imagined a baseline at which molecular jiggles cease altogether, what we now call absolute zero. Kelvin’s insight gave science its first truly universal yardstick for warmth, unshackled from the quirks of water freezing at sea level or mercury expanding inside a glass tube.
The Modern Definition
Jump ahead to 2019. Rather than anchoring the kelvin to a particular substance’s freezing or boiling point, metrologists locked it to a constant of nature: the Boltzmann constant. By declaring the constant exactly 1.380 649 × 10⁻²³ joules per kelvin, the kelvin itself became immutably tied to the average kinetic energy of particles. In other words, temperature is now defined by energy, not by a thermometer. This switch means that even if we travel to Mars or build a lab on a drifting asteroid, the kelvin remains exactly the same.
Why Scientists Prefer Kelvin
Ask a chemist why a reaction rate doubles, or an astrophysicist why a nebula glows, and the answer nearly always rests on temperature measured in kelvin. The gas laws—PV = nRT—only deliver tidy, unit-free constants when T is in kelvin. Plasma physicists plotting the ten-million-kelvin interior of the Sun, and cryogenic engineers coaxing superconductors to a whisper above 1 K, both rely on the same scale. No negative numbers clutter the math, and absolute zero—0 K, or –273.15 °C—provides a natural lower limit grounded in physics, not human convenience.
Everyday Touchstones
If 0 K feels abstract, pin it to your senses. A brisk winter day at –10 °C is 263 K. Room temperature, a comfortable 20 °C, is 293 K. The surface of a pizza oven glows red-orange around 1000 K, and molten lava rolls out near 1400 K. Slide farther up the dial and you reach the photosphere of our Sun at roughly 5800 K—a figure that astrophysicists deduced long before anyone ever thought of flying to space.
Trivia to Share at Your Next Dinner Party
• The coldest spot in the known universe isn’t deep space; it’s inside terrestrial labs where physicists coax atoms into Bose-Einstein condensates mere billionths of a kelvin above absolute zero.
• Lord Kelvin himself estimated the Sun’s age at a few million years—spectacularly wrong, yet his name lives on attached to the thermodynamic scale that helped reveal the Sun’s true, nuclear-powered vigor.
• Because kelvin is an SI base unit, you drop the degree sign. It’s “kelvin,” not “degrees Kelvin”—a tiny linguistic tweak that flags you as someone in the scientific know.
From chemistry benches to cosmic observatories, the kelvin quietly underpins our understanding of heat, energy, and the restless dance of atoms—proof that sometimes the most profound revolutions begin with a single, elegantly defined unit.
degree Celsius
From Swedish Laboratories to Global Thermometers
When Anders Celsius sketched his novel temperature scale in 1742 in Uppsala, Sweden, he was after simplicity. At the time, the Fahrenheit scale and a handful of local “degrees” floated around Europe, each inconveniently anchored to quirky reference points like “the temperature of a healthy man’s armpit.” Celsius chose water instead—nature’s most familiar liquid—and assigned two crisp benchmarks: 0 °C for the temperature at which pure water freezes and 100 °C for the point at which it boils, both at standard atmospheric pressure. (Curiously, his original drawing placed 0 at boiling and 100 at freezing; his colleague Carl Linnaeus flipped the scale the following year. The rest, as they say, is climate-controlled history.)
A Thermometer’s Universal Dial
Scientifically, the Celsius degree is tied directly to the kelvin, the official SI unit of temperature. The conversion is child-simple: just add or subtract 273.15. This linear relationship means that a change of 1 °C is exactly the same “amount of warmth” as a change of 1 K, a fact chemists exploit when they shift back and forth between everyday lab notes and formal thermodynamic calculations. Meteorologists praise the scale, too: when a cold front drags the afternoon high from 18 °C down to 8 °C, the math is intuitive—no need to decode a three-digit number, as you would on the kelvin map.
Everyday Touchpoints
• Refrigerators are set around 4 °C, the sweet spot that slows bacterial growth while keeping lettuce crisp.
• A balmy beach day in the Mediterranean? Roughly 28 °C.
• Human body temperature hovers near 37 °C; if your forehead thermometer reads 38 °C, grab the aspirin.
• Bakers know yeast wakes up close to 30 °C, turns hyperactive at 38 °C, and throws a tantrum (dies) near 60 °C.
Industry, Research, and the Cosmos
Celsius reigns in medicine—for dosing IV fluids or storing vaccines at precisely 2–8 °C—and in engineering, where steel’s mechanical properties are charted against temperature in degrees Celsius. Climate scientists publish global warming targets in Celsius, because “limiting warming to 1.5 °C” is instantly relatable: it’s the difference between a light jacket and rolled-up sleeves.
Nuggets of Trivia
• The Guinness record for Earth’s coldest natural temperature, −89.2 °C in Antarctica’s Vostok Station, is only about 184 kelvins above absolute zero.
• In Canada and the U.K., ovens still display temperatures in Celsius, but many recipe books nostalgically list Fahrenheit—culinary bilingualism at 180 °C (350 °F).
• Mars rovers report afternoon highs around −20 °C; pack a scarf.
The Takeaway
Whether you’re calibrating a vaccine freezer, debating climate policy, or simply deciding if you need mittens, the degree Celsius offers a clear, water-anchored yardstick for the invisible energy of heat. It translates the abstract world of thermodynamics into numbers our skin—and our planet—can feel.