The Ca II H and K lines are among the few features available to infer the metallicity of extremely metal-poor stars from medium-resolution spectroscopy. Unfortunately, these lines can overlap with absorption produced in the intervening interstellar medium, introducing systematic errors in the derived metallicities. The strength of the Ca II infrared triplet lines can also be measured at extremely low metallicities, and it is not affected by interstellar absorption, but it suffers significant departures from local thermodynamic equilibrium (LTE). We investigate the feasibility of adopting the Ca II infrared triplet as a metallicity indicator in extremely metal-poor stars using state-of-the art non-LTE models including the most recent atomic data. We find that the triplet lines exhibit non-LTE abundance corrections that can exceed 0.5 dex. When interstellar absorption affecting the Ca II resonance lines is accounted for using high-resolution observations, the agreement between non-LTE abundances for the triplet and those for the resonance lines, with only minor departures from LTE, is excellent. Non-LTE effects strengthen the Ca II IR triplet lines, facilitating measurements at very low metallicities, compared with LTE estimates, down to [Fe/H] = -6.0. This result has important implications for the discovery of primitive stars in our Galaxy and others, since instruments are most sensitive at red/near-infrared wavelengths, and tens of millions of spectra covering the Ca II IR triplet will soon become available from the Gaia, DESI, WEAVE, and PFS missions.