They seem to have (if I understand correctly) degree-Celsius and degree-Fahrenheit symbols. So maybe Kelvin is included for consistency, and it just happens to look identical to Latin K?
IMO the confusing bit is giving it a lower case. It is a symbol that happens to look like an upper case, not an actual letter…
Unicode wants to be able to preserve round-trip re-encoding from this other standard which has separate letter-K and degree-K characters. Making these small sacrifices for compatibility is how Unicode became the defacto world standard.
IMO this is somewhere where if we were really doing something, we might as well go all the way and double check the relevant standards, right? The filesystem should accept some character set for use as names, and if we’re generating a name inside our program we should definitely find a character set that fits inside what the filesystem expects and that captures what we want to express… my gut says upper case Latin K would be the best pick if we needed to most portably represent Kelvin in a filename on a reasonably modern, consumer filesystem.
IMO the confusing bit is giving it a lower case. It is a symbol that happens to look like an upper case, not an actual letter…