The point of HDR is to display an image that exceeds the available dynamic range (of sensor, display, or both).
If you have a sensor that can record more than sixteen bits per channel on each pixel and more than five stops brightness range, you can record "HDR" images without need of software, but you'll still need to do the usual exposure blending to be able to display detail in the extreme highlight and shadow areas simultaneously unless your display can show the same number of shades.
In practice, 16 bits per channel is generally considered "enough" because your eye can't reliably distinguish more than 256 brightness differences between complete black and complete saturation of a given channel. A range of five stops (= brightest 32x the light as darkest) is the limitation; negative film (especially the last generation from the 1990s and early 2000s) could record detail over a range of seven to eight stops (128x to 256x), though this generally required darkroom manipulation to fit into the ~five stops of dynamic range of printing papers.
So, in order to record HDR (more than five stops of range) you'd need a sensor that has a wider range than the common ones -- and that's going to be expensive. Astronomical sensors may have this capability (I'm not sure, I have no firsthand experience with those). Or you can stick with the current method of exposure blending, either automatically in software inside the camera (which takes away your control) or by bracketing exposure manually or automatically and blending the exposures to preserve detail, ideally without looking cartoonish like some of the earlier HDR images did.