For example, a DSLR sensor is not all that different than most other camera sensors. The main difference is what is being done on the sensor versus what is broken out for external access.
I’m certainly no expert here, but I tried building an astro photo setup old school style with some old we cams. None of the sensors I had available broke out the features I needed. I could have done some external image stacking but there were a lot of errors in the compressed output from the module. I basically learned I need to buy a sensor based on the features available in the Linux kernel driver to do what I wanted to do, and that randomly chosen cheap webcams didn’t have very much low level access.
From the hardware side, it is a ton of data output that can be challenging to handle and process quickly enough. The frequencies are quite high and that makes circuit design challenging too. It is easier to drop stuff from the stream earlier and output a much smaller final product like image. At least, that was my experience as a maker that was mostly playing in a space that is over my head in such a project.
Infamous Prince