Determining exposure time for a star with a given magnitude.

Affiliation
American Association of Variable Star Observers (AAVSO)
Fri, 05/07/2021 - 18:52

I was wondering if anyone out there has some sort of algorithm available that will calculate the exposure time for a star given its magnitude and camera/telescope data that will result in an unsaturated image.  Basically I would like to setup and automated process that would take pictures of variables stars . This process allows for the entry of the exposure time. It would be nice If I could calculate this value and plug it into the process automatically.

I'd love to hear any ideas.

 

Thanks

 Nor

Affiliation
American Association of Variable Star Observers (AAVSO)
One has to be a bit careful…

One has to be a bit careful not to forget other factors than exposure time that play a role here: sky background brighness (moon!) but mostly the focus (spreading the star image over more pixels helps pushing saturation to brighter magnitudes). Needless to say the camera gain and offset also play a role. 

Having said that, if all of these factors are kept constant, for a certain telesope and camera with fixed gain and offset, you can calculate that time for any magnitude using a simple scaling law:

If you find on a test image that (under fixed conditions as described above), at a given exposure time of T, stars are almost saturated at magnitude M , then for a star of magnitude M'  you can expect saturation at an exposure time of

t' = t * 2.51^(M'-M)

(well actually 2.51 here is an approximation for the fifth root of 100, but we are dealing with a rule of thumb here anyway, so this is good enough)

Cheers

HB