We've already desribed the absorption/scattering effects of earth's
atmosphere:
The extinction coefficient can be determined by making multiple observations of a star at different airmasses. Then you can solve for and using least squares. Note that you need to sample a good range of airmasses to get good leverage on the fit, and you must bracket the airmasses of all of your program objects.
When doing broad-band work, however, there's an additional subtlety because
of the wavelength dependence. Two different stars might have different
extinction coefficients in the same filter because the stars might
have very different colors, which has an effect if the filter bandpass is
broad. This problem can be solved by using second order extinction
coefficients, where the extinction is a function not only of the airmass,
but also of the stellar color:
Using the above formalism, you have to solve for extinction coefficients plus a magnitude for every star you observe. This requires a fair bit of observing to constrain all of the parameters. Clearly, if you can observe stars of known brightnesses, you will have better constraints on the extinction coefficients. These leads us into the discussion of standard stars.