what is calibration of microscope?
plzzz explain it in detail & easy to understand. how can we count the divisions and how we find out the size of the object
- Anonymous9 years agoFavourite answer
The magnification level of a microscope is theoretically measured by multiplying the
magnification of the ocular lens (usually 10X) by the magnification of the objective lens
(4X, 10X, 43X, 100X). In actual practice, you will find that each microscope that you use
and even each lens on the same microscope will vary slightly from the stated magnification
level. Before accurate measurements can be made, you must calibrate your microscope using
both an eyepiece or ocular micrometer and a stage micrometer. Since the ocular micrometer
is inside the ocular lens, it will not change size when the objectives are changed. Therefore,
each objective lens must be calibrated separately. The stage micrometer has calibration marks
at known intervals (0.01 mm for those used in this class.) With the lowest objective in place,
focus on the stage micrometer. Rotate the ocular micrometer until it is exactly superimposed
on the stage micrometer with the forward edge of both scales being precisely even. Then look
for another point, as far from the beginning as possible, where the two scales again overlap.
Count the spaces on each scale from the start to the second point of overlap. This will give you
x ocular units which are equal to .xmm. Simply divide the number of ocular units into the stage
units or mm to determine the number of mm in each ocular unit. Most measurements of microscopic
structures are given in micrometers or microns (mm). Since there are 1000 mm in each mm, you can
multiply your mm figure by 1000 to determine the number of mm in each ocular unit.
- 9 years ago
calibration is done to get the max utility of the scope
calculating the diameter of the field helps get the proper viewSource(s): books