imgaz3.staticbg.com/thumb/large/oaupload/banggo...A multimeter, also known as a volt-ohm meter, is a handheld tester used to measure electrical voltage, current (amperage), resistance, and other values. Multimeters come in analog and digital versions and are useful for everything from simple tests, like measuring battery voltage, to detecting faults and complex diagnostics. They are one of the tools preferred by electricians for troubleshooting electrical problems on motors, appliances, circuits, power supplies, and wiring systems. DIYers also can learn to use multimeters for basic measurements around the house. 

  • Analog Multimeters

An analog multimeter is based on a microammeter (a device that measures amperage, or current) and has a needle that moves over a graduated scale. Analog multimeters are less expensive than their digital counterparts but can be difficult for some users to read accurately. Also, they must be handled carefully and can be damaged if they are dropped.
Analog multimeters typically are not as accurate as digital meters when used as a voltmeter. However, analog multimeters are great for detecting slow voltage changes because you can watch the needle moving over the scale. Analog testers are exceptional when setting as ammeters, due to their low resistance and high sensitivity, with scales down to 50µA (50 microamperes).

  • Digital Multimeters

Digital multimeters are the most commonly available type and include simple versions as well as advanced designs for electronics engineers. In place of the moving needle and scale found on analog meters, digital meters provide readings on an LCD screen. They tend to cost more than analog multimeters, but the price difference is minimal among basic versions. Advanced testers are much more expensive.

Digital multimeters typically are better than analog in the voltmeter function, due to the higher resistance of digital. But for most users, the primary advantage of digital testers is the easy-to-read and highly accurate digital readout.