I am working on a very low power LED blinker and the need of measuring currents in the range of microamps (uA) has opened me new challenges in the world of lab instruments.
Once you can measure milliVolts, just let the uA current flow into a large-ish resistor (100k, 1M) and measure the voltage across it and apply Ohm's law. Then you remember that your instrument has a finite impedance and it might influence your readings. So, how to estimate a DMM/DVM input impedance? Or how to confirm what is written in the accompanying leaflet?
In this experiment I have been using an ANENG AN8002 DMM, because it is small and uses 2xAAA cells instead of a PP3 9V battery. It should have 10 Mohm input impedance in DC Volts range.
In order to confirm or calculate the desired value, build a resistive voltage divider say, with a 10 Mohm resistor like this:
+V ------- 10 Mohm -------- DVM -------- GND
Let +V be a known voltage value (measure it in advance using the same DVM) and read what the DVM display says. The inverse formula says:
Rdvm = Vdvm*10 / (V - Vdvm) [Mohm]
Using my values I get 11 Mohm of input impedance in the >2V range. I do have measured the 10 Mohm resistor with several instruments and they all agree on its value.
Since the AN8002 is an autoranging instrument, the input impedance may change with the auto-set range. And it does indeed, but it is always around 10 Mohm.
Do I need a microAmp meter? Maybe not yet.
Oh, by the way, the blinker takes 0.4 uA (that's 400 nA) when OFF. I have repeated the math several times and taken several measurements in different ways, but they do agree! It runs 24 hours on a 1F 5V capacitor ... if you don't mess with it with a DMM :)