Nessus Posted September 18 Share Posted September 18 This is the only hardware forum I belong to, so my apologies if this is inappropriate. I am trying to help the Better Half learn some electronics. "Better" is going through Charles Platt's Make: Electronics series. One of the experiments involves measuring the DC gain of a transistor from 5 microamps base current to a few milliamps. Lacking a meter that would go down to microamps, we bought a BK Model 393. Part of the experiment is measuring the collector current. At one point we measured the collector current as 1.44ma on the milliamp scale, but when we set the meter to microamps, the current read 1220 microamps, rather than 1440. I used to build computers back in the 1980s-90s, and every meter I've used before gets the scaling 'right,' and would have read 1440. Is this not unexpected, or should we return the meter? BK's tech support has not answered our request for information. K-R.. Quote Link to comment Share on other sites More sharing options...
apersson850 Posted September 18 Share Posted September 18 To be expected. When you switch over to microamps the internal resistance of the instrument is higher, as it's not sensitive enough to keep it down to the same level as when using the higher range. Hence it influences the circuit it's a part of more, and the current is reduced. 5 Quote Link to comment Share on other sites More sharing options...
Nessus Posted September 19 Author Share Posted September 19 6 hours ago, apersson850 said: To be expected. When you switch over to microamps the internal resistance of the instrument is higher, as it's not sensitive enough to keep it down to the same level as when using the higher range. Hence it influences the circuit it's a part of more, and the current is reduced. Thanks for the response, but I am not entirely sure I understand. It makes sense that inserting a meter into the circuit will have a slight effect, I 'get' that. It also makes sense that the input resistance may vary slightly on the various scales, but I would think that the readings should be closer than 15 per-cent, if not in agreement. Based on the values on the components, we calculated the current as 1.4ma. Can I trust this meter? Apologies if I sound 'thick,' I haven't played much with analog stuff since I was in school, back when the Earth was cooling. K-R. 1 Quote Link to comment Share on other sites More sharing options...
apersson850 Posted September 20 Share Posted September 20 (edited) The meter adds some impedance when you measure the current. The current is then calculated by the instrument by measuring the voltage across the instrument. How much this changes things depends on the impedance of the rest of the circuit in relation to the instrument. Let's, for the sake of argument, say that the instrument requires a voltage of 1 V to indicate 1 A. If the current flowing in a circuit driven by 10 V is 1 A, then adding the instrument to measure the current means that now only 9 V is available to drive the original circuit, as the instrument will consume 1 V. Thus the current is 1 A when we aren't measuring it, but as soon as we measure it the instrument will show 0.9 A. Is the instrument faulty? No, because when the instrument is added to the circuit, the current in this new circuit is 0.9 A. But if the impedance of the circuit is ten times higher the original current is only 0.1 A. Now the voltage drop over the instrument will be 1/10th of what we had before, or 0.1 V. Remaining to drive the rest of the circuit is 9.9 V, which implies that the new current will be 99 mA instead of 100 mA. So now the difference is just 1 mA. That's the same as 1%, but in the other circuit, with an impedance closer to the instrument's own impedance, the current fell by 10%. The specifications I found on the internet for the instrument you have claims a max burden voltage at the 6000 uA range of 2 V. Using the 60 mA range the burden voltage is reduced to 500 mV, for some reason. The specification looks like they have taken some shortcuts in the design (or the data sheet I found was wrong). Anyway, adding a voltage drop of about 0.4 V may for sure influence your circuit. The 2 V drop will be at full range, 6 mA, so taken down to a bit over 1 mA it will be about 0.4 V. That for sure can make a big difference in the kind of circuit you are testing, and a difference that's exactly of the kind you see. A tip is to measure the voltage across some resistor that's already in the circuit and then calculate the current. DMM type instruments typically have an impedance of 10 megaohm (and your's is no exception) when measuring voltage. That means that the influence the measurement will have when measuring across resistors of more normal values is minimal. I was probably at school at about the same time as you, but then I've spent most of my professional life doing things like these, so I haven't forgotten them yet. Edited September 20 by apersson850 3 Quote Link to comment Share on other sites More sharing options...
Nessus Posted September 21 Author Share Posted September 21 Thank you for the explanation, and I really appreciate your looking up the info. This means that measuring 1.4mA on the 60mA scale having a 500mV burden voltage would have a 16mV drop in the circuit? I was able to convey your explanation to the Better Half, and will try the experiment using the resistor trick. K-R. Quote Link to comment Share on other sites More sharing options...
apersson850 Posted September 21 Share Posted September 21 Yes, you've understood the math. Glad I could be of any help. Measuring currents in circuits has always been a hassle. Current clamps don't influence much, but they tend to be a bit costly for low currents. Especially for DC. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.