Ad

Monday, March 29, 2021

Reverse engineering and "unlocking extras" in a Hantek 1832C LCR meter


For some time I've felt the need to have a piece of equipment allowing me to measure inductance, impedance and a broader range of capacitance values. For the rest of the relevant types of measurements I've already had a digital multimeter and an oscilloscope, which have proven sufficient so far.

The device in question is named LCR after the symbols that represent the quantities that is capable of measuring. L represents Inductance (the letter conventionally used in the mathematical equations involving this quantity), C stands for Capacitance and R is Resistance.

In a more advanced LCR meter or in this model in particular, other quantities such as the Q factor, Equivalent Series Resistance (ESR), Impedance (Z) and Reactance (X) can also be measured.

One aspect that characterizes the versatility of an LCR meter is the range of frequencies that it is capable of synthesizing in order to produce measurements. An LCR meter with a broader range of frequencies will be capable of measuring the inductance of a component closer to the conditions in which it will be operated.

In this particular LCR meter, the user can select frequencies between 100 Hz and 40 KHz. There is a "twin" version of this meter - the 1833C, which is capable of outputting 50, 75 or 100KHz sine waves as well, and two different voltage output levels can be selected - 300 mV rms or 600 mV rms. The 1832C only outputs a fixed 600 mV.

While I don't have a confirmation that the hardware on the 1833C is the same, some users were able to find that after installing the firmware provided in the vendor page in their 1832C devices (there is only one firmware image for all the 183x models), to their surprise the device was now reporting as 1833C, and they could then select frequencies up to 100 KHz.

While this is a relatively simple way of  "converting" the device, I found that my (recently purchased) meter had a firmware version which appeared more recent than the one found in the website, judging by the date code/version number of each. Mine has version 20201120PM, whereas the one in the Hantek site is 2020101001AM according to the filename and the photo provided by one user in the blog where this topic first appeared: https://www.eevblog.com/forum/testgear/hantel-lcr-1832c-unlock/

As I didn't want to let go this eventually improved firmware version, I decided to try to find if there was an alternative way of achieving the same effect, i.e. unlock the frequencies and the 300 mV level.

By reading the manual I found that this instrument supports SCPI (Standard Commands for Programmable Instruments) commands. This is basically a spec that describes a standard way for devices to communicate with a given test equipment. The physical interface in this case is USB, but the specification is agnostic to the physical layer. Another popular physical interface from the days when this standard first appeared is the GPIB bus (IEEE-488.1), but is less common in more modern devices.

So my focus was on trying to figure out if there would be some undocumented SCPI command to change factory data. I knew that many devices support changing lower level calibration data and even model data (as is the case of the Rigol DS1052E oscilloscope) via SCPI, so chances were they would use such approach to calibrate and set the model in the factory, while keeping the rest of the process the same for the two models.

By playing a bit with the SCPI commands, I found that I could set the frequency to 100 KHz using the FREQuency <freq> command:


Also I could set the level with the FUNCtion:LEVel <level> command. These settings would stay active as long as I would not change to a different value in the keypad. By verifying that the device could take measurements and the output frequency was very accurate in these "out of spec" ranges, I gained confidence and found it to be very plausible that the 1833C is physically the same device.

In order to find other commands, my first approach was to analyse the binary image of the firmware and look for relevant strings. This allowed me to figure out the existence of a "fact:model ?" command that when called would return the current model string:

-> fact:model ?

<- model = Hantek1832C

Upon trying to use it for writing, i.e.:

-> fact:model Hantek1833C

<- model = Hantek1832C

it would just behave as the query command. Seemed too easy but I had to try anyway :)

So I decided to go deeper in the analysis and begin with a tool called Ghidra.

This is basically a open source tool first created by the NSA (not too surprising why this organization needed to develop such tool :) ), that is capable of decompiling code from multiple architectures. In this case, the LCR meter has an STM32 microcontroller: 

So I needed to be able to decompile Arm Cortex Little Endian 32 bit code. 


While the default view after opening the decompiled code is of a ASM listing (which I have a hard time reading), the best thing however is that the tool can analyse and generate a somewhat more readable C code.


The rest of the work is a bit like solving a puzzle. Along the way, as functions get figured out in what these do, I can label these with new names, making the work less difficult as I progress in the analysis:


By default the tool is only capable of assigning generic names to the functions, labels and variables, such as FUN_xxxx or uVarXX and so on. As one of these elements is renamed, all the occurences are renamed.

The tool is quite compreehensive and can even generate flow graphs such as this one:


Not very helpful but cool.

Not too long into the analysis, I realized that there was a global variable that was a pointer to an address in RAM containing a byte used as a boolean. When this byte was set to 0x01, certain commands were allowed to run.

In the STM32, any address in the 0x20000000-0x2000FFFF range corresponds to the SRAM:

So by looking up this address (0x200031E6) in other parts of the code, I found where it was being read, and also asserted. I was able to conclude that this was the variable that would tell if the device was in debug mode or not. Among the various commands that dependend on this variable being set to 0x01, was the fact:mode write command.

So I only had to figure out how to enter debug mode. I found another function that grouped various calibration related strings, and among these were the strings "hantek_enter_debug_cmd" and "hantek_exit_debug_cmd". And precisely in this section of the code is where the debug global variable was being set. By digging a little bit further, I could see that these two commands belonged to the CALIB subsystem, and essentially this function was responsible for handling all the commands of this subsystem.

So with this information, I decided to experiment in runtime, and yes the device appeared to enter debug mode:


Intuitively, my next step was to try the fact:mode for writing. I entered the command:

fact:mode Hantek1833C

and verified that no response was produced. This is normal for write commands. So I exited debug mode:


and tried the same command for reading, i.e. 

fact:mode ? 

as expected, the command returned Hantek1833C

but as I executed another SCPI command *IDN? (this is a command that most devices support, and it provides information about the software and hardware in the device), I obtained the same model nr:

Hantek Handheld LCR Meter,Hantek1832C,CN************,20201120PM

Also by looking at the SYSTEM INF menu, I was getting the same model:



So it seemed evident that this command is doing something but is not actually writing the change into the flash.

By looking a bit further, I found that there was in the same subsystem a fact:save command which seemed to be doing just that:


By executing it after changing the model string, this time yes, the change was persisted:

I was then able to confirm that everything was as expected: I could now change to any of the new frequencies, and set the 300 mV level as well:


Looking at the signal through the oscilloscope, I could verify that the frequency and shape of the sine wave was spot on, and the 300mV setting was close to the actual rms value of this setting:


So in a nutshell, the commands that need to be executed:
  1. Enter debug mode:

    calib:hantek_enter_debug_cmd

  2. Change the model string (this command will not return data, so use the "Send Command" button in this case):

    fact:model "Hantek1833C"

  3. Save the change:

    fact:save

  4. Exit debug mode:

    calib:hantek_exit_debug_cmd

  5. Confirm that the change was successful:

    *IDN?
In Keysight Interactive IO, except for the fact:model write command, you should use the "Send & Read" button, to send the commands and wait for a response string:


After the change is committed, I was able to compare the contents of the flash before and after the change, and confirm that for example no calibration data is lost. As can be seen only the model string, and what appears to be a flag have changed. 


There is another region of the flash where one 32 bit word changes, but I verified that this value changes even without modifying any settings. It may keep some kind of quick self calibration or zero setting possibly:



In spite of being fairly evident that the two models share the same hardware, it is worth noting that there is a good likelyhood that in the 1832C model, there is probably no valid or accurate calibration data for the frequencies which are not normally accessible. Given that calibration is normally a time consuming process, it is likely that in the assembly line they are able to output more 1832C's because of skipping the calibration of these additional modes - test equipment almost unavoidably needs to be individually calibrated in order to be fit for purpose.

Doing a few tests, I found some indication that the calibration for these new frequencies may not be accurate: ran open and closed circuit user calibrations, and then tested an inductor marked as V121, which indicates 120 uH and a tolerance of 25%. 


Taking measurements at the different frequencies, I obtained the following results:

Frequency (Hz)Measurement (uH)
100182
120180
400184
1000185
4000185.6
10000185.1
40000183.44
50000174.9
75000172.78
100000170.67

While the measured value is overall outside of the indicated value (the inductor may be designed for a substantially higher frequency), looking at the corresponding log scale graph, we see that there appears to be a more pronounced drop when going from 40 KHz to 50 KHz than in any other transition in the vicinity:



I observed a similar pattern in other inductors. This is suggestive that some calibration offset is not correct for the new frequencies.

By looking further at the code, I was able to see that there are other commands in the "calib" SCPI subsystem, which are likely the ones required for performing calibration against reference standards (should ideally be done in a certified lab). These are:

calib:paraav
calib:300mvopen
calib:600mvopen
calib:300mvshort
calib:600mvshort
calib:300mvstd
calib:300mvstdsave
calib:600mvstd
calib:600mvstdsave
calib:300mvopensave
calib:600mvopensave
calib:300mvshortsave
calib:600mvshortsave
calib:600mvread
calib:300mvread
calib:short

But as of yet it is not clear how to use these. Some commands appear to provide a readout of the calibration data, but the exact syntax is still to be figured out.

Regarding the hardware, the device doesn't have a lot of discrete components. The STM32 is clearly the heart of meter, and all indicates that its internal 12-bit ADC's are used for signal acquisition, and at least one of the 12-bit DAC's is used for producing the sine wave.


There is one component that I could not readily identify due to its markings being grinded off, but during his teardown, Voltlog was able to recognize it as a AD8052ARZ voltage feedback opamp (https://www.analog.com/media/en/technical-documentation/data-sheets/AD8051_8052_8054.pdf). 


Not sure about the importance of keeping this particular component obfuscated, even though it seems to be responsible for the signal conditioning before the ADC (its two outputs are connected to PA1 and PA2 pins in the microcontroller).

7 comments:

Unknown said...

Hey!
I unlocked it in 1832 and found what to do to make the device show correctly at high frequencies. To do this, it is enough to evaporate 4 capacitors that are located with calibration resistors and form a 45 kHz high-pass filter with them. After that, you need to carry out the usual calibration.
Good luck!

Creation Factory said...

Hi,

Thanks for the hint! Can you detail more the procedure you describe? I.e. which are these 4 capacitors? And do you have any idea if these capacitors are absent in the 1833C?

Cheers

Unknown said...

Good evening!
Capacitors C66, C67, C68, C69 are located near the reference resistors and form an RC filter with them at a frequency of 45 kHz. When calibrating, at a frequency higher than this, the capacitors shunt the reference resistors and the calibration is incorrect. I realized this when measuring an accurate resistor of 10,000kΩ, at a frequency of 100kHz it showed 30kΩ. After removing 4 capacitors and calibrating, it began to show 10,000 kΩ at all frequencies. To memorize the calibration results, press the SET button 3 times and after turning on the device, the calibration settings are saved. But it is not exactly. Whether there are these capacitors in 1833 I do not know, I think not, otherwise the device would have been calibrated incorrectly
Happy experiments!

Creation Factory said...

This sounds good. It makes sense that they might have done something extra to differentiate the models. And being the case, it is surely a cheap way of achieving such segregation at the hardware level.

I confirm the same in my device, while testing parallel resistance on a 10 K resistor. Up to 40 KHz is steady at 10 K, but as I select higher frequencies the value progressively increases up to 30 K at 100 KHz.

Indeed on a resistive load the frequency shouldn't matter, even asssuming incorrect calibration offsets..

Ultimately what the 1833 might have instead are different valued capacitors, so that the cutoff frequency is above 100 KHz.

Nice one! Thanks

Unknown said...

If these capacitors are in 1833, then their capacity should be 3 times less, parallel to the 100kOhm calibration resistor was a 22pF capacitor, 10kOhm - 245pF. The device works great without these capacitors at all. It may be necessary to adjust at high frequencies by installing capacitors, but the capacitance must be calculated for a 100kHz filter.
Good luck!

Unknown said...

It is necessary to wash the board with isopropanol, after which the device works more stable.

Unknown said...

I never found a 10Ω calibration resistor on the board.