What is the difference between the readability and accuracy of a balance?

Loading... 666 view(s)
What is the difference between the readability and accuracy of a balance?
There are many options out there when it comes to laboratory balances. Other than the maximum capacity, one of the main differences you may have seen between balances is their readability. The term readability is often used interchangeably with terms like repeatability (precision) and accuracy. However, they each have different definitions that are important when trying to understand the difference between the readability and accuracy of a balance. As well as deciding which is the best balance for your application.

So, what is readability and how does it differ from accuracy?


The smallest difference between values that can be read from the display. Below, demonstrates the displays of balances with differing readabilities. Readability should only really be considered a specification of the balance as opposed to a complete indication of how correct the actual reading is. This is explained in a little more detail below. [caption id="attachment_15751" align="aligncenter" width="722"] (A) Ohaus Analytical Balance with readability of 0.0001g. (B) Ohaus Semi-Micro Balance with readability of 0.00001g. They will display readings at these intervals.[/caption] Accuracy is more representative when it comes determining how “good” a balance is. It can be defined as how close any measured value is to the true value of the weight applied. It is not a standalone value given to a balance, but is determined by a number of factors including precision, trueness and linearity of measurement. A high accuracy balance is essential if trueness of measurement (across the whole measuring range) is your goal. Of course, this has to be combined with an adequate level of readability to achieve this. Calibration can demonstrate the accuracy of a balance. If your calibration weights are correctly maintained and you know their weight to be true, you can perform repeatability and linearity tests. Balance manuals will often detail how to perform such tests. Other terms used when it comes to describing balances are:


This is the amount of variation you will see during repeated measurements of the same mass independent of the true value, under the same conditions (for example same use, same environmental conditions so bias is reduced). It is often expressed as a standard deviation across a number of measurements (usually 5 or 10).


Is the ability of the balance to follow a linear relationship between the mass applied and the displayed value. A perfectly linear relationship would follow a straight line as shown in blue on the graph below. However balances will have some form of permissible variation from a linear relationship (beige shaded area), which is written at a ± value (the black error bars). [caption id="attachment_15734" align="aligncenter" width="300"]Weight Weight[/caption]

Corner load (and corner load error):

Used to describe differences in the measured value when a weight is applied to the corner of the balance pad as opposed to in the centre. Corner load error means that there are differences in readings because of this. This type of error also contributes towards how accurate a balance is. We supply a large range of balances from leading manufacturers including Ohaus, Adam Equipment, Precisa and A&D, which can be found here. If you need any assistance in selecting the most suitable balance for your application, you can contact us using the contact form below or simple call 01954 233 120.
Leave your comment
Your email address will not be published