AnsweredAssumed Answered

Choosing pressure sensor to meet accuracy requirements for Blood Pressure Measurement

Question asked by Gustavo Roitman on Jan 13, 2016
Latest reply on Jan 19, 2016 by Jose Alberto Reyes Morales


I'm working on the design of a Non-Invasive Blood Pressure monitor. I'm choosing a pressure sensor to meet the specifications needed to comply with standard IEC 60601-2-30, which applies to this kind of devices. I see that there are many application notes where NXP suggests that different pressure sensors are adecuate for this application. For example, Application Note AN4328, where pressure sensor MP3V5050 is used.


The above mentioned international standard requires that the device has a manometer mode to measure the air pressure in the patient's cuff, and that the accuracy of the device must meet the following rquirement:


" Limits of the error of the manometer from environmental conditions

Over the temperature range of 10°C to 40°C and the relative humidity range of 15% to 85% (non-condensing), the maximum error for the measurement of the CUFF pressure at any point of the nominal measurement range shall be less than or equal to +/-3mmHg (+/-0.4kPa) or 2% of the reading, whichever is greater."


The standard also requires that the manometer system must measure up to 230mmHg ( and This sets the limit of the error to +/-3mmHg when the measured pressure is in the range 0mmHg to 150mmHg, and 2% of the reading in the range 150mmHg to 230mmHg.


All pressure sensors specify Accuracy in percent relative to FSS, and not percent relative to the reading. So, for example, for MP3V5050GP accuracy is +/-2.5%FSS over the compensated temperature range 0°C to 85°C (no references to relative humidity). FSS is 50kPa = 375mmHg, so accuracy is 9.375mmHg, which is greater than the limits in all the measurement range.


1) Am I correct in the meaning of the sensor specification?

2) Is there a way to choose MP3V5050GP and comply with the limits in the error required in the standard?

3) We can make an offset and gain adjustment in production process at room temperature (25°C), and also we could do an Auto-Zero procedure before each measurement. Could the Accuracy of +/-2.5% be reduced by some of these adjustments? If so, how can we calculate the new accuracy after the adjustments?


I've already read Application Note AN1636 where Auto-Zero is explained for another sensor. The application note assumes that the only error sources are Offset Calibration and Temperature Coefficient of Offset and ends up with error bounds that only account for ADC resolution, but doesn't take into account other error sources (pressure and thermal hystheresis, non-linearity, thermal effect on span).