Linearity vs Sensitivity | Difference between Linearity and Sensitivity

This page compares Linearity vs Sensitivity and mentions difference between Linearity and Sensitivity.

Sensitivity

sensitivity curve

Definition:
It is the ratio of the magnitude of output signal to the magnitude of input signal applied to the instrument.
➨Sensitivity = Output/Input
➨An instrument requires high degree of sensitivity.
➨Sensitivity ∝ 1/Deflection Factor

As shown in the curve, it is the ratio of infinitesimal change in the output to infinitesimal change in the input.
➨ Sensitivity = Δqo/Δqi

Linearity

linearity curve

Definition:
• It is defined as ability to reproduce the input characteristics symmetrically and linearly. The linearity curve shows actual calibration curve and idealized straight line.
• The output is linearly proportional to the input.
• For a linear instrument the sensitivity is constant for the entire range of the instrument.

➨Linearity is more important parameter compare to all other parameters including sensitivity.



What is difference between or comparison between

Following links mention difference or comparison between various equipments and terms:
comparison between SPI and I2C
difference between PXI and PCI
Microscope vs Telescope
Amplitude Modulation vs Angle Modulation
difference between modem and router
Air Fuel Ratio Sensor vs O2 Sensor
Radiometer vs Spectrometer vs Spectroradiometer
Clamp meter vs digital multimeter
Colorimeter vs Spectrophotometer
difference between Prism and Grating
difference between Venturi meter and Orifice meter
difference between Lux and Lumens