What does the term 'sensitivity' refer to in the context of an RCD?

Prepare for the EWRB Theory Test. Use flashcards and multiple-choice questions with detailed explanations to boost your readiness. Pass with confidence!

In the context of a Residual Current Device (RCD), 'sensitivity' specifically refers to the current imbalance required to trip the device. This imbalance occurs when there is a difference in the electric current flowing through the live and neutral wires, which could indicate a leakage current to earth that poses a risk of electric shock or fire.

An RCD is designed to detect even small levels of current leakage and respond quickly to disconnect the supply. The sensitivity rating, typically measured in milliamperes (mA), indicates the amount of leakage current that will cause the RCD to trip and cut off the electrical supply. Common sensitivity ratings include 30 mA for personal protection and 100 mA for fire protection.

The other options, while relevant to the functioning of RCDs, do not define 'sensitivity.' For instance, the voltage level required to trip relates to how much voltage is applied rather than the current imbalance. Similarly, the time delay before tripping pertains to the speed of response, which is not what sensitivity measures. Finally, maximum load capacity refers to how much load the circuit can handle before tripping, which is also separate from the concept of sensitivity.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy