Coupon Accepted Successfully!


Ohm's Law

In an electrical circuit, the cell is the source of electrical energy, which maintains the desired potential difference between the ends of a conductor. It is the potential difference that drives the current in the circuit. We would expect a definition giving the relation between the potential difference at the ends of a conductor and the magnitude of the current that flows in it. This very important relation was discovered by George Simon Ohm (Germany) in the year 1827. On the basis of his experimental observations he discovered a law, known in his honour as Ohm’s Law, which states that the current flowing through a conductor is directly proportional to the potential difference across its ends, provided the physical conditions of the conductor remain the same.
Potential difference ∝ Current
Or V ∝ I
Or V = RI
Where R is the constant of Proportionality called the resistance of the given conductor and its unit is Ohm. The symbol of Ohm can be represented as Ω.
Definition of one Ohm
The resistance of a conductor is said to be 1 ohm if a current of 1 ampere flows through it and when the potential difference across its ends is 1 volt.
If the potential difference (V) is 1 volt and current (I) is 1 ampere, the resistance (R) is 1 ohm. From Ohm’s law we have
Thus we can conclude that current is inversely proportional to the resistance. If the resistance of the wire is doubled the current gets halved. A variable resistor can be used to increase or decrease the current without changing the voltage source.
The V-I graph for a nichrome wire is given below.
A straight line plot shows that as the current through a wire increases, the potential difference across the wire increases linearly. This confirms Ohm’s law.


Test Your Skills Now!
Take a Quiz now
Reviewer Name