Definition of Transformer
:
A
transformer is a static machine used for transforming power from one circuit to
another without changing frequency. This is a very basic definition of transformer.
History of Transformer:
The history
of transformer was commenced in the year 1880. In the year
1950, 400KV electrical power transformer was
introduced in high voltage electrical power system. In the early
1970s, unit rating as large as 1100MVA was produced and 800KV and even higher
KV class transformers were manufactured in year of 1980.
Losses in Transformer:
As the electrical transformer is a static device, mechanical
loss in transformer normally does not come into picture. We generally consider
only electrical losses in transformer. Loss in any
machine is broadly defined as difference between input power and output power.
When input power is supplied to the
primary of transformer, some portion of that power is used to compensate core losses in transformer i.e. Hysteresis loss in transformer and Eddy currentloss in transformer core
and some portion of the input power is lost as I2R loss and dissipated
as heat in the primary and secondary windings, because these windings have some
internal resistance in
them. The first one is called core loss oriron loss in transformer and
the later is known as ohmic loss or copper loss in transformer.
Another loss occurs in transformer, known as Stray Loss, due to Stray fluxes
link with the mechanical structure and winding conductors.
There are two type of losses in a transformer
1.
Copper Losses
2.
Iron Losses or Core Losses or
Insulation Losses
Copper losses ( I²R) are variable losses
which depends on Current passing through transformer
windings while Iron Losses or Core Losses or Insulation Losses depends
on Voltage.
So the transformer is designed for rated voltage (iron loss) and rated current (copper loss). We can't predict the power factor while designing the machine, because power factor depends upon the load which varies time to time.
When a manufacturer makes a transformer, UPS etc, they have no idea of the type of load that will be used & consequently they can only rate the device according to its maximum current output that the conductors can safely carry (at unity Power Factor) & the insulation rating of the conductors (voltage & temperature).
That’s why the Transformer Rating may be expressed in kVA, Not in kW.
Source : www.studyelectrical.com