Explaining Gate Capacitance vs Gate Charge?
Inducktion, Fri Sept 18 2020, 04:51AM
I've been spending a lot of time researching the difference between various power MOSFETs, and I'm a tad bit confused.
There's gate capacitance, and gate charge. I understand that as you increase gate drive voltage, gate charge increases. But, what
is gate charge?
How does this affect gate driving? I know gate capacitance seems to be the biggest thing most people look for in a mosfet, but yet, gate charge can seemingly vary WIDELY between FET's even of similar current and voltage ratings... whereas gate capacitance does vary, but not by quite so much a margin...
Example, these two mosfets:
The IRF200P222 has a gate capacitance of 9820 pF, and a total gate charge of 135 nC at 10 VGS. Rated at 200V, 182A.
Meanwhile, the APT20M18 has a gate capacitance of 9880 pF, but a total gate charge of
330 nC at 10 VGS! Yet, it's rated at 200V, 100A!
How does this work?
And on a different, yet related, note, why do people still use the standard IRFP4xx - IRFP2xx FET's so much, if there's much better FET's seemingly available nowadays? Is it because they're easy to drive? Or just more or less "tradition" in some sense?