The Designer's Guide Community
Forum
Welcome, Guest. Please Login or Register. Please follow the Forum guidelines.
Sep 26th, 2024, 10:13pm
Pages: 1
Send Topic Print
3.3V vs 14V (Read 2455 times)
grosser
Community Member
***
Offline



Posts: 57

3.3V vs 14V
Nov 20th, 2012, 2:05pm
 
hello

will it be safe to use 3.3V transistors with 4.2V supply ? Gate oxide breakdown, diode junction breakdown  is 7V and punch through voltage is 6.6V. It is 0.35um process
Or I should use hv transistors ready for 14V?

regards
grosser
Back to top
 
 
View Profile   IP Logged
boe
Community Fellow
*****
Offline



Posts: 615

Re: 3.3V vs 14V
Reply #1 - Nov 20th, 2012, 2:55pm
 
Grosser,

this is determined by the reliability of the transistor wrt. hot-electron injection. It does not depend on breakdown or punch-through voltage.

You need reliability data from your fab to answer that; the issue will be most critical for minimum-length transistors

- B O E
Back to top
 
 
View Profile   IP Logged
analog_wiz
Junior Member
**
Offline



Posts: 31
Universe
Re: 3.3V vs 14V
Reply #2 - Nov 20th, 2012, 7:05pm
 
Hi Grosser, it would depend on the topology you are using.What you can

do it run a transient simulation and measure the voltage across the transistor junctions(eldo has something called reliability simulations...). You could also use ultrasim and it has reliability checks for checking voltages across various junctions and it can flag violations if any when a transient sim is run.
Back to top
 
 
View Profile   IP Logged
carlgrace
Senior Member
****
Offline



Posts: 231
Berkeley, CA
Re: 3.3V vs 14V
Reply #3 - Nov 21st, 2012, 9:21am
 
The voltage requirements are based on differences between terminals (except for things like well breakdown).

If you don't ever have the 4.2 V across a gate-source or drain-source junction you're golden.  Be sure you carefully check the startup sequence and ESD structures on your chip though... you may need a custom pad (depends on your pad library of course)
Back to top
 
 
View Profile   IP Logged
Lex
Senior Member
****
Offline



Posts: 201
Eindhoven, Holland
Re: 3.3V vs 14V
Reply #4 - Nov 22nd, 2012, 7:57am
 
carlgrace wrote on Nov 21st, 2012, 9:21am:
The voltage requirements are based on differences between terminals (except for things like well breakdown).

If you don't ever have the 4.2 V across a gate-source or drain-source junction you're golden.  ...


I would think that VGS and VGD should not exceed 3.3V, and expect that VDS can handle higher voltages.
Back to top
 
 
View Profile   IP Logged
Pages: 1
Send Topic Print
Copyright 2002-2024 Designer’s Guide Consulting, Inc. Designer’s Guide® is a registered trademark of Designer’s Guide Consulting, Inc. All rights reserved. Send comments or questions to editor@designers-guide.org. Consider submitting a paper or model.