Electric Automation Forum
Forum » Power Supply » Digital Control Delay
Topics: Digital Control Delay on Power Supply
#1
Start by
Ray Ridley
09-23-2013 06:59 AM

Digital Control Delay

Here is a question for those of you who are pushing hard in the digital design domain.

When you have done all the tricks of oversampling, acceleration, etc, what is the smallest delay that you have achieved in implementing the compensation algorithm? (Both absolute time, and fraction of a switching period.)

I'm not talking about the final comparator in the inner loop, let's assume that has reverted back to a comparator to avoid delays there.
09-23-2013 09:12 AM
Top #2
Ray Ridley
09-23-2013 09:12 AM
Please also add which controller you are using to achieve this.
09-23-2013 12:08 PM
Top #3
Andrew Ferencz
09-23-2013 12:08 PM
With the DSPic you can get to about 30 instruction cycles at 25nS per instruction. They are introducing a new part that will further reduce this as it has additional sets of working registers to reduce context saving. Of course it depends on the control, simple PI takes very few instructions to execute. At a minimum I scale the A/D value, PI, check for under/overflow of the control output, set output, set flag for new A/D available. The DSPic has versions that run faster than 40 MHz instruction clock but I have not used them yet.
09-23-2013 02:20 PM
Top #4
Ray Ridley
09-23-2013 02:20 PM
Andrew: thanks for the response. Let's say it's a PID.

How about the final number that you get for the time delay? Not theoretical, but actually achieved so far.
09-23-2013 04:24 PM
Top #5
Siyu He
09-23-2013 04:24 PM
Hi,
Would you help to define the 'delay'? I read some textbook and application report, they are using two different 'delay' so far. One is the from the samping point to the duty cycle update point, usually one or more switching periode. Another one is from the sampling point to the switching point.Also, ZOH formed by ADC and DPWM will introduce phase delay.
Are you just talking about the calculation delay?
09-23-2013 07:02 PM
Top #6
Ray Ridley
09-23-2013 07:02 PM
I am talking about the delay from the sampling point to the duty cycle update point.

So it is the ADC time, plus calculation time, plus the hold time before it gets used.

It used to be one or two switching periods, you are correct. However, that's not fast enough to effectively control a switching power supply with good bandwidth, so the time has come down drastically in recent years.
09-23-2013 09:28 PM
Top #7
Siyu He
09-23-2013 09:28 PM
If delay is define like this, I think as long as the all the instructions are done within one switching periode, the total delay is one switching periode. Probably, reducing of delay should be combined with increasing switching frequency.
09-23-2013 11:46 PM
Top #8
ron vinsant
09-23-2013 11:46 PM
The XRP7724, which is a DC/DC converter only group delay is 200nS. This is the delay from the sample of Vout to the output of the driver from the DPWM.
09-24-2013 02:02 AM
Top #9
Ray Ridley
09-24-2013 02:02 AM
Thats impressive. For that example, what frequency are you running the converter at?

And of the 200 ns, how much for the ADC, how much computation time?
09-24-2013 04:04 AM
Top #10
Haifeng Lu
09-24-2013 04:04 AM
According to your define of delay, I think the effect of PWM period is dominant. In my application, it is 1.5 Tpwm.
09-24-2013 06:54 AM
Top #11
John Billingsley
09-24-2013 06:54 AM
Early Zilker Labs parts oversampled at 16xFsw, averaged, then down sampled the average to Fsw, with a skew of Tsw*3/16.
Fsw could be from 200-1400 kHz, so that's roughly 133ns from the last 16x sample.
Their latest part (Intersil ZL8800?) can generate a duty cycle update 4 times per switching period to extend or shorten the current duty cycle, or cause the next one to initiate early or late, in response to a transient load. Still think the skew is about the same, though.
Reply to Thread