Full Disclosure mailing list archives

Re: Xfree86 video buffering?


From: James Tucker <jftucker () gmail com>
Date: Fri, 25 Feb 2005 13:08:20 +0000

On Thu, 24 Feb 2005 23:26:36 -0500, Valdis.Kletnieks () vt edu
<Valdis.Kletnieks () vt edu> wrote:
I don't think this is at all easily solvable - when the X server starts up, the
card is probably in console mode using the VGA emulation, which is pretty
brain-dead and doesn't touch much of the card memory (when you have 32M or 64M
on-card, that 640x480 gets lonely sitting in the corner).  The X server first
has to pop it into the native NVidia/ATI/whatever graphics mode (remember, it
has to do that *before* it can access the video memory - you can't get there
while still in VGA emulation).  

Is it not possible to switch processing modes without switching video modes?
Could a fast(er) operation, such as a blank intermediate pallete be
loaded to 'wash out' all coloured pixels on the screen?

Then it can proceed to clear out the on-card
memory.  Unfortunately, if the X server pauses in between setting the mode and
clearing  the memory, you get to see the uninitialized (and therefor left-over)
buffers.  

Damn busy waiting...

About the best you can do here is fix the server to try to not do any
time-intensive operations between the mode set and the clear.

Isn't this a problem when it comes to POSIX compliance?
 
There's multiple reasons why it can even survive a power cycle.  In my case,
I've only been powering off for a few seconds (stupid laptop doesn't have a MNI
Reset button, which would be quite helpful when doing kernel-level hacking),
and the voltage levels in the RAM hadn't decayed all the way to bit lossage.

It's like my father always said, turn it off, count to thirty, now you
can go again.. ;-)
I knew the electronic engineers were keeping something from us... now
what about that third 'bit'?
_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.netsys.com/full-disclosure-charter.html


Current thread: