[CentOS] Re: dragging windows leaves traces -- running older GeForce drivers on newer cards

Bryan J. Smith <b.j.smith@ieee.org>

thebs413 at earthlink.net
Mon Jun 6 16:15:03 UTC 2005


From: Robin Mordasiewicz <robin at bullseye.tv>
> I have an excellent nvidia graphics card, I get excellent fps, all agp 
> features are installed correctly etc.

In Windows or Linux?  And by "all agp features are installed correctly,"
what do you mean and how are you getting this?

> When I drag windows around wit hthe opaque settings on it leaves traces 
> for a couple milliseconds as it redraws the window, but it just looks so 
> crummy to see.

Hmmm, sounds like a window manager/environment issue, unless you're
having a video card issue (see below).  What is your desktop environment?

> I am having a hard time convincing people that using linux is a good idea.

Umm, what Windows version offers translucent settings comparable to Linux
desktops?  None that I know of.  And by "opaque," do you mean translucent/
"see through" is turned on, but set to 0%?  If so, that's still transcluent.

No shipping Windows or Linux distribution I know of uses video card
framebuffer to off-load such complex rendering.  It's done 100% in software,
hence why Microsoft doesn't bother to offer it, and Linux desktops are slow
at it.

The only major OS I know of that ships using OpenGL to manage window
framebuffer directly in video framebuffer is MacOS X via "QuartzExtreme."

Microsoft was adding this capability to NT6.0 "Longhorn" in its new, Avalon
desktop, but the development has been pushed back and out of NT6.0
"Longhorn" client.  To put a "good face" on this, Microsoft is still releasing
NT6.0 "Longhorn" client with this next generation Windows Graphics Foundation
(WGF), but only version 1.0 which uses existing DirectX 9 code.  In a nutshell,
DirectX 9 still doesn't have the full geometry setup of OpenGL, which makes
it far, far less capable of what QuartzExtreme is capable of with age-old
OpenGL 1.2+.  WGF 2.0, which incorporates many new features to address
these deficiencies (in a nutshell, what was DirectX 10) will ship at a latter
date and supposedly address WGF 1.0's shortcomings (I'll believe it when
I see it -- there's still a lot of legacy GDI issues to overcome).

For more on this, see my PC_Support post here:  
http://lists.leap-cf.org/pipermail/pc_support/2005-May/000333.html  

Now in the Linux world, The FreeDesktop.org "Cario" project (not to be
confused with the original NT4.0 release codename) adds many things,
including direct window-to-video framebuffer via OpenGL over X11 (GLX).
It iss being adopted by a number of desktop suites, including
Enlightment R17 as well as GNOME 3.  A few window managers already
implement it, or their own solution.  Sun's LookingGlass desktop actually
uses a non-X11 system (and quite a number of other, innovative APIs --
including 3D _input_ as _standard_), but offers X11/GLX for compatibility
(and, thus, should support "Cario" interfaces as well).

But until these features ship as standard in either Windows or Linux
distros (think Fedora Core 5 / RHEL 5 / CentOS 5 time-frame ;-), don't
expect good performance with opaque desktops where the software is
doing really slow and ugly memory maps to/from the video card, over
the system interconnect, etc...  I.e., don't blame Linux for attempting
to implement a feature in software that Microsoft won't care attempt
because it has the same issues.  ;->

> People make judgements on this type of thing and they think that
> linux is slow compared to Windows.
> When I had windows installed on this same machine dragging
> windows around did not produce any traces.

When you say opaque, do you mean translucent but 0% in Linux?
Sometimes people enable this feature not realizing that you're
adding a massive amount of overhead to the software framebuffer
of the window manager.

> Using linux on the desktop looks like a throwback to windows 95 on a 486.
> Does anyone have any advice on how to make X redraw windows better.

Turn off any translucent features, period.  You don't want them enabled.


From: Sukru TIKVES <sukru at cs.hacettepe.edu.tr>
> Are you using the x.org driver (nv) or the nvidia one?

I'm going to guess the former, or possibly an earlier version of the latter.
Unless he has some translucent settings enabled.
It would be nice to know what is exactly enabled.


From: Robin Mordasiewicz <robin at bullseye.tv>
> I am using the nvidia driver, with XFree86 centos3.4
> Moving windows is mostly smooth, not all applications leave traces as X 
> redraws, but I have a dual 3gig machine with a top nvidia card.
> Also when resizing windows it seems to redraw much slower than windows.

What exact video card on what exact nVidia driver release?

Although nVidia has maintained a level of upward compatibility since the
original NV0x series (Riva TNT/TNT2) up to the latest NV4x (GeForce 6000
series), if you load an older driver that was never designed for the latest
series, you will get similiar symtopms.

Example 1:  NV2x/NV30-era XFree86 (FC1) with NV40 card

Last summer I had upgraded another system, a dual Athlon MP2400+,
from a GeForce4 Ti4200 (NV25) to a GeForce 6800GT (NV40).  Running
Fedora Core 1 at the time, I was using a similar XFree86 release
as RHEL/CentOS 3.  That version includes only NV2x (GeForce3/4Ti) and
early NV30 (GeForce FX5800) era support . I got the same results until 
I upgraded to the nVidia driver.

Example 2:  NV1x-era Windows XP with NV40 card

Just this past April, I loaded _Windows_XP on a brand new Athlon64
3200+ system with a GeForce 6800GT (NV40) card.  XP detected a GeForce
compatible card and loaded the driver from when the NV1x series was
current (GeForce2 era).  I experienced the _exact_same_ issues in XP until
I grabbed the latest WHQL driver for XP.

Example 3:  NV4x-era Xorg (FC3) with NV40 card

Now on the _very_same_system_, I loaded Fedora Core 3 AMD64 which
includes a newer Xorg release with not only NV40 (GeForce 6800), but
even some NV41/43 (GeForce 6200/6600) and the latest NV45
(GeForce 6800, PCIe native).  So I didn't have an issue even using the
stock Xorg driver.

So, are you sure you are using the nVidia driver _downloaded_ from
nVidia?  Or are you simply using the nVidia (nv) driver CentOS 3 is
detecting when installed?  I want to say, given your symptomps,
you're using the later, which (even with update 4) is still a NV2x/NV30
driver, and not good for later NV31/34 (GeForce FX5200/5500) let
alone NV4x (GeForce 6000 series) cards.

And if you are using the nVidia driver, please note that you want to be
running with the _latest_ 1.0-6xxx series driver (avoid the new 1.0-7xxx
series driver until it matures).  The 1.0-6xxx series driver is required
for NV40 (GeForce 6800 series), and latter versions for NV41/43/NV45.


From: Sukru TIKVES <sukru at cs.hacettepe.edu.tr>
> As far as I know there are significant speed improvements between xorg 
> and XFree86 servers, but I may be wrong.

The Xorg and XFree86 releases are pretty tandem.  What most Red Hat
users may assume is that because the last XFree86 in any Red Hat
release (FC1/RHEL3) was from the NV2x/NV30 era, Xorg was added when
the nv driver added NV3x/NV40 support (FC2+).  XFree86 has matched
those updates, but they just aren't in any newer releases in any Red
Hat distro.

> I guess CentOS is not suited for desktop machines, you may try your 
> chance with the upcoming FC4 release. (No, i am not talking about 
> workstations).

Nope, that shouldn't be a factor.

> [ Btw, my machine was Athlon XP 2800+ with GeForce FX 5200. Linux 
> install was smooth as in Windows 95 smooth on a P4. ]

The GeForce FX5200 is the NV34 (or NV31?, I think that was the FX5900
or FX5700 -- the higher designation is not necessary faster, just newer,
and can be actually an economy design).  Anything Fedora Core 2+
(including CentOS 4) should support it, or possibly late Fedora Core 1
(and CentOS 3) with the final XFree86 updates.

<TANGENT>

BTW, I know people get into the big "proprietary" debate on nVidia.
Just some facts to know of:  

1)  nVidia released the source code back in the XFree86 3.3.x days

2)  Intel, Microsoft and other 3rd parties who owned IP in that release
were not amused, and even though nVidia obfusciated the identifiers
and structure of the C code, people started "cleaning it up" and exposing
what Intel, Microsoft and others did not want.

3)  XFree86 4.0 is MIT licensed introduced a modular design that supports
closed source drivers, which nVidia and Matrox quickly took advanatge of.

4)  It was at that time nVidia decided to "unify" it's entire NV0x on-ward
driver model (Riva TNT and GeForce on-ward, but not earlier, pre-NV0x
Riva/128).  This has driven all their other decisions.

5)  As part of the unified driver and its AGPgart and memory functions,
nVidia was contractually forbidden to release it GPL -- by Intel itself.
Intel considered AGP a "trade secret" implementation of PCI (AGP is
_not_ a PCI Standards Group standard), hence why nVidia's AGP
implementation and specifications were only in their closed video
driver (not even the nForce -- it only worked for nForce+GeForce).
The Intel AGPgart in the kernel is largely a clean-room design (and
rather limited).

NOTE:  #5 is going away as with the release of PCI-Express by the
PCI SG (an actual, open standard), Intel no longer considers AGP
a "trade secret" and has freed up some of nVidia's obligations.
There is already AGPgart code for nVidia chipsets in newer kernels,
so you don't have to have a GeForce on a nForce chipset.

6)  nVidia puts more developers on the MIT-licensed XFree86/Xorg
drivers than anyone else.  nVidia is completely open with its 2D
drivers and specifications, including video-in/video-out and other
scan converters.  In a nutshell, the 2D MIT (nv) driver is very, very
good as long as recent for the latest card support, because nVidia
supports it.  Of course, older releases with newer cards will
always be a performance issue, and system-specific implementations
(such as the Go Mobile series in notebooks) will definitely lack
compatibility.

The nVidia-licensed "nvidia" driver is a unifed 2D/3D with complete
OpenGL-standards, ARB extensions as well as a few nVidia extension
(this is far, far less common in the OpenGL world).  The OpenGL on
X11 (GLX) implementation is unmatched and without it, Linux would
have not been so widely adopted as the primary visual workstation
in the CAM and EDA world over the last 5-7 years (only as a back-end
solution).  If you just need 2D, you don't need the "nvidia" driver,
not even for TV scan conversion.  But the 3D version at least gives
you an "open standard" platform, meaning you write applications to
an "open standard" that does not require nVidia hardware.

nVidia drivers continue to be trusted, based on open standard GLX
performance and reliability.  I see a few people complaining about
the new 1.0-7xxx series of drivers, but that's not unexpected because
it's a brand new driver release with OpenGL 2.0 and some major
changes in how it handles VESA defaults (it's a bit more conservative,
which is breaking things that used to be autodetected).

7)  ATI tried to "go clean-room 3D" with its indirect support of UtahGLX
and Direct Rendering Infrastructure (DRI) and releasing of the R100
series (Radeon 7500-8000) and R200 (Radeon 8500-9000) series.
These 3D implementations were add-ons to the stock MIT drivers.
Unfortunately, the lag time between release and lack of variant
endless support (e.g., RV280/Radeon 9200, etc...) often caused major
headaches and issues -- let alone performance sucked.  Worse yet
was the fact that nVidia had established itself as _the_ video card/
driver combo for production GLX.

So as of the R300 (Radeon 9500+) series, ATI has _withheld_ 3D
specifications.  In reality, ATI has always withheld not only some
3D, but even 2D interfaces.  This is largely because ATI has some
really advanced 2D image encoding and other logic in their GPUs
that are partially driven by software/algorithms licensed from a
3rd party (whereas nVidia largely off-chipsets much of its
capabilities, or uses commodity approaches/algorithms that it
has no issues documenting).  Although some of the UtahGLX/DRI
code still works for R300+ series cards, they are now severely
lagging.

Which means ATI now offers a "closed source" unified driver as well.

8)  My final comments

The reality is that video card products are _obsoleted_ within 12
months.  GPUs double in performance 2-3 times faster than CPUs,
which means driver development goes hand-in-hand with engineering
development _pre-release_.  In other words, we can dream of a
day where there is no cutting-edge intellectual property in GPUs so
vendors can release drivers as GPL, and not have to hack kernel
interfaces because of 3rd party NDA obligations.  But the idea that
video card drivers can be community developed means that their
drivers wouldn't be released until the GPU is already 1-2 generations
behind.

This is the reality of the product.

Now, luckily, OpenGL/GLX are "open standards," and most vendor
extensions are submitted and formalized by an architectural
review board.  This is very different than the DirectX world where
"features" are often vendor-only extensions, and not always 
standardized in the API -- which is largely a wrapper to required,
physical and often eccentric GPU functions until almost commodity
3-4+ generations behind and all video cards have them.  This is
why many applications in Windows require specific video cards,
and not even after 2-3 generations the titles will run (if ever).

So by supporting the OpenGL/GLX "open standards," even if you are
running a proprietary, closed source driver to run your OpenGL/GLX
application, it means you will be able to run it in the future on
another video card.  Even if the ARB extensions utilized and only
supported by 1-2 "cutting edge" video cards (namely ATI and nVidia),
they are typically supported by other cards within 1-2 generations
by the very nature of how the OpenGL ARB works.

</TANGENT>


--
Bryan J. Smith   mailto:b.j.smith at ieee.org




More information about the CentOS mailing list