| Commit message (Collapse) | Author | Age | Files | Lines |
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
tags.
__glXInitializeVisualConfigFromTags doesn't skip the payload of
unrecognized tags. Instead, it treats the value as if it were the
next tag, which can happen if the server's GLX extension is not
Mesa's. For example, this falls down when NVIDIA sends a
GLX_FLOAT_COMPONENTS_NV = 0 pair, causing
__glXInitializeVisualConfigFromTags to bail out early.
Signed-off-by: Aaron Plattner <[email protected]>
Signed-off-by: Ian Romanick <[email protected]>
|
|\| |
|
| | |
|
| |
| |
| |
| |
| | |
Re-add support for the vblank_mode environment and configuration
variable. Useful for benchmarking and app control.
|
| |
| |
| |
| |
| | |
Add a new DRI2 configuration query extension. Allows for DRI2 client
code to query for common DRI2 configuration options.
|
|\| |
|
| |
| |
| |
| | |
This should have been part of the last change...
|
|\| |
|
| |
| |
| |
| |
| |
| |
| | |
In the direct rendered case, we need to tell the server our initial swap
interval. If we don't, the local and server values will be out of sync,
since the server and client defaults may be different (as they were
before this patch).
|
| |
| |
| |
| | |
https://bugs.freedesktop.org/show_bug.cgi?id=27628
|
| |
| |
| |
| | |
https://bugs.freedesktop.org/show_bug.cgi?id=27628
|
|\| |
|
| |
| |
| |
| |
| |
| |
| | |
In the direct rendered case, we need to convert DRI2 swap complete
events to GLX events for the client to consume. This path had what
looks like a stray "& 0x75" from some earlier debugging that prevented
clients from seeing the right event code.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
When matching attributes using the 'mask' matching criteria, the spec
says that
"Only GLXFBConfigs for which the set bits of attribute include all
the bits that are set in the requested value are
considered. (Additional bits might be set in the attribute)."
The current test returns true if the two bit masks have bits in
common, specifically it matches even if the requested value has bits
set that are not set in the fbconfig attribute. For example, an
application asking for
GLX_DRAWABLE_TYPE, GLX_PIXMAP_BIT | GLX_PBUFFER_BIT,
as glxpbdemo does, will match fbconfigs that don't support pbuffer
rendering, as long as they support pixmap rendering.
Reviewed-by: Ian Romanick <[email protected]>
|
| |
| |
| |
| |
| |
| |
| |
| | |
We've supported indirect rendering pbuffers for a while, but not direct
rendering pbuffers. The way we do this is by creating a hidden pixmap
and wrap that in a GLX pbuffer. This only works when we have DRI2 on
the server, but if the server doesn't have DRI2, it won't expose configs
with pbuffer bits enabled.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
When matching attributes using the 'mask' matching criteria, the spec
says that
"Only GLXFBConfigs for which the set bits of attribute include all
the bits that are set in the requested value are
considered. (Additional bits might be set in the attribute)."
The current test returns true if the two bit masks have bits in
common, specifically it matches even if the requested value has bits
set that are not set in the fbconfig attribute. For example, an
application asking for
GLX_DRAWABLE_TYPE, GLX_PIXMAP_BIT | GLX_PBUFFER_BIT,
as glxpbdemo does, will match fbconfigs that don't support pbuffer
rendering, as long as they support pixmap rendering.
Reviewed-by: Ian Romanick <[email protected]>
|
|\| |
|
| |
| |
| |
| | |
Fixes bug #27454.
|
|\| |
|
| |
| |
| |
| |
| | |
Add ifdef guards around variables of types defined only for
GLX_DIRECT_RENDERING.
|
| |
| |
| |
| |
| |
| |
| |
| | |
The IDs will be the same in the case where an X window is used directly
as a GLX drawable, but will fail if a new GLX drawable is created
explicitly, as with glxgears_fbconfig.
Fixes fdo bug #27190.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This reverts commit 9aadc793f3db64cefa0b08f18abad424a659dacc.
This reverts commit 69ea4e7718efb60b6b0d795a355cebd6712ceac1.
This reverts commit dbe8b013936d977ec63d6607bfd2fc6772d29787.
This reverts commit 23215ef4d60a86d9f3b3fdc08e3fdadc59e98890.
This reverts commit 9495e3703062d1ddaf3161f4efc23f0b51284d9b.
This reverts commit 0594cf70883b64692ba617d85f4f9b4e636e5c2b.
This reverts commit 86a7978d37393ee34f876569ac06ffdb8d7289ae.
This reverts commit 437902ce978cde9a0e1aa260f12dc232a8501c42.
|
| |
| |
| |
| | |
Signed-off-by: Jeremy Huddleston <[email protected]>
|
| |
| |
| |
| | |
Signed-off-by: Jeremy Huddleston <[email protected]>
|
| |
| |
| |
| | |
Signed-off-by: Jeremy Huddleston <[email protected]>
|
| |
| |
| |
| | |
Signed-off-by: Jeremy Huddleston <[email protected]>
|
| |
| |
| |
| | |
Signed-off-by: Jeremy Huddleston <[email protected]>
|
| |
| |
| |
| |
| | |
driContext field for __GLXcontextRec struct is only defined if
GLX_DIRECT_RENDERING is set.
|
|\|
| |
| |
| |
| |
| |
| |
| | |
Conflicts:
Makefile
src/mesa/main/version.h
Resolved by keeping version strings from master (also in the intel driver).
|
| |
| |
| |
| |
| | |
Apparently the higher compiler optimization level in non-debug builds was
eliminating the unused functions referencing the unresolved DRI2 symbols...
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This can happen when an X window is destroyed behind our back. We use
DRI2CopyRegion behind the scenes in many places (like flushing the fake
front to the real front) so we have to ignore X errors triggered in that
case.
The glean test cases trigger this consistently as they don't destroy the
GLX drawable nicely, they just destroy the X window.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
allocated
Move the initialization of ext_list_first_time from all of the DRI loader's
CreateScreen routines, to where the storage for the screen config is
allocated.
It needs to get set in the screen-config even if DRI is forced off
using LIBGL_ALWAYS_INDIRECT, so that psc->direct_support is initialized
correctly, otherwise __glXExtensionBitIsEnabled() always returns FALSE
Specifically, this causes a problem with an X server which advertises
GLX<=1.2, and the GLX_SGIX_fbconfig extension.
glXGetFBConfigFromVisualSGIX() uses __glXExtensionBitIsEnabled() to
check if the GLX_SGIX_fbconfig extension is available, but that function
won't return correct information because that data has never been
initialized, because ext_list_first_time was never set...
Signed-off-by: Jon TURNEY <[email protected]>
Signed-off-by: Brian Paul <[email protected]>
(cherry picked from commit 96ab4d2b84178209ee59017458d9964b32b7e183)
|
| | |
|
| | |
|
| | |
|
| | |
|
| |
| |
| |
| |
| | |
This needs a patch for xserver/glx also. An enviroment variable will be added
at some point, it chould be for swrastg only or all gallium drivers.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This can happen when an X window is destroyed behind our back. We use
DRI2CopyRegion behind the scenes in many places (like flushing the fake
front to the real front) so we have to ignore X errors triggered in that
case.
The glean test cases trigger this consistently as they don't destroy the
GLX drawable nicely, they just destroy the X window.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This change passes a remainder of 1 to the server with the
DRI2SwapBuffers request, causing it to honor the OML semantics for the
swap rather than falling through to glXSwapBuffers behavior. The
remainder actually ends up ignored since the divisor is 0, but we need
to differentiate the OML and standard behavior somehow.
Reported-by: Mario Kleiner <[email protected]>
Signed-off-by: Jesse Barnes <[email protected]>
|
| | |
|
| | |
|
| |
| |
| |
| | |
Signed-off-by: Alan Coopersmith <[email protected]>
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
allocated
Move the initialization of ext_list_first_time from all of the DRI loader's
CreateScreen routines, to where the storage for the screen config is
allocated.
It needs to get set in the screen-config even if DRI is forced off
using LIBGL_ALWAYS_INDIRECT, so that psc->direct_support is initialized
correctly, otherwise __glXExtensionBitIsEnabled() always returns FALSE
Specifically, this causes a problem with an X server which advertises
GLX<=1.2, and the GLX_SGIX_fbconfig extension.
glXGetFBConfigFromVisualSGIX() uses __glXExtensionBitIsEnabled() to
check if the GLX_SGIX_fbconfig extension is available, but that function
won't return correct information because that data has never been
initialized, because ext_list_first_time was never set...
Signed-off-by: Jon TURNEY <[email protected]>
Signed-off-by: Brian Paul <[email protected]>
|
|
|
|
|
| |
If the server supports the OML related protocol, enable support for the
extension.
|
|
|
|
| |
Leftover from earlier commit.
|
|
|
|
|
|
|
|
| |
I wasn't careful enough when removing support for GCC versions earlier
than 3.3.0. I could have sworn that I compile tested before pushing,
but apparently not. FAIL.
Signed-off-by: Ian Romanick <[email protected]>
|
|
|
|
| |
Signed-off-by: Ian Romanick <[email protected]>
|
|
|
|
| |
Signed-off-by: Ian Romanick <[email protected]>
|
|
|
|
| |
See fd.o bug 26832.
|