| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
| |
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Kenneth Graunke <[email protected]>
|
|
|
|
|
|
|
|
| |
Make sure that init_fbconfig_for_chooser sets correct value of
drawableType for visual configs and fbconfigs.
Signed-off-by: Tomasz Lis <[email protected]>
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Correctly handle the value of renderType in GLX context. In case of the
value being incorrect, context creation fails.
v2 (idr): indirect_create_context is just a memory allocator, so don't
validate the GLX_RENDER_TYPE there. Fixes regressions in several
GLX_ARB_create_context piglit tests.
Signed-off-by: Tomasz Lis <[email protected]>
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
| |
v2 (idr): Open-code the check for GLX_RENDER_TYPE.
dri2_convert_glx_attribs can't be called from here because that function
only exists in direct-rendering builds. Also add a stub version of
indirect_create_context_attribs to tests/fake_glx_screen.cpp to prevent
'make check' regressions.
Signed-off-by: Tomasz Lis <[email protected]>
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
| |
Set the correct values of renderType in glXCreateContext and
init_fbconfig_for_chooser.
Signed-off-by: Tomasz Lis <[email protected]>
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Make sure that renderType property value is stored in GLX context while
it's being created. Further patches will be provided to make the value
correspond to fbconfig's renderType.
v2 (idr): Move a hunk from the next patch to this patch to prevent a
build break.
Signed-off-by: Tomasz Lis <[email protected]>
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
| |
Just to be consistent with the functions' Bool return type.
Reviewed-by: Jose Fonseca <[email protected]>
|
|
|
|
| |
Reviewed-by: Jose Fonseca <[email protected]>
|
|
|
|
|
|
|
|
|
| |
NOTE: This is a candidate for stable branches.
Reviewed-by: Ian Romanick <[email protected]>
Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=47478
Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=62999
Bugzilla: http://bugs.winehq.org/show_bug.cgi?id=26763
|
| |
|
| |
|
|
|
|
|
|
|
| |
It's been required for building glx since
b518dfb513742984f27577d25566f93afd86d4fc in january.
Reviewed-by: Chad Versace <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This patch has been generated by the following Coccinelle semantic
patch:
// Don't cast the return value of malloc/realloc.
//
// Casting the return value of malloc/realloc only stands to hide
// errors.
@@
type T;
expression E1, E2;
@@
- (T)
(
_mesa_align_calloc(E1, E2)
|
_mesa_align_malloc(E1, E2)
|
calloc(E1, E2)
|
malloc(E1)
|
realloc(E1, E2)
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
These calls allowed Xlib to use a custom memory allocator, but Xlib has
used the standard C library functions since at least its initial import
into git in 2003. It seems unlikely that it will grow a custom memory
allocator. The functions now just add extra overhead. Replacing them
will make future Coccinelle patches simpler.
This patch has been generated by the following Coccinelle semantic
patch:
// Remove Xcalloc/Xmalloc/Xfree calls
@@ expression E1, E2; @@
- Xcalloc (E1, E2)
+ calloc (E1, E2)
@@ expression E; @@
- Xmalloc (E)
+ malloc (E)
@@ expression E; @@
- Xfree (E)
+ free (E)
@@ expression E; @@
- XFree (E)
+ free (E)
Reviewed-by: Brian Paul <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
| |
This reverts commit adefee50d954151f76150af80207081ae3c247d9.
Shared glapi was never tested with --enable-xlib-glx and turns out
to cause a lot of problems.
Conflicts:
configure.ac
|
|
|
|
|
|
|
|
| |
libglapi.so, libGL.so, libGLESv2.so, libGLESv1_CM.so must all
come from the same version of Mesa or bad things may happen.
Acked-by: Kenneth Graunke <[email protected]>
Signed-off-by: Matt Turner <[email protected]>
|
|
|
|
| |
Signed-off-by: Ian Romanick <[email protected]>
|
|
|
|
| |
Signed-off-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
| |
If the server returned BadContext, the error would just get droped on
the floor.
Fixes the piglit test glx-import-context-single-process
NOTE: This is a candidate for the 7.11 branch, but it also requires
the previous patch.
Signed-off-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Previously the share_xid was only set in the glXImportContextEXT path,
and it was left set to None in all of the other create-context paths.
Fixes the piglit test glx-query-context-info-ext.
NOTE: This is a candidate for the 7.11 branch.
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Adam Jackson <[email protected]>
Reviewed-by: Eric Anholt <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Send the DestroyContext protocol immediately when glXDestroyContext is
called, and never call it when glXFreeContextEXT is called. In both
cases, either destroy the client-side structures or, if the context is
current, set xid to None so that the client-side structures will be
destroyed later.
I believe this restores the behavior of the original SGI code. See
src/glx/x11 around commit 5df82c8. The spec doesn't say anything
about glXDestroyContext not really destroying imported contexts (it
acts like glXFreeContextEXT instead), but that's what the original
code did. Note that glXFreeContextEXT on a non-imported context does
not destroy it either.
Fixes the piglit test glx-free-context.
NOTE: This is a candidate for the 7.11 branch.
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Adam Jackson <[email protected]>
Reviewed-by: Eric Anholt <[email protected]>
|
|
|
|
|
|
|
|
|
|
| |
Fixes the piglit test glx-get-context-id.
NOTE: This is a candidate for the 7.11 branch.
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Adam Jackson <[email protected]>
Reviewed-by: Eric Anholt <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
glXImportContextEXT
The primary problem was that the number of reply bytes read is clamped
to sizeof(propList), but the loop that processes the properties tries
to examine all of the properties sent by the server. If the server
sends 47,000 properties, we only read 3 but process all 47,000.
NOTE: This is a candidate for the 7.11 branch.
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Adam Jackson <[email protected]>
Reviewed-by: Eric Anholt <[email protected]>
|
|
|
|
|
|
|
|
| |
NOTE: This is a candidate for the 7.11 branch.
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Adam Jackson <[email protected]>
Reviewed-by: Eric Anholt <[email protected]>
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
Create a new GLX drawable struct to track client related info, and add a
wrap counter to it drawable and track it as we receive events. This
allows us to support the full 64 bits of the event structure we pass to
the client even though the server only gives us a 32 bit count.
Reviewed-by: Michel Dänzer <[email protected]>
Reviewed-by: Jeremy Huddleston <[email protected]>
Signed-off-by: Jesse Barnes <[email protected]>
|
|
|
|
|
|
| |
... and clean up if it didn't.
Signed-off-by: Adam Jackson <[email protected]>
|
|
|
|
| |
Signed-off-by: Adam Jackson <[email protected]>
|
|
|
|
|
|
|
|
| |
In applegl, GLX advertises the same extensions provided by OpenGL.framework
even if such extensions are not provided by glapi. This allows a client
to get access to such API.
Signed-off-by: Jeremy Huddleston <[email protected]>
|
|
|
|
|
|
|
| |
gc->vtable->destroy is always set and is used unconditionally
in other places, so don't bother checking for it first.
Signed-off-by: Jeremy Huddleston <[email protected]>
|
|
|
|
|
|
| |
Now that we're using glapi, we don't need to special case this.
Signed-off-by: Jeremy Huddleston <[email protected]>
|
|
|
|
|
|
|
|
|
|
| |
This reverts portions of 6849916170c0275c13510251a7b217c20f2b993e that caused
the darwin config to fail to build due to missing implementations in that
commit.
See https://bugs.freedesktop.org/show_bug.cgi?id=29162
Signed-off-by: Jeremy Huddleston <[email protected]>
|
|
|
|
|
|
| |
Fixes regression introduced by: c356f5867f2c1fad7155df538b9affa8dbdcf869
Signed-off-by: Jeremy Huddleston <[email protected]>
|
|
|
|
|
|
| |
Fixes regression introduced by: 6ddf66e9230ee862ac341c4767cf6b3b2dd2552b
Signed-off-by: Jeremy Huddleston <[email protected]>
|
|
|
|
|
| |
Reviewed-by: Brian Paul <[email protected]>
Signed-off-by: Adam Jackson <[email protected]>
|
|
|
|
|
|
|
|
|
|
| |
We want to check for Success, otherwise it will fail even with the right visual.
NOTE: This is a candidate for the 7.10 branch.
Signed-off-by: Antoine Labour <[email protected]>
Signed-off-by: Stéphane Marchesin <[email protected]>
Signed-off-by: Brian Paul <[email protected]>
|
|
|
|
| |
Signed-off-by: Adam Jackson <[email protected]>
|
|
|
|
|
| |
The GLX Spec says you only implicitly glFlush if the drawable being
swapped is the current context's drawable.
|
|
|
|
|
|
| |
'dpy' was being checked for null *after* it was already used once.
Also add a null check for psc, and drop gc's redundant initialization.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When --enable-shared-glapi is specified, libGL will share libglapi with
OpenGL ES instead of defining its own copy of glapi. This makes sure an
app will get only one copy of glapi in its address space.
The new option is disabled by default. When enabled, libGL and libglapi
must be built from the same source tree and distributed together. This
requirement comes from the fact that the dispatch offsets used by these
libraries are re-assigned whenever GLAPI XMLs are changed.
For GLX, indirect rendering for has_different_protocol() functions is
tricky. A has_different_protocol() function is assigned only one
dispatch offset, yet each entry point needs a different protocol opcode.
It cannot be supported by the shared glapi. The fix to this is to make
glXGetProcAddress handle such functions specially before calling
_glapi_get_proc_address.
Note that these files are automatically generated/re-generated
src/glx/indirect.c
src/glx/indirect.h
src/mapi/glapi/glapi_mapi_tmp.h
|
|
|
|
|
|
| |
Doesn't work for pixmaps, was looking up the GLX XID and was never thread
safe. Instead, just destroy the client side structures when the
drawable is no long current for a context.
|
|
|
|
|
|
| |
This reverts 6a6e6d7b0a84e20f9754af02a575ae34081d310c and initializes
dummyContext with an all NULL vtable. The context vtable pointer is
supposed to always be non-NULL, but the vtable entries can be NULL.
|
| |
|
|
|
|
|
|
|
|
|
| |
I was hitting this with gliv.
The GLX spec explicitly mentions that glXWaitX, glXWaitGL and glXUseXFont calls
are ignored when there's no current context. Not sure what if anything the
GLX_EXT_texture_from_pixmap spec says about this, but I think ignoring the
calls makes more sense than crashing there as well. :)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Code in glx/glxcmds.c which uses the XF86VIDMODE extension is already guarded. Also use
that guard to control inclusion of the xf86vmode.h header, and only enable that guard if the
XF86VIDMODE extension is found by pkgconfig.
This changes the behaviour on platforms which XF86VIDMODE exists, in that XF86VIDMODE used to
be mandatory, but is now optional.
Presumably other build systems are already arranging for -DXF86VIDMODE to be supplied to the
complier when glxcmds.c is compiled, so are not affected by this change
Signed-off-by: Jon TURNEY <[email protected]>
|
|
|
|
|
|
|
| |
This fixes some of the build issues with GLX_INDIRECT_RENDERING but !GLX_DIRECT_RENDERING due to recent changes.
Signed-off-by: Jon TURNEY <[email protected]>
Signed-off-by: Kristian Høgsberg <[email protected]>
|
|
|
|
| |
https://bugs.freedesktop.org/show_bug.cgi?id=29304
|
| |
|
| |
|
| |
|