| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
Reviewed-by: Eric Anholt <[email protected]>
|
|
|
|
|
|
|
| |
Again, there was already a call to _mesa_source_buffer_exists() earlier in
the function.
Reviewed-by: Eric Anholt <[email protected]>
|
|
|
|
|
|
|
| |
There was already a call to _mesa_source_buffer_exists() earlier in
the function.
Reviewed-by: Eric Anholt <[email protected]>
|
|
|
|
|
|
|
| |
INLINE is still seen in some files (some generated files, etc) but this
is a good start.
Acked-by: Kenneth Graunke <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
All drivers remaining in Mesa support this extension. This extension
is either required or optional features in desktop OpenGL, OpenGL ES
1.x, and OpenGL ES 2.x.
EXT_texture_format_BGRA8888 is mostly a subset of EXT_bgra. The only
difference seems to be that EXT_texture_format_BGRA8888 allows GL_BGRA
as an internal format to glTexImage2D and friends.
Reviewed-by: Brian Paul <[email protected]>
Reviewed-by: Kenneth Graunke <[email protected]>
|
|
|
|
|
|
|
| |
AFAIK, there are few users of this extension and I can see a couple
reasons why this is probably broken in Mesa anyway.
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
| |
These fields were only used for swrast so move them into
swrast_texture_image.
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
| |
It's only used by swrast.
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This code was really broken before. A lot of the error checks were
done much later (too late), and some of the error checks would fail.
The underlying problem is that Mesa doesn't ever keep compressed paletted
textures in their original format. The textures are immediately
converted to some RGB or RGBA format.
Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=39991
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Brian Paul <[email protected]>
Tested-by: Jin Yang <[email protected]>
|
|
|
|
|
|
| |
Signed-off-by: Ian Romanick <[email protected]>
Reviewed-by: Brian Paul <[email protected]>
Tested-by: Jin Yang <[email protected]>
|
|
|
|
|
| |
This also involves passing swrast_texture_image instead of gl_texture_image
into all the fetch functions.
|
|
|
|
|
|
| |
Matches the NewTextureImage() hook. With new subclasses of
gl_texture_image coming we need a new hook to properly delete objects of
those subclasses.
|
|
|
|
|
|
| |
Do it during swrast state validation since the FetchTexel() functions
are only called from swrast now and not core Mesa.
Remove assertions in mipmap.c since they're no longer appropriate.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
EXT_shared_texture_palette
This was also discussed at XDS 2010. However, actually making the
change was delayed because several drivers still exposed these
extensions to significant benefit (e.g., tdfx). Now that those
drivers have been removed, this code can be removed as well.
v2: A lot of bits that were missed in the previous patch have been removed.
Reviewed-by: Brian Paul <[email protected]>
Reviewed-by: Kenneth Graunke <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
All driver implementations of FreeTextureImageBuffer already check
that Data != NULL and free it. However, this means that we will also
free driver storage if the driver storage wasn't in the form of a Data
pointer.
This was produced by the following semantic patch:
@@
expression C;
expression T;
@@
- if (T->Data) {
- C->Driver.FreeTextureImageBuffer(C, T);
+ C->Driver.FreeTextureImageBuffer(C, T);
- }
Reviewed-by: Brian Paul <[email protected]>
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
|
| |
This was produced by sed, except for one hunk in driverfuncs.c where
trailing whitespace was dropped.
Reviewed-by: Brian Paul <[email protected]>
Reviewed-by: Ian Romanick <[email protected]>
|
|
|
|
|
|
|
| |
Several drivers have these fields in their subclasses of gl_texture_image.
They'll be useful for core Mesa too...
Reviewed-by: Ian Romanick <[email protected]>
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
| |
The GL_EXT_texture_array spec allows this (Section 3.8.1).
Fixes failing piglit fbo-depth-array test.
NOTE: This is a candidate for the 7.10 branch.
|
|
|
|
|
| |
Also rename to _mesa_logbase2 and move to imports.h to keep the ugly
ifdef GNUC stuff outside other files (also to allow reuse).
|
|
|
|
|
|
| |
With minor clean-ups by Brian Paul.
Signed-off-by: Brian Paul <[email protected]>
|
|
|
|
| |
Fixes http://bugs.freedesktop.org/show_bug.cgi?id=37648
|
|
|
|
| |
Signed-off-by: Brian Paul <[email protected]>
|
|
|
|
| |
This was mistakenly inside the #if FEATURE_ES block.
|
|
|
|
| |
No GLSL or driver support yet.
|
| |
|
|
|
|
|
| |
The component ordering of some formats has been been reversed to match
Gallium types.
|
| |
|
|
|
|
|
|
|
| |
LUMINANCE_ALPHA_LATC2 = LUMINANCE_ALPHA_3DC, so this is easy.
Note that there is no specification for 3DC, just a few white papers
from ATI.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The encoding/decoding algorithms are shared with RGTC.
Thanks to some magic with the base format, the RGTC texstore functions work
for LATC too.
swrast passes the related piglit tests besides two things:
- The alpha channel is wrong (it's always 1), however the incorrect alpha
channel makes some other tests fail too, so I guess it's unrelated to LATC.
- Signed LATC fetches aren't correct yet (signed values are clamped to [0,1]),
however RGTC has the same problem.
Further testing (with other of my patches) shows that hardware drivers
and softpipe work.
BTW, ETQW uses this extension.
|
|
|
|
|
|
| |
when doing glCopyTex[Sub]Image() and checking the source buffer's
completeness.
We only need to determine FBO completeness when the status is indeterminate.
|
|
|
|
| |
Spotted by Bernd Buschinski.
|
|
|
|
| |
In case the driver enables GL_MESA_texture_array but not the EXT version.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
From reading EXT_texture_sRGB and EXT_framebuffer_sRGB and interactions
with FBO I've found that swrast is converting the sRGB values to linear for
blending when an sRGB texture is bound as an FBO. According to the spec
and further explained in the framebuffer_sRGB spec this behaviour is not
required unless the GL_FRAMEBUFFER_SRGB is enabled and the Visual/config
exposes GL_FRAMEBUFFER_SRGB_CAPABLE_EXT.
This patch fixes swrast to use a separate Fetch call for FBOs bound to
SRGB and avoid the conversions.
v2: export _mesa_get_texture_dimensions as per Brian's comments.
Signed-off-by: Dave Airlie <[email protected]>
|
| |
|
|
|
|
|
|
|
| |
It's just LUMINANCE, not LUMINANCE_ALPHA. Fixes
fbo-generatemipmap-formats GL_EXT_texture_sRGB-s3tc assertion failure
when it tries to pack the L8 channels into LUMINANCE_ALPHA and wonders
why it's trying to do that.
|
| |
|
|
|
|
| |
Simplify some code, remove unneeded checks, etc.
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
| |
Something similar could be done for glCopyTex[Sub]Image() and the
compressed texture image functions as well.
|
|
|
|
|
|
|
|
|
|
|
| |
This allows 16K x 16K 2D textures, for example, but we don't want to
allow that for 3D textures. The new gl_constants::MaxTextureMBytes
field is used to prevent allocating too large of texture image.
This allows a 16K x 32 x 32 3D texture, for example, but prevents 16K^3.
Drivers can override this limit. The default is currently 1GB.
Apps should use the proxy texture mechanism to determine the actual
max texture size.
|
|
|
|
| |
Fixes http://bugs.freedesktop.org/show_bug.cgi?id=31779
|