| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
| |
|
|
|
|
| |
Signed-off-by: Brian Paul <[email protected]>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Anything that matched IDENTIFIER was strdup'ed and returned to the
parser. However, almost every case of IDENTIFIER in the parser just
dropped the returned string on the floor. Every swizzle string, every
option string, every use of a variable, etc. leaked memory.
Create a temporary buffer in the parser state (string_dumpster and
dumpster_size). Return strings from the lexer to the parser in the
buffer. Grow the buffer as needed. When the parser needs to keep a
string (i.e., delcaring a new variable), let it make a copy then.
The only leak that valgrind now detects is /occasionally/ the copy of
the program string in gl_program::String is leaked. I'm not seeing
how. :(
|
|
|
|
|
| |
The program string is kept in the program object. On the second call
into glProgramStringARB the previous kept string would be leaked.
|
| |
|
|
|
|
|
| |
Bug #24435
(cherry picked from commit d56125a298106d81e10674f1c4b3b43b51a5139d)
|
|
|
|
| |
This fixes the second part of bug 23552.
|
|
|
|
|
|
|
|
| |
Tests glBlitFramebuffer() between two texture/renderbuffer surfaces.
In particular, blit from level[1] of a cube map face to a 2D texture.
Used to find/fix bug in intel do_copy_texsubimage().
See commit aef1ab1073f3e30d699b99dae17518ed48b57c72
|
|
|
|
|
|
| |
Use src->draw_offset intead of zero. Zero usually worked, except when
the src renderbuffer is actually a texture mipmap level higher than zero.
Fixes progs/test/blitfb.c test.
|
|
|
|
| |
A slightly modified version of a patch from Vinson Lee.
|
|
|
|
| |
Bug #24734.
|
| |
|
|
|
|
|
|
|
|
|
| |
This reverts commit 8810b8f67135185d1044746bb861fe2ff997626c.
It turns out the i965 driver uses the intel->Fallback field as a boolean,
not as a bitmask. The intelFallback() function is a no-op in the i965
driver. It would have been nice if there were some comments about this.
I'll fix that next...
|
|
|
|
| |
This would only be hit if we got and invalid index_size.
|
| |
|
|
|
|
|
|
|
| |
Need to push texture state and polygon state too.
Fixes rendering glitches seen in progs/demos/engine when changing
the rendering mode (wireframe, texture modes).
This makes bitmap rendering a little slower, unfortunately.
|
|
|
|
|
| |
The texture format should not be checked until validation time since
the format might be changed by a subsequent glTexImage() call.
|
| |
|
|
|
|
|
|
|
|
|
| |
By just using offsets, we confused the hardware's tiling calculations,
resulting in failures in miptree validation and blit clears.
Fixes piglit fbo-clearmipmap.
Bug #23552. (automatic mipmap generation)
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
| |
Need to return the actual compressed format when the user originally
requested a generic compressed format.
|
|
|
|
|
|
| |
Maps a compressed MESA_FORMAT_x to correspding GLenum. Needed for
querying a texture's actual format when a generic format was originally
requested.
|
| |
|
|
|
|
|
|
|
|
| |
Array indexes are invalid when >= the maximum, but array sizes are
only in valid when > the maximum. This prevented programs from
declaring a single maximum size array.
See the piglit vp-max-array test.
|
|
|
|
|
| |
Per the GLX spec, when changing rendering contexts, the old context
should first be flushed.
|
|
|
|
|
|
|
| |
According to the GLXDestroyContext() man page, the context should not
immediately be destroyed if it's bound to some thread. Wait until it's
unbound to really delete it. The code for doing the later part is
already present in MakeContextCurrent() so no change was needed there.
|
| |
|
| |
|
|
|
|
|
|
|
|
| |
Setting intel->Fallback = 1 clobbered any fallback state that was already
set. Not sure where this hack originated (the git history is a little
convoluted). Define and use a new BRW_FALLBACK_DRAW bit instead. This
shouldn't break anything and could potentially fix some bugs (but no
specific ones are known).
|
| |
|
| |
|
| |
|
|
|
|
|
| |
The value was probably wrong too.
It was the same as INTEL_FALLBACK_DRAW_BUFFER.
|
|
|
|
| |
otherwise
|
|
|
|
|
|
|
| |
Also avoids empty shader for "END" - seems to be somewhat valid fp
Maybe this can be done differently in the future (fake FRAG_RESULT_COLOR
already in Map_Fragment_Program() or is there a way to program the chip
to not hang in case of no exports.
|
| |
|
| |
|
|
|
|
| |
s/LERP/LRP/
|
|
|
|
|
| |
strtod_l needs the xlocale.h header on Mac OS. It's possible other
non-Linux OSes would need this header too.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The in kernel texture check fails because of both
bit11 flags being set on 16x16 textures. It tuns out
that these bits are still set and not cleared in the
pp_txpitch field of the texture. The attached patch
at least helps for this case on my machine. It clears
the bit 11 from the pitch field if the texture is smaller
and masks out that hight bits on the conventional width
and height field.
Fixes bug 24584
|
|
|
|
|
|
|
|
| |
size was being calculated based on 3 bytes per pixel with 24 bit depth
instead of 4 bytes. This caused corruption in the bottom 25% of objects.
This finishes fixing the menu/text corruption in compiz/kde4.
Signed-off-by: Robert Noland <[email protected]>
|
| |
|
| |
|
|
|
|
|
|
|
|
|
| |
_mesa_strtod() is used for shader/program parsing where the decimal
point character is always '.' Use strtod_l() with a "C" locale to
ensure correct string->double conversion when the actual locale uses
another character such as ',' for the decimal point.
Fixes bug 24531.
|