| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
| |
Reported by Karl Schultz.
|
|
|
|
|
|
|
|
|
|
|
| |
This brings swrast's support up to the state of gallium, and fixes the
default center behavior of fragment.position.xy in piglit
fp-arb-fragment-coord-conventions-none.
The extension is not enabled currently because the GLSL part of the
extension isn't supported, so piglit
glsl-arb-fragment-coord-conventions-define fails as would any serious
test of the GLSL part.
|
|\
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Conflicts:
src/gallium/auxiliary/draw/draw_context.c
src/gallium/auxiliary/draw/draw_pt_fetch_shade_pipeline.c
src/gallium/auxiliary/pipebuffer/Makefile
src/gallium/auxiliary/pipebuffer/SConscript
src/gallium/auxiliary/pipebuffer/pb_buffer_fenced.c
src/gallium/auxiliary/tgsi/tgsi_scan.c
src/gallium/drivers/i915/i915_surface.c
src/gallium/drivers/i915/i915_texture.c
src/gallium/drivers/llvmpipe/lp_setup.c
src/gallium/drivers/llvmpipe/lp_tex_sample_c.c
src/gallium/drivers/llvmpipe/lp_texture.c
src/gallium/drivers/softpipe/sp_prim_vbuf.c
src/gallium/state_trackers/xorg/xorg_dri2.c
src/gallium/winsys/drm/intel/gem/intel_drm_api.c
src/gallium/winsys/drm/nouveau/drm/nouveau_drm_api.c
src/gallium/winsys/drm/radeon/core/radeon_drm.c
src/gallium/winsys/drm/vmware/core/vmw_screen_dri.c
src/mesa/state_tracker/st_cb_clear.c
|
| |
| |
| |
| |
| |
| |
| |
| | |
We were calling this from the CI span function, but not the RGBA
span function.
I don't know of a test program for the GL_EXT_depth_bounds_test
extension...
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The integer Z clamping range depends on the number of bits
in the Z buffer because that's the scale factor used when we
transform NDC coords by the viewport/depth range.
Fixes fd.o bug #25972 but only for Z buffers up to a depth
of 30 bits. Beyond that we get into messy integer overflow
issues and things fall apart.
|
|\ \ |
|
| |\ \
| | | |
| | | |
| | | |
| | | | |
Conflicts:
src/mesa/main/dd.h
|
| | | |
| | | |
| | | |
| | | | |
Signed-off-by: Chia-I Wu <[email protected]>
|
|\ \ \ \
| |_|/ /
|/| | /
| | |/
| |/|
| | |
| | | |
Conflicts:
src/gallium/auxiliary/pipebuffer/pb_buffer_fenced.c
src/gallium/auxiliary/util/Makefile
src/gallium/drivers/r300/r300_state_derived.c
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | |
| | |
| | |
| | | |
Should fix fdo bug 25837.
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
When we have integer-valued texture formats, the texture border color
must also store integer and uint values.
With GL 3.0, the new glTexParameterIiv() and glTexParameterIuiv() functions
can set the border color to int or uint values.
|
|\| |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Conflicts:
configs/darwin
src/gallium/auxiliary/util/u_clear.h
src/gallium/state_trackers/xorg/xorg_exa_tgsi.c
src/mesa/drivers/dri/i965/brw_draw_upload.c
|
| | |
| | |
| | |
| | |
| | | |
When using multiple color drawbuffers with blending/logicop/masking we
were overwriting color values which we still needed.
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
This is part of the GL_EXT_draw_buffers2 extension and part of GL 3.0.
The ctx->Color.ColorMask field is now a 2-D array. Until drivers are
modified to support per-buffer color masking, they can just look at
the 0th color mask.
The new _mesa_ColorMaskIndexed() function will be called by
glColorMaskIndexedEXT() or glColorMaski().
|
|/ /
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
ctx->Color.BlendEnabled is now a GLbitfield instead of a GLboolean to
indicate blend on/off status for each color/draw buffer.
This is infrastructure for GL_EXT_draw_buffers2 and OpenGL 3.x
New functions include _mesa_EnableIndexed(), _mesa_DisableIndexed(), and
_mesa_IsEnabledIndexed(). The enable function corresponds to
glEnableIndexedEXT() for GL_EXT_draw_buffers2 or glEnablei() for GL3.
Note that there's quite a few tests for ctx->Color.BlendEnabled != 0 in
drivers, etc. Those tests can remain as-is since the mask will be 0 or ~0
unless GL_EXT_draw_buffers2 is enabled.
|
|\ \
| | |
| | |
| | |
| | | |
Conflicts:
src/gallium/drivers/softpipe/sp_quad_blend.c
|
| | | |
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Add a GLbitfield64 type and several macros to operate on 64-bit
fields. The OutputsWritten field of gl_program is changed to use that
type. This results in a fair amount of fallout in drivers that use
programs.
No changes are strictly necessary at this point as all bits used are
below the 32-bit boundary. Fairly soon several bits will be added for
clip distances written by a vertex shader. This will cause several
bits used for varyings to be pushed above the 32-bit boundary. This
will affect any drivers that support GLSL.
At this point, only the i965 driver has been modified to support this
eventuality.
I did this as a "squash" merge. There were several places through the
outputswritten64 branch where things were broken. I foresee this
causing difficulties later for bisecting. The history is still
available in the branch.
Conflicts:
src/mesa/drivers/dri/i965/brw_wm.h
|
| | | |
|
| |/
|/| |
|
| |
| |
| |
| |
| |
| |
| | |
I'd written a testcase for the hard part of the extension enablement, so
naturally the easy stuff was completely broken. There are still issues,
as I'm seeing FLOAT_TO_UINT(max_f) == 0x0 when max_f == 1.0, but it gets
piglit depth-clamp-range closer to success.
|
| | |
|
|\ \
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Conflicts:
src/mesa/drivers/dri/radeon/radeon_fbo.c
src/mesa/drivers/dri/s3v/s3v_tex.c
src/mesa/drivers/dri/s3v/s3v_xmesa.c
src/mesa/drivers/dri/trident/trident_context.c
src/mesa/main/debug.c
src/mesa/main/mipmap.c
src/mesa/main/texformat.c
src/mesa/main/texgetimage.c
|
| | |
| | |
| | |
| | |
| | |
| | | |
Fix backward component ordering for RGB textures.
Only optimize RGBA texture case if running little endian. This restriction
could be lifted with a little work.
|
| | | |
|
| | | |
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | | |
_ActualFormat is replaced by Format (MESA_FORMAT_x).
ColorEncoding, ComponentType, RedBits, GreenBits, BlueBits, etc. are
all replaced by MESA_FORMAT_x queries.
|
| | |
| | |
| | |
| | |
| | | |
Need to be careful with component ordering for MESA_FORMAT_RGB888
and MESA_FORMAT_RGBA8888.
|
| | |
| | |
| | |
| | | |
Removed: MESA_FORMAT_RGBA, RGB, ALPHA, LUMINANCE, LUMINANCE_ALPHA, INTENSITY.
|
| | |
| | |
| | |
| | |
| | |
| | | |
Now gl_texture_image::TexFormat is a simple MESA_FORMAT_x enum.
ctx->Driver.ChooseTexture format also returns a MESA_FORMAT_x.
gl_texture_format will go away next.
|
|\ \ \
| | |/
| |/| |
|
| | | |
|
| | | |
|
| | |
| | |
| | |
| | | |
This ensures the driver won't map the wrong set of textures.
|