diff options
author | Francisco Jerez <[email protected]> | 2012-03-18 23:59:33 +0100 |
---|---|---|
committer | Francisco Jerez <[email protected]> | 2012-05-11 12:39:42 +0200 |
commit | 57c048f291ffcb97d7df3177d92f9634e510dcc0 (patch) | |
tree | 16f1a407d28ea263fb396fb03a5db84d341ecdd2 /src/gallium/docs | |
parent | 2644952bd4dfa3b75112dee8dfd287a12d770705 (diff) |
gallium/compute: Drop TGSI dependency.
Add a shader cap for specifying the preferred shader representation.
Right now the only supported value is TGSI, other enum values will be
added as they are needed.
This is mainly to accommodate AMD's LLVM compiler back-end by letting
it bypass the TGSI representation for compute programs. Other drivers
will keep using the common TGSI instruction set.
Reviewed-by: Tom Stellard <[email protected]>
Diffstat (limited to 'src/gallium/docs')
-rw-r--r-- | src/gallium/docs/source/screen.rst | 2 |
1 files changed, 2 insertions, 0 deletions
diff --git a/src/gallium/docs/source/screen.rst b/src/gallium/docs/source/screen.rst index 8e4584023df..d912dc6d81d 100644 --- a/src/gallium/docs/source/screen.rst +++ b/src/gallium/docs/source/screen.rst @@ -185,6 +185,8 @@ to be 0. If unsupported, only float opcodes are supported. * ``PIPE_SHADER_CAP_MAX_TEXTURE_SAMPLERS``: THe maximum number of texture samplers. +* ``PIPE_SHADER_CAP_PREFERRED_IR``: Preferred representation of the + program. It should be one of the ``pipe_shader_ir`` enum values. .. _pipe_compute_cap: |