diff options
author | Ian Romanick <[email protected]> | 2016-11-08 11:06:05 -0800 |
---|---|---|
committer | Ian Romanick <[email protected]> | 2016-11-10 10:57:59 -0800 |
commit | e85a747e294762785df2ce8a299c153254c6fca2 (patch) | |
tree | 336aa8c0062cde342d3b152955e5c3e8e97806f4 /src/gallium | |
parent | cbba5e13acc2052349f7e2d304e6dddf31fb199a (diff) |
glcpp: Handle '#version 0' and other invalid values
The #version directive can only handle decimal constants. Enforce that
the value is a decimal constant.
Section 3.3 (Preprocessor) of the GLSL 4.50 spec says:
The language version a shader is written to is specified by
#version number profile opt
where number must be a version of the language, following the same
convention as __VERSION__ above.
The same section also says:
__VERSION__ will substitute a decimal integer reflecting the version
number of the OpenGL shading language.
Use a separate flag to track whether or not the #version line has been
encountered. Any possible sentinel (0 is currently used) could be
specified in a #version directive. This would lead to trying to
(internally) redefine __VERSION__. Since there is no parser location
for this addition, NULL is passed. This eventually results in a NULL
dereference and a segfault.
Attempts to use -1 as the sentinel would also fail if '#version
4294967295' or '#version 18446744073709551615' were used. We should
have piglit tests for both of these.
Signed-off-by: Ian Romanick <[email protected]>
Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=97420
Reviewed-by: Nicolai Hähnle <[email protected]>
Cc: [email protected]
Cc: Juan A. Suarez Romero <[email protected]>
Cc: Karol Herbst <[email protected]>
Diffstat (limited to 'src/gallium')
0 files changed, 0 insertions, 0 deletions