diff options
| author | 2013-11-04 16:43:03 -0800 | |
|---|---|---|
| committer | 2013-11-04 16:43:03 -0800 | |
| commit | 497ba0e08503806571b52ebe27cc7eee4c0e71a7 (patch) | |
| tree | 0edeb7b6cce3fa669fb45be3ef3a1dd6febde936 /libs/ui/FramebufferNativeWindow.cpp | |
| parent | 40da5283ebc6b5cf1e3820740dc274c47cc55f6d (diff) | |
Don't use implementation-defined format with CPU consumers
If the virtual display surface is being consumed by the CPU, it can't
be allowed with HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED since there is
no way for the CPU consumer to find out what format gralloc chose. So
for CPU-consumer surfaces, just use the BufferQueue's default format,
which can be set by the consumer.
A better but more invasive change would be to let the consumer require
a certain format (or set of formats?), and disallow the producer from
requesting a different format.
Bug: 11479817
Change-Id: I5b20ee6ac1146550e8799b806e14661d279670c0
Diffstat (limited to 'libs/ui/FramebufferNativeWindow.cpp')
0 files changed, 0 insertions, 0 deletions