gpu: nvgpu: fix error_type_index decoding

Priv errors are in the form of 0xBADF-TYPE[3:0]-TYPEINDEX[3:0]-YY
To get error TYPEINDEX, shift by 8 instead of 16 as shifting by
16 will result 0.

Fix Coverity ID 9748860

Change-Id: I2eee5c0cdd87d318a2ef670b6329de984158ed1e
Signed-off-by: Seema Khowala <seemaj@nvidia.com>
Reviewed-on: https://git-master.nvidia.com/r/1995783
Reviewed-by: Terje Bergstrom <tbergstrom@nvidia.com>
Reviewed-by: svc-mobile-coverity <svc-mobile-coverity@nvidia.com>
Reviewed-by: svc-mobile-misra <svc-mobile-misra@nvidia.com>
Reviewed-by: svc-misra-checker <svc-misra-checker@nvidia.com>
GVS: Gerrit_Virtual_Submit
Reviewed-by: Alex Waterman <alexw@nvidia.com>
Reviewed-by: mobile promotions <svcmobile_promotions@nvidia.com>
Tested-by: mobile promotions <svcmobile_promotions@nvidia.com>
This commit is contained in:
Seema Khowala
2019-01-15 11:11:08 -08:00
committed by mobile promotions
parent 6b4a762528
commit c2524323eb

View File

@@ -69,7 +69,7 @@ void gp10b_priv_ring_decode_error_code(struct gk20a *g,
{ {
u32 error_type_index; u32 error_type_index;
error_type_index = (error_code & 0x00000f00U) >> 16U; error_type_index = (error_code & 0x00000f00U) >> 8U;
error_code = error_code & 0xBADFf000U; error_code = error_code & 0xBADFf000U;
if (error_code == 0xBADF1000U) { if (error_code == 0xBADF1000U) {