Yes, and to be honest, you should aim to get one anyway. As we just said, 8-bit is very 1980s. In an age of 4K HDR you really want to have a 10-bit display to get the benefit of modern graphics and content.
Games for contemporary PCs and modern consoles all render in 10-bit as a minimum, and HDR is becoming universal. Of course, they’ll work just fine with a low cost 8-bit panel but you’ll miss out. Even more expensive 8-bit monitors and TVs with HDR support exhibit limitations. For example, on Xbox One X a dithered 8-bit display (which simulates 10-bit as best as possible) can only work with basic HDR10. Proper 10-bit displays open up Dolby Vision and HDR10+ options.
In that regard gaming’s no different from serious movie watching, streaming, photography, or video editing. For all of them, source content keeps increasing in detail and quality. Obviously, the display you use should keep up with the content, not stay stuck in the past. That means 10-bit or more, since 8-bit, while reliable and proven, simply doesn’t show you the whole picture.
With 10-bit you get a more detailed image and as resolution increases, there are more details to display. Force a game to run on an 8-bit panel and you’ll get less complex darks, washed out or banded brights, and approximated textures instead of the ones intended by the artists. Even if not shocking, the difference is becoming increasingly important.