The Vanishing Pixel (Hardware Upgrade)
I bought Acer XB280HK last week. It's a horrible display but the first one I could afford that had 4K resolution with 60Hz refresh rate and G-Sync. For G-Sync to work I had to upgrade my graphics card too though. I ended picking up GTX960. Together they manage to compensate quite lot of the troubles I have had with the display and early software support.
Lot of software just freaks out on the large resolution and VLC is unable to show videos that match the resolution of the display. On top of that my ubuntu desktop isn't taking it well: Due to higher pixel density, some programs get their font rendering wrong. Every desktop application has its own idea of what are the dimensions of the system font. Trying to resize the fonts also resize their metrics. Though, whenever I've got fonts rendered, they're sharper than before.
Stand allows every motion on the table to shake the monitor, the monitor itself got barely unnoticeable backbleed on the bottom and left edges. The display cable forced me to lift my computer to the table. It's not a good monitor, when compared to stuff I had.
What makes this noteworthy in my blog are the high frequency patterns in photos. There's just more to look at. It's not much denser, but dense enough to put a difference. I've been watching at photos and graphics on my desktop and benchmarking my new system. I have a hunch that increase in display densities result in a revolution on graphics.
The display density isn't enough to eliminate pixel aliasing, but the aliasing patterns appear strikingly different compared to the coarser patterns on my old display. Antialiased lines and edges in various angles seem much nicer than they did. It makes existing graphics seem noticeably different.
If there's bad pixel graphics, they look even worse than before. Though good ones are just improved if I run them through hqx -filter. Perhaps good graphics are less affected by aliasing artifacts in how do they look like?
Vector graphics seem noticeably better for symbols on a dense display, but they lack the high frequency detail I enjoy in the photographs and some of the pixel art.
G-Sync is NVIDIA brand for varying refresh rate. That makes the display lot more forgiving for latency spikes in the games. I managed to lag down the framerate in Half Life 2: Low Coast with the water effects. It's still noticeable if the rate suddenly drops down to 24FPS, but it lags down smoothly. During the play I didn't notice any jitter and every frame was whole. Though, while I was playing the game itself somehow flipped it's camera during one frame and shown something entirely else for a brief period.
With the G-Sync on there are occassionally some other bugs. If I set chrome web browser fullscreen, the G-Sync flips on and sometimes the display is flashing a bit. I don't know if the problem's the monitor or the G-Sync, or chrome.
Can barely wait for 8k panels to show up.