DragWx wrote: ↑Fri May 26, 2023 6:45 pm
I think, apart from a handful of cranky people, we're all excited and looking forward to the CX16 (look at the introductions forum for example), even if we're not saying it with every post in every thread.
I do apologize if I came across as cranky. I promise, that was not my intention.
DragWx wrote: ↑Fri May 26, 2023 6:45 pm
a developer can either leave up to the VERA to resolve (the ugly (IMO) non-integer screen resize), or can handle natively in their program (modify screen layouts to account for a margin); something we as devs will need to talk about at some point.
It's still my opinion that the simplest, and probably best thing to do here, is to re-adopt the generally-accepted practice from "back when", which was to respect a safety margin when designing your UI elements and when considering your field of gameplay.
Granted, "back when" is as recent as-- wait, no, my employer still does this today with our in-game UIs, because overscan is still a thing with modern TVs taking natively HDMI input from their blu-ray players.
But as far as I'm aware, it dates back to time immemorial, in the literal sense that I don't think anyone I know has a living memory or can point to a recorded point when the practice started. It just always was, because it was always required for developing for TV displays.
The rest of this, I'm sure, is going to be boring details for most people, but I can't help myself because I love talking about gamedev.
Recently, there's been a trend with some video games to present the player with a size and alignment calibration screen on their first play, in addition to brightness/black-level calibration (the latter practice starting in earnest sometime in the late 2000s or early 2010s, as far as I can remember). Even some PC games are starting to do this size/alignment calibration -- mostly ones developed alongside their console versions.
So even with "safe frame" practices, my industry is starting to concede defeat when it comes to finding good default rendering parameters across a variety of display hardware and capabilities. UI designers want to make use of the full display, from extreme corner to extreme corner. Thankfully, modern games can do this relatively easily, because the rendering techniques just plug in white/black/etc values into the appropriate deferred-rendering shaders, and scale/offset are similarly trivial parameters to the final render call.
I don't think games using the VERA will be able to replicate modern trends in software so easily. I mean, imagine trying to scale color brightness of your palette with only 12-bit palette values -- there's a huge difference between $F and $E in a color channel. But even for size and location of the UI, I suppose I would put it out there that locating things into the extreme corners or edges of the display is questionable practice to begin with -- you might be able to do some basic math once and cache the corrected offsets for UI elements, but you need to leave unused space in your UI to handle the potential movement anyways. Why not plan with a safety margin in mind that covers most cases? If the negative space in your UI offends you, there's nothing stopping you from adding some non-functional greebling to make it more interesting on displays that show the area.