Without question, we live in a high definition world. Just go outside and look around. See that leaf? It looks REAL. So does what is left of the grass at this point in the year. OK, enough of this; it is cold and the outdoors can be a little frightening. Time to go back to my favorite, well-worn spot on the sofa and that shiny HD television set. But even the most newfangled “ultra high-definition” (UHD, otherwise incorrectly known as 4K – see tangent #1 below) TVs can not touch the brilliance of reality – unless you live in Japan, where the outrageous 8K systems probably come close. And while “real life” might be incredibly detailed (and perhaps more exciting, and healthful, as well), most video games produced in the 1990s and early 2000s are not – regardless of how big or expensive your high-def television might be (sorry Black Friday shoppers).
First, some background: High definition, commonly abbreviated as “HD”, has been used to designate the high resolution digital broadcast standard for television in the United States since the (might I say, infamous?) transition to digital TV (DTV) signals for over-the-air TV about a decade ago. What is that? You want to delve into technical details? I aim to please! To be officially designated as HD, a signal needs to offer at least a 1280×720 resolution, commonly referred to as “720p”. The higher resolution HD standard, also known as “full” HD (or FHD, if you are pretentious), is 1920×1080, and is generally referred to as “1080p”. (See tangent #2, below, for fascinating insight into the use of the “p” with regard to broadcast resolution!) I will point out here that even the latest video games sometimes fail to hit full 1920×1080 resolution given the limits of console hardware, with previous-gen (Sony PS3, Microsoft XBox 360) games internally rendered at 1600×900 or even 1280×720 – though this was still plenty of resolution to provide a “high definition” experience.
Though any modern TV can display HD content with appropriate detail and sharpness, the previous standard, employed by every video game system prior to the PlayStation 3 and XBox 360 in the mid-2000s, generally looks terrible on current displays. In my last editorial I looked at the relationship between resolution and design, and discussed pixel art vs. 3D rendering and static image assets (some have called it an altogether brilliant – if brief – look at the subject, that must be read over and over to be truly appreciated), and it is in part the combination of very low resolution and poor up-scaling to HD dimensions that is to blame for the lousy appearance of standard definition (SD) console games on HDTVs. To this end, systems such as the Nintendo GameCube and Sony PlayStation 2, each a massive graphical upgrade over their predecessors on the SD TVs of the era, look really poor on a modern HDTV. But the story does not end so quickly for these consoles. There is another, hidden side to some of the games on this particular pair of classic gaming machines; and in some cases the high definition experience was lurking inside the game disc all along, just waiting to be exploited using a modern PC!
While I am partial to the aesthetics of a game incorporating pre-rendered backgrounds (such as Final Fantasy VII), games such as the twelfth (that is a strange-looking word, I must say) installment of the not-exactly “Final” Fantasy have the advantage of being fully 3D rendered; and here is a prime example of a game that can look fantastic without much effort by changing just one crucial setting in an emulator. Yes, PlayStation 2 games can be emulated if you have a fast enough computer, and all you need (aside from the fast PC) is a copy of the original game, a DVD drive, and a free piece of software for your PC called PCSX2. Leaving any other settings at their defaults, I popped in my copy of FFXII and changed only the all-important internal resolution of the game from the default to a full 1080p. Here is a gallery I found to illustrate my point, and as you can see (if you follow the link and view the images, that is) it looks fantastic – and could easily have been released as a PlayStation 3 game without any more than a simple resolution up-scaling. I could do this with any game (with very mixed results), but the key to a game like FFXII is the graphical assets on the game disc, which were ill-served by the standard definition console hardware of old.
I could go on ranting about screen resolution, digital vs. cinema color standard, and above all the secret HD life of video games such as the original GameCube version of The Legend of Zelda: The Wind Waker (which looks amazing at 1080p using the Dolphin emulator!), but I will end my writing on this subject for now. While there is a certain charm to the original, slightly blurry versions of some of the best-loved games in our recent past, it takes an old-school CRT television to make them look the way we remember them. As we enter the “4K” realm with televisions (just try to buy a new 1080p TV next year), high resolution will matter more than ever before, and while we can only hope for a proper HD remake of some of the most beloved games (I will not even mention the one I am thinking of, but Lusi knows), it is nice to know that a little effort will reward the investment in a fast computer by providing some spectacular improvements to the game discs you already own. What are you waiting for (aside from the financial barrier to entry and overall inconvenience of PC-based emulation compared to living room console gaming)?
Tangent #1: 4K has been in use for several years in the motion picture realm (a standard of the Digital Cinema Initiative, or DCI), and it describes a horizontal resolution of 4096 pixels and a maximum vertical resolution of 2160 pixels (actual vertical resolution varies based on the lens system and final aspect ratio of the film, with a Panavision movie requiring fewer vertical lines of resolution, for example). UHD, on the other hand, which is standard used for “4K” streaming and UHD Blu-ray, is actually a lower 3820 x 2160 resolution. UHD is also a convenient quadrupling of the previous 1920×1080 HD standard, with both horizontal and vertical lines doubled to reach the higher resolution standard. We will not get into the differences between DCI 4K and consumer UHD color standards here, which actually accounts for a far more significant difference than the lost pixels in horizontal resolution.
Tangent #2: The reason that it is not always correct to call 1280×720 or 1920×1080 programming “720p” and “1080p” is the use of the p, which actually stands for progressive scan, a process in which each frame of the program is displayed in its entirety, rather than with the old method of interlacing two incomplete frames. Interlacing (abbreviated as simply “i”) is an inferior option that often results in jagged edges and lost detail; problems less visible on the old standard CRT televisions, which almost exclusively displayed interlaced content. True “1080p” programming is essentially limited to Blu-ray video, newer video games, and high quality digital streaming, as TV broadcast signals at 1920×1080 are interlaced to conserve bandwidth.