Wednesday, March 5, 2025

Superauthenticity: Pixels, signals, and shaders

I could have been playing like this!

This is a follow-up to my original topic on preservation and superauthenticity. The short of it is that most aspects of preservation of commercial entertainment products either concern the materials or the presentation. Authenticity preserves the original presentation to the extent that is possible. Superauthenticity enhances the presentation by correcting flaws in the original, but in a way that isn't destructive to the materials.

I play most of my games through unofficial emulation, because it provides the next-closest thing to an authentic presentation after using original hardware, and provides plenty of superauthentic options that wouldn't be available otherwise. One of these parameters, which you wouldn't get much control over otherwise, is in a way the atomic unit of video game presentation itself - the pixel.

Today, most gamers, including myself, use fixed-pixel displays exclusively. Many find this perfectly adequate. Some insist on vintage televisions and monitors for period-appropriate games. And for others, there are now many ways to mimic some of the pixel rasterization effects on modern gear.

 

What is a pixel? 

We're going to get a little technical here, and maybe a little bit philosophical too, but hopefully not overly in-depth. People have written lengthy research papers about this very topic - I intend to keep it briefer than that.

Nowadays, despite Alvy Ray Smith's insistence otherwise, it's easy to think of a pixel as a little square. A rectangle of solid color that exists within a uniform, fixed-dimension grid of others like it, differing only by color information. This is reductive, but most of the time it works just fine. Just as real world architects needn't account for the curvature of the earth when designing most projects, computer artists, engineers, and consumers can often get along just fine thinking of a pixel as a square of color. The digital interface between your modern GPU and display uses this model. Most raster image file formats use this model. Even in the early 1980's, sprite artists often worked with this model, using graph paper to plan out their animation frames, which could be easily encoded to ROM as tile data - typically in 4x4 or 8x8 chunks.

And yet, just as the ground you walk on isn't flat, and the earth itself is not a sphere, a pixel isn't a little square, and situations exist where this matters. Even today, we have the issue of sub-pixel rendering, which is important for reading legible text on a computer monitor. ClearType, though designed in 1998, still works very well with the majority of modern LCD monitors, but can cause problems on screens with subpixels that aren't arranged in the way that it expects. This includes certain OLED screens, consumer televisions, smart phones, and even rotatable monitors in vertical orientation. This is, of course, a specific use case that most consumers and even most software engineers needn't think about too much, but it will matter to some, and the concept of a pixel today is astoundingly precise compared to what it once was.

 

The majority of the games I have covered so far were intended to be played on 15khz color raster monitors. Other display types have included point displays, vector monitors, text terminals, handheld LCD, and even teletypes, but for the purpose of discussing pixel-based graphics, we only need to worry about 15khz color raster monitors, which includes consumer televisions, computer monitors, and arcade monitors. If you were to look at one very closely, it may look something like this:

Source: r/mildlyinteresting

These dots are not pixels! These are phosphor points whose color and intensity make up the elements of the final image, but there is no way for a computer to directly address any point, nor a triad, nor even a particular cluster.

In fact, as far as the screen is concerned, there's no such thing as a pixel. The smallest unit of discrete information in the video signal is a scanline, which is a continual waveform. When the video source comes from a video game with big, chunky pixels, the waveforms will peak, trough, and plateau in a manner that corresponds to the pixel sequence, but the monitor doesn't know that. The monitor is continually changing the color and brightness of a dot of light in accordance with the waveform amplitude as it races back and forth across the screen. Any given "pixel" from the video source may wind up illuminating any number of phosphor points, though there are numerous factors outside the developers' & artists' control than can increase or decrease precision.

Source: filthypants.blogspot.com

The NTSC standard dictates 15,750 lines per second, and 525 lines per complete frame, with vertical synchronization occurring twice per cycle to produce line interlacing. In practice, only a maximum of 480 of these lines will wind up producing part of a meaningful image. The CRT televisions proved to be flexible with regards to standards, though, and computer engineers working with limited memory and processing power quickly discovered it was possible to hack this standard by using 262 lines per frame, with one v-sync per cycle and no interlacing. This cut the effective vertical resolution in half and produced gaps between the scanlines, but early consoles were never going to be able to use full NTSC resolution anyway, the gaps wouldn't be as pronounced on a low-precision 15khz screen, and in exchange for these negligible drawbacks, 60fps motion would appear smoother and stabler.

Nowadays we call this resolution 240p, but this is anachronistic - 240 lines is the hypothetical maximum resolution, and early console manufacturers didn't expect this many lines to wind up doing anything for the visible image. The Famicom only used 224 lines, the other 16 would be inactive, and this carried forward to the majority of SNES games. Atari was even less optimistic; the programmer's manual for the Atari 2600 recommends a maximum of 192 active lines!

Diagram by Steve Wright, archived by Bill Watson

So, to answer the question - what is a pixel? In short, a pixel is a signal for the monitor's electron beam to change to a particular color at a certain time. Each scanline's spectrum of colors is determined by a sequence of these signals, and the more precise your signal and monitor, the more each line will look like a row of colored rectangles, and the more your screen will look like a grid of them.

When you run a video game through an emulator on a fixed-pixel display, precision is perfect. Your screen, your snapshots, your videos, will look like grids of perfect little squares. Or, with scaling, perhaps grids of perfect big squares. As far as the sorts of games I've covered so far, this is definitely not authentic! But is it so bad?


What are pixels supposed to look like?

Posted to Reddit ad nauseam, original source unknown

Here's the neat part - there's no right answer! CRT enthusiasts will passionately argue that old console games only look "right" on an old CRT. The games were designed for CRTs, tested on CRTs, played on CRTs, and to extent, they are right! Despite the many options for simulating CRT effects on modern displays, it's impossible to completely replicate the look and feel of one, especially if you have a large display or sit very close to it. For an authentic CRT experience, you must have a CRT, and you must view it from an appropriate distance. I've been to a barcade that offers a room with genuine Atari and Genesis consoles hooked up to CRT televisions and concur that the look and feel just hits different.

However, I find these arguments oversimplistic. Consumer televisions came in a very wide variety of quality and differed greatly in terms of precision, color accuracy, geometry, etc. They could differ further still based on the type of video cable used, the TV calibration, even the age of the set. The last CRT television I owned, and the one I spent the majority of my time playing console games on, was a 19" Sony of average price and quality. It lasted me 16 years, and I played NES through original Xbox on it, all of them using RF coaxial connections routed through a VHS player, which is just about the worst connection setup possible. My friend "B" had a comparatively luxurious 27" Zenith and likely used S-Video connection. The difference in quality was more significant than the difference between my wife's Walmart 1080p bargain screen and my own super-premium 4K/HDR OLED, and developers had to ensure that their games would look good on his setup, my setup, worse setups than my own, and everything in between. Is a CRT setup actually authentic to the developer's vision if it does not precisely match the imprecisions and distortions that the developer witnessed?

Another thing I find baffling is that so many CRT enthusiasts will go to great lengths to try to maximize picture fidelity, even at the expense of authenticity. No clearer-cut example exists than the debate of composite vs. RGB connectors; the majority of U.S. consumers used composite connectors or worse, and there are games where "upgrading" to RGB connectors is demonstrably destructive to the intended presentation. The picture becomes clearer, freer of noise and distortion, but dithering effects, where the phosphor clusters were meant to bloom, overlap, and blend to create new colors and transparencies, instead stay within their grid lines and only produce a coarse checkerboard pattern. I have to conclude that for these users, authenticity isn't the point; they simply like the way it looks, and loss of authenticity is an acceptable trade for a cleaner picture that still attains a CRT look.

What I conclude is that there's no definite answer as to what pixels are "supposed" to look like. Developers' had to ensure their picture could tolerate video distortion, but could not necessarily rely on it! Arcade and computer game developers could expect more precision than console game developers, but still could not rely on pixels being rendered with an exacting degree of fuzziness.

As for attaining superauthentic pixels, we have a number of options. From the earliest days of emulation, we had raw scaled pixels, fake scanlines, and countless filtering options. Today, we have a slew of pixel shader options to mimic many CRT-like properties, which in many cases can reproduce both the intentional and incidental effects of the technology with limitless configuration. The best option, I'd say, is entirely in the eye of the beholder. I personally enjoy playing arcade games with simple pixel shaders, but I would argue that the vast majority of CRT-based games I've covered so far don't really need them. I would also argue that pixel shaders are inappropriate for documentation purposes, my blog included; you can always apply pixel shaders to a raw-pixel screenshot, but you can't take them out. To be clear, there are games where it is clear that the artists expect pixels to blend into one another, pixel superauthenticity requires this be mimicked, and there are already shaders and filters that can accomplish this more reliably than authentic hardware ever could. But as of 1985, we're a long way from the period where this is common.

For the rest of this article, I thought I'd share some system-specific insights. Superauthenticity might be subjective, but there's no reason it can't be an informed subjectivity.

You will need to click on the images below to see them in fullscreen for the proper effect, in most cases!


Arcade games

Counterclockwise - unfiltered, bilinear filtering, crt-royale, MAME-hlsl

Well, so much for being system-specific. Each arcade game essentially being a system unto itself, but I'm not going to delve into each and every one of them, obviously.

Disregarding vector games, monochrome games, and the murky world of pre-CPU transistor logic games like Pong, we can make some general observations that apply to the majority of arcade machines released between 1975 and 1985. Almost all use 15khz color monitors, but unlike consumer televisions, these monitors are designed with progressive scan in mind, and the video signal typically comes from RGB cables which keep a cleaner color signal than composite or RF. For some games, the monitors are oriented vertically, not horizontally, which will mean the scanlines (and color blending) will also be oriented vertically.

MAME has ways of outputting to a 15khz monitor for a pretty arcade-accurate look, but you'd need such a monitor and a video card that supports 15khz output modes, which are uncommon and no longer supported by current hardware standards. For the rest of us, there's also built-in support for some basic pixel shaders, referred to as HLSL. And honestly? I like it. It does darken the picture with default settings, but this is probably more accurate to the original colors than the other options shown in the above screenshots. Real arcade games always look darker than emulated ones, in my experience. I've played Indiana Jones and the Temple of Doom at ACAM, and can confirm that, at least on the monitor there, the earthen background texture nearly disappears into the darkness. Most games of the era aren't dark enough for this kind of black crush.

Retroarch provides an intimidating number of shader options, many of them far fancier than MAME's HLSL mode. I haven't even scratched the surface of what's available, but crt-royale is pretty popular, and it looks pretty good too. I don't think it is necessarily more authentic than any other option, but given the low resolution of the materials we are working with, I wouldn't call it destructive to the materials, so it isn't inauthentic either. Pseudo-authentic, if you like. Just about any option will add texture to the pixels and soften their rectangular look, so by all means use what looks good to you. But I will say that I think shaders that simulate screen curve are silly and unnecessary.

There is, however, one arcade game I've covered so far where you definitely need pixel shaders in order to attain the game's intended look, and it's not even that good or very impressive looking. Bilinear filtering won't cut it.


 

 

Atari 2600

Counterclockwise - unfiltered, bilinear filtering, crt-royale, MAME-hlsl

Ok... does this really matter? Have your choice of silk purse from a sow's ear. Personally I'll take my maximally-chunky pixels raw.

 

Apple II

Counter-clockwise - Composite idealized, composite monitor, color TV, color TV with scanlines

The Apple II family of computers had a very peculiar way of encoding color. The most common graphics mode for games was known as high-res mode - effectively a 280x192 monochrome bitmap, but thanks to some NTSC timing trickery, pixels in certain positions would display as purple or green, or with the flip of a "high bit" preceding each block of seven pixels, blue or orange.

The final output is, of course, an analog waveform containing 192 scanlines, and isn't easily represented by a low-res image file. With some more NTSC jiggery-pokery one could intentionally output more colors than the six standard ones, but for most intents and purposes, the intended image is indeed a 280x192 bitmap with only six colors that have to obey certain rules. AppleWin's "composite idealized" output mode presumes that this is what the software is trying to do and normalizes it in a clear and stable manner. It isn't truly authentic, but in nearly all cases, I don't see much reason to use anything else, or much need to mess around with pixel shaders.

 

Atari 400/800

Counterclockwise - unfiltered, CTIA artifacts, crt-easymode, crt-royale, crt-hyllian, MAME-hlsl

Counterclockwise - blargg, mame-ntsc, ntsc-320px-composite, ntsc-simple

The Atari 8-bit computers are an interesting situation, as they came with a 5-pin monitor output that delivered a video signal comparable to (and somewhat compatible with) S-Video, but as far as I can tell, almost nobody used it. Most just connected it to their televisions using the RF video cable.

Both Altirra and Retroarch provide some options for simulating television artifact colors. For the most part, you don't really need this - the Atari already had impressive color depth for its time, well surpassing even the Commodore 64. You also have the usual options for scanlines, bilinear filtering, sharpen filters, etc.

Retroarch offers pixel shaders, many of which provide an anachronistic RGB look, but I can't see that it hurts the materials or presentation, and like with arcade games, they can be fun and enjoyable. Other shaders, which I've shown in the second set of shots, are more focused on simulating television blurriness, which I don't really think enhances these games, but the option is there.

A few games, though, do need television artifact colors.


IBM PC

Counterclockwise - RGBI, composite, TGA

When I think of classic PC graphics, I think of 320x200 resolution on high-precision 31Khz VGA/SVGA monitors which gave a sharper image than most arcade monitors. Pixels were chunky, but color depth and resolution was high enough that developers could make things look pretty much however they wanted to - photorealistic, cartoonish, painterly, whatever. Computing power and the talents of the artists were the limiting factors, not the video standard. But we are years away from this.

In 1985, we have that monitor/television duality again. IBM's CGA standard supports a digital RGBI interface, and an analog composite interface to televisions. Internally, graphics are stored in a 4-color framebuffer, and will display perfectly on an RGBI monitor, but a bit smeary on a television. Some games, though, use television artifacting to simulate more colors, and these will only look right with composite. And will also be a bit smeary.

We also have a digital 16-color mode available to PCJr. and Tandy computers, which is just as crisp as 4-color RGBI, though lower in maximum resolution, but uses a fixed palette dependent on the computer.

DOSBox does not handle composite mode well, unfortunately. We need to use a fork or PCem to utilize this.

I haven't messed around with pixel shaders or filters when it comes to PC games of this era, and I don't really see the need.

 

NEC PC-88

Counterclockwise - Scanlines, unfiltered, soft interpolation, hard interpolation

The PC-88 ecosystem is a blind spot for me and something I still don't feel confident emulating. Nevertheless, I can't see what purpose scanlines serve even though they are enabled by default with every emulator I've seen. Interpolation just looks nicer.


ZX Spectrum


I got nothing. I've never seen a ZX Spectrum, probably never will, and have no clue what it's actually supposed to look like. I just emulate them with raw pixels.

The target output is PAL televisions and uses a 224-line progressive scan mode, but the Spectrum's color abilities are limited. Artifact colors are probably not possible.

 

Commodore 64

Counterclockwise - Unfiltered, scanlines, CRT filter, CRT filter + external palette

Finally, we have a system that supports 16 colors without television artifacting or other tricks. Whatever color you want can go wherever you want. Even the Atari computers don't quite allow that! Unfortunately, those 16 colors are the entire palette, and the colors were selected by video engineers, not artists! The colors available also depend on the model and system region. VICE, which I am the most comfortable using, defaults to PAL colors, but in the above shots I am using NTSC colors.

The C64 had two video standards again; an S-Video-like DIN connector meant for monitors, and RF output meant for televisions. VICE's options seems to be more centered on simulating the former look.

I don't have any direct insight on what Commodore 64 games are meant to look like, but subjectively, I think I like the built-in CRT filter with the external palette option. There are a number of included palettes to choose from; the one in the upper right shot is called Pepto-NTSC.

VICE doesn't provide any options for pixel shaders, and I'm not as comfortable using RetroArch for C64 emulation, so I haven't experimented much.

 

NES


Counterclockwise - Unfiltered, Blargg NTSC filter, mame-ntsc, ntsc-simple


To end this article, we have the first video game system I ever owned, the Nintendo Entertainment System!

Back in the day I had this connected to a TV with RF, which was actually the original Famicom model's only option. And I can remember the picture was always a bit unsteady and distorted - more so than the Blargg filter above demonstrates. Personally, I have no nostalgia for that whatsoever, but still, a perfectly crisp image isn't an authentic one.

I think I prefer the ntsc-simple shaders out of all the options that I've tried, but I don't think any option harms the presentation. Stochastic RF distortion is something the image tolerates, not something it depends on.

 

Well, this post was a lot longer than I anticipated, but I had fun writing it! I may revisit this topic in the future, as I begin to cover more sophisticated systems - that's when we start to really be able to take advantage of pixel shader effects. For now, though, we'll be going back to the Eastern Front next, as promised.

3 comments:

  1. My first machine was a ZX Spectrum in the late '80s, connected to a black-&-white TV— I imagine up 'til around then that wasn't a super unusual scenario, the kids getting an old b&w set for their games. So maybe I should play emulated games in monochrome!

    ReplyDelete
    Replies
    1. My wife brought me a completely refurbed and recapped ZX... the cat recently knocked it off the shelf and I'm not sure if it survived though... Durn cat!

      Delete
  2. I think the extra colours derived from artifacting are a feature of NSTC units, so if you are using PAL unit this feature is not possible. Therefore UK designed computers such as the ZX spectrum would never be able to use this feature as they are designed for PAL.

    ReplyDelete

Commenting with signin or name/URL is encouraged but not required. If the spam filter deletes your legitimate comment, apologies - it does that sometimes.

Most popular posts