fleabitdev 13 minutes ago

Emulators also struggle to faithfully reproduce artwork for the Game Boy Color and the original Game Boy Advance. Those consoles used non-backlit LCD displays with a low contrast ratio, a small colour gamut, and some ghosting between frames. Many emulators just scale the console's RGB555 colour values to the full range of the user's monitor, which produces far too much saturation and contrast.

It's a shame, because I really like the original, muted colour palette. Some artists produced great results within the limitations of the LCD screen. Similar to the CRT spritework in this article, it feels like a lost art.

However, there's an interesting complication: quite a lot of Game Boy Color and Game Boy Advance developers seem to have created their game art on sRGB monitors, and then shipped those graphics without properly considering how they would appear on the LCD. Those games appear muddy and colourless on a real console - they actually look much better when emulated "incorrectly", because the two errors exactly cancel out!

seanhunter 2 hours ago

It genuinely baffles me that people are nostalgic about CRTs. CRTs were universally god-awful and I paid top dollar to have the best that money could buy for myself since I worked from home and it was still terrible. Modern monitors are better in every possible way.

  • Aloha an hour ago

    CRT's were better for a long time, LCD's did eventually catch up.

    I held on to my 21" Trinitron for a long time into the LCD era, because it had better contrast, resolution and sharpness. Eventually affordable LCD's did catch up.

  • brandonmenc an hour ago

    People are nostalgic for the pixel art made specifically to look good on CRTs.

    It’s like sometimes preferring 24 fps cinema or oil paints over photography.

    It depends on what’s being displayed.

    • rafabulsing an hour ago

      Yeah, I don't think anyone is nostalgic for doing spreadsheets/word processing/etc on CRTs.

      The utilitarian POV will always look for the best (less noisy/most accurate/most reproducible) medium possible. But when it comes to art, many other aspects factor in, and a little bit of noise can very well add something to a piece. Not always, but often enough.

      • kees99 a few seconds ago

        > don't think anyone is nostalgic for doing spreadsheets on CRTs.

        Early LCDs were exceptionally awful in two respects: slow to the point where moving objects were disappearing altogether ("mouse pointer trails" option was a workaround for this exact problem); and - very narrow viewing angles, beyond which colors would go weird, especially on half-tones.

        Both of those defects are less noticeable with mostly static high-contrast text (spreadsheets/text editing). And way, way more annoying with fast-moving colour graphics (games).

  • hbn an hour ago

    Your answer is right in the article, it shouldn't be that baffling.

    No one is campaigning to get rid of the beautiful modern 4k OLED displays and return to CRTs for everything. But for low resolution content (retro games and computers) it looks better on a CRT.

    There's pretty good modern emulation of the look, but at the end of the day it's a different technology and they do look different.

    Not to mention the feel of playing through a near-zero latency analogue signal is noticeable if you've played enough of certain games. There's a reason speedrunners of retro games all play on console with a CRT.

  • cubefox 42 minutes ago

    No, CRTs are still much better than sample-and-hold screens (OLED or LCD) with regards to motion clarity.

    Short version: Our eyes are constantly tracking objects on screen ("smooth pursuit"), which leads to visible smearing on OLED and LCD screens, because they hold their frame for the entire frame time rather than just flashing them for a short fraction of that time. Especially fast paced and side scrolling games like Sonic look much better on CRT screens. (But CRTs have much lower brightness and hence contrast than modern screens.)

    Full Explanation here: https://news.ycombinator.com/item?id=42604613

luciferin 6 hours ago

This is fun to see right now. I've been playing around with CRT shaders in retroarch for the last few days. My main goal is to use the [CRT-Beam-simulator](https://github.com/blurbusters/crt-beam-simulator) at 120hz and get some sort of CRT slot or shadow mask at the same time. I've landed on some settings I enjoy for N64 games, and it really has improved the experience for me.

On the post's notes on the Sonic waterfall effect, the [Blargg NTSC Video Filter](https://github.com/CyberLabSystems/CyberLab-Custom-Blargg-NT...) is intended to recreate that signal artifact, but similar processing is included in a lot of the CRT shaders that are available. I found that RGB had a visual artifact when moving that made the waterfall flicker, but composite didn't, so I played on that setting. Running it with the beam simulator is probably causing some of that.

  • dmonitor 5 hours ago

    Using an OLED display? I've found that's the only type of display that can even come close to reproducing the CRT look

    • willis936 3 hours ago

      The blacks are there, but the brightness is not. I just played some smash 64 on a CRT last weekend and using an OLED for my desktop.

mapontosevenths 7 days ago

I have a Retrotink 4k. I mostly use it for VHS transfer these days, but it's original purpose is upscaling retro game images and applying various masks and filters to make the game look like it's using a CRT on a modern display.

It works beautifully, and you no longer need a clunky, heavy, dying CRT. I'm sure the purists will say it's not the same, but I've done sides by side comparisons IRL and it's good enough for me even when pixel peeping. I prefer the emulated CRT look on a modern OLED to the real thing these days.

  • hbn an hour ago

    I started playing Policenauts recently, and when I first booted the game I was straining my eyes trying to read the blocky pixelated text. I only recently started using RetroArch, but I did some digging and figured out how to enable a CRT filter and immediately it was 1000% easier on my eyes.

    The anime art and FMV sequences looked way better too.

  • amlib 4 hours ago

    I do something like this for my old game consoles, except that I pipe them trough an old analog video capture card that supports 240p60 and use the video processing module in Retroarch to do the capture with minimal lag. After adding some fancy CRT shaders and other image adjustment carefully tailored for this, the image comes out looking great! I sometimes toggle the shaders off and wince at the "raw" digital capture. I actually bought this capture card back in 2008 for this purpose but detested using it until around 2018 when I started using it in conjunction with retroarch and CRT shaders.

    For that period it even shaped my perception that analog video and specially n64 graphics were always bad, but all that was vindicated by those shaders, it really does make a big difference, and made me find a new appreciation for n64 graphics in particular.

    There is some internet misconception that the inherently "blurry" output of an n64 is bad (And sure, some games are just ugly/bad from an artistic standpoint), but it's actually the smoothest image any analog console will ever produce when hooked up to a proper CRT or CRT shader, and it's consistent across all games because of "forced" high quality AA in all games. Even the next generation of consoles seldomly used AA.

  • CharlesW 5 hours ago

    Apologies for the off-topic question, but I'm so curious: How is this useful for VHS transfer?

    • kowbell 5 hours ago

      Not OP but I assume "VHS Transfer" meant "transfer to a digital format" i.e. digitize. The Retrotink is a fancy "composite/component/vga-to-hdmi" box, so you can do: VCR playing a VHS -> Retrotink -> HDMI capture card -> computer saving that to a file.

  • trenchpilgrim 6 hours ago

    An OLED with a great filter is good enough for most gamers other than archivists and hardcore collectors, yeah.

zzo38computer 42 minutes ago

In my opinion: CRT has problems including distortion; however, LCD can be bad if the picture is the wrong resolution that does not match the display, while CRT can handle this problem better. LCD is good if the source of the picture is made for the LCD at its resolution.

briandw 20 minutes ago

Perfect example of the expert fallacy. Also how safetyism can cause harm.

rendaw 7 days ago

The peach on the author's CRT looks pretty awful, as does the photo. I'm curious what sort of CRT produced the meme image. Maybe it can't be done by a real CRT, but the author's CRT doesn't look anything like the example from the CRT database they have below.

They also said the impression is different since it's so close up - what does it look like at the size you'd really see it in game?

  • CrossVR 3 hours ago

    > I'm curious what sort of CRT produced the meme image.

    The article mentions later that it's a PVM-20L2MD [1]. This is a professional CRT monitor for medical devices. It uses the same signals as a consumer TV, but comes with a higher quality tube that has a sharper picture.

    [1] https://crtdatabase.com/crts/sony/sony-pvm-20l2md

  • dmonitor 5 hours ago

    He seemed to test it on a bunch of computer monitors, and not a standard 480i consumer television set? The different shadow masks phosphor patterns change how things look

    • drougge 4 hours ago

      The C= 1084S he uses is a more a (very good) PAL TV than a computer monitor, even if it was sold as a monitor. So "576i" in your terminology. (It was also sometimes sold with a TV tuner, or at least the earlier 1084 (same picture tube AFAIK) was.)

Lerc 3 hours ago

I wonder if anyone has employed the services of a hyperrealistic artist to depict the image they see on a CRT.

Given their ability to generate a painting that appears identical to a photo, could they depict how the image appears to them, eliminating any loss from mechanical capture.

gwbas1c 5 hours ago

When I played NES and SNES as a kid, the resolution was so low that I only saw pixels. (Edit: I saw whole pixels when using the RF switch.) To this day, when I go back and play those games on modern consoles I just can't use CRT emulation.

Maybe I just didn't play games that used tricks to get around the pixels?

---

That being said, I remember that "New Super Mario Brothers" on Wii appeared to use a CRT trick to try and make the background flash in a boss room. I always played my Wii in 480p, so it just looked like there were vertical lines in the boss room.

  • RiverCrochet 5 hours ago

    I grew up playing Atari, NES, SNES, and PS1 games on old TVs most of the time, sometimes not the best quality. I also remember that often in 80's arcades, it was guaranteed for at least one or two machines to have CRT issues; colors not aligned, skew at the top/bottom, burn-in (common), screen too bright, etc. All part of the experience and quite nostalgic for me.

    The NES had a particular quirk with its NTSC output that I always thought was very characteristic of NES. I found this article a few years ago, and was fascinated that work was done to really figure it out - https://www.nesdev.org/wiki/NTSC_video - and it's awesome at at least some emulators (FCEUX) seem to use this info to generate an experience quite similar to what I remember the NES being when I grew up. But I don't think any NES game graphics really depended on this for any visual output. All NES games had jagged vertical lines, for example.

    • gwbas1c 2 hours ago

      > The video timing in the NES is non-standard - it both generates 341 pixels, making 227 1/3 subcarrier cycles per scanline, and always generates 262 scanlines. This causes the TV to draw the fields on top of each other, resulting in a non-standard low-definition "progressive" or "double struck" video mode sometimes called 240p

      Ahh: I always wondered why I never saw interlacing artifacts on the NES! (I'm going to assume the same thing for the SNES too.)

gwbas1c 2 hours ago

> Sometimes, all we want to do is shout "CRTs were magic, bro, just trust me!"

I certainly feel that way when watching interlaced video. There's far too much bad deinterlacing out there. One of the biggest problems that I encounter is that deinterlacers tend to reduce the frame rate. (IE, 60i -> 30p instead of 60p, or 50i -> 25p instead of 50p)

(That being said, when I would watch interlaced live TV on a 30" or more TV, I'd see all kinds of interlacing artifacts.)

  • ssl-3 2 hours ago

    With enough lines, enough brightness, and a high-enough refresh rate, it may become possible have a display that can artificially emulate the features of a CRT -- including phosphor persistence and blooming and focus issues and power supply sag and everything else, with interlacing. AFAICT, we aren't there yet.

    If/when this happens, we may be able to again view things as close as they were in broadcast, but with a modern display instead of an an old CRT.

    If we can find any content to play that way, anyhow. A lot of it is cheerfully being ruined.

    Aside from the dozens of us who are interested in that, most of the rest of the folks seems convinced that television looked just as terrible as a blurry SLP VHS tape does after being played through a $12 composite-to-USB frame grabber, using a 3.5mm "aux cable" jammed in between the RCA jacks of the VCR and the converter, and ultimately delivered by an awful 360p30 codec on YouTube, before being scaled in the blurriest way possible...and draw from this the conclusion that there's no details that have any value worth preserving.

    Even though television was never actually like that. It had limits, but things could be quite a lot better than that awful mess I just described.

    (For those here who don't know: Towards the end of the run, the quality of a good broadcast with a good receiver would often be in the same ballpark as the composite output of a DVD player is today (but with zero data compression artifacts instead of >0), including the presentation of 50 or 60 independent display updates per second.)

    • gwbas1c an hour ago

      > With enough lines, enough brightness, and a high-enough refresh rate, it may become possible have a display that can artificially emulate the features of a CRT -- including phosphor persistence and blooming and focus issues and power supply sag and everything else, with interlacing. AFAICT, we aren't there yet.

      To truly do that, you need to display over 20 million frames a second.

      Why?

      True analog video didn't capture frames, but instead each pixel was transmitted / recorded as it was captured. This becomes clear when watching shows like Mr. Rogers on an LCD. When the camera pans, the walls look all slanted. (This never happened when viewing on a CRT) This is because the top part of the image was captured before the bottom part. I wouldn't even expect a 60i -> 60p deinterlacer to correct it.

      That being said, I don't want to emulate a CRT:

      - I want a deinterlacer that can figure out how to make the (cough) best image possible so deinterlacing artifacts aren't noticeable. (Unless I slow down the video / look at stills.)

      - I want some kind of machine-learning algorithm that can handle the fact that the top of the picture was captured slightly before the bottom of the picture; then generate a 120p or a 240p video.

      CRTs had a look that wasn't completely natural; it was pleasant, like old tube amplifiers and tube-based mixers, but it isn't something that I care to reproduce.

LennyHenrysNuts 7 days ago

I still have three working CRTs. A monochrome monitor for the Atari ST, a Sony Multiscan VGA and some random Phillips thing I saved from the skip.

I still play Diablo I on the Sony to this day. Wonderful monitor. I will cry when it finally dies.

karmakaze 7 days ago

This reminds me of my favorite way of watching movies at home was on a 1365x768 plasma TV at 24fps. I really didn't like 1080p, 120Hz, and 4k that came after it. Great for sports and news, not so much for fiction.

  • tuna74 2 hours ago

    Maybe you should turn off the motion smoothing and show the movies in their proper 24 fps?

    Playing something like The Dark Knight Rises from an UHD Blu-ray on a good OLED looks incredible!

    • astrange 8 minutes ago

      You want either motion smoothing (of the 60->120fps kind) or black frame insertion on an OLED for good motion, otherwise the lack of decay between frames will make it look unnaturally juddery.