Quantcast
Channel: Atari Systems Forums
Viewing all articles
Browse latest Browse all 22482

Artifacting - isn't it weird?

$
0
0
I was reading flashjazzcat's thread about his GUI project and saw his screenshots, and it got me thinking about artifacting (again).  Artifacting, for those who aren't familiar with the term, is the effect on NTSC systems with a composite monitor where the LUM signal interferes with the COLOR signal, which causes half-size pixels to display with a colored "fringe".  If you draw a whole row of even-numbered or odd-numbered pixels, you get one of two "fake" colors.  By setting the background color to black and the foreground to white (or vice-versa), the effect is most pronounced.

Anyway what got me thinking about it is, FJC showed (I believe) an 800XL with purple/green artifacting, and a 65XE with blue/orange.  My XEGS does purple/green, and my old (long-dead) 800XL did blue/orange.  I've also heard of yellow/blue though I've never seen it.

So I'm wondering: what actually determines the color of the artifact?  I doubt it's the (solely) IC because swapping GTIAs doesn't seem to affect it (with my limited sample set anyway).  To answer this mystery I dug into the GTIA (and CGIA) data sheets to figure out exactly how this color timing works.

Hope this comes out right:


                                                                                                 _____________
                                                                                                /
LUM  ------------------------------------------------------------------------------------------X  LUM 0-F
         |<----------------------------------------------------------------------------------->|\_____________
         |                              T(lum1) = Luma output delay (max 450ns)
     ___ |                            ________________________                              __________________
        \|                           /                        \                            /
OSC      \     OSC low 140nS        /     OSC high 140nS       \                          /
         |\________________________/                            \________________________/
         |
         |    T(inv) = Color output delay (max 190nS)
         |<-------------------------------->|
      ______________________________________|_________________________________________________________________
                                            |/
COL                                         /   Color Reg = 0 (No color output)
      _____________________________________/


      _________________________________________________________________________________________________
                                           \   \   \   \   \   \   \   \   \   \   \   \   \   \   \   \
COL                                         \ 1 \ 2 \ 3 \ 4 \ 5 \ 6 \ 7 \ 8 \ 9 \ A \ B \ C \ D \ E \ F \
                                             \___\___\___\___\___\___\___\___\___\___\___\___\___\___\___\____
                                                                     |<->|
                                                                       Δt = 16-25


You can see from the diagram the same thing I realized: the luma output for a given color clock comes much later than the color output!  This seeming discrepancy is possibly explained by delay introduced by capacitance in the color output circuitry, which can work as an AC delay (technically [in a perfect capacitor] it causes the signal to be changed into the derivative of its input, for any calculus people out there).

Now one thing you may be thinking is "What about the color adjustment pot?", which is a valid question.  The CADJ signal is a voltage input to C/GTIA which adjusts the delta between different colors.  Because this affects the delta, there is only one setting that is "right": too high or too low and your spectrum goes either above or below 360°.  So given that only one setting is "right", it follows that you cannot change artifacting using this input without actually making "normal" colors incorrect.

As you can see, every hue value corresponds to a 16-25 nS delay; thus a difference in delay of as little as 10s of nanoseconds  in either the luma or color output can, without impacting normal color output, dramatically change the artifacting colors observed.  So take this data and combine the fact that just about every Atari model (and, thanks to widespread modification, often multiple systems in the same model) has a different color output circuit, and the conclusion I come to is that the video output circuit used is a primary (if not THE primary) influence on the artifacting output.

Interestingly when I added color output to my 600XL, I used the simple XE color circuit, and as a test I created a simple "composite" output by bridging the chroma to the luma via a capacitor, and observed the output - and the output was purple/green just like my XEGS with the identical color circuit.

The only problem with my theory is the observations of FJC's demo output.  He showed his 65XE with blue/orange and his XL with purple/green!  This would seem to poke a big hole in my theory that I cannot explain - so FJC, did you happen to mix up the labels on those two screen grabs?

Viewing all articles
Browse latest Browse all 22482

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>