The term TV artifacts refers to a spot or "pixel" on the screen that displays
a different color than the one assigned to it. --De Re Atari, page D-1
There are two different types of artifacting associated with the Atari.
The first type is considerably more intuitive. Color cathode ray tube (CRT)
televisions and computer displays generate color by exciting red, green, and
blue phosphors arranged in either an aperture grille pattern (vertical wires)
or a shadow mask pattern (triads of dots).
The density of the phosphors defines the "dot pitch" of the display device.
If a video signal source defines a spot or pixel that is smaller than the dot
pitch of the display device, then accurate color cannot be reproduced by that
display device in that precise spot on the screen. This type of artifacting
is relatively minor with the Atari because of the relatively low resolution of
Atari graphics modes in comparison to the dot pitch of CRT display devices.
NTSC Atari computers exhibit a considerably more profound type of artifacting
than the above. The following is from De Re Atari, Appendix D:
"Television Artifacts": http://www.atariarchives.org/dere/chaptD.php
Appendix D is credited to Atari's Lane Winner with assistance from Jim Cox.
This section discusses how to get multiple colors out of a single color
graphics mode through the use of television artifacts.
The ANTIC modes with which this can be accomplished are 2, 3, and 15. ANTIC
mode 2 corresponds to BASIC mode 0, ANTIC mode 15 is BASIC mode 8, and ANTIC
mode 3 has no corresponding BASIC mode. Each of these modes has a pixel
resolution of one half color clock by one scan line. They are generally
considered to have one color and two luminances. With the use of artifacts,
pixels of four different colors can be displayed on the screen in each of
A simple example of artifacts using the Atari computer is shown by entering
the following lines:
These statements will plot two points on a black background; however each
pixel will have a different color.
To understand the cause of these differing colors one must first understand
that all the display information for the television display is contained in a
modulated television signal.
The two major components of this signal are the luminance, or brightness, and
the color, or tint. The luminance information is the primary signal,
containing not only the brightness data but also the horizontal and vertical
syncs and blanks. The color signal contains the color information and is
combined or modulated into the luminance waveform.
The luminance of a pixel on the screen is directly dependent on the amplitude
of the luminance signal at that point. The higher the amplitude of the signal,
the brighter the pixel.
The color information, however, is a phase shifted signal. A phaseshifted
signal is a constantly oscillating waveform that has been delayed by some
amount of time relative to a reference signal, and this time delay is
translated into the color.
The color signal oscillates at a constant rate of about 3.579MHz, thus
defining the highest horizontal color resolution of a television set. This
appears on the screen in the form of 160 visible color cycles across one scan
line. (There are actually 228 color cycles including the horizontal blank and
sync, and any overscan.)
The term "color clock" refers to one color cycle and is the term generally
used throughout the Atari documentation to describe units of measurement
across the screen. The graphics mode 7 is an example of one color clock
resolution, where each color clock pixel can be a different color. (There are
microprocessor limitations though.)
Atari also offers a "high resolution" mode (GRAPHICS 8) that displays 320
pixels across one line. This is generated by varying the amplitude of the
luminance signal at about 7.16MHz, which is twice the color frequency.
Since the two signals are theoretically independent, one should be able to
assign a "background" color to be displayed and then merely vary the luminance
on a pixel-by-pixel basis. This in fact is the way mode 8 works, the
"background" color coming from playfield register 2, and the luminances coming
from both playfield registers 1 and 2.
The problem is that in practice the color and luminance signals are not
independent. They are part of a modulated signal that must be demodulated to
be used. Since the luminance is the primary signal, whenever it changes, it
also forces a change in the color phase shift. For one or more color clocks
of constant luminance this is no problem, since the color phase shift will be
unchanged in this area. However, if the luminance changes on a half color
clock boundary it will force a fast color shift at that point. Moreover, that
color cannot be altered from the transmitting end of the signal (the Atari
Since the luminance can change on half color clock boundaries, this implies
that two false color, or artifact pixel types can be generated. This is
basically true. However, these two pixels can be combined to form two types
of full color clock pixels. This is illustrated below:
TV Scan | | |
Line |<---1 color clock---->| |
| | |
| | | | |
|<-1 pixel->| | | |
| | | | |
Luminance 0 1 0 0 1/2 cc pixel color A
(0=off, 1 0 0 0 1/2 cc pixel color B
1=on) 1 1 0 0 1 cc pixel color C
0 1 1 0 1 cc pixel color D
Note that each of these pixels requires one color clock of distance and
therefore has a horizontal resolution of 160.
The colors A through D are different for each television set, usually because
the tint knob settings vary. Thus they cannot be described as absolute colors,
for example, red; but they are definitely distinct from each other, and
programs have been written that utilize these colors.
The actual colors seen depends upon the tint setting of the NTSC display
device, and also upon the version of the NTSC Atari computer used, as pointed
out by Bryan on Oct 7, 08:
It's well known that different models produce different artifact colors.
The 800 produces Blue/Green, the 1200XL produces Green/Purple, and the other
XLs produce Blue/Red. The reason for this doesn't lie with GTIA, but
rather with the delays inherent in the different video buffer circuits.
When you start modifying the video circuits, you slightly alter the time
alignment between chroma and luma and the artifact colors change. The TV's
decoder will be synched to the colorburst supplied by the chroma signal, but
artifact colors are produced by changing the luma level at the 3.579 color
frequency which the NTSC Atari models are inherently set up to do.
A classic example of a game that utilizes color artifacting on the NTSC Atari
is the Broderbund game, Choplifter!. 2nd example: Drol, also by Broderbund.
More information about artifacting on the Atari 8-bit computers:
"Atari Artifacting" by Judson Pewther, Compute! #38, July 1983, p. 221:
or from Compute!'s Second Book of Atari Graphics:
"GRAPHICS 8 In Four Colors Using Artifacts" by David Diamond, Compute!'s First
Book of Atari Graphics:
A posting on AtariAge by phaeron, Posted Fri Jan 28, 2011: