VOGONS


40 Column Text Mode Issues

Topic actions

  • This topic is locked. You cannot reply or edit posts.

Reply 40 of 457, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

Implemented wide-angle color demodulation. Instead of decoding the chroma signal 100% as the NTSC specification calls for, the R-Y is more then the proper 90 degrees apart from the B-Y axis (hence "wide-angle"), yielding a picture that might be more pleasing to the eye, but less accurate with regards to the NTSC specification.
It's often used in consumer TV sets to fool customers into thinking the picture is better than it actually is (if you have ever tried to properly calibrate a home entertainment system with products like Video Essentials, you know what I mean). See pictures:

http://www.geocities.com/belzorash/composite/wideangle.html

Last edited by NewRisingSun on 2005-10-08, 14:50. Edited 1 time in total.

Reply 42 of 457, by Great Hierophant

User metadata
Rank l33t
Rank
l33t

(they match the output of captured screenshots perfectly, and the way they're generated is fool-proof in its straightforwardness),

I'm not sure I would rely too much on MobyGames' screenshots, especially as some seem to be taken using a card other than a true IBM Color Graphics Adapter.

If this gets implemented in DOSBox, it should definitely be user-selectable as some people will probably prefere one way and some the other.

In the 50 year history of the NTSC standard, you would have to be viewing TVs of this vintage to get the pure effect:
http://home.att.net/~pldexnis/restoretv.html

Reply 43 of 457, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

I'm not sure I would rely too much on MobyGames' screenshots,

I *have* seen the composite color mode on my TV, and I've found that the screenshots are reasonably accurate. I could also say that my colors are right because that's the way my TV looks, but I don't think that I would be making a strong case that way. 😀

especially as some seem to be taken using a card other than a true IBM Color Graphics Adapter.

But that wouldn't change the colors. First of all, I think most 'clones' still used the 6845 video controller, second, the composite color mode, at least as far as the graphics card is concerned, is just a monochrome mode with the color burst signal turned on, so there's not a big gamut as to what a card can do differently.

In the 50 year history of the NTSC standard, you would have to be viewing TVs of this vintage to get the pure effect:

Well, I don't think any programmer of 80s software used a CT-100. As for the 'pure' effect, studio-grade equipment basically fulfills the NTSC spec completely except for the phosphor primaries, which are the brighter (but less colorful) SMPTE-C primaries instead, which in turn are almost exactly the same as sRGB, the standard for properly calibrated computer monitors (which doesn't mean that all computer monitors actually look that way 😉)

Reply 44 of 457, by Great Hierophant

User metadata
Rank l33t
Rank
l33t

I *have* seen the composite color mode on my TV, and I've found that the screenshots are reasonably accurate. I could also say that my colors are right because that's the way my TV looks, but I don't think that I would be making a strong case that way.

I'm sure that when the few software companies that bothered to use the CGA color composite mode, they probably hooked the card up to the same composite monitors they used to test their Apple II games on. NTSC certainly lives up to its nickname "Never the Same Color Twice"; I guess you could make as strong a case as anybody.

But that wouldn't change the colors. First of all, I think most 'clones' still used the 6845 video controller, second, the composite color mode, at least as far as the graphics card is concerned, is just a monochrome mode with the color burst signal turned on, so there's not a big gamut as to what a card can do differently.

But it would be up to the video hardware to shift the phase of the color burst for each nybble or thereabouts. IBM had a 13" long ISA card stacked to the brim with TTL logic chips (as well as 16KB of RAM, the 6845 CRT controller and the 8KB character generator ROM). Most clones are half that length. I wouldn't be surprised if many cut corners on a feature not widely used. We have already seen that shifting the text characters one pixel column will switch the text artifacts.

Well, I don't think any programmer of 80s software used a CT-100. As for the 'pure' effect, studio-grade equipment basically fulfills the NTSC spec completely except for the phosphor primaries, which are the brighter (but less colorful) SMPTE-C primaries instead, which in turn are almost exactly the same as sRGB, the standard for properly calibrated computer monitors (which doesn't mean that all computer monitors actually look that way)

So how close will your color composite algorithim get to studio grade calibration?

Reply 45 of 457, by HunterZ

User metadata
Rank l33t++
Rank
l33t++
Great Hierophant wrote:

NTSC certainly lives up to its nickname "Never the Same Color Twice"; I guess you could make as strong a case as anybody.

I've always thought that People Are Lavender was more humorous 😜

BTW, I'm eating this stuff up. Don't let me interrupt 😀

Reply 46 of 457, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

NTSC certainly lives up to its nickname "Never the Same Color Twice"

Undeservedly so. The reason for most inaccuracies lies in bad engineering on part of TV set manufacturers, and the reason that most people think the "hue" and "saturation" controls are to be adjusted to a setting you "like", as opposed to correcting transmission errors.

But it would be up to the video hardware to shift the phase of the color burst for each nybble or thereabouts.

No. The video hardware doesn't "shift the phase of the color burst" for each nibble. You're confused as to the workings of the composite color mode. The color burst is just a continuous 3.579545 MHz signal. Any color phase information is derived from the monochrome signal by the TV set's chroma decoder. In the composite color mode, as opposed to any other mode, the graphics card doesn't generate ANY color. It just generates a monochrome signal with a continuous 3.579545 MHz signal, the color burst, on top of it.

We have already seen that shifting the text characters one pixel column will switch the text artifacts.

That's hardly a cost-cutting feature. And fewer chips doesn't necessarily reduce functionality if all you're doing is combining discrete logic components into an integrated circuit.

So how close will your color composite algorithim get to studio grade calibration?

As far as the colors are concerned, it should be 100% accurate when the "wide angle color demodulation" is turned off.

Reply 47 of 457, by Great Hierophant

User metadata
Rank l33t
Rank
l33t

No. The video hardware doesn't "shift the phase of the color burst" for each nibble. You're confused as to the workings of the composite color mode. The color burst is just a continuous 3.579545 MHz signal. Any color phase information is derived from the monochrome signal by the TV set's chroma decoder. In the composite color mode, as opposed to any other mode, the graphics card doesn't generate ANY color. It just generates a monochrome signal with a continuous 3.579545 MHz signal, the color burst, on top of it.

It seems a very simple principle when you put it that way. This site http://www.atariarchives.org/dere/chaptD.php
talks about the principles behind the color composite mode of the Atari 8-bit machines, and comparing what you said to what it says, the principles sound very similar. But a CGA card has 640 horizontal pixels while the Ataris only had 320. As it doesn't have a clock crystal on it, then it must be using the PC's full 14.318MHz clock generator for the high resolution mode. (Compare the MDA, which uses a 16.257MHz clock crystal and has a horizontal resolution of 720 pixels.)

Reply 48 of 457, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

The video hardware doesn't "shift the phase of the color burst" for each nibble. You're confused as to the workings of the composite color mode. The color burst is just a continuous 3.579545 MHz signal. Any color phase information is derived from the monochrome signal by the TV set's chroma decoder. In the composite color mode, as opposed to any other mode, the graphics card doesn't generate ANY color. It just generates a monochrome signal with a continuous 3.579545 MHz signal, the color burst, on top of it.

I have a quick, somewhat-related question: Excluding convergence errors and poor-quality RGB phosphor patterns, is the red/blue color artifacting of white pixels on NTSC displays generated from many computers and video game consoles solely a side-effect of this color burst method of generating low-res color output from a hi-res monochrome signal?

Reply 49 of 457, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

color burst method of generating low-res color output from a hi-res monochrome signal?

I don't know what you mean by "color burst method", the "color burst" first tells the TV that it should decode color information, second, it's a reference for the color difference signal that is modulated in phase and amplitude. That's not unique to the CGA, it's how the RCA color system works.

As for other systems: there are many possible reasons, I don't know every device out there. For example, the Nintendo Entertainment System generates a combined luma/chroma signal at the same time, thus reducing the luma resolution to the chroma resolution. It also creates square waves instead of sine waves, thus yielding many interesting artifacts.
(By the way, figuring out the NES' colors is a complete nightmare, because most Japanese game developers tested their games on Japanese 80s consumer TV sets, which took a LOT of liberty with the NTSC specifications, to put it mildly. That's why you don't see one accurate NES palette out there --- even Nintendo with their Gamecube rereleases can't get it right.)

Reply 50 of 457, by Great Hierophant

User metadata
Rank l33t
Rank
l33t

The NES's PPU has a 64 color palette in a 16x4 color matrix. Four of the 16 columns send the color burst and rely solely on the luminance signal, which is 5.37MHz. The other twelve columns all have a different phase shifted color burst signal (I'm not sure whether the amplitude varies.) In this sense it is different from the color composite modes of the CGA, Apple II and Atari 8-bits, which only use the half/quarter pixel luminence signals combined with the reference color burst signal to select the color.

Unfortunately, most NES games were designed for Japanese TVs, which apparently tweak the colors to give more natural-looking Asian fleshtones. Fortunately, computers using composite color modes are all American based and don't have that problem. Even more fortunate is that the NES is the only major Japanese console that relied on the NTSC color model. All the others relied on RGB, even the Sega Mark III and the NEC PC Engine.

Reply 51 of 457, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

luminance signal, which is 5.37MHz.

How do you figure that?

(I'm not sure whether the amplitude varies.)

No, the amplitude, which is saturation, is the same throughout all 12 colors. In terms of voltage levels at least --- the 3x colors will appear less saturated due to white clip, and and some TVs the 0x colors will be clipped as well (and if I look at Japanese screenshots, they are supposed to be clipped).

But I think we ought to stick to discussing PC stuff here. 😀

Reply 52 of 457, by Great Hierophant

User metadata
Rank l33t
Rank
l33t

How do you figure that?

Actually it is the pixel clock, but I have remarked on the similarities between the luminence signals and the pixel clocks of other computers. For example, if the CGA has a pixel clock of 14.318 in the 640 pixel mode and 7.16 in the 320 pixel mode, the ratios match with the NES's 5.37 pixel clock in its 256 pixel mode.

But I think we ought to stick to discussing PC stuff here. Happy

I think by your good efforts we can fully implement the CGA color composite mode and can turn our attention to other aspects of the CGA, if any, that deserve our attention.

Reply 53 of 457, by NewRisingSun

User metadata
Rank Oldbie
Rank
Oldbie

For example, if the CGA has a pixel clock of 14.318 in the 640 pixel mode and 7.16 in the 320 pixel mode, the ratios match with the NES's 5.37 pixel clock in its 256 pixel mode.

Nono. That Atari page you linked had me fooled for a while, but some if the information on that site is complete BS. For example, it says "The color signal oscillates at a constant rate of about 3.579 MHz, thus defining the highest horizontal color resolution of a television set." This is completely wrong.
The color signal does not oscillate AT a rate of 3.579545 MHz, it oscillates AROUND the color burst reference carrier (and that one oscillates AT 3.579545 MHz); the maximum bandwidth for the in-phase signal is 1.5 MHz (120 lines) and 0.5 MHz (40 lines) for the quadrature signal.

I'm not sure how the term "pixel clock" relates to this, but it's definitely not something inherent in the NTSC specification, but internal to the Atari hardware, consequently, it does not apply to the NES' PPU.

Last edited by NewRisingSun on 2005-10-08, 14:53. Edited 1 time in total.