VOGONS


CRT experience

Topic actions

Reply 60 of 120, by analog_programmer

User metadata
Rank Oldbie
Rank
Oldbie
doublebuffer wrote on 2023-08-02, 08:13:

What do you think of CRTs?

Briefly: No, thanks! I have eyes to keep.

from СМ630 to Ryzen gen. 3
engineer's five pennies: this world goes south since everything's run by financiers and economists
this isn't voice chat, yet some people, overusing online communications, "talk" and "hear voices"

Reply 61 of 120, by Jo22

User metadata
Rank l33t++
Rank
l33t++
Deunan wrote on 2023-08-11, 11:18:

Usually only early CRT monitors (mono, CGA) could be damaged by invalid H/V sync signals. And even some of these early ones have some form of protection against it (will not sync to signals too far out of usual range).

That reminds me of a story I once read online.
Apparently, both IBM and Hercules pointed fingers at each others for broken MDA/MGA monitors way back then.

If I remember correctly, IBM said it was due to Hercules cards being dangerous/not complying to specs, while Hercules blamed IBM for not including a filter inside the IBM MDA monitors.
A filter, so it goes, would have limited the range of possible frequencies to begin with.

(Annotation: So the line end transformer wouldn't be run out of specifications/out of resonance and
its fine windings of wire wouldn't heat up and burn up in the process. That's because it's in high-impedance if in resonance (good), but in low-resonance if not (bad, causes short). )

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 62 of 120, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote on 2023-08-11, 12:01:

CRTs are also useful for newer games! 😁

https://www.youtube.com/watch?v=99B-h8sNrdc

Digital Foundry has a couple of videos on that topic as well:

https://www.youtube.com/watch?v=V8BVTHxc4LM
https://www.youtube.com/watch?v=3PdMtwQQUmo

I doubt many people will have access to such a high-end CRT monitor though, but it sure looks good.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 63 of 120, by tauro

User metadata
Rank Member
Rank
Member

With the SyncMaster 955DF I just found that the "Highlight" function brings up colors a lot, everything looks more vibrant... But each time that you change the display mode (it happens a lot when using DOS, even some games do it when changing level), that config is lost and you have to manually reapply it (5 button presses). That's a minus! It also lets you change the color balance (each individual color) and the image sharpness.

analog_programmer wrote on 2023-08-11, 13:03:
doublebuffer wrote on 2023-08-02, 08:13:

What do you think of CRTs?

Briefly: No, thanks! I have eyes to keep.

Do all CRTs harm your eyes? Isn't it related to refresh rate?

Deunan wrote on 2023-08-11, 11:18:

6500k is what most monitors should be using, unless there is a specific requirement to have it set otherwise. Problem is CRTs are not so great at having strong red, but can do strong blue (especially the more modern CRTs) so they were usually sold (and defaulted) to 9300k, or close, to show more bright picture in a well-lit room. But this does screw up white balance towards blue, as the color temperature suggests. Curiously enough sometimes the blue phosphor (and/or gun) goes bad first (esp. if 9300k setting was used for a long time) so the screen turns sort-of reddish or yellow. In that case stay with 9300k because going to 6500k would only make it worse - and in any case, the tube is all but done by that point.

6500k it is 👍

You guys are sharing very interesting info and links.

Reply 64 of 120, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
tauro wrote on 2023-08-11, 18:08:

Do all CRTs harm your eyes? Isn't it related to refresh rate?

Cheaper CRTs from the early to mid '90s could only handle low refresh rates and were sometimes prone to flickering.

Newer CRTs from the late 90s and mid 2000s (especially high-end models from Sony, NEC, ViewSonic and Iiyama) were capable of higher refresh rates (85 Hz and up) which are generally more pleasant to the eyes. This can be a bit subjective too, as some people aren't bothered by looking at a CRT running at 60 Hz for hours on end. Others (myself included) like having 100 Hz or more and can get headaches or eye fatigue from lower refresh rates.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 65 of 120, by tauro

User metadata
Rank Member
Rank
Member
tauro wrote on 2023-08-11, 18:08:

With the SyncMaster 955DF I just found that the "Highlight" function brings up colors a lot, everything looks more vibrant... But each time that you change the display mode (it happens a lot when using DOS, even some games do it when changing level), that config is lost and you have to manually reapply it (5 button presses). That's a minus! It also lets you change the color balance (each individual color) and the image sharpness.

Apparently this function can damage the CDT (color display tube, Samsung terminology) if used for extensive periods of time. I think it may be related to the intensity of the colors. It's about twice more intense.

From the manual:

Tips for Highlight Zone
1. To protect CDT against the screen brightness, the Highlight Zone function persists for three
hours and then automatically stops. So please reset it to continue.

Could it damage it though? If it is truly harmful then, why is it an option?
I wish I could hack it and use it permanently like this...

Joseph_Joestar wrote on 2023-08-11, 18:24:

Cheaper CRTs from the early to mid '90s could only handle low refresh rates and were sometimes prone to flickering.

Newer CRTs from the late 90s and mid 2000s (especially high-end models from Sony, NEC, ViewSonic and Iiyama) were capable of higher refresh rates (85 Hz and up) which are generally more pleasant to the eyes. This can be a bit subjective too, as some people aren't bothered by looking at a CRT running at 60 Hz for hours on end. Others (myself included) like having 100 Hz or more and can get headaches or eye fatigue from lower refresh rates.

DOS forces 70Hz, how do you deal with that?

Reply 66 of 120, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
tauro wrote on 2023-08-11, 19:42:

DOS forces 70Hz, how do you deal with that?

I can only speak for myself, but somehow, 320x200 @ 70Hz doesn't seem to bother me that much. It might be due to the lack of fine detail at that resolution or something like that. In contrast, viewing higher resolutions on a CRT monitor at 60 or 70 Hz does strain my eyes quite a bit.

Thankfully, tools like VBEHz can be used to force DOS SVGA games (e.g. WarCraft 2) to run at higher refresh rates. But I'm not sure if something similar can be done for 320x200.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 67 of 120, by analog_programmer

User metadata
Rank Oldbie
Rank
Oldbie
tauro wrote on 2023-08-11, 18:08:

Do all CRTs harm your eyes? Isn't it related to refresh rate?

Yep, my eyes have always been overly sensitive to the flickering from CRT monitors and cheap LCD displays with backlight PWM-regulation. Definitely I don't think that ever again I'll need even a quality CRT "crate" (like some of those Trinitrons 100+Hz) weighing over 25 kilos.

from СМ630 to Ryzen gen. 3
engineer's five pennies: this world goes south since everything's run by financiers and economists
this isn't voice chat, yet some people, overusing online communications, "talk" and "hear voices"

Reply 68 of 120, by shamino

User metadata
Rank l33t
Rank
l33t
tauro wrote on 2023-08-11, 19:42:

Could it damage it though? If it is truly harmful then, why is it an option?
I wish I could hack it and use it permanently like this...

My last CRT was a Sony Trinitron 19", not sure of the model. CPD-G400 comes to mind? Not sure if that's right.
It was absolutely beautiful, great contrast and glossy colors. But then the picture started going bad in 1-2 years. Turns out there was a known soldering defect in those monitors that was probably the cause and could have been fixed, but I knew nothing about that at the time.

The monitor had a menu option called "Image Restoration" or "Enhancement" or whatever. It was a one-time option. I used it. The picture was suddenly good again, for a while. Then it got really bad. By the time I got rid of that monitor, the picture had a constant gray smoke color and there were curved red lines all across the screen. It was a total piece of trash at 3 years of age.

To my understanding, that option overdrives the picture tube and ruins it. And as I recall there's no way to turn it off - you do it once and it's permanently in that mode for the rest of the monitor's life.
I think Sony's intent was to reduce warranty claims by adding a destructive option that "forces" the monitor across the warranty finish line. If the tube was ruined after that, it wasn't their problem anymore.

I don't know anything about the option on your monitor or if it will shorten it's life. But don't put too much faith in the manufacturer not to put "risky" options in there if they thought the risk was outweighed by improving their sales.

Reply 69 of 120, by Jo22

User metadata
Rank l33t++
Rank
l33t++
analog_programmer wrote on 2023-08-12, 07:22:
tauro wrote on 2023-08-11, 18:08:

Do all CRTs harm your eyes? Isn't it related to refresh rate?

Yep, my eyes have always been overly sensitive to the flickering from CRT monitors and cheap LCD displays with backlight PWM-regulation. Definitely I don't think that ever again I'll need even a quality CRT "crate" (like some of those Trinitrons 100+Hz) weighing over 25 kilos.

Let's don't forget about noise, too.

I had trouble with noise, rather.
Tube TVs (classic, non 100Hz) had this horrible 15 KHz whine that made me feel dizzy after a while. VGA monitors (31,5KHz) weren't that bad, though. That's also why I have a soft spot for vintage VGA monitors, still.

Then there was coil whine on motherboards and noisy power supplies and noisy fans. There was a moment in life in which I really had suffered because of noise.
It forced me to build myself a passively cooled, low performance PC. Including passively cooled PSU.
Headphones also helped me.

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 70 of 120, by tauro

User metadata
Rank Member
Rank
Member
shamino wrote on 2023-08-12, 08:01:

My last CRT was a Sony Trinitron 19", not sure of the model. CPD-G400 comes to mind? Not sure if that's right.
It was absolutely beautiful, great contrast and glossy colors. But then the picture started going bad in 1-2 years. Turns out there was a known soldering defect in those monitors that was probably the cause and could have been fixed, but I knew nothing about that at the time.

2 years is very little time for such a big thing...

shamino wrote on 2023-08-12, 08:01:

The monitor had a menu option called "Image Restoration" or "Enhancement" or whatever. It was a one-time option. I used it. The picture was suddenly good again, for a while. Then it got really bad. By the time I got rid of that monitor, the picture had a constant gray smoke color and there were curved red lines all across the screen. It was a total piece of trash at 3 years of age.

The picture was completely gray? Or was it more like a permanent haze?

shamino wrote on 2023-08-12, 08:01:

To my understanding, that option overdrives the picture tube and ruins it. And as I recall there's no way to turn it off - you do it once and it's permanently in that mode for the rest of the monitor's life.
I think Sony's intent was to reduce warranty claims by adding a destructive option that "forces" the monitor across the warranty finish line. If the tube was ruined after that, it wasn't their problem anymore.

Wow... in this case it's not a one time thing. But with all you have told us, it makes me wonder if it is a destructive thing... The function is even advertised on the monitor, with a sticker in the front.

I noticed that the brightness option is not very useful, it adds a haze and it's not too desirable. The contrast is the important option, it means 'intensity' as far as I can see... I have it set to 100.

Maybe I'm whining because I'm used to an LED IPS (a very bright screen), but even with contrast set to 100 it is not intense. I wish I had a good one in perfect condition just to compare. This one doesn't seem to have been used a lot judging by the looks of it. It's perfectly white/beige as opposed to the classic yellowed beige. Is there a way to find out? Some test?

shamino wrote on 2023-08-12, 08:01:

To my understanding, that option overdrives the picture tube and ruins it.

I suppose that the "Highlight Zone" function just increases the power/voltage and that's what makes the image look more intense. I managed to reduce the amount of buttons to enable it to just one. And as long as you don't change the resolution/mode it keeps the "Highlight Zone" function on.

When you enable it, you get extra settings that only apply to a small section of the screen, but you can expand that section to fill all the screen and that's how you end up with a very intense picture. Then you can configure individual colors, sharpness and contrast.

These pictures show the difference, but it's exaggerated. Same ISO, aperture, exposure, etc. The monitor's default setting is not as dark as it looks in the photo. You'll notice a big difference. When you taste the "Highlight zone" stronger colors, you want them...

Attachments

Reply 71 of 120, by tauro

User metadata
Rank Member
Rank
Member
Jo22 wrote on 2023-08-12, 08:48:

Let's don't forget about noise, too.

I had trouble with noise, rather.
Tube TVs (classic, non 100Hz) had this horrible 15 KHz whine that made me feel dizzy after a while. VGA monitors (31,5KHz) weren't that bad, though. That's also why I have a soft spot for vintage VGA monitors, still.

I remember the noise some CRTs made... and I'm also a fan of silent systems. A big cooler, Noctua fans. There are also some cheaper ones that stay below 20 dB. Plus the case, plus placing it a little far, and you get a silent system.

Jo22 wrote on 2023-08-12, 08:48:

Then there was coil whine on motherboards and noisy power supplies and noisy fans. There was a moment in life in which I really had suffered because of noise.
It forced me to build myself a passively cooled, low performance PC. Including passively cooled PSU.
Headphones also helped me.

There are some PSUs that don't enable the fan until they reach a certain temperature/load. As a matter of fact, I'm using a RM750x that does exactly that.

There are some remedies for coil whine, such as using epoxy, cyanoacrilate, or even hot glue depending on the kind of coil. I haven't tried them yet but I'm keeping them in mind.

Reply 72 of 120, by Hanamichi

User metadata
Rank Member
Rank
Member
shamino wrote on 2023-08-12, 08:01:
My last CRT was a Sony Trinitron 19", not sure of the model. CPD-G400 comes to mind? Not sure if that's right. It was absolutel […]
Show full quote
tauro wrote on 2023-08-11, 19:42:

Could it damage it though? If it is truly harmful then, why is it an option?
I wish I could hack it and use it permanently like this...

My last CRT was a Sony Trinitron 19", not sure of the model. CPD-G400 comes to mind? Not sure if that's right.
It was absolutely beautiful, great contrast and glossy colors. But then the picture started going bad in 1-2 years. Turns out there was a known soldering defect in those monitors that was probably the cause and could have been fixed, but I knew nothing about that at the time.

The monitor had a menu option called "Image Restoration" or "Enhancement" or whatever. It was a one-time option. I used it. The picture was suddenly good again, for a while. Then it got really bad. By the time I got rid of that monitor, the picture had a constant gray smoke color and there were curved red lines all across the screen. It was a total piece of trash at 3 years of age.

To my understanding, that option overdrives the picture tube and ruins it. And as I recall there's no way to turn it off - you do it once and it's permanently in that mode for the rest of the monitor's life.
I think Sony's intent was to reduce warranty claims by adding a destructive option that "forces" the monitor across the warranty finish line. If the tube was ruined after that, it wasn't their problem anymore.

I don't know anything about the option on your monitor or if it will shorten it's life. But don't put too much faith in the manufacturer not to put "risky" options in there if they thought the risk was outweighed by improving their sales.

Nah

The Sony GDM and CPD monitors from a certain age have the issue you mention but I don't believe it's intentional.
Calibration drift happens on all CRTs and I reckon is a cause of many been thrown away which could fixed.

You have to remember a CRT is a hot box with precision resistors and capacitors degrading over time, a small deviation from there desired values causes problems. In addition to solder joints going bad from heating and cooling.

Sony was early to embrace digital CRT chassis and continued to heavily rely on it for calibration and cost saving.
What happens is the analogue components related to G2 voltages go out of spec, the factory digital calibration is now incorrect and drives the monitor too bright

"Image Restoration" is an inbuilt feature to help re-calibrate somewhat and was designed so that image critical applications/users can rely on the monitor for several years over the indended lifespan
Ideally using the factory calibration tool WinDAS is a better fix and even better when in conjuction with fixing the deterioating resistor, capacitor etc

Background on types of CRT 'Chassis' which need to be calibrated in different ways:

Analogue chassis - No digital scan, limited to <40khz, only limited geometry adjustments, often compatabile with 15kHz sources, adjustment and calibration are manual tuning of potentiometers. e.g. early NEC Multisyncs, Sharp, Sanyo, Epson, Mitsubishi PC98, X68000 and CGA monitors

Analogue chassis with digital controls - late 80s, early 90s.. same as above but you get digital adjustment buttons which are actually controlling RDACs and limited memory recall e.g. Mitsubishi Diamond Scan, IIyama MF, Some EGA monitors <~1992

Earlyish Digital chassis - Digital controls, digital scan, digital geometry adjustment with more geometry adjustments available, can scan up 96kHz, cannot sync below 30Khz, memory recall, calibration is done by software but may still be some potentiometers to adjust. e.g. All brands after ~1992 - 1999

Without going into details it is the digital scan component that enables much higher scan capability and geometry adjusments. (With some tradeoffs)

Hybrid chassis - They combine an analogue part to maintain <30Khz horizontal scan rate compatability and a digital part to go above 40Khz. You can see limited adjustments for 15kHz sources but the normal modern adjustments for say 50kHz source. NEC XM29+, some JDM NEC, EPSON monitors etc.

Late Digital chassis - Same as earlier digital chassis but the level of integration is higher, pushing for 130Khz scan rates, very limited adjustment outside of software calibration. Pushing resolution higher and cost lower.

There are of course other things that go wrong such as the flyback deterioration, cathode wear, phospor burn... but on the whole I have more than 30 working CRTs in my collection and all work pretty well. The late Sony's are a bit troublesome to fix because they rely so much on that digital calibration

Reply 73 of 120, by DerBaum

User metadata
Rank Oldbie
Rank
Oldbie

Some years ago i really wanted the experience back to play my old playstations on real CRTs.
So i got myself 3 SONY TVs (a 4:3 and a 16:9 in the picture and another one of the same 16:9 with a HIFI rack thats at a different location).
I think i played 4 games and put all the TVs right back into storage. The same with all the Monitors i have for my computers.
I dont know... LCDs look so much nicer and dont weight 50 kilos each...
It was cool to see how ok CRTs still look but compared to a LCD it just hurts my eyes.

2023-08-12 15.12.58.jpg
Filename
2023-08-12 15.12.58.jpg
File size
1.8 MiB
Views
1090 views
File license
CC-BY-4.0

FCKGW-RHQQ2

Reply 74 of 120, by lti

User metadata
Rank Member
Rank
Member
Skyscraper wrote on 2023-08-11, 12:01:

CRTs are also useful for newer games! 😁

https://www.youtube.com/watch?v=99B-h8sNrdc

I was going to post the followup to that video. It shows the black crush problem I was talking about, which seems common for Trinitrons. I don't think it's normal, considering how many dark games were around back when CRTs were basically the only option.
https://www.youtube.com/watch?v=55WFuvivPhE
Here's another video from several years ago where someone tried to calibrate the black crush out of two Trinitrons (a third model) and failed.
https://www.youtube.com/watch?v=xH4fKpKLkBM

Reply 75 of 120, by Tiido

User metadata
Rank l33t
Rank
l33t

Black crush is a symptom of a worn tube, and caused by incorrect cutoff voltages and acceleration(aka "Screen" and G2) voltage, both which drift with age but there's a point where these adjustments no longer give a satisfactory result.

T-04YBSC, a new YMF71x based sound card & Official VOGONS thread about it
Newly made 4MB 60ns 30pin SIMMs ~
mida sa loed ? nagunii aru ei saa 😜

Reply 76 of 120, by Hanamichi

User metadata
Rank Member
Rank
Member

I'd say incorrect RGB Cut Off and Bias values due to some drifting components.

Good writeup here:
https://shmups.system11.org/viewtopic.php?t=67477

Diamondtrons seem to universally need these adjusted and I've brought one from looking at death's door to fantastic through the service menu.

Disagree the tube is worn, they are pretty robust, many owners put there CRTs aside in the mid 2000s and a worn tube is subtly less sharp or shows actual burn.

You can see Sony D24 BVMs go on for 100k hours and still look great but many Sony GDM FW900s are in all sorts of poorly states.

Both monitors us the same tube (one has 16:9 mask the other 16:10), but the BVM has maintainable parts with some folk having the experience to keep them going.

Reply 77 of 120, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

Worn tube is like, close the blinds in the daytime to see a dos prompt.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 78 of 120, by amadeus777999

User metadata
Rank Oldbie
Rank
Oldbie

Stil have a few CRTs and always will - an old system without a CRT is a bore.
I use cheaper ones for common affairs and spare the better ones(20"+) for rare sessions.
Currently in use is a 19" Targa as a workhorse screen. It's a nice CRT but nothing like the big ones from IIyama that I have used.

Reply 79 of 120, by andre_6

User metadata
Rank Member
Rank
Member
Hanamichi wrote on 2023-08-12, 12:10:

You have to remember a CRT is a hot box with precision resistors and capacitors degrading over time, a small deviation from there desired values causes problems. In addition to solder joints going bad from heating and cooling.

Sony was early to embrace digital CRT chassis and continued to heavily rely on it for calibration and cost saving.
What happens is the analogue components related to G2 voltages go out of spec, the factory digital calibration is now incorrect and drives the monitor too bright

Could you please elaborate on the G2 voltage's analogue components, namely what these are? Are they located near the Screen and Focus knobs, or are they on the main board? I have two small CRTs that I've corrected, but I'd like to know specifically what components are getting out of spec so I can service them someday, before even later I'm forced to fully recap both TVs anyway. Even though I corrected their respective issues I know it was just a re-calibration, and not a "more definitive" solution like replacing the failing components.

I have a small late 90's Sony Trinitron that has "breathing" issues (not blooming) and I saw this advice online at the time:

"When the cathode ray current is too high, too many electrons build up in the ultor anode, causing the positive charge of the anode to drop, that is, the voltage drops. When this happens, the electrons in the cathode ray move slower, causing the cathode ray to loose its “stiffness” and it bends too quickly as the line of video is drawn, causing it to expand further to the side than it is supposed to.

The fix for this is simple! You need to decrease the cathode ray current AND increase the stiffness of the cathode ray. First, lower contrast as much as possible until going any lower makes peak white start to look too grey. This decreases cathode ray current, and therefore decreases the amount of excess electrons that can build up at the ultor anode.

You can further decrease the cathode ray current by making the cathode ray thinner. The easiest way to do this is to lower the brightness as low as it can go, so that the picture is almost totally black. Then, to make the picture have the correct brightness, don’t use the brightness setting, but instead increase the G2 “screen” dial on the flyback transformer until the picture has the correct brightness. This makes the cathode ray sharper and thinner, which decreases cathode ray current, AND it does one more thing: it makes the cathode ray more stiff because the G2 setting determines the initial acceleration applied to the electrons".

Trinitron aside, I have another small CRT from NEC from the early 90's that had a brightness issue but was related to brightness and black levels only, the geometry was never "breathing" or warped in any way. I did follow the quoted advice and it alleviated the the Triniton's "breathing" problems, as for the NEC a simple knob adjustment corrected the brightness to a good black level without making it look to dark.

Thanks!