elianda wrote on 2022-07-12, 20:21:
We had this discussion here as well, about e.g. distinguishing between 640x400 and 720x400. Also that basically every TFT shows 640x400 as 720x400 double sampling every ninth pixel.
At the moment I'd say the best approach is, if you know which one it actually is to allow only one of it for the 400 scanline mode. There is no way for a capture card to determine the correct number of horizontal pixels to sample automatically.
One theoretical way of doing it in this specific context (VGA) would imply taking advantage of the fact that 640x400 is line doubled 320x200, whereas 720x400 isn't.
Consequently, if for example a capture card is set-up for 640x400 , and if the feed actually received is 640x400 (line-doubled 320x200), then each frame digitized at 640x400 can be decimated by a factor of 1/2 (to 320x200) and then line-doubled back to 640x400 essentially losslessly (assuming no noise and phase errors ). Doing this with a 720x400 feed digitized as 640x400 , will not be even close to lossless .
Assuming what I'm suggesting above is correct .
A capture card application's detection algorithm could :
a) assume 640x400 when ambiguous timings are detected
b) run a comparison between a captured frame (once every second, for example) and a decimated and re-line-doubled version of the frame
c) if the comparison in b) results in a match do nothing, if there is no match switch to 720x400
Then, assuming a switch to 720x400 capture has occurred, the same validation logic could be applied in reverse (if the comparison matches, switch to 640x400 , otherwise do nothing) to switch back to 640x400 when appropriate .
This would need to be limited to analyzing on the y axis since it is always sampled at 400 active lines (449 total) in both cases . Having pre-configured proper phase values for both modes for a given VGA card in advance would be a pre-requisite, of course .
Does that make sense, theoretically, and is it feasible to implement ?