VOGONS


3dfx voodoo chip emulation

Topic actions

  • This topic is locked. You cannot reply or edit posts.

Reply 240 of 386, by wd

User metadata
Rank DOSBox Author
Rank
DOSBox Author

Is "freeing video memory" of any relevance? The glDelete functions only mark the
specified indices as free'd so if you're doing a lot of glGen (or paired calls) you should
do that to not run out of them, but it's not related in any way to the actual memory allocated
on the graphics card as far as i know.

Reply 241 of 386, by kekko

User metadata
Rank Oldbie
Rank
Oldbie

the texture coordinates and depth values are already processed when transferred to the opengl drawing routines. Or at least that is what I meant to do, by recycling code from the software renderer (see "get_st" function and get_depth macro).
In the attachment you can find the glide2x.dll code plus a couple of compiled test programs (see folder "test21" in particular).
When I talked of freeing video memory, I meant my video card memory, not emulated voodoo card mem (just in case it was not clear)
I thought of "releasing" a texture from actual video card when that texture memory space was overwritten in the emulated memory; when a texture_write is issued and a dword is written in the emulated texture memory, the texture linked to that memory space should also be released from my video memory. that's what function ogl_texture_clear is supposed to do.

Attachments

  • Filename
    glide2x_246_linux.zip
    File size
    1.86 MiB
    Downloads
    343 downloads
    File license
    Fair use/fair dealing exception

Reply 244 of 386, by bored

User metadata
Rank Newbie
Rank
Newbie

kekko, I'm having a blast playing EF2000 in its 3dFX glory.....the major gaming news of 2010 from my point of view. Thank you so much.

But as usual, we always want more.....and I was wondering how it was going with the OpenGL version.
I compiled and ran EF2000 using the code from 'dosbox_ogl_exp9.zip, and while there are issues, the textures that were visible looked orgasmic and were rendered with breathtaking speed.

This may be a naive question, but is it possible to work backwards and recreate Glide from the raw 3dFX commands thus allowing the use of existing Glide wrappers?

....and a word of appreciation to the DosBox developers. I've never seen an open source project compile with so few warnings, or have such good documentation. Using the VS2008 instructions, I was up and running in under 30 minutes. Good work!

Reply 245 of 386, by kekko

User metadata
Rank Oldbie
Rank
Oldbie

thank you! I'm actually quite proud of this result, which I couldn't have achieved without the help of great people like Hal, wd and gulikoza.

opengl is quite promising indeed, and I guess there's actually not much work left to do, but there are few troubles which made me stop and I need some help and/or some time to spend on this (which I have not 🙁 )

the interfacing code needed to make it work through a wrapper is much more complex than actually write the few opengl calls needed.
the framework is mostly there. it just a matter of completing the missing rendering features (and fixing some bugs).

Reply 247 of 386, by TouchMonkey

User metadata
Rank Newbie
Rank
Newbie
kekko wrote:

About the perspective correction issue, I've read somewhere that I need to use glFrustum or gluPerspective, instead of glOrtho, in order to fix textures. Does anyone have a clue?
Thanks 😀

You definitely don't want to use glFrustrum or gluPerspective, glOrtho is the way to go you just need to modify the texture parameters based on the depth. If you use the frustrum/perspective functions to setup your view/projection matrices then your vertex X/Y coordinates will get messed up as they get shifted towards or away from the center based on their Z (depth). glOrtho does not modify the X/Y coordinates so you want to stick with that.

I've worked on OpenGL graphics code for a Dreamcast emulator and the PowerVR chip in that system works similarly to the Voodoo, no transform or lighting hardware so all of the coordinates come in in screen coordinates.

The way I made perspective-correct texturing work for the PowerVR was to multiply the texture U/V coordinates by the vertex Z (depth) in the vertex shader, then using the texture2DProj texture sampler in the pixel/fragment shader.

The exact lines I used are below.

Vertex shader:
gl_TexCoord[0] = gl_Vertex.z * gl_MultiTexCoord0;

Fragment shader:
vec4 texel = texture2DProj(texture_sampler, gl_TexCoord[0]);

Give that a try and see if it works for you. I will say that I eventually moved the rendering code over to DX10 and had a MUCH easier time dealing with the pre-transformed coordinates and depth values. OGL can be a bit restrictive since even with shaders it is very fixed-function oriented. DX10 lets you write almost the entire graphics pipeline in the shaders, and lets you define whatever you want a "vertex" to be, so you can abuse the hell out of it 😀 Of course that limits you to Windows but there are plusses and minuses to everything.

Reply 248 of 386, by leileilol

User metadata
Rank l33t++
Rank
l33t++
TouchMonkey wrote:

I've worked on OpenGL graphics code for a Dreamcast emulator and the PowerVR chip in that system works similarly to the Voodoo, no transform or lighting hardware so all of the coordinates come in in screen coordinates.

Great! Finally a PVR hacker arrived.

apsosig.png
long live PCem

Reply 249 of 386, by kode54

User metadata
Rank Member
Rank
Member
kekko wrote:

About the perspective correction issue, I've read somewhere that I need to use glFrustum or gluPerspective, instead of glOrtho, in order to fix textures. Does anyone have a clue?
Thanks 😀

Hi, I see you can't read. Maybe you'll listen to somebody else instead?

Reply 250 of 386, by kekko

User metadata
Rank Oldbie
Rank
Oldbie

@TouchMonkey
Thank you! You cleared things up quite a bit about that functions.
So, you're saying that perspective correction can't work, because triangles are actually in 2d space, right? and the solution you implemented is to fix texturing pixel by pixel with a simple shader? That's quite interesting, indeed.
Unfortunately I'm not sure of being able to easily implement some shader code, but I was thinking to go the other way around, please check if it makes sense for you.
What about transforming 2d vertex x,y coordinates back to 3d using inverse projection formula, then use glFrustrum (or gluPerspective) to represent triangles in 3d space? May it work?

@leilei
the pvr emulator by touchmonkey might fulfil your dream of getting the chip emulated in dosbox 😉

@kode54
I must confess I'm not an expert of the matter, but I don't see the need of being rude.

Reply 251 of 386, by TouchMonkey

User metadata
Rank Newbie
Rank
Newbie
kekko wrote:

@TouchMonkey
So, you're saying that perspective correction can't work, because triangles are actually in 2d space, right? and the solution you implemented is to fix texturing pixel by pixel with a simple shader?

That's pretty much correct. You have a vertex with an X/Y value that specify a specific pixel, and a depth value that can be whatever the designer wanted and doesn't even have to related to other vertexes in any way. A shader isn't the ONLY solution, however it is the one I have code in front of me for.

When I originally wrote the PVR code I wasn't using a shader and I had managed to get textures working correctly. I just can't remember how I did it 😀 I needed to go to a shader so I could correctly model some blending modes that aren't available in OpenGL. Unfortunately I don't have a copy of the original source any more as I've changed revision control systems and my old OpenGL code never made it into the new system. I only have a single copy of my last OpenGL version before I moved everything over to DirectX 10. I'm not sure you'll run into that with the Voodoo, but the main reason I changed was because OpenGL is extremely restrictive about the depth buffer and finding a well-supported method of having a floating point depth buffer that wasn't retricted to 0-1 was not easy (or possible, really). DirectX is MUCH more flexible. But that's a different story.

But let's see if we can hack around this anyway.

The logic that I did in the vertex shader can just be done directly in your code. If you have a vertex, V, with an X Y and Z parameters and a texture T with a U and V parameter you could use the following code:

glTexCoord2f(T.U * V.Z, T.V * V.Z);

That does everything the one line of vertex shader did, it's just a multiplication of the UV coordinates by the depth.

Now for the hard part, trying to replace the fragment shader with the fixed function commands. I remember it taking a couple of tries to get working correctly before but it isn't impossible.

The diffculty comes in that there are a number of different texture samplers and I don't think the default implementation uses texture2Dproj(), I think it just calls texture2D(). There's also a texture2Drect() which is used when you specifically don't want any correction done.

As I said I can't remember what my final solution was but here's a couple of things to try.

Attempt 1: Try using the 4 parameter version of glTexCoord. The 3rd parameter is used for 3D textures, which you're not using, but the 4th parameter is used as a modifier for the other parameters. This may seem counter-intuitive but try modifying the glTexCoord command again as follows:

glTexCoord4f(T.U * V.Z, T.V * V.Z, 0.0f, 1.0f / V.Z);

Basically we're multiplying UV by the depth, the supplying the 4th parameter (i think it's T, as in UVST) as the inverse of the depth. What's funny is that U, V and S get multiplied by T which will undo everything but I think may work.

If that doesn't work, try switching the multiplies and the divide:

glTexCoord4f(T.U / V.Z, T.V / V.Z, 0.0f, V.Z);

While you're at it, might as well try these iterations as well:

glTexCoord4f(T.U, T.V, 0.0f, 1.0f / V.Z);
glTexCoord4f(T.U, T.V, 0.0f, V.Z);

I think the first one is the most likely to work.

If that doesn't work there's something else we can try.

Attempt 2: Change glTexCoord back to using the 2 coordinate version (with the multiplication) but change your glVertex commands to use the 4 parameter version. You would end up with:

glVertex4f(V.X, V.Y, V.Z, 1.0f / V.Z);
glTexCoord2f(T.U * V.Z, T.V * V.Z);

If the results aren't quite right, try dividing T.UV instead of multiplying, or just using the normal UV.

I'm not very confident about this one, Attempt 1 seems much more likely, but you can try it anyway.

kekko wrote:

@TouchMonkey
What about transforming 2d vertex x,y coordinates back to 3d using inverse projection formula, then use glFrustrum (or gluPerspective) to represent triangles in 3d space? May it work?

Technically that could work but I have a feeling it wouldn't be worth the effort. To summarize, you're missing a lot of information, and an entire vertex parameter, that the original calculations used when positioning the vertex in 3D space. Trying to project all of the coordinates into your own 3D space, then have them moved back successfully into their original XY position by your own view/projection math, is extremely difficult and will break in a lot of cases where you have odd Z (depth) values. Not to mention you already have limited depth buffer precision so you want to avoid as much math as possible or pixels might start showing up in the incorrect order when polygons are close together in space.

Like I said though, that doesn't mean it can't be done it will just be hard and there are easier ways to go about fixing texture perspective issues.

I hope that all makes sense or at least provides some insight. This stuff is pretty complicated. I wish I still had my original code, let that be a lesson that you should always use a good revision control system and you should never throw away the history when moving things around.

Reply 252 of 386, by TouchMonkey

User metadata
Rank Newbie
Rank
Newbie

Looking at my old code I'm pretty sure Attempt 1 is the correct method. I found the following macro still in use:

#define handle_texture_coordinates(u, v, z) \
glTexCoord2f(u, v);

Originally all of my texture coordinates used the Z (depth). When I converted everything to shaders I was too lazy to modify all of my calls so I kept the macro and just ignored the 3rd parameter.

Reply 253 of 386, by Dominus

User metadata
Rank DOSBox Moderator
Rank
DOSBox Moderator

Not directly coderelated, but Directx is not really an option if that code is supposed to ever get accepted into dosbox...

Windows 3.1x guide for DOSBox
60 seconds guide to DOSBox
DOSBox SVN snapshot for macOS (10.4-11.x ppc/intel 32/64bit) notarized for gatekeeper

Reply 254 of 386, by leileilol

User metadata
Rank l33t++
Rank
l33t++
TouchMonkey wrote:

I just can't remember how I did it 😀 I needed to go to a shader so I could correctly model some blending modes that aren't available in OpenGL.

That's no problem: all PCX1-2 can do is alpha blend.

I can't see why that or even Glide would need DX10 to roll properly. I second the OpenGL motion

though, shaders would be cool to fake the characteristic 16-bit DAC dithering but that's something GLSL postprocessing could do instead

apsosig.png
long live PCem

Reply 255 of 386, by TouchMonkey

User metadata
Rank Newbie
Rank
Newbie
leileilol wrote:
TouchMonkey wrote:

I just can't remember how I did it 😀 I needed to go to a shader so I could correctly model some blending modes that aren't available in OpenGL.

That's no problem: all PCX1-2 can do is alpha blend.

I can't see why that or even Glide would need DX10 to roll properly. I second the OpenGL motion

I actually misspoke slightly there. I meant to say I moved to a shader because I needed to implement some blend methods that were not exactly identical to the options provided by the OpenGL fixed-function pipeline. Moving the blend logic to an OpenGL shader (GLSL) allowed me to exactly match the blending of the PowerVR.

The reason I moved to DirectX 10 was because of the framebuffer issue. Now this is COMPLETELY off topic, but the PowerVR used floating point depth values WAY before any of the other chips got around to implementing it. OpenGL, unless you're using one of the new extensions, only supports an integer based depth buffer. So the float values you pass in have to get converted to a 16 or 24 bit integer which can cause clipping issues when polygons get close together. Not only that, OpenGL requires that all depth values coming out of the view/projection matrices be mapped to between 0.0 and 1.0, inclusive, so they can be cleanly mapped to the integer values. Values greater or less than those values get clamped. The PowerVR has no such restrictions, some developers use who number or very tiny number or negative number, it really doesn't matter to the PowerVR as long as it is a valid float. This lead to some severe problems trying to get all of the depth values mapped into the correct range.

Originally I tried using the float depth buffer OpenGL extension to work around this but ran into two major problems. The first is that, at the time, not all video card manufacturers supported the extension. So it wouldn't work everything. The second problem was the deal breaker. Even though the buffer is floating-point OpenGL STILL clamps all values to 0.0-1.0. This is stupid, in my opinion.

Now, DirectX 9 wouldn't help either because FP depth buffer wasn't a guarantee of the spec and even then it was still a mostly fixed-function API. However DX10 was a complete rewrite that guarantees a FP depth-buffer as part of the implementation requirements. And even better, while the DX spec mentions that you should restrict values to 0.0-1.0, if you read into the shader documentation the card is required to use whatever depth value you pass out no matter what the value. So by switching to DX10 I was able to implement the PowerVR's depth sorting with much more accuracy and using less hacks to constrain the depth values.

Don't get me wrong, I am NOT saying you should use DX for this project. I originally chose OpenGL because it was portable across multiple systems. I just want you to be aware of some of the pitfalls I ran into while trying to emulate a similar hardware device.

Reply 256 of 386, by kekko

User metadata
Rank Oldbie
Rank
Oldbie

Hi touchmonkey, i'm thinking again about inverse projection method; I guess I understood why you said it would work just in theory.
Some games use strange tricks or don't even use z buffer (just triangle z-sorting i guess).
Moreover, you can't correctly convert clipped polys on the edges of viewing frustum for obvious reasons.
I'm not able to implement that shader technique, if you could give me some hints on how to do it, well, that would be just great.
thanks.

Reply 257 of 386, by TouchMonkey

User metadata
Rank Newbie
Rank
Newbie
kekko wrote:
Hi touchmonkey, i'm thinking again about inverse projection method; I guess I understood why you said it would work just in theo […]
Show full quote

Hi touchmonkey, i'm thinking again about inverse projection method; I guess I understood why you said it would work just in theory.
Some games use strange tricks or don't even use z buffer (just triangle z-sorting i guess).
Moreover, you can't correctly convert clipped polys on the edges of viewing frustum for obvious reasons.
I'm not able to implement that shader technique, if you could give me some hints on how to do it, well, that would be just great.
thanks.

Alright, I spent the morning writing up a little test app (first time I've touched OpenGL in years) and managed to get perspective texturing working without shaders or anything too fancy. I'm attaching the project so you can try it out for yourself. The program draws 2 rectangles. The one on the left is using normal texture coordinates and does not draw correctly. The one on the right uses modified texture coordinates and displays correctly. See the attached screenshot.

It turns out the correct method was to multiply U and V by Z, and specify Z as the 4th parameter. Here's my drawing code, the first function is "bad", the second function draws correctly.

static void DisplayVertex(Vertex* v, float xShift)
{
glColor4fv(v->color);
glTexCoord2f(v->u, v->v);
glVertex3f(v->x + xShift, v->y, v->z);
}

static void DisplayVertexCorrected(Vertex* v, float xShift)
{
glColor4fv(v->color);
glTexCoord4f(v->u * v->z, v->v * v->z, 0.0f, v->z);
glVertex3f(v->x + xShift, v->y, v->z);
}

To clarify one point, I don't specifically call glOrtho (or gluOrtho2D) or any of the other project functions in this sample, I just did everything using the default (Identity) projection and world matrices. The identity matrices don't do any modifications based on depth so technically it is just an Orthographic project with X and Y ranges of -1 to 1 ,and 0 to 1 in the Z direction. Calling glOrtho just modifies the various matrices so that the X and Y ranges are scaled accordingly.

The project was written in Visual Studio 2010, but if you're not on windows you should be able to copy the code into a different project. If you are on Windows and don't have VS, the Express versions are pretty nice and free.

Let me know if you have any trouble getting this running or have any questions about it.

Attachments

  • Filename
    PerspectiveCorrectTextures.zip
    File size
    3.72 KiB
    Downloads
    297 downloads
    File comment
    VS2010 Project Files
    File license
    Fair use/fair dealing exception
  • Screenshot.png
    Filename
    Screenshot.png
    File size
    34.59 KiB
    Views
    3562 views
    File comment
    Screenshot
    File license
    Fair use/fair dealing exception

Reply 258 of 386, by kekko

User metadata
Rank Oldbie
Rank
Oldbie

thank you touchmonkey. unfortunately it seems that nothing changed, the textures are still distorted.
Does it depend by glOrtho? If I try to comment glortho and/or glmatrix* functions, nothing is rendered anymore. Any idea?
also, I noticed that if I change the z of upper vertices in your test app, the right texture gets distorted, too.
thanks.

Reply 259 of 386, by gulikoza

User metadata
Rank Oldbie
Rank
Oldbie

I might be totally off here, since I haven't seen what you tried.
You actually do not want to divide with the vertex->z since that will usually be 1.0 or -1.0. I found something which might be helpful:

"The programmer provides initial texture values s/w, t/w, and 1/w for each vertex and Glide computes the gradients. The hardware performs the proper iteration and perspective correction for true-perspective texture mapping. During each iteration of row/column walking, a division is performed by 1/w to correct for perspective distortion."

What you need to put in as a fourth coordinate is 1/w. glOrtho should stay as it is.

Again, sorry if I'm totally off 😜

http://www.si-gamer.net/gulikoza