Log In  

Cart #50080 | 2018-03-08 | Code ▽ | Embed ▽ | License: CC4-BY-NC-SA

Material Capture, or "matcap" for short, is a really neat technique, and now that I've learned about it, I'm pretty surprised that it doesn't come up more often in gamedev conversations. It's great! Here's a pico8 implementation of the concept: it draws approximated realtime reflections on 3D meshes.

There are three settings that you can change while it's running:

Right/Left: Switch materials
Up/Down: Fast-mode / Slow-mode
O/X: Switch models

Fast-mode is kind of like a vertex shader in modern 3D engines (the material texture is only sampled along the edges of triangles). Slow-mode is kind of like a pixel shader (except there's no GPU to give it a speed boost)

P#50081 2018-03-07 22:54 ( Edited 2018-03-09 18:15)

:: Felice

Is this different from spherical environment mapping?

P#50095 2018-03-08 07:43 ( Edited 2018-03-08 12:43)

It's very similar - the downside here is that we only have a hemisphere of lighting info, so we have to lock it to the camera orientation (if you rotate the camera, the lights and reflections have to rotate with it)

Really simple overview of the method for lighting a single point on a surface:

  1. Get the surface normal and convert to view-space
  2. Convert the normal from (-1,1) space to (0,1) space
  3. Sample the matcap texture with that position as the UV coordinate

Then just make sure your texture is a render of a sphere, like the examples here

P#50115 2018-03-08 17:13 ( Edited 2018-03-08 22:13)
:: Felice

Yeah, that's pretty much camera-space spherical envmapping.

With spherical (vs. paraboloid) you basically have a render of a sphere that really only works well from at or near the same axis as the camera that rendered the environment image. You can warp it, or add a second image as seen from the opposite direction, to see the whole universe instead of half, but you get severe resolution issues around the circumference.

One thing I meant to try when I worked on PS2, but never got around to, was to have six renders, one per +/- axis, which only extended out to +/- 0.7071 in each direction, and then choose one of them to sample from based on which was the major axis of the normal. Basically a cube map, except still a spherical projection onto the textures, to save having to project the normal onto the face of the cube. I think it could work well, but I'm not sure sprite ram's sufficient to make it work on pico-8... maybe if you repeated or flip-flopped the four equatorial images and just had special ones for the north and south poles.

P#50119 2018-03-08 18:54 ( Edited 2018-03-08 23:59)

Oh well shit, I guess I didn't actually know what spherical envmapping meant, then - thanks a bunch for the info!

Your spherical-cube idea is very interesting...it seems like you might want to generate those textures from a 3D render instead of drawing them by hand, to avoid seams. Seems like you could also author a cubemap and then project it into your sphere-cube format at load time?

P#50148 2018-03-09 12:57 ( Edited 2018-03-09 17:57)
:: Felice

Yeah, that's what I always imagined I'd do. Render the faces of a cubemap, then render each face of the cubemap as a texture onto an appropriately-warped grid of quads to produce the final image.

There are also some ways to do dynamic dual-parabaloid environment maps that skip the middle image-warp step by warping the rendered geometry instead, but you have to have well-tessellated geometry for that to work without obvious artifacts. The upside is you render two scenes instead of six. Each scene has more in it, due to the field of view being wider, but you at least skip all the culling and ordering of the other four.

P#50150 2018-03-09 13:15 ( Edited 2018-03-09 18:42)

[Please log in to post a comment]

Follow Lexaloffle:        
Generated 2021-01-17 19:01 | 0.018s | 4194k | Q:32