Jump to content
Double Fine Action Forums

BecauseMyCatSaidSo

DFA Backers
  • Content Count

    6
  • Joined

  • Last visited

About BecauseMyCatSaidSo

  • Rank
    Newbie
  1. That's one of the extensions that function seems to use (try disassembling it and the functions it calls). Was curious if it was something that prevented sub-HD 3000 level hardware from being supported (maybe a sign if it crashed right after trying to map the buffer).
  2. @RagingMind and @Jamey, does your card/gl driver support the GL_ARB_map_buffer_range extension?
  3. So I'm trying to see if I can't correct some misconceptions about mipmaps that are unfortunately common. While computing and providing a meager 30% extra memory may be distasteful to some, mipmaps are actually a fantastically clever and easy solution to a difficult sampling/filtering problem in graphics (both realtime and not): they make images look way nicer while providing a really big speedup/reduction in power consumption. So I see your statement detached, but you need to consider the reality of doing rendering... when rendering good looking, filtered and stable images, prefiltered data [mipmaps = prefiltered texture data] is key! Computing it online or offline are both fine. Many current/next-generation realtime and offline rendering techniques use the mipmap-like techniques of computnig multiresolution pre-filtered data (for texture+model data sparse voxel octrees and virtual texturing are easy ones, or for surface shading check out LEAN/CLEAN mapping). I hope people reading these posts and are interested in computer graphics aren't thrown off by someone disliking the idea of mipmaps and realize how cool they are. ps: If you don't like to generate them nvdxt/crunch/squish/pvrtextool/dxtextool/compressonator/any other tool out there will do a good job for you. Generating them on the fly if you have a dynamic texture or uncompressed texel formats is easy and fast: glGenerateMipmaps.
  4. The core "concept" of mipmaps is image processing and filtering theory... the basis of digital image synthesis. Lumping mipmaps as the same class of "static" as lods, pre-baked lighting, occlusion, pathfinding, etc seems misguided. There's no plausible realtime alternative, nor would you really want one. Perhaps higher-quality filtering on your mipped images (both offline and online, like doing a real approximation of EWA filtering instead of bilinear), but none at all? Welcome to the PS1! Folks even take realtime-computed quantities (like procedurally defined textures/displacement maps) and bake them down just so they can perform mipmap-like filtering on them to reduce aliasing (for example, take a look at some of the uses of ptex inside Disney, or brickmaps provided by PRman). Hard to imaging how you don't like the idea of making your rendered images look better for trivial cost, and a huge performance increase as a bonus!
  5. While the analogy of mipmaps=LODS is very easy to make, it's fundamentally flawed. They are critically important and that analogy is missing two important advantages they give you (which Oliver touches on). The first is that mipmaps are crucial to prevent aliasing. A good example is this image: mipmapping_example.png -- This would be even worse in motion, the image on the the left would be horribly flickery and shimmery as the image moves, while the one on the left, while blurrier, would not be flickerying and shimmery. Things like anisotropic filtering of textures also go to help this, but without mipmaps would not be sufficient to get rid of aliasing. The latter is simply bandwidth usage. The omitting mipmaps (and compression) can cause hugely more texture bandwidth to be utilized, which can be disastrous for performance, especially on these mobile devices where they are already bandwidth constrained. For quality, you'd think "high quality" rendering like film (renderman/etc) would not use mipmaps, because it is a "quality reduction"? Nope: http://renderman.pixar.com/resources/current/rps/txmake.1.html, it's equally important for them (or perhaps even more so, when you have hundreds of gigabytes of textures used in a single frame.. without mipmaps you'd need to read that all in). All mipmaps really do is pre-filter what you would need to do at runtime to prevent aliasing. One analogy if you rendered a super high resolution image with no mips and then scaled it down, you'd approximate what you get by rendering an image with mipmaps at a regular resolution.
  6. I think the poppy head-turns pointed out by the original poster may be a side-effect of the way that characters are created. (Creation process has been described by previous DF posts). Basically the characters have a set number of "angles" built into their geometry/textures (ie front/left side/right side), and the transition between them is basically hiding some parts and revealing other parts. So the while they will probably be able to smooth things out a little bit with care, attention and time, it'll be hard to completely hide the fact that the model really is popping from one state to another. One of those tech tradeoffs that end-up turn into a style of its own Hard to get these things to look like it's a traditional 2d drawing being animated, since it's really a series of 3d planes being animated using traditional (3d!) bone+joint skinning. Can't blame them either for taking that approach, since they have an entire studio familiar with 3d techniques and can't re-staff of re-train everyone to be a traditional animator. Also I think it looks different and nice!
×
×
  • Create New...