Jump to content
Double Fine Action Forums
Sign in to follow this  
Jamey Sharp

crash at startup on Linux

Recommended Posts

When I try to run The Cave, at startup, I see the cursor change shape, then I see the swirly "Loading..." animation in the corner, and then the game segfaults.

Comparing what I see to when my wife plays it on her laptop, I notice that I should be seeing the Sega logo before the "Loading..." animation; and based on some error output when I try replaying an apitrace of the crashing session, I'm guessing that I don't have s3tc support enabled correctly. (Tips appreciated, since I do have libtxc-dxtn-s2tc0:i386 installed.) But this crash doesn't appear to be in a GL call or related to textures, so I thought I should report it separately.

Here's the stacktrace of the crashing thread; I can provide stacktraces from the other threads if that would help, as well as an apitrace of the GL calls that completed before the crash.


#0  0x08960795 in VertexBufferInternal::ReplicateVertices(unsigned int, unsigned int, unsigned int) ()

#1  0x08974eb4 in RenderContext::RenderInstanced(VertexDeclaration const&, Array const&, IndexBuffer const&, VertexBuffer const&, Range const&) ()

#2  0x08ae13c9 in ParticleSnapshot::_RenderParticles(RenderContext*, TaskDispatcher*, mat4 const&) ()

#3  0x08ae1953 in ParticleSnapshot::Render(RenderContext*, TaskDispatcher*, mat4 const&) ()

#4  0x08a7de01 in SceneFrame::_RenderShadedSnapshots(RenderContext*, RenderMessagePump*, RenderSnapshot**, unsigned int, GeomFilter, char const*, bool) ()

#5  0x08a8712e in SceneFrame::_RenderShadedPass(RenderContext*, RenderMessagePump*) ()

#6  0x08ab2228 in SceneGraph::_RenderThreadMainLoop(Thread*) ()

#7  0x08508916 in Thread::Run() ()

#8  0x08508ef2 in ThreadImpl::_StartFunc(void*) ()

#9  0xf7b1ec39 in start_thread ()

  from /lib/i386-linux-gnu/i686/cmov/libpthread.so.0

#10 0xf795b78e in clone () from /lib/i386-linux-gnu/i686/cmov/libc.so.6

I don't know how to find a version number for this game but my copy of Cave.bin.x86 has a SHA-1 hash of b63c271a3949fcc72c3e2000a263b8e70990aeb2.

I haven't been able to find any hints on this crash from this forum or general Google searches. (I guess most people don't run Steam games under gdb when they crash...)

Is this a known issue, or is there something I can do to troubleshoot it further?

Share this post


Link to post
Share on other sites

Intel Core 2 Duo; i965; Debian testing; 4GB system RAM.

I won't be shocked if graphics performance is too slow on this hardware, but the game crashes before I can find that out...

Share this post


Link to post
Share on other sites

It's the graphics chip for sure. The lowest one supported officially is the Intel HD 3000 integrated graphics. Sorry, unless someone at DF has something to say, you're probably out of luck.

Share this post


Link to post
Share on other sites

So what you're telling me is I'm only four years and five hardware generations behind the minimum supported hardware? I don't see the problem. :-/

I'd still appreciate an official response on the segfault.

Share this post


Link to post
Share on other sites

I have an Intel i5-3450 (HD2500 graphics) running Debian Wheezy (testing) and I get the exact same behavior and crash when I try to run the game.

Share this post


Link to post
Share on other sites

Have you tried throwing a message at support (at) doublefine (dot) com?

Hate to sound like a broken record, but the minimum supported is the HD 3000. I imagine DF will help you more than most big publishers will, but don't expect miracles.

Share this post


Link to post
Share on other sites
@RagingMind and @Jamey, does your card/gl driver support the GL_ARB_map_buffer_range extension?

glxinfo lists GL_ARB_map_buffer_range in the "OpenGL extensions" list, so yes, I think it's supported by the "Mesa DRI Intel® 965GM" driver version 8.0.5 that I have installed. Although I'm not sure off-hand how to verify that my installed 32-bit build of Mesa has the same support as the 64-bit build. Why do you ask?

Share this post


Link to post
Share on other sites

That's one of the extensions that function seems to use (try disassembling it and the functions it calls). Was curious if it was something that prevented sub-HD 3000 level hardware from being supported (maybe a sign if it crashed right after trying to map the buffer).

Share this post


Link to post
Share on other sites

My understanding is that the difference between the HD 2500 and 3000 is the number of execution units, so I wouldn't expect the game to crash with my GPU but just run slower. (possibly too slow to play, but I can't even get that far)

Share this post


Link to post
Share on other sites
Sign in to follow this  

×
×
  • Create New...