Jump to content
Double Fine Action Forums


DFA Backers
  • Content Count

  • Joined

  • Last visited

About jensend

  • Rank
    Double Action Newbie
  1. I wasn't in a position to back it before Christmas, then I was sick and dealing with new responsibilities after Christmas, and I forgot how soon it was ending! No other opportunities to support it until buying a retail copy in the distant future?
  2. I played all of Act 1 offline and had no such problems.
  3. Absolutely 100% agree that you should not be able to carry more than one egg at a time - Vella should protest about falling. I took a while to get this puzzle. When I collected all three eggs, thinking the weight would help me return to the ground, and found that they didn't do anything, I was at a bit of a loss. Took me a while to think of the platform, because if it isn't heavy enough to make me sink it sure isn't going to make that platform sink. When taking things out of my pockets suddenly had seismic consequences, it broke my suspension of disbelief. The unrealism of inventories can only be taken this far if it's lampshaded as a laugh. I don't think that would work well here. Worries about a one-egg rule causing terrible tedium are misplaced. You are never more than a couple screens away from the platform where the eggs are deposited. You can double click for fast screen transitions.
  4. After deleting the entire save game folder, the game launches normally. Unfortunately, the random hang/reboot problem is persisting, and it's showing up in a couple other games as well- it may even be a hardware problem at this stage, unfortunately. If there's any way of recovering the save, it would make it so I don't have to replay 95% of the game to play the last 5%, which I'd appreciate. Compressed copy of save folder at dropbox.
  5. No dice. Brutal Legend still simply sits at black with 100% utilization of one cpu core right after the bink etc logos disappear. On the plus side, maybe that means my save file is not the problem and it could be recovered and played elsewhere?
  6. I have an AMD A6-5400K APU. (It has integrated HD 7560D graphics, which uses the same VLIW4 technology as AMD's 69xx discrete cards, not the GCN technology used by the 77xx and higher discrete cards; since DF has already indicated there may be some problems with 6xxx cards I thought that was important to note.) Brütal Legend worked just fine for me until yesterday, when I upgraded my drivers from Catalyst 12.8 to Catalyst 13.9 (at the request of AMD Support, who I had been talking to about another game which wasn't working-- and still isn't working). After the upgrade, my system started hanging or rebooting randomly in Brütal Legend- it'd give me anywhere from 2 minutes to 30 minutes of normal gameplay and then either hang or reboot. For a while I tried changing various settings and trying again- after all, I was close to the end of the single-player game, at the shores of the Sea of Black Tears after having completed almost all secondary missions, freeing 110 of the 120 serpents, etc, so I was hoping to just manage a bit longer. Then it rebooted when I think it was in the middle of saving my game, and now the game hangs on launch (after the DF and tech logos but before the Jack Black record store video). Any way my save might be recoverable? Any ideas on the driver front (I may just try to see if rolling things back does the trick)?
  7. Ah, hadn't found that thread before posting. (It veered off topic pretty fast though.) DF Justin did say then that they didn't plan to fix it in the next patch, but that patch was released in Feb. This really is a very visually prominent problem. Every graphics stack has good bilinear interpolation/filtering/scaling routines. Since it's not like they're needing hundreds of images resized every second, even a very simplistic hand-rolled routine, which you could code up in just a handful of lines, would perform more than plenty fast enough.
  8. I bought The Cave recently; it's got a lot of brilliant ideas but seems unpolished, and some things drive me crazy. Here's one example that seems like it ought to be easy to fix. The cave paintings are good artwork, but (at least at non-default resolutions) they're ruined by awful artifacts caused by nearest-neighbor scaling. DF should know better. Naive linear scaling improves the situation tremendously. Linear scaling is not too great itself, but for these cartoony images the difference between it and actually-decent-but-more-expensive algorithms like Lanzcos is fairly minimal. The intro/loading screens also have scaling issues- probably the same NN scaling - but I haven't investigated those. Attachments: the cave paintings, which are in Packs\RgS_stuff, are natively 1024x512 (cp_tt_01.png). When you play the game at 1080p they're resized to 1612x806 and the nearest neighbor interpolation is horrid (screenshot.png). Linear interpolation (linear.png) is good enough to no longer be distracting or painful to the viewer.
  9. The new system requirements are up, and they should serve as a much more accurate guideline than the previous ones. Looks like we're done here. Thanks!
  10. I hadn't actually tried it with 128MB, just wondered about it given the fairly simple look and the way it performed with 256MB. After reading your reply, I checked with GPU-Z; max gpu mem usage while playing through several encounters etc at 640x480 was 197MB, about the same for 800x600, while at 1024x768 it was 226MB. So 256MB is the correct minimum, and extrapolating based on pixel count, 1080p should work just fine with 384MB, so I'd say use that for your "recommended" level rather than 1GB. The Geforce 7600s were midrange parts, and the Radeon X1300 was matched up against the much-weaker Geforce 7300. Wikipedia's lists (links: AMD, NV, Intel) give you a good overview of product lineups and specifications. Several sites have fairly comprehensive benchmark databases available; a prominent example is Notebookcheck (remove the "still available" restriction and allow showing desktop GPUs too to get the full picture from the linked page). If you want more detailed game benchmarks etc, some hardware review sites have easily browseable archives (here's Tech Report's archive for the relevant period). Obviously none of these will tell you exactly how a card will perform with Costume Quest, but any of these things should be able to give you a fairly reasonable picture of the relative performance of different cards. Yes. I'm pretty confident about that; the one part I'm least sure of is whether the very lowest-end Intel HD Graphics- the one that came in Arrandale CPUs- is up to the task. This forum post from December seems to indicate it works on a normally-clocked Arrandale, but he didn't say how smoothly it ran. If you want to be conservative you might say "HD Graphics 2000."
  11. Some months ago I mentioned in the support thread that Costume Quest is completely unplayable on my usual machine, a (usually docked) laptop with only an Intel GMA 4500, even at absolute minimum settings and resolution. DF David F. thought the game might be CPU limited even though that machine had a fairly quick 1.9 GHz Core 2 Duo, but neither core ever hit 60% utilization during gameplay. The CPU wasn't the problem, the GPU was. Yesterday I finished playing through Costume Quest on an AMD E-350 system (HD 6310 integrated GPU) with intermediate settings at 1024x768. This dual-core CPU is much weaker than the Core 2 Duo (it's probably about 2/3 as fast), and at 1.6 GHz it definitely doesn't meet your stated minimum system requirement. But the game was quite smooth and enjoyable. Please fix the system requirements. The game requires less CPU power and much less GPU memory than you state, and it needs much, much more GPU power on the Intel side of things. Inaccurate system requirements lead to frustrated and angry customers who can't really play a game they paid for and other customers who could have enjoyed the game but didn't purchase it because they thought they couldn't run it. As I said, the two 1.6GHz AMD Bobcat cores in the E-350 did just fine; the next lower model is only 1.3GHz and is probably too slow. Any Core 2 Duo at 1.2 GHz and up (I think that's all of them) should be fine, as should any Athlon/Turion X2 at 1.5GHz and up (all but one model AFAIK); these processors are faster per-clock than Bobcat. I tried testing the game on a single-core Pentium 4 (2.4GHz, with hyperthreading) and found it could be kinda playable at minimum settings but it was definitely CPU limited. The very highest-end Intel Atom, the D2800, might be OK, but I don't think any other Atoms would work (plus they're normally paired with GPUs that are at least 10x too slow to run CQ). That's a complicated situation to sum up in the system requirement, but since you're already using C2D as a reference to specify the recommended system, you might say the game requires a 1.2GHz Core 2 Duo or equivalent. Maybe 1.3 or 1.4GHz if you want to be rather conservative, but certainly not 2.0. Costume Quest certainly doesn't require more than 256MB of GPU memory, and I wonder whether it really requires any more than 128MB or can even make any use of more than 512MB. Many of the graphics cards close to your stated minimum requirements were never available with more than 256MB of graphics memory, and cards with more memory than the norm for a given chip are actually quite often slower than normal. (For cards that are lower clocked, the manufacturer often tosses more memory on as a marketing point to sell these slower cards to unwary/uninformed buyers.) The G45 Intel chips are far, far away from being up to the task for Costume Quest. The Intel HD Graphics (built into Arrandale CPUs) might be playable on the lowest settings; I'm not sure. The HD Graphics 2000 (the lower-end Sandy Bridge CPUs) should be OK at normal settings. Just say "Intel HD Graphics or better." It's harder to know what the minimum ATI/AMD and nV cards should be, but it's easy to see that right now they're out of whack; for instance, the Nvidia 7600 GS has the same DirectX support as the X1300 and ~3x the performance, but you say it doesn't meet the requirements and the X1300 does. When I tested with the Pentium 4 and found it almost playable at min settings, that system had an nV 7600GS, and the video card was not the bottleneck; with a faster CPU the 7600GS would probably be playable. I doubt the Radeon X1300 is playable; the Radeon X1600 (along with the X1300XT, which was basically just a relabeled X1600) might be fine. You say any GeForce 8000+ would work, but some of the lower-end GeForce 8 series (8100-8400) might be questionable, since they performed significantly worse than the 7600GS in many games. I guess I'd suggest "Radeon X1600 or better" and "GeForce 7600GS or better."
  12. DFJustin: Thanks. I can be quite patient about this; I've played the game before and though I want to play it again and get the achievements it doesn't have to be immediate. Coryking: have you looked at play-through videos online to see how what you're seeing differs from the norm? One can't normally see Raz or just about anything else in there, but you should be able to see footprints.
  13. Note that the only official Steam statement about offline achievements is Source-engine specific (i.e. only for Half-Life 2 and associated games); it mentions that playing in HL2's Commentary Mode, Offline Mode, or cheat mode won't trigger any achievements. While there are some other games where some achievements don't trigger while offline, earning achievements while offline works just fine for most achievements on most Steam games. The most common exception is when a game has a event counter which is supposed to be totaled over all your times playing the game; since these have to reach across different save files they often are completely outside of the scope of any "saved state" which the game keeps track of. For instance, since you mention Dredmor, I've earned quite a number of Dredmor achievements while offline, but the "Kill x number of diggles" achievements won't trigger; the game doesn't keep track of that number itself at all, it just tries to relay an event to steam each time a Diggle is killed. So it makes some sense that this can't be earned offline, though they could have chosen to program it in a way that would work with offline mode (saving the total number of diggles killed in some kind of global application data outside of any particular save game). None of the Psychonauts achievements are things which reach across saves, and almost all of them are things the game quite definitely keeps track of in the save file- whether different areas were completed, whether all of such-and-such items were collected, etc. As I said earlier, the Psychonauts level achievements (and several other achievements) can be earned offline and then persist after reconnecting. This suggests to me that much of the question is what gets stored in the save game files and how reconnecting and synchronizing with the Steam Cloud interacts with those saves. Apparently player level is read from the saves when synchronizing but most other achievement-relevant facts aren't. If it were just that the achievements I mentioned weren't triggering when in Offline Mode, that'd be regrettable (and fixable!) but not too uncommon for a Steam game. But there are two further oddities with Psychonauts achievements: First, the achievements actually do trigger while offline, but then they're overwritten when you reconnect and synchronize with the Steam Cloud. Second, once you've triggered the achievement while offline *you can no longer earn that achievement using that save game* . For instance, if I'm offline and I read a few post-its of camp gossip on the notice boards, the "Camp Gossip" achievement is triggered; when I reconnect, the achievement is erased *and I can't re-earn the achievement by looking at camp gossip while online.* I have to start a new save game.
  14. I can understand if you don't have much of a response for my problem with achievements, but it'd be nice if you could at least acknowledge the fact that I said something about it...
  15. Still seems fishy to me, especially since the 4500 draw calls is when looking at just the initial jack-o-lanterns and once it really gets started with the neighborhood sweep that drops to ~3000. Couldn't batching reduce the number of draw calls drastically? I'm no expert here, but it really seems to me like a _lot_ of draw calls for the amount of geometry being shown. After reading a couple of articles I guess that's a common issue with porting from a console- the per-call overhead is a *lot* higher on PC, so PC games are designed around instancing the geometry, batching the draw calls, etc to get everything drawn without crazy high numbers of draw calls, while on consoles, where there's no api overhead, doing 10K separate draw calls is not really a big deal and so everything ends up with its own draw call. I read that the situation is slightly improved with more recent versions of the DirectX api. Similar slowdowns happen with a lot of the other cutscenes too, though I've not looked at them in as much detail. I really don't think the game is ever CPU bound on my machine. Neither core ever went above 60% utilization during my test run, which went up through the "rebuild the robot costume" bit. Also, I've had the laptop plugged in throughout all this and don't think the power plan is affecting this. Again, I know the entire GMA series was far from great; many of those developers who do decide to support Intel end up using Intel HD Graphics as the baseline. If it can't be made to run well on GMA, I can run it on a desktop with better graphics hardware at the end of this month. I just saw what the system requirement said and thought I could go for it now. If this can't be fixed and the Intel graphics requirement needs to be bumped to HD Graphics I wonder whether the other system requirements may be inaccurate as well. In particular, looking at how the Intel stuff stacks up, I bet nothing from nV and ATI slower than the GeForce 8400M GT and the Mobility Radeon X1600 /Radeon HD 2400 can handle it, and those cards may be iffy. I also wonder whether you really meant to ask for 512MB video memory as a min. requirement; seeing as the Xbox 360's 512MB is shared across both the GPU and CPU one would hope 256MB video memory might suffice, and many of the cards which otherwise meet your stated minimum specs were never sold with more than 128MB.
  • Create New...