t4r. I understand that close to 100 people have contributed to Frostbite 3 from all over EA, and it has been confirmed that Battlefield 4 will run at 60 fps on next gen consoles for maximum fluidity in its controls and visuals. Destiny runs at half that - what gives? I'd like to understand the technical challenges behind the 1080p60 target and why Bungie hasn't ever made an effort to hit this mark.
English
#Destiny
-
9 RepliesEdited by o0MrCheesy0o: 6/20/2013 3:45:30 PMWhy won't Destiny run at 60 frames per second, or will it? I don't know. However, I'm an advocate for high-frame-rates in 3D video-games. When I've discussed this issue with other gamers, generally, it is thought that a game running at 60 frames-per-second runs at that rate at the cost of decreased graphical fidelity. That's true, but it's also true for a game running at 30 frames-per-second: a simulation running at 1 frame an hour would have more time to calculate and produce greater graphical fidelity than a simulation running at 30, or 60 frames-per-second. A balance has to be struck wheres the needs for aesthetics are met, and so too are the needs of usability. 30 frames-per-second being the industry standard though, I think, is appalling. It's not enjoyable, and in the genre of first-person-shooters it heavily restricts the content. As does the game-pad—but that's a discussion for another topic! Just imagine, however, a [u]fast[/u] paced first-person-shooter like Quake (a video-game series known for its players playing at a high frame-rate) being played with a game-pad at 30 frames-per-second. It's hard, and it certainly wouldn't be enjoyable, at least not to those of us who've experienced better. It's why fast-paced first-person-shooters aren't popular with the majority of first-person-shooter players: the way in which they play is not supportive of such games. It's a niche that can't exist on a current console. I know there's many console gamers who've only ever played first-person-shooters at 30 frames-per-second using game-pads, and when the idea of a game running at 60 frames-per-second comes up they instantly reject it, because they think a higher frame-rate equates to less pleasing graphics. They also think they can't tell the difference, and some think the human eye can only see an arbitrary number of frames per second. The eye doesn't see in frames-per-second, you can tell the difference between 30 and 60 frames-per-second, and a game running at a higher frame-rate doesn't mean it has to look worse than its predecessor. Just remember though, you're always losing out on graphical fidelity no matter what frame rate you choose to support. I'm sure a simulation running at 1 frame a year could produce mind blowing visuals! Games aren't movies. The largest difference I can think of is that one medium is passive, the other is interactive. To trick the mind into believing it's seeing motion movies use blur, and they use blur because they run at a low frame-rate, and they probably run at a low frame-rate to save money on film—though that's just me guessing. If you can't clearly see something in a film due to blur it doesn't matter. It doesn't matter if you can see or understand what's happening on screen, the film will continue regardless. Indeed you could be asleep and the film will continue to progress! Video-games are interactive however, not passive like film. We interact with what we see and hear, and so it's in the operators best interest to get a clear picture. A high frame-rate offers that. Allow a gamer to play a 3D video-game running at 160 frames-per-second on a display at 160 hertz, then switch to 30 frames-per-second mid-session and witness the reaction. I guarantee all would prefer the higher frame-rate, besides those egotistical enough to lie. I'd like to see what it looks like to play a first-person-shooter at 500 or 1000 frames-per-second, but I've only got a monitor that can output 160 unique frames-per-second (:- (
-
I don't know if they confirmed it didn't. But CoD is 60, Battlefield will be 60, and now Halo is 60, so I don't see why Destiny won't be, especially considering it isn't a cutting edge game in the graphics department.
-
I don't like it. Because 30 fps only means average fps. Meaning it could possibly dip below that. Lag issues due to connection is one thing but framerate lag is unacceptable. That's one of Halo 4's glaring problems.
-
16 RepliesThe biggest problem that I see with consoles today is that they keep trying to get better GRAPHICS and make things look more REAL. FUNCTIONALITY > Graphics I don't give a flying -blam!- if Destiny is at 30, And neither does Bungie. That's what most kids care about today is the damn graphics...So stupid. I have games from the original Xbox that I'll still play today over 360 games because they are just great games. Even with their mediocre graphics. There's so much more to games than that.
-
Edited by Atomic Bacon: 7/3/2013 12:54:06 PMif its not it probably has something to do with the world being so massive.
-
im more interested in if the game is gonn run at full 1080p? does anyone know this? i would LOVE destiny to ru full hd on my big tv, man i would never leave the house :)
-
Well it's certainly that I just don't care about the 30 vs 60 fps comparison. Not much to change my experience in a game.
-
1 Reply60 fps bugs me, I did not like the hobbit when it ran it, but for movies its a little worse. Also they're probly not doing 60 fps cuz they wanna run the game on 360 and ps3
-
4 RepliesBecause Destiny is in essences an MMO and will having thousand of more people playing in it's game at one time in comparison to BF4.
-
1 ReplyWhat frame rate does skyrim run at?
-
Okay where does it say that Destiny runs at 30 fps. I don't think Bungie's told anyone yet.
-
5 RepliesWhere does it say that Destiny will run at xxx frames per second? In all honesty, I couldn't care less about the frame rate being 30 or 60 on console. When it's on a TV and you're sitting a few metres away, to me, there is no difference. Sure, if you're on a PC, 30cm from the monitor, you might notice it, but then the PC should be able to go up to 60fps if you care enough to pay for good enough specs.
-
4 RepliesEdited by SquattingTurtle: 6/20/2013 6:52:51 AMprobably because they can't do it on the 360... so it had to be dumbed down a bit. is this true in the first place?
-
Because I don't give a flying fudge about fps. 30 is fine where its at. 60 is cool but i don't care...
-
The bar for next gen video games has not been set yet. Right now each developer is doing what they hope will work best. Give it 2 years and the games will all be bigger and better.
-
6 Replieswhy does it matter?
-
1 ReplyFor me this is what matters in a game in order 1. Game play 2. Graphics 3. story 4. FPS
-
2 Replies- KotOR - Mount and Blade Just two amazing games that have shit graphics but AMAZING story and gameplay.
-
Trade-off simply put. In order to sustain the massive complex worlds with seamless transitional gameplay between campaigns and multiplayer while also sustaining AI capability, you need to sacrifice some things. BF4 will have beautifully rendered maps that are, while huge, still limited and all that processing power will only need to focus on that particular map and its elements. Since Destiny is a more persistent world, with multiple people playing different areas at a time, it cannot sustain that framerate (same reason many MMO's aren't in Hyper-res, despite being on powerful PC's). P.S. I know it's shared world shooter
-
Why should Destiny?
-
Not that big of a deal. I could really care less about the 30v60 fps debate. As long as the game plays nice and its fun I dont give a crap about fps.
-
Because they don't care about impressing the cod kids
-
Because Destiny's Engine needs to work across all 4 consoles fluidly.
-
I don't see why it matters.
-
it does have massive worlds an gorgeous lighting so I'm not that surprised really.
-
7 RepliesMaybe because IT IS BEING MADE FOR OUTDATED CURRENT GEN CONSOLES also.