Out of curiosity, is there anyone that [i]prefers[/i] 30fps, and why?
English
-
Me. 1. Animation at 30 fps has more natural appearing motion blur. 2. In games where IMMERSION is important (for example RPGs), the unnatural clarity of 60 fps can be immersion-breaking. 3. Trying to run a game at 60 fps on a console burns up limited computing resources that can be used for things that bring more value to most game types. 4. Sixty fps only improves game performance in games where there is competitive, on-line multiplayer (shooters and fighting games), and games where there are fast moving objects that need to be seen with a great deal of clarity (racing games). 5. Artificially trying to hit 60 fps on machines that can't really handle them is often worse than 30 fps. Frame rates get unstable which leads to stuttering. The compromises necessary to reduce CPU can lead to annoying texture popping and other artifacts that make animation seem cartoonish, and make it difficult to look at (Halo 5.)
-
[quote]Me. 1. Animation at 30 fps has more natural appearing motion blur. (Motion blur is terrible) 2. In games where IMMERSION is important (for example RPGs), the unnatural clarity of 60 fps can be immersion-breaking. (How?) 3. Trying to run a game at 60 fps on a console burns up limited computing resources that can be used for things that bring more value to most game types. (If the engine is optimised well enough any game could hit 60.) 4. Sixty fps only improves game performance in games where there is competitive, on-line multiplayer (shooters and fighting games), and games where there are fast moving objects that need to be seen with a great deal of clarity (racing games). (That's not true) 5. Artificially trying to hit 60 fps on machines that can't really handle them is often worse than 30 fps. Frame rates get unstable which leads to stuttering. The compromises necessary to reduce CPU can lead to annoying texture popping and other artifacts that make animation seem cartoonish, and make it difficult to look at (Halo 5.) (Never seen that happen)[/quote]
-
Motion blur is a natural part of how your eyes work, and people who don't spend all day playing video games perceive its [i]absence[/i] as unnatural. Which is why high frame rate video is often described by non-gamers as "cartoonish". And why in games like RPG that run at high frame rates on PCs it has to be added back artificially to avoid breaking the sense of immersion.
-
Motion blur is added artificially on 30fps too... Do you think this because films have motion blur and run at 30 (24, actually) fps? Because a slideshow of actual pictures vs a slideshow of an artificial, computer-rendered environment are completely different.
-
Edited by TheArtist: 6/24/2017 7:39:30 PMDifferent only in how they are generated. Especially since almost all photography these days is digital.
-
Edited by Himself: 6/24/2017 7:42:28 PMExactly. Cameras have "natural" motion blur (at 30 or 60FPS) while a simulation running at 30 does not. If you turn off the motion blur setting in a game running at 30fps, there will be no motion blur.
-
That's true of FILM photography, not digital. Motion blur is an intrinsic part of how your eye works. High frame rates just overcome it by brute force.
-
Your eyes don't have "natural" motion blur on a game because nothing is actually moving; it's just a bunch of pixels changing color. Because you're a console gamer, I guess the only games you've experienced with motion blur were at 30fps, so you don't know any better. ¯\_(ツ)_/¯
-
Edited by TheArtist: 6/25/2017 7:06:08 PMYou're trying to argue me down about what I do for a living. The person who doesn't know any better is you. I have been a pc gamer for over 20 years. I simply don't like playing action games on mouse and keyboard. Never have.
-
I won't argue about what you do, but I will argue about your justification if it's based on misinformation.
-
Edited by TheArtist: 6/25/2017 7:07:43 PMRead a neurophysiology textbook, and then we'll finish this conversation. Until then your making up shit as you go along.
-
Anyone can claim they have expertise in a given area on an Internet forum. I don't see how neurophysiology would give you much insight into how videogames work, though. I am interested to see what your counterpoints are, however, and what exactly I'm "making up."
-
You don't experience the world. You experience the construct your brain creates.
-
Well, I can see our conversation is being quite productive.
-
1. Not in my experience. Crysis 2/3 at 60FPS has the best motion blur I've seen in a game. 2. If anything, bad performance breaks immersion worse than 60FPS does imo. I'd rather have everything look smooth than choppy. The only exception here is in games that were meant to run at 30 but are running at 60, as it often causes weird choppy animations (Halo CE and Morrowind come to mind here). 3. Like what? I don't see how having a game perform well can make gameplay worse or take out gameplay. The only way this could possibly happen that I can think of is with split screen, but the Xbox One has a total of 37 split screen games out of 1235 games total. Considering that a lot of Xbox One games run at 30, I don't think 60FPS is the main reason why split screen is no longer prevalent. 4. 60FPS [i]is[/i] improved game performance. I'm not sure what you mean here. Are you saying it's only a benefit under these conditions? Because it certainly provides a smoother experience. 5. You're right for the wrong reasons here. Stuttering is bad, but it has nothing to do with animation or "reducing CPU" (what does that even mean?).
-
Edited by TheArtist: 6/21/2017 8:47:58 PMAny animation that is at 60 fps has to have any motion blur artificially added back to the images...which can be done. Thirty fps has it naturally. Thirty fps animation only looks "choppy" if its done poorly....or your brain is used to looking at animation with higher frame rates. Sixty fps looks unnaturally clear...and even cartoonish...if you aren't used to looking at very high frame rates. Sixty fps doesn't improve the performance of all games. Running at chess simulation or a turn-based strategy game at 60 fps isn't going to improve the game. High frame rates improve the performance of fast-paced competitive games where input and refresh rates are tied to frame rate (thus reducing input lag) or games where you need to see very fast moving objects clearly (the absence of motion blur.) That is why 60 fps is so popular with racing games...and with competitive FPSs. Stuttering is the result of significant drop in frame rate causing the animation to visibly change its rate of motion. Its usually the result of overloading a CPU by trying to run at a higher frame rate than the processor can handle. Either because of poor optimization (if its occurring on PC) or (on console) the machine's CPU has just been overwhelmed by the workload being placed on it....and the throughput gets bottlenecked. Bottomline. You're not going to get gamers to agree on 60 fps. Especially if you ask players of a wide variety of age ranges....and a wide variety of game types. If you ask lots of PC gamers----who have to deal with the jankiness of keyboard movement----shooter gamers, racing gamers, and fighting gamers, you will get a consensus of people who say frame rate trumps resolution. But if you ask simulation gamers, strategy gamers, RPG gamers, MMO gamers, console gamers (with the smooth curving movement that joysticks and produce) as well as older gamers.....you're going to find that lot of these people simply NOT care about frame rates (other than stability). You'll find that most of them have NO interest in sacrificing resolution or other aspects of game play quality just to hit a 60 fps benchmark. In my own case, if I'm playing a racing game, I'm going to want 60 fps. Any other game? Don't care....and if I have any preference its towards 30 fps. Because as a console gamer, I'm unwilling to sacrifice other aspects of play. I don't have to deal with the jankiness of mouse-and-keyboard....and 30 fps animation looks more natural to eyes that have a nearly 50 years of experience looking at animation at that framerate. ...and no amount of argument is going to overrule what my own eyes and vision tell me on a daily basis.
-
Lol what? 30fps doesn't magically mean it has "natural motion blur." I'm convinced you have no idea what you're talking about. 30fps is 30fps. What do you mean "if it's done poorly?" You seem to kind of be throwing around vague concepts without backing them up. Of course the games will play differently, but that doesn't mean that Halo 3's 30fps is different from Skyrim's 30fps. Is there some sort of non-competitive FPS where 30fps is preferable? Because I've never heard of it. You realize console games can be poorly optimized too, right? And stuttering isn't caused by a processor trying to run at a higher framerate than its "capable of." Eventually your processor will bottleneck your GPU, yes, because they [i]perform different tasks,[/i] but it has nothing to do with "too many frames." An underpowered CPU is often why RTS games like Ashes of the Singularity can end up being so laggy even with a good GPU. Input lag is caused by the response time of your display. For example, my monitor is 5 milliseconds. The only way that refresh rate is tied to fps is that you can't see more than 60FPS if your refresh rate is 60. I [i]am[/i] a PC gamer. Hi, nice to meet you. And the keyboard and mouse is not "janky." Considering you said you're a console gamer, I doubt you've played with a KB&M for any extended amount of time. By the way, MMO and strategy is primarily on PC (and I don't know if a single person who prefers to play them on a controller), so I don't know why you separated them like you did. And, if you didn't know, you can play with controllers on the PC-- you'd just be putting yourself at a disadvantage. You've yet to give me an example where 60FPS hurts gameplay quality. Developers don't take features and mechanics out for 60FPS. And given the option of superior graphics over framerate, I'm sure plenty of people would choose graphics-- but if they were given the option to play at 60FPS with those same visuals, I doubt many would say they wanted to play at 30fps instead.
-
In single player games yes. Story based games tend to have a soap opera effect at 60fps. I hate that feel and look.
-
I know what you mean, but imo 60FPS still feels better, especially for smooth combat.
-
Edited by crazeeavery: 6/21/2017 5:06:51 PMYeah if you're playing and FPS 60fps is definitely better and smother for PVP.
-
Yeah. What single player games do you think look better in 30?
-
The last of us remastered is definitely my number one. The soap opera effect is vary strong at 60fps.
-
Never played that one. ¯\_(ツ)_/¯ But RPGs like Fallout, Skyrim, Morrowind, etc. look better at 60FPS to me.
-
Never seen them running stable past 30 fps myself sadly.
-
You could always look up some PC gameplay on YouTube, but from personal experience it feels a lot different from actually playing it.