I’m just thinking outside the box here , anybody know what frames per second our human eyes produce? 😂 😂😂😂😂
I’m just a weird thinking person. But seriously, anybody know.
[spoiler]Moderator edit: This thread has been moved to #Offtopic, a more appropriate forum for this offtopic discussion.
Feel free to private message the moderator who moved your post, link to topic, for further clarification about why this topic was moved.[/spoiler]
English
#Offtopic
-
Scientists say between 30-60 fps for the human eye in a processing sense. When it comes to gaming, that doesn’t necessarily mean that a 60hz screen is all you need, as a human eye in reality would blur anything in the peripheral area while tracking the focused object. In gaming it doesn’t work the same way, devs try to simulate the effect by adding motion blur or depth of field. Up to 120hz would likely be the limit for what the human eye can truly tell a difference with in gaming for keeping the experience as smooth as possible to simulate reality. The 120-240hz argument is moot.
-
작성자: TheArtist 10/24/2022 2:42:05 PMThe eyes don’t see. The brain sees. The eyes just transform light into information that the brain can interpret. The brain will see any animation >24 fps as continuous motion. Higher frame rates will improve the fine points of some experiences like very fast moving action. So the mythology that has come to surround high frame rates in pc gaming just doesn’t stand up to the science. What is likely going on is that hardcore pc gamers just had their brains habituate to high frame rates, and more powerful machines and GPUs got better at producing images quickly and reducing the amount of artifact in the images and animation. So seeing slower frame rates on weaker machines with poorer image quality became a problem. In my own case, I was used to cinematic frame rates (24 fps) for decades before becoming a pc gamer. So 30 fps games were never an issue for me. In fact the shortcuts that early last Gen consoles took to try to artificially get to 60 fps would often give me headaches. But once the machines were powerful to create a smooth and stable 60 fps, I made the transition easily…and I can still go back and play 30 fps games without a problem. Though my eye will pick up an unstable frame rate or artifact (like screen tearing) faster than before.
-
3 답변Our eyes don't "produce" anything. They take in a constant stream of light, and it's mostly about how well our brains can process that information.
-
4 답변Idk if this is related, but I think it might have something to do with the number of FPS in animation is 24. I’m probably wrong tho.
-
작성자: Cultmeister 10/24/2022 10:58:36 AMWe don’t see in frames per second. A standard TV flashes still images at a certain speed to produce the [url=https://youtu.be/yNZTOtEGh8c]illusion of motion[/url]. A still image is referred to as a ‘frame’ so if the TV flashes 60 images a second then that’s 60 FPS. But that’s not what happens with the human eyes. A constant stream of light enters the eye to produce an image, and this stream changes as movement occurs, there are no ‘still’ images and therefore no FPS. If you run past a wrought iron fence quickly enough, you may not be able to notice the bars and instead see a continuous scene behind it. But the bars are still there. That’s like TV. Seeing with your eyes is like removing the fence altogether.
-
I'd say our frames per second would be the number of times light registers in our eyes in a second, and I had actually thought about this a week past and was going to try to calculate it, but I decided to sleep instead.
-
1 답변
-
6 답변
-
1 답변작성자: BetweenMyself 10/22/2022 7:14:02 PMThe 30-60 FPS answer given as a standard seems to be a generally accepted “factoid” (something that sounds true but isn’t). There is a bit of disagreement as to how this misunderstanding arose, but one prominent theory is that it is an attempt to explain away the inherent deficiencies due to practical concerns in the refresh rates of the standard film reel up through today’s modern monitors. While frame rates as low as 20fps can cause the brain to recognize motion in a series of sequential images it is generally accepted that most people stop being able to perceive [i]flicker[/i] (the apparent jump from one image to the next) at somewhere between 60 and 75 fps. However the human eye is able to detect, the nervous system is able to transmit and the brain can interpret the subtle differences in perceived motion of much higher frame rates. The following link will take you to a brief discussion of the issue: https://www.quora.com/What-is-the-highest-frame-rate-fps-that-can-be-recognized-by-human-perception-At-what-rate-do-we-essentially-stop-noticing-the-difference Here are a few selected quotes from the discussion: [quote]Myelinated nerves can fire between 300 to 1000 times per second in the human body and transmit information at 200 miles per hour. What matters here is how frequently these nerves can fire (or "send messages"). The nerves in your eye are not exempt from this limit. Your eyes can physiologically transmit data that quickly and your eyes/brain working together can interpret up to 1000 frames per second. However, we know from experimenting (as well as simple anecdotal experience) that there is a diminishing return in what frames per second people are able to identify. Although the human eye and brain can interpret up to 1000 frames per second, someone sitting in a chair and actively guessing at how high a framerate is can, on average, interpet up to about 150 frames per second.[/quote] [quote]The USAF, in testing their pilots for visual response time, used a simple test to see if the pilots could distinguish small changes in light. In their experiment a picture of an aircraft was flashed on a screen in a dark room at 1/220th of a second. Pilots were consistently able to "see" the afterimage as well as identify the aircraft. This simple and specific situation not only proves the ability to percieve 1 image within 1/220 of a second, but the ability to interpret higher FPS.[/quote] [quote]70 fps starts to get to the point where you really could not see any frame change or flicker at all, even out of the corner of your eye (the periphery is more flicker-sensitive than the center of the eye). The retina, however, is analog -- the brain dos not process vision as "frames". So it is possible that even higher frame rates could change visual perception in certain circumstances.[/quote] [quote]It turns out, that the actual subjective threshold (where a human subject decides an image is no longer flickering) depends on many factors, including contrast, brightness, spatial factors, and to a certain (but important) extent, image content.[/quote] [quote]You can see relative timing of visual events down to the millisecond level. But you also have some persistence of vision, so short visual stimuli merge together.[/quote] (΄◉◞౪◟◉`)
-
14 답변