Jump to content

Arena Breakout: Infinite

Hipfire is added, aims coming soon!
Read more...

Project L33T

See the game notes for instructions on how to disable smoothing.
Read more...

Twilight Town: A Cyberpunk FPS

Just added.
Read more...

Contain

See the game notes for instructions on how to disable smoothing.
Read more...

Vomitoreum

Just added.
Read more...

Perceived sensitivity


Recommended Posts

9 hours ago, Drimzi said:

I didn't say monitor distance depends or scales based on zoom.

You literally said this.

On 1/26/2018 at 8:12 PM, Drimzi said:

Monitor distance (1:1 Match): Camera zooms in and out; physical mouse movement depends only on the zoom (IMO)

?

9 hours ago, Drimzi said:

With zoom ratio as the method for scaling the sensitivity, if I zoom in, not only do I need to move my hand further to reach a target due to the zoom, but the sensitivity has also been reduced a lot due to the formula.

No? This reads like there's some extra multiplier reducing sensitivity even more after scaling sensitivity. (there isn't)

9 hours ago, Drimzi said:

With 1:1 / vertical match this isn't an issue, I just move my hand more or less depending on the zoom.

*citation needed*

9 hours ago, Drimzi said:

And I say 1:1 because if you say just vertical or horizontal, then horizontal can mean anything as there are so many different aspect ratios. So I just make it a habit to say 1:1.

Where did I say to call it horizontal? It's vertical. In every case of "1:1" it's vertical. Calling it 1:1 implies it perfectly matches sensitivity 1 to 1.

9 hours ago, CaptaPraelium said:

Man I would LOVE to see the monitor size and distance from the monitor for those guys.

https://prosettings.net/overwatch-pro-settings-gear-list/ has their monitors (and other meaningless shit like gaming chairs lmao). Distance to screen probably doesn't vary that much, especially in the tournament where they all have the same desk.

9 hours ago, KandiVan said:

Is zoom ratio the same as 0% monitor match on the calculator? I've been using it for about two weeks. I dont know, I still feel like im wiffing shots I shouldn't be and it feels too sensitive while ADS'd.

Yes it is. Also that's interesting since usually people who think it's off think it's not sensitive enough.

Link to comment
8 hours ago, Skwuruhl said:

Yes it is. Also that's interesting since usually people who think it's off think it's not sensitive enough.

See, I feel like my issue is that im taking incredibly long range fights on relatively high FOV while ADS'd (61.84 vFOV). On top of that, movement is very sporadic in H1Z1 and I'm playing in third person, so I think there is FOV discrepancies as opposed to config FOV. Further, hitboxes are relatively small and headshots are HEAVILY favored. Pin point precision is an absolute requirement. As a result, most people just utilize incredibly low ADS sensitivity (50-60 inches on average) so that their margin of error is so massive you dont miss those crucial two taps. Likewise, they just utilize player movement for crosshair placement more than actual mouse movement. I may try to lower my config FOV which will bring down my ADS FOV as well, see if it gives me a little more leeway.

Edited by KandiVan
Link to comment
2 hours ago, KandiVan said:

See, I feel like my issue is that im taking incredibly long range fights on relatively high FOV while ADS'd (61.84 vFOV). On top of that, movement is very sporadic in H1Z1 and I'm playing in third person, so I think there is FOV discrepancies as opposed to config FOV. Further, hitboxes are relatively small and headshots are HEAVILY favored. Pin point precision is an absolute requirement. As a result, most people just utilize incredibly low ADS sensitivity (50-60 inches on average) so that their margin of error is so massive you dont miss those crucial two taps. Likewise, they just utilize player movement for crosshair placement more than actual mouse movement. I may try to lower my config FOV which will bring down my ADS FOV as well, see if it gives me a little more leeway.

When you ADS it still goes to first person like ARMA yeah? If so I don't know how well any matching method works when swapping between first and third person like that since the camera physically moves in addition to the FOV change. I haven't personally tested (with AHK scripts etc.) any method in third person games.

Edited by Skwuruhl
Link to comment
Just now, Skwuruhl said:

When you ADS it still goes to first person like ARMA yeah? If so I don't know how well any matching method works when swapping between first and third person like that since the camera physically moves in addition to the FOV change. I haven't personally tested any method in third person games.

No, going ADS just zooms you in over the shoulder while maintaining third person over the shoulder. Camera scales back 1.67 m in hipfire and 1.1 m in ADS. Is there a way we could scale the first person config FOV kind of like how a microscope works? IE: Going third person increases FOV by scalar dependent on distance back from first person?

Link to comment
5 hours ago, KandiVan said:

No, going ADS just zooms you in over the shoulder while maintaining third person over the shoulder. Camera scales back 1.67 m in hipfire and 1.1 m in ADS. Is there a way we could scale the first person config FOV kind of like how a microscope works? IE: Going third person increases FOV by scalar dependent on distance back from first person?

The camera moving makes it really tricky. I don't know how you'd account for the camera moving.

Link to comment

I have switched to 0% monitor match in all my games given @Skwuruhl's images above. I switched about a month ago and I still  am not as comfortable with it as I was at 75% or Viewspeed V2 (pretty negligible difference between these two). From my perspective, strictly from a non-mathematical point of view and how quickly my brain and muscle memory adjusts to something, it takes a long time (many hours) to adjust to a new sensitivity. 

 

This perception also doesn't really seem to rear its head on games where you don't need to be perfectly precise (Single Player Games). On games I am intimately familiar with and have over 1k hours on I do notice the difference between 0% and 75% and I am noticeably less precise with it. Is this because 75% is "better", I personally don't think so. Do I think 0% is a superior enough method to switch over for someone used to 75%? I would say no, and you only need evidence from the pros as said above to see this. For someone who is not a serious competitive player in CS or otherwise, I think the change is worth it, as I think 0% gives you a little more margin for error around the crosshair compared to 75%. But I wouldn't recommend the change to a CSGO veteran, personally.

Link to comment

I'm sorry I've been sparse with the updates to this thread. This is largely because it's very time-consuming to make illustrations which are really needed to explain the progress I'm making. I'm old.... I've been doing it with pen and paper and such ;) I might just get a camera and put pics of some of that here, rather than just nothing.

 

3 hours ago, Rashy said:

So overall whats better 0% mm or 56.25% theoretically? Thx

Theoretically, 0% is perfect. For the image, on the screen. But what we see, is not what is on the screen. What we see, is on the inside of our eye. This is why 0% feels 'slow'. It does not account for the last step in the game world being projected from the monitor to our eye. We do not have any formula which do so, and accordingly, 0% is the most correct theory we have as of right now.



As per the science nerdery posted above, we know that we do not measure distance between two points, in the real world or the game world, directly - as we would with say, a ruler, or by pixels on screen. We measure it by means of deriving the distance from the angle between two points.

This is a terrible thing to attempt to explain without pictures, but I'll try, because it offers us two interesting insights. Firstly, it offers some validity to 'monitor matching', and secondly, offers some hint as to why it is that we seem to prefer to monitor match at the kind of percentages which we do. If none of this makes any sense, I'll do some cruddy mspaint to explain it ;)

Firstly, let's picture our monitor from above or from the side (it doesn't really matter, but I do it from the side because the games use VFOV) so we have a straight line. Now we need to measure our monitor and our seating position (assuming that your eyes are directly in line with the centre of the screen, which for the purpose of FPS games, they should be). We can use the following formula to find our actual FOV of the monitor. I sit 32.5cm from a 1440p 27" monitor (I can hear my mother telling me that's unhealthy), so mine looks like this:

widthpx = 2560

heightpx = 1440

diagcm = 27*2.54

viewdistance = 32.5  <-- ^--- Yep, centimetres because science. Also I'm Aussie ;) You can use inches, just don't *2.54 in the line above.

heightcm = (diagcm/sqrt(widthpx^2+heightpx^2))*heightpx

actualfov = 2(arctan((heightcm/2)/viewdistance))

= 54.70173510519102597649

Unsurprisingly, valve know their stuff (see links above) and I have adjusted my workspace to bring my FOV close to the natural 55-60 degree FOV where our eyes and brain treat the image as important (beyond this is our peripheral vision where we do not see so much detail but mostly movement, again see links above)

So, now we can envision that there is a triangle formed between our eyes (well, our eye. We don't need to worry about stereo for this, so we just use the dominant eye) and the edges of the screen, and the angle from the eyes is as calculated above. Cool. But, let's imagine that angle is increased to say 80degrees (my hipfire FOV). In order for the triangle to meet the edges of the screen, our eyes should be much closer.... and if they are (ie, we move our head closer to the monitor), we see NO distortion. The distortion of the image is NOT caused by the projection. It is caused by the fact that our head doesn't move, to match the focal point of the projection.

Here, we start to uncover the real reason WHY we feel the need to change mouse sensitivity when zooming, at all. It's about the amount of angle our eyes need to move, to cover the same amount of angle in the game world. This is distinct from, the distance our eyes move, to cover the distance between two points. Our brain doesn't work that way. It thinks of all distances as angles, which makes sense really, since it's all a matter of feedback from our eyes telling our brain how much they rotated.

Now, if we take a few FOVs (in my testing I've been using actual, hipfire, 4x and 8x zoom) and measure out the distances to the focal points, we will have one very close to the monitor (hipfire), one where we sit(actual), one some distance behind where we sit (4x), and one very far behind us (8x). Guess what the ratios between those distances are? zoom ratio. Great :D And we already know, that Zoom Ratio/0% gives us perfect movement in the centre of the screen.

So, why does it fail? Let's say, that we see a target which is half-way to the edge of our monitor. Let us not make the mistakes of the past and think of this as pixels or cm or inches, it is an angle. Our brains all agree on this ;) In my case (using the same formula above and dividing the screen by half again), that's angle= 2(arctan((heightcm/2/2)/viewdistance)) ~=29.00degrees from the centre of the screen.

So, now let's put this into effect using our hipfire, 4x and 8x zoom. Our eyes move 29degrees, how far do we need to rotate in game, to aim at this target? (yes, it can be simplified mathematically, but for the purpose of conversation...) We can calulate the focal distance from our screen, for a given FOV, using the following formula: 
opticallycorrectdistance=(heightcm/2)/(tan(fov/2))
So, I'll do that for my 3 example FOVs:

 

hipdistance=(heightcm/2)/(tan(80/2))

= 20.03463865597708287603

 

fourdistance=(heightcm/2)/(tan(14.8/2))

= 129.4379759752501060469

 

eightdistance=(heightcm/2)/(tan(7.45/2))

= 258.21347922131382533488



And now we can just use the same formula above, with these distances, to calculate how far that ~29 degrees of eye movement amounts to, in the game world:

 

actualfov = 2(arctan((heightcm/2/2)/hipdistance))

= 45.52095254923326167504

 

actualfov = 2(arctan((heightcm/2/2)/fourdistance))

= 7.43098865714869079575

 

actualfov = 2(arctan((heightcm/2/2)/eightdistance))

= 3.72894033006548981691



Ok that's well and good, but why is it important? This quick example, when we compare the results to those of 0%MM/zoom ratio,demonstrates that as our FOV decreases, the effect of the distortion on angular displacement decreases. So what? well, this tell us that the most important adjustment to our mouse sensitivity, is that made between the widest FOV - which is going to be hipfire - and our actual FOV of the screen from our eyes. As the FOV becomes smaller (higher zoom in game) the distortion is lower and lower and less and less meaningful.

So, since we can NEVER make a perfect adjustment of sensitivity for all parts of the screen, because the distortion is not constant across the screen; but we can make an adjustment which is perfect for one part of the screen (this is why there is a percentage in monitor matching and a coefficient in BF and a zoom sensitivity in OW etc)... Which part of the screen is most important? If we say, the centre, then we use zoom ratio. But almost all agree, that 0% feels 'slow', and we know that is because of the angles vs distance thing. If we are CSGO or BF1 defaults, we use 4/3 aka 75% because muh feels. If we're the average OW pro, we use 18%. Why does everyone disagree? Well, if you take the hipfire FOV of a player, and his actual FOV, and work out your ratio from there....suddenly it all begins to line up with what 'muh feels' has been telling us all along.

Sure, ANY variation from the optically correct distance from screen, for a given FOV, will introduce distortion; and that distortion will ensure that our mouse sensitivity will never be correct to any given point on the screen..... but the lower our FOV gets, the more zoomed in we get, the less of a difference it makes. The big difference, is that between our wide hipfire FOV, and our actual FOV of the screen.

Link to comment
18 minutes ago, CaptaPraelium said:

I'm sorry I've been sparse with the updates to this thread. This is largely because it's very time-consuming to make illustrations which are really needed to explain the progress I'm making. I'm old.... I've been doing it with pen and paper and such ;) I might just get a camera and put pics of some of that here, rather than just nothing.

 

Theoretically, 0% is perfect. For the image, on the screen. But what we see, is not what is on the screen. What we see, is on the inside of our eye. This is why 0% feels 'slow'. It does not account for the last step in the game world being projected from the monitor to our eye. We do not have any formula which do so, and accordingly, 0% is the most correct theory we have as of right now.



As per the science nerdery posted above, we know that we do not measure distance between two points, in the real world or the game world, directly - as we would with say, a ruler, or by pixels on screen. We measure it by means of deriving the distance from the angle between two points.

This is a terrible thing to attempt to explain without pictures, but I'll try, because it offers us two interesting insights. Firstly, it offers some validity to 'monitor matching', and secondly, offers some hint as to why it is that we seem to prefer to monitor match at the kind of percentages which we do. If none of this makes any sense, I'll do some cruddy mspaint to explain it ;)

Firstly, let's picture our monitor from above or from the side (it doesn't really matter, but I do it from the side because the games use VFOV) so we have a straight line. Now we need to measure our monitor and our seating position (assuming that your eyes are directly in line with the centre of the screen, which for the purpose of FPS games, they should be). We can use the following formula to find our actual FOV of the monitor. I sit 32.5cm from a 1440p 27" monitor (I can hear my mother telling me that's unhealthy), so mine looks like this:

widthpx = 2560

heightpx = 1440

diagcm = 27*2.54

viewdistance = 32.5  <-- ^--- Yep, centimetres because science. Also I'm Aussie ;) You can use inches, just don't *2.54 in the line above.

heightcm = (diagcm/sqrt(widthpx^2+heightpx^2))*heightpx

actualfov = 2(arctan((heightcm/2)/viewdistance))

= 54.70173510519102597649

Unsurprisingly, valve know their stuff (see links above) and I have adjusted my workspace to bring my FOV close to the natural 55-60 degree FOV where our eyes and brain treat the image as important (beyond this is our peripheral vision where we do not see so much detail but mostly movement, again see links above)

So, now we can envision that there is a triangle formed between our eyes (well, our eye. We don't need to worry about stereo for this, so we just use the dominant eye) and the edges of the screen, and the angle from the eyes is as calculated above. Cool. But, let's imagine that angle is increased to say 80degrees (my hipfire FOV). In order for the triangle to meet the edges of the screen, our eyes should be much closer.... and if they are (ie, we move our head closer to the monitor), we see NO distortion. The distortion of the image is NOT caused by the projection. It is caused by the fact that our head doesn't move, to match the focal point of the projection.

Here, we start to uncover the real reason WHY we feel the need to change mouse sensitivity when zooming, at all. It's about the amount of angle our eyes need to move, to cover the same amount of angle in the game world. This is distinct from, the distance our eyes move, to cover the distance between two points. Our brain doesn't work that way. It thinks of all distances as angles, which makes sense really, since it's all a matter of feedback from our eyes telling our brain how much they rotated.

Now, if we take a few FOVs (in my testing I've been using actual, hipfire, 4x and 8x zoom) and measure out the distances to the focal points, we will have one very close to the monitor (hipfire), one where we sit(actual), one some distance behind where we sit (4x), and one very far behind us (8x). Guess what the ratios between those distances are? zoom ratio. Great :D And we already know, that Zoom Ratio/0% gives us perfect movement in the centre of the screen.

So, why does it fail? Let's say, that we see a target which is half-way to the edge of our monitor. Let us not make the mistakes of the past and think of this as pixels or cm or inches, it is an angle. Our brains all agree on this ;) In my case (using the same formula above and dividing the screen by half again), that's angle= 2(arctan((heightcm/2/2)/viewdistance)) ~=29.00degrees from the centre of the screen.

So, now let's put this into effect using our hipfire, 4x and 8x zoom. Our eyes move 29degrees, how far do we need to rotate in game, to aim at this target? (yes, it can be simplified mathematically, but for the purpose of conversation...) We can calulate the focal distance from our screen, for a given FOV, using the following formula: 
opticallycorrectdistance=(heightcm/2)/(tan(fov/2))
So, I'll do that for my 3 example FOVs:

 

hipdistance=(heightcm/2)/(tan(80/2))

= 20.03463865597708287603

 

fourdistance=(heightcm/2)/(tan(14.8/2))

= 129.4379759752501060469

 

eightdistance=(heightcm/2)/(tan(7.45/2))

= 258.21347922131382533488



And now we can just use the same formula above, with these distances, to calculate how far that ~29 degrees of eye movement amounts to, in the game world:

 

actualfov = 2(arctan((heightcm/2/2)/hipdistance))

= 45.52095254923326167504

 

actualfov = 2(arctan((heightcm/2/2)/fourdistance))

= 7.43098865714869079575

 

actualfov = 2(arctan((heightcm/2/2)/eightdistance))

= 3.72894033006548981691



Ok that's well and good, but why is it important? This quick example, when we compare the results to those of 0%MM/zoom ratio,demonstrates that as our FOV decreases, the effect of the distortion on angular displacement decreases. So what? well, this tell us that the most important adjustment to our mouse sensitivity, is that made between the widest FOV - which is going to be hipfire - and our actual FOV of the screen from our eyes. As the FOV becomes smaller (higher zoom in game) the distortion is lower and lower and less and less meaningful.

So, since we can NEVER make a perfect adjustment of sensitivity for all parts of the screen, because the distortion is not constant across the screen; but we can make an adjustment which is perfect for one part of the screen (this is why there is a percentage in monitor matching and a coefficient in BF and a zoom sensitivity in OW etc)... Which part of the screen is most important? If we say, the centre, then we use zoom ratio. But almost all agree, that 0% feels 'slow', and we know that is because of the angles vs distance thing. If we are CSGO or BF1 defaults, we use 4/3 aka 75% because muh feels. If we're the average OW pro, we use 18%. Why does everyone disagree? Well, if you take the hipfire FOV of a player, and his actual FOV, and work out your ratio from there....suddenly it all begins to line up with what 'muh feels' has been telling us all along.

Sure, ANY variation from the optically correct distance from screen, for a given FOV, will introduce distortion; and that distortion will ensure that our mouse sensitivity will never be correct to any given point on the screen..... but the lower our FOV gets, the more zoomed in we get, the less of a difference it makes. The big difference, is that between our wide hipfire FOV, and our actual FOV of the screen.

Yeah, I just tend to think from experience that any perception (good or bad) felt when changing your sensitivity can't be reliably counted on if you do not spend an appropriate amount of time testing it. Enough time where you have completely broken your old muscle memory.  That depends entirely on how ingrained a sensitivity is for you and, to a lesser extent, the game you are playing.

CSGO pros use 1 because they've been using 1 for over a decade, Overwatch pros use 40 because it felt best for them when they first picked up the game. Are either of these perceptions correct or wrong? Is Ana's scope fundamentally different than the AWP (it's not)?  0% is literally more accurate around your crosshair, we know this, does that mean KennyS would be better if he used it? His experience and familiarity at 75% trumps the small advantage 0% gives any day.

 

And if you were regularly flicking to the edge of the screen 75% would be better, and a higher match percentage may be superior in games where quick aim is valued over center crosshair precision, but you can only have one. To be consistent, to improve, you need to stay at one method and stick with it long-term, I think we can all agree to that.

Edited by Bryjoe
Link to comment

Well, that's kinda the point of this thread. 'muh feels' doesn't really have a place here, so muscle memory and such don't factor in.... Well, that's not entirely true. 'muh feels' has a place here, which is in the question "why does 'muh feels' disagree with 0%MM/ZoomRatio?". We pretty much have that answered now, all that's left is to find the mathematically optimal formula which accounts for perception - and when I say perception, I mean optical perception, as in, how our brains process the image which reaches our eye; this is in contrast with perception, as in, the above plus a bunch of subjective experiences like operating with suboptimal sensitivities or whatever.

You're certainly right though, that pretty much any sensitivity will work just fine with sufficient experience. Our brains will, and do, make a 'formula' that works. It would be nice though, to put a formula such as that, into a computer, so we can use it across new games and FOVs.

Link to comment
13 hours ago, CaptaPraelium said:

Well, that's kinda the point of this thread. 'muh feels' doesn't really have a place here, so muscle memory and such don't factor in.... Well, that's not entirely true. 'muh feels' has a place here, which is in the question "why does 'muh feels' disagree with 0%MM/ZoomRatio?". We pretty much have that answered now, all that's left is to find the mathematically optimal formula which accounts for perception - and when I say perception, I mean optical perception, as in, how our brains process the image which reaches our eye; this is in contrast with perception, as in, the above plus a bunch of subjective experiences like operating with suboptimal sensitivities or whatever.

You're certainly right though, that pretty much any sensitivity will work just fine with sufficient experience. Our brains will, and do, make a 'formula' that works. It would be nice though, to put a formula such as that, into a computer, so we can use it across new games and FOVs.

Sorry, what is the answer to why our brains disagree with 0% match? I am not sure if 0% feels off because I am so used to 75% or 75% is just more "natural". There is no doubt that 75% is more versatile if you use it to convert sensitivities on different FOVs, slower on high FOVS and faster on low FOVS. In that way, it's closer to your "base" sensitivity usually. The Question is: does it feel better because it legitimately is easier to adjust to or because I have been using 75% for 15 years?

Link to comment
7 hours ago, Bryjoe said:

The Question is: does it feel better because it legitimately is easier to adjust to or because I have been using 75% for 15 years?

Short answer to your question: Mostly because you are used to it. Like you said above, even if we gave KennyS a perfect formula which doesn't exist yet, he'd be better with 75% because that's what he has practised to use.
 

7 hours ago, Bryjoe said:

Sorry, what is the answer to why our brains disagree with 0% match?

I think it's time to do some pictures..... Sorry, I'm short on time right now so I know they suck but I hope these will do the job OK.

If you visit this link, you can see a demonstration of how the usual Zoom Ratio aka 0%MM formula (=tan(fova/2)/tan(fovb/2)) compares to a simple division of the lengths of the focal points of those FOVs. You can grab the FOVA and FOVB points and slide them around to see the numbers change. But our head does not move when we zoom in and out, so we are not viewing the image from the focal point, so although we have perfectly accounted for the difference in image on screen, we have not accounted for the image which reaches our eyes.

https://www.geogebra.org/m/ByAfGqzc

Edited by CaptaPraelium
link updated
Link to comment
  • 2 weeks later...

Been busy and haven't had a lot of time to put into this.... Anyway here's the current situation I'm working with: We can see from the graphics above, that as our target deviates from the centre of the screen, the change in angle deviates from zoom ratio.... So, it seems a matter of finding a formula which provides the minimal error. But here's a spanner in the works:

WHY DO WE ASSUME THE TARGET IS ON SCREEN

Yeh, there's a can of worms. You can probably hear the gears turning in my head from where you are.

Link to comment
  • 2 weeks later...

One thing I've realized after taking a couple months off of shooters is that I've found it REALLY obvious to perceive how the methods differ. I think the explanation for this is that I have lost my muscle memory (I never really had much to start with because I'm always changing sensitivity) so it's a lot more obvious to me what feels different from my desktop sensitivity, which I have been using this whole time.

0% feels amazing at the center and it's like it seamlessly switches between FOV - most noticeable when you change from game to desktop. But then when it comes to flicking to targets further away from the center I'm always missing. And if you are aware of the rest of the screen, it moves too fast and so your perceived sensitivity just feels off.

100% just feels way too slow at the center. But turning around and flicking to targets closer to the edge is effortless.

I haven't tried Drimzi's VFOV 100% MM but I'll see how it feels, though, I'm assuming it will be halfway between the two, unless there's some magic happening. But I doubt it... I really think there is no perfect method.

Edited by potato psoas
Link to comment

@CaptaPraelium

I like what you're trying to do with this whole figuring out how we perceive sensitivity, but I would've thought this had been solved by now. I think I have a good way to explain what exactly is happening...

So when we move our mouse a certain distance, we expect to move a relative distance (Edit: our eyes actually perceive things in angle, but I guess this helps us determine distance, so sensitivity can still be considered "distance" over time) on the monitor from the cursor/crosshair. This is what defines muscle memory and it is the very basis behind monitor matching. However I think what we need to define is this idea of "sensitivity" or speed of mouse movement. Since speed is just distance over time, then we can assume that "sensitivity" can be defined in terms of distance - over time we are constantly evaluating how far we expected to move with how far we have actually moved. If you know anything about the way our eyes and brain work, you would know that they are constantly perceiving new information (like a refresh rate) and making instantaneous judgments about that information. There is nothing magical about "sensitivity".

But it is not just the crosshair that you are constantly referencing - when you move your mouse you are looking at the entire screen and constantly referencing the "sensitivity" of many points on the monitor at the same time. Your eye is not just focused on one single point. There is a useful field of view that you always use. Even peripheral vision can play a part in perceived sensitivity.

Lastly, because of distortion, the perceived "sensitivity" of all points on the monitor are going to feel different. This is why, no matter what method you use, something will always feel off. E.g.:

  • This is why 0% MM feels amazing at the crosshair but it feels too fast at points closer to the edge of the monitor.
  • This is why 100% MM feels too slow at the crosshair but feels better at the edge.
  • And this is why every other method in between 0% and 100% will feel imperfect.
    (Edit: some of this part isn't exactly correct and I have refined my explanation in a future post)

There is no way to work around the distortion. From what I've explained, it's clear that what we perceive, what we expect, our muscle memory, can be explained in terms of distance. It's best that you share these assumptions to help you determine picking the optimal method.

If you want to test what I have said, a good way to tell how the perceived sensitivity for different points on the monitor changes is to use the McOsu first person mod and compare it to a 3D game. McOsu pans the camera around the 2D plane but it essentially uses the same principle as 3D, given that 2D is just 0 FOV. But anyway, as an example, you can compare how the speed at the edge of the monitor is supposed to feel compared to a game with sensitivity converted from 2D using 0% MM. (meh)

Edited by potato psoas
replace distance with angle, and what I've said about perceived sensitivity is partly true, but I refine in a later post
Link to comment
On 2/21/2018 at 09:02, CaptaPraelium said:

Theoretically, 0% is perfect. For the image, on the screen. But what we see, is not what is on the screen. What we see, is on the inside of our eye. This is why 0% feels 'slow'. It does not account for the last step in the game world being projected from the monitor to our eye. We do not have any formula which do so, and accordingly, 0% is the most correct theory we have as of right now.



As per the science nerdery posted above, we know that we do not measure distance between two points, in the real world or the game world, directly - as we would with say, a ruler, or by pixels on screen. We measure it by means of deriving the distance from the angle between two points.

I simply don't believe this is true. I think the distance from the monitor to the eye has more to do with the 2D realm than the 3D realm. I covered a lot about how 2D works in this post:

It explains a lot of reasons why I've always hated moving between McOsu first person mod and normal osu!...

Since there is a crosshair in the first person "3D" mode, you are developing your muscle memory from one point on the monitor to every other point on the monitor. But with normal 2D cursor movement, because points on the monitor have different distances from the eye, the perceived sensitivity is changing slightly depending on where the cursor is. You could probably minimize this by getting a curved monitor so that all points on the monitor are equally the same distance away from your eye... not that that is something you can do. It's the other reason why you should make sure you sit in exactly the same position every time you play.

Edited by potato psoas
Link to comment

I think I've been too hard on 0% MM...

I know I've been testing things at higher FOVs for the sake of comparing methods but that only exacerbates the flaws and minimizes the strengths of 0% MM. I found that at 90 FOV it is perfectly fine for what I need it to do. It only feels off right near the edge, which hardly affects my consistency. Yet the rest of the screen is extremely useable, especially the center. In this situation I would prefer a consistent center than having the little area at the edge feeling off.

I believe on top of this approach would be to simply cut off my FOV range at about 103 - the highest FOV I normally use. It's compensation I'm willing to take because I know every other method is flawed anyway.

I think I'm convinced on this now, I agree with many things Skwuruhl has said... I was just being too picky.

Link to comment
23 hours ago, potato psoas said:

 

  • This is why 0% MM feels amazing at the crosshair but it feels too fast at points closer to the edge of the monitor.
  • This is why 100% MM feels too slow at the crosshair but feels better at the edge.

 

Isn't the opposite true? 0% MM would result in a lower sensitivity compared to 100% MM.

Link to comment
On 3/22/2018 at 20:51, tttt1010 said:

Isn't the opposite true? 0% MM would result in a lower sensitivity compared to 100% MM.

Lower sensitivity as in lower cm/360, that's for sure. But no, 0% is matched at the crosshair but becomes faster as you go towards the edge of the monitor. Whereas 100% starts off slow at the crosshair and gets closer to match at the edge of the monitor.

If you want to prove which has the lower cm/360 you can always check in the calculator.

Edited by potato psoas
seems there's "more to this than meets the eye"
Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...