Jump to content

Arena Breakout: Infinite

Hipfire is added, aims coming soon!
Read more...

Project L33T

See the game notes for instructions on how to disable smoothing.
Read more...

Twilight Town: A Cyberpunk FPS

Just added.
Read more...

Contain

See the game notes for instructions on how to disable smoothing.
Read more...

Vomitoreum

Just added.
Read more...

Perceived sensitivity


Recommended Posts

On 21/03/2018 at 8:56 PM, potato psoas said:

One thing I've realized after taking a couple months off of shooters is that I've found it REALLY obvious to perceive how the methods differ. I think the explanation for this is that I have lost my muscle memory (I never really had much to start with because I'm always changing sensitivity) so it's a lot more obvious to me what feels different from my desktop sensitivity, which I have been using this whole time.

...

I haven't tried Drimzi's VFOV 100% MM but I'll see how it feels, though, I'm assuming it will be halfway between the two, unless there's some magic happening. But I doubt it... I really think there is no perfect method.

Being an ex-console peasant, I have the same gift, and I've been very careful to maintain it. I don't have a 'muscle memory' of any significance, since my muscle memory is for sticks not a mouse, and I'm constantly changing my sensitivity in my search for a formula.

You're right, there can be no perfect method, at least, not based on monitor distance matching. Because the distortion of the image is introduced by our stationary position relative to the monitor, while the focal point of the image shifts back and forth, the image on the monitor is not an accurate point of reference for rotation. This is well evidenced by the failure of 0% MM which is the perfect representation of the image on the monitor.
 

On 21/03/2018 at 10:27 PM, potato psoas said:

@CaptaPraelium

I like what you're trying to do with this whole figuring out how we perceive sensitivity, but I would've thought this had been solved by now.

Me too. The thing is that nobody has really tried before me so I am inventing the wheel here. I have a rare disease and am not able to work on this often, and I'm knowingly biting off more than I can chew, so I stand on the shoulders of giants, in medical/psychological fields as well as gamers here, but there is some leftover which I must do alone (unless others pitch in here as I've invited :))

One of the really problematic parts I'm finding, is that I can come up with concepts, do working on paper, but to share my progress takes twice as long as making the progress, since I have to do fancy images and stuff. I'm a lot further along than the thread shows,  but it's hard to share. I'll try and give some updates to what I've learned, today, in my replies to you.


 

On 21/03/2018 at 10:27 PM, potato psoas said:

So when we move our mouse a certain distance, we expect to move a relative distance on the monitor from the cursor/crosshair.

No. This is a common misconception and one which has led us (and I say "us" as in, me too!) astray. Our brain expects us to move a certain ANGLE from the crosshair, NOT a certain distance. Our brain uses the angle to determine the distance between two objects. The headshrinks refer to this as https://en.wikipedia.org/wiki/Visual_angle . Note in the image on that page, we need V and D to determine S. We cannot expect a movement in distance, because we do not know the distance. We can know the angle thanks to feedback from our eyes. Using 0%MM accounts for the shift in D (refer to the geogebra links above), however it does not account for the distortion introduced by 'faking' the shorter distance (read: zooming the image, rather than actually rendering what it would look like from a shorter distance, or moving your head to the focal point every time you ADS haha) And, that's where this comes in:

On 21/03/2018 at 10:27 PM, potato psoas said:

But it is not just the crosshair that you are constantly referencing - when you move your mouse you are looking at the entire screen and constantly referencing the "sensitivity" of many points on the monitor at the same time. Your eye is not just focused on one single point. There is a useful field of view that you always use. Even peripheral vision can play a part in perceived sensitivity.

 

On 21/03/2018 at 10:27 PM, potato psoas said:
  • This is why 0% MM feels amazing at the crosshair but it feels too fast at points closer to the edge of the monitor.


The reason any sensitivity feels faster at the edges of the monitor is a result of the 'stretching' introduced by the distortion of the image. As we diverge from the centre, the image becomes more and more stretched, so the same angular movement causes a greater distance to be travelled on-screen.
 

18 hours ago, potato psoas said:

Lower sensitivity as in lower cm/360, that's for sure. But no, 0% is matched at the crosshair but becomes faster as you go towards the edge of the monitor. Whereas 100% starts off slow at the crosshair and gets closer to match at the edge of the monitor.

Everything starts off slow at the crosshair :)
 

On 21/03/2018 at 10:27 PM, potato psoas said:
  • This is why 100% MM feels too slow at the crosshair but feels better at the edge.

The reason it feels best at the edge is because the angle of rotation is perfectly matched to that monitor distance, but it will always feel 'off' anywhere else.
 

On 21/03/2018 at 10:27 PM, potato psoas said:
  • And this is why every other method in between 0% and 100% will feel imperfect.

There is no way to work around the distortion. From what I've explained, it's clear that what we perceive, what we expect, our muscle memory, can be explained in terms of distance. It's best that you share these assumptions to help you determine picking the optimal method.

As discussed above, by accounting for distance, we get 0% MM which is great except for the distortion; and as you'e covering here, the distortion can never be fully accounted for. What we're after here is the 'most correct'. And now we begin to touch on what I've been working on recently.... Now before I get into this I have to say, some of this is untested and unconfirmed. I'll mark those with a ***. Normally, I like to be REALLY fricken sure before I post something here, but I realise that my lack of communication is detrimental and probably moreso than posting a possible mistake....So, I'll post this possible mistake - but I'm sure you'll see where I'm going with this.


As I'm sure is obvious by now (so I won't go in-depth explaining why) the distortion we see on-screen is a tangent function. If we consider the sensitivity "monitor matched" at 180 degrees (directly behind us, hence the quotes on "monitor match" because that's NOT on the monitor obviously....), we're looking at something which approaches having no divider for mouse sensitivity whatsoever, as in, the same cm/360 for every zoom level (*** pretty sure of this, haven't finished the math, but I'm very confident). If we consider the sensitivity at 0 degrees (the crosshair), we are looking at 0%MM. You can see this in the geogebra image in my posts above. I have to re-do that geogebra magic to allow for a target that is off-screen, but you can see where it's going.

Using that image, I started to suspect that perhaps the formula was as simple as dividing the two FOVs. You can see that at vertical 100% (slide the 'Target' X in that image to the top of the 'screen'), that's exactly what we have. This seems to make a lot of sense, since we've learned that we're not dealing with distance, we're dealing with angles, so dividing the angles of the FOVs seems sensible. And then it clicked, hey this won't work in HFOV. Dividing the two HFOVs does NOT provide the same result as dividing the two VFOVs. A quick spot of experimentation showed me, that if we scale between 0 and 180FOV, we get the monitor's aspect ratio as the ratio between the divided results. No big surprise there, but as I alluded to in my previous post, this got me to thinking: Where is the best place to match? 100% VFOV seemed nice but 100% HFOV breaks that wide open. And then I got to thinking: 100% HFOV is the same thing as 177.77~% VFOV (on 16:9 monitors at least). Knowing the importance of aspect-ratio independent sensitivity (as per my previous discussion and the example of being in a plane or spaceship or whatever, and rolling it over...) This got me to thinking: WHY do we assume that the target is on screen? It's entirely common for me to be in-game, and hear footsteps behind me and have to spin and hit them. Sure, it's not as common as a target roughly in front of me and within FOV, but that has no bearing on what's correct for the formula, it's related to my own movement. The target might be at the right edge of a 16:9 monitor, that's the same thing as being entirely off-screen above me. And oh look, there goes a plane flying above me. Think about things like rocket jumping where you spin 180, fire, then spin back.... As soon as I started thinking about this, examples flooded my head.

There has long been discussion about which monitor match percentage is correct, and I am now asking myself, who says it has to be on the monitor at all?!

So, we have to find a place that scales correctly, given a minimum sensitivity, as provided by 0%MM, and a maximum, being identical cm/360(***). Which has the least amount of error involved? Well, this is easy if we think of it as a tangent function. At the asymptotes, we have a vertical line, representing maximum sensitivity (same cm/360 ***). that's 90degrees (tan(90degrees)). At 0, we have a horizontal line, representing minimum sensitivity, the 0%MM. annnnd hello 45 degrees. Keep in mind that I am referring to an angle from the centre of screen, so that's a 90FOV. If you consider the average kind of hipfire FOVs people use, this is going to be somewhere just a little greater than 100% VFOV. I'm reminded of the conversation about Battlefield's USA, where they started with what we call 0%MM, and everyone said it was too slow, then they tried 100%VFOV matching and it was almost there, then 100%HFOV but some people found it too fast, and then came to the conclusion that some people feel it differently, and settled on CSGO's 133% as a 'middle-ground'.


 

On 21/03/2018 at 10:31 PM, potato psoas said:

I simply don't believe this is true.

Believe it, it's true :) You can see it for yourself, it's been proven, and not only by myself. One way to see it is with the geogebra image I posted above. If you move the 'eye' to the focal point for a given FOV, there is no distortion of the angle from the eye to the target on-screen and beyond. Another more tangible way to do it is by using the high-FOV images from the first page of this thread. Use the formula:
opticallycorrectdistance=(heightcm/2)/(tan(hipfov/2)) and move your eye to that correct distance (you'll have your face right up in the monitor!) Look at the centre of the image and note the complete lack of distortion toward the edges.
 

On 21/03/2018 at 11:48 PM, potato psoas said:

I think I've been too hard on 0% MM...

It's funny because I think I've been too easy on it. At first glance (and second and nth, lol) it appears to be perfect. I said early on here, that I expected this thread to validate a previous formula, and not so much to make something new, and to be honest, what I silently expected was to prove that 0%MM was correct. The more I look into it, the more I realise why it feels too slow for everything beyond 0% from the crosshair - because it is the MINIMUM sensitivity across a range of sensitivities which contain the most correct one. As our FOV decreases (we zoom in), so does our range and so does any error in our sensitivity, so it's quite hard to see any error in it. The greatest exposure of any error, is going to be in the way we treat our hipfire FOV.
 

 

Link to comment
18 hours ago, CaptaPraelium said:

Believe it, it's true :) You can see it for yourself, it's been proven, and not only by myself. One way to see it is with the geogebra image I posted above. If you move the 'eye' to the focal point for a given FOV, there is no distortion of the angle from the eye to the target on-screen and beyond. Another more tangible way to do it is by using the high-FOV images from the first page of this thread. Use the formula:
opticallycorrectdistance=(heightcm/2)/(tan(hipfov/2)) and move your eye to that correct distance (you'll have your face right up in the monitor!) Look at the centre of the image and note the complete lack of distortion toward the edges.

Okay I understand this... I actually mentioned the eye distance shift idea in my post (below). Though I didn't realize that it would cancel out the distortion if you lined up your eye FOV with the in-game FOV. I've actually done this before... It's why many players of racing games like to line up their FOV for more immersion: http://www.projectimmersion.com/fov/

I guess I only just realized when I made my other post that different points on the monitor have different "sensitivities" - but only if we were looking at it from a 0 FOV perspective... you really have to take into account the design of the eye. Oh that reminds me - keep in mind WE HAVE TWO EYES.

 

Shit, I understand what I'm missing now... this thing is amazing: https://www.geogebra.org/m/DyZprxpA

Edited by potato psoas
Link to comment
3 hours ago, potato psoas said:

Okay I understand this... I actually mentioned the eye distance shift idea in my post (below). Though I didn't realize that it would cancel out the distortion if you lined up your eye FOV with the in-game FOV. I've actually done this before... It's why many players of racing games like to line up their FOV for more immersion: http://www.projectimmersion.com/fov/

Right, I know what you mean. It didn't quite 'click' in my head until I was reading Valve's document and they mentioned it and I was like "hangon" *punch numbers into calculator* "Holy cow it works" :o

 

3 hours ago, potato psoas said:

I guess I only just realized when I made my other post that different points on the monitor have different "sensitivities" - but only if we were looking at it from a the perspective of the monitor... you really have to take into account the design of the eye. Oh that reminds me - keep in mind WE HAVE TWO EYES.

Yeh, honestly I've been completely avoiding this largely for reasons of simplification.I did gloss over it, and I'm fairly confident that since we are viewing a flat, 2D image (even of a virtual 3D world), the binocular cues don't really apply. Which is good because it would make everything really really difficult. https://en.wikipedia.org/wiki/Depth_perception#Binocular_cues

Speaking of reminders, you've reminded me of an important thing I've neglected to mention. It's unlikely that what we will get from the above concepts, is a single 'monitor distance' for which we will match all FOV's, as we have with the 'monitor match' method. Instead, what we will probably get is an 'angle match', where each FOV will match the other FOVs only at a given angle (probably 90 degrees as per the tan curve thing I mentioned above, but that will depend on the stuff with ***'s marked on them actually being calculated.) To put this in terms of monitor distance, in hipfire it's likely to be around 100+% (vertical), but in very high zoom it is likely to be entirely off-screen. Don't quote me on this ;) This is the kind of thing which I usually keep to myself until I've formulated and tested and verified it but since I'm trying to be more communicative, there it is. By the way, 0% monitor match and 0° angle match, are the same thing..... 0%MM is still pretty much the basis for all of this.

 

 

Edited by CaptaPraelium
Link to comment
2 hours ago, espe89 said:

Sorry but for ignorant people like me which method is the "best" today to convert WINDOWS into CSGO ?

 

Sorry mate I'm honestly not too sure. I don't dabble in 2D to 3D conversion much yet. This thread is more about conversions between different FOVs in 3D. Once I'm really satisfied with that working, I'll branch out into 2D.

Link to comment

Okay, I'm just putting my thoughts into words here. I just want to figure things out for myself and you can read along to see if I'm making sense...

One of the things I mentioned before was that if your monitor were able to shift to match your eye's FOV with the in-game FOV then you wouldn't need to change your cm/360 (I think - or was it the gear ratio...) - it would feel exactly the same (and across all points of the monitor). But because that ain't going to happen, we have to figure out how the distortion changes so that we can compensate for it.

As a reminder, the eye-to-monitor distortion and the rectilinear projection distortion cancel each other out when the game FOV matches the eye's FOV, thus causing the sensitivity to be the same at all points on the monitor (as it is perceived by the eyes).

Previously, I thought that each method had only one behavior, i.e. 0% MM only became faster at the edges and 100% MM became slower at the center, but adding the function of the eye into the equation instead gives us two behaviors as you move away from your eye's FOV - one for when you approach 180 FOV and one for when you approach 0 FOV.

So I thought about it and I found this:
0%MM: The center is matched for every FOV, but as you approach 180 FOV the sensitivity gets faster at the edges and as you approach 0 FOV the sensitivity gets slower at the edges.
100%MM: The edge is matched for every FOV, but as you approach 180 FOV the sensitivity gets slower at the center and as you approach 0 FOV the sensitivity gets faster at the center.

Keep in mind, monitor matching (MM) in this sense refers to angle matching, not distance matching, but the concept is still the same - we are just identifying how the distortion changes so we can figure out how to compensate for it.

As you can see, there is still distortion, but the behaviors are different and when the distortion occurs depends a lot on sitting distance. One of the important things to remember is that there is a limit to distortion with 100% MM, as it is limited by the chord length as it refers to the arc length, whereas 0% MM approaches 0cm/360 at 180 FOV. But as I said before, this does not mean 0% MM is the inferior method.

So in the end, there is still going to be distortion, but the approach you take determines the sensitivity. I think a lot can be said about your eye-to-monitor distance, as this can also exacerbate the flaws of each method.

If you intend to use 0% MM, make sure your monitor is further away from your eyes, whereas if you intend to use 100% MM, make sure your monitor is closer to your eyes. I don't know the specifics but I'm sure there's a formula to help you find the sweet spot... but even then, there will still be distortion.

Edited by potato psoas
Link to comment
On 3/23/2018 at 18:34, CaptaPraelium said:

The headshrinks refer to this as https://en.wikipedia.org/wiki/Visual_angle . Note in the image on that page, we need V and D to determine S. We cannot expect a movement in distance, because we do not know the distance. We can know the angle thanks to feedback from our eyes. Using 0%MM accounts for the shift in D (refer to the geogebra links above), however it does not account for the distortion introduced by 'faking' the shorter distance (read: zooming the image, rather than actually rendering what it would look like from a shorter distance, or moving your head to the focal point every time you ADS haha)

Does that mean 100% MM accounts for the shift in the radius/hypotenuse of the visual angle?

Link to comment
On 3/23/2018 at 18:34, CaptaPraelium said:

Dividing the two HFOVs does NOT provide the same result as dividing the two VFOVs. A quick spot of experimentation showed me, that if we scale between 0 and 180FOV, we get the monitor's aspect ratio as the ratio between the divided results. No big surprise there, but as I alluded to in my previous post, this got me to thinking: Where is the best place to match? 100% VFOV seemed nice but 100% HFOV breaks that wide open. And then I got to thinking: 100% HFOV is the same thing as 177.77~% VFOV (on 16:9 monitors at least). Knowing the importance of aspect-ratio independent sensitivity (as per my previous discussion and the example of being in a plane or spaceship or whatever, and rolling it over...) This got me to thinking: WHY do we assume that the target is on screen? It's entirely common for me to be in-game, and hear footsteps behind me and have to spin and hit them. Sure, it's not as common as a target roughly in front of me and within FOV, but that has no bearing on what's correct for the formula, it's related to my own movement. The target might be at the right edge of a 16:9 monitor, that's the same thing as being entirely off-screen above me. And oh look, there goes a plane flying above me. Think about things like rocket jumping where you spin 180, fire, then spin back.... As soon as I started thinking about this, examples flooded my head.

There has long been discussion about which monitor match percentage is correct, and I am now asking myself, who says it has to be on the monitor at all?!

I swear even if the "monitor match" was off the monitor then there would still be distortion... but if you are going to do that kind of thing, 100% MM works best at all points outside the boundaries of the monitor because 100% MM is the gear ratio method, the method that synchronizes sensitivity like gears connected by a pulley. Keep in mind, theoretically, the pulley can be at a tangent to the gear or wrap around the gear completely - it will still pull the gear in sync with all the other gears.

Link to comment
20 hours ago, CaptaPraelium said:

Yeh, honestly I've been completely avoiding this largely for reasons of simplification.I did gloss over it, and I'm fairly confident that since we are viewing a flat, 2D image (even of a virtual 3D world), the binocular cues don't really apply. Which is good because it would make everything really really difficult.

I was thinking more along the lines of the eyes aren't exactly centered at the crosshair, they are to the left and right of it, so unless we play with an eyepatch and sit to one side of the monitor, none of our math is going to be right... and I tried this and my left eye is going blind :/ but it really did feel like I was looking at a 3D model and not a 2D image.

Link to comment
24 minutes ago, potato psoas said:

Does that mean 100% MM accounts for the shift in the radius/hypotenuse of the visual angle?

It does, but here is the tricky part: It does if you use 100% VFOV, but for 100% HFOV it is still distorted - It is distorted by a factor of the aspect ratio (ie 1.77777~ for 16:9 monitors) at the greatest extent, ie, we are converting 180 FOV to 0FOV.. This is exactly the problem I'm trying to solve right now. I know that it's something to do with the tan function distortion and finding the 'middle ground' but I'm not quite sure how to handle the math behind it. I'm searching for a function that scales between zoom ratio, and zoom ratio*aspect ratio, with a tan curve, and 45 degrees along that curve will be the minimum error. I'm no @Skwuruhl ;)

Link to comment
1 minute ago, potato psoas said:

I was thinking more along the lines of the eyes aren't exactly centered at the crosshair, they are to the left and right of it, so unless we play with an eyepatch and sit to one side of the monitor, none of our math is going to be right... and I tried this and my left eye is going blind :/ but it really did feel like I was looking at a 3D model and not a 2D image.

I treat this as if we have one eye only, and that is our dominant eye. https://en.wikipedia.org/wiki/Ocular_dominance

 

Link to comment

 

I thought I'd make a diagram explaining what I was talking about in my post before. It is showing the distortion for 0 FOV (or approaching 0 FOV), the eye FOV and 180 FOV (or approaching 180 FOV), demonstrating how the required mouse movement is affected at the edge of the monitor and at the center of the monitor, smartly shown by dividing the angle into two equal parts:

5ab64a376b732_eyefov.png.14960567cb564d3f464999d8d134ed82.png

Then consider how 0% MM and 100% MM work and you can figure out how each method is affected by distortion.

6 hours ago, potato psoas said:

0%MM: The center is matched for every FOV, but as you approach 180 FOV the sensitivity gets faster at the edges and as you approach 0 FOV the sensitivity gets slower at the edges.
100%MM: The edge is matched for every FOV, but as you approach 180 FOV the sensitivity gets slower at the center and as you approach 0 FOV the sensitivity gets faster at the center.

Simple explanation:

0%MM slower approaching 0 FOV, faster approaching 180 FOV

100%MM faster approaching 0 FOV, slower approaching 180 FOV

The direction changes at the crossover point when the eye FOV and in-game FOV are the same.

Edited by potato psoas
Link to comment
22 hours ago, potato psoas said:

I swear even if the "monitor match" was off the monitor then there would still be distortion... but if you are going to do that kind of thing, 100% MM works best at all points outside the boundaries of the monitor because 100% MM is the gear ratio method, the method that synchronizes sensitivity like gears connected by a pulley. Keep in mind, theoretically, the pulley can be at a tangent to the gear or wrap around the gear completely - it will still pull the gear in sync with all the other gears.

Yes, you're quite right - the distortion is unavoidable. We can never fully avoid the effects of it, and so I am attempting to minimise those effects.

The issue with monitor matching, is that it does not function in the same manner, when considering the distortion on both the horizontal, and the vertical planes. As skwuruhl has mentioned in the past, the monitor match percentages which we have chosen, be they 100% vertical or horizontal or diagonal or whatever, are somewhat arbitrary. What I am attempting to find, is a non-arbitrary position at which to match sensitivities, by matching at the position where the errors in sensitivity (too slow/too fast) are minimal across all angles of rotation, regardless of whether that angle falls on-screen at the given FOV.

Link to comment
On 24.03.2018 at 16:00, potato psoas said:

 

I thought I'd make a diagram explaining what I was talking about in my post before. It is showing the distortion for 0 FOV (or approaching 0 FOV), the eye FOV and 180 FOV (or approaching 180 FOV), demonstrating how the required mouse movement is affected at the edge of the monitor and at the center of the monitor, smartly shown by dividing the angle into two equal parts:

5ab64a376b732_eyefov.png.14960567cb564d3f464999d8d134ed82.png

Then consider how 0% MM and 100% MM work and you can figure out how each method is affected by distortion.

Simple explanation:

0%MM slower approaching 0 FOV, faster approaching 180 FOV

100%MM faster approaching 0 FOV, slower approaching 180 FOV

The direction changes at the crossover point when the eye FOV and in-game FOV are the same.

So, as i am stupid, is there any eye FOV for 0%MM to have 1:1?

And if there is, how i need to match sens? Help me please

Edited by vrnvorona
Link to comment
1 hour ago, vrnvorona said:

So, as i am stupid, is there any eye FOV for 0%MM to have 1:1?

And if there is, how i need to match sens? Help me please

You're not stupid :) It's confusing AF.

Short answer is no. Your eye FOV would have to change to match the current FOV (read: you would have to move your seat back and forth when you right-click and un-right-click)

Link to comment
10 minutes ago, CaptaPraelium said:

You're not stupid :) It's confusing AF.

Short answer is no. Your eye FOV would have to change to match the current FOV (read: you would have to move your seat back and forth when you right-click and un-right-click)

Lmao. And which FOVs i need to 103 and 51 game fovs relatively? And how to calculate eye fov to monitor.

Link to comment

//My monitor resolution
widthpx = 2560

heightpx = 1440

//27 inch diagonal, converted to centimeters because Australia If you prefer inches, just don't do *2.54.
diagcm = 27*2.54

//Distance from monitor to eye, also centimeters in this case, measured with a piece of string...
//actually used my headphone cable and a borrowed Hannah Montana ruler XD Super high tech here
viewdistance = 36

//Calculating monitor height from this information so it's more exact than just measuring it. We need monitor height because FOV is usually measured vertically.
heightcm = (diagcm/sqrt(widthpx^2+heightpx^2))*heightpx

//Calculating the actual FOV of the eye to the monitor based on the above
actualfov = 2(arctan((heightcm/2)/viewdistance))

//Calculating the focal point (ie, the 'optically correct distance' as Valve says it) for a given VFOV
opticallycorrectdistance=(heightcm/2)/(tan(vfov/2))

Edited by CaptaPraelium
for the lulz
Link to comment
3 minutes ago, CaptaPraelium said:

//My monitor resolution
widthpx = 2560

heightpx = 1440

//27 inch diagonal, converted to centimeters because Australia
diagcm = 27*2.54

//Distance from monitor to eye
viewdistance = 36

//Calculating monitor height from this information so it's more exact than just measuring it. We need monitor height because FOV is usually measured vertically.
heightcm = (diagcm/sqrt(widthpx^2+heightpx^2))*heightpx

//Calculating the actual FOV of the eye to the monitor based on the above
actualfov = 2(arctan((heightcm/2)/viewdistance))

//Calculating the focal point (ie, the 'optically correct distance' as Valve says it) for a given VFOV
opticallycorrectdistance=(heightcm/2)/(tan(vfov/2))

Cms and Ausies are best mate

And as i got it i need 103 and 51 eye fov for same fovs ingame, aight? Theoretically

Link to comment
1 minute ago, CaptaPraelium said:

Theoretically yes. In practice of course this is never possible...well never practical to move your seat around, and that's why we're nerding out in here trying to find the 'next best thing'

Ok, sounds interesting. Thanks for math mate )

Link to comment
46 minutes ago, CaptaPraelium said:

Theoretically yes. In practice of course this is never possible...well never practical to move your seat around, and that's why we're nerding out in here trying to find the 'next best thing'

Though, i kinda can't really understand how to do this. I made some screenshots and got pictures ~2.65 size in scope while fov is 1/2. If i just devide 1/2.65 i get 37.75..... popular 0% MM value. if i divide 51/103 i get 100% MM 50 RSS. Can't get why they differ and why just having 38 OR 50 can be 1:1.

 

You can not explaing this but eh, is solution even exist?

Link to comment
8 hours ago, vrnvorona said:

So, as i am stupid, is there any eye FOV for 0%MM to have 1:1?

And if there is, how i need to match sens? Help me please

You would need to be sitting infinitely far away to match eye FOV to 0 FOV, which isn't possible. The only way to have your desktop matched is to have a curved monitor.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...