Jump to content

Robin Hood - Sherwood Builders

See the game notes for instructions on how to reduce smoothing.
Read more...

Gas Station Simulator

See the game notes for instructions on how to disable smoothing.
Read more...

Mortal Shell

See the game notes for instructions on how to disable smoothing.
Read more...

Incursion Red River

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

ONCE HUMAN

Hipfire added, more aims to come. See the game notes for instructions on how to disable smoothing.
Read more...

Same game, monitor size change, different aim - can you help?


Recommended Posts

2 minutes ago, Drimzi said:

nope just leave the sens alone, although leaving it alone is the same as 0% scaling.

So you're changing your cm/360 on the same FOV, but it still works? I've lost it now but I'll take your word for it, I'm never going to use this it anyway :o

Link to comment

Let's go from 10" monitor to 20" monitor. Both have 1920x1080 pixels. First assumption will be to match the distance it takes to move the cursor from one side to the other side, AKA move 1920 pixels. This is what the calculator does. Let's extrapolate this idea to a 1" screen and a 1000" screen. If the distance it takes to move the mouse results in the cursor moving 1" in one case, and 1000" in the other, will it feel the same? Personally it wouldn't to me.

Instead of matching the quantity of pixels displaced, you want to match the distance displaced. Figure out how big a pixel is.

(rounded for simplicity)

10" monitor:  Every pixel is 0.0045"

20" monitor:  Every pixel is 0.0090"

If it takes 1" of mouse movement to move 1920 pixels on the 20" monitor, then 1" of mouse movement correlates to 17.4". This is a ratio of 1 : 17.4, or a gain of 17.4. It doesn't matter what units you measure, whether it is mm, cm, inches, kilometers, it doesn't matter. 1 unit of mouse movement correlates to x unit of cursor movement. This is what you convert, not the pixels displaced. It will become inconvenient to move the cursor from the start menu to the system tray if the screen was 1000" wide, but that has nothing to do with the sensitivity of the mouse. For convenience, you can change the sensitivity to reduce that distance, but that is just preference.

 

Let's take it to 3D. In the above explanation you can see that pixels don't matter. The monitor is acting like a window. You are keeping real-world distances. The same applies to 3D.

Imagine you are playing a game like CS:GO. You crop the aspect ratio from 16:9 to 4:3. Does the perceived sensitivity change? Is it like some kind of optical illusion where replacing some of the rendered game world with blackness changes the perception of sensitivity? It personally isn't to me.

What if instead of blackness, you just change to a 4:3 monitor. You effectively reduced the horizontal angle of view from 106.26 degrees to 90 degrees, but the perceived sensitivity did not change. If it feels the same, why would you scale the sensitivity by the change in angles, ie. 90/106.26, which is what 'monitor distance match' is doing. So at the moment, the angle of view (or FOV) is changing, but the cm/360° is not and it feels good. So maybe it is because the vertical angle of view is still 73.74, so that's why the sensitivity didn't change? Wrong. Since CS:GO enforces a specific angle of view, let's just place black paper on the monitor to 'crop' it, let's reduce the effective vertical angle of view to a low number, will the sensitivity appear to change? If you answer no, then you can see that the angle of view has no bearing on the sensitivity. The focal length remained constant, the angle of view changed, so the answer is the focal length.

Here is 90° (4:3) at 20" and at 10", overlayed

969333531_90overlayed.thumb.jpg.14239aa1cc7a4d70ab7c8a561fe425fd.jpg

You can see that they are both different. They have identical angle of view, but the focal length is completely different.

 

If you convert from the 20" to the 10", then the 10" needs to reduce the angle of view to maintain the same focal length. 20" has 90° (4:3), 10" has 53.13° (4:3).

218532443_5390.thumb.jpg.39a01d705886e15d4c0a248498c681a0.jpg

 

Here is 10" converted to 20". 10" has 90° (4:3). 20" has 126.87° (4:3).

1098856454_90127.thumb.jpg.6daba7019766b01064ce0eb35a8b613b.jpg

 

The first scenario, where the game enforces 90° (4:3), the game sensitivity value doesn't change. However, the CPI did change, by a factor of 2 if you keep the same 2D sensitivity. This is because the 20" is twice the size of 10". The focal length has also changed by the same factor. So the change in CPI results in the change in cm/360°, even though the FOV is the same. If you didn't want your cm/360° to change, you would have to use a lower angle of view on the smaller monitor (the 2nd picture).

You can quickly test this for yourself by creating a custom half resolution, like 960x540, and then playing a game in windowed or fullscreen (with no scaling and override enabled). Compare no change in sensitivity, and double game sensitivity (or whatever sensitivity value is equivalent to half the cm/360°). See which one feels better.

Scaling the game sensitivity by the change in focal length is exactly what 0% monitor match does. The change in monitor size is also the same factor as the change in focal length. This suggests that 0% is the only way to convert game sensitivity/cpi, as it results in no change in 'sensitivity'. The relationship between the device and the display element remains constant. Any deviation from this will be personal preference.

Many will have personal preference that will override 0%. Like if the screen was incredibly wide such as 32:9, making the distance from one end to the other double than that of a 16:9, then they may prefer to scale their sensitivity by a factor or 2 in order to make the hand distance the same when going from the start menu to the taskbar, or closing browser tabs, etc. They distance matched, which requires a change of device/mouse sensitivity (also applies to 3D). Many may also prefer to amplify their sensitivity for aim down sights/scopes, so they don't have to scale their input proportionately with the change in zoom and image curvature. 0% usually feels too slow in this case because you are directly comparing two different focal lengths without the distance between you and the reference point changing to cancel out the perceived zoom, whilst simultaneously using the same hand movement before and after the zoom. The target size, distance between the target and the crosshair, the target movement speed, etc., all scale proportionately with the focal length, which means you also need to scale your input with the focal length to land the flick or track the movement speed (and then there's the difference in eccentricity/curvature that results in that snappy/sluggish feeling and different diagonal trajectories).

Due to these preferences to scale sensitivity instead of input, other methods like 'monitor distance match' can be useful. I think Viewspeed v2 is also useful, the feeling is is kind of comparable to zooming in and having the camera dolly in the opposite direction of the zoom. If you zoom in from one end of the spectrum to the other, whilst simultaneously moving the mouse in circles, it looks like the sensitivity is matched perfectly. But when it comes to actually using Viewspeed v2, it results in worse aim performance (personally) at the expense of it feeling more consistent in that exact specific moment, due to the no sudden feeling of slowdown or speed up whilst using identical mouse movement before/after the zoom.

 

As for focal length, I believe it is found by doing the following:

(SquarePixels/2) / tan(SquareDegrees * pi/360) = focal length in pixels?

It's the same as this graphical fov calculator: https://teacher.desmos.com/activitybuilder/custom/5a61dd34fafbd40a25416e02#preview/d123ef39-8694-4760-af7d-c18c936ce79d

Scale pixels by the physical dimension of a pixel, and you will see that the above cases have identical focal lengths despite very different angle of views.

10" 90 (4:3) has 720 pixels focal length, 720 * 0.0045 = 3.24"

20" 126.87 (4:3) has 360 pixels focal length, 360 * 0.0090 = 3.24".

Edited by Drimzi
Link to comment
  • 8 months later...

maybe i missed something but everyone says changing aspect ratio affects sensitivity.

 

So if i go from a 17zoll 4:3 monitor to a 24.5 zoll 16:9 monitor with 1600dpi how is the math?

is it still:

17/24.5*1600?? or have i account the aspect ratio change?

Link to comment
  • 3 months later...
  • 2 years later...
  • Wizard
31 minutes ago, KMHTech said:

So I had a DPI of 800 (Overwatch), going from the 25" AW2518HF to the 27" VG279QM would require a DPI change for 1to1?

For it to be the same sensitivity in terms of 360 distance, no.

For it to physically match the old sensitivity, yes. This means that if you directly overlay your new 27" over your 25", moving the crosshair for instance to the edge of your 25" will for your 27" match the movement to where your 25" inch end on the 27".

However, going from 25" to 27" is about the same angular size difference as moving 2" back. Personally I like to preserve the 360 distance, but the calculator lets you enter different monitor sizes and will convert between them (unless you use 360 distance as matching method).

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...