Jump to content

Project L33T

See the game notes for instructions on how to disable smoothing.
Read more...

Twilight Town: A Cyberpunk FPS

Just added.
Read more...

Contain

See the game notes for instructions on how to disable smoothing.
Read more...

Vomitoreum

Just added.
Read more...

Double Action: Boogaloo

Just added.
Read more...

Perceived sensitivity


Recommended Posts

Just now, vrnvorona said:

It can't be solution cause it is inconsistent.

I understand this, however, if we're coming up to what feels natural transitioning from hipfire -> ADS or what ever FOV change there is, to an average human, I have the thought that mouse acceleration 'feels' the most natural even though it's the most inconsistent in terms of aiming, such that our solution is not simply one value out of a function

Link to comment
19 hours ago, Drimzi said:

1:1 monitor match (For 16:9 - 56.25% horizontal, 100% vertical) felt most accurate, but only out of the sample size I chose, which was 0%, 1:1, 4:3, 1:1 Diagonal, Viewspeed v2. I didn't test arbitrary scaling between these. Something can feel good until you truly feel something better, so hopefully CaptaPraelium's current ideas work out and he creates something that everyone can agree upon.

For now, you can calculate sensitivity values for your chosen fov range and test it out with the script and see what you like best. 1:1 felt most balanced for me out of the sample size, and I chose it as the formula to use as it is incredibly easy to verify if a sensitivity is correct, because all fovs require the same distance to rotate to the top of the screen. So even if a game is not added to this calculator, or a games ADS is not supported, I can change the sensitivity or DPI myself until it rotates to the top of the screen.

I hate to be naive here, but could you explain the cases tested, I don't understand what they're referring to from the calculator:

0% I assume is 0% monitor distance,

1:1 no clue,

4:3 no clue,

1:1 diagonal no clue,

Viewspeed v2 from your formula, was this intended for 3d -> 3d conversion? I thought it was for taking a 3d -> 2d???

Link to comment

There's a reason basically every operating system has it on. Push mouse faster, makes mouse go faster. It's very intuitive. Acceleration works entirely programmatically so theoretically speaking you could be accurate with it. If you can control not only the distance of your hand movement, but also the speed, and acceleration, it could work.
Thing is, you want to remove barriers to accuracy. Have you ever compared PC shooter gameplay, to console gameplay of the same game, and been like wow, mouse aim makes this completely different? I mean we all know controllers are less accurate but, think of why they're less accurate - because not only do you have to control the distance of the stick throw, but the speed and acceleration. That extra effort is why they're so much less efficient, and it's the same extra effort required to use acceleration.

If I had a trackpad though, where I could completely control the acceleration algorithm....Don't get me started.

TL;DR Acceleration is like using a controller.

Link to comment
1 hour ago, CaptaPraelium said:

There's a reason basically every operating system has it on. Push mouse faster, makes mouse go faster. It's very intuitive. Acceleration works entirely programmatically so theoretically speaking you could be accurate with it. If you can control not only the distance of your hand movement, but also the speed, and acceleration, it could work.
Thing is, you want to remove barriers to accuracy. Have you ever compared PC shooter gameplay, to console gameplay of the same game, and been like wow, mouse aim makes this completely different? I mean we all know controllers are less accurate but, think of why they're less accurate - because not only do you have to control the distance of the stick throw, but the speed and acceleration. That extra effort is why they're so much less efficient, and it's the same extra effort required to use acceleration.

If I had a trackpad though, where I could completely control the acceleration algorithm....Don't get me started.

TL;DR Acceleration is like using a controller.

Yeah, controller thoguh is not that accurate cause it has much longer stop point. Like, you controll speed of turn, not distance itself.

Link to comment
On 03/04/2018 at 3:33 AM, Skidushe said:

I hate to be naive here, but could you explain the cases tested, I don't understand what they're referring to from the calculator:

0% I assume is 0% monitor distance,

1:1 no clue,

4:3 no clue,

1:1 diagonal no clue,

Viewspeed v2 from your formula, was this intended for 3d -> 3d conversion? I thought it was for taking a 3d -> 2d???

Since this calculator uses a percentage of the users horizontal resolution, for 16:9:

1:1 = 56.25%

4:3 = 75%

16:9 = 100%

Viewspeed V2 handles 3d to 3d as well, as it doesn't return the same 360 distance for all fovs.

 

edit: To find percentage values:

1:1 = 1080 / 1920 * 100 = 56.25%

4:3 = (1080 * 4/3) / 1920 * 100 = 75%

1:1 Diagonal = (1080 * sqrt(2)) / 1920 * 100 = 79.55%

16:9 = (1080 * 16/9) / 1920 * 100 = 100%

21:9 = (1080 * 64/27) / 1920 * 100 = 133.33%

Edited by Drimzi
Link to comment
12 hours ago, Skidushe said:

I hate to be naive here, but could you explain the cases tested, I don't understand what they're referring to from the calculator:

0% I assume is 0% monitor distance,

1:1 no clue,

4:3 no clue,

1:1 diagonal no clue,

Viewspeed v2 from your formula, was this intended for 3d -> 3d conversion? I thought it was for taking a 3d -> 2d???

They're aspect ratios. So like "16:9 match" would be 100% for most people. 1:1 doesn't mean "1:1 mouse movements" but "1:1 aspect ratio". So it's just what 100% would be with a square screen. "1:1 diagonal" being distance from crosshair to the corner of a square screen. 

Link to comment

I have created a python script to create the files for the cs fov testing @Drimzi made with his lua script here : https://github.com/Skidushe/sens-fov-scalar

It should allow us to test the different monitor distance ratios efficiently rather than copy pasting.

You should be able to put the python file inside the cfg folder and then run from there if you want to do it that way too.

Edited by Skidushe
Link to comment

I've attached a cue profile with the macro for executing the scripts, the toggle is the END key, place the files in the cfg file, exec the fov<min>+<max>.cfg, and press end with this cue profile and you should see it start to work

FOVTestingCorsair.cueprofile

You would have to change the number of times the 1 key is pressed on either side of the 2, as we can't do any pre-processing, but that's just a copy pasting exercise, it's setup by default for the FOV range 15-100 which from my testing seems adequate. Note that there is a 'remove all delays' button if you're having trouble with the delay blocks.

Edited by Skidushe
Link to comment

With my script, I did 10% intervals from 0% -> 60% as 75% felt too fast:

0%: Feels really good on close targets, but far targets you need to hit are too slow

10%: Still feels really good on close targets, but far targets still sluggish to hit

20%: Far targets are still sluggish, but close range stopped feeling so good and felt uncanny 

30%: Everything didn't really feel that good, far targets were better, but close also was comprimised

40%: Everything stopped feeling too far away to hit when it was on screen, or just left FOV, close range felt good

50%: Close seemed slightly comprimised but far away felt good

60%: Far felt good but close range felt really disconnected

This leaves my to believe that my best monitor distance ratio lies between 40%-55%

Link to comment

For your python script, you don't necessarily need a mouse-sensitivity.com subscription. You could do the math yourself to generate all the sensitivity values.

This should be enough to get you started. Should convert a CSGO sensitivity for the default FOV to a sensitivity value for any other FOV using the monitor match method.

(90 / output_fov) * inputs.sensitivity * conversionMethod

Here are various values for conversionMethod. It's in lua.

Monitor match, using a percentage of the horizontal

math.atan(((inputs.percentage / 100) * (inputs.resolutionWidth / inputs.resolutionHeight)) * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * output_fov)/360)))/math.pi) / 360)) / math.atan(((inputs.percentage / 100) * (inputs.resolutionWidth / inputs.resolutionHeight)) * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * 90)/360)))/math.pi) / 360))

Monitor match, using a coefficient of the vertical like Battlefield Uniform Soldier Aiming

math.atan(inputs.coefficient * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * output_fov)/360)))/math.pi) / 360)) / math.atan(inputs.coefficient * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * 90)/360)))/math.pi) / 360))

Zoom ratio. Same as 0% monitor match.

math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * output_fov)/360)))/math.pi) / 360) / math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * 90)/360)))/math.pi) / 360)

Using 0 coefficient or percentage will return an error, so you need an if/else statement to use the zoom ratio code instead.

 

Probably best to avoid the reliance on mouse-sensitivity.com's calculator as any new proposed formula that needs testing will not be on the calculator.

 

Here is an example of the sensitivity conversion in Wolfram Alpha, using 3 sensitivity, 56.25% monitor match, 1920x1080, and converting to 40 4:3 HFOV.

(90/40) * 3 * (ArcTan[((56.25/100) (1920/1080)) Tan[(Pi ((360 ArcTan[(3/4) Tan[(Pi 40)/360]])/Pi))/360]] / ArcTan[((56.25/100) (1920/1080)) Tan[(Pi ((360 ArcTan[(3/4) Tan[(Pi 90)/360]])/Pi))/360]])

Link to comment

I m a foreigner,reading mathematic theory is very hard for me.

Can anybody just give a conclusion,how much percentage of monitor distance matching is best for fov change,Thanks a lot.

In addition,same sensitivity,for example 40,in 80 FOV TPP and 80 FOV FPP, they both take about 40cm/360,but tpp feels faster than fpp,why

Is there a method for FPP to TPP sensitivity translation?

Link to comment
7 hours ago, Drimzi said:

For your python script, you don't necessarily need a mouse-sensitivity.com subscription. You could do the math yourself to generate all the sensitivity values.

This should be enough to get you started. Should convert a CSGO sensitivity for the default FOV to a sensitivity value for any other FOV using the monitor match method.


(90 / output_fov) * inputs.sensitivity * conversionMethod

Here are various values for conversionMethod. It's in lua.

Monitor match, using a percentage of the horizontal


math.atan(((inputs.percentage / 100) * (inputs.resolutionWidth / inputs.resolutionHeight)) * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * output_fov)/360)))/math.pi) / 360)) / math.atan(((inputs.percentage / 100) * (inputs.resolutionWidth / inputs.resolutionHeight)) * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * 90)/360)))/math.pi) / 360))

Monitor match, using a coefficient of the vertical like Battlefield Uniform Soldier Aiming


math.atan(inputs.coefficient * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * output_fov)/360)))/math.pi) / 360)) / math.atan(inputs.coefficient * math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * 90)/360)))/math.pi) / 360))

Zoom ratio. Same as 0% monitor match.


math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * output_fov)/360)))/math.pi) / 360) / math.tan((math.pi * (360 * math.atan(3/4 * math.tan((math.pi * 90)/360)))/math.pi) / 360)

Using 0 coefficient or percentage will return an error, so you need an if/else statement to use the zoom ratio code instead.

 

Probably best to avoid the reliance on mouse-sensitivity.com's calculator as any new proposed formula that needs testing will not be on the calculator.

 

Here is an example of the sensitivity conversion in Wolfram Alpha, using 3 sensitivity, 56.25% monitor match, 1920x1080, and converting to 40 4:3 HFOV.

(90/40) * 3 * (ArcTan[((56.25/100) (1920/1080)) Tan[(Pi ((360 ArcTan[(3/4) Tan[(Pi 40)/360]])/Pi))/360]] / ArcTan[((56.25/100) (1920/1080)) Tan[(Pi ((360 ArcTan[(3/4) Tan[(Pi 90)/360]])/Pi))/360]])

How does this work on games reliant on FOV like PUBG? Also, I'll put that in the script tonight, it should make it a hell of a lot faster too, thanks :)

Link to comment
1 hour ago, Skidushe said:

How does this work on games reliant on FOV like PUBG? Also, I'll put that in the script tonight, it should make it a hell of a lot faster too, thanks :)

That code is just for CSGO since that's what the script is for. That (90/40) part is the multiplier for fov dependent games. Usually they just do default fov / user fov. The fov type determines the monitor match %. Since 90 and 40 are the 4:3 hfov values, it results in a 4:3 monitor match. So if you did want a 4:3 monitor match in CSGO, you would just change the FOV and not touch the sensitivity at all since the game already does it.

Edited by Drimzi
Link to comment
3 hours ago, Skidushe said:

I've changed the script to now use the formula provided, I thought they'd be some guarded secret, but here we are: https://github.com/Skidushe/sens-fov-scalar

It's now basically instant at creating the files, and you don't need to install any requirements now, just run the file

Cheers, you just made it very easy to test different formula.

 

Tested with 25-165 FOV, 140 increments, 5-15 second zoom durations. Viewspeed v2 actually felt the best tbh. The zoom transition feels a little weird but I think that is just because it is scaling by incrementing the fov value rather than the actual zoom factor, so the zoom transition speed quickly slows down, which can throw off the perceived sensitivity scaling.

 

I attached an edit of the script that I used to quickly test different formula. It will generate from the desktop sensitivity.

Install Python 3.65, download this script, right-click the script and click edit with IDLE, add/remove the comments '#' before the newSens variables to change what formula the script will use.

creator.py

Link to comment

I don't know how complex you can get with your lua scripts, but you can normalise the zooming by making the the size of the intervals between zooms get larger as you zoom in.

If you half the FOV, you get a 2x zoom, if you quarter the FOV, you get a 4x zoom, you can quickly see that zoom is directly proportional to k/newFOV, in this case k is the base fov, which in our case is 106.26, if you differentiate to find the rate of change of zoom, you can do the opposite of this to get the gap size between intervals

Link to comment
2 hours ago, Drimzi said:

Cheers, you just made it very easy to test different formula.

 

Tested with 25-165 FOV, 140 increments, 5-15 second zoom durations. Viewspeed v2 actually felt the best tbh. The zoom transition feels a little weird but I think that is just because it is scaling by incrementing the fov value rather than the actual zoom factor, so the zoom transition speed quickly slows down, which can throw off the perceived sensitivity scaling.

 

I attached an edit of the script that I used to quickly test different formula. It will generate from the desktop sensitivity.

Install Python 3.65, download this script, right-click the script and click edit with IDLE, add/remove the comments '#' before the newSens variables to change what formula the script will use.

creator.py

I'd have to agree with you on this, after some quick testing Viewspeed V2 1:1 feels to me like it keeps the movement required compared to the "viewspeed" most balanced, but I still need to test some more to be sure. Honestly though I don't think there's enough difference between Viewspeed V2 and any % between ~45-75 that it would severely impact my performance in-game. However, I still feel like I'm fighting against the zoom so to say when using any other formula than Viewspeed V2. At lower match % it feels almost like there's some resistance on the mouse when zoomed in and the other way around when it zooms out, higher % aren't as bad to me but I still seems to make my aiming less "stable" the more the fov zooms in and out.

The only one that is straight up unusable for me is zoom ratio, the difference in perceived speed between fovs is so large that I imagine even something like tracking a player with hipfire in pubg then zooming in and taking a shot with an 4x/8x and finally switching to a holographic for follow up shots would be quite a challenge without practicing with it for weeks. Probably best to leave pubg out of the comparisons though, I have a feeling that not realizing there was a vertical sensitivity reduction was one of the main reasons I started doubting Viewspeed V2 in the first place ^^

What is your current go-to maps for this kind of testing btw? I've just been trying to find which formula feels most natural when tracking moving bots/targets while standing still or moving around/being moved by a platform as well as just shooting bots on maps like Aim Botz - Training or Training Center 1.5c and that's probably not the most ideal way to do it.

Link to comment

Is the “viewspeed v2” option in the calculator the same thing as “viewspeed v2 1:1” in the python script?

Anyways, after a few rounds of testing from 30-130 FOV, I found that monitor match 1:1 and viewspeed v2 1:1 felt the best in terms of how easy it was to flick to target to target without the feeling that mouse movements were too fast or too slow depending on the current FOV, but I couldn’t quite tell the difference between the two. 

I’ve been using monitor match 0%/zoom ratio to convert my desktop sensitivity to other games for the past few months and had also tested the method in the pyhton script and also found that it felt absolutely horrid to aim with. Flicking from target to target  felt really inconsistent unlike monitor match 1:1 and viewspeed v2.

The map I used to test these methods is “training_aim_csgo2”, using the intensive fast aiming option with a .5 second respawn time on the targets.

 

Edited by 1TimePurchase
Link to comment
17 minutes ago, 1TimePurchase said:

Is the “viewspeed v2” option is in calculator the same thing as “viewspeed v2 1:1” in the python script?

Anyways, after a few rounds of testing from 30-130 FOV, I found that monitor match 1:1 and viewspeed v2 1:1 felt the best in terms of how easy it was to flick to target to target without the feeling that mouse movements were too fast or too slow depending on the current FOV, but I couldn’t quite tell the difference between the two. 

I’ve been using monitor match 0%/zoom ratio to convert my desktop sensitivity to other games for the past few months and had also tested the method in the pyhton script and I found that it felt absolutely horrid to aim with. Flicking from target to target  felt really inconsistent unlike monitor match 1:1 and viewspeed v2.

The CSGO map that I used to test these methods is “training_aim_csgo2” using the intensive fast aiming option with a .5 second respawn time on the targets.

 

 

Viewspeed V2 is close to 75% MM, which in turn is what CSGO's default zoom sens is and is probably what most PC gamers are used to. From a hipfire 0% would make movement feel slower than 75%. Which may be why it felt "horrid" to you. 0% may not feel better when matching at hipfire, but it is verifiably more accurate at the center of your crosshair, so in that sense, in any game where you are aiming down the scope like call of duty, it is objectively superior for hitting targets in the center of your screen. Can you be better at 75% than 0%? You sure can and many people are, doesn't mean it's better.

I think 75% is a better "all around" sensitivity if you are matching a bunch of different hipfire FOVS as there is usually less difference in cm/360 between them compared to 0%. The question is why would you not just match hipfire FOV between games and eliminate this problem in the first place?  Even so, 0% is always going to be more accurate at the center of the screen than 75%.

For consistency purposes, it really is best to just use the same FOV for every game if you can. I think sticking to a match percentage is helpful if you must do that, but then your movement is completely different. I honestly do get the appeal of just running with a game's default FOV and using one match for every game, but it's probably not the best for your aim.

Take Far Cry 5 at its default FOV:

image.png.b263c4c78fccf70815a22e9bf1813dd3.png

 

 

62 CM is going to feel absurdly slow compared to the CSGO sens. This is just one example of many where not trying to match hipfire FOV is a bad idea.

Edited by Bryjoe
Link to comment
3 hours ago, iBerggman said:

What is your current go-to maps for this kind of testing btw? I've just been trying to find which formula feels most natural when tracking moving bots/targets while standing still or moving around/being moved by a platform as well as just shooting bots on maps like Aim Botz - Training or Training Center 1.5c and that's probably not the most ideal way to do it.

Yeah just use intensive fast aiming and play around with settings until it's comfortable. Fast aiming involves both flicking to various points on the monitor and micro-adjustments around the crosshair without the hassle of missing your target.

Edited by potato psoas
Link to comment
8 hours ago, Drimzi said:

Cheers, you just made it very easy to test different formula.

 

Tested with 25-165 FOV, 140 increments, 5-15 second zoom durations. Viewspeed v2 actually felt the best tbh. The zoom transition feels a little weird but I think that is just because it is scaling by incrementing the fov value rather than the actual zoom factor, so the zoom transition speed quickly slows down, which can throw off the perceived sensitivity scaling.

 

I attached an edit of the script that I used to quickly test different formula. It will generate from the desktop sensitivity.

Install Python 3.65, download this script, right-click the script and click edit with IDLE, add/remove the comments '#' before the newSens variables to change what formula the script will use.

creator.py

I've added your script onto the same github repo

Link to comment

I don't think you need a script to change the direction of the bindings. You just need to program the creator.py to make it so the second last execs change the direction. And then you can just have a toggle set to constantly repeat keypress 1 (or any key actually - would need to edit that in Python).

...And I did do that, and I have the FOVs scaling now, but I'm finding that the sensitivity goes erratic every so often. :wacko: Does this happen to anyone else, or is it smooth changing for you?

Edited by potato psoas
Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...