Jump to content

Robin Hood - Sherwood Builders

See the game notes for instructions on how to reduce smoothing.
Read more...

Gas Station Simulator

See the game notes for instructions on how to disable smoothing.
Read more...

Mortal Shell

See the game notes for instructions on how to disable smoothing.
Read more...

Incursion Red River

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

ONCE HUMAN

Hipfire added, more aims to come. See the game notes for instructions on how to disable smoothing.
Read more...

Difference between Windows pointer speed and in-game sensitivity


Recommended Posts

 

Hi, I don't understand the difference between Windows sensitivity and in-game mouse speed (if there is any).

Note: I'm NOT asking about the well-known high DPI and low sensitivity discussion, I understand the difference between hardware-dependent DPI and software-dependent sensitivity. What I don't understand is the difference between the speed set in Windows and the one set in game settings.

By the way, I'm using PMW3366.

The questions:

1) I understand any setting higher than 6 in Windows introduces pixel skipping every units you move the mouse. But why is setting it to .5x multiplier (4, I think) bad? It just takes moving the mouse 2 times the units to cover the same area, right? It shouldn't skip anything.

2) If lowering the Windows speed below 6 is bad, why is increasing native DPI and lowering in-game sensitivity good? I understand higher DPI detects more detail, but if lowering sensitivity is bad just as lowering the speed in Windows (in other words, skipping every x counts, depending on the setting), how can it be better?

3) In Windows 1:1 tracking is at 6/11. What setting is it in games? Some games use numbers between 0.001 and 20, but others use 1 to 100. If a games uses settings between 1 and 100, the number cannot represent the multiplier, can it? Otherwise it would be multipliers between 1x and 100x which seems ridiculous. Do games have a 1:1 setting or do they work differently?

In this reddit thread (https://www.reddit.com/r/GlobalOffensive/comments/1x2a3l/here_is_how_to_get_the_most_out_of_your_mouse/) the poster claims (under "In-game sensitivity") that there IS a difference between Windows and game sensitivity because Windows does some number rounding which doesn't happen with the in-game sensitivity multiplier. Is this true? If it is, then:

1) Why doesn't Windows do it the same way games do (if it's just plain better)?

2) It says:

"For example, say you are using 3/11. Your windows multiplier is 0.25. If you move the mouse by one pixel constantly for 3 counts, 3 * 0.25 = 0.75, and the pointer is not moved at all. You then move another 2 counts, and 2 * 0.25 = 0.5 + the 0.75 from last time = 1.25. Now the pointer moves by 1. This happens because when Windows applies the scaling factor, it can only pass through to the game a whole (integer) number of mouse movement and it holds back or delays a remainder which is added into the next movement."

I don't see any other way to do it. If you set it to 6 and change the DPI accordingly to get the same effective speed, how would it be any different? If you don't move the mouse far enough to move the cursor by one pixel, it won't move, right? It can't move less than one pixel. Am I missing something here? I'd really like to understand this better.

Any help is appreciated, thanks!

 

Link to comment
  • Replies 9
  • Created
  • Last Reply

Top Posters In This Topic

Popular Days

Top Posters In This Topic

  • Wizard
40 minutes ago, James said:

1) I understand any setting higher than 6 in Windows introduces pixel skipping every units you move the mouse. But why is setting it to .5x multiplier (4, I think) bad? It just takes moving the mouse 2 times the units to cover the same area, right? It shouldn't skip anything.

It's always better to do this in the game, if the game supports the sensitivitet you want.

 

40 minutes ago, James said:

2) If lowering the Windows speed below 6 is bad, why is increasing native DPI and lowering in-game sensitivity good? I understand higher DPI detects more detail, but if lowering sensitivity is bad just as lowering the speed in Windows (in other words, skipping every x counts, depending on the setting), how can it be better?

Lowering sensitivity in games does not behave like lowering Windows Pointer Speed. It simply makes the "grid" your mouse snaps to finer and finer.

 

40 minutes ago, James said:

3) In Windows 1:1 tracking is at 6/11. What setting is it in games? Some games use numbers between 0.001 and 20, but others use 1 to 100. If a games uses settings between 1 and 100, the number cannot represent the multiplier, can it? Otherwise it would be multipliers between 1x and 100x which seems ridiculous. Do games have a 1:1 setting or do they work differently?

 

They all work differently, but making the calculator figure out the 1:1 is something I have planned to do.

 

40 minutes ago, James said:

In this reddit thread (https://www.reddit.com/r/GlobalOffensive/comments/1x2a3l/here_is_how_to_get_the_most_out_of_your_mouse/) the poster claims (under "In-game sensitivity") that there IS a difference between Windows and game sensitivity because Windows does some number rounding which doesn't happen with the in-game sensitivity multiplier. Is this true? If it is, then:

1) Why doesn't Windows do it the same way games do (if it's just plain better)?

2) It says:

"For example, say you are using 3/11. Your windows multiplier is 0.25. If you move the mouse by one pixel constantly for 3 counts, 3 * 0.25 = 0.75, and the pointer is not moved at all. You then move another 2 counts, and 2 * 0.25 = 0.5 + the 0.75 from last time = 1.25. Now the pointer moves by 1. This happens because when Windows applies the scaling factor, it can only pass through to the game a whole (integer) number of mouse movement and it holds back or delays a remainder which is added into the next movement."

I don't see any other way to do it. If you set it to 6 and change the DPI accordingly to get the same effective speed, how would it be any different? If you don't move the mouse far enough to move the cursor by one pixel, it won't move, right? It can't move less than one pixel. Am I missing something here? I'd really like to understand this better.

For the rest here, as long as the game supports RAW input, uses that and forget about the WPS settings :)

Link to comment

Hi, thanks for responding! :)

About the RAW input, I'm not aware of any games I play actually supporting it. I do just set WPS to 6 and configure the rest with DPI/in-game sensitivity but I'm just wondering about how they work.

I guess the answer I was looking for is in the sentence you wrote: "Lowering sensitivity in games does not behave like lowering Windows Pointer Speed." I want to understand how they differ from each other.

In this post you made years ago (https://www.mouse-sensitivity.com/forum/topic/5-how-sensitivity-works/) you say that .5 pixels per count feels smoother than 1 pixel per count. Why is that so if the pointer cannot move half a pixel. It may be slower, but the steps it makes are identical, which is 1 pixel per each step.

Also, does this have anything to do with games using mouse input to calculate degrees of rotation rather than number of pixels (like moving pointer in Windows, for example)?

Thanks!

 

Link to comment
17 minutes ago, Drimzi said:

I don't think there is anything wrong with WPS 4 and below. It's not EXACTLY the same as 6/11, but that doesn't make it bad or even matter.

If you double the DPI exactly and drop to 4/11, it will still have the same number of counts being accepted by the pc and thus your cursor moves the same number of pixels.

The difference being that the mouse is more sensitive. The sensor will use a finer matrix when the DPI is increased, thus picking up finer movement as actual movement and sending a count. By the time you move the mouse the same distance for 6/11 to send a count, two counts would have been sent, and Windows would have accepted one of them and dropped the other.

With the mouse counts all being queued, then sent off every polling rate, in the end there will be pretty much no difference.

There is also the issue with the DPI set and the actual DPI, which greatly depends on the mouse model. Setting a mouse to 800 may not be exactly double of 400, and so on.

In the end, all of this doesn't really matter anyway. You get use to the mouse and you don't need to compare it to 6/11.

What I want to know is if there is a difference between how WPS and in-game sensitivity works. Do they both skip counts or is there a difference?

Link to comment
1 minute ago, Drimzi said:

Are you talking about games like StarCraft that use a cursor? The in-game sensitivity works the same as WPS, it just uses all 20 WPS steps.

As for 3D games, they don't use pixels. You rotate in 3d space. A frame gets rendered with the new position, so there is no issues with moving less than a pixel or whatever as they are unrelated. Low game sensitivity is optimal.

I'm talking purely about 3D rotation games. So, high in-game sensitivity cannot cause pixel-skipping, correct?

Link to comment
1 minute ago, Drimzi said:

If you rotate more than what a pixel represents, then yes you can 'skip' a pixel. If you raise the sensitivity so high that you can rotate your to your screen edge in 1 count, then you skip half your screens pixels.

Makes sense, thanks!

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...