Jump to content

WEBFISHING

Just added!
Read more...

Monster Hunter Wilds

Just added!
Read more...

Oh Deer

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

Fractal Block World

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

Outpath

The sensitivity slider is not accurate, expect some discrepancy.
Read more...

Does a higher DPI reduce mouse sensor DPI variation, or is it all relative?


Go to solution Solved by fortunate reee,

Recommended Posts

29 minutes ago, lolngoway007 said:

it reduces latency and tightens the high/lows (more consistent input lag/feeling) when moving slow/fast).

Dependent on sensors tho, some have smoothing etc.

Wont make you god, but appears optimal.

Okay, thanks yeah I think I wasn't too clear I was talking in terms of sensor inaccuracy. The Razer Viper Mini has a DPI variation of 10%, so at 400DPI you would have a max difference of +- 40DPI. However, if I increased my DPI (hypothetically) to something like 12800DPI would that mean I'd have a DPI variance of +-1280DPI or +-40DPI?

Link to comment
  • Solution
39 minutes ago, lolngoway007 said:

 

 

it reduces latency and tightens the high/lows (more consistent input lag/feeling) when moving slow/fast).

Dependent on sensors tho, some have smoothing etc.

Wont make you god, but appears optimal.

 

grafik.png.9203136ac9f0434711265048a6ee0c1e.png

cant be bothered to type so here is a screenshot that explains it very well

tldr: dont get insecure because of that useless video.

--------------------

+ the cpi variation can be dependent on multiple factors ranging from manufacturers to programming to your specific model

 you could / should check for accuracy on multiple different cpi levels with most mice you could just as well adapt your cpi number in its software to adapt

or simple adjust your ingame settings to match making worrying about that redundant

Link to comment
32 minutes ago, fortunate reee said:

grafik.png.9203136ac9f0434711265048a6ee0c1e.png

cant be bothered to type so here is a screenshot that explains it very well

tldr: dont get insecure because of that useless video.

--------------------

+ the cpi variation can be dependent on multiple factors ranging from manufacturers to programming to your specific model

 you could / should check for accuracy on multiple different cpi levels with most mice you could just as well adapt your cpi number in its software to adapt

or simple adjust your ingame settings to match making worrying about that redundant

Help me understand because this topic interests me and more understanding is better for me. So is noob police saying:

- higher DPI's initial movement will occur faster than low DPI at same hand speed. (my understanding)

- then doesn't that mean the high DPI will start  the movement and stop (after travelling the same distance) faster then low dpi and therefore the "input lag" or whatever the term is for it, is lower/better with high dpi? (want clarification on this). Like running say, cs go at 30 fps vs 999fps, the higher FPS will initiate the movement faster and stop faster than 30 fps even if it is the same distance. Higher fps = less input lag is my understanding.

All the above is of course at the same cm/360 for high or low dpi.

I didn't understand his ursain bolt example, i am unable to translate that into movement on monitor, any help would be great.

But i do get what the OP was asking (variation of DPI)  is not what i was providing (higher dpi = lower input lag). 

 

 

 

 

 

Link to comment
43 minutes ago, lolngoway007 said:

I didn't understand his ursain bolt example, i am unable to translate that into movement on monitor, any help would be great.

you will always move the distance you move so you can ignore that aspect completetly

pretty sure he critiscised the methods used in the video this post relates to the smaller increments that higher cpi scans at sicne you need "more movement " to start moving at lower cpi  and falsely mixing that up with input lag .

@TheNoobPolice sorry for dragging you into this but i remembered your post on the accel discord and had to post this here adn i dont want to give out false info

Link to comment
6 hours ago, fortunate reee said:

you will always move the distance you move so you can ignore that aspect completetly

pretty sure he critiscised the methods used in the video this post relates to the smaller increments that higher cpi scans at sicne you need "more movement " to start moving at lower cpi  and falsely mixing that up with input lag .

@TheNoobPolice sorry for dragging you into this but i remembered your post on the accel discord and had to post this here adn i dont want to give out false info

Ahh that is making sense to me, less movement to start the movement on screen, but that isn't the same as actual lower input lag. I suppose it is more a perceived input lag reduction since it feels like it moves quicker (at least on the small movements/slower hand speed).

Thanks for clarifying.

Link to comment
10 hours ago, lolngoway007 said:

Ahh that is making sense to me, less movement to start the movement on screen, but that isn't the same as actual lower input lag. I suppose it is more a perceived input lag reduction since it feels like it moves quicker (at least on the small movements/slower hand speed).

Thanks for clarifying.

There wouldn't be any perceived change at all.

The way to illustrate this is to imagine say, a "million DPI", with a wait() function rather than a polling rate (so hypothetically, a mouse that would always update to the OS as soon as it has data, rather than having to wait for the next poll frame).

Moving even a tiny amount at a regular hand speed would send first data to the OS within one microsecond, but in order for the sensitivity to not be insanely unimaginable, the minimum degrees turned for the first counts distance input (i.e the in-game sensitivity) would have to be so small that the initial movement would not even be visible to the human eye as far as how far it has rotated the game world. Therefore it would be completely useless that some data had been received within a microsecond, because the point of a mouse in a game is a turning device to rotate to a target location, and for that to be relevant it needs to be defined over a distance. 

As long as one counts distance is less than one pixels distance on screen (referred to as "pixel ratio" in the calculator on this site), there would never be any difference in time to turn to any target which is represented in a different pixel on screen (i.e one that you could see you needed to turn to) that your crosshair wasn't already over to begin with no matter the DPI you use.

Edited by TheNoobPolice
Link to comment
2 hours ago, TheNoobPolice said:

There wouldn't be any perceived change at all.

The way to illustrate this is to imagine say, a "million DPI", with a wait() function rather than a polling rate (so hypothetically, a mouse that would always update to the OS as soon as it has data, rather than having to wait for the next poll frame).

Moving even a tiny amount at a regular hand speed would send first data to the OS within one microsecond, but in order for the sensitivity to not be insanely unimaginable, the minimum degrees turned for the first counts distance input (i.e the in-game sensitivity) would have to be so small that the initial movement would not even be visible to the human eye as far as how far it has rotated the game world. Therefore it would be completely useless that some data had been received within a microsecond, because the point of a mouse in a game is a turning device to rotate to a target location, and for that to be relevant it needs to be defined over a distance. 

As long as one counts distance is less than one pixels distance on screen (referred to as "pixel ratio" in the calculator on this site), there would never be any difference in time to turn to any target which is represented in a different pixel on screen (i.e one that you could see you needed to turn to) that your crosshair wasn't already over to begin with no matter the DPI you use.

Thanks that has helped me understand your point more, especially when you talked about the movement being so small it wouldn't be visible to the eye even though it has moved on the screen. Cheers.

Link to comment
  • 3 weeks later...
On 12/18/2021 at 4:30 AM, TheNoobPolice said:

Moving even a tiny amount at a regular hand speed would send first data to the OS within one microsecond, but in order for the sensitivity to not be insanely unimaginable, the minimum degrees turned for the first counts distance input (i.e the in-game sensitivity) would have to be so small that the initial movement would not even be visible to the human eye as far as how far it has rotated the game world. Therefore it would be completely useless that some data had been received within a microsecond, because the point of a mouse in a game is a turning device to rotate to a target location, and for that to be relevant it needs to be defined over a distance. 

As long as one counts distance is less than one pixels distance on screen (referred to as "pixel ratio" in the calculator on this site), there would never be any difference in time to turn to any target which is represented in a different pixel on screen (i.e one that you could see you needed to turn to) that your crosshair wasn't already over to begin with no matter the DPI you use.

Looking at their chart, it seems like they're talking about a difference of up to 11.7 ms of latency. (17.11ms vs 10 ms) ((400 dpi vs 1600 DPI)) At faster speed, the difference drops to 2.05 ms of latency.  At 120 fps, that's like .24 to 1.32 frames. Or at 360 fps, .73 to 3.96 frames.

I think your theory makes sense, but has any one tested it? It might be interesting to see if both DPI stop at the same time.

 

Just to bring up an additional point about higher DPI. Have you seen the GWTEST results? (look at the circle results)
At 400 DPI, moving at 50 ips,  it looks like the mouse is 75.05% accurate.
At 50 ips, the mouse accuracy goes up to 91.67 @ 800 DPI, 92.65 @ 1600 DPI, and 94.17 @ 2400 DPI . 

 

Link to comment
3 hours ago, RedX said:

Just to bring up an additional point about higher DPI. Have you seen the GWTEST results? (look at the circle results)
At 400 DPI, moving at 50 ips,  it looks like the mouse is 75.05% accurate.
At 50 ips, the mouse accuracy goes up to 91.67 @ 800 DPI, 92.65 @ 1600 DPI, and 94.17 @ 2400 DPI .

By the way, I use 400 dpi(roccat, pmw3389) and when I play square map in osu, there is a strong distortion to one side, the hand goes far away from the familiar area and I return it every time. But I think it's shidenkai already worn out, the sensor in some places can not read the surface, I have so raiden worn out, missed a lot of movements.

Edited by Vaccaria
Link to comment
8 hours ago, RedX said:

Looking at their chart, it seems like they're talking about a difference of up to 11.7 ms of latency. (17.11ms vs 10 ms) ((400 dpi vs 1600 DPI)) At faster speed, the difference drops to 2.05 ms of latency.  At 120 fps, that's like .24 to 1.32 frames. Or at 360 fps, .73 to 3.96 frames.

I think your theory makes sense, but has any one tested it? It might be interesting to see if both DPI stop at the same time.

Just to bring up an additional point about higher DPI. Have you seen the GWTEST results? (look at the circle results)
At 400 DPI, moving at 50 ips,  it looks like the mouse is 75.05% accurate.
At 50 ips, the mouse accuracy goes up to 91.67 @ 800 DPI, 92.65 @ 1600 DPI, and 94.17 @ 2400 DPI . 

 

It isn't required to be tested, because it's just simple math and the way a mouse works is already completely understood. I added it to a spreadsheet to show the timings of different DPIs until first data sent (which is NOT input latency, because a target distance is not defined for the input).

Make a copy of the sheet and play around with the hand speed and DPI values and or/ system latency of other factors (such as GPU/ CPU / OS latency - which are also not static and vary between runs etc) and you will get the calculated delay until "screen flash" if you tested with the same methodology of the Battlenonsense or Optimum tech youtube channels. The only wiggle room on this is if the DPI reported by the mouse is not accurate.

The reason the higher and higher DPI converges closer to the max value, is not because the effect changes, but because the larger total system latency of other factors becomes more and more dominant in the calculation.

But in either case, none of these differences are meaningful when considering the delay to a target location, which is the only important factor for a pointing device.

Of course, there will be sensors that perform differently on different DPI settings with either more or less drift, and a VERY low DPI mouse would obviously have visibly less accurate angle tracking due to fewer data points with which to define the angle of any direction. This is also to do with the firmware implementation of the mouse / sensor, as some may be even worse on higher DPI's. But this is also not really meaningful when it is tested with a 2D cursor, because cursors cannot sub-pixel increment, so any input is necessarily truncated to the nearest pixel position which doesn't happen in a 3D game. You would always get more visible "drift" with a cursor than what happens in a game.

Edited by TheNoobPolice
Link to comment
18 hours ago, Vaccaria said:

By the way, I use 400 dpi(roccat, pmw3389) and when I play square map in osu, there is a strong distortion to one side, the hand goes far away from the familiar area and I return it every time. But I think it's shidenkai already worn out, the sensor in some places can not read the surface, I have so raiden worn out, missed a lot of movements.

I don't think there is a GWTEST for that specific mouse. But the pmw 3389 is also used by: MM710, Hati HT-M, XM1, Pulsefire Pro, Pulsefire Surge. And they have tested the XM1 with upgraded firmware. Unless the rocat has some issues, it might have similar results. 

GWTEST test for XM1:

400 DPI: 10 ips @ 91.22% , 30 ips @ 90.92%, 50 ips @ 87.11%.

800 DPI:  10 ips @ 91.06% , 30 ips @ 91.39%, 50 ips @ 90.89%.

So it seems that pmw 3389 performs better at 800 DPI, then 400 DPI.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...