Jump to content

Oh Deer

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

Fractal Block World

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

Outpath

The sensitivity slider is not accurate, expect some discrepancy.
Read more...

Red Dead Redemption

All aims use the same sensitivity setting, choose the sensitivity for the aim you prefer to be matched.
Read more...

Star Wars: Knights of the Old Republic II - The Sith Lords

Just added!
Read more...

High Dpi issues on old Games / Engines


Recommended Posts

Hello,
 
here is the problem and I think the people here in the forum have the necessary knowledge.
 
"CS:GO: FAIL 100%
Its legacy mouse mathematics creates dramatically different mousefeel at 400-vs-800-vs-1600-vs-3200dpi. Almost nobody likes above 800dpi in CS:GO because it feels wonky due to the math limitations of the old Source engine in the new high-Hz era, at least until Valve fixes the legacy math errors. In addition, sensitivity setting is only 2 decimal digits. We cannot recommend more than 800dpi with legacy Source engine games (including aimtrainers that uses the Source engine)." Source: Chief Blur Buster
 
 
Now my question is whether this is the case in all halflife engine games since source is based on it.
Iam also not sure if this info is already shared here but I think this is very important to know.
Link to comment
  • Wizard
1 hour ago, Quackerjack said:
Its legacy mouse mathematics creates dramatically different mousefeel at 400-vs-800-vs-1600-vs-3200dpi. Almost nobody likes above 800dpi in CS:GO because it feels wonky due to the math limitations of the old Source engine in the new high-Hz era, at least until Valve fixes the legacy math errors. In addition, sensitivity setting is only 2 decimal digits.

This is just plain wrong, you can set the sensitivity with more decimals in the console or in the config file. There's no math limitations to using high DPI and it does not feel wonky unless there's some kind of hardware issue on your computer with high DPI.

Link to comment

A gish gallop of nonsense.

Blurbusters have a history of making extremely simple premises sound extremely complex and use sophistry and hand-waving to make sure that no one who ever actually could de-bunk it, would ever have the time or inclination to. I mean where do you start with this:

"At 60Hz using a 125Hz mouse, fewer mouse math roundoff errors occured(sic). But at 500Hz refresh rate and 8000Hz mouse, especially when multiplied by a sensitivity factor, can generate enough math rounding errors occurs in one second (in floats) to create full-integer pixel misalignments!"

Utter drivel I'm afraid.

Link to comment
  • Wizard
17 hours ago, TheNoobPolice said:

A gish gallop of nonsense.

Yeah, it doesn't even seem like they know what they are writing about:

"While Fortnite properly uses high-precision mouse mathematics -- unfortunately only two digits of sensitivity adjustment is provided, making it hard to transition between DPI settings. This does not scale as well to the 500 Hz refresh rate future with 8000 Hz gaming mice capable of silky-smooth ultra high DPI operation."

Fortnite does lack precision in the sensitivity settings, but this has absolutely nothing to do with the supported polling rates...

Some games do have issues with high polling rates and high DPI, but the polling rate issues has no connection to the sensitivity settings, and the high DPI issues are usually because the sensitivity doesn't go low enough. If decimal precision is the issue you can (usually) fine-adjust the DPI to make it better.

Link to comment
19 hours ago, DPI Wizard said:

This is just plain wrong, you can set the sensitivity with more decimals in the console or in the config file. There's no math limitations to using high DPI and it does not feel wonky unless there's some kind of hardware issue on your computer with high DPI.

I'd like to crosspost this here, that provides more insight; on some observations.

Quote

For example, Jame uses 25600 dpi, 0.0313 sens (800 edpi), and his team finished #5 in winnings for 2022 and won the Intel Extreme Masters Rio Major 2022. So besides being difficult to configure, what happens if one goes beyond 800 dpi in source-based games?

That's a fantastic anecdote.  

With some really good hard-core configuring -- text editing the configuration file for many settings including raw input, ultra-precise sensitivity numbers, etc -- it's possible to make >800dpi usable in CS:GO.  But it's not really easy.  >800dpi feels simply awful most of the time in CS:GO so I'm pretty impressed he got 25600dpi to work great.

The mouse mathematics inside CS:GO becomes very wonky when you're far away from compensated 800 "edpi", so clearly this gamer found a sweet spot magic 0.0313 sensitivity that eliminted mouse math errors at >800dpi by maintaining 800 edpi.  So that's my theory -- the mouse mathematics in CS:GO becomes accurate again at 800 edpi.  

It's much how CS:GO goes kinda wonky at >1000fps, concurrently CS:GO also got a sweet spot frame rate range and a sweet spot "edpi" range -- it goes wonky.  Mouseturns don't perform consistently across the entire hardware-DPI range at all sensitivity ranges, so the mouse mathematics in the old game aren't as good as the games.   

But it's not impossible to fight against it and workaround with magic-number sensitivities that takes some knowledge to calculate.

Keeping it to 800 [b][color=red][u]e[/u][/color][/b]dpi is definitely key, that preserves training memory (also known as "muscle memory").  Syncing sensitivity across all games is not intuitive to most gamers at oddball DPI;

_____

Now a very deadly serious esports question.

Is there currently any software that can automatically sync "edpi" across all games, and automatically recomputes/recalculates/rewrite configuration files, everytime you change mouse DPI?  

Play at 800dpi, exit the game, change your mouse DPI higher and restart the software, no change to your fast flicks.  Except your slow flicks improve.  Your mouseturns are still the same speed -- except your slowslews (sniper, slow crosshair tracking, and more jitter-free slowturns with unaffected fast flickturns etc) starts to feel massively better at >800dpi.  

This would help train gamers on why high DPI is a gigantic boon, if configured properly.

There's a need for such edpi-synchronization software in the esports industry, to break out of the 400dpi-800dpi dregs without wrecking training memories (muscle memory).  

Choosing 800edpi in an edpi system tray app while your mouse is at 3200dpi, and seeing all your games automatically sync (configuration files modified by the [b]Automatic EDPI System Tray App[/b], etc) would be a giant boon.   No manual calculating, no edpi wizards, no wrecked muscle memories, etc.  Perfect sensitivity numbers in all of them.   

An automatic edpi configuator would be a good compromise before my proposed [url=https://forums.blurbusters.com/viewtopic.php?t=7739]High Definition Mouse Extensions API Proposal[/url].  

It would need to know how to modify the configuration files of the common games (CS:GO, Fortnite, Valorant, Rocket League, Overwatch, DOTA2, LoL, and other popular esports games) but it would allow 3200dpi+ nirvana in esports on modern mouse sensors without wrecking 800dpi muscle memories (training memory).

While I'm the Hz expert and not as expert on mice, improved recommendations about mice are needed from Blur Busters, and I'd love to get one of you DPI professionals to write an article for Blur Busters.  

Edited by Chief Blur Buster
Link to comment

BTW -- I wanted to add a compliment --  you're one of my favourite sites for calculating correct dpi / edpi / related stuff.

While my specialty is Hz and not mice, let me provide some rationale of my past recommendations, to give better context...

I use DPI Calculator/Analyzer from time to time too, as a quick calculate for some purposes.  Maybe only once every year or every few months, but it's something I'm super glad that exists.  Good job.

Remember, that post was posted a year ago during my early beta testing of Razer 8KHz, of which I was a beta tester of. 

My engineer contact at Razer found issues in some some engines to realize that the mouse mathematics are better in some newer games, though it is only one contributory factor (including, of course, the end user not bothering to configure correctly -- older engines are less forgiving than newer engines, even during configuration file editing for more sensitivity digits).

We've learned more since, if you've read my newer posts -- and I'm already in 25+ peer reviewed papers (purple Research button on website) including cited by NVIDIA, Samsung, NIST.gov, etc.  So, it's pretty clear I know my stuff in a different niche, even if not as well about mouse.  But that doesn't mean we can't collaborate together to (A) find the causes; (B) avoid the math problems; (C) get more esports community to use higher DPI.  Better we work together after all, picking each other's brains on the portions of our skills that we are smartest at.  

I do, however would love to see more peer reviewed papers on mouse edpi.  Maybe a researcher here would love to help co-author?  Clearly, there's fewer good DPI+sensitivity combos in older games, than in newer games.  When I was beta testing the Razer (8KHz), they did some tests and privately discovered bad mouse mathmatics in some games, at least for some DPI+sensitivity combinations. 

If you tow the 800 edpi line with some great manual mouse mathematics, you can prevent a lot of issues and keep everything better sync'd between CS:GO and newer games.   Few do bother to use the DPI analyzer, but let's rightfully pan how unintuitive older games are out-of-the-box for DPI adjustments by people who don't bother to use utilities.  Newer games does oddball DPI and oddball sensitivities better (e.g. 1234dpi at 0.789647 sensitivity or whatever random combo) with less jitter/rounding.

In coding, some of the code uses 'floats' for the mouse mathematics (or worse, integers), while the newer games are more likely to use 'double', and clamp to the final engine's precision after finishing mathing it all out.  It may only be minor differences that are only noticed by seasoned esports players.

Now if one correctly use DPI Calculator/Wizard and similar utilities in a Right Tool For Right Job, a lot of this definitely goes away it appears -- from more recent anecdots.  However, it doesn't make my post utter drivel, given the engineers at Razer had discovered the same thing as well;

While pollrate and dpi are separate, ikeep in mind that pollrate and dpi can interact with each other in unexpected/weird ways.  High pollrate creates CPU fluctuation surges which can create some mouse jitter (polltime:rendertime goes out of sync because game or CPU processed the high poll inefficiently), and mouse jitter dynamics can be affected by it.  Like how you configure a Razer 8KHz at max 8000Hz, and really rip the mouse around, a CPU core gets pegged at 100% and the mouseturns starts weirdly stuttering, and some people (incorrectly) tried to change DPI to fix it.  It changed the jitterfeel, even if it didn't completely solve, so the smoothness-feel in "CPU duress situations" can create some unexpected interactions between pollrate and dpi, depending on how the mouse mathematics in an engine is done.  A perfect system, you keep up to date, and it handles your mouse well, then you probably won't notice a problem -- but not everybody's system is your system.

I'm a C++ and C# coder, and I am shocked how bad the mouse mathematics are in some engines.  It does appear though, that you can kinda tweak your way to some good combinations that feels good, by using utilities such as DPI wizard.

While I'm a bigger expert in Hz than mouse, not all FPS players are computer coders and have seen what I've seen...  It may not affect aim if you configure edpi correctly, but if you use oddball numbers, like randomly sliding sensitivity sliders, or using oddball DPI in existing mouse utilities -- then the math errors adds up (e.g. mouse sensor interpolation, mouse driver smoothing, game rounding errors, etc) -- the whole chain.  It's all very poorly researched, but problems definitely exist, with suboptimal configurations.

It takes less work with newer engines to keep the chain 'math clean', but DPI wizard is a great way to mitigate this and not notice any math problem.  So this is probably just a simple case of "I'm right too, and you're right too", in a non-mutually-exclusive manner -- depending on the settings.  Knowledge can certainly be improved, though!

Thoughts are also welcome on the High Definition Mouse API Proposal, too!

(P.S. Sorry I write big texts.  I wish I could do YouTube and shorter stuff, but I am born deaf, and walls of text are my thing.  But many of my fans love them!  It's my hallmark to stop the 30fps-vs-60fps laughing over the years.)

Edited by Chief Blur Buster
Link to comment
  • Wizard
12 hours ago, Chief Blur Buster said:

The mouse mathematics inside CS:GO becomes very wonky when you're far away from compensated 800 "edpi", so clearly this gamer found a sweet spot magic 0.0313 sensitivity that eliminted mouse math errors at >800dpi by maintaining 800 edpi.  So that's my theory -- the mouse mathematics in CS:GO becomes accurate again at 800 edpi.  

It's just as precise at sensitivity 0.01 as it is at 1. The maths for CSGO is also pretty simple and well known.

Say you are using 800 DPI and sensitivity 1. For every inch you move your mouse, 800 counts are sent to the game, which the game interprets as turning:

0.022*1*800=17.6 degrees

Now depending on how fast you move your mouse and your polling rate, the amount of packets and their sizes will vary. It might be 800 packets of 1 count or 80 packets of 10 counts. This doesn't matter though, as the end result is the same.

Now if we instead use 25600 DPI and sensitivity 0.03125, 25600 counts are sent to the game over 1 inch, and we get:

0.022*0.03125*25600=17.6 degrees

Same exact movement, but the packet sizes will be larger. For instance 800 packets of 32 counts or 80 packets of 320 counts.

But enough of the theory, let's test it:

The crosshair ends up at the exact same spot, and the movement frame-by-frame is identical. There is no inaccuracy or limitation in the math here. Note that this is purely from a sensitivity standpoint, when you mix this with possible issues with 1000 FPS and/or 8000 Hz polling rates you can definitely run into issues.

12 hours ago, Chief Blur Buster said:

Is there currently any software that can automatically sync "edpi" across all games, and automatically recomputes/recalculates/rewrite configuration files, everytime you change mouse DPI?  

You can manually do it using the calculator on this site, syncing across all games automatically is probably not possible since a lot of games only have their sensitivity setting in-game with no file or registry setting etc. to modify it in. It could probably be done for quite a lot of games though, but it would need to use the calculator in the background as simply dividing a multiplying the sensitivity only works for about half of the games.

12 hours ago, Chief Blur Buster said:

While pollrate and dpi are separate, ikeep in mind that pollrate and dpi can interact with each other in unexpected/weird ways.  High pollrate creates CPU fluctuation surges which can create some mouse jitter (polltime:rendertime goes out of sync because game or CPU processed the high poll inefficiently), and mouse jitter dynamics can be affected by it.

Definitely, and I think this is the main issue. A lot games do not handle polling rates over 1 kHz very well, and at 2-8 kHz you might also run into a lot of issues if your hardware isn't up to par.

Heck, even brand new Unreal Engine 5 games doesn't handle over 250 Hz correctly if the devs haven't disabled smoothing.

Link to comment
  • Wizard
7 minutes ago, Quackerjack said:

Btw can you tell me why this happen? I remember playing in old days on 100 - 120 hz monitors and had never stutter on a 125 hz pollingrate. 

This isn't related to the monitor refresh rate, but with a low polling rate from the mouse each report will be larger.

So with 125 Hz this can result in for instance 125 packets of 8 counts instead of 1000 packets of 1 count with 1000 Hz, so for each report the crosshair will jump 8 times further with 125 Hz resulting in a much more choppy experience.

Link to comment
2 hours ago, Quackerjack said:

Btw can you tell me why this happen?

 https://blurbusters.com/gsync/preview2/ :)

15 hours ago, Chief Blur Buster said:

"I'm right too, and you're right too"

Maybe we will need something like GSync in mouse tracking -> MSync :) 

Another think is, that "mousefeel" is mostly local feel. No matter if you have a 10000 Hz monitor or mouse. All your mouse moves (delta moves) are distributed by let say 100 packets per second over internet. And when they arrive to your computer from your opponent. Game engine will smooth the delta difference. So it will look clean, but in reality it's "choppy". In CS 1.6, the command ex_interp "0.1" take care of this. And if you set it to 0 and your NET settings are low (cl_cmdrate "30"; cl_updaterate "20") it was unplayable. If you set ex_interp "0.1" it starts to feel clean. But input lag will be higher. 

So even when it will feel and look absolutely perfect on monitor. It can be just engine interpolation. But in the background, it can be much less precise that how it looks smooth on monitor.

Link to comment
43 minutes ago, MacSquirrel_Jedi said:

 https://blurbusters.com/gsync/preview2/ :)

Maybe we will need something like GSync in mouse tracking -> MSync :) 

Another think is, that "mousefeel" is mostly local feel. No matter if you have a 10000 Hz monitor or mouse. All your mouse moves (delta moves) are distributed by let say 100 packets per second over internet. And when they arrive to your computer from your opponent. Game engine will smooth the delta difference. So it will look clean, but in reality it's "choppy". In CS 1.6, the command ex_interp "0.1" take care of this. And if you set it to 0 and your NET settings are low (cl_cmdrate "30"; cl_updaterate "20") it was unplayable. If you set ex_interp "0.1" it starts to feel clean. But input lag will be higher. 

So even when it will feel and look absolutely perfect on monitor. It can be just engine interpolation. But in the background, it can be much less precise that how it looks smooth on monitor.

so you say i just feel the input lag of 8ms there with 125 polling rate? Still i dont get it. Back in days all mices worked on 125 hz and there was never this situation.

Link to comment
  • Wizard
41 minutes ago, Quackerjack said:

so you say i just feel the input lag of 8ms there with 125 polling rate? Still i dont get it. Back in days all mices worked on 125 hz and there was never this situation.

Just like you can definitely feel and see the difference between a 120 Hz monitor and a 360 Hz monitor, you can also do it to polling rates. There are a lot of variables though, as for instance what sensitivity you run, DPI, FPS, and how much you move your mouse etc. Some are comfortable with a 2 inch 360 distance, others prefer 50 inches. This makes a huge impact on how fast you move your mouse.

Back in the days of 125 Hz we had no comparison, monitors were usually 60 Hz and games rarely ran over 60 FPS (I'm exaggerating here, but you get my point).

No one missed 4k when 1080p came, and no one though their Corvette was slow before Tesla came along. For the most part we don't know what we're missing until something better shows up :)

Link to comment

BTW, although not a mouse expert, I did tip some dominoes in the mouse world behind the scenes.

A twitter debate that I started with with some academics who doubted the benefit of high poll rates...

The Twitter debate triggered a research paper by the same individuals that conceded that humans could see imperfections in 1KHz that disappeared at 8Khz.  They didn't believe 8KHz had any humankind benefit, while I strived to convince them otherwise;

[Then about six months later]

The research paper now has a registered DOI:|
"Do We Need a Faster Mouse? Empirical Evaluation of Asynchronicity-Induced Jitter"
https://dl.acm.org/doi/10.1145/3472749.3474783
And kudos that they published it open access without no paywall (link to PDF).

Some cherrypicked images from the paper:

Image

Image

NOTE: Yes, yes, the paper can still be slightly improved with more data. There's certainly more tests needed (VSYNC ON versus VSYNC OFF, fixed gaze at crosshairs versus tracking gaze at moving objects, ala The Stroboscopic Effect of Finite Frame Rates) -- which can produce different human visibilities of 1KHz vs 8KHz...

Now that being said -- these tests were done without strobing.  If you enable strobing such as DyAc, (less motion blur makes mouse jitter more visible) the green squares seem to move maybe one or two squares to the right.  As long as the mouse jitter is significantly much worse (in milliseconds error) than the motion blur MPRT (in milliseconds) (MPRT number, not GtG number), there are odds of it becoming human visible.

Although I'm not cited in the paper (like other papers, from the Blur Busters research portal at www.blurbusters.com/area51 ...), the researchers did acknowledge that the twitter debate triggered their decision to research the matter...I just use the mouse profile software to automatically set 1KHz, 2KHz or higher depending on what executable file is running -- but when just at Windows desktop I usually keep it at 8KHz, just because it's noticeable in my mouse cursor:

(The stroboscopic stepping effect is something I write about in one of my articles.  Some people aren't bothered by it.  But some are.  Like some people are bothered by tearing, others are not, others really love high Hz, others are fine with 60Hz, and so on, and depends on esports vs non-esports contexts).

I could clearly see 1Khz vs 8Khz even at Windows Desktop just by circling the mouse pointer, as a jitter effect between pollrate and VSYNC, since for many (non-esports) use cases, you want VSYNC such as for smooth scrolling & smooth mouse pointer operations (e.g. painting, etc).

In practicality, the game, the port, the drivers, the software, the whole chain -- needs to be efficient enough to handle it without bogging down the game.  I've found 2Khz to be an excellent compromise most of the time.  8KHz + a system that can't keep up, can be worse than 1KHz, so tweaking and optimization is needed, as well as a game that can handle that mouse input rate (without pegging a CPU core to 100%).  But all things equal, I absolutely love 8KHz even just during plain web surfing, Adobe Photoshop, Visual Studio, etc.  It's very visible to me.

Just wanted to add some useful background information from my beta-testing of 8KHz.  

Edited by Chief Blur Buster
Link to comment

  

On 1/19/2023 at 2:04 PM, MacSquirrel_Jedi said:

 https://blurbusters.com/gsync/preview2/ :)

Maybe we will need something like GSync in mouse tracking -> MSync :) 

Another think is, that "mousefeel" is mostly local feel. No matter if you have a 10000 Hz monitor or mouse. All your mouse moves (delta moves) are distributed by let say 100 packets per second over internet. And when they arrive to your computer from your opponent. Game engine will smooth the delta difference. So it will look clean, but in reality it's "choppy". In CS 1.6, the command ex_interp "0.1" take care of this. And if you set it to 0 and your NET settings are low (cl_cmdrate "30"; cl_updaterate "20") it was unplayable. If you set ex_interp "0.1" it starts to feel clean. But input lag will be higher. 

So even when it will feel and look absolutely perfect on monitor. It can be just engine interpolation. But in the background, it can be much less precise that how it looks smooth on monitor.

I've been slowly working to convince some people at mouse manufacturers / Microsoft behind the scenes we need a better mode of relaying mouse data precisely without killing CPUs.   

8KHz can really slam many CPUs, especially on crappy motherboards, crappy drivers, and crappy OSes.  Imagine, getting 24KHz-quality (full sensor rate) through only a manageable 2000 or 4000 mouse-communications per second, and fully system-jitter-compensated, without pegging a CPU core to 100%.  End user could configure it to meet their system needs and game compatibility.

BTW, I must say I am pissed Windows 11 does not keep up as well for my 8KHz mouse for some reason, possibly immature mouse drivers.  BTW, I was the world's first person to run 1000Hz directly from Microsoft Windows, on an internal laboratory prototype display.  I contacted Microsoft and they added a hidden Windows Insider registry key to shatter the 500Hz limitation, and sent me the information privately (NDA).  But now it's in public Windows builds.  From what I last heard, Windows 11 now finally supports >500Hz, given the new 540Hz monitor being released.   But a bad problem, Windows 11 does not seem as mouse 8KHz friendly as Windows 10.  Ouch.   I do wonder, if Microsoft has backported the >500Hz support to Windows 10, given the upcoming flood of >500Hz monitors later this decade.  (1000Hz OLED 2030 FTW...)

(Aside: At this point in diminishing curve, Always upgrade refresh rate by geometric increments such as 2x if possible.  e.g. 60Hz -> 144Hz -> 360Hz -> 1000Hz.  But if GtG is near 0 like an OLED, then upgrading refresh rate by 1.5x is okay.   IPS LCD 240Hz-vs-360Hz is often only 1.2x-1.3x due to GtG (and diminished further to 1.1x due to high frequency jitter adding extra blur, cue YouTuber 240Hz-vs-360Hz "upgrade worthlessness sensationalism").  But OLED 240Hz-vs-360Hz is a 1.5x motion blur difference when perfectly framepaced with an excellent mouse, and 240Hz-vs-1000Hz OLED is 4x blur difference. Strobeless blur reduction FTW!  I have a 240Hz OLED on my desk now as part of my work with one of the monitor manufacturers -- and I can confirm that 240Hz OLED is clearer motion than non-strobed 360Hz since there's no GtG blur, only MPRT blur (GtG and MPRT are both concurrently important, but 0ms GtG does not fix motion blur without fixing MPRT too).  Strobed is still clearer, but some people don't like strobed.  That being said, 240Hz is still very far from the final frontier, as explained in my 2018 article, Blur Busters Law: The Amazing Journey To Future 1000Hz Displays.  Retina refresh rate can be quintuple digits in the most extreme variables, but that deep in diminishing curve of returns requires extremely steep geometric Hz upgrades upgrade to still be noticed, before the vanishing point of the diminishing curve of returns)

This is another argument in favour of the HD Mouse Extensions API proposal. We need higher Hz displays, higher Hz mice, and improved precision in the whole workflow.  Presently, I'm the expert in many communities about the Present()-to-photons blackbox, by the way -- but jitter that occurs before the hardware is a giant problem in the current refresh rate race requiring geometric upgrades.  Since high-frequency stutter/jitter also adds display motion blur that obscures display refresh rate differences.  70 stutters per second at 360Hz from any source (mouse, software, whatever) is just a blur like a fast-vibrating music string, as seen in the stutter-to-blur continuum animation).

(Yes, there's a GPU solution to doing 1000fps UE5-quality.  Reprojection tests, ported from VR to PC, and utilizing 10:1 ratios instead of 2:1 ratio, have shown 4K 1000fps 1000Hz UE5-quality is possible on an RTX 4000 series.  See LinusTechTips video for an example). 

While LinusTechTips extoll reprojection benefits for 30fps->240fps, I think 10 years ahead: 100fps->1000fps.  I did some tests to show that the demo, was indeed, of course, capable of 10:1 ratio.  Much fewer artifacts (far less than DLSS 3.0!!) and perceptually lagless.  But we need to see this ported to an UE5 engine instead.  I predicted frame rate amplification of 10:1 ratios way back in year 2018 and lo-n-behold, I've downloaded a demo that converts 1440p 30fps to 280fps using only roughly 10% overhead of a Razer Blade 15 GPU.  While the graphics in that demo is simple, the remainder 90% of GPU can do anything else such as render UE5-quality visuals, as reprojection is detail-level-independent.

While the downloadable demo's reprojection is not as good and artifact-free as Oculus ASW 2.0, I made a new discovery that starting with a minimum 100fps, to reproject to >1000fps, produced far less reprojection artifacts due to the fact that the artifacts vibrated far beyond flicker fusion frequency.

(TL;DR: 4K 1000fps 1000Hz 1ms-lag GPU with UE5-detail-level is thus, apparently, already here today with reprojection help.  It just hasn't been milked properly yet.)

I think that some game developers have noticed (I'm hoping Epic did -- UE5 needs optional reprojection support, stat).  By year 2025-2030, reprojection will probably be an optional feature (e.g. DLSS 4.0+, FSR 4.0+, XeSS 4.0+), though requires the game engine API to communicate movements to the frame generation API for perceptual lossless reprojection (which occurs roughly at ~100fps base framerate -- reprojecting from a lower frame rate has too many artifacts).  The new gist of thinking for the upcoming 1000fps 1000Hz ecosystem amongst my circle of researchers, is reprojecting ~100fps -> ~1000fps, for strobeless blur reduction (reducing display motion blur by 90% without strobing and without black frame insertion). 

This system mandatorily requires communications of movement deltas (mwhich means mouse improvements apply here!!) to the reprojection engine between original triangle-rendered frames.  So a modification is this made to the game engine to continuously communicate movement deltas to the frame generation API, for zero latency frame generation.  (Retroactive reprojection allows even 10ms frame rendertimes to have only 1ms mouse-to-GPU-framebuffer latency, by the way!  Research papers are coming about this.

(Reprojection isn't necessarily evil in an era where we've accepted Netflix' use of mandatory interpolation feature in H.264 and H.EVC video codecs to perceptually losslessly produce 24fps from 1fps non-interpolated frames.  This is better than grandpa's television interpolation, because the original interpolator had perfect ground truth knowledge of the original frames.)

And, herein lies the argument why a system like High Definition Mouse Extension API is necessary for tomorrow's 1000Hz ecosystem.  It will take me years to convince vendors to do things in a totally new way, to get away from the problems.   But you can help educate me on my gaps of knowledge, despite me being the expert in Present()-to-photons black box chain (my "VSYNC OFF tearlines are just rasters" beam racing raster demo). 

Being that said, this is not important for many old games and today's games, but will be very important for the upcoming 1000fps 1000Hz world to prevent things from vanishing below the human-visibility floor of the diminishing curve of returns. 

For example, my proposed High Definition Mouse Extension API mandates microsecond-timestamping of mouse polls at the mouse sensor level, and preserved (through the jittery USB and jittery software) all the way to the videogame. For better mousepolltime:photontime synchronization that is software-immunized by self-software jitters.

On 1/19/2023 at 2:50 PM, Quackerjack said:

so you say i just feel the input lag of 8ms there with 125 polling rate? Still i dont get it. Back in days all mices worked on 125 hz and there was never this situation.

There's more than one way to "feel" 125 vs 1000 vs 8000.  Lag feel, jitter feel, visual feel, etc.  Some contexts depends on what you're doing.  

I can most definitely "see" it, at the least, even if I can't feel the lag. 

Many years ago, I published a simplified image the temporal aliasing effect between frame rate and refresh rate in refresh-synchronized situations (e.g. Windows Desktop and VSYNC ON situations). 

2000hz-mice-needed

Then it became much worse when we're dealing with 240Hz, 280Hz, 360Hz and 390Hz, with those odd refresh rate divisors amplifying 1000Hz jitter much bigger than this montage of old photographs from way back in the 120Hz/144Hz days.  This is actually a modified version of an old 2014 montage (sans 2000Hz) I shared with my blog readers at the time, back when Blur Busters was merely an unpaid hobby of mine.

See... I've been pleading for 2000Hz+ since 2016.  I screamed hard at Razer for years.

This also can translate to frame rate in a roundabout way (in games with no mouse smoothing / interpolation, which is frowned on anyway).  Where low pollrate translates to more motion blur (stutter-to-blur continuum animation demo -- www.testufo.com/eyetracking#speed=-1 ...) where low framerates vibrate noticeably like a slow music string, and high frame rates vibrate beyond flicker fusion threshold like a fast music string. 

A 125Hz poll rate means things will essentially run at 125fps even if your monitor is 360Hz, and you will have no less motion blur than 1/125sec persistence since the 125Hz pollrate essentially throttles the game frame rate.

You don't need to feel the lag of a millisecond to win by the millisecond, like two olympics sprinters crossing the finish line at the same time.  They see the scoreboard and find out they were only a few milliseconds apart.  Likewise, see-and-shoot-simultaneously situations in FPS games, is also a metaphorical latency race to the finish line, even if you can't "feel" the latency, you may still win by the latency advantage.

Likewise, a low DPI of 400 can be problematic for frame rate -- move a mouse only 1/4th inch over a period of 1 second, only gives 100 reports.  That limits your (say, mousepan or mouseturn) frame rate to only 100fps. Most experienced people in this forum already know this (better) than I do, but I just wanted to add that, just in case you were unaware.

I will wholeheartedly admit maybe I do not know as much about mice as some of you -- but you cannot deny I tipped a few dominoes -- and you cannot deny my industry connections -- and cannot deny some necessities that will be needed for the future 1000fps 1000Hz ecosystem (2030) -- and high Hz doesn't just benefit esports, given the slow mainstreaming of high Hz (phones, tablets, consoles). 

In fact, even I met Apple engineers at DisplayWeek convention who was gawking at BOE's 500Hz panel in May 2022.  Being conservative about costs and battery consumption, I bet consumer tablets will still only be 240Hz OLED by 2030, esports will now probably have widespread 1000Hz "near 0ms GtG" displays by ~2030-ish.  This will necessitate better mouse handling to remain visible humankind benefits -- full stop.   Especially now, since our recent discovery that mouse jitter adds extra display motion blur (research articles still yet to be released), where high-frequency mouse jitter (sensortime:gametime sync error margin, from many causes such as USB jitter and CPU fluctuations in mouse driver, games, OS, etc) is just extra persistence blur degrading motion blur by approximately 10-20%.  This can turn a 240Hz-vs-360Hz 1.3x difference (throttled down from 1.5x to 1.3x due to GtG) to a mere 1.1x difference (mouse jitter adding more blur).    We're the discoverer of the error-margins-of-the-future applicable to optimizing future games whose software development has not yet started.

Blur Busters was a hobby that involved tons of forum posts, so longtimers know my legacy.  This continues today, with my walls of text even over a decade later.  Now look at me, I'm cited in over 25 peer reviewed papers (Google Scholar), thanks to some of my display motion blur related work.

Obviously, my posts are targetted to researcher/engineer/prosumer/etc communities, so my posts get a bit complex, as I try to "Popular Science" or "Cole Notes" a science paper, but still confusing for the average esports gamer.  Now that being said, while I know what happens to pixels after frame Present()-ation all the way to them becoming photons -- I am certainly sometimes deficient of some parts of the chain before the Present().  Doesn't mean I know squat.

There are many paths forward, and it is here where I'm interested in hearing from other computer programmers / game programmers / researchers / etc.  If you want to help influence tomorrow's 1000Hz ecosystem -- and help course-correct my mistakes -- then most certainly get in touch with me.

Edited by Chief Blur Buster
Link to comment
On 1/19/2023 at 8:03 AM, DPI Wizard said:

It's just as precise at sensitivity 0.01 as it is at 1. The maths for CSGO is also pretty simple and well known.

Say you are using 800 DPI and sensitivity 1. For every inch you move your mouse, 800 counts are sent to the game, which the game interprets as turning:

0.022*1*800=17.6 degrees

Now depending on how fast you move your mouse and your polling rate, the amount of packets and their sizes will vary. It might be 800 packets of 1 count or 80 packets of 10 counts. This doesn't matter though, as the end result is the same.

I'm not talking about the math that a user can do.

I'm talking about math rounding errors in game engine programming and graphics drivers.  Many graphics rendering often used "float" instead of "double", and sometimes this creates things like Z-buffer fighting artifacts or other effects.  What happens is inside software, many math operations can occur at lower precision than it should before being downconverted to final rendering precision.  The new programming best-practices is to keep higher precision until the final stage (rendering exact position of scenery). 

0.022*0.03125*25600=17.6 degrees

This appears to produce good comfort for the Source Engine.  This results in steps that are math-comfortable since a "float" data type in computer programming is only 6 to 7 significant digits (Source Engine rendering uses floats).  Even intermediate steps (e.g. 0.022 * 0.03125) is at risk of producing many digits (0.0006875) but still fits within the float datatype. 

Now, if you use a different sensitivity number than "0.03125" and a different dpi than "25600" you may produce more digits than can fit inside a "float".  Woe is the game engine, if the final degree is an infinite-repeating degrees.  Now at 500 frames per second, you get 500 math rounding errors (from float being 6-7 digits INSIDE the game engine), you get potential wonkiness.

This is also why 1000fps in CS:GO has more problems than 300fps, do you realize?

(Do you realize that Source Engine only keeps 6 to 7 decimal digits?)

Many newbies don't understand this.  No wonder some experienced people like me, can be thrown off too -- because both you and I are correct simultaneously, depending on what you enter in the math variables for the game engine...

Just like you don't want to use too few bits when doing repeated photo filters or repeated GPU effects (color banding from rounding effects), you don't want to use too few digits when a game engine does repeat math on mouse deltas on single-precision GPU rendering engines.   Some engines keep high precision (doubles) when calculating 6dof positionals before finally rendering at single-precision, and make sure that math errors does not accumulate over multiple inputs.

So the renewed moral of the story: Use a DPI calculator like yours, and stick to numbers CS:GO likes (e.g. 800-clean numbers).  Apparently, if you feed it cleaner numbers that are more easily mathed by the game engine CS:GO seems to merrily keep up okay.

There may be other causes of the problems witnessed (e.g. system limitations), but that doesn't mean math rounding errors can occur INSIDE a game engine, given many engines only keeps as few as 6 decimal digits during their positional math...

What many non-programmers don't know is that there are math rounding errors INSIDE the game software and INSIDE the graphics drivers.

I do, however, wholehearted admit my mistake in blanket-failing CS:GO.  Correction: CS:GO works beautifully at high DPI provided you carefully calculate sensitivity numbers with DPI numbers to prevent too many rounding errors inside the single-precision Source Engine.  Even many new engines are still single-precision but some codebases keep intermediate positional calculations in doubles (to prevent accumulation of rounding errors) before the render.

I shall begin telling more users to your mouse calculator, if they want to experiment with numbers far beyond the mouse technology available at the time Source Engine was developed.

On 1/19/2023 at 8:03 AM, DPI Wizard said:

Now if we instead use 25600 DPI and sensitivity 0.03125, 25600 counts are sent to the game over 1 inch, and we get:

0.022*0.03125*25600=17.6 degrees

Same exact movement, but the packet sizes will be larger. For instance 800 packets of 32 counts or 80 packets of 320 counts.

But enough of the theory, let's test it:

The crosshair ends up at the exact same spot, and the movement frame-by-frame is identical. There is no inaccuracy or limitation in the math here. Note that this is purely from a sensitivity standpoint, when you mix this with possible issues with 1000 FPS and/or 8000 Hz polling rates you can definitely run into issues.

Unlike before, I now know this is clean;

What I did not know at the time I wrote the "FAIL" -- is that the game engine's mouse mathematics are fairly accurate again with some of this tweaking (e.g. editing the configuration file instead of adjusting the coarse slider).  If you go to oddball numbers, like 1734 instead of 800, CS:GO doesn't behave as well as newer engines with oddball numbers.   So a newer engine such as Valorant may mouseturn perfectly at very oddball numbers, CS:GO doesn't necessarily.  For example, if you want to use, say, .  

So the magic number is to keep it at 800-equivalence, and you're fine with using higher DPI in CS:GO.  That, I didn't know earlier, and you did correct me.  

So some of my old recommendations are now out of date.  Instead of a blanket "bad at all above 800", you can do it with properly calculated numbers (like yours) -- except there is less flexibility to "aberrate" away from calculated numbers without side effects, than you would with a newer engine that has fewer math rounding errors inside the engine.  The new rule of thumb is to try to stay with clean numbers (tell people to go to your site if tweaking a retro engine like CS:GO to >800dpi).

Obviously, most people in esports aren't always configuration-file-saavy.

On 1/19/2023 at 8:03 AM, DPI Wizard said:

You can manually do it using the calculator on this site, syncing across all games automatically is probably not possible since a lot of games only have their sensitivity setting in-game with no file or registry setting etc. to modify it in. It could probably be done for quite a lot of games though, but it would need to use the calculator in the background as simply dividing a multiplying the sensitivity only works for about half of the games.

Definitely, and I think this is the main issue. A lot games do not handle polling rates over 1 kHz very well, and at 2-8 kHz you might also run into a lot of issues if your hardware isn't up to par.

Heck, even brand new Unreal Engine 5 games doesn't handle over 250 Hz correctly if the devs haven't disabled smoothing.

That's why I am pushing hard for High Definition Mouse Extensions API, a proposal that I have written.

Please feel free to flame roast critique its flaws, so it's an even better API (if vendors listen).  

Remember it's for future software, not for past software (old games) though popular engines such as Fortnite could theoretically retroactively add a menu setting to enable this, upon popular demand.  Easily getting a proper preferred (e.g. 800-feel) in all supported games at 24000-equivalent poll (full sensor rate with microsecond timestamps relayed to software), at your own adjustable latency that the system can keep up with (1000 arrays per second, 2000 arrays per second, or 12345 arrays per second).  So at 2000/sec, that's 12 rawinput mouse deltas embedded within a packet from the mouse.  Way overkill, but you get the picture -- it gives a path to 8000 mousedeltas/sec at only 1000 polls/sec that doesn't kill a CPU core in midrange systems.   So you can keep your 1000 pollrate, but still get 8000 timestamped rawinput mouse deltas per second, communicated as 1000 packets per second (configurable) if your system cannot handle an 8000 rate.   Likewise if you system can handle 8000 but can't handle 24000, and so on.  The numbers within are just arbitrary example -- it just uncaps mouse innovation not bottlenecked by software performance or jitter limitations -- especially in a USB-plugs-overloaded system that adds lots of jitter to untimestamped mouse deltas.   

Note: Think about this random napkin exercise: In tomorrow's 1000Hz+ world (and far beyond), we eventually need to accurately achieve frame rates higher than a specific computer's realistically achievable mouse packet rate, which is partially why I added the microsecond timestamping requirement as well as multiple-mouse-delta-per-packet feature.  Imagine feeding a future 100fps->4000fps reprojection engine (e.g. DLSS 5.0+, FSR 4.0+, XeSS 4.0+) with only 1000 or 2000 mouse packets per second.  With HD Mouse API, full sensor rate (e.g. "24KHz pollrate" or whatever you configure your mouse to) can be relayed to the game engine (accurately microsecond timestamped) at a configurable bunched number per second that is not too overloading for the computer.  And because reprojection requires many mouse packets between original triangle-rendered frames, this fits each other like a glove without succumbing to a specific computers' mouse event rate limitation.  This can be done with multiple mouse deltas per packet, along with their sensor timestamps. 

Before replying, please watch this YouTube video as the primary enabling technology of the 4K >1000fps >1000Hz future of the 2030s+ to understand an early bird Wright Brothers hint of one of the reasons why I proposed the HD Mouse Extensions API years ahead of the mouse manufacturers, the OS vendors, and the GPU manufacturers.  Someone told me I am the temporal version (Hz, GtG, MPRT, VRR, lag) of a Japanese 1980s MUSE High Definition television researcher.  They're not too far off.  It's big whoop in 2020 like talking about 4K in 1980s, but essential early researcher talk applicable to 2030s esports.  Again, researchers found that retina refresh rate is quintuple-digits (though people needed to upgrade frame rate and refresh rate by gigantic amounts, 2x-4x, WHILE GtG=0 simultaneously, to notice a difference, when that close to the vanishing point of diminshing curve of returns).

Edited by Chief Blur Buster
Link to comment
  • Wizard
9 hours ago, Chief Blur Buster said:

Now, if you use a different sensitivity number than "0.03125" and a different dpi than "25600" you may produce more digits than can fit inside a "float".

 

9 hours ago, Chief Blur Buster said:

So the renewed moral of the story: Use a DPI calculator like yours, and stick to numbers CS:GO likes (e.g. 800-clean numbers).  Apparently, if you feed it cleaner numbers that are more easily mathed by the game engine CS:GO seems to merrily keep up okay.

 

9 hours ago, Chief Blur Buster said:

If you go to oddball numbers, like 1734 instead of 800, CS:GO doesn't behave as well as newer engines with oddball numbers.   So a newer engine such as Valorant may mouseturn perfectly at very oddball numbers, CS:GO doesn't necessarily. 

You really have to explain these statements better to me, because:

  1. The game has zero knowledge of your DPI (how could it know?). All the game knows is to rotate a set amount of degrees for every count it receives from the mouse. Now there are a couple of games and aim trainers that lets you enter your DPI, but it doesn't affect how they handle sensitivity, it's just to give you the option to view and configure your 360 distance.
  2. You keep referring to clean numbers and oddball numbers, but this makes no sense. If you have used the DPI Analyzer you should know that the configured DPI on the sensor is not the true DPI. It's a completely arbitrary number, 800 can for instance in reality be 851.34 or 764.29 (or anything else) depending on the sensor. So you will never get a clean number, 1734 is just as clean as any other DPI. Not that I understand why a "clean" number would make any difference?
  3. The only thing that matters is whether the sensitivity is accurate or not at the value you need to set it to. Some games do have issues with very low sensitivity values as they will start to not register single counts etc. But again, this has nothing to do with DPI.
Link to comment

Hand-waving in full effect I see....

The reason 8khz doesn't work a lot of times is nothing to do with "mouse mathematics" or floats vs double precision of sensitivity variables. It is because of how games acquire input themselves.

A game like Valorant only works with 8khz when it calls GetRawInputBuffer() to hold the inputs until an arbitrary buffer size is filled when the engine is ready for the input. If any app just gets WM_INPUT messages "as they come" as per a standard read of Rawinput, then unless it's pipeline is so trivial that it is only doing something like updating a counter, then it will most likely fall over with inputs spamming-in haphazardly every ~125us. The symptom is dropped packets, negative accel and / or just maxed CPU usage and stuttering. None of this is anything to do with sensitivity calculations being superior / inferior. Windows is also not a RTOS and is objectively bad once timing gets tight, it becomes extremely expensive to try and do anything accurately at these kind of timings. This is not going to change as it's a fundamental of the Windows environment.

The only reason low DPI can work with 8khz in some games where high DPI doesn't, is because the DPI is nowhere near saturating the polling rate and you're getting much lower input cadence. Set your mouse to 8khz and 400 dpi and move at 1 inch per second and your update rate is 400hz (obviously) and is therefore no longer "8khz" as far as the game is concerned. This is nothing to do with the DPI setting itself, which the game has no knowledge of or interaction with as DPI Wizard already said.

Most simulation loops of in-game physics, enemy positions - things like whether a crosshair is over an enemy etc will run at 60hz, maybe a really competitive FPS would run higher, and maybe they poll mouse input faster at 2 or 3 times the frame rate of the game, with the textures / graphics rendering running at the actual frame rate obviously. Usually though, you would register event callbacks from input devices which are then passed to a handler / accumulator that is then called once at the start of each frame. In other words, it does not matter if you are sending 8000 updates a second to the OS, because a game will just buffer them all and sum the total mouse distance to be called at the start of each render frame anyway - it makes no practical difference to your crosshair position whether you do this work in the firmware of the mouse at 1khz, or whether a game does it instead at 8khz. The only important factor is that the polling rate is greater than or equal to the highest frame rate of the game at a minimum. If you think using 8khz is giving 8000 discrete rotational updates of your crosshair a second, and for each of those positions an enemy location is being calculated for whether said input would be over a target or not (i.e something meaningful) then you are mad.

Once we get into the future where performance improves, it is also not inevitable this will change - rather the opposite. We have, for example the precedent of "Super Audio CD" and "DVD Audio" in the audio realm which were both large increases in resolution vs CD quality on a factual basis, yet both failed as standards precisely because that level of resolution is not required for the user experience - instead, users actually gravitated towards lower resolutions (compressed audio formats) and smaller file sizes for easier distribution. Point being - if such technological improvements were available to game engine developers to do more complex computations within a smaller timeframe, they will not be using such resource to update the crosshair position more frequently. There are many things such as higher animation fidelity, better online synchronisation systems, more complex physics, rendering improvements etc which would all be much higher priority and more obvious quality gains.

Either yourselves or Razer's 8k marketing team moving a cursor on a desktop and posting pictures of "micro stutter" is not going to make any user see a problem. This is unlike the actual micro-stutter that occurs in 3D rendering due to frame-cadence issues, which are readily apparent to even the uninitiated user. There is no one using a 1000hz mouse on a 144hz display and going "geez, I really hate these micro-stutters on my desktop cursor, I can't wait for that to be improved!". In short, you are inventing a problem and then trying to sell a solution / idea of a solution, and your argument effectively boils down that anyone who disagrees just doesn't have the eyes to see the problem, when the truth is no problem exists in the first place, and your solution would not even solve it if it did. 

Mouse report rate does not need to be synchronised to monitor refresh rate or game frame rate etc whatsoever, and insisting it would improve anything is fundamentally misunderstanding how the quantities of each are handled and how they interact with one another. Games will always render frames at arbitrary intervals because each frame has infinitely variable parameters all which require an arbitrary amount of resources, and mouse input polling on Windows will always be haphazard timewise and always has been due to the fundamental design of the operating system. Moreover, once the timings get down to a millisecond or so then there is no value to anyone in any case. No one is going to care about turning G-sync on if the monitor can run 1000hz (this is effectively the "Super Audio CD effect") and any "improved" mouse API that presumably could send decimal distance values to the OS instead of the standard (x,y) packet of integers with remainders carried to next poll, would also achieve nothing of value to anyone over existing systems that are stable, extremely well understood and established.

Edited by TheNoobPolice
Link to comment
W dniu 23.01.2023 o 06:40, Chief Blur Buster napisał:

 

So the renewed moral of the story: Use a DPI calculator like yours, and stick to numbers CS:GO likes (e.g. 800-clean numbers).  Apparently, if you feed it cleaner numbers that are more easily mathed by the game engine CS:GO seems to merrily keep up okay.

Ok, so please answer my simple, exemplary question. I use 400 dpi, 2.6 sens, CS:GO of course. rawinput1 and all that needed stuff. I wanted to switch to 1600 dpi (and 0.65 sens obv) multiple times, but I just felt that my aim is off and im not confident about my movements. It wasn't caused by dpi deviations (cause irl 360 distances matched) and wasn't caused by shitty mouse (modded fk2 with g305 internals). My question is: is it really me having it hard to get used to the different way of my screen moving with 1600 etc, or... is it what you are trying to describe? Cause 0.65 x 0.022 doesnt look like a lot of decimals, so I should be fine. Also, why do you suggest using mouse-sensitivity website for that, when we can do simple calculations ourselves? (1600x400 -> 2.6/4)

Link to comment
  • Wizard
4 minutes ago, justtypingokok said:

Also, why do you suggest using mouse-sensitivity website for that, when we can do simple calculations ourselves? (1600x400 -> 2.6/4)

For some games this works, but try doing it for a game like BF2042 with the sensitivity set to 0 and you'll see why a calculator is needed.

Link to comment
1 minutę temu, DPI Wizard napisał:

Try doing this for a game like BF2042 with the sensitivity set to 0, and you'll see why a calculator is needed.

No no, of course I'm not saying the website is useless or anything. Just wanting to know if he meant there was something special to pay attention to, other than calculations. Other than that, I'm just curious if there really is a problem with those high dpi in cs:go or not :D

Link to comment
On 1/23/2023 at 6:31 AM, DPI Wizard said:

You really have to explain these statements better to me, because:

  1. The game has zero knowledge of your DPI (how could it know?). All the game knows is to rotate a set amount of degrees for every count it receives from the mouse. Now there are a couple of games and aim trainers that lets you enter your DPI, but it doesn't affect how they handle sensitivity, it's just to give you the option to view and configure your 360 distance.
  2. You keep referring to clean numbers and oddball numbers, but this makes no sense. If you have used the DPI Analyzer you should know that the configured DPI on the sensor is not the true DPI. It's a completely arbitrary number, 800 can for instance in reality be 851.34 or 764.29 (or anything else) depending on the sensor. So you will never get a clean number, 1734 is just as clean as any other DPI. Not that I understand why a "clean" number would make any difference?
  3. The only thing that matters is whether the sensitivity is accurate or not at the value you need to set it to. Some games do have issues with very low sensitivity values as they will start to not register single counts etc. But again, this has nothing to do with DPI.

The short story is that there are a lot of theories abound, many possibly incorrect.

I bet we'd agree that WAY more research is needed. 

It is quite possible that the creator of engine (e.g. Valve) as well as the mouse manufacturers (e.g. Logitech and whomever) created a situation where some things don't work well.  It's a long pipeline from mouse sensor to mouse firmware to USB poll to USB drivers to game engine and the math workflow inside a game engine.

So the actual truth may be completely different from what everybody wrote (myself and Razer included, I can sometimes parrot incorrect information from, say a mouse engineer that came to the wrong conclusion).   What really stands, is that a lot of systems have aimfeel problems when deviating away from 400 or 800.  More research and study is needed.   

Many of us, people here included, have noticed that 400 and 800 is no longer the final frontier for today's increased resolutions at increased refresh rates.   Several have personally noticed that aimfeel feels more wonky at the same calculated DPI settings in CS:GO than certain newer engines; the reasons I described may very well be wrong, but that doesn't deny the odd behaviors that sometimes happen from an old game engine at least by some users on some displays;

Much more conservative mouse settings were often fine at 1024x768 60Hz with no visible issues, but can produce more visible issues at 2560x1440 360Hz, as an example.  Higher resolutions, higher refresh rates and higher frame rates, as well as new sync technologies (other than VSYNC OFF) can amplify issues that weren't seen before. 

Old games weren't originally tested at those testing variables.  This stuff doesn't matter as much to end users but for the future of the refresh rate race.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...