Jump to content

Chief Blur Buster

Members
  • Posts

    12
  • Joined

  • Last visited

Everything posted by Chief Blur Buster

  1. I've been lurking the last while, but chiming in again: Ideally, m_rawinput 1 makes it all moot. For the most part, this recent discussion is probably a way to figure out how to make non-rawinput behave as good as rawinput. However, make sure it's not caused by only the Pointer Precision setting. That's one of the biggie interferer of slow-vs-fast flicks during non-rawinput mode. Especially on high-pollrate mice. It's now a best practice to turn that goddamn setting off, even for Windows desktop use -- lest my desktop muscle memory starts unconsciously interfering with my game-flick muscle memory. It really means the opposite, a "Worsen Pointer Precision" setting, on high Hz displays using high pollrate mice -- so even for Desktop use, DONT USE THAT SETTING if you're a high-Hz user or a high-poll user -- full stop. The algorithm seems fine on your everyday DELL/HP 60Hz office monitor using the included 125Hz USB mouse on an Optiplex tower or boilerplate Lenovo laptop.... but, throw in your gaming Hz or gaming mouse, and it all becomes a big mess -- it's a great demo of really bad mouse mathematics. I've even seen 2x speed mouse pointer differences with Pointer Precision setting. You can easily notice it in Desktop by using an 8000Hz mouse running at 240Hz+ refresh rates. Use your mouse software to switch pollrate between 500Hz vs 1000Hz vs 2000Hz vs 4000Hz vs 8000Hz, and you will see that screen flick distance changes for the same hand-flick inch, due to how the mathematics in Microsoft's Pointer Precision behaves -- it becomes gigantic when it comes to comparing 1000Hz vs 8000Hz. It even worsens more with higher poll and higher Hz (360Hz+). Ugly. Try it yourself: Turn that setting off, and turn off pointer acceleration in all mouse settings. Now, flick the mouse 2 inches, observe how far pointer goes for 1000Hz vs 8000Hz. Same flick distance. Now turn on the setting. Flick 2 inches, observe how far pointer goes for 1000Hz vs 8000Hz. Massive difference (over 2x difference) in onscreen distance. Turn the setting off, and 125Hz-vs-8000Hz is same flick distance, without weird effects. So, don't even use the setting for desktop mouse pointer use if you're a high-Hz + gaming mouse user. So for rawinput=OFF use cases, I deem "Pointer Precision" it the biggest interferer in mouse muscle memory in the escalating refresh rates and poll rates. The Microsoft Pointer Precision algorithm is really wonky and produces very heavily non-linear behaviors. It doesn't let you preserve muscle memory across per-application poll rate profile (e.g. using 2000Hz-8000Hz in games that don't overload with it, and using 1000Hz in other games) I simply mention this, as changing multiple settings at once, can cause incorrect blame to be assigned. Also re-verifying the enablement of rawinput is important; as configuration settings don't always "take" on the first try, e.g. forgetting that game already running in background, when you're doing hundreds of passes in benchmark work duty, when a tester intern hired by some youtuber is doing hours of boring repetitive tests of different settings -- hey, it happens... So -- to doublecheck everything -- Is this possible that the pointer errors is completely attribute to the "Pointer Precision" algorithm? Or do still clearly see issues that can be definitively traced to CS:GO side?
  2. Correct. As I said in my earlier reply, my original wall of text, is indeed out of date (missing new info) See my other replies in this thread for additional context.
  3. And now I rewrite my accidentally-lost reply, as I promised. Although I will shorten it (summarized/abbreviated) because of continued discussion already covered some ground, and the gist summarized it well. So in other words, I will avoid repeating things I already said in the "gist" above, so read the gist above first. Correct, that's a bigger weak link. Now some angles to ponder: It's a long chain from sensor to photons, and "because of how games acquire input themselves" is exactly it in how good/bad it can do mathematics too. For example, you're doing 500 rounding-errors per second at 500 frames per second, on a 500 Hz display, it can build up to enough to create a "false acceleration or deceleration / false negative acceleration" behavior in some engines even though mouse acceleration is turned off. The more frames that spew at you, the more math rounding errors per second that can occur, if the developer wasn't forward-looking-enough to compensate for that possibility. The best way is to correctly accumulate the mouse deltas and base your new angle of mouseturn off that (accumulation), not the rounded-off mouselook position from the previous frame cycle (modify the previous rounded-off mouselook position). 500 rounding errors per second definitely builds up very fast -- to more than 1% mouselook offpoint error per second in some engines. This isn't audiophile below-the-noisefloor stuff. Sometimes a shotgun approach is done to fix all the weak links all at once including the human-visible (seen by 90%) and the less-likely (seen by 1%), like the concept of preserving sensor timestamps all the way to the engine, regardless of how engine acquires input, increases likelihood the engine can avoid a lot of pitfalls befalling us today. Yup. Exactly. Yup. Exactly. But it's not just about dropped packets, but millions of causes that can jitter everything in the noise between sensor to photons. Correct, Windows is not an RTOS. Exactly -- that's why I am a big advocate of being able to expose higher DPI more accurately to games with fewer side effects. Although DPI and pollrate are different, higher DPI can more quickly lead to quicker event storms in an engine, which can jitter those timestamps around a bit more often at higher DPI (e.g. moving 1 inch per second is more likely to produce more mouse events per second, or more mouse deltas per second, at 1600dpi than 400dpi). Even if the game engine batch-processes an event storm to prevent single-event processing overheads, this does not solve-all. Other weak links remain too, including the lack of accurate sensor timestamps (e.g. less accurate timestamps created by software long past the horses leaving the barn door of the sensor). The timestamps added to mous deltas created by software can jitter many pollcycles-off the actual sensor timestamps, depending on how things played out (USB jitter, software peformance, etc). This is a problem even if you're reading 10 polls all at once from a mouse deltas array, and they all had timestamps added by some software middleman that has no correlation to what was at the sensor level. My stance is, "all weak links are valid. Fix all of them all at once if possible". Thus, timestamps at the sensor, and preserve all the way to the software for it to process accurately relative to their gametime timestamps. Then who cares about some of the academic discussion in between (some visible, some not visible)... That isn't currently today the biggest benefit of 8Khz poll rate (in an 8KHz supported game). Many games don't do 8KHz good enough to make the 8KHz shine, but in the best cases, you can really notice the smoothness differences in the best software in the most optimized systems. The problem is that it's very few and far in between, due to systemic limitations that make it below the human-benefits noise floor. Most of the time I leave my mouse at 2KHz for most software for that reason, as it derives practically ~90% of the benefits of the 1KHz->8KHz journey. But under Windows 11, some games seems to even have more difficulty with even 1KHz (Microsoft optimization issue) then a Windows update fixed that -- and it worked fine at 1-2KHz, but not at 4-8Khz. Currently poll rates pushes system efficiency / OS efficiency / driver efficiency to the point where 8KHz does not produce more widespread universal human-benefits, so it's part of the reason why the proposed HD Mouse API is a good idea. Now poll rate should be somewhat oversampled slightly (jitter). That's seen in the jitters of poll Hz versus another frequency (a framerate, or a refresh rate), like the weird mousefeel you see on some mice and systems if you're testing 249Hz or 251Hz, which is 1Hz off a multiple of common poll rates. The jitter feel is also different depending on sync technology (VSYNC ON, VSYNC OFF or G-SYNC, etc) but a modicum oversampling can have beneficial effects to avoid the jittering between two frequencies. This is not important most of the time, but it's one of the many error margins that need to be addressed en-masse (as while one error margin may be below human noise floor, the dozens of error margins add up to visibiliity). Rather punch through all the noise and just preserve sensor timestamps to the game engine, rather than waste time whac-a-mole on the chain, anyway. While it's not my purview to determine if a person is mad or not, I like to refinements that satisfies both the mad and non-mad people. That's during the shotgun journey of blasting multiple error margins out of existence at the same time (non human visible and human visible ones) This isn't "below the noise floor" stuff. It's all about geometrics, e.g. 240Hz vs 1000Hz is visible to well over 90% of random-from-public average mainstream population if you design your blind test for a forced-eye-tracking on forced-moving-object. (e.g. moving text readability tests where successfully reading text is the objective). Perfect 240fps and perfect 1000fps is very hard though, without all the noise margins (including jitter) that can reduce differentials. Instead of DVD-vs-720p situation, you've got a VHS-vs-8K situation, except it's in the temporal dimension. Motion blur differences of the display persistence of unstrobed framerate=Hz of 4x is much easier to see than LCD GtG-throttled refresh rate incrementalism. Scientifically, assuming perfect framepacing, perfect sensor/step pacing, and perfect GtG=0ms, all motion blur is purely frametime on sample and hold, and zero GPU motion blur: - The most perfect zeroed out motion blur is motionblur=frametime on ANY strobeless/impulseless/flickerless/BFIless display. - Perfect 240fps 240Hz has exactly the same motion blur as a 1/240sec film camera photograph - Perfect 1000fps 1000Hz has exactly the same motion blur as a 1/1000sec film camera photograph It will typically be worse than that (240Hz vs 360Hz feels like a 1.1x difference, not 1.5x) due to the supertankerfuls of zillions of weak links from sensor to photons, especially software. But also hardware. Common ones accrue. GtG weak link? Ditto. USB Jitter weak link? Ditto. Mouse sensor flaws? Ditto. Pollrate processing issue? Ditto. GPU blur weak link? Ditto. Math rounding errors that builds up faster than before due to more frames per second? Ditto. That's just a few of the massive number of of weak links, some significant, some insignificant. It is already well known that major differences in camera shutter speeds are easier to tell apart. If you're a photographer in sports, it's hard to tell apart a 1/240sec and a 1/360sec shutter photograph. But way easier to tell apart a 1/240sec versus 1/1000sec for fast motion. So, perceptually, 1000fps 1000Hz will yield perfect motion (zero stroboscopics, zero blur) for motionspeeds up to 1000 pixels/sec. Now if you go faster motion with tinier pixels, e.g. 4000 pixels/sec or 8000 pixels/sec, it can require more Hz to retina all that out. But on a 1080p display, those motionspeeds are gone offscreen in a fraction of a second. Not so if you're wearing a 16K 180-degree FOV VR headset. Loosely speaking, the retina refresh rate generally occurs at some Hz above a human eye's ability to track a specific motionspeed in pixels/sec (as long as the angular resolution is resolvable enough that panning images has a different clarity than static images). This is to avoid the stroboscopic stepping effect (e.g. during fixed gaze) and to avoid the motion blur (e.g. during tracking gaze), whereupon you can't tell apart the display from real life in a temporal sense; ...Side commentary and side napkin exercise: VR ideally has to look exactly real life. VR is forced to flicker only because we can't achieve things stroblessly. Oculus Quest 2 is 0.3ms MPRT, so it would require 3333fps 3333Hz to create the same motion quality without its current flicker-based (BFI / strobing) motion blur reduction. So it underlines how far we are from eliminating motion blur strobelessly for all use cases for all possible uses of a display in humankind. Until then A/B test between VR and real life can easily fail due to a single weak link (display motion blur forced on you above-and-beyond real life, or stroboscopic effect above-and-beyond real life, or jitter forced on you above-and-beyond real life etc). And we're backporting some VR innovations back to FPS displays, given how things are often much better framepaced / jittercompensated in the VR world than in the non-VR world currently... It's shocking really, if you knew the weak links that existed that we just handwave-off in non-VR world in the era of ever-bigger ever-higher-resolution displays. Given a good system, VR achieved a near-miraculous efficiency in sensortime:photontime that is two to three orders of more magnitude more accurate than an average esports system thanks to superlative display hardware optimization and software optimization, assuming you don't underspec for the VR content like try to run Half Life Alyx on a GTX 780. When you're head turning at 8000 pixels/sec, anything that looks off from real-life shows up far more, like a 4 pixel microstutter which is 0.5ms framepace error on 8000 pixels/sec headturn. As a rule of thumb, stutter error bigger than MPRT in motion blur, is generally easily human visible in a motion test. Even 240Hz unstrobed is still (4.167ms MPRT + "X" ms GtG) of display motion blur, which easily hides these tiny jitters.... Part of the inspiration to HD Mouse APIs, since often headtracker data is often microsecond timestamped and relayed to the system in a preserved way, so that tracktime:gametime:photontime is far more accurate in VR hardware engineering and software engineering, than for current non-VR. And yes, GPU frame rate is a limiting factor to tomorrow's 1000fps+ 1000Hz+ world. Yes, GPU horsepower does need to keep up, but there's longterm lagless paths there (reprojection). Speaking of that, that's a VR innovation eventually to be backported to non-VR contexts in the coming years or decade, as some game developers has noticed. Being that said, this isn't only-audiophile-can-hear-it stuff. It's more majority-of-mainstream-can-see-it-if-blindtested stuff. Even for things like browser scrolling, too (that's why Apple and Samsung came out with 120Hz. But more mainstream notices if it's 4x+ differences, like 60Hz-vs-240Hz, or 240Hz-vs-1000Hz, rather than merely only 60Hz-vs-120Hz -- there are some games and some blind tests where the geometric differences matter than the absolute number of the Hz). Even things like mere browser scrolling looks noticeably clearer to most population at 240fps 240Hz versus 1000fps 1000Hz, it's as if you've turned on strobing (except you didn't). Apple would do it already if it used zero battery power. At the end of the day, killing enough of the weak links to create maximum refresh rate perfection amplifies the differences between refresh rates more. Sometimes one lineitem is unimportant, but we need to obliterate a hell lot of line items, and the mouse is undoubted one of the many weak links. Just as 4K was $10,000 in year 2001 (IBM T221), 4K is now a $299 walmart special. Tomorrow's refresh rates are easily pennies extra on top of 60Hz, but we keep buying small upgrades for so long (e.g. 240Hz then 360Hz is a worthless 1.5x and throttled further to 1.1x due to weak links). And Linus said it's cheap to laglessly frame-generate extra frames, but game engines and GPU vendors haven't picked it up -- but they will be forced to by 2030 to support future kilohertz-class displays. You said some significant issues (that I agree with). I said significant issues too. Whether you and I agree on weak link line-items is academic to the point that there are weak links is sabotaging the refresh rate race, where we don't get our money's worth out of the Hz that we ideally should. Sometimes weak links are fixed line-item. The 240Hz OLED prototypes sitting here, will help solve a major GtG weak link (Thanks to the zeroed-out weak link 240Hz OLED is roughly as clear-motion as the best 360Hz, a XL2566K E-TN, when at framerate=Hz, although the XL2566K lag is still slightly more -- but that's a separate flaw to be refined other than display motion blur). Other weak links should be fixed en-masse (e.g. sensor-level timestamps). Either way, many weak links remain. Also, given lab has already confirmed mainstream random-from-population visibility way above the noise floor of all that silly audiophile testing.... this definitely isn't audiophile-only stuff. We've just been spoonfed refresh rate incrementalism for way too long, with too little humankind benefit, without enmasse-fixing all systemic weak links all at once, including everything you just said (which indirectly agrees with several of my points, even if we may lineitem-disagree).
  4. Most importatly... My old "100% fail" quite is fully outdated with the information I had received since over time from multiple sources. This is already retracted, given the light of solutions. Just wanted to remind about that. I am now catching up on what I have not yet replied. Keep tuned, editing reply in Notepad++ (for autosave)
  5. How many mice have you tried at 1600? 1600 may have very well been fine in a specific same game engine in the 1024x768 60Hz era, but remember the resolution race concurrently with the refresh rate race, amplifies differences between 800 and 1600, by making flaws more human-visible on today's esports-standard 1080p 240Hz. Things are even more visible in the 1440p 360Hz LCD or 1440p 240Hz OLED era. I currently have 240Hz OLED prototypes here, too. This leads to a very apparent mysterious observed real-world-user migration to a sweet-spot DPI for older game engines (to our aghast dissapointment). It's easier to feel differences now with the fluctuating mouse report rates from 0 to 1000 and down back to 0 as you move with mouse varying speeds through the dpi points of your current dpi setting. The more number of refresh cycle opportunities, the frames per second opportunties, COMBINED with the higher resolution (= bigger onscreen pixels per inch of mouseturn movement = easier to see varying step distances, depending on fixed/moving objects or fixed/moving eye gaze). And remember, even if you do 100fps at 240Hz, those 100fps visually round off more accurately to the correct refresh cycle, in relative-time mousetime:photontime than it did on 100fps 60Hz. Given you've forced a 16.7ms granularity AND 16.7ms of motion blur that throws a fog screen at mouse problems or mouse setting problems or other relevant refresh rate race variables. Today, thanks to higher resolutions and higher refresh rates, there is less of that "fog" now on displays and on current GPUs than there was in 2006. For you, 400dpi may have very well been less than 1 pixel step per mouseturn movement at yesteryear sensitivity settings at yesteryear computer resolutions, but when we're facing tomorrow's 4K esports in its monstrous eyes, those step distances become bigger shark bites now. The same degree of mouseturn is way more pixels at the same framerate too. Yes, now with today's powerful GPUs, you can spew more framerate to compensate for those coarse mouseturn or pan step distances, but any problems in that, can still manifest itself in multiple ways, such as more rapid buildup of math rounding errors (in specific games), especially in engines that never were originally tested at framerates of a future decade. And also that, high frequency jitter creates extra display motion blur that throttles differences between Hz (e.g. 240Hz-vs-360Hz). There's multiple concurrent and simultaneous problems that occur at the same time, and it's hard whac-a-mole all of them out. (That's also part of the reason of the microsecond timestamps mandated at sensor level in the proposed HD Mouse API, to punch through all the noise inside the chain, whether be cable packet jitter or performance fluctuations in the system, etc). We just know that laboratory prototype displays need it, even if we don't need it for current esports displays. Today's lab refresh rates are tomorrow's esports refresh rates, after all. It really amplifies the problems of the refresh rate race and the resolution race, so the claims of "solutions looking for a problem" is just "Humans cant see 30fps-vs-60fps" garbage speil at this stage, dead horse in current researcher circles on my end, while other communities are still fiercely debatey mcdebateface continuing to debate, as we look from a distance. But we are, indeed, willing to learn from your other amazing skills, because researchers like me aren't esports atheletes ourselves, as a nuts-and-bolts role in the display engineering industry. But I'll unceremoniously do line-item callouts, walking away after dropping my microphone. Many are quite terrible at high DPI but the newest sensors have become really good at 1600 and so, the blame by some of us has begun shifting further up the chain (aka game engine). It takes only one math weak link and I suspect that people I talked to is only speculating where it is, but right now more esports-industry researchers have begun blaming the engine for weaknesses that only shows up in the current refresh rate race. I wish we could all come to the same songsheet or agreement, mind you. Mind you, I haven't personally looked at the leaked CS:GO source code to Sherlock-Holmes the bleep out of it, but I bet someone else in the world is itching to.
  6. I was typing a reply to you, and halfway in -- accidentally closed the browser tab with some misclick (Ah, ooops). I'll retry on my reply soon. But in short, great points, with some extra info added. (Brief gist: Your points were 80% good, 20% slightly behind the times -- due to increased Hz and resolutions making things visible that wasn't before. The same mouse flaw shows more clearly at 4K 240Hz than 800x600 60Hz. I also have a 1000Hz test display sitting here too, which ...ahem... has some interesting effets in jitter visibility. (BTW, Viewpixx sells 1440Hz laboratory test projectors, though that requires some custom driver programming). Nutshell is that higher resolutions, higher refresh rates, higher frame rates, all concurrently amplify smaller flaws. Also was talking about how displays behave differently in four situations -- 1. "stationary eyes moving object", 2. "stationary eyes stationary object", 3. "moving eyes moving object", 4. "moving eyes stationary object" -- motion demonstration www.testufo.com/eyetracking -- and not all games behave the same way here. Some of esports are slowly moving to 1440p already, after camping at 1080p status quo for a while, and this can continue to go up as long as tech allows. Also, fixed gaze on crosshairs while mouseturning can miss a lot of mouse jitter that is very visible with certain DOTA2 gaming technique where you eyetrack things during a pan, or for arena-style FPS where eye-tracking away from crosshairs is more often. Jitter and display motion-blur (including extra motion blur added by fast-vibrating jitter) can be over 10x more visible in one of the four situations than another. Thus, it is just a confirmed strawman argument to claim non-issue in the refresh rate race. We just respect that everybody sees differently. Just look at the different eye-movement patterns, the different eyeglasses prescriptions, the 12% population that is colorblind, the less motion sensitive, etc. It's easy to silo ourselves into just a CS:GO community and not realize the Rainbow Six esports champion used strobing, or some other surprising things some CS:GO veterans were unaware of. Etc.) (Will rewrite my fuller attempted reply later)
  7. The short story is that there are a lot of theories abound, many possibly incorrect. I bet we'd agree that WAY more research is needed. It is quite possible that the creator of engine (e.g. Valve) as well as the mouse manufacturers (e.g. Logitech and whomever) created a situation where some things don't work well. It's a long pipeline from mouse sensor to mouse firmware to USB poll to USB drivers to game engine and the math workflow inside a game engine. So the actual truth may be completely different from what everybody wrote (myself and Razer included, I can sometimes parrot incorrect information from, say a mouse engineer that came to the wrong conclusion). What really stands, is that a lot of systems have aimfeel problems when deviating away from 400 or 800. More research and study is needed. Many of us, people here included, have noticed that 400 and 800 is no longer the final frontier for today's increased resolutions at increased refresh rates. Several have personally noticed that aimfeel feels more wonky at the same calculated DPI settings in CS:GO than certain newer engines; the reasons I described may very well be wrong, but that doesn't deny the odd behaviors that sometimes happen from an old game engine at least by some users on some displays; Much more conservative mouse settings were often fine at 1024x768 60Hz with no visible issues, but can produce more visible issues at 2560x1440 360Hz, as an example. Higher resolutions, higher refresh rates and higher frame rates, as well as new sync technologies (other than VSYNC OFF) can amplify issues that weren't seen before. Old games weren't originally tested at those testing variables. This stuff doesn't matter as much to end users but for the future of the refresh rate race.
  8. I'm not talking about the math that a user can do. I'm talking about math rounding errors in game engine programming and graphics drivers. Many graphics rendering often used "float" instead of "double", and sometimes this creates things like Z-buffer fighting artifacts or other effects. What happens is inside software, many math operations can occur at lower precision than it should before being downconverted to final rendering precision. The new programming best-practices is to keep higher precision until the final stage (rendering exact position of scenery). 0.022*0.03125*25600=17.6 degrees This appears to produce good comfort for the Source Engine. This results in steps that are math-comfortable since a "float" data type in computer programming is only 6 to 7 significant digits (Source Engine rendering uses floats). Even intermediate steps (e.g. 0.022 * 0.03125) is at risk of producing many digits (0.0006875) but still fits within the float datatype. Now, if you use a different sensitivity number than "0.03125" and a different dpi than "25600" you may produce more digits than can fit inside a "float". Woe is the game engine, if the final degree is an infinite-repeating degrees. Now at 500 frames per second, you get 500 math rounding errors (from float being 6-7 digits INSIDE the game engine), you get potential wonkiness. This is also why 1000fps in CS:GO has more problems than 300fps, do you realize? (Do you realize that Source Engine only keeps 6 to 7 decimal digits?) Many newbies don't understand this. No wonder some experienced people like me, can be thrown off too -- because both you and I are correct simultaneously, depending on what you enter in the math variables for the game engine... Just like you don't want to use too few bits when doing repeated photo filters or repeated GPU effects (color banding from rounding effects), you don't want to use too few digits when a game engine does repeat math on mouse deltas on single-precision GPU rendering engines. Some engines keep high precision (doubles) when calculating 6dof positionals before finally rendering at single-precision, and make sure that math errors does not accumulate over multiple inputs. So the renewed moral of the story: Use a DPI calculator like yours, and stick to numbers CS:GO likes (e.g. 800-clean numbers). Apparently, if you feed it cleaner numbers that are more easily mathed by the game engine CS:GO seems to merrily keep up okay. There may be other causes of the problems witnessed (e.g. system limitations), but that doesn't mean math rounding errors can occur INSIDE a game engine, given many engines only keeps as few as 6 decimal digits during their positional math... What many non-programmers don't know is that there are math rounding errors INSIDE the game software and INSIDE the graphics drivers. I do, however, wholehearted admit my mistake in blanket-failing CS:GO. Correction: CS:GO works beautifully at high DPI provided you carefully calculate sensitivity numbers with DPI numbers to prevent too many rounding errors inside the single-precision Source Engine. Even many new engines are still single-precision but some codebases keep intermediate positional calculations in doubles (to prevent accumulation of rounding errors) before the render. I shall begin telling more users to your mouse calculator, if they want to experiment with numbers far beyond the mouse technology available at the time Source Engine was developed. Unlike before, I now know this is clean; What I did not know at the time I wrote the "FAIL" -- is that the game engine's mouse mathematics are fairly accurate again with some of this tweaking (e.g. editing the configuration file instead of adjusting the coarse slider). If you go to oddball numbers, like 1734 instead of 800, CS:GO doesn't behave as well as newer engines with oddball numbers. So a newer engine such as Valorant may mouseturn perfectly at very oddball numbers, CS:GO doesn't necessarily. For example, if you want to use, say, . So the magic number is to keep it at 800-equivalence, and you're fine with using higher DPI in CS:GO. That, I didn't know earlier, and you did correct me. So some of my old recommendations are now out of date. Instead of a blanket "bad at all above 800", you can do it with properly calculated numbers (like yours) -- except there is less flexibility to "aberrate" away from calculated numbers without side effects, than you would with a newer engine that has fewer math rounding errors inside the engine. The new rule of thumb is to try to stay with clean numbers (tell people to go to your site if tweaking a retro engine like CS:GO to >800dpi). Obviously, most people in esports aren't always configuration-file-saavy. That's why I am pushing hard for High Definition Mouse Extensions API, a proposal that I have written. Please feel free to flame roast critique its flaws, so it's an even better API (if vendors listen). Remember it's for future software, not for past software (old games) though popular engines such as Fortnite could theoretically retroactively add a menu setting to enable this, upon popular demand. Easily getting a proper preferred (e.g. 800-feel) in all supported games at 24000-equivalent poll (full sensor rate with microsecond timestamps relayed to software), at your own adjustable latency that the system can keep up with (1000 arrays per second, 2000 arrays per second, or 12345 arrays per second). So at 2000/sec, that's 12 rawinput mouse deltas embedded within a packet from the mouse. Way overkill, but you get the picture -- it gives a path to 8000 mousedeltas/sec at only 1000 polls/sec that doesn't kill a CPU core in midrange systems. So you can keep your 1000 pollrate, but still get 8000 timestamped rawinput mouse deltas per second, communicated as 1000 packets per second (configurable) if your system cannot handle an 8000 rate. Likewise if you system can handle 8000 but can't handle 24000, and so on. The numbers within are just arbitrary example -- it just uncaps mouse innovation not bottlenecked by software performance or jitter limitations -- especially in a USB-plugs-overloaded system that adds lots of jitter to untimestamped mouse deltas. Note: Think about this random napkin exercise: In tomorrow's 1000Hz+ world (and far beyond), we eventually need to accurately achieve frame rates higher than a specific computer's realistically achievable mouse packet rate, which is partially why I added the microsecond timestamping requirement as well as multiple-mouse-delta-per-packet feature. Imagine feeding a future 100fps->4000fps reprojection engine (e.g. DLSS 5.0+, FSR 4.0+, XeSS 4.0+) with only 1000 or 2000 mouse packets per second. With HD Mouse API, full sensor rate (e.g. "24KHz pollrate" or whatever you configure your mouse to) can be relayed to the game engine (accurately microsecond timestamped) at a configurable bunched number per second that is not too overloading for the computer. And because reprojection requires many mouse packets between original triangle-rendered frames, this fits each other like a glove without succumbing to a specific computers' mouse event rate limitation. This can be done with multiple mouse deltas per packet, along with their sensor timestamps. Before replying, please watch this YouTube video as the primary enabling technology of the 4K >1000fps >1000Hz future of the 2030s+ to understand an early bird Wright Brothers hint of one of the reasons why I proposed the HD Mouse Extensions API years ahead of the mouse manufacturers, the OS vendors, and the GPU manufacturers. Someone told me I am the temporal version (Hz, GtG, MPRT, VRR, lag) of a Japanese 1980s MUSE High Definition television researcher. They're not too far off. It's big whoop in 2020 like talking about 4K in 1980s, but essential early researcher talk applicable to 2030s esports. Again, researchers found that retina refresh rate is quintuple-digits (though people needed to upgrade frame rate and refresh rate by gigantic amounts, 2x-4x, WHILE GtG=0 simultaneously, to notice a difference, when that close to the vanishing point of diminshing curve of returns).
  9. I've been slowly working to convince some people at mouse manufacturers / Microsoft behind the scenes we need a better mode of relaying mouse data precisely without killing CPUs. 8KHz can really slam many CPUs, especially on crappy motherboards, crappy drivers, and crappy OSes. Imagine, getting 24KHz-quality (full sensor rate) through only a manageable 2000 or 4000 mouse-communications per second, and fully system-jitter-compensated, without pegging a CPU core to 100%. End user could configure it to meet their system needs and game compatibility. BTW, I must say I am pissed Windows 11 does not keep up as well for my 8KHz mouse for some reason, possibly immature mouse drivers. BTW, I was the world's first person to run 1000Hz directly from Microsoft Windows, on an internal laboratory prototype display. I contacted Microsoft and they added a hidden Windows Insider registry key to shatter the 500Hz limitation, and sent me the information privately (NDA). But now it's in public Windows builds. From what I last heard, Windows 11 now finally supports >500Hz, given the new 540Hz monitor being released. But a bad problem, Windows 11 does not seem as mouse 8KHz friendly as Windows 10. Ouch. I do wonder, if Microsoft has backported the >500Hz support to Windows 10, given the upcoming flood of >500Hz monitors later this decade. (1000Hz OLED 2030 FTW...) (Aside: At this point in diminishing curve, Always upgrade refresh rate by geometric increments such as 2x if possible. e.g. 60Hz -> 144Hz -> 360Hz -> 1000Hz. But if GtG is near 0 like an OLED, then upgrading refresh rate by 1.5x is okay. IPS LCD 240Hz-vs-360Hz is often only 1.2x-1.3x due to GtG (and diminished further to 1.1x due to high frequency jitter adding extra blur, cue YouTuber 240Hz-vs-360Hz "upgrade worthlessness sensationalism"). But OLED 240Hz-vs-360Hz is a 1.5x motion blur difference when perfectly framepaced with an excellent mouse, and 240Hz-vs-1000Hz OLED is 4x blur difference. Strobeless blur reduction FTW! I have a 240Hz OLED on my desk now as part of my work with one of the monitor manufacturers -- and I can confirm that 240Hz OLED is clearer motion than non-strobed 360Hz since there's no GtG blur, only MPRT blur (GtG and MPRT are both concurrently important, but 0ms GtG does not fix motion blur without fixing MPRT too). Strobed is still clearer, but some people don't like strobed. That being said, 240Hz is still very far from the final frontier, as explained in my 2018 article, Blur Busters Law: The Amazing Journey To Future 1000Hz Displays. Retina refresh rate can be quintuple digits in the most extreme variables, but that deep in diminishing curve of returns requires extremely steep geometric Hz upgrades upgrade to still be noticed, before the vanishing point of the diminishing curve of returns) This is another argument in favour of the HD Mouse Extensions API proposal. We need higher Hz displays, higher Hz mice, and improved precision in the whole workflow. Presently, I'm the expert in many communities about the Present()-to-photons blackbox, by the way -- but jitter that occurs before the hardware is a giant problem in the current refresh rate race requiring geometric upgrades. Since high-frequency stutter/jitter also adds display motion blur that obscures display refresh rate differences. 70 stutters per second at 360Hz from any source (mouse, software, whatever) is just a blur like a fast-vibrating music string, as seen in the stutter-to-blur continuum animation). (Yes, there's a GPU solution to doing 1000fps UE5-quality. Reprojection tests, ported from VR to PC, and utilizing 10:1 ratios instead of 2:1 ratio, have shown 4K 1000fps 1000Hz UE5-quality is possible on an RTX 4000 series. See LinusTechTips video for an example). While LinusTechTips extoll reprojection benefits for 30fps->240fps, I think 10 years ahead: 100fps->1000fps. I did some tests to show that the demo, was indeed, of course, capable of 10:1 ratio. Much fewer artifacts (far less than DLSS 3.0!!) and perceptually lagless. But we need to see this ported to an UE5 engine instead. I predicted frame rate amplification of 10:1 ratios way back in year 2018 and lo-n-behold, I've downloaded a demo that converts 1440p 30fps to 280fps using only roughly 10% overhead of a Razer Blade 15 GPU. While the graphics in that demo is simple, the remainder 90% of GPU can do anything else such as render UE5-quality visuals, as reprojection is detail-level-independent. While the downloadable demo's reprojection is not as good and artifact-free as Oculus ASW 2.0, I made a new discovery that starting with a minimum 100fps, to reproject to >1000fps, produced far less reprojection artifacts due to the fact that the artifacts vibrated far beyond flicker fusion frequency. (TL;DR: 4K 1000fps 1000Hz 1ms-lag GPU with UE5-detail-level is thus, apparently, already here today with reprojection help. It just hasn't been milked properly yet.) I think that some game developers have noticed (I'm hoping Epic did -- UE5 needs optional reprojection support, stat). By year 2025-2030, reprojection will probably be an optional feature (e.g. DLSS 4.0+, FSR 4.0+, XeSS 4.0+), though requires the game engine API to communicate movements to the frame generation API for perceptual lossless reprojection (which occurs roughly at ~100fps base framerate -- reprojecting from a lower frame rate has too many artifacts). The new gist of thinking for the upcoming 1000fps 1000Hz ecosystem amongst my circle of researchers, is reprojecting ~100fps -> ~1000fps, for strobeless blur reduction (reducing display motion blur by 90% without strobing and without black frame insertion). This system mandatorily requires communications of movement deltas (mwhich means mouse improvements apply here!!) to the reprojection engine between original triangle-rendered frames. So a modification is this made to the game engine to continuously communicate movement deltas to the frame generation API, for zero latency frame generation. (Retroactive reprojection allows even 10ms frame rendertimes to have only 1ms mouse-to-GPU-framebuffer latency, by the way! Research papers are coming about this. (Reprojection isn't necessarily evil in an era where we've accepted Netflix' use of mandatory interpolation feature in H.264 and H.EVC video codecs to perceptually losslessly produce 24fps from 1fps non-interpolated frames. This is better than grandpa's television interpolation, because the original interpolator had perfect ground truth knowledge of the original frames.) And, herein lies the argument why a system like High Definition Mouse Extension API is necessary for tomorrow's 1000Hz ecosystem. It will take me years to convince vendors to do things in a totally new way, to get away from the problems. But you can help educate me on my gaps of knowledge, despite me being the expert in Present()-to-photons black box chain (my "VSYNC OFF tearlines are just rasters" beam racing raster demo). Being that said, this is not important for many old games and today's games, but will be very important for the upcoming 1000fps 1000Hz world to prevent things from vanishing below the human-visibility floor of the diminishing curve of returns. For example, my proposed High Definition Mouse Extension API mandates microsecond-timestamping of mouse polls at the mouse sensor level, and preserved (through the jittery USB and jittery software) all the way to the videogame. For better mousepolltime:photontime synchronization that is software-immunized by self-software jitters. There's more than one way to "feel" 125 vs 1000 vs 8000. Lag feel, jitter feel, visual feel, etc. Some contexts depends on what you're doing. I can most definitely "see" it, at the least, even if I can't feel the lag. Many years ago, I published a simplified image the temporal aliasing effect between frame rate and refresh rate in refresh-synchronized situations (e.g. Windows Desktop and VSYNC ON situations). Then it became much worse when we're dealing with 240Hz, 280Hz, 360Hz and 390Hz, with those odd refresh rate divisors amplifying 1000Hz jitter much bigger than this montage of old photographs from way back in the 120Hz/144Hz days. This is actually a modified version of an old 2014 montage (sans 2000Hz) I shared with my blog readers at the time, back when Blur Busters was merely an unpaid hobby of mine. See... I've been pleading for 2000Hz+ since 2016. I screamed hard at Razer for years. This also can translate to frame rate in a roundabout way (in games with no mouse smoothing / interpolation, which is frowned on anyway). Where low pollrate translates to more motion blur (stutter-to-blur continuum animation demo -- www.testufo.com/eyetracking#speed=-1 ...) where low framerates vibrate noticeably like a slow music string, and high frame rates vibrate beyond flicker fusion threshold like a fast music string. A 125Hz poll rate means things will essentially run at 125fps even if your monitor is 360Hz, and you will have no less motion blur than 1/125sec persistence since the 125Hz pollrate essentially throttles the game frame rate. You don't need to feel the lag of a millisecond to win by the millisecond, like two olympics sprinters crossing the finish line at the same time. They see the scoreboard and find out they were only a few milliseconds apart. Likewise, see-and-shoot-simultaneously situations in FPS games, is also a metaphorical latency race to the finish line, even if you can't "feel" the latency, you may still win by the latency advantage. Likewise, a low DPI of 400 can be problematic for frame rate -- move a mouse only 1/4th inch over a period of 1 second, only gives 100 reports. That limits your (say, mousepan or mouseturn) frame rate to only 100fps. Most experienced people in this forum already know this (better) than I do, but I just wanted to add that, just in case you were unaware. I will wholeheartedly admit maybe I do not know as much about mice as some of you -- but you cannot deny I tipped a few dominoes -- and you cannot deny my industry connections -- and cannot deny some necessities that will be needed for the future 1000fps 1000Hz ecosystem (2030) -- and high Hz doesn't just benefit esports, given the slow mainstreaming of high Hz (phones, tablets, consoles). In fact, even I met Apple engineers at DisplayWeek convention who was gawking at BOE's 500Hz panel in May 2022. Being conservative about costs and battery consumption, I bet consumer tablets will still only be 240Hz OLED by 2030, esports will now probably have widespread 1000Hz "near 0ms GtG" displays by ~2030-ish. This will necessitate better mouse handling to remain visible humankind benefits -- full stop. Especially now, since our recent discovery that mouse jitter adds extra display motion blur (research articles still yet to be released), where high-frequency mouse jitter (sensortime:gametime sync error margin, from many causes such as USB jitter and CPU fluctuations in mouse driver, games, OS, etc) is just extra persistence blur degrading motion blur by approximately 10-20%. This can turn a 240Hz-vs-360Hz 1.3x difference (throttled down from 1.5x to 1.3x due to GtG) to a mere 1.1x difference (mouse jitter adding more blur). We're the discoverer of the error-margins-of-the-future applicable to optimizing future games whose software development has not yet started. Blur Busters was a hobby that involved tons of forum posts, so longtimers know my legacy. This continues today, with my walls of text even over a decade later. Now look at me, I'm cited in over 25 peer reviewed papers (Google Scholar), thanks to some of my display motion blur related work. Obviously, my posts are targetted to researcher/engineer/prosumer/etc communities, so my posts get a bit complex, as I try to "Popular Science" or "Cole Notes" a science paper, but still confusing for the average esports gamer. Now that being said, while I know what happens to pixels after frame Present()-ation all the way to them becoming photons -- I am certainly sometimes deficient of some parts of the chain before the Present(). Doesn't mean I know squat. There are many paths forward, and it is here where I'm interested in hearing from other computer programmers / game programmers / researchers / etc. If you want to help influence tomorrow's 1000Hz ecosystem -- and help course-correct my mistakes -- then most certainly get in touch with me.
  10. BTW, although not a mouse expert, I did tip some dominoes in the mouse world behind the scenes. A twitter debate that I started with with some academics who doubted the benefit of high poll rates... The Twitter debate triggered a research paper by the same individuals that conceded that humans could see imperfections in 1KHz that disappeared at 8Khz. They didn't believe 8KHz had any humankind benefit, while I strived to convince them otherwise; [Then about six months later] The research paper now has a registered DOI:| "Do We Need a Faster Mouse? Empirical Evaluation of Asynchronicity-Induced Jitter" https://dl.acm.org/doi/10.1145/3472749.3474783 And kudos that they published it open access without no paywall (link to PDF). Some cherrypicked images from the paper: NOTE: Yes, yes, the paper can still be slightly improved with more data. There's certainly more tests needed (VSYNC ON versus VSYNC OFF, fixed gaze at crosshairs versus tracking gaze at moving objects, ala The Stroboscopic Effect of Finite Frame Rates) -- which can produce different human visibilities of 1KHz vs 8KHz... Now that being said -- these tests were done without strobing. If you enable strobing such as DyAc, (less motion blur makes mouse jitter more visible) the green squares seem to move maybe one or two squares to the right. As long as the mouse jitter is significantly much worse (in milliseconds error) than the motion blur MPRT (in milliseconds) (MPRT number, not GtG number), there are odds of it becoming human visible. Although I'm not cited in the paper (like other papers, from the Blur Busters research portal at www.blurbusters.com/area51 ...), the researchers did acknowledge that the twitter debate triggered their decision to research the matter...I just use the mouse profile software to automatically set 1KHz, 2KHz or higher depending on what executable file is running -- but when just at Windows desktop I usually keep it at 8KHz, just because it's noticeable in my mouse cursor: (The stroboscopic stepping effect is something I write about in one of my articles. Some people aren't bothered by it. But some are. Like some people are bothered by tearing, others are not, others really love high Hz, others are fine with 60Hz, and so on, and depends on esports vs non-esports contexts). I could clearly see 1Khz vs 8Khz even at Windows Desktop just by circling the mouse pointer, as a jitter effect between pollrate and VSYNC, since for many (non-esports) use cases, you want VSYNC such as for smooth scrolling & smooth mouse pointer operations (e.g. painting, etc). In practicality, the game, the port, the drivers, the software, the whole chain -- needs to be efficient enough to handle it without bogging down the game. I've found 2Khz to be an excellent compromise most of the time. 8KHz + a system that can't keep up, can be worse than 1KHz, so tweaking and optimization is needed, as well as a game that can handle that mouse input rate (without pegging a CPU core to 100%). But all things equal, I absolutely love 8KHz even just during plain web surfing, Adobe Photoshop, Visual Studio, etc. It's very visible to me. Just wanted to add some useful background information from my beta-testing of 8KHz.
  11. BTW -- I wanted to add a compliment -- you're one of my favourite sites for calculating correct dpi / edpi / related stuff. While my specialty is Hz and not mice, let me provide some rationale of my past recommendations, to give better context... I use DPI Calculator/Analyzer from time to time too, as a quick calculate for some purposes. Maybe only once every year or every few months, but it's something I'm super glad that exists. Good job. Remember, that post was posted a year ago during my early beta testing of Razer 8KHz, of which I was a beta tester of. My engineer contact at Razer found issues in some some engines to realize that the mouse mathematics are better in some newer games, though it is only one contributory factor (including, of course, the end user not bothering to configure correctly -- older engines are less forgiving than newer engines, even during configuration file editing for more sensitivity digits). We've learned more since, if you've read my newer posts -- and I'm already in 25+ peer reviewed papers (purple Research button on website) including cited by NVIDIA, Samsung, NIST.gov, etc. So, it's pretty clear I know my stuff in a different niche, even if not as well about mouse. But that doesn't mean we can't collaborate together to (A) find the causes; (B) avoid the math problems; (C) get more esports community to use higher DPI. Better we work together after all, picking each other's brains on the portions of our skills that we are smartest at. I do, however would love to see more peer reviewed papers on mouse edpi. Maybe a researcher here would love to help co-author? Clearly, there's fewer good DPI+sensitivity combos in older games, than in newer games. When I was beta testing the Razer (8KHz), they did some tests and privately discovered bad mouse mathmatics in some games, at least for some DPI+sensitivity combinations. If you tow the 800 edpi line with some great manual mouse mathematics, you can prevent a lot of issues and keep everything better sync'd between CS:GO and newer games. Few do bother to use the DPI analyzer, but let's rightfully pan how unintuitive older games are out-of-the-box for DPI adjustments by people who don't bother to use utilities. Newer games does oddball DPI and oddball sensitivities better (e.g. 1234dpi at 0.789647 sensitivity or whatever random combo) with less jitter/rounding. In coding, some of the code uses 'floats' for the mouse mathematics (or worse, integers), while the newer games are more likely to use 'double', and clamp to the final engine's precision after finishing mathing it all out. It may only be minor differences that are only noticed by seasoned esports players. Now if one correctly use DPI Calculator/Wizard and similar utilities in a Right Tool For Right Job, a lot of this definitely goes away it appears -- from more recent anecdots. However, it doesn't make my post utter drivel, given the engineers at Razer had discovered the same thing as well; While pollrate and dpi are separate, ikeep in mind that pollrate and dpi can interact with each other in unexpected/weird ways. High pollrate creates CPU fluctuation surges which can create some mouse jitter (polltime:rendertime goes out of sync because game or CPU processed the high poll inefficiently), and mouse jitter dynamics can be affected by it. Like how you configure a Razer 8KHz at max 8000Hz, and really rip the mouse around, a CPU core gets pegged at 100% and the mouseturns starts weirdly stuttering, and some people (incorrectly) tried to change DPI to fix it. It changed the jitterfeel, even if it didn't completely solve, so the smoothness-feel in "CPU duress situations" can create some unexpected interactions between pollrate and dpi, depending on how the mouse mathematics in an engine is done. A perfect system, you keep up to date, and it handles your mouse well, then you probably won't notice a problem -- but not everybody's system is your system. I'm a C++ and C# coder, and I am shocked how bad the mouse mathematics are in some engines. It does appear though, that you can kinda tweak your way to some good combinations that feels good, by using utilities such as DPI wizard. While I'm a bigger expert in Hz than mouse, not all FPS players are computer coders and have seen what I've seen... It may not affect aim if you configure edpi correctly, but if you use oddball numbers, like randomly sliding sensitivity sliders, or using oddball DPI in existing mouse utilities -- then the math errors adds up (e.g. mouse sensor interpolation, mouse driver smoothing, game rounding errors, etc) -- the whole chain. It's all very poorly researched, but problems definitely exist, with suboptimal configurations. It takes less work with newer engines to keep the chain 'math clean', but DPI wizard is a great way to mitigate this and not notice any math problem. So this is probably just a simple case of "I'm right too, and you're right too", in a non-mutually-exclusive manner -- depending on the settings. Knowledge can certainly be improved, though! Thoughts are also welcome on the High Definition Mouse API Proposal, too! (P.S. Sorry I write big texts. I wish I could do YouTube and shorter stuff, but I am born deaf, and walls of text are my thing. But many of my fans love them! It's my hallmark to stop the 30fps-vs-60fps laughing over the years.)
  12. I'd like to crosspost this here, that provides more insight; on some observations. That's a fantastic anecdote. With some really good hard-core configuring -- text editing the configuration file for many settings including raw input, ultra-precise sensitivity numbers, etc -- it's possible to make >800dpi usable in CS:GO. But it's not really easy. >800dpi feels simply awful most of the time in CS:GO so I'm pretty impressed he got 25600dpi to work great. The mouse mathematics inside CS:GO becomes very wonky when you're far away from compensated 800 "edpi", so clearly this gamer found a sweet spot magic 0.0313 sensitivity that eliminted mouse math errors at >800dpi by maintaining 800 edpi. So that's my theory -- the mouse mathematics in CS:GO becomes accurate again at 800 edpi. It's much how CS:GO goes kinda wonky at >1000fps, concurrently CS:GO also got a sweet spot frame rate range and a sweet spot "edpi" range -- it goes wonky. Mouseturns don't perform consistently across the entire hardware-DPI range at all sensitivity ranges, so the mouse mathematics in the old game aren't as good as the games. But it's not impossible to fight against it and workaround with magic-number sensitivities that takes some knowledge to calculate. Keeping it to 800 [b][color=red][u]e[/u][/color][/b]dpi is definitely key, that preserves training memory (also known as "muscle memory"). Syncing sensitivity across all games is not intuitive to most gamers at oddball DPI; _____ Now a very deadly serious esports question. Is there currently any software that can automatically sync "edpi" across all games, and automatically recomputes/recalculates/rewrite configuration files, everytime you change mouse DPI? Play at 800dpi, exit the game, change your mouse DPI higher and restart the software, no change to your fast flicks. Except your slow flicks improve. Your mouseturns are still the same speed -- except your slowslews (sniper, slow crosshair tracking, and more jitter-free slowturns with unaffected fast flickturns etc) starts to feel massively better at >800dpi. This would help train gamers on why high DPI is a gigantic boon, if configured properly. There's a need for such edpi-synchronization software in the esports industry, to break out of the 400dpi-800dpi dregs without wrecking training memories (muscle memory). Choosing 800edpi in an edpi system tray app while your mouse is at 3200dpi, and seeing all your games automatically sync (configuration files modified by the [b]Automatic EDPI System Tray App[/b], etc) would be a giant boon. No manual calculating, no edpi wizards, no wrecked muscle memories, etc. Perfect sensitivity numbers in all of them. An automatic edpi configuator would be a good compromise before my proposed [url=https://forums.blurbusters.com/viewtopic.php?t=7739]High Definition Mouse Extensions API Proposal[/url]. It would need to know how to modify the configuration files of the common games (CS:GO, Fortnite, Valorant, Rocket League, Overwatch, DOTA2, LoL, and other popular esports games) but it would allow 3200dpi+ nirvana in esports on modern mouse sensors without wrecking 800dpi muscle memories (training memory). While I'm the Hz expert and not as expert on mice, improved recommendations about mice are needed from Blur Busters, and I'd love to get one of you DPI professionals to write an article for Blur Busters.
×
×
  • Create New...