Jump to content
View in the app

A better way to browse. Learn more.

Mouse Sensitivity Community

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Chief Blur Buster

Members
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Thank you, this tweak with -useforcedmparms returns back to the old engine behaviour (goldscr) if you leave  the m_mouseaccel1/2 and m_mousespeed values on default.
    if ( CommandLine()->FindParm ("-useforcedmparms" ) ) { #ifdef WIN32 m_fMouseParmsValid = SystemParametersInfo( SPI_GETMOUSE, 0, m_rgOrigMouseParms, 0 ) ? true : false; #else m_fMouseParmsValid = false; It makes sure to use the spi_getmouse function from windows. Which on default is coded as this in the goldsrc engine:
    originalmouseparms[3], newmouseparms[3] = {0, 0, 1}  See: https://www.drdobbs.com/windows/some-things-ive-learned-about-win32-game/184410376#l4
     
    The first two 0 stands for the mousethreshold1/2 and the 1 for the mousespeed. With this csgo will turn mouse acceleration as on but get by passed with m_rawinput 1 so there is no acceleration in game but on windows and the game menu. (see cs 1.6 which behave exact like that)
    Csgo Vanilla is therefore coded as m_fMouseParmsValid = false and the false is referring to 0.0 for the mousethresholds. The m_mousespeed i assume is also set to 0.0 atleast the console accept this as the lowest value.
    static ConVar m_mousespeed( "m_mousespeed", "1", FCVAR_ARCHIVE, "Windows mouse acceleration (0 to disable, 1 to enable [Windows 2000: enable initial threshold], 2 to enable secondary threshold [Windows 2000 only]).", true, 0, true, 2 ); static ConVar m_mouseaccel1( "m_mouseaccel1", "0", FCVAR_ARCHIVE, "Windows mouse acceleration initial threshold (2x movement).", true, 0, false, 0.0f ); static ConVar m_mouseaccel2( "m_mouseaccel2", "0", FCVAR_ARCHIVE, "Windows mouse acceleration secondary threshold (4x movement).", true, 0, false, 0.0f ); So the new code should look like this:

    originalmouseparms[3], newmouseparms[3] = {0.0, 0.0, 0.0} This shows how windows handle the spi_setmouse function, in this you see that the params [3] have set to 0, 0, 0 so windows set the linearmousescale:
     
    void WIN_UpdateMouseSystemScale() { int mouse_speed; int params[3] = { 0, 0, 0 }; if (SystemParametersInfo(SPI_GETMOUSESPEED, 0, &mouse_speed, 0) && SystemParametersInfo(SPI_GETMOUSE, 0, params, 0)) { if (params[2]) { WIN_SetEnhancedMouseScale(mouse_speed); } else { WIN_SetLinearMouseScale(mouse_speed); } } } Thats why i suggested to set -useforcedmparms and m_mousespeed 0. And with this you changed how the game handles the mouse input and in theory it should change the mouse input.
    I agree with m_rawinput 1 all that shouldnt matter right? It should bypass the windows settings.
    Since you said you dont feel any difference i need to read more about rawinput cause for me the change in small mouse movements is noticeable before the mouse felt heavy and delayed.
     
     
  2. Like
    I remembered one bug in Counter Strike 1.6 engine that we oldschool AWPers know about. This bug is present with any combination of settings you can think of  regardless of whether RawInput is on/off. This nicely illustrates that this topic has its meaning from both perspective of view (benefit/criticism). Because as you can see in the video. A slow movement of the mouse to the right is registered by the game when scoped. But not to the left.
    This is also why I prefer manual verification over software. Because I want to be 100% sure how the system/game reflects the real world (long chain from sensor to photons). Not just software world.
  3. Like
    Chief Blur Buster reacted to mlem in High Dpi issues on old Games / Engines   
    So is the take away that. 
    1. Some ppl like to overcomplicate things and ignore simple solutions.
    2. Blurbuster wall of text is outdated and doesn't actually know what he is talking about and invent problems to create solutions to sell, even though they aren't actually 'problems'.
    3. Bugs are bugs.
  4. Like
    Pretty much!
  5. Like
    Chief Blur Buster got a reaction from MacSquirrel_Jedi in High Dpi issues on old Games / Engines   
    And now I rewrite my accidentally-lost reply, as I promised. 🙂
    Although I will shorten it (summarized/abbreviated) because of continued discussion already covered some ground, and the gist summarized it well.
    So in other words, I will avoid repeating things I already said in the "gist" above, so read the gist above first.
    Correct, that's a bigger weak link. 
    Now some angles to ponder:
    It's a long chain from sensor to photons, and "because of how games acquire input themselves" is exactly it in how good/bad it can do mathematics too.  For example, you're doing 500 rounding-errors per second at 500 frames per second, on a 500 Hz display, it can build up to enough to create a "false acceleration or deceleration / false negative acceleration" behavior in some engines even though mouse acceleration is turned off.   The more frames that spew at you, the more math rounding errors per second that can occur, if the developer wasn't forward-looking-enough to compensate for that possibility.
    The best way is to correctly accumulate the mouse deltas and base your new angle of mouseturn off that (accumulation), not the rounded-off mouselook position from the previous frame cycle (modify the previous rounded-off mouselook position).  500 rounding errors per second definitely builds up very fast -- to more than 1% mouselook offpoint error per second in some engines.   This isn't audiophile below-the-noisefloor stuff.
    Sometimes a shotgun approach is done to fix all the weak links all at once including the human-visible (seen by 90%) and the less-likely (seen by 1%), like the concept of preserving sensor timestamps all the way to the engine, regardless of how engine acquires input, increases likelihood the engine can avoid a lot of pitfalls befalling us today.
    Yup. Exactly.
    Yup. Exactly.  
    But it's not just about dropped packets, but millions of causes that can jitter everything in the noise between sensor to photons.
    Correct, Windows is not an RTOS.
    Exactly -- that's why I am a big advocate of being able to expose higher DPI more accurately to games with fewer side effects. Although DPI and pollrate are different, higher DPI can more quickly lead to quicker event storms in an engine, which can jitter those timestamps around a bit more often at higher DPI (e.g. moving 1 inch per second is more likely to produce more mouse events per second, or more mouse deltas per second, at 1600dpi than 400dpi).
    Even if the game engine batch-processes an event storm to prevent single-event processing overheads, this does not solve-all. Other weak links remain too, including the lack of accurate sensor timestamps (e.g. less accurate timestamps created by software long past the horses leaving the barn door of the sensor). 
    The timestamps added to mous deltas created by software can jitter many pollcycles-off the actual sensor timestamps, depending on how things played out (USB jitter, software peformance, etc).  This is a problem even if you're reading 10 polls all at once from a mouse deltas array, and they all had timestamps added by some software middleman that has no correlation to what was at the sensor level.
    My stance is, "all weak links are valid.  Fix all of them all at once if possible".  Thus, timestamps at the sensor, and preserve all the way to the software for it to process accurately relative to their gametime timestamps.  Then who cares about some of the academic discussion in between (some visible, some not visible)...
    That isn't currently today the biggest benefit of 8Khz poll rate (in an 8KHz supported game).  Many games don't do 8KHz good enough to make the 8KHz shine, but in the best cases, you can really notice the smoothness differences in the best software in the most optimized systems.  The problem is that it's very few and far in between, due to systemic limitations that make it below the human-benefits noise floor.  Most of the time I leave my mouse at 2KHz for most software for that reason, as it derives practically ~90% of the benefits of the 1KHz->8KHz journey.  But under Windows 11, some games seems to even have more difficulty with even 1KHz (Microsoft optimization issue) then a Windows update fixed that -- and it worked fine at 1-2KHz, but not at 4-8Khz.  Currently poll rates pushes system efficiency / OS efficiency / driver efficiency to the point where 8KHz does not produce more widespread universal human-benefits, so it's part of the reason why the proposed HD Mouse API is a good idea.
    Now poll rate should be somewhat oversampled slightly (jitter).  That's seen in the jitters of poll Hz versus another frequency (a framerate, or a refresh rate), like the weird mousefeel you see on some mice and systems if you're testing 249Hz or 251Hz, which is 1Hz off a multiple of common poll rates.  The jitter feel is also different depending on sync technology (VSYNC ON, VSYNC OFF or G-SYNC, etc) but a modicum oversampling can have beneficial effects to avoid the jittering between two frequencies.
    This is not important most of the time, but it's one of the many error margins that need to be addressed en-masse (as while one error margin may be below human noise floor, the dozens of error margins add up to visibiliity).  Rather punch through all the noise and just preserve sensor timestamps to the game engine, rather than waste time whac-a-mole on the chain, anyway.
    While it's not my purview to determine if a person is mad or not, I like to refinements that satisfies both the mad and non-mad people.   
    That's during the shotgun journey of blasting multiple error margins out of existence at the same time (non human visible and human visible ones)
    This isn't "below the noise floor" stuff.
    It's all about geometrics, e.g. 240Hz vs 1000Hz is visible to well over 90% of random-from-public average mainstream population if you design your blind test for a forced-eye-tracking on forced-moving-object. (e.g. moving text readability tests where successfully reading text is the objective).  Perfect 240fps and perfect 1000fps is very hard though, without all the noise margins (including jitter) that can reduce differentials.  
    Instead of DVD-vs-720p situation, you've got a VHS-vs-8K situation, except it's in the temporal dimension.   
    Motion blur differences of the display persistence of unstrobed framerate=Hz of 4x is much easier to see than LCD GtG-throttled refresh rate incrementalism.

    Scientifically, assuming perfect framepacing, perfect sensor/step pacing, and perfect GtG=0ms, all motion blur is purely frametime on sample and hold, and zero GPU motion blur:
    - The most perfect zeroed out motion blur is motionblur=frametime on ANY strobeless/impulseless/flickerless/BFIless display.
    - Perfect 240fps 240Hz has exactly the same motion blur as a 1/240sec film camera photograph
    - Perfect 1000fps 1000Hz has exactly the same motion blur as a 1/1000sec film camera photograph
    It will typically be worse than that (240Hz vs 360Hz feels like a 1.1x difference, not 1.5x) due to the supertankerfuls of zillions of weak links from sensor to photons, especially software.  But also hardware. 
    Common ones accrue.  GtG weak link?  Ditto.  USB Jitter weak link?  Ditto.  Mouse sensor flaws?  Ditto.  Pollrate processing issue?  Ditto.  GPU blur weak link?  Ditto.  Math rounding errors that builds up faster than before due to more frames per second?  Ditto.   That's just a few of the massive number of of weak links, some significant, some insignificant.   
    It is already well known that major differences in camera shutter speeds are easier to tell apart.  If you're a photographer in sports, it's hard to tell apart a 1/240sec and a 1/360sec shutter photograph.  But way easier to tell apart a 1/240sec versus 1/1000sec for fast motion.

    So, perceptually, 1000fps 1000Hz will yield perfect motion (zero stroboscopics, zero blur) for motionspeeds up to 1000 pixels/sec.  Now if you go faster motion with tinier pixels, e.g. 4000 pixels/sec or 8000 pixels/sec, it can require more Hz to retina all that out.  But on a 1080p display, those motionspeeds are gone offscreen in a fraction of a second.  Not so if you're wearing a 16K 180-degree FOV VR headset.   
    Loosely speaking, the retina refresh rate generally occurs at some Hz above a human eye's ability to track a specific motionspeed in pixels/sec (as long as the angular resolution is resolvable enough that panning images has a different clarity than static images).  This is to avoid the stroboscopic stepping effect (e.g.  during fixed gaze) and to avoid the motion blur (e.g. during tracking gaze), whereupon you can't tell apart the display from real life in a temporal sense;
    ...Side commentary and side napkin exercise: VR ideally has to look exactly real life.  VR is forced to flicker only because we can't achieve things stroblessly. Oculus Quest 2 is 0.3ms MPRT, so it would require 3333fps 3333Hz to create the same motion quality without its current flicker-based (BFI / strobing) motion blur reduction.  So it underlines how far we are from eliminating motion blur strobelessly for all use cases for all possible uses of a display in humankind.  Until then A/B test between VR and real life can easily fail due to a single weak link (display motion blur forced on you above-and-beyond real life, or stroboscopic effect above-and-beyond real life, or jitter forced on you above-and-beyond real life etc).   And we're backporting some VR innovations back to FPS displays, given how things are often much better framepaced / jittercompensated in the VR world than in the non-VR world currently... It's shocking really, if you knew the weak links that existed that we just handwave-off in non-VR world in the era of ever-bigger ever-higher-resolution displays.  Given a good system, VR achieved a near-miraculous efficiency in sensortime:photontime that is two to three orders of more magnitude more accurate than an average esports system thanks to superlative display hardware optimization and software optimization, assuming you don't underspec for the VR content like try to run Half Life Alyx on a GTX 780.  When you're head turning at 8000 pixels/sec, anything that looks off from real-life shows up far more, like a 4 pixel microstutter which is 0.5ms framepace error on 8000 pixels/sec headturn.  As a rule of thumb, stutter error bigger than MPRT in motion blur, is generally easily human visible in a motion test.  Even 240Hz unstrobed is still (4.167ms MPRT + "X" ms GtG) of display motion blur, which easily hides these tiny jitters....
    Part of the inspiration to HD Mouse APIs, since often headtracker data is often microsecond timestamped and relayed to the system in a preserved way, so that tracktime:gametime:photontime is far more accurate in VR hardware engineering and software engineering, than for current non-VR. 
    And yes, GPU frame rate is a limiting factor to tomorrow's 1000fps+ 1000Hz+ world.  
    Yes, GPU horsepower does need to keep up, but there's longterm lagless paths there (reprojection).  Speaking of that, that's a VR innovation eventually to be backported to non-VR contexts in the coming years or decade, as some game developers has noticed.
    Being that said, this isn't only-audiophile-can-hear-it stuff.   
    It's more majority-of-mainstream-can-see-it-if-blindtested stuff.  Even for things like browser scrolling, too (that's why Apple and Samsung came out with 120Hz.  But more mainstream notices if it's 4x+ differences, like 60Hz-vs-240Hz, or 240Hz-vs-1000Hz, rather than merely only 60Hz-vs-120Hz -- there are some games and some blind tests where the geometric differences matter than the absolute number of the Hz).
    Even things like mere browser scrolling looks noticeably clearer to most population at 240fps 240Hz versus 1000fps 1000Hz, it's as if you've turned on strobing (except you didn't).  Apple would do it already if it used zero battery power.
    At the end of the day, killing enough of the weak links to create maximum refresh rate perfection amplifies the differences between refresh rates more.  Sometimes one lineitem is unimportant, but we need to obliterate a hell lot of line items, and the mouse is undoubted one of the many weak links.
    Just as 4K was $10,000 in year 2001 (IBM T221), 4K is now a $299 walmart special.  Tomorrow's refresh rates are easily pennies extra on top of 60Hz, but we keep buying small upgrades for so long (e.g. 240Hz then 360Hz is a worthless 1.5x and throttled further to 1.1x due to weak links).  And Linus said it's cheap to laglessly frame-generate extra frames, but game engines and GPU vendors haven't picked it up -- but they will be forced to by 2030 to support future kilohertz-class displays.
    You said some significant issues (that I agree with).  I said significant issues too.   Whether you and I agree on weak link line-items is academic to the point that there are weak links is sabotaging the refresh rate race, where we don't get our money's worth out of the Hz that we ideally should.   
    Sometimes weak links are fixed line-item.  The 240Hz OLED prototypes sitting here, will help solve a major GtG weak link (Thanks to the zeroed-out weak link 240Hz OLED is roughly as clear-motion as the best 360Hz, a XL2566K E-TN, when at framerate=Hz, although the XL2566K lag is still slightly more -- but that's a separate flaw to be refined other than display motion blur).  Other weak links should be fixed en-masse (e.g. sensor-level timestamps).  Either way, many weak links remain.
    Also, given lab has already confirmed mainstream random-from-population visibility way above the noise floor of all that silly audiophile testing.... this definitely isn't audiophile-only stuff.   
    We've just been spoonfed refresh rate incrementalism for way too long, with too little humankind benefit, without enmasse-fixing all systemic weak links all at once, including everything you just said (which indirectly agrees with several of my points, even if we may lineitem-disagree).
  6. Like
    Chief Blur Buster got a reaction from MacSquirrel_Jedi in High Dpi issues on old Games / Engines   
    The short story is that there are a lot of theories abound, many possibly incorrect.
    I bet we'd agree that WAY more research is needed. 
    It is quite possible that the creator of engine (e.g. Valve) as well as the mouse manufacturers (e.g. Logitech and whomever) created a situation where some things don't work well.  It's a long pipeline from mouse sensor to mouse firmware to USB poll to USB drivers to game engine and the math workflow inside a game engine.
    So the actual truth may be completely different from what everybody wrote (myself and Razer included, I can sometimes parrot incorrect information from, say a mouse engineer that came to the wrong conclusion).   What really stands, is that a lot of systems have aimfeel problems when deviating away from 400 or 800.  More research and study is needed.   
    Many of us, people here included, have noticed that 400 and 800 is no longer the final frontier for today's increased resolutions at increased refresh rates.   Several have personally noticed that aimfeel feels more wonky at the same calculated DPI settings in CS:GO than certain newer engines; the reasons I described may very well be wrong, but that doesn't deny the odd behaviors that sometimes happen from an old game engine at least by some users on some displays;
    Much more conservative mouse settings were often fine at 1024x768 60Hz with no visible issues, but can produce more visible issues at 2560x1440 360Hz, as an example.  Higher resolutions, higher refresh rates and higher frame rates, as well as new sync technologies (other than VSYNC OFF) can amplify issues that weren't seen before. 
    Old games weren't originally tested at those testing variables.  This stuff doesn't matter as much to end users but for the future of the refresh rate race.
  7. Like
    Hand-waving in full effect I see....

    The reason 8khz doesn't work a lot of times is nothing to do with "mouse mathematics" or floats vs double precision of sensitivity variables. It is because of how games acquire input themselves.

    A game like Valorant only works with 8khz when it calls GetRawInputBuffer() to hold the inputs until an arbitrary buffer size is filled when the engine is ready for the input. If any app just gets WM_INPUT messages "as they come" as per a standard read of Rawinput, then unless it's pipeline is so trivial that it is only doing something like updating a counter, then it will most likely fall over with inputs spamming-in haphazardly every ~125us. The symptom is dropped packets, negative accel and / or just maxed CPU usage and stuttering. None of this is anything to do with sensitivity calculations being superior / inferior. Windows is also not a RTOS and is objectively bad once timing gets tight, it becomes extremely expensive to try and do anything accurately at these kind of timings. This is not going to change as it's a fundamental of the Windows environment.

    The only reason low DPI can work with 8khz in some games where high DPI doesn't, is because the DPI is nowhere near saturating the polling rate and you're getting much lower input cadence. Set your mouse to 8khz and 400 dpi and move at 1 inch per second and your update rate is 400hz (obviously) and is therefore no longer "8khz" as far as the game is concerned. This is nothing to do with the DPI setting itself, which the game has no knowledge of or interaction with as DPI Wizard already said.

    Most simulation loops of in-game physics, enemy positions - things like whether a crosshair is over an enemy etc will run at 60hz, maybe a really competitive FPS would run higher, and maybe they poll mouse input faster at 2 or 3 times the frame rate of the game, with the textures / graphics rendering running at the actual frame rate obviously. Usually though, you would register event callbacks from input devices which are then passed to a handler / accumulator that is then called once at the start of each frame. In other words, it does not matter if you are sending 8000 updates a second to the OS, because a game will just buffer them all and sum the total mouse distance to be called at the start of each render frame anyway - it makes no practical difference to your crosshair position whether you do this work in the firmware of the mouse at 1khz, or whether a game does it instead at 8khz. The only important factor is that the polling rate is greater than or equal to the highest frame rate of the game at a minimum. If you think using 8khz is giving 8000 discrete rotational updates of your crosshair a second, and for each of those positions an enemy location is being calculated for whether said input would be over a target or not (i.e something meaningful) then you are mad.

    Once we get into the future where performance improves, it is also not inevitable this will change - rather the opposite. We have, for example the precedent of "Super Audio CD" and "DVD Audio" in the audio realm which were both large increases in resolution vs CD quality on a factual basis, yet both failed as standards precisely because that level of resolution is not required for the user experience - instead, users actually gravitated towards lower resolutions (compressed audio formats) and smaller file sizes for easier distribution. Point being - if such technological improvements were available to game engine developers to do more complex computations within a smaller timeframe, they will not be using such resource to update the crosshair position more frequently. There are many things such as higher animation fidelity, better online synchronisation systems, more complex physics, rendering improvements etc which would all be much higher priority and more obvious quality gains.

    Either yourselves or Razer's 8k marketing team moving a cursor on a desktop and posting pictures of "micro stutter" is not going to make any user see a problem. This is unlike the actual micro-stutter that occurs in 3D rendering due to frame-cadence issues, which are readily apparent to even the uninitiated user. There is no one using a 1000hz mouse on a 144hz display and going "geez, I really hate these micro-stutters on my desktop cursor, I can't wait for that to be improved!". In short, you are inventing a problem and then trying to sell a solution / idea of a solution, and your argument effectively boils down that anyone who disagrees just doesn't have the eyes to see the problem, when the truth is no problem exists in the first place, and your solution would not even solve it if it did. 

    Mouse report rate does not need to be synchronised to monitor refresh rate or game frame rate etc whatsoever, and insisting it would improve anything is fundamentally misunderstanding how the quantities of each are handled and how they interact with one another. Games will always render frames at arbitrary intervals because each frame has infinitely variable parameters all which require an arbitrary amount of resources, and mouse input polling on Windows will always be haphazard timewise and always has been due to the fundamental design of the operating system. Moreover, once the timings get down to a millisecond or so then there is no value to anyone in any case. No one is going to care about turning G-sync on if the monitor can run 1000hz (this is effectively the "Super Audio CD effect") and any "improved" mouse API that presumably could send decimal distance values to the OS instead of the standard (x,y) packet of integers with remainders carried to next poll, would also achieve nothing of value to anyone over existing systems that are stable, extremely well understood and established.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.