Jump to content

Oh Deer

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

Fractal Block World

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

Outpath

The sensitivity slider is not accurate, expect some discrepancy.
Read more...

Red Dead Redemption

All aims use the same sensitivity setting, choose the sensitivity for the aim you prefer to be matched.
Read more...

Star Wars: Knights of the Old Republic II - The Sith Lords

Just added!
Read more...

High Dpi issues on old Games / Engines


Recommended Posts

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

[big post]

I was typing a reply to you, and halfway in -- accidentally closed the browser tab with some misclick (Ah, ooops).  I'll retry on my reply soon. 

But in short, great points, with some extra info added.

(Brief gist: Your points were 80% good, 20% slightly behind the times -- due to increased Hz and resolutions making things visible that wasn't before.  The same mouse flaw shows more clearly at 4K 240Hz than 800x600 60Hz.  I also have a 1000Hz test display sitting here too, which ...ahem... has some interesting effets in jitter visibility.  (BTW, Viewpixx sells 1440Hz laboratory test projectors, though that requires some custom driver programming).  Nutshell is that higher resolutions, higher refresh rates, higher frame rates, all concurrently amplify smaller flaws.  Also was talking about how displays behave differently in four situations -- 1. "stationary eyes moving object", 2. "stationary eyes stationary object", 3. "moving eyes moving object", 4. "moving eyes stationary object" -- motion demonstration www.testufo.com/eyetracking -- and not all games behave the same way here.  Some of esports are slowly moving to 1440p already, after camping at 1080p status quo for a while, and this can continue to go up as long as tech allows.  Also, fixed gaze on crosshairs while mouseturning can miss a lot of mouse jitter that is very visible with certain DOTA2 gaming technique where you eyetrack things during a pan, or for arena-style FPS where eye-tracking away from crosshairs is more often.  Jitter and display motion-blur (including extra motion blur added by fast-vibrating jitter) can be over 10x more visible in one of the four situations than another.  Thus, it is just a confirmed strawman argument to claim non-issue in the refresh rate race.  We just respect that everybody sees differently.  Just look at the different eye-movement patterns, the different eyeglasses prescriptions, the 12% population that is colorblind, the less motion sensitive, etc.  It's easy to silo ourselves into just a CS:GO community and not realize the Rainbow Six esports champion used strobing, or some other surprising things some CS:GO veterans were unaware of.  Etc.)

(Will rewrite my fuller attempted reply later)

Edited by Chief Blur Buster
Link to comment
13 hours ago, justtypingokok said:

Ok, so please answer my simple, exemplary question. I use 400 dpi, 2.6 sens, CS:GO of course. rawinput1 and all that needed stuff. I wanted to switch to 1600 dpi (and 0.65 sens obv) multiple times, but I just felt that my aim is off and im not confident about my movements. It wasn't caused by dpi deviations (cause irl 360 distances matched) and wasn't caused by shitty mouse (modded fk2 with g305 internals). My question is: is it really me having it hard to get used to the different way of my screen moving with 1600 etc, or... is it what you are trying to describe? Cause 0.65 x 0.022 doesnt look like a lot of decimals, so I should be fine. Also, why do you suggest using mouse-sensitivity website for that, when we can do simple calculations ourselves? (1600x400 -> 2.6/4)

I also tried 1600 many times. Yesterday I went back to 800 after a month cause iam still not able to use 1600. (1600 dpi 0,52 sens to 800 dpi 1,04 sens csgo) And yes I feel I can’t control 1600 even it should be the same as 800 with calculated sens. I don’t know why. Funny, in 2006 I only used 1600 dpi with my diamondback plasma and never had issues, guess iam old 😅

Edited by Quackerjack
Link to comment

Here is a theory about the mouse sensitivity in csgo. The source engine is based on the halflife engine, which has fixed mouse parameters like mousespeed 1. I think csgo still enables thes mouse parameters, although m_rawinput 1 should bypass them. A halflife modder has already confirmed this:

SamVanheer commented on Jan 16, 2019
The engine handles the raw input cvar, but the game is responsible for those command line parameters. Even if raw input is turned on it will still apply those settings, which affect how the OS processes mouse input before the engine handles it.

MarkC:

They make -noforcemspd and -noforcemaccel the DEFAULT options; By default Source games do not set MouseSpeed to anything. For compatibility reasons, they add a new option -useforcedmparms which reverts to the older HalfLife engine behaviour.

Easy test: Set in regedit smoothmousexcurve and smoothmouseycurve all to zero and use the launch options -useforcedmparms, now you cant move your mouse in csgo even with raw input. But wait why not set m_mousespeed 0 so csgo dont use the windows accelcurve? .... Dont work even with m_mousespeed 0 which tells windows not to use the smoothmousecurve settings you cant move your mouse.

Same happens in cs 1.6. Without launch options you cant move your mouse when smoothmousex-ycurve is set all to 0. So m_rawinput doesnt bypass windows.

 

In short: Set in regedit all mousecurves to 0. Restart your pc, and enjoy the first time csgo with right mouse input. You will feel it right on the start in the welcome screen that the mouse is super responsive now.

 

 

@DPI Wizard

@TheNoobPolice

@Chief Blur Buster

@MacSquirrel_Jedi

 

 

Sources:

https://www.hltv.org/forums/threads/391206/m-mousespeed-1-or-0

https://forums.blurbusters.com/viewtopic.php?t=9769

https://github.com/ValveSoftware/halflife/issues/2087

 

 

Edited by Quackerjack
Link to comment
On 1/25/2023 at 1:54 AM, Quackerjack said:

I also tried 1600 many times. Yesterday I went back to 800 after a month cause iam still not able to use 1600. (1600 dpi 0,52 sens to 800 dpi 1,04 sens csgo) And yes I feel I can’t control 1600 even it should be the same as 800 with calculated sens. I don’t know why. Funny, in 2006 I only used 1600 dpi with my diamondback plasma and never had issues, guess iam old 😅

How many mice have you tried at 1600?  

1600 may have very well been fine in a specific same game engine in the 1024x768 60Hz era, but remember the resolution race concurrently with the refresh rate race, amplifies differences between 800 and 1600, by making flaws more human-visible on today's esports-standard 1080p 240Hz.  Things are even more visible in the 1440p 360Hz LCD or 1440p 240Hz OLED era.  I currently have 240Hz OLED prototypes here, too. 

This leads to a very apparent mysterious observed real-world-user migration to a sweet-spot DPI for older game engines (to our aghast dissapointment).  It's easier to feel differences now with the fluctuating mouse report rates from 0 to 1000 and down back to 0 as you move with mouse varying speeds through the dpi points of your current dpi setting.  The more number of refresh cycle opportunities, the frames per second opportunties, COMBINED with the higher resolution (= bigger onscreen pixels per inch of mouseturn movement = easier to see varying step distances, depending on fixed/moving objects or fixed/moving eye gaze). 

And remember, even if you do 100fps at 240Hz, those 100fps visually round off more accurately to the correct refresh cycle, in relative-time mousetime:photontime than it did on 100fps 60Hz.  Given you've forced a 16.7ms granularity AND 16.7ms of motion blur that throws a fog screen at mouse problems or mouse setting problems or other relevant refresh rate race variables. 

Today, thanks to higher resolutions and higher refresh rates, there is less of that "fog" now on displays and on current GPUs than there was in 2006.  For you, 400dpi may have very well been less than 1 pixel step per mouseturn movement at yesteryear sensitivity settings at yesteryear computer resolutions, but when we're facing tomorrow's 4K esports in its monstrous eyes, those step distances become bigger shark bites now.  The same degree of mouseturn is way more pixels at the same framerate too. 

Yes, now with today's powerful GPUs, you can spew more framerate to compensate for those coarse mouseturn or pan step distances, but any problems in that, can still manifest itself in multiple ways, such as more rapid buildup of math rounding errors (in specific games), especially in engines that never were originally tested at framerates of a future decade.  And also that, high frequency jitter creates extra display motion blur that throttles differences between Hz (e.g. 240Hz-vs-360Hz).  There's multiple concurrent and simultaneous problems that occur at the same time, and it's hard whac-a-mole all of them out. 

(That's also part of the reason of the microsecond timestamps mandated at sensor level in the proposed HD Mouse API, to punch through all the noise inside the chain, whether be cable packet jitter or performance fluctuations in the system, etc).  We just know that laboratory prototype displays need it, even if we don't need it for current esports displays.  Today's lab refresh rates are tomorrow's esports refresh rates, after all.

It really amplifies the problems of the refresh rate race and the resolution race, so the claims of "solutions looking for a problem" is just "Humans cant see 30fps-vs-60fps" garbage speil at this stage, dead horse in current researcher circles on my end, while other communities are still fiercely debatey mcdebateface continuing to debate, as we look from a distance.  But we are, indeed, willing to learn from your other amazing skills, because researchers like me aren't esports atheletes ourselves, as a nuts-and-bolts role in the display engineering industry.  But I'll unceremoniously do line-item callouts, walking away after dropping my microphone.

Many are quite terrible at high DPI but the newest sensors have become really good at 1600 and so, the blame by some of us has begun shifting further up the chain (aka game engine).  It takes only one math weak link and I suspect that people I talked to is only speculating where it is, but right now more esports-industry researchers have begun blaming the engine for weaknesses that only shows up in the current refresh rate race.  I wish we could all come to the same songsheet or agreement, mind you. 

Mind you, I haven't personally looked at the leaked CS:GO source code to Sherlock-Holmes the bleep out of it, but I bet someone else in the world is itching to.

Edited by Chief Blur Buster
Link to comment
3 hours ago, Chief Blur Buster said:

How many mice have you tried at 1600?  

1600 may have very well been fine in a specific same game engine in the 1024x768 60Hz era, but remember the resolution race concurrently with the refresh rate race, amplifies differences between 800 and 1600, by making flaws more human-visible on today's esports-standard 1080p 240Hz.  Things are even more visible in the 1440p 360Hz LCD or 1440p 240Hz OLED era.  I currently have 240Hz OLED prototypes here, too. 

This leads to a very apparent mysterious observed real-world-user migration to a sweet-spot DPI for older game engines (to our aghast dissapointment).  It's easier to feel differences now with the fluctuating mouse report rates from 0 to 1000 and down back to 0 as you move with mouse varying speeds through the dpi points of your current dpi setting.  The more number of refresh cycle opportunities, the frames per second opportunties, COMBINED with the higher resolution (= bigger onscreen pixels per inch of mouseturn movement = easier to see varying step distances, depending on fixed/moving objects or fixed/moving eye gaze). 

And remember, even if you do 100fps at 240Hz, those 100fps visually round off more accurately to the correct refresh cycle, in relative-time mousetime:photontime than it did on 100fps 60Hz.  Given you've forced a 16.7ms granularity AND 16.7ms of motion blur that throws a fog screen at mouse problems or mouse setting problems or other relevant refresh rate race variables. 

Today, thanks to higher resolutions and higher refresh rates, there is less of that "fog" now on displays and on current GPUs than there was in 2006.  For you, 400dpi may have very well been less than 1 pixel step per mouseturn movement at yesteryear sensitivity settings at yesteryear computer resolutions, but when we're facing tomorrow's 4K esports in its monstrous eyes, those step distances become bigger shark bites now.  The same degree of mouseturn is way more pixels at the same framerate too. 

Yes, now with today's powerful GPUs, you can spew more framerate to compensate for those coarse mouseturn or pan step distances, but any problems in that, can still manifest itself in multiple ways, such as more rapid buildup of math rounding errors (in specific games), especially in engines that never were originally tested at framerates of a future decade.  And also that, high frequency jitter creates extra display motion blur that throttles differences between Hz (e.g. 240Hz-vs-360Hz).  There's multiple concurrent and simultaneous problems that occur at the same time, and it's hard whac-a-mole all of them out. 

(That's also part of the reason of the microsecond timestamps mandated at sensor level in the proposed HD Mouse API, to punch through all the noise inside the chain, whether be cable packet jitter or performance fluctuations in the system, etc).  We just know that laboratory prototype displays need it, even if we don't need it for current esports displays.  Today's lab refresh rates are tomorrow's esports refresh rates, after all.

It really amplifies the problems of the refresh rate race and the resolution race, so the claims of "solutions looking for a problem" is just "Humans cant see 30fps-vs-60fps" garbage speil at this stage, dead horse in current researcher circles on my end, while other communities are still fiercely debatey mcdebateface continuing to debate, as we look from a distance.  But we are, indeed, willing to learn from your other amazing skills, because researchers like me aren't esports atheletes ourselves, as a nuts-and-bolts role in the display engineering industry.  But I'll unceremoniously do line-item callouts, walking away after dropping my microphone.

Many are quite terrible at high DPI but the newest sensors have become really good at 1600 and so, the blame by some of us has begun shifting further up the chain (aka game engine).  It takes only one math weak link and I suspect that people I talked to is only speculating where it is, but right now more esports-industry researchers have begun blaming the engine for weaknesses that only shows up in the current refresh rate race.  I wish we could all come to the same songsheet or agreement, mind you. 

Mind you, I haven't personally looked at the leaked CS:GO source code to Sherlock-Holmes the bleep out of it, but I bet someone else in the world is itching to.

I don’t lie, I tried more than 20 from 2014 - 2021 than I stopped. (indcluded more than 10 different mousepads soft hard and hybrids) 

Atm iam on the pulsare xlite v2. I only played csgo and some apex legends during this time. Since 2 months now iam playing Kovaaks. And noticed the mouse feeling is so sharp and direct. So I think there is smth wrong in csgo.
 

But let me tell you after disabling all mousecurves in the regedit like I mentioned in my post before the wonky feeling went away for me in csgo. Iam 100 percent sure the smoothmousecurves affects the mouse in csgo. This would also explain why people notice differences in mouse behavior between different windows versions. 

Link to comment
9 hours ago, Quackerjack said:

Iam 100 percent sure the smoothmousecurves affects the mouse in csgo. This would also explain why people notice differences in mouse behavior between different windows versions. 

There's basically three ways games can get mouse input events; Raw Input, Direct Input (deprecated) and Windows Mouse.

If you call GetRawInputData() or GetRawInputBuffer() (two API's variants of the same fundamental thing) it is physically impossible for those registry settings you mention to have any effect, as the (x,y) packet is obtained beneath user space, and before those functions even occur. In other words, those settings could only affect a game where Raw Input was "enabled", if the "Raw Input" object in the GUI / convar was just flat broken and basically wired to nothing so it was just calling windows mouse. This is not the case in CSGO and it is easily demonstrated it functions correctly. It would also be completely incompetent and I don't think I've seen that ever.

The reason a new DPI can feel different even when normalised to the same degree/count is only because of:

1) The mouse sensor is not perfect and has a different % DPI deviation at different settings. Sometimes this can be quite significant and is very common.

2) Assuming you compensate by lowering the sensitivity after increasing DPI, you get a lower minimum angle increment, which can make very slow movements feel smoother / different.

3) If the game actually does have errors in it's sensitivity formula at certain values. There is no voodoo with this and it is easily tested. CSGO does not have any errors as per DPI Wizard's earlier post.

4) Placebo.
 

On 25/01/2023 at 05:34, Chief Blur Buster said:

stuff


None of that is anything to do with mouse sensitivity. Monitor refresh rates, eye tracking in motion vs stationary, screen resolution or anything else etc. Nothing I have said is out of date or behind the times - there are no developers making their FPS mouse sensitivity calculations incorporate factors relating to eye tracking of moving vs stationary objects in the current year lol. That is all just la la land. The sensitivity functions in a game will point your crosshair / camera to the exact same point in the game world even if you turn the monitor off or don't even have one connected, so there is obviously no link to that. You are almost talking as if you think the turn increment in the game has to be a division of a pixel size / distance or something? Suffice to say it doesn't. There is also no "jitter" in the game math for calculating sensitivity, it's just numbers and math is math. Any timing jitter occurs at the mouse firmware side and it's interface to the operating system due to realities of Windows timing I mentioned before - that is also not a function of screen resolution / monitor refresh rate anyway.

One last point before I am done with this silly thread about things that are already completely understood lol. Any digital function has a finite resolution, and if you so wish, you can zoom into it close enough and find errors. If you blow those errors up into a big enough image on your screen it can look like something significant, and we can create a story about it for x,y,z reasons, but it's effectively just a non-existent problem. I can view an audio wave at 96khz that was original sampled at 192khz and when zooming I can see artefacts of the resolution that are not there in the 192khz source. Does this mean 96khz is awful and all our audio needs to be minimum 192khz? Of course not, even 48khz is totally fine. Because apart from those "audiophile" tosspots who insist on paying a grand per meter of cable that is soldered using the tears of a Swan and insulated with Albatross feathers, it's all just bollocks and complete unnecessary. I get the feeling that if we already had 100,000hz mice, 800 PPI screens, 10,000hz refresh rates, and 10,000FPS gaming, you'd still just be zooming-in and taking screenshots on your website saying "these microseconds are not enough, my new mouse API removes cursor micro-stutter to within the nano second! Look at the jitter your cursor has!" 

The masses are asses though and no doubt there's always some plonker to sell "magic beans" to, so I wish you good luck convincing them with that :)

Link to comment
28 minutes ago, TheNoobPolice said:

There's basically three ways games can get mouse input events; Raw Input, Direct Input (deprecated) and Windows Mouse.

If you call GetRawInputData() or GetRawInputBuffer() (two API's variants of the same fundamental thing) it is physically impossible for those registry settings you mention to have any effect, as the (x,y) packet is obtained beneath user space, and before those functions even occur. In other words, those settings could only affect a game where Raw Input was "enabled", if the "Raw Input" object in the GUI / convar was just flat broken and basically wired to nothing so it was just calling windows mouse. This is not the case in CSGO and it is easily demonstrated it functions correctly. It would also be completely incompetent and I don't think I've seen that ever.

The reason a new DPI can feel different even when normalised to the same degree/count is only because of:

1) The mouse sensor is not perfect and has a different % DPI deviation at different settings. Sometimes this can be quite significant and is very common.

2) Assuming you compensate by lowering the sensitivity after increasing DPI, you get a lower minimum angle increment, which can make very slow movements feel smoother / different.

3) If the game actually does have errors in it's sensitivity formula at certain values. There is no voodoo with this and it is easily tested. CSGO does not have any errors as per DPI Wizard's earlier post.

4) Placebo.
 


None of that is anything to do with mouse sensitivity. Monitor refresh rates, eye tracking in motion vs stationary, screen resolution or anything else etc. Nothing I have said is out of date or behind the times - there are no developers making their FPS mouse sensitivity calculations incorporate factors relating to eye tracking of moving vs stationary objects in the current year lol. That is all just la la land. The sensitivity functions in a game will point your crosshair / camera to the exact same point in the game world even if you turn the monitor off or don't even have one connected, so there is obviously no link to that. You are almost talking as if you think the turn increment in the game has to be a division of a pixel size / distance or something? Suffice to say it doesn't. There is also no "jitter" in the game math for calculating sensitivity, it's just numbers and math is math. Any timing jitter occurs at the mouse firmware side and it's interface to the operating system due to realities of Windows timing I mentioned before - that is also not a function of screen resolution / monitor refresh rate anyway.

One last point before I am done with this silly thread about things that are already completely understood lol. Any digital function has a finite resolution, and if you so wish, you can zoom into it close enough and find errors. If you blow those errors up into a big enough image on your screen it can look like something significant, and we can create a story about it for x,y,z reasons, but it's effectively just a non-existent problem. I can view an audio wave at 96khz that was original sampled at 192khz and when zooming I can see artefacts of the resolution that are not there in the 192khz source. Does this mean 96khz is awful and all our audio needs to be minimum 192khz? Of course not, even 48khz is totally fine. Because apart from those "audiophile" tosspots who insist on paying a grand per meter of cable that is soldered using the tears of a Swan and insulated with Albatross feathers, it's all just bollocks and complete unnecessary. I get the feeling that if we already had 100,000hz mice, 800 PPI screens, 10,000hz refresh rates, and 10,000FPS gaming, you'd still just be zooming-in and taking screenshots on your website saying "these microseconds are not enough, my new mouse API removes cursor micro-stutter to within the nano second! Look at the jitter your cursor has!" 

The masses are asses though and no doubt there's always some plonker to sell "magic beans" to, so I wish you good luck convincing them with that :)

I disagree with your first sentence. If you set the smoothmousecurves to 0 (restart pc to take effect) and open cs 1.6 you are not able to move your mouse even rawinput 1 is set. Same happens when you use -useforcedmparms in csgo.

1) agree

2) agree

3) You didnt test it i guess it

 

Edited by Quackerjack
Link to comment

CS 1.6 != CSGO. Why are you talking about a different game now? 

If you enter some convar that changes the way mouse is handled then you are possibly overwriting how it gets mouse input. I'm not familiar with that game and the only code I have perused is this https://github.com/ValveSoftware/source-sdk-2013 that CSGO is built on. I guess 1.6 is not since it was released ten year earlier.

Like I said, if the game is actually calling Raw Input (which CSGO does) then those settings don't have any effect.

Link to comment
W dniu 26.01.2023 o 12:49, Quackerjack napisał:

In short: Set in regedit all mousecurves to 0. Restart your pc, and enjoy the first time csgo with right mouse input. You will feel it right on the start in the welcome screen that the mouse is super responsive now.

I haven't analysed nor tested anything here, but just from the first glance - yea, you might feel some differences starting from the welcome screen cause you do realize that actually MENUS in cs:go are not affected by rawinput? Same applies for ingame buy menu. So if you change windows multiplier for example, you will feel the difference in menus, but not in aiming ingame. So that might change some of your theories, just adding it here for now.

Edited by justtypingokok
Link to comment

The thing is csgo (source engine from 2004) is based on the goldscr engine from 1998 which cs 1.6 is on. Extra info goldscr engine is based on the quake engine which always had the bug with the hardcoded mousespeed 1 and thats why the cpl fix,markc etc came. (this hardcoded number was never fixed till today in source its hardcode mousespeed 1).

And csgo is using by default mouse parameter -noforcemspd and -noforcemaccel. These commands tell the game to not set the windows MouseSpeed (enhanced pointer precision) and to not set the mouseThreshold 1 and 2. You can see this in the ingame commands from csgo m_mousespeed, m_mouseaccel1 and m_mouseaccel2.

Valve says that m_mousespeed 0 is accel off, m_mousespeed 1 is accel on and m_mousespeed 2 is accel on with the use of the thresholds. Questionable why m_mousespeed 1 is the default here.

If you set all smoothmousecurves to 0 and use the launch option -useforcedmparms you arent able to move the mouse. Even if you set m_mousespeed to 0 in csgo which should mean noforcemspd (noforcemousespeed) you cant move the mouse so that means m_mousespeed cant be set to 0. In order to make your mouse work again you have to set additionally -noforcemspd in the launch option.

So why there is a command m_mousespeed in the game which is set by default to 1 and cant be disabled? And also we know the engine is still responsive to mouse parms even when raw input is set? That means m_mousespeed 1 calls the smoothmousecurves from windows in a strange way and can just set to 0 when all smoothcurves are set to 0.

Edited by Quackerjack
Link to comment
1 hour ago, justtypingokok said:

I haven't analysed nor tested anything here, but just from the first glance - yea, you might feel some differences starting from the welcome screen cause you do realize that actually MENUS in cs:go are not affected by rawinput? Same applies for ingame buy menu. So if you change windows multiplier for example, you will feel the difference in menus, but not in aiming ingame. So that might change some of your theories, just adding it here for now.

You are absolutely right on that point. I just mentioned it so people see that there things begin to change and everything is setup correctly. Still questionable is that you cant move your mouse in the menu when smoothmousecurves set to 0, -useforcedmparms is in use and m_mousespeed is set to 0. Which should not activate the hardcoded mousespeed 1 from the engine. It is also clear why you can not move the mouse with this setup. The hardcoded number mousespeed 1 is forced and with smoothmousecurves zero you get 1*0= 0 mousespeed.

Link to comment

 

1 hour ago, Quackerjack said:

If you set all smoothmousecurves to 0 and use the launch option -useforcedmparms you arent able to move the mouse.

Why are you doing any of this?

That launch arg just calls the WinAPI SystemParametersInfo() Get/SetMouse which is how applications control the windows mouse / EPP values if they want to, and it's storing existing values that it then first checks on init. So in other words, you are manually editing values in the registry to some value you think is going to do something for you, and then asking the game to call that and store it in a variable to use on init. I am not surprised if it breaks something with the cursor.
https://github.com/ValveSoftware/source-sdk-2013/blob/master/mp/src/game/client/in_mouse.cpp#L262

The default Windows setting is to have EPP enabled, maybe why the convar m_mousespeed has been chosen to be set to default = 1? but this doesn't affect Raw Input in any case. None of those relate to the function for the raw input mouse event accumulator which overwrites the *mx / *my variables with raw input data if m_rawinput = 1 
https://github.com/ValveSoftware/source-sdk-2013/blob/master/mp/src/game/client/in_mouse.cpp#L331

The order the functions are checked / applied is then here - you can see for GetMouseDelta() function the first two args are the passed-in raw input deltas.
https://github.com/ValveSoftware/source-sdk-2013/blob/master/mp/src/game/client/in_mouse.cpp#L663

It checks for it's own filtering convar at that point - which defaults to 0 at the top of the file in case you were also worried about that, before passing out the output
https://github.com/ValveSoftware/source-sdk-2013/blob/master/mp/src/game/client/in_mouse.cpp#L368

 
All you need to do in CSGO is set m_rawinput 1 and m_customaccel 0 if you don't want to use either the in-game accel or Windows EPP / WPS settings. There's no reason to enter any other convars or worry about anything else. It matters not one jot that Enhance Pointer Precision is enabled or not (m_mousespeed = 1) when the game is calling Raw Input. 

Edited by TheNoobPolice
Link to comment

https://github.com/ValveSoftware/halflife/issues/1793 yes this is 1.6 again but still today people claim that they have acceleration even with m_rawinput 1.

And still there is something wrong in csgo.

I just downloaded cs source to compare to csgo which is the same engine.

When all mousesmoothcurves set to 0 and -useforcedmparms is set in csgo you cant move the mouse.

When all mousesmoothcurves set to 0 and -useforcedmparms is set in source you can move the mouse.

And i dont change this values arbitrarily. Mousesmoothcurves set to 0 means no acceleration would be apply.

https://forums.blurbusters.com/viewtopic.php?f=10&t=8482&start=10

image.thumb.png.912942bee3adc72f20c3eef9dfe642a0.png

Link to comment

Quoted from MarkC:

-noforcemspd (or no launch options) means: Do not force EPP ON. If EPP is on on the desktop, it will still be on in-game. (Do not force EPP ON does not mean the same as Force EPP OFF.)

-useforcedmparms means: Force EPP to the m_mousespeed ConVar. Since m_mousespeed is always = 1 (a bug), EPP is then forced ON. No existing option forces EPP OFF.

This is half true: If useforcedmparms is set and m_mousespeed 1. (default windows regedit) no EPP is on with m_rawinput 1. If you set rawinput 0 EPP will be set! When now on this setup m_mousespeed 0 is used the EPP is off with no matter what rawinput is set to.  So you can set it to 0.

And here comes the bug:

When useforcedmparms force EPP to the m_mousespeed ConVar u should be able to move the mouse in csgo with smoothmousecurves set to 0 and m_mousespeed 0. And you should not be able to move the mouse in cs source with useforcedmparms and m_mousespeed 1.

 

 

Back when Windows 95 and 98 were as good as it got, Windows Control Panel allowed you to set the "mouse speed" using a 7 position slider. There was also a system function : SystemParametersInfo(SPI_SETMOUSE, ...) so that a program could set the same. The Windows 95/98 "mouse speed" slider was not like we have now: it actually set acceleration. Accel was calculated using two threshholds. At one mouse-speed threshhold the pointer-speed would suddenly double, and at the next threshhold the speed would suddenly quadruple. At the left-most 'Slow' setting, there was no accel and pointer response was 1-to-1. As you moved the slider to the right, accel is turned on and the threshholds were changed so they cut in earlier. There is a graph showing the sudden changes here: Windows pointer speeds (translated from Japanese) Internally, the accel was controlled by a parameter called 'MouseSpeed' and the threshholds were called 'MouseThreshold1' (the doubling one) and 'MouseThreshold2' (the quadrupling one). When MouseSpeed=0, no accel was applied. When MouseSpeed=1, only the doubling threshhold was used. When MouseSpeed=2, both threshholds were used. See here: MS TechNet>MouseSpeed. Needless to say, the sudden jumps when the accel kicked in and the pointer speed suddenly doubled or quadrupled was not good for game play. The developers of Quake had a fix to the accel problem: They called the SystemParametersInfo(SPI_SETMOUSE, ...) function to try and turn accel off. On Windows 95/98/2000 their fix worked! But it had a hidden problem. They set the accel threshholds to double at a threshhold of zero (MouseSpeed=1, MouseThreshold1=0). They should have set the accel to not double at all (MouseSpeed=0). On Windows 95/98/2000, with Quake setting MouseSpeed=1, MouseThreshold1=0 there was no accel felt, because ALL mouse input was doubled, so response was linear. Why did Quake choose that setting? Maybe they read this MS article: How to disable mouse acceleration. That article says to set MouseSpeed to 1 or 2. They say 1 or 2 because the user can already use the Control Panel to set MouseSpeed=0, but for some users that makes the mouse too slow, so MS suggest 1 or 2 in their article. In Windows 2000, mouse pointer speed was changed. The existing accel was moved from the 95/98 slider to a new 'Acceleration' section, with 4 option buttons: None, Low, Medium, High. The new replacement pointer speed slider was an 11 position slider (and internally known as 'MouseSensitivity'). (Things get dangerous here, because now the Windows MouseSpeed setting controls acceleration, and the MouseSensitivity setting controls speed!) Quake using MouseSpeed=1 to 'disable' accel was a problem waiting to happen when Windows XP changed the way it handled accel. Windows XP does not use the threshholds, but instead has a checkbox: 'Enhance pointer precision' (EPP) to turn accel on. They mapped the new EPP checkbox to the existing Windows 95/98/2000 MouseSpeed accel setting. Now when Quake tries to disable Windows accel by setting MouseSpeed=1, MouseThreshold1=0, it instead turns accel ON! At this point people freak out and CPL comes to the rescue with the CPL Mouse Fix and then later Cheese with a much improved mouse fix and Aion/Anir with his accelfix and wcafix... Valve use the Quake engine for their new HalfLife/CounterStrike 1.6 games and inherit the problem. What do they do to fix it? For compatibility reasons, they don't fix it, they just a add an option to work around it. They add a -noforcemspd launch option to NOT set MouseSpeed=1. They add a -noforcemaccel launch option to NOT set the MouseThresholdN values. They DO NOT add any option to disable accel, or fix the Quake code that tried to disable accel; the user must disable EPP themselves. Valve create the Source engine which inherits the problem. What do they do to fix it? They make -noforcemspd and -noforcemaccel the DEFAULT options; By default Source games do not set MouseSpeed to anything. For compatibility reasons, they add a new option -useforcedmparms which reverts to the older HalfLife engine behaviour. - MarktheC, creator of MarkC Mousefix

 

 

And again a mistake, when source set by default the noforcemspd and the noforcemaccel why you cant move the mouse in csgo with mousesmoothcurves set to 0?  But infact you can move the mouse in cs source! Cause Csgo dont has this launch options by default and so this mouse parameters set in the engine affect the sensitivity.

If you follow the steps i mentioned you can cleary see this is true, no placebo, testable and provable.

 

In short, if you want a 1:1 unsmoothed sensitivity in csgo (same goes for apex legends) you have to set: -useforcedmparms (allows the use of noforcemspd and noforcemaccel), -noforcemspd, -noforcemaccel in the launch options. Use m_rawinput 1 and m_mousespeed 1.

And to be 100 percent sure set all smoothmousecurves to 0 (i dont know if this is than still necessary i changed the smoothmousecurves for testing and leaved it to 0)

Edited by Quackerjack
Link to comment
20 hours ago, Quackerjack said:

In short, if you want a 1:1 unsmoothed sensitivity in csgo (same goes for apex legends) you have to set: -useforcedmparms (allows the use of noforcemspd and noforcemaccel), -noforcemspd, -noforcemaccel in the launch options. Use m_rawinput 1 and m_mousespeed 1.

And to be 100 percent sure set all smoothmousecurves to 0 (i dont know if this is than still necessary i changed the smoothmousecurves for testing and leaved it to 0)

This is incorrect and just misinformation. All you need to do is enable Raw Input and ensure the in-game accel is off (m_customaccel = 0, which isn't related to the windows accel whatsoever).

My motivation for pointing this out, is that a lot of fairly naïve newcomers to gaming lurk on forums like these to get info and this sort of thing is totally confusing and just completely wastes their time.

Given I know for a fact you do not need to enter those launch args to get unsmoothed / unaccelerated / unfiltered mouse input - you can see exactly what those params do in the code I posted and exactly how Raw Input is being called. You can also see in this very thread a video where DPI Wizard is sending different counts distance values at different input cadences and there is no smoothing on the output as the movement frame-by-frame is identical and it ends up at exactly the same spot (so no accel either). I know you are not experiencing any smoothing at all with m_rawinput = 1 because I know it does not exist (unless you deliberately go out of your way to enable m_filter = 1)

So I am genuinely curious what your motivation is to pretend otherwise. Like is it just a "I want to be right" or are you genuinely gaslighting yourself into thinking that unless you mess about in the registry there is some mystical effect on your mouse input?

MarkC registry stuff is 13 years old and hasn't ever been relevant to Raw Input games. All it ever did is make it so when "Enhance Pointer Precision" checkbox is enabled in windows mouse settings, the cursor output is the same as with it disabled. You can just untick the box in windows mouse settings for the same effect, or enable Raw Input where the checkbox doesn't affect input anyway. There is probably about 3 titles in existence where it was necessary to avoid forced windows accel, none of which anyone is playing now and/or have since been updated to use Raw Input anyway.

Link to comment
47 minutes ago, TheNoobPolice said:

This is incorrect and just misinformation. All you need to do is enable Raw Input and ensure the in-game accel is off (m_customaccel = 0, which isn't related to the windows accel whatsoever).

My motivation for pointing this out, is that a lot of fairly naïve newcomers to gaming lurk on forums like these to get info and this sort of thing is totally confusing and just completely wastes their time.

Given I know for a fact you do not need to enter those launch args to get unsmoothed / unaccelerated / unfiltered mouse input - you can see exactly what those params do in the code I posted and exactly how Raw Input is being called. You can also see in this very thread a video where DPI Wizard is sending different counts distance values at different input cadences and there is no smoothing on the output as the movement frame-by-frame is identical and it ends up at exactly the same spot (so no accel either). I know you are not experiencing any smoothing at all with m_rawinput = 1 because I know it does not exist (unless you deliberately go out of your way to enable m_filter = 1)

So I am genuinely curious what your motivation is to pretend otherwise. Like is it just a "I want to be right" or are you genuinely gaslighting yourself into thinking that unless you mess about in the registry there is some mystical effect on your mouse input?

MarkC registry stuff is 13 years old and hasn't ever been relevant to Raw Input games. All it ever did is make it so when "Enhance Pointer Precision" checkbox is enabled in windows mouse settings, the cursor output is the same as with it disabled. You can just untick the box in windows mouse settings for the same effect, or enable Raw Input where the checkbox doesn't affect input anyway. There is probably about 3 titles in existence where it was necessary to avoid forced windows accel, none of which anyone is playing now and/or have since been updated to use Raw Input anyway.

Its not if you would tested it by yourself. I never talked about m_customaccel, its clear that this is an command to activate acceleration from the game.

Also if you say m_rawinput 1 ignores all that this doesnt hurt if anyone gonna try it. 

You cant deny that cs source use the launch option by default and csgo not with my testing above.

I never said use markc mouse fix? Why you talk about that? Do you even read and try things?

And the insinuation that i do this for gaslighting or to be right is a bad joke. I do this cause a lot off people feel that the sensitivity in csgo is off and i want to share the way how to fix this.

Link to comment
3 hours ago, Quackerjack said:

I never said use markc mouse fix? Why you talk about that? Do you even read and try things?

You literally just pasted a huge quote from him where he is talking about that exact thing and you have mentioned repeatedly about changing the "smoothmousecurves to 0" which is "the MarkC fix" . No one needs to do that anymore, at all. Like ever. It's a decade out of date. People changing SmoothMouseXCurve and SmoothMouseYCurve in the registry aka MarkC fix as a routine habit on Windows installs are simply copy/pasting decades-old internet advice. Doing so may even break things if people mess with stuff they don't understand. Raw Input is there for a reason.

3 hours ago, Quackerjack said:

You cant deny that cs source use the launch option by default and csgo not with my testing above.

Off-topic. I never said anything about cs source or 1.6. Its also completely irrelevant to anyone playing csgo, which is the entire point of the discussion from the statement "CSGO has 100% FAIL sensitivity". False. It doesn't. If you wanted to talk about cs source or 1.6 and their specifics you could have started a thread doing so.

3 hours ago, Quackerjack said:

 

And the insinuation that i do this for gaslighting or to be right is a bad joke. I do this cause a lot off people feel that the sensitivity in csgo is off and i want to share the way how to fix this.

Fix what? There's a reason the entry on this website says only this for CSGO:

image.png.fdd45bdd4da2d927a9b2b7f14a00bbee.png

And it doesn't also say ensure you have smoothmousecurves set to 0 and all those launch arguments you mention. Because it is not required.

But up to you man, if you want to ignore the relevant game engine code directly shown to you, ignore a video in this thread showing the input is not smoothed or accelerated by the owner of this website you are subscribed to, and instead go by  "a lot of people feel"....then good luck. I've tried to help you with this to make you see the game is fine but if you insist on inventing problems that you then have to work around there's nothing anyone can do about that.

Link to comment
35 minutes ago, TheNoobPolice said:

You literally just pasted a huge quote from him where he is talking about that exact thing and you have mentioned repeatedly about changing the "smoothmousecurves to 0" which is "the MarkC fix" . No one needs to do that anymore, at all. Like ever. It's a decade out of date. People changing SmoothMouseXCurve and SmoothMouseYCurve in the registry aka MarkC fix as a routine habit on Windows installs are simply copy/pasting decades-old internet advice. Doing so may even break things if people mess with stuff they don't understand. Raw Input is there for a reason.

Off-topic. I never said anything about cs source or 1.6. Its also completely irrelevant to anyone playing csgo, which is the entire point of the discussion from the statement "CSGO has 100% FAIL sensitivity". False. It doesn't. If you wanted to talk about cs source or 1.6 and their specifics you could have started a thread doing so.

Fix what? There's a reason the entry on this website says only this for CSGO:

image.png.fdd45bdd4da2d927a9b2b7f14a00bbee.png

And it doesn't also say ensure you have smoothmousecurves set to 0 and all those launch arguments you mention. Because it is not required.

But up to you man, if you want to ignore the relevant game engine code directly shown to you, ignore a video in this thread showing the input is not smoothed or accelerated by the owner of this website you are subscribed to, and instead go by  "a lot of people feel"....then good luck. I've tried to help you with this to make you see the game is fine but if you insist on inventing problems that you then have to work around there's nothing anyone can do about that.

MarkC fix doesnt change the smoothmousecurves to 0 if it would do that you wouldnt be able to move your mouse when a game set the EEP on.

I literally discribed all steps how you can see that windows influence the sensitivity. I also linked sources that shows that the engine is responsive for this hardcoded values and that there are still hardcoded values.

As i mentioned csgo is based on source which is based on goldscr which is based on halflife engine. And all have the inhert bug that sets mousespeed 1. Thats why i mentioned css and cs 1.6 here. I also never said this game has mouse acceleration. I said windows affect the sensitivity cause this hardcoded number mousespeed 1 is set in the engine.

Also ccs uses launch options by default and csgo dont use them. Both games have the options rawinput btw. And yes the reason why css has defaulted launch options is because it wasnt released with m_rawinput which was added later.

So its not false. If you would tested it (i think you dont even play csgo) you would see iam right. I mean it doesnt hurt to try it out and set the launch options. And again its not about right or false. Its for the ppl to get a responsive mouse in this bugged game.

And in my post i said i used the smoothmousecurves for testing. I dont think that they are necessary if -usedforcedmparms -noforcemspd and -noforcemaccel is set. I just left them to 0. But setting smoothmousecurves to 0 was needed to show that csgo dont use the launch options by default.

Edited by Quackerjack
Link to comment

Most importatly...

My old "100% fail" quite is fully outdated with the information I had received since over time from multiple sources.  This is already retracted, given the light of solutions.  Just wanted to remind about that.    I am now catching up on what I have not yet replied.  Keep tuned, editing reply in Notepad++ (for autosave)

Edited by Chief Blur Buster
Link to comment
On 1/24/2023 at 11:34 PM, Chief Blur Buster said:

I was typing a reply to you, and halfway in -- accidentally closed the browser tab with some misclick (Ah, ooops).  I'll retry on my reply soon. 

But in short, great points, with some extra info added.

(Brief gist: Your points were 80% good, 20% slightly behind the times -- due to increased Hz and resolutions making things visible that wasn't before.  The same mouse flaw shows more clearly at 4K 240Hz than 800x600 60Hz.  I also have a 1000Hz test display sitting here too, which ...ahem... has some interesting effets in jitter visibility.  (BTW, Viewpixx sells 1440Hz laboratory test projectors, though that requires some custom driver programming).  Nutshell is that higher resolutions, higher refresh rates, higher frame rates, all concurrently amplify smaller flaws.  Also was talking about how displays behave differently in four situations -- 1. "stationary eyes moving object", 2. "stationary eyes stationary object", 3. "moving eyes moving object", 4. "moving eyes stationary object" -- motion demonstration www.testufo.com/eyetracking -- and not all games behave the same way here.  Some of esports are slowly moving to 1440p already, after camping at 1080p status quo for a while, and this can continue to go up as long as tech allows.  Also, fixed gaze on crosshairs while mouseturning can miss a lot of mouse jitter that is very visible with certain DOTA2 gaming technique where you eyetrack things during a pan, or for arena-style FPS where eye-tracking away from crosshairs is more often.  Jitter and display motion-blur (including extra motion blur added by fast-vibrating jitter) can be over 10x more visible in one of the four situations than another.  Thus, it is just a confirmed strawman argument to claim non-issue in the refresh rate race.  We just respect that everybody sees differently.  Just look at the different eye-movement patterns, the different eyeglasses prescriptions, the 12% population that is colorblind, the less motion sensitive, etc.  It's easy to silo ourselves into just a CS:GO community and not realize the Rainbow Six esports champion used strobing, or some other surprising things some CS:GO veterans were unaware of.  Etc.)

(Will rewrite my fuller attempted reply later)

And now I rewrite my accidentally-lost reply, as I promised. 🙂

Although I will shorten it (summarized/abbreviated) because of continued discussion already covered some ground, and the gist summarized it well.

So in other words, I will avoid repeating things I already said in the "gist" above, so read the gist above first.

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

The reason 8khz doesn't work a lot of times is nothing to do with "mouse mathematics" or floats vs double precision of sensitivity variables. It is because of how games acquire input themselves.

Correct, that's a bigger weak link. 

Now some angles to ponder:

It's a long chain from sensor to photons, and "because of how games acquire input themselves" is exactly it in how good/bad it can do mathematics too.  For example, you're doing 500 rounding-errors per second at 500 frames per second, on a 500 Hz display, it can build up to enough to create a "false acceleration or deceleration / false negative acceleration" behavior in some engines even though mouse acceleration is turned off.   The more frames that spew at you, the more math rounding errors per second that can occur, if the developer wasn't forward-looking-enough to compensate for that possibility.

The best way is to correctly accumulate the mouse deltas and base your new angle of mouseturn off that (accumulation), not the rounded-off mouselook position from the previous frame cycle (modify the previous rounded-off mouselook position).  500 rounding errors per second definitely builds up very fast -- to more than 1% mouselook offpoint error per second in some engines.   This isn't audiophile below-the-noisefloor stuff.

Sometimes a shotgun approach is done to fix all the weak links all at once including the human-visible (seen by 90%) and the less-likely (seen by 1%), like the concept of preserving sensor timestamps all the way to the engine, regardless of how engine acquires input, increases likelihood the engine can avoid a lot of pitfalls befalling us today.

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

A game like Valorant only works with 8khz when it calls GetRawInputBuffer() to hold the inputs until an arbitrary buffer size is filled when the engine is ready for the input. If any app just gets WM_INPUT messages "as they come" as per a standard read of Rawinput, then unless it's pipeline is so trivial that it is only doing something like updating a counter, then it will most likely fall over with inputs spamming-in haphazardly every ~125us.

Yup. Exactly.

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

The symptom is dropped packets, negative accel and / or just maxed CPU usage and stuttering.

Yup. Exactly.  
But it's not just about dropped packets, but millions of causes that can jitter everything in the noise between sensor to photons.

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

None of this is anything to do with sensitivity calculations being superior / inferior. Windows is also not a RTOS and is objectively bad once timing gets tight, it becomes extremely expensive to try and do anything accurately at these kind of timings. This is not going to change as it's a fundamental of the Windows environment.

Correct, Windows is not an RTOS.

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

The only reason low DPI can work with 8khz in some games where high DPI doesn't, is because the DPI is nowhere near saturating the polling rate and you're getting much lower input cadence. Set your mouse to 8khz and 400 dpi and move at 1 inch per second and your update rate is 400hz (obviously) and is therefore no longer "8khz" as far as the game is concerned. This is nothing to do with the DPI setting itself, which the game has no knowledge of or interaction with as DPI Wizard already said.

Exactly -- that's why I am a big advocate of being able to expose higher DPI more accurately to games with fewer side effects. Although DPI and pollrate are different, higher DPI can more quickly lead to quicker event storms in an engine, which can jitter those timestamps around a bit more often at higher DPI (e.g. moving 1 inch per second is more likely to produce more mouse events per second, or more mouse deltas per second, at 1600dpi than 400dpi).

Even if the game engine batch-processes an event storm to prevent single-event processing overheads, this does not solve-all. Other weak links remain too, including the lack of accurate sensor timestamps (e.g. less accurate timestamps created by software long past the horses leaving the barn door of the sensor). 

The timestamps added to mous deltas created by software can jitter many pollcycles-off the actual sensor timestamps, depending on how things played out (USB jitter, software peformance, etc).  This is a problem even if you're reading 10 polls all at once from a mouse deltas array, and they all had timestamps added by some software middleman that has no correlation to what was at the sensor level.

My stance is, "all weak links are valid.  Fix all of them all at once if possible".  Thus, timestamps at the sensor, and preserve all the way to the software for it to process accurately relative to their gametime timestamps.  Then who cares about some of the academic discussion in between (some visible, some not visible)...

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

Most simulation loops of in-game physics, enemy positions - things like whether a crosshair is over an enemy etc will run at 60hz, maybe a really competitive FPS would run higher, and maybe they poll mouse input faster at 2 or 3 times the frame rate of the game, with the textures / graphics rendering running at the actual frame rate obviously. Usually though, you would register event callbacks from input devices which are then passed to a handler / accumulator that is then called once at the start of each frame. In other words, it does not matter if you are sending 8000 updates a second to the OS, because a game will just buffer them all and sum the total mouse distance to be called at the start of each render frame anyway - it makes no practical difference to your crosshair position whether you do this work in the firmware of the mouse at 1khz, or whether a game does it instead at 8khz. The only important factor is that the polling rate is greater than or equal to the highest frame rate of the game at a minimum. If you think using 8khz is giving 8000 discrete rotational updates of your crosshair a second, and for each of those positions an enemy location is being calculated for whether said input would be over a target or not (i.e something meaningful)

That isn't currently today the biggest benefit of 8Khz poll rate (in an 8KHz supported game).  Many games don't do 8KHz good enough to make the 8KHz shine, but in the best cases, you can really notice the smoothness differences in the best software in the most optimized systems.  The problem is that it's very few and far in between, due to systemic limitations that make it below the human-benefits noise floor.  Most of the time I leave my mouse at 2KHz for most software for that reason, as it derives practically ~90% of the benefits of the 1KHz->8KHz journey.  But under Windows 11, some games seems to even have more difficulty with even 1KHz (Microsoft optimization issue) then a Windows update fixed that -- and it worked fine at 1-2KHz, but not at 4-8Khz.  Currently poll rates pushes system efficiency / OS efficiency / driver efficiency to the point where 8KHz does not produce more widespread universal human-benefits, so it's part of the reason why the proposed HD Mouse API is a good idea.

Now poll rate should be somewhat oversampled slightly (jitter).  That's seen in the jitters of poll Hz versus another frequency (a framerate, or a refresh rate), like the weird mousefeel you see on some mice and systems if you're testing 249Hz or 251Hz, which is 1Hz off a multiple of common poll rates.  The jitter feel is also different depending on sync technology (VSYNC ON, VSYNC OFF or G-SYNC, etc) but a modicum oversampling can have beneficial effects to avoid the jittering between two frequencies.

This is not important most of the time, but it's one of the many error margins that need to be addressed en-masse (as while one error margin may be below human noise floor, the dozens of error margins add up to visibiliity).  Rather punch through all the noise and just preserve sensor timestamps to the game engine, rather than waste time whac-a-mole on the chain, anyway.

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

then you are mad.

While it's not my purview to determine if a person is mad or not, I like to refinements that satisfies both the mad and non-mad people.   

That's during the shotgun journey of blasting multiple error margins out of existence at the same time (non human visible and human visible ones)

On 1/23/2023 at 8:48 AM, TheNoobPolice said:

Once we get into the future where performance improves, it is also not inevitable this will change - rather the opposite. We have, for example the precedent of "Super Audio CD" and "DVD Audio" in the audio realm which were both large increases in resolution vs CD quality on a factual basis, yet both failed as standards precisely because that level of resolution is not required for the user experience - instead, users actually gravitated towards lower resolutions (compressed audio formats) and smaller file sizes for easier distribution. Point being - if such technological improvements were available to game engine developers to do more complex computations within a smaller timeframe, they will not be using such resource to update the crosshair position more frequently. There are many things such as higher animation fidelity, better online synchronisation systems, more complex physics, rendering improvements etc which would all be much higher priority and more obvious quality gains.

This isn't "below the noise floor" stuff.

It's all about geometrics, e.g. 240Hz vs 1000Hz is visible to well over 90% of random-from-public average mainstream population if you design your blind test for a forced-eye-tracking on forced-moving-object. (e.g. moving text readability tests where successfully reading text is the objective).  Perfect 240fps and perfect 1000fps is very hard though, without all the noise margins (including jitter) that can reduce differentials.  

Instead of DVD-vs-720p situation, you've got a VHS-vs-8K situation, except it's in the temporal dimension.   

Motion blur differences of the display persistence of unstrobed framerate=Hz of 4x is much easier to see than LCD GtG-throttled refresh rate incrementalism.

display-persistence-blur-equivalence-to-camera-shutter.png

Scientifically, assuming perfect framepacing, perfect sensor/step pacing, and perfect GtG=0ms, all motion blur is purely frametime on sample and hold, and zero GPU motion blur:

- The most perfect zeroed out motion blur is motionblur=frametime on ANY strobeless/impulseless/flickerless/BFIless display.
- Perfect 240fps 240Hz has exactly the same motion blur as a 1/240sec film camera photograph
- Perfect 1000fps 1000Hz has exactly the same motion blur as a 1/1000sec film camera photograph

It will typically be worse than that (240Hz vs 360Hz feels like a 1.1x difference, not 1.5x) due to the supertankerfuls of zillions of weak links from sensor to photons, especially software.  But also hardware. 

Common ones accrue.  GtG weak link?  Ditto.  USB Jitter weak link?  Ditto.  Mouse sensor flaws?  Ditto.  Pollrate processing issue?  Ditto.  GPU blur weak link?  Ditto.  Math rounding errors that builds up faster than before due to more frames per second?  Ditto.   That's just a few of the massive number of of weak links, some significant, some insignificant.   

It is already well known that major differences in camera shutter speeds are easier to tell apart.  If you're a photographer in sports, it's hard to tell apart a 1/240sec and a 1/360sec shutter photograph.  But way easier to tell apart a 1/240sec versus 1/1000sec for fast motion.

motion_blur_from_persistence_on_sample-and-hold-displays.png

So, perceptually, 1000fps 1000Hz will yield perfect motion (zero stroboscopics, zero blur) for motionspeeds up to 1000 pixels/sec.  Now if you go faster motion with tinier pixels, e.g. 4000 pixels/sec or 8000 pixels/sec, it can require more Hz to retina all that out.  But on a 1080p display, those motionspeeds are gone offscreen in a fraction of a second.  Not so if you're wearing a 16K 180-degree FOV VR headset.   

Loosely speaking, the retina refresh rate generally occurs at some Hz above a human eye's ability to track a specific motionspeed in pixels/sec (as long as the angular resolution is resolvable enough that panning images has a different clarity than static images).  This is to avoid the stroboscopic stepping effect (e.g.  during fixed gaze) and to avoid the motion blur (e.g. during tracking gaze), whereupon you can't tell apart the display from real life in a temporal sense;

...Side commentary and side napkin exercise: VR ideally has to look exactly real life.  VR is forced to flicker only because we can't achieve things stroblessly. Oculus Quest 2 is 0.3ms MPRT, so it would require 3333fps 3333Hz to create the same motion quality without its current flicker-based (BFI / strobing) motion blur reduction.  So it underlines how far we are from eliminating motion blur strobelessly for all use cases for all possible uses of a display in humankind.  Until then A/B test between VR and real life can easily fail due to a single weak link (display motion blur forced on you above-and-beyond real life, or stroboscopic effect above-and-beyond real life, or jitter forced on you above-and-beyond real life etc).   And we're backporting some VR innovations back to FPS displays, given how things are often much better framepaced / jittercompensated in the VR world than in the non-VR world currently... It's shocking really, if you knew the weak links that existed that we just handwave-off in non-VR world in the era of ever-bigger ever-higher-resolution displays.  Given a good system, VR achieved a near-miraculous efficiency in sensortime:photontime that is two to three orders of more magnitude more accurate than an average esports system thanks to superlative display hardware optimization and software optimization, assuming you don't underspec for the VR content like try to run Half Life Alyx on a GTX 780.  When you're head turning at 8000 pixels/sec, anything that looks off from real-life shows up far more, like a 4 pixel microstutter which is 0.5ms framepace error on 8000 pixels/sec headturn.  As a rule of thumb, stutter error bigger than MPRT in motion blur, is generally easily human visible in a motion test.  Even 240Hz unstrobed is still (4.167ms MPRT + "X" ms GtG) of display motion blur, which easily hides these tiny jitters....

Part of the inspiration to HD Mouse APIs, since often headtracker data is often microsecond timestamped and relayed to the system in a preserved way, so that tracktime:gametime:photontime is far more accurate in VR hardware engineering and software engineering, than for current non-VR. 

And yes, GPU frame rate is a limiting factor to tomorrow's 1000fps+ 1000Hz+ world.  

Yes, GPU horsepower does need to keep up, but there's longterm lagless paths there (reprojection).  Speaking of that, that's a VR innovation eventually to be backported to non-VR contexts in the coming years or decade, as some game developers has noticed.

Being that said, this isn't only-audiophile-can-hear-it stuff.   

It's more majority-of-mainstream-can-see-it-if-blindtested stuff.  Even for things like browser scrolling, too (that's why Apple and Samsung came out with 120Hz.  But more mainstream notices if it's 4x+ differences, like 60Hz-vs-240Hz, or 240Hz-vs-1000Hz, rather than merely only 60Hz-vs-120Hz -- there are some games and some blind tests where the geometric differences matter than the absolute number of the Hz).

Even things like mere browser scrolling looks noticeably clearer to most population at 240fps 240Hz versus 1000fps 1000Hz, it's as if you've turned on strobing (except you didn't).  Apple would do it already if it used zero battery power.

At the end of the day, killing enough of the weak links to create maximum refresh rate perfection amplifies the differences between refresh rates more.  Sometimes one lineitem is unimportant, but we need to obliterate a hell lot of line items, and the mouse is undoubted one of the many weak links.

Just as 4K was $10,000 in year 2001 (IBM T221), 4K is now a $299 walmart special.  Tomorrow's refresh rates are easily pennies extra on top of 60Hz, but we keep buying small upgrades for so long (e.g. 240Hz then 360Hz is a worthless 1.5x and throttled further to 1.1x due to weak links).  And Linus said it's cheap to laglessly frame-generate extra frames, but game engines and GPU vendors haven't picked it up -- but they will be forced to by 2030 to support future kilohertz-class displays.

You said some significant issues (that I agree with).  I said significant issues too.   Whether you and I agree on weak link line-items is academic to the point that there are weak links is sabotaging the refresh rate race, where we don't get our money's worth out of the Hz that we ideally should.   

Sometimes weak links are fixed line-item.  The 240Hz OLED prototypes sitting here, will help solve a major GtG weak link (Thanks to the zeroed-out weak link 240Hz OLED is roughly as clear-motion as the best 360Hz, a XL2566K E-TN, when at framerate=Hz, although the XL2566K lag is still slightly more -- but that's a separate flaw to be refined other than display motion blur).  Other weak links should be fixed en-masse (e.g. sensor-level timestamps).  Either way, many weak links remain.

Also, given lab has already confirmed mainstream random-from-population visibility way above the noise floor of all that silly audiophile testing.... this definitely isn't audiophile-only stuff.   

We've just been spoonfed refresh rate incrementalism for way too long, with too little humankind benefit, without enmasse-fixing all systemic weak links all at once, including everything you just said (which indirectly agrees with several of my points, even if we may lineitem-disagree).

Edited by Chief Blur Buster
Link to comment
  • 2 weeks later...

I remembered one bug in Counter Strike 1.6 engine that we oldschool AWPers know about. This bug is present with any combination of settings you can think of :) regardless of whether RawInput is on/off. This nicely illustrates that this topic has its meaning from both perspective of view (benefit/criticism). Because as you can see in the video. A slow movement of the mouse to the right is registered by the game when scoped. But not to the left.

This is also why I prefer manual verification over software. Because I want to be 100% sure how the system/game reflects the real world (long chain from sensor to photons). Not just software world.

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...