Jump to content

Mortal Shell

See the game notes for instructions on how to disable smoothing.
Read more...

Incursion Red River

The sensitivity slider is not accurate, expect some discrepancy. Use the config file for best accuracy.
Read more...

ONCE HUMAN

Hipfire added, more aims to come. See the game notes for instructions on how to disable smoothing.
Read more...

Content Warning

Just added!
Read more...

Vellum

See the game notes for instructions on how to disable smoothing.
Read more...

CS:GO - m_rawinput vs rinput


Recommended Posts

  • Wizard

What input method provides the best sensitivity in CS:GO between raw mouse input on or off, or the rinput program? Find out here!

 

 

Rinput is a 3rd party program that injects into the game to remove mouse acceleration.

 

Test system

CPU: Intel Core i7-3770K

Memory: 16 GB

GPU: Geforce GTX Titan | Driver: 355.82

OS: Windows 10 Professional x64

Game: Counter Strike: Global Offensive (Steam version), Exe version 1.34.9.9 (csgo)

Map: Dust II

 

Test method

This test is done by analyzing the amount of packets lost or gained between the mouse driver and the game, using Logitech's scripting capabilities in their Gaming Software. Each test is performed by sending 100k packets to the game.

Packet discrepancy is not the same as acceleration. Discrepancies are whole reports sent from the mouse that does not make it to the game (or in some rare cases gets doubled), while acceleration is reports received by the game, but where movement is added or subtracted to the packet.

 

What's not tested

Since this analysis is done directly from the mouse driver, it does not account for anything that happens between the sensor in the mouse and the mouse driver, such as USB latency and sensor inaccuracy.

CS: GO's built in raw mouse input supposedly suffers from buffering tied to the FPS, but I have not tested this, nor experienced it. Other sources indicate it might be a matter of 1-2 ms lag, which is insignificant compared to the findings here.

The tests are also done offline, so most variables caused by online play are removed. If anything, these results represent a best-case scenario.

 

Testing procedure

At first I was set on doing a test where 1000 packets were sent to the game, then counting the discrepancy. The results turned out to vary a lot, so I needed to increase the sample size.

I did a lot of testing with 10k packets, doing each test 10 times. The results were a lot more consistent, but very time-consuming.

So I tested with 100k packets one time for each test, and compared the results to the average of the 10k packets tests. As they were pretty much identical, I used this method for the final analysis.

 

Test layout

There are four input methods tested here:

  • m_rawinput 1 - CS: GO's built-in raw mouse input capability
  • m_rawinput 0 - By turning off raw mouse input, the game gets the mouse data from Windows rather than directly from the mouse driver
  • rinput.exe 1.2 - This program injects itself between the mouse driver and the game, and this is the non-sequential version
  • rinput.exe 1.31 - This is the sequential version of rinput

Each input method is tested with packet sizes of 1, 10 and 100. The packet size determines how many counts the game should move the cursor/cross hair in one instance.

 

In addition, each of these combinations are tested with 1, 2 and 4 ms delay, equaling polling rates of 1000 Hz, 500 Hz and 250 Hz.

 

And lastly, all these test were done both with and without vertical sync (vsync). All vsync tests were done with triple buffered vsync @ 120 Hz, but I've tests different refresh rates, and 60, 120 or 144 Hz does not make much of a difference.

 

Results

These graphs show the average discrepancy for each polling rate. The numbers are averaged from the 1, 10 and 100 packet size tests. The raw data is shown under the graphs.

As these are averages, the reality is that you might experience anything from close to no packet loss, to over twice the average. For methods other than "m_rawinput 1" there is a lot of random fluctuation.

 

Vsync OFF:

post-2-0-51036600-1442408902.jpg

post-2-0-70640200-1442408903.jpg

 

Vsync ON:

post-2-0-18101000-1442408904.jpg

post-2-0-62926800-1442408904.jpg

 

Video

This video visualizes the discrepancy:

 

Conclusion

"m_rawinput 1" provides by far the most accurate and consistent interpretation of the signals from the mouse driver. All tests point to the fact that this feature does exactly what it is supposed to do; get the raw mouse data without anything interfering in between.

 

"m_rawinput 0" is better than rinput.exe in all tests, sometimes rinput.exe has triple the loss of "m_rawinput 0".

 

Rinput 1.2 seems to yield a slightly better result than 1.31, especially at 1000 Hz.

 

Vsync has a huge impact on everything except "m_rawinput 1".

 

It's important to note that 1% loss does not necessarily mean that you move 1% shorter. When you play, the packets sent from you mouse will vary greatly in size, ranging from 1 to over 1000 depending on your settings. If you lose a 1 count packet, it won't make much of a difference, but if you lose a 1000 count packet, it will make a huge impact on how far you move.

 

If you want accuracy, there is no doubt; use "m_rawinput 1". Around 1-2 percent packet loss is quite significant, especially since the actual loss fluctuates, and the movement will be inconsistent.

 

If you want some other tests done, please let me know!


View full update

Link to comment

Very interesting data, thanks for your hard work.

 

I still can't help but feel that the packet loss is something that can vary greatly depending on one's specific hardware/firmware/software/OS/game configuration. One of the biggest initial things I see, is the fact that VSync makes such a big difference, which makes me question how much other factors are affecting the data

 

I am eager to perform your experiment on my own computer. Is it possible you could provide the scripting file and instructions for your exact methodology using the Logitech software?

 

Also I'd like to say that 1.5ms input lag added between the mouse data read and the frame render is not as insignificant to some as you stated. Some people seem particularly sensitive to this and would still find the tradeoff of slightly less accurate mouse movement by packet loss worth the tradeoff for being mouse movement feeling more responsive and 'connected'.

 

And that's interesting about RInput 1.2 versus 1.31. I want to look into this more myself. Thanks again for your contributions

Edited by Vols and Jezuz
Link to comment
  • Wizard

I welcome everyone to try for themselves :)

 

This is the script, it will be assigned to key G1 while M1 is active (make sure you un-bind any keys/macros bound to this key):

if (event == "G_PRESSED" and arg == 1) and GetMKeyState() == 1 then
	for i = 0, 999 do
        MoveMouseRelative(10,0)
	Sleep(1)
	end
end

Set the sensitivity in-game to 1.63636364, and executing the script should spin you around exactly 360 degrees. FOV needs to be set to the default 90.

 

Note that with only 1000 packets you will get varying results.

Link to comment

I wrote a script that performs the exact same core mouse movement inputs as the Logitech script, but I suspect that it is much more efficient/accurate at mimicking 1000Hz USB polling, for several reason I will go into when I do a full post of my data similar to the OP. This inefficiency/inaccuracy would be exacerbated for m_rawinput 0/RInput because of the effect on DLL calls that must be very precise, and those two require 1 and 2 extra DLL calls respectively that m_rawinput 1 does not. More details on this as well as my script to follow in the full findings post.

 

But for a teaser, my initial finding from 100,000 mouse data packet runs, is RInput has ~0.42% packet loss (VSync off, 1000Hz/1ms update rate) as compared to ~0.002% for m_rawinput 1. This data would certainly make the tradeoff for m_rawinput 0/RInput, of inaccuracy from packet loss to improve ~1.5ms reduced input lag (relative to m_rawinput 1), much closer to being a debatable choice with no clear-cut winner.

Edited by Vols and Jezuz
Link to comment
  • Wizard

I just got a Teensy, which if I'm understanding correctly, I'll be able to do essentially what our scripts are doing, but sending it through USB to get the full polling effect and not having to rely on scripts which mimic USB polling and CPU prioritization with timers and such.

I'm planning to get one too, many cool things that can be done with it :)

Link to comment

if anyone can compile https://github.com/abort/Rinput-Library

try this

http://www.overclock.net/t/1545382/csgo-m-rawinput-0-drops-samples/190#post_24385831

 

 1.5ms input lag 

 

damn it man idk how many times i have to say this

i only got 1.5ms difference by underclocking my 4770k to 800mhz and gpu to whatever the minimum is in afterburner. this gave an in-game fps of ~90 in a server with no bots. the difference at my usual settings was 100-200us. dont remember precisely, but there was for sure a consistent difference.

Link to comment

if anyone can compile https://github.com/abort/Rinput-Library

try this

http://www.overclock.net/t/1545382/csgo-m-rawinput-0-drops-samples/190#post_24385831

 

 

damn it man idk how many times i have to say this

i only got 1.5ms difference by underclocking my 4770k to 800mhz and gpu to whatever the minimum is in afterburner. this gave an in-game fps of ~90 in a server with no bots. the difference at my usual settings was 100-200us. dont remember precisely, but there was for sure a consistent difference.

 

 

I'm not just basing that on your finding, I've also seen high-speed camera, full-chain measurement of 1-2ms consistent input lag for m_rawinput 1 versus m_rawinput 0. I had it bookmarked along with a wealth of other good source materials off which to base my future experiments and measurements, but unfortunately they were rip'ed along with several things when my Windows install ate itself a few months ago and I had to revert to a full system image from April. I'm trying to find it, but I'm fairly certain it was from one of those obscure Japenese sites.

 

Regardless, I test out and swap between m_rawinput 0, RInput, and m_rawinput 1 on a fairly regular basis, and I'm just about as positive as I can be that (on my system at least) that there is some kind of input lag or smoothing or buffering or whatever you want to call it going on with m_rawinput 1. It just feels distinctly different to me in my vast personal experience with it. For reasons I could probably expound upon but is probably not worth mentioning here, I almost prefer it for AWPing, while in general I prefer RInput. Obviously an A/B/X test would best exceedingly difficult if not impossible for RInput v. m_rawinput 1 (v. m_rawinput 0?), though an A/B test would be possible for me to implement in my program, but I feel confident I would have similar ability to differentiate between them as I did with m_rawinput 1 versus m_rawinput 0 in your CS:GO A/B/X procedure.

 

Acquiring my Teensy was a step in the right direction for providing some better raw data on the matter. But eventually I hope to also obtain and oscilloscope and high-speed camera that is capable of full-chain input lag measurement. I think only then when will be able to ferret out exactly what is going on here.

Edited by Vols and Jezuz
Link to comment
  • Wizard

I just got a Teensy, which if I'm understanding correctly, I'll be able to do essentially what our scripts are doing, but sending it through USB to get the full polling effect and not having to rely on scripts which mimic USB polling and CPU prioritization with timers and such.

BTW, one annoying limitation of the Logitech script, is that it takes 0.5 ms for the script to actually send the packet. So when telling the script to sleep for 1 ms, the total execution is in reality 1.5 ms before it cycles through, making it closer to 666 Hz, not 1000 Hz. So me stating that it is 1000 Hz is not precise, but showing how the timing affects the accuracy was the main point.

So I hope the Teensy can emulate real 1000 Hz!

Link to comment

BTW, one annoying limitation of the Logitech script, is that it takes 0.5 ms for the script to actually send the packet. So when telling the script to sleep for 1 ms, the total execution is in reality 1.5 ms before it cycles through, making it closer to 666 Hz, not 1000 Hz. So me stating that it is 1000 Hz is not precise, but showing how the timing affects the accuracy was the main point.

So I hope the Teensy can emulate real 1000 Hz!

 

That's the exact type of thing I was worrying about for the Logitech script. My script actually does 1000Hz (I was able to measure the average update interval as something like 1.0005ms), but it still seems that something is interfering in the low-level functioning that I want to troubleshoot. My script is an automated loop of runs of 1000 mouse data packets to turn 360* in one second and collecting the discrepancy in getpos via condump. The vast majority of these loop iterations yield a packet loss for RInput/m_rawinput 0 that is on the order of 0.1% or less, but there are these rare instances when the packet loss for a loop iteration will be 10% or higher, which is literally 10% of the mouse input data being lost over an entire second. Now I know this doesn't actually occur with a real mouse, because that kind of catastrophic degradation of input data would be so distinct that it would almost be like a severe lag spike or a sensor malfunction.

 

I have updated my script, but not ran it yet, to also collect statistical things like the max packet loss per loop iteration and the sample standard deviation to help tell how often these catastrophic events are occurring and to help analyze how well the script is emulating 1000Hz USB polling. And I may even spend some time doing some detective work on what exactly is crapping on m_rawinput 0/RInput when we are running our scripts. Worse comes to worse, I will just do some elementary data to at least show my findings with (what I feel) is an improvement to the Logitech script, but unless I am able to find an explanation or work-around for whatever is causing that unrealistic data for m_rawinput 0/RInput, I don't think I will be satisfied with anything less than the Teensy experiment.

 

Unfortunately, I am going out of town for the weekend so it is unlikely I will make any progress or be able to collect/post data until next week.

 

teensy2.0 does 1000hz fine with theur sample usb mouse code

 

@volz: well i've never seen anyone but myself do the camera tests properly... they usually just look at a small region of the screen, so each measurement has +-(0.5/refreshrate) error.

 

I had two other such camera tests people had done (one of which I was talking about in my previous post) bookmarked that I felt like were reasonably close to being as experimentally sound as yours. When I have more time, I will do some deep googling and reading back of relevant threads to try and remember how exactly I stumbled upon or was directed to them.

Edited by Vols and Jezuz
Link to comment

well the highspeed camera tests i've seen before, including my own before i started the photodiode+mcu setup, are intrinsically are limited to ~1ms accuracy and precision because of 1000fps and the timing of physical events (whether it's motion or clicking a button).

 

actually i'll do the teensy test as well. i'm not convinced that there's anything "wrong" with m_rawinput 1 that causes the 0.0x% deviations in the op. probably related to floating-point stuff

Link to comment
  • Wizard

Not related to the input lag, but I think I'll perform this analysis on one or two more games that use different engines and supports both raw and Windows inputs. I have a couple of games in mind that have shown to be extremely precise when I analyzed their sensitivity. Should be interesting to see how the results compare.

Link to comment

Intro

Hope you don't mind me stealing your formatting ;) but my goal was to simulate 500/1000 Hz mouse data with actual 500/1000 Hz USB polling using a Teensy 2.0.
 
Test system
CPU: Intel Core i7-3770K
Memory: 16 GB
GPU: Geforce GTX 680 | Driver: 347.09 Beta
OS: Windows 7 Ultimate SP1 x64
Game: Counter Strike: Global Offensive (Steam version), Exe version 1.35.0.2 (csgo)
Map: Dust II
 
Testing procedure

The example USB Mouse code from the official Teensy site was modified to send 1 s of continuous input data (500 packets of 20 x-counts for 500 Hz, or 1000 packets of 10 x-counts for 1000 Hz), send a mouse button click, collect the data, calculate the packet discrepancy, then reset and restart the loop. At sensitivity 1.63636364, these 1 s intervals of input data will give 360° rotation in-game. Thus, by using "setang 0 0 0" at the start of each iteration, then "clear; getpos; condump" as triggered by the mouse button click after each 1 s interval of input data, the discrepancy in the view angle as collected from the console dump text file can be calculated, and the results averaged over all intervals in the loop.

 

Results

The presented data is for 100 x 1 s data collection intervals, which is 50,000 mouse data packets for 500 Hz and 100,000 mouse data packets for 1000 Hz.

 

iQ3UGoE.png

 

kj8omCE.png

 

Conclusion

As I suspected, the mouse data packet loss as previously reported was greatly exaggerated by the script's shortcomings in accurately modeling USB polling data. I believe using Teensy 2.0, which can not only send input data with about the same precision as a 1000 Hz USB mouse (1 ms update interval) unlike the script but can also actually send the data via USB polling to accurately reflect its CPU prioritization, is a far superior method for calculating mouse data packet loss.

 

While m_rawinput 1 undoubtedly still provides the most accurate translation of mouse data, with practically no packet loss, RInput and m_rawinput 0 only have ~0.2-0.3% packet loss. This is about 10x less than was measured using the script simulation. While RInput may have a slight packet loss, some users perceive a very noticeable improvement in the input lag and/or the difficult to describe overall 'mouse feel'. I suspect that perceived difference in RInput versus m_rawinput 1 is very dependent on the individual's unique peripherals/hardware/firmware/drivers/software/OS/game configuration. If packet loss was ~1-3% as previously described, then it would be hard to justify this tradeoff. But considering the findings of this testing that it is more on the scale of 0.2-0.3%, I believe there can be enough improvement for some users in perceived input lag and/or mouse feel to be worth the small amount of packet loss associated with RInput.

 

The actual mouse packet discrepancy for RInput, as measured by actual 500/1000 Hz USB polling, is about an order of magnitude less than was previously presented. There is indeed still a difference from m_rawinput 1, which provides no packet loss, but it's generally around 0.2-0.3%, which for some users may be worth the tradeoff for less input lag and/or better mouse feel as compared to m_rawinput 1.

Edited by Vols and Jezuz
Link to comment
  • Wizard

Very cool, thanks! BTW, did you do this with vsync on or off?

 

Just one little note, and I don't know if this will make any difference. But you should try to do the test sending 100k packets in one go, with sensitivity 0.0163636364, and see how many counts you are off turning 360 degrees. I got very varying results doing only 1000 packets, but the average might be around the same.

Link to comment

Vsync off. I don't think very many, if any, serious CS:GO players will use Vsync, so I didn't collect any data for Vsync on. The added input lag is simply not worth whatever improvement there may be in packet discrepancy.

 

My intuition is that 100k packets at sensitivity 0.0163636364 would give the same results with the Teensy, but I'll have to try that another time as I am all data'ed out. I've read that the source engine doesn't deal well with sensitivities < 1, so perhaps that's why your results varied, but I'm not sure I believe that. It could also just be because of the inadequacies of attempting to emulate USB polling with a script. I tried giving my script I previously mentioned a CPU priority everywhere from Low to Realtime, and it still gave much larger packet loss %'s than when using actual USB polling with the Teensy.

Edited by Vols and Jezuz
Link to comment
  • Wizard

Vsync off. I don't think very many, if any, serious CS:GO players will use Vsync, so I didn't collect any data for Vsync on. The added input lag is simply not worth whatever improvement there may be in packet discrepancy.

 

My intuition is that 100k packets at sensitivity 0.0163636364 would give the same results with the Teensy, but I'll have to try that another time as I am all data'ed out. I've read that the source engine doesn't deal well with sensitivities < 1, so perhaps that's why your results varied, but I'm not sure I believe that. It could also just be because of the inadequacies of attempting to emulate USB polling with a script. I tried giving my script I previously a mentioned a CPU priority everywhere from Low to Realtime, and it still gave much larger packet loss %'s than when using actual USB polling with the Teensy.

I'm waiting for my Teensy, so I will do some re-testing when I get it. I've done a few tests similar to my original ones here on Battlefield 4, and the loss seems to be significantly lower. So the Source engine probably plays a big part indeed.

Link to comment

I had some interesting findings with how the average absolute** packet discrepancy of m_rawinput 0 and RInput varies with fps_max value. I'm not sure exactly what causes variance in m_rawinput 0, but with RInput it involves DLL calls to Windows functions, so if the input data read occurs between the tiny gap of time in between GetCursorPos and SetCursorPos, then that packet will be lost. So the more frames that are being rendered, the more often this occurs. So I'm not using fps_max 375 with RInput via sourceGL because that fps will almost always stay over 300, under which I can detect a difference in aim feel/appearance, but keeps the average packet discrepancy lower than higher fps caps. qsxcv has mentioned that it would be possible to update the RInput code to fix

 

**Previously I had been erroneously calculating all discrepancies as loss, because my script was taking the absolute value of the discrepancy in angle (because I had noticed that quite often it will be something like -0.000014, due to minor rounding errors and such). After looking through the data I noticed is that m_rawinput 0 would occasionally have positive packet discrepancy, almost like it was experiencing random positive acceleration. So I've shown histograms of the packet loss % occurrence. You can see that these positive discrepancies do not occur with RInput.

 

Note that m_rawinput 1 has 0.000% loss invariably in all my tests, so I have not included it's data since it would always be the same. Also, these tests are for 1,000,000 data packets, or 1000 separate collections of 1000 ms data collections of 1 ms packet interval (1000 Hz). Final note, with fps_max 999 I get ~600-800 fps.

 

E2eGDlI.png

 

m_rawinput 0 histograms:

fps_max 60 http://i.imgur.com/sP1bdus.png

fps_max 120 http://i.imgur.com/ucsQqCG.png

fps_max 300 http://i.imgur.com/g5OaihW.png

fps_max 375 http://i.imgur.com/fEXW70F.png

fps_max 500 http://i.imgur.com/90theYv.png

fps_max 999 http://i.imgur.com/zuSZ4cS.png

 

RInput 1.31 histograms:

fps_max 60 http://i.imgur.com/1xRq7Wm.png

fps_max 120 http://i.imgur.com/yO7ed60.png

fps_max 300 http://i.imgur.com/76EAgsP.png

fps_max 375 http://i.imgur.com/gdd9hZS.png

fps_max 500 http://i.imgur.com/yUbSlBB.png

fps_max 999 http://i.imgur.com/VCMH0cD.png

Edited by Vols and Jezuz
Link to comment

what's the framerate here?

fullscreen or fullscreen windowed?

 

how do you get both cursors to show up?

 

 

ideally, if the game runs at infinite fps and there's nothing stupid going on (e.g. vsync), 

1. the windows cursor and the game cursor should match perfectly at the top of the screen

2. the windows cursor should lag behind the game cursor by one refresh period (well actually refresh period minus vblank time)

 

this is because the position of the windows cursor is determined at the start of the refresh cycle

Edited by qsxcv
Link to comment

what's the framerate here?

fullscreen or fullscreen windowed?

 

how do you get both cursors to show up?

 

 

ideally, if the game runs at infinite fps and there's nothing stupid going on (e.g. vsync), 

1. the windows cursor and the game cursor should match perfectly at the top of the screen

2. the windows cursor should lag behind the game cursor by one refresh period (well actually refresh period minus vblank time)

 

this is because the position of the windows cursor is determined at the start of the refresh cycle

two cursors: SHIFT + F2 = window play demo. (full screen mode)

Feels delay much higher than with CSS

fps menu 300.

VSync off

Edited by Rud
Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...