Jump to content

Double Action: Boogaloo

Just added.
Read more...

FOUNDRY

Just added!
Read more...

Gray Zone Warfare

Use the config file for best accuracy.
Read more...

World of Shooting

See the game notes for instructions on how to disable smoothing.
Read more...

Robin Hood - Sherwood Builders

See the game notes for instructions on how to reduce smoothing.
Read more...

CS:GO - m_rawinput vs rinput


DPI Wizard

What input method provides the best sensitivity in CS:GO between raw mouse input on or off, or the rinput program? Find out here!

 

 

Rinput is a 3rd party program that injects into the game to remove mouse acceleration.

 

Test system

CPU: Intel Core i7-3770K

Memory: 16 GB

GPU: Geforce GTX Titan | Driver: 355.82

OS: Windows 10 Professional x64

Game: Counter Strike: Global Offensive (Steam version), Exe version 1.34.9.9 (csgo)

Map: Dust II

 

Test method

This test is done by analyzing the amount of packets lost or gained between the mouse driver and the game, using Logitech's scripting capabilities in their Gaming Software. Each test is performed by sending 100k packets to the game.

Packet discrepancy is not the same as acceleration. Discrepancies are whole reports sent from the mouse that does not make it to the game (or in some rare cases gets doubled), while acceleration is reports received by the game, but where movement is added or subtracted to the packet.

 

What's not tested

Since this analysis is done directly from the mouse driver, it does not account for anything that happens between the sensor in the mouse and the mouse driver, such as USB latency and sensor inaccuracy.

CS: GO's built in raw mouse input supposedly suffers from buffering tied to the FPS, but I have not tested this, nor experienced it. Other sources indicate it might be a matter of 1-2 ms lag, which is insignificant compared to the findings here.

The tests are also done offline, so most variables caused by online play are removed. If anything, these results represent a best-case scenario.

 

Testing procedure

At first I was set on doing a test where 1000 packets were sent to the game, then counting the discrepancy. The results turned out to vary a lot, so I needed to increase the sample size.

I did a lot of testing with 10k packets, doing each test 10 times. The results were a lot more consistent, but very time-consuming.

So I tested with 100k packets one time for each test, and compared the results to the average of the 10k packets tests. As they were pretty much identical, I used this method for the final analysis.

 

Test layout

There are four input methods tested here:

  • m_rawinput 1 - CS: GO's built-in raw mouse input capability
  • m_rawinput 0 - By turning off raw mouse input, the game gets the mouse data from Windows rather than directly from the mouse driver
  • rinput.exe 1.2 - This program injects itself between the mouse driver and the game, and this is the non-sequential version
  • rinput.exe 1.31 - This is the sequential version of rinput

Each input method is tested with packet sizes of 1, 10 and 100. The packet size determines how many counts the game should move the cursor/cross hair in one instance.

 

In addition, each of these combinations are tested with 1, 2 and 4 ms delay, equaling polling rates of 1000 Hz, 500 Hz and 250 Hz.

 

And lastly, all these test were done both with and without vertical sync (vsync). All vsync tests were done with triple buffered vsync @ 120 Hz, but I've tests different refresh rates, and 60, 120 or 144 Hz does not make much of a difference.

 

Results

These graphs show the average discrepancy for each polling rate. The numbers are averaged from the 1, 10 and 100 packet size tests. The raw data is shown under the graphs.

As these are averages, the reality is that you might experience anything from close to no packet loss, to over twice the average. For methods other than "m_rawinput 1" there is a lot of random fluctuation.

 

Vsync OFF:

post-2-0-51036600-1442408902.jpg

post-2-0-70640200-1442408903.jpg

 

Vsync ON:

post-2-0-18101000-1442408904.jpg

post-2-0-62926800-1442408904.jpg

 

Video

This video visualizes the discrepancy:

 

Conclusion

"m_rawinput 1" provides by far the most accurate and consistent interpretation of the signals from the mouse driver. All tests point to the fact that this feature does exactly what it is supposed to do; get the raw mouse data without anything interfering in between.

 

"m_rawinput 0" is better than rinput.exe in all tests, sometimes rinput.exe has triple the loss of "m_rawinput 0".

 

Rinput 1.2 seems to yield a slightly better result than 1.31, especially at 1000 Hz.

 

Vsync has a huge impact on everything except "m_rawinput 1".

 

It's important to note that 1% loss does not necessarily mean that you move 1% shorter. When you play, the packets sent from you mouse will vary greatly in size, ranging from 1 to over 1000 depending on your settings. If you lose a 1 count packet, it won't make much of a difference, but if you lose a 1000 count packet, it will make a huge impact on how far you move.

 

If you want accuracy, there is no doubt; use "m_rawinput 1". Around 1-2 percent packet loss is quite significant, especially since the actual loss fluctuates, and the movement will be inconsistent.

 

If you want some other tests done, please let me know!


User Feedback

Recommended Comments



Hello,

 

I've recorded two videos with the following settings: m_rawinput 1, fps_max 0 and V-Sync 0. Just look how the mouse delay increased when full 3D scene has been rendered. And in this situation the game's cursor should represent the crosshair behavior, no? I write about this because I can definitely feel the difference between mouse responce in 1.6 and CS GO.

 

https://youtu.be/zEwDSlyD85Y

 

Edited by nicolovbg
Link to comment
kkkkkkkkkkkkkkkkkkkkkkk

 

volz and i fixed the issues with m_rawinput 0 using a modified rinput

 

here's a dll

https://drive.google.com/file/d/0B5_qzxdnJV0PNDZsejlsTGFQUmM/view?usp=sharing < this was a debug build which doesn't work unless you have msvcr debug libraries and stuff


 

source:


or


and replace rawinput.cpp with this:


 

i guess volz will commit the changes to his fork when he sees this

Edited by qsxcv
Link to comment

kkkkkkkkkkkkkkkkkkkkkkk

 

volz and i fixed the issues with m_rawinput 0 using a modified rinput

 

here's a dll

https://drive.google.com/file/d/0B5_qzxdnJV0PNDZsejlsTGFQUmM/view?usp=sharing

 

source:

https://github.com/abort/Rinput-Library

or

https://github.com/VolsandJezuz/Rinput-Library

and replace rawinput.cpp with this:

http://pastebin.com/Kjb7vWhj

 

i guess volz will commit the changes to his fork when he sees this

Better uninstall sourceGL then and see if people get VAC banned before using it again?

Link to comment

I replaced the Rinput.dll with yours, started CS GO, typed csgo.exe and got: failed to load Dymanic Link Library. Is everything alright with your file?

Link to comment

Teensy tests show the new RInput has <0.001% packet loss across all FPS.

 

sourceGL has been updated with the new RInput: http://sourcegl.sourceforge.net/

 

Here is the new RInput dll on github: https://github.com/VolsandJezuz/Rinput-Library/releases/tag/v1.35

 

 

This is unnecessary imo

I'm curious how SourceGL works/prioritizes itself when someone is video encoding (IE. livestreaming). Would it result in more packet loss (Also, <0.001% in a new update is great)? Would it be beneficial to have SourceGL run at Above Normal or High priority?

Link to comment

sourceGL is basically idling once a game is launched, so it's probably best to have it at Normal or Below Normal priority. Using sourceGL for RInput versus just using RInput.exe will have no difference in packet loss, sourceGL simply automates the process.

Link to comment

sourceGL is basically idling once a game is launched, so it's probably best to have it at Normal or Below Normal priority. Using sourceGL for RInput versus just using RInput.exe will have no difference in packet loss, sourceGL simply automates the process.

I meant using SourceGL+RInput together while livestreaming. As livestreaming is a very intensive process and I'm unsure if the video encoder thread might take priority over SourceGL+RInput a few times and result in some packet loss.

Link to comment

Everything workes fine. Guys, your work is invaluable. I can finally play with 99,99% of accuracy and no delay at all, I am so sensitive to those miliseconds. :/ Thank you a lot! (:

Link to comment
  • Wizard

 

kkkkkkkkkkkkkkkkkkkkkkk
 
volz and i fixed the issues with m_rawinput 0 using a modified rinput
 
here's a dll
https://drive.google.com/file/d/0B5_qzxdnJV0PNDZsejlsTGFQUmM/view?usp=sharing < this was a debug build which doesn't work unless you have msvcr debug libraries and stuff
 
source:
or
and replace rawinput.cpp with this:
 
i guess volz will commit the changes to his fork when he sees this

 

This is awesome! Will test it a bit later ;)

Link to comment

I meant using SourceGL+RInput together while livestreaming. As livestreaming is a very intensive process and I'm unsure if the video encoder thread might take priority over SourceGL+RInput a few times and result in some packet loss.

 

sourceGL injects the RInput dll into the game exe just like RInput.exe. Therefore sourceGL having a higher priority can only hurt things, if it were to take CPU time away from the game, but this shouldn't really happen because like I said, sourceGL is basically not doing anything after game startup. And with the way m_rawinput 1 and RInput work, mouse packets shouldn't be lost if the video encoder is taking some CPU time away from the game. Instead, it would just cause fewer FPS, so the mouse packets could have slightly more input delay just from less FPS.

Link to comment

sourceGL injects the RInput dll into the game exe just like RInput.exe. Therefore sourceGL having a higher priority can only hurt things, if it were to take CPU time away from the game, but this shouldn't really happen because like I said, sourceGL is basically not doing anything after game startup. And with the way m_rawinput 1 and RInput work, mouse packets shouldn't be lost if the video encoder is taking some CPU time away from the game. Instead, it would just cause fewer FPS, so the mouse packets could have slightly more input delay just from less FPS.

Alright, good to know. I'm just trying to figure out why using rInput from SourceGL in L4D2 makes it feel like the mouse is harder to control, as opposed to m_rawinput 1.

 

EDIT: Yeah, I can say without a doubt that rInput (using SourceGL) while livestreaming in L4D2 feels really hard to control. Without livestreaming I can control my mouse just fine using rInput, but while streaming it feels easier to use m_rawinput.

Edited by ball2hi
Link to comment

Alright, good to know. I'm just trying to figure out why using rInput from SourceGL in L4D2 makes it feel like the mouse is harder to control, as opposed to m_rawinput 1.

 

EDIT: Yeah, I can say without a doubt that rInput (using SourceGL) while livestreaming in L4D2 feels really hard to control. Without livestreaming I can control my mouse just fine using rInput, but while streaming it feels easier to use m_rawinput.

 

RInput involves extra calls to the Win32 API functions GetCursorPos and SetCursorPos to feed the game the raw input data that m_rawinput 1 does not. The 'old' v1.31 RInput suffered packet loss because it would call GetCursorPos and read the raw mouse data, but it would wait until SetCursorPos was called to reset the raw x and y accumulators. Meaning any raw data collected during polls in between these two calls was ignored. The way the Source engine is coded with m_rawinput 0 to translate cursor position changes to in-game aim, the two cursor function would ideally occur sequentially without interruption. The phenomenon of dropped mouse data packets was roughly proportional to fps_max, because the more frames that were rendered, the more work the CPU was having to do and thus the higher occurrence of something requiring CPU wall time in between Get/SetCursorPos.

 

The new RInput fixed the packet drops by instead resetting the raw x and y accumulators during the GetCursorPos call (along with another more subtle and difficult to explain change in how it accumulates the raw input data). Still, RInput requires the two extra Win32 API calls that are vulnerable to high CPU utilization and being delayed momentarily by higher-prioritized processes. Now I really am not knowledgeable enough about Windows API, Source engine frame rendering, how the streaming program(s) work, or CPU prioritization to know exactly could be going on if what you're saying is true, but I imagine it's one of these possibilities:

 

1) streaming is causing frequent ~100% utilization of the CPU core that is also handling Get/SetCursorPos, so those functions are subject to having to wait for CPU wall time, which could either delay the frame rendering they're being called for, or cause that cursor change to be delayed to the next frame (thus inducing 1 frame of input lag)

2) your streaming implementation isn't causing ~100% core utilization happens to be particularly taxing on the Windows API, causing some kind of collision with the Get/SetCursorPos timing

3) the streaming processes have higher priority than the game process, so calls such as Get/SetCursorPos are ceding time to the streaming processes during high CPU utilization (this possibility could be easily fixed by setting the streaming processes lower priority than the game with Prio)

 

Whatever the case, you could invest ~$20 in a Teensy and easily program it to perform exact patterns of mouse input to demonstrate with empirical evidence what you say you are experiencing.

Link to comment

actually even with m_rawinput 1,

setcursorpos is still being called (every frame i think)

you can see this if you have a second monitor open; you can see the cursor bouncing on the second screen when you swipe with a high dpi and fps_max 60

 

i think this is so that you don't click on the second window too easily. but why don't they just call clipcursor?

 

Alright, good to know. I'm just trying to figure out why using rInput from SourceGL in L4D2 makes it feel like the mouse is harder to control, as opposed to m_rawinput 1.

 

EDIT: Yeah, I can say without a doubt that rInput (using SourceGL) while livestreaming in L4D2 feels really hard to control. Without livestreaming I can control my mouse just fine using rInput, but while streaming it feels easier to use m_rawinput.

interesting; what are you using, obs?

 

this is something i should definitely try when i resume my input lag measurements

 

 

Hello,

I've recorded two videos with the following settings: m_rawinput 1, fps_max 0 and V-Sync 0. Just look how the mouse delay increased when full 3D scene has been rendered. And in this situation the game's cursor should represent the crosshair behavior, no? I write about this because I can definitely feel the difference between mouse responce in 1.6 and CS GO.

https://youtu.be/zEwDSlyD85Y

 

the csgo cursor gets the cursor position differently from the csgo screen movements though. for example if you use rinput, and set m_rawinput 1, in-game your crosshair is stuck and doesnt move, but the csgo cursor is still fine

Edited by qsxcv
Link to comment

interesting; what are you using, obs?

 

this is something i should definitely try when i resume my input lag measurements

 

 

Yes, OBS. However, I'm using Open Broadcaster Software - Multiplatform (OBS-MP) and doing two encoders. One is my livestream, 648p@60fps, and HDD recording 720p@60fps.

 

I'm noticing in general it's hard to control SourceGL rInput even without streaming. I'm not sure if it's related to my computer or Left4Dead 2, but it's definitely noticeable. 

Link to comment

Yes, OBS. However, I'm using Open Broadcaster Software - Multiplatform (OBS-MP) and doing two encoders. One is my livestream, 648p@60fps, and HDD recording 720p@60fps.

 

I'm noticing in general it's hard to control SourceGL rInput even without streaming. I'm not sure if it's related to my computer or Left4Dead 2, but it's definitely noticeable. 

 

I'm assuming this is with the newly released RInput v1.35/sourceGL v2.06? Like I said, using RInput with or without sourceGL would be the exact same in-game, so if there is some problem in L4D2, it's related to RInput and not sourceGL per se.

 

Have you used RInput with CS:GO or any other game? If so, is it just L4D2 that doesn't feel right with it?

Edited by Vols and Jezuz
Link to comment

actually even with m_rawinput 1,

setcursorpos is still being called (every frame i think)

you can see this if you have a second monitor open; you can see the cursor bouncing on the second screen when you swipe with a high dpi and fps_max 60

 

i think this is so that you don't click on the second window too easily. but why don't they just call clipcursor?

 

I've had the exact same thought before. Calling SetCursorPos with raw input seems like such a bizarre work-around as opposed to simply calling ClipCursor. Maybe ClipCursor has higher overhead or some other strange side effect...?

 

Random thought, but we could make an injectable DLL that peeks and discards SetCursorPos calls by the game and triggers ClipCursor being called for m_rawinput 1. Or even more complicated, we could make RInput detect the m_rawinput setting and work normally if it's 0, and not register itself for raw input or detour/trampoline Get/SetCursorPos and instead use the ClipCursor change I just mentioned if it's 1. Not sure if either of these ideas are entirely possible or worth pursuing though.

Edited by Vols and Jezuz
Link to comment

I'm assuming this is with the newly released RInput v1.35/sourceGL v2.06? Like I said, using RInput with or without sourceGL would be the exact same in-game, so if there is some problem in L4D2, it's related to RInput and not sourceGL per se.

 

Have you used RInput with CS:GO or any other game? If so, is it just L4D2 that doesn't feel right with it?

It was also with the older SourceGL which is why I stopped using it until this new update, which I'm not using it again now haha. It just feels very jittery, and almost like I'm playing with 1000hz polling rate (IE. it's reading every motion so the slightest sneeze can move me off target). It does it's job of less input-lag (noticeable) but for precision/control it just doesn't feel there.

 

I'm wondering if it might be because of my Logitech Gaming Software installed. LGS has been the center of a lot of problems, I just wish I could have it not detect my G502 and just detect my keyboard so it doesn't do any kind of interfering with my mouse. Going to test and see if this solves the problem (uninstalling LGS).

 

EDIT: Uninstalling LGS (and restarting) seems to have helped a bit with the jitter but it still feels harder to control. Specifically when I'm moving the mouse diagonally (IE. up left, down right) and while moving. Horizontally, it feels fine. I'll do more testing with this but maybe it's just because I'm not used to the more responsive feel perhaps?

Edited by ball2hi
Link to comment

No offense but your experience with the old Rinput/sourceGL isn't really relevant anymore because the new ones are considerably better in terms of accurately handling raw mouse data with ~0 packet loss.

 

Now if the new RInput/sourceGL controlled noticeably worse than m_rawinput 1 in L4D2, then that would be something that might concern me more.

Link to comment

No offense but your experience with the old Rinput/sourceGL isn't really relevant anymore because the new ones are considerably better in terms of accurately handling raw mouse data with ~0 packet loss.

 

Now if the new RInput/sourceGL controlled noticeably worse than m_rawinput 1 in L4D2, then that would be something that might concern me more.

I said also. I tested both. Both were hard to control compared to m_rawinput 1 in L4D2. I'm going to ask someone else who plays L4D2 to test it out and give me their opinion later, using the new sourceGL.

Link to comment

Sorry for the double-post but figured I'd update what I've been testing.

 

So, for the longest time I've been using 400dpi / 5 sens. I think the multiplier of 5 sensitivity is what's causing the issues as when I tried 3600dpi and 0.3 sens, whatever problems I was getting from SourceGL were gone in Left4Dead 2. I think the proper term to use is jitter, as my friend who tried out SourceGL said he felt it at the beginning.

Link to comment

Hi ! I play csgo with the rawinput 0 and I feel the delay after CSS. Can you do choose between Accel OS OF or ON in Rinput ? 

At the moment players who play with the acceleration OS do not have programs that reduce the delay in CSGO and use acceleration OS.

Link to comment

Hello!

 

I decided to try your program in Quake Live. I play with in_mouse -1 (=m_rawinput 0) instead of in_mouse 2 (m_rawinput 1) because I've discovered (of course by feeling; but I am such a sensitive person to this stuff) that RawInput in QuakeLive has relative delay as well.

 

So, what's the problem with your program? Well, I start the game, play a little bit, open Rinput v1.37, type quakelive_steam.exe, the *.dll file successfully injectes, I go back to the game, play a little bit (everything's fine), then press ESC. After that buttons in the main menu cannot be pressed, like MOUSE1 doesn't work at all. Then I press ESC again and go back to the game.

 

And here the real problem startes. Eventually, I can shoot, but I cannot aim with my mouse, the crosshair doesn't move. Could you please explain the problem to me, or, try to fix it?

 

Thank you

Link to comment



Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...