Jump to content


Premium Members
  • Content count

  • Joined

  • Last visited

  • Days Won


CaptaPraelium last won the day on February 21

CaptaPraelium had the most liked content!

About CaptaPraelium

  • Rank

Recent Profile Visitors

746 profile views
  1. Perceived sensitivity

    This is what I've been getting at. We choose to match based on some percentage of the monitor space... Uhm.... Why exactly? What's so special about the boundaries of the display? Not only are they determined by the monitor dimensions, but also change depending on the game context (eg, is my spaceship rolled sideways?) Matching to 100% horizontal is no different to matching some space above your monitor where you can't see. What random space? Well, that depends on which monitor you bought. I don't have much time for horizontal matching, at this point. Vertical matching at 100% does have a redeeming feature above other monitor match percentages, and that is that it matches at the point where the angle to the target is the same angle as the FOV, and that FOV is actually an input in calculating the image displayed, so it has some mathematical relevance. Setting the target in the geogebra demo, to the top edge of the screen, shows us that we get the same zoom ratio, as the direct division of the FOVs - and at this point I can't exclude the possibility that this simple solution is actually the correct formula.
  2. Perceived sensitivity

    There's a reason basically every operating system has it on. Push mouse faster, makes mouse go faster. It's very intuitive. Acceleration works entirely programmatically so theoretically speaking you could be accurate with it. If you can control not only the distance of your hand movement, but also the speed, and acceleration, it could work. Thing is, you want to remove barriers to accuracy. Have you ever compared PC shooter gameplay, to console gameplay of the same game, and been like wow, mouse aim makes this completely different? I mean we all know controllers are less accurate but, think of why they're less accurate - because not only do you have to control the distance of the stick throw, but the speed and acceleration. That extra effort is why they're so much less efficient, and it's the same extra effort required to use acceleration. If I had a trackpad though, where I could completely control the acceleration algorithm....Don't get me started. TL;DR Acceleration is like using a controller.
  3. Perceived sensitivity

    This is what I'm trying (badly, I know) to explain with the crappy picture above. You're actually both right. If everything we were doing were a 2D image, which in fact it is because it's on screen, 0%MM would be correct. Heck, that's exactly how 0%MM works, it treats the game world only as it's 2D projection and tells us how much we zoomed it in on our flat screen. But, because of the distortion we experience by viewing it as a 3D scene (but not changing our focal point when we change FOV, ie, sitting in the same place all the time), the image becomes stretched, so the cursor feels like it is moving through the 3D space at a different speed. The nature of that 3D space is determined by the distortion, so the distortion is controlling the 'too fast/too slow' feeling we get. (and it's not necessarily just the distortion of our seating position - here, I am referring to the difference in distortion between two FOVs) There's good reason why gear ratio/1:1/100%VFOV/4:3 feel closer and seem to move the image at a speed that feels right. 0%MM is always exactly right for anything directly under the crosshair. But for anything away from it, 0% is too slow, and the further it is from the crosshair, the more 0% is too slow. As we monitor match further and further from the crosshair, we get higher and higher sensitivity. If we 'monitor match' at 180degrees - even though that's not on the monitor - what we get is a sensitivity divider of 1 - as in, SAME sensitivity, same cm/360, for any FOV. Which is obviously too fast. So, what we have here are two known sensitivities. At the centre of the crosshair, we have 0%MM. This is as slow as it gets. And at the very edge of the game world/our turning circle, we have =cm/360. This is as fast as it gets. And this makes sense. There's no reason we would ever want to go below or above these speeds respectively. So let's graph out a line between those. In my crappy mspaint, it's shaped like a tan curve (inverted at the Y axis because it's more intuitive to think of it as 'left X'' than 'negative X'). Somewhere along that line, is a place where it's a little too fast, but not too close to the =cm/360 extent; and a little too slow, but not too close to the 0%MM extent.. That's the sweet spot.... and those places where drimzi's muh feels and clever LGS script make it seem correct, they seem that way because they're close to that sweet spot. The geogebra demo I did, shows hints of all of this, but it's not always immediately apparent. We can demonstrate it though. I'm still working on being able to show my working and give an exact formula that gives exact sensitivities to use.
  4. Perceived sensitivity

    Yep. The reason 0% works like this is because it is at the extent of logical sensitivities - the lowest extent. Everything is too slow at the centre, and moves toward being too fast as it moves outward. Because of the way the distortion curves, the errors in sensitivity curve the same way. So most of the error is away from the centre. note that the 'sweet spot' is closer to 0% than it is to the opposite extent. This is why, if you choose one extent over the other, 0% wins easily. The trouble is that we don't continually re-assess our position. We make estimations and move, then re-assess it. Think of a 'flick shot'. You don't slowly move the crosshair over the target, you make one singular 'flick' of the wrist, and click a few buttons, all in one instant action like it's packaged up in a bow. If the system is that we constantly check the presence of the target under the crosshair, literally any sensitivity will do.
  5. Perceived sensitivity

    I don't know where you got this idea. Aim is critical especially in tanks. You should use the same formula that you use for infantry. As for movement, you do that with the keyboard..... The movement speed is limited by the vehicle, and not effected by mouse sensitivity. Experience will trump even a 'perfect' formula... But both, will trump experience alone I did cover this earlier in the thread but it's long AF, I know TL;DR is that our brain measures distance by angle between points, so when your FOV changes, the angles change, and it messes with your perception of distance. Distance and speed are linked, so it changes your perception of speed. If you sit far enough away, pixels will be indistinguishable, but the effect remains. This thread is about doing things as scientifically soundly as possible. If we don't propose and test hypotheses, we're doing it wrong! Don't be afraid to speak your mind and make suggestions, if you're wrong it's just a step toward being right, and if you're right, you'll be helping a LOT I don't know if I'd call it a controversy, but it's definitely the 64,000 dollar question XD
  6. Perceived sensitivity

    Oh trust me, I'll be using a computer to solve them. I'm no sadist. But I need to MAKE them first
  7. Perceived sensitivity

    Updated version of the geogebra demo. https://www.geogebra.org/m/adAFrUq3 I need to re-make this from scratch with three, linked targets with positions relative to each of the three focal points. It's a lot of work, it's gonna take a while. Meantime, this version allows us to reposition the target to be off-screen, to demonstrate the way the angles scale beyond the visible parts of the projection. It's *far* from perfect but it gives some idea. Funny story. You know how people learn calculus in school and they're like "I'm never gonna need this after school this is stupid"? Well, not me, I was like, 'I need this for my career'. Decades later, different career, and I'm now having to relearn differentials just for this nerdery. The irony. lol.
  8. Perceived sensitivity

    This has been covered in other threads, I think we're pretty far off topic here sorry.
  9. Perceived sensitivity

    Theoretically yes. In practice of course this is never possible...well never practical to move your seat around, and that's why we're nerding out in here trying to find the 'next best thing'
  10. Perceived sensitivity

    //My monitor resolution widthpx = 2560 heightpx = 1440 //27 inch diagonal, converted to centimeters because Australia If you prefer inches, just don't do *2.54. diagcm = 27*2.54 //Distance from monitor to eye, also centimeters in this case, measured with a piece of string... //actually used my headphone cable and a borrowed Hannah Montana ruler XD Super high tech here viewdistance = 36 //Calculating monitor height from this information so it's more exact than just measuring it. We need monitor height because FOV is usually measured vertically. heightcm = (diagcm/sqrt(widthpx^2+heightpx^2))*heightpx //Calculating the actual FOV of the eye to the monitor based on the above actualfov = 2(arctan((heightcm/2)/viewdistance)) //Calculating the focal point (ie, the 'optically correct distance' as Valve says it) for a given VFOV opticallycorrectdistance=(heightcm/2)/(tan(vfov/2))
  11. Perceived sensitivity

    You're not stupid It's confusing AF. Short answer is no. Your eye FOV would have to change to match the current FOV (read: you would have to move your seat back and forth when you right-click and un-right-click)
  12. Perceived sensitivity

    Yes, you're quite right - the distortion is unavoidable. We can never fully avoid the effects of it, and so I am attempting to minimise those effects. The issue with monitor matching, is that it does not function in the same manner, when considering the distortion on both the horizontal, and the vertical planes. As skwuruhl has mentioned in the past, the monitor match percentages which we have chosen, be they 100% vertical or horizontal or diagonal or whatever, are somewhat arbitrary. What I am attempting to find, is a non-arbitrary position at which to match sensitivities, by matching at the position where the errors in sensitivity (too slow/too fast) are minimal across all angles of rotation, regardless of whether that angle falls on-screen at the given FOV.
  13. Optimal Sensitivity and Mouse-sensitivity users

    The answer is: The distance your mouse travels when you rotate your arm through a 90 degree arc. More than this is bad for your body (over-rotates the shoulder), less than this is bad for your accuracy (lower sensitivity makes for higher accuracy) and your body (over-stresses the wrist).
  14. Perceived sensitivity

    I treat this as if we have one eye only, and that is our dominant eye. https://en.wikipedia.org/wiki/Ocular_dominance
  15. Perceived sensitivity

    It does, but here is the tricky part: It does if you use 100% VFOV, but for 100% HFOV it is still distorted - It is distorted by a factor of the aspect ratio (ie 1.77777~ for 16:9 monitors) at the greatest extent, ie, we are converting 180 FOV to 0FOV.. This is exactly the problem I'm trying to solve right now. I know that it's something to do with the tan function distortion and finding the 'middle ground' but I'm not quite sure how to handle the math behind it. I'm searching for a function that scales between zoom ratio, and zoom ratio*aspect ratio, with a tan curve, and 45 degrees along that curve will be the minimum error. I'm no @Skwuruhl