What is DPI & eDPI?

ProSettingsLibrary8 Comments

What is DPI & eDPI?


DPI stands for Dots Per Inch, which is a measurement of how many times your mouse reports its position to your PC. eDPI stands for ‘effective Dots Per Inch,’ which is a formula used to calculate the effective sensitivity of any given player, regardless of hardware or software settings. We explain this in depth below.

DPI


Note: in PC gaming, CPI (Counts Per Inch) is used intermittently with DPI. 

DPI stands for Dots Per Inch. If you’ve got your mouse set at 1000 DPI it means that the mouse will ‘measure’ 1000 points of movement per inch that you move the mouse. The higher your DPI is the more your cursor moves when you move the mouse, so the higher the DPI, the more ‘sensitive’ the mouse, from a hardware point of view. If your mouse is set to 800 DPI it will travel 800 pixels (‘counts/dots’) on screen per inch of movement on your desk.

DPI is always set on the mouse itself (or via software) and as such translates to the sensitivity of the mouse in Windows, internet browsers, and so on. So in general the DPI of your mouse will determine how sensitive it is throughout your entire system.

It is a common marketing gimmick to advertise gaming mice with absurdly high DPI counts, but most professional gamers set their DPI anywhere between 400 and 1600, making these absurdly high DPI possibilities virtually useless. Aside from that, some sensors will introduce smoothing at higher DPI levels, so it’s always safer to stay at 1600 or lower. If you want a higher overall sensitivity you can always up your ingame sensitivity.

Most mice have a DPI button located below the scroll wheel, allowing you to cycle between different DPI settings.

Sensitivity

Sensitivity (or ‘sens’) is the ingame sensitivity setting. Contrary to DPI this only applies in the game where you set it in, so it’s perfectly possible to have multiple different sensitivities across a variety of games installed on the same machine whilst using the same mouse. Different games also use different ways of measuring sensitivity, so ‘1’ sensitivity in game A won’t necessarily mean the same in game B.

Looking at DPI or ingame sensitivity on its own to compare is usually a bad idea. Player A can have his mouse set to 1600 DPI with an ingame sensitivity of 2, whilst player B can have her mouse set to 400 DPI with an ingame sensitivity of 8. These look like wildly different configurations, but in reality both players’ mice will be equally sensitive to movement.

eDPI

eDPI:
DPI * Sensitivity

Since gamers like to compare settings, gear, and so forth, and comparing raw sensitivity and DPI can get confusing we use eDPI when comparing ‘true sensitivities.’ eDPI stands for effective Dots Per Inch, and it’s calculated by multiplying the mouse DPI with the ingame sensitivity. This gives gamers a way of comparing the true sensitivity of different players, regardless of their hardware or software settings.

Player A has a DPI of 1600, and an ingame sens of 2.
Player B has a DPI of 400, and an ingame sens of 8.

Player A's eDPI = 3200 (1600*2)
Player B's eDPI = 3200 (400*8)

Both players have the same true sensitivity (eDPI), though their DPI and ingame sens are very different.

Lift Off Distance (LOD)

8 Comments on “What is DPI & eDPI?”

    1. Try googling “How to convert desktop sensitivity to ingame sensitivity” and you will find a couple of threads that will help you with your question. 🙂

  1. One interesting observation is that if you decide to use motion blur reduction by strobing (e.g. ULMB, ELMB, DyAc), is that without motion blur, mouse microstutters are much more visible.

    Blur-reduced modes look much better if you use a accurate ultra-high DPI (1600dpi+) and compensate by lowering in-game sensitivity.

    Your Windows desktop will move too fast, but you can configure mouse profiles to automatically raise DPI when your game launches (or steam.exe launches) and lower DPI when your game (or Steam.exe) exits.

    You want your mouseturns left/right to be as smooth (stutterless) as keyboard strafe left/right. That can be very hard.

    But by carefully raising DPI, it makes ULMB look so much better at all mouse movement speeds (slow mouse movements & flick mouse movements), especially if your framerates are matched with refresh rate. Ultra-high DPI with recent mouse sensor, on a very good mouse (1000Hz or true-2000Hz like Cougar Minos X5), with very clean mouse feet, on a very good mousepad. Makes a huge difference in making ULMB look better, it actually looks like my mouseturns are TestUFO-smooth.

    Thanks,
    Mark Rejhon
    Founder, Blur Busters

  2. i dont understand my eDPI . . . . .i play CSGO and I use 400 DPI and my sensivity in-game is 1.5 so whats my eDPI??? can u guys answer me??

  3. edpi is a strange use, it’s not actually an effective dpi, since in a game we are talking about measurements of angle, not inches on a mouse pad, the math is superficially easy, but it’s not intuitive, like if someone says they use “960” edpi, I can’t quickly see in my head what that means, I much prefer a term like “400 equivalent” or cm/revolution.

    If I use 0.6, and a 1600dpi mouse. I can take 1600/400 = 4.0, and then do 4.0*0.6 and I get [email protected], note that “edpi” is special case of “1 equivalent” -> [email protected], the reason why I like this is that most people have tried 400dpi, basically every non-gaming mouse is 400dpi, and so we can quickly figure out in our minds what means in real terms, and to people not familiar with “pro settings”, they don’t have to do any heavy lifting they just drop their mouse down to 400dpi and they can feel the sens.

    Even better is cm/rev because it’s game agnostic, I can just do `2.54 × 360 / (sense × yaw × dpi)`, where yaw is 0.022 (for counter-strike), and I get 43.3cm/rev, plus this standard has been used since early quake pro days.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.