What is DPI & eDPI?

DPI stands for Dots Per Inch, which is a measurement of how many times your mouse reports its position to your PC. eDPI stands for ‘effective Dots Per Inch,’ which is a formula used to calculate the effective sensitivity of any given player, regardless of hardware or software settings. We explain this in depth below.


Note: in PC gaming, CPI (Counts Per Inch) is used intermittently with DPI. 

DPI stands for Dots Per Inch. If you’ve got your mouse set at 1000 DPI it means that the mouse will ‘measure’ 1000 points of movement per inch that you move the mouse. The higher your DPI is the more your cursor moves when you move the mouse, so the higher the DPI, the more ‘sensitive’ the mouse, from a hardware point of view. If your mouse is set to 800 DPI it will travel 800 pixels (‘counts/dots’) on screen per inch of movement on your desk.

DPI is always set on the mouse itself (or via software) and as such translates to the sensitivity of the mouse in Windows, internet browsers, and so on. So in general the DPI of your mouse will determine how sensitive it is throughout your entire system.

It is a common marketing gimmick to advertise gaming mice with absurdly high DPI counts, but most professional gamers set their DPI anywhere between 400 and 1600, making these absurdly high DPI possibilities virtually useless. Aside from that, some sensors will introduce smoothing at higher DPI levels, so it’s always safer to stay at 1600 or lower. If you want a higher overall sensitivity you can always up your ingame sensitivity.

Most mice have a DPI button located below the scroll wheel, allowing you to cycle between different DPI settings.


Sensitivity (or ‘sens’) is the ingame sensitivity setting. Contrary to DPI this only applies in the game where you set it in, so it’s perfectly possible to have multiple different sensitivities across a variety of games installed on the same machine whilst using the same mouse. Different games also use different ways of measuring sensitivity, so ‘1’ sensitivity in game A won’t necessarily mean the same in game B.

Looking at DPI or ingame sensitivity on its own to compare is usually a bad idea. Player A can have his mouse set to 1600 DPI with an ingame sensitivity of 2, whilst player B can have her mouse set to 400 DPI with an ingame sensitivity of 8. These look like wildly different configurations, but in reality both players’ mice will be equally sensitive to movement.



DPI * Sensitivity
Find out how to find your DPI

Since gamers like to compare settings, gear, and so forth, and comparing raw sensitivity and DPI can get confusing we use eDPI when comparing ‘true sensitivities.’ eDPI stands for effective Dots Per Inch, and it’s calculated by multiplying the mouse DPI with the ingame sensitivity. This gives gamers a way of comparing the true sensitivity of different players, regardless of their hardware or software settings. Note that eDPI is game-specific: different games use different ways to calculate sensitivities, so eDPI is only useful for comparing actual sensitivities of different players within the same game.

Player A has a DPI of 1600, and an ingame sens of 2.

Player B has a DPI of 400, and an ingame sens of 8.

Player A’s eDPI = 3200 (1600*2) Player B’s eDPI = 3200 (400*8)

Both players have the same true sensitivity (eDPI), though their DPI and ingame sens are very different.

Notify of

Inline Feedbacks
View all comments