eDPI stands for ‘effective Dots Per Inch,’ which is a formula used to calculate the effective sensitivity of any given player, regardless of hardware or software settings. The formula is DPI * ingame sens.
In gaming, users can use many different hardware/software settings combinations. One user might have his mouse set to be extremely sensitive at the hardware level (which means he would have a high DPI; Dots Per Inch) with a very low ingame sensitivity, while another user might go for the reverse. eDPI is a handy tool to compare the actual sensitivity of different setups that takes all of these settings into account.
- DPI = Dots Per Inch; your mouse’s sensitivity on a hardware level
- eDPI = DPI multiplied by ingame sensitivity
- eDPI is used to easily compare ‘true sensitivities’ across different setups in the same game
- eDPI cannot be used to compare across games, since different games handle sensitivity differently
What is DPI?
DPI stands for Dots Per Inch. If you’ve got your mouse set at 1000 DPI it means that the mouse will ‘measure’ 1000 points of movement per inch that you move the mouse. The higher your DPI is, the more your cursor moves when you move the mouse. The higher the DPI, the more ‘sensitive’ the mouse, from a hardware point of view.
DPI is always set on the mouse itself or via software (see How to find DPI and change it). In general, the DPI of your mouse will determine how sensitive it is throughout your entire system. Almost all pro gamers use a DPI of 1600 or lower, since higher DPI settings can cause issues such as smoothing (What is mouse smoothing?).
Most mice have a DPI button located below the scroll wheel or on the underside of the mouse, allowing you to cycle between different DPI settings on the fly.
Note: In PC gaming, CPI (Counts Per Inch) is used intermittently with DPI. CPI is the same as DPI in this context.
Sensitivity (or ‘sens’) is the ingame sensitivity setting. Contrary to DPI, this only applies in the game where you set it in, so you can use different sensitivities in different games at the same time. Different games also use different ways of measuring sensitivity, so ‘1’ sensitivity in game A won’t necessarily mean the same in game B.
Looking at DPI or ingame sensitivity on its own to compare is usually a bad idea. Player A can have his mouse set to 1600 DPI with an ingame sensitivity of 2, whilst player B can have her mouse set to 400 DPI with an ingame sensitivity of 8. These look like wildly different configurations, but in reality both players’ mice will be equally sensitive to movement. This is why we use eDPI to compare sensitivities across the same game.
DPI * Sensitivity
Find out how to find your DPI
Since gamers like to compare settings, gear, and so forth, and comparing raw sensitivity and DPI can get confusing we use eDPI when comparing ‘true sensitivities.’
eDPI stands for effective Dots Per Inch, and it’s calculated by multiplying the mouse DPI with the ingame sensitivity. This gives gamers a way of comparing the true sensitivity of different players, regardless of their hardware or software settings. Note that eDPI is game-specific: different games use different ways to calculate sensitivities, so eDPI is only useful for comparing actual sensitivities of different players within the same game.
As an example, we’ve listed three different (fictional) players using the same eDPI, with drastically different settings.
|Player A||400||2||400*2 = 800|
|Player B||800||1||800*1 = 800|
|Player C||1600||0.5||1600*0.5 = 800|
As you can see, these three players use very different setups, but if you were to test all three setups, you wouldn’t notice a difference in the actual, real-life sensitivity of their mouse setups. For this reason, eDPI gets used to compare effective sensitivities.