On my Lg G stylo h634, it has a resolution of (720*1280) w/ a size of 5.7 inches and a real PPI of 258. With this information this phone should have:
DP width: 240
DP height: 240
Density: 1.5
Density PPI: 240
However when i run my test on this real device it's giving me:
DP width: 257
DP height: 258
Density: 2.0
Density PPI: 320
Now when i run an emulated version of my phone with the same specs, the first metrics are given (density: 1.5, PPI: 240, etc) which are the proper metrics. I'm not sure why this is happening, but can anybody explain why an emulated version is more accurate than the real device?
but can anybody explain why an emulated version is more accurate than the real device?
It's not. The device is what the device is.
On DisplayMetrics
, xdpi
and ydpi
are the actual physical density values. For example, the documentation for xdpi
has:
The exact physical pixels per inch of the screen in the X dimension.
In your question, you state that the device has "a real PPI of 258". That fits the values that you are getting from DisplayMetrics
, bearing in mind that pixels are rarely square, so the xdpi
and ydpi
values are rarely exactly equal.
The value for density
is based on a manufacturer setting (ro.sf.lcd_density
in /system/build.prop
, I think). Why LG decided to go with xhdpi
instead of hdpi
, I cannot say. If I had to guess, they felt that existing apps looked better on the device with that logical density. The emulator will use its own algorithm. Another manufacturer with a similar screen might choose hdpi
(what the emulator chose).
The value for densityDpi
is driven directly from density
.