You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm building an AR experience on Android that projects stars in the sky and users can see them when they point their phones upwards. The stars should stay in the same spot from the users point of view (for simplicity's sake, I'm assuming the users are staying in the same spot).
Problem
For the most part it seems to work fine when I hold the phone upright. However when point upwards at the stars and I rotate the phone to its side (portrait to landscape), the stars would rotate away instead of staying in place.
This sounds like a common problem but I'm having trouble figuring out how to fix it with my limited knowledge of quaternions and 3D space - hoping someone here can help 🙏
How am I implementing it?
I'm utilizing Android's rotation vector sensor and retrieving a quaternion that represents the phone's current orientation in Android's own coordinate definition:
X is defined as the vector product Y x Z. It is tangential to the ground at the device's current location and points approximately East.
Y is tangential to the ground at the device's current location and points toward the geomagnetic North Pole.
Z points toward the sky and is perpendicular to the ground plane.
To achieve the AR effect, I'm following these steps:
Get the sensor quaternion value
Convert quaternion to a Euler Angle based on this formula
Convert euler angle to a rotation matrix
Remap the coordinate system to fit Filament (from Z-axis point up to Y-axis pointing up)
Apply the matrix to the camera with setModelMatrix
Here's the code:
Getting sensor values and applying it
override fun onSensorChanged(event: SensorEvent) {
if (lastAccuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
return
}
if (event.sensor.type == Sensor.TYPE_ROTATION_VECTOR) {
val tempQuaternion = FloatArray(4)
SensorManager.getQuaternionFromVector(tempQuaternion, event.values)
val sensorQuaternion = Quaternion(
x = tempQuaternion[1],
y = tempQuaternion[2],
z = tempQuaternion[3],
w = -tempQuaternion[0]
)
val eulerAngles = FloatArray(3)
qtToEulerAngles(sensorQuaternion.toFloatArray(), eulerAngles)
val cameraTransform = eulerToRotationMatrix(
eulerX = eulerAngles[0].toDouble(),
eulerY = eulerAngles[1].toDouble(),
eulerZ = eulerAngles[2].toDouble()
)
// Rotate the sensor reading we got to match Filament's coordinate system.
val rotated = FloatArray(16)
SensorManager.remapCoordinateSystem(cameraTransform.floatArray, AXIS_Z, AXIS_X, rotated)
camera.setModelMatrix(output)
}
}
Converting quaternion to Euler Angles
fun qtToEulerAngles(rotationQt: FloatArray, angles: FloatArray?): FloatArray? {
var x = rotationQt[0]
var y = rotationQt[1]
var z = rotationQt[2]
var w = rotationQt[3]
var angles = angles
if (angles == null) {
angles = FloatArray(3)
} else require(angles.size == 3) { "Angles array must have three elements" }
val sqw: Float = w * w
val sqx = (x * x).toFloat()
val sqy = (y * y).toFloat()
val sqz: Float = z * z
val unit = sqx + sqy + sqz + sqw // if normalized is one, otherwise
// is correction factor
val test: Float = x * y + z * w
if (test > 0.499 * unit) { // singularity at north pole
angles[1] = (2 * Math.atan2(x.toDouble(), w.toDouble())).toFloat()
angles[2] = (Math.PI / 2).toFloat()
angles[0] = 0f
} else if (test < -0.499 * unit) { // singularity at south pole
angles[1] = (-2 * Math.atan2(x.toDouble(), w.toDouble())).toFloat()
angles[2] = (-Math.PI/2).toFloat()
angles[0] = 0f
} else {
angles[1] =
Math.atan2((2 * y * w - 2 * x * z).toDouble(), (sqx - sqy - sqz + sqw).toDouble()).toFloat() // roll or heading
angles[2] = Math.asin((2 * test / unit).toDouble()).toFloat() // pitch or attitude
angles[0] = Math.atan2((2 * x * w - 2 * y * z).toDouble(), (-sqx + sqy - sqz + sqw).toDouble()).toFloat() // yaw or bank
}
return angles
}
Converting Euler Angles to Matrix
fun eulerToRotationMatrix(eulerX: Double, eulerY: Double, eulerZ: Double): Array<Array<Double>> {
val rx = arrayOf(
arrayOf(1.0, 0.0, 0.0),
arrayOf(0.0, cos(eulerX), -sin(eulerX)),
arrayOf(0.0, sin(eulerX), cos(eulerX))
)
val ry = arrayOf(
arrayOf(cos(eulerY), 0.0, sin(eulerY)),
arrayOf(0.0, 1.0, 0.0),
arrayOf(-sin(eulerY), 0.0, cos(eulerY))
)
val rz = arrayOf(
arrayOf(cos(eulerZ), -sin(eulerZ), 0.0),
arrayOf(sin(eulerZ), cos(eulerZ), 0.0),
arrayOf(0.0, 0.0, 1.0)
)
return multiplyMatrices(rz, multiplyMatrices(ry, rx))
}
private fun multiplyMatrices(a: Array<Array<Double>>, b: Array<Array<Double>>): Array<Array<Double>> {
val result = Array(3) { Array(3) { 0.0 } }
for (i in 0 until 3) {
for (j in 0 until 3) {
for (k in 0 until 3) {
result[i][j] += a[i][k] * b[k][j]
}
}
}
return result
}
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hey everyone,
I'm building an AR experience on Android that projects stars in the sky and users can see them when they point their phones upwards. The stars should stay in the same spot from the users point of view (for simplicity's sake, I'm assuming the users are staying in the same spot).
Problem
For the most part it seems to work fine when I hold the phone upright. However when point upwards at the stars and I rotate the phone to its side (portrait to landscape), the stars would rotate away instead of staying in place.
This sounds like a common problem but I'm having trouble figuring out how to fix it with my limited knowledge of quaternions and 3D space - hoping someone here can help 🙏
How am I implementing it?
I'm utilizing Android's rotation vector sensor and retrieving a quaternion that represents the phone's current orientation in Android's own coordinate definition:
To achieve the AR effect, I'm following these steps:
setModelMatrix
Here's the code:
Beta Was this translation helpful? Give feedback.
All reactions