I have an app written in Aurelia, with ThreeJS integration where we can display the 3D objects with the help of WebXR. I wanted to show each of the 3D objects separately, based on location. For example, if I have an attribute for object A with GPS coordinates (lat, long), is there a function/way to show this object in AR with WebXR?
Do I understand correctly that you show the same image for both eyes? It should be possible to achieve the same with a single plane and a custom shader. Or, alternatively, you could transform it in onBeforeRender for each camera. That way you might not even have to care if you're in vr mode or not.
Coincidentally, for some experiment I also needed to calculate the IPD in A-Frame. I did this through extracting the positions from the left and right eyes' cameras. Here's the approach (slightly adjusted to also trigger on the enter-vr event like in your sample):
Obviously if you use onBeforeRender of the scene for something else, the above won't work. But simply setting a flag and performing this logic on the next render() of the component achieves effectively the same.