The simple fact tobii decided it wasn’t ready to show, and that pimax listened, is enough to allay my concerns. If they weren’t happy with Pimax even showing it in a less than perfect state, they aren’t going to be happy with launching it either. (Joshua himself stated it was Tobii’s decision not to show it during his live stream with MRTV).
- Its new ground for Tobii:
It’s not new ground for them unless I’m misunderstanding you, they are already in the HP G2 omnicept, and also the Vive pro eye, the Pico NEO 3 pro eye and the NEO 2 eye.
- Tobii aren’t controlling the IPD adjustments, they are simply measuring it.
Not sure where you’ve seen this discussed? If so, would appreciate a link. Regardless, accurately measuring IPD is definitely the hardest part imo… accurately driving a lens module to a predefined position using a small motor and an encoder is literally something I could whip up in a couple of hours in the man cave and is very much a solved problem with a myriad of ways to achieve it.
- Tobii isn't providing the Foveated rendering they are providing eye tracking data only:
Again not sure where you’ve seen this, but would appreciate a point in the right direction if you have any info.
It’s pretty clear they aren’t just providing raw eye tracking data however, the foveated transport for example is a tobii technology that pimax are definitely using as it has been mentioned, and I don’t see any reason to assume they wouldn’t also be using tobii spotlight since they work hand in hand and I believe spotlight is a prerequisite. One of the main reasons to go with Tobii is their full software foveated integrations stack
https://www.tobii.com/products/integration/xr-headsets/foveation-technology
As for the foveated rendering itself, pretty sure pimax are at least partly going to be planning on using Nvidia’s VRSS 2 framework (hence one assumes at least one reason why it’s Nvidia only).
https://developer.nvidia.com/blog/delivering-dynamic-foveated-rendering-with-nvidia-vrss-2/ which if you have a read of you’ll see was developed in conjunction with tobii spotlight and works on the driver level which matches what Pimax have been saying (and is further evidence that this is not new ground for Tobii, and they are doing more than simply providing raw data).
On top of that, OpenXR toolkit can already work with the G2 omnicept (which is also a tobii eye tracking product) and is pretty much universal when using opencomposite to switch openVR games to OpenXR. I’m currently using fixed foveated rendering in OpenXR toolkit and it generally works pretty well, albeit perhaps with slightly more modest performance gains than could be achieved earlier in the pipeline.
Anyway, time will tell but honestly given all the above it’s the thing I’m least worried about in the whole headset
Also seen some encouraging stuff from Joshua on Reddit regarding the lens focal distance/diopter issue that was reported:
Yes we already identified the issue and have started the process of correcting it.
Yes we are actively fixing the issue. On this road-show I was able to identify the issue and feed it back to our engineers. They have confirmed my theory in testing and we are currently debating different solutions. We are debating 3 options, one of which would possibly case a minor delay in delivery time.
So at the very least they seem to be taking feedback seriously, which is encouraging to see.
We’ll find out if they have actually changed their spots when it launches next month… by which I mean March… I mean June.