Quantcast
Channel: Answers by "mikewarren"
Browsing latest articles
Browse All 57 View Live

Answer by mikewarren

I'm pretty sure the default is toe-in approach. HOWEVER, There are new API calls (SetStereoProjectionMatrices and SetStereoViewMatrices) that allow you to set stereo view and projection matrices per...

View Article



Answer by mikewarren

It's a bug. After quite a bit of searching, I found this... https://issuetracker.unity3d.com/issues/vr-non-fullscreen-viewport-rect-bounds-not-honored-in-vr-mode So far, it seems to be fixed in 5.5

View Article

Answer by mikewarren

[Robert Oats gave a talk at Unite 2013][1] that discusses some of the issues with large terrain databases. Creating a system to stream in / out terrain in tiles (perhaps even multiple resolutions), and...

View Article

Answer by mikewarren

You may be interested in [this post][1]. Some possible suggestions... 1. Create individual GameObject instances 2. Use GPU instancing shader 3. Use a particle system and programmatically create objects...

View Article

Answer by mikewarren

I'd bet that the pose data you get from that API is a copy of whatever data is being held internally to create poses. Any modifications you make would only be relevant to the methods that use that...

View Article


Answer by mikewarren

It's not a great solution but you could counteract the yaw in the camera parent by getting the tracking data and inverting it. https://docs.unity3d.com/ScriptReference/VR.InputTracking.html

View Article

Answer by mikewarren

Here's a code snippet from a larger module I have that loads scene bundles from files. This one loads the first scene in the bundle into the scene manager. AssetBundle bundle =...

View Article

Answer by mikewarren

Did you ever figure out what was going on? I use the UnityAssemblies folder to build external plugins for my project. It looks like the folder is synced right before the script editor (Visual Studio)...

View Article


Answer by mikewarren

First, if your just going to condense your "mood" calculation to a single variable, a float range or at the very least, a single enumerated type might be a better choice. I thought your question was...

View Article


Answer by mikewarren

@applemaniac Ok, if I didn't know better, I'd think I had written this post as I have nearly the exact same setup. CAVE, (no cluster though - single host), ART tracking, Windows 7 - OpenGL, Quadro...

View Article

Answer by mikewarren

@dinah93 The interfaces to set the stereo projection and view matrices in the Camera component are still there. While there doesn't appear to be a stereo mode in the XR settings in the editor any more,...

View Article

Answer by mikewarren

@red__carrot I know it's an old thread but FWIW, Unity supports VRPN natively through the Cluster Input Manager. You used to be able to access the manager programatically to add VRPN support through...

View Article

Answer by mikewarren

I'm not sure I understand your question, but I think your confusing the scene world space with tracking space in whatever XR input system you're using. There is no world tracking coordinate because the...

View Article


Answer by mikewarren

No problem. My suggestion may not be what you're looking for. Absolutely, you can do the calculations manually, but it's a lot harder. When you create a parent / child relationship in Unity between...

View Article

Answer by mikewarren

Your problem is that angles wrap at 360 degrees. I haven't used it, but Mathf.LerpAngle looks like what you might need. [Mathf.LerpAngle][1] [1]:...

View Article


Answer by mikewarren

Try just "stereo". I agree, it's not well documented. FYI. There is an interface to get the valid list of tokens. using UnityEngine; using UnityEditor; public static class StereoSetup {...

View Article

Answer by mikewarren

It's hard to say for sure that something **doesn't** exist, but I've never seen a Kinect work directly with Android. Even if you get the two talking, I suspect all you'll get is a an RGB and depth...

View Article


Answer by mikewarren

It would appear the token is currently "split" not "Split". (I know it's capitalized in the documentation sample you used.) I added VR support for Mock HMD (split) and None to a player and was able to...

View Article

Answer by mikewarren

While I've never used model targets in Vuforia, the [documentation][1] specifically states that model targets cannot be moving. I would expect that to change as the software evolves, but it may not...

View Article

Answer by mikewarren

I'm not familiar with the Steam VR prefab, as I made my own rig. FWIW, I added a capsule collider to the head/camera node. The XR transform that drives the head/camera position is an offset from the...

View Article

Answer by mikewarren

I think this a very thoughtful question. If you're not familiar with them, I would look up the Events and Delegates design pattern (EventHandler) for .Net. Your cars have a state associated with them,...

View Article


Answer by mikewarren

I'm not sure what your asking here, but my guess is that you don't understand why your gun is facing up and not forward when you pick it up. Assuming you're just attaching the gun to the controller...

View Article


Answer by mikewarren

Do the following... - Project Settings -> Player - XR Settings - Check "Virtual Reality Supported" - Add "Mock HMD" to Virtual Reality SDKs - Move to top of SDK list or remove all other SDKs Used to...

View Article

Answer by mikewarren

You just need to create a ratio that maps the min/max distance range to the min/max haptic pulse range. Something like below. (You'll need to adjust units and range values.) float minDist = 0; //...

View Article

Answer by mikewarren

I'm not going to be able to give you a definitive answer, but I think these are good questions to be asking. This is kind of a classic tiling problem. I'd suggest looking up Octrees as an example of...

View Article


Answer by mikewarren

It must be a Unity bug. My project works in 2017.4.9f1 and 2018.1.0f2. It fails in 2018.2.2f1 and 2018.2.3.f1. (sigh)

View Article

Answer by mikewarren

@yasmn Unity has support to build for multiple XR devices, and it allows you to prioritize which device to use if it finds more than one. "None" is an acceptable XR device and can be added to list so...

View Article

Answer by mikewarren

You might want to look here. It was quite a while ago, but I was well into trying to create prefabs with scripts from Asset bundles at the time....

View Article

Answer by mikewarren

@unity_2qEYo_LYOfShGQ I've not specifically integrated a Vicon tracking system with Unity, but I have integrated a couple others. My guess is that the Vicon coordinate system and the Unity coordinate...

View Article



Answer by mikewarren

FWIW, I tried to use the LWRP with stereo non head-mounted last year and it didn't work. It's my understanding that LWRP evolved into URP, so there may still be an issue there.

View Article
Browsing latest articles
Browse All 57 View Live




Latest Images