There’s no reason to drag this out any longer. Let’s see what happens when we float the dual Canon 3D rig around on a DIY Steadicam. It’s wrapped around the sphere about 160 horizontal degrees, based on the amount of stereoscopic depth (to avoid hurting anyone’s eyes).
The noise you hear in the background is the air-conditioner blasting away during this rare southern California heat wave.
I have added some pretty wallpaper in the background for people who prefer to have 360 degrees of imagery in their 360° virtual reality videos.
This video is extremely important, since it will show how much camera movement you can get away with when you are out runnin’ and gunnin’, shooting a 3D video that will be wrapped this far around the sphere, so let’s take a really close look at it with our stereoscopic VR viewers like Google Cardboard and Gear VR.
Remember, there’s always real 360° 3D CGI as an option, which is also a lot of fun!
This scene plays for 3 minutes to allow enough time to look around.
Created in a Spherical Stereo Blender project, composited and rendered with Sony Movie Studio Platinum 13, and converted to h264 mp4 with HandBrake.
“Dragster Engine” Blender model by ChrisKuhn: http://www.blendswap.com/blends/view/84608
Let’s take a look at camera movement in 360° 3D virtual reality scenes, which will be more problematic the further we wrap them around a sphere. This one, for reference, has no movement. It is wrapped approximately 160 horizontal degrees, so the 1.3% NetD (net deviation, i.e., total stereoscopic depth) won’t hurt anyone’s eyes. It is “mounted to infinity” to avoid making anyone diverge their eyes.
I’ve added a background of my current Windows 10 desktop on my Ultrawide LG monitor, in case someone with a broken gyro in their phone happens to start the video in the wrong direction, and some people prefer to see imagery all the way around the 360 degree scene.
The 3D video was recorded with my dual Canon 3D rig, and I monitored the sync with a DIY BASIC Stamp sync monitor.
The 360° 3D VR scene was set up in a Blender Spherical Stereo project, and the final compositing was done with Sony Movie Studio Platinum 13.
I’ve been to this point before with 3D things, then failed, so I’m doing a double-check to see if my new 360° 3D virtual reality workflow is *really* working.
What’s different about it, you ask? Thanks for asking! I’ve modified the rendering process, which reduces the rendering speed and file size, which reduces the upload time to YouTube!
This is my first experiment with a 3D photo as the near point and a 3D video “picture-in-picture”, and now I’m going to have more fun than one person deserves!
Also, there are no rules that say we have to show 360 degrees of imagery in a 360° virtual reality scene—this one only has a 195 degree hFOV.
As I explain in this 360° 3D virtual reality video, I was hoping to be out runnin’ and gunnin’ this morning, checking for camera movement anomalies, but I’m still having rendering issues.
Hopefully I have found a decent workflow compromise by rendering an XAVC video out of Sony Movie Studio or Vegas, then converting that to an h264 video with HandBrake.
I created the foreground scene, which is a photo shot with my cell phone, “mounted to the near point” with a Spherical Stereo Blender project.
The “picture-in-picture” 3D video in this 360° scene, “mounted to infinity”, as the old-school stereographers like to call it, was recorded with a Sony TD30. It has close to a 65 degree hFOV, which should almost fill the entire screen in a Google Cardboard or Gear VR type stereoscopic viewer.
I was going to get out early this morning and do some runnin’ and gunnin’ for a 360° 3D virtual reality video, but hadn’t played with my 5 inch image-splitter, *The Beast*, in so long, I almost forgot how to use it! By the time I got it set up, it was so bright and sunny outside, I needed to put a polarizer filter on the lens, but couldn’t remember if it worked with *The Beast*.
I used a Spherical Stereo Blender template to determine the degree of spherical wrap I needed with the 1.5% of net deviation in the video, to insure the depth won’t hurt anyone’s eyes, which ended up being 130 degrees.
Next week I should be totally ready to get out there into the real world and finally do some serious 360° runnin’ and gunnin’!
Rotate this view to see three 3D photos of Bonelli Park.
This is a final test of my 360° Reverse Engineering theory before I move out of the shadows in the parks into the heat of the public streets. Since I will no longer have to make stereoscopic calculations and measure distances before shooting, it’s time to do some seriously quick and dirty runnin’ and gunnin’, baby!