The idea behind the Evo’s design is a simple, but unique one. 3D cameras and 360° cameras have one thing in common… the presence of at least two lenses. Where those lenses face in relation with each other, and the type of lens determines the kind of media you capture. Lenses that sit side by side (with a rough distance of 2.5 inches between them) can capture two different channels corresponding to the left and right eye, creating a sense of depth, and therefore a 3-dimensional video or image. Lenses (usually at least 180° fisheye) that face in opposite directions can capture an entire scene in 360 degrees, allowing you to create videos or images you can look around in and immerse yourself into. The Insta360 Evo simply creates a mechanism in which these two lenses can fold to either face in the same direction or the opposite, allowing the camera to alternate between shooting in 3D and in 360°.
The Evo can record 3D 5K/30fps video (or 18-megapixel stills) with a 180-degree field of view, viewable using a VR headset that comes in the box, or an innovative HoloFrame case that sits on your phone, turning your phone’s screen into a 3D display. Fold the cameras to face opposite each other and the camera captures 360° videos and stills that you can view in your VR headset, even looking around to see things behind, beside, above or below you.
What’s even more remarkable about the Evo is its ability to not just record, but also stabilize video. Using its 6-axis gyroscopic stabilization system, Evo’s videos are immersive, crisp, and jitter free. The FlowState stabilization system even allows the Evo to capture time-lapses that are incredibly smooth. Whether you’re walking on the footpath or on a bumpy trail, the Evo can capture videos without needing an external gimbal or stabilizer (the gimbal would end up getting captured in 360° videos too). A simple flip/fold mechanism allows you to transition between shooting in 3D and shooting in VR, allowing you to create fully immersive video content, and the Evo even packs kits, headsets, and cases that let you and your audience properly view the content you’ve created!
The Hololens, when it released four long years back, made some great promises and lived up to a fair amount of them. However, after 4 years of being used, developed for, and pitted against competition, the Hololens 2 has a set of well-defined milestones to achieve. Microsoft spent a significant amount of time figuring out where the Hololens would enhance professions and lives, and even an incredible amount on the Hololens’ internal technology that they claim is so state-of-the-art that the Hololens 2 has no competition for the next 2-3 years when it comes to fidelity.
This is the Hololens 2. It’s slightly lighter, slightly better looking, but just remarkable on the inside. It has a much larger projection area and can display images in 2K. Rather than relying on reflection technology, the Hololens 2 uses lasers, a waveguide, and a proprietary rotating mirror that turns the laser’s beam into a wide fan of light that reaches your eye. This fan of light is what you end up seeing, and Microsoft’s managed to make this system so incredibly good that the images you see feel real. The Hololens 2 also packs cameras that scan your pupils to identify you and correct the images to match the distance between your pupils, calibrating things to an incredibly minute level so that your idea of reality is successfully blurred. The Hololens 2 is also a mixed reality headset, allowing you to interact with the holograms around you. An Azure Kinect sensor sits on the front of the headset identifying objects as well as gestures, making your interaction with digital elements much more natural.
The Hololens, according to Alex Kipman, is a device meant for the majority of professions. Created to be a device that transcends its current use (designing cars in 3D and such), the Hololens 2 actually sees itself being used in more technical requirements as a means to look past objects. Be it an engineer who’s analyzing or fixing a jet engine, or a doctor who’s studying a patient, or even a production designer who’s trying to visualize an entire space with interactive elements. In short, instead of making the computer experience portable, it’s designed to bring the digital experience to people who don’t sit in front of computers at all. The Hololens 2 even packs a flipping hinge that lets you lift the lens up like a visor or welding mask, lowering it down on your face only when you need it. Microsoft plans to roll out the Hololens 2 purely to corporations and factories, rather than let consumers have access to it immediately. Judging by how much value its design intends to bring to professions that need to get down in the dirt and use hands, that makes sense… but then again, that’s also been Microsoft’s mantra for the longest time. After all, the company is in the business of facilitating businesses, isn’t it?
The future generation won’t look at VR headsets and controllers the same way we do. For us, the generation that used pens, pencils, notebooks, and blackboards, VR headsets and future-tech aren’t as easily accepted and integrated into our lives as they are with younger generations. The Soar ensures that this tech is introduced at a young age, in the right manner, to children. Rather than playing with an iPad or a VR headset first, the Soar ensures their first interaction is a positive learning experience.
The design of the Soar kit is as perfect as it gets. A dock, a VR headset, and two controllers (with stylus heads) all come together into a singular form, nesting neatly within each other. The dock charges the devices, while the headset can be used for interactive experiences. The controllers can work either with the headset, or with a tablet that’s used separately. On the other end of each controller is a thick stylus that works as a drawing/coloring pen with the aforementioned tablet.
The Soar’s nesting design makes sure that the schoolchild has all their tools at their disposal. Its small form factor also makes it easy to carry around, and to school and back home… where rather than rely on their parents’ iPads or Nintendo Switches, or Oculus Rifts, they’ll be immersed in tech that’s built just for them, to induct them appropriately into futuristic technology.
My strongest visual memory of VR headsets is this image of Mark Zucerkberg walking past an entire audience wearing VR headsets. There’s a lot to discuss about this picture. Especially the element of dystopia, where masses have these large cuboids strapped to their faces, completely absorbed in their virtual world, oblivious to their surroundings. This feeling is brought about by the VR headsets, which are enormous, and truth be told, ugly.
VR’s always had this bad rap, thanks to their hulking design… but it doesn’t have to be that way. LUCI’s Immers VR Headset does everything a top-notch VR headset does, but in half the size, and with a brushed metal frontal plate that gives it a major style upgrade. On the inside, you’ve got two UHD displays delivering 4K output to your eyes, with a series of proprietary ‘pancake’ lenses that practically dissolve the pixels, so you don’t see a single one, but rather a crisp, clear, high-contrast image. The Immers also packs a 3D audio system to go with their display tech.
Immers does all this while being lightweight, compact, and most importantly, stylish. The Immers can be carried around in a bag, or even hung from your collar, and no one would know the difference between it and a stylish pair of glares. Who knows, it may even help VR headsets become as ubiquitous as everyday carry one day!
I’ll give the guys at Feelreal a pass because the VR market is only growing, so we may have to wait a decent amount of time before multisensory VR headsets don’t look like someone glued a toaster to your face. Its bulky design aside, the Feelreal is an attachment that adds an olfactory element to your VR experience. Compatible with the Oculus Rift, Go, HTC Vive, Samsung Galaxy VR, and the Playstation VR, the Feelreal snaps to the bottom of the headset, sitting right in front of the nose. It then releases smells that complement the media you’re watching by simulating, stimulating, or relaxing.
Compatible with a variety of games, movies, and multimedia content on platforms like YouTube VR, the Feelreal can generate a variety of aromas and smells, making you feel like you’re in the moment, whether it’s a forest, a race-track, or a battlefield. The Feelreal comes with a series of 9 aroma-vials that combine to create as many as 255 different types of distinct smells, from flowers and petrichor, to food, to grease or gunpowder. The Feelreal goes the extra mile by providing a tactile experience too! It comes with its own water-spray, microheater (to simulate heat) and fan (to simulate wind), adding different layers to your audiovisual VR experience.
Feelreal’s addition to VR opens up a world of opportunities. Designed not just for games and movies, the Feelreal can also be used for therapy/meditation or even in the culinary industry. The idea is definitely a novel one (and makes VR a truly multisensorial experience, as it should be), although I personally have issues with wearing a headset that practically feels like a brick tied to the front of my face… but that’s just because the concept and the backing technology is so nascent…
Probably for the first time, spectacle wearers may be at a strategic advantage as North acquired all of Intel’s Vault AR patents. The company aims to build and launch, in the near future, smart AR glasses that are incredibly light and indistinguishable from regular glasses. They’re stylish (and look nothing like the abomination that was Google Glass), and come with a module built into the side of the glasses (the stem) that projects an image on your spectacle lens and uses the glass of your eyewear to reflect that image directly onto your retina. The image is seen by you, the wearer alone, and more importantly, can be focus-calibrated according to your eye-power.
Intel has been struggling to get the technology off the bench, but North’s acquisition of its research and its patents may just mean that we, the spectacle wearers, will have AR-enabled HUD eyewear very soon, that may make taking the smartphone out of our pocket an entirely unnecessary activity. The laser projection module can tell you the time, share notifications, even let you browse maps or know who’s calling, and North’s experiments with their own ‘Focal’ series of AR glasses even came with a ring that let you browse through and control your content, as well as Alexa, so you could give commands to your smart-glasses… and while this is an incredible breakthrough, the fact that North’s glasses still look like normal, off-the-shelf prescription eyewear is extremely noteworthy. It shows a realization that wearable technology must conform to the standards of fashion, if it is to see the daylight of mass acceptance. Me, I’m glad that eyewear (something I cried about and hated as a kid) is stepping into an incredible, powerful future, and probably the next time someone calls me ‘four eyes’, I might just be able to google a nasty comeback without them noticing!
The future in which we’re perpetually strapped to AR or VR headsets is a distant, dystopic one. At most, we’d use a VR headset for 2-3 hours of work or gaming, in conjunction with our desktop PC. That temporary use is what triggered the design of the Pega VR4. It sits in its dock, right beside your desktop, and plugs to your head when you need.
Short and fragmented VR experiences are much more likely in the present and the near future, says designer YuHsun Chung. People often take to VR to visualize products better, often in context, and interact with them in ways that aren’t possible on a desktop screen. That’s where the Pega VR4 comes in. Designed as an accessory to your desktop, like a mouse or a keyboard, the Pega VR4 can be used when you want to visualize content (3D or 360°) and even interact with it. On the front of the headset sits a LEAP-controller that tracks your hands, giving you the ability to manipulate and interact with your content, letting you zoom in and out of scenarios, revolve or pan 3D models, or test out VR interfaces. Once done, just pop the Pega VR4 off and dock it back in its platform. The platform doesn’t just charge the VR headset, it also connects directly to your system via LAN, USB, and HDMI, so that your headset remains completely wireless! Ah, perfect!
My experience with VR has been limited to slipping phones into VR headsets, so I can say without a doubt that the minute your eyes see those magnified pixels, they know they’re being fooled. Besides, your 16:9 aspect ratio phone screen can’t match up to your 180° of peripheral vision, so all you do end up seeing is a highly pixelated box of visual content in front of you. Pimax (pronounced pie-max) changes that with its incredible 8K VR headset that puts two 4K screens in front of each of your eyes. Pixels are barely discernible to the naked with such a high resolution, and the two 4K screens combine to give you an immersive 200° field of view to fill your POV.
Partnering with Nvidia, AMD, Disney, Valve, Leap Motion, and even BMW, the Pimax will soon see itself being used in a variety of training and entertainment applications. Pimax’s hi-def VR headsets even come with an optional eye-tracking feature and even two motion controllers for your hands! And at $899 for just the headset alone (that’s $100 more than the HTC Vive), the Pimax 8K VR headset is quite a steal! For starters, it’s a fraction of the cost of an 8K television!
Take a second to appreciate what’s happening in the video above. It uses augmented reality for work in a way that intuitively plugs the gap between physical and digital workspaces. It requires no tutorials, and is so easy to understand, simply because our physical experiences of organizing our workspaces is brought into the digital world!
Meet the Magic UX, and experiment by the guys at Special Projects. The idea of the Magic UX is to take the way we multitask in the physical world, and bring it to the digital world. Multitasking or switching between tasks on your phone can involve a lot of unnecessary swiping and pressing of icons and buttons that waste time. Multitasking at your desk isn’t that complicated. If you want to write, you pick up your pen and write. if you want to type, you put the pen down and move your hands to the keyboard. The guys at Special Projects believe that Augmented Reality in phones can bring that physical ease into the digital world.
The Magic UX is a result of that belief. You use the phone’s spatial awareness to ‘pin’ applications in a certain point in space, much like placing your notebook in one corner of your desk, and your calendar at another, and your post-its at a third corner. Magic UX lets you pin your apps in dedicated spaces, and the minute you move your phone away, the app fades into the background. You can create a literal landscape of apps that you can switch between by simply switching the location of your phone. What’s even better is that you can even drag and drop items with incredible intuition, by quite literally dragging and dropping them in a virtual space! I can’t wait to see major mobile operating systems begin noticing the potential that AR has in revolutionizing the way we work!
The kind of specifications I’m about to read out to you are so insanely over the top, they’re sort of hard to believe. This super-expensive football capable of recording audio and video content in VR comes with 16 (yes, SIXTEEN) RED Helium 8K sensors with 180-degree Schneider lenses, and capturing content at 60 fps in VR in a single scene with depth information (so you’ve got 3D ready data too) to display 360-degree footage at a quality that has the brand value of RED attached to it.
The camera is capable of capturing depth information and discerning between foreground and background and rendering out parallax too (check out the video below), allowing for 6 degrees of freedom while viewing content using a VR headset, so in layman’s terms, you can move your head forwards and backwards, side to side, and up or down, and objects in the video will orient themselves with respect to your head’s position, which makes for an incredibly realistic VR experience because as your head moves even slightly while breathing, or just through involuntary movements, your VR environment responds to it by repositioning objects too, to make your VR experience significantly 0more believable and immersive.
The Manifold is built in a way that allows you to store controls and video storage as far as 328 feet away from the camera (obviously in a 360° recording, you don’t want any cast/crew or equipment in the camera’s field of view). The Manifold utilizes Facebook’s depth estimation technology to build its 3D view for volumetric data and even has post-processing tools for filmmakers like Adobe, Foundry, and OTOY to render out projects.
While the two companies aren’t being particularly generous with launch details, it’s safe to say that the Manifold is clearly strictly for professional use (and will most likely be extremely expensive for professional video capture too). However, if and when the Manifold does launch, and finds itself being used to capture videos (whether for movies or games), it’ll be pretty ground-breaking to be able to capture content as advanced as what the Manifold promises. I hope I live long enough to see it become an industry standard (like IMAX) too!