Intel RealSense and Uraniom Put Your Avatar in Fallout 4

Erica Griffin - Intel RealSense & Uraniom Avatar in Fallout 4

While Intel’s RealSense 3D scanning technology isn’t particularly new, the application exhibited at CES 2016 by Intel and Uraniom is revolutionary, as it enables gamers to become the stars of Fallout 4 and several other games.

Games where the character’s features can be altered have been around for quite a while now, and people who paid attention to such details spent a lot of time trying to get the character to resemble, even remotely, to themselves. They had to pick a hair color, face shape, type of facial hair, so on and so forth, and at the end of this lengthy process, all that they ended up with was a generic character they shared not more than five features with. Enter Intel’s RealSense 3D scanning technology and Uraniom’s platform for turning raw 3D scans into playable video game avatars, and you end up with a character created in your image that’s roaming around in the open-world environment.

Intel claimed in September 2014 that it would bring 3D scanning to smartphones and tablets in 2015. While there have been a couple of devices equipped with the RealSense technology (most importantly the Dell Venue 8 7840 and the HP Spectre X2), this whole trend hasn’t picked up steam as fast as I would’ve hoped. Maybe implementing depth-sensing cameras into mobile devices is expensive for manufacturers and they don’t want to make their products unapproachable. Personally, I think 3D scanning goes hand in hand with 3D printing, and since that industry has gained a lot of momentum, I think that Intel RealSense should, too.

Erica Griffin, the technology nerd who likes to film stuff, exemplified in a video shot at CES 2016 (that you can watch below) how all of this works. She had her head scanned using an HP Spectre X2 tablet that’s equipped with Intel RealSense R200 depth-sensing cameras. This process is a bit awkward, as someone needs to hold the camera and walk around the subject, who in turn isn’t allowed to move.

According to Intel, the RealSense R200 cameras provides reliable depth information. To facilitate the scanning process and make sure that the resulting avatar looks proper, Uraniom recommends keeping a neutral face, having the neck exposed, the hair (if any) tied back, and homogeneous lighting. The low-res scan was then uploaded to the cloud to render a high-res that can then be adjusted in Uraniom’s avatar editor. Some facial parameters need to be aligned prior to exporting the avatar and importing it into the game. As Erica pointed out in her video, seeing yourself in a game is equal parts amazing and disturbing.

Even though Intel has only exemplified inclusion of avatars in Fallout 4, the technology can be used for several other games, including FIFA 2015, Arma 3, GTA V and The Elder Scrolls: Skyrim. Hopefully, more developers of games where character customization makes sense will join the trend.

Be social! Follow Walyou on Facebook and Twitter, and read more related stories about the new way of interacting with laptops and doors that involves Intel RealSense, or the Dell Venue 8 7840, the world’s most interesting tablet.

[Source and image credit: Erica Griffin]

Quantum Human Maya Plugin Animates Static Human 3D Models: Living Statue

Last month, we learned about Live2D, an application that combines multiple illustrations into one 3D model. Here’s another upcoming time-saving application for animators. Currently in development by Quantum Matrix, Quantum Human is an Autodesk Maya plugin that can turn a static 3D model of a body into a character that you can animate, render or use for motion capture.

quantum_human_3d_skeleton_1zoom in

Quantum Human recognizes the different parts of the 3D model, segments them properly then adds a skeleton, control rig, a muscular system and much more. It can even adapt assets such as clothing to fit the resulting character. Since it’s (at least initially) meant for professional use, the plugin has a ton of options and parameters, but it will also let you save batches of settings. In fact, it has a one-click conversion process out of the box:

The software’s creator and Quantum Matrix founder Kwai Bun made this demo short film to demonstrate Quantum Human’s capabilities. Kwai made the entire demo within six man-weeks, and other than the hair and clothes, the main characters movement abilities were built using Quantum Human.

Quantum Human will be released later this year under various licensing options. Quantum Matrix promises that future versions will not require Maya. Animate a browser and head to Quantum Human’s website for more information, images and demo videos of the application.

[via Prosthetic Knowledge]

Shoes that Look Like Meteorites: Debrislliant Idea

Instead of a sewing machine, Studio Swine used a CNC mill to make its eye-catching Meteorite Shoes. The studio used a 3D scanner to scan meteorite samples at the Natural History Museum. They used the resulting 3D files to create a design for the shoes’ upper. Finally, a CNC milling machine carved the irregular shape and texture out of aluminum foam.

meteorite_shoes_2zoom in

Though it seems like the shoes would be awkward or painful to wear, Studio Swine says the shoe has a soft leather lining. They’re also lightweight because aluminum foam is mostly made of air.

meteorite_shoes_1zoom in

Crash into the studio’s website to see more photos of the shoes. Microsoft commissioned the shoes and Studio Swine made the footwear with the help of the Surface Pro 3, so we’ll probably see someone walking around in them in Microsoft ads.

meteorite_shoes_3zoom in

[via Inspirationist; image by Petr Krejci]

This Guy Made a Decent 3D Scanner Using a Cheap Webcam and a Laser Level

If you want to 3D scan things, you’re going to have to shell out some serious cash. The MakerBot Digitizer, which inspired this project, for example, is a whopping $800. That’s definitely worth it to a person who’s got money to burn or using it for professional purposes. Most of us who want one just want to play with one, and $800 is pretty steep for a toy.

makeshift_digitizer_1zoom in

Will Forfang was unwilling to part with that much money, and also unwilling to not have a 3D scanner. He is, however, quite clever, and wrote a program that compiles a ton of images from a cheap Logitech webcam of an object that is being illuminated by a laser level and translated that data into a 3D model. No, it’s not as good as the Digitizer, but it honestly did a very good job of scanning that skull.

Scanning+processzoom in

If you’re brave enough to try to replicate his set up, you can follow Forfang’s instructions here.

3d_skull_scan_1zoom in

[via 3D Printing Industry]

Cadillac Elmiraj Concept Car Made Possible by 3D Scanning Tech

I’ve always wondered exactly how modern concept cars are actually designed. In years past, artists went to work with clay and molded the shape of a car by hand. I assumed that most of this process was now done with computers. Apparently, a mix of the two is more accurate for modern concept cars like the Cadillac Elmiraj.

elmiraj tb 620x406magnify

GM has announced that 3D scanning was a big part of the design process for the car. GM used a 3D scanning process that projected light patterns and used an advanced camera to capture 3D shapes and turn those shapes into mathematical data. That mathematical data was then used to create a 3D rendering.

Once those math-based models were achieved, the hand modeling in clay and computer milling took over. Changes made to the mathematical model for the concept were applied to the clay model using computer controlled milling. This process allows GM to move from a scale model to a full size model in about a week.

“With the Elmiraj, we were able to use 3-D scanning as the bridge between traditional hand-sculpting teams who work in clay and digital modeling design teams who work in math,” said Frank Saucedo, director of General Motors’ North Hollywood Advanced Design Studio. “Our ability to scan the clay model with speed and precision and go from the digital tools to the hands of a craftsman and vice versa was extremely valuable.”

3D Systems Sense review: a 3D scanner for the masses (almost)

3D Systems Sense review: a 3D scanner for the masses (almost)

If we've crossed paths in the past week, there's a pretty good chance I've scanned you. This extends well beyond the human race, into the realms of animal, vegetable, plush toy and fruit bowl. Some subjects were too small to be scanned, some too fidgety and, in the case of my attempted 3D selfie, not nearly flexible enough. Such issues were mere roadblocks in my strange one-man journey to 3D-scan the world. I may have a problem. I admit it. For starters, I'm not completely sure what I plan on doing with all these scans, but while such questions are entirely logical, they've yet to curb my enthusiasm for the device. Sense is one of those propositions that seems too good to be true: a user-friendly, (relatively) portable 3D scanner capable of capturing objects up to 10 feet by 10 feet, and at a fraction of the price of the competition.

If the product is indeed what 3D Systems claims, it could fill a major hole in the consumer 3D-printing market. In recent years, 3D-printing companies have largely focused on the printers themselves, which have gotten cheaper and easier to use. At the same time, the race to dominate the category has often caused companies to ignore the question of how those without extensive CAD experience can create 3D files in the first place. MakerBot unveiled its solution back at SXSW: the $1,400 Digitizer, a rotating, desktop scanning bed capable of capturing objects up to eight inches by eight inches. 3D Systems' Sense takes a wholly different approach: This is a $400 handheld scanner that can digitize an entire human being.%Gallery-slideshow123207%

Filed under:

Comments

Stantt Uses 3D Body Scans to Make Shirts That Fit Perfectly

Stantt Shirt

Shirts that aren’t the right size for your body don’t just look bad; they’re also uncomfortable. You can study size charts all you want, but unless you’re getting a shirt that’s made to order, then chances are you won’t really find a shirt that’ll fit you perfectly.

Cue Stantt and their revolutionary approach that’ll fit you in one of over fifty available sizes that were created using 3D body scan data as a reference. Matt Hornbuckle and Kirk Keel, who are behind Stantt, hope to change how clothes are tailored, and for now, they’re setting their sights on the casual button-up shirt.

The duo want to do away with the usual S/M/L sizing that’s available for most brands so you can find the right size that actually fits your body. Each shirt is crafted by hand in the USA and your size will be determined using DataFit Technology. All you’ll have to do is provide them with your chest, waist, and sleeve length measurements, and they’ll check the database and tailor you a shirt that’ll fit your body best.

Stantt shirts are currently up for funding on Kickstarter, where a minimum pledge of $78 will get you one of your very own.

VIA [ C|NET ]

Bre Pettis on the MakerBot Digitizer: we’re building an ecosystem (video)

Bre Pettis on the MakerBot Digitizer we're building an ecosystem video

"We get to set the standard in desktop 3D scanning," Bre Pettis says, beaming. "When we looked out at the world and saw what 3D scanners could do, we wanted to make something that could make really high quality models that you could create on your MakerBot." The CEO can't stop smiling at the close of the Digitizer's official press launch. It's the smile of a man who has just shown off a major piece of the puzzle -- an object that helps answer the question of just how, precisely, average consumers can create products to 3D print.

"We're really building out an ecosystem," he says of the scanner, which joins the Replicator 2, MakerWare software and the Thingiverse online database in the MakerBot portfolio. "The game is on, we're building a nice suite of products that work really well together." It's a pricey piece, of course, coming in at $1,400, but Pettis insists that it'll give users a much fuller experience than hacked Kinect-type solutions, thanks in large part to the Digitizer's software solution. "There are DIY options out there, but we've spent the time and energy on the software to make this a really seamless experience."

And as for a potential Replicator / Digitizer bundle deal, well, Pettis is only saying, "stay tuned."

Filed under:

Comments

Fuel3D brings point-and-shoot 3D scanning prototype to Kickstarter

Image

As a seemingly endless stream of companies work to bring the world its first truly mainstream desktop 3D printer, a number of folks are attempting to bridge a fairly fundamental disconnect: how to best help the average consumer get their hands on 3D models in the first place. Databases are a decent solution -- Thingiverse has a devoted community of makers working around to clock to create cool things for us to print out. Simplified software can work, too, but that still requires some artistic talent on the part of the creator. 3D scanners seem to be the most popular solution these days, from Microsoft's Kinect to MakerBot's lazy Susan-esque Digitizer.

Fuel3D is the latest company to take its entry to Kickstarter. The handheld 3D scanner is based on a technology developed at Oxford University for medical imaging purposes. Now the company is looking to bring it to market at under $1,000, offering full-color, high-res 3D scans through simple point-and-shoot execution. Once captured, that information can be exported for things like the aforementioned 3D printing and computer modeling. The first three folks who pledge $750 will get their hand on a pre-production model and those who pony up $990 will receive the triangular final version. The company expects to ship in May of next year -- assuming it hits that $75,000 goal, of course. After all, Fuel3D can't exactly print money -- yet.

Filed under:

Comments

Source: Kickstarter

3D scanning with the Smithsonian’s laser cowboys (video)

DNP 3D scanning with the Smithsonian's laser cowboys video

"We're not scanning every object in the collection," Adam Metallo tells me, offering up the information almost as soon as we set foot in the Smithsonian's Digitization office. It's an important piece of information he wants to make sure I have, right off the bat. It seems that, when the story of the department's 3D-scanning plans first hit the wire, a number of organizations blew the scope of the project out of proportion a bit. And while the team's project is certainly ambitious, it's not, you know, crazy. It's the work of a three-person team, still in its nascent stages, attempting to prove the value of new technologies to a 167-year-old museum affectionately known as "the nation's attic."

In the fall of 2011, Metallo and fellow Smithsonian 3D scanner Vince Rossi (a duo the institute has lovingly deemed its "laser cowboys") unpacked their equipment in Chile's Atacama Desert. "They were widening the Pan-American Highway, and in doing so, they uncovered about 40 complete whale specimens," Rossi explains. "But it might take decades for them to remove the fossils from the rock, so we were able to capture this snapshot of what that looked like in 3D." The tool of choice for the expedition was a laser arm scanner, which utilizes a process the duo compares to painting an object, moving back and forth across its surface as the device records the relative position of its axes.

Filed under:

Comments

Source: Facebook, Twitter