European astronauts plan to take pics of a comet from the dawn of time

The European Space Agency (ESA) hopes to photograph a yet-to-be-discovered comet as it approaches Earth's orbit for the first time. To do so, it's developing "Comet Interceptor," a composite made of three individual spacecraft, which will separate to...

DIY 3D Printed Apple III Raspberry Pi Case: Palm-sized Flop

YouTuber Charles Mangin is a big fan of Apple’s classic computers, even the apocalypse in a box known as the Apple III. Last year, Charles designed a Raspberry Pi case based on the disastrous PC. It may not look like much, but it’s actually a physical representation of Charles’ love for Apple (and making).

diy_3d_printed_apple_iii_case_by_charles_mangin_1zoom in

Charles wanted to have the case 3D printed, but he didn’t have an Apple III on hand, so he designed the case from scratch. First he made outlines of the case using patent drawings and photos that he got online. Then he turned those outlines into 3D models, with tons of refining and adjusting in between. It took him about 20 hours of drawing, plotting and refining to make sure that his replica case was accurate. Then he uploaded his 3D models to Shapeways and ordered a print. You can skip to 14:52 in the video below to see the finished case, but you really should at least skim through the video to see Charles hard at work and play.

If only the Apple III team was as obsessive as Charles.

[via Hack A Day]

Microsoft Holoportation Augmented Reality Chat App: Hatsune, Me, You

Microsoft’s Interactive 3D Technologies group recently showed us an example of how HoloLens and 3D scanning technology could revolutionize long distance communication. The group’s Holoportation concept is a supercharged version of video chat, letting you see the other party in life-size 3D in real time.

microsoft_holoportation_1zoom in

Holoportation uses several 3D cameras to capture the subject(s), which can be still or moving. The captured 3D models are textured to resemble the subjects, then compressed and transmitted to the other party, who can view the models using the HoloLens.

The proof-of-concept demo below shows that there is noticeable lag, but it’s still incredibly immersive. By having both parties use similar props in the same relative space, such as a chair, you can make it easier to pretend that you’re in the same location. The software also lets you record conversations and adjust the size of the 3D models when playing the recording.

I hope I live to see this become commonplace. Microsoft, you’re our only hope.

[via Gizmodo]

Computational Hydrographics Applies Patterns Precisely: Copy Paste Connoisseur

Hydrographics or water transfer printing lets you apply graphics to objects in one pass. It gives a uniform finish and can be used even with large objects. And watching it is a great timewaster. But it’s not ideal for items with complex shapes, and it can be used to apply only graphics with repeating patterns, because it’s hard to align the film that holds the pattern with the substrate object. Until now.

Computational_Hydrographic_Printing_1zoom in

Students from Columbia University and Zhejiang University presented their computational hydrographic printing system (pdf) at SIGGRAPH 2015. To make it easier to understand this breakthrough, compare the image above with the one below. The patterns on the objects above were printed using the students’ setup, while the ones below were applied using traditional hydrographics. Note how the latter only has simple patterns, while the mask and the car above have a variety of details that are right where they should be – the eyes, nose, headlights, wheels etc – as if they were painted on.

Computational_Hydrographic_Printing_2zoom in

The solution is a combination of hardware made from off-the-shelf parts and the group’s custom software. The student’s hydrographics machine has a depth sensor (e.g. a Kinect) that analyzes the object’s shape as well as its orientation and location relative to the machine’s vat. They combine this data with the group’s virtual simulation of how the film holding the pattern will behave. The simulation predicts exactly how the film will stretch and distort when an object of any given shape is dipped into it.

From there, they can print a pattern that fits exactly on the substrate object (assuming the printing machine dips and holds the object in the recorded position). Because the system can record an object’s shape and print patterns with extreme precision, it can even print multiple patterns separately on the same object.

As noted in the video, this has a lot of applications, but one that’s particularly interesting is its potential to serve as an alternative to colored 3D printers. If you want the colors to only be on the surface of a printed object, perhaps this will be more economical and precise. Download the students’ paper from Columbia University (pdf) for more on their project.

[via Wired]

Interactive 3D Installation Lets You Paint on the Artists: You Are the Canvas

New York University Interactive Telecommunications Program (ITP) students Rosalie Yu and Alon Chitayat came up with Skin Deep, an art installation that’s best described as a virtual form of body painting. They made 3D models of themselves then created a program that allowed people to draw on those models in real time.

skin_deep_3D_drawing_1zoom in

Aside from their 3D models, Rosalie and Alon also printed pieces of paper with textures corresponding to parts of their body. These pieces of paper were then laid out in front of onlookers along with pens and other drawing tools, inviting them to “complete” the artists’ 3D portraits.

skin_deep_3D_drawing_3zoom in

The two made a JavaScript program that would apply the onlookers’ drawings – which were captured by a webcam – on to their 3D models in real time.

skin_deep_3D_drawing_2zoom in

Here’s Skin Deep in action:

I think their intuitive process of turning 2D input into 3D output has applications outside of art as well. Head to Rosalie and Alon’s website for more about their project.

[via Prosthetic Knowledge]

LEGO X Converts Creations into 3D Models in Real Time: Body and Soul

Last year, LEGO launched Fusion, which are LEGO sets that you build and then import virtually into simple games on your mobile device. LEGO X is the adult version of Fusion. Created by Gravity Research Club – the same group behind the 3D sketching project – LEGO X creates a 3D model of whatever you’re building while you’re building it.

lego_x_by_gravity_research_club_1zoom in

According to Architect Magazine, “The LEGO bricks’ movements are reciprocated in real time on a screen using location-mapping and gyroscopic sensors” embedded on each LEGO X piece. You can then use the LEGO X app to refine, store or share the 3D model.

While the prototypes in the video were made with Duplo blocks, Gravity told Dezeen that the next version of LEGO X will use normal-size LEGO. Obviously this is more tedious than CAD or even pen-and-paper, but there might be times when a tangible and playful approach could benefit designers. Besides, I think that the entertainment value of LEGO X is enormous. Imagine using it to play Minecraft or Kerbal Space Simulator, not to mention future LEGO video games.

[via Fast Co. Design]

Quantum Human Maya Plugin Animates Static Human 3D Models: Living Statue

Last month, we learned about Live2D, an application that combines multiple illustrations into one 3D model. Here’s another upcoming time-saving application for animators. Currently in development by Quantum Matrix, Quantum Human is an Autodesk Maya plugin that can turn a static 3D model of a body into a character that you can animate, render or use for motion capture.

quantum_human_3d_skeleton_1zoom in

Quantum Human recognizes the different parts of the 3D model, segments them properly then adds a skeleton, control rig, a muscular system and much more. It can even adapt assets such as clothing to fit the resulting character. Since it’s (at least initially) meant for professional use, the plugin has a ton of options and parameters, but it will also let you save batches of settings. In fact, it has a one-click conversion process out of the box:

The software’s creator and Quantum Matrix founder Kwai Bun made this demo short film to demonstrate Quantum Human’s capabilities. Kwai made the entire demo within six man-weeks, and other than the hair and clothes, the main characters movement abilities were built using Quantum Human.

Quantum Human will be released later this year under various licensing options. Quantum Matrix promises that future versions will not require Maya. Animate a browser and head to Quantum Human’s website for more information, images and demo videos of the application.

[via Prosthetic Knowledge]

Live2D Euclid Turns Combines Illustrations into a 3D Model: Sketchune Miku

3D animations based on illustrations sacrifice a significant amount of quality in their transition from 2D. You can see it in games like Ni No Kuni, Valkyria Chronicles and the Naruto Ultimate Ninja Storm series. Their models look great from afar, but you can see they look flat in some areas up close. A company called Live2D aims to eliminate that sacrifice with its Euclid technology.

live2d_euclid_3d_animation_technology_1zoom in

Live2D’s Euclid will allow animators to combine 2D illustrations drawn from different angles into a single 3D model, losing none of the details in the process. Each illustration can be deformed independently, which should result in more subtle and varied animations. Animators will also be able to combine a Euclid 3D model with a 3D model made with other methods to create a single 3D model.

The girl’s exploded view creeped me out. Head to Live2D’s website for more on Euclid.

[via DigInfo via Siliconera]

Exo 3D Printed Titanium Leg Concept: The Six Hundred Dollar Man

Prosthetic legs typically go for several thousands of dollars and stick out because of their robotic appearance. Industrial Design student William Root wanted to address both of those issues with his Exo concept. It’s a prosthetic leg that takes advantage of modern 3D modeling and printing technology.

exo_3d_printed_prosthetic_leg_by_william_root_1zoom in

In theory, production of the Exo would start out with a 3D scan of both the wearer’s remaining leg and the stump on the damaged leg to ensure a perfect fit. For the latter, William points to MIT’s FitSocket technology, which analyzes the “properties of soft tissues in human limbs” to create a personalized prosthetic attachment model in “minutes instead of months.”

exo_3d_printed_prosthetic_leg_by_william_root_2zoom in

This is in contrast to current methods of making molds out of the wearer’s stump, and then casting, forming, fitting and making adjustments, a process that sometimes has to be done multiple times to get the fit right.

exo_3d_printed_prosthetic_leg_by_william_root_3zoom in

William thinks that Exo should be produced out of titanium powder using laser sintering. He chose a wire frame design not only to reduce materials and weight but because it looks beautiful as well.

exo_3d_printed_prosthetic_leg_by_william_root_4zoom in

exo_3d_printed_prosthetic_leg_by_william_root_5zoom in

exo_3d_printed_prosthetic_leg_by_william_root_6zoom in

Head to William’s Behance page to learn more about his concept. I wouldn’t be surprised if the Exo turns out to have biomechanical or medical flaws, but the idea of using 3D modeling and printing prosthetics has been proven in small scales with prosthetic hands and could be worth exploring further.

[via Boing Boing]

 

Metal Gear Solid Virtual Diorama: Solid Snake, REX, Y & Z

While most 3D models on Sketchfab are of characters or objects, member Glenatron proves that it can be a great platform to showcase dioramas as well. He recently shared his cute interactive homage to Metal Gear Solid, which depicts scenes from the PlayStation game.

chibi gear solid by glenatron 620x349magnify

Glenatron’s Chibi Gear Solid features multiple tiny Solid Snakes in a tiny Shadow Moses island. One Snake is calling Roy Campbell on the CODEC, one is fighting Metal Gear REX with Gray Fox and so on. Dive into the diorama below:

It’s like MGS crossed with Captain Toad: Treasure Tracker. I’d play the heck out of that. It could be a sequel to VR Missions. Now I’m psyched for a non-existent game.

[via Sketchfab via Kotaku]