Harry Potter fans will have their mouths wide open for the next big set revealed by the LEGO Group. Yes, the Sorting Hat for choosing out of the four Hogwarts Houses (Gryffindor, Slytherin, Ravenclaw, Hufflepuff) that each of the students will go to, now gets a dedicated LEGO version.
This detailed set slated for a March 1, 2024 release consists of 561 pieces in total, each one coming together to create the intricate build. It’s going to be the first time in more than a decade that LEGO has incorporated a sound box in one of its Harry Potter sets. The Talking Sorting Hat will carry a price tag of $100 when it is finally available to pick off the shelf. Eager Harry Potter fans can also pre-book it right away!
The hidden sound box says out loud randomized phrases in English, that we all are familiar with. Of course, these are from among one of the magical verses that assign the person (holding the tip or placing it on top of the head) to one of the four Hogwarts houses. When the 9.5 inches tall and 7.5 diameter LEGO set is fully assembled it can be adorned on a stand that has the Gryffindor, Slytherin, Hufflepuff and Ravenclaw house symbols all around. The tip of the hat and the eyebrows are both movable – in fact, pressing the hat’s tip opens the mouth, sways the eyebrows and plays the random phrase from the 31 available options.
As a generous bonus, the Talking Sorting Hat comes with an exclusive Harry Potter minifigure wearing the Sorting Hat miniature version and looking ever so cute. As already mentioned the set consists of printed bricks depicting each of the houses, the other patches are realized with the stickers.
While I’m basically a digital person, I turn analog when it comes to my journaling habit. This means I have a lot of tools like notebooks, stickers, washi tapes, and other ephemera to help me journal. But probably my most important “weapons” are my pens. As someone who likes colorful things, I collect different colored (both ink and the pen’s actual color) pens that I can use when I write in my various journals. So whenever I see a new kind of pen, whether it’s the design or the features, I pay attention.
Designer: Seung-Wan Nam
This concept for a pen called Bloomstick is based on the idea that writing down your dreams is an important part in making them come true. So the pen can metaphorically help your dreams to “bloom like flowers” when you write them down on paper using it. The tagline of the product is “click to bloom your dream”. It is basically a pen with a silicone-covered button that when you press it opens to a flower-like shape and turns it into a blooming instrument.
The product renders show different colors available for the pens like green, blue, and pink. The flower part of the pen is white while the “bud” part seems to be of a different color that matches the main, silicone part of the pen. When closed, it looks like just any ordinary pen and you’ll still be able to use it of course but it’s without its blooming design. There doesn’t seem to be any other function that it can do aside from write and look pretty.
As someone who collects pens and who likes flowery, pretty things, this is something I’d probably buy if I see it in a stationery store. Now if it can actually make my handwriting look nicer or make my dreams come true, I’d order it as soon as it hits the market.
Denis Villeneuve’s upcoming movie Dune: Part Two is creating quite a buzz before its March 1 release in the US. Hamilton Watch with its stint of creating watches for more than 500 movies since 1930, wasn’t going to let this opportunity go. The result is a collaboration with Legendary Entertainment and Warner Bros to create a duo of exclusive Ventura timepieces inspired by the epic space opera movie’s prop watch. One of them is the Desert Watch carrying a minimalist charm while the Edge Dune Edition has a sci-fi character to it.
We found a fancy for the latter, so we’ll be talking about this sci-fi timepiece here. However, some things are common to both these watches including the commemorative ellipsoid packaging that represents the psychedelic drug in Melange and the water depicted by the blue section. On special request of Denis to create the timepiece in close association with the film’s prop master, Doug Harlocker, the watch turned out to be as unique as it gets. It defies the traditional watch norms resulting in a hardwearing wrist gadget well-suited for the Fremen. Designer: Hamilton Watch
The Hamilton Ventura Edge Dune Limited Edition has a case made out of black PVD coating and measures 51mm x 47.2mm. The 100-meter water-resistant watch mimics the 3D relief elements present in the original timepiece depicted in the movie. I particularly like the intricate circuit board effect and the amazing texture complementing it. On the face, it displays the time in cool blue digits and gets the quartz movement on the inside.
The time reading is done in a vertical formation, looking like a faint blue text that lights up at the push of a button. Also, the blue ring on the watch illuminates to complete the look. When the lights go out they go out with a slight brightness peak and fade into the dark, just like the original prop timepiece. The Hamilton Ventura Edge Dune watch will be limited to just 2,000 units and comes at a price tag of $2,500.
If you’ve browsed through the internet long enough, you’ve seen videos of Taylor Swift or Coldplay concerts, with the entire audience lighting up thanks to LED wristbands that respond to the concert’s lighting and music setup. Well, Dynavisual wants to take that technology a few steps further. The company is working on a set of flexible high-visibility LED ‘billboards’ that can be worn on caps, clothes, bags, etc. The LEDs can be customized to display messages, stats, and brand logos (pretty standard stuff), but what really makes them exciting is their ability to be paired together by the thousands, potentially turning an entire crowd into a massive display. Dubbed ‘Swarm Technology’, Dynavisual paints an incredibly exciting future where massive arenas come to life at concerts or sports games, displaying images, logos, or even massive graphics – just imagine the entire stadium displaying the word ‘GOAL’ when Messi scores. The best part? These Dynavisual displays could then go back to being individual units once the game is over, with each person carrying their display back home and using it like they normally did.
Flexible displays have been around for over a decade now, but calling the Dynavisual Pad a flexible display is a bit of a stretch. In theory, it passes the bar, but practically, you’re also looking at a pad that has just 512 pixels, arranged in a 16×32 array. Standard OLED displays like the ones in your phone have millions of pixels per display, but the folks at Dynavisual don’t want you to make that comparison. If looked at independently, the Dynavisual Pad is an entirely different product. It’s designed with a robust construction that can be worn across your body or on your head, has multi-directional flexibility across both X and Y axes, and those individual pixels may give the display an incredibly low resolution, but they’re exceedingly bright. For comparison, the iPhone has a peak outdoor brightness of 2000 nits – the Dynavisual Pad outputs a comfortable 6000 nits, making it up to 3x brighter than traditional phone displays. The reason is simple – the low resolution and high brightness aid visibility over distances of multiple feet, whereas a smartphone display is practically unusable beyond 6-7 feet.
Showcased at MWC 2024, the Dynavisual team is bringing attention to how capable its Pad is. Each Pad is roughly the size of a large phone, has visibility in bright daylight, and boasts a flexible design thanks to ridges in the back that allow it to easily bend in multiple directions, making it perfect for wearing on your person. The device is made to be lightweight, allowing you to wear it on a cap, in a hoodie/jacket, or even on a bag, and it has its own built-in battery. Details on the battery life were unclear, but given that the device has just over 500 LEDs to power, it’s much more power-efficient than your smartphone.
The applications for the Dynavisual Pad are perhaps the most exciting bit as far as the product goes. There’s an obvious use-case in branding/marketing, with the pad accepting logos, messages, and branding elements, but Dynavisual sees the Pad as being a great communication element beyond the narrow marketing approach. It could be used by safety personnel to help deliver messages/guidance, or it could even be used in a personal capacity, perhaps by a cyclist looking to let drivers behind know whether they’re turning left or right (or braking). To that end, the Dynavisual Pad is a pretty smart device. It packs a whole slew of sensors, including GPS, Bluetooth, WiFi, Infrared, and even voice input.
The 6000-nit LEDs are bright enough to shine through thin fabrics too, allowing you to easily conceal them into clothing.
However, things get truly exciting when multiple Dynavisual Pads come together at the same venue. The company has developed a unique protocol that allows multiple Dynavisual Pads to sync together, becoming a swarm or hive-mind. The technology is best displayed within stadiums and arenas, as organizers can command multiple Dynavisual Pads together, turning them into a massive intelligent display that relies on hundreds if not thousands of pads to work as individual pixels. Imagine an entire stadium audience erupting into colors and displaying the score every time there’s a goal (or a touchdown), or lighting up with brand messages during an ad break. Unlike with current LED bands used in concerts that rely on a combination of WiFi and radio frequencies, Dynavisual’s Swarm Technology operates differently. For starters, with existing solutions, LED bands are owned by organizers, distributed to audiences at venues, and collected once the event is over. That isn’t a concern with the Dynavisual Pad, as users can bring in their own Pad devices into a venue and have it automatically sync up with the event’s light and sound system. Moreover, while current LED bands can only display swathes of colors and vague shapes, Dynavisual’s team has managed to figure out how to display images – a feat that’s incredibly tricky because organizers will need to know where every single Pad device is located, and send specific signals to them.
The application of that widespread swarm technology, however, doesn’t translate well to a hands-on demo with just one or two units on display. It’s also a challenge to explain the product to people attending the Mobile World Congress, because there’s a knee-jerk reaction to then compare it to a mobile – which the Dynavisual Pad is NOT. That being said, the swarm technology looks promising, as the visuals quite literally paint a vivid picture… although with massive concerts, tournaments, and large-scale sporting events like the Olympics, the Dynavisual Pad has an audience practically ripe for the picking!
Showcasing their tech at the Mobile World Congress, Everysight is riding the new AR (or should we call it Spatial) wave with the Maverick, their small, sleek, and stylish glasses that challenge the bulky mixed reality glasses and headsets we see today with something so close to regular glasses, you wouldn’t be able to tell the difference. Designed with a projected display that lets you see data and metric overlaid on the existing world, the Maverick uses an entire slew of sensors to track position, orientation, and head-tilt to ensure that digital elements remain in your line of sight (LOS) and correctly oriented. In fact, the Maverick even bagged multiple awards last year, including the iF Design Award and the Red Dot Award.
Unlike most AR headsets that can make you look a little dystopian when walking down the streets or sitting in a subway (we’re looking at you, Vision Pro wearers), the Maverick stands at the intersection of great tech and fashion. Weighing in at under 47 grams, these glasses boast a sporty, ergonomic frame that promises comfort for all-day wear, a crucial consideration for devices intended to be part of our daily lives. This comfort does not come at the expense of durability or style, making them a versatile accessory suitable for any occasion.
AR glasses are only as good as their displays (something that most Vision Pro users will swiftly point out – which is why the Maverick impresses with its crisp, high-contrast visuals despite its tiny package. Utilizing a Sony Color microOLED display, it delivers stunning visuals characterized by vibrant colors and sharp details. The high-brightness display guarantees an optimal viewing experience in both indoor and outdoor settings, a testament to the glasses’ adaptability and user-centric design.
Ease of use is at the forefront of the Maverick design, with an intuitive interface that allows users to navigate and control features through simple gestures. This user-friendly approach is further enhanced by advanced sensors like a 3D accelerometer, gyro, and magnetometer, which provide accurate line-of-sight tracking for an immersive augmented reality experience.
Battery life is a perennial concern for wearable technologies, and here, Maverick impresses with over 8 hours of continuous operation. This endurance is complemented by efficient power management, ensuring that the glasses support a day’s worth of activities without needing a recharge. Such longevity is essential for users who demand reliability from their smart devices.
User interface and interaction are streamlined for ease of use. Maverick features an intuitive interface that allows for effortless navigation and control through simple gestures. This ease of use is further enhanced by the inclusion of a 3D accelerometer, gyro, and magnetometer, providing accurate line-of-sight tracking that enriches the augmented reality experience by aligning virtual objects with the real world seamlessly. Moreover, the Maverick glasses are designed with inclusivity in mind. They offer an RX solution with personalized lenses tailored to individual prescriptions, ensuring that users with varying visual needs can enjoy the benefits of smart eyewear without compromise.
Connectivity is robust, with Bluetooth 5.2 ensuring seamless pairing with a wide range of devices, including iOS and Android smartphones, as well as Apple Watch and Android Wear. This connectivity underpins the Maverick’s versatility, making it a central hub for notifications and digital interactions on the go.
In the box, users will find everything needed to start their journey with Maverick: tinted removable visors, a charging cable, a carrying case, a pouch, a cleaning cloth, and interchangeable nose pieces. Everysight is selling a developer edition of the Maverick for $399, although it’s unclear when the public rollout will begin, and what the price will be for regular consumers.
Everysight’s Maverick glasses represent a significant advancement in smart eyewear, proving that it’s possible to stay stylish while benefiting from the latest in wearable technology. They set a new benchmark for combining practicality with elegance, ensuring users can stay connected in a visually compelling, productive, and convenient manner. If these glasses could brew coffee, we might never find a reason to take them off.
Our Internet needs are becoming more complicated even at home. Multiple devices ranging from smartphones to smart appliances compete for bandwidth, while different services like gaming and streaming demand more data than, say, a smart thermostat. The simplistic routers of yesteryears are no longer sufficient to face the challenges of modern lifestyles, but as these boxes become more sophisticated, their presence also becomes more obnoxious as well. The latest and greatest routers seem to want to be seen as powerful monstrosities rather than helpful tools that make our lives easier. Completely bucking the trend, D-LINK launched its AQUILA PRO AI smart mesh routers that finally look more at home in your home, masquerading as a piece of sculptural art that hides the powerful technology inside its graceful curves.
Granted, those antennas on your router aren’t just for show, but that doesn’t always mean they need to be visible, especially with today’s technologies. It might simply be a matter of pride that some of these literally black boxes show off the number of spikes they have as if those indicate how much power they actually possess. The result is a design that isn’t just space-inefficient but also unaesthetic to most people.
In contrast, you won’t find sharp points or even sharp edges on the D-LINK AQUILA PRO AI (models M30 and M60). What you will find instead is an elegant object that belies its superior technology, looking more like a piece of decoration rather than a router. Its name and unique shape, whose ends curl up and inward, are inspired by the Aquila constellation and the eagle, a majestic bird that exudes both power and grace. That association goes even beyond the general shape of the device, with feather-like patterns on the router’s ventilation.
The D-LINK AQUILA PRO AI isn’t just all looks, of course, as it also boasts the latest connectivity technologies, especially Wi-Fi 6. And since it’s a mesh router, you can combine multiple units and spread them around your house to get rid of dead zones and ensure fast, stable, and uninterrupted connections. It also comes with the latest privacy and security protections, plus conveniences offered by smart home platforms and mobile app control.
The D-LINK AQUILA PRO AI’s ground-breaking design doesn’t stop there either. It also tries to give back to the planet we live in by making use of PCRs or Post-Consumer Recycled materials for its housing, reducing its negative impact on the environment. This smart mesh router is stunning and beautiful proof that power doesn’t have to look harsh and cold. After all, there is both power and elegance in the form of an eagle taking flight.
Smart glasses, in contrast to AR headsets and visors, aim for a design that ideally should be indistinguishable from regular glasses. With today’s technologies and knowledge, however, that’s not easily possible, especially when you need to add powerful computing hardware to sophisticated optics. That’s especially the case when you need to offer some kind of smart assistant functionality, especially voice and speech recognition. In the past, you had to settle for rough translations and sometimes misinterpretations; comical but frustrating nonetheless. That definitely sounds like a job for AI, and that’s exactly what OPPO is bringing to the table, or rather to your eyes, with the newest iteration of its lightweight and discreet “assisted Reality” glasses that take a focused approach to wearables.
AI is still the hot thing in tech today, in spite of and despite the bad publicity that misuse of the tool brings. Today’s AIs happen to be great at processing human language, both written and spoken, and they can now run the device itself with very little power, making them perfect for very small devices, including smart glasses. In its third iteration, the OPPO Air Glass 3 prototype harnesses the power of AI, specifically its own self-trained language model AndesGPT, to deliver a more natural way to talk to your glasses and get your job done.
AI might be the technical highlight of the new OPPO Air Glass 3, but its winning feature is going to be its design. OPPO is laying claim to the title of the world’s lightest binocular full-color glasses, and at 50g only, the claim does have merit. It looks just like regular spectacles with very thick frames, but nothing like those complicated and heavy mixed reality glasses. Despite that lightweight design, the Air Glass 3 still boasts a bright 1,000 nits display delivered by a tiny Spark micro projector, ensuring you can clearly see the virtual information even in bright environments. And with an ultra-thin waveguide, you don’t get the rainbow-like patterns that are often seen on optical see-through displays like these.
The OPPO Air Glass 3 manages to offer this more comfortable design thanks to its more focused functions. Rather than trying to cast its net wide with augmented reality, OPPO is instead focusing on “assisted reality” that emphasizes productivity over entertainment. You’ll still be able to see images if you want and control music playback, but the information that’s displayed in front of your eyes is limited to things like navigation, timers, translations, or even a teleprompter. In other words, it’s a sleek way to have all the important information you need right in front of you instead of having to fish out your phone from your pocket and get distracted in the process.
Of course, that means it will need to connect to an external device, particularly your OPPO smartphone. The Air Glass mobile app provides that connection you need with OPPO’s AndesGPT to ensure you’re getting the best performance possible without weighing your head down. OPPO is also laying the groundwork for more AI-enhanced features and experiences by investing heavily in its own AI center in the hopes of empowering all its products, especially its smartphones, with these features.
Yet another notable game studio is laying off a significant chunk of its workforce. Supermassive Games, the developer behind interactive horror titles Until Dawn and The Quarry, is cutting around 90 jobs, according to Bloomberg. That's nearly a third of the studio's more than 300 employees.
Supermassive confirmed in a statement that the studio will reorganize. "As a result, we are entering into a period of consultation, which we anticipate will result in the loss of some of our colleagues," it said. "This is not a decision that's been taken lightly, with many efforts made to avoid this outcome."
Supermassive notes that it's not safe from the "significant challenges" facing the games industry. More than 6,000 workers in the industry have lost their jobs since the beginning of the year and we're not even into March yet.
Meanwhile, indie studio Die Gute Fabrik has paused production amid funding difficulties. The developer of Saltsea Chronicles and Sportsfriends will use its remaining funds to give staff a month of paid time "to catch their breaths" while they look for new jobs. The studio is still seeking backers to help it resume production and hopes to bring back current team members in the future. However, it notes that "the publishing and investment scene is so tough for companies and projects of our scale right now it's made it extremely difficult to secure funding for our next project without a gap in income."
We’re sad to share that we're halting production at @gutefabrik due to the challenging funding & investment scene in games right now. We downed tools earlier this month & have been doing our best to support the team who'll be looking for work from mid-March.
Did you think you’ve seen the last of Doom running on random stuff? Think again. Landscaping technology company Husqvarna just announced that the game will run on some of its robot lawn mowers. So you can mow down hellspawn just ahead of mowing down errant blades of grass.
Here’s the deal. It’ll only be available on the company’s Automower Nera robotic lawn mower models, beginning this April. Once downloaded, you play the game via the lawn mower’s onboard display. Rotating the control knob turns Doomguy left and right and pressing the knob makes you shoot. Holding down the start button initiates forward movement. It’s Doom. You know the drill.
There are some caveats here. First of all, you have to sign up to download the software by September 9. It won’t be available for US residents, despite Husqvarna making a concerted effort to sell more robot lawn mowers in the United States. Finally, this is just the game running on the onboard display. It’s not as if the mower turns your yard into an actual level, with unwanted greenery representing demonic enemies. Still, it’s always nice to see Doom continue to do its thing.
Just ahead of Mobile World Congress, NVIDIA unveiled its latest laptop GPUs and, what a surprise, they’re designed largely to assist with AI processing. The RTX 500 and 1000 Ada Generation graphics cards are primarily for thin and light laptops. While they won’t offer as much TOPS AI performance as current higher-end mobile GPUs, they could be a handy option for on-the-go AI processing for the likes of researchers, content creators and video editors. It's worth noting they're workstation GPUs rather than ones designed for gaming.
NVIDIA says the GPUs, which are based on the Ada Lovelace architecture, offer up to twice the ray-tracing performance of previous-gen GPUs (they employ third-gen ray-tracing cores). Fourth-gen Tensor Cores, meanwhile, deliver up to twice the throughput of previous GPUs, according to NVIDIA. The company says this helps with “accelerating deep learning training, inferencing and AI-based creative workloads.”
The RTX 500 has 4GB of dedicated memory, while the RTX 1000 has 6GB. NVIDIA says they deliver up to 154 and 193 TOPS of AI performance, respectively. Compared with a CPU-only AI configuration, the RTX 500 is slated to provide up to three times faster AI-powered photo editing, as much as 10 times the graphics performance for 3D rendering and up to 14 times the generative AI performance for various models.
The GPUs also support DLSS 3, the company’s upscaling tech. In addition, an eighth-gen encoder includes AV1 support. NVIDIA says this video codec is “up to 40 percent more efficient than H.264, enabling new possibilities for broadcasting, streaming and video calling.”
If you’re interested in picking up a laptop with an RTX 500 or 1000 GPU, you won’t have to wait long. They’ll debut this spring in laptops from the likes of Dell, HP, Lenovo and MSI.
This article originally appeared on Engadget at https://www.engadget.com/nvidias-rtx-500-and-1000-ada-gpus-bring-more-ai-smarts-to-thin-and-light-workstations-161517977.html?src=rss