Vue lecture

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.

Apple Researchers Built A Pixar-Style Robot Lamp That Moves And Emotes Like A Living Creature

Apple’s latest experiment in robotics feels like a love letter to Pixar’s Luxo Jr. The tiny, energetic desk lamp that hops onto the screen before every Pixar film has always been more than just a mascot—it’s a symbol of character-driven storytelling. Now, Apple’s researchers have taken that same playful, emotive energy and brought it into a real-world robotic lamp, designed not just to function, but to interact, express, and even entertain. Researchers at Apple presented a paper titled ‘ELEGNT’ (Expressive and Functional Movement Design for Non-Anthropomorphic Robot) along with a comprehensive video of the lamp in action.

There’s a poetic connection here. Steve Jobs, the man who shaped Apple’s design philosophy, was also the visionary who helped Pixar become an animation powerhouse. The DNA of both companies has always been about creating technology that feels approachable—whether through the friendly curves of an iPhone or the lifelike expressions of an animated toy. Apple’s robotic lamp embodies that same philosophy, proving that robots don’t need to be powerful to be meaningful. They just need to be relatable.

Designers: Apple Machine Learning Research Division

Developed by Apple’s Machine Learning Research division, this robotic lamp is more than an automated light source. It gestures, reacts, and even sulks when it’s left behind. A demonstration video shows it performing tasks in two modes: “Functional,” where it simply executes commands, and “Expressive,” where it adds personality to its movements. The difference is striking. Instead of cold efficiency, the expressive mode makes interactions feel natural—like the lamp is part of the room’s social fabric, not just an object within it.

In one scene, the lamp hears music and starts swaying, an irresistible display of curiosity. In another, it glances outside before describing the weather, as if pausing to check for itself. When it reminds a user to drink water, it nudges the glass forward—not as a command, but as a gentle encouragement. These small but thoughtful gestures tap into something deeply human: the way we naturally ascribe personality to objects that behave in familiar ways.

This is why anthropomorphism in robotics matters. People don’t just want machines that work—they want machines they can connect with. A robot that can convey joy, hesitation, or even mild disappointment is far more engaging than one that simply executes tasks. It’s a lesson we’ve seen play out in animated films for decades, and it’s one that robotics engineers are beginning to embrace. In a way it also helps shed the impression of robots being scary (Skynet, Terminator, Transformers, Ultron) by embracing more delicate, humane characteristics instead.

Apple’s research aligns with earlier reports from Mark Gurman suggesting the company is developing a home robot with an articulating arm and an iPad-like interface. Speculated to launch by 2026 or 2027, it could integrate with smart home systems and even act as a companion device. If Apple is serious about bringing robotics into consumer spaces, this expressive lamp could be a glimpse of what’s to come.

For now, this experiment serves as a reminder that technology doesn’t have to be rigid or clinical to be useful. The best machines aren’t just the ones that perform tasks efficiently—they’re the ones that make us feel something. And if a desk lamp can make you smile just by hanging its head in disappointment, Apple might be onto something special. You can read the entire research paper on Apple’s website here.

The post Apple Researchers Built A Pixar-Style Robot Lamp That Moves And Emotes Like A Living Creature first appeared on Yanko Design.

If Framework Designed A Drone: Meet the Modular Drone Concept with Fully Upgradable Components

Here’s my hot take for 2025, technology that cannot be upgraded is genuinely consumer-unfriendly. Framework proved it was possible by designing a sleek laptop that featured totally upgradable components – most gaming PCs are entirely upgradable too – so why not phones? Why not tablets? And why not drones?

Drones are devices that you probably buy once or twice in your lifetime. Nobody buys a new drone every 2 years – they use the one they have for as long as possible before upgrading (that’s only if they need to upgrade)… which really means you’re stuck with backward tech for a fairly long time. To combat this, Ethan White designed the ARK – a modular drone with an architecture that features removable and upgradable components. Need a new battery? Swap it out. Want a better camera lens? Substitute the older one for a newer model.

Designer: Ethan White

“Traditional drones require complete hardware changes or airframe redesigns to perform different roles. The ARK, however, offers an integrated solution with the simple act of swapping module pack,” says Ethan. Although the drone references Noah’s Ark, it quite literally represents the metaphor of the ‘Ship of Theseus’ – a thought experiment revolving around a ship that remains constant, with its parts gradually replaced over time.

The way the ARK is designed balances purposeful bulkiness with aerodynamics. Sure, laptops can be sleek for cosmetic reasons – but drones need to shed every single ounce they can for efficiency – a heavy or bulky drone can’t fly as well as a lithe, aerodynamic one – so making a drone that’s easy to disassemble, modular, and upgradable presents a unique challenge.

Components can’t be interwoven with each other inside a single outer body. The battery needs to exist independent of the PCB. The motors, sensors, cameras, every element has to be positioned very thoughtfully, so that they can be individually removed and replaced.

To that end, the ARK has a remarkable design, featuring components that interlock together when in use, and separate when you need to perform a swap. All this while still making sure you’ve got a drone that’s portable, foldable, and aerodynamic. The modularity also means you can purpose-build your drone based on your needs. Want something for entertainment, choose a basic package. Want a multimedia beast, upgrade your camera. Want to record at night, swap the daytime camera for a module that supports night vision. Want better range, add better antennas on top. You can build your drone with precise intent, just like you would your PC.

The drone features upgradable PCBs, cameras, propellers/motors, battery packs, and even other components like anti-collision sensors. Although conceptual, Ethan is working on a proof-of-concept and states that he’s aiming for IP43 water and dust resistance, along with a 30-minute flight-time. That might sound dull on paper, but I’d choose 30 minutes of flight with an absolutely incredible camera lens and sensor over 50-60 minutes with a fairly basic lens array. Plus, things will only get better with time – and as a consumer, you directly benefit from it.

The post If Framework Designed A Drone: Meet the Modular Drone Concept with Fully Upgradable Components first appeared on Yanko Design.

Modified Roomba becomes a smart robot for dispensing doggie treats

Robot vacuum cleaners are so advanced these days that some of them barely need human intervention. Of course, that wasn’t always the case in the earliest days of this market segment, with the first generations relying on some very basic technologies and software by today’s standards. Although its name has become synonymous with robot vacs, the Roomba has almost all but faded into that background.

Some, however, have found some rather creative uses for older models, with a bit of hacking and ingenuity. Giving a very old rolling robot a better brain and literally space-grade software, this mod turns a cleaning machine into a mess-making one, albeit not by intention. Instead of picking up dirt or even clothes, the “Space Vacuum” will instead drop pieces of food for your canine companion to munch on when it’s near.

Designer: Joaquim Silveira

Most of us probably have a file of unused and abandoned electronics that are still a bit functional except for one broken or missing part. We don’t have much practical use for them, so they just end up gathering dust and taking up space, sometimes forgotten until the next Spring cleaning. A few people, however, have the fortunate skill to bring these machines back to life, though sometimes with a different function so different from the original.

This Roomba, for example, once cleaned up dirty floors in a previous lifetime. But with some clever use of off-the-shelf hardware like an Arduino and some skilled DIY electronics, it has been given a new lease on life as a dog food dispenser. Ironically, it now does the opposite of cleaning the floor and instead makes a mess, presuming the canine doesn’t gobble up the treats first.

What the Space Vacuum basically does is detect the presence of a dog, while differentiating it from other living critters in the house, and drops dog food from a paper cup that has its bottom partially cut out. Curiously enough, the software used for this project is NASA Jet Propulsion Labs’s (JPL) Fprime flight software used for space missions and controlling drones, which sounds a bit overkill for a food-dispensing robot. That, however, gives it the right to call itself a “Space Vacuum,” though the vacuum part is in question.

It’s definitely not going to win any awards for aesthetics or practicality, but the project does spark curiosity and maybe inspire a few more experiments using these house robots. The rather odd clothes-picking Roborock vacuum at CES 2025 demonstrated what these machines are actually capable of, especially with their advanced sensors and AI-powered brains. All they need is a robot arm or a proper food dispenser, and they have the makings of a general-purpose robot that won’t take up too much floor space.

The post Modified Roomba becomes a smart robot for dispensing doggie treats first appeared on Yanko Design.

NASA unveils first look at SUV-sized Mars Chopper concept

NASA’s Ingenuity Mars Helicopter was a groundbreaking piece of equipment that was able to be the first human-made object to fly to a different planet back in 2021. And while it crashed in January of this year, it was still able to make 72 flights in under three years. Now that it’s not functioning anymore, NASA is looking at building the next object that will give us a peek into another planet.

Designer: NASA

They have unveiled the early design renderings of the Mars Chopper, their proposed follow-up to the Ingenuity Mars Helicopter. It’s a huge leap from the original as this one is the size of an SUV with six rotor blades to help it fly across the planet. It will be able to carry up to 11 pounds of science payloads across 1.9 miles per Mars day.

The initial renders show the three-legged drone gliding over the supposed landscape of Mars. Since Ingenuity was much smaller, this will hopefully be able to surpass its achievements and give us an even better view and understanding of Mars. It should be able to help scientists in studying Martian terrain and at a faster rate.

While it’s still in its “early conceptual and design stages”, there is already anticipation in how the Chopper can give us a glimpse into the previously inaccessible areas. It’s not clear though if it will actually be sent to Mars.

The post NASA unveils first look at SUV-sized Mars Chopper concept first appeared on Yanko Design.

Pangolin-inspired robot can dig and “poop” out seeds to plant trees

Not all robots have to look, well, robotic. There is a growing number of robots that are inspired by real life creatures (sometimes, even humans, but that’s a whole other discussion) or so called bio-inspired bots. The latest winner of the Natural Robotics Contest is inspired by a pretty unlikely animal: the insect-eating mammal called the Pangolin.

Designer: Dorothy and Dr. Robert Siddall

A high school student from California named Dorothy designed a robot whose main goal is to dig and plant seeds. Since pangolins are naturally digging animals, why not use it to create a robot that can help populate areas with more trees? The winning concept was turned into an actual prototype called the Plantolin by the partner research institute. More than just looking like a pangolin, it uses features from the mammal and incorporate it into the functions of the robot.

The Plantolin roves around on two wheels and just like the pangolin, it balances on its long, movable tail. Each of the wheels has an electric quadcopter drone motor. The digging is done by these two front legs with the tail tilting down once it starts to provide leverage. Once there’s a hole already, the robot drives over it and poops out a yew tree seed bomb nugget (containing both seeds and soil).

It’s a pretty interesting way to re-populate a space with more trees. It will probably be faster and will need minimal human intervention when it’s programmed right, so no need to train actual pangolins to do the job.

The post Pangolin-inspired robot can dig and “poop” out seeds to plant trees first appeared on Yanko Design.

Rectangular robot vacuum concept proposes a more efficient design for smaller spaces

Robot vacuum cleaners and mops are common sights these days, but despite all the advancements they’ve made in terms of technology, their basic shape has remained unchanged since the first-ever Roomba. They’re almost all circular, though the are some that have taken on rounded square forms, a shape that was dictated by the limitations of old technologies that don’t seem to be relevant today. This standard design, however, still carries over other limitations, like squeezing into tight spaces or cleaning corners. Perhaps it’s time to rethink that old and outdated design, which is what this concept tries to do in order to cater to homes with smaller, cramped spaces and messy floors.

Designer: Subin Kim

The initial design of robot vacuums was made primarily for the robot and not the humans. The circular shape made it easier for the machine to turn and correct its direction, something that was all too common given the very basic technologies from decades past. Today, however, most robot cleaners have no problem navigating the most cluttered floors, so there’s now an opportunity to rethink that basic shape.

mini is a concept design that stretches the robot vacuum into a more rectangular form, technically more pill-shaped with its rounded sides. The idea is that this robot can better squeeze itself into narrow spaces, like those between walls and furniture, or hug edges to properly brush and vacuum areas that even the most sophisticated circular robot can’t reach. In small apartments or tiny homes, that is more often the case, so such a design is more useful than the majority of round or square robots.

The design can actually be even more efficient than standard robot vacuums because it can change its orientation depending on the area of the floor to be cleaned. In its vertical mode, it can easily clean out narrow gaps, but then it can rotate and switch to horizontal mode if there’s a wider space available for it to move. Such a feat would require AI and advanced sensors, both of which are readily available on most robot cleaners today.

mini’s design does mean it won’t be able to turn quickly, but that can be handled by better obstacle detection and smarter navigation. Although it might not work perfectly in practice, the concept does challenge the status quo and encourages a design that really puts the user at the center, rather than simply turning such robot helpers into technological showcases.

The post Rectangular robot vacuum concept proposes a more efficient design for smaller spaces first appeared on Yanko Design.

Bamboo drone explores a more sustainable way to fly and deliver things

Some see them as annoyances and others consider them as privacy and security risks. That said, flying drones, just like their quadruped terrestrial counterparts, will inevitably be a part of our near future. That does mean there will be more mass-produced drones, more than what we already have today, and the materials used to make them aren’t always accessible or sustainable. But just as plastic is being replaced in other design industries, there’s also an opportunity to test other materials that are just as suitable for these flying robots. One experiment does exactly that, and it chooses a rather unexpected option that’s much loved in the design industry: wood.

Designer: Deepak Dadheech

Wood is not something you’d immediately associate with electronics, let alone robotics, but it is finding its way to more appliances and gadgets. In those cases, the material is prized for its sustainability and aesthetics, the latter of which isn’t exactly a priority among unmanned aerial vehicles or UAVs like drones. That said, not all wood is created equal, and one particular type could very well be suitable for the demands of a drone.

Bamboo, in particular, is known for being lightweight yet also durable, especially when it comes to its tensile strength. Unlike hardwood, which could splinter and break on impact, bamboo can absorb a bit more strength. It’s also in high supply or easily renewable, unlike other trees that take a longer time to mature. Because of these properties, it could make for a good substitute for both plastic and carbon fiber, as the Bamboo hexacopter drone demonstrates.

Of course, the whole drone isn’t made of bamboo. In addition to the circuitry and brushless motors, the propellers are still made from plastic. Only the main frame, legs, and arms use bamboo, which is the largest use of plastic or carbon fiber in drones anyway. For only around $12 worth of bamboo, you can have a drone that weighs only 350g, half that of typical plastic builds.

The question, however, is whether such a strategy will actually be effective or if it will have too many compromises for the sake of sustainability. The Bamboo Drone does fly indeed and it can, in theory, carry light payloads like tools, emergency supplies, or scientific instruments. How it will fair against strong winds and light rain has yet to be tested, and that will really determine how suitable bamboo will be for a fleet of drones.

The post Bamboo drone explores a more sustainable way to fly and deliver things first appeared on Yanko Design.

Narwal Unveils Freo Z Ultra at IFA 2024, Showcasing Advanced Robotic Cleaning Technology

Narwal introduced its latest robotic vacuum and mop, the Freo Z Ultra, at IFA 2024. This new device brings advanced cleaning technology to homes, offering intelligent features for thorough and efficient cleaning. It uses the TwinAI Dodge Obstacle Avoidance system, allowing it to navigate around objects precisely and recognize over 120 household items in real-time. Whether furniture, cables, or pet waste, the robot efficiently avoids obstacles while cleaning all areas.

Designer: Narwal

The design is clean and minimal, with curves that allow it to blend into various home environments. The key feature of the Freo Z Ultra is its dual RGB camera system, which captures up to 1.5 million data points per second. This wide-angle system enhances the robot’s ability to recognize objects and move through spaces without making contact. The high-definition cameras provide detailed visuals, ensuring they can adjust to different surroundings and avoid even small objects.

A significant part of the Freo Z Ultra’s functionality lies in its AI DirtSense 2.0 technology. This system automatically identifies the type of mess it encounters and adapts its cleaning method accordingly. For example, the vacuum first handles dry messes like dust or crumbs with its 12,000 Pa suction. Afterward, it switches to mopping mode to clean up liquid spills, ensuring no dirt is left behind. Separating dry and wet cleaning avoids contamination and maintains a high standard of cleanliness.

The mop system also uses AI technology to optimize its performance. The AI-Adaptive Hot Water Mop Washing system adjusts the water temperature between 113°F and 167°F (45°C and 75°C) based on the type of dirt detected. Regular dirt is cleaned with warm water, while more stubborn grime is addressed with hotter water. After each cleaning session, the system automatically washes and dries the mop, preventing bacteria growth and maintaining mop efficiency.

The Freo Z Ultra quickly maps its environment, creating a detailed 3D model in six minutes. It uses ultrasonic sensors and RGB cameras to navigate various surfaces, including corners and edges. The Smart EdgeSwing technology allows the robot to easily clean along walls and baseboards, ensuring no area is missed during the cleaning process, even in tight spaces.

The robot integrates with popular smart home systems like Alexa, Google Home, and Siri. Users can control the vacuum through voice commands or manage its settings via the Narwal app. The app allows users to customize cleaning schedules, set no-go zones, and remotely monitor the vacuum’s progress. This makes the Freo Z Ultra a versatile tool for households that value convenience and ease of use.

Privacy is another key aspect of the Freo Z Ultra. The robot operates with TÜV Rheinland Privacy Certification, ensuring that all user data remains secure. The robot’s storage and computation functions are kept offline, and any camera use requires user permission. This provides peace of mind for users concerned about data security in their homes.

Pet owners will appreciate the Freo Z Ultra’s pet-friendly features. The vacuum can detect and avoid pets as they move around the house, preventing accidental disturbances. It can also delay cleaning in areas where pets are resting, resuming once the space is free of animals. This ensures pet fur and messes are cleaned effectively without stressing pets or requiring human intervention.

The robot’s Zero Tangling Floating Brush 2.0 is designed to handle hair without tangling. This brush system sweeps 4,400 times per minute, capturing pet hair and debris without causing blockages. Additionally, the robot operates at a noise level of 71 dB, significantly reducing disturbances during its cleaning cycles. This low-noise feature makes it suitable for homes with pets or small children.

The Freo Z Ultra’s self-emptying station simplifies maintenance. The base station can hold up to 120 days’ worth of debris, reducing the need for frequent emptying. It uses hot air at 113°F to dry the collected debris, preventing bacteria buildup and maintaining a hygienic environment. The station also self-cleans and dries the mop after every use, keeping the entire system ready for the next cleaning session without user intervention.

The Freo Z Ultra adapts its cleaning approach based on the surface type. Hardwood floors use less downward pressure to avoid damage. On ceramic tiles, it increases the pressure to ensure a deep clean. The mop’s moisture level is also adjusted to suit the cleaning surface, providing optimal performance on both wood and tile. Carpets are handled with care, as the robot can automatically lift the mop when it detects carpeted areas, allowing it to vacuum the surface without interference.

The device’s powerful 12,000 Pa suction ensures that even deep-set dirt is removed from carpets and other surfaces. Users can enable Power Boost mode through the Narwal app to enhance suction when needed. This allows the Freo Z Ultra to adapt to different cleaning needs, whether tackling high-traffic areas or maintaining delicate surfaces.

Narwal’s Freo Z Ultra offers a comprehensive cleaning solution for homes of all sizes, bringing together advanced AI, privacy protections, and smart home integration. Its ability to adapt to various floor types, combined with features designed for pet owners, makes it a highly versatile tool for maintaining cleanliness with minimal effort. The robot’s low-maintenance design and intelligent cleaning systems ensure a hassle-free experience for users looking to automate their floor care routine.

The post Narwal Unveils Freo Z Ultra at IFA 2024, Showcasing Advanced Robotic Cleaning Technology first appeared on Yanko Design.

RoboGrocery is the first step towards robots packing our grocery

When I first encountered a self-checkout system in IKEA a few years ago, I sort of panicked because I didn’t know what to do. But after experiencing it and eventually figuring things out, I thought this was such a convenient way to do your shopping, especially if you want to keep social interactions at a minimum. Now if only there was a also a self-packing system since the packing up groceries stuff is the most difficult.

Designer: MIT CSAIL

Eventually, this can of course come true and one step towards a system like this is the RoboGrocery. This was developed by MIT’s CSAIL department and uses a soft robotic gripper together with computer vision to help you bag groceries and other small items. It’s still in its early stages of course but seeing how it’s working at this time seems pretty promising.

They tested it out by placing 10 objects on a grocery conveyer belt, ranging from soft items like grapes, crackers, muffins, bread to the more solid ones like cans, meal boxes, and ice cream containers. The vision detects the size of the item to determine the order of placing it in a box. The grasper, with the pressure sensors in its fingers, then determines whether the item is delicate and should not be placed at the bottom of the bag.

 

While we’re still a few steps away from actually having a robot to bag your groceries, it’s an interesting first step towards that. Eventually, after it becomes available for commercial use, they might also be able to develop this for industrial spaces like recycling plants and factories.

The post RoboGrocery is the first step towards robots packing our grocery first appeared on Yanko Design.

OpenCat – Le framework open source des animaux de compagnie robotiques

Aujourd’hui, j’aimerai vous parler OpenCat, un framework open source qui va vous aider à créer vos propres robots animaux de compagnie, c’est à dire des quadrupèdes hyper réalistes et Ô surprise parfaitement abordables. Pour cela, OpenCat permet de piloter des servomoteurs haute performance utilisés comme articulations, une structure de corps optimisée et des contrôleurs low-cost comme Arduino, ESP32 ou Raspberry Pi.

Vous pouvez également ajouter plein de modules trop cools comme une caméra intelligente, des capteurs IoT ou encore une commande vocale. Avec le code de contrôle open source hyper efficace, vos robots vont littéralement prendre vie !

OpenCat, c’est pas non plus juste un délire de geek barbu puisque ça a déjà été déployé sur des robots commerciaux comme le chat Nybble et le chien Bittle de Petoi. Ces petites merveilles de technologie peuvent courir, marcher et même s’auto-équilibrer comme de vrais animaux.

Les créateurs d’OpenCat partagent tout ça en open source pour une bonne raison : Ils veulent favoriser la collaboration dans le développement de la robotique, de l’IoT et de l’IA sur des robots quadrupèdes abordables. Ils veulent également diffuser des ressources éducatives sur la robotique au plus grand nombre et inspirer les étudiants, les ados et même les enfants. C’est beau quand même !

Bref, si vous aussi vous rêvez de construire vos propres robots animaux de compagnie, foncez sur OpenCat ! C’est le moment de mettre les mains dans le cambouis, d’apprendre en s’amusant et de participer à cette aventure.

AI artist will “train” robot dogs to do a live painting session

Spot has been a pretty busy dog, previously appearing with super group BTS a few years and just last week, getting its own costume and dancing its heart out to celebrate International Dance Day. Lest you think that it’s an actual dog though, it’s actually a robotic dog that can do more than just jump and roll over. Now it’s branching out to the art world with a new exhibit featuring the power of AI.

Designer: Agnieszka Pilat

There has been a lot of heated discussions about AI and art but not all of them are always negative. While a lot have been critical, there are those that want to explore how autonomous technology and AI-generated art can aid in the democratization of art. One of those people is Polish artist Agnieszka Pilat. She has partnered with Boston Dynamics, or rather, Spot the robot dogs, for the Heterobota exhibition at the Museum of Fine Arts in Boston.

Two of the robot dogs, nicknamed Basia and Omuzana, will do a live painting demonstration in the museum on a 156 x 160 inch canvas on May 10. Pilat will be “training” the dogs to doodle and paint from 8PM to 12AM, with a little resting in between just like an actual artist would. Visitors in the museum can actually watch them live and the final work will not be displayed afterwards so your only chance to see the robot dogs in action would be during the live painting session.

Pilat says that the expected outcome is more like that of a “little kids finger-painting” since the technology is young and new, even though she has collaborated with Spot before. But it’s an interesting experiment in how humans can use AI and robots to generate art. Of course, there’s still a lot of discussion that rightly needs to be had but things like this can open up various viewpoints and opinions that can hopefully enhance the conversation.

The post AI artist will “train” robot dogs to do a live painting session first appeared on Yanko Design.

❌