New ways to integrate data with physical objects | MIT News

Mustafa Doğa Doğan said that to understand the meaning of StructCode, think of Superman. Not the “faster than a speeding bullet” and “more powerful than a motorcycle” version, but a Superman or Supergirl who sees the world differently than the average mortal – a person who can look around a room and gather all kinds of information about an average person people. Objects that are not obvious to the less perceptive.

In a nutshell, that’s “the high-level idea behind StructCode,” explains Doğan, a doctoral student in electrical engineering and computer science at MIT and co-director of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) of affiliates. “Our goal is to change the way we interact with objects” — making those interactions more meaningful and meaningful — “by embedding information into objects in an easily accessible way.”

StructCode grew out of a project called InfraredTags, which Doğan and other colleagues launched in 2022. This work, as well as the current project, was conducted in the laboratory of Stefanie Mueller, an associate professor at MIT who serves as Doğan’s advisor and is involved in both projects. In last year’s approach, “invisible” tags (visible only by cameras capable of detecting infrared light) were used to reveal information about physical objects. The disadvantage is that many cameras cannot sense infrared light. Additionally, methods of manufacturing these objects and affixing labels to their surfaces rely on 3D printers, which tend to be very slow and can often only create very small objects.

StructCode, at least in its original version, relies on objects produced through laser cutting technology that can be manufactured in minutes instead of the hours it might take on a 3D printer. Additionally, information can be extracted from these objects using RGB cameras commonly found in smartphones; the ability to operate in the infrared range of the spectrum is not required.

In an initial demonstration of the idea, the MIT-led team decided to build their object out of wood, creating items such as furniture, picture frames, flower pots or toys that would be ideal for laser-cut manufacturing. A key question that must be addressed is: How can information be stored in an unobtrusive and durable way, compared to externally attached barcodes and QR codes, and without compromising the structural integrity of the object?

The team’s current solution relies on joints, which are ubiquitous in wooden objects made from multiple components. Perhaps the most familiar is the finger joint, which features a zigzag pattern in which two pieces of wood are adjacent at right angles so that each protruding “finger” along the join of the first piece fits into a corresponding “finger” along the join of the second piece. gap”. For the second piece, similarly, every gap at the seam of the first piece is filled with fingers from the second piece.

“Joints have these repetitive features, like repeated bits,” Dogan said. To create the code, the researchers slightly varied the gap, or length of the fingers. The length of standard sizes is assigned 1. Shorter lengths are assigned 0, longer lengths are assigned 2. The encoding scheme is based on these sequences of numbers or bits, which can be observed along the joints. For each four-digit string, there are 81 (34) possible changes.

The team also demonstrated a method of encoding information in a “living hinge,” a joint made by taking a flat, rigid material and making it bendable by cutting a series of parallel vertical lines. As with the finger joints, the distance between these lines can vary: 1 is the standard length, 0 is a slightly shorter length, and 2 is a slightly longer length. In this way, code can be assembled from objects containing living hinges.

The idea is described in a paper called “StructCode: Exploiting Fabrication Artifacts to Store Data in Laser-Cut Objects,” which was presented this month at the 2023 ACM Computational Manufacturing Symposium in New York City. Doğan, the paper’s first author, is joined by Mueller and four co-authors — recent MIT alumna Grace Tang ’23, MNG ’23; MIT undergraduate Richard Qi; UC Berkeley graduate student Vivian Hsinyueh Chan; and Thijs Roumen, assistant professor at Cornell University.

“In the field of materials and design, there is a tendency to associate novelty and innovation with completely new materials or manufacturing technologies,” points out Elvin Karana, Professor of Materials Innovation and Design at TU Delft. What impressed Karana most about StructCode was that it provided a novel way to store data by “applying common techniques like laser cutting and ubiquitous materials like wood.”

Ellen Yi-Luen Do, a computer scientist at the University of Colorado, adds that StructCode’s idea is “simple, elegant and totally makes sense. It would be like having a Rosetta Stone to help decipher Egyptian hieroglyphs.”

Patrick Baudisch, a computer scientist at Germany’s Hasso Plattner Institute, sees StructCode as “a major advance in personal manufacturing. It takes a key feature currently available only for mass-produced goods and applies it to custom objects.”

In a nutshell, here’s how it works: First, a laser cutter, guided by a model created with StructCode, creates an object with encoded information embedded in it. After downloading the StructCode app, users can decode hidden messages by pointing their phone’s camera at an object, which can (with the help of StructCode software) detect subtle changes in length found in an object’s outward joints, or living hinges.

Doan said the process would be easier if the user was equipped with augmented reality glasses. “In this case, you don’t need to point the camera. The information will appear automatically.” This can give people more of the “superpowers” that the designers of StructCode wanted to give them.

“The object does not need to contain a lot of information,” Doğan added. “In the form of a URL, for example, it’s enough to direct people to where they can find the information they need.”

The user may be sent to a website where they can obtain information about the object – how to care for it, and perhaps ultimately how to dismantle it and recycle (or safely dispose of) its contents. Flower pots made with living hinges can tell users when the plants in the pots were last watered and when they need to be watered again based on records maintained online. Kids can examine a toy alligator with StructCode and learn scientific details about various parts of the animal’s anatomy. A picture frame made from StructCode-modified knuckles could help people understand the painting inside the frame and the person (or people) who created the artwork—perhaps linking to a video of the artist talking directly about the piece.

“This technology could pave the way for new applications, such as interactive museum exhibitions,” says computer scientist Raf Ramakers of the University of Hasselt in Belgium. “It has the potential to expand the range in which we can perceive and interact with everyday things” – which is exactly what is the goal that inspires the work of Dogan and his colleagues.

But for Doğan and his collaborators, StructCode is not the end. In addition to laser cutting, the same general approach can be applied to other manufacturing technologies, and information storage does not have to be limited to the seams of wooden objects. For example, data can be represented by the texture of leather, the pattern of a woven or knitted piece, or otherwise hidden in the image. Doan is excited about the breadth of options available and that their “exploration into this new realm of possibilities is just beginning, aiming to make objects and our world more interactive.”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *