Engineer Notes: VR Weddings, 3D Printed Milk, and Learning How Your Dog Feels

Here's what we thought was pretty cool this week in the worlds of design, engineering, and science.

Worth Keeping An Eye On

All pet owners at one time or another have wished they could tell what their pet was feeling. Researchers at the University of Cambridge are getting closer to making that wish a reality. With a new algorithm, they have been able to create a system called Sheep Pain Facial Expression Scale or (SPFES) that assesses how much pain a sheep is in based on their facial expression. Using 500 photographs of sheep, they trained this system to recognize five distinct facial expressions of pain level in sheep. SPFES can predict just how much pain a sheep is in with 80 percent accuracy. Although this is just the beginning, researchers have found that muscle movement in the face of sheep and humans when they are in pain bare a resemblance. This brings hope that with additional developments, there will be systems that can recognize human emotions, and probably be able to give you insight into how your dog is feeling.

Researchers at the University of Cambridge are working on software that could let you better understand what your dog is feeling.

It was a lovely May afternoon as Elisa Evans changed into her wedding dress and Martin Shervington finished tying the bowtie on his suit. They walked together to the venue of their not-so-average wedding where they found no guests, just two headsets. As they placed the headsets over their eyes, they saw their family and friends throwing heart emojis and smiling faces in their direction. The couple stood on a platform built over a churning red lake of lava and got ready to start this legally-binding ceremony as a disco ball spun overhead. As you can probably tell, this was not a typical wedding, this was a virtual reality wedding and it was one of the first of its kind. Contrary to what you may think, the groom remarked that this virtual wedding took a lot more planning and attention to detail than a regular, traditional wedding. Designing their avatars, picking a virtual reality location, choosing the guests who would be able to “attend” with avatars and deciding who else would have to watch from a live YouTube stream were just some of the challenges they faced. Along with the happy couple not psychically being able to kiss due to the fact that there was a wall between them, a problem that many guests faced was the placement of their avatars. One guest said that there was even a point where her avatar accidently ended up standing directly in between the lovely couple. Although it was weird and awkward for some, Elisa and Martin may have started a new trend in weddings. Or maybe not.  

Our Robot Friends

There's been quite a lot of talk recently about robots taking over the jobs of humans. Here at GrabCAD we even shared a link to a site that estimates just how likely you are to lose your job to a robot based on what industry you work in. Thankfully, researchers have come up with a fews ways to ensure that you don’t fall victim to this robot take-over. The first tip: become an AI trainer or explainer. This may seem impractical if you don’t know much about AI’s or what their capabilities are, but all it takes is a little training of your own and you will be full of knowledge and expertise that no robot can take from you. The second, most practical tip: develop specialized skills. One trait that will take a long time for robots to develop is soft skills. This is something that comes very easily to humans and even if interpersonal skills are not your strong suit, sit down with another human and start a conversation! One last tip that researchers say will help ensure job security: be creative. Robots will be trained to do many analytic tasks that require very little creativity. Standing out with creative skills will set you apart and make you an asset to your company that no robot can replace.

MIT is helping to pave new ground for people who are visually impaired. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory have created a system that consists of a 3D camera, a belt with vibrating monitors, and an electronically reconfigurable braille interface that all work together to aid in the exploration of environments for the visually impaired. Researchers have been working for decades to come up with a system that replaces the traditional metal-tipped cane. With the realization that the abdomen is an open canvas for sensory perception, they began to come up with replacement ideas. Using special algorithms that create different vibrations based on where the user is headed and what they are about to encounter, this system is even able to let the user know whether the chair they are about to come upon is occupied or not. Varying frequencies and intensities in the vibrations of the belt combined with a symbolic Braille interface (c = chair, t = table), work in tandem to indicate the exact type of object a user is about to face and how close or far they are from it.

What if there was a way to lower the time it took you to run a mile by almost 30 seconds without having to do any additional training? Researchers at Harvard’s Wyss Institute and the Harvard John A. Paulson School of Engineering and Applied Sciences have created an exosuit that does just that. Running is a costly form of exercise that puts a heavy strain on the body. This exosuit, that is worn like a pair of running shorts, reduces that strain using thin and flexible wires that apply force to the hip joint to assist when the user is running. By acting as a second pair of hip extensor muscles, this exosuit not only takes strain off of your lower body, it also reduces metabolic costs, which were measured by monitoring the user's oxygen consumption and carbon dioxide production during the exercise. This exosuit is just the beginning of seeing what benefits wearable technology has for the human body.

Wearable tech like this exoskeleton pair of shorts is increasingly becoming reality. Photo courtesy of Wyss Institute at Harvard University.

The Wonders of 3D Printing

In the previous version of Engineer Notes, I mentioned the use of 3D printers to print food for people who suffer from dysphagia. If you thought that was pushing the limits of 3D printing, you’ve seen nothing yet. A startup out of Silicon Valley is just beginning to 3D print milk. Yes you read that right, milk! They are not actually printing the liquid, because as of now, that is impossible. But they are combining a strain of yeast with 3D bioprinted proteins that manipulate the yeast to produce casein, the protein that is found in milk. By combining this mixture with a few other ingredients and letting it ferment, a milk-like substance is produced. This product is raising quite the threat to dairy farmers. With vegetarianism and other animal-free diets on the rise, this could be a valuable way to produce milk and other animal-free products. The question is, will people actually drink it?

Like what you've just read? Sign up to receive GrabCAD's free weekly Digital Thread newsletter.