California State University, Long Beach
Inside CSULB Logo

Getting A RISE From Lab Work

Published: December 18, 2013

Humans and machines find common ground in CSULB’s Robotics and Interactive Systems Engineering (RISE) Laboratory under the direction of Mechanical and Aerospace Engineering’s Panadda Marayong.

The RISE laboratory with its staff of eight students focuses on the design and development of Human-Machine Collaborative Systems (HMCS) and the study of human-machine haptics (any form of nonverbal communication involving touch also called “force or tactile feedback”).

“We design and develop robotic systems that amplify and/or assist human physical capabilities when performing tasks that require learned skills, judgment and precision,” said Marayong. “The systems seek to combine human judgment and experience with physical and sensory augmentation using advanced display and robotic devices to improve the overall performance.”

By human-machine interaction, Marayong doesn’t mean the Transformers. “I mean people, robots and machines working together to perform a task,” she said. “Our creation doesn’t have to be an actual robotic device. It could be enhanced feedback delivered through a computer through a virtual environment or an overlay of information that helps to perform tasks better.”

The vibrotactile device for the prosthetic rehabilitation project.
PHOTO COURTESY OF CSULB AND VA LONG BEACH
The vibrotactile device for the prosthetic rehabilitation project.

Marayong serves as the faculty advisor of the Society of Women Engineers at CSULB. She received her B.S. in mechanical engineering from the Florida Institute of Technology in 2001 and a M.S. and Ph.D. in mechanical engineering in robotics from Johns Hopkins University, the latter in 2007.

Marayong pointed to three lab projects that seek to improve the modern cockpit, to make better rehabilitation outcomes for amputees and to promote greater safety for the crane operators who serve the port of Long Beach.

RISE is developing the next generation in cockpit display working with NASA and psychology’s Thomas Strybel and Kim-Phuong Vu in CSULB’s Center for Human Factors in Advanced Aeronautics Technologies (CHAAT).

This project involves a user input device to be used with the 3D Volumetric Cockpit Situation Display (CSD) developed by NASA as part of the Cockpit Display of Traffic Information (CDTI) program. The advanced features of the CSD, such as 2D-to-3D manipulation and real time path planning, place a limit on the use of traditional input devices, such as a mouse and a keyboard.

“Basically, we want to give pilots more flexibility in their flight control on the plane,” she said. But where other users might access their computers through a mouse, “that is not something that would be useful in real life,” she added. “First, we are trying to come up with a different input device for pilots to use CSD in a real cockpit. Second, we are looking at the software side to help pilots to manipulate and navigate more efficiently.”

Panadda (Nim) Marayong
PHOTO BY DAVID J. NELSON
Panadda (Nim) Marayong

One way to improve performance is through “force feedback.” “When pilots use this, they can feel the tactile impact of their environment,” she said. “It offers more realistic interaction with the environment rather than feeling nothing at all. We provide force feedback as an additional cue to make pilots aware of the decisions they are making. Force feedback can prevent pilots from doing the wrong thing. It augments their information.”

RISE also works with the Gait and Motion Analysis Lab at the Long Beach Veterans Affairs Medical Center in hopes of improving the rehabilitation of lower-limb amputees. In collaboration are kinesiology’s Will Wu and electrical engineering’s I-Hung Khoo with funding from the CSU Program for Education and Research in Biotechnology Joint Venture Grant.

The lab focuses on the development of a vibrotactile feedback device. Individuals using prostheses have limited feeling in the amputated limb and must utilize friction and pressure sensations felt at the skin-socket interface to perceive the state of their prosthesis, she explained. As a training device, the vibrotactile device would generate vibration on the prosthesis that can be felt.

“Sometimes it is difficult for recent amputees to interpret information from their prostheses and they don’t immediately understand what it means,” she said. “The idea behind the project is rehabilitation and fall prevention as well as motor learning skills. Our idea is for an inexpensive universal design that can be used with standard prosthetic components for training.”

The lab’s third project, previously funded by METRANS, seeks to improve the ergonomics of crane operators working high above Long Beach Harbor. Electrical engineering’s Henry Yeh collaborates.

This project involves a feasibility study on the integration of a multimodal (force and visual feedback) user interface to assist the crane operators during container handling, she explained. The lab focuses on the evaluation of sensor placement location, types of feedback and the design and construction of a scaled test bed now in the lab. The guidance will be provided as an overlay of sensory information and commands to improve efficiency and safety during container loading and unloading.

“We are interested in improving ergonomics for crane operators,” she said. “Sometimes they run into problems when they don’t get enough visual detail from way up there. We are thinking about computer aid to get more information to the operators. We are exploring an embedded system that could run most of everything on a single board and can be easily integrated to the existing crane console.”