Originally posted by Brian Mosley on the CRA Policy Blog
On April 26th, the Coalition for National Science Funding (CNSF), an alliance of over 140 professional organizations, universities, and businesses, held their 22nd Annual Capitol Hill Exhibition. CNSF supports the goal of increasing the federal investment in the National Science Foundation’s research and education programs, and the exhibition itself is a great way to show members of Congress and their staff what research the American people have funded.
This year the Computing Research Association, a member of CNSF, sponsored the research group led by Vijaykrishnan Narayanan at Penn State University, demonstrated multiple pieces of technology under “Visual Shopping Assistance for Person with Visual Impairment.” Dr. Narayanan was assisted in exhibiting his group’s research by some of his students and colleagues: Nandhini Chandramoorthy and Peter Zientra, PhD students at Penn State; Ikenna Okafor and Gus Smith, both undergrad researchers at Penn State; Kevin Irick, a former faculty student at Penn State and current founder and CEO of SiliconScapes; and Laurent Itti, professor of computer science, psychology, and neuroscience at the University of Southern California. The group’s research has been conducted under their “Visual Cortex on Silicon”project, which is funded by NSF’s Expedition in Computing program.
There were two pieces of technology that the group demonstrated at the exhibition: a smart glove with tactile feedback and a visual assistance eyepiece. The smart glove was demonstrated by Ikenna Okafor and Gus Smith; the photo above with Mr. Smith and Francis Cordova, shows the device. You can view a demonstration of the glove on the research group’s website (first video). The device has a camera in the palm and interprets the visual data to detect a desired object and direct the user to it using tactile feedback in the glove. The eyepiece, demonstrated by Kevin Irick and Peter Zientra, interprets data to read the labels of products in a grocery store aisle and directs the user, using audio directions, to a specific item. You can see Kevin demonstrating the glasses, along with the mock grocery aisle, in the photo above. Both of these devices have major applications for visually impaired people, as well as other computer vision uses.
The final demonstration was performed by Laurent Itti of the University of Southern California, a member of the Visual Cortex on Silicon project. Dr. Itti demoed his visual attention, face detection, and object recognition algorithm, which is able to detect both faces and movement of people on the fly. Pointing a miniaturized camera at the passing crowd at the exhibition, the system was able to detect faces on anyone facing the camera and highlighted them with different colored boxes on the computer screen.
All of this work is supported from the CISE directorate at NSF. All three projects were well received by the attendees of the exhibition; in fact, the students fielded questions from Congressional staffers; NSF Program Officers; the Assistant Director of CISE, Jim Kurose; and even the NSF Director, France Córdova.
A number of other organizations had displays and were demonstrating NSF funded research at the event. From the Massachusetts Institute of Technology’s “Listening to Einstein’s Universe with LIGO,” to the American Economic Association’s “The Future of Data for Social Science Research,” to the American Mathematical Society’s “On the Movement of Cells, Birds, Fish and Other Agents: Mathematical Modeling in Biology and Ecology,” the exhibition was a great display of the different types of research being supported by NSF. Look here to see a list of some of the participating organizations and what a few of the exhibitors were presenting.