Texas State professors look to advance artificial intelligence

Photo+credit%3A+Star+File+Photo

Photo credit: Star File Photo

Ziek Sanchez, News Reporter

Researchers are developing a more advanced, ecofriendly, power-efficient and compact form of artificial intelligence technology in their new project.

Computer science professors Ziliang Zong and Yan Yan are developing AI technology that can learn from its mistakes, recognize speech, possess language recognition capabilities and identify unique faces are only some of the different types of AI that are becoming prominent in the world of technology.

AI, in theory, can eliminate the need for humans from participating in dangerous and performing tedious tasks like driving and fighting in wars. It could even eliminate the risk of human error when calculating important measurements.

Although the use of AI in the world has a beneficial future, it can be difficult to widen its reach into everyone’s lives. Limitations such as large amounts of data to process, large amounts eco-unfriendly power consumption and non-compact hardware space all hinder the progression of implementing AI into devices such as a smart watch, smart phone and other popular mobile devices. Researchers Zong and Yan hope to tackle these problems and make way for improved AI technology.

Zong said his team is trying to move AI into a more advanced and efficient state of being, surpassing barriers that often hinder past examples.

“We are trying to solve the problems with current AI research,” Zong said. “For example, AI deep learning models are very big. They cannot be loaded into mobile devices like cameras and drones. To load this information into edge devices we would have to make the AI model very small and non-resource intensive and non-power hungry because mobile devices don’t run on powerful enough batteries. We’re looking to improve traditional AI and make it more efficient for edge devices.”

Zong and Yan’s research project Interpretable Multi-Modal Neural Network Pruning for Edge Devices will be funded by a $500,000 grant from the division of computer network systems of the Nation Science Foundation, an independent agency promoting the progression of sciences related to improving health, prosperity and welfare.

The project’s goal is to develop compact and accurate AI that can fit inside and be powered by “edge devices” such as modern day mobile technology. The AI will be created to think like a human by using text, video and audio all at once.

The team composes of Zong, Yan and three doctoral students who will work to design the AI’s neural network to use the minimal amount of information needed to operate.

Yan said AI that can use significant amounts of information can allow itself to increase efficiency and usefulness.

“We are trying to provide multi-modal learning for edge devices. An AI that can use and combine different kinds of information,” Yan said. “For example, we have things like video, audio and tasks information. We’re trying to find the answer on how do we combine all of this information and utilize it with AI”

Cody Blakeney, computer science doctoral student, said AI that lives on smart devices could be a very useful application to have if it was easier to have access to.

“AI models can be very difficult for computers to run,” Blakeney said. “What I do on the project will help reduce the size that the model needs to operate with, increase the speed at which the AI can operate and make it so worse and older hardware can be able to run it.”

As the project advances Zong and Yan hope to recruit undergraduate students to help assist the progress of the project. The professors are set to present their paper detailing the project at the 2020 Winter Conference on Applications of Computer Vision in March of this year.

For more information on the Interpretable Multi-Modal Neural Network Pruning for Edge Devices project visit the National Science Foundation’s page detailing the project.

If you liked this story, consider supporting student media through a donation or by signing up for our weekly newsletter.


Did you like this story? Share it on Flipboard

Flipboard share
Viewed 144 times, 1 visits today