Research

Our group develops and utilizes machine learning algorithms and tools to carry out simulations, data analytics, and visualization on both parallel and distributed computing systems.

Smart Communities, Smart Responders (SCSR) – An AI for IoT Prize Competition

In February 2022, Texas A&M University was awarded nearly $1,200,000 for the Public Safety Innovation Accelerator Program: Artificial Intelligence for IoT Information (PSIAP - AI3) Prize Competition. The rapid deployment of 5G infrastructure, Internet-of-Things (IoT) devices, smart buildings, transportation, and public safety data streams benefit communities across the country. However, these technologies made it difficult for public safety leaders and individual first responders to make use of this data. Through the project, our team seeks to accelerate the development of real‐time data visualization and rapid integration of IoT sensors for first responders, giving these stakeholders access to various streams of IoT data delivered in usable formats that can help solve complex challenges, thus improving America’s public safety capability.

PIs: Walt Magnussen, Jian Tao

Members Involved: Nicole Hatch

PSIAP Artificial Intelligence for IoT Information (AI3)

Creating a Shared Public Safety Radio Data Set for Research and Analysis

In September 2021, Texas A&M University was awarded more than $900,000 for the Public Safety Innovation Accelerator Program: Public Safety Radio Data Grant. As public safety begins to transition from Land Mobile Radio (LMR) systems to packet-based broadband systems such as LTE (Long-Term Evolution) and 5G technology, shared public safety radio datasets are becoming increasingly important. In the effort to support the transition from LMR to LTE, Texas A&M seeks to gather this data from several different public safety organizations to develop traffic models and determine overall system requirements.

PIs: Walt Magnussen, Nick Duffield, Jian Tao, Michael E. Fox, Joan Quintana

Members Involved: David Santos

NIST-PSIAP-PSCR 2021

Food for Thought Competition: Using NLP and Machine Learning to Link Food and Nutrition Databases

The US Department of Agriculture (USDA)’s domestic food and nutrition assistance programs affect the daily lives of millions of people, from almost half the infants born in the US to older adults, and they provide a nutrition safety net to ensure that no eligible American goes hungry. About one in four persons participate in at least one program at some point during a given year. Even more people rely on USDA guidance about what to eat for a nutritious diet—both individuals and organizations like hospitals, assisted living facilities, and childcare centers. This competition challenges data scientists to use machine learning and natural language processing to link databases on a large scale, helping millions of Americans make better food and nutrition choices while at the same time saving US taxpayers money.

PIs: Jian Tao

Members Involved: Sree Kiran Prasa

USDA-Food for Thought 2022

Collaborative project to develop OSS that visualizes and analyzes research data from VIVO systems

In this project we will create RDash, a web-based data-driven research intelligence platform for the analysis of the Texas A&M research enterprise. We envision this platform as one of the core components for an integrated and comprehensive Decision Support System (DSS) for research enterprises. Within the scope of this project, our team aims to: (1) Create a dashboard for interactive analysis of the Texas A&M research enterprise, (2) Identify existing Texas A&M research capacity for strategic assessment, (3) Identify Texas A&M subject matter experts and research clusters, (4) Advance opportunities for interdisciplinary research, (5) Strategically map capacity to state/national priorities, funding opportunities

PIs: Bruce Herbert

Members Involved: Sree Kiran Prasa

2021 Catalyst Fund Projects

Auditory Perception in CARLA

The project started with developing a virtual RELLIS campus to familiarize researchers with controlling autonomous vehicles prior to physical implementation. As the map grew more refined, the idea to implement auditory perception in the simulator grew of interest. The primary focus is on emergency vehicle recognition and having the vehicle move aside once the siren is detected. We aim to use binary classification with spectograms to determine whether a sound is a siren, and the Unreal Engine assets utilized by the simulator must be modified to produce the sounds for the model to detect.

PIs: Jian Tao

Members Involved: Nina Kang, Alyssa Cassity

Virtual RELLIS Site

Combining high-resolution climate simulations with ocean biogeochemistry, fisheries and decision-making models to improve sustainable fisheries

Climate change is posing new threats to West Coast communities dependent on fisheries. A new National Science Foundation Convergence Accelerator-funded research project led by Texas A&M University scientists is tackling those challenges using cutting-edge modeling and decision-making technologies. The project is a large multi-institutional endeavor, led by Piers Chapman, research professor in the Department of Oceanography, and brings together scientists from academia, federal agencies and industry.

PIs: Piers Chapman

Members Involved: Revanth Reddy, Alyssa Cassity

West Coast Fisheries Management