NMSU College of Arts and Sciences professors awarded prestigious NSF grants
Writer: Isabel A. Rodriguez, (575) 646-7066, firstname.lastname@example.org
New Mexico State University will soon see the arrival of two new research devices in the College of Arts and Sciences; a multi-collector inductively coupled mass spectrometer and laser sampling system, through the efforts of Frank Ramos, associate professor of geological sciences, and an instrument for research in irregularly parallel big data computation, thanks to Jonathon Cook, associate professor of computer science.
Ramos and Cook were recognized for their work at an NMSU Research Rally Friday, Oct. 11. Both projects will be funded by the National Science Foundation’s Major Research Instrumentation grants.
Ramos is working to establish a laboratory at Gardiner Hall, where users can conduct their research using innovative techniques involving isotopes. The mass spectrometer is designed to measure isotope ratios of different elements.
“In the earth sciences, these isotopes can be used as tracers to evaluate different reservoirs, or to date rocks and minerals,” he explained. “In addition, isotopes can be used to track what happens to magmas as they move, as well as the time it takes for them to ascend to the surface. They can also be used to track migration paths of fish, animals and humans, and many other uses.”
The new mass spectrometer will analyze samples more efficiently compared to the one currently available in NMSU geological sciences department. Unlike the present equipment, the new device will have the ability to measure isotopes in many materials without requiring chemically purifying mineral and melt components.
This type of information is useful in address problems in ore, volcanic and biological systems.
Professors from various departments, including biology, anthropology, conservation ecology and others, will be able to use the mass spectrometer. Students will also have the opportunity to use the equipment.
“The different uses of the equipment are only limited by the creativity of the scientists that are applying isotopes to address their research questions,” Ramos said. “It has multi-disciplinary potential.”
Unlike mass spectrometers at other universities, the new device will also include a laser sampling system.
Ramos and his colleagues will evaluate two different mass spectrometers, before purchasing and installing one at NMSU.
In addition to the $500,000 NSF grant, Ramos (who is the Michael L. Johnson endowed chair) will receive $215,000 from the College of Arts and Sciences Johnson Endowment and $300,000 from Michael and Judy Johnson.
Cook’s $224,000 NSF grant will help acquire a computational instrument designed to support data-driven graph computations.
“It’s for big data research,” he said. “My area of interest is software engineering: what’s it’s doing, how it’s behaving and how it can improve. I generate lots of data. The instrument has computation nodes that have large amounts of memory, with whole hosts of memory capacity at each node to handle that data and to allow the computation fast access to the data.”
The type of analytical work Cook and his colleagues do is complimentary to traditional computer science studies. The device will also be used as a learning tool for students.
“There is a tremendous need for students who understand data analysis, and they need access to computational resources and instruments like this,” he said. “We are already introducing a big data analysis graduate course this year. The instrument will provide a platform to deploy what we’ve learned into a bigger environment. This is really a research machine; it’s not a machine that’s big enough to, for example, process a large corporation’s amount of data.”
Cook’s research could potentially improve software and improve the performance of anything from scientific application software to video games.
“Even improving the software performance of video games could have an economic impact,” he said. “We get sample applications that represent some type of software we want to analyze. Then we bring those in-house and collect data from them.
“I think the most fascinating part is the amount of data we could actually process. It’s hard to relate, because we throw out numbers like gigabyte or megabyte, and to really understand that number is really hard. The example I’ve used is that we generate 2.5 exabytes of data per day, worldwide. If you had one exa-millimeter, that would get you to Jupiter from the sun and back 500 times. It’s incredible to visualize that number.”