Johns Hopkins professor Alex Szalay is leading a new effort to manage the 'deluge' of scientific data - Baltimore

Software Development

Jun. 11, 2018 10:25 am

Johns Hopkins professor Alex Szalay is leading a new effort to manage the ‘deluge’ of scientific data

The Open Storage Network is looking to provide a more efficient way for researchers to manage data. The National Science Foundation provided $1.8 million for the pilot.
Johns Hopkins Professor Alex Szalay inside the “Data-Scope.”

Johns Hopkins Professor Alex Szalay inside the "Data-Scope."

(Courtesy photo by Joey Pulone)

A Johns Hopkins astrophysicist is spearheading creation of a new tool to manage what he calls the “data deluge” in academic research.

Alex Szalay is heading up a national team to create the Open Storage Network, which will provide a place to store and share scientific data. The National Science Foundation recently awarded the project with a $1.8 million grant, which follows a $1 million seed grant awarded last year by Schmidt Futures, the philanthropic initiative of Alphabet’s Eric Schmidt.

Scientific research is producing massive amounts of data, and it’s led to new efforts to pool resources. According to the JHU Hub, Szalay encountered the supernova of info firsthand through past work on galaxies, and set out to create more efficient infrastructure to handle all of the data as director of the university’s Institute for Data Intensive Engineering and Science.

“The OSN team and their supporting collaborators will build a community to multiply the impact of previous and current NSF investments and anchor comprehensive data infrastructure that will be vital to the future of our nation’s scientific and engineering enterprise,” Erwin Gianchandani, acting assistant director of NSF Computer and Information Science and Engineering Directorate, said in a statement.

Data storage partners include the National Data Service, as well as NSF’s Big Data Regional Innovation Hubs at the University of California San Diego, University of Illinois and the University of North Carolina, along with computing centers in Pittsburgh and Holyoke, Mass.

Initial work involves creating data transfer systems that can handle lots of data while matching the speed of a 100-gigabit network connection with a small number of nodes, according to info provided by NSF. Moving forward, researchers at the institutions will pilot the network over the next two years to test performance, preservation capabilities, security and access. Software and service features will be added as it is developed.


Already a contributor? Sign in here
Connect with companies from the community
New call-to-action


Sign-up for daily news updates from Baltimore