(Photo by Stephen Babcock)
If the next wave of tech is coming from young folks taking time to tinker, then the Johns Hopkins Homewood Campus shoulda had a tsunami warning last weekend.
HopHacks gathered college students from around the region for the spring edition of the twice-annual hackathon that’s a stop on the Major League Hacking circuit.
Here’s a look at projects that stood out during the final pitches on Sunday:
- The overall $1,024 winner combined Google Street View and virtual reality, in a somewhat unexpected way. The University of Maryland’s Kevin Chen, Johann Miller, Ricky Han and Jason Zou wanted to create a way to help memorize speeches. Users input a speech to the WebVR app, and a Google Street View location provides clues. Get a key part of the speech right, and advance to the next location. Get it wrong, and it provides a hint.
- Power and politics were on the mind of hackathon teams in the first post-election HopHacks. Taking Mr. Robot as a jumping off point, this group was looking for hidden connections in something that’s there everyday: the news. The team of Darren Geng, Nikhil Kulkarni, Jason Bak and Hugh Han looked to tap the New York Times API to find connections between people, places and governments or companies. Using the connections, the second-place winners saw a path to create data visualizations.
- You’ve probably heard talk of cybersecurity and the blockchain recently. This project brought the pair together. The Johns Hopkins team of Andrew Fan, Eric Rothman and Ben Leibowitz zeroed in on the distributed data storage capabilities of the blockchain as a means to spread information about potentially malicious IP addresses and other threats. When an entity discovers a threat, they can use ThreatSync to see info about it. The project won third prize overall, as well as recognition from John Snow Labs and Amazon Web Services.
- A crowd favorite, this translator of foreign comics got an award from Google. Ryan Newell and Dylan Lewis wanted to translate Manga from Japanese, so they tapped machine translation and computer vision to scan the text of comics and translate them to a reader’s language.
- One of two of the 10 finalists looking to develop technology to help people with visual impairments, this app aims to tell people what is around them. It’s designed for the mundane tasks of navigating the world. That’s pretty broad, so the team identified two use cases: determining whether a walk sign was present, and if people around them were happy or not. The team of Jin Yong Shin, Alex Ahn, William Yao and Gavi Rawson earned a nod from Johns Hopkins Medicine’s Technology Innovation Center.
- This project got recognition from Google. Jennifer Strong and Praveen Ravi created an Android app that uses audio to help the visually impaired navigate their surroundings.
- While Seasonal Affective Disorder wasn’t an issue this February weekend, the team of Emily Gong, Raghav Chetal, Michael Appel and Acheev Baghat were working on a hardware remedy in case winter returns. They created an RGB LED strip that activates when an alarm clock goes off. That way, the sun will always come up.
Columbia-based Unleashed Technologies acquired by Linc Partners
These 9 startups just finished Conscious Venture Lab. Here’s how they want to create impact *and* value
4 autism-friendly event strategies that will benefit everyone at your next conference
How this entrepreneurial-minded lawyer is helping set founders up for long-term growth
6 takeaways from Baltimore Magazine’s Made in Maryland Brand Summit
Preparing girls to be creators, not just users, of technology
5 things we learned about entrepreneurship at UB’s Attman Competition
Technology is ever evolving — shouldn’t business education be, too?
Sign-up for daily news updates from Technical.ly Baltimore