![]() “Most people who write software take pride in knowing that it’s being used, and we’re no exception. “Looking back at what we’ve accomplished with GASNet, I’m proud that we’ve helped so many application developers reach their goals by allowing them to skip designing and implementing a network runtime,” added Hargrove. They might know that they’re programming in Chapel, but they don’t know that this cool embedded library handles the communications services.” “Many application developers don’t even know that GASNet exists. We aimed to provide them with a virtual interface to the communication layer so they could target something stable that is hardware- and network-independent, and let GASNet help it work efficiently on various HPC systems,” said Bonachea. With GASNet, the idea was to isolate the compiler writers from low-level hardware details. “It can take 20 years or more to develop a high-quality parallel-language compiler, which is a huge effort, and machines change much faster than that. ![]() Major companies also write software that depends on GASNet-EX, such as Hewlett Packard Enterprise’s (HPE’s) Chapel. This new version, called GASNet-EX, supports scientific applications in drug discovery ( NWChemEx), metagenomics research ( ExaBiome), COVID-19 infection simulation ( SIMCoV), and much more. It was also recently upgraded to support exascale scientific applications via the Department of Energy’s Pagoda project. Twenty years later, GASNet is thriving with dozens of clients in academia, national laboratories, and industry. Shallow Water Tsunami Simulation solving shallow water Navier-Stokes equations, written using an Actor library communicating via UPC++ and GASNet-EX. He continued to build and improve on it as a graduate student researcher in Berkeley Lab’s Future Technologies Group and later as an engineer in the Lab’s Computer Languages and Systems Software Group, working in collaboration with Lab engineers and scientists, including Paul Hargrove and Katherine Yelick. He published the first GASNet Specification technical report in October of that year. Building on the Active Messages (AM) paradigm developed at UC Berkeley a decade before, he designed GASNet (short for Global-Address Space Networking), a network-independent and language-independent high-performance communication interface for implementing the runtime system of global address space languages, such as UPC and Titanium. Recognizing that RMA would become an important feature in future HPC systems, Bonachea, then a UC Berkeley graduate student, dedicated his CS258 Spring 2002 semester project to develop a solution to this problem. “Architectures were evolving quickly, and it was becoming clear that the dominant high-performance computing layer for communications, Message Passing Interface (MPI), was not well-suited at the time to take advantage of the Remote Memory Access (RMA) capabilities that were becoming available in network hardware,” said Dan Bonachea, Lawrence Berkeley National Laboratory ( Berkeley Lab) Computer Systems Engineer. 12, 2022 - In 2002, as massively parallel supercomputers were offering more computing power to programmers, a significant limitation to overall effectiveness was the time it took for processors to communicate with each other. Since 1987 - Covering the Fastest Computers in the World and the People Who Run Themĭec.
0 Comments
Leave a Reply. |