Supercomputers in Latin America
December 13, 2017
Why is it
important to have a high-performance computer kraken?
The supercomputers are in Mexico, Brazil,
Argentina, Chile, Peru and Ecuador. Supracomputer, San Luis Gonzaga University
of Ica Think of an ordinary car: for example, a sedan model either. Now think
of a Formula 1 car. That's the relationship a desktop computer has, like the
one you use at home or your office on a daily basis, with a supracomputer (or
high-performance computer, HPC). While both are cars (or computers) it is the
only thing they look like. Just like the Formula One cars, like the ones Hamilton
or Vettel usually drive, they cost a lot and require personnel with expertise
to manipulate them: they are good for highly specialized problems (you would
not use a race car to go to the market, nor would you use a supercomputer to
read your mail or check your Facebook wall).
Companies like Cray or IBM are the
traditional signatures in developing these monsters, which work on some of the
biggest problems in science and engineering: the origins of the universe or new
medicines against cancer. We speak of special machines for the technology
within them: only one of these could have 10,000 processors. For reasons like
this, they are expensive: the top 100 cost more than 20 million dollars per
unit. In Latin America, several countries have their own supercomputers. What
they are for, where they are and what their potential are are questions that we
must answer to get the most out of them.
What is a
supercomputer?
According to the IOSR Journal of Computer
Engineering, "high performance computing (HPC) refers to the practice of
accumulating computing power in a way that can perform much better than a
typical desktop can do to solve major problems in science. , engineering or
business. " A useful way to understand what high performance computers
are, suggested by InsideHPC, is to think about what is inside them. Inside your
daily computer you can find: processors, memory, disks, operating system. The
supercomputer has all that, only much more of each. HPCs are really groups of
computers (something like an octopus of many brains). Each individual computer
configured in a small group has between one and four processors, and current
processors typically have between two to four cores. People specialized in HPC
often refer to individual computers within a group as nodes. If we talk about
business, for example, four nodes could be enough for a small one, or 16 cores.
A group size common in many businesses is between 16 and 64 nodes, or between
64 and 256 cores. The idea of putting individual nodes to work together is to
solve a big problem that no other ordinary computer can solve. As people, the
nodes need to communicate with each other so that group work makes sense. For
this, of course, these are connected through networks.
What is it
for?
The usefulness of HPCs (supercomputers) is multidiverse, applicable in several
areas. Quickly, we can find that in its early versions, from the 70's to date, it has been useful for weather forecasts and aerodynamic research (Cray-1). In areas such as probabilistic analysis and radiological protection (CDC Cyber).
areas. Quickly, we can find that in its early versions, from the 70's to date, it has been useful for weather forecasts and aerodynamic research (Cray-1). In areas such as probabilistic analysis and radiological protection (CDC Cyber).
Also, in the decryption by brute force (EFF DES Cracker). Others did so in illustrative 3D simulations of nuclear tests under the Nuclear Non-Proliferation Treaty (ASCI Q). More recently, it also allowed the realization of simulations of molecular dynamics (Tianhe-1A). One of the representative firms of this technology is the IBM, that with its supracomputer Blue Gene / P managed to simulate a number of artificial neurons equivalent to approximately 1% of the cerebral cortex, containing 1.6 billion neurons with approximately 9 trillion connections.
The same research was able to simulate a number of
artificial neurons equivalent to a rat brain. The National Oceanic and
Atmospheric Administration, NOAA, uses supracomputers to process hundreds of
millions of data (numerical, graphical) to make more accurate forecasts. The US
Advanced Computing and Simulation Program it also uses supracomputers to
maintain and simulate everything that could be done with the nuclear arsenal of
the United States.
The (cold)
war of the petaflops
In the war of supracomputers, China
has taken things very seriously and already surpasses the United States (which
was once a leader), leading for more than a handful of years the list of
fastest computers in the world. An example of this is how the country has used
one of these monsters to create the largest version of the digitally generated
Universe. The Sunway TaihuLight, the most powerful supercomputer of these
moments, simulated "the birth and expansion of the Universe" using 10
billion digital particles, something that Chinese experts have considered
"a warm-up exercise". Today, he is working on building a
supracomputer capable of making 1 trillion calculations per second and expects
the first prototype to be completed this year. With this type of technology, it
hopes to be the leading country in artificial intelligence by 2030.
The supercomputers are also in Europe
(Belgium, Germany, France, Spain and others have them), as research centers and
high-performance computing services serving the entire scientific community.
Currently the Chinese Sunway is at the forefront of everything, with its 93
petaFLOPS (being 1 petaflops, with s in singular and plural, the capacity of a
computer to perform 1015 operations and processing calculations in a second).
Supercomputers in the region In Mexico,
supercomputers are used for years and have mainly educational and research
centers. It is clear that they are not at the level of China or the United
States, but they have theirs. Abacus, for example, acquired by the Center for
Research and Advanced Studies (Cinvestav) in 2014, was one of the 150 fastest
supercomputers that year, with its 8,904 cores, 1.2 Petabytes of storage and
40TB of RAM, being able to reach 400 teraflops. The same center also has
Xiuhcoatl, or the fire snake, released in 2012 and with a processing of around
50 teraflops. On the other hand, Yoltla, or the hotbed of knowledge, was
installed in 2014 on the campus of the Autonomous Metropolitan University, with
peaks of up to 45 teraflops. The Autonomous University also played its part
with Atócatl (2011) and Miztli (2013). Brazil is the other Latin American power
in the possession of supercomputers, although with some problems. The fastest
computer is Santos Dumont, which has three modules (each in the top 500
worldwide). Together these make 1.1 petaflops of performance, which in simple
could be a million times faster than a common notebook. He researched protein
chains to cure Alzheimer's disease, and he worked to carry out a genetic
mapping of the Zika virus, but it had to be stopped because it consumed too
much energy and it was not possible to solve it. It also has Cimatec Yemoja,
the second in the country, with 400 teraflops; Grifo04, with 251.5 teraflops;
Tupa, with 214 teraflops. They are employees used in the oil industry, in
geophysics, pharmaceuticals, chemistry, among others. The supercomputer
laboratory of the Benemérita Autonomous University of Puebla (BUAP) stands out,
which has a supercomputer with a maximum capacity of 200 teraflops, and a
second equipment with 135 teraflops. All this technology is used in complex
studies varied, from research on astrophysics to studies to know the damage caused
by radiation to DNA molecules at very low energies, less than one
electron-volt. In Colombia, the Industrial University of Santander (UIS) owns
Guane-1, the EAFIT University has the Apolo supercomputer, while the Center for
Bioinformatics and Computational Biology (Bios) has its supercomputer dedicated
to biotechnology, capable of analyzing in single 15 minutes the genome of a
bacterium, a task that could take about 20 days before.
Argentina has Tupac, designed to
simulate hydraulic fracturing processes for the oil industry, meteorology and
car manufacturing. Also highlights Mendieta, a supercomputer installed in the
National University of Cordoba, with its 14.7 teraflops at the service of
research in astronomy, engineering and medicine, Big Data, censuses, among
others.
Ecuador, for its part, has a supercomputer of up to 350
teraflops in Yachay (planned university community). Chile has since 2015 with
Leftraru, a supercomputer with a processing capacity of 50 teraflops belonging
to the Center for Mathematical Modeling (CMM) of the Faculty of Physics and
Mathematics of the University of Chile.
And it was also the moment of Peru:
with the acquisition of the supracomputer by the National University of San
Luis Gonzaga de Ica (UNICA), a Chinese transnational aims to deepen in areas
such as meteorology, astrophysics, pharmaceuticals and agriculture to a new
level . Also, in fields such as data mining. The Supracomputer is already
working on the integration with the satellite SAT SAT, operated by the space
agency Conida, in order to develop more accurate weather forecasts, as
authorities of this alma mater told N + 1. in process the construction of the
data center where the equipment will be installed on the campus.
Keys
supercomputers
measured?
The most used criterion is the
calculation capacity, which is measured in flops (floating point operations).
An operation of addition, subtraction, multiplication or division equals one
flop per second.
What is a
teraflops and a petaflops?
If
flops is a measure of calculation performance, then tera = 1012 and peta =
1015. If a supercomputer has a petaflops, for example, it means that it can
perform 1,000,000,000,000,000 basic arithmetic operations per second.
In what
areas can they be useful?
The list is long: predict the
climate, look for patterns on climate changes, study the universe, simulate
effects of a heart operation, a nuclear or earthquake test, simulate brain
functions, genomic analysis, predict the effect of new drugs in the proteins,
look for minerals or oil more accurately, study bigdata how to analyze the
behavior of customers in business. What software do they use? When talking
about software, there appears an old acquaintance that at the mobile and desktop
level perhaps we already had forgotten with the predominance of Android or
Windows: Linux. As for supercomputers, free software sweeps: in the last Top500
was installed in 498 systems.
0 comments