I would even go as far to say that the Argonne Mira is a better supercomputer than Tianhe-2
With that logic Japan dominates the supercomputing industry.
Lists
But there a lot of metrics out there to measure the performence of a supercomputer. Energy efficiency is of course one of them. However there is data intensive modeling performence for big data analytics and there is matrix calculation performence for simulation needs. Tianhe-2A is currently the fastest supercomputer in simulation purposes (LINPACK Benchmark source :
November 2015 | TOP500 Supercomputer Sites and ranked 6th in data intensive computing performence. Japan's K Computer is the fastest supercomputer for data intensive supercomputing.
November 2015 | Graph 500
fast as it is, it's inefficient. coding for Tianhe-2 is a challenge
This has no meaning. Parallel processing is a hard field. Users are responsible for using the machine efficiently. Not the builders of the machine. Users should optimize their codes with the best way they could in order to run them fast. How many threads will be used, how many processes should be used, reinventing the sequential algorithms and come up with parallel implementations are user responsibility in every center.
Lazy and uneducated programmer will complain. That's a fact. But hey, what the heck is a mediocre programmer doing near those machines? A simple deadlock could create a lot of problems in those machines. The code should be tested intensively before running it on supercomputer. Because the supercomputer time is limited and valuable.
Those are the challenges that you experience in every system. I'm sorry to say that but there is no other challenge specific to Tianhe-2A.
“It seems a lot of these massive machines, usually made with large government investment, lie idle after they are made, or are even abandoned midway, due to fundamental defects in China’s traditional bureaucratic management system,” remarks He.
This doesn't make a supercomputer overrated. It only shows that China has more supercomputer capacity than it actually uses. However as the time goes by and the technology gets to be well known by a larger scientist group the utilization rate will go high. Tianhe-1 had also a very utilization rate when it first got out. Today it's very hard to book a time.
Chinese researchers are having a hard time programming code to unlock all that power. and the cost just to use it per day would set you back $60,000 to $100,000
U.S,Western, and Japanese have better software
This is not about Tianhe-2A. This is about the parallel computing ability of Chinese researchers. Researchers not writing efficient piece of sotware doesn't mean the machine is overrated. It means Chinese researchers lack the experience of using a supercomputer. With time, they will have the experience.
which is why I brought up why didn't China just build three 10 to 15 Pflop supercomputers??
once could havefdone aerospace research, another nuclear energy/nuclear weapons research, and the third on whatever
This is also nonsense. Supercomputers can be divided into pieces and multiple users can use them simultaneously. Merging them into one center actually makes it more efficient, then making three different centers.
the only thing can different the two is the program/codes written for them
the code written for the Chinese Tianhe-2 isn't good while the one in the U.S has good code.
you know the theoretically peak for Tianhe-2 is 54.9 Pflops. that can only be achieved with 100% perfect software
This is not about the machine, but about the experience of the researchers in the field of parallel processing. If they write inefficient code, they will get inefficient results. That doesn't mean the machine is overrated.