Unlimited Computing is currently working on drafts for standards for how Super-computing machines should be both hardware and software design plus how the services should be designed to give optimal performance and compatibility with the future.
The standards could also apply in some degree to normal computers because we are reaching a ceiling for how much data a normal desktop computer can transfer and store a day – today.
The process started as a result of a design where more than 64 bit computing power is needed. The 64 bit can with quantum computers reach as much as 512 qubits.
Today larger storage capacity, computing power, memory capacity and network speed is needed every day to handle the exponential growth in data that needs to be processed and accessed.
The need for standards rises as you want to write software for different OS platforms and different bit length on like 32 bit software – 64 bit and more. The standard is need to compile software for hardware running more than 64 bit. This to ease the work at software companies around the world in their development of OS and software for different hardware.
The need for standards also rises for data integrity security for storage solutions and data integrity for network connections.
The reasons for this is that solar storms can affect both storage solutions and network connectivity. An example is a test a transfer on a gigabit CAT 6E network of more than 800 gigabyte, where we found by file compare 1 bit error per 2 gigabyte of data.(statistically) The TCP /IP protocol check sum should catch it, chances are very small, but it is like winning in the lottery, you are handing in a trillion lottery tickets, you are doomed to win more than one time. This can cause serious corruption of data sets, software bugs, and database/software crashes.
And most important the need for standards to lower the cost of servicing hardware and software, and initial cost for software and hardware solutions for the industry of large scale computing.