The Machine: HP’s Big Dreams for Big Data Generation
HP Labs is in the process of developing a revolutionary new system they call The Machine. The computing architecture of the system will use electrons for computation, ions for storage, and photons for communication.
The purpose of The Machine will be to store a huge amount of data, and retrieve it faster than any technology currently available. Fiber optics, silicon photonics, single flash memories (for both RAM and storage), and a Non-Volatile Memory (NVM) aware Operating System are the technologies involved in the project. Once completed, The Machine will be the size of a refrigerator and it will be able to do the work of an entire data center, claims HP.
Computers today use small pools of memory to store data temporarily and then retrieve it when required. The whole process is rapid as long as there are only a few amounts of data. However, if there is a large volume of data, then they have to be stored on hard discs. It will take the system more time to retrieve those data. HP is aiming to change this approach with the help of The Machine.
Technology for The Machine
The Machine will use a new type of memory chip that is incredibly fast, and have the ability to hold a large amount of data. It will be able to retrieve information even when the power is lost. It can act as both the system’s storage and the system memory. It is manufactured using memristor technology. The Machine will also be using the silicon photonics technology, which relies on lasers instead of traditional copper wires to move the data around in a computer.
The operating system will be a custom based Linux OS. It will have to run on each SoCs (system on a chip), and should be able to work with universal memory.
The need for a system like The Machine
To understand why you need a system like The Machine, you have to understand Big Data. Big Data is a term used for a collection of data (datasets) so large and complex, that the data processing devices or applications that exist today are inadequate to store and process them. If you are wondering whether it is possible to generate such a huge amount of data, then take a look at the following statistics from Data Never Sleeps 2.0 that are handled in a minute.
- Seventy-two hours of video content uploaded to YouTube.
- Fifty thousand apps downloaded by Apple users.
- Two hundred million email messages sent.
- Two hundred thousand photos posted to Instagram.
- Three hundred thousand tweets.
- Two and a half million Facebook shares.
- 23300 hours of Skype connection.
- 347222 photos shared in WhatsApp.
According to IBM, two and a half billion gigabytes of data was generated every day in the year 2012. It is almost impossible to calculate how much data is there now. The cause for concern is the fact that about 75 percent of the data is unstructured, and unstructured data can be hard to access. Google helps to find online data and content. It processes twenty petabytes of data per day. To help you understand it better, the combined written works created by mankind since we started recording history equals to fifty petabytes of information or data.
Advantages of The Machine
The Machine will be able to resolve complex everyday problems, like assigning airport gates to every plane that lands, irrespective of the time they land, thereby helping the passengers to get off the airplane quickly and efficiently. The current computer system doesn’t have the ability to recognize a vacant gate in the case of a plane landing earlier than scheduled. If you are a frequent flier, you will be able to relate to this situation.
The Machine will allow processors to access a large volume of data, and retain it even in the event of losing power. The Machine will also help companies to run its operations smoothly and efficiently by improving the computing power. It will help the companies to improve their security as well.
The Machine is a project-in-development as of now. It is still an ambition. The various technologies that it needs are still in development, including the operating system. The critics question the feasibility of such a system. However, HP Labs CTO and director Martin Fink assures that The Machine is real, and it is on its way.
He announced during the HP Discover conference that a prototype would be ready by next year for its partners, to develop software. It will have 2500 CPU cores and 320 TB of main memory. However, the prototype will only have DRAM memory chips since the advanced memristor technology is still under development.
If, HP is able to realize The Machine, it will definitely change computing in a big way. If The Machine is not realized, then companies will have to find a new way to counter Big Data.