Date Oct 31, 2019
The explosive growth of the tech industry in recent years owes much of its success to an underlying principle that governed its inception -- Moore’s Law. The dramatic increase in power and decrease in size of the microchip was predicted by Gordon E. Moore in 1965, when he correctly asserted that the number of transistors on a single chip would double every two years. As computing devices became more affordable, microchip efficiency expanded exponentially to reach previously unseen levels of data storage and operating power.
Today, enormous amounts of data are produced every millisecond, reaching unprecedented levels that require larger units of measurement, such as the petabyte and even the exabyte, while implementing more advanced storage methods. Just as the microchip’s efficiency has increased, the accumulation of Big Data has as well - posing a challenge to Moore’s Law, with experts predicting it will render itself obsolete as we enter into a new decade.
Over ten years ago, Moore even stated of his prediction, “(It) can’t continue forever. It is the nature of exponential functions...they eventually hit a wall.”
The miniaturization of the microchip will undoubtedly reach a breaking point, making storage options such as the cloud all the more essential to the ever-expanding data-driven society of the immediate future. As much as 90% of the world’s data has been generated in the last two years alone, signifying a huge surge in both data production and collection. Most companies, however, only analyze about 12% of the data they store. With such a dense amount of data being produced and managed, making Big Data operationally useful for organizations requires fine-tuned, innovative data analytics -- with cloud orchestration already taking on a critical role in data management.
“The cloud is a significant advantage when done correctly; it requires a large amount of expertise that most companies don't have,” Fluency Security CEO Chris Jordan said. “Even Amazon doesn't fully comprehend cloud security.”
Fully realizing cloud capabilities will take time and will require privacy regulations. Older data management systems, such as SIEMs, do not store data for the amount of time required by new consumer privacy laws. This makes efficiently storing copious amounts of data not simply a matter of convenience, but rather a legal demand that governments across the globe are now enforcing. EU organizations recently passed the 1-year mark since the GDPR was enacted, with most EU-based companies having difficulty meeting compliance standards. Closer to home, the United States is now following suit as California prepares to roll out the CCPA in 2020.
Jordan says traditional data log management systems that were initially created for on-prem servers, like SIEM, can be adapted to the cloud, but are essentially working backwards. Starting with the cloud structure and creating a nearly instant tracking system that works within it creates a more efficient pathway for high-capacity network flow management, best described as a “data river.” As an example, Fluency’s largest customer processes eight to twelve billion messages a day, with spikes of half a million events per second. By efficiently managing the data river flow, all information can be easily accessed and leveraged in spite of its enormous volume.
“The conversation around database advantages -- like storability and ease of use -- still has to overcome the learning curve.” Jordan said. “A businessperson has to balance the cost advantages of the cloud vs. the cost advantages of on-prem. It's an evolution not a revolution.”