Join executives from July 26-28 for Transform’s AI & Edge Week. Hear from top leaders discuss topics surrounding AL/ML technology, conversational AI, IVA, NLP, Edge, and more. Reserve your free pass now!
I recently heard the phrase, “One second to a human is fine – to a machine, it’s an eternity.” It made me reflect on the profound importance of data speed. Not just from a philosophical standpoint but a practical one. Users don’t much care how far data has to travel, just that it gets there fast. In event processing, the rate of speed for data to be ingested, processed and analyzed is almost imperceptible. Data speed also affects data quality.
Data comes from everywhere. We’re already living in a new age of data decentralization, powered by next-gen devices and technology, 5G, Computer Vision, IoT, AI/ML, not to mention the current geopolitical trends around data privacy. The amount of data generated is enormous, 90% of it being noise, but all that data still has to be analyzed. The data matters, it’s geo-distributed, and we must make sense of it.
For businesses to gain valuable insights into their data, they must move on from the cloud-native approach and embrace the new edge native. I’ll also discuss the limitations of the centralized cloud and three reasons it is failing data-driven businesses.
The downside of centralized cloud
In the context of enterprises, data has to meet three criteria: fast, actionable and available. For more and more enterprises that work on a global scale, the centralized cloud cannot meet these demands in a cost-effective way — bringing us to our first reason.
It’s too damn expensive
The cloud was designed to collect all the data in one place so that we …