How Will Data Centers Handle All That Data From Autonomous Cars
The autonomous car industry will be one of the most transformative and disruptive events in my lifetime. The world’s auto manufacturers are charging full steam ahead, with investment in autonomous technology and manufacturing growing each year. Just last week, Volvo forecasted that by 2025, more than a third of the company’s sales will be autonomous cars.
This is a huge force in the IT landscape. You see, each autonomous car is essentially its own highly connected supercomputer, requiring previously unimaginable amounts of data, network connectivity and IT orchestration. For data center and technology companies the opportunity is unprecedented, as are the challenges.
Imagine this. Each car has over 3,000 data points. Each data point has to report “what’s going on.” Some of those “reports” require real time response - “Hey, am I getting close to delivery truck in front of me?” and others are just informational updates – “I need my oil changed soon.” It’s just a matter of time before our cars are making doctor’s appointments for us and reminding us to pick up milk at the grocery store!
So exactly how much data are in these reports? Try 4TB PER DAY for a typical driver. Yep, I said a DAY! Add to that; drivers will now be browsing the web, streaming Netflix, shopping, planning vacations and so much more now that that the car has the driving part covered! In America alone there are over 265 million drivers. If even 10 percent of them adopt autonomous vehicles, we’re looking at zettabytes of data. How big is a zettabyte (ZB)? It’s this big: 1,000,000,000,000,000,000,000 or 1 trillion GB.
I’ll ask the obvious question – how on earth are the data center and IT industries going to handle this? The answer: software-defined data centers (SDDCs). These facilities are completely digitized, creating new network and connectivity strategies. A fundamental aspect of an SDDC is IT orchestration, or the ability to create a data lake with billions of raw data points and then transform that data into organized, aggregated and meaningful data.
At QTS we have been busy building a home for all those ZBs. We improved our processes to build data centers bigger, faster and in key locations to handle the anticipated growth. We reimagined, re architected and recreated an entire connectivity ecosystem that delivers ultralow latency performance in a cloud and carrier rich environment. We’ve designed the industry’s preeminent IT orchestration platform, the QTS Service Delivery Platform.
It’s going to be big, almost unimaginable – certainly unprecedented. Data, data, data. If you heard nothing else hear this; there will be more data moving, transforming and living in our world than ever before. While the world is reimagining the role of data in our day-to-day lives, QTS is redefining the data center to offer next-generation infrastructure and solutions. Contact us to learn more about our software-defined data centers and Service Delivery Platform.