As a data center solutions guy, I attend many industry conferences and forums. So, I’m always curious about new ideas and how customers are using technology to solve new business problems.
The other day, I stumbled across a new Gartner Research report entitled “Deliver Data Center Modernization Using Three Cloud-Complementary Approaches”.
I was wondering about how all of this migration to the cloud was evolving. Now that most data center folks have, in one way or another, ported some of their applications to the cloud, I wanted to find out how these “hybrid” on-premise/cloud provider data center environments were working out.
Part of the report introduced me to a new concept that Gartner calls the “colocation network hubs” (what many of us refer to as “regional edge facilities”).
Regardless of name – the problem is the same – latency, bandwidth, and security. Until now, I had always pictured colocation as a large warehouse for IT equipment and not a platform that would support smaller scale implementations.
Most analysts agree that a number of technology drivers (exponential increases in digital traffic, a heavy demand for high bandwidth content, the growing Internet of Things trend) are converging to change the nature of how data centers are operated.
According to the Gartner report I read, regional edge facilities are one approach that is designed to handle access and security issues surrounding this large volume of data being generated at the network edge.
Regional edge facilities host the hardware and software needed to process local, high bandwidth applications. A typical regional edge facility offers a secure, dedicated environment where end users house local data processing, data storage and networking equipment.
These facilities are designed to maintain high availability to multiple customers while offering maximum network speed and the lowest levels of latency.
The regional edge facility can offer these services because they reside within major internet aggregation centers which experience low (less than .001 %) internet service outages.
Now that computing in edge environments has become a necessity for supporting the heavy bandwidth use and assuring low latency, regional edge facilities address the challenge by addressing network bottlenecks and supplementing bandwidth for data centers that may be geographically distant from cloud providers.
The Gartner report goes on to provide several specific examples of where application of a regional edge facility approach may make good business sense.
In essence, from what I could ascertain, these facilities are pretty good at dealing with authentication requirements and in minimizing latency. They appear to be a good solution for stabilizing the variable nature of some cloud connections.
What to do if you want to learn more
If regional edge facilities are something you decide to look into, I recommend also researching how you would power and cool this type of distributed environment. You’ll want to make sure your networking and server gear are UPS protected.
Depending upon how much equipment you plan to house in the facility, you might look into a small prefabricated unit that has all the physical infrastructure equipment you need, all in one package, to make it work.
Throw in some remote monitoring and you should be good to go.
In my opinion, looking into regional edge facilities might serve as a good insurance policy if your edge computing applications are growing.
Article by Steven Carlini Schneider Electric Data Center Blog Network