Blog

← Back to Main Page

Building Big Infrastructure in an age of Big Data

September 1, 2016  |  Written by Elisa Wong, Regional Sales Director, Asia Pacific

Is your data centre ready to help you disrupt with big data and analytics?

Many people say data is the new oil – a valuable commodity that flows through organisations, powering the engines of new insights and actions. It’s a fair comparison – after all, when properly analysed, big data can deliver mission-critical business insights, open new markets and create competitive advantages.

Even better, as the big data and analytics market matures, we’re seeing the rise of technologies like Hadoop and Spark, with real-time capabilities. This means that operating as an ‘as-it-happens’ business – one that can sense and respond immediately to a changing environment – is fast becoming a reality for some enterprises.

The problem is that for big data to deliver the kind of intelligence that dramatically changes your organisation’s decision-making, you need the right tools to refine it and set it to use. A core element here is big data’s networking requirements. After all, neither the data nor the insights it provides are of much use if they can’t get to their destinations quickly and securely – 100 percent of the time.

So if your company’s looking to drive performance more effectively with digital insights, you need to start by asking some tough questions about your data centre infrastructure. Just how effectively can it support your big data needs?

A focus on software-defined networks

The first question to ask is whether your data centre network has the right processing software to provide users with the agile, scalable, end-to-end framework to efficiently mine big data. You need infrastructure that can be provisioned through automation, centrally managed through programmability and orchestrated through platforms that reach across the entire system in the service of large workloads.

The data centres that do this best are software-defined, and they orchestrate networks through cloud platforms. Software-defined networking technology makes it possible to treat multiple data centres and the network as a single system to efficiently access and process applications. This includes those that access geographically distributed databases and servers through public and private clouds.

With such a system in place, a company’s IT team can migrate, deploy and control big data-processing applications in the cloud, within the desired context – as well as evolve them within the cloud as an innovation platform.

This approach also eliminates traditional IT silos and forces businesses to rethink server, network and storage administration in a more holistic way.

A focus on connectivity

Harnessing the power of big data also requires partnership. It requires the ability to swiftly and efficiently connect an ecosystem of service providers and peers inside a secure global platform of networked data centres.

So the second key question you need to ask your data centre provider is whether it can integrate multiple applications, data types and data sources through specialised high-speed circuits, quickly and securely.

You need the kind of connectivity that is as flexible and agile as the dynamic information assets you’re aiming to organise and analyse. A data centre that can offer a hybrid solution – involving a number of cloud and service providers within a neutral, well-connected location – is likely to be critical for big data mining success.

A focus on a fast and seamless experience

Every day, the use of new computers, tablets, sensors, wearable devices and other smart endpoints that make up the Internet of Things grows exponentially. Of course, managing the sheer quantity and complexity of sources, locations and data sets associated with the proliferation of these devices creates many data centre challenges. For example, as the number of endpoints increases, so do device management costs. And where applications include the integration of multiple databases and data sources, network latency has the potential to become a massive problem, slowing information flow to a frustrating near-crawl.

The third key question to ask your data centre provider, then, is how it would deal with such challenges. Leading data centre providers take what might be termed a data-centre-centric approach. Here, rather than requiring you to connect a plethora of external networks and services to your organisation – requiring your IT team to create links to service providers and peers – you can take advantage of a ‘connected campus’ model. With this model, your data centre brings your enterprise and applications to its network, rather than the other way around.

This kind of data-centric infrastructure means you gain seamless access to the applications that will help you analyse your data. It also enhances your ability to scale storage fluidly and handle your analytics operations more efficiently.

Powering digital transformation requires innovation and a new approach to operating data centres – from the hardware and software to the supporting infrastructure and services. A hyper-connected model relying on networking that’s agile, efficient, flexible and responsive will not only help to provide a transformative customer experience, but will also transform your entry into the new digital economy.

Digital Realty’s premier data centres deliver just this. Interconnected via dark fibre that leverages off the existing Digital Realty ecosystem, they bring together an established carrier-neutral connectivity with a wider range of telecommunications providers, services providers and business partners in an open cross-connect model. Contact me to find out more.

Elisa Wong recently joined Digital Realty as Regional Director, Colocation and Connectivity, Asia Pacific. She has more than 19 years of experience in consultancy led sales engagements relating to high-value, complex deals across the technology domain. Before joining Digital Realty, she worked at Cable and Wireless, Verizon and Orange.

comments powered by Disqus