fbpx
The future of data management

The future of data management

APAC Managing Director of Park Place Technologies, Ian Shearer discusses how governance is moving up the agenda and the role data maintenance plays in driving a competitive advantage for enterprises

In this exclusive interview, Ian Shearer shares with us his thoughts on the future of data management. The Managing Director of Operations in the APAC region for Park Place Technologies brings with him a wealth of Third Party Maintenance (TPM) experience, having joined Park Place following the acquisition of Computer Computer (Com-Com) Limited in September 2016, which he owned and ran for 24 years.

How do you see the future of data management? How has the industry evolved?

Ian : Data centre planners are looking hard at artificial intelligence and predictive maintenance. They’re taking a page from the “Industry 4.0” or industrial IoT (IIoT) playbook with techniques shown to work in factories, along oil pipelines and in other industrial settings. Why not in tech facilities too?

The trend is a departure from reactive maintenance, which responds only when problems arise. It’s also a step above scheduled maintenance, which tries to prevent downtime with regular interventions timed based on historical lifespan data.

AI and Predictive Maintenance use real-time data to anticipate issues before they happen. In fact, a computerised maintenance management system (CMMS) can identify potential issues and launch trouble tickets with no user intervention.

This is a big part of why deep learning has become the latest buzzword because it takes machine learning’s greatest asset to the extreme.

Machine learning is moving on from pattern recognition and traditional algorithms into more elusive “deep learning.” A key contribution of machine learning is its ability to “discover” structure within data using an iterative approach, without needing humans to begin with any theories or assumptions to test.

This is a big part of why deep learning has become the latest buzzword because it takes machine learning’s greatest asset to the extreme. Relying on neural networks, deep learning identifies more complicated patterns in larger amounts of data than previously possible. It holds great promise for language translation, medical diagnosis and much more.

With the expansion of computing power — not to mention the exponential increase in available data and affordable, scalable storage — it’s increasingly possible to automatically produce models to analyze ever larger quantities of information and deliver increasingly accurate predictive results in faster and faster cycles. This will be a welcome development for businesses because with more precise modeling comes the ability to identify profitable opportunities and risks before they happen.

Park Place recently debuted ParkView, a remote triage service platform that enables predictive detection and identification of hardware faults that occur within a data centre.

ParkView, powered by BMC technology, revolutionises visibility into data centre infrastructure and operations by identifying and reporting hardware faults, as well as potential faults, enabling faster response and problem resolution. ParkView predicts data centre issues, then triages the fault and identifies the proper fix, allowing quick repairs to be made through Park Place’s seamless integration with hardware maintenance service plans.

Where do you see the place for the adoption of AI in all of this?

Ian : Kjell Carlsson, Ph.D., Senior Analyst for Forrester, a market research company that provides advice on existing and potential impacts of technology, recently hosted a web clinic with Paul Mercina, Director of Product Management for Park Place Technologies.

Carlsson, citing Forrester Data Global Business Technographics Data and Analytics Surveys from 2016-18, said interest in implementing AI “is real”.

Survey respondents across nearly all industry verticals were most interested in operational efficiencies such as implementing greater levels of automation of internal processes, improving online and physical security and attaining greater levels of automation of customer-facing services. Expected core business outcomes include increased revenue growth, improved ability to anticipate and respond to the competitive market and better customer experience.

AI can help data centres become more energy efficient.

According to NRDC, in 2013, U.S. data centres consumed an estimated 91 billion kilowatt-hours of electricity. Whether it’s moving equipment or an increase in computing traffic, these are just a few examples that can cause changes in data centre heating loads.

Over the past few years, AI has become an important tool in reducing wasted energy by data centres. These applications help to reduce power, report cooling inefficiencies, and analyse the health status of critical mission systems to improve efficiency while saving energy.

It is estimated that a stunning  30 percent of data centre capacity is idle. These systems remain plugged in and powered on, gobbling electricity. The reasons for this are many.

IT often isn’t paying for the electric—the facilities management department is—so IT isn’t set up to monitor usage. Additionally, it frequently happens that servers were allocated way back when, staff has since turned over, and now no one really knows what certain boxes are doing anymore. It seems safest to let them hum along as they are.

Finally, IT pros are also understandably inclined to build in spare capacity, to meet those “just in case” situations. But with cloud, virtualisation, and other options available today, there may be less power-hungry ways to reach the same goal. It takes time to tackle each of these problems in turn but the cost-savings can add up.

The landscape of cybersecurity is ever changing and just like other aspects of business, data centres have to be prepared for cyberattacks and threats. With an abundance of information passing through data centres on a daily basis, this activity requires a significant amount of time to monitor and manage cybersecurity issues. Some data centres try to combat these threats by blocking access and creating impassable walls. However, with the constant flux of users, the approach of blocking access isn’t enough to ensure complete security.

Now, AI applications are enabling data centres to adapt more quickly to the ever-changing security requirements while also providing a more secure environment for their users without forcing strict rules. AI solutions can also assist with detecting malware and spam, analysing normal and abnormal activity patterns, identifying weak spots  and strengthening protection from potential threats.

Governance will increasingly be moving front and centre on the agenda. As enterprises grapple with the wealth of information at their fingertips, what do you foresee as the main issues in the quest for compliance versus being able to use data effectively for increased business value?

Ian : Increasingly, nation-states want oversight of their citizens’ data. An emerging trend among nation-states is requiring data on citizens be stored in their own country. Commonly referred to as data residency or data localization regulations, these rules are becoming a major challenge to IT operations for a variety of companies doing business across borders.

Under the concept of data sovereignty, digital data is subject to the laws or legal jurisdiction of the country where it is stored.  In the post-Snowden era, many governments are especially interested in guaranteeing that any snooping that is going on is done themselves, not by allies or adversaries.

It may come as no surprise that Russia is one of the countries exerting the strictest data controls. In fact, the country’s data localisation rules were centre to LinkedIn’s recent decision to abandon the Russia market. Dozens of countries on six continents have their own requirements. Some affect all data while others cover only specific types, such as health or genetic information.

Overall, the trend is toward more, not less, government “guidance” and rule-making. IT pros are now envisioning a future in which companies must segment out and store, process and transfer data differently based on the national origin of the individual. This may soon create expensive and difficult logistical demands and could lead to changes in some enterprises’ business models.

With data localisation laws cropping up all the time, enterprises must find ways to adapt. More and more companies are looking to vendors to stay up-to-date on the rapidly changing compliance landscape for physical storage and data transmission outside national borders.

It’s important to note, companies are not exempt from financial penalty or prosecution if they use cloud providers claiming to be compliant. Every company must, therefore, exert due diligence to understand how their cloud partners deal with data localisation rules. Businesses should develop processes to screen potential partners and ensure country-specific requirements are reflected in contractual SLAs.

Data agility is a clear competitive advantage to enterprises – what role does data maintenance play in this and how can effective data maintenance support this well?

Ian : Data centre innovation is often a slow and measured process. In many cases, efforts to take on new data centre strategies are so complex and many faceted that IT leaders are sure to run into a few unexpected problems along the way.

One area that should not go neglected is data centre hardware maintenance. Effective support strategies can create more fiscal flexibility, enable IT managers to depend more on legacy systems and serve as a key foundational element of any innovation plan.

Two clear examples come from Google and Facebook, as both organisations have adopted similar policies to drive better data centre operations. One change made by both organisations was to raise temperatures in the data centre, effectively reducing cooling needs and, as such, keeping power costs under control. Facebook has also found considerable success reducing expenses by using low-cost hardware, according to InformationWeek.

Both of these innovative strategies are made possible largely by hardware commoditisation.

Commoditisation has come to the data centre hardware sector as technologies have become so advanced that differentiation between systems becomes a secondary issue. The end result is an environment in which hardware is so inexpensive that it is often less expensive to replace it than it is to repair it.

However, that assumption comes along with traditional support models, which are built around expensive OEM extended warranties. Third-party hardware maintenance plans can reduce support costs by 40 to 60 percent. This functionality, alongside commoditisation, can enable IT managers to not only buy less expensive hardware but to use it for longer.

Hardware may be less important than ever when it comes to aligning business and IT operations, but it still represents a huge cost and barrier to innovation in the data centre. Effective hardware maintenance strategies reduce these expenses, help IT managers take on innovative strategies built around low-cost hardware and create value by eliminating the unnecessary capital expenses of an early refresh.

Ian ShearerIan Shearer is APAC Managing Director of Park Place Technologies. Based in Singapore, Ian has overall responsibility for the growth and development of the business across this area of the world.

A keen road cyclist, Ian is a business graduate of the University of Sunderland.

 

 





There are no comments

Add yours

This site uses Akismet to reduce spam. Learn how your comment data is processed.

x
freshmail.com powered your email marketing