Gartner reveals influencing factors for data strategy

26 November 2013


Posted by Satvir Bhullar

Gartner has outlined eight factors it believes will influence data strategy across the next five years.

It argues these changes will be of concern for technical, financial and delivery services and the nature and function of data centers will change within this time frame.

First of all, the firm suggests companies utilise more efficient processor, power and storage technology. It believes the near future will see a rise in cheap processors and flash memory, adding to greater in-memory computing - as a result, low-energy processors will help reduce energy costs.

Some of its other predictions include changing architecture to match growing cloud and server hosting trends alongside bigger investments in operational processes and enterprise data centers.

Building on this, Gartner believes it will see business continuity and data recovery become key features of continuous data operations. One can argue this is important for professionals as, when moving vital information onto the cloud, there is a greater need for recovery as the physical hardware itself is out of reach.

Vice president Rakesh Kumar said: "Over the next five to ten years most organizations will need to change their approach to previous data center strategies used in the last five to seven years, as most of the world comes out of recession and the Nexus of Forces (social, mobile, cloud and information) affects technology use."

One of the smaller, more obvious predictions made is the need to change and improve operating systems (OS), although the firm suggests a greater move onto the Linux platform, the OS behind Android. It suggests these migrations could cause significant disruption to architecture.

In fact, current news suggests Intel is planning to introduce a 64-bit version of Android, adding 'multi-window' options and added functionality to help compete with the likes of iOS and Windows in the more immediate future.

Gartner also encourages consolidation and rationalization as a continuous program, rather than one-off instances, to help better optimise the new hardware expected. If more computing is to happen on the cloud, a more active program to consolidate the data will prove useful.

 Your basket
Your basket is empty