10-Point forecast: Big data set to find its users in 2016

Oracle predicts in a research paper that big data will leave development laboratories behind this year and be accepted by companies across the board. Ten trends are influencing development in 2016.

Oracle’s Big Data Blog features a link to a PDF entitled Enterprise Big Data Predictions 2016, which forecasts the changes in the use of big data. The authors present ten trends that can be put into three major groups: firstly, big data is getting simpler and the number of users is growing rapidly; secondly, there is significant technical progress; and thirdly, that progress will have a greater impact on society, politics and business processes.

Big data is becoming common knowledge

Access to big data is no longer limited to specialists. New and simplified tools are also allowing normal analysts to access the databases of Hadoop clusters and use them for their purposes. Simple self-service is becoming the mode of access, which in turn means better predictions.

The demand for professional data analysts is rising along with the number of hypotheses to be verified by means of big data. The financial sector, for example, is trying to develop more effective algorithms for the assessment of risks.

Technical progress

The first big data users still had to build up their own big data clusters and environments, a process that could take six months or longer. Thanks to new cloud services and preconfigured appliances, the required time has been significantly reduced.

The many algorithms, analyses and applications used today are being replaced by standardised access to virtualised data via SQL, REST and other scripting languages.

Manually programmed first-generation data processing is being rendered redundant by new and improved management tools and data stream oriented programming. The latter uses parallel processing of data, facilitates the reuse of functional operators and provides functions for statistical analysis and machine reading.

Artificial intelligence is increasingly being used for data processing, especially in the form of machine reading, automatic text recognition and graph-oriented databases.

Data lineage, i.e. the determination of the origin and development of data sets, is becoming an obligation for companies. This is the only way they can assess the results of analyses based on multiple data sets of varying quality.

IoT cloud services for big data are driving the further spread of the Internet of Things. These services simplify the collection and analysis of sensor data and lead to the development of new products that perform actions automatically on the basis of data analysis.

Impacts on politics and society

If governments can easily determine from which country specific data originates, it is easier for them to enforce national usage regulations. International companies are therefore increasingly being forced to adopt a hybrid cloud strategy and also build up regional data centres and offers alongside their globally available services, in order to meet the respective compliance rules.

Privacy and data protection are moving into the centre of focus: rising awareness of how data is collected, shared, stored as well as stolen is leading to increased demands for regulation of access to personal information. Companies are developing classification systems with predefined policies for the access, transfer and protection of data sets. Due to increasingly sophisticated hacker techniques, companies are also being forced to tighten security measures as well as continuously monitor access to and use of their data.

Matomo