Business Intelligence and data play an increasingly important role in our lives, and organizations also experience this personally. The smart use of data distinguishes the winners from the losers. Our trends for 2020 are therefore still dominated by (the use of) data. But above all: the responsible use of data.
Data-based working becomes the norm
Working data critically brings passion and pleasure back to the workplace. Connect data with continuous improvement and PDCA (Plan-Do-Check-Act) and your organization will flourish again. True data curation is the ultimate goal that every organization should strive for. In this scenario, all decisions are supported by relevant data that is collected, stored and analyzed. People are freed from the hassle, indecision and the tyranny of managers and employees who loudly proclaim their opinion and steer gut feeling.
All employees understand their role in the process and strategy of the organization. Data curation is the next step in the evolution of organizations after agile work. However, many companies still have a long way to go before they can embrace a data culture. Structure, processes, and behavior will all have to change. And it starts with the people.
Data literacy is an essential core competence
Before data-based working can take the upper hand, most people will first have to learn a lot about data. Gartner speaks of an “extreme backlog” and claims that 80% of organizations will have set up programs by 2020 to promote data literacy in their ranks. This backlog hinders the digital progress of many organizations, according to researchers. According to them, a quarter of managers are annoyed by “lack of digital skills” and a third find that the quality of the work suffers from that lack.
However, there is of course always a danger that people overestimate themselves and underestimate others if you ask them. There is talk of the famous “Dunning-Kruger effect”. For example, only 8% of managers think that they are insufficiently digital. The fact remains, however, that on average there is still a lot of work to be done in the field of data literacy. A large number of managers also say they have little insight into the data literacy of colleagues. But here too progress is being made. You can measure data literacy more accurately, and therefore you can also use it as a KPI. The growth in data literacy will be a challenge for every organization.
Predictive analytics becomes mainstream
The predictive analytics market is expected to be worth $ 12.41 billion in 2022 – 272% more than in 2017, with compound annual growth of 22.1%. The rise of AI and machine learning is responsible for major growth in the predictive analytics market. The majority of this growth comes from Pacific Asia, where rapid economic growth will continue.
Organizations develop predictive diagnostic models by equipping equipment with sensors. These sensors can detect and detect potential problems and take early action to prevent more expensive maintenance.
AI makes analytics more human
The emergence of algorithms, machine learning, and AI makes many people (not entirely wrong) somewhat nervous. Do we lose our jobs on the machines? Will we soon be ruled by cold algorithms? Nothing is less true. AI is still being programmed by people. And according to Gartner, AI will ensure net job growth in 2020. In this way, 2.3 million new jobs are created compared to 1.8 million jobs that will become redundant.
There is still a huge gap between the amount of data that is generated and the capacity of people to process that data and take actions based on the data. And that is precisely what people are good at analyzing complex problems in context employing intuition and empathy.
“Data for good” is on the rise
BI cannot only be used for profit and process improvement. Companies are investing more and more time in Corporate Social Responsibility programs, or projects to improve the world. For example, Ikea has built a solar farm for 20,000 Syrian refugees in Jordan.
These types of projects do not always result from pure altruism. From research shows that Millennials prefer to work for companies that do their part to improve the world, and even with pleasure would take in exchange for a place in a company less wage that takes its social responsibility seriously. There are also so-called data commonwealths: platforms to share resources between different organizations to contribute to cancer research, for example. Data must be at the service of people, and large companies must be accountable.
Smart algorithms sharpen the privacy discussion
The emergence of algorithm-driven technologies such as big data, the Internet of Things and artificial intelligence poses new constitutional challenges, according to research into algorithms and the influence on our fundamental rights, carried out by researchers at the University of Toronto. These technologies, for example, affect the choices we make and therefore on our autonomy. Also, built-in prejudices in algorithms (‘biases’) can lead to unequal treatment. Smart algorithms have an impact on our freedom rights, equality rights, procedural rights, and privacy rights.
For example, ‘data surveillance’ (data + surveillance) can violate the right to privacy. These infringements can occur both in the private environment and in public spaces, for example in smart cities. The use of artificial intelligence, for example in the form of robots, can also affect the right to relational privacy, in healthcare. The average Dutch person is included in hundreds of databases. These databases not only contain the digital footprint of these people (the information they leave about themselves) but also their ‘data shadow’, which consists of all the information generated by others about them. This makes privacy protection a major issue.
New technology is crying out for new laws and regulations
2018 was the year of the AVG, the General Data Protection Regulation (AVG), also known as GDPR. 2020 will be the year of the European ePrivacy Regulation, among other things. This regulation is intended to better protect the confidentiality of digital communication and includes rules for the use of e-mail, telemarketing, cookies and other forms of electronic communication, such as Skype and WhatsApp.
Politicians and regulators have also been woken up in the progressive state of California. The ‘usual suspects’ when it comes to worldwide privacy violations (think of Facebook and Google), can look at the California Consumer Privacy Act by analogy with the European GDPR. In addition to ‘hard’ laws and regulations, the (international) standardization and standardization committees are also not without evidence. They argue for a new ISO standard for Artificial Intelligence (AI) and big data. The worldwide ISO organization, for example, recently started a new standardization process. To bring lines in the implementation, testing, and demonstration of, among other things, reliability, robustness, ethics and legal issues surrounding AI applications and big data.
Standardization, for example, will eventually become an effective means of ensuring that AI applications are safe and desirable for people, society and business.
Applied ethics of Big Data
In addition to the broad mainstream of (ordinary) ethics, there is also applied ethics. This works in a more limited area, such as medical practice, but also business conduct and the applications of IT, for example. It is precisely this latter direction, ethics within IT and in particular the use of Big Data that we investigate further in this blog. We briefly discuss aspects of personal protection and privacy concerning the application of data analytics. This blog goes into different ways with which you can solve the problem. We also discuss the advantages and disadvantages of these solutions. Finally, this blog reflects on the difficult relationship between ethics and IT applications. And the ways to improve this relationship and prevent abuse of computers (and information).
Big Data is tempting
Big Data systems have unprecedented and unlimited possibilities. Because the data and the combinations to be made are increasing at an incredibly fast pace. In particular, the combination of the application of machine learning, artificial intelligence, and Big Data offers unprecedented opportunities to make predictions about one’s life. Like everything in this life, Big Data also has a seductive and frightening side.
We want to know everything about the customer
The temptation is to collect as much information as possible about people, machines and things. A company prefers to know everything about its customers. The better the thinking and behavior of the customers is mapped out and known, the better a company can make customized products. Companies that operate in this structured manner are logically more successful in obtaining and maintaining a preferential position with the customer. It is often the only way to beat the competitor.
What does the customer think?
At the other end of the spectrum, customers don’t like it that companies just get access to their personal information. It is not surprising that in recent years the call for protection of our privacy has only been heard more often and louder. At the same time, the customer wants to be able to use personal offers and discounts on frequently used and purchased products. So there is a field of tension. PSD2, for example, is a new payment law in Europe. These regulations must ensure that you as a customer have more freedom to manage your finances. For example, you can choose to share your payment details with other companies to, pay faster in a webshop. If you want this of course.
The infiltration of Big Data into our lives
Big Data is slowly infiltrating our entire lives. Whether we want this or not. The use of Big Data means that (part of) our data, feelings, and ideas become accessible to many others and strangers. Those who do not want this place themselves outside society and the modern world. This means that the discussion about the possibilities of Big Data concerning its morally correct use must be conducted anyway.
Big data is here to stay
Big Data is here to stay and will only expand. We will have to accept that everyone, or in particular nobody, in particular, can get information about us. People we don’t know and to whom we would probably not tell anything about ourselves, a non-natural person, anyone can root in our private lives. The discussion should therefore no longer be about whether we want to use Big Data. Rather, we must think about how we want to deal with the excesses that Big Data potentially entails. In other words, what is the ethics of Big Data?