How cognitive computing will control everything.

Maxime Leleannec
4 min readMar 24, 2021

“By combining the state-of-the-art of neuroscience, big data and nanotechnology, cognitive computing is creating algorithms simulating human behaviour to solve problems, opening unprecedented horizons”

Big Brother is watching you… 👀

If in George Orwell’s 1984, Big Brother observes private life through surveillance networks, the 2021 version could be quite different.

Let’s imagine that it is an all-powerful computer that controls everything, and that can even impersonate and simulate human behaviour… Just imagine…

What is Cognitive Computing?

While computers have been able to perform calculations and process information faster than humans for several decades, they struggle to perform simple tasks for humans, such as understanding natural language, recognising objects within an image, or showing empathy by recognising or displaying emotions.

Cognitive computing is the result of the crossover between different sciences and expertise : Cognitive Computing = Super Computing + Neuroscience + Big Data + Nanotechnology

How does it work?

Cognitive computing allows computers to mimic the functioning of the human brain. It uses self-learning algorithms based on data mining and pattern recognition to generate solutions to a wide range of problems.

Cognitive computer systems can synthesize data from a variety of information sources, while considering context and conflicting evidence to suggest the best possible answers.

To do so, cognitive systems include self-learning technologies that use enhanced data mining, pattern matching and natural language processing (NLP) to simulate the operation of the human brain.

The use of computer systems to solve the types of problems that humans typically face requires large amounts of both structured and unstructured data, which is transmitted to machine learning algorithms. With time and experience, cognitive systems are able to sharpen the way they identify patterns and process data to be able to anticipate new problems and tailor possible outcomes.

What is it used for, and for what purpose?

The aim of Cognitive Computing is to create automated computer systems capable of solving problems without the need for human assistance. Cognitive computing is used by many artificial intelligence applications, including expert systems, natural language programming, neural networks, robotics and virtual reality.

Note : DC and AI are often confused and interchanged. Indeed, the nuance is relatively fine between the two, and are both characterized by : Technology that depends on data to make or provide a decision. With AI, the data is injected into algorithms over a long period of time, so that the system can learn by itself and provide outcomes.

The term CT is typically used to illustrate an AI system that aims to simulate human behaviour. Where AI relies on algorithms to solve a problem or to identify patterns hidden in data, the goal of cognitive computing systems is to create algorithms that copy the reasoning process of the human brain to solve a series of problems as data and problems evolve.

Who is using Cognitive Computing ?

The Cognitive Computing is based on a large database. Therefore it is rather logical to see the big digital companies, pioneers in this technology. Companies like GAFA (Google, Amazon, Facebook, Microsoft) and Chinese BATX (Baidu, Alibaba, Tencent, Xiaomi, Huawei) have astronomical amounts of data, which strengthens their edge in this technology race and continues to invest huge amounts of money in R&D.

However, the pioneer of this technology is IBM, which has invested billions of dollars in big data analysis and is now conducting research on cognitive computing, devoting 1/3 of its R&D budget to this development. For example, IBM has developed a program called Waston, which answers questions formulated in our natural language.

It is particularly used in the healthcare field because, by analysing various questions asked by patients and doctors’ analyses, it is able to provide hypotheses and possible curative treatments.

If you want to know more about Waston, it’s here ➡️ https://www.youtube.com/watch?v=P18EdAKuC1U

To go further

Today, we have all used or are using Siri or Alexa for very simple tasks, such as sending a message, changing music and scheduling an appointment.

But none of this is as well developed as JARVIS, the famous assistant of superhero Iron Man.

But it is possible to go a little further than just using Siri and Alexa… and the economic benefits and growth potential of cognitive computing are significant. Artificial intelligence can increase human performance and expertise by putting the right data in the right hands.

For example : You’re driving to work, but your phone, or indeed your car, notices that you have an impending health problem, detected by various sensors based in the vehicle that can retrieve specific data. The car can redirect you, while driving autonomously, to the nearest health centre, while searching the latest research for a list of potential illnesses you may be suffering from based on the data collected.

Therefore, as soon as you arrive at the hospital, which will have already been alerted to your arrival by an emergency call function provided by the car, you can immediately receive the best equipment, a pre-established medical diagnosis and adequate treatment, without having to go to the emergency room.

At the same time, your family has already been notified of your condition, your employer is aware of what is happening to you, and your insurer will have been notified.

And all of this, autonomously… It’s a bit scary, but it’s something that is technically feasible.

If you liked this article, find my other posts:

Maxime LE LEANNEC — 雷王力 🛸📶

--

--

Maxime Leleannec
0 Followers

Tech enthusiastic, I work for an international company as an IoT business developer. I’m passionate about Asian culture and Tech in Asia and speak mandarin.