Artificial intelligence. Machine learning. Deep learning. Though these terms are becoming increasingly mainstream, to many people they still feel like the subject of a science fiction film. Let's simplify things and try the one-line definition of each term:
The very idea of artificial intelligence dates back to the 1950s with the advent of computational techniques and abilities in machines. The goal was simple: to go beyond using a computer as a means of calculation and actually drive decision making.
This meant that computers needed to go beyond calculating decisions based on existing data; they needed to move forward with a greater look at various options for more calculated deductive reasoning. How this is practically accomplished, however, has required decades of research and innovation. A simple form of artificial intelligence is building rule-based or expert systems. However, the advent of increased computer power starting in the 1980s meant that machine learning would change the possibilities of AI.
Rule-based decisions worked for simpler situations with clear variables. Even computer-simulated chess is based on a series of rule-based decisions that incorporate variables such as what pieces are on the board, what positions they're in, and whose turn it is. The problem is that these situations all required a certain level of control. At a certain point, the ability to make decisions based simply on variables and if/then rules didn't work.
The trick, then, was to mimic HOW humans learned.
Machine learning was introduced in the 1980s with the idea that an algorithm could process large volumes of data, then begin to determine conclusions based on the results it was getting. For example, if a machine-learning algorithm was fed a large volume of credit card transactions with if/then rules for flagging frauds, it could then start to identify secondary factors that created a pattern, such as when an account purchases something at unusual hours or at stores in a different geographic location.
Such a process required large data sets to start identifying patterns. But while data sets involving clear alphanumeric characters, data formats, and syntax could help the algorithm involved, other less tangible tasks such as identifying faces on a picture created problems.
In the 2000s, technology took another step forward and the solution to this was to create a learning methodology that mimicked the human brain.
Deep learning works by breaking down information into interconnected relationships—essentially making deductions based on a series of observations. By managing the data and the patterns deduced by machine learning, deep learning creates a number of references to be used for decision making. As is the case with standard machine learning, the larger the data set for learning, the more refined the deep learning results are.
A simple way to explain deep learning is that it allows unexpected context clues to be taken into the decision-making process. Consider how a young child learns to read. If they see a sentence that says "Cars go fast," they may recognize the words "cars" and "go" but not "fast." However, with some thought, they can deduce the whole sentence because of context clues. "Fast" is a word they will have likely heard in relation to cars before, the illustration may show lines to indicate speed, and they may know how the letters F and A work together. These are each individual items, such as "do I recognize that letter and know how it sounds?" But when put together, the child's brain is able to make a decision on how it works and read the sentence. And in turn, this will reinforce how to say the word “fast” the next time they see it.
This is how deep learning works—breaking down various elements to make machine-learning decisions about them, then looking at how they are interconnected to deduce a final result.
Artificial intelligence software can use decision-making and automation powered by machine learning and deep learning to increase an organization’s efficiency. From predictive modeling to report generation to process automation, artificial intelligence can transform how an organization operates, creating improvements in efficiency and accuracy. Oracle Cloud Infrastructure (OCI) provides the foundation for cloud-based data management powered by AI and ML.