With the rapid evolution of Internet Era, nowadays, data and information can be easily stored in the network environment such as in the Cloud where it enable Internet users to obtain the information they need at any time just through a few clicks in their computer. In the other hand, organizations can fetch data from various different sources, for examples, the social media platform, industrial equipment, business transactions and smart devices (IoT), in which they can do predictive analytics for insights that lead to better decisions and more tactical business moves. These data are what we often called as the “big data”.
Before the era of “big data” and “artificial intelligence”, data analysis is always being done by using tools like Excel. However, Excel is not an ideal tool for large-scale data analytics as its scalability is not meant to scale to handle large datasets we deal with in the real world today.
Moreover, due to most of the datasets are actually a mixture of structured and unstructured data, this becomes one of the biggest pain points for the Excel to process these data. In addition, Excel is also facing deficiency in some key functionality of programming languages and machine learning libraries that we can use to form a more complex predictive and analytical models.
Machine Learning being a subset to AI where engineers realized that rather than teaching computers and machines how to do everything, it would be far more efficient to code them to think like human beings, and then plug them into the internet to give them access to all of the information in the world.
Machine learning is a study of algorithms.
Algorithms are actually a bunch of mathematical formula that will be input into the computer, that responsible to give instruction to the computer to perform any decision.
In machine learning, the learning algorithm will be trained by feeding in with large-scale of dataset as though like feeding it with lots of experience until it is able to give precise instruction to the computer to perform the decision.
The performance of the algorithm after every training will be measured base on the accuracy and precision of the decision performed by the computer. The data and algorithm will be further modified and run for another round of the training, to see in the event of the performance is not as expecting.
In short, machine learning is a repetition of these few steps until an algorithm that can give a very accurate instruction is formed.
Machine learning process is an analogy of human brain.
In a layman’s term, machine learning is a process carried out by the computer to mimic the way human interpret and solve the problem.
This idea is like the high school students sitting for their final exam where they usually will do the revision and practice based on past year questions, the more past year questions that they practice, the higher the accuracy of them answering the questions in the final exam. As such, the past year questions here are basically like the experience gained by the student, the more experience the students have, the better the performance that they can give.
So, same goes to the machine learning, which has all those historical data (data used in model training) to be the model’s experience. Therefore, we say that, machine learning is a learning process to make the machine or the computer to mimic or to have human behavior.
3 major types of scenario we can use machine learning
Scenario 1: Rules are complex or cannot be define.
Facial and voice recognition that requires bunches of the algorithms that MUST be arranged in their correct sequence. So, these algorithms and sequences or what we called it as the “rules”, are always complex and hard to define.
Scenario 2: Task rules change over time.
Beside the complex rule, situation where the rules are not as complex but the rule dynamically change from time to time also require machine learning.
For example, the part-of-speech tagging application, we have two sentences here,
- ‘She saw a bear.’
- ‘Your effort will bear fruits.’
The meaning of the ‘bear’ in the first sentence and the ‘bear’ in the second sentence are totally different where one is noun, one is verb. So, how would the computer know when the ‘bear’ word means as a noun and when the ‘bear’ means as a verb?
Scenario 3: Data distribution changes over time, requiring constant re-adaptation of programs.
In the other case, machine learning is needed to perform task that required us to make some prediction on something that has no regular pattern or regular trend.
For example, the stock market prediction. The trend of the stock market will always be affected by many different factors and uncertainty, which are hardly can be predicted by simple algorithms.
Or we can also decide which types of the algorithms or the solution to be used based on both the complexity of the rules of the problems, and the scale of the problem.
For example, when we met a large-scale problem and we say that the rules to solve this problem is very complex, then we will need to use the machine learning algorithms.
Other than this combination, you can use rule-based algorithms when the rule complexity is low, or use the manual rules when the scale of problem is small even though the rule complexity is high, or you can just use simple problem solution to solve small scale and simple problem.
|Rule Complexity||Scale of Problem||Suitable Solution|
Machine learning algorithms
The emerge and evolution of machine learning algorithms has greatly reduced human’s workload, increases the efficiency of work completion and helped human to save up a lot of time with the powerful computing and analytical capability of the machine learning. It makes fast processing of an enormous volumes of data to be possible, discover the hidden pattern of the data, makes it an attractive undertaking for organizations to use it to increase their business value. To date, not only large tech companies like Google, Facebook, Amazon and Microsoft are dominating the AI investment, a lot of the non-tech companies are also exploring the potential of AI. So, has your organization started the path of machine learning too?