Building a Machine Learning might become easier by the day, but there’s a rule of thumb: garbage input equals garbage predictions. Outliers are observation that significantly differ from other observations. Having outliers in your data will...
K-Nearest Neighbors is a double-edged algorithm that can be used for both Classification and Regression problems. While its intuition is quite simple, it can yield impressive results, but there are a few drawbacks, hence double-edged.
Logistic Regression is one of the most basic ML algorithms. While it is a basic algorithm it isn’t always included in ML courses because of its weaknesses.
Linear Regression is probably the first ML algorithm any Data Scientist will encounter in its journey. As a matter of fact it is the easiest ML algorithm to learn conceptually. Let’s take a look.
Recent Comments