Important Machine Learning Concepts Part – 2
Training multiple models with different parameters to solve the same problem.
Statistical way of comparing 2+ techniques to determine which technique performs better and also if difference in statistically significant.
Simple model/heuristic used as reference point for comparing how well a model is performing.
Prejudice or favourite towards some things, people or groups over others that can affect sampling and interpretation of data, the design of a system, and how users interact with system.
Model that is trained online in a continuously updating fashion.
Model that is trained offline.
Process of converting an actual range of values into a standard range of values, typically -1 to +1.
Independently and Identically Distributed
Data drawn from a distribution that doesn’t change, and where each value drawn doesn’t depend on previously drawn values; ideal but rarely found in real life.
In machine learning, a hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters are derived via training.
Refers to a model’s ability to make correct predictions on new, previously unseen data as opposed to the data used to train the model.
Cross-entropy is commonly used in machine learning as a loss function. Cross-entropy is a measure from the field of information theory, building upon entropy and generally calculating the difference between two probability distribution.