3 Concepts Every Data Scientist Must Know Part – 2

1. Bagging and Boosting Bagging and Boosting are two different ways used in combining base estimators for ensemble learning (Like random forest combining decision trees). Bagging means aggregating the predictions of several weak learners. We can think of it combining weak learners is used in parallel. The average of the predictions of several weak learners…

Read More

What is the Normal Distribution?

Probability distribution is the function that shows the probabilities of the outcome of an event or experiment. Consider a feature (i.e., column) in a dataframe. This feature is a variable and its probability distribution function shows the likelihood of the values it can take. Probability distribution function are quite useful in predictive analytics or machine…

Read More

Important Deep learning Concept Explained Part – 2

Converge Algorithm that converges will eventually reach an optimal answer, even if very slowly. An algorithm that doesn’t converge may never reach an optimal answer. Learning Rate Rate at which optimizers change weights and biases. High learning rate generally trains faster but risks not converging whereas a lower rate trains slower. Numerical instability Issues with…

Read More

Important Deep learning Concept Explained Part – 1

Neuron Node is a NN, typically taking in multiple input values and generating one output value by applying an activation function (nonlinear transformation) to weighted sum of input values. Weights Edges is a NN, the goal of training is to determine the optimal weight for each feature; if weight = 0, corresponding feature does not…

Read More

Russia-Ukraine War Data Analysis Project using Python

In this article I will take you through the task of Analyzing the Russia-Ukraine war Dataset using Python. The dataset that I am using for the task of analysis the Ukraine and Russia War is downloaded from Kaggle. You can download russia-ukraine equipment dataset from here and russia-ukraine personnel losses dataset from here. Now let’s import…

Read More

3 Concepts Every Data Scientist Must Know Part – 1

Central Limit Theorem We first need to introduce the normal (gaussian) distribution for central limit theorem to make sense. Normal distribution is a probability distribution that look like a bell. X-axis represents the values and y-axis represents the probability of observing these values. The sigma values represent standard deviation normal distribution is used to represent…

Read More

Most Common Feature Scaling methods in Machine Learning

Definition Feature scaling is the process of normalizing the range of feature in a dataset. Real-world datasets often contain features that are varying in degrees of magnitude, range and units. Therefore, in order for machine learning models to interpret these features on the same scale, we need to perform scaling. Feature scaling makes the model…

Read More

Top 8 Deep Learning Algorithms

Convolutional Neural Networks CNN’s popularly known as ConvNets majority consists of several layers and are specifically used for image processing and detection of objects. It was developed in 1998 by Yann LeCun. CNNs have wide usage in identifying the image of the satellites, medical image processing, series forecasting, and anomaly detection. CNNs process the data…

Read More

Python Interview Questions – Part 1

1. What is the difference between indexing and slicing? Indexing is the extracting or lookup one or particular values in a data structure, whereas slicing retrieves a sequence of elements. 2. What is the lambda function? Lambda functions are an anonymous or nameless function. These functions are called anonymous because they are not declared in…

Read More

What are Pickling and Unpickling?

Pickling Converting a Python object hierarchy to a byte stream is called pickling. Pickling is also referred to as serialization. Unpickling Converting a byte stream to a python object hierarchy is called unpickling. Unpickling is also referred to as deserialization. Popular Posts

Read More