Sign Language Prediction with MobileNet

In this exercise we shall again apply transfer learning to predict the Numeric Sign Language. We will be applying MobileNet Model and shall modify the model and then fine tune it to suit our requirements. Click here to view the code.

  • Tools

TensorFlow, Transfer Learning, CNN, Google Collab Notebook

MNIST Digit Recognizer using CNN

MNIST (“Modified National Institute of Standards and Technology”) is the de-facto “hello world” data set of computer vision. Since its release in 1999, this classic data set of handwritten images has served as the basis for bench marking classification algorithms. As new machine learning techniques emerge, MNIST remains a reliable resource for researchers and learners alike. In this exercise, our goal is to correctly identify digits from a data set of tens of thousands of handwritten images. Click here to view the code.

  • Tools

TensorFlow, CNN, Google Collab Notebooks

Exploratory Data Analysis - Titanic data set(Kaggle)

The objective of this exercise was to clean the data and then apply feature engineering to generate meaningful insights of the Titanic data set available at Kaggle . A training data set with 891 rows was used for this exercise. The data set has interesting features like age, gender, fare etc of the passengers to predict whether they survives the Titanic mishap. Click here to view the code.

  • Tools

Missing Value Treatment, Feature Engineering and Exploratory Data Analysis, Google Colab Notebooks

Creation of an India Credit Risk Default Model Using Logistic Regression

The project involved developing a credit risk default model using a given data which had to be checked for outliers, missing values, multicollinearity etc. Univariate and Bivariate Analysis had to be conducted and the model had to be built using Logistic Regression on most important variables. Model Performance Measures were undertaken that included predicting the accuracy of the model on certain datasets. Click here to view the code.

  • Tools

Logistic Regression, Univariate & Bivariate Analysis, Outlier Treatment, Model Performance Measures

Visualizing Car Insurance Claims using Tableau

This project explored the art of problem-solving with the aid of visual analytics. Tableau’s data visualization tools were used to create interactive dashboards to provide high-level insights to an Insurance company to drive the company’s car insurance schemes.

  • Tools

Data Visualization, Tableau, Business Intelligence

Build a forecasting model to predict monthly gas production

The project involved developing an ARIMA model to forecast the monthly Australian gas production level for the next 12 months. Click here to view the Code.

  • Tools

ARIMA, Time Series Forecasting, ADF Test

Choosing preferable mode of transport by employees

The project involves deciding on the mode of transport that the employees prefer while commuting to office. For this, multiple models such as KNN, Naive Bayes, Logistic Regression have been created and explored to check their model performance metrics. Bagging and Boosting modelling procedures have also been applied to create the models. Click here to view the code.

  • Tools

Bagging and Boosting, KNN, Naive Bayes, Logistic Regression

Cellphone-Logistic project

The primary objective was to investigate the parameters contributing for customer churn (attrition) in the Telecom Industry. A Logistic Regression Model was developed and validated with test data to predict customer churn.

-Tools

Logistic Regression, Model Comparison, Predictive Analytics

Building a supervised Model to cross-sell personal loans The objective of this exercise was to build a model using a Supervised learning technique to figure out profitable segments to target for cross-selling personal loans. A Pilot campaign data of 20000 customers was used which included several demographic and behavioral variables. The Model was further validated and a deployment strategy was recommended.

  • Tools

Random Forest, Data Mining, Pruning, Model Performance Measures