Diving In Too Deep

Diving In Too Deep

Diving into Deep Learning and Testing the waters of Solidity

The problem Diving In Too Deep solves

Being part of SSoC'22, I was able to contribute to 2 main repositories i.e 1)Deep Learning - Simplified and 2) Solidity Pathshala.
My contributions were as follows:

DL - Simplified Repository:

  • Fruits Classification (PR=>#92): In this project, my main objective was to make a Deep Learning Model that will be able to classify 131 different types of fruits and vegetables. A dataset from kaggle was provided. I had made 4 different models (CNN, VGG16, Inception-ResNet, MobileNet) out of which the CNN model gave the highest accuracy of 95.99% .

  • Quality Prediction in Mining Process (PR=>#139) : As the name suggests, in this project I had to create a model which will predict the purity of an ore, based on the lab results performed on the crusher data. The dataset had information related to ores extracted using reverse flotation process. Here I had created 5 different models out of which the highest accuracy achieved was 99.4% .

Solidity Pathshala:

  • I had recently started learning the Solidity language. In this repo, I had added few different codes which showcase the different datatypes in Solidity, how to declare and call a function and so on.

  • The codes I submitted were (PR=>#33):

  1. Reverse_Digits

  2. Check_Palindrome

  3. Find_Average

  4. Sum_of_Digits

Challenges I ran into

Both Deep Learning and Solidity were very new to me. I had referred to a lot of resources to learn about them. I wasn't sure as to which model architecture would be good and how to design the architecture for the CNN model. Transfer Learning also was pretty overwhelming at first. Referring to books like "Deep Learning with Python" by François Chollet, helped me to understand the concept better.
For Solidity, I had watched a few youtube videos to get started.

Discussion