jsunecha

Jatin Unecha

Based on a strong performance in the Higher Secondary School Board Examination, I received an
admission for the Bachelor of Engineering in Computer Engineering program at the prestigious Pune Institute of Computer Technology (PICT), the top-ranked school in the state for Computer Science
research and education. I was excited to have the opportunity to interact with many renowned professors and brilliant students and started working towards expanding my knowledge in important areas such as Advanced Data Structures and Algorithms, Theory of Computation, and Large Scale Data Processing using SQL and NoSQL databases. I have also taken significant steps towards gaining experience in interpersonal skills, teamwork, and industry experience. As a member of the IEEE student branch at my college, I have actively organized and participated in co-curricular activities held in my university such as inter-college programming contests, debates, and quiz competitions.

During the third year of the Bachelor of Engineering program, I started studying advanced
courses in Distributed Computing, Parallel Algorithms, Large Scale Data Processing, and Machine Learning. After learning the foundational concepts on these topics, I became highly interested in the power of Distributed Computing to provide a high-performance architecture to perform large scale data processing and complex parallel computations to solve problems that cannot be easily solved using
traditional architectures. I started gaining in-depth knowledge of the subject by reading reference books
and research papers. During the third year of the Bachelor of Engineering program, I presented a research project on “Personalized Recommendations using Distributed Computing” in a research seminar event at my university. I surveyed the latest research in the field of Distributed Computing and Machine Learning Algorithms and implemented a book recommendation system using a Collaborative Filtering algorithm on the Spark framework. The research project and seminar presentation were highly appreciated by students and faculty members. This project provided me with an excellent opportunity to perform methodical research and present the research findings to an audience of 100+ students and faculty members.

In July 2020, I was selected for an internship opportunity at BMC Software Inc. after a rigorous
selection process to create an Automatic Anomaly Detection System for a distributed environment. I
developed complex architecture components using Java and Kubernetes to create data processing
components to normalize the data, apply Cosine Similarity algorithms, and detect patterns relating to
anomalies in the system. In this internship, I worked with a team of 2 other undergraduate students and
learned important technical, interpersonal, and team working skills while working in an innovative
corporate environment at BMC Software Inc. In the final year of the Bachelor of Engineering program in
October 2020, I was selected to work on an industry-sponsored project for creating a new Distributed Financial Transaction Processing System at Sarvatra Technologies Inc. In this project, I worked with a team of 3 other undergraduate students to design and develop a fault-tolerant and scalable distributed platform for processing millions of financial transactions using Java, Kafka, Spark and Cassandra.

After completing the undergraduate program in August 2021, I started working as an Associate
Software Engineer at SE2 LLC, a leader in the US life and annuities insurance industry, headquartered in
Topeka, Kansas. At SE2, I am a part of the Policy Management team responsible for feature
enhancements related to Policy Management use cases and applying data analytics to drive business
decisions. After joining the team, I was very excited to get the opportunity of working on large scale
applications of the technologies I had used in my undergraduate projects. I learned about the existing
implementation and started working on creating large scale Web Service APIs for the Policy
Administration product using technologies such as Java, Spring Boot, REST Web Services, Oracle and
AWS. After gaining a strong foundation of the existing project implementation, I took the initiative to expand my knowledge of Spark, Kafka and AWS using self-study courses on Udemy and Coursera and collaborated with 2 other team members to create a data processing framework to process large volumes of insurance policy data, aggregate the data and identify insights from the data to drive business decisions. Working on these projects in a real world application has reinforced my learning and expanded my understanding of the field of Distributed Computing and Large Scale Data Processing.

Projects

Streamlive

A decentralized livestreaming platform - video and paymentsIPFS, Fleek, Livepeer, Superfluid, Filecoin, alchemy, Spheron

Skills

Python
Java
Angular
RESTful API Design
Oracle SQL

Experience

  • SE2 - Associate Software Engineer
    August 2021 - Present

    Designed and implemented large scale Web Service APIs for the Policy Administration product using Java, Spring Boot, REST Web Services, Oracle, AWS. Designed and implemented a data processing framework using Java, Kafka, Spark, Scala to process large volumes of insurance policy data, aggregate the data and to identify insights from the data to drive business decisions. Worked with cross-functional teams from engineering, product and finance to understand business requirements and implement new features and enhancements in existing products.

  • BMC Software - Project Intern
    July 2020 - November 2020

    During the final year of the Bachelor of Engineering program, worked as an undergraduate intern on a project to create an Automatic Anomaly Detection System for a distributed environment. Developed architecture components using Java and Kubernetes to read data from distributed data sources, normalize the data, apply Cosine Similarity algorithms and detect patterns relating to anomalies in the system.