Who am I?
I am an engineering undergraduate, building a home for Data Teams Around the world at Atlan
What am I good at?
Visualizing and understanding problems to the best of there proximity and finding the best of the solutions with innovative but simple solutions. I always follow, the problem first and solution second strategy
Technically, I am very strong in the following.
What drives me?
Hackathon is the most primitive way of solving real-world problems, and yes solving problems with the help of technology drives, defines and describes me and makes me feel like a wizard I am a skilled observant technocrat who observes the environment around him with Hawkeye, and thus I am good at catching the deformities around me and thinking of an innovative way to solve the problem, which I also learned with the passage of many hackathons that I went to.
I have participated in dozens of hackathon, won a few, been a top participant in all of them, including asia's biggest blockchain Hackathon, ETHIndia
I built solutions in a YCombinator 2015 winning funded fin-tech startup, which helps students and people from rural backgrounds to small loans easily. I was a Backend Engineering Intern redcarpetup.com. The following are my key roles and experience.
1 Making new APIs for the app and optimizing performance for existing once.
2. Recognizing patterns in user's complaint and reporting Anomalies
3. Managing Airflow for daily reports.
Atlan is a data democratization company that helps data teams collaborate frictionlessly on data projects. We are creating a home for data teams—allowing them to truly democratize both internal and external data while automating repetitive tasks.
My Contribution in the organization during the internship period was
1 . Currently owning metadata recommendation bots, built over Argo orchestration layer
2. End-to-end maintenance orchestration and OLAP/OLTP engine behind Atlan’s product offering.
3. Built load aware scaling interface for resource-intensive python workloads using AWS EKS . This helped customers to run data science use cases under existing ETL pipelining architecture in a cost effective manner.
4. Built CLI-based internal tooling which helped to avoid cascading failures, fix issues with data, and restore business as usual in case of failures. Built using click