medX

medX

Thru our question-based, AI-powered telehealth platform, users could ask questions about their diagnosed conditions, making it easier for them to find the information on products they need.

medX

medX

Thru our question-based, AI-powered telehealth platform, users could ask questions about their diagnosed conditions, making it easier for them to find the information on products they need.

The problem medX solves

Feeling tired, frustrated, and worse than you felt before? Most patients experience this when consulting telehealth services for the best treatments. That's because providers today see patients as a mixture of billing codes rather than people who have simple, yet complex, questions about their health. Intelliquette addresses them with its question-based (rather than product-based) UI, which connects the right combination of symptoms with conditions that best reflect a patient's health. These questions, as well as their corresponding responses, are produced by our bespoke AI-powered LangChain Large Language Model (LLM) syndicate, forming a distinct combination of technical acumen from a GPT-2-Medium model. We also incorporate embeddings into our LLM-produced sentences using the SentenceTransformers library, which is responsible for semantic analysis. This pre-trains our model to best serve users who look for answers and a sense of trust from our application.

Challenges we ran into

What became our biggest challenge wasn't the difficulty of producing an innovative product at a hackathon, but instead refining the viability of our service. Establishing a clear distinction between diagnosing patients, thereby making assumptions about their care, and treating our patients as well-intentioned people who happen to be unknowledgeable about their medical treatment provided us the confidence and assurance our project was both tangible and applicable to India's current telehealth market. This is not to say that several learning curves necessary for innovation led us to take countless detours throughout the past 48 hours: we switched across multiple back-end and LLM providers, which brought us to focus on the application's dependability. None of these challenges dismiss the potential of our tool, however. With additional fine-tuning to the LLM model and further resource allocation to our front-end development, subsequent versions of our application may include a chatbot to which users can express their concerns within their own semantics, allowing our predictive model to expand its semantic and sentiment analysis in addition to producing accurate medical treatment plans.

Tracks Applied (4)

Auth0 Track

Storing our user details in a straightforward, secure manner, we built our user management system using Auth0, prioritis...Read More

Auth0

Best Use of APIs

Creating our API from scratch, with the help of Postman, minimized our projected costs for running this application, max...Read More

Postman

Best Postman Public Workspace

True to Postman's mission to connect APIs across several platforms seamlessly, we used Postman to test, develop, and dep...Read More

Postman

Medical Track

Connecting patients and doctors through a telehealth platform that is sensitive to patients' healthcare plans is an inno...Read More

Discussion