Outfit Aura addresses the complexities of online fashion shopping by making it easier for users to find outfits that align with their unique preferences, cultural backgrounds, and current trends. Traditional online shopping can feel impersonal and overwhelming, with users often unsure of how clothes will look on them or whether an outfit will suit their style. Outfit Aura solves this by delivering a tailored, interactive, and visually immersive shopping experience, giving users confidence in their purchases.
- Personalized Outfit Discovery: Users receive outfit recommendations based on personal style, cultural influences, and current trends, simplifying the selection process.
- Enhanced Shopping Experience with AI Interaction: An interactive Multilingual Conversational Fashion Outfit Generator provides real-time feedback and customization, making style exploration fun and engaging.
- Confidence through Virtual Try-On: The Virtual Try-On feature allows users to see how outfits look on them before buying, reducing uncertainty and return rates.
- Always Up-to-Date: Our project adapts to trends from social media and seasonal events, ensuring users stay fashion-forward and relevant.
- Streamlined Shopping with Amazon Integration: Recommended outfits are linked directly to Amazon, enabling easy and convenient purchases within the app.
- Culturally Resonant Recommendations: By reflecting cultural styles and festive highlights, our project creates a more personalized shopping experience that resonates with diverse backgrounds.
One of the biggest challenges we faced while building Outfit Aura was integrating multiple modules across a diverse tech stack, including ReactJS, Node.js, Flask, MongoDB, SQL, Gemini, OpenAI, and Stable Diffusion. Each module had its unique requirements and constraints, which made it complex to ensure seamless communication and data flow between them.
For example, integrating the conversational model using OpenAI with the front-end in React required careful API management to handle requests and real-time responses smoothly. Additionally, linking the Stable Diffusion-powered Virtual Try-On feature to React while managing heavy image processing tasks on the backend posed significant performance issues.
- Separate Servers for Flask and Node.js: To keep the processing load balanced and ensure smooth data handling, we hosted our Flask and Node.js backends on separate servers. This separation allowed us to better handle requests specific to each service and avoid overloading any single server.
- Threading in Flask: For the content filtering algorithm implemented in Flask, we introduced threading to manage multiple requests concurrently. This boosted our server’s performance, especially during high-demand tasks like filtering and processing generated images.
- Modularizing the Codebase: By breaking the project into smaller, modular services (e.g., separate Flask endpoints for Stable Diffusion and OpenAI interactions), we made each component more manageable and easier to troubleshoot.
Tracks Applied (3)
Major League Hacking
GitHub Education
Google For Developers
Discussion