Scaling an AI Interview Chatbot for Enterprise Training
I enhanced conversational UI experiences for a Claude Opus-powered chatbot through improved interface accessibility, streamlined chat transcript downloads, and optimized interaction flows.
Role:
Product Designer
Team:
1 designer, 1 product manager, 6 engineers
Tools:
Figma, Claude Opus 3, Claude 3.5 Haiku
Skills:
UX Design, UI Design, Conversation Design, Accessibility
Timeline:
June - August 2025
Challenge
A new AI-powered chatbot designed to help interviewers practice their skills was about to face its biggest test. The tool allowed users to ask questions while the AI responded as a realistic interviewee, but it had just completed beta testing and needed significant improvements before a high-stakes deployment.
The HQ training team was preparing to travel nationwide, conducting four-day training sessions where the chatbot would be used intensively—two hours daily across six months, training over 3,000 learners total. The product was new and showed promise, but critical usability and accessibility issues threatened its success at scale.
The Audit
I conducted a comprehensive usability audit that uncovered four critical issues that would severely impact the user experience during high-volume training sessions:
Broken Loading Feedback
The chatbot relied on a browser loading bar at the top of the screen to indicate processing, but this was far removed from the input area where users' attention was focused. This violated a core conversational UI principle: users need immediate acknowledgment of their input within the conversation thread to maintain the natural turn-taking flow they expect from chat interfaces. With increasing user loads potentially causing longer response times, users had no clear indication their question was being processed.
Inaccessible Submit States
Users could submit questions via Enter key or clicking a submit button. While both methods were temporarily disabled during processing, the submit button never visually indicated this state change, failing WCAG accessibility standards. This created an inaccessible and frustrating experience for users who could click repeatedly with no feedback—a critical flaw in conversational interfaces where clear action confirmation is essential.
Unclear Conversation Boundaries
After users marked their interview complete, the conversation ended and no more questions could be submitted. However, the submit button remained visually active and functional, leading users to attempt interactions that couldn't work. This violated the conversational design principle that system capabilities should always be clearly communicated to users.
Missing Export Functionality
The chatbot lacked a crucial feature for the training context—transcript export. This missing conversation archiving capability meant trainers had no way to capture or evaluate the interview sessions, breaking the connection between the conversational experience and the broader training workflow.
Solution
I applied conversational UI best practices to redesign the interface with four key improvements:
Enhanced Loading States
I designed contextual loading indicators positioned directly within the conversation thread, following the principle that users need immediate acknowledgment of their input to maintain conversational flow. The new loading states included subtle animations and clear messaging that set appropriate expectations for response times—critical for preventing the "dead air" problem that breaks user engagement in chat interfaces.
Accessible Submit Button Design
I redesigned the submit button with proper accessibility states, including visual disabled states and appropriate ARIA labels. The solution implemented a clear three-state pattern (idle, loading, disabled) that follows WCAG guidelines while providing crucial feedback to all users. This addresses the fundamental requirement in conversational interfaces that every user action must receive clear confirmation.
Clear Conversation Boundaries
I created a distinct end-state design that clearly communicates when the conversation has concluded. The interface now transitions to a read-only mode that maintains conversation history while preventing new inputs—following the conversational design principle that system capabilities should always be transparent to users.
Export Functionality
I designed an intuitive export feature that integrates seamlessly into the conversation flow, following patterns users recognize from messaging platforms. The export includes conversation metadata and structured formatting to support trainer evaluation workflows while maintaining the conversational context.
Additional Improvements:
- Added strategic white space after conversations to improve scrolling flexibility
- Enhanced overall readability through improved typography and spacing
- Created clear visual hierarchy to guide users through the interview process
Impact
The redesigned chatbot successfully supported the nationwide training initiative, providing a smooth and accessible experience for all 3,000 learners. The improvements addressed critical scalability concerns while establishing a strong foundation for conversational interface best practices.
Key achievements:
01.
Eliminated accessibility barriers that would have prevented some users from effectively participating in training
02.
Designed loading states that gracefully handled variable response times as user volume scaled
03.
Created export functionality that supported trainers' evaluation workflows
04.
Successfully prepared a new product for high-stakes, high-volume deployment
Key Learnings
This project reinforced the importance of applying conversational design principles when creating AI-powered interfaces for enterprise use. Working with a chatbot in a training context required understanding how users expect turn-taking behavior and immediate feedback—core elements that become critical when serving thousands of users simultaneously.
The experience highlighted how conversational UI design must balance familiar chat patterns with accessibility requirements and business needs. Success required implementing proper loading acknowledgment, clear system status communication, and conversation archiving—fundamental principles that transform a functional tool into an engaging, accessible experience.