Conclusion


Download the full CDD Playbook - no email required.

Includes 5 guided activities that help conversational AI teams adopt conversation-driven development, and build the assistants users want.

Get the Playbook


Users are the driving force behind every type of software development, and AI assistants are no exception. Conversational AI teams have a unique window into how users interact with an assistant, and successful teams channel conversation data into development decisions and model training. 

But as we’ve seen, conversations are only part of the equation—you can’t have CDD without development! And to that end, engineering best practices make up the other half of a comprehensive approach to building AI assistants. Teams practicing CDD work in short, iterative development cycles, using automated testing and CI/CD to ensure that updates are reliable and predictable.

Incorporating these practices into your team’s workflow is a journey, and most teams today are already using many of these practices. The key is to view developing AI assistants as a partnership between your users and your development team, one that starts early and continues throughout your development process. With that mindset, your team can make the culture shift toward conversation-driven development and build a framework for creating AI assistants that truly help users.


 

CDD Checklist

Share

  • Conduct a user test with internal testers
  • Conduct a user test with a focus group of real users

Review

  • Read conversations
  • In Rasa X, use filters to surface important conversations
  • Identify issues that need to be addressed
  • Identify successful conversation that can be turned into training stories and tests

Annotate

  • Label user messages and add them to training data
  • Convert successful conversations into training stories

Fix

  • Connect your assistant to version control
  • Make updates to address issues uncovered during review 

Test

  • Establish a CI/CD pipeline
  • Make automated tests part of your CI/CD process
  • Institute a code review process

Track

  • Identify proxy metrics as well as top-level statistics to measure success
  • Use tags to label when events occur in conversations

 

Additional Resources

Back: Track