Webinar
47:55

How to Use Integrated Version Control in Rasa X

In this recorded webinar, Ty Dunn, Product Manager at Rasa, presents a 45-minute demo of Integrated Version Control, a feature that allows developers to connect Rasa X with a remote Git repository.

Integrated Version Control allows teams to track updates to training data and incorporate Rasa X into automated deployment workflows. During the webinar, we walk through the process of setting up Integrated Version Control and share best practices for building AI assistants, with an emphasis on continuous iteration and automated testing.

In this webinar we covered:

  • How Rasa X helps product teams turn conversations into training data
  • Why you should incorporate testing, version control, and CI/CD into your development process— and how to do it with Rasa X
  • How to set up Integrated Version Control and push changes to production

Transcript:

Welcome everyone. Thank you for joining us on the call today. I'm Karen White with the marketing team at Rasa and I'm joined by Ty Dunn, our product manager. So today we're going to be dedicating 45 minutes to talking about a new feature in Rasa X: integrated version control. So this feature syncs your Rasa X instance to a remote Git repository so you can version control your training data, and it also allows you to hook into downstream workflows like CI/CD, branching, and automated testing. So we have a lot of exciting content to cover today, but before we begin, let's quickly go over the agenda and how to participate during today's webinar.

If we could go to the next slide please. Thank you. So for those of you who might be new to Rasa or Rasa X, we're going to start with a quick overview of our mission, and then we'll narrow our focus to Rasa X. Since we launched Rasa X in May, we're seeing wider and wider adoption, especially from product teams who need a tool set that's purpose built for maintaining and improving AI assistants that are running in production.

So we'll talk about how Rasa X helps you go from a basic bot to an advanced assistant that's capable of handling business critical tasks. And then our latest addition to that, is integrated version control. So we'll give you all the details about what's new. But one of the things that we're actually most excited about with integrated version control actually doesn't happen within Rasa X at all. So when you connect Rasa X with Git based platforms like GitHub, that lets you hook into all kinds of downstream workflows for managing your AI assistant according to software development best practices. So we're talking about CI/CD, building features on branches. and incorporating automated testing into your deployments.

So next on our agenda, we're going to do a live demo. Ty's going to walk us through loading an assistant into Rasa X using integrated version control, and take us through some of the workflows to help you get the most out of the tool. And then last we'll answer questions from the audience. So we'll start with questions that were pre-submitted by email and then we'll move to live Q&A. So you'll notice right now that all of you in the audience are muted at the moment. Please keep yourself muted, but at any time during the webinar you can ask a question in the chat. So you don't have to wait until the end. You can open the chat from the meeting control bar. And then when we see your questions, our moderators will add them to the list and we will answer as many as we have time for. So with that I'll hand it over to Ty.

Thanks Karen. Promise I'm not going to be quite as eloquent as that, but I'll do my best. Before we jump into Rasa X, I always like to set things up by bringing you back to the mission that Rasa is on, which is to empower all makers to create a system that works for everyone. And I think a really cool example of that is that Rasa X has been downloaded in over 135 countries. I pulled the stats on this earlier this week because I was curious about how many people around the world had been using it. And it really showed me how global the community of Rasa is and just the number of people that are using Rasa X to build level three assistants.

And the really cool part about this is it really ties to that mission, of building AI systems that work for everyone. Because I think oftentimes the problems that we face in our communities that our friends and families face in our communities are best solved by us. So it's cool to see that around the world, people in the community are solving problems in their communities.

And so Rasa X. For those 135 different people or those 135 different countries. It's empowering them to build mission-critical assistants. And what we mean by mission-critical assistants are ones that are used in production. Obviously in helping people, but also, not just helping people, but are really important to the enterprise that's running it or the startup that's building a conversational interface. An important assistant, not just a toy project. And so we enable people to do that using Rasa X.

And so what Rasa X allows you to do? In the past, most assistants that have been built with other platforms and other ways you can build chatbots and assistants, only allowed you to build level one notification assistants that are one way conversations or level two assistants, FAQ assistants, as we often call them, chatbots that can only handle one turn conversations. And where we've moved with Rasa specifically is being able to handle multi-turn conversations, that bring in context. So you're not just replying to what the user just said, but you're replying to... you're making a decision or their assistant is making a decision on what to do next based on conversation turns like one conversation turns ago, two conversation turns ago, three conversation turns ago. And everyday at Rasa we're bringing in more and more context to the assistants you can build.

However, as we started building Rasa Open Source, we recognized that just having that framework that is fully customizable and available to download anywhere and for free, it wasn't enough. To reach level three and to push the limit to level three. And so that's why we built Rasa X. Recently if you saw the blog post from Rasa's great marketing team on building open source chatbots: Better Together with Rasa Open Source and Rasa X, you have seen a graph similar to this where we talk about how when you first get started with Rasa Open Source, you build locally on your machine, your laptop, and you build a minimal viable assistant, which is where the most important happy paths are covered by that assistant.

Just the things you expect the user to say, the business logic that you know, people who follow the flows and are completely following exactly what you want them to do. You define those locally using Rasa Open Source, and we have plenty of material to show that and how to get to that stage. And then once you are at that stage is when you deploy to a server—you actually serve your assistant to the world and you improve it using Rasa X.

And so the idea is after you've built this minimum viable assistant, you start talking to it yourself. And by talking to it yourself in Rasa X, you're actually starting to collect training data which you can use to improve the quality of your assistant. And once it gets good enough, you gain confidence to where you're willing to share it with test users. So Rasa X has this really cool feature that we'll show where you actually can quickly have a link accessible by anyone on the internet that you want to share it with, who can start talking to your assistant, and your assistant gets better and better till the point that maybe you're ready to hook it up to whatever channel, whether that be Slack or Facebook messenger and serve it to real users, after maybe some subset of beta testing at first and then expanding more over time.

The idea behind Rasa X is that you're going to be continually improving your assistant. The only way that this graph occurs is if you're following this loop over and over again. And so the first step of this loop is to collect conversations between your users and your assistant. We'll look at Rasa X and what that looks like. And once you've done that, you review your conversations and improve your system based on what you learn. Sometimes that's taking messages that come in and adding them to your NLU training data and using it to better improve your performance of your NLU intent classification. And then sometimes it's actually looking through patterns of things going wrong in conversations perhaps, and improving your stories to make your assistant be able to handle more and more unhappy paths.

In order to do that in a scalable and repeatable fashion, we've introduced with integrated version control, this idea of using continuous integration and continuous deployment as you would in other software engineering. In the world of conversational AI, specifically making it super easy to complete this loop, in a scalable, repeatable way. Make an update quickly, and maybe multiple updates per day, and share your assistant and have it improve, in this fashion.

And so if we look at the original purpose behind Rasa X, it was what we talked about earlier, going from that level two to level three and advanced level three assistant, right? To build something that is able to be that mission critical assistant we talked about. And so as we reflected on what it's going to take to get there, for teams to get there, we identified three really important things. The first one being conversational data sets, specific to the assistant's task. It's not good enough to say if you're building a customer success, customer support chatbot to just have human to human conversations, you actually need human to assistant conversations with the assistant you've been building, to look at and to use as training data. And not only are you using them as training data, but that training data is also a representative, hopefully, right? With the actual way people talk to your assistant. And so we've often seen problems in the past where people will sit in a room for three months and hypothetically come up with ways that people will talk to their assistant. And then when they go to launch their system, it completely fails because users talk to it in a way that they didn't expect, or attempt to do things with it that they didn't expect.

And so that's why we really want people to go after conversational datasets that are representative of the problem they're trying to tackle. And it makes sense because if you think of what a model is, right? It's trying to model reality, and so the closer your data set is to reality, the better it will be able to model. And then in addition to that we think you have to have powerful tools for reviewing past conversations. So what we were talking about a bit with conversational data sets is once you have that dataset you can't just dump it in and it just gets better. You actually have to have some manual human work, at least as of right now. In the future, maybe we can get to a self learning assistant, but today it takes a lot of human work: annotating data, reviewing conversations.

And so both of these two pieces, number one and number two, have been in the version of Rasa X that launched in May, but the one that we've added recently at the end of 2019 is this integrated version control feature, which we're talking about. Which completes the three things we really believe you need, right? Which is integrating with existing development workflows. So a lot of the most advanced teams that are building advanced AI assistants are using continuous integration, continuous deployment pipelines. They're versioning their assistant, they're testing. They're doing test driven development. And so by allowing Rasa X to easily and smoothly work with those workflows, and I was able to do the three things that we think are very important for team building AI assistants, and specifically those level three mission critical assistants.

And so the new feature, what is it? And what is this integrated version control that Karen mentioned, and I keep talking about? Well, it's been an experimental feature in Rasa X as of 0.23.0. At this point I would probably say it's more like an early access feature, the name in Rasa X is still experimental. Pretty quick here, we're going to be moving it out of that hidden tab, which we'll look at later in this presentation and making it just completely always there whenever you deploy a Rasa X server. And so what it does is it allows you to create a two way sync between a remote Git repository and Rasa X.

So if you make changes in Rasa X, you can easily get them to that remote Git repository, and then if you make changes to that remote repository they're auto loaded into Rasa X. And so what this allows you to do is version your training data, and push changes in Rasa X to a branch, and do branch based development using the Git workflows that you probably know from building maybe a mobile app or building a website. And so the extension of that is because it's built on top of Git, those downstream workflows of automated testing and automated deployments are now possible. Also an extra thing that comes along, and then with that is the ability to set up these scalable software engineering practices.

We'll take a quick look at how to deploy Rasa X and generate an SSH key, add it to GitHub and connect it with Rasa X in order to set up integrated version control. However, we won't go too deep into it. There's a great, I think it's episode nine of masterclass from Juste that covers this much better than I would ever be able to. But at a high level, this is how you set up integrated version control. You have a Rasa X server, and then on your local machine you generate a private key. You take the public key and you put it into your deploy keys on GitHub or any code hosting platform you have, If you use Bitbucket or GitLab, those also work and are also supported. But then once you do that, you send a POST request to the Rasa X server letting it know that here's the GitHub repo I want to use and that's how you connect it. We'll do a quick overview of that.

And so once you do that, you've enabled testing, version control, and continuous integration/continuous deployment. And so the high level thesis that we've been thinking about, as some people in the community have also been thinking about, is how important software engineering best practices are, even though an application uses machine learning, right? So Arthur, superhero of Rasa, talks about how once you actually get that complex mission-critical assistant, it's really, really hard, to keep it managed. And so in order to manage it, you need tougher engineering practices that are scalable and repeatable. And then throwing in this part of the Botium team which is building the Selenium for bot testing, loved to hear that this feature was included in Rasa X and created this awesome blog post about how to integrate the two.

And then lastly, Alan our co-founder and CTO had a recent tweet storm I don't know if he calls it tweet storm, but recently tweeted about it, this new feature and his thesis around how writing tests and using branches and using CI/CD is something that we should be doing but haven't been doing in the world of conversational AI. And it got quite a bit of traction on Twitter. And so if you think about what is this end to end workflow that we keep talking about it's that you're in parallel making changes on the Rasa X server. You're annotating data while you're also maybe on your local laptop or any other computer outside of Rasa x server, also editing your assistant.

And because you're doing it in two places and have the ability to do it in two places, it's great to be able to sync the changes you're making. One of the best ways to do that is to use these code hosting platforms, Git management platforms, like GitHub, GitLab Bitbucket, to have them be synced. And because you're using Git, you can do Git checkout branch on your local machine and write some action code and version it into Git, push it to GitHub while also annotating data in Rasa X and have them both be not leading to conflicts or you don't have to, I think the previous workflow that many people followed, as you're using Rasa X, you have to constantly be moving data in and out, uploading and downloading, which just slows you down.

And then once you have it in GitHub, the bonus downstream workflow that this enables is CI/CD. So this is really cool. I'm excited to see the different ways people are approaching setting up these pipelines. Often people will have a build application, train model, sometimes linting, sometimes validating of training data. Then they'll run some end to end tests they've set up, which we'll talk about a bit.

Oftentimes that will produce some artifact like a five fold cross-validation or other cool things we'll look at, which require a manual review from say, a data scientist to say, "Hey, this is not introducing model regressions, all the end-to-end tests pass," and then you can deploy your model. And once you do that, people have different strategies around having multiple deployments, of a production, staging, development environment and their own workflows for those. But this is entirely customizable in the same way that you would do it for a mobile app or a website. You can take advantage of the same technologies like Travis CI. You'd have GitHub Actions , CircleCI.

And so to go a little bit deeper into what we mean by the continuous integration tasks that you could run, one example is the end to end tests that are in the rasa test --stories, right? And so if you haven't seen these, I highly recommend that you use them. It's basically taking the stories you have in your training data and other stories that you want your model to ensure that it can handle, and it's adding some messages that should result in those intents and ensuring that those intent, action sequences in those stories occur correctly.

You can keep adding end-to-end tests and even though you're changing lots of stuff, you're doing lots of changes to your model and you're improving your assistant, you can be sure that all of those test stories still work. And then in addition to that, you can do NLU model evaluation, where you can ensure that you're generalizing well and some people split into a test set but others don't, right? And you can use that and produce some artifacts that you can review.

And then there's also a core evaluation. You could generate a confusion matrix and look at that. And the cool thing is using continuous integration and say GitHub, you can actually have those run and just be auto added to your pull request or your merge request in GitLab. As we look at later this example and GitHub Actions shows and it's a really easy workflow for someone to have another engineer, another data scientist, someone else on the team to come and review. And so with that we'll get out of the slides and get into the demo.

And so first thing I want to do is jump into this Rasa X instance. As you can tell with this nice message, it's your first time here. I just deployed this following that Rasa Masterclass episode I was talking about earlier. So this is a brand new, fresh deployment of Rasa X. So the way that you use integrated version control is you just go to the experimental part, you click check and you hit okay. By the time you're watching this video, if you're watching it later, then this live stream and this webinar it might just always be there, right? But it just says not connected.

And so, like we talked about earlier, the way that you connect it is you have your your assistant that you've built, it's a minimum viable assistant that can handle the happy path, you version it in Git and push it to your code hosting platform. So in this case, GitHub. We have my very basic assistant that I often use to test Rasa X. And what you do there is you add a deploy key. So I think if I go to settings, it's different depending on if you're using GitLab or Bitbucket, you add a deploy key, click in the right area. Where do you get that deploy key? Well, you in your command line. You follow the SSH generation of a key that GitHub has outlined. Create a public and private key, add your public key here and then the way that you finish, once you've generated the SSH key and you've added the deploy key to your repository is and there's a couple of methods. The one that I showed here is sending a cURL request to your assistant to, I'm sorry, not to your assistant, to your Rasa X server. Saying "This is the repository I want to use." In this repo, that JSON is my SSH key and also pointing to the repository I want to use, right? And I did this already so I don't want to waste time during the webinar doing it, but that gives you an idea of how you set it up. And so we close out of this one, the one I've already done it on. Look, now it's entered the green state as we call it.

At some point, maybe we give these names, but right now we call it the green state. And in that green state what I know is that my Git server is up to date with the latest changes. So whatever you see here, whatever training data I see here, is going to be the same thing if I went and looked in my data on GitHub, and they're equal to each other.

And so before integrated version control, if I go into say, Conversations, there's this cool feature in Rasa X, if you haven't used it, called share your bot where you can share your assistant with people, friends and colleagues perhaps before deploying it on a channel and they can talk to it. So even though I'm talking to, I'm just showing you how cool it is, right? And so talking to the assistant, right? I'm from Detroit. All right? That's probably enough. And the really cool thing about Rasa X and why these workflows are even better, if you'd previously been talking to your assistant in Rasa shell on the command line, you would have talked to it, it worked. But then all of that training data that you just generated, wouldn't have actually ended up anywhere.

Well, here, it automatically ends up in what we call the NLU inbox. And apparently it recognized Tyler as a name, but then it got the wrong intent. And so here I can very quickly, even though it predicted Tyler, I can change it to name, click Mark as correct, delete this test I was doing earlier. And you'll notice that in the bottom left that that green state we saw earlier is now the orange state. And what that indicates is that you have changes on your Rasa X instance that aren't reflected in the Git repo.

And so even though it is a two-way sync, it isn't always an automatic two way sync. If you are up to date in that green state with your Git repo, then it will automatically continue to pull any updates you make there.

However, that only happens if you don't have any changes on Rasa X. If you have changes on Rasa X, like in this case where I'm ahead of whatever changes were made locally then I need to make sure I add them. And so you can see how nice that is. In a minute, we'll go a little bit further and see that full workload. Before I jump into that, I'll discard my changes and you'll see, Oh, okay, I'll return to this green state.

And so previously if I made changes, I added training data here. And then I went and trained a new model and in Rasa X it will eventually pop up here. Right? And I'll activate it. What you'll find is I created changes and I activated a model, but it doesn't necessarily result in improvements, right? I have no idea. Can it still pass all of those end to end tests that I probably didn't set up if I'm not using integrated version control? But even if I did, I still have no idea.

And so to just be making changes and be iterating towards no objective or to not even be able to measure whether you're doing well and improving, that's incredibly problematic. In addition to that, all that training data I just added is not versioned. So if I want to roll back because this new assistant I built is not very good, I can't do that unfortunately. And then of course, because it's not built on top of Git, there's no chance to do a CI/CD pipeline.

And so if we go again to here and we talked to the assistant again, generate the link, seeing the conversation from earlier. I guess my model is not very good. This is not a very good test assistant, but see if I'll get it this time. Cool. All right. Now it was following a path that I expected to. And so if I go to NLU training data, it will show up. There's a bit of lag at the moment to my current deployment. But you'll see that once again, the data shows up here, mark as correct and now it's added the changes or sorry, it's gone to the orange state and I can add the changes to Git.

Add changes to Git, I'm given two options. I can commit them right to master, and that's not going to make a pull request, that's just going to add them to master. And I have no idea if these changes have been good. And so often we recommend doing the new branch route. And so I add the changes. You'll notice in the bottom left a little bit of a spinning arrows, which means, "Hey, it's pushing," and then it goes back to green, right? Because if you think about it, now we're at the same state where the Git repo is equivalent to Rasa X.

Well, technically it actually isn't. So here's a slight nuance that you should understand, which is because I didn't push to master... If I pushed to master, it would be equivalent, right? But because I didn't... It's now on another branch. And so what happens in Rasa X, which is the slight nuance, Rasa X reverts to the same state. So it's green, meaning it's equivalent to the data that's in the Git repo, but all those changes I just annotated are now on that commit that I did. So if I click here and compare pull requests, create a pull request and we can see the changes I made.

So I annotated Tyler name here and I can review the changes because I'm the only one that does this. Oh, I can't approve my own pull request. That's not good, but in theory, if I could approve my pull request, merge it in, and then it would go here and it'll be green, and all those changes I just made would be good. And so what are the advantages? Well, I think it's pretty obvious. I just versioned the changes I just made, which means at any point I can go back to other commits I've had and see what is that version? And maybe check out and use that version if it's better.

I can also do branch-based development, which is cool. So if I go into the command line, and go here, I just have my local repository going on, what I can do is check out another branch, maybe where something like an action? I can't actually write actions in Rasa X, because it's a very coding intensive exercise and it's better to be done on your favorite IDE or text editor in my case Vim. And I can add something like, maybe like a comment, hello. Save it, we can see that I'm versioning it locally here. I can add those changes, I can commit them, and then I can push them. Cool. And then the changes I made here, are going to be on a new branch that I just did less than a minute ago. Local edit. Right? And you can see the changes that I made locally.

And so now, despite maybe someone else annotating data here, I can be on my laptop here actually making changes, which is cool because you could be fixing bugs well while using features. I think something we really believe in is that the best assistants are built by teams, and so being able to do that in a team and have multiple people working on the assistant at the same time and having them not collide, it's pretty cool.

So, yeah, we have the Rasa test show, command line interface, testing of end to end stories and model evaluations, all that. I hope you're using it and using it to do test driven development, even for assistants. And that's pretty cool, but it's not very scalable, right? If I have to download my assistant, and run the tests every time on my local machine, that's not very optimal. And so the cool thing about about using this workflow, is that you can set up CI/CD pipelines and I don't have one on this test instance, but we'll look at two examples that are cool.

This first one is from the CTO of Rasa, Alan. He has a GitHub Action that runs the end-to-end stories that we talked about, but then also does this intent cross validation, five fold validation and puts it in the PR. And so whoever goes to review the PR, looks at that, checks out like, "Oh, these, precision recall f1 score have improved a bit with all the new training data I added," and then merges them in. And so this is available, I will share the link afterwards, but you can see it actually goes through and builds the model, installs dependencies, does that cross validation, and then put it into the PR. But you don't have to use GitHub Actions, that's just an example.

Another cool one is one of the team members on our team is building a bot for a rum distillery called Distill-bot. This one's from Brian. He does a similar thing with the five fold cross validation, but he has this extra column that's confused with where he's like, "Oh look, I made changes to inform perhaps, and then now it's potentially being confused with greet and affirm." And so I should change that. I'm going to deny this pull request because the changes we've made are not good.

But that addition to the PR, only came after he ran a number of tests. And so what he did here is he did the code formatting tests, the docker file on the PR type tests, Rasa data validated, ran his stories, his end-to-end stories, after all of those passed, he then produced this artifact on the pull request.

And so I guess the last piece you should know about integrated version control is that, if for example, I make changes... so we've talked about the two states and you'll see them in the docs the green orange state. They explained them better than I'll be able to, but if I make changes, say let's go talk to your bot. So usually if you're going to talk to the bot yourself, you can just go to talk to your bot. And we'll say, "Hey, Romeo," if I could go to NLU training. Oh, perfect, it's here. It's not greet. I don't know why I keep saying greet. Maybe this is the wrong test bot that I... maybe my test bot is just really bad.

Anyway, it's like if I make changes here, we get to the orange state, which we've talked about. But there's a slight edge case that we didn't talk about, which is if I go back here and I'm in this branch and I go to my data, and I add something like, "For sure dude." And then I add it, commit it. We're going to show you the red state, right? Which is not only are you ahead of the... Well, actually I should check out master. Sorry. I forgot I was on another branch.

Okay, now I'm on the right branch. So if I do it in just because there'll be a little bit more straightforward. I do it on master branch and add my "for sure dude" again, right. Sorry. Live coding. So I became a product manager so I don't have to do this. Git commit. Red state. Okay. Now I just push right? Because I'm a master branch, I actually updated master.

And so the master that's on... where is the right tab. Okay. Yeah, the master that's right here, on Rasa X. This master branch is now ahead of changes that I have on my Rasa X server, but also on my Rasa X server, I've made changes, which is when you enter the red state. And what the red state says is, you know how last time we could commit to master, now you can't commit to master, because when you have potential conflicts, what we'd probably see here in that one training data file, right? We don't want to have to handle those merge conflicts for you.

And so rather than handling the merge conflicts for you, we'll only allow you to add a new branch, which means when you go back here and you've added it, now you're adding that to a different branch and then it will be up to you to take all the commits you have, make pull requests for them and then merge them all back together. But I guess with that, I think I will move to the Q&A.

So if I push my changes from local files to my GitHub repo, do they automatically get pulled into Rasa X? So like we talked about earlier, we talked about the green state, the orange state, and the red state. The answer is, it depends on when you push your local files, right? Like I was showing us in the last command line demo that I was doing, the changes I made on the local files, if I do it on the master branch and add them and I've made no changes on Rasa X, then we're on the green state and it pulls those changes that have been made to the GitHub repo, automatically into Rasa X.

However, if I've made changes on the Rasa X server, those changes might not be pulled in. What I need to do is I need to actually get those changes off of the Rasa X server and then once I get those changes off the Rasa X server, I will pull the latest updates and I'll have that pull request. Or if you're using GitLab merge request then I merge in.

Can different Rasa X users push changes to their own branches? So this depends on whether you're using Rasa X or if you're using Rasa X as part of the Rasa enterprise edition. If you use Rasa X, you recognize that there's only one password. There isn't this concept of different users in the community edition. In order to have different users on Rasa X, you have to use the enterprise, which allows everybody to get their own... allows everybody to get their own user with their own password. You can integrate it with single sign on, with you're in an enterprise, that's super helpful. And then you can do role based access control on those users.

If that user has role based access control that allows them to push changes and multiple people have that, then you could do changes, but on their own branches. Depends on if they're doing those changes in parallel or separately. And so the way Rasa X currently works is that everybody's collaboratively working together, so you wouldn't be able to make changes if you are a Rasa X user on the enterprise and push those changes to your own branch while someone else was doing it.

However, if you did it and then push your changes and someone else came along and did it and pushed their changes, different users could push them to different branches. A complicated answer. I think the thing you should understand is, no, in summary. And then if you're wanting to go more towards this many team members, different users in Rasa X, then I highly recommend reaching out to the sales team and talking about the enterprise edition.

And then the last pre-submitted question, when I launch Rasa X, I don't see an option to enable experimental features. What could cause this? A number of things, but probably the most or maybe not the most obvious, but the most likely is you've deployed it using local mode. So if you find yourself in the command line and typing Rasa X and hitting enter, and it deploying Rasa X probably on your local machine, you're probably not using Rasa X in server mode, right? And so in order to use in server mode, you need to deploy with Docker, Kubernetes. And the really cool thing about those is they give you production grade deployments that allow you to scale with your user base and they give you that functionality of being a server is a computer that's always on, which is really important because you don't want your assistant to stop serving users when you close your laptop lid. And so if you don't see the experimental features, I might check that first.

Otherwise, maybe you're watching this video later, if you're not in the live stream right now and it's no longer an experimental feature. So therefore it would just always be in your Rasa X server interface. And so I think we have some Q&A that were submitted. The first one is from Micah. Micah is asking, using Rasa X is it possible to commit changes of NLU training data or stories to multiple files? Having two huge files after some time seems hard to maintain. Yeah, that's a really good point. I guess as people start to build complex systems, that's one strategy people take. And at this stage, I believe the answer is no, but the cool thing is the reason it's in this early access experimental feature stage is we're working on taking all this feedback, people bringing up these points to us and doing an iteration on it.

And so in the coming weeks, months, I hope it's weeks, we're going to have integrated version control v2, that will be generally available, remove its experimental label. And so we're taking feedback like that, which right after this meeting, I'm going to make sure the engineers are thinking about that. But as of right now, I believe the answer is no.

Do you have instructions to how to integrate Rasa with Azure DevOps instead of get GitHub? At the moment I don't think we have specific instructions for specific things. We use GitHub at Rasa, which is where the Rasa open source code lives, and the Rasa SDK live. And so we use a lot of GitHub. And so we just show an example of this, but I imagine, I haven't personally used the Azure DevOps. I imagine it's very similar to GitHub and so you should be able to take the general principles and ideas and apply the same things. But in our docs, I think at the moment for GitLab Bitbucket and GitHub it tells you explicitly how to add a deploy key. Maybe we should add one for Azure DevOps. Another good feedback point that we can add.

Another question. Do you have any resources on writing automated tests for each NLU and stories? Yes. I think you might've seen it when I clicked one slide ahead. There's resources, that GitHub Gist looking at Sara, if you've ever gone on the docs and chatted with the assistant, that assistant's called Sara. That assistant is actually open source. The customer success engineering team has added some automated tests, that Distill-bot I was showing with the rum distillery, Google Assistant bot is open source on Brian, I don't know. It's open source actually. I don't know the license but it's currently public at least on this GitHub repo.

And so those are some examples, but we're thinking about how can we collect more and more examples? How can we showcase what the community is doing to give other people inspiration and ideas. So something I'm thinking about, maybe we make an open source repo on writing tests, right, for each NLU and stories. Right? And the docs should have some examples too. I should mention that as well. Well I could go read about like Rasa tests, there should be some examples in the docs.

Another question, is there a flag that we can add to a channel to not track conversations, to exclude conversations from training data? For example, we only want to use conversation data from Slack but not from Facebook messenger. So if you look in Rasa X for a second and we go to the conversation screen, the conversations that I'm having here are going to show up here in the conversation screen of Rasa X. And if I jumped to Carbon bot, which is also another open source example.

But carbon bot's another one and it shows you that in the conversation screen, you'll see this one's from Tester, this one's from Facebook. And so what you'll be able to see is from those conversations, if you're annotating there, you'll be able to choose which ones. So I only want to annotate from Facebook messenger or from Slack or from the Test channel or whatever. But within this NLU inbox, right, at least as of right now, right, there is no label what channel it came from. I'd actually be very curious if... my email was at the end of this presentation, I'd be curious as to know why you want that. But I'm currently thinking about how we can drastically improve this, NLU training experience, so maybe that's something we add if you're interested. And if you are, send me an email. Like I said, it's at the end of the slides.

What's another question? Is the multi-project importer available in Rasa X? No, the multi-project importer is not available in Rasa X. I don't believe it's available past Rasa 1.0 and so if you want to use Rasa X, you have to at least be using Rasa 1.0, and so the answer is no. Unless I'm thinking the wrong multi-project importer. The problem is we have four features that are all named the same thing. So I guess ask a followup question if you know better than me about that specific name about the multi-project importer and I'm not talking about the right thing. It is available on Rasa 1.0. I guess either way the answer is, no. There's another experimental feature. It's hard to keep up with what Rasa is doing, right? There's another experimental feature.

There is a lot of support for training data and management of training resources from within Rasa X. Are there any plans to integrate a similarly powerful UI suite for local testing even prior to commit? That's a good question. I think if I can allow myself to pontificate for a little bit, I would say my hope is yes. I want to make tests a first class citizen of Rasa projects. I really believe that it's going to be hard to build a scalable level three mission critical, complex assistant if you're not testing. And so I'd love to get it into the consciousness of like if you're building an assistant with Rasa, you should be building and using test driven development or at the very least using test cases. And I think it does make sense, the ability to maybe manage those tests in Rasa X. I'd have to think through those workflows more.

But a cool feature sneak peek that will be coming out shortly is you're going to be able to copy end to end stories outside of Rasa X. So currently if you look in the conversation screen, you can see the story from a real conversation and you'll be able to just hit another tab and it will show you that story in end-to-end format and you can just hit copy and then add it to your CI pipeline as an end to end story example. So that isn't quite probably where you want us to go, but it's a step in that direction.

And I think the last question, unless someone adds one before I'm finished answering this is Rasa core testing using end to end evaluation using end to end stories and Rasa 1.6 throws an error. I was not able to find an answer for this on the forum. All right, another great feedback question. I wonder if it's 1.6.0 or after that or I believe Rasa 1.7 was released yesterday, so maybe it's fixed in there. If it's not fixed and you've... I guess if you're still hitting this error and it's not fixed in the latest version, I highly recommend going on our GitHub going into the issues, submitting an issue. Just because maybe the team's not aware of it and then you can, my GitHub handle just @Tydunn, T-Y-D-U-N-N so if for some reason you don't get an answer on it, just tag me in it and I commit to making sure that it doesn't throw an error because it shouldn't.

Cool. I guess with that, that's all the Q&A that has been added as far as I'm aware. And so the last thing that we wanted to talk about that we wanted to mention before I let Karen take over again is that we have these awesome resources that these slides that will be shared with you. Episode nine and 10 of the masterclass help with installing Rasa X, setting up integrated version control, common workflows with Rasa, open source in Rasa X. I think it's super helpful, if all the things I were saying were too high level when I said it in one sentence, Juste does a great job of going into detail about the things that I compressed into one question.

And then plenty of links for further reading. So I've been told that the slides will be available to you so you can click on these links. But you might've noticed earlier that if you go to GitHub.com/rasaHQ/test-bot, the repo that I was using earlier. Actually I put some of these in that pull request, I gave example GitHub actions. And so if the slides aren't shared fast enough to you, you can go and look at those right away.

And with that, as always, if you have questions, ask on the forum and get in touch. My email is there, I'm the product manager at Rasa constantly thinking about how to build better conversational interfaces, constantly looking for people to talk to. So if you're willing to chat and give feedback, and especially about integrated version control, we're making updates about it, send me an email and I'd be happy to hop on a Zoom call with you or just if you want to just write it, you can write it and I'll make sure that the engineers are aware of all the feedback you have on integrated version control and we'll make sure to take that into consideration as we build this second iteration of it.

Awesome. Thanks Ty. And also thanks to everyone that joined us on the call today. And thank you to everyone in the community who's already enabled integrated version control and given us their feedback. So far the community's input has really been incredibly valuable and we want to continue iterating to support all of the use cases and the workflows that matter to you. So you see, you can reach Ty by email reach out with any product feedback, any questions. Otherwise, we'll see you in the forum. So thanks everyone. Have a great rest of your day.

Speakers
13314504
Ty Dunn

Product Manager

Rasa

1608637222-karen-white
Karen White

Developer Marketing Manager

Rasa

Is your Enterprise Ready for a
Conversational Customer Experience?