MedBot Installation Guide
MedBot can be installed from its source code or by docker images.
Install MedBot by docker
-
Install docker-compose
-
Create your token
echo TOKEN=$(tr -dc A-Za-z0-9 </dev/urandom | head -c 20) > .env
-
Create docker-compose file:
$ cat docker-compose.yml services: rasa: image: medbot/medbot_server:latest container_name: chatbot-server restart: always ports: - 5005:5005 volumes: - ./logs:/rasa-server/rasa/logs networks: - rasa-netowrk env_file: - .env command: [$TOKEN] app: image: medbot/action_server:latest container_name: action-server restart: always volumes: - ./logs:/action-server/logs networks: - rasa-netowrk expose: - 5055 networks: rasa-netowrk: driver: bridge
-
Enjoy your chatbot
docker-compose -f docker-compose.yml up -d
Install from source code
-
Clone Rasa branch
git clone https://github.com/arezae/chatbot --branch rasa --single-branch --depth 1
-
Directory tree
To run the Rasa chatbot, we need to run actions-server and chatbot-server separately. So we’ve separated actions, datasets, and the chatbot from each other.
Action-Server will contain actions and datasets, and Rasa-Server will contain the chatbot model and its autocorrect component. In the rest of the document, we will install requirements and then run chatbot and action servers.
production/ ├── action-server │ ├── actions │ ├── datasets │ └── docker └── rasa-server ├── autocorrect │ └── data ├── docker └── rasa ├── data ├── models └── tests
-
Install requirements
Create python environment
pip install --user --upgrade pip pip install --user virtualenv python -m venv rasa_env
-
Activate your python environment
source virtualenv/bin/activate
-
Install rasa chatbot requirements
pip install --no-cache-dir -r production/rasa-server/requirements.txt
-
Install rasa chatbot server and actions server requirements
pip install --no-cache-dir -r production/rasa-server/requirements.txt -r production/action-server/requirements.txt
-
Download Spacy weights
python -m spacy download en_core_web_md
-
Download autocorrect module dictionaries. This dictionary consists of English and medical words.
cd production/rasa-server python -c "import autocorrect; autocorrect.Speller('en_med')"
-
Train rasa model
cd production/rasa-server/rasa rasa train
-
Run rasa server We are almost done. Now we can run the rasa and actions server.
-
Run the Rasa server. If you would like to run the Rasa server with tokens to authenticate requests, You can add
--auth-token YOUR_TOKEN
at the end of the following commandcd production/rasa-server/rasa mkdir logs rasa run --log-file logs/rasa-server.log --enable-api
-
Run the action server
cd production/action-server/actions mkdir logs rasa run --log-file logs/action-server.log actions --actions actions
-
-
Enjoy chatting with MedBot
curl --location --request POST 'http://localhost:5005/webhooks/rest/webhook' \ --header 'Content-Type: application/json' \ --data-raw '{ "message" : "Can you give me dosage information of Abilify?", "sender" : "default" }'
Build a Docker image
-
Clone Rasa branch
git clone https://github.com/arezae/chatbot --branch rasa --single-branch --depth 1
-
Directory tree
To run the Rasa chatbot, we need to run actions-server and chatbot-server separately. So we’ve separated actions, datasets, and chatbots from each other.
Action-Server will contain actions and datasets, and Rasa-Server will contain the chatbot model and its autocorrect component.
production/ ├── action-server │ ├── actions │ ├── datasets │ └── docker └── rasa-server ├── autocorrect ├── docker └── rasa ├── data └── tests
In the rest of the document, we will build separated images for actions and rasa and then run them together.
-
Genrate token
-
Generate token for requests authentication. You can do this manually or use the urandom device file.
echo TOKEN=$(tr -dc A-Za-z0-9 </dev/urandom | head -c 20) > production/.env
-
Check your token:
$ cat production/.env TOKEN=SOME_RANDOM_STRING
-
-
Choose your Dockerfile
Default Dockerfile is Slim-based. But you can choose other Dockerfiles in the
docker
directory. For example for the rasa-based images:cp production/rasa-server/docker/Dockerfile-Rasa production/rasa-server/Dockerfile cp production/action-server/docker/Dockerfile-Rasa production/action-server/Dockerfile
-
Review docker-compose file
- In the case of using an old version of docker-compose or docker-engine, you might need to add the
version
key, on top of thedocker-compose.yml
file. For example in the case of the1.25.0
version ofdocker-compose
, you need to addversion: "3.7"
orversion: "3"
on top of your docker-compose file like this:version: "3.7" services: rasa: build: context: ./rasa-server dockerfile: Dockerfile args: - VERSION=v0.1.0 . . .
-
Also the version of the model weights you want to use, can be assigned by the
VERSION
argument in the docker-compose file. This will download theVERSION.tar.gz
file from Dropbox while building a Docker images.Please note that, if you want to use the special version of our bot, you should clone that version alongside assigning that version as the
VERSION
argument in the docker-compose file. You can find out available versions here.
- In the case of using an old version of docker-compose or docker-engine, you might need to add the
-
Build an image
Build a rasa chatbot and action server images
docker-compose -f production/docker-compose.yml build
-
Make containers and run images
Run docker-compose to start and run the chatbot and its actions together in an isolated environment
docker-compose -f production/docker-compose.yml up -d
- Test your chatbot
-
Talk with your chatbot
curl --location --request POST 'http://localhost:5005/webhooks/rest/webhook' \ --header 'Content-Type: application/json' \ --data-raw '{ "message" : "Can you give me dosage information of Abilify?", "sender" : "default" }'
-
Debug your model. (Use your token instead of SOME_RANDOM_STRING)
curl --location --request POST 'http://localhost:5005/model/parse?token=SOME_RANDOM_STRING' \ --header 'Content-Type: application/json' \ --data-raw '{ "text" : "Can you give me dosage information of Abilify?" }'
-
Or you can use the following command. (Again, you should use your token instead of SOME_RANDOM_STRING)
curl --location --request GET 'localhost:5005/conversations/default/tracker?token=SOME_RANDOM_STRING' \ --header 'Content-Type: application/json' \ --data-raw '{ "text" : "Can you give me dosage information of Abilify?" }'
-
-
Monitor the chatbot
-
Check for errors and warnings in logs
docker-compose -f production/docker-compose.yml logs -f -t
-
Monitor a live stream of containers resource usage statistics
docker stats
-
-
In removing the chatbot case
Stop and remove chatbot containers
docker-compose -f production/docker-compose.yml down