Allow 2 llama bots to talk to each other on 2 separate nodes
Go to file
2023-04-30 20:06:14 -04:00
converse-llama.js Add: TTS=1 espeak with queue system to read back conversation 2023-04-30 20:06:14 -04:00