Mycroft -- open source AI

Tags: #<Tag:0x00007f1fe437a450>

(stephen krywenko) #1

Not sure how many people would be interested in Integrating AI into their energy monitor… but I started using this opensource AI --MyCroft… it works reasonable well…

I installed in on quadcore orange pi running armbian… the hardest thing was getting the audio to work as you have to use Pulse Audio.
and there very little info on what software is required form a small base install so alot of guess work was involved

first off you need to enable your mic which is relatively simple - alsamixer basically tab over to capture on mic1 and hit the space bar to enable it …
next you need to to install mpv it just installs pretty much all the audio prerequisites files you can also in stall vlc ( i browsed through their code and seen they mentioning it for play back so I also installed that as well for good measure )…

 apt-get install mpv 

download a small test wav file to the device for testing
ie mvp test.wav
when playing the file it will tell you it is playing with alsa engine

installing pulse audio is well I am not sure if I installed it properly as their not a lot of consistent info on how to install that…

   apt-get install pulseaudio pulseaudio-utils

next you need to give it permission to the user ie

 usermod -a -G pulse root
 usermod -a -G pulse-access  root

logout or reboot

once done that you can run pulseaudio --system or pulseaudio -D

if everything is setup right now when you play a audio file with mpv it will say using pulse audio engine

now you can proceed with the mycroft install it will take a couple hours to install

  • cd ~/
  • git clone
  • cd mycroft-core
  • bash

once done run

./ unittest

hopefully it will be error free it might display error as audio files did not finish downloading as that can take a while after it finished installing the core…

next test your audio with mycroft

./ audiotest

if all worked out well mycroft will now be able to hear you and speak to you. :slight_smile:

next just run

 ./ all

to get it up and running and have a fully functional AI.

it will give you a pairing key and a way you go… pair it on their website

currently I got it running pretty well. got it stream music and alot of other function. currently trying to get domoticz to work. but it probably take me a couple days to understand how that code works… but if you are an openhab user it comes with a plugin that you can connect to it directly to control your devices. right off the top …

oh well good luck have fun

(stephen krywenko) #2

okay for those wanting to add domiticz functionality to mycroft – since it is not well documented… here a basic how too-
use this version as he/she actively working on it…

you can either install via your skills installer found on skill page. using the github git link


cd /opt/mycroft/skills
git clone

then wait a few minutes and then refresh your skills page

and this should appear

then just add in your domoticz settings

your light switches have to phased a certain way “where” and “what”

ie turn off office ( where) light (what)
and it will only recognize what ever words is in this voc file for where – if you want to use different word phrase just add it to the file ie zigbee as what i tend to use

and the what uses this voc file

so for me I tend to used zigbee as where - and kettle or dehumidifier as a what

so the end result for after adding the new words to voc files

the command “hey mycroft” - " turn on zigbee kettle" or " turn off zigbee dehumidifier " will result in a domoticz action

I found it best to use start-mycroft in debug mode for domotcz when setting up .

 ./ debug

as you can type in the commands in the cli and also you can see if there is a problem in word recognition then you can change/add in the word or words that are easier for mycroft to understand in domoticz etc…

(stephen krywenko) #3

little further update to AI energy monitor project… current problem with Pi devices is the mic quality and the the audio output is un amped … not even enough power to drive a head phone.

to solve the Mic problem was pretty easy I used a common mic found in most arduino kits CZN-15E . did not even have to remove the existing mic just soldered it to the mic pins on the back of the orange pi… now you can speak normally even at slight whisper and it will work well. and normal voice work with in a 10- 15 foot radius . currently testing with a slight cone interface on the mic that boosted it operation to about 30 feet. the only thing is you have pronounce the tigger sound ( HEY Mycroft) distinctly at that range but can speak normally after that.

the sound is a bit of a bother since it is un boosted - going to addin the TDA1308 hopefully that will get it up to the correct out put with out having over drive a set powered speakers … currently I have it working by pushing the sound to a pair of powered speakers to act as pre amp at 3/4 volume then using the headphone output to connect to a dolby surround system or other speakers which works fine .

today I wish I had a 3d printer. as I just need to box the AI and I am pretty much done -. It works pretty well it communicates with domoticz pretty good and i can get all my info off the device that I want all I have to say is

“Hey mycroft” – " what the current Usage"
and it will reply the "
"the Current usage is … "

(Trystan Lea) #4

Thanks for sharing @stephen that’s interesting!

(stephen krywenko) #5

for those who might want to add AI to Open Energy Monitoring project.
here a sketch for driving TFT_eSPI compatible screen also compatible RA8875 screen ( just point it away from the TFT-eSPI libary and point it at RA8875 library)
it based on my other TFT graphing sketches so you can incorporate in to my other MQTT remote monitors and touch screen interfaces if you wish .

this is the web music stream interface - it displays the Station, the Artists and the song titles.
you can easily addin weather weather display or energy display…
the blue ring informs you that Mycroft is ready

the green Ring informs you the Mycroft is listening for your command

simply copy to the ai home directory
you may have to modify them slightly as I set it to operate in the home dir of ai ( home/ai )

here are the sketch and the processing files for mycroft (9.2 MB)

it a little big because it includes the start.wav that my son created for it

it communicates via serial connection to /dev/ttyS2 and esp (wemos_mini ). The ESP and screen are powered off the the two 5 volt pins on the Pi .
once hooked up and you verified that you can talk to the screen from command line

sudo echo "cmd_9( xxx)" > /dev/ttyS2

this will tell the screen to clear itself and will display blue ring you can change the cmd_3 or 4 to turn from blue to green

then just add these lines to /etc/rc.local

sudo -H -u ai /home/ai/./startup
/home/ai/./ChNam >nul 2>&1 & echo "started station "  
/home/ai/./META  >nul 2>&1 & echo "started Meta data" 
/home/ai/./onlight >nul 2>&1 & echo "started ligt on" 
/home/ai/./offlight>nul 2>&1 & echo "started light off" 
/home/ai/./ont >nul 2>&1 & echo "started ont " 
sudo -H -u root  echo "cmd_9(start)" > /dev/ttyS2
/home/ai/./DandTime >nul 2>&1 & echo "sent time "
/home/ai/./starttime >nul 2>&1 & echo "started timer "  

if using orange pi on armbian OS such as I do you need to enable the Uarts just cd to /boot
nano armbianEnv.txt

and add

overlays=uart1 uart2

now reboot and your uarts will be enabled

okay good luck have fun