LSTM Poetry with Text-to-Speech

For the final week’s project using ml5.js, I put together a LSTM text generator with a poetry model and text-to-speech generator.

Github: https://github.com/vulture-boy/lstmPoetry
(
there are a few extra models on the Github that you can access by modifying the code slightly; just change the model folder to load)

You can check it out here: https://vulture-boy.github.io/lstmPoetry/
[The text to speech seems to be giving the webhost some issues and only works some of the time. would recommend downloading it from the GitHub]

To accomplish this, I scraped poetry from a website and followed the tutorial listed on ml5: Training a LSTM

Scraping

webscraperchrome

I used Web Scraper in Chrome in order to get the text information I needed to train the machine learning process. I needed to create a text file containing the information I wanted the algorithm to learn from, but I didn’t want to go through the laborious process of manually collecting it from individual web pages or Google searches. Using a web scraper makes the task automated by the computer. The only information that is required is a ‘sitemap’ that you can put together using Web Scraper’s interface: pick out the html elements that designate which text, links and data of interest are located to describe to the scraper how to navigate the page and what to collect.

scrapepoems

After the process is complete (or if you decide to interrupt it), you can export a .csv containing the data collected by the Web Scraper process and copy the column(s) containing the desired data into a .txt file for the training process to use.

Training the Process

In order to prepare my computer for training, I had to install a few packages to my Windows 10 Powershell, namely Chocolatey, Python3, and a few python packages (pip, Tensorflow, virtual environment). It’s worth noting that in order to install these I needed to enable Remote Scripts: by default, Windows 10 prevents you from running scripts inside Powershell for safety purposes.

Installing Python3 (inc. Powershell setup)
Installing Tensorflow

capture

Once I had the packages installed, I ran the train.py file included in the training package repository on a .txt file collating all the text data I collected via web scraping. Each epoch denotes one full presentation of the data to the process and the time/batch section denotes how many seconds passed per process. The train_loss parameter indicates how accurate the process’ prediction was to the input data: the lower the value, the better the prediction. There are also several hyper-parameters that can be adjusted to improve the quality of the result and the time it takes to process (Google has a description of this here). I used the default settings for my first batch on the poetry:

  • with 15 minutes of scraped data (3500 iterations, poem paragraphs), it took about 15 minutes to process.
  • For a second batch, I collected about 30 minutes of data from a fanfiction website (227650 iterations, sentence and paragraph sizes) and I believe it took a little over 3 hours.
  • I adjusted the hyperparameters as recommended on the ml5 training instructions for 8mb of data on another 15 minute data set containing an entire novel (55000 iterations, 360 chapters) and instead chose to run the process on my laptop instead of my desktop computer. The average time/batch was ~7.5, larger than my desktop’s average of ~0.25 with default settings. This was also going to take approximately five days to complete, so I aborted the process. I tried again using default settings on my laptop: the iterations increased from 55000 to 178200 but the batch time was a respectable 0.115 on average.

scrape

The training file on completion creates a model folder, which can be substituted for any other LSTM model.

Text-to-Speech

One of the contributed libraries for p5.js is the p5.speech library. The library is easily integrated into existing p5.js projects and has comprehensive documentation on their website. For my LSTM generator, I created a voice object and a few extra sliders to control the voice’s pitch and playback speed as well as a playback button that read the output text. Now I can listen to beautiful machine-rendered poetry!

Here’s a sample output:

The sky was blue and horry The light a bold with a more in the garden, Who heard on the moon and song the down the rasson’t good the mind her beast be oft to smell on the doss of the must the place But the see though cold to the pain With sleep the got of the brown be brain. I was the men in the like the turned and so was the chinder from the soul the Beated and seen, Some in the dome they love me fall, to year that the more the mountent to smocties, A pet the seam me and dream of the sease ends of the bry sings.

Blow, Wind, Blow: A Windy Tweet Machine

Tyson Moll

tweeted

What?
I hooked up a Twitter account with p5 and the Weather Underground. It tweets whenever the wind picks up speed, or whenever I have something spicy to say on the account. Whoop de doo!

Follow it at @TorontoWindy on twitter!

How?
Thanks to a handy-dandy example provided by Nick Puckett, our presiding Ubiq prof, we were able to connect p5 / javascript to io.adafruit.com, which in turn can be accessed by a website called IFTTT (IF-This-Then-That) to perform actions with whatever value you provide it through the p5 environment. IFTTT works in terms of ‘applets’, which perform the “do this whenever that happens” you dictate them to. It supports a variety of services; I used the Twitter, WeatherUnderground and Adafruit functionality.

IO.adafruit.com accomplishes this by capturing data sent to its API, then sharing the information with any integrated services observing the channel the data was sent to. The p5 environment simply has to collect information, then pass it on to IO.adafruit.com.

feeds applets

The first applet allows you to post a Wolfram Alpha response to Twitter using a p5 interface. I crudely merged last week’s example code on retrieving Wolfram Alpha API messages through PubNub with this week’s example code on sending information to io.adafruit.com in order to have the applet post some fun facts about wind on the Twitter account. Essentially, whenever the Wolfram API sends back information to the p5 interface, it triggers the io.adafruit.com function and sends the string to be forwarded to the Twitter account.

app-example

The second applet just takes wind data from WeatherUnderground whenever it passes certain thresholds (in KPH) and posts a tweet about it on Twitter. By following the IFTTT’s site’s step-by-step instructions I quickly got the applet running.

Why?
I never really payed any attention to the wind direction reports from weather providers but it seemed cool in concept to be alerted whenever it gets real windy in Toronto. So why not? Having it available as a separate service makes it more noticeable to me instead of the information being bogged up with the weather details everyone else wants to know about (e.g. temperature).

It’s likely that the IFTTT service can be circumvented altogether with a solid understanding of each of it’s services APIs, as I would presume many of them are publicly available, but the process of setting this connection up was considerably quick and simple.

The concept of having this access to API’s hosted on the web also interests me in the idea of public engagement of data and its distribution and uses. Maybe the p5 context could be a corporate sharing site for employee photos, maybe it’s a handy way to centralize all your social media sharing activity.

Capturing the Weather

Tyson Moll

Pastebin Copy of C# Script for Unity Weather API

This week we were tasked with utilizing an API in order to ‘do something’. So I built a very simple weather app in Unity.

It took a little scrounging around to find practical examples of what I wanted to accomplish. I came across AtlasVR, a beautiful example created by Sebastian Scholl and Nate Daubert as well as a script written out of boredom by a Unity forum user known as ‘johnnydj’. Both seemed to have been written for slightly older versions of Unity based on their usage of the depreciated ‘WWW’ class, so with reference to the Unity Documenation I modified the scripts to use the modern ‘UnityWebRequest’ class. I used OpenWeatherMap.org as the provider of the JSON-formatted weather data used in my Unity project as denoted by the two sources: after registering on their website, anyone can make a request to their system at a rate of 60 calls per hour.. not significant but enough to get my application working. With testing, I was successfully able to parse the retrieved JSON information and use the data in Unity!

Being able to access these numbers and import other elements of OpenWeatherMap’s variety of JSON properties with the script opened the opportunity to use them within Unity’s game environment. I was able to adjust the color value of the directional light with the temperature values and increase the density of a particle fog using the humidity percentage. The capacity to use this information to model real-world environments seems powerful, and I’m glad that I now have access to this script for future reference when developing JSON-based API accessors / decoders. From my understanding, this method of storing information can also be used for saving games or sharing information across networks in easily readable format. Perhaps it’s not as efficient as binary, but it’s incredibly legible.

componentcolor

Resources:

johnnydj. “Current Weather Script” Unity Forums. Forum Post. Retrieved from <https://forum.unity.com/threads/current-weather-script.242009/>

Scholl, Sebastian and Nate Daubert. “AtlasVR: Connecting to Web API’s in Unity for Weather Data” hackster.io. Journal. Retrieved from <https://www.hackster.io/team-dream/atlasvr-connecting-to-web-api-s-in-unity-for-weather-data-38a099>

Unity Documentation. “JSON Serialization” and “UnityWebRequest.Get”. Retrieved from <https://docs.unity3d.com/Manual/JSONSerialization.html> and <https://docs.unity3d.com/ScriptReference/Networking.UnityWebRequest.Get.html>

 

 

Process Journal #1 – Radio Rabbit (Tyson)

Radio Rabbit

Video: pic.twitter.com/mjzqA4hqOe

Day 1:

We were introduced to the XBEE radio transceivers. My radio kit came with a Lilypad Arduino which conveniently could be used to configure the radios. I tested the device out with another student and we sent several strings back and forth between eachother.

Reminds me of the command terminal on computers / past arduino serial communication experiments. The XBEEs were relatively easy to setup with the aid of the lecture slides provided by our professors and the necessary firmare. Transmissions have been clean so far, but will have to watch out for dropped packets when testing in the future.

Since the XBEEs don’t natively fit on breadboards, I soldered the breakout board provided to us after class.

The important things to note:
– 3.3v power supply
– RX / TX pairing between the arduino micro and xbee on setup
– serial can be used to communicate with the XBEE for testing purposes, but ‘Serial1’ must be used in the Arduino code to denote communication with the 1 and 2 pins.

Day 2:

For the sake of our task to create a device to interact with a metronome signal I was inspired to use several servos I had leftover from last semester to create some sort of rabbit. I wanted the project to be relatively simple but demonstrate motion and have an entertaining appearance. I first tested out the servos with both a 9V power supply and a USB power supply. Supplying the four small devices that would be the legs worked as expected and operated fine with some demo code provided through Arduino’s IDE (Servo Sweep and Physical Pixel), although I discovered yesterday that the charge remaining in my battery supply seemed to have diminished overnight and lost its efficiency. The weight of the device on the breadboard began to concern me as I thought about how the device would be supported if it were to be an enclosed object.

When it came time to design lasercut parts for the device, I created two types of legs to test (forelegs and back legs), with multiple copies cut to allow them to be laminated for thickness. The servos were mounted beneath the device’s breadboard with Weldbond and a second panel and oriented as appropriate for the four-legged mammal. The legs were mounted to the servos with the kit attachments, some glue and screws; without the screws, the legs were unable to support the weight of the device.

20190109_174258 capture

Day 3:

The code has been designed to operate in two different states: a wide stance for a “High” signal and a narrow stance for a “Low” signal, in similar appearance to the visual look of a sprinting rabbit. The orientation of the servos required, for example, that the left back leg was positioned to a different angle than that of the right back leg in order to appear symmetrical. For my code, this meant that the angles for the respective servos were at opposite ends of a range between 0 and 180. Naturally, having the legs span the whole range of the servos looked peculiar and caused some technical hiccups with the leg behavior, so it was reduced to an appropriate span.

The back legs seem to perform more effectively than the forelegs, which often get caught in awkward angles causing disruptive behavior. This may also be due to my selection of servos for the forelegs, which are of a different, cheaper make than the hind ones supplied by last year’s Creation and Computation class kits.

20190109_181210fritz

Day 4: 

Along with several other classmates I tested out my device’s functionality with the radio signals of other XBEEs. It took a bit of troubleshooting to detect but it turns out the device works fine, if a bit finicky sometimes with receiving the data packets. This could be due to the amperage draw of the servos competing with the XBEE; given some of my past experiences with servos I’m fairly surprised how well the circuit has been working up to this point.

I did end up getting a replacement battery, which worked just once with the remote signal but afterwards lacked the capacity to support the circuit. Perhaps I will look into using 12V instead of 9V in the future.