Twitter bot that posts randomly generated Vision quotes and interacts with other people saying 'what is x if not y persevering'. https://twitter.com/BotPersevering.
Go to file
Vivek Santayana ff1072e243 API deprecation of api.me() 2021-11-11 10:34:02 +00:00
development Committed version 3 of the bot from the source code on my computer. Older versions of the bot were version controlled manually before I knew how to use Git. 2021-07-06 14:09:06 +01:00
interaction API deprecation of api.me() 2021-11-11 10:34:02 +00:00
posting Re-based to Slim, reduced NTLK corpus dependencies 2021-08-06 10:06:19 +01:00
.env.example Bump version numbers 2021-08-06 10:09:40 +01:00
.gitignore Re-based to Slim, reduced NTLK corpus dependencies 2021-08-06 10:06:19 +01:00
LICENSE Initial commit 2021-07-06 13:23:27 +01:00
Pipfile Committed version 3 of the bot from the source code on my computer. Older versions of the bot were version controlled manually before I knew how to use Git. 2021-07-06 14:09:06 +01:00
Pipfile.lock Committed version 3 of the bot from the source code on my computer. Older versions of the bot were version controlled manually before I knew how to use Git. 2021-07-06 14:09:06 +01:00
README.md Bump version numbers 2021-08-06 10:09:40 +01:00
docker-compose.yml Changed restart parameter to unless-stopped. 2021-09-01 10:37:17 +01:00

README.md

Vision Bot

This is a Twitter bot that posts randomly generated Vision quotes and interacts with other people saying 'what is x if not y persevering'.

The bot is available on Twitter @BotPersevering.

This is a silly bot intended to parody a line of dialogue that got a lot of attention and fame, and is just a fun piece of code.

Bot Structure

The bot has two functions:

  1. A posting function that posts a quote in the format of 'What is x, if not y persevering?' at the top of each hour, and
  2. An interaction function that likes and retweets any tweets it finds matching that format every five minutes.

Each function is a self-contained programme that authenticates to the bot independently using the same API tokens. They are both contained in separate Docker containers that are then spun up using docker-compose. Both Docker images are based on the Python image.

Posting Function

The posting function is fairly straightforward. The bot has set lists of phrases categorised by number of syllables. These lists were built using various natural language libraries in python, and may not be the most reliable dataset.

At the top of its posting cycle, it randomly determines how many syllables the phrases it will use will have, and it uses phrases of the same number of syllables on either side so that the quote scans well. It randomly selects a phrase from the corresponding word list.

After it selects the first phrase, it selects the second phrase using the following algorithm:

  1. Select a random phrase,
  2. If it is the same phrase as the first, discard and select a new phrase,
  3. Check the phrase's similarity score, using some rudimentary natural language processing, to the first phrase,
  4. If it less than a 66% similarity (which is roughly equal to the similarity score between 'grief' and 'love' which was the original basis of the quote), discard and try selecting a new phrase up to five times,
  5. If this is the fifth time the bot is attempting to find a phrase of adequate similarity and it is still unable to do so, then disregard the similarity scores,
  6. Once it has selected a phrase that matches the similarity requirement, or it has tried at least five times and failed, it will post the two phrases in the format 'What is <phrase 1> if not <phrase 2> persevering?', and
  7. After it has posted the tweet, it logs the post and waits until the next hour.

Interaction Function

The Interaction function is much simpler. Every five minutes, it searches the Twitter timeline for quotes that fit the format What is <phrase 1> if not <phrase 2> persevering?. It parses a list of all of these tweets, and likes and re-tweets all of them that were later than the last tweet it interacted with. It then logs the most recent tweet as the most recent tweet it interacted with so that it remembers a starting point for the next time it parses tweets.

development Directory

This directory contains code and word lists that I had used to generate the bot's word lists. It is all fairly rudimentary code to spit out text files that have lists of words categorised by syllables. These lists formed the basis for the word lists the bot uses. I deleted some of the more un-funny ones, like a lot of phrases that were '... currency unit', amongst others. This has no bearing on the actual functioning of the bot. I have just left this here on the off chance people need to re-generate the word lists.

In order to re-generate word lists, simply run the Python scripts. This has no connection to the Docker containers or any of the other set-up for the bot.

Setup

Running another instance of the Bot is very straightforward. As the entire bot is Dockerised and uses Docker Compose, all you need to do is:

  1. Clone the repository,
  2. Create a .env file in the main directory, as well as in the /interaction and /posting directories that contain the various API keys (using the template below),
  3. Install and run Docker and Docker Compose, and
  4. Navigate to the folder containing the docker-compose.yml file and use the command sudo docker-compose up -d.

.env File

I have kept all of my API keys private and on .env files that Docker ignores when pushing to the repository. You will need to create your own .env files to store these values, and have multiple copies of this in the same directory as the docker-compose.yml file, and the /interaction and /posting directories.

N.B.: Make sure you use the same variable names as below.

.env file(s):

BOT_VERSION=3.1
BEARER_TOKEN=
ACCESS_TOKEN=
ACCESS_TOKEN_SECRET=
API_TOKEN=
API_KEY_SECRET=

Version History

This is the third version of the bot, and I made the first two before I learnt how to use Docker, so my version control for that is messy and all over the shop.

The third version added the natural language processing, had a massively expanded list of words, and basically did everything that I learnt the bot could do after I realised there were natural language processing libraries on Python.


My thanks to Alan Jackson for his valuable suggestions for this project.