Cronjob Redux

After days of trying to figure this out, I finally got the video to upload via a cronjob.

There were 2 issues.

Issue the first

Finally found the issue. Original script from YouTube developers guidehad this:

CLIENT_SECRETS_FILE = "client_secrets.json"

And then a couple of lines later, this:

% os.path.abspath(os.path.join(os.path.dirname(__file__), CLIENT_SECRETS_FILE))

When crontab would run the script it would run from a path that wasn’t where the CLIENT_SECRETS_FILE file was and so a message would be displayed:

WARNING: Please configure OAuth 2.0
To make this sample run you will need to populate the client_secrets.json file
found at:

  %s

with information from the Developers Console
https://console.developers.google.com/

For more information about the client_secrets.json file format, please visit:
https://developers.google.com/api-client-library/python/guide/aaa_client_secrets

What I needed to do was to update the CLIENT_SECRETS_FILE to be the whole path so that it could always find the file.

A simple change:

CLIENT_SECRETS_FILE  = os.path.abspath(os.path.join(os.path.dirname(__file__), CLIENT_SECRETS_FILE))

Issue the second

When the create_mp4.sh script would run it was reading all of the h264 files from the directory where they lived BUT they were attempting to output the mp4 file to / which it didn’t have permission to write to.

This was failing silently (I’m still not sure how I could have caught the error). Since there was no mp4 file to upload that script was failing (though it was true that the location of the CLIENT_SECRETS_FILE was an issue).

What I needed to do was change the create_mp4.sh file so that when the MP4Box command output the mp4 file to the proper directory. The script went from this:

(echo '#!/bin/sh'; echo -n "MP4Box"; array=($(ls ~/Documents/python_projects/*.h264)); for index in ${!array[@]}; do if [ "$index" -eq 0 ]; then echo -n " -add ${array[index]}"; else echo -n " -cat ${array[index]}"; fi; done; echo -n " hummingbird.mp4") > create_mp4.sh

To this:

(echo '#!/bin/sh'; echo -n "MP4Box"; array=($(ls ~/Documents/python_projects/*.h264)); for index in ${!array[@]}; do if [ "$index" -eq 0 ]; then echo -n " -add ${array[index]}"; else echo -n " -cat ${array[index]}"; fi; done; echo -n " /home/pi/Documents/python_projects/hummingbird.mp4") > /home/pi/Documents/python_projects/create_mp4.sh

The last bit /home/pi/Documents/python_projects/create_mp4.sh may not be neccesary but I’m not taking any chances.

The video posted tonight is the first one that was completely automatic!

Now … if I could just figure out how to automatically fill up my hummingbird feeder.

Cronjob … Finally

I’ve mentioned before that I have been working on getting the hummingbird video upload automated.

Each time I thought I had it, and each time I was wrong.

For some reason I could run it from the command line without issue, but when the cronjob would try and run it … nothing.

Turns out, it was running, it just wasn’t doing anything. And that was my fault.

The file I had setup in cronjob was called run_scrip.sh

At first I was confused because the script was suppose to be writing out to a log file all of it’s activities. But it didn’t appear to.

Then I noticed that the log.txt file it was writing was in the main \` directory. That should have been my first clue.

I kept trying to get the script to run, but suddenly, in a blaze of glory, realized that it was running, it just wasn’t doing anything.

And it wasn’t doing anything for the same reason that the log file was being written to the \` directory.

All of the paths were relative instead of absolute, so when the script ran the command ./create_mp4.sh it looks for that script in the home directory, didn’t find it, and moved on.

The fix was simple enough, just add absolute paths and we’re golden.

That means my run_script.sh goes from this:

# Create the script that will be run
./create_script.sh
echo "Create Shell Script: $(date)" >> log.txt

# make the script that was just created executable
chmod +x /home/pi/Documents/python_projects/create_mp4.sh

# Create the script to create the mp4 file
/home/pi/Documents/python_projects/create_mp4.sh
echo "Create MP4 Shell Script: $(date)" >> /home/pi/Documents/python_projects/log.txt

# upload video to YouTube.com
/home/pi/Documents/python_projects/upload.sh
echo "Uploaded Video to YouTube.com: $(date)" >> /home/pi/Documents/python_projects/log.txt

# Next we remove the video files locally
rm /home/pi/Documents/python_projects/*.h264
echo "removed h264 files: $(date)" >> /home/pi/Documents/python_projects/log.txt

rm /home/pi/Documents/python_projects/*.mp4
echo "removed mp4 file: $(date)" >> /home/pi/Documents/python_projects/log.txt

To this:

# change to the directory with all of the files
cd /home/pi/Documents/python_projects/

# Create the script that will be run
/home/pi/Documents/python_projects/create_script.sh
echo "Create Shell Script: $(date)" >> /home/pi/Documents/python_projects/log.txt

# make the script that was just created executable
chmod +x /home/pi/Documents/python_projects/create_mp4.sh

# Create the script to create the mp4 file
/home/pi/Documents/python_projects/create_mp4.sh
echo "Create MP4 Shell Script: $(date)" >> /home/pi/Documents/python_projects/log.txt

# upload video to YouTube.com
/home/pi/Documents/python_projects/upload.sh
echo "Uploaded Video to YouTube.com: $(date)" >> /home/pi/Documents/python_projects/log.txt

# Next we remove the video files locally
rm /home/pi/Documents/python_projects/*.h264
echo "removed h264 files: $(date)" >> /home/pi/Documents/python_projects/log.txt

rm /home/pi/Documents/python_projects/*.mp4
echo "removed mp4 file: $(date)" >> /home/pi/Documents/python_projects/log.txt

I made this change and then started getting an error about not being able to access a json file necessary for the upload to YouTube. Sigh.

Then while searching for what directory the cronjob was running from I found this very simple idea. The response was, why not just change it to the directory you want. 🤦‍♂️

I added the cd to the top of the file:

# change to the directory with all of the files
cd /home/pi/Documents/python_projects/

Anyway, now it works. Finally!

Tomorrow will be the first time (unless of course something else goes wrong) that The entire process will be automated. Super pumped!

SSL … Finally!

I’ve been futzing around with SSL on this site since last December. I’ve had about 4 attempts and it just never seemed to work.

Earlier this evening I was thinking about getting a second Linode just to get a fresh start. I was this close to getting it when I thought, what the hell, let me try to work it out one more time.

And this time it actually worked.

I’m not really sure what I did differently, but using this site seemed to make all of the difference.

The only other thing I had to do was make a change in the word press settings (from http to https) and enable a plugin Really Simple SSL and it finally worked.

I even got an ‘A’ from SSL Labs!

Again, not really sure why this seemed so hard and took so long.

I guess sometimes you just have to try over and over and over again

Hummingbird Video Capture

I previously wrote about how I placed my Raspberry Pi above my hummingbird feeder and added a camera to it to capture video.

Well, the day has finally come where I’ve been able to put my video of it up on YouTube! It’s totally silly, but it was satisfying getting it out there for everyone to watch and see.

Hummingbird Video Capture: Addendum

The code used to generate the the mp4 file haven’t changed (really). I did do a couple of things to make it a little easier though.

I have 2 scripts that generate the file and then copy it from the pi to my MacBook Pro and the clean up:

Script 1 is called create_script.sh and looks like this:

(echo '#!/bin/sh'; echo -n "MP4Box"; array=($(ls *.h264)); for index in ${!array[@]}; do if [ "$index" -eq 0 ]; then echo -n " -add ${array[index]}"; else echo -n " -cat ${array[index]}"; fi; done; echo -n " hummingbird.mp4") > create_mp4.sh | chmod +x create_mp4.sh

This creates a script called create_mp4.sh and makes it executable.

This script is called by another script called run_script.sh and looks like this:

./create_script.sh
./create_mp4.sh

scp hummingbird.mp4 ryan@192.168.1.209:/Users/ryan/Desktop/

# Next we remove the video files locally

rm *.h264
rm *.mp4

It runs the create_script.sh which creates create_mpr.sh and then runs it.

Then I use the scp command to copy the mp4 file that was just created over to my Mac Book Pro.

As a last bit of housekeeping I clean up the video files.

I’ve added this run_script.sh to a cron job that is scheduled to run every night at midnight.

We’ll see how well it runs tomorrow night!

How to pick a team to root for (when the Dodgers aren’t playing)

I’ve been thinking a bit about how to decide which team to root for. Mostly I just want to stay logically consistent with the way I choose to root for a team (when the Dodgers aren’t playing obviously).

After much thought (and sketches on my iPad) I’ve come up with this table to help me determine who to root for:

Opp1 / Opp 2 NL West NL Central NL East AL West AL Central AL East
NL West Root for team that helps the Dodgers NL Central Team NL East Team NL West Team,unless it hurts the Dodgers NL West Team,unless it hurts the Dodgers NL West Team,unless it hurts the Dodgers
NL Central NL Central Team Root for underdog NL Central Team NL Central Team NL Central Team NL Central Team
NL East NL East Team NL Central Team Root for underdog NL East Team NL East Team NL East Team
AL West NL West Team,unless it hurts the Dodgers NL Central Team NL East Team The Angels over the A’s over the Mariners over the Rangers over the Astros AL West Team AL West Team
AL Central NL West Team,unless it hurts the Dodgers NL Central Team NL East Team AL West Team Root for underdog AL Central Team
AL East NL West Team,unless it hurts the Dodgers NL Central Team NL East Team AL West Team AL Central Team Root for underdog (unless it’s the Yankees)

The basic rule is root for the team that helps the Dodgers payoff changes, then National League over American League and finally West over Central over East (from a division perspective).

There were a couple of cool sketches I made, on real paper and my iPad. Turns out, sometimes you really need to think about thing before you write it down and commit to it.

Of course, this is all subject to change depending on the impact any game would have on the Dodgers.

ITFDB Demo

Last Wednesday if you would have asked what I had planned for Easter I would have said something like, “Going to hide some eggs for my daughter even though she knows the Easter bunny isn’t real.”

Then suddenly my wife and I were planning on entertaining for 11 family members. My how things change!

Since I was going to have family over, some of whom are Giants fans, I wanted to show them the ITFDB program I have set up with my Pi.

The only problem is that they would be over at 10am and leave by 2pm while the game doesn’t start until 5:37pm (Thanks ESPN).

To help demonstrate the script I wrote a demo script to display a message on the Pi and play the Vin Scully mp3.

The Code was simple enough:

from sense_hat import SenseHat
import os


def main():
    sense = SenseHat()
    message = '#ITFDB!!! The Dodgers will be playing San Francisco at 5:37pm tonight!'
    sense.show_message(message, scroll_speed=0.05)
    os.system("omxplayer -b /home/pi/Documents/python_projects/itfdb/dodger_baseball.mp3")


if __name__ == '__main__':
    main()

But then the question becomes, how can I easily launch the script without futzing with my laptop?

I knew that I could run a shell script for the Workflow app on my iPhone with a single action, so I wrote a simple shell script

python3 ~/Documents/python_projects/itfdb/demo.py

Which was called itfdb_demo.sh

And made it executable

chmod u+x itfdb_demo.sh

Finally, I created a WorkFlow which has only one action Run Script over SSH and added it to my home screen so that with a simple tap I could demo the results.

The WorkFlow looks like this:

Nothing too fancy, but I was able to reliably and easily demonstrate what I had done. And it was pretty freaking cool!