Scripting on Raspberry Pis and automating various tasks on my servers is an amusing hobby of mine. It makes my life easier, builds robustness and gives me regular status updates of my various servers so I can keep up with all my projects. Many of my scripts can be replaced by formal services but I enjoy creating my own solution that has no reliance on the outside world and runs efficiently on my server.
I made a simple BASH script that uses curl to check if all my servers are online and its dumps all the outputs into a formatted HTML page. I host the page on my internal network to quickly check the current status of my many nodes and services. If any servers go down it will also send me an email.
#HOST NAME, ADDRESS, SEARCH TERMINOLOGY, SEND EMAIL
Lekatsas.us,https://www.lekatsas.us,Lekatsas,1
Cloud.Lekatsas.us,https://cloud.lekatsas.us,Lekatsas,1
Photos.Lekatsas.us,https://photos.lekatsas.us,PhotoPrism,0
Rick.Lekatsas.us,https://rick.lekatsas.us,homepage.mp4,1
,,,,
Smoky,http://192.168.1.4,Temperatures,1
Stentor,http://192.168.20.17,Stentor,1
Chronos,http://192.168.20.12,Chronos,1
Hermes,http://hermes.lekatsas.us,Hermes,1
#!/bin/bash
#config file location
hosts="/etc/cron.scripts/host_list.ini"
#log file for the html file displaying the server status
log_file=/var/www/nginx/uptime.html
#declare the variables
name=()
address=()
search=()
email=()
email_body="" #email body
result=1 #Initialize the variable for the first loop, start as failing values then curl/check
#parse the ini file to collect host names to check for
while IFS="," read -r rec_column1 rec_column2 rec_column3 rec_column4
do
name+=("$rec_column1")
address+=("$rec_column2")
search+=("$rec_column3")
email+=("$rec_column4")
#arr_csv+=("$line")
done < <(tail -n +2 $hosts) #tail command drops the first line of the ini which describes the dataset
#Purge log file
cat /dev/null > $log_file
#write html auto refresh code
echo "<head>" >> $log_file
echo "<meta http-equiv="refresh" content="60">" >> $log_file
echo "</head>" >> $log_file
#write the date and time in the first line using the 'date' command
current_date=`date +%D`
current_time=`date +%T`
echo "<h3>$current_time $current_date</h3>" >> $log_file
echo "<table>" >> $log_file
# NOW START CHECKING ALL THE SERVERS
for ((i=0; i<=${#name[@]}; i++))
do
#if the variable is empty, print a line break in HTML
if [ -z "${name[i]}" ];then
#add an extra break for formatting
echo "<tr><td><br></td><td><br></td></tr>" >> $log_file
#if the variable has content, curl the host and save the output to HTML/log file
else
echo "Checking ${name[i]}"
#if the variable 'search' is empty just blindly curl and look for any response
if [ -z "${search[i]}" ];then
#curl -sfkL --retry-max-time 2 --retry 3 --max-time 5 --ipv4 ${address[i]} > /dev/null
curl -sfkL --connect-timeout 0.5 --max-time 1 --retry 5 --retry-delay 0 --retry-max-time 5 --ipv4 ${address[i]} > /dev/null
# $? = Exit status of last task
result=$?
echo -e "curl return code: $result \n"
#otherwise, curl and look for a unique search term inside the HTML code to confirm it didn't curl a 404
else
echo "Searching for ${search[i]}"
#curl -sfkL --retry-max-time 2 --retry 3 --max-time 5 --ipv4 ${address[i]} | grep -q "${search[i]}"
curl -sfkL --connect-timeout 0.5 --max-time 1 --retry 5 --retry-delay 0 --retry-max-time 5 --ipv4 ${address[i]} | grep -q "${search[i]}"
result=$?
echo -e "grep return code $result \n"
fi
#check the result, if result != 0
# -ne is 'not equal', not equal to zero
if [ $result -ne 0 ];then
echo "<tr><td>${name[i]}</td><td>🚨<mark>**DOWN**</mark>🚨</td></tr>" >> $log_file
#Send email only on the top of the hour AND if it's flagged for email notification
#if the time, in minutes, is less than 5 and email array element contains something (not empty)
if [ "$(date +%M)" -lt 5 ] && [ "${email[i]}" -eq 1 ] ; then
email_body+="$(date +%F) $(date +%T) - ${name[i]} is 🚨⚠️🚨 DOWN 🚨⚠️🚨\n"
fi
else
echo "<tr><td><a href="${address[i]}">${name[i]}</a></td><td>🆗👌</td></tr>" >> $log_file
fi
fi
done
echo "</table>" >> $log_file
#look in the html file, if any of the servers are DOWN add a gif to grab attention
if grep -q DOWN $log_file; then
echo "<img src="web_images/server_down_yipyip.gif" width='250' height='188'>" >> $log_file
fi
#If one of the servers is down, send an email. if the string is empty send the email
if [ ${#email_body} -gt 1 ]; then
echo -e "Subject: Host DOWN - Uptime Checking Script\n\n$email_body" | msmtp -v --account=lekatsas.us email@lekatsas.us
fi
DS18B20 are a cheap, popular, high-precision digital temperature sensor from Maxim (formerly Dallas Semiconductor). They work on a 1-wire data interface and the best part is you can hang many off the same bus! Because I'm stubborn I wanted to get it to work in BASH so below is my silly code that grabs the raw data from the device, cleans it up with several stacked commands then pushes it over MQTT to my Home Assistant server for display.
#!/bin/bash
#MQTT Variables
# The values are send to Home Assistant via MQTT
SERVER='192.168.200.5'
USERNAME='stentor'
PASSWORD='PASSWORD'
TOPIC_AMBIENT='stentor/ambient_temp'
# Main loop
while true; do
#first sed collects the second line
#second sed returns everything after = symbol
#printf pads the numbers adding zeros to make it at least 4 total digits (not characters, it ignores the - )
#third sed inserts a decimal three places in from the right
ambient_temp=$(cat /sys/bus/w1/devices/28-0000075ff73a/w1_slave | \
sed -n '2 p' | \
sed 's/^.*=//' | \
xargs printf '%.4d' | \
sed 's/...$/.&/')
echo "Ambient Temperature is $ambient_temp"
#sometimes there are errors, not sure, possibly on the one wire interface and I push bad data to HASS. Zero degrees.
#if the temperature is zero degrees centigrade, don't push it to HASS
if [ "$ambient_temp" != "0.000" ]; then
mosquitto_pub -h $SERVER -u $USERNAME -P $PASSWORD -t $TOPIC_AMBIENT -m "{\"value\":$ambient_temp}" -r
fi
# Sleep for 30 seconds before checking again
sleep 30
done
I keep all my docker services as folders with compose files inside them. All these folders are in one clean directory for easy backup so I made a simple script to start all the services as well as update them in one execution.
#!/bin/bash
logger "Starting Docker compose services"
services=(/home/services/docker/*) # This creates an array of the full paths to all subdirs
#the last entry in this array is always blank line, hence the minus 1 in the for loop count below
for ((i=0; i<=(${#services[@]}-1); i++))
do
docker compose -f ${services[i]}/docker-compose.yml pull && docker compose -f ${services[i]}/docker-compose.yml up -d --always-recreate-deps --remove-orphans &
done
#wait for all the background commands to finish
wait