Merge branch 'development' into beta

This commit is contained in:
Michael Stanclift 2020-07-16 22:22:06 -05:00
commit d773315d5d
5 changed files with 271 additions and 58 deletions

View File

@ -14,8 +14,8 @@ Download the latest release from [GitHub](https://github.com/vmstan/gravity-sync
```bash ```bash
cd ~ cd ~
wget https://github.com/vmstan/gravity-sync/archive/v2.1.5.zip wget https://github.com/vmstan/gravity-sync/archive/v2.1.7.zip
unzip v2.1.5.zip -d gravity-sync unzip v2.1.7.zip -d gravity-sync
cd gravity-sync cd gravity-sync
``` ```
@ -169,6 +169,11 @@ The `./gravity-sync.sh config` function will attempt to ping the remote host to
Default setting in Gravity Sync is 0, change to 1 to skip this network test. Default setting in Gravity Sync is 0, change to 1 to skip this network test.
#### `ROOT_CHECK_AVOID=''`
At execution, Gravity Sync will check that it's deployed with it's own user (not running as root), but for a container deployment this is not necessary.
Default setting in Gravity Sync is 0, change to 1 to skip this root user test.
#### `BACKUP_RETAIN=''` #### `BACKUP_RETAIN=''`
The `./gravity-sync.sh backup` function will retain a defined number of days worth of `gravity.db` and `custom.list` backups. The `./gravity-sync.sh backup` function will retain a defined number of days worth of `gravity.db` and `custom.list` backups.
@ -216,9 +221,23 @@ If you prefer to still use cron but modify your settings by hand, using the entr
```bash ```bash
crontab -e crontab -e
*/30 * * * * /bin/bash /home/USER/gravity-sync/gravity-sync.sh > /home/USER/gravity-sync/gravity-sync.cron */15 * * * * /bin/bash /home/USER/gravity-sync/gravity-sync.sh > /home/USER/gravity-sync/gravity-sync.cron
0 23 * * * /bin/bash /home/USER//gravity-sync/gravity-sync.sh backup >/dev/null 2>&1
``` ```
### Automating Automation
To automate the deployment of automation option you can call it with 2 parameters:
- First interval in minutes to run sync [0-30],
- Second the hour to run backup [0-24]
(0 will disable the cron entry)
For example:
`./gravity-sync.sh automate 15 23`
Will configure automation of the sync function every 15 minutes and of a backup at 23:00.
## Reference Architectures ## Reference Architectures
The designation of primary and secondary is purely at your discretion. The doesn't matter if you're using an HA process like keepalived to present a single DNS IP address to clients, or handing out two DNS resolvers via DHCP. Generally it is expected that the two (or more) Pi-hole(s) will be at the same phyiscal location, or at least on the same internal networks. It should also be possible to to replicate to a secondary Pi-hole across networks, either over a VPN or open-Internet, with the approprate firewall/NAT configuration. The designation of primary and secondary is purely at your discretion. The doesn't matter if you're using an HA process like keepalived to present a single DNS IP address to clients, or handing out two DNS resolvers via DHCP. Generally it is expected that the two (or more) Pi-hole(s) will be at the same phyiscal location, or at least on the same internal networks. It should also be possible to to replicate to a secondary Pi-hole across networks, either over a VPN or open-Internet, with the approprate firewall/NAT configuration.

View File

@ -28,6 +28,22 @@ The `./gravity-sync.sh update` and `version` functions will look for the `dbclie
#### 2.1.5 #### 2.1.5
Skipping a few digits because what does it really matter? Skipping a few digits because what does it really matter?
- Implements a new beta branch, and with it a new `./gravity-sync.sh beta` function to enable it. This will hopefully allow new features and such to be added for test users who can adopt them and provide feedback before rolling out to the main update branch.
- Uses new SQLITE3 backup methodology introduced in 2.1, for all push/pull sync operations.
- `./gravity-sync.sh restore` lets you select a different `gravity.db` and `custom.list` for restoration.
- One new Star Trek reference.
- `./gravity-sync.sh restore` now shows recent complete Backup executions.
#### 2.1.6
- Adds prompts during `./gravity-sync.sh configure` to allow custom SSH port and enable PING avoidance.
- Adds `ROOT_CHECK_AVOID` variable to advanced configuration options, to help facilitate running Gravity Sync with container installations of Pi-hole. (PR [#64](https://github.com/vmstan/gravity-sync/pull/64))
- Adds the ability to automate automation. :mind_blown_emoji: Please see the [ADVANCED.md](https://github.com/vmstan/gravity-sync/blob/master/ADVANCED.md) document for more information. (PR [#64](https://github.com/vmstan/gravity-sync/pull/64))
(Thanks to [@fbourqui](https://github.com/fbourqui) for this contributions to this release.)
#### 2.1.7
- Git Rebase
## 2.0 ## 2.0
### The Smart Release ### The Smart Release

View File

@ -1 +1 @@
2.1.5 2.1.7

View File

@ -33,5 +33,6 @@ REMOTE_PASS=''
# SKIP_CUSTOM='' # SKIP_CUSTOM=''
# DATE_OUTPUT='' # DATE_OUTPUT=''
# PING_AVOID='' # PING_AVOID=''
# ROOT_CHECK_AVOID=''
# BACKUP_RETAIN='' # BACKUP_RETAIN=''

View File

@ -3,7 +3,7 @@ SCRIPT_START=$SECONDS
# GRAVITY SYNC BY VMSTAN ##################### # GRAVITY SYNC BY VMSTAN #####################
PROGRAM='Gravity Sync' PROGRAM='Gravity Sync'
VERSION='2.1.5' VERSION='2.1.7'
# Execute from the home folder of the user who owns it (ex: 'cd ~/gravity-sync') # Execute from the home folder of the user who owns it (ex: 'cd ~/gravity-sync')
# For documentation or downloading updates visit https://github.com/vmstan/gravity-sync # For documentation or downloading updates visit https://github.com/vmstan/gravity-sync
@ -31,6 +31,7 @@ VERIFY_PASS='0' # replace in gravity-sync.conf to overwrite
SKIP_CUSTOM='0' # replace in gravity-sync.conf to overwrite SKIP_CUSTOM='0' # replace in gravity-sync.conf to overwrite
DATE_OUTPUT='0' # replace in gravity-sync.conf to overwrite DATE_OUTPUT='0' # replace in gravity-sync.conf to overwrite
PING_AVOID='0' # replace in gravity-sync.conf to overwrite PING_AVOID='0' # replace in gravity-sync.conf to overwrite
ROOT_CHECK_AVOID='0' # replace in gravity-sync.conf to overwrite
# Backup Customization # Backup Customization
BACKUP_RETAIN='7' # replace in gravity-sync.conf to overwrite BACKUP_RETAIN='7' # replace in gravity-sync.conf to overwrite
@ -108,6 +109,9 @@ function update_gs {
if [ -f "$HOME/${LOCAL_FOLDR}/dev" ] if [ -f "$HOME/${LOCAL_FOLDR}/dev" ]
then then
BRANCH='development' BRANCH='development'
elif [ -f "$HOME/${LOCAL_FOLDR}/beta" ]
then
BRANCH='beta'
else else
BRANCH='master' BRANCH='master'
fi fi
@ -299,12 +303,14 @@ function pull_gs_reload {
## Pull Function ## Pull Function
function pull_gs { function pull_gs {
previous_md5
md5_compare md5_compare
backup_settime backup_settime
pull_gs_grav pull_gs_grav
pull_gs_cust pull_gs_cust
pull_gs_reload pull_gs_reload
md5_recheck
logs_export logs_export
exit_withchange exit_withchange
@ -401,7 +407,9 @@ function push_gs_reload {
## Push Function ## Push Function
function push_gs { function push_gs {
previous_md5
md5_compare md5_compare
backup_settime
intent_validate intent_validate
@ -409,15 +417,12 @@ function push_gs {
push_gs_cust push_gs_cust
push_gs_reload push_gs_reload
md5_recheck
logs_export logs_export
exit_withchange exit_withchange
} }
## Smart Sync Function function previous_md5 {
function smart_gs {
md5_compare
backup_settime
if [ -f "${LOG_PATH}/${HISTORY_MD5}" ] if [ -f "${LOG_PATH}/${HISTORY_MD5}" ]
then then
last_primaryDBMD5=$(sed "1q;d" ${LOG_PATH}/${HISTORY_MD5}) last_primaryDBMD5=$(sed "1q;d" ${LOG_PATH}/${HISTORY_MD5})
@ -430,6 +435,13 @@ function smart_gs {
last_primaryCLMD5="0" last_primaryCLMD5="0"
last_secondCLMD5="0" last_secondCLMD5="0"
fi fi
}
## Smart Sync Function
function smart_gs {
previous_md5
md5_compare
backup_settime
PRIDBCHANGE="0" PRIDBCHANGE="0"
SECDBCHANGE="0" SECDBCHANGE="0"
@ -550,24 +562,57 @@ function restore_gs {
MESSAGE="This will restore your settings on $HOSTNAME with a previous version!" MESSAGE="This will restore your settings on $HOSTNAME with a previous version!"
echo_warn echo_warn
MESSAGE="PREVIOUS BACKUPS" MESSAGE="PREVIOUS BACKUPS AVAILABLE FOR RESTORATION"
echo_info echo_info
ls $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD} | grep $(date +%Y) | grep ${GRAVITY_FI} | colrm 18 ls $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD} | grep $(date +%Y) | grep ${GRAVITY_FI} | colrm 18
MESSAGE="Enter the date you want to restore from" MESSAGE="Select backup date to restore ${GRAVITY_FI} from"
echo_need echo_need
read INPUT_BACKUP_DATE read INPUT_BACKUP_DATE
if [ -f $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD}/${INPUT_BACKUP_DATE}-${GRAVITY_FI}.backup ] if [ -f $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD}/${INPUT_BACKUP_DATE}-${GRAVITY_FI}.backup ]
then then
MESSAGE="Backup File Located" MESSAGE="Backup File Selected"
echo_info
else else
MESSAGE="Invalid Requested" MESSAGE="Invalid Request"
echo_info
exit_nochange
fi fi
if [ "$SKIP_CUSTOM" != '1' ]
then
if [ -f ${PIHOLE_DIR}/${CUSTOM_DNS} ]
then
ls $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD} | grep $(date +%Y) | grep ${CUSTOM_DNS} | colrm 18
MESSAGE="Select backup date to restore ${CUSTOM_DNS} from"
echo_need
read INPUT_DNSBACKUP_DATE
if [ -f $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD}/${INPUT_DNSBACKUP_DATE}-${CUSTOM_DNS}.backup ]
then
MESSAGE="Backup File Selected"
else
MESSAGE="Invalid Request"
echo_info
exit_nochange
fi
fi
fi
MESSAGE="${GRAVITY_FI} from ${INPUT_BACKUP_DATE} Selected"
echo_info
MESSAGE="${CUSTOM_DNS} from ${INPUT_DNSBACKUP_DATE} Selected"
echo_info
intent_validate intent_validate
MESSAGE="Making Time Warp Calculations"
echo_info
MESSAGE="Stopping Pi-hole Services" MESSAGE="Stopping Pi-hole Services"
echo_stat echo_stat
@ -619,11 +664,11 @@ function restore_gs {
if [ "$SKIP_CUSTOM" != '1' ] if [ "$SKIP_CUSTOM" != '1' ]
then then
if [ -f $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD}/${INPUT_BACKUP_DATE}-${CUSTOM_DNS}.backup ] if [ -f $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD}/${INPUT_DNSBACKUP_DATE}-${CUSTOM_DNS}.backup ]
then then
MESSAGE="Restoring ${CUSTOM_DNS} on $HOSTNAME" MESSAGE="Restoring ${CUSTOM_DNS} on $HOSTNAME"
echo_stat echo_stat
sudo cp $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD}/${INPUT_BACKUP_DATE}-${CUSTOM_DNS}.backup ${PIHOLE_DIR}/${CUSTOM_DNS} >/dev/null 2>&1 sudo cp $HOME/${LOCAL_FOLDR}/${BACKUP_FOLD}/${INPUT_DNSBACKUP_DATE}-${CUSTOM_DNS}.backup ${PIHOLE_DIR}/${CUSTOM_DNS} >/dev/null 2>&1
error_validate error_validate
MESSAGE="Validating Ownership on ${CUSTOM_DNS}" MESSAGE="Validating Ownership on ${CUSTOM_DNS}"
@ -705,6 +750,9 @@ function restore_gs {
## Core Logging ## Core Logging
### Write Logs Out ### Write Logs Out
function logs_export { function logs_export {
if [ "${TASKTYPE}" != "BACKUP" ]
then
MESSAGE="Saving File Hashes" MESSAGE="Saving File Hashes"
echo_stat echo_stat
rm -f ${LOG_PATH}/${HISTORY_MD5} rm -f ${LOG_PATH}/${HISTORY_MD5}
@ -713,6 +761,7 @@ function logs_export {
echo -e ${primaryCLMD5} >> ${LOG_PATH}/${HISTORY_MD5} echo -e ${primaryCLMD5} >> ${LOG_PATH}/${HISTORY_MD5}
echo -e ${secondCLMD5} >> ${LOG_PATH}/${HISTORY_MD5} echo -e ${secondCLMD5} >> ${LOG_PATH}/${HISTORY_MD5}
error_validate error_validate
fi
MESSAGE="Logging Successful ${TASKTYPE}" MESSAGE="Logging Successful ${TASKTYPE}"
echo_stat echo_stat
@ -734,6 +783,8 @@ function logs_gs {
tail -n 7 "${LOG_PATH}/${SYNCING_LOG}" | grep PULL tail -n 7 "${LOG_PATH}/${SYNCING_LOG}" | grep PULL
echo -e "Recent Complete ${YELLOW}PUSH${NC} Executions" echo -e "Recent Complete ${YELLOW}PUSH${NC} Executions"
tail -n 7 "${LOG_PATH}/${SYNCING_LOG}" | grep PUSH tail -n 7 "${LOG_PATH}/${SYNCING_LOG}" | grep PUSH
echo -e "Recent Complete ${YELLOW}BACKUP${NC} Executions"
tail -n 7 "${LOG_PATH}/${SYNCING_LOG}" | grep BACKUP
echo -e "Recent Complete ${YELLOW}RESTORE${NC} Executions" echo -e "Recent Complete ${YELLOW}RESTORE${NC} Executions"
tail -n 7 "${LOG_PATH}/${SYNCING_LOG}" | grep RESTORE tail -n 7 "${LOG_PATH}/${SYNCING_LOG}" | grep RESTORE
echo -e "========================================================" echo -e "========================================================"
@ -1115,6 +1166,11 @@ function error_validate {
## Validate Sync Required ## Validate Sync Required
function md5_compare { function md5_compare {
# last_primaryDBMD5="0"
# last_secondDBMD5="0"
# last_primaryCLMD5="0"
# last_secondCLMD5="0"
HASHMARK='0' HASHMARK='0'
MESSAGE="Analyzing ${GRAVITY_FI} on ${REMOTE_HOST}" MESSAGE="Analyzing ${GRAVITY_FI} on ${REMOTE_HOST}"
@ -1127,7 +1183,7 @@ function md5_compare {
secondDBMD5=$(md5sum ${PIHOLE_DIR}/${GRAVITY_FI} | sed 's/\s.*$//') secondDBMD5=$(md5sum ${PIHOLE_DIR}/${GRAVITY_FI} | sed 's/\s.*$//')
error_validate error_validate
if [ "$primaryDBMD5" == "$secondDBMD5" ] if [ "$primaryDBMD5" == "$last_primaryDBMD5" ] && [ "$secondDBMD5" == "$last_secondDBMD5" ]
then then
HASHMARK=$((HASHMARK+0)) HASHMARK=$((HASHMARK+0))
else else
@ -1154,7 +1210,7 @@ function md5_compare {
secondCLMD5=$(md5sum ${PIHOLE_DIR}/${CUSTOM_DNS} | sed 's/\s.*$//') secondCLMD5=$(md5sum ${PIHOLE_DIR}/${CUSTOM_DNS} | sed 's/\s.*$//')
error_validate error_validate
if [ "$primaryCLMD5" == "$secondCLMD5" ] if [ "$primaryCLMD5" == "$last_primaryCLMD5" ] && [ "$secondCLMD5" == "$last_secondCLMD5" ]
then then
# MESSAGE="${CUSTOM_DNS} Identical" # MESSAGE="${CUSTOM_DNS} Identical"
# echo_info # echo_info
@ -1209,14 +1265,14 @@ function md5_recheck {
secondDBMD5=$(md5sum ${PIHOLE_DIR}/${GRAVITY_FI} | sed 's/\s.*$//') secondDBMD5=$(md5sum ${PIHOLE_DIR}/${GRAVITY_FI} | sed 's/\s.*$//')
error_validate error_validate
if [ "$primaryDBMD5" == "$secondDBMD5" ] # if [ "$primaryDBMD5" == "$secondDBMD5" ]
then # then
HASHMARK=$((HASHMARK+0)) # HASHMARK=$((HASHMARK+0))
else # else
MESSAGE="Differenced ${GRAVITY_FI} Detected" # MESSAGE="Differenced ${GRAVITY_FI} Detected"
echo_warn # echo_warn
HASHMARK=$((HASHMARK+1)) # HASHMARK=$((HASHMARK+1))
fi # fi
if [ "$SKIP_CUSTOM" != '1' ] if [ "$SKIP_CUSTOM" != '1' ]
then then
@ -1236,16 +1292,16 @@ function md5_recheck {
secondCLMD5=$(md5sum ${PIHOLE_DIR}/${CUSTOM_DNS} | sed 's/\s.*$//') secondCLMD5=$(md5sum ${PIHOLE_DIR}/${CUSTOM_DNS} | sed 's/\s.*$//')
error_validate error_validate
if [ "$primaryCLMD5" == "$secondCLMD5" ] # if [ "$primaryCLMD5" == "$secondCLMD5" ]
then # then
# MESSAGE="${CUSTOM_DNS} Identical" # MESSAGE="${CUSTOM_DNS} Identical"
# echo_info # echo_info
HASHMARK=$((HASHMARK+0)) # HASHMARK=$((HASHMARK+0))
else # else
MESSAGE="Differenced ${CUSTOM_DNS} Detected" # MESSAGE="Differenced ${CUSTOM_DNS} Detected"
echo_warn # echo_warn
HASHMARK=$((HASHMARK+1)) # HASHMARK=$((HASHMARK+1))
fi # fi
else else
MESSAGE="No ${CUSTOM_DNS} Detected on ${REMOTE_HOST}" MESSAGE="No ${CUSTOM_DNS} Detected on ${REMOTE_HOST}"
echo_info echo_info
@ -1255,7 +1311,7 @@ function md5_recheck {
then then
REMOTE_CUSTOM_DNS="1" REMOTE_CUSTOM_DNS="1"
MESSAGE="${REMOTE_HOST} has ${CUSTOM_DNS}" MESSAGE="${REMOTE_HOST} has ${CUSTOM_DNS}"
HASHMARK=$((HASHMARK+1)) # HASHMARK=$((HASHMARK+1))
echo_info echo_info
fi fi
MESSAGE="No ${CUSTOM_DNS} Detected on $HOSTNAME" MESSAGE="No ${CUSTOM_DNS} Detected on $HOSTNAME"
@ -1263,14 +1319,14 @@ function md5_recheck {
fi fi
fi fi
if [ "$HASHMARK" != "0" ] # if [ "$HASHMARK" != "0" ]
then # then
MESSAGE="Replication Checks Failed" # MESSAGE="Replication Checks Failed"
echo_warn # echo_warn
else # else
MESSAGE="Replication Was Successful" # MESSAGE="Replication Was Successful"
echo_info # echo_info
fi # fi
} }
## Validate Intent ## Validate Intent
@ -1318,13 +1374,48 @@ function config_generate {
echo_stat echo_stat
cp $HOME/${LOCAL_FOLDR}/${CONFIG_FILE}.example $HOME/${LOCAL_FOLDR}/${CONFIG_FILE} cp $HOME/${LOCAL_FOLDR}/${CONFIG_FILE}.example $HOME/${LOCAL_FOLDR}/${CONFIG_FILE}
error_validate error_validate
MESSAGE="Environment Customization"
echo_info
MESSAGE="Enter a custom SSH port if required (Leave blank for default '22')"
echo_need
read INPUT_SSH_PORT
INPUT_SSH_PORT="${INPUT_SSH_PORT:-22}"
if [ "${INPUT_SSH_PORT}" != "22" ]
then
MESSAGE="Saving Custom SSH Port to ${CONFIG_FILE}"
echo_stat
sed -i "/# SSH_PORT=''/c\SSH_PORT='${INPUT_SSH_PORT}'" $HOME/${LOCAL_FOLDR}/${CONFIG_FILE}
error_validate
fi
MESSAGE="Perform PING tests between Pi-holes? (Leave blank for default 'Yes')"
echo_need
read INPUT_PING_AVOID
INPUT_PING_AVOID="${INPUT_PING_AVOID:-Y}"
if [ "${INPUT_PING_AVOID}" != "Y" ]
then
MESSAGE="Saving Ping Avoidance to ${CONFIG_FILE}"
echo_stat
sed -i "/# PING_AVOID=''/c\PING_AVOID='1'" $HOME/${LOCAL_FOLDR}/${CONFIG_FILE}
error_validate
PING_AVOID=1
fi
MESSAGE="Standard Settings"
echo_info
MESSAGE="IP or DNS of Primary Pi-hole" MESSAGE="IP or DNS of Primary Pi-hole"
echo_need echo_need
read INPUT_REMOTE_HOST read INPUT_REMOTE_HOST
if [ "${PING_AVOID}" != "1" ] if [ "${PING_AVOID}" != "1" ]
then then
MESSAGE="Testing Network Connection (PING)" MESSAGE="Testing Network Connection (PING)"
echo_stat echo_stat
ping -c 3 ${INPUT_REMOTE_HOST} >/dev/null 2>&1 ping -c 3 ${INPUT_REMOTE_HOST} >/dev/null 2>&1
@ -1479,6 +1570,9 @@ function show_version {
if [ -f $HOME/${LOCAL_FOLDR}/dev ] if [ -f $HOME/${LOCAL_FOLDR}/dev ]
then then
DEVVERSION="dev" DEVVERSION="dev"
elif [ -f $HOME/${LOCAL_FOLDR}/beta ]
then
DEVVERSION="beta"
else else
DEVVERSION="" DEVVERSION=""
fi fi
@ -1500,8 +1594,6 @@ function show_version {
fi fi
echo_info echo_info
echo -e "========================================================" echo -e "========================================================"
dbclient_warning
} }
function dbclient_warning { function dbclient_warning {
@ -1533,9 +1625,14 @@ function task_automate {
MESSAGE="Configuring Hourly Smart Sync" MESSAGE="Configuring Hourly Smart Sync"
echo_info echo_info
MESSAGE="Sync Frequency in Minutes (1-30) or 0 to Disable" if [[ $1 =~ ^[0-9][0-9]?$ ]]
echo_need then
read INPUT_AUTO_FREQ INPUT_AUTO_FREQ=$1
else
MESSAGE="Sync Frequency in Minutes (1-30) or 0 to Disable"
echo_need
read INPUT_AUTO_FREQ
fi
if [ $INPUT_AUTO_FREQ -gt 30 ] if [ $INPUT_AUTO_FREQ -gt 30 ]
then then
@ -1569,9 +1666,14 @@ function task_automate {
MESSAGE="Configuring Daily Backup Frequency" MESSAGE="Configuring Daily Backup Frequency"
echo_info echo_info
MESSAGE="Hour of Day to Backup (1-24) or 0 to Disable" if [[ $2 =~ ^[0-9][0-9]?$ ]]
echo_need then
read INPUT_AUTO_BACKUP INPUT_AUTO_BACKUP=$2
else
MESSAGE="Hour of Day to Backup (1-24) or 0 to Disable"
echo_need
read INPUT_AUTO_BACKUP
fi
if [ $INPUT_AUTO_BACKUP -gt 24 ] if [ $INPUT_AUTO_BACKUP -gt 24 ]
then then
@ -1634,6 +1736,17 @@ function task_devmode {
echo_stat echo_stat
rm -f $HOME/${LOCAL_FOLDR}/dev rm -f $HOME/${LOCAL_FOLDR}/dev
error_validate error_validate
elif [ -f $HOME/${LOCAL_FOLDR}/beta ]
then
MESSAGE="Disabling BETA"
echo_stat
rm -f $HOME/${LOCAL_FOLDR}/beta
error_validate
MESSAGE="Enabling ${TASKTYPE}"
echo_stat
touch $HOME/${LOCAL_FOLDR}/dev
error_validate
else else
MESSAGE="Enabling ${TASKTYPE}" MESSAGE="Enabling ${TASKTYPE}"
echo_stat echo_stat
@ -1647,6 +1760,42 @@ function task_devmode {
exit_withchange exit_withchange
} }
## Devmode Task
function task_betamode {
TASKTYPE='BETA'
MESSAGE="${MESSAGE}: ${TASKTYPE} Requested"
echo_good
if [ -f $HOME/${LOCAL_FOLDR}/beta ]
then
MESSAGE="Disabling ${TASKTYPE}"
echo_stat
rm -f $HOME/${LOCAL_FOLDR}/beta
error_validate
elif [ -f $HOME/${LOCAL_FOLDR}/dev ]
then
MESSAGE="Disabling DEV"
echo_stat
rm -f $HOME/${LOCAL_FOLDR}/dev
error_validate
MESSAGE="Enabling ${TASKTYPE}"
echo_stat
touch $HOME/${LOCAL_FOLDR}/beta
error_validate
else
MESSAGE="Enabling ${TASKTYPE}"
echo_stat
touch $HOME/${LOCAL_FOLDR}/beta
error_validate
fi
MESSAGE="Run UPDATE to apply changes"
echo_info
exit_withchange
}
## Update Task ## Update Task
function task_update { function task_update {
TASKTYPE='UPDATE' TASKTYPE='UPDATE'
@ -1682,12 +1831,14 @@ function task_compare {
TASKTYPE='COMPARE' TASKTYPE='COMPARE'
MESSAGE="${MESSAGE}: ${TASKTYPE} Requested" MESSAGE="${MESSAGE}: ${TASKTYPE} Requested"
echo_good echo_good
import_gs import_gs
validate_gs_folders validate_gs_folders
validate_ph_folders validate_ph_folders
validate_os_sshpass validate_os_sshpass
previous_md5
md5_compare md5_compare
} }
@ -1716,6 +1867,7 @@ function task_backup {
backup_local_custom backup_local_custom
backup_cleanup backup_cleanup
logs_export
exit_withchange exit_withchange
} }
@ -1763,7 +1915,7 @@ function backup_remote_custom {
echo_stat echo_stat
CMD_TIMEOUT='15' CMD_TIMEOUT='15'
CMD_REQUESTED="sudo cp ${PIHOLE_DIR}/${CUSTOM_DNS} ${PIHOLE_DIR}/${CUSTOM_DNS}.backup'\"" CMD_REQUESTED="sudo cp ${PIHOLE_DIR}/${CUSTOM_DNS} ${PIHOLE_DIR}/${CUSTOM_DNS}.backup"
create_sshcmd create_sshcmd
fi fi
fi fi
@ -1831,7 +1983,10 @@ function root_check {
MESSAGE="Evaluating Arguments" MESSAGE="Evaluating Arguments"
echo_stat echo_stat
root_check if [ "${ROOT_CHECK_AVOID}" != "1" ]
then
root_check
fi
case $# in case $# in
@ -1883,7 +2038,7 @@ case $# in
TASKTYPE='PULL' TASKTYPE='PULL'
MESSAGE="${MESSAGE}: ${TASKTYPE} Requested" MESSAGE="${MESSAGE}: ${TASKTYPE} Requested"
echo_good echo_good
import_gs import_gs
validate_gs_folders validate_gs_folders
validate_ph_folders validate_ph_folders
@ -1897,7 +2052,7 @@ case $# in
TASKTYPE='PUSH' TASKTYPE='PUSH'
MESSAGE="${MESSAGE}: ${TASKTYPE} Requested" MESSAGE="${MESSAGE}: ${TASKTYPE} Requested"
echo_good echo_good
import_gs import_gs
validate_gs_folders validate_gs_folders
validate_ph_folders validate_ph_folders
@ -1936,6 +2091,10 @@ case $# in
task_devmode task_devmode
;; ;;
beta)
task_betamode
;;
devmode) devmode)
task_devmode task_devmode
;; ;;
@ -1981,6 +2140,24 @@ case $# in
;; ;;
esac esac
;; ;;
2)
case $1 in
automate)
task_automate
;;
esac
;;
3)
case $1 in
automate)
task_automate $2 $3
;;
esac
;;
*) *)
task_invalid task_invalid