wip
This commit is contained in:
parent
7bb8a8c839
commit
e2c0d1de87
18
Makefile
18
Makefile
@ -1,4 +1,4 @@
|
|||||||
# Autogenerated at 22.11.2024 09:40 using ./gen-makefile
|
# Autogenerated at 29.11.2024 16:47 using ./gen-makefile
|
||||||
.DEFAULT_GOAL := help
|
.DEFAULT_GOAL := help
|
||||||
|
|
||||||
#===============================================
|
#===============================================
|
||||||
@ -129,7 +129,7 @@ papirus:
|
|||||||
pgsql:
|
pgsql:
|
||||||
@./install/pgsql
|
@./install/pgsql
|
||||||
|
|
||||||
##php: Install php v8.1 + ppa
|
##php: Install php v8.3 + ppa
|
||||||
php:
|
php:
|
||||||
@./install/php
|
@./install/php
|
||||||
|
|
||||||
@ -173,7 +173,7 @@ rustdesk:
|
|||||||
snap:
|
snap:
|
||||||
@./install/snap
|
@./install/snap
|
||||||
|
|
||||||
##sublimetext: Install Sublime Text (build 4169)
|
##sublimetext: Install Sublime Text
|
||||||
sublimetext:
|
sublimetext:
|
||||||
@./install/sublimetext
|
@./install/sublimetext
|
||||||
|
|
||||||
@ -262,6 +262,18 @@ phpstack: php phptools
|
|||||||
/apache2:
|
/apache2:
|
||||||
@./uninstall/apache2
|
@./uninstall/apache2
|
||||||
|
|
||||||
|
##/canon-mg2500: Uninstall Canon Pixma MG2500 + ppa
|
||||||
|
/canon-mg2500:
|
||||||
|
@./uninstall/canon-mg2500
|
||||||
|
|
||||||
|
##/chrome: Uninstall google chrome
|
||||||
|
/chrome:
|
||||||
|
@./uninstall/chrome
|
||||||
|
|
||||||
|
##/composer: Uninstall composer
|
||||||
|
/composer:
|
||||||
|
@./uninstall/composer
|
||||||
|
|
||||||
##/docker: Uninstall docker + ppa
|
##/docker: Uninstall docker + ppa
|
||||||
/docker:
|
/docker:
|
||||||
@./uninstall/docker
|
@./uninstall/docker
|
||||||
|
67
README.md
67
README.md
@ -1,46 +1,28 @@
|
|||||||
# My Ubuntu environment
|
# My shell environment
|
||||||
|
|
||||||
`make`-ready bunch of scripts for easily installation of different software.
|
`make`-ready bunch of scripts for easily (de)installation of different software and bunch of useful handy functions for custom scripting.
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
* Ubuntu >= 20.04 (not tested with version < 20)
|
* Ubuntu >= 20.04 (not tested with version < 20)
|
||||||
* `bash`, `zsh` or other `sh`-compatible shell
|
* `bash`, `zsh` or other `sh`-compatible shell
|
||||||
* `make` (optional but recommended)
|
* `make` (optional but recommended)
|
||||||
* `wget` (necessary for some scripts)
|
* `wget` (required for some scripts)
|
||||||
* `git` (necessary for some scripts)
|
* `git` (required for some scripts)
|
||||||
|
|
||||||
If some dependecies are missed for some of these scripts it is enougth to run `./install/apt` in most cases.
|
If some dependecies are missed for some of these scripts it is enougth to run `./install/apt` in most cases, otherwise script will suggest (or even install) them.
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
### Clone this repo (recommended)
|
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
# if git is installed
|
# with git
|
||||||
git clone git@git.axenov.dev:anthony/my-env.git --depth=1
|
git clone git@git.axenov.dev:anthony/my-env.git --depth=1 --single-branch
|
||||||
|
|
||||||
# if git is not installed
|
# without git
|
||||||
wget -qO - https://git.axenov.dev/anthony/my-env/archive/master.tar.gz | tar -zxf -
|
wget -qO - https://git.axenov.dev/anthony/my-env/archive/master.tar.gz | tar -zxf -
|
||||||
|
|
||||||
# switch to repo dir
|
|
||||||
cd my-env
|
|
||||||
|
|
||||||
# generate fresh ./Makefile and get full list of `make` goals
|
|
||||||
./gen-makefile
|
|
||||||
|
|
||||||
# get full list of `make` goals
|
# get full list of `make` goals
|
||||||
make
|
cd my-env && make
|
||||||
```
|
|
||||||
|
|
||||||
### Selective straightforward installation
|
|
||||||
|
|
||||||
```shell
|
|
||||||
# from remote file (you can meet interaction bugs this way!)
|
|
||||||
wget -qO - https://git.axenov.dev/anthony/my-env/raw/branch/master/install/apt | bash
|
|
||||||
|
|
||||||
# from locally cloned repo (except scripts from ./packs)
|
|
||||||
./install/apt
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## How to add my script?
|
## How to add my script?
|
||||||
@ -78,17 +60,32 @@ mypackX: goalA goalB
|
|||||||
```
|
```
|
||||||
|
|
||||||
where:
|
where:
|
||||||
* `mypack*` is the pack name
|
* `mypack*` is the pack name of your choice
|
||||||
* `goal*` are script names in `./install`
|
* `goal*` are script names in `./install`
|
||||||
|
|
||||||
## TODO
|
## Useful links and sources used
|
||||||
|
|
||||||
* build: [flameshot](https://github.com/flameshot-org/flameshot#compilation)
|
* https://gist.github.com/anthonyaxenov/d53c4385b7d1466e0affeb56388b1005
|
||||||
* build: [rustdesk](https://github.com/rustdesk/rustdesk#build)
|
* https://gist.github.com/anthonyaxenov/89c99e09ddb195985707e2b24a57257d
|
||||||
* [JB mono](https://www.jetbrains.com/ru-ru/lp/mono/#how-to-install) ([2](https://fonts.google.com/specimen/JetBrains+Mono))
|
* ...and other my [gists](https://gist.github.com/anthonyaxenov/) with [SHELL] prefix
|
||||||
* update scripts (when possible)
|
* https://github.com/nvie/gitflow/blob/develop/gitflow-common (BSD License)
|
||||||
* uninstall scripts (when possible)
|
* https://github.com/petervanderdoes/gitflow-avh/blob/develop/gitflow-common (FreeBSD License)
|
||||||
|
* https://github.com/vaniacer/bash_color/blob/master/color
|
||||||
|
* https://misc.flogisoft.com/bash/tip_colors_and_formatting
|
||||||
|
* https://www-users.york.ac.uk/~mijp1/teaching/2nd_year_Comp_Lab/guides/grep_awk_sed.pdf
|
||||||
|
* https://www.galago-project.org/specs/notification/
|
||||||
|
* https://laurvas.ru/bash-trap/
|
||||||
|
* https://stackoverflow.com/a/52674277
|
||||||
|
* https://rtfm.co.ua/bash-funkciya-getopts-ispolzuem-opcii-v-skriptax/
|
||||||
|
* https://gist.github.com/jacknlliu/7c51e0ee8b51881dc8fb2183c481992e
|
||||||
|
* https://gist.github.com/anthonyaxenov/d53c4385b7d1466e0affeb56388b1005
|
||||||
|
* https://github.com/nvie/gitflow/blob/develop/gitflow-common
|
||||||
|
* https://github.com/petervanderdoes/gitflow-avh/blob/develop/gitflow-common
|
||||||
|
* https://gitlab.com/kyb/autorsync/-/blob/master/
|
||||||
|
* https://lug.fh-swf.de/vim/vim-bash/StyleGuideShell.en.pdf
|
||||||
|
* https://www.thegeekstuff.com/2010/06/bash-array-tutorial/
|
||||||
|
* https://www.distributednetworks.com/linux-network-admin/module4/ephemeral-reserved-portNumbers.php
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
[WTFPLv2](LICENSE)
|
[WTFPLv2](LICENSE) but other licences are also possible.
|
||||||
|
13
TODO.md
13
TODO.md
@ -1,6 +1,11 @@
|
|||||||
# Todo list
|
# Todo list
|
||||||
|
|
||||||
* [ ] tdesktop (https://desktop.telegram.org)
|
* tdesktop (https://desktop.telegram.org)
|
||||||
* [ ] spoofdpi (https://git.axenov.dev/mirrors/SpoofDPI/tags)
|
* spoofdpi (https://git.axenov.dev/mirrors/SpoofDPI/tags)
|
||||||
* [ ] lazynvim (https://www.lazyvim.org)
|
* lazynvim (https://www.lazyvim.org)
|
||||||
* [ ] ...
|
* build: [flameshot](https://github.com/flameshot-org/flameshot#compilation)
|
||||||
|
* build: [rustdesk](https://github.com/rustdesk/rustdesk#build)
|
||||||
|
* update scripts (when possible)
|
||||||
|
* uninstall scripts (when possible)
|
||||||
|
* ...
|
||||||
|
|
||||||
|
@ -6,6 +6,10 @@
|
|||||||
"https://daocloud.io",
|
"https://daocloud.io",
|
||||||
"https://c.163.com/",
|
"https://c.163.com/",
|
||||||
"https://registry.docker-cn.com",
|
"https://registry.docker-cn.com",
|
||||||
"https://huecker.io"
|
"https://huecker.io",
|
||||||
|
"https://public.ecr.aws",
|
||||||
|
"https://quay.io",
|
||||||
|
"https://registry.access.redhat.com",
|
||||||
|
"https://registry.redhat.io"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
224
helpers
224
helpers
@ -1,224 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
|
|
||||||
installed() {
|
|
||||||
command -v "$1" >/dev/null 2>&1
|
|
||||||
}
|
|
||||||
|
|
||||||
installed2() {
|
|
||||||
dpkg --list | grep -qw "ii $1"
|
|
||||||
}
|
|
||||||
|
|
||||||
apt_install() {
|
|
||||||
sudo apt install -y --autoremove $*
|
|
||||||
}
|
|
||||||
|
|
||||||
require() {
|
|
||||||
sw=()
|
|
||||||
for package in "$@"; do
|
|
||||||
if ! installed "$package" && ! installed2 "$package"; then
|
|
||||||
sw+=("$package")
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
if [ ${#sw[@]} -gt 0 ]; then
|
|
||||||
info "This packages will be installed in your system:\n${sw[*]}"
|
|
||||||
apt_install ${sw[*]}
|
|
||||||
[ $? -gt 0 ] && die "installation cancelled" 201
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
require2() {
|
|
||||||
sw=()
|
|
||||||
for package in "$@"; do
|
|
||||||
if ! installed "$package" && ! installed2 "$package"; then
|
|
||||||
sw+=("$package")
|
|
||||||
fi
|
|
||||||
done
|
|
||||||
if [ ${#sw[@]} -gt 0 ]; then
|
|
||||||
die "This packages must be installed in your system:\n${sw[*]}" 200
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
title() {
|
|
||||||
[ "$1" ] && title="$1" || title="$(grep -m 1 -oP "(?<=^##makedesc:\s).*$" ${BASH_SOURCE[1]})"
|
|
||||||
info
|
|
||||||
info "==============================================="
|
|
||||||
info "$title"
|
|
||||||
info "==============================================="
|
|
||||||
info
|
|
||||||
}
|
|
||||||
|
|
||||||
unpak_targz() {
|
|
||||||
require tar
|
|
||||||
tar -xzf "$1" -C "$2"
|
|
||||||
}
|
|
||||||
|
|
||||||
symlink() {
|
|
||||||
ln -sf "$1" "$2"
|
|
||||||
}
|
|
||||||
|
|
||||||
download() {
|
|
||||||
require wget
|
|
||||||
wget "$1" -O "$2"
|
|
||||||
}
|
|
||||||
|
|
||||||
clone() {
|
|
||||||
require git
|
|
||||||
git clone $*
|
|
||||||
}
|
|
||||||
|
|
||||||
clone_quick() {
|
|
||||||
require git
|
|
||||||
git clone $* --depth=1 --single-branch
|
|
||||||
}
|
|
||||||
|
|
||||||
abspath() {
|
|
||||||
echo $(realpath -q "${1/#\~/$HOME}")
|
|
||||||
}
|
|
||||||
|
|
||||||
is_writable() {
|
|
||||||
[ -w "$(abspath $1)" ]
|
|
||||||
}
|
|
||||||
|
|
||||||
is_dir() {
|
|
||||||
[ -d "$(abspath $1)" ]
|
|
||||||
}
|
|
||||||
|
|
||||||
is_file() {
|
|
||||||
[ -f "$(abspath $1)" ]
|
|
||||||
}
|
|
||||||
|
|
||||||
is_function() {
|
|
||||||
declare -F "$1" > /dev/null
|
|
||||||
}
|
|
||||||
|
|
||||||
regex_match() {
|
|
||||||
printf "%s" "$1" | grep -qP "$2"
|
|
||||||
}
|
|
||||||
|
|
||||||
in_array() {
|
|
||||||
local find=$1
|
|
||||||
shift
|
|
||||||
for e in "$@"; do
|
|
||||||
[[ "$e" == "$find" ]] && return 0
|
|
||||||
done
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
implode() {
|
|
||||||
local d=${1-}
|
|
||||||
local f=${2-}
|
|
||||||
if shift 2; then
|
|
||||||
printf %s "$f" "${@/#/$d}"
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
IINFO="( i )"
|
|
||||||
INOTE="( * )"
|
|
||||||
IWARN="( # )"
|
|
||||||
IERROR="( ! )"
|
|
||||||
IFATAL="( @ )"
|
|
||||||
ISUCCESS="( ! )"
|
|
||||||
IASK="( ? )"
|
|
||||||
IDEBUG="(DBG)"
|
|
||||||
IVRB="( + )"
|
|
||||||
|
|
||||||
BOLD="\e[1m"
|
|
||||||
DIM="\e[2m"
|
|
||||||
NOTBOLD="\e[22m" # sometimes \e[21m
|
|
||||||
NOTDIM="\e[22m"
|
|
||||||
NORMAL="\e[20m"
|
|
||||||
RESET="\e[0m"
|
|
||||||
|
|
||||||
FRESET="\e[39m"
|
|
||||||
FBLACK="\e[30m"
|
|
||||||
FWHITE="\e[97m"
|
|
||||||
FRED="\e[31m"
|
|
||||||
FGREEN="\e[32m"
|
|
||||||
FYELLOW="\e[33m"
|
|
||||||
FBLUE="\e[34m"
|
|
||||||
FLRED="\e[91m"
|
|
||||||
FLGREEN="\e[92m"
|
|
||||||
FLYELLOW="\e[93m"
|
|
||||||
FLBLUE="\e[94m"
|
|
||||||
|
|
||||||
BRESET="\e[49m"
|
|
||||||
BBLACK="\e[40m"
|
|
||||||
BWHITE="\e[107m"
|
|
||||||
BRED="\e[41m"
|
|
||||||
BGREEN="\e[42m"
|
|
||||||
BYELLOW="\e[43m"
|
|
||||||
BBLUE="\e[44m"
|
|
||||||
BLRED="\e[101m"
|
|
||||||
BLGREEN="\e[102m"
|
|
||||||
BLYELLOW="\e[103m"
|
|
||||||
BLBLUE="\e[104m"
|
|
||||||
|
|
||||||
dt() {
|
|
||||||
echo "[$(date +'%H:%M:%S')] "
|
|
||||||
}
|
|
||||||
|
|
||||||
ask() {
|
|
||||||
IFS= read -rp "$(print ${BOLD}${BBLUE}${FWHITE}${IASK}${BRESET}\ ${BOLD}$1 ): " $2
|
|
||||||
}
|
|
||||||
|
|
||||||
print() {
|
|
||||||
echo -e "$*${RESET}"
|
|
||||||
}
|
|
||||||
|
|
||||||
debug() {
|
|
||||||
if [ "$2" ]; then
|
|
||||||
print "${DIM}${BOLD}${RESET}${DIM}${FUNCNAME[1]:-?}():${BASH_LINENO:-?}\t$1 "
|
|
||||||
else
|
|
||||||
print "${DIM}${BOLD}${RESET}${DIM}$1 "
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
verbose() {
|
|
||||||
print "${BOLD}${IVRB}${RESET}${FYELLOW} $1 "
|
|
||||||
}
|
|
||||||
|
|
||||||
info() {
|
|
||||||
print "${BOLD}${FWHITE}${BLBLUE}${IINFO}${RESET}${FWHITE} $1 "
|
|
||||||
}
|
|
||||||
|
|
||||||
note() {
|
|
||||||
print "${BOLD}${DIM}${FWHITE}${INOTE}${RESET} $1 "
|
|
||||||
}
|
|
||||||
|
|
||||||
success() {
|
|
||||||
print "${BOLD}${BGREEN}${FWHITE}${ISUCCESS}${BRESET}$FGREEN $1 "
|
|
||||||
}
|
|
||||||
|
|
||||||
warn() {
|
|
||||||
print "${BOLD}${BYELLOW}${FBLACK}${IWARN}${BRESET}${FYELLOW} Warning:${RESET} $1 "
|
|
||||||
}
|
|
||||||
|
|
||||||
error() {
|
|
||||||
print "${BOLD}${BLRED}${FWHITE}${IERROR} Error: ${BRESET}${FLRED} $1 " >&2
|
|
||||||
}
|
|
||||||
|
|
||||||
fatal() {
|
|
||||||
print "${BOLD}${BRED}${FWHITE}${IFATAL} FATAL: $1 " >&2
|
|
||||||
print_stacktrace
|
|
||||||
}
|
|
||||||
|
|
||||||
die() {
|
|
||||||
error "${1:-halted}"
|
|
||||||
exit ${2:-100}
|
|
||||||
}
|
|
||||||
|
|
||||||
print_stacktrace() {
|
|
||||||
STACK=""
|
|
||||||
local i
|
|
||||||
local stack_size=${#FUNCNAME[@]}
|
|
||||||
debug "Callstack:"
|
|
||||||
# for (( i=$stack_size-1; i>=1; i-- )); do
|
|
||||||
for (( i=1; i<$stack_size; i++ )); do
|
|
||||||
local func="${FUNCNAME[$i]}"
|
|
||||||
[ x$func = x ] && func=MAIN
|
|
||||||
local linen="${BASH_LINENO[$(( i - 1 ))]}"
|
|
||||||
local src="${BASH_SOURCE[$i]}"
|
|
||||||
[ x"$src" = x ] && src=non_file_source
|
|
||||||
debug " at $func $src:$linen"
|
|
||||||
done
|
|
||||||
}
|
|
19
helpers.sh
Normal file
19
helpers.sh
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -oe pipefail
|
||||||
|
|
||||||
|
__root__=$( dirname $(readlink -e -- "${BASH_SOURCE}"))
|
||||||
|
source $__root__/helpers/io.sh || exit 255
|
||||||
|
source $__root__/helpers/basic.sh || exit 255
|
||||||
|
source $__root__/helpers/debug.sh || exit 255
|
||||||
|
source $__root__/helpers/packages.sh || exit 255
|
||||||
|
|
||||||
|
title() {
|
||||||
|
[[ $__AAA_NO_TITLE = 1 ]] || {
|
||||||
|
[ "$1" ] && title="$1" || title="$(grep -m 1 -oP "(?<=^##makedesc:\s).*$" ${BASH_SOURCE[1]})"
|
||||||
|
info
|
||||||
|
info "==============================================="
|
||||||
|
info "$title"
|
||||||
|
info "==============================================="
|
||||||
|
info
|
||||||
|
}
|
||||||
|
}
|
29
helpers/arg-parser/README.md
Normal file
29
helpers/arg-parser/README.md
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
# Argument parser for bash scripts
|
||||||
|
|
||||||
|
More info:
|
||||||
|
* 🇷🇺 [axenov.dev/bash-args](https://axenov.dev/bash-args/)
|
||||||
|
* 🇺🇸 [axenov.dev/en/bash-processing-arguments-in-a-script-when-called-from-the-shell/](https://axenov.dev/en/bash-processing-arguments-in-a-script-when-called-from-the-shell)
|
||||||
|
|
||||||
|
Tested in Ubuntu 20.04.2 LTS in:
|
||||||
|
|
||||||
|
```
|
||||||
|
bash 5.0.17(1)-release (x86_64-pc-linux-gnu)
|
||||||
|
zsh 5.8 (x86_64-ubuntu-linux-gnu)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Version history
|
||||||
|
|
||||||
|
```
|
||||||
|
v1.0 - initial
|
||||||
|
v1.1 - arg(): improved skipping uninteresting args
|
||||||
|
- arg(): check next arg to be valid value
|
||||||
|
v1.2 - removed all 'return' statements
|
||||||
|
- arg(): error message corrected
|
||||||
|
- new examples
|
||||||
|
v1.3 - argl(): improved flag check
|
||||||
|
- some text corrections
|
||||||
|
v1.4 - new function argn()
|
||||||
|
- some text corrections
|
||||||
|
v1.5 - arg(), grep_match(): fixed searching for -e argument
|
||||||
|
- grep_match(): redirect output into /dev/null
|
||||||
|
```
|
268
helpers/arg-parser/args.sh
Normal file
268
helpers/arg-parser/args.sh
Normal file
@ -0,0 +1,268 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
#########################################################################
|
||||||
|
# #
|
||||||
|
# Argument parser for bash scripts #
|
||||||
|
# #
|
||||||
|
# Author: Anthony Axenov (Антон Аксенов) #
|
||||||
|
# Version: 1.5 #
|
||||||
|
# License: MIT #
|
||||||
|
# #
|
||||||
|
#########################################################################
|
||||||
|
# #
|
||||||
|
# With 'getopt' you cannot combine different #
|
||||||
|
# arguments for different nested functions. #
|
||||||
|
# #
|
||||||
|
# 'getopts' does not support long arguments with #
|
||||||
|
# values (like '--foo=bar'). #
|
||||||
|
# #
|
||||||
|
# These functions supports different arguments and #
|
||||||
|
# their combinations: #
|
||||||
|
# -a -b -c #
|
||||||
|
# -a avalue -b bvalue -c cvalue #
|
||||||
|
# -cab bvalue #
|
||||||
|
# --arg #
|
||||||
|
# --arg=value -ab -c cvalue --foo #
|
||||||
|
# #
|
||||||
|
# Tested in Ubuntu 20.04.2 LTS in: #
|
||||||
|
# bash 5.0.17(1)-release (x86_64-pc-linux-gnu) #
|
||||||
|
# zsh 5.8 (x86_64-ubuntu-linux-gnu) #
|
||||||
|
# #
|
||||||
|
#########################################################################
|
||||||
|
|
||||||
|
#purpose Little helper to check if string matches PCRE
|
||||||
|
#argument $1 - some string
|
||||||
|
#argument $2 - regex
|
||||||
|
#exitcode 0 - string valid
|
||||||
|
#exitcode 1 - string is not valid
|
||||||
|
grep_match() {
|
||||||
|
printf "%s" "$1" | grep -qP "$2" >/dev/null
|
||||||
|
}
|
||||||
|
|
||||||
|
#purpose Find short argument or its value
|
||||||
|
#argument $1 - (string) argument (without leading dashes; only first letter will be processed)
|
||||||
|
#argument $2 - (number) is it flag? 1 if is, otherwise 0 or nothing
|
||||||
|
#argument $3 - (string) variable to return value into
|
||||||
|
# (if not specified then it will be echo'ed in stdout)
|
||||||
|
#returns (string) 1 (if $2 == 1), value (if correct and if $2 != 1) or nothing
|
||||||
|
#usage To get value into var: arg v 0 myvar or myvalue=$(arg 'v')
|
||||||
|
#usage To find flag into var: arg f 1 myvar or flag=$(arg 'f')
|
||||||
|
#usage To echo value: arg v
|
||||||
|
#usage To echo 1 if flag exists: arg f
|
||||||
|
arg() {
|
||||||
|
local need=${1:0:1} # argument to find (only first letter)
|
||||||
|
[ $need ] || {
|
||||||
|
echo "Argument is not specified!" >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
local isflag=$2 || 0 # should we find the value or just the presence of the $need?
|
||||||
|
local retvar=$3 || 0 # var to return value into (if 0 then value will be echo'ed in stdout)
|
||||||
|
local args=(${__MAIN_ARGS[0]}) # args we need are stored in 1st element of __MAIN_ARGS
|
||||||
|
for ((idx=0; idx<${#args[@]}; ++idx)) do # going through args
|
||||||
|
local arg=${args[$idx]} # current argument
|
||||||
|
# skip $arg if it starts with '--', letter or digit
|
||||||
|
grep_match "$arg" "^(\w{1}|-{2})" && continue
|
||||||
|
# clear $arg from special and duplicate characters
|
||||||
|
# e.g. 'fas-)dfs' will become 'fasd'
|
||||||
|
local chars="$(printf "%s" "${arg}" | tr -s [${arg}] | tr -d "[:punct:][:blank:]")"
|
||||||
|
# now we can check if $need is one of $chars
|
||||||
|
if grep_match "-$need" "^-[$chars]$"; then # if it is
|
||||||
|
if [[ $isflag = 1 ]]; then # and we expect it as a flag
|
||||||
|
# then return '1' back into $3 (if exists) or echo in stdout
|
||||||
|
[ $retvar ] && eval "$retvar='1'" || echo "1"
|
||||||
|
else # but if $arg is not a flag
|
||||||
|
# then get next argument as value of current one
|
||||||
|
local value="${args[$idx+1]}"
|
||||||
|
# check if it is valid value
|
||||||
|
if grep_match "$value" "^\w+$"; then
|
||||||
|
# and return it back back into $3 (if exists) or echo in stdout
|
||||||
|
[ $retvar ] && eval "$retvar='$value'" || echo "$value"
|
||||||
|
break
|
||||||
|
else # otherwise throw error message into stderr (just in case)
|
||||||
|
echo "Argument '$arg' must have a correct value!" >&2
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
#purpose Find long argument or its value
|
||||||
|
#argument $1 - argument (without leading dashes)
|
||||||
|
#argument $2 - is it flag? 1 if is, otherwise 0 or nothing
|
||||||
|
#argument $3 - variable to return value into
|
||||||
|
# (if not specified then it will be echo'ed in stdout)
|
||||||
|
#returns (string) 1 (if $2 == 1), value (if correct and if $2 != 1) or nothing
|
||||||
|
#usage To get value into var: arg v 0 myvar or myvalue=$(arg 'v')
|
||||||
|
#usage To find flag into var: arg f 1 myvar or flag=$(arg 'f')
|
||||||
|
#usage To echo value: arg v
|
||||||
|
#usage To echo 1 if flag exists: arg f
|
||||||
|
argl() {
|
||||||
|
local need=$1 # argument to find
|
||||||
|
[ $need ] || {
|
||||||
|
echo "Argument is not specified!" >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
local isflag=$2 || 0 # should we find the value or just the presence of the $need?
|
||||||
|
local retvar=$3 || 0 # var to return value into (if 0 then value will be echo'ed in stdout)
|
||||||
|
local args=(${__MAIN_ARGS[0]}) # args we need are stored in 1st element of __MAIN_ARGS
|
||||||
|
for ((idx=0; idx<${#args[@]}; ++idx)) do
|
||||||
|
local arg=${args[$idx]} # current argument
|
||||||
|
# if we expect $arg as a flag
|
||||||
|
if [[ $isflag = 1 ]]; then
|
||||||
|
# and if $arg has correct format (like '--flag')
|
||||||
|
if grep_match "$arg" "^--$need"; then
|
||||||
|
# then return '1' back into $3 (if exists) or echo in stdout
|
||||||
|
[ ! $retvar = 0 ] && eval "$retvar=1" || echo "1"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
else # but if $arg is not a flag
|
||||||
|
# check if $arg has correct format (like '--foo=bar')
|
||||||
|
if grep_match "$arg" "^--$need=.+$"; then # if it is
|
||||||
|
# then return part from '=' to arg's end as value back into $3 (if exists) or echo in stdout
|
||||||
|
[ ! $retvar = 0 ] && eval "$retvar=${arg#*=}" || echo "${arg#*=}"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
#purpose Get argument by its index
|
||||||
|
#argument $1 - (number) arg index
|
||||||
|
#argument $2 - (string) variable to return arg's name into
|
||||||
|
# (if not specified then it will be echo'ed in stdout)
|
||||||
|
#returns (string) arg name or nothing
|
||||||
|
#usage To get arg into var: argn 1 myvar or arg=$(argn 1)
|
||||||
|
#usage To echo in stdout: argn 1
|
||||||
|
argn() {
|
||||||
|
local idx=$1 # argument index
|
||||||
|
[ $idx ] || {
|
||||||
|
error "Argument index is not specified!"
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
local retvar=$2 || 0 # var to return value into (if 0 then value will be echo'ed in stdout)
|
||||||
|
local args=(${__MAIN_ARGS[0]}) # args we need are stored in 1st element of __MAIN_ARGS
|
||||||
|
local arg=${args[$idx]} # current argument
|
||||||
|
if [ $arg ]; then
|
||||||
|
[ ! $retvar = 0 ] && eval "$retvar=$arg" || echo "$arg"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Keep in mind:
|
||||||
|
# 1. Short arguments can be specified contiguously or separately
|
||||||
|
# and their order does not matter, but before each of them
|
||||||
|
# (or the first of them) one leading dash must be specified.
|
||||||
|
# Valid combinations: '-a -b -c', '-cba', '-b -ac'
|
||||||
|
# 2. Short arguments can have values and if are - value must go
|
||||||
|
# next to argument itself.
|
||||||
|
# Valid combinations: '-ab avalue', '-ba avalue', '-a avalue -b'
|
||||||
|
# 3. Long arguments cannot be combined like short ones and each
|
||||||
|
# of them must be specified separately with two leading dashes.
|
||||||
|
# Valid combinations: '--foo --bar', '--bar --foo'
|
||||||
|
# 4. Long arguments can have a value which must be specified after '='.
|
||||||
|
# Valid combinations: '--foo=value --bar', '--bar --foo=value'
|
||||||
|
# 5. Values cannot contain spaces even in quotes both for short and
|
||||||
|
# long args, otherwise first word will return as value.
|
||||||
|
# 6. You can use arg() or argl() to check presence of any arg, no matter
|
||||||
|
# if it has value or not.
|
||||||
|
|
||||||
|
### USAGE ###
|
||||||
|
# This is simple examples which you can play around with.
|
||||||
|
|
||||||
|
# first we must save the original arguments passed
|
||||||
|
# to the script when it was called:
|
||||||
|
__MAIN_ARGS=$@
|
||||||
|
|
||||||
|
echo -e "\n1. Short args (vars):"
|
||||||
|
arg a 1 a # -a
|
||||||
|
arg v 0 v # -v v_value
|
||||||
|
arg c 1 c # -c
|
||||||
|
arg z 1 z # -z (not exists)
|
||||||
|
echo "1.1 a=$a"
|
||||||
|
echo "1.2 v=$v"
|
||||||
|
echo "1.3 c=$c"
|
||||||
|
echo "1.4 z=$z"
|
||||||
|
|
||||||
|
echo -e "\n2. Short args (echo):"
|
||||||
|
echo "2.1 a=$(arg a 1)"
|
||||||
|
echo "2.2 v=$(arg v 0)"
|
||||||
|
echo "2.3 c=$(arg c 1)"
|
||||||
|
echo "2.4 z=$(arg z 1)"
|
||||||
|
|
||||||
|
echo -e "\n3. Long args (vars):"
|
||||||
|
argl flag 1 flag # --flag
|
||||||
|
argl param1 0 param1 # --param1=test
|
||||||
|
argl param2 0 param2 # --param2=password
|
||||||
|
argl bar 1 bar # --bar (not exists)
|
||||||
|
echo "3.1 flag=$flag"
|
||||||
|
echo "3.2 param1=$param1"
|
||||||
|
echo "3.3 param2=$param2"
|
||||||
|
echo "3.4 bar=$bar"
|
||||||
|
|
||||||
|
echo -e "\n4. Long args (echo):"
|
||||||
|
echo "4.1 flag=$(argl flag 1)"
|
||||||
|
echo "4.2 param1=$(argl param1 0)"
|
||||||
|
echo "4.3 param2=$(argl param2 0)"
|
||||||
|
echo "4.4 bar=$(argl bar 1)"
|
||||||
|
|
||||||
|
echo -e "\n5. Args by index:"
|
||||||
|
argn 1 first
|
||||||
|
echo "5.1 arg[1]=$first"
|
||||||
|
echo "5.2 arg[3]=$(argn 3)"
|
||||||
|
|
||||||
|
# Well, now we will try to get global args inside different functions
|
||||||
|
|
||||||
|
food() {
|
||||||
|
echo -e "\n=== food() ==="
|
||||||
|
arg f 0 food
|
||||||
|
argl 'food' 0 food
|
||||||
|
[ $food ] && echo "Om nom nom! $food is very tasty" || echo "Uh oh" >&2
|
||||||
|
}
|
||||||
|
|
||||||
|
hello() {
|
||||||
|
echo -e "\n=== hello() ==="
|
||||||
|
arg n 0 name
|
||||||
|
argl name 0 name
|
||||||
|
[ $name ] && echo "Hi, $name! How u r doin?" || echo "Hello, stranger..." >&2
|
||||||
|
}
|
||||||
|
|
||||||
|
hello
|
||||||
|
food
|
||||||
|
|
||||||
|
### OUTPUT ###
|
||||||
|
|
||||||
|
# Command to run:
|
||||||
|
# bash args.sh -va asdf --flag --param1=paramvalue1 -c --param2="somevalue2 sdf" --name="John" -f Seafood
|
||||||
|
|
||||||
|
# 1. Short args (vars):
|
||||||
|
# 1.1 a=1
|
||||||
|
# 1.2 v=v_value
|
||||||
|
# 1.3 c=1
|
||||||
|
# 1.4 z=
|
||||||
|
#
|
||||||
|
# 2. Short args (echo):
|
||||||
|
# 2.1 a=1
|
||||||
|
# 2.2 v=v_value
|
||||||
|
# 2.3 c=1
|
||||||
|
# 2.4 z=
|
||||||
|
#
|
||||||
|
# 3. Long args (vars):
|
||||||
|
# 3.1 longflag=1
|
||||||
|
# 3.2 param1=test
|
||||||
|
# 3.3 param2=password
|
||||||
|
# 3.4 barflag=
|
||||||
|
#
|
||||||
|
# 4. Long args (echo):
|
||||||
|
# 4.1 longflag=1
|
||||||
|
# 4.2 param1=test
|
||||||
|
# 4.3 param2=password
|
||||||
|
# 4.4 barflag=
|
||||||
|
#
|
||||||
|
# 5. Args by index:
|
||||||
|
# 5.1 arg[1]=asdf
|
||||||
|
# 5.2 arg[3]=--param1=paramvalue1
|
||||||
|
#
|
||||||
|
# === hello() ===
|
||||||
|
# Hi, John! How u r doin?
|
||||||
|
#
|
||||||
|
# === food() ===
|
||||||
|
# Om nom nom! Seafood is very tasty
|
87
helpers/basic.sh
Normal file
87
helpers/basic.sh
Normal file
@ -0,0 +1,87 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
source $( dirname $(readlink -e -- "${BASH_SOURCE}"))/io.sh || exit 255
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Little handy helpers for scripting
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
# convert relative path $1 to full one
|
||||||
|
abspath() {
|
||||||
|
echo $(realpath -q "${1/#\~/$HOME}")
|
||||||
|
}
|
||||||
|
|
||||||
|
# check if path $1 is writable
|
||||||
|
is_writable() {
|
||||||
|
[ -w "$(abspath $1)" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
# check if path $1 is a directory
|
||||||
|
is_dir() {
|
||||||
|
[ -d "$(abspath $1)" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
# check if path $1 is a file
|
||||||
|
is_file() {
|
||||||
|
[ -f "$(abspath $1)" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
# check if an argument is a shell function
|
||||||
|
is_function() {
|
||||||
|
declare -F "$1" > /dev/null
|
||||||
|
}
|
||||||
|
|
||||||
|
# check if string $1 matches regex $2
|
||||||
|
regex_match() {
|
||||||
|
printf "%s" "$1" | grep -qP "$2"
|
||||||
|
}
|
||||||
|
|
||||||
|
# check if array $2 contains string $1
|
||||||
|
in_array() {
|
||||||
|
local find=$1
|
||||||
|
shift
|
||||||
|
for e in "$@"; do
|
||||||
|
[[ "$e" == "$find" ]] && return 0
|
||||||
|
done
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# join all elements of array $2 with delimiter $1
|
||||||
|
implode() {
|
||||||
|
local d=${1-}
|
||||||
|
local f=${2-}
|
||||||
|
if shift 2; then
|
||||||
|
printf %s "$f" "${@/#/$d}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# open url $1 in system web-browser
|
||||||
|
open_url() {
|
||||||
|
if which xdg-open > /dev/null; then
|
||||||
|
xdg-open "$1" </dev/null >/dev/null 2>&1 & disown
|
||||||
|
elif which gnome-open > /dev/null; then
|
||||||
|
gnome-open "$1" </dev/null >/dev/null 2>&1 & disown
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# unpack .tar.gz file $1 into path $2
|
||||||
|
unpack_targz() {
|
||||||
|
require tar
|
||||||
|
tar -xzf "$1" -C "$2"
|
||||||
|
}
|
||||||
|
|
||||||
|
# make soft symbolic link of path $1 to path $2
|
||||||
|
symlink() {
|
||||||
|
ln -sf "$1" "$2"
|
||||||
|
}
|
||||||
|
|
||||||
|
# download file $1 into path $2 using wget
|
||||||
|
download() {
|
||||||
|
require wget
|
||||||
|
wget "$1" -O "$2"
|
||||||
|
}
|
||||||
|
|
||||||
|
# download file $1 into path $2 using curl
|
||||||
|
cdownload() {
|
||||||
|
require curl
|
||||||
|
curl -fsSL "$1" -o "$2"
|
||||||
|
}
|
26
helpers/debug.sh
Normal file
26
helpers/debug.sh
Normal file
@ -0,0 +1,26 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
source $( dirname $(readlink -e -- "${BASH_SOURCE}"))/io.sh || exit 255
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Functions to debug scripts
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
var_dump() {
|
||||||
|
debug "$1 = ${!1}"
|
||||||
|
}
|
||||||
|
|
||||||
|
print_stacktrace() {
|
||||||
|
STACK=""
|
||||||
|
local i
|
||||||
|
local stack_size=${#FUNCNAME[@]}
|
||||||
|
debug "Callstack:"
|
||||||
|
# for (( i=$stack_size-1; i>=1; i-- )); do
|
||||||
|
for (( i=1; i<$stack_size; i++ )); do
|
||||||
|
local func="${FUNCNAME[$i]}"
|
||||||
|
[ x$func = x ] && func=MAIN
|
||||||
|
local linen="${BASH_LINENO[$(( i - 1 ))]}"
|
||||||
|
local src="${BASH_SOURCE[$i]}"
|
||||||
|
[ x"$src" = x ] && src=non_file_source
|
||||||
|
debug " at $func $src:$linen"
|
||||||
|
done
|
||||||
|
}
|
178
helpers/git.sh
Normal file
178
helpers/git.sh
Normal file
@ -0,0 +1,178 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
_dir=$( dirname $(readlink -e -- "${BASH_SOURCE}"))
|
||||||
|
source $_dir/io.sh || exit 255
|
||||||
|
source $_dir/basic.sh || exit 255
|
||||||
|
source $_dir/pkg.sh || exit 255
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Shorthands for git
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
git.clone_quick() {
|
||||||
|
require git
|
||||||
|
git clone --depth=1 --single-branch $*
|
||||||
|
}
|
||||||
|
|
||||||
|
git.is_repo() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "Path is not specified" 101
|
||||||
|
require_dir "$1/"
|
||||||
|
check_dir "$1/.git"
|
||||||
|
}
|
||||||
|
|
||||||
|
git.require_repo() {
|
||||||
|
require git
|
||||||
|
git.is_repo "$1" || die "'$1' is not git repository!" 10
|
||||||
|
}
|
||||||
|
|
||||||
|
git.cfg() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "Key is not specified" 101
|
||||||
|
if [[ "$2" ]]; then
|
||||||
|
git config --global --replace-all "$1" "$2"
|
||||||
|
else
|
||||||
|
echo $(git config --global --get-all "$1")
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
git.set_user() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "git.set_user: Repo is not specified" 100
|
||||||
|
git.cfg "$1" "user.name" "$2"
|
||||||
|
git.cfg "$1" "user.email" "$3"
|
||||||
|
success "User set to '$name <$email>' in ${FWHITE}$1"
|
||||||
|
}
|
||||||
|
|
||||||
|
git.fetch() {
|
||||||
|
require git
|
||||||
|
if [ "$1" ]; then
|
||||||
|
if git.remote_branch_exists "origin/$1"; then
|
||||||
|
git fetch origin "refs/heads/$1:refs/remotes/origin/$1" --progress --prune --quiet 2>&1 || die "Could not fetch $1 from origin" 12
|
||||||
|
else
|
||||||
|
warn "Tried to fetch branch 'origin/$1' but it does not exist."
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
git fetch origin --progress --prune --quiet 2>&1 || exit 12
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
git.reset() {
|
||||||
|
require git
|
||||||
|
git reset --hard HEAD
|
||||||
|
git clean -fd
|
||||||
|
}
|
||||||
|
|
||||||
|
git.clone() {
|
||||||
|
require git
|
||||||
|
git clone $* 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
git.co() {
|
||||||
|
require git
|
||||||
|
git checkout $* 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
git.is_it_current_branch() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "git.is_it_current_branch: Branch is not specified" 19
|
||||||
|
[[ "$(git.current_branch)" = "$1" ]]
|
||||||
|
}
|
||||||
|
|
||||||
|
git.pull() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] && BRANCH=$1 || BRANCH=$(git.current_branch)
|
||||||
|
# note "Updating branch $BRANCH..."
|
||||||
|
git pull origin "refs/heads/$BRANCH:refs/remotes/origin/$BRANCH" --prune --force --quiet 2>&1 || exit 13
|
||||||
|
git pull origin --tags --force --quiet 2>&1 || exit 13
|
||||||
|
# [ "$1" ] || die "git.pull: Branch is not specified" 19
|
||||||
|
# if [ "$1" ]; then
|
||||||
|
# note "Updating branch $1..."
|
||||||
|
# git pull origin "refs/heads/$1:refs/remotes/origin/$1" --prune --force --quiet 2>&1 || exit 13
|
||||||
|
# else
|
||||||
|
# note "Updating current branch..."
|
||||||
|
# git pull
|
||||||
|
# fi
|
||||||
|
}
|
||||||
|
|
||||||
|
git.current_branch() {
|
||||||
|
require git
|
||||||
|
git branch --show-current || exit 18
|
||||||
|
}
|
||||||
|
|
||||||
|
git.local_branch_exists() {
|
||||||
|
require git
|
||||||
|
[ -n "$(git for-each-ref --format='%(refname:short)' refs/heads/$1)" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
git.update_refs() {
|
||||||
|
require git
|
||||||
|
info "Updating local refs..."
|
||||||
|
git remote update origin --prune 1>/dev/null 2>&1 || exit 18
|
||||||
|
}
|
||||||
|
|
||||||
|
git.delete_remote_branch() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "git.remote_branch_exists: Branch is not specified" 19
|
||||||
|
if git.remote_branch_exists "origin/$1"; then
|
||||||
|
git push origin :"$1" # || die "Could not delete the remote $1 in $ORIGIN"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
warn "Trying to delete the remote branch $1, but it does not exists in origin"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
git.is_clean_worktree() {
|
||||||
|
require git
|
||||||
|
git rev-parse --verify HEAD >/dev/null || exit 18
|
||||||
|
git update-index -q --ignore-submodules --refresh
|
||||||
|
git diff-files --quiet --ignore-submodules || return 1
|
||||||
|
git diff-index --quiet --ignore-submodules --cached HEAD -- || return 2
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
git.is_branch_merged_into() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "git.remote_branch_exists: Branch1 is not specified" 19
|
||||||
|
[ "$2" ] || die "git.remote_branch_exists: Branch2 is not specified" 19
|
||||||
|
git.update_refs
|
||||||
|
local merge_hash=$(git merge-base "$1"^{} "$2"^{})
|
||||||
|
local base_hash=$(git rev-parse "$1"^{})
|
||||||
|
[ "$merge_hash" = "$base_hash" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
git.remote_branch_exists() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "git.remote_branch_exists: Branch is not specified" 19
|
||||||
|
git.update_refs
|
||||||
|
[ -n "$(git for-each-ref --format='%(refname:short)' refs/remotes/$1)" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
git.new_branch() {
|
||||||
|
require git
|
||||||
|
[ "$1" ] || die "git.new_branch: Branch is not specified" 19
|
||||||
|
if [ "$2" ] && ! git.local_branch_exists "$2" && git.remote_branch_exists "origin/$2"; then
|
||||||
|
git.co -b "$1" origin/"$2"
|
||||||
|
else
|
||||||
|
git.co -b "$1" "$2"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
git.require_clean_worktree() {
|
||||||
|
require git
|
||||||
|
if ! git.is_clean_worktree; then
|
||||||
|
warn "Your working tree is dirty! Look at this:"
|
||||||
|
git status -bs
|
||||||
|
_T="What should you do now?\n"
|
||||||
|
_T="${_T}\t${BOLD}${FWHITE}0.${RESET} try to continue as is\t- errors may occur!\n"
|
||||||
|
_T="${_T}\t${BOLD}${FWHITE}1.${RESET} hard reset\t\t\t- clear current changes and new files\n"
|
||||||
|
_T="${_T}\t${BOLD}${FWHITE}2.${RESET} stash changes (default)\t- save all changes in safe to apply them later via 'git stash pop'\n"
|
||||||
|
_T="${_T}\t${BOLD}${FWHITE}3.${RESET} cancel\n"
|
||||||
|
ask "${_T}${BOLD}${FWHITE}Your choice [0-3]" reset_answer
|
||||||
|
case $reset_answer in
|
||||||
|
1 ) warn "Clearing your work..." && git.reset ;;
|
||||||
|
3 ) exit ;;
|
||||||
|
* ) git stash -a -u -m "WIP before switch to $branch_task" ;;
|
||||||
|
esac
|
||||||
|
fi
|
||||||
|
}
|
112
helpers/io.sh
Normal file
112
helpers/io.sh
Normal file
@ -0,0 +1,112 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Simple and fancy input & output
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
IINFO="( i )"
|
||||||
|
INOTE="( * )"
|
||||||
|
IWARN="( # )"
|
||||||
|
IERROR="( ! )"
|
||||||
|
IFATAL="( @ )"
|
||||||
|
ISUCCESS="( ! )"
|
||||||
|
IASK="( ? )"
|
||||||
|
IDEBUG="(DBG)"
|
||||||
|
IVRB="( + )"
|
||||||
|
|
||||||
|
BOLD="\e[1m"
|
||||||
|
DIM="\e[2m"
|
||||||
|
NOTBOLD="\e[22m" # sometimes \e[21m
|
||||||
|
NOTDIM="\e[22m"
|
||||||
|
NORMAL="\e[20m"
|
||||||
|
RESET="\e[0m"
|
||||||
|
|
||||||
|
FRESET="\e[39m"
|
||||||
|
FBLACK="\e[30m"
|
||||||
|
FWHITE="\e[97m"
|
||||||
|
FRED="\e[31m"
|
||||||
|
FGREEN="\e[32m"
|
||||||
|
FYELLOW="\e[33m"
|
||||||
|
FBLUE="\e[34m"
|
||||||
|
FLRED="\e[91m"
|
||||||
|
FLGREEN="\e[92m"
|
||||||
|
FLYELLOW="\e[93m"
|
||||||
|
FLBLUE="\e[94m"
|
||||||
|
|
||||||
|
BRESET="\e[49m"
|
||||||
|
BBLACK="\e[40m"
|
||||||
|
BWHITE="\e[107m"
|
||||||
|
BRED="\e[41m"
|
||||||
|
BGREEN="\e[42m"
|
||||||
|
BYELLOW="\e[43m"
|
||||||
|
BBLUE="\e[44m"
|
||||||
|
BLRED="\e[101m"
|
||||||
|
BLGREEN="\e[102m"
|
||||||
|
BLYELLOW="\e[103m"
|
||||||
|
BLBLUE="\e[104m"
|
||||||
|
|
||||||
|
dt() {
|
||||||
|
echo "[$(date +'%H:%M:%S')] "
|
||||||
|
}
|
||||||
|
|
||||||
|
ask() {
|
||||||
|
IFS= read -rp "$(print ${BOLD}${BBLUE}${FWHITE}${IASK}${BRESET}\ ${BOLD}$1 ): " $2
|
||||||
|
}
|
||||||
|
|
||||||
|
print() {
|
||||||
|
echo -e "$*${RESET}"
|
||||||
|
}
|
||||||
|
|
||||||
|
debug() {
|
||||||
|
if [ "$2" ]; then
|
||||||
|
print "${DIM}${BOLD}${RESET}${DIM}$(dt)${IDEBUG} ${FUNCNAME[1]:-?}():${BASH_LINENO:-?}\t$1 " >&2
|
||||||
|
else
|
||||||
|
print "${DIM}${BOLD}${RESET}${DIM}$(dt)${IDEBUG} $1 " >&2
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
verbose() {
|
||||||
|
print "${BOLD}$(dt)${IVRB}${RESET}${FYELLOW} $1 "
|
||||||
|
}
|
||||||
|
|
||||||
|
info() {
|
||||||
|
print "${BOLD}$(dt)${FWHITE}${BLBLUE}${IINFO}${RESET}${FWHITE} $1 "
|
||||||
|
}
|
||||||
|
|
||||||
|
note() {
|
||||||
|
print "${BOLD}$(dt)${DIM}${FWHITE}${INOTE}${RESET} $1 "
|
||||||
|
}
|
||||||
|
|
||||||
|
success() {
|
||||||
|
print "${BOLD}$(dt)${BGREEN}${FWHITE}${ISUCCESS}${BRESET}$FGREEN $1 "
|
||||||
|
}
|
||||||
|
|
||||||
|
warn() {
|
||||||
|
print "${BOLD}$(dt)${BYELLOW}${FBLACK}${IWARN}${BRESET}${FYELLOW} Warning:${RESET} $1 "
|
||||||
|
}
|
||||||
|
|
||||||
|
error() {
|
||||||
|
print "${BOLD}$(dt)${BLRED}${FWHITE}${IERROR} Error: ${BRESET}${FLRED} $1 " >&2
|
||||||
|
}
|
||||||
|
|
||||||
|
fatal() {
|
||||||
|
print "${BOLD}$(dt)${BRED}${FWHITE}${IFATAL} FATAL: $1 " >&2
|
||||||
|
print_stacktrace
|
||||||
|
}
|
||||||
|
|
||||||
|
die() {
|
||||||
|
error "${1:-halted}"
|
||||||
|
exit ${2:-255}
|
||||||
|
}
|
||||||
|
|
||||||
|
# var='test var_dump'
|
||||||
|
# var_dump var
|
||||||
|
# debug 'test debug'
|
||||||
|
# verbose 'test verbose'
|
||||||
|
# info 'test info'
|
||||||
|
# note 'test note'
|
||||||
|
# success 'test success'
|
||||||
|
# warn 'test warn'
|
||||||
|
# error 'test error'
|
||||||
|
# fatal 'test fatal'
|
||||||
|
# die 'test die'
|
13
helpers/log.sh
Normal file
13
helpers/log.sh
Normal file
@ -0,0 +1,13 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Logging functions
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
# write some message $1 in log file and stdout with timestamp
|
||||||
|
log_path="/home/$USER/logs"
|
||||||
|
log() {
|
||||||
|
[ ! -d "$log_path" ] && log_path="./log"
|
||||||
|
[ ! -d "$log_path" ] && mkdir -p "$log_path"
|
||||||
|
echo -e "[$(date '+%H:%M:%S')] $*" | tee -a "$log_path/$(date '+%Y%m%d').log"
|
||||||
|
}
|
66
helpers/misc.sh
Normal file
66
helpers/misc.sh
Normal file
@ -0,0 +1,66 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Misc
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
# https://askubuntu.com/a/30414
|
||||||
|
is_full_screen() {
|
||||||
|
local WINDOW=$(echo $(xwininfo -id $(xdotool getactivewindow) -stats | \
|
||||||
|
egrep '(Width|Height):' | \
|
||||||
|
awk '{print $NF}') | \
|
||||||
|
sed -e 's/ /x/')
|
||||||
|
local SCREEN=$(xdpyinfo | grep -m1 dimensions | awk '{print $2}')
|
||||||
|
if [ "$WINDOW" = "$SCREEN" ]; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
curltime() {
|
||||||
|
curl -w @- -o /dev/null -s "$@" <<'EOF'
|
||||||
|
time_namelookup: %{time_namelookup} sec\n
|
||||||
|
time_connect: %{time_connect} sec\n
|
||||||
|
time_appconnect: %{time_appconnect} sec\n
|
||||||
|
time_pretransfer: %{time_pretransfer} sec\n
|
||||||
|
time_redirect: %{time_redirect} sec\n
|
||||||
|
time_starttransfer: %{time_starttransfer} sec\n
|
||||||
|
---------------\n
|
||||||
|
time_total: %{time_total} sec\n
|
||||||
|
EOF
|
||||||
|
}
|
||||||
|
|
||||||
|
ytm() {
|
||||||
|
youtube-dl \
|
||||||
|
--extract-audio \
|
||||||
|
--audio-format flac \
|
||||||
|
--audio-quality 0 \
|
||||||
|
--format bestaudio \
|
||||||
|
--write-info-json \
|
||||||
|
--output "$HOME/Downloads/ytm/%(playlist_title)s/%(channel)s - %(title)s.%(ext)s" \
|
||||||
|
$*
|
||||||
|
}
|
||||||
|
|
||||||
|
docker.ip() { # not finished
|
||||||
|
if [ "$1" ]; then
|
||||||
|
if [ "$1" = "-a" ]; then
|
||||||
|
docker ps -aq \
|
||||||
|
| xargs -n 1 docker inspect --format '{{.Name}}{{range .NetworkSettings.Networks}} {{.IPAddress}}{{end}}' \
|
||||||
|
| sed -e 's#^/##' \
|
||||||
|
| column -t
|
||||||
|
elif [ "$1" = "-c" ]; then
|
||||||
|
docker-compose ps -q \
|
||||||
|
| xargs -n 1 docker inspect --format '{{.Name}}{{range .NetworkSettings.Networks}} {{.IPAddress}}{{end}}' \
|
||||||
|
| sed -e 's#^/##' \
|
||||||
|
| column -t
|
||||||
|
else
|
||||||
|
docker inspect --format '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "$1"
|
||||||
|
docker port "$1"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
docker ps -q \
|
||||||
|
| xargs -n 1 docker inspect --format '{{.Name}}{{range .NetworkSettings.Networks}} {{.IPAddress}}{{end}}' \
|
||||||
|
| sed -e 's#^/##' \
|
||||||
|
| column -t
|
||||||
|
fi
|
||||||
|
}
|
32
helpers/notifications.sh
Normal file
32
helpers/notifications.sh
Normal file
@ -0,0 +1,32 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
source $( dirname $(readlink -e -- "${BASH_SOURCE}"))/packages.sh || exit 255
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Desktop notifications
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
notify () {
|
||||||
|
require "notify-send"
|
||||||
|
[ -n "$1" ] && local title="$1" || local title="My notification"
|
||||||
|
local text="$2"
|
||||||
|
local level="$3"
|
||||||
|
local icon="$4"
|
||||||
|
case $level in
|
||||||
|
"critical") local timeout=0 ;;
|
||||||
|
"low") local timeout=5000 ;;
|
||||||
|
*) local timeout=10000 ;;
|
||||||
|
esac
|
||||||
|
notify-send "$title" "$text" -a "MyScript" -u "$level" -i "$icon" -t $timeout
|
||||||
|
}
|
||||||
|
|
||||||
|
notify_error() {
|
||||||
|
notify "Error" "$1" "critical" "dialog-error"
|
||||||
|
}
|
||||||
|
|
||||||
|
notify_warning() {
|
||||||
|
notify "Warning" "$1" "normal" "dialog-warning"
|
||||||
|
}
|
||||||
|
|
||||||
|
notify_info() {
|
||||||
|
notify "" "$1" "low" "dialog-information"
|
||||||
|
}
|
72
helpers/packages.sh
Normal file
72
helpers/packages.sh
Normal file
@ -0,0 +1,72 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
source $( dirname $(readlink -e -- "${BASH_SOURCE}"))/io.sh || exit 255
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Functions to control system packages
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
installed() {
|
||||||
|
command -v "$1" >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
installed_pkg() {
|
||||||
|
dpkg --list | grep -qw "ii $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
apt_ppa_add() {
|
||||||
|
sudo add-apt-repository -y $*
|
||||||
|
}
|
||||||
|
|
||||||
|
apt_ppa_remove() {
|
||||||
|
sudo add-apt-repository -ry $*
|
||||||
|
}
|
||||||
|
|
||||||
|
apt_update() {
|
||||||
|
sudo apt update $*
|
||||||
|
}
|
||||||
|
|
||||||
|
apt_install() {
|
||||||
|
sudo apt install -y $*
|
||||||
|
}
|
||||||
|
|
||||||
|
apt_remove() {
|
||||||
|
sudo apt purge -y $*
|
||||||
|
}
|
||||||
|
|
||||||
|
dpkg_install() {
|
||||||
|
sudo dpkg -i $*
|
||||||
|
}
|
||||||
|
|
||||||
|
dpkg_remove() {
|
||||||
|
sudo dpkg -r $*
|
||||||
|
}
|
||||||
|
|
||||||
|
dpkg_arch() {
|
||||||
|
dpkg --print-architecture
|
||||||
|
}
|
||||||
|
|
||||||
|
require() {
|
||||||
|
sw=()
|
||||||
|
for package in "$@"; do
|
||||||
|
if ! installed "$package" && ! installed_pkg "$package"; then
|
||||||
|
sw+=("$package")
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
if [ ${#sw[@]} -gt 0 ]; then
|
||||||
|
info "These packages will be installed in your system:\n${sw[*]}"
|
||||||
|
apt_install ${sw[*]}
|
||||||
|
[ $? -gt 0 ] && die "installation cancelled" 201
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
require_pkg() {
|
||||||
|
sw=()
|
||||||
|
for package in "$@"; do
|
||||||
|
if ! installed "$package" && ! installed_pkg "$package"; then
|
||||||
|
sw+=("$package")
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
if [ ${#sw[@]} -gt 0 ]; then
|
||||||
|
die "These packages must be installed in your system:\n${sw[*]}" 200
|
||||||
|
fi
|
||||||
|
}
|
89
helpers/testing.sh
Normal file
89
helpers/testing.sh
Normal file
@ -0,0 +1,89 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
source $( dirname $(readlink -e -- "${BASH_SOURCE}"))/io.sh || exit 255
|
||||||
|
|
||||||
|
########################################################
|
||||||
|
# Testing functions
|
||||||
|
########################################################
|
||||||
|
|
||||||
|
# $1 - command to exec
|
||||||
|
assert_exec() {
|
||||||
|
[ "$1" ] || exit 1
|
||||||
|
local prefix="$(dt)${BOLD}${FWHITE}[TEST EXEC]"
|
||||||
|
if $($1 1>/dev/null 2>&1); then
|
||||||
|
local text="${BGREEN} PASSED"
|
||||||
|
else
|
||||||
|
local text="${BLRED} FAILED"
|
||||||
|
fi
|
||||||
|
print "${prefix} ${text} ${BRESET} ($?):${RESET} $1"
|
||||||
|
}
|
||||||
|
# usage:
|
||||||
|
|
||||||
|
# func1() {
|
||||||
|
# return 0
|
||||||
|
# }
|
||||||
|
# func2() {
|
||||||
|
# return 1
|
||||||
|
# }
|
||||||
|
# assert_exec "func1" # PASSED
|
||||||
|
# assert_exec "func2" # FAILED
|
||||||
|
# assert_exec "whoami" # PASSED
|
||||||
|
|
||||||
|
|
||||||
|
# $1 - command to exec
|
||||||
|
# $2 - expected output
|
||||||
|
assert_output() {
|
||||||
|
[ "$1" ] || exit 1
|
||||||
|
[ "$2" ] && local expected="$2" || local expected=''
|
||||||
|
local prefix="$(dt)${BOLD}${FWHITE}[TEST OUTP]"
|
||||||
|
local output=$($1 2>&1)
|
||||||
|
local code=$?
|
||||||
|
if [[ "$output" == *"$expected"* ]]; then
|
||||||
|
local text="${BGREEN} PASSED"
|
||||||
|
else
|
||||||
|
local text="${BLRED} FAILED"
|
||||||
|
fi
|
||||||
|
print "${prefix} ${text} ${BRESET} (${code}|${expected}):${RESET} $1"
|
||||||
|
# print "\tOutput > $output"
|
||||||
|
}
|
||||||
|
# usage:
|
||||||
|
|
||||||
|
# func1() {
|
||||||
|
# echo "some string"
|
||||||
|
# }
|
||||||
|
# func2() {
|
||||||
|
# echo "another string"
|
||||||
|
# }
|
||||||
|
# expect_output "func1" "string" # PASSED
|
||||||
|
# expect_output "func2" "some" # FAILED
|
||||||
|
# expect_output "func2" "string" # PASSED
|
||||||
|
|
||||||
|
|
||||||
|
# $1 - command to exec
|
||||||
|
# $2 - expected exit-code
|
||||||
|
assert_code() {
|
||||||
|
[ "$1" ] || exit 1
|
||||||
|
[ "$2" ] && local expected=$2 || local expected=0
|
||||||
|
local prefix="$(dt)${BOLD}${FWHITE}[TEST CODE]"
|
||||||
|
$($1 1>/dev/null 2>&1)
|
||||||
|
local code=$?
|
||||||
|
if [[ $code -eq $expected ]]; then
|
||||||
|
local text="${BGREEN} PASSED"
|
||||||
|
else
|
||||||
|
local text="${BLRED} FAILED"
|
||||||
|
fi
|
||||||
|
print "${prefix} ${text} ${BRESET} (${code}|${expected}):${RESET} $1"
|
||||||
|
}
|
||||||
|
# usage:
|
||||||
|
|
||||||
|
# func1() {
|
||||||
|
# # exit 0
|
||||||
|
# return 0
|
||||||
|
# }
|
||||||
|
# func2() {
|
||||||
|
# # exit 1
|
||||||
|
# return 1
|
||||||
|
# }
|
||||||
|
# expect_code "func1" 0 # PASSED
|
||||||
|
# expect_code "func1" 1 # FAILED
|
||||||
|
# expect_code "func2" 0 # FAILED
|
||||||
|
# expect_code "func2" 1 # PASSED
|
@ -1,15 +1,11 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install apache2 (latest)
|
##makedesc: Install apache2 (latest)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
sudo apt install -y --autoremove apache2 && \
|
apt_install apache2
|
||||||
sudo systemctl restart apache2
|
sudo systemctl restart apache2
|
||||||
|
|
||||||
[ $? = 0 ] && {
|
success "apache2 installed!"
|
||||||
echo
|
apache2 -v
|
||||||
success "apache2 installed!"
|
|
||||||
apache2 -v
|
|
||||||
echo
|
|
||||||
}
|
|
||||||
|
13
install/apt
13
install/apt
@ -1,8 +1,5 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install software from apt
|
##makedesc: Install software from apt
|
||||||
source `dirname $0`/../helpers || exit 255
|
|
||||||
|
|
||||||
title
|
|
||||||
|
|
||||||
sudo apt update && \
|
sudo apt update && \
|
||||||
sudo apt upgrade -y --autoremove && \
|
sudo apt upgrade -y --autoremove && \
|
||||||
@ -27,7 +24,6 @@ sudo apt update && \
|
|||||||
libghc-zlib-dev \
|
libghc-zlib-dev \
|
||||||
libssl-dev \
|
libssl-dev \
|
||||||
lsb-release \
|
lsb-release \
|
||||||
lsp-plugins \
|
|
||||||
make \
|
make \
|
||||||
mc \
|
mc \
|
||||||
meld \
|
meld \
|
||||||
@ -36,7 +32,7 @@ sudo apt update && \
|
|||||||
net-tools \
|
net-tools \
|
||||||
nmap \
|
nmap \
|
||||||
p7zip-full \
|
p7zip-full \
|
||||||
pulseeffects \
|
easyeffects \
|
||||||
software-properties-common \
|
software-properties-common \
|
||||||
terminator \
|
terminator \
|
||||||
ubuntu-restricted-extras \
|
ubuntu-restricted-extras \
|
||||||
@ -51,9 +47,4 @@ sudo apt update && \
|
|||||||
tree \
|
tree \
|
||||||
earlyoom
|
earlyoom
|
||||||
# sqlitebrowser
|
# sqlitebrowser
|
||||||
# etckeeper \
|
# etckeeper
|
||||||
# geoclue-2.0 \
|
|
||||||
# gnome-software \
|
|
||||||
# minder \
|
|
||||||
# redshift \
|
|
||||||
# redshift-gtk
|
|
||||||
|
@ -1,17 +1,13 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install Canon Pixma MG2500 + ppa
|
##makedesc: Install Canon Pixma MG2500 + ppa
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
sudo add-apt-repository -y ppa:thierry-f/fork-michael-gruz && \
|
apt_ppa_add ppa:thierry-f/fork-michael-gruz
|
||||||
sudo apt install -y \
|
|
||||||
cnijfilter-mg2500series \
|
|
||||||
scangearmp-mg2500series
|
|
||||||
|
|
||||||
[ $? = 0 ] && {
|
apt_install cnijfilter-mg2500series
|
||||||
echo
|
apt_install scangearmp-mg2500series
|
||||||
success "Canon Pixma MG2500 installed!"
|
|
||||||
info "Now you must add a new printer in your system."
|
success "Drivers for Canon Pixma MG2500 installed!"
|
||||||
echo
|
info "Now you must reboot PC and connect your printer."
|
||||||
}
|
|
||||||
|
@ -1,20 +1,17 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install google chrome (latest)
|
##makedesc: Install google chrome (latest)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://t.me/axenov_blog/251
|
# https://t.me/axenov_blog/251
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
mkdir -p "$HOME/install"
|
mkdir -p "$HOME/install/deb"
|
||||||
|
|
||||||
download "https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb" \
|
download "https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb" \
|
||||||
"$HOME/install/google-chrome.deb" && \
|
"$HOME/install/deb/google-chrome.deb"
|
||||||
sudo dpkg -i "$HOME/install/google-chrome.deb"
|
|
||||||
|
|
||||||
[ $? = 0 ] && {
|
dpkg_install "$HOME/install/deb/google-chrome.deb"
|
||||||
echo
|
|
||||||
success "Google Chrome installed!"
|
success "Google Chrome installed!"
|
||||||
google-chrome --version
|
google-chrome --version
|
||||||
echo
|
|
||||||
}
|
|
||||||
|
@ -1,6 +1,7 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install composer (latest)
|
##makedesc: Install composer (latest)
|
||||||
source `dirname $0`/../helpers || exit 255
|
here=$( dirname $(readlink -e -- "${BASH_SOURCE}"))
|
||||||
|
source "$here/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://getcomposer.org/doc/faqs/how-to-install-composer-programmatically.md
|
# https://getcomposer.org/doc/faqs/how-to-install-composer-programmatically.md
|
||||||
|
|
||||||
@ -8,34 +9,30 @@ title
|
|||||||
|
|
||||||
require php
|
require php
|
||||||
|
|
||||||
if installed "composer"; then
|
if installed composer; then
|
||||||
warn "WARNING: Removing current composer to install its actual version"
|
warn "WARNING: Removing current composer to install latest one"
|
||||||
sudo apt remove -y --autoremove composer
|
composer --version
|
||||||
sudo rm -f \
|
__AAA_NO_TITLE=1 source $here/../uninstall/composer
|
||||||
"$HOME/.local/bin/composer" \
|
|
||||||
/bin/composer \
|
|
||||||
/usr/bin/composer \
|
|
||||||
/usr/local/bin/composer \
|
|
||||||
/usr/src/composer \
|
|
||||||
"$HOME/.local/bin/composer"
|
|
||||||
fi
|
fi
|
||||||
|
|
||||||
mkdir -p "$HOME/install" "$HOME/.local/bin"
|
mkdir -p "$HOME/install/other" "$HOME/.local/bin"
|
||||||
download "https://getcomposer.org/installer" "$HOME/install/composer-setup.php" && \
|
|
||||||
php "$HOME/install/composer-setup.php" --install-dir="$HOME/.local/bin/composer"
|
|
||||||
|
|
||||||
[ $? = 0 ] && {
|
download "https://getcomposer.org/installer" \
|
||||||
COMPOSER_GLOBAL_HOME="$($HOME/.local/bin/composer config -g home)"
|
"$HOME/install/other/composer-setup.php"
|
||||||
NEWPATH="export PATH=\"$COMPOSER_GLOBAL_HOME/vendor/bin:\${PATH}\""
|
|
||||||
cat "$HOME/.profile" | grep -qoh "$NEWPATH" || {
|
|
||||||
$NEWPATH
|
|
||||||
echo "$NEWPATH" >> "$HOME/.profile"
|
|
||||||
}
|
|
||||||
|
|
||||||
echo
|
php "$HOME/install/other/composer-setup.php" \
|
||||||
success "composer installed!"
|
--install-dir="$HOME/.local/bin/" \
|
||||||
composer --version
|
--filename="composer"
|
||||||
echo
|
|
||||||
|
COMPOSER_GLOBAL_HOME="$($HOME/.local/bin/composer config -g home)"
|
||||||
|
NEWPATH="export PATH=\"$COMPOSER_GLOBAL_HOME/vendor/bin:\${PATH}\""
|
||||||
|
cat "$HOME/.profile" | grep -qoh "$NEWPATH" || {
|
||||||
|
$NEWPATH
|
||||||
|
echo "$NEWPATH" >> "$HOME/.profile"
|
||||||
}
|
}
|
||||||
|
|
||||||
source "$HOME/.profile"
|
source "$HOME/.profile"
|
||||||
|
|
||||||
|
success "composer installed!"
|
||||||
|
composer --version
|
||||||
|
|
||||||
|
@ -1,35 +1,39 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install docker (latest) + docker-compose (latest) + ppa
|
##makedesc: Install docker (latest) + ppa
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://docs.docker.com/engine/install/ubuntu/
|
# https://docs.docker.com/engine/install/ubuntu/
|
||||||
|
# https://docs.docker.com/engine/install/linux-postinstall/#manage-docker-as-a-non-root-user
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
require ca-certificates
|
for pkg in docker.io docker-doc docker-compose docker-compose-v2 podman-docker containerd runc; do
|
||||||
require curl
|
apt_remove $pkg
|
||||||
require gnupg
|
done
|
||||||
require lsb-release
|
|
||||||
|
|
||||||
sudo mkdir -p /etc/apt/keyrings
|
require ca-certificates lsb-release
|
||||||
curl -fsSL https://download.docker.com/linux/ubuntu/gpg \
|
source /etc/os-release
|
||||||
| sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg && \
|
|
||||||
sudo chmod a+r /etc/apt/keyrings/docker.gpg && \
|
|
||||||
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" \
|
|
||||||
| sudo tee /etc/apt/sources.list.d/docker.list > /dev/null && \
|
|
||||||
sudo apt update && \
|
|
||||||
sudo apt install -y --autoremove \
|
|
||||||
docker-ce \
|
|
||||||
docker-ce-cli \
|
|
||||||
containerd.io \
|
|
||||||
docker-compose-plugin && \
|
|
||||||
sudo usermod -aG docker $(whoami)
|
|
||||||
|
|
||||||
[ $? = 0 ] && {
|
key="/etc/apt/keyrings/docker.asc"
|
||||||
echo
|
|
||||||
success "Docker installed!"
|
sudo install -m 0755 -d /etc/apt/keyrings
|
||||||
info "Probably, you need to relogin to apply 'docker' group."
|
sudo cdownload https://download.docker.com/linux/ubuntu/gpg $key
|
||||||
info "Your ones currently are: $(groups)"
|
sudo chmod a+r $key
|
||||||
docker --version
|
|
||||||
echo
|
echo "deb [arch=$(dpkg_arch) signed-by=$key] https://download.docker.com/linux/ubuntu $VERSION_CODENAME stable" \
|
||||||
}
|
| sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
|
||||||
|
|
||||||
|
apt_update
|
||||||
|
apt_install \
|
||||||
|
docker-ce \
|
||||||
|
docker-ce-cli \
|
||||||
|
containerd.io \
|
||||||
|
docker-buildx-plugin \
|
||||||
|
docker-compose-plugin
|
||||||
|
|
||||||
|
sudo usermod -aG docker $(whoami)
|
||||||
|
newgrp docker
|
||||||
|
|
||||||
|
success "Docker installed!"
|
||||||
|
docker --version
|
||||||
|
info "Probably, you need to relogin to apply 'docker' group permanently."
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install dotfiles
|
##makedesc: Install dotfiles
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
@ -1,22 +1,21 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install droidcam v1.9.0
|
##makedesc: Install droidcam v1.9.0
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://www.dev47apps.com/droidcam/linux/
|
#TODO
|
||||||
|
exit
|
||||||
|
|
||||||
|
# https://droidcam.app/linux/
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
mkdir -p "$HOME/install/droidcam"
|
mkdir -p "$HOME/install/deb/"
|
||||||
|
|
||||||
download "https://files.dev47apps.net/linux/droidcam_1.9.0.zip" "$HOME/install/droidcam.zip" && \
|
download "https://beta.droidcam.app/go/droidCam.client.setup.deb" "$HOME/install/deb/droidcam_client_amd64.deb"
|
||||||
unzip -oq "$HOME/install/droidcam.zip" -d "$HOME/install/droidcam" && \
|
|
||||||
cd "$HOME/install/droidcam" && \
|
|
||||||
sudo ./install-client
|
|
||||||
|
|
||||||
[ $? = 0 ] && {
|
dpkg_install "$HOME/install/deb/droidcam_client_amd64.deb"
|
||||||
echo
|
apt_install v4l2loopback-dkms
|
||||||
success "droidcam installed!"
|
|
||||||
info "Don't forget to install the android app:"
|
success "droidcam installed!"
|
||||||
info "https://play.google.com/store/apps/developer?id=Dev47Apps"
|
info "Don't forget to install the android app:"
|
||||||
echo
|
info "https://play.google.com/store/apps/developer?id=Dev47Apps"
|
||||||
}
|
|
||||||
|
@ -1,11 +1,22 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install droidcam-obs plugin v1.5.1
|
##makedesc: Install droidcam-obs plugin v1.5.1
|
||||||
|
|
||||||
|
#TODO
|
||||||
|
# ffmpeg -version | head -n 1 | awk '{print $3}'
|
||||||
|
# https://github.com/dev47apps/droidcam-obs-plugin/releases
|
||||||
|
|
||||||
|
exit
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# https://www.dev47apps.com/droidcam/linux/
|
# https://www.dev47apps.com/droidcam/linux/
|
||||||
# https://www.dev47apps.com/obs/
|
# https://www.dev47apps.com/obs/
|
||||||
# https://www.dev47apps.com/obs/usage.html
|
# https://www.dev47apps.com/obs/usage.html
|
||||||
# https://obsproject.com/forum/threads/how-to-start-virtual-camera-without-sudo-privileges.139783/
|
# https://obsproject.com/forum/threads/how-to-start-virtual-camera-without-sudo-privileges.139783/
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
echo
|
echo
|
||||||
echo "==============================================="
|
echo "==============================================="
|
||||||
echo "Installing droidcam-obs..."
|
echo "Installing droidcam-obs..."
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install flameshot (latest)
|
##makedesc: Install flameshot (latest)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install Wireguard + FRKN
|
##makedesc: Install Wireguard + FRKN
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://frkn.org/ru/installation
|
# https://frkn.org/ru/installation
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install git (latest)
|
##makedesc: Install git (latest)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install golang v1.21.0
|
##makedesc: Install golang v1.21.0
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://go.dev/dl/
|
# https://go.dev/dl/
|
||||||
# https://golang.org/doc/install
|
# https://golang.org/doc/install
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install grub-customizer (latest + ppa)
|
##makedesc: Install grub-customizer (latest + ppa)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install JetBrains Mono fonts
|
##makedesc: Install JetBrains Mono fonts
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://www.jetbrains.com/lp/mono/#how-to-install
|
# https://www.jetbrains.com/lp/mono/#how-to-install
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install KDE Backports
|
##makedesc: Install KDE Backports
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
@ -1,6 +1,15 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install omz fancy (powerline10k + MesloLGS font)
|
##makedesc: Install omz fancy (powerline10k + MesloLGS font)
|
||||||
|
|
||||||
|
# https://gist.github.com/anthonyaxenov/b8460935d06b9f0da72def03d0f26515
|
||||||
|
|
||||||
|
# Based on:
|
||||||
|
# https://github.com/Powerlevel9k/powerlevel9k/wiki/Install-Instructions
|
||||||
|
# https://github.com/ohmyzsh/ohmyzsh
|
||||||
|
# https://powerline.readthedocs.io/en/latest/installation/linux.html#fonts-installation
|
||||||
|
# https://gist.github.com/dogrocker/1efb8fd9427779c827058f873b94df95
|
||||||
|
# https://linuxhint.com/install_zsh_shell_ubuntu_1804/
|
||||||
|
|
||||||
echo
|
echo
|
||||||
echo "==============================================="
|
echo "==============================================="
|
||||||
echo "Installing omz fancy: powerline10k + MesloLGS font..."
|
echo "Installing omz fancy: powerline10k + MesloLGS font..."
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install openvpn v2.6.3 (src)
|
##makedesc: Install openvpn v2.6.3 (src)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://openvpn.net/community-downloads/
|
# https://openvpn.net/community-downloads/
|
||||||
# https://openvpn.net/community-resources/installing-openvpn/
|
# https://openvpn.net/community-resources/installing-openvpn/
|
||||||
@ -15,7 +15,7 @@ require libssl-dev \
|
|||||||
|
|
||||||
mkdir -p "$HOME/install/"
|
mkdir -p "$HOME/install/"
|
||||||
download "https://swupdate.openvpn.org/community/releases/openvpn-${OVPNVER}.tar.gz" "$HOME/install/openvpn-${OVPNVER}.tar.gz" && \
|
download "https://swupdate.openvpn.org/community/releases/openvpn-${OVPNVER}.tar.gz" "$HOME/install/openvpn-${OVPNVER}.tar.gz" && \
|
||||||
unpak_targz "$HOME/install/openvpn-${OVPNVER}.tar.gz" "$HOME/install/" && \
|
unpack_targz "$HOME/install/openvpn-${OVPNVER}.tar.gz" "$HOME/install/" && \
|
||||||
cd "$HOME/install/openvpn-${OVPNVER}" && \
|
cd "$HOME/install/openvpn-${OVPNVER}" && \
|
||||||
sudo ./configure && \
|
sudo ./configure && \
|
||||||
sudo make && \
|
sudo make && \
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install papirus-icon-theme (latest)
|
##makedesc: Install papirus-icon-theme (latest)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install postgresql (latest) and php-pgsql (if php is installed)
|
##makedesc: Install postgresql (latest) and php-pgsql (if php is installed)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install php v8.1 + ppa
|
##makedesc: Install php v8.3 + ppa
|
||||||
|
|
||||||
[ $1 ] && PHPVER="$1" || PHPVER="8.1"
|
[ $1 ] && PHPVER="$1" || PHPVER="8.3"
|
||||||
echo
|
echo
|
||||||
echo "==============================================="
|
echo "==============================================="
|
||||||
echo "Installing php${PHPVER}..."
|
echo "Installing php${PHPVER}..."
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install postman (latest)
|
##makedesc: Install postman (latest)
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://learning.postman.com/docs/getting-started/installation-and-updates/#installing-postman-on-linux
|
# https://learning.postman.com/docs/getting-started/installation-and-updates/#installing-postman-on-linux
|
||||||
|
|
||||||
@ -13,7 +13,7 @@ mkdir -p "$HOME/install" && \
|
|||||||
"$HOME/.local/share/applications"
|
"$HOME/.local/share/applications"
|
||||||
|
|
||||||
download "https://dl.pstmn.io/download/latest/linux64" "$HOME/install/postman.tar.gz" && \
|
download "https://dl.pstmn.io/download/latest/linux64" "$HOME/install/postman.tar.gz" && \
|
||||||
unpak_targz "$HOME/install/postman.tar.gz" "$HOME/install" && \
|
unpack_targz "$HOME/install/postman.tar.gz" "$HOME/install" && \
|
||||||
symlink "$HOME/install/Postman/Postman" "$HOME/.local/bin/postman" && \
|
symlink "$HOME/install/Postman/Postman" "$HOME/.local/bin/postman" && \
|
||||||
cat << EOF > "$HOME/.local/share/applications/Postman.desktop" && sudo update-desktop-database
|
cat << EOF > "$HOME/.local/share/applications/Postman.desktop" && sudo update-desktop-database
|
||||||
[Desktop Entry]
|
[Desktop Entry]
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Install Sublime Text
|
##makedesc: Install Sublime Text
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
# https://www.sublimetext.com/download_thanks?target=x64-deb
|
# https://www.sublimetext.com/download_thanks?target=x64-deb
|
||||||
# https://gist.github.com/skoqaq/3f3e8f28e23c881143cef9cf49d821ff
|
# https://gist.github.com/skoqaq/3f3e8f28e23c881143cef9cf49d821ff
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: VSCode deb-package
|
##makedesc: VSCode deb-package
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
19
tools/README.md
Normal file
19
tools/README.md
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
# Shell scripts
|
||||||
|
|
||||||
|
## Russian
|
||||||
|
|
||||||
|
Эти скрипты я писал в разное время для решения разных задач.
|
||||||
|
Чтобы они не растерялись по репозиториям и носителям, я решил собрать их здесь в одну кучу.
|
||||||
|
|
||||||
|
Я всегда использую Ubuntu в качестве своих настольных и серверных ОС, поэтому все эти скрипты писались и использовались в этих средах с версий 18.*.
|
||||||
|
|
||||||
|
Многие скрипты зависимы от [io.sh](/io.sh).
|
||||||
|
|
||||||
|
## English
|
||||||
|
|
||||||
|
These scripts were written at different times to solve different my own problems.
|
||||||
|
I decided to collect them here in a heap so that they are not lost in repositories and media.
|
||||||
|
|
||||||
|
I always use Ubuntu as my desktop and server OS, so all these scripts has been written and used in these environments since version 18.*.
|
||||||
|
|
||||||
|
Many scripts depending on [io.sh](/io.sh).
|
32
tools/basic-ubuntu-lemp.sh
Normal file
32
tools/basic-ubuntu-lemp.sh
Normal file
@ -0,0 +1,32 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
apt update && apt upgrade -y --autoremove
|
||||||
|
apt install -y \
|
||||||
|
apt-transport-https \
|
||||||
|
build-essential \
|
||||||
|
ca-certificates \
|
||||||
|
cmake \
|
||||||
|
curl \
|
||||||
|
dialog \
|
||||||
|
gettext \
|
||||||
|
gnupg \
|
||||||
|
htop \
|
||||||
|
libaio1 \
|
||||||
|
libcurl4-gnutls-dev \
|
||||||
|
libexpat1-dev \
|
||||||
|
libghc-zlib-dev \
|
||||||
|
libssl-dev \
|
||||||
|
make \
|
||||||
|
mc \
|
||||||
|
nano \
|
||||||
|
net-tools \
|
||||||
|
nmap \
|
||||||
|
p7zip-full \
|
||||||
|
software-properties-common \
|
||||||
|
unzip \
|
||||||
|
inotify-tools \
|
||||||
|
git \
|
||||||
|
mariadb-server \
|
||||||
|
mariadb-client \
|
||||||
|
nginx \
|
||||||
|
certbot
|
26
tools/dc
Normal file
26
tools/dc
Normal file
@ -0,0 +1,26 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
CONTAINER="my-container" # the name of the container in which to 'exec' something
|
||||||
|
CONFIG="$(dirname $([ -L $0 ] && readlink -f $0 || echo $0))/docker-compose.yml" # path to compose yml file
|
||||||
|
CMD="docker-compose -f $CONFIG" # docker-compose command
|
||||||
|
APP_URL='http://localhost:8000/'
|
||||||
|
|
||||||
|
open_browser() {
|
||||||
|
if which xdg-open > /dev/null; then
|
||||||
|
xdg-open "$1" </dev/null >/dev/null 2>&1 & disown
|
||||||
|
elif which gnome-open > /dev/null; then
|
||||||
|
gnome-open "$1" </dev/null >/dev/null 2>&1 & disown
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
case "$1" in
|
||||||
|
'' | 'help' ) echo -e "Provide one of operations: \t start, stop, up, down, restart, rebuild, open";
|
||||||
|
echo "Otherwise all args will be passed to 'docker exec -ti $CONTAINER ...'" ;;
|
||||||
|
'open' ) open_browser $APP_URL ;;
|
||||||
|
'up' ) $CMD up -d --build ;; # build and start containers
|
||||||
|
'down' ) $CMD down --remove-orphans ;; # stop and remove containers
|
||||||
|
'start' ) $CMD start ;; # start containers
|
||||||
|
'stop' ) $CMD stop ;; # stop containers
|
||||||
|
'restart' ) $CMD stop && $CMD start ;; # restart containers
|
||||||
|
'rebuild' ) $CMD down --remove-orphans && $CMD up -d --build ;; # rebuild containers
|
||||||
|
* ) docker exec -ti $CONTAINER $@ # exec anything in container
|
||||||
|
esac
|
@ -1,54 +1,55 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
# https://gist.github.com/anthonyaxenov/c16e1181d4b8a8644c57ec8a1f6cf21c
|
||||||
#########################################################################
|
#########################################################################
|
||||||
# #
|
# #
|
||||||
# Set display resolution #
|
# Set output resolution #
|
||||||
# #
|
# #
|
||||||
# Author: Anthony Axenov (Антон Аксенов) #
|
# Author: Anthony Axenov (Антон Аксенов) #
|
||||||
# Version: 1.0 #
|
# Version: 1.0 #
|
||||||
# License: WTFPL #
|
# License: WTFPLv2 #
|
||||||
# #
|
# #
|
||||||
#########################################################################
|
#########################################################################
|
||||||
# #
|
# #
|
||||||
# Using this script you can change your display resolution #
|
# Using this script you can change your output resolution #
|
||||||
# to any one you need. Just adjust some vars below and run script #
|
# to any one you need. Just adjust some vars below and run script #
|
||||||
# (chmod +x needed). #
|
# (chmod +x needed). #
|
||||||
# #
|
# #
|
||||||
#########################################################################
|
#########################################################################
|
||||||
|
|
||||||
# https://gist.github.com/anthonyaxenov/c16e1181d4b8a8644c57ec8a1f6cf21c
|
# Set output name to work with. You can get it via 'xrandr --listactivemonitors'
|
||||||
|
output="HDMI-3"
|
||||||
# Set display name to work with. You can get it via 'xrandr --listactivemonitors'
|
# Set width of this output in px
|
||||||
display="HDMI-2"
|
width=1920
|
||||||
# Set width of this display in px
|
# Set height of this output in px
|
||||||
width=1600
|
height=1080
|
||||||
# Set height of this display in px
|
# Set refresh rate in Hz of this output in px
|
||||||
height=900
|
refresh=120
|
||||||
|
|
||||||
# Sometimes cvt and gtf generates different modelines.
|
# Sometimes cvt and gtf generates different modelines.
|
||||||
# You can play around and look which of them gives best result:
|
# You can play around and look which of them gives best result:
|
||||||
modeline=$(cvt ${width} ${height} | grep "Modeline")
|
modeline=$(cvt ${width} ${height} ${refresh} | grep "Modeline")
|
||||||
# modeline=$(gtf ${width} ${height} 60 | grep "Modeline")
|
# modeline=$(gtf ${width} ${height} ${refresh} | grep "Modeline")
|
||||||
|
|
||||||
# Some important data needed to xrandr:
|
# Some important data needed to xrandr:
|
||||||
modename="${width}x${height}_my"
|
modename="${width}x${height}@${refresh}_my"
|
||||||
params=$(echo "$modeline" | sed "s|^\s*Modeline\s*\"[0-9x_.]*\"\s*||")
|
params=$(echo "$modeline" | sed "s|^\s*Modeline\s*\"[0-9x_.]*\"\s*||")
|
||||||
|
|
||||||
echo "Set resolution ${width}x${height} on display $display:"
|
echo "Set resolution ${width}x${height}@${refresh} on output $output:"
|
||||||
echo "$modename $params"
|
echo "$modename $params"
|
||||||
|
|
||||||
# Simple logic:
|
# Simple logic:
|
||||||
# 1. Switch display to safe mode which always exists (I believe) to avoid errors
|
# 1. Switch output to safe mode which always exists (I believe) to avoid errors
|
||||||
xrandr --output $display --mode 640x480
|
xrandr --output $output --mode 640x480 --verbose
|
||||||
# 2. If display aready have our mode -- we must delete it to avoid errors
|
# 2. If output aready have our mode -- we must delete it to avoid errors
|
||||||
if $(xrandr | grep -q "$modename"); then
|
if $(xrandr | grep -q "$modename"); then
|
||||||
# 2.1. Detach mode from display
|
# 2.1. Detach mode from output
|
||||||
xrandr --delmode $display $modename
|
xrandr --delmode $output $modename
|
||||||
# 2.2. Remove mode itself
|
# 2.2. Remove mode itself
|
||||||
xrandr --rmmode $modename
|
xrandr --rmmode $modename
|
||||||
fi
|
fi
|
||||||
# 3. Create new mode with freshly generated parameters
|
# 3. Create new mode with freshly generated parameters
|
||||||
xrandr --newmode $modename $params
|
xrandr --newmode $modename $params --verbose
|
||||||
# 4. Attach mode to our display
|
# 4. Attach mode to our output
|
||||||
xrandr --addmode $display $modename
|
xrandr --addmode $output $modename --verbose
|
||||||
# 5. Switch display to this mode immidiately
|
# 5. Switch output to this mode immidiately
|
||||||
xrandr --output $display --mode $modename
|
xrandr --output $output --mode $modename --refresh $refresh --verbose
|
0
tools/free-space.sh
Normal file → Executable file
0
tools/free-space.sh
Normal file → Executable file
121
tools/init-home-mediasrv.sh
Normal file
121
tools/init-home-mediasrv.sh
Normal file
@ -0,0 +1,121 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
sudo apt update && sudo apt upgrade -y --autoremove
|
||||||
|
sudo apt install -y \
|
||||||
|
alien \
|
||||||
|
apt-transport-https \
|
||||||
|
build-essential \
|
||||||
|
ca-certificates \
|
||||||
|
cmake \
|
||||||
|
curl \
|
||||||
|
dconf-editor \
|
||||||
|
default-jdk \
|
||||||
|
dialog \
|
||||||
|
gettext \
|
||||||
|
gnupg \
|
||||||
|
gparted \
|
||||||
|
hardinfo \
|
||||||
|
htop \
|
||||||
|
libaio1 \
|
||||||
|
libcurl4-gnutls-dev \
|
||||||
|
libexpat1-dev \
|
||||||
|
libghc-zlib-dev \
|
||||||
|
libssl-dev \
|
||||||
|
lsb-release \
|
||||||
|
lsp-plugins \
|
||||||
|
make \
|
||||||
|
mc \
|
||||||
|
nano \
|
||||||
|
neofetch \
|
||||||
|
net-tools \
|
||||||
|
nmap \
|
||||||
|
p7zip-full \
|
||||||
|
easyeffects \
|
||||||
|
software-properties-common \
|
||||||
|
ubuntu-restricted-extras \
|
||||||
|
unzip \
|
||||||
|
vlc \
|
||||||
|
ffmpeg \
|
||||||
|
xclip \
|
||||||
|
inotify-tools \
|
||||||
|
notify-osd \
|
||||||
|
fonts-open-sans \
|
||||||
|
libnotify-bin \
|
||||||
|
gnome-software \
|
||||||
|
gnome-software-plugin-flatpak \
|
||||||
|
gnome-software-plugin-snap \
|
||||||
|
terminator \
|
||||||
|
geoclue-2.0 \
|
||||||
|
redshift \
|
||||||
|
redshift-gtk \
|
||||||
|
samba \
|
||||||
|
dkms
|
||||||
|
|
||||||
|
|
||||||
|
# https://selectel.ru/blog/tutorials/how-to-install-and-configure-samba-on-ubuntu-20-04/
|
||||||
|
# https://linuxconfig.org/how-to-configure-samba-server-share-on-ubuntu-22-04-jammy-jellyfish-linux
|
||||||
|
# https://phoenixnap.com/kb/ubuntu-samba
|
||||||
|
# https://computingforgeeks.com/install-and-configure-samba-server-share-on-ubuntu/
|
||||||
|
# https://linux.how2shout.com/how-to-install-samba-on-ubuntu-22-04-lts-jammy-linux/
|
||||||
|
sudo cp /etc/samba/smb.conf /etc/samba/smb.conf.bak
|
||||||
|
sudo bash -c 'grep -v -E "^#|^;" /etc/samba/smb.conf.bak | grep . > /etc/samba/smb.conf'
|
||||||
|
sudo systemctl enable --now smbd
|
||||||
|
sudo usermod -aG sambashare $USER
|
||||||
|
sudo smbpasswd -a $USER
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
sudo add-apt-repository -y ppa:agornostal/ulauncher && \
|
||||||
|
sudo apt install -y --autoremove ulauncher
|
||||||
|
|
||||||
|
curl -L https://yt-dl.org/downloads/latest/youtube-dl -o "${HOME}/.local/bin/youtube-dl" && \
|
||||||
|
sudo chmod +rx "${HOME}/.local/bin/youtube-dl"
|
||||||
|
|
||||||
|
wget "https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb" && \
|
||||||
|
sudo dpkg -i google-chrome-stable_current_amd64.deb
|
||||||
|
|
||||||
|
git clone https://github.com/aircrack-ng/rtl8812au.git && \
|
||||||
|
cd rtl8812au && \
|
||||||
|
sudo make dkms_install
|
||||||
|
|
||||||
|
sudo curl -s -o /usr/share/keyrings/syncthing-archive-keyring.gpg https://syncthing.net/release-key.gpg && \
|
||||||
|
echo "deb [signed-by=/usr/share/keyrings/syncthing-archive-keyring.gpg] https://apt.syncthing.net/ syncthing stable" | sudo tee /etc/apt/sources.list.d/syncthing.list && \
|
||||||
|
echo "deb [signed-by=/usr/share/keyrings/syncthing-archive-keyring.gpg] https://apt.syncthing.net/ syncthing candidate" | sudo tee /etc/apt/sources.list.d/syncthing.list && \
|
||||||
|
sudo apt update && sudo apt install -y --autoremove syncthing && \
|
||||||
|
wget "https://raw.githubusercontent.com/syncthing/syncthing/main/etc/linux-desktop/syncthing-start.desktop" -O $HOME/.local/share/applications/syncthing-start.desktop && \
|
||||||
|
wget "https://raw.githubusercontent.com/syncthing/syncthing/main/etc/linux-desktop/syncthing-ui.desktop" -O $HOME/.local/share/applications/syncthing-ui.desktop && \
|
||||||
|
ln -sf $HOME/.local/share/applications/syncthing-start.desktop $HOME/.config/autostart/syncthing-start.desktop
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
#####################################################################
|
||||||
|
|
||||||
|
sudo apt install -y kodi kodi-pvr-iptvsimple
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
5
tools/inotifywait-cp/README.md
Normal file
5
tools/inotifywait-cp/README.md
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
# Backing up photos from Syncthing
|
||||||
|
|
||||||
|
More info:
|
||||||
|
* 🇷🇺 [axenov.dev/резервное-копирование-фотографий-со](https://axenov.dev/резервное-копирование-фотографий-со/)
|
||||||
|
* 🇺🇸 (planned to translate)
|
19
tools/inotifywait-cp/inotifywait-cp.service
Normal file
19
tools/inotifywait-cp/inotifywait-cp.service
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
# Daemon file
|
||||||
|
# Place or symlink it to /etc/systemd/system/inotifywait-cp.service
|
||||||
|
# Enable and start: sudo systemctl enable --now inotifywait-cp
|
||||||
|
# Check it: sudo systemctl status inotifywait-cp
|
||||||
|
|
||||||
|
[Unit]
|
||||||
|
Description=Photosync from android
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
Restart=always
|
||||||
|
# correct these parameters as needed:
|
||||||
|
User=user
|
||||||
|
WorkingDirectory=/home/user
|
||||||
|
ExecStart=bash /home/user/.local/bin/photosync-a53.sh
|
||||||
|
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=network.target
|
59
tools/inotifywait-cp/inotifywait-cp.sh
Normal file
59
tools/inotifywait-cp/inotifywait-cp.sh
Normal file
@ -0,0 +1,59 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# My use case:
|
||||||
|
# syncthing synchronizes ALL changes in DCIM directory on my android to PC.
|
||||||
|
# I wanted files to be copied somewhere else on my PC to stay forever, so I
|
||||||
|
# could sort them later and safely free some space on mobile without loss.
|
||||||
|
# Also I wish to have some stupid log with history of such events.
|
||||||
|
|
||||||
|
# inotify-tools package must be installed!
|
||||||
|
|
||||||
|
# CHANGE THIS PARAMETERS to ones you needed
|
||||||
|
dir_src="$HOME/Syncthing/Mobile/Camera"
|
||||||
|
dir_dest="$HOME/some/safe/place"
|
||||||
|
dir_logs="$HOME/inotifywait-cp-logs"
|
||||||
|
regexp="[0-9]{8}_[0-9]{6}.*\.(jpg|mp4|gif)"
|
||||||
|
|
||||||
|
print() {
|
||||||
|
echo -e "[`date '+%H:%M:%S'`] $*" \
|
||||||
|
| tee -a "$dir_logs/`date '+%Y%m%d'`.log"
|
||||||
|
}
|
||||||
|
|
||||||
|
copy () {
|
||||||
|
mkdir -p "$dir_src" "$dir_dest" "$dir_logs"
|
||||||
|
if [ -f "$dir_dest/$1" ]; then
|
||||||
|
print "SKIPPED:\t$dir_dest/$1"
|
||||||
|
else
|
||||||
|
cp "$dir_src/$1" "$dir_dest/$1"
|
||||||
|
print "COPIED:\t$dir_src/$1 => $dir_dest/$1"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
mkdir -p "$dir_src" "$dir_dest" "$dir_logs"
|
||||||
|
|
||||||
|
print "START\t========================="
|
||||||
|
|
||||||
|
# First, try to backup files synced since last exec of this script
|
||||||
|
ls -1 "$dir_src" \
|
||||||
|
| grep -E "^$regexp$" \
|
||||||
|
| while read filename; do copy "$filename"; done
|
||||||
|
|
||||||
|
# Next, run inotifywait against source directory with args:
|
||||||
|
# --quiet -- print less (only print events)
|
||||||
|
# --monitor -- don't stop after first event (like infinite loop)
|
||||||
|
# --event -- first syncthing creates hidden file to write data into
|
||||||
|
# then renames it according to source file name, so here
|
||||||
|
# we listen to MOVED_TO event to catch final filename
|
||||||
|
# --format %f -- print only filename
|
||||||
|
# --include -- filename regexp to catch event from, ensure your $regexp
|
||||||
|
# is correct or remove line 56 to catch synced ALL files
|
||||||
|
|
||||||
|
inotifywait \
|
||||||
|
--quiet \
|
||||||
|
--monitor \
|
||||||
|
--event moved_to \
|
||||||
|
--format %f \
|
||||||
|
--include "$regexp" \
|
||||||
|
"$dir_src" \
|
||||||
|
| while read filename; do copy "$filename"; done
|
||||||
|
|
||||||
|
print "FINISH\t========================="
|
157
tools/netbeans-php-wrapper/php
Normal file
157
tools/netbeans-php-wrapper/php
Normal file
@ -0,0 +1,157 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Welcome to amusement park!
|
||||||
|
|
||||||
|
[[ "$1" = '--help' ]] || [[ "$1" = '-h' ]] && cat <<EOF && exit
|
||||||
|
NetBeans docker wrapper for php
|
||||||
|
===============================
|
||||||
|
Anthony Axenov (c) 2023, The MIT License
|
||||||
|
https://axenov.dev
|
||||||
|
https://opensource.org/license/mit
|
||||||
|
Replacement host php interpreter with dockerized one to run & debug cli php scripts.
|
||||||
|
Usage:
|
||||||
|
./$(basename $0) --container=<NAME> [--map=<PATH1>:<PATH2>] [PHP_ARGS] <SCRIPT> [SCRIPT_ARGS]
|
||||||
|
Arguments:
|
||||||
|
--container : docker container where your SCRIPT is located. Required.
|
||||||
|
--map : sources path mapped from the host to container. Not required.
|
||||||
|
PATH1 is an absolute path to php sources directory on the host.
|
||||||
|
PATH2 is an absolute path of the same directory inside of container.
|
||||||
|
Delimiter ':' is required. If PATH1, PATH2 or delimiter is missed
|
||||||
|
or value is empty then error will be thrown.
|
||||||
|
PHP_ARGS : arguments you can pass to real php interpreter according to its --help.
|
||||||
|
Not required.
|
||||||
|
SCRIPT : a path to script file (.php) to be executed in container. Required.
|
||||||
|
Note that this file must exist inside or be available from that container.
|
||||||
|
SCRIPT_ARGS : arguments to call your script with. They will be passed to script as is.
|
||||||
|
Not required.
|
||||||
|
Read this article to know how to set this helper as interpreter for NetBeans:
|
||||||
|
ru: https://axenov.dev/netbeans-php-docker-xdebug-cli
|
||||||
|
en: https://axenov.dev/en/netbeans-php-docker-xdebug-cli-en
|
||||||
|
EOF
|
||||||
|
|
||||||
|
pwd=$(pwd) # current working directory
|
||||||
|
cmdline=($@) # copy currently called command line to array
|
||||||
|
collect_php_args=1 # should we collect php args or script ones?
|
||||||
|
quiet=0 # should we print some useful data before executing?
|
||||||
|
|
||||||
|
# find a path where this wrapper is located
|
||||||
|
wrapper_dir="$(dirname $0)"
|
||||||
|
|
||||||
|
# find a path where project is probably located
|
||||||
|
project_dir="$(dirname $wrapper_dir)"
|
||||||
|
|
||||||
|
# here we check if this wrapper is global or local
|
||||||
|
# but if it is set as global from nbproject dir of
|
||||||
|
# current project then it is not detected as global
|
||||||
|
# anyway behavior will be correct
|
||||||
|
nbproject="$(basename $wrapper_dir)"
|
||||||
|
[ "$nbproject" = 'nbproject' ] && is_global=0 || is_global=1
|
||||||
|
|
||||||
|
# prepare new array to collect php args
|
||||||
|
declare -a php_cmd=("docker" "exec")
|
||||||
|
|
||||||
|
# and another one for script args
|
||||||
|
declare -a script_args=()
|
||||||
|
|
||||||
|
# and one more for directory mapping
|
||||||
|
declare -a map_arr=()
|
||||||
|
|
||||||
|
# iterate over arguments we received from netbeans
|
||||||
|
for arg in "${cmdline[@]}"; do
|
||||||
|
|
||||||
|
# if this is a container name
|
||||||
|
if [ "${arg::11}" = '--container' ]; then
|
||||||
|
container="${arg:12}" # save it
|
||||||
|
php_cmd+=("$container" 'php') # add php itself
|
||||||
|
continue # jump to next iteration
|
||||||
|
fi
|
||||||
|
|
||||||
|
# if this is a path map
|
||||||
|
if [ "${arg::5}" = '--map' ]; then
|
||||||
|
map="${arg:6}" # save it
|
||||||
|
map_arr=(${map//:/ }) # split it and check if it is correct
|
||||||
|
if [ -z "${map_arr[0]}" ] || [ -z "${map_arr[1]}" ]; then
|
||||||
|
echo "ERROR: directory map is incorrect!"
|
||||||
|
echo "Use $0 --help to get info about how to use this wrapper."
|
||||||
|
echo "Exit code 3."
|
||||||
|
exit 3
|
||||||
|
fi
|
||||||
|
continue # jump to next iteration
|
||||||
|
fi
|
||||||
|
|
||||||
|
# if this is a container name
|
||||||
|
if [ "${arg::7}" = '--quiet' ]; then
|
||||||
|
quiet=1
|
||||||
|
continue # jump to next iteration
|
||||||
|
fi
|
||||||
|
|
||||||
|
# if this is an absolute path to a script file
|
||||||
|
if [ -f "$arg" ]; then
|
||||||
|
# make its path correct for container
|
||||||
|
if [ "$map" ]; then # when paths are mapped
|
||||||
|
# remove first part of map from an absolute filepath and append result to second map part
|
||||||
|
filepath="${map_arr[1]}${arg##${map_arr[0]}}"
|
||||||
|
else # when paths are NOT mapped
|
||||||
|
# remove project path from absolute filepath
|
||||||
|
filepath="${arg##$project_dir/}"
|
||||||
|
fi
|
||||||
|
php_cmd+=("$filepath") # append php args with filepath
|
||||||
|
collect_php_args=0 # now we need to collect script args
|
||||||
|
continue # jump to next iteration
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "$collect_php_args" = 1 ]; then # if we collect php args
|
||||||
|
php_cmd+=("$arg") # add current arg to php args as is
|
||||||
|
continue # jump to next iteration
|
||||||
|
fi
|
||||||
|
|
||||||
|
script_args+=("$arg") # otherwise add current arg to script args as is
|
||||||
|
done
|
||||||
|
|
||||||
|
# docker container name is required so we must halt here if there is no one
|
||||||
|
if [ -z "$container" ]; then
|
||||||
|
echo "ERROR: no docker container is specified!" >&2
|
||||||
|
echo "Use $0 --help to get info about how to use this wrapper." >&2
|
||||||
|
echo "Exit code 1." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# path to php script is also required so we must halt here too if there is no one
|
||||||
|
if [ -z "$filepath" ]; then
|
||||||
|
echo "ERROR: no script filepath is specified!" >&2
|
||||||
|
echo "Use $0 --help to get info about how to use this wrapper." >&2
|
||||||
|
echo "Exit code 2." >&2
|
||||||
|
exit 2
|
||||||
|
fi
|
||||||
|
|
||||||
|
cmdline="${php_cmd[*]} ${script_args[*]}" # make a command to execute
|
||||||
|
|
||||||
|
# print some important data collected above
|
||||||
|
if [ "$quiet" = 0 ]; then
|
||||||
|
echo "NetBeans docker wrapper for php"
|
||||||
|
echo "==============================="
|
||||||
|
echo -e "Container name: $container"
|
||||||
|
echo -e "Script path: $filepath"
|
||||||
|
echo -e "Directory mapping: ${map:-(none)}"
|
||||||
|
echo -e "Command line:\n$cmdline\n"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# some debug output
|
||||||
|
# echo "=== some debug output ========="
|
||||||
|
# cat <<EOF | column -t
|
||||||
|
# is_global $is_global
|
||||||
|
# container $container
|
||||||
|
# pwd $pwd
|
||||||
|
# wrapper_dir $wrapper_dir
|
||||||
|
# nbproject $nbproject
|
||||||
|
# project_dir $project_dir
|
||||||
|
# map $map
|
||||||
|
# map_arr[0] ${map_arr[0]}
|
||||||
|
# map_arr[1] ${map_arr[1]}
|
||||||
|
# filepath $filepath
|
||||||
|
# EOF
|
||||||
|
# echo "==============================="
|
||||||
|
|
||||||
|
$cmdline # execute
|
||||||
|
|
||||||
|
# that's folks!
|
@ -1,16 +1,15 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
# https://gist.github.com/anthonyaxenov/b8336a2bc9e6a742b5a050fa2588d71e
|
||||||
#####################################################################
|
#####################################################################
|
||||||
# #
|
# #
|
||||||
# Stupidly simple backup script for own projects #
|
# Stupidly simple backup script for own projects #
|
||||||
# #
|
# #
|
||||||
# Author: Anthony Axenov (Антон Аксенов) #
|
# Author: Anthony Axenov (Антон Аксенов) #
|
||||||
# Version: 1.0 #
|
# Version: 1.0 #
|
||||||
# License: WTFPLv2 More info (RU): https://axenov.dev/?p=1234 #
|
# License: WTFPLv2 More info: https://axenov.dev/?p=1423 #
|
||||||
# #
|
# #
|
||||||
#####################################################################
|
#####################################################################
|
||||||
|
|
||||||
# https://gist.github.com/anthonyaxenov/b8336a2bc9e6a742b5a050fa2588d71e
|
|
||||||
|
|
||||||
# database credentials ==============================================
|
# database credentials ==============================================
|
||||||
|
|
||||||
DBUSER=
|
DBUSER=
|
||||||
|
72
tools/rsync-backup.sh
Executable file
72
tools/rsync-backup.sh
Executable file
@ -0,0 +1,72 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
RS_SRC_DEV=/dev/sdb1
|
||||||
|
RS_DST_DEV=/dev/sdc1
|
||||||
|
LOG_DIR="/home/$USER/rsync-logs"
|
||||||
|
USE_NTFY=0
|
||||||
|
NTFY_TITLE="Backup: $RS_SRC_DEV => $RS_DST_DEV"
|
||||||
|
NTFY_CHANNEL=""
|
||||||
|
|
||||||
|
log() {
|
||||||
|
[ ! -d "$LOG_DIR" ] && mkdir -p "$LOG_DIR"
|
||||||
|
echo -e "[`date '+%H:%M:%S'`] $*" | tee -a "$LOG_DIR/`date '+%Y%m%d'`.log"
|
||||||
|
}
|
||||||
|
|
||||||
|
# отправляет простую нотификацию
|
||||||
|
ntfy_info() {
|
||||||
|
[ $USE_NTFY == 1 ] && ntfy send \
|
||||||
|
--title "$NTFY_TITLE" \
|
||||||
|
--message "$1" \
|
||||||
|
--priority 1 \
|
||||||
|
"$NTFY_CHANNEL"
|
||||||
|
}
|
||||||
|
|
||||||
|
# отправляет нотификацию с предупреждением
|
||||||
|
ntfy_warn() {
|
||||||
|
[ $USE_NTFY == 1 ] && ntfy send \
|
||||||
|
--title "$NTFY_TITLE" \
|
||||||
|
--tags "warning" \
|
||||||
|
--message "$1" \
|
||||||
|
--priority 5 \
|
||||||
|
"$NTFY_CHANNEL"
|
||||||
|
}
|
||||||
|
|
||||||
|
log "START\t========================="
|
||||||
|
|
||||||
|
mnt_check=$(findmnt -nf "$RS_SRC_DEV")
|
||||||
|
if [ $? -gt 0 ]; then
|
||||||
|
log "Source partition '$RS_SRC_DEV' is not mounted. Exit 1."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
RS_SRC_PATH=$(echo $mnt_check | awk '{ print $1 }')
|
||||||
|
log "Source partition '$RS_SRC_DEV' is mounted at '$RS_SRC_PATH'"
|
||||||
|
|
||||||
|
mnt_check=$(findmnt -nf "$RS_DST_DEV")
|
||||||
|
if [ $? -gt 0 ]; then
|
||||||
|
log "Destination partition '$RS_DST_DEV' is not mounted. Exit 1."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
RS_DST_PATH=$(echo $mnt_check | awk '{ print $1 }')
|
||||||
|
log "Destination partition '$RS_DST_DEV' is mounted at '$RS_DST_PATH'"
|
||||||
|
|
||||||
|
log "Executing rsync:"
|
||||||
|
|
||||||
|
rsync -huva \
|
||||||
|
--progress \
|
||||||
|
--delete \
|
||||||
|
--exclude='lost+found' \
|
||||||
|
--exclude='.Trash' \
|
||||||
|
"$RS_SRC_PATH/" \
|
||||||
|
"$RS_DST_PATH/" \
|
||||||
|
| while read line; do log "$line"; done
|
||||||
|
|
||||||
|
if [ $? -gt 0 ]; then
|
||||||
|
log "Something went wrong. Exit 3."
|
||||||
|
ntfy_warn "Something went wrong, check log"
|
||||||
|
exit 3
|
||||||
|
fi
|
||||||
|
ntfy_info "Success!"
|
||||||
|
|
||||||
|
log "FINISH\t========================="
|
18
tools/s3-backup-old.sh
Executable file
18
tools/s3-backup-old.sh
Executable file
@ -0,0 +1,18 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
TTL_DAYS=1
|
||||||
|
S3="s3://......"
|
||||||
|
OLDER_THAN=$(date -d "$TTL_DAYS days ago" "+%s")
|
||||||
|
echo $OLDER_THAN
|
||||||
|
s3cmd ls -r $S3 | while read -r line; do
|
||||||
|
FILETIME=$(echo "$line" | awk {'print $1" "$2'})
|
||||||
|
FILETIME=$(date -d "$FILETIME" "+%s")
|
||||||
|
echo $FILETIME - $OLDER_THAN
|
||||||
|
if [[ $FILETIME -le $OLDER_THAN ]]; then
|
||||||
|
FILEPATH=$(echo "$line" | awk {'print $4'})
|
||||||
|
if [ $FILEPATH != "" ]; then
|
||||||
|
printf 'Must delete: %s\n' $FILEPATH
|
||||||
|
echo "s3cmd del $FILEPATH"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
342
tools/s3-backup.sh
Normal file
342
tools/s3-backup.sh
Normal file
@ -0,0 +1,342 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
#####################################################################
|
||||||
|
# #
|
||||||
|
# Stupidly simple backup script for own projects #
|
||||||
|
# #
|
||||||
|
# Author: Anthony Axenov (Антон Аксенов) #
|
||||||
|
# Version: 1.2 #
|
||||||
|
# License: WTFPLv2 #
|
||||||
|
# More info (RU): https://axenov.dev/?p=1272 #
|
||||||
|
# #
|
||||||
|
#####################################################################
|
||||||
|
|
||||||
|
# use remote storages ===============================================
|
||||||
|
|
||||||
|
USE_SSH=1
|
||||||
|
USE_S3=1
|
||||||
|
|
||||||
|
# database credentials ==============================================
|
||||||
|
|
||||||
|
DBUSER=
|
||||||
|
DBPASS=
|
||||||
|
DBNAME=
|
||||||
|
DBCHARSET="utf8"
|
||||||
|
|
||||||
|
# dates for file structure ==========================================
|
||||||
|
|
||||||
|
TODAY_DIR="$(date +%Y.%m.%d)"
|
||||||
|
TODAY_FILE="$(date +%H.%M)"
|
||||||
|
|
||||||
|
# local storage =====================================================
|
||||||
|
|
||||||
|
LOCAL_BAK_DIR="/backup"
|
||||||
|
LOCAL_BAK_PATH="$LOCAL_BAK_DIR/$TODAY_DIR"
|
||||||
|
|
||||||
|
# database backup file
|
||||||
|
LOCAL_SQL_FILE="$TODAY_FILE-db.sql.gz"
|
||||||
|
LOCAL_SQL_PATH="$LOCAL_BAK_PATH/$LOCAL_SQL_FILE"
|
||||||
|
|
||||||
|
# project path and backup file
|
||||||
|
LOCAL_SRC_DIR="/var/www/html"
|
||||||
|
LOCAL_SRC_FILE="$TODAY_FILE-src.tar.gz"
|
||||||
|
LOCAL_SRC_PATH="$LOCAL_BAK_PATH/$LOCAL_SRC_FILE"
|
||||||
|
|
||||||
|
# log file
|
||||||
|
LOG_FILE="$TODAY_FILE.log"
|
||||||
|
LOG_PATH="$LOCAL_BAK_PATH/$LOG_FILE"
|
||||||
|
|
||||||
|
# remote storages ===================================================
|
||||||
|
|
||||||
|
SSH_HOST="user@example.com"
|
||||||
|
SSH_BAK_DIR="/backup"
|
||||||
|
SSH_BAK_PATH="$SSH_BAK_DIR/$TODAY_DIR"
|
||||||
|
SSH_SQL_FILE="$SSH_BAK_PATH/$LOCAL_SQL_FILE"
|
||||||
|
SSH_SRC_FILE="$SSH_BAK_PATH/$LOCAL_SRC_FILE"
|
||||||
|
SSH_LOG_FILE="$SSH_BAK_PATH/$LOG_FILE"
|
||||||
|
|
||||||
|
S3_BUCKET="s3://my.bucket"
|
||||||
|
S3_DIR="$S3_BUCKET/$TODAY_DIR"
|
||||||
|
S3_SQL_FILE="$S3_DIR/$LOCAL_SQL_FILE"
|
||||||
|
S3_SRC_FILE="$S3_DIR/$LOCAL_SRC_FILE"
|
||||||
|
S3_LOG_FILE="$S3_DIR/$LOG_FILE"
|
||||||
|
|
||||||
|
# autoremove ========================================================
|
||||||
|
|
||||||
|
# time to live on different storages
|
||||||
|
TTL_LOCAL=3
|
||||||
|
TTL_SSH=7
|
||||||
|
TTL_S3=60
|
||||||
|
|
||||||
|
# autoremove flags
|
||||||
|
CLEAR_SSH=1
|
||||||
|
CLEAR_S3=1
|
||||||
|
|
||||||
|
# notifications =====================================================
|
||||||
|
|
||||||
|
USE_NTFY=1
|
||||||
|
NTFY_TITLE="Backup script"
|
||||||
|
NTFY_CHANNEL=
|
||||||
|
|
||||||
|
#====================================================================
|
||||||
|
#
|
||||||
|
# Functions used for the whole backup flow
|
||||||
|
#
|
||||||
|
#====================================================================
|
||||||
|
|
||||||
|
# prints arguments to stdout and into log file
|
||||||
|
log() {
|
||||||
|
echo -e "[$(date +%H:%M:%S)] $*" | tee -a "$LOG_PATH"
|
||||||
|
}
|
||||||
|
|
||||||
|
# sends notification with information
|
||||||
|
ntfy_info() {
|
||||||
|
[ $USE_NTFY == 1 ] && ntfy send \
|
||||||
|
--title "$NTFY_TITLE" \
|
||||||
|
--message "$1" \
|
||||||
|
--priority 1 \
|
||||||
|
"$NTFY_CHANNEL"
|
||||||
|
}
|
||||||
|
|
||||||
|
# sends notification with warning
|
||||||
|
ntfy_warn() {
|
||||||
|
[ $USE_NTFY == 1 ] && ntfy send \
|
||||||
|
--title "$NTFY_TITLE" \
|
||||||
|
--tags "warning" \
|
||||||
|
--message "$1" \
|
||||||
|
--priority 5 \
|
||||||
|
"$NTFY_CHANNEL"
|
||||||
|
}
|
||||||
|
|
||||||
|
# prints initialized parameters
|
||||||
|
show_params() {
|
||||||
|
log "Initialized parameters:"
|
||||||
|
|
||||||
|
log "├ [ Remotes ]"
|
||||||
|
log "│\t├ USE_SSH = $USE_SSH"
|
||||||
|
[ $USE_SSH == 1 ] && log "│\t├ SSH_HOST = $SSH_HOST"
|
||||||
|
log "│\t├ USE_S3 = $USE_S3"
|
||||||
|
[ $USE_S3 == 1 ] && log "│\t├ S3_BUCKET = $S3_BUCKET"
|
||||||
|
|
||||||
|
log "├ [ Database ]"
|
||||||
|
log "│\t├ DBUSER = $DBUSER"
|
||||||
|
log "│\t├ DBNAME = $DBNAME"
|
||||||
|
log "│\t├ DBCHARSET = $DBCHARSET"
|
||||||
|
log "│\t├ LOCAL_SQL_PATH = $LOCAL_SQL_PATH"
|
||||||
|
[ $USE_SSH == 1 ] && log "│\t├ SSH_SQL_FILE = $SSH_SQL_FILE"
|
||||||
|
[ $USE_S3 == 1 ] && log "│\t├ S3_SQL_FILE = $S3_SQL_FILE"
|
||||||
|
|
||||||
|
log "├ [ Sources ]"
|
||||||
|
log "│\t├ LOCAL_SRC_DIR = $LOCAL_SRC_DIR"
|
||||||
|
log "│\t├ LOCAL_SRC_PATH = $LOCAL_SRC_PATH"
|
||||||
|
[ $USE_SSH == 1 ] && log "│\t├ SSH_SRC_FILE = $SSH_SRC_FILE"
|
||||||
|
[ $USE_S3 == 1 ] && log "│\t├ S3_SRC_FILE = $S3_SRC_FILE"
|
||||||
|
|
||||||
|
log "├ [ Log ]"
|
||||||
|
log "│\t├ LOG_PATH = $LOG_PATH"
|
||||||
|
[ $USE_SSH == 1 ] && log "│\t├ SSH_LOG_FILE = $SSH_LOG_FILE"
|
||||||
|
[ $USE_S3 == 1 ] && log "│\t├ S3_LOG_FILE = $S3_LOG_FILE"
|
||||||
|
|
||||||
|
log "├ [ Autoclear ]"
|
||||||
|
log "│\t├ TTL_LOCAL = $TTL_LOCAL"
|
||||||
|
[ $USE_SSH == 1 ] && {
|
||||||
|
log "│\t├ CLEAR_SSH = $CLEAR_SSH"
|
||||||
|
log "│\t├ TTL_SSH = $TTL_SSH"
|
||||||
|
}
|
||||||
|
[ $USE_S3 == 1 ] && {
|
||||||
|
log "│\t├ CLEAR_S3 = $CLEAR_S3"
|
||||||
|
log "│\t├ TTL_S3 = $TTL_S3"
|
||||||
|
}
|
||||||
|
|
||||||
|
log "└ [ ntfy ]"
|
||||||
|
log "\t├ USE_NTFY = $USE_NTFY"
|
||||||
|
[ $USE_NTFY == 1 ] && log "\t├ NTFY_TITLE = $NTFY_TITLE"
|
||||||
|
[ $USE_NTFY == 1 ] && log "\t└ NTFY_CHANNEL = $NTFY_CHANNEL"
|
||||||
|
}
|
||||||
|
|
||||||
|
# initializes directories for backup
|
||||||
|
init_dirs() {
|
||||||
|
if [ ! -d "$LOCAL_BAK_PATH" ]; then
|
||||||
|
mkdir -p $LOCAL_BAK_PATH
|
||||||
|
fi
|
||||||
|
[ $USE_SSH == 1 ] && ssh $SSH_HOST "mkdir -p $SSH_BAK_PATH"
|
||||||
|
}
|
||||||
|
|
||||||
|
# clears old local backups
|
||||||
|
clear_local_backups() {
|
||||||
|
log "\tLocal:"
|
||||||
|
log $(find "$LOCAL_BAK_DIR" -type d -mtime +"$TTL_LOCAL" | sort)
|
||||||
|
find "$LOCAL_BAK_DIR" -type d -mtime +"$TTL_LOCAL" | xargs rm -rf
|
||||||
|
}
|
||||||
|
|
||||||
|
# clears old backups on remote ssh storage
|
||||||
|
clear_ssh_backups() {
|
||||||
|
if [ $USE_SSH == 1 ] && [ $CLEAR_SSH == 1 ]; then
|
||||||
|
log "\tSSH:"
|
||||||
|
log $(ssh "$SSH_HOST" "find $SSH_BAK_DIR -type d -mtime +$TTL_SSH" | sort)
|
||||||
|
ssh "$SSH_HOST" "find $SSH_BAK_DIR -type d -mtime +$TTL_SSH | xargs rm -rf"
|
||||||
|
else
|
||||||
|
log "\tSSH: disabled (\$USE_SSH, \$CLEAR_SSH)"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# clears backups on remote s3 storage
|
||||||
|
clear_s3_backups() {
|
||||||
|
# https://gist.github.com/JProffitt71/9044744?permalink_comment_id=3539681#gistcomment-3539681
|
||||||
|
if [ $USE_S3 == 1 ] && [ $CLEAR_S3 == 1 ]; then
|
||||||
|
log "\tS3:"
|
||||||
|
OLDER_THAN=$(date -d "$TTL_S3 days ago" "+%s")
|
||||||
|
s3cmd ls -r $S3_DIR | while read -r line; do
|
||||||
|
FILETIME=$(echo "$line" | awk {'print $1" "$2'})
|
||||||
|
FILETIME=$(date -d "$FILETIME" "+%s")
|
||||||
|
if [[ $FILETIME -le $OLDER_THAN ]]; then
|
||||||
|
FILEPATH=$(echo "$line" | awk {'print $4'})
|
||||||
|
if [ $FILEPATH != "" ]; then
|
||||||
|
log "$line"
|
||||||
|
s3cmd del $FILEPATH
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
else
|
||||||
|
log "\tS3: disabled (\$USE_S3 + \$CLEAR_S3)"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# clears old backups
|
||||||
|
clear_backups() {
|
||||||
|
echo
|
||||||
|
log "1/7 Removing old backups..."
|
||||||
|
clear_local_backups
|
||||||
|
clear_ssh_backups
|
||||||
|
clear_s3_backups
|
||||||
|
}
|
||||||
|
|
||||||
|
# makes archive with database dump
|
||||||
|
backup_db() {
|
||||||
|
echo
|
||||||
|
log "2/7 Dumping DB: $DBNAME..."
|
||||||
|
mysqldump \
|
||||||
|
--user=$DBUSER \
|
||||||
|
--password=$DBPASS \
|
||||||
|
--opt \
|
||||||
|
--default-character-set=$DBCHARSET \
|
||||||
|
--quick \
|
||||||
|
$DBNAME | gzip > $LOCAL_SQL_PATH
|
||||||
|
if [ $? == 0 ]; then
|
||||||
|
log "\t- OK"
|
||||||
|
send_db_ssh
|
||||||
|
send_db_s3
|
||||||
|
else
|
||||||
|
log "\t- ERROR: failed to create dump. Exit-code: $?"
|
||||||
|
ntfy_warn "ERROR: failed to create dump"
|
||||||
|
log "3/7 Sending database backup to $SSH_HOST... skipped"
|
||||||
|
log "4/7 Sending database backup to $S3_DIR... skipped"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# sends database archive into ssh remote storage
|
||||||
|
send_db_ssh() {
|
||||||
|
echo
|
||||||
|
log "3/7 Sending database backup to $SSH_HOST..."
|
||||||
|
if [ $USE_SSH == 1 ]; then
|
||||||
|
rsync --progress "$LOCAL_SQL_PATH" "$SSH_HOST:$SSH_SQL_FILE"
|
||||||
|
if [ $? == 0 ]; then
|
||||||
|
log "\t- OK"
|
||||||
|
else
|
||||||
|
log "\t- ERROR: failed to send DB backup to $SSH_HOST. Exit-code: $?"
|
||||||
|
ntfy_warn "ERROR: failed to send DB backup to $SSH_HOST"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log "\t- disabled (\$USE_SSH)"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# sends database archive into s3 remote storage
|
||||||
|
send_db_s3() {
|
||||||
|
echo
|
||||||
|
log "4/7 Sending database backup to $S3_DIR..."
|
||||||
|
if [ $USE_S3 == 1 ]; then
|
||||||
|
s3cmd put "$LOCAL_SQL_PATH" "$S3_SQL_FILE"
|
||||||
|
if [ $? == 0 ]; then
|
||||||
|
log "\t- OK"
|
||||||
|
else
|
||||||
|
log "\t- ERROR: failed to send DB backup to $S3_DIR. Exit-code: $?"
|
||||||
|
ntfy_warn "ERROR: failed to send DB backup to $S3_DIR"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log "\t- disabled (\$USE_SSH)"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# makes archive with project sources
|
||||||
|
backup_src() {
|
||||||
|
echo
|
||||||
|
log "5/7 Compressing project dir: $LOCAL_SRC_DIR..."
|
||||||
|
tar -zcf "$LOCAL_SRC_PATH" "$LOCAL_SRC_DIR"
|
||||||
|
if [ $? == 0 ]; then
|
||||||
|
log "\t- OK"
|
||||||
|
send_src_ssh
|
||||||
|
send_src_s3
|
||||||
|
else
|
||||||
|
log "\t- ERROR: failed to compress project. Exit-code: $?"
|
||||||
|
ntfy_warn "ERROR: failed to compress project"
|
||||||
|
log "6/7 Sending project backup to $SSH_HOST... skipped"
|
||||||
|
log "7/7 Sending project backup to $S3_DIR... skipped"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# sends sources archive into ssh remote storage
|
||||||
|
send_src_ssh() {
|
||||||
|
echo
|
||||||
|
log "6/7 Sending project backup to $SSH_HOST..."
|
||||||
|
if [ $USE_SSH == 1 ]; then
|
||||||
|
rsync --progress "$LOCAL_SRC_PATH" "$SSH_HOST:$SSH_SRC_FILE"
|
||||||
|
if [ $? == 0 ]; then
|
||||||
|
log "\t- OK"
|
||||||
|
else
|
||||||
|
log "\t- ERROR: failed to send project backup to $SSH_HOST. Exit-code: $?"
|
||||||
|
ntfy_warn "ERROR: failed to send project backup to $SSH_HOST"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log "\t- disabled"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# sends sources archive into s3 remote storage
|
||||||
|
send_src_s3() {
|
||||||
|
echo
|
||||||
|
log "7/7 Sending project backup to $S3_DIR..."
|
||||||
|
s3cmd put "$LOCAL_SRC_PATH" "$S3_SRC_FILE"
|
||||||
|
if [ $? == 0 ]; then
|
||||||
|
log "\t- OK"
|
||||||
|
else
|
||||||
|
log "\t- ERROR: failed to send database backup to $S3_DIR. Exit-code: $?"
|
||||||
|
ntfy_warn "ERROR: failed to send project backup to $S3_DIR"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# prints used/free space on local storage
|
||||||
|
show_finish() {
|
||||||
|
echo
|
||||||
|
log "Finish!"
|
||||||
|
log "Used space: $(du -h "$LOCAL_BAK_PATH" | tail -n1)" # вывод размера папки с бэкапами за текущий день
|
||||||
|
log "Free space: $(df -h "$LOCAL_BAK_PATH" | tail -n1 | awk '{print $4}')" # вывод свободного места на локальном диске
|
||||||
|
echo
|
||||||
|
}
|
||||||
|
|
||||||
|
# sends log file into both remote storage
|
||||||
|
send_log() {
|
||||||
|
[ $USE_SSH == 1 ] && rsync --progress "$LOG_PATH" "$SSH_HOST:$SSH_LOG_FILE"
|
||||||
|
[ $USE_S3 == 1 ] && s3cmd put "$LOG_PATH" "$S3_LOG_FILE"
|
||||||
|
}
|
||||||
|
|
||||||
|
# main flow =========================================================
|
||||||
|
|
||||||
|
log "Start ----------------------------------------------------------"
|
||||||
|
show_params
|
||||||
|
init_dirs
|
||||||
|
clear_backups
|
||||||
|
backup_db
|
||||||
|
backup_src
|
||||||
|
show_finish
|
||||||
|
send_log
|
||||||
|
ntfy_info "Finish!"
|
40
tools/setup-wakeonlan.sh
Executable file
40
tools/setup-wakeonlan.sh
Executable file
@ -0,0 +1,40 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
print() {
|
||||||
|
echo -e "$*"
|
||||||
|
}
|
||||||
|
|
||||||
|
state() {
|
||||||
|
sudo ethtool "$iface" | grep -E '^\s+Wake-on:\s\w+' | awk '{print $2}'
|
||||||
|
}
|
||||||
|
|
||||||
|
[ "$1" ] && iface="$1" || iface=enp3s0
|
||||||
|
|
||||||
|
[ -f "/sys/class/net/$iface/address" ] && mac=$(cat "/sys/class/net/$iface/address") || mac=''
|
||||||
|
[ -z "$mac" ] && {
|
||||||
|
print "Wrong interface! $iface" >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
state=$(state)
|
||||||
|
|
||||||
|
print "Interface\t: $iface"
|
||||||
|
print "MAC-address\t: $mac"
|
||||||
|
print "WoL state\t: $state"
|
||||||
|
|
||||||
|
if [ $state == 'd' ]; then
|
||||||
|
sudo ethtool -s "$iface" wol gu || true
|
||||||
|
sudo mkdir -p /etc/networkd-dispatcher/configuring.d
|
||||||
|
sudo tee /etc/networkd-dispatcher/configuring.d/wol <<EOF >/dev/null
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
ethtool -s $iface wol gu || true
|
||||||
|
EOF
|
||||||
|
sudo chmod 755 /etc/networkd-dispatcher/configuring.d/wol
|
||||||
|
print "* New WOL state\t: $(state)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
print "\nTo wake up this device run this command from another one:\n"
|
||||||
|
print "\twakeonlan -p 8 $mac\n"
|
||||||
|
print "\twol $mac\n"
|
||||||
|
|
49
tools/upgrade-ubuntu.sh
Executable file
49
tools/upgrade-ubuntu.sh
Executable file
@ -0,0 +1,49 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# https://dev.to/chefgs/upgrading-an-end-of-life-eol-ubuntu-os-to-lts-version-3a36
|
||||||
|
# https://changelogs.ubuntu.com/meta-release
|
||||||
|
|
||||||
|
installed() {
|
||||||
|
command -v "$1" >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
# sudo software-properties-qt (переключиться с LTS на нормальные релизы)
|
||||||
|
# sudo aptitude install update-manager-core update-manager
|
||||||
|
# sudo apt upgrade --autoremove -y
|
||||||
|
# installed pkcon && sudo pkcon update --autoremove -y
|
||||||
|
# sudo apt dist-upgrade
|
||||||
|
# sudo apt install update-manager-core
|
||||||
|
# sudo do-release-upgrade -p
|
||||||
|
|
||||||
|
source /etc/os-release
|
||||||
|
|
||||||
|
echo "Loading..."
|
||||||
|
|
||||||
|
IFS=$'\n' codenames=($(curl -s https://changelogs.ubuntu.com/meta-release | grep -xP "^Dist:\s[\w]+$" | sed "s/Dist: //" ))
|
||||||
|
thisCodename="$VERSION_CODENAME"
|
||||||
|
for idx in "${!codenames[@]}"; do
|
||||||
|
if [ "${codenames[idx]}" = "$thisCodename" ]; then
|
||||||
|
nextCodename=${codenames[((idx+1))]}
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
targetDownloadPath="`pwd`/upgrade-$nextCodename"
|
||||||
|
targetToolPath="$targetDownloadPath/unpacked"
|
||||||
|
targetToolFile="$targetDownloadPath/$nextCodename.tar.gz"
|
||||||
|
|
||||||
|
echo "Current dist: $thisCodename"
|
||||||
|
echo "Next dist: $nextCodename"
|
||||||
|
echo "Target path: $targetToolFile"
|
||||||
|
|
||||||
|
rm -rf "$targetToolPath"
|
||||||
|
mkdir -p "$targetToolPath"
|
||||||
|
|
||||||
|
echo "Downloading..."
|
||||||
|
cd "$targetDownloadPath"
|
||||||
|
wget "http://archive.ubuntu.com/ubuntu/dists/${nextCodename}-updates/main/dist-upgrader-all/current/${nextCodename}.tar.gz"
|
||||||
|
|
||||||
|
echo "Unpacking..."
|
||||||
|
tar -xaf "$targetToolFile" -C "$targetToolPath"
|
||||||
|
|
||||||
|
echo "Starting..."
|
||||||
|
cd unpacked
|
||||||
|
sudo ./$nextCodename
|
82
tools/vscode-ext.sh
Normal file
82
tools/vscode-ext.sh
Normal file
@ -0,0 +1,82 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
exts=(
|
||||||
|
'af4jm.vscode-m3u'
|
||||||
|
'ahmadalli.vscode-nginx-conf'
|
||||||
|
'akamud.vscode-theme-onedark'
|
||||||
|
'anweber.statusbar-commands'
|
||||||
|
'baincd.mini-command-palettes'
|
||||||
|
'bmewburn.vscode-intelephense-client'
|
||||||
|
'codezombiech.gitignore'
|
||||||
|
'cweijan.vscode-redis-client'
|
||||||
|
'darkriszty.markdown-table-prettify'
|
||||||
|
'davidmarek.jsonpath-extract'
|
||||||
|
'deitry.apt-source-list-syntax'
|
||||||
|
'devsense.composer-php-vscode'
|
||||||
|
'devsense.intelli-php-vscode'
|
||||||
|
'devsense.phptools-vscode'
|
||||||
|
'devsense.profiler-php-vscode'
|
||||||
|
'dotjoshjohnson.xml'
|
||||||
|
'dunstontc.vscode-go-syntax'
|
||||||
|
'dustypomerleau.rust-syntax'
|
||||||
|
'eamodio.gitlens'
|
||||||
|
'editorconfig.editorconfig'
|
||||||
|
'esbenp.prettier-vscode'
|
||||||
|
'furkanozalp.go-syntax'
|
||||||
|
'gigacode.gigacode-vscode'
|
||||||
|
'golang.go'
|
||||||
|
'grapecity.gc-excelviewer'
|
||||||
|
'humao.rest-client'
|
||||||
|
'irongeek.vscode-env'
|
||||||
|
'jebbs.plantuml'
|
||||||
|
'jeff-hykin.better-go-syntax'
|
||||||
|
'jeppeandersen.vscode-kafka'
|
||||||
|
'jflbr.jwt-decoder'
|
||||||
|
'jinsihou.diff-tool'
|
||||||
|
'jtr.vscode-position'
|
||||||
|
'kenhowardpdx.vscode-gist'
|
||||||
|
'leavesster.jsonpath'
|
||||||
|
'mads-hartmann.bash-ide-vscode'
|
||||||
|
'mamoru.vscode-fish-text'
|
||||||
|
'mechatroner.rainbow-csv'
|
||||||
|
'mehedidracula.php-namespace-resolver'
|
||||||
|
'mhutchie.git-graph'
|
||||||
|
'mrmlnc.vscode-apache'
|
||||||
|
'ms-azuretools.vscode-docker'
|
||||||
|
'ms-ceintl.vscode-language-pack-ru'
|
||||||
|
'ms-vscode.hexeditor'
|
||||||
|
'ms-vscode.makefile-tools'
|
||||||
|
'neilbrayfield.php-docblocker'
|
||||||
|
'neonxp.gotools'
|
||||||
|
'nickdemayo.vscode-json-editor'
|
||||||
|
'nico-castell.linux-desktop-file'
|
||||||
|
'open-rpc.open-rpc'
|
||||||
|
'pejmannikram.vscode-auto-scroll'
|
||||||
|
'pkief.material-icon-theme'
|
||||||
|
'qcz.text-power-tools'
|
||||||
|
'rogalmic.bash-debug'
|
||||||
|
'rust-lang.rust-analyzer'
|
||||||
|
'ryu1kn.partial-diff'
|
||||||
|
'srmeyers.git-prefix'
|
||||||
|
'sumneko.lua'
|
||||||
|
'syler.ignore'
|
||||||
|
'takumii.markdowntable'
|
||||||
|
'tamasfe.even-better-toml'
|
||||||
|
'tyriar.lorem-ipsum'
|
||||||
|
'vitorsalgado.vscode-redis'
|
||||||
|
'waderyan.gitblame'
|
||||||
|
'wayou.vscode-todo-highlight'
|
||||||
|
'weijunyu.vscode-json-path'
|
||||||
|
'xdebug.php-debug'
|
||||||
|
'yinfei.luahelper'
|
||||||
|
'yog.yog-plantuml-highlight'
|
||||||
|
'yves.schema-tree'
|
||||||
|
'yzane.markdown-pdf'
|
||||||
|
'yzhang.markdown-all-in-one'
|
||||||
|
'zgm.cuesheet'
|
||||||
|
'zh9528.file-size'
|
||||||
|
'zobo.php-intellisense'
|
||||||
|
)
|
||||||
|
|
||||||
|
for ext in "$exts[@]"; do
|
||||||
|
code --install-extension $ext
|
||||||
|
done
|
0
tools/ytdlcue.sh
Executable file → Normal file
0
tools/ytdlcue.sh
Executable file → Normal file
@ -1,13 +1,10 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Uninstall apache2
|
##makedesc: Uninstall apache2
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
sudo apt remove -y --autoremove apache2
|
apt_remove apache2
|
||||||
|
|
||||||
|
success "apache2 removed!"
|
||||||
|
|
||||||
[ $? = 0 ] && {
|
|
||||||
echo
|
|
||||||
success "apache2 uninstalled!"
|
|
||||||
echo
|
|
||||||
}
|
|
||||||
|
15
uninstall/canon-mg2500
Executable file
15
uninstall/canon-mg2500
Executable file
@ -0,0 +1,15 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
##makedesc: Uninstall Canon Pixma MG2500 + ppa
|
||||||
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
|
title
|
||||||
|
|
||||||
|
apt_ppa_remove ppa:thierry-f/fork-michael-gruz
|
||||||
|
|
||||||
|
dpkg_remove cnijfilter-mg2500series
|
||||||
|
dpkg_remove cnijfilter-common
|
||||||
|
dpkg_remove scangearmp-mg2500series
|
||||||
|
dpkg_remove scangearmp-common
|
||||||
|
|
||||||
|
success "Drivers for Canon Pixma MG2500 removed!"
|
||||||
|
|
9
uninstall/chrome
Executable file
9
uninstall/chrome
Executable file
@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
##makedesc: Uninstall google chrome
|
||||||
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
|
title
|
||||||
|
|
||||||
|
dpkg_remove google-chrome-stable
|
||||||
|
|
||||||
|
success "Google Chrome removed!"
|
17
uninstall/composer
Executable file
17
uninstall/composer
Executable file
@ -0,0 +1,17 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
##makedesc: Uninstall composer
|
||||||
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
|
title
|
||||||
|
|
||||||
|
apt_remove composer
|
||||||
|
|
||||||
|
sudo rm -f \
|
||||||
|
"$HOME/.local/bin/composer" \
|
||||||
|
/bin/composer \
|
||||||
|
/usr/bin/composer \
|
||||||
|
/usr/local/bin/composer \
|
||||||
|
/usr/src/composer \
|
||||||
|
|
||||||
|
success "composer removed!"
|
||||||
|
|
@ -1,16 +1,21 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Uninstall docker + ppa
|
##makedesc: Uninstall docker + ppa
|
||||||
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
echo
|
title
|
||||||
echo "==============================================="
|
|
||||||
echo "Uninstalling docker + ppa..."
|
|
||||||
echo "==============================================="
|
|
||||||
echo
|
|
||||||
|
|
||||||
sudo apt remove -y docker*
|
apt_remove -y \
|
||||||
|
docker-ce \
|
||||||
|
docker-ce-cli \
|
||||||
|
containerd.io \
|
||||||
|
docker-buildx-plugin \
|
||||||
|
docker-compose-plugin \
|
||||||
|
docker-ce-rootless-extras
|
||||||
|
|
||||||
rm -rf \
|
sudo rm -rf \
|
||||||
|
/var/lib/docker \
|
||||||
|
/var/lib/containerd \
|
||||||
/etc/apt/sources.list.d/docker.list \
|
/etc/apt/sources.list.d/docker.list \
|
||||||
/etc/apt/keyrings/docker.gpg
|
/etc/apt/keyrings/docker.asc
|
||||||
|
|
||||||
sudo apt update
|
apt_update
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
##makedesc: Uninstall grub-customizer with ppa
|
##makedesc: Uninstall grub-customizer with ppa
|
||||||
source `dirname $0`/../helpers || exit 255
|
source "$( dirname $(readlink -e -- "${BASH_SOURCE}"))/../helpers.sh" || exit 255
|
||||||
|
|
||||||
title
|
title
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user