My personal clipboard of handy commands that I use frequently and that I tend to forget.
- Start new session: just
tmux
- Detach: ctrl+b, d
- List current:
tmux ls
- Attach:
tmux attach-session -t 0
SRC=Old Text here
DST=New Text here
find . *.php -type f -print0 | xargs -0 sed -i "/ipsum/s,$SRC,$DST,g"
(/ipsum/ selects lines containing "ipsum" and only on these lines the command(s) that follow are executed.)
ncdu
find . -xdev -type f -size +100M -print | xargs ls -lh | sort -k5,5 -h -r | head
Search only for files (-type f) in the current working directory (.), larger than than 100MB (-size +100M), don’t descend directories on other filesystems (-xdev) and print the full file name on the standard output, followed by a new line. The output of the find command is piped to xargs which executes the ls -lh command that will print the output in long listing human-readable format, and sorts lines based on the 5th column (-k5,5), compare the values in human-readable format (-h) and reverse the result (-r). head : prints only the first 10 lines of the piped output.
du -a | sort -n -r | head -n 50
Finds largest files and directories and lists largest 50, sorted on size.
find . -type f -size +10M
grep -rnw '/path/to/somewhere/' -e "pattern"
/usr/sbin/logrotate -v /data/web/hypernode_logrotate.conf --state /data/web/.logrot_state
varnishstat -f SMA.s0.g_bytes -f SMA.s0.g_space
varnishtop -i BereqURL
varnishtop -i ReqURL
varnishlog -g request -q 'ReqMethod eq "PURGE"'
mail -a From:[email protected] -s 'Mail Testing lalala' [email protected] <<< 'Dit is het bericht. Over en sluiten.'
Tar complete directory, recursive, including hidden files, while preserving file permissions
tar -cvpzf FILENAME.tgz .
Install globally using npm install --global wayback-sitemap-archive
then run using: wsa <SITEMAP_URL>
- Create one directory in home dir using
mkdir ~/warmertmp
. - Then run command:
wget -U "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/105.0.0.0 Safari/537.36" --directory-prefix=~/warmertmp --reject jpg,png --reject-regex "(.*)\?(.*)" --spider --recursive --no-directories https://www.DOMAINNAME.nl
You can also set this up in a cron job if you want.
Explanation:
--recursive
will force wget to crawl the website recursively.
--spider
is for "not downloading anything". However, this directive results in files created and deleted. Thus the following is useful:
--directory-prefix=~/warmertmp
ensures that temporary files will end up in that temp dir.
--no-directories
will ensure no empty directories are left out after running
--reject-regex "(.*)\?(.*)"
This will fetch all pages, BUT will disregard everything after the ? -- so good for layered navigation, for example
Optional: --quiet
is just a good way to silence any output to avoid cron email being sent.