Shell Scripting for DevOps


From terminal commands to reliable automation

Learning Objectives

  • Understand what the shell does when you run commands
  • Write scripts with #!/bin/bash and set -euo pipefail
  • Use variables, input, args, conditionals, loops, and functions safely
  • Explain quoting, expansion, exit codes, and redirection
  • Build and reason about a safer backup script

Why Shell Scripting?

Portable glue for DevOps automation

Where Scripts Are Used

  • Cloud/VM bootstrap scripts
  • Container startup (entrypoint.sh)
  • CI/CD jobs (build.sh, deploy.sh)
  • Ops maintenance (cleanup, checks, alerts)
Goal: turn repeatable manual steps into predictable, testable automation.

Under the Hood: Press Enter Flow

Read
Parse
Expand
Resolve
Fork/Exec
Exit Code

Most common bugs come from expansion and quoting.

Your First Script

Shebang + strict mode + execution

Minimal Safe Script

#!/bin/bash
set -euo pipefail   # fail fast: command errors, undefined vars, pipeline failures

echo "Hello, DevOps World!"   # write a line to stdout
  • #!/bin/bash: choose Bash interpreter explicitly
  • -e: stop on command failure
  • -u: fail on undefined variables
  • -o pipefail: fail if any pipeline command fails

Run It

chmod +x hello.sh   # add execute bit so file can run as a program
./hello.sh          # run script from current directory
  • chmod: change file permissions
  • +x: add execute permission
  • ./: run file from current directory
  • hello.sh: target file receiving execute bit
  • If file is not executable, shell returns Permission denied

Script Anatomy Template

#!/bin/bash
set -euo pipefail

log_file="/tmp/my-script.log"   # store logs in /tmp for safe testing

log_info() {
  echo "[INFO] $*" | tee -a "$log_file"   # print and append to log file
}

if [ "$#" -lt 1 ]; then   # $# is number of script arguments
  echo "Usage: $0 <name>"   # $0 is script name
  exit 1                   # non-zero exit status means failure
fi

name="$1"                  # first positional argument
log_info "Hello, $name"

Structure: defaults → functions → validation → main logic

Variables, Input, Arguments

Variables and Environment

#!/bin/bash
set -euo pipefail

name="Alice"             # regular shell variable
export ROLE="student"    # exported var visible to child processes
echo "Hi $name ($ROLE)"  # expand variables inside double quotes

env | sort               # list environment variables alphabetically
  • export passes variable to child processes
  • env | sort lists env variables alphabetically

Input and Positional Args

read -r -p "Enter your name: " name   # -r keeps backslashes literal, -p prints prompt

src="$1"                                # first CLI argument
dst="$2"                                # second CLI argument
if [ -z "$src" ] || [ -z "$dst" ]; then   # -z checks empty string, || is logical OR
  echo "Usage: $0 <source> <destination>"
  exit 1
fi
cp -r -- "$src" "$dst"                  # -r recursive copy, -- ends option parsing
  • -r with read: keep backslashes literal
  • -p: show prompt
  • $1 and $2: first and second positional arguments
  • [ -z "$src" ]: true when string is empty
  • ||: logical OR between test conditions
  • cp -r: recursive copy for directories
  • --: end of options (safe path handling)

Arrays and "$@"

files=("$@")                    # save all args as array, preserving spaces
for f in "${files[@]}"; do      # iterate safely over array items
  echo "Processing: $f"
done
  • "$@": each argument preserved separately
  • $*: merges all args into one string
  • For file paths, prefer arrays + quoted expansion

Quoting and Expansion

Plain-Language Concepts

  • Variable expansion: $HOME becomes its value
  • Command substitution: $(pwd) becomes command output
  • Word splitting: text split at spaces/tabs/newlines
  • Globbing: wildcards like *.log match filenames

Double vs Single vs Unquoted

name="Ali Veli"
echo "Hello $name"      # double quotes: expand vars, keep one argument
echo '$HOME'            # single quotes: literal text, no expansion

file="My Notes.txt"
rm "$file"              # quoted path stays one argument
rm $file                # unquoted path may split on spaces (unsafe)
Safe default: quote variables by default: "$var", "$@".
  • "..." allows $var and $(cmd) expansion
  • '...' is literal text, no expansion
  • Unquoted values may be split into multiple arguments

Expansion Order (Practical View)

  1. Brace expansion
  2. Tilde expansion
  3. Variable / parameter expansion
  4. Command substitution
  5. Arithmetic expansion
  6. Word splitting
  7. Filename globbing

Most bugs happen in the final two steps when values are unquoted.

Conditionals and Loops

Conditionals

#!/bin/bash
set -euo pipefail

read -r -p "Enter a directory: " dir
if [ -d "$dir" ]; then         # -d: true if path exists and is a directory
  echo "Directory exists."
elif [ -e "$dir" ]; then       # -e: true if path exists (any type)
  echo "Path exists, but is not a directory."
else
  echo "Path not found."
fi

-d directory exists, -e path exists

Other useful tests: -f file exists, -r readable, -w writable.

Loops

servers=(web01 web02 db01)      # Bash array
for s in "${servers[@]}"; do     # "${arr[@]}" keeps each element separate
  echo "Processing $s"
done

count=0
while [ "$count" -lt 3 ]; do     # -lt means numeric "less than"
  date                            # print current date/time
  count=$((count + 1))           # arithmetic expansion
  sleep 1                         # pause 1 second
done
  • "${servers[@]}" preserves each item safely
  • $((...)) does integer arithmetic
  • -lt means "less than" in numeric comparisons
  • sleep 1 waits for 1 second between iterations

Functions

#!/bin/bash
set -euo pipefail

backup_path() {
  local src="$1"                 # local variable only inside function
  local dst="$2"
  tar -czf "$dst" -- "$src"      # -c create, -z gzip, -f archive name
}

backup_path "$HOME/Documents" "/tmp/docs.tgz"
  • local keeps variables inside the function
  • tar -czf: create gzip archive file

Exit Codes and Redirection

Exit Status

mkdir /root/test   # likely fails without privileges
echo $?           # exit code of previous command (0 success, non-zero fail)
  • 0 means success
  • non-zero means failure

Standard Streams

command > out.log           # redirect stdout, overwrite file
command >> out.log          # redirect stdout, append file
command 2> err.log          # redirect stderr (fd 2) only
command > all.log 2>&1      # send stderr to same target as stdout
command1 | command2         # pipe stdout of command1 into stdin of command2
  • stdout = normal output, stderr = errors
  • 2>&1 sends stderr to the same place as stdout
  • > overwrites file, >> appends to file
  • 2> redirects only error stream
  • | pipes stdout of left command to stdin of right command

Pipelines and pipefail

# without pipefail:
false | true              # left command fails, right succeeds
echo $?                   # returns 0 from last command (misleading)

# with set -o pipefail:
set -o pipefail           # pipeline status becomes first failing command
false | true
echo $?                   # now returns non-zero
Without pipefail, failed commands can be hidden in pipelines.

Debugging Scripts

Debug Toolkit

bash -n script.sh   # parse only, do not execute commands
bash -x script.sh   # execute with command trace

set -x              # enable tracing inside script
# ... commands ...
set +x              # disable tracing
Debug fast: check syntax first, then trace execution.

Defensive Patterns

require_cmd() {
  command -v "$1" >/dev/null 2>&1 || {   # command exists? hide output/errors
    echo "[ERROR] Missing command: $1" >&2   # send error message to stderr
    exit 1
  }
}

require_cmd tar    # verify dependency before main logic
require_cmd gzip
  • Check dependencies up front
  • Fail early with clear error messages
  • command -v checks if command is available in PATH
  • >/dev/null 2>&1 hides both normal output and errors
  • || { ... } runs error block when check fails

Hands-On Backup Script

Safer defaults + validation + logging

Full Example

#!/bin/bash
set -euo pipefail

dest="${1:-/tmp/backups}"                # use arg1 or default path
shift || true                            # drop arg1; '|| true' avoids strict-mode exit when no args
sources=("$@")                           # remaining args become source array
if [ "${#sources[@]}" -eq 0 ]; then      # array length check
  sources=("$HOME/Documents" "/etc")     # fallback source paths
fi

mkdir -p -- "$dest"                      # create destination if missing
host="$(hostname -s)"                    # short hostname
stamp="$(date +%Y-%m-%d-%H%M%S)"         # timestamp for unique archive names
archive="$dest/${host}-${stamp}.tgz"

for p in "${sources[@]}"; do
  if [ ! -e "$p" ]; then                 # ! negates test; -e means path exists
    echo "[WARN] Skipping missing path: $p" >&2   # warning to stderr
  fi
done

tar -czf "$archive" -- "${sources[@]}"   # create gzip tar from all sources
echo "[OK] Backup complete: $archive"
  • ${1:-/tmp/backups}: default value when arg 1 is missing
  • shift: drops first positional argument
  • ${#sources[@]}: array length
  • ! -e: path does not exist
  • >&2: write warning to stderr
  • -czf in tar: create + gzip + filename

Try It

chmod +x backup.sh   # make script executable
./backup.sh /tmp/my-backups "$HOME/Documents" /etc   # pass destination and source paths
  • Use /tmp first while learning
  • Then move to real directories carefully

Backup Script Walkthrough

  • dest="${1:-/tmp/backups}": default destination
  • shift || true: consume first arg safely in strict mode
  • sources=("$@"): collect remaining args as paths
  • mkdir -p -- "$dest": ensure destination exists
  • hostname -s + date +...: predictable archive names
  • >&2: send warnings to error stream

Common Scripting Mistakes

  • Forgetting quotes around variables and paths
  • Skipping argument validation
  • Assuming paths always exist
  • Ignoring command exit status
  • Running risky scripts on production paths first
Start safe: test in /tmp, then scale up.

Pre-Run Checklist

  • Run syntax check: bash -n script.sh
  • Run with trace on test data: bash -x script.sh
  • Confirm all paths are quoted
  • Use non-production paths first (/tmp)
  • Verify exit code behavior for expected failures
  • Commit script and review in Git before scheduling/CI

Practice Lab

Suggested exercises

Lab Tasks

  1. Write health-check.sh for disk + memory thresholds
  2. Add argument parsing for threshold values
  3. Log output to file and stderr warnings separately
  4. Fail with non-zero exit code on critical condition
  5. Run with bash -x and fix one intentional bug

Key Takeaways

  • Use #!/bin/bash in course scripts
  • Default to set -euo pipefail in scripts
  • Quote variables: "$var", "$@"
  • Validate inputs before destructive operations
  • Keep scripts modular, readable, and version-controlled

Q&A

Next step: build one small script that replaces a manual task you do every day.