The OpenNET Project / Index page

[ новости /+++ | форум | wiki | теги | ]

Поиск:  Каталог документации

3.10. External Filters, Programs and Commands

This is a descriptive listing of standard UNIX commands useful in shell scripts.


The basic file "list" command. It is all too easy to underestimate the power of this humble command. For example, using the -R, recursive option, ls provides a tree-like listing of a directory structure.

Example 3-42. Using ls to create a table of contents for burning a CDR disk


# Script to automate burning a CDR.

# Uses Joerg Schilling's "cdrecord" package
# (

# If this script invoked as an ordinary user, need to suid cdrecord
# (chmod u+s /usr/bin/cdrecord, as root).

if [ -z $1 ]
# Default directory, if not specified on command line.
# The "l" option gives a "long" file listing.
# The "R" option makes the listing recursive.
# The "F" option marks the file types (directories suffixed by a /).
echo "Creating table of contents."

mkisofs -r -o cdimage.iso $IMAGE_DIRECTORY
echo "Creating ISO9660 file system image (cdimage.iso)."

cdrecord -v -isosize speed=2 dev=0,0 cdimage.iso
# Change speed parameter to speed of your burner.
echo "Burning the disk."
echo "Please be patient, this will take a while."

exit 0

Changes the attributes of a file.

chmod +x filename
# Makes "filename" executable for all users.

chmod 644 filename
# Makes "filename" readable/writable to owner, readable to
# others
# (octal mode).

chmod 1777 directory-name
# Gives everyone read, write, and execute permission in directory,
# however also sets the "sticky bit", which means that
# only the directory owner can change files in the directory.


Set the default file attributes (for a particular user).



Carries out COMMAND on each file that find scores a hit on. COMMAND is followed by {} \; (the ; is escaped to make certain the shell reads it literally and terminates the command sequence). This causes COMMAND to bind to and act on the path name of the files found.

Example 3-43. Badname, eliminate file names in current directory containing bad characters and white space.


# Delete filenames in current directory containing bad characters.

for filename in *
badname=`echo "$filename" | sed -n /[\+\{\;\"\\\=\?~\(\)\<\>\&\*\|\$]/p`
# Files containing those nasties:   + { ; " \ = ? ~ ( ) < > & * | $
rm $badname 2>/dev/null
#           So error messages deep-sixed.

# Now, take care of files containing all manner of whitespace.
find . -name "* *" -exec rm -f {} \;
# The "{}" references the paths of all the files that "find" finds.
# The '\' ensures that the ';' is interpreted literally, as end of command.

exit 0

See the man page for find for more detail.


A filter for feeding arguments to a command, and also a tool for assembling the commands themselves. It breaks a data stream into small enough chunks for filters and commands to process. Consider it as a powerful replacement for backquotes. In situations where backquotes fail with a too many arguments error, substituting xargs often works. Normally, xargs reads from 'stdin' or from a pipe, but it can also be given the output of a file.

ls | xargs -p -l gzip gzips every file in current directory, one at a time, prompting before each operation.

One of the more interesting xargs options is -n XX, which limits the number of arguments passed to XX.

ls | xargs -n 8 echo lists the files in the current directory in 8 columns.

Example 3-44. Log file using xargs to monitor system log


# Generates a log file in current directory
# from the tail end of /var/log messages.

# Note: /var/log/messages must be readable by ordinary users
#       if invoked by same (#root chmod 755 /var/log/messages).

( date; uname -a ) >>logfile
# Time and machine name
echo --------------------------------------------------------------------- >>logfile
tail -5 /var/log/messages | xargs |  fmt -s >>logfile
echo >>logfile
echo >>logfile

exit 0

Example 3-45. copydir, copying files in current directory to another, using xargs


# Copy (verbose) all files in current directory
# to directory specified on command line.

if [ -z $1 ]
# Exit if no argument given.
  echo "Usage: `basename $0` directory-to-copy-to"
  exit 1

ls . | xargs -i -t cp ./{} $1
# This is the exact equivalent of
# cp * $1

exit 0
eval arg1, arg2, ...

Translates into commands the arguments in a list (useful for code generation within a script).

Example 3-46. Showing the effect of eval


y=`eval ls -l`
echo $y

y=`eval df`
echo $y
# Note that LF's not preserved

exit 0

Example 3-47. Forcing a log-off


y=`eval ps ax | sed -n '/ppp/p' | awk '{ print $1 }'`
# Finding the process number of 'ppp'

kill -9 $y
# Killing it

# Restore to previous state...

chmod 666 /dev/ttyS3
# Doing a SIGKILL on ppp changes the permissions
# on the serial port. Must be restored.

rm /var/lock/LCK..ttyS3
# Remove the serial port lock file.

exit 0
expr arg1 operation arg2 ...

All-purpose expression evaluator: Concatenates and evaluates the arguments according to the operation given (arguments must be separated by spaces). Operations may be arithmetic, comparison, string, or logical.

expr 3 + 5

returns 8

expr 5 % 3

returns 2

y=`expr $y + 1`

incrementing variable, same as let y=y+1 and y=$(($y+1)), as discussed elsewhere

z=`expr substr $string28 $position $length`

Note that external programs, such as sed and Perl have far superior string parsing facilities, and it might well be advisable to use them instead of the built-in bash ones.

Example 3-48. Using expr


# Demonstrating some of the uses of 'expr'
# +++++++++++++++++++++++++++++++++++++++


# Arithmetic Operators

echo Arithmetic Operators
a=`expr 5 + 3`
echo 5 + 3 = $a

a=`expr $a + 1`
echo a + 1 = $a
echo \(incrementing a variable\)

a=`expr 5 % 3`
# modulo
echo 5 mod 3 = $a


# Logical Operators

echo Logical Operators

echo a = $a
b=`expr $a \> 10`
echo 'b=`expr $a \> 10`, therefore...'
echo "If a > 10, b = 0 (false)"
echo b = $b

b=`expr $a \< 10`
echo "If a < 10, b = 1 (true)"
echo b = $b


# Comparison Operators

echo Comparison Operators
echo a is $a
if [ `expr $a = snap` ]
# Force re-evaluation of variable 'a'
   echo "a is not zipper"


# String Operators

echo String Operators

echo The string being operated upon is $a.

# index: position of substring
b=`expr index $a 23`
echo Numerical position of first 23 in $a is $b.

# substr: print substring, starting position & length specified
b=`expr substr $a 2 6`
echo Substring of $a, starting at position 2 and 6 chars long is $b.

# length: length of string
b=`expr length $a`
echo Length of $a is $b.

# 'match' operations similarly to 'grep'
b=`expr match $a [0-9]*`
echo Number of digits at the beginning of $a is $b.
b=`expr match $a '\([0-9]*\)'`
echo The digits at the beginning of $a are $b.


exit 0

Note that : can substitute for match. b=`expr $a : [0-9]*` is an exact equivalent of b=`expr match $a [0-9]*` in the above example.


The let command carries out arithmetic operations on variables. In many cases, it functions as a less complex version of expr.

Example 3-49. Letting let do some arithmetic.



let a=11
# Same as 'a=11'
let a=a+5
# Equivalent to let "a = a + 5"
# (double quotes makes it more readable)
echo "a = $a"
let "a <<= 3"
# Equivalent of let "a = a << 3"
echo "a left-shifted 3 places = $a"

let "a /= 4"
# Equivalent to let "a = a / 4"
echo $a
let "a -= 5"
# Equivalent to let "a = a - 5"
echo $a
let "a = a * 10"
echo $a
let "a %= 8"
echo $a

exit 0

The printf, formatted print, command is an enhanced echo. It is a limited variant of the C language printf, and the syntax is somewhat different.

printf format-string... parameter...

See the printf man page for in-depth coverage.

Note: Older versions of bash may not support printf.

Example 3-50. printf in action


# printf demo



printf "Pi to 2 decimal places = %1.2f" $PI
printf "Pi to 9 decimal places = %1.9f" $PI
# Note correct round off.

printf "\n"
# Prints a line feed, equivalent to 'echo'.

printf "Constant = \t%d\n" $DecimalConstant
# Insert tab (\t)

printf "%s %s \n" $Message1 $Message2


exit 0

The at job control command executes a given set of commands at a specified time. This is a user version of cron.

at 2pm January 15 prompts for a set of commands to execute at that time. These commands may include executable shell scripts.

Using either the -f option or input redirection (<), at reads a command list from a file. This file can include shell scripts, though they should, of course, be noninteractive.

bash$ at 2:30 am Friday < at-jobs.list
job 2 at 2000-10-27 02:30


Lists currently executing jobs by owner and process id. This is usually invoked with ax options, and may be piped to grep to search for a specific process.

ps ax | grep sendmail results in:
295 ?        S      0:00 sendmail: accepting connections on port 25


The batch job control command is similar to at, but it runs a command list when the system load drops below .8. Like at, it can read commands from a file with the -f option.


This is the shell equivalent of a wait loop. It pauses for a specified number of seconds, doing nothing. This can be useful for timing or in processes running in the background, checking for a specific event every so often.
sleep 3
# Pauses 3 seconds.


This is the somewhat obscure and much feared "data duplicator" command. It simply copies a file (or stdin/stdout), but with conversions. Possible conversions are ASCII/EBCDIC, upper/lower case, swapping of byte pairs between input and output, and skipping and/or truncating the head or tail of the input file. A dd --help lists the conversion and other options that this powerful utility takes.

The dd command can copy raw data and disk images to and from devices, such as floppies. It can even be used to create boot floppies.
dd if=kernel-image of=/dev/fd0H1440
One important use for dd is initializing temporary swap files (see Example 3-79).


File sorter, often used as a filter in a pipe. See the man page for options.


Simple file comparison utility. The files must be sorted (this may, if necessary be accomplished by filtering the files through sort before passing them to diff). diff file-1 file-2 outputs the lines in the files that differ, with carets showing which file each particular line belongs to. A common use for diff is to generate difference files to be used with patch (see below). The -e option outputs files suitable for ed or ex scripts.

patch -p1 <patch-file
# Takes all the changes listed in 'patch-file' and applies them
# to the files referenced therein.

cd /usr/src
gzip -cd patchXX.gz | patch -p0
# Upgrading kernel source using 'patch'.
# From the Linux kernel docs "README",
# by anonymous author (Alan Cox?).


Versatile file comparison utility. The files must be sorted for this to be useful.

comm -options first-file second-file

comm file-1 file-2 outputs three columns:

  • column 1 = lines unique to file-1

  • column 2 = lines unique to file-2

  • column 3 = lines common to both.

The options allow suppressing output of one or more columns.

  • -1 suppresses column 1

  • -2 suppresses column 2

  • -3 suppresses column 3

  • -12 suppresses both columns 1 and 2, etc.


This filter removes duplicate lines from a sorted file. It is often seen in a pipe coupled with sort.
cat list-1 list-2 list-3 | sort | uniq > final.list


A filter than converts tabs to spaces, often seen in a pipe.


A tool for extracting fields from files. It is similar to the print $N command set in awk, but more limited. It may be simpler to use cut in a script than awk. Particularly important are the -d (delimiter) and -f (field specifier) options.

Using cut to obtain a listing of the mounted filesystems:
cat /etc/mtab | cut -d ' ' -f1,2

Using cut to list the OS and kernel version:
uname -a | cut -d" " -f1,3,11,12

cut -d ' ' -f2,3 filename is equivalent to awk '{ print $2, $3 }' filename


Column removal filter. This removes columns (characters) from a file and writes them, lacking the specified columns, back to stdout. colrm 2 3 <filename removes the second and third characters from each line of the text file filename.


Tool for merging together different files into a single, multi-column file. In combination with cut, useful for creating system log files.


Consider this a more flexible version of paste. It works on exactly two files, but permits specifying which fields to paste together, and in which order.


This specialized archiving copy command is rarely used any more, having been supplanted by tar/gzip. It still has its uses, such as moving a directory tree.

Example 3-51. Using cpio to move a directory tree


# Copying a directory tree using cpio.

if [ $# -ne 2 ]
  echo Usage: `basename $0` source destination
  exit 1


find "$source" -depth | cpio -admvp "$destination"

exit 0

The familiar cd change directory command finds use in scripts where execution of a command requires being in a specified directory.
(cd /source/directory && tar cf - . ) | (cd /dest/directory && tar xvfp -)
[from the previously cited example by Alan Cox]


Utility for updating access/modification times of a file to current system time or other specified time, but also useful for creating a new file. The command touch zzz will create a new file of zero length, named zzz, assuming that zzz did not previously exist. Time-stamping empty files in this way is useful for storing date information, for example in keeping track of modification times on a project.


Utility for splitting a file into smaller chunks. Usually used for splitting up large files in order to back them up on floppies or preparatory to e-mailing or uploading them.


Delete (remove) a file or files. The -f forces removal of even readonly files. When used with the recursive flag -r, this command removes files all the way down the directory tree (very dangerous!).


Remove directory. The directory must be empty of all files, including dotfiles, for this command to succeed.


Creates links to pre-existings files. Most often used with the -s, symbolic or "soft" link flag. This permits referencing the linked file by more than one name and is a superior alternative to aliasing.

ln -s oldfile newfile links the previously existing oldfile to the newly created link, newfile.


Make directory, creates a new directory. mkdir -p project/programs/December creates the named directory. The -p option automatically creates any necessary parent directories.


This is the file copy command. cp file1 file2 copies file1 to file2, overwriting file2 if it already exists.


This is the file move command. It is equivalent to a combination of cp and rm. It may be used to move multiple files to a directory.


"Remote copy", copies files between two different networked machines. Using rcp and similar utilities with security implications in a shell script may not be advisable. Consider instead, using an expect script.


In its default behavior the yes command feeds a continuous string of the character y followed by a line feed to stdout. A control-c terminates the run. A different output string may be specified, as in yes different string, which would continually output different string to stdout. One might well ask the purpose of this. From the command line or in a script, the output of yes can be redirected or piped into a program expecting user input. In effect, this becomes a sort of poor man's version of expect.


prints (to stdout) an expression or variable ($variable).
echo Hello
echo $a

Normally, each echo command prints a terminal newline, but the -n option suppresses this.

cat, tac

cat, an acronym for concatenate, lists a file to stdout. When combined with redirection (> or >>), it is commonly used to concatenate files.
cat filename
              cat file.1 file.2 file.3 > file.123
The -n option to cat inserts consecutive numbers before each line of the target file(s).

tac, is the inverse of cat, listing a file backwards from its end.


lists the first 10 lines of a file to stdout.


lists the end of a file to stdout (the default is 10 lines, but this can be changed). Commonly used to keep track of changes to a system logfile, using the -f option, which outputs lines appended to the file.

Example 3-44 and Example 3-83 show tail in action.


[UNIX borrows an idea here from the plumbing trade.]

This is a redirection operator, but with a difference. Like the plumber's tee, it permits "siponing off" the output of a command or commands within a pipe, but without affecting the result. This is useful for printing an ongoing process to a file or paper, perhaps to keep track of it for debugging purposes.

                 |------> to file
  command--->----|-operator-->---> result of command(s)

cat listfile* | sort | tee check.file | uniq > result.file
(The file check.file contains the concatenated sorted "listfiles", before the duplicate lines are removed by uniq.)

sed, awk

manipulation scripting languages in order to parse text and command output


Non-interactive "stream editor", permits using many ex commands in batch mode.


Programmable file extractor and formatter, good for manipulating and/or extracting fields (columns) in text files. Its syntax is similar to C.


wc gives a "word count" on a file or I/O stream:
$ wc /usr/doc/sed-3.02/README
20     127     838 /usr/doc/sed-3.02/README
[20 lines  127 words  838 characters]

wc -w gives only the word count.

wc -l gives only the line count.

wc -c gives only the character count.

wc -L gives only the length of the longest line.

Using wc to count how many .txt files are in current working directory:
$ ls *.txt | wc -l


character translation filter.

Note: must use quoting and/or brackets, as appropriate.

tr "A-Z" "*" <filename changes all the uppercase letters in filename to asterisks (writes to stdout).

tr -d [0-9] <filename deletes all digits from the file filename.

Example 3-52. toupper: Transforms a file to all uppercase.


# Changes a file to all uppercase.

if [ -z $1 ]
# Standard check whether command line arg is present.
  echo "Usage: `basename $0` filename"
  exit 1

tr [a-z] [A-Z] <$1

exit 0

Example 3-53. lowercase: Changes all filenames in working directory to lowercase.

#! /bin/bash
# Changes every filename in working directory to all lowercase.
# Inspired by a script of john dubois,
# which was translated into into bash by Chet Ramey,
# and considerably simplified by Mendel Cooper,
# author of this HOWTO.

for filename in *  #Traverse all files in directory.
   fname=`basename $filename`
   n=`echo $fname | tr A-Z a-z`  #Change name to lowercase.
   if [ $fname != $n ]  # Rename only files not already lowercase.
     mv $fname $n

exit 0

A filter that wraps inputted lines to a specified width.


Simple-minded file formatter.


Line numbering filter. nl filename lists filename to stdout, but inserts consecutive numbers at the beginning of each non-blank line. If filename omitted, operates on stdin.

Example 3-54. nl: A self-numbering script.


# This file echoes itself twice to stdout with its lines numbered.

# 'nl' sees this as line 3 since it does not number blank lines.
# 'cat -n' sees the above line as number 5.

nl `basename $0`

echo; echo  # Now, let's try it with 'cat -n'

cat -n `basename $0`
# The difference is that 'cat -n' numbers the blank lines.

exit 0

Print formatting filter. This will paginate a file (or stdout) into sections suitable for hard copy printing. A particularly useful option is -d, forcing double-spacing.

Example 3-55. Formatted file listing.


# Get a file listing...

b=`ls /usr/local/bin`

# ...40 columns wide.
echo $b | fmt -w 40

# Could also have been done by
# echo $b | fold - -s -w 40
exit 0

Simply invoked, date prints the date and time to stdout. Where this command gets interesting is in its formatting and parsing options.

Example 3-56. Using date


#Using the 'date' command

# Needs a leading '+' to invoke formatting.

echo "The number of days since the year's beginning is `date +%j`."
# %j gives day of year.

echo "The number of seconds elapsed since 01/01/1970 is `date +%s`."
# %s yields number of seconds since "UNIX epoch" began,
# but how is this useful?

suffix=`eval date +%s`
echo $filename
# It's great for creating "unique" temp filenames,
# even better than using $$.

# Read the 'date' man page for more formatting options.

exit 0

Outputs very verbose timing statistics for executing a command.

time ls -l / gives something like this:
0.00user 0.01system 0:00.05elapsed 16%CPU (0avgtext+0avgdata 0maxresident)k
0inputs+0outputs (149major+27minor)pagefaults 0swaps

See also the very similar times command in the previous section.


A multi-purpose file search tool that uses regular expressions. Originally a command/filter in the ancient ed line editor, g/re/p, or global - regular expression - print.

grep pattern [file...]

search the files file, etc. for occurrences of pattern.

ls -l | grep '.txt' has the same effect as ls -l *.txt.

The -i option to grep causes a case-insensitive search.

Example 3-83 demonstrates how to use grep to search for a keyword in a system log file.


which <command> gives the full path to the command. This is useful for finding out whether a particular command or utility is installed on the system.

$bash which pgp


This utility records (saves to a file) all the user keystrokes at the command line in a console or an xterm window. This, in effect, create a record of a session.


The standard UNIX archiving utility. Originally a Tape ARchiving program, from whence it derived its name, it has developed into a general purpose package that can handle all manner of archiving with all types of destination devices, ranging from tape drives to regular files to even stdout. GNU tar has long since been patched to accept gzip options, see below.


The standard GNU/UNIX compression utility, replacing the inferior and proprietary compress.


Shell archiving utility. The files in a shell archive are concatenated without compression, and the resultant archive is essentially a shell script, complete with #!/bin/sh header, and containing all the necessary unarchiving commands. Shar archives still show up in Internet newsgroups, but otherwise shar has been pretty well replaced by tar/gzip. The unshar command unpacks shar archives.


A utility for identifying file types. The command file file-name will return a file specification for file-name, such as ascii text or data. It references the magic numbers found in /usr/share/magic, /etc/magic, or /usr/lib/magic, depending on the Linux/UNIX distribution.


This utility encodes binary files into ASCII characters, making them suitable for transmission in the body of an e-mail message or in a newsgroup posting.


This reverses the encoding, decoding uuencoded files back into the original binaries.

Example 3-57. uuencoding encoded files


# Allow 35 lines for the header (very generous).

for File in *
# Test all the files in the current working directory...
search1=`head -$lines $File | grep begin | wc -w`
search2=`tail -$lines $File | grep end | wc -w`
# Files which are uuencoded have a "begin" near the beginning,
# and an "end" near the end.
 if [ $search1 -gt 0 ]
    if [ $search2 -gt 0 ]
      echo "uudecoding - $File -"
      uudecode $File

exit 0
more, less

Pagers that display a text file or text streaming to stdout, one page at a time.

jot, seq

These utilities emit a sequence of integers, with a user selected increment. This can be used to advantage in a for loop.

Example 3-58. Using seq to generate loop arguments


for a in `seq 80`
# Same as for a in 1 2 3 4 5 ... 80 (saves much typing!).
# May also use 'jot' (if present on system).
  echo -n "$a "


exit 0

The clear command simply clears the text screen at the console or in an xterm. The prompt and cursor reappear at the upper lefthand corner of the screen or xterm window. This command may be used either at the command line or in a script. See Example 3-32.

Inferno Solutions
Hosting by

Закладки на сайте
Проследить за страницей
Created 1996-2023 by Maxim Chirkov
Добавить, Поддержать, Вебмастеру