Contents

  1. Changing case in filenames
  2. useful sed stuff
  3. dd and tar
  4. More useful sed/awk stuff
  5. Add a user to an ACL
  6. Words 4 passwords
  7. Password matching
  8. Add up filesystem usage (on jupiter)
  9. find and stuff
  10. Interesting awk stuff

    a) Strip blank lines out of a text file

    b) Add up file sizes and get total in Gb

  11. Determine directory sizes
  12. Restore GNU tar tape with wildcards
  13. Adding up directory sizes

1) Changing case in filenames

mclaren# foreach file (`ls IL*`)
? set from = $file
? set to = `echo $file | tr IL il` 
? echo $from $to
? mv $from $to
? end

Changing filenames (similar to above)

38 colinbr@williams> foreach file (`ls *.test`)
? set from = $file
? set to = `echo $file | sed -e 's/_toyota//'`
? echo $from $to
? mv $from $to
? end

Filenames with spaces ...

Consider the directory listing

119 colinbr@williams> lsl
total 10
drwxr-xr-x   2 colinbr  it           512 Mar 15 14:28 ./
drwxrwxr-x  10 colinbr  it          1024 Mar 15 14:26 ../
-rw-r--r--   1 colinbr  it            40 Mar 15 14:28 new net space
-rw-r--r--   1 colinbr  it            37 Mar 15 14:27 space one
-rw-r--r--   1 colinbr  it            37 Mar 15 14:27 text gap

The problem is to write a loop which will rename all the filenames that have spaces, substituting space, " " , for underscore "_" .

53 colinbr@williams> foreach file (`ls`)
? set from = "$file"
? set to = `echo $file | tr ' ' _`
? echo $from $to
? end

This fails because the value of $file gets split at each space character, so a little wizardry is needed.

122 colinbr@williams> foreach foo (`ls | sed -e 's/ /_/g'`)
? set bar = `echo $foo | tr _ ' '`
? echo "$bar $foo"
? cp "$bar" "$foo"
? end
new net space new_net_space
space one space_one
text gap text_gap
123 colinbr@williams> ls -la
total 16
drwxr-xr-x   2 colinbr  it           512 Mar 15 14:41 .
drwxrwxr-x  10 colinbr  it          1024 Mar 15 14:26 ..
-rw-r--r--   1 colinbr  it            40 Mar 15 14:28 new net space
-rw-r--r--   1 colinbr  it            40 Mar 15 14:41 new_net_space
-rw-r--r--   1 colinbr  it            37 Mar 15 14:27 space one
-rw-r--r--   1 colinbr  it            37 Mar 15 14:41 space_one
-rw-r--r--   1 colinbr  it            37 Mar 15 14:27 text gap
-rw-r--r--   1 colinbr  it            37 Mar 15 14:41 text_gap

The loop does a copy rather than a rename but it shows the principle.

$ for foo in `ls | sed -e 's/-//g' -e 's/#//' -e 's/ /_/g'`
> do
> bar=`echo $foo | tr _ ' '`
> echo "$bar $foo"
> mv "$bar" "$foo"
> done
Back to top

2) useful sed stuff

ls -1d https* | sed -e 's/https-//' -e 's/80\.//' -e 's/808[56]\.//'
Back to top

3) dd and tar

Three archives on a tape For a tape that was created by dd to a remote tape drive using a command like:

tar -cvf - . | ssh remote-machine 'dd of=/dev/rmt/0n bs=10k conv=sync,whole'

Extract with a loop of the form:

foreach arch (1 2 3)
dd if=/dev/rmt/2n bs=10k conv=sync | tar xvf -
end
Back to top

4) More useful sed/awk stuff

Which users, in a given group, are in which other groups?

mclaren# ypmatch contract group | awk -F: '{print $4}' | sed -e 's/[,:]/ /g'
michaels heathere melinda paulm robn daniel ehodges michaell nigelba clarem douglas lawrence geoff anthony dani

Then:

mclaren# groups `ypmatch contract group | awk -F: '{print $4}' | sed -e 's/[,:]/ /g'`
michaels : dc contract
heathere : dc contract
melinda : dc contract
paulm : dc contract
robn : dc contract 2pub
daniel : dc contract 2pub
ehodges : dc contract 2pub
michaell : dc contract 2pub
nigelba : dc contract 2pub
clarem : dc contract 2pub
douglas : dc contract 2pub
lawrence : dc contract 2pub
geoff : dc contract 2pub
anthony : dc contract 2pub
dani : dc contract
Back to top

5) Add a user to an ACL

mclaren# setfacl -m user:cmcivor:r-x reports

See also ../solaris_acls

Back to top

6) Words 4 passwords

Can we randomly choose four-letter words from /usr/dict/words and combine them with a digit to get a password?

In ksh

$ echo $RANDOM
8328
$ echo $RANDOM / 31
23558 / 31
$ a=`expr $RANDOM / 31` ; echo $a
1004
$ 
$ a=`expr $RANDOM \* 500` ; echo $a
7801000

$ nl /usr/dict/words | grep $RANDOM
  1666  Australis
 11666  incantation
 16660  parsimony
 16661  parsley
 16662  parsnip
 16663  parson
 16664  parsonage
 16665  Parsons
 16666  part
 16667  partake
 16668  Parthenon
 16669  partial
 21666  stiffen

$ a=`expr $RANDOM` ; b=`expr $RANDOM` ; c=`expr $a / $b`; echo $a $b $c
11371 21354 0

$ nl /usr/dict/words | grep $RANDOM | awk '{print $2}'
for word in `echo $s`
do
   awk '
        /length <= 4/
        { print }
       '
done

    24  paste *.tmp | awk '{print $2,$4}'
    25  paste *.tmp | awk '{print $2,$4}' | uniq

This information eventually became the pwgen script.

Back to top

7) Password matching

Given a file of lusernames, try the following to see if they have passwords set:

9 colinbr@williams> foreach luser (`cat nt_lusers`)
? ypmatch -k $luser passwd | awk -F: '{print $1,$3}'
? end
otilia *LK*
stuartb mHlUCAufjGZqs
melinda *LK*
jasonc 6TtPRachQ9yZ2
lucyco *LK*
janec *LK*
daniel *LK*
piali *LK*
markd fcfWcoPNSVLd2
anned S2rhzf0/dD3Qs
jennie *LK*
hannahd *LK*
clemency *LK*
dani Xcmk8HIKhOpxM
geoff *LK*
michaell *LK*
Can't match key philipl in map passwd.byname.  Reason: no such key in map.
george VT4mwY/BAKar6
joannam 8dl4cXbe5fJzg
seanm *LK*
mattr DREyWZt4b7rac
johnw *LK*
simony *LK*
Back to top

8) Add up filesystem usage (on jupiter)

Given:

35 colinbr@jupiter> df -kF vxfs
Filesystem            kbytes    used   avail capacity  Mounted on
/dev/vx/dsk/mysqldg/mysqldg01v01
                     17641472  882900 16501104     6%    /export/mysql01
/dev/vx/dsk/nfsdg/nfsdg08v01
                     140500608 95783712 44373360    69%    /export/nfs08
/dev/vx/dsk/nfsdg/nfsdg07v01
                     140500608 112782640 27502168    81%    /export/nfs07
/dev/vx/dsk/nfsdg/nfsdg06v01
                     140500608 139354656 1137296   100%    /export/nfs06
/dev/vx/dsk/nfsdg/nfsdg05v01
                     140500608 138131456 2352008    99%    /export/nfs05
/dev/vx/dsk/nfsdg/nfsdg04v01
                     140500608 120200832 20299776    86%    /export/nfs04
/dev/vx/dsk/nfsdg/nfsdg03v01
                     140500608 135679696 4785120    97%    /export/nfs03
/dev/vx/dsk/nfsdg/nfsdg02v01
                     140500608 137334208 3143976    98%    /export/nfs02
/dev/vx/dsk/nfsdg/nfsdg01v01
                     140500608 135558064 4913736    97%    /export/nfs01

Strip out header information and add up column 4 to get disk usage over all these file systems. Use:

38 colinbr@jupiter> df -kF vxfs | tail +2 | awk '{ m += $4 } END { print m }'
1015708692
39 colinbr@jupiter> 

Why add up column 4, not column 3? Because the Filesystem column is column 1, as shown below:

8 colinbr@saturn> df -kF vxfs | tail +2  
/dev/vx/dsk/nfsdg/nfsdg06v01 140500608 135386088 5114520    97%    /export/nfs06
/dev/vx/dsk/nfsdg/nfsdg08v01 140500608 121158624 19248112    87%    /export/nfs08
/dev/vx/dsk/nfsdg/nfsdg07v01 140500608 125346504 15036336    90%    /export/nfs07
/dev/vx/dsk/nfsdg/nfsdg03v01 140500608 133569776 6883832    96%    /export/nfs03
/dev/vx/dsk/nfsdg/nfsdg02v01 140500608 133622784 6843976    96%    /export/nfs02
/dev/vx/dsk/nfsdg/nfsdg05v01 140500608 132158936 8341672    95%    /export/nfs05
/dev/vx/dsk/nfsdg/nfsdg09v01 140500608 130249444 9610529    94%    /nfs09
/dev/vx/dsk/nfsdg/nfsdg01v01 140500608 131138544 9362064    94%    /export/nfs01
/dev/vx/dsk/nfsdg/nfsdg04v01 140500608 132524608 7976000    95%    /export/nfs04
/dev/vx/dsk/mysqldg/mysqldg01v01 17641472 13203632 4372068    76%    /export/mysql01
Back to top

9) find and stuff

Given a directory full of files, find files of the same name scattered across a user's home directory:

$ for file in *
> do
> echo $file
> find /home/colinbr -name "$file"
> done

If a duplicate is found, can we then get the loop to mv/rm/cp the file from the current working directory?

See the script /home/colinbr/bin/rm_dups

Back to top

10) Interesting awk stuff

a) Strip blank lines out of a text file

27 colinbr@williams> awk '{ if (NF != 0) {print} }' < opera.adr

or

28 colinbr@williams> 
       awk '{ if (NF != 0) {print} }' < opera.adr > opera.adr.strip
Back to top

b) Add up file sizes and get total in Gb

For files of the form

applchad@bmw> ls -l 0000023987.ARC
-rw-r-----   1 orachad  dba      10486272 Mar 20 07:46 0000023987.ARC
applchad@bmw> 

Use:

applchad@bmw> du -sk *.ARC | awk '{ n += $1 } END {print n/1024/1024}'
5.49609
applchad@bmw> 

For files of the form

bmw# ls -l chad_redo.tar
-rw-r--r--   1 applchad dba      5214197760 Mar 26 17:34 chad_redo.tar
bmw# 

use

bmw# echo 5214197760 |awk '{ n += $1 } END {print n/1024/1024/1024}'
4.57666
Back to top

11) Determine directory sizes

Consider the directory

mclaren# pwd
/dc/tmp5
mclaren# ls
pcift0  pcift4

Rather than mucking about with commands of the form

mclaren# cd pcift0
mclaren# du -sk `ls -la | grep '^d' | awk '{print $9}'`

Use something like

mclaren# cd /dc/tmp5 mclaren# ls pcift0 pcift4 mclaren# find pcift0/* -type d -prune -exec du -sk {} \;

The important bit is the -prune which stops find going down through each directory it finds under pcift0 . And we get output something like:

29776150        pcift0/02-04
24580   pcift0/03-01
1401354 pcift0/1021
40      pcift0/194-016
19      pcift0/194-017
46561   pcift0/194-018
47401   pcift0/194-019
1021438 pcift0/194-042a
19374108        pcift0/NEW
16993697        pcift0/UPLOAD
9129745 pcift0/apexscanning
767443  pcift0/batch72a
714958  pcift0/batch74a
52634676        pcift0/cd_restore
19102506        pcift0/handover0303
1757459 pcift0/m041
12      pcift0/newft
1128397 pcift0/old
882989  pcift0/oldbatches
1825    pcift0/pcift-dlt1
1371085 pcift0/xmltemp
mclaren# 
Back to top

12) Restore GNU tar tape with wildcards

This was used for EEBO data sent from AA. The tape was tar format, single archive, several directories. Needs to be restored with GNU tar so wildcards can be used.

drwxrwxrwx 4050/405      0 Nov  8 18:29 2002 digdis/export/
-rwxrwxrwx 200/40   12579 Nov  8 17:56 2002 digdis/export/exp_eebo.log
-rwxrwxrwx 200/40  1131668480 Nov  8 17:56 2002 digdis/export/exp_eebo.dmp
-rwxrwxrwx 200/40     724 Nov  8 18:03 2002 digdis/export/config.ora
-rwxrwxrwx 200/40    3915 Nov  8 18:03 2002 digdis/export/initeebo.ora
-rwxrwxrwx 200/40   10437 Nov  8 18:29 2002 digdis/export/eebo_datafiles.txt

Restore these with a command of the form:

/usr/local/bin/tar xvf /dev/rmt/0 '*export*'
Back to top

13) Adding up directory sizes

mclaren# pwd
/dc/tmp5/pcift0/handover0403/img_sarah
mclaren# ls -aF
./         0072APEX/  6017APEX/  c446APEX/  n231APEX/
../        1090APEX/  c420APEX/  j552APEX/  n247APEX/

a) Get indivdual directory sizes with

mclaren# du -sk *
903365  0072APEX
587290  1090APEX
448085  6017APEX
135545  c420APEX
439927  c446APEX
510814  j552APEX
296254  n231APEX
346353  n247APEX

b) Get the total directory size with

mclaren# du -sk * | awk '{ k += $1 } END { print k }'
3667633

c) Or, perhaps more simply,

mclaren# du -sk .
3667634 .
mclaren# 

However, b) is useful for a list of directories that aren't subdirectories of the current working directory, eg

mclaren# du -sk ./n247APEX ../img_gs/3188APR | awk '{ k += $1 } END { print k }'
855491
mclaren#
Back to top