I have to run as an exercise at my university a bash script to reverse lookup all their DNS entries for a B class network block they own.
This is the fastest I have got but takes forever. Any help optimising this code?
#!/bin/bash
network="a.b"
CMD=/usr/bin/dig
for i in $(seq 1 254); do
for y in $(seq 1 254); do
answer=`$CMD -x $network.$i.$y +short`;
echo $network.$i.$y ' resolves to ' $answer >> hosts_a_b.txt;
done
done
Using GNU xargs to run 64 processes at a time might look like:
#!/usr/bin/env bash
lookupArgs() {
for arg; do
# echo entire line together to ensure atomicity
echo "$arg resolves to $(dig -x "$arg" +short)"
done
}
export -f lookupArgs
network="a.b"
for (( x=1; x<=254; x++ )); do
for (( y=1; y<=254; y++ )); do
printf '%s.%s.%s\0' "$network" "$x" "$y"
done
done | xargs -0 -P64 bash -c 'lookupArgs "$@"' _ >hosts_a_b.txt
Note that this doesn't guarantee order of output (and relies on the lookupArgs
function doing one write()
syscall per result) -- but output is sortable so you should be able to reorder. Otherwise, one could get ordered output (and ensure atomicity of results) by switching to GNU parallel -- a large perl script, vs GNU xargs' small, simple, relatively low-feature implementation.