Background: I am running unit tests and one requires calling a PSQL function with a high number of URLs (ie. 2000+). and this is extremely slow as shown in this Minimal Working Example (MWE)
MWE:
#!/bin/bash
# Generate random 4096 character alphanumeric
# string (upper and lowercase)
URL="http://www.$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w $((4096-15)) | head -n 1).com"
# Create a comma separated list of 2000 URLs
for i in $(seq 2000)
do
URLS="$URLS,$URL"
done
We call it and measure the run time like so
$ time ./generate_urls.sh
real 1m30.681s
user 1m14.648s
sys 0m16.000s
Question: Is there a faster, more efficient way to achieve this same result?
Instead of concatenating over and over, just print them all and store the result.
URLS=$(
for i in $(seq 2000) ; do
printf %s, "$URL"
done
)
echo "${URLS%,}" # Remove the final comma.
Takes less than 2 secs on my machine. Even when I move the URL generation inside the loop, it takes just about 8 secs.