Purpose:

Remove duplicate URLs from a file, where the URLs only differ by N characters

Usage:

# With pipe 
cat <file> | dry <N>

# With file
dry <N> <file> [options]

Example:

# Read from a pipe and print to stdout
cat file | dry 50

# Which is the same as
dry 50 file

# Replace file in-place
dry 40 $domain/scan/gf/xss.txt -i