<\/span><\/h2>\nWithin our script we will use curl. Curl allows us to define the output variables that are returned. We will define this variables within a separate file named\u00a0curl_format<\/span>. Within this file add the following,<\/p>\n%{http_code},%{num_connects},%{num_redirects},%{redirect_url},%{remote_ip},%{remote_port},%{url \r\n_effective},%{time_total}<\/pre>\n<\/span>Input File<\/span><\/h2>\nNext we create a file containing the websites we want to analysis. Within this example we call the file websites.txt<\/span>.<\/p>\nhttps:\/\/www.fir3net.com\r\nhttp:\/\/bbc.com\r\nhttp:\/\/fake.com<\/pre>\n<\/span>Script<\/span><\/h2>\nNow lets create the script. Create a file called fetch_url_stats.sh<\/span> and add the following,<\/p>\n#!\/bin\/bash\r\n\r\n# ASSIGN PATH \r\nPATH=\/usr\/kerberos\/sbin:\/usr\/kerberos\/bin:\/usr\/local\/sbin:\/usr\/local\/bin:\/sbin:\/bin:\/usr\/sbin:\/usr\/bin\r\n\r\n# CHECK FOR ARGUMENTS\r\nif [ -z \"${1}\" ]; then\r\n echo \"Usage: $0 [FILENAME]\";\r\n exit\r\nfi\r\n\r\n# ASSIGN VARIABLE\r\nWEBSITES=\"${1}\"\r\n\r\n# PRINT HEADER\r\necho target_url,http_code,num_connects,num_redirects,redirect_url,remote_ip,remote_port,url_effective,time_total \r\n\r\n# FETCH STATS\r\nfor target_url in `cat ${WEBSITES}`\r\ndo \r\n response=$(curl -L -w \"@curl_format\" -o \/dev\/null -s $target_url)\r\n echo $target_url,$response\r\ndone<\/pre>\n<\/span>Example<\/span><\/h2>\nBelow shows an example. We provide the file containing the list of websites as an argument, the scripts run and the results are provided as a CSV.<\/p>\n
laptop:~ felix001$ .\/fetch_url_stats.sh websites.txt\r\ntarget_url,http_code,num_connects,num_redirects,redirect_url,remote_ip,remote_port,url_effective,time_total\r\nhttps:\/\/www.fir3net.com,200,2,1,,149.126.74.98,443,https:\/\/www.fir3net.com\/,3.759\r\nhttp:\/\/bbc.com,200,2,1,,212.58.244.69,80,http:\/\/www.bbc.com\/,0.933\r\nhttp:\/\/fake.com,200,2,1,,83.138.157.142,80,http:\/\/www.fake.com\/,0.843<\/pre>\n <\/p>\n","protected":false},"excerpt":{"rendered":"
Introduction For one reason or another, Im sure you will find yourself in a position when you need to obtain statistics for a collection of websites. Today, we will show you steps required in building a BASH script that will do just that. Lets go…. Output Format Within our script we will use curl. Curl … Read more<\/a><\/p>\n","protected":false},"author":2,"featured_media":1029,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[40],"tags":[],"yoast_head":"\nAutomate\/Gather Statistics for Multiple Websites in BASH - Fir3net<\/title>\n \n \n \n \n \n \n \n \n \n \n \n \n\t \n\t \n\t \n \n \n \n\t \n\t \n\t \n