I'm writing a tutorial for efficient broken link checking and fixing with Xenu Linksleuth. I would like to fetch and print the titles of all the links in the link report. To be specific, I would manually remove all the broken urls from the report, leave the ones that are ok and then review the titles I get to determine any parked domains or such.

I would like this script to be dynamic, so it only fetches the titles as I scroll down the page (lazy behavior). Also to minimize bandwidth the PHP would stop reading the remote page after the title tag.

For now, I've only managed to insert something static after every external link on the page:

Code:
 $(document).ready(function() {
        $("a[href*='http://']:not([href*='"+location.hostname+"']),[href*='https://']:not([href*='"+location.hostname+"'])").after('Test');
});
I would have to somehow ajax .load after each link this:
fopen_testv3.php?link=
with the correct link being passed.

My current PHP code might need some improving, too, if you have some suggestions about error handling, bandwidth minimizing etc.:

PHP Code:
$url=$_GET["link"];
function 
getTitle($url) {
        
$fh fopen($url"r");
        
$str stream_get_line($fh1024);  
        
fclose($fh);
        
$str2 strtolower($str);
        
$start strpos($str2"<title>")+7;
        
$len   strpos($str2"</title>") - $start;
        return 
substr($str$start$len);
}
echo  
getTitle($url);