Oh, and about my previous method.. It works very well provided you can use it, but it will slow things down a bit if there's a lot of images in a post (after all, it will have to check each of them).
Here's a better method that adds the benefit of caching remote images for as longs as you want
1. Open includes/parser.php
2. Find, around line 282 (the line number in my previous post was wrong):
//
// Turns an URL from the [img] tag into an <img> tag or a <a href...> tag
//
function handle_img_tag($url, $is_signature = false)
{
global $lang_common, $pun_config, $pun_user;
3. Replace with:
//
// Turns an URL from the [img] tag into an <img> tag or a <a href...> tag
//
function handle_img_tag($url, $is_signature = false)
{
global $lang_common, $pun_config, $pun_user;
$replace = array('%20',' '); // We don't want spaces in our filenames
$file = basename(str_replace($replace, '_', $url)); // Get remote filename, excluding pathname
$expire = '259200'; // How long should we wait to download the image again? Defaults to 3 days.
$hash = @md5($url); // Generate a MD5 hash of the file(s) URL. Helps prevent multiple copies of the same file.
$localfile = 'cache/img/'.$file.''; // This is the temp. filename of the local cached copy.
if(file_exists('cache/img/'.$hash.'_'.$file.'') && (time()-filemtime('cache/img/'.$hash.'_'.$file.'') < $expire)) { // Check it image exists, and if it's expired.
$url = 'cache/img/'.$hash.'_'.$file.''; // Local copy is OK, and not expired, thus we provide don't need to do anything more right now.
} else {
$fh = @fopen($localfile , 'w' ); // Prepare for writing
$remote = @file_get_contents($url); // Get the contents of the remote file
@fwrite ($fh, $remote); // Write the new file...
@fclose ($fh); // ...and now we close it.
rename($localfile, 'cache/img/'.$hash.'_'.$file.''); // The temp file is now uploaded, so let's just rename it before we continue
$secure = @getimagesize('cache/img/'.$hash.'_'.$file.''); // Check the image dimensions. If we can't find them, it's not an image!
if($secure == FALSE) {
@unlink('cache/img/'.$hash.'_'.$file.''); // The file was not an image, so we will have to delete it for security reasons.
$url = 'img/warning.png'; // We will also provide a warning image. This will show up for any invalid images, or even missing ones.
} else {
$url = 'cache/img/'.$hash.'_'.$file.''; // This is a valid image, so we provide the user with a cached copy.
}
}
4. Create the following folder: cache/img & chmod it to 777
5. Create an .htaccess file in above folder with the following content:
<Limit GET POST>
Order Allow,Deny
Allow from All
</Limit>
6. Create a image with your warning text, named warning.png & upload it to your img directory.
7. Save & upload.
This will be a *lot* faster than my previous "fix", seeing as files are cached and only local copies will be checked each X number of days
Edit: Fixed script.