I gather the original poster is talking about his experience of having his forum content 'scraped' by someone.
Then finding that that his content (posts etc) was being used to populate another forum, using some of the tools available to steal other people's content this way that one can find on the web, alas.
It's a big problem that can be difficult to fight off, especially as a lot of these scrapers do bad things like:
- fail to respect robots.txt,
- attempt to cloak themselves behind bogus user agents, and
- access you from a variety of changable IP addresses.
I would look at my server logs on the day(s) the scraper in question came to visit, and see if there was anything in the logs that would help you perhaps block it in future (by user agent, IP address etc) either in your robots.txt file, .htaccess, mod_security filter settings, with your firewall, or whatever other means at your disposal.
To get a good lesson in the scope of the problem, and some of the solutions, I heartily recommend reading and keeping abreast of IncrediBILLs blog.
He is about the most vocal enemy scrapers and bad bots have on the web. He also says he is developing a tool to help content authors defeat them that, if it works, will certainly be worth looking out for:
Some of the services offered by Copyscape - http://www.copyscape.com - may also be useful to you if you are worried about this happening again. It can help alert you to people ripping off your content.