Copyright Issues is a crawler that finds articles online. If you feel that your article should not be on our site make sure that you setup a robots.txt properly and contact us to remove your articles and sites from our crawling list.

Robots.txt to block

To block our robot please place a robots.txt text file to the root (/) of your website. The sample content of the robots.txt may look like below.
			Disallow: /
If you don't want to block our robot, simply do not do anything.