Search engines do not like it when they find same content on two different URLs. This is primarily due to the fact that the caching and process of the search engines gets cumbersome while filtering out the duplicates. If they do not do that, chances are high that similar looking pages will fill the SERPs (search engine result pages) which will blemish the searcher experience.
Not all duplicate content is created knowingly. Content management systems like the custom ones and the open source CMS all have a tendency of creating duplicate content due to different reasons. The webmasters should monitor closely the URL output of the CMS and run reports and access Google webmaster tool to diagnose the problems with content, if any.