Speaking with about 10 years implementation experience, I'd agree that standardising usage of the same terms is a good thing - but this requires management and publication of the semantic framework - the vocabularies that metadata uses to be explicit.
The other big issue relates to the "granularity" of resources. I could create metadata for each feature, or register a WFS that has a collection of features. Obviously we need a flexible approach, so logically metadata profiles and feature metadata should be part of a coherent information architecture.
When it comes to discovery, we find that in a services world that it is not the content of metadata descriptions that prove useful, its the standards that are referred to that allow us to access the data (protocol and data standards, and over time access arrangements, data quality etc may become more relevant). This means that the actually discovery process tends towards querying relationships between pieces of metadata addressing separate concerns.
This is actually quite close to the Semantic Web, in that each such relationship is an "assertion" that forms part of a knowledge base. At the moment we need to make such assertions to support simple discovery and utilisation. Its possible that automated inference will be a pipe dream - but it certainly will not happen without frameworks that generate meaningful assertions as part of day-to-day operations.
The history of Dublin Core is also instructive. We tend to rely on tools like Google that use stochastic analysis rather than formal metadata. Metadata however works well within systems with coherent governance - where there is also a driver for content creators to see their content properly classified so that it appears in the right place. I dont think we should assume any more noble behaviour from the geospatial community - so what's the carrot?