Tim Berners-Lee's vision of a web where data is machine-readable and universally linked by meaning.
The Semantic Web is an extension of the World Wide Web envisioned by Tim Berners-Lee in which information is given well-defined meaning, enabling computers and people to work in cooperation. Rather than web pages designed for human reading, the Semantic Web uses standards like RDF, OWL, and SPARQL to make data machine-readable, interoperable, and linkable across the entire web.
The Semantic Web's foundational standards — RDF, OWL, SPARQL, and Linked Data — have become the backbone of enterprise knowledge graphs and AI interoperability frameworks. While the original vision of a fully semantic public web was never fully realized, its technologies are now powering the enterprise AI revolution. The Open Semantic Interchange initiative and MCP protocol are direct descendants of Semantic Web thinking.
The Semantic Web is built on a layered architecture: URI/IRI (for identifying resources), RDF (for describing resources), RDFS/OWL (for defining vocabularies and ontologies), SPARQL (for querying), and trust/proof layers (for verifying claims). Linked Data principles — using HTTP URIs as names, providing useful information at those URIs, and linking to other URIs — enable the web of data to be navigated like the web of documents.
DBpedia extracts structured data from Wikipedia and publishes it as Linked Data using RDF. This creates a machine-readable knowledge graph of millions of entities — people, places, organizations — that AI systems can query directly. When an AI needs to know the population of Paris or the CEO of Microsoft, it can query DBpedia's SPARQL endpoint rather than scraping Wikipedia.