Users of Ring's semantic video search can quickly find specific moments in video data, such as "a dog in my backyard" or "a package delivery," using a system that employs PostgreSQL and pgvector for video data handling. The system utilizes vector embeddings to represent visual content, allowing for efficient storage and similarity searches. When a user queries a term, the system converts it into a vector to retrieve similar video frames in under two seconds. Ring's video search operates on a global scale, processing billions of read requests daily across four continents and nine AWS regions, with strict latency requirements for millions of users. Key metrics include storing 100–200 billion embeddings, generating approximately 2 billion new embeddings daily, and maintaining a data footprint of 140–150+ TB across three PostgreSQL clusters.