Apple Expands Use of Maps Imagery for AI Training
Apple has announced plans to utilize imagery collected for its Look Around feature in Apple Maps to train artificial intelligence models, starting March 2025. The tech giant recently updated its website, detailing how the collected data will be used to enhance image recognition, creation, and enhancement capabilities across its products and services.
Look Around, Apple’s answer to Google Maps’ Street View, provides users with street-level imagery for navigation and exploration. The company employs a fleet of vehicles and pedestrians equipped with advanced cameras, sensors, iPhones, and iPads to capture high-quality images and 3D data for improving Apple Maps.
In line with privacy concerns, Apple emphasizes that all faces and license plates in Look Around imagery are blurred before publication. This blurred imagery will be the foundation for AI model training. Users also have the option to request additional blurring for their residences, although this is not the default setting.
The move aligns with Apple’s ongoing efforts to leverage AI in its products. The company’s Apple Intelligence already utilizes AI image generation models in features such as Image Playground and the Clean Up tool in the Photos app. Additionally, advanced image recognition technology has significantly improved search capabilities within the Photos application.
While Apple has not specified which particular models will benefit from this new data source, industry experts anticipate that the company will provide more details as the implementation date approaches. This development marks a significant step in Apple’s strategy to enhance its AI capabilities and maintain competitiveness in the rapidly evolving tech landscape.
As March 2025 draws closer, users and industry watchers alike will be keen to see how Apple harnesses this vast collection of visual data to improve its services and potentially introduce new AI-powered features across its ecosystem.