Google engineer provides explanation on how AI-powered Immersive View functions on Google Maps
Google Maps has been working on simplifying navigation by altering the visual representation of content on the map. This includes changes to the colors used for water and green areas, as well as plans to modify the navigation display, although these changes have not been implemented yet. However, the most significant update came in February, when Google introduced Immersive View for routes. Now, after a year since its launch, we can finally witness how this feature operates.
Immersive View in Google Maps
Maps’ immersive view is a mix of standard navigation and Street View, with a few smart laptops thrown in. This provides a top-down view, but the legacy may have multiple computer paintings. Google engineer Daniel Filip explained how it works in an interview with CNET.
Google uses certain cameras in engines, airplanes, and backpacks to capture images for Street View. They then combine these photos with aerial camera footage and make 3D models of the locations.
You may see Google Maps cars taking pictures around town. They use these photos to create Street View. Over time, Google made their digicam system smaller and lighter, so now they can map places that motors can’t reach. But planes are also necessary for 3D information in Immersive View.
The cameras under airplanes are different from the cameras in cars. They have four lenses that pass through each one differently, creating a unique effect called parallax. Computers translate these images into 3D models of the land and buildings, complete with details such as signs and sidewalks. These machines also take pictures for Google Earth.
The immersive view of the routes doesn’t just show still images. It also has animations of what the site visitors might look like at unique times of the day, and even shows birds flying or people walking, as if it’s actually happening then. It also displays weather statistics so you can better plan your experience.