Apple relies on Sony’s camera sensors, which it uses to create the world-famous iPhone cameras. However, according to Bloomberg’s Mark Gurman’s PowerOn Newsletter, Apple may be considering developing its own internal camera sensor technology to further improve imaging on a larger scale.
Gurman mentions that imaging and camera sensors are big selling points for iPhones and mixed reality devices. Apple believes that by gaining more control, including designing its own imaging hardware, it can improve its products. Despite this change, Gurman notes that Apple will continue to depend on manufacturing partners.
This strategy is not new for Apple, as it has designed its own components such as the Taptic Engine and silicon in the iPhone, Mac and other devices. This approach not only supports devices for longer life, but also allows for fine-tuning that provides a refined user experience. In addition, it reduces dependence on third-party companies and overall costs.
Apple’s imaging ambitions may extend beyond the iPhone. According to speculation, it could aim for better sensors for the eventual Apple Car and future generations of its Vision Pro mixing headphones.
The timetable for implementing such a move and the current state of research and development are unclear. However, one thing is certain – Apple is committed to creating its own components. Having its imaging sensor technology work alongside chipsets and other hardware could open up new opportunities for the Cupertino-based tech giant in the future.