Technology

Glass Rethinks the Smartphone Camera through an Old-School Cinema Lens

Glass Rethinks the Smartphone Camera through an Old-School Cinema Lens

Smartphone cameras have improved significantly, but it’s becoming increasingly difficult to enhance them since we’ve pretty well reached the limit of what can be done in a cubic centimeter. Glass is a firm that aims to revolutionize camera technology by employing a significantly larger sensor and an optical technique borrowed from the world of filmmaking: anamorphic lenses. Since we’ve seen such advancements in prior generations of phones, it may not be evident that cameras won’t grow better. But, in a sense, we’ve used up all the slack in this line.

A larger sensor, a better lens, or some type of computational wizardry are required to improve the image. Unfortunately, sensors can’t become any bigger since it would necessitate larger lenses. Also, even when the camera is “folded,” there isn’t enough room in the phone body for larger lenses. Meanwhile, computational photography is fantastic, but it can only do so much – stacking a few photos to improve dynamic range or depth information is amazing, but it rapidly reaches a point of diminishing returns.

Glass Rethinks the Smartphone Camera through an Old-School Cinema Lens

Glass co-founder and CEO Ziv Attar, who has worked in mobile imaging for over a decade, including at Apple, noted, “The restrictions used to be about money, but now it’s about size.” Tom Bishop, the other co-founder, previously worked at Apple, and the two of them collaborated on Portrait Mode, presumably frustrated by the constraints of traditional camera design. “They simply made the lens wider up until about 5 years ago, then they started making the sensor bigger,” Attar explained.

“Then you put algorithms at it to decrease noise, but even that is nearing its limitations; fairly soon [AI-generated images] will be pure delusion.” Night mode pushes exposure stacking to its limits — it copes admirably with the lack of photons, but zooming close makes everything appear strange and false.” He went on to say, “The phone screen kind of deceives us.” “An ordinary person won’t see the difference between an iPhone 12 and 13, but everyone can tell when compared to a professional camera.” There’s a lot of work to be done if you can notice the difference.”

So, what precisely is that job? Attar has determined that of all of these problems, changing the lens is the only one that makes sense. True, it can’t go much bigger – but only if you use a symmetrical, conventional lens assembly. Why should we, though? In cinema, they abandoned up on that restraint a century ago. Widescreen movies were not always available. For obvious reasons, they were originally more likely to be in the shape of a 35mm film frame. You could present a widescreen image if you matted off the top and bottom, which people appreciated — but you were really simply zooming in on a section of the film that you had paid for in detail. However, a technology developed in the 1920s quickly remedied the problem.

Anamorphic lenses compress a broad field of vision from the sides to fit it into the film frame, and the process is reversed when projected using an anamorphic projector: the picture is stretched back out to the appropriate aspect ratio. There are a few intriguing optical effects introduced, but I’ll refrain from describing them because you’ll never be able to un-see them in content if I do. The Glass lens system isn’t quite the same, but it employs comparable concepts and oddly shaped lenses. It all started with the basic concept of how to add a bigger sensor. Making a larger square would require a larger lens, which we don’t have — but what if you made the sensor longer, like a rectangle?

You’d also want a longer, rectangular lens. The anamorphic approach allows you to record and present a bigger but distorted image, then use an image processor to transform it to the correct aspect ratio. (The method isn’t identical to the film approach, but it follows the same ideas.) What is the maximum image size you can capture? The primary camera on an iPhone 13 contains a sensor that measures roughly 75 millimeters, resulting in a total area of 35 square millimeters. The sensor in Glass’s prototype is roughly 248 mm: about 192 square mm, or 5-6 times bigger, with a corresponding increase in megapixels. For your convenience, here’s a quick reference chart.