Why is the location of the iPhones fourth camera given to lidar?

 Why is the location of the iPhones fourth camera given to lidar?

Before the launch of the new generation iPhone this year, many people speculated that the iPhone 12promax would be equipped with a fourth camera. Unexpectedly, the iPhone 12pro series placed a lidar (laser radar) in the precious fourth camera position.

It has always been Apples common sense not to play cards according to common sense.

1u3001 What is lidar

Looking vertically, the round black area at the bottom right of the camera module of iPhone 12pro series is lidar

Lidar, whose full name is lightdetection and ranging in Chinese, is a radar system that detects the position and speed of targets by transmitting laser beams. It is composed of laser transmitter, optical receiver, turntable and information processing system. It can obtain the range, azimuth, height, speed, attitude and even shape of the target, so as to detect, track and recognize No.

Aside from the jargon, the lidar on the iPhone 12 is a tiny device that can detect object distance and record depth information, about the size of a mobile phone camera.

At work, lidar can realize two important functions.

First, measure the approximate distance between the object and the mobile phone.

Distance is measured by firing a laser beam that is invisible to the naked eye

Second, record the depth information of objects within the projection range.

At present, the lidar range given by apple is up to five meters. That is to say, the iPhone 12pro series mobile phones equipped with lidar can theoretically measure and record the distance and depth information of about 5 meters.

2u3001 What is the difference between lidar and TOF?

When it comes to estimating object distance and recording depth information, many Android manufacturers have installed TOF (time of flight) sensors on some models to realize these two functions.

For example, p40pro has a 3D depth camera (TOF), which can deeply perceive the depth information of objects and achieve large-scale depth effect.

At present, some mobile phone manufacturers mostly use itof (indirect time of flight sensor), while lidar belongs to dtof (direct time of flight sensor).

Although both itof and lidar are TOF, the difference is not small.

First, the signals are different.

Ito f sends out the light signal which is modulated by sine wave and the intensity of light and dark changes regularly. Lidar is emitting a laser beam.

Dtof principle

Itof estimates the distance from its emission to its reflection by sensing the difference between sinusoidal light signals. Lidar is a laser that measures distance directly.

Itof principle

Third, the accuracy range, measurement speed and anti-interference ability are different.

The signal of itof is easy to be interfered. The measurement accuracy decreases with the distance, and the measurement is relatively slow, so it is suitable for ranging in centimeter range.

Lidar has a strong anti-interference ability, and the measurement accuracy does not decrease with the distance, so it is more suitable for long-distance ranging.

Finally, itof is easier to be miniaturized, and dtof integration is more difficult to be miniaturized, so there are obvious differences in business scenarios between the two. Itof is mostly used in mobile phones, and dtof is more commonly used in large devices such as automobiles and UAVs.

Automobile radar

Lidar is faster, more accurate and more distant than itof, but it is more difficult to integrate and costs more. Apples ability to integrate lidar into the iPhone 12 camera module must have taken some thought.

3u3001 Why Apple chose lidar

Maybe the iPhone will become a 456 camera phone one day, but at the critical juncture of being discussed about is iPhone photography lagging behind? choosing to give lidar the precious fourth camera position is enough to show how much Apple attaches importance to the future layout.

What does Apple want?

On the official webpage of iPad Pro 2020, apple gave a straightforward answer. Lidar is mainly serving AR (augmented reality) at this stage.

At the moment when mobile phone photography almost reached the ceiling, apple chose a different way from other manufacturers. Instead of using high pixels and multiple optical zoom, apple added lidar to let users record depth information..

The benefits of being able to record in-depth information are obvious, and I summarize it as two-way fusion..

One is reality virtualization, that is, AR, which can record all kinds of information of real things with camera system and lidar, and then display them on screen. Observing from any angle and close distance also breaks the limitation of time and space.

Without the depth information of iPhone 12, it is impossible to watch virtual iPhone with AR

With depth information, the photo is no longer a plane

Dynamic Emoji can be regarded as a simplified version of VR. Virtual characters have facial depth information of real people

These two kinds of fusion are inseparable from the depth information, which is also the bridge between AR and VR. For example, teachers use AR devices that can record in-depth information to shoot live teaching, and students wear VR glasses to listen to the class. Every student sees a virtual image of the teacher, but the expression and action details of the virtual image are the same, so it is difficult to simply describe it with AR or VR.

The movie player number one shows the fusion of reality and virtual reality. It is recommended to eat in this article

4u3001 Lets talk about mobile phones

What is mobile phone for human society?

Looking back on the human history after the emergence of mobile phones, the common demand of users for information processing and exchange has always been decentralization.

At the beginning, the common demand of users is to improve the efficiency of information exchange, so the core function of mobile phone is to make calls and send short messages. The central nodes of information exchange fixed telephone, mail, fax, post and telecommunications, which are bound by lines, have been completely marginalized, and now they are only a supplementary tool in special situations.

Next, the common demand of users is to break the information imbalance and further decentralize. Everyone wants to have their own voice, so the core function of mobile phones has become social, and a large number of social apps have sprung up.

It is precisely because Apples full touch-screen product iPhone can accurately and efficiently solve the pain points of users common needs in the second stage at the right time, so that apple can beat Nokia and become the first comprehensive market player.

Now, what are the common needs of users?

Users need to further decentralize, not only to voice, but also to record and transmit their individual civilization as far as possible, so that their individual civilization can be recorded in the history of human civilization.

IPhone ad shows iPhones life recording function

A long time ago, all living beings had no sense of historical existence, and could not master the tools to record themselves. The emergence of mobile phones makes the data stored on the server become personal history books. These tiny personal history books like ants play an unparalleled influence, and jointly constitute a human great historical book - the common big database of human society. Individuals are no longer nameless in history that can be directly ignored, nor can they be written off with a stroke of a pen So and so, but in the common great history books of mankind, one by one, live and live.

Now, lidar has given us a higher dimension of writing ability, adding in-depth information on sound, plane images, behavior habits and other dimensions, breaking through the plane limit.

5u3001 Immortality starts with deep information

For ordinary users, the intuitive help brought by lidar is not very obvious at present. It belongs to the degree of better, no and little influence.

The potential benefits it brings are very obvious. If the conditions and camera App allow, every scene, every moment, every person you meet, every flower you see, every tableware you touch, and every self portrait you take can be recorded by lidar. As long as the opportunity and technology are mature and the amount of data is large enough, then sound, image and behavior can be combined Data as like as two peas can create a virtual world and virtual self in the server, just like the real world and real self, and even wait for the day when technology level is high enough to create a self in the real world.

If the robots in western world have a corresponding real human, then getting the in-depth information of real people is the first step in making robots

If one day technology is more advanced, this virtual self can be given a set of human algorithm. After a hundred years, some people will miss us

Need to enter the virtual world, can see a living we, why not achieve another form of eternal life?

Source: AI fan Er Author: Zhu Hai, editor in charge: Mao Xinsi_ NBJS11624