I explained the Google technology radar (Soli) to discover the movements of the hand located inside the phone company, Smart (Pixel 4), although many of the details of the apparatus have been known in the past, but that the company made for the first time artificial intelligence models of (Soli), which is trained to discover the gestures of freedom and identify them with a time-delayed low.
Google claims that the technology (Soli) to the existing hardware (Pixel 4) and (Pixel 4 XL) – the first consumer devices that feature that water – can enable new forms of context-awareness gestures on devices such as smartwatches, paving the way for experiments that will fit users with disabilities better.
And alone (Soli) inside (Pixel 4), which was a collaborative effort between the technology group and advanced projects (ATAP) of Google and the products of the Pixel (Pixel) and Android (Android), on the radio at 60 GHz, and C with a field of view built at an angle of 180 Degrees Records the positional information, in addition to things like range and speed.
And reflect the electromagnetic waves of information back to the antennas, and enhance custom filters, (including candidates representative of the vibration of sound caused by music), the ratio of the signal with the ease of trading junk, and discrimination between the reflections from the noise and commotion.
And the transformation of signal in machine learning models of (Soli) to classify gestures, least of millimeters, which requires developed according to Google overcome several major challenges:
- Performs all user simple movements such as scrolling in countless ways.
- Show the movement of the intruder within the range of the sensor sometimes as a character similar to the gestures.
- It seems that the whole world turns when you turn the phone from the point of view of the sensor.
Forced Google to develop algorithms to discover the gestures accurately, owing to the clear roads own unique persons, so that you may choose the method of scrolling for someone by someone else, as the company has developed algorithms to ensure no movement is detected the background is wrong as gestures.
And created teams responsible for (Soli) system models trained using millions of gestures recorded from thousands of volunteers from Google, which completed hundreds of hours of recordings of the radar containing the movements of the public of other volunteers from Google.
And took Google into account the interference between the other components inside the phone, so the police had to develop techniques to address the new signal to assess the effect of sonic vibration on the radar signal, which paved the way for the use of (Motion Sense) to also play music.
Trained artificial intelligence models using the framework of machine learning (TensorFlow) of Google, has been improved to run directly on the digital signal processor of low power devices (Pixel 4), allowing those models track up to 18 thousand frames per second, even when you turn off the main processor.
The challenge for Google is the interpretation of the movement of time quickly and accurately; to know what he's doing to the user, besides the expense of the movement of the phone while walking, or vibrations during music playback, the company said: “We are excited to continue research and development in relation to the radar short-term (Soli), in order to enable the capacity of sensing and perception”.
Comments
Post a Comment