As the iPad and iPhone have mostly demonstrated to a great extent,
much of Apple’s hardware nowadays relies heavily on accurate detection of
direct touch inputs, such as a finger resting against a screen, or on a
trackpad, in Mac’s case. However, as people across the globe depend on
augmented reality for both entertainment and work, they require interacting
with digital objects that aren’t equipped with physical touch sensors. Quite
recently, Apple has patented a prime technique for detecting touch using
Machine Learning (ML) and depth-mapping cameras.
As per patent standards, Apple’s technology in the depth-based
touch detection system is pretty straightforward – the external cameras work
together in a live or real-time environment for creating a three-dimensional
depth map by measuring the distance of an object, for instance – a finger, from
a touchable surface and then further determining when the object touches the
surface. Crucially, the distance measured is designed in a way so that it
becomes usable even when the cameras change their position, depending partly on
training from an ML-based model to discern touch inputs.
Illustrations of the technique corresponding to Apple’s patented
technology demonstrate three external cameras working together for determining
the relative position of a finger, which is a concept that might turn out to be
somewhat known to the users of Apple’s triple-camera iPhone 11 Pro models.
Similar multi-camera arrays are most likely to appear in the future devices of
Apple, such as the new dedicated AR glasses and iPad Pros, allowing each to
determine finger input conveniently by applying ML knowledge and depth-mapping
a scene to analyze the intent of changes in the finger’s position.
Equipped with this technology, AR glasses in the future could
efficiently eliminate the need for trackpads and physical keyboards by
replacing them with digital versions, which only a user can see and further
interact with adequately. Such AR glasses could also enable user interfaces to
be anchored to other surfaces like walls, creating a secure elevator
conceivably that could only be operated or brought to particular floors with
the help of the AR buttons.
Apple has been granted the patent US10,572,072 corresponding to
the AR touch detection technology – invented by Sunnyvale-based Daniel Kurz and
Lejing Wang. Apple had first filed the Patent
Application at the end of September in 2017. Tim Cook,
Apple’s CEO, has suggested that AR will be of the utmost importance for the
tech giant to move forward in the present highly competitive world. For more
visit: https://www.kashishipr.com/
Don’t forget to follow us on social
media:
Contact
- US
No comments:
Post a Comment