Showing posts with label LatestTech News. Show all posts
Showing posts with label LatestTech News. Show all posts

Wednesday 28 September 2016

Apple is pushing for a world without touch, but what lies beyond that world?

Minority-Report-Tech2-720-624x351.jpg
With the launch of the iPhone 7 and much-maligned AirPods, Apple is pushing us towards a world where physically interacting with our phones may be unnecessary. But what lies beyond that world?
It’s not just Apple of course, Amazon’s Alexa, Google Now, Cortana, they’re all aiming for a world where we spend less time clicking, scrolling or tapping and just talk to our devices. The idea does sound nice. Not having to constantly pull your phone out of your pocket to check the time or interact with a notification would be nice, as would placing calls or sending messages.
Isn’t saying, “Hey Siri, call mom” far easier than pulling your phone out of your pocket, unlocking it, tapping the dialer, searching for “mom” and then tapping on call? You can streamline the process a bit with shortcuts and widgets of course, but you get the idea.
What about PCs, however? Where do they fit in? The likes of Google Now and Siri are great on a phone, but I’m not sold on the idea for PC. I don’t know about you, but I find it far faster to type than to holler for Cortana every time I need something.
Are we going to be stuck on keyboard/mouse/touchpad forever, though? Maybe not.
Gesture recognition has been slowly but steadily picking up pace. With VR and AR around the corner, gestures might just take centre-stage over every other form of input in the future.
Apple seems to be working on gesture-recognition technology of its own. It recently patented a “3D depth-mapping” technology that involves lasers and eye-tracking. Apple has been working on similar technology since 2009 and in 2013, bought PrimeSense, the company behind the Kinect’s tracking system.
Intel’s RealSense has been in the business a long time, Microsoft already has some form of gesture tracking with stereo cameras in the HoloLens and the likes of LeapMotion and Tobii’s EyeX have been working on bringing gestures to a PC near you.
With the Oculus Rift and HTC Vive, tracking position and gestures is an integral part of making up a truly immersive experience.
If you’ve not used gestures before, let me just tell you that they are amazing, but only when they work. I’ve tried RealSense and LeapMotion myself and have experimented a bit with head and hand tracking. While finicky and temperamental, the setups proved to be very freeing.
Interactions feel more natural and intuitive. Unlike a regular keyboard and mouse, which we’ve simply gotten used to, swiping away a notification or scrolling through a page just felt like a more natural experience. In games, that feeling of freedom is vastly enhanced.
You’re not locked to a fixed perspective. You won’t realise how much more natural it is to drive a virtual car and glance at a mirror or track a corner with a simple head movement till you actually try it.
Gestures don’t stop there of course. Imagine controlling a TV with gestures. No remote, no hassles. If you’ve used a good air remote, you know what I’m talking about. Could this extend to smart homes? Why not? Alexa and Siri can already control your home, after all.
Gesture controls don’t work very well right now. The right hardware is expensive, fine-tuning is a pain at times and there just isn’t enough to do with the technology, but it will eventually get there.
I eagerly await the day that gestures go mainstream. Maybe then I can finally fulfil myMinority Report fantasies.



from WordPress http://ift.tt/2cUzqDC
via IFTTT
Share: