Now The blind can see with Google Glass


Google introduced its Glass project along with the developer area previously this year, and there has been much buzz regarding exactly what the platform can do. Yet while Glass applications mostly include an augmented-reality visual layer of data indicated for seeing, one big possible application hads be making it possible for the visually hindered to “see.”.

A Google Glass designer is constructing devices to allow the blind to see with resounding methods. The system is intended to work like sonar– the positioning innovation utilized in subs and by some ocean mammals– in establishing the placement of objects around the individual. There are in fact existing technologies and applications that use the exact same concept, although these have constraints. For instance, the vOICe app for Android will primarily describe points around you with speech. The app tries to determine items and closeness with the Android device’s camera.

Nonetheless, the restriction is that the camera actually has to “see” the atmosphere, and you need to put on earphones. That’s not specifically a sensible remedy. First of all, it’s not hands-free, unless you can easily install your mobile phone on your physique or clothing. Secondly, earphones can easily be very troublesome. Below’s where Google Glass comes into play.

Glass includes 2 points that will be necessary in this application:.

First is the head-mounted camera, which moves along with your own head motions.
Secondly is the use of bone conduction to transfer audio to your internal ear. This suggests no more requirement for difficult earphones and earphones. Just use the Glass headset and you excel to go.
This presumes, naturally, that the head set is joined an Android device for sight-to-sound translation. An existing constraint, too, is that sight-to-sound works most effectively along with stereo headsets, which gives the mind a much better method of positioning the sound loved one to a three-dimensional area. Or else, it might simply end up confusing because you will certainly hear an item description, yet will certainly not understand where precisely it is.

This new advancement will not exactly restore eyesight, although it’s a good replacement until that time in the distant future when we can directly interface our gadgets along with our brains. Check out the video for an example of exactly how vOICe offers audible cues (like “pings”) to help navigate through a 3D room.

You can additionally look at the vOICe for Android project page (in the source web sites) for a much better description of how the application functions. If you’re curious as to the capitalization, consider OIC as “oh, I see.”.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>