Google Glasses Optometrist

Through The Google Looking Glass

Dr. Kisling binocular vision, computer vision syndrome, Contact Lenses, eyeglasses, Google Glass, Technology Leave a Comment

 Google Glass- An Optometrist Views The Good, The Bad, And The Downright Ugly!

 

 

Google Glasses Optometrist

Through The Google Looking Glass

Google Glass is coming! Initially I felt it was a gimmick that would have little effect. Now I think differently. It will be the biggest change in communications since the advent of the cell phone, only growing at a much faster rate.  IN case you missed it, Google Glasses are the new eyeglasses sporting a projected display in front of the right eye.  This is projected through a small prism so you can view the computer screen, images, and probably a virtual keyboard you will be able to type on by looking up a little. The patents cover a lot of territory and indicate the projected image can be adjusted up, down or in the line of sight. In my mind, the question remains whether Google will own this new area of technology or be left behind by another start-up company that takes a better approach. Only time will tell and Google will certainly be the  one that brings this technology to the mainstream initially.  Isabelle Olsson is the person responsible for bringing colors to the eyeglass frame like shale, tangerine, charcoal, cotton, and  sky. So far it sounds like one frame model, the so called Explorer. It probably will be coming with a sunglass insert so people won’t look quite so strange when looking at things that aren’t really there. We should encourage using it. By the end of this year Google Glass will probably be available for people who need corrective lenses. So far I have not seen anyone sporting Google Glass in Fort Collins but due to the high concentration of The Tech industry in Northern Colorado it’s only a matter of time before “Through The Google Looking Glass” enters the local lexicon.. Reportedly people are wandering around HP in California trying them out & a lot of those HP people end up in Fort Collins. 

Through The Google Looking Glass -Eyewear In Need Of An Adjustment!

Now for the no so good news. Have you ever worn a lopsided pair of eyeglasses? Well you probably will get to do that again. The patent indicates that weight is added behind the users ear on the same side as the device to balance the weight off of the nose. That’s all good and well but it is adding more weight to the already top heavy side. Do you really feel better if your ear hurts instead of your nose? They need to add more weight behind the other ear so it isn’t tipped towards one side. Google Glass is headed towards the heavy eyewear we have struggled so hard to leave behind. 

The Lawyers Are Coming After Google Glass!

The device on the frame as it is shown in its current incarnation will create a significant loss of peripheral vision in the right superior visual field.  Probably too much to consider driving a car with it on, but people will undoubtedly do so.  Of course that is kind of like driving with a 6 pack on the front seat. In addition to being blatantly obvious there is a digital trail to accompany any black box data from the car. Same problem goes for riding a bike, walking across a busy intersection, etc. The lawyers are already looking for future cases. They haven’t even crossed into the visual problems, the visual perceptual distraction issues are more than enough for now. Visual issues are not limited to field restrictions. How many times have you seen someone turning the corner of a busy intersection while gabbing on a cell phone and looking up and to the right? It is an automatic thing. Add the interactive nature of Google Glass and you have people looking in all kinds of directions except the correct one!

The loss of peripheral vision is not insurmountable. Someone will find a way around it. The prism and projector are too big and need to be moved to an area that will not restrict vision.  A prism is probably not the best solution to move the image into a desired region. Perhaps projecting a holographic image or using a lens matrix that can be electrically activated will work in the future. We currently have Transition lens technology that  alters light transmission thought the lens activated by UV light. Last year we saw the release of the Pixel  emPower prescription no line bifocal lens that allows adjustment of the bifocal strength by touching the side of the frame. The touch changes the electrical charge to a membrane lens component which alters the multifocal lens power.  Taking the concept of both of these lenses, put it into a microgrid and add different color responses pixel by pixel and you have solved the problem! OK maybe not so easy but something will work out. Lumos Optical in Israel has developed a lens that spreads images from the side across the lens using some sort of angled tiny wedges. It is then displayed using semitransparent sections. That sounds kind of like partial mirrors, like the one way mirrors they used to show on police shows years ago.  

Not so minor of a problem is the fact that Google glasses provide the projected image in front of one eye only. Most optometrist have fit patients needing bifocals with monovision or modified monovision. Monovision uses one eye for distance and the other for near. For people who readily suppress or turn off part of the image from one eye it works very well. For the rest of us its a nightmare of eyestrain and ghosting of images.  In addition, monovision works only with contact lenses, not for glasses. Ask any eye doctor about that one. Google Glass resembles modified monovision in many ways and I believe a lot of people are going to be looking for a $1500.00 refund. Probably not at first but it will come. If you look at the early days of computer integration into the workplace there is a precedent. The early CRT screens had horrible resolution and flicker, but very few people complained. The early users were tech oriented, research types who were willing to put up with a lot. As they started showing up on the typical office workers desks the complaints of eyestrain and headaches started to skyrocket. Today we have a name for this type of binocular dysfunction “Computer Vision Syndrome”.  I think that history is destined to repeat itself. South by Southwest is not a typical slice of the population. Google should have tried setting up a booth at the corner Walmart if they really wanted to know how people who respond to it. 

Google Contact Lenses-Last But Not Least

And then there is “Google Contact Lenses”. Well, not yet but—-guess who has been the lead on Project Glass?  Babak Parvis, a Professor of Innovation and Electrical Engineering at the University of Washington. Is it just a coincidence that he developed the first LED display contact lens? I think not. Google has to be looking at this for the future. They will have to play catch up with imec and the Center Of Microsystems Technology in Belgium. They have developed a curved LCD display using different polymers that conduct electrical charges. That allows the entire contact lens to become a display. 

Unlike Google Glasses, a Google contact lens could not be hooked up to a power source by wires. Professor Parvis has refined a concept from a contact lens designed to measure eye pressure for glaucoma (yes this one really is on the market). The contact lens has a ring of thin wires in the periphery that allow powering up from a non attached battery by induction. Cool stuff! 

The possibilities are amazing but there are still a lot of issues to deal with. If you are old enough, you can remember the first cell phones. I remember meeting an ophthalmologist for lunch who plopped a huge phone bigger than a brick with what must have been a 3 foot antenna on top of the table. Probably maintained a charge almost all of the way through lunch. Look at where we are today. Google Glass is a technology waiting to blossom. There will just be a few growing pains along the way. 

 GOOGLE GLASS UPDATES

The XE10 software update for Google Glass came out for Oct 2013 with  improved capacity in the Google Now functions. When the wearer asks for directions they will be given in the last format used-so if you had been walking the directions will be given for on people walking. Directions for public transit have been added. The September update, XE9, added more of the predictive search functions. We are all used to typing in what we want to know into a search and checking the most relevant results. Google Glass technology is driving this farther than ever before by trying to predict what information we will need based on where we are and where we appear to be going. Sooner or later we may not even need to know where we are going, Google may predict it for us!

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *