Servers and storage are a primary focus for one hospital’s support upgrades.
While Google Glass may not have taken hold in the consumer market, a new generation of high-tech smart glasses is emerging to turn the eyewear industry on its head.
Earlier this year, researchers at the University of Utah announced that they had developed “glasses with liquid-based lenses that ‘flex’ to refocus on whatever the wearer is viewing,” according to a release from the National Institute of Biomedical Imaging and Bioengineering, which funded the project.
“The glasses incorporate an impressive array of electrical, mechanical, optical, sensor, and computer technologies with the goal of developing a one-size-fits-all approach to vision correction,” says Andrew Weitz, NIBIB program director, whose expertise includes bioelectronic vision technologies, according to the release.
While human eyes often lose the ability to focus at different distances as they age, the glasses use glycerin sandwiched between flexible membranes in order to mimic the eye’s natural ability to flex and to focus on whatever the wearer is looking at, whether it’s near or far, allowing one lens to act like many. It does this using an algorithm that relies on two variables: the eyeglass prescription, which the user plugs into a mobile app, and information on where the user is looking that is gathered by a sensor. The sensor is mounted on the bridge of the glasses and “uses pulses of infrared light to identify where the user is looking and provide the precise distance,” the release notes.
Moreover, as a wearer looks around, the lens is able to change shape and focus on a new distance in just 14 milliseconds, 25 times faster than the blink of an eye, according to the NIBIB.
“Theoretically, these would be the only glasses a person would ever have to buy because they can correct the majority of focusing problems,” Carlos Mastrangelo, an electrical and computer engineering professor at the University of Utah, told the source. “Users just have to input their new prescription as their eyesight changes.”
Because of the sensor and a rechargeable battery pack, the current prototype is relatively bulky, although the engineers are working to improve the design all the time and are hoping to bring the product to the commercial market within the next three years.
Calling on artificial intelligence, augmented reality and smart glasses, like Google Glass, a California-based startup is working alongside AT&T with the aim to help blind and low-vision people navigate their surroundings. The glasses stream what a blind or visually impaired individual might see to a remote agent via an AI dashboard. The agent can then help the individual navigate the surroundings, read signs or shop, MobiHealthNews reports.
The startup, Aira, has raised $15 million thus far to develop tech and services that will remove barriers for the estimated 20 million blind and low-vision people living in the U.S. The company recently moved out of beta and made the platform available for purchase, the site reports.
But Google Glass isn’t the tech giant’s only product that has potential to help the visually impaired. The recently announced Google Lens project, which uses AI to navigate indoor spaces via a smartphone, could also be used to help those with low or no vision move around with greater ease.
“Further out, imagine what this technology could look like for someone with impaired vision, for example," Google Vice President of Virtual Reality Clay Bavor said at the Google I/O conference in May, MobiHealthNews reports. "Our visual positioning system with an audio interface could transform how they make their way through the world.”