Newly released software is taking on the last challenge in audio searching. The tool, called MediaMined, is an artificial intelligence system that can make sense of what it hears, whether the sounds are speech, music, or even a sound effect like an explosion or creaking door.
These days, it’s easy to take for granted the power of online text searches. With Google and Bing able to find just about anything written online, we may not appreciate how tricky it is for computers to search through pictures, video, and sound. Unlike people, computers cannot easily recognise objects in pictures or identify sounds heard in an audio file. One solution to this problem has been to label files with keywords, but the ideal solution would be to create software that understands content without needing such help.
Created by San Francisco-based Imagine Research and supported by funds from the US National Science Foundation, MediaMined is not the first tool designed to ‘understand’ audio. Voice recognition technology enables software to digest speech andvtools like Shazam and Soundhound’s Midomi and Hound have also come on the scene in recent years and can recognise music. MediaMined, however, extends audio recognition capabilities to all sounds, earning it applicability beyond speech and music.
MediaMined sets itself apart from other sound-searching tools by applying a machine-learning approach. This flexible strategy lets users find sets of similar sounds based on features beyond what might be in the keywords associated with a file. As Imagine Research’s founder and CEO Jay LeBoeuf explained in a recent press release, MediaMined “allows computers to index, understand and search sound- as a result, we have made millions of media files searchable.” With its general applicability, MediaMined could help movie soundtrack makers work more efficiently or, its creators speculate, even help doctors to assess a patient’s cough or wheeze.
Posted on Wednesday, 23rd November, 2011
A team of Japanese scientists has managed to create stem cells that re-enact the early stages of mammalian eye development in a culture dish. The exciting bit is that this includes a particularly important part of eye development: retina formation. The team’s work and findings, published in the journal Nature, could help provide methods for treating blindness.
The retina is a thin piece of tissue at the back of the eye. It is responsible for converting light, which has passed through the lens, into electrical nerve impulses. Thousands of these nerve impulses then travel to the brain, where they are processed to form an image. Critical to this activity are the retina's photoreceptor cells; the loss of these cells is the main cause of untreatable blindness. Without them, light that passes into the eye cannot be converted into the signals required to form an image in the brain.
In 2006, a team of scientists from the UK and USA showed that particular stem cells, when introduced into damaged retinas of mice, could help repair the damage. The stem cells they used were ones that had not quite completed their development into photoreceptor cells. While a drawback to this approach has been the limited availability of these cells, this latest advance in simulating retina formation on a culture dish could overcome this problem. The developing synthetic retinas could provide a ready supply of stem cells that are at the required stage of development. It is hoped that once appropriate stem cells are introduced into a blind person’s retina, the cells will divide to produce functional photoreceptor cells, gifting the person with the ability to see.
This article is based on the content presented in a research article in the latest issue of Nature (7th April 2011), titled: ‘Self-organizing optic-cup morphogenesis in three-dimensional culture’. To see the original research article, please visit: http://www.nature.com/nature/journal/v472/n7341/full/nature09941.html
To see some videos of the incredible, and rather beautiful, 3D eye cell cultures, please see the ‘Supplementary information’ section at the same link.
Posted on Tuesday, 1st November, 2011