Facebook Twiter Goole Plus Linked In YouTube Blogger


Hearing is the ability to perceive sound. The range within which a voice can be heard. Perceive (sound) via the auditory sense. The act of hearing attentively. Receive a communication from someone. Hearing is the ability to perceive sound by detecting vibrations, changes in the pressure of the surrounding medium through time, through an organ such as the ear.

How to Listen Effectively when other Speak

Previous SubjectNext Subject

Ear Parts Auditory System is the sensory system for the sense of hearing. It includes both the sensory organs (the ears) and the auditory parts of the sensory system.

Ear is the organ of hearing and, in mammals, balance. In mammals, the ear is usually described as having three parts—the outer ear, middle ear and the inner ear. The outer ear consists of the pinna and the ear canal. Since the outer ear is the only visible portion of the ear in most animals, the word "ear" often refers to the external part alone. The middle ear includes the tympanic cavity and the three ossicles. The inner ear sits in the bony labyrinth, and contains structures which are key to several senses: the semicircular canals, which enable balance and eye tracking when moving; the utricle and saccule, which enable balance when stationary; and the cochlea, which enables hearing. The ears of vertebrates are placed somewhat symmetrically on either side of the head, an arrangement that aids sound localisation.

Cochlea is the auditory portion of the inner ear. It is a spiral-shaped cavity in the bony labyrinth, in humans making 2.5 turns around its axis, the modiolus. A core component of the cochlea is the Organ of Corti, the sensory organ of hearing, which is distributed along the partition separating fluid chambers in the coiled tapered tube of the cochlea. The name is derived from the Latin word for snail shell, which in turn is from the Greek κοχλίας kokhlias ("snail, screw"), from κόχλος kokhlos ("spiral shell") in reference to its coiled shape; the cochlea is coiled in mammals with the exception of monotremes.

Saccule is a bed of sensory cells situated in the inner ear. The saccule translates head movements into neural impulses which the brain can interpret. The saccule detects linear accelerations and head tilts in the vertical plane. When the head moves vertically, the sensory cells of the saccule are disturbed and the neurons connected to them begin transmitting impulses to the brain. These impulses travel along the vestibular portion of the eighth cranial nerve to the vestibular nuclei in the brainstem. The vestibular system is important in maintaining balance, or equilibrium. The vestibular system includes the saccule, utricle, and the three semicircular canals. The vestibule is the name of the fluid-filled, membranous duct than contains these organs of balance. The vestibule is encased in the temporal bone of the skull.

Dizziness - Fainting - Head Spins

Vertigo is when a person feels as if they or the objects around them are moving when they are not. Often it feels like a spinning or swaying movement. This may be associated with nausea, vomiting, sweating, or difficulties walking. It is typically worsened when the head is moved. Vertigo is the most common type of dizziness.

Vestibular System is the sensory system that provides the leading contribution to the sense of balance and spatial orientation for the purpose of coordinating movement with balance. Together with the cochlea, a part of the auditory system, it constitutes the labyrinth of the inner ear in most mammals, situated in the vestibulum in the inner ear. As movements consist of rotations and translations, the vestibular system comprises two components: the semicircular canal system, which indicate rotational movements; and the otoliths, which indicate linear accelerations. The vestibular system sends signals primarily to the neural structures that control eye movements, and to the muscles that keep an animal upright. The projections to the former provide the anatomical basis of the vestibulo-ocular reflex, which is required for clear vision; and the projections to the muscles that control posture are necessary to keep an animal upright. The brain uses information from the vestibular system in the head and from proprioception throughout the body to understand the body's dynamics and kinematics (including its position and acceleration) from moment to moment.

Audiology is a branch of science that studies hearing, balance, and related disorders. Its practitioners, who treat those with hearing loss and proactively prevent related damage are audiologists. Employing various testing strategies (e.g. hearing tests, otoacoustic emission measurements, videonystagmography, and electrophysiologic tests), audiology aims to determine whether someone can hear within the normal range, and if not, which portions of hearing (high, middle, or low frequencies) are affected, to what degree, and where the lesion causing the hearing loss is found (outer ear, middle ear, inner ear, auditory nerve and/or central nervous system). If an audiologist determines that a hearing loss or vestibular abnormality is present he or she will provide recommendations to a patient as to what options (e.g. hearing aid, cochlear implants, appropriate medical referrals) may be of assistance. In addition to testing hearing, audiologists can also work with a wide range of clientele in rehabilitation (individuals with tinnitus, auditory processing disorders, cochlear implant users and/or hearing aid users), from pediatric populations to veterans and may perform assessment of tinnitus and the vestibular system.

Hearing Aids

Hearing Aid is a device designed to improve hearing. Hearing aids are classified as medical devices in most countries, and regulated by the respective regulations. Small audio amplifiers such as PSAPs or other plain sound reinforcing systems cannot be sold as "hearing aids".

Cochlear Implant is a surgically implanted electronic device that provides a sense of sound to a person who is profoundly deaf or severely hard of hearing in both ears; as of 2014 they had been used experimentally in some people who had acquired deafness in one ear after learning how to speak. Cochlear implants bypass the normal hearing process; they have a microphone and some electronics that reside outside the skin, generally behind the ear, which transmits a signal to an array of electrodes placed in the cochlea, which stimulate the cochlear nerve.

Auditory Brainstem Implant is a surgically implanted electronic device that provides a sense of sound to a person who is profoundly deaf, due to retrocochlear hearing impairment (due to illness or injury damaging the cochlea or auditory nerve, and so precluding the use of a cochlear implant).

ProSounds H2P: Get Up To 6x Normal Hearing Enhancement with Simultaneous Hearing Protection
Muzo - Personal Zone Creator w Noise Blocking Tech
AMPSound Personal Bluetooth Stereo Amplifiers & Earbuds Hearing Aid
Experiencing Music through a Cochlear Implant
Bone Conduction
BLU - World’s Most Versatile Hearing Glasses
Olive: Next-Gen Hearing Aid
Sleeping Aids


Hearing Test to Check Hearing Loss
Hearing Check
Online Hearing Test
National Hearing Test

Dichotic listening Test is a psychological test commonly used to investigate selective attention within the auditory system and is a subtopic of cognitive psychology and neuroscience. Specifically, it is "used as a behavioral test for hemispheric lateralization of speech sound perception." During a standard dichotic listening test, a participant is presented with two different auditory stimuli simultaneously (usually speech). The different stimuli are directed into different ears over headphones. Research Participants were instructed to repeat aloud the words they heard in one ear while a different message was presented to the other ear. As a result of focusing to repeat the words, participants noticed little of the message to the other ear, often not even realizing that at some point it changed from English to German. At the same time, participants did notice when the voice in the unattended ear changed from a male’s to a female’s, suggesting that the selectivity of consciousness can work to tune in some information."

Multisensory Integration is the study of how information from the different sensory modalities, such as sight, sound, touch, smell, self-motion and taste, may be integrated by the nervous system. A coherent representation of objects combining modalities enables us to have meaningful perceptual experiences. Indeed, multisensory integration is central to adaptive behavior because it allows us to perceive a world of coherent perceptual entities. Multisensory integration also deals with how different sensory modalities interact with one another and alter each other's processing.

Hearing Impairment - Deafness

Hearing Impairment or Hearing Loss is a partial or total inability to hear. A Deaf person has little to no hearing. Hearing loss may occur in one or both ears. In children hearing problems can affect the ability to learn spoken language and in adults it can cause work related difficulties. In some people, particularly older people, hearing loss can result in loneliness. Hearing loss can be temporary or permanent.

American Society for Deaf Children
Deaf Professional Arts Network

"People with Disabilities are are like Gifts from God. We learn just as much about people with disabilities as we do ourselves. Sometimes they help us realize our own disabilities, ones that we never knew we had."

Auditory Processing Disorder is an umbrella term for a variety of disorders that affect the way the brain processes auditory information. Individuals with APD usually have normal structure and function of the outer, middle and inner ear (peripheral hearing). However, they cannot process the information they hear in the same way as others do, which leads to difficulties in recognizing and interpreting sounds, especially the sounds composing speech. It is thought that these difficulties arise from dysfunction in the central nervous system.

Auditory Scene Analysis is a proposed model for the basis of auditory perception. This is understood as the process by which the human auditory system organizes sound into perceptually meaningful elements. The term was coined by psychologist Albert Bregman. The related concept in machine perception is computational auditory scene analysis (CASA), which is closely related to source separation and blind signal separation. The three key aspects of Bregman's ASA model are: segmentation, integration, and segregation.

Spatial Hearing Loss refers to a form of deafness that is an inability to use spatial cues about where a sound originates from in space. This in turn impacts upon the ability to understand speech in the presence of background noise.

Auditory Verbal Agnosia also known as pure word deafness, is the inability to comprehend speech. Individuals with this disorder lose the ability to understand language, repeat words, and write from dictation. However, spontaneous speaking, reading, and writing are preserved. Individuals who exhibit pure word deafness are also still able to recognize non-verbal sounds. Sometimes, this agnosia is preceded by cortical deafness; however, this is not always the case. Researchers have documented that in most patients exhibiting auditory verbal agnosia, the discrimination of consonants is more difficult than that of vowels, but as with most neurological disorders, there is variation among patients.

Hearing Loss and Cognitive Decline in Older Adults

Hyperacusis is a health condition characterized by an increased sensitivity to certain frequency and volume ranges of sound (a collapsed tolerance to usual environmental sound). A person with severe hyperacusis has difficulty tolerating everyday sounds, some of which may seem unpleasantly or painfully loud to that person but not to others.
Usher Syndrome is an extremely rare genetic disorder caused by a mutation in any one of at least 11 genes resulting in a combination of hearing loss and visual impairment. It is a leading cause of deafblindness and is at present incurable.
Hearing through the Tongue

Sensory Deprivation is the deliberate reduction or removal of stimuli from one or more of the senses. Simple devices such as blindfolds or hoods and earmuffs can cut off sight and hearing, while more complex devices can also cut off the sense of smell, touch, taste, thermoception (heat-sense), and 'gravity'. Sensory deprivation has been used in various alternative medicines and in psychological experiments (e.g. with an isolation tank). Short-term sessions of sensory deprivation are described as relaxing and conducive to meditation; however, extended or forced sensory deprivation can result in extreme anxiety, hallucinations, bizarre thoughts, and depression. A related phenomenon is perceptual deprivation, also called the ganzfeld effect. In this case a constant uniform stimulus is used instead of attempting to remove the stimuli; this leads to effects which have similarities to sensory deprivation. Sensory deprivation techniques were developed by some of the armed forces within NATO, as a means of interrogating prisoners within international treaty obligations. The European Court of Human Rights ruled that the use of the five techniques by British security forces in Northern Ireland amounted to a practice of inhumane and degrading treatment.

Spatial Intelligence

Special Needs

Sonar is a technique that uses sound propagation (usually underwater, as in submarine navigation) to navigate, communicate with or detect objects on or under the surface of the water, such as other vessels. Two types of technology share the name "sonar": passive sonar is essentially listening for the sound made by vessels; active sonar is emitting pulses of sounds and listening for echoes. Sonar may be used as a means of acoustic location and of measurement of the echo characteristics of "targets" in the water. Acoustic location in air was used before the introduction of radar. Sonar may also be used in air for robot navigation, and SODAR (an upward looking in-air sonar) is used for atmospheric investigations. The term sonar is also used for the equipment used to generate and receive the sound. The acoustic frequencies used in sonar systems vary from very low (infrasonic) to extremely high (ultrasonic). The study of underwater sound is known as underwater acoustics or hydroacoustics.

Photoacoustic Imaging is a biomedical imaging modality based on the photoacoustic effect. In photoacoustic imaging, non-ionizing laser pulses are delivered into biological tissues (when radio frequency pulses are used, the technology is referred to as thermoacoustic imaging). Some of the delivered energy will be absorbed and converted into heat, leading to transient thermoelastic expansion and thus wideband (i.e. MHz) ultrasonic emission. The generated ultrasonic waves are detected by ultrasonic transducers and then analyzed to produce images. It is known that optical absorption is closely associated with physiological properties, such as hemoglobin concentration and oxygen saturation. As a result, the magnitude of the ultrasonic emission (i.e. photoacoustic signal), which is proportional to the local energy deposition, reveals physiologically specific optical absorption contrast. 2D or 3D images of the targeted areas can then be formed. Fig. 1 is a schematic illustration showing the basic principles of photoacoustic imaging.
Thermoacoustic Imaging is a strategy for studying the absorption properties of human tissue using virtually any kind of electromagnetic radiation. But Alexander Graham Bell first reported the physical principle upon which thermoacoustic imaging is based a century earlier. He observed that audible sound could be created by illuminating an intermittent beam of sunlight onto a rubber sheet.

Sign Language

Sign Language Letter YSign Language is a language which chiefly uses manual communication to convey meaning, as opposed to acoustically conveyed sound patterns. This can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to express a speaker's thoughts. Sign languages share many similarities with spoken languages (sometimes called "oral languages"), which depend primarily on sound, and linguists consider both to be types of natural language. Although there are some significant differences between signed and spoken languages, such as how they use space grammatically, sign languages show the same linguistic properties and use the same language faculty as do spoken languages. They should not be confused with body language, which is a kind of non-linguistic communication.

American Sign Language is the predominant sign language of Deaf communities in the United States and most of anglophone Canada. ASL signs have a number of phonemic components, including movement of the face and torso as well as the hands. ASL is not a form of pantomime, but iconicity does play a larger role in ASL than in spoken languages. English loan words are often borrowed through fingerspelling, although ASL grammar is unrelated to that of English. ASL has verbal agreement and aspectual marking and has a productive system of forming agglutinative classifiers. Many linguists believe ASL to be a subject–verb–object (SVO) language, but there are several alternative proposals to account for ASL word order.

American Sign Language (youtube)

ASL speakers amount to about 1 % of the 231 million English speakers in America.

List of Sign Languages. There are perhaps three hundred sign languages in use around the world today.

SignAloud: Gloves Translate Sign Language into Text and Speech (youtube)
Christine Sun Kim: The enchanting music of sign language (video)
Uni, 1st Sign Language to Voice System
Start American Sign Language

Hand Gestures (people smart)
American Hand Gestures in Different Cultures (youtube)
7 Ways to Get Yourself in Trouble Abroad (youtube)
Song in Sign Language, Eminem Lose Yourself (youtube)
Hand Symbols

Tactical Hand Signals (Info-Graph)

vl2 Story Book Apps
Bilingual Interfaces through Visual Narratives

Life Print Sign Language

Lip Reading is a technique of understanding speech by visually interpreting the movements of the lips, face and tongue when normal sound is not available. It relies also on information provided by the context, knowledge of the language, and any residual hearing. Lip-reading is not easy, as this clip demonstrates. Although ostensibly used by deaf and hard-of-hearing people, most people with normal hearing process some speech information from sight of the moving mouth.
Viseme is any of several speech sounds that look the same, for example when lip reading.

Al-Sayyid Bedouin Sign Language
Voices from El-Sayed (youtube)

American Speech-Language-Hearing Association
American Speech-Language-Hearing Association
Speech and Language Center
Learning Specialist

Speech Recognition - Voice Activated

Sonitus Medical SoundBite Hearing System

Manually Coded English is a variety of visual communication methods expressed through the hands which attempt to represent the English language. Unlike deaf sign languages which have evolved naturally in deaf communities, the different forms of MCE were artificially created, and generally follow the grammar of English.
Morse Code - Codes

Dragon Notes
Dragon Dictation

Smartphone Technologies

Closed Captioning and subtitling are both processes of displaying text on a television, video screen, or other visual display to provide additional or interpretive information. Both are typically used as a transcription of the audio portion of a program as it occurs (either verbatim or in edited form), sometimes including descriptions of non-speech elements. Other uses have been to provide a textual alternative language translation of a presentation's primary audio language that is usually burned-in (or "open") to the video and unselectable.
Closed Caption Software
Closed Captioning Jobs

Auditory Training
Sound - Music
Learning Styles

Visual Language is a system of communication using visual elements. Speech as a means of communication cannot strictly be separated from the whole of human communicative activity which includes the visual and the term 'language' in relation to vision is an extension of its use to describe the perception, comprehension and production of visible signs.

Visual Language
Visual Language
Visual Learning
Visual Perception (spatial intelligence)
Body Language

Communication Boards
Communication Assistance Board with Images

Communication Board with Images

Boardmaker Software
Alphabet Card (PDF)
Place Mat (PDF)

Augmentative & Alternative Communication devices (AAC)

Facilitated Communication is a discredited technique used by some caregivers and educators in an attempt to assist people with severe educational and communication disabilities. The technique involves providing an alphabet board, or keyboard. The facilitator holds or gently touches the disabled person's arm or hand during this process and attempts to help them move their hand and amplify their gestures. In addition to providing physical support needed for typing or pointing, the facilitator provides verbal prompts and moral support. In addition to human touch assistance, the facilitator's belief in their communication partner's ability to communicate seems to be a key component of the technique.

Ways to Communicate with a Non-Verbal Child
Communication Devices for Children who Can't Talk
Communication Boards
Speech and Communication
Communication Needs

World’s First Touch Enabled T-shirt  BROADCAST: a programmable LED t–shirt that can display any slogan you want using your smartphone.

Software turns Webcams into Eye-Trackers

Communication Symbols Board

Noise Pollution Map of the U.S.

Noise Pollution Map of America
Sound Pollution Tools

The Thinker Man