Information-Based Support Technologies for Persons with Visual Disabilities

Report on IEEE EMBC 2006 Special Session

[Japanese]

* Abstract

Special Session Venue

At IEEE EMBC 2006 (the 28th Annual International Conference of the IEEE Engineering in Medicine and Biology Society), August 30th-September 3rd, New York, U.S.A., we have organized a special session to demonstrate the research activities of the project research group "Fundamentals of Welfare Information" funded by Grant-In-Aid for Scientific Research (Kakenhi) by the Ministry of Education, Japan.

* About the International Conference

IEEE EMBC:IEEE is the world's biggest academic association on electricity, electronics, and information technology. It has many special interest groups and EMBS, Engineering in Medicine and Biology Society, is one of them. Rehabilitaion engineering is one of EMBC's activity area.

* About the Special Session

Session Title: Information-Based Support Technologies for Persons with Visual Disabilities

Purpose of organizing the session:Almost every technology, from low to high, can be used to improve the QOL of persons with visual disabilities as well as other kinds of disabilities. Among those technologies we would like to focus on the information-based technologies in this special session. Effective use of such information technologies can make up for the inconvenience arising from visual impairments. Blind persons and persons with low vision are relieved physically and mentally by acquiring supplementary- or alternative-information with help of various kinds of tools that can assist or substitute their visual sense. For example, magnification and/or changing colors of characters are useful for reading of persons with low vision. The appropriate assistance in this magnification or color adjustment is feasible by the feedback of physiological information derived from their visual system. Persons with visual disabilities can also be helped by acquiring alternative information via his or her auditory and/or tactile sense instead of visual information. The most popular way of sensory substitution is the use of hearing ability by blind persons. Another typical method of sensory substitution is the use of Braille characters, that consist of tactile six dots and read by the fingers. Information transformation from printed materials or electronic data into speech or Braille/vibration has been practically used. But it still has many problems to be solved for more efficient information transmission. Ingenious transforming techniques are necessary for persons with visual disabilities to understand the original meaning more correctly and smoothly. In order to develop these methods, visual, auditory, and tactile cognitive characteristics of these persons should be understood thoroughly. In this organized session, the latest results of information-based technologies used to assist the activities of persons with visual disabilities are presented along with the analyses of visual, auditory, and tactile cognitive characteristics of those persons.

Organizers: Tetsuya Watanabe (National Institute of Special Education) and michio Miyakawa (University of Niigata)

Venue: New York Marriott Marquis Hotel at Times Square, Marquis C, New York City, NY, USA

Session Time:Friday September 1, 10:45-12:15 (EasternTime)

* Program

(1) [Invited paper] J. R. Marston (Univ. of California Santa Barbara, USA):
"Using Remote Infrared Audible Cues to Increase Spatial Knowledge Aquisition for Persons Who are Blind or Visually Impaired in a Multimodal Transportaion environment."
(2) M. Miyakawa, Y. Maeda, Y. Miyazawa, and J. Hori (Niigata Univ., Japan):
" A Smart Video Magnifier Controlled by the Visibility Signal of a Low Vision User."
(3) T. Watanabe, S. Oouchi, T. Yamaguchi (Nat. Inst. of Special Education, Japan), M. Shimojo (Univ. of Electro-Communications, Japan) and S. Shimada (Tokyo Metropolitan Industrial Technology Res. Inst.):
"Development of a Measuring System of Contact Force during Braille Reading using an Optical 6-Axis Force Sensor. "
(4) T. Nishimoto, S. Sako, S. Sagayama (Univ. of Tokyo, Japan), K. Ohshima, K. Oda, and T. Watanabe (Tokyo Women's Cristian Univ., Japan):
" Effect of Learning on Listening to Ultra-Fast Synthesized Speech."
(5) M. Miyagi, Y. Horiuchi, M. Nishida, and A. Ichikawa (Chiba Univ., Japan):
"Analysis of Prosody in Finger Braille Using Electromyography."

* Abstract

(1)
Using Remote Infrared Audible Cues to Increase Spatial Knowledge Aquisition for Persons Who are Blind or Visually Impaired in a Multimodal Transportaion environment

James R. Marston (Univ. of California Santa Barbara, USA)

Abstract Dr. Marston is making a presentation.
This paper reports on a filed test with blind travelers at a large multi-modal train terminal. Participants used their regular methods of navigation or used a form of auditory signage installed in the environment while exploring the area and performing transfer related tasks. Those who used the auditory signage exhibited a better understanding of the spatial relationships among locations. They were able to make more shortcuts and also answered questions that showed they understood the spatial layout and had a much better cognitive map then those who used their regular methods of navigation and orientation.

(2)
A Smart Video Magnifier Controlled by the Visibility Signal of a Low Vision User

Michio Miyakawa, Yoshinobu Maeda, Youichi Miyazawa, and Junichi Hori (Niigata Univ., Japan)

Abstract Mr. Miyazawa is making a presentation.
A smart video magnifier for the people with visual disabilities is now being developed to assist their stress-free reading. In a video magnifier, the users watch the monitor screen that is displaying the book page to be read. Eye movement is needed for reading a book. The difficulty of character recognition that is dependent on the environmental conditions is reflected to the eye movement. Accordingly, information on the visibility of the user is extracted as physiological signals accompanied by the gazing motion. These signals are basically used to control the video magnifier. The advantages and usefulness of the adaptive-type video magnifier are discussed in this paper.

(3)
Development of a Measuring System of Contact Force during Braille Reading using an Optical 6-Axis Force Sensor.

Tetsuya Watanabe, Susumu Oouchi, Toshimitsu Yamaguchi (Nat. Inst. of Special Education, Japan), Makoto Shimojo, (Univ. of Electro-Communications, Japan) and Sshigenobu Shimada (Tokyo Metropolitan Industrial Technology Res. Inst.)

Abstract Dr. Watanabe is making a presentation.
A system was developed by using an optical 6-axis force sensor to measure contact force during Braille reading. In using this system, we have dealt two problems. One is variability of output values depending on the contact point. It was solved by using two transformation techniques. The other is that subjects have to read Braille in an irregular manner. We compared two manners of Braille reading, one-hand vs. two-hands, and found small reduction in reading speed. We have collected data from two Braille readers with this system and shown more minute contact force trajectories quantitatively than those in earlier studies.

(4)
Effect of Learning on Listening to Ultra-Fast Synthesized Speech

Takuya Nishimoto, Shinji Sako, Shigeki Sagayama (The University of Tokyo), Kazue Ohshima, Koichi Oda, and Takayuki Watanabe (Tokyo Woman's Christian University)

Abstract Mr. Nishimoto is making a presentation.
A text-to-speech synthesizer that would produce easily understandable voices at very fast speaking rates is expected to help persons with visual disability to acquire information effectively with screen reading softwares. We investigated the intelligibility of Japanese Text-to-Speech systems at fast speaking rates, using four-digit random numbers as the vocabulary of the recall test. We also studied the fast and intelligible text-to-speech engine, using HMM-based synthesizer with the corpus with fast speaking rate. As the results, the statistical models trained with the fast speaking corpus was effective. The learning effect was significant in the early stage of the trials and the effect sustained for several weeks.

(5)
Analysis of Prosody in Finger Braille Using Electromyography

Manabi Miyagi, Masafumi Nishida, Yasuo Horiuchi, and Akira Ichikawa (Chiba Univ., Japan)

Abstract Ms. Miyagi is making a presentation.
Analysis of Prosody in Finger Braille Using Electromyography Finger braille is one of the communication methods for the deaf blind. The interpreter types braille codes on the fingers of deaf blind. Finger braille seems to be the most suited medium for real-time communication by its quickness and accuracy of transmitting characters. We hypothesize that the prosody information exists in the time structure and strength of finger braille typing. Prosody is the non-linguistic information that has functions to transmit the sentence structure, prominence, emotions and other form of information in real time communication. In this study, we measured the surface electromyography (sEMG) of finger movement to analyze the strength of finger braille typing. We found that the strength of finger movement increases at the beginning of a phrase and a prominent phrase. The result shows the possibility that the prosody in the strength of finger braille can be applied to create an interpreter system for the deaf-blind.

* How the session went

Sorry! Sentences below are still in Japanese.

 EMBC 2006では,1700件を超える発表と15のミニシンポジウムがあった。これだけ大規模なため,会期は5日間と長いものだった。

 セッション参加者は約20名。そのうち特定領域研究の関係者は7名,日本人以外の人数は5-6名だった。

 セッション冒頭では宮川先生が開催の挨拶と,セッションを設けてもらった主催者へのお礼の言葉を述べた。その後,司会は渡辺に代わり,招待講演者James Marston氏を紹介し,講演を始めてもらった。Marston氏には赤外線を使った視覚障害者誘導支援システムRIASに関する実験結果を紹介して頂いた。RIASはスミスケトルウェル視覚研究所で開発された製品で,施設内に設置する情報タグと,利用者が持ち歩くレシーバからなる。情報タグに近づくとそこに何があるかを音声で伝える(のちほど,この機器は日本製であり,TalkingSingsという名前で三菱プレシジョンで開発されたものと同じだと関係者から聞いた)。この製品を使って20数名の視覚を被験者に駅構内を歩く実験を行い,課題の達成度からシステムの有効性を主張する内容である。赤外線の到達距離,複数の情報を受信した場合のシステムの挙動,赤外線の人体への影響,触図と比べた場合の利点,など活発な質疑が交わされた。

 その後は,弱視班(宮澤君@新潟大),点字班(渡辺@特総研),Kiki班(西本氏@東大),盲ろう班(宮城氏@千葉大)の順で研究成果を発表した。点字班の発表に対しては,測定結果の応用は何か,測定結果を受けて強い接触力で点字を読むことを奨めるのか,という質問がされた。Kiki班の発表に対しては,速い音声にも慣れるのはどういう機序かという質問がされた。盲ろう班の発表に対しては,指点字の世界での普及状況,日本の盲学校で指点字を教えるのかといった質問が出された。

 日本人以外に特定領域研究の内容を伝え,それに対して意見・質問を聞くことができたことから,研究成果の海外への発信という目的はまずは達成された。海外における同じ分野の情報収集と,海外研究者との交歓という目的も満たすことができた。しかし,セッションへの参加者は決して多かったとはいえず,日本とそれ以外の発表者の人数のバランス,セッション確定までに長い時間を要したこと,このためセッションの広報が遅れたことなど,今後,国際会議でセッションを企画する際に留意すべき点として残された。

Audience Presenters and co-authors from Japan Sushi at the Marriott Hotel

Left: audience in the venue. Middle: Presenters and co-authors from Japan. Right: Sushi at the Marriott Hotel.

* Impression

 日本人の英語での発表は,発表そのものは準備をしていれば問題はない。しかし,会場から質問があったとき,それを聞き取ること,そして的確に答えることに課題が多い。また国際会議で日本人が質問することもまれである。EMBCに限った話では全くないが,これらを克服することで,障害者支援分野における日本の進んだ研究開発をもっと世界にアピールする必要があると感じる。

* Information on the Web


HOME -> Overseas Information

Last updated: August 27, 2007
Copyright (C) 2006-2007 Tetsuya WATANABE