Updated on 2025/05/03

写真a

 
TAKAHASHI Satoshi
 
Organization
Faculty of Interdisciplinary Science and Engineering in Health Systems Associate Professor
Position
Associate Professor
External link

Degree

  • Doctor (Engineering) ( Okayama University )

Research Interests

  • Rehabilitation

  • Human error

  • binocular stereopsis

Research Areas

  • Life Science / Medical systems

  • Life Science / Biomedical engineering

  • Life Science / Medical assistive technology

Research History

  • 岡山大学学術研究院ヘルスシステム統合科学研究学域   准教授

    2021.4

      More details

  • 岡山大学大学院ヘルスシステム統合科学研究科   准教授

    2018.4

      More details

Professional Memberships

▼display all

Committee Memberships

  • 計測自動制御学会   第25回生体・生理工学シンポジウム 座長(3C2-1~6)  

    2010   

      More details

    Committee type:Academic society

    計測自動制御学会

    researchmap

  • 計測自動制御学会   中国支部 幹事  

    2009 - 2011   

      More details

    Committee type:Academic society

    計測自動制御学会

    researchmap

  • 日本機械学会   第22回バイオエンジニアリング講演会実行委員  

    2009   

      More details

    Committee type:Academic society

    日本機械学会

    researchmap

 

Papers

  • Effect of Aging on the Human Kinetic Visual Field

    S.Takahashi, Z.Xu, M.Tanida, J.Wu

    Neuroscience and Biomedical Engineering   4 ( 1 )   50 - 56   2016

     More details

  • Relative Position of the Fingers Affects Length Perception while Grasping Objects

    S.Takahashi, Y.Ren, H.Wang, N.Kitayama, Z.Wu, J.Wu

    Neuroscience and Biomedical Engineering   4 ( 1 )   67 - 74   2016

     More details

  • Effects of Sound Frequency on Audiovisual Integration: An Event-Related Potential Study

    Weiping Yang, Jingjing Yang, Yulin Gao, Xiaoyu Tang, Yanna Ren, Satoshi Takahashi, Jinglong Wu

    PLOS ONE   10 ( 9 )   1 - 15   2015.9

     More details

    Language:English   Publishing type:Research paper (scientific journal)   Publisher:PUBLIC LIBRARY SCIENCE  

    A combination of signals across modalities can facilitate sensory perception. The audiovisual facilitative effect strongly depends on the features of the stimulus. Here, we investigated how sound frequency, which is one of basic features of an auditory signal, modulates audiovisual integration. In this study, the task of the participant was to respond to a visual target stimulus by pressing a key while ignoring auditory stimuli, comprising of tones of different frequencies (0.5, 1, 2.5 and 5 kHz). A significant facilitation of reaction times was obtained following audiovisual stimulation, irrespective of whether the task-irrelevant sounds were low or high frequency. Using event-related potential (ERP), audiovisual integration was found over the occipital area for 0.5 kHz auditory stimuli from 190-210 ms, for 1 kHz stimuli from 170-200 ms, for 2.5 kHz stimuli from 140-200 ms, 5 kHz stimuli from 100-200 ms. These findings suggest that a higher frequency sound signal paired with visual stimuli might be early processed or integrated despite the auditory stimuli being task-irrelevant information. Furthermore, audiovisual integration in late latency (300-340 ms) ERPs with fronto-central topography was found for auditory stimuli of lower frequencies (0.5, 1 and 2.5 kHz). Our results confirmed that audiovisual integration is affected by the frequency of an auditory stimulus. Taken together, the neurophysiological results provide unique insight into how the brain processes a multisensory visual signal and auditory stimuli of different frequencies.

    DOI: 10.1371/journal.pone.0138296

    Web of Science

    researchmap

  • Effects of Auditory Stimuli in the Horizontal Plane on Audiovisual Integration: An Event-Related Potential Study

    Weiping Yang, Qi Li, Tatsuya Ochi, Jingjing Yang, Yulin Gao, Xiaoyu Tang, Satoshi Takahashi, Jinglong Wu

    PLOS ONE   8 ( 6 )   e66402   2013.6

     More details

    Language:English   Publishing type:Research paper (scientific journal)   Publisher:PUBLIC LIBRARY SCIENCE  

    This article aims to investigate whether auditory stimuli in the horizontal plane, particularly originating from behind the participant, affect audiovisual integration by using behavioral and event-related potential (ERP) measurements. In this study, visual stimuli were presented directly in front of the participants, auditory stimuli were presented at one location in an equidistant horizontal plane at the front (0 degrees, the fixation point), right (90 degrees), back (180 degrees), or left (270 degrees) of the participants, and audiovisual stimuli that include both visual stimuli and auditory stimuli originating from one of the four locations were simultaneously presented. These stimuli were presented randomly with equal probability; during this time, participants were asked to attend to the visual stimulus and respond promptly only to visual target stimuli (a unimodal visual target stimulus and the visual target of the audiovisual stimulus). A significant facilitation of reaction times and hit rates was obtained following audiovisual stimulation, irrespective of whether the auditory stimuli were presented in the front or back of the participant. However, no significant interactions were found between visual stimuli and auditory stimuli from the right or left. Two main ERP components related to audiovisual integration were found: first, auditory stimuli from the front location produced an ERP reaction over the right temporal area and right occipital area at approximately 160-200 milliseconds; second, auditory stimuli from the back produced a reaction over the parietal and occipital areas at approximately 360-400 milliseconds. Our results confirmed that audiovisual integration was also elicited, even though auditory stimuli were presented behind the participant, but no integration occurred when auditory stimuli were presented in the right or left spaces, suggesting that the human brain might be particularly sensitive to information received from behind than both sides.

    DOI: 10.1371/journal.pone.0066402

    Web of Science

    researchmap

  • Development of a method to present wide-view visual stimuli in MRI for peripheral visual studies

    Jinglong Wu, Bin Wang, JiaJia Yang, Yuu Hikino, Satoshi Takahashi, Tianyi Yan, Seiichiro Ohno, Susumu Kanazawa

    JOURNAL OF NEUROSCIENCE METHODS   214 ( 2 )   126 - 136   2013.4

     More details

    Language:English   Publishing type:Research paper (scientific journal)   Publisher:ELSEVIER SCIENCE BV  

    We developed a novel wide-view visual presentation system for fMRI studies. Computer-generated images were projected onto a hemispheric, translucent screen inside the MRI bore and were then back-projected onto a 52 mm diameter screen. To achieve a wide field view, a spherical screen with a curvature radius of 30 mm was placed 30 mm away from the subjects' eyes. The subjects wore contact lenses that enabled them to focus on the screen, and the resulting visual field reached 120 degrees. To evaluate the clarity and quality of the MM images, a signal-to-noise ratio valuation experiment was performed. In addition, we successfully applied this visual presentation system to studies of visual retinotopic mapping and object perception neural function in the peripheral visual field. Our study demonstrated that the system is compatible with the MM environment. Based on the wide-field mapping results, this system was more effective at mapping a checkerboard stimuli in V1-V3 from the central to peripheral visual fields. In higher-level visual areas, we successfully located several classical category-selective areas, including the face-selective area (FFA), occipital face area (OFA), house-selective area (PPA), transverse occipital sulcus (TOS), lateral occipital complex (LOC) and posterior fusiform area (pFs). In these areas, we found that the response amplitudes exhibited different decreasing trends with increasing eccentricity. In conclusion, we developed a simple, effective method for presenting wide-view visual stimuli within the MRI environment that can be applied to many kinds of fMRI studies of peripheral vision. (C) 2013 Elsevier B.V. All rights reserved.

    DOI: 10.1016/j.jneumeth.2013.01.021

    Web of Science

    researchmap

▼display all

MISC

Presentations

  • 画像の明瞭さの両眼立体視に及ぼす影響

    矢野雄大, 高橋智, 江島義道

    日本視覚学会2025年冬季大会  2025.1.24 

     More details

    Event date: 2025.1.22 - 2025.1.24

    Language:Japanese   Presentation type:Poster presentation  

    researchmap

  • 空間文脈効果のパターン依存性の事象関連電位による分析

    江原 光祐, 楊 家家, 高橋 智, 江島 義道, 呉 景龍

    日本視覚学会 2024年冬季大会  2024.1.19 

     More details

    Event date: 2024.1.17 - 2024.1.19

    Language:Japanese   Presentation type:Oral presentation (general)  

    researchmap

  • 視覚認識ERP特性に与える事前言語情報の影響―言語レベル依存性の分析

    黄 暁頴, 楊 家家, 髙橋 智, 江島 義道, 呉 景龍

    日本視覚学会 2024年冬季大会  2024.1.19 

     More details

    Event date: 2024.1.17 - 2024.1.19

    Language:Japanese   Presentation type:Poster presentation  

    researchmap

  • 事前視覚情報の聴覚的物体認識に与える影響の事象関連電位分析

    穗積 政毅, 森田 健次, 楊 家家, 高橋 智, 江島 義道, 呉 景龍

    日本視覚学会 2024年冬季大会  2024.1.19 

     More details

    Event date: 2024.1.17 - 2024.1.19

    Language:Japanese   Presentation type:Poster presentation  

    researchmap

  • 事前聴覚情報の視覚的物体認識に与える影響の事象関連電位分析

    十川 凌一, 楊 家々, 高橋 智, 江島 義道, 呉 景龍

    日本視覚学会 2024年冬季大会  2024.1.19 

     More details

    Event date: 2024.1.17 - 2024.1.19

    Language:Japanese   Presentation type:Poster presentation  

    researchmap

▼display all

Research Projects

  • 両眼輻輳・焦点調節協調動作解析による若年立体視不全の解明と回復方法の研究開発

    Grant number:22K12921  2022.04 - 2026.03

    日本学術振興会  科学研究費助成事業  基盤研究(C)

    高橋 智, 呉 景龍, 濱崎 一郎, 江島 義道, 早見 武人

      More details

    Grant amount:\4160000 ( Direct expense: \3200000 、 Indirect expense:\960000 )

    researchmap

  • 顕在と潜在の意図推定による選択意図の脳内メカニズムの解明と意図理解ロボットの提案

    Grant number:22K04011  2022.04 - 2026.03

    日本学術振興会  科学研究費助成事業  基盤研究(C)

    呉 瓊, 高橋 智, 呉 景龍, 江島 義道, 于 英花

      More details

    Grant amount:\4030000 ( Direct expense: \3100000 、 Indirect expense:\930000 )

    researchmap

  • 注意と状況認識の脳機能の解明及び高齢運転者支援システムへの適用

    Grant number:20K04381  2020.04 - 2024.03

    日本学術振興会  科学研究費助成事業  基盤研究(C)

    張 明, 呉 景龍, 江島 義道, 高橋 智

      More details

    Grant amount:\4290000 ( Direct expense: \3300000 、 Indirect expense:\990000 )

    本研究今年の研究計画としでは、昨年度の成果に基づいて、高齢者に適用できる注意と状況認識の実験タスクを考案し、実施することを目指している。また、注意の3つのサブネットワークの観点(Posner, 1990)から、状況認識と注意の喚起ネットワーク、指向ネットワークと実行制御ネットワークの行動学、脳波および磁気共鳴画像法(fMRI)の実験を実施し、独自に開発した多感覚モダリティ・マルチメディア解析とディープラーニングなどの手法を用いて実験データの統合処理を通じて、注意と状況認識の脳機能ネットワークを提案する。
    記の計画の通り、研究者らの従来の成果を生かして、高齢者に適用できる注意と状況認識の新しい実験タスクを実施した。まず、自己関連情報は,他者関連情報より正確に早く処理することを自己優先効果と呼ばれている。研究者ら磁気共鳴画像法(fMRI)を利用して、自己優先効果と空間遠近の脳的関連性が見られ、安全運転に関与する人間の視野空間遠近の認識能力の脳内部位を同定した。今後高齢者の空間遠近の識別能力実験タスクの考案に適用できて、重要な効果が見られた。これらの成果は2022年度年度の「Scientific Reports」学術誌に掲載されている。また、状況認識に関与する人間の多感覚統合について、研究者ら機能的磁気共鳴画像法(fMRI)と多感覚Go / No-goタスクを使用して、同時に発生する視覚と聴覚刺激に生かした運動反応特性に関する脳内部位を調査した。左IPL、左PreCG、両側STG、および腹側の神経回路と人間の多感覚統合に生かした運動制御と関与すると示唆した。この成果では2022年度の「NeuroImage」学術誌に掲載された。さたに、高齢者に適用できる注意と状況認識の新しい実験タスクを実施した上で、「Acta Psychologica SinicaとPerception」2件の学術論文を採択して、掲載している。以上の成果基づいて、今年度の研究進歩は前年度の研究計画の通りに進行している。

    researchmap

  • International joint study on human brain model of tactile sensing and application to intelligent robot hand

    Grant number:19KK0099  2019.10 - 2025.03

    Japan Society for the Promotion of Science  Grants-in-Aid for Scientific Research  Fund for the Promotion of Joint International Research (Fostering Joint International Research (B))

    呉 景龍, 田中 高志, 高橋 智, 金澤 右, 呉 瓊

      More details

    Grant amount:\18460000 ( Direct expense: \14200000 、 Indirect expense:\4260000 )

    医療福祉機器の製作や高質感製品の創出では、手触りだけで形状・質感を知覚できるロボットハンド技術が21世紀の高齢化社会と製品の高品質化に強く求められている。しかし、手による医療福祉行為や触覚品質の評価は医者・職人の経験や主観的手触り感覚の判断に頼っているのが現状で、形状・質感を知覚できるロボットハンド技術はまだ確立されていない。
    本国際共同研究では,代表者及び分担者たちが従来までに米国衛生研究所(NIH)と中国北京理工大学(BIT)との共同研究で得た触覚脳機能と知能ロボットの成果を発展させて、触覚感知脳内モデルを構築して形状・質感を知覚できる知能ロボットハンドへの適用について探索する。
    本年度では、これまで得た触覚認知脳機能に関する知見と応用成果を発展させて、北京理工大学と共同で空気圧制御と圧電素子を用いて高磁場のfMRI環境で使用できる触覚脳機能マッピング実験装置を研究開発した。この特製の実験装置を用いて時空間ダブル循環方式で刺激して触覚認知のfMRI実験を実施し、fMRI実験データ解析は米国NIHと共同でを行い触覚感知脳内モデルを検討している。さらに、本国際共同研究は基礎的かつ総合的研究であり、若手研究者も参画して触覚感知脳内モデルの構築と知能ロボットハンドへの適用に関する国際共同研究ネットワークの形成にも取り組んでいる。
    これらの研究成果は英文雑誌論文4件(SAGE Open 1件、Perception 1件、Neuroscience Letters 1件、Frontiers in Psychology 1件)と国内外の学会発表7件を発表した。

    researchmap

  • Brain model of touch sensation and application to robot hands with perceptive function of shape and texture

    Grant number:18K18835  2018.06 - 2022.03

    Japan Society for the Promotion of Science  Grants-in-Aid for Scientific Research  Grant-in-Aid for Challenging Research (Exploratory)

    Wu Jinglong

      More details

    Grant amount:\6240000 ( Direct expense: \4800000 、 Indirect expense:\1440000 )

    In the production of medical and welfare equipment and the creation of high-quality products, robot hand technology that can perceive the shape and texture only by touch is strongly required for the aging society in the 21st century and the improvement of product quality. However, the current situation is that medical welfare actions by hand and evaluation of tactile quality rely on the experience of doctors and craftsmen and the judgment of subjective tactile sensations, and robot hand technology that can perceive shape and texture has not yet been established.
    In this research, we propose a method of applying to robot hands that can perceive shape and texture by measuring the perceptual characteristics of length and shape by fingertips and combining them with the results of brain activity of touch sensation.
    The related results obtained from this research have been published, including 27 English journal articles and 30 presentations at domestic and overseas academic conferences.

    researchmap

▼display all

 

Class subject in charge

  • Programming (2024academic year) 3rd and 4th semester  - 水1~2

  • Programming (2024academic year) 3rd and 4th semester  - 水1~2

  • Programming 1 (2024academic year) Third semester  - 水1~2

  • Programming 2 (2024academic year) Fourth semester  - 水1~2

  • Advanced Internship for Interdisciplinary Medical Sciences and Engineering (2024academic year) Year-round  - その他

▼display all