Yutaka Ishii

Japanese version

*Affiliation*

Associate Professor

Department of Systems Engineering,
Faculty of Computer Science and System Engineering,
Okayama Prefectural University

E-mail: ishii[at]cse.oka-pu.ac.jp


*Education*


Mar., 2003.


Mar., 2000.


Mar., 1998.



Graduate from Okayama Prefectural University.
(Graduate School of System Engineering) <D.E.>

Graduate from Okayama Prefectural University.
(Graduate School of System Engineering) <M.E.>

Graduate from Okayama Prefectural University.
(Department of System Engineering) <B.E.>

*Career*


Oct., 2013.

Apl., 2011.

Oct., 2003.

Apl., 2003.



Associate Professor, Okayama Prefectural University

Assistant Professor, Okayama Prefectural University

Assistant Professor, Kobe University

Researcher, CREST of Japan Science and Technology Corporation (JST)




*Fields of Research*

Human Interface, Embodied Communication, Nonvarbal Interface, Avatar Interaction, Physiological Information Processing.


*Member of the Following Societies*

Human Interface Society, Information Processing Society of Japan, the Japanese Cognitive Science Society, Japan Ergonomics Society, The Institute of Electronics, Information and Communication Engineers, The Society of Instrument and Control Engineers.


*Awards*

Mar., 2004. Human Interface Society, the Best Paper Award.

Mar., 2003. The Information Processing Society of Japan, Best Paper Award for Young Researcher of IPSJ National Convention.

Mar., 2002. Human Interface Society, the Best Paper Award.

Mar., 2000. Human Interface Society, Best Paper Award for Young Researcher of Human Interface Symposium.

Jan., 2000. IEEE Hiroshima Section Student Subcommittee, the Best Presentation Award.


*Papers*

2018 Ishii, Y., Nishida, M. and Watanabe, T.: Development of a Speech-Driven Embodied Entrainment Character System with a Back-Channel Feedback; Advances in Affective and Pleasurable Design. AHFE 2018. Advances in Intelligent Systems and Computing, vol 774, pp.132-139, Springer, Jun. 27, 2018.

2017 Tanaka, K., Watanabe, T. and Ishii, Y.: Development of an Immersive Presentation Experience System that Audience Characters Nod for Lecturer's Utterance; Prc. of International Conference on Design and Concurrent Engineering 2017 & Manufacturing Systems Conference 2017, No.35, pp.1-5, Sep. 7, 2017.

Ikeda, K., Ishii, Y. and Watanabe, T.: Development of a Handwave Robot Expressing Intentions with Hand-waving; Prc. of International Conference on Design and Concurrent Engineering 2017& Manufacturing Systems Conference 2017, No.22, pp.1-5, Sep. 7, 2017.

2016 Ishii, Y. and Watanabe, T. and Sejima, Y.: Development of an Embodied Avatar System using Avatar-Shadow's Color Expressions with an Interaction-activated Communication Model; Proc. of the 4th International Conference on Human-Agent Interaction (HAI 2016) pp.337-340, Oct. 4, 2016.

2015 Esaki, K., Inoue, S., Watanabe, T. and Ishii, Y.: An Embodied Entrainment Avatar-Shadow System to Support Avatar Mediated Communication; Proc. of the 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN2015), pp.419-424, Aug. 31, 2015.

Yamamoto, M., Takabayashi, N., Watanabe, T. and Ishii, Y.: A Nursing Communication Education Support System with the Function of Reflection; Proc. of the 2015 IEEE/SICE International Symposium on System Integration (SII2015), pp.912-917, Dec. 11, 2015.

2014 Yamamoto, M., Takabayashi, N., Ono, K., Watanabe, T. and Ishii, Y.: Development of a Nursing Communication Education Support System Using Nurse-Patient Embodied Avatars with a Smile and Eyeball Movement Model; Proc. of the 2014 IEEE/SICE International Symposium on System Integration (SII2014), pp.175-180, Dec. 13, 2014.

Inoue, S., Esaki, K., Watanabe, T. and Ishii, Y.: Development of an Embodied Entrainment Avatar-Shadow System for Avatar-Mediated Communication Support; Proc. of the 2014 IEEE/SICE International Symposium on System Integration (SII2014), pp.181-185, Dec. 13, 2014.

Kohara, M., Shikata, H., Watanabe, T. and Ishii, Y.: Speech-Driven Embodied Entrainment Character System with Emotional Expressions and Motions by Speech Recognition; Proc. of the 2014 IEEE/SICE International Symposium on System Integration (SII2014), pp.431-435, Dec. 13, 2014.

Ishii, Y. and Watanabe, T.: Evaluation of a Video Communication System with Speech-Driven Embodied Entrainment Audience Characters with Partner's Face; Proc. of the Second International Conference on Human-Agent Interaction (HAI 2014) pp.221-224, Oct. 30, 2014.
2013 Takada, T., Nakayama, S., Ishii, Y. and Watanabe, T.: Superimposed Self-Character Mediated Video Chat System for a Face-to-Face Interaction Base on the Detection of Talker's Face Angles; Proc. of the 8th ACM/IEEE International Conference on Human-Robot In interaction, HRI 2013 Demonstration Session, D14, Mar. 4, 2013.
2012 Ishii, Y. and Watanabe, T.: E-VChat: A Video Communication System in Which a Speech-driven Embodied Entrainment Character Working with Head Motion is Superimposed for a Virtual Face-to-face Scene; Proc. of the 21st IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN2012), pp.191-196, Sep. 10, 2012.

Nomura, Y., Watanabe, T., Ishii, Y., and Sejima, Y.: An Embodied Virtual Communication System with Speech-driven Embodied Entrainment Objects; Proc. of First International Symposium on Socially and Technically Symbiotic Systems (STSS2012),No.32, pp.1-6, Aug. 29, 2012.

Takada, T., Ishii, Y., and Watanabe, T.: Development of an Embodied Video Communication System with a Superimposed Entrainment Character Driven by Voice and Head Motion Inputs; Proc. of First International Symposium on Socially and Technically Symbiotic Systems (STSS2012), No.40, pp.1-4, Aug. 29, 2012.

Takemura, M., Watanabe, T., and Ishii, Y.: Development of a Typing-driven Embodied Entrainment Character Chat System for Three Users; Proc. of First International Symposium on Socially and Technically Symbiotic Systems (STSS2012), No.41, pp.1-4, 2012-8.

Ishii, Y., Takada, T., and Watanabe, T.: A Proposal of Video Communication System in Which Talker's Avatar is Superimposed for a Virtual Face-to-Face Scene; Proc. of The Sixth International Conference on Collaboration Technologies (CollabTech 2012), pp.147-148, Aug. 27, 2012.
2011 Ishii, Y. and Watanabe, T.: Embodied Communication Support Using a Presence Sharing System under Teleworking; Proc. of 14th International Conference on Human-Computer Interaction (HCI International 2011), pp.41-45, Jul. 9, 2011.

Sejima, Y., Ishii, Y. and Watanabe, T.: A Virtual Audience System for Enhancing Embodied Interaction Based on Conversational Activity; Proc. of 14th International Conference on Human-Computer Interaction (HCI International 2011), pp.180-189, Jul. 9, 2011.
2010 Ishii, Y., Sejima, Y., and Watanabe, T.: Effects of Delayed Presentation of Self-Embodied Avatar Motion with Network Delay; Proc. of the 4th International Universal Communication Symposium (IUCS2010), pp.261-266, Oct. 19, 2010.

Kamahara, J., Nagamatsu, T., Tada, M., Kaieda, Y. and Ishii, Y.: Instructional Video Content Employing User Behavior Analysis: Time Dependent Annotation with Levels of Detail; Proc. of 18th International Conference on User Modeling, Adaptation, and Personalization (UMAP 2010), LNCS/6075, pp.87-98, Springer-Verlag, Jun. 20, 2010.

Kamahara, J., Nagamatsu, T., Kaieda, Y. and Ishii, Y.: Behavioral Analysis using Cumulative Playback Time for Identifying Task Hardship of Instruction Video; Proc. of the 2010 International Workshop on Multimedia and Semantic Technologies (MUST 2010), pp.1-6, May 23, 2010.

2009 Ishii, Y., Osaki, K., and Watanabe, T.: Ghatcha: GHost Avatar on a Telework CHAir; Proc. of HCI International 2009, pp.216-225, Jul. 24, 2009.

Ishii, Y. and Watanabe, T.: Development of a Virtual Presence Sharing System Using a Telework Chair; Progress in Robotics, Springer, Communications in Computer and Information Science(CCIS) 44, pp.173-178, Aug. 16, 2009 (FIRA RoboWorld Congress 2009).

Kamahara, J., Nagamatsu, T., Fukuhara, Y., Kaieda, Y. and Ishii, Y.: Method for Identifying Task Hardships by Analyzing Operational Logs of Instruction Videos; Proc. of 4th International Conference on Semantic and Digital Media Technologies (SAMT2009), LNCS/5887, pp.161-164, Springer-Verlag, Dec. 3, 2009.

2008 Ishii, Y., Osaki, K., Watanabe, T., and Ban, Y.: Evaluation of Embodied Avatar Manipulation Based on Talker's Hand Motion by Using 3D Trackball; Proc. of the 17th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN2008), pp.653-658, Aug. 3, 2008.

2007 Ishii, Y. and Watanabe, T.: An Embodied Avatar Mediated Communication System with VirtualActor for Human Interaction Analysis; Proc. of the 16th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN2007), pp.37-42, 2007.

2004 Watanabe, T., Ogikubo, M. and Ishii, Y.: Visualization of Respiration in the Embodied Virtual Communication System and Its Evaluation; International Journal of Human-Computer Interaction (IJHCI), Vol.17, No.1, pp.89-102, 2004.

Ishii, Y. and Watanabe, T.: An Embodied Video Communication System in which Own VirtualActor is Superimposed for Virtual Face-to-face Scene; Proc. of the 13th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN2004), pp.461-466, 2004.

2001 Ishii, Y. and Watanabe, T.: Effects of the Arrangement of VirtualActors on Human Interaction by Using the Embodied Virtual Communication System; The International Symposium on Measurement, Analysis and Modeling of Human Functions (ISHF 2001), pp.269-274, 2001.

2000 Watanabe, T., Okubo, M. and Ishii, Y.: An Embodied Virtual Face-to-Face Communication System with Virtual Actor and Virtual Wave for Human Interaction Sharing; Proc. of World Multi-conference on Systems, Cybernetics and Informatics (SCI.2000), III, pp.146-151, 2000.

Ishii, Y. and Watanabe, T.: Evaluation of an Embodied Virtual Communication System for Human Interaction Analysis by Synthesis; Proc. of the 9th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN2000), pp.29-34, 2000.