default search action
Jules Françoise
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [c24]Behnoosh Mohammadzadeh, Jules Françoise, Michèle Gouiffès, Baptiste Caramiaux:
Studying Collaborative Interactive Machine Teaching in Image Classification. IUI 2024: 195-208 - 2022
- [c23]Jules Françoise, Sarah Fdili Alaoui, Yves Candau:
CO/DA: Live-Coding Movement-Sound Interactions for Dance Improvisation. CHI 2022: 482:1-482:13 - [c22]Jules Françoise, Gabriel Meseguer-Brocal, Frédéric Bevilacqua:
Movement Analysis and Decomposition with the Continuous Wavelet Transform. MOCO 2022: 16:1-16:13 - [c21]Imen Trabelsi, Jules Françoise, Yacine Bellik:
Sensor-based Activity Recognition using Deep Learning: A Comparative Study. MOCO 2022: 20:1-20:8 - [c20]Victor Paredes, Jules Françoise, Frédéric Bevilacqua:
Entangling Practice with Artistic and Educational Aims: Interviews on Technology-based Movement-Sound Interactions. NIME 2022 - [i2]Victor Paredes, Jules Françoise, Frédéric Bevilacqua:
Entangling Practice with Artistic and Educational Aims: Interviews on Technology-based Movement Sound Interactions. CoRR abs/2209.13921 (2022) - 2021
- [j6]Téo Sanchez, Baptiste Caramiaux, Jules Françoise, Frédéric Bevilacqua, Wendy E. Mackay:
How do People Train a Machine?: Strategies and (Mis)Understandings. Proc. ACM Hum. Comput. Interact. 5(CSCW1): 162:1-162:26 (2021) - [c19]Jules Françoise, Baptiste Caramiaux, Téo Sanchez:
Marcelle: Composing Interactive Machine Learning Workflows and Interfaces. UIST 2021: 39-53 - 2020
- [j5]Baptiste Caramiaux, Jules Françoise, Wanyu Liu, Téo Sanchez, Frédéric Bevilacqua:
Machine Learning Approaches for Motor Learning: A Short Review. Frontiers Comput. Sci. 2: 16 (2020) - [i1]Baptiste Caramiaux, Jules Françoise, Abby Wanyu Liu, Téo Sanchez, Frédéric Bevilacqua:
Machine Learning Approaches For Motor Learning: A Short Review. CoRR abs/2002.04317 (2020)
2010 – 2019
- 2018
- [j4]Jules Françoise, Frédéric Bevilacqua:
Motion-Sound Mapping through Interaction: An Approach to User-Centered Design of Auditory Feedback Using Machine Learning. ACM Trans. Interact. Intell. Syst. 8(2): 16:1-16:30 (2018) - [c18]Mirjana Prpa, Kivanç Tatar, Jules Françoise, Bernhard E. Riecke, Thecla Schiphorst, Philippe Pasquier:
Attending to Breath: Exploring How the Cues in a Virtual Environment Guide the Attention to Breath and Shape the Quality of Experience to Support Mindfulness. Conference on Designing Interactive Systems 2018: 71-84 - [c17]Youssef Guedira, Franck Bimbard, Jules Françoise, René Farcy, Yacine Bellik:
Tactile Interface to Steer Power Wheelchairs: A Preliminary Evaluation with Wheelchair Users. ICCHP (1) 2018: 424-431 - [p1]Yves Candau, Thecla Schiphorst, Jules Françoise:
Designing from Embodied Knowing: Practice-Based Research at the Intersection Between Embodied Interaction and Somatics. New Directions in Third Wave Human-Computer Interaction (2) 2018: 203-230 - 2017
- [c16]Sarah Fdili Alaoui, Jules Françoise, Thecla Schiphorst, Karen Studd, Frédéric Bevilacqua:
Seeing, Sensing and Recognizing Laban Movement Qualities. CHI 2017: 4009-4020 - [c15]Jules Françoise, Yves Candau, Sarah Fdili Alaoui, Thecla Schiphorst:
Designing for Kinesthetic Awareness: Revealing User Experiences through Second-Person Inquiry. CHI 2017: 5171-5183 - [c14]Yves Candau, Jules Françoise, Sarah Fdili Alaoui, Thecla Schiphorst:
Cultivating kinaesthetic awareness through interaction: Perspectives from somatic practices and embodied cognition. MOCO 2017: 21:1-21:8 - [c13]Hugo Scurto, Frédéric Bevilacqua, Jules Françoise:
Shaping and exploring interactive motion-sound mappings using online clustering techniques. NIME 2017: 410-415 - 2016
- [c12]Jules Françoise, Olivier Chapuis, Sylvain Hanneton, Frédéric Bevilacqua:
SoundGuides: Adapting Continuous Auditory Feedback to Users. CHI Extended Abstracts 2016: 2829-2836 - [c11]Jules Françoise, Frédéric Bevilacqua, Thecla Schiphorst:
GaussBox: Prototyping Movement Interaction with Interactive Visualizations of Machine Learning. CHI Extended Abstracts 2016: 3667-3670 - [c10]Frédéric Bevilacqua, Baptiste Caramiaux, Jules Françoise:
Perspectives on Real-time Computation of Movement Coarticulation. MOCO 2016: 35:1-35:5 - 2015
- [b1]Jules Françoise:
Motion-sound Mapping By Demonstration. (Apprentissage des Relations entre Mouvement et Son par Démonstration). Pierre and Marie Curie University, Paris, France, 2015 - [j3]Max Rheiner, Thomas Tobler, Fabian Troxler, Seki Inoue, Keisuke Hasegawa, Yasuaki Monnai, Yasutoshi Makino, Hiroyuki Shinoda, Jules Françoise, Norbert Schnell, Riccardo Borghesi, Frédéric Bevilacqua, Tuncay Cakmak, Holger Hager:
Demo hour. Interactions 22(2): 6-9 (2015) - [j2]Jules Françoise, Norbert Schnell, Riccardo Borghesi, Frédéric Bevilacqua:
MaD. Interactions 22(3): 14-15 (2015) - [c9]Jules Françoise, Agnès Roby-Brami, Natasha Riboud, Frédéric Bevilacqua:
Movement sequence analysis using hidden Markov models: a case study in Tai Chi performance. MOCO 2015: 29-36 - [e2]Sarah Fdili Alaoui, Philippe Pasquier, Thecla Schiphorst, Jules Françoise, Frédéric Bevilacqua:
Proceedings of the 2nd International Workshop on Movement and Computing, MOCO 2015, Vancouver, British Columbia, Canada, August 14-15, 2015. ACM 2015, ISBN 978-1-4503-3457-0 [contents] - 2014
- [j1]Baptiste Caramiaux, Jules Françoise, Norbert Schnell, Frédéric Bevilacqua:
Mapping Through Listening. Comput. Music. J. 38(3): 34-48 (2014) - [c8]Jules Françoise, Sarah Fdili Alaoui, Thecla Schiphorst, Frédéric Bevilacqua:
Vocalizing dance movement for interactive sonification of laban effort factors. Conference on Designing Interactive Systems 2014: 1079-1082 - [c7]Jules Françoise, Norbert Schnell, Riccardo Borghesi, Frédéric Bevilacqua:
Probabilistic Models for Designing Motion and Sound Relationships. NIME 2014: 287-292 - [c6]Jules Françoise, Norbert Schnell, Frédéric Bevilacqua:
MaD: mapping by demonstration for continuous sonification. SIGGRAPH Emerging Technologies 2014: 16:1 - [c5]Jules Françoise, Norbert Schnell, Frédéric Bevilacqua:
MaD: mapping by demonstration for continuous sonification. SIGGRAPH Studio 2014: 38:1 - [e1]Frédéric Bevilacqua, Sarah Fdili Alaoui, Jules Françoise, Philippe Pasquier, Thecla Schiphorst:
International Workshop on Movement and Computing, MOCO '14, Paris, France, June 16-17, 2014. ACM 2014, ISBN 978-1-4503-2814-2 [contents] - 2013
- [c4]Frédéric Bevilacqua, Norbert Schnell, Nicolas H. Rasamimanana, Julien Bloit, Emmanuel Fléty, Baptiste Caramiaux, Jules Françoise, Eric O. Boyer:
De-Mo: designing action-sound relationships with the mo interfaces. CHI Extended Abstracts 2013: 2907-2910 - [c3]Jules Françoise, Norbert Schnell, Frédéric Bevilacqua:
Gesture-based control of physical modeling sound synthesis: a mapping-by-demonstration approach. ACM Multimedia 2013: 447-448 - [c2]Jules Françoise, Norbert Schnell, Frédéric Bevilacqua:
A multimodal probabilistic model for gesture-based control of sound synthesis. ACM Multimedia 2013: 705-708 - [c1]Jules Françoise:
Gesture-sound mapping by demonstration in interactive music systems. ACM Multimedia 2013: 1051-1054
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2025-01-09 13:21 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint