Loading…

Sketch-based interaction and modeling: where do we stand?

Sketching is a natural and intuitive communication tool used for expressing concepts or ideas which are difficult to communicate through text or speech alone. Sketching is therefore used for a variety of purposes, from the expression of ideas on two-dimensional (2D) physical media, to object creatio...

Full description

Saved in:
Bibliographic Details
Published in:AI EDAM 2019-11, Vol.33 (4), p.370-388
Main Authors: Bonnici, Alexandra, Akman, Alican, Calleja, Gabriel, Camilleri, Kenneth P., Fehling, Patrick, Ferreira, Alfredo, Hermuth, Florian, Israel, Johann Habakuk, Landwehr, Tom, Liu, Juncheng, Padfield, Natasha M. J., Sezgin, T. Metin, Rosin, Paul L.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c316t-19e1d2d6df322b9f7acb76cef260f7bbcb0278aa77e7e9fe958f498594a2e0273
cites cdi_FETCH-LOGICAL-c316t-19e1d2d6df322b9f7acb76cef260f7bbcb0278aa77e7e9fe958f498594a2e0273
container_end_page 388
container_issue 4
container_start_page 370
container_title AI EDAM
container_volume 33
creator Bonnici, Alexandra
Akman, Alican
Calleja, Gabriel
Camilleri, Kenneth P.
Fehling, Patrick
Ferreira, Alfredo
Hermuth, Florian
Israel, Johann Habakuk
Landwehr, Tom
Liu, Juncheng
Padfield, Natasha M. J.
Sezgin, T. Metin
Rosin, Paul L.
description Sketching is a natural and intuitive communication tool used for expressing concepts or ideas which are difficult to communicate through text or speech alone. Sketching is therefore used for a variety of purposes, from the expression of ideas on two-dimensional (2D) physical media, to object creation, manipulation, or deformation in three-dimensional (3D) immersive environments. This variety in sketching activities brings about a range of technologies which, while having similar scope, namely that of recording and interpreting the sketch gesture to effect some interaction, adopt different interpretation approaches according to the environment in which the sketch is drawn. In fields such as product design, sketches are drawn at various stages of the design process, and therefore, designers would benefit from sketch interpretation technologies which support these differing interactions. However, research typically focuses on one aspect of sketch interpretation and modeling such that literature on available technologies is fragmented and dispersed. In this paper, we bring together the relevant literature describing technologies which can support the product design industry, namely technologies which support the interpretation of sketches drawn on 2D media, sketch-based search interactions, as well as sketch gestures drawn in 3D media. This paper, therefore, gives a holistic view of the algorithmic support that can be provided in the design process. In so doing, we highlight the research gaps and future research directions required to provide full sketch-based interaction support.
doi_str_mv 10.1017/S0890060419000349
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2331385492</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2331385492</sourcerecordid><originalsourceid>FETCH-LOGICAL-c316t-19e1d2d6df322b9f7acb76cef260f7bbcb0278aa77e7e9fe958f498594a2e0273</originalsourceid><addsrcrecordid>eNplkEtLAzEYRYMoWKs_wF3AdTSvSSZuRIpVoeCiuh7y-GKntjM1mVL892aoO1d3cQ73wkXomtFbRpm-W9LaUKqoZCWokOYETZhUhjCt6CmajJiM_Bxd5LwuDjWVnCCz_ILBr4izGQJuuwGS9UPbd9h2AW_7AJu2-7zHhxUkwKHHB8B5KOzhEp1Fu8lw9ZdT9DF_ep-9kMXb8-vscUG8YGogzAALPKgQBefORG2908pD5IpG7Zx3lOvaWq1Bg4lgqjpKU1dGWg4FiSm6OfbuUv-9hzw0636fujLZcCGYqCtpeLHY0fKpzzlBbHap3dr00zDajA81_x4SvxE3V3Q</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2331385492</pqid></control><display><type>article</type><title>Sketch-based interaction and modeling: where do we stand?</title><source>Cambridge Journals Online</source><creator>Bonnici, Alexandra ; Akman, Alican ; Calleja, Gabriel ; Camilleri, Kenneth P. ; Fehling, Patrick ; Ferreira, Alfredo ; Hermuth, Florian ; Israel, Johann Habakuk ; Landwehr, Tom ; Liu, Juncheng ; Padfield, Natasha M. J. ; Sezgin, T. Metin ; Rosin, Paul L.</creator><creatorcontrib>Bonnici, Alexandra ; Akman, Alican ; Calleja, Gabriel ; Camilleri, Kenneth P. ; Fehling, Patrick ; Ferreira, Alfredo ; Hermuth, Florian ; Israel, Johann Habakuk ; Landwehr, Tom ; Liu, Juncheng ; Padfield, Natasha M. J. ; Sezgin, T. Metin ; Rosin, Paul L.</creatorcontrib><description>Sketching is a natural and intuitive communication tool used for expressing concepts or ideas which are difficult to communicate through text or speech alone. Sketching is therefore used for a variety of purposes, from the expression of ideas on two-dimensional (2D) physical media, to object creation, manipulation, or deformation in three-dimensional (3D) immersive environments. This variety in sketching activities brings about a range of technologies which, while having similar scope, namely that of recording and interpreting the sketch gesture to effect some interaction, adopt different interpretation approaches according to the environment in which the sketch is drawn. In fields such as product design, sketches are drawn at various stages of the design process, and therefore, designers would benefit from sketch interpretation technologies which support these differing interactions. However, research typically focuses on one aspect of sketch interpretation and modeling such that literature on available technologies is fragmented and dispersed. In this paper, we bring together the relevant literature describing technologies which can support the product design industry, namely technologies which support the interpretation of sketches drawn on 2D media, sketch-based search interactions, as well as sketch gestures drawn in 3D media. This paper, therefore, gives a holistic view of the algorithmic support that can be provided in the design process. In so doing, we highlight the research gaps and future research directions required to provide full sketch-based interaction support.</description><identifier>ISSN: 0890-0604</identifier><identifier>EISSN: 1469-1760</identifier><identifier>DOI: 10.1017/S0890060419000349</identifier><language>eng</language><publisher>Cambridge: Cambridge University Press</publisher><subject>Algorithms ; Augmented reality ; Collaboration ; Communication ; Design ; Designers ; Industrial design ; Interfaces ; Modelling ; Product design ; Researchers ; Sketches</subject><ispartof>AI EDAM, 2019-11, Vol.33 (4), p.370-388</ispartof><rights>Copyright Cambridge University Press Nov 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c316t-19e1d2d6df322b9f7acb76cef260f7bbcb0278aa77e7e9fe958f498594a2e0273</citedby><cites>FETCH-LOGICAL-c316t-19e1d2d6df322b9f7acb76cef260f7bbcb0278aa77e7e9fe958f498594a2e0273</cites><orcidid>0000-0002-6580-3424</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Bonnici, Alexandra</creatorcontrib><creatorcontrib>Akman, Alican</creatorcontrib><creatorcontrib>Calleja, Gabriel</creatorcontrib><creatorcontrib>Camilleri, Kenneth P.</creatorcontrib><creatorcontrib>Fehling, Patrick</creatorcontrib><creatorcontrib>Ferreira, Alfredo</creatorcontrib><creatorcontrib>Hermuth, Florian</creatorcontrib><creatorcontrib>Israel, Johann Habakuk</creatorcontrib><creatorcontrib>Landwehr, Tom</creatorcontrib><creatorcontrib>Liu, Juncheng</creatorcontrib><creatorcontrib>Padfield, Natasha M. J.</creatorcontrib><creatorcontrib>Sezgin, T. Metin</creatorcontrib><creatorcontrib>Rosin, Paul L.</creatorcontrib><title>Sketch-based interaction and modeling: where do we stand?</title><title>AI EDAM</title><description>Sketching is a natural and intuitive communication tool used for expressing concepts or ideas which are difficult to communicate through text or speech alone. Sketching is therefore used for a variety of purposes, from the expression of ideas on two-dimensional (2D) physical media, to object creation, manipulation, or deformation in three-dimensional (3D) immersive environments. This variety in sketching activities brings about a range of technologies which, while having similar scope, namely that of recording and interpreting the sketch gesture to effect some interaction, adopt different interpretation approaches according to the environment in which the sketch is drawn. In fields such as product design, sketches are drawn at various stages of the design process, and therefore, designers would benefit from sketch interpretation technologies which support these differing interactions. However, research typically focuses on one aspect of sketch interpretation and modeling such that literature on available technologies is fragmented and dispersed. In this paper, we bring together the relevant literature describing technologies which can support the product design industry, namely technologies which support the interpretation of sketches drawn on 2D media, sketch-based search interactions, as well as sketch gestures drawn in 3D media. This paper, therefore, gives a holistic view of the algorithmic support that can be provided in the design process. In so doing, we highlight the research gaps and future research directions required to provide full sketch-based interaction support.</description><subject>Algorithms</subject><subject>Augmented reality</subject><subject>Collaboration</subject><subject>Communication</subject><subject>Design</subject><subject>Designers</subject><subject>Industrial design</subject><subject>Interfaces</subject><subject>Modelling</subject><subject>Product design</subject><subject>Researchers</subject><subject>Sketches</subject><issn>0890-0604</issn><issn>1469-1760</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNplkEtLAzEYRYMoWKs_wF3AdTSvSSZuRIpVoeCiuh7y-GKntjM1mVL892aoO1d3cQ73wkXomtFbRpm-W9LaUKqoZCWokOYETZhUhjCt6CmajJiM_Bxd5LwuDjWVnCCz_ILBr4izGQJuuwGS9UPbd9h2AW_7AJu2-7zHhxUkwKHHB8B5KOzhEp1Fu8lw9ZdT9DF_ep-9kMXb8-vscUG8YGogzAALPKgQBefORG2908pD5IpG7Zx3lOvaWq1Bg4lgqjpKU1dGWg4FiSm6OfbuUv-9hzw0636fujLZcCGYqCtpeLHY0fKpzzlBbHap3dr00zDajA81_x4SvxE3V3Q</recordid><startdate>20191101</startdate><enddate>20191101</enddate><creator>Bonnici, Alexandra</creator><creator>Akman, Alican</creator><creator>Calleja, Gabriel</creator><creator>Camilleri, Kenneth P.</creator><creator>Fehling, Patrick</creator><creator>Ferreira, Alfredo</creator><creator>Hermuth, Florian</creator><creator>Israel, Johann Habakuk</creator><creator>Landwehr, Tom</creator><creator>Liu, Juncheng</creator><creator>Padfield, Natasha M. J.</creator><creator>Sezgin, T. Metin</creator><creator>Rosin, Paul L.</creator><general>Cambridge University Press</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7TB</scope><scope>7XB</scope><scope>88I</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M2P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PKEHL</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>S0W</scope><orcidid>https://orcid.org/0000-0002-6580-3424</orcidid></search><sort><creationdate>20191101</creationdate><title>Sketch-based interaction and modeling: where do we stand?</title><author>Bonnici, Alexandra ; Akman, Alican ; Calleja, Gabriel ; Camilleri, Kenneth P. ; Fehling, Patrick ; Ferreira, Alfredo ; Hermuth, Florian ; Israel, Johann Habakuk ; Landwehr, Tom ; Liu, Juncheng ; Padfield, Natasha M. J. ; Sezgin, T. Metin ; Rosin, Paul L.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c316t-19e1d2d6df322b9f7acb76cef260f7bbcb0278aa77e7e9fe958f498594a2e0273</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Augmented reality</topic><topic>Collaboration</topic><topic>Communication</topic><topic>Design</topic><topic>Designers</topic><topic>Industrial design</topic><topic>Interfaces</topic><topic>Modelling</topic><topic>Product design</topic><topic>Researchers</topic><topic>Sketches</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bonnici, Alexandra</creatorcontrib><creatorcontrib>Akman, Alican</creatorcontrib><creatorcontrib>Calleja, Gabriel</creatorcontrib><creatorcontrib>Camilleri, Kenneth P.</creatorcontrib><creatorcontrib>Fehling, Patrick</creatorcontrib><creatorcontrib>Ferreira, Alfredo</creatorcontrib><creatorcontrib>Hermuth, Florian</creatorcontrib><creatorcontrib>Israel, Johann Habakuk</creatorcontrib><creatorcontrib>Landwehr, Tom</creatorcontrib><creatorcontrib>Liu, Juncheng</creatorcontrib><creatorcontrib>Padfield, Natasha M. J.</creatorcontrib><creatorcontrib>Sezgin, T. Metin</creatorcontrib><creatorcontrib>Rosin, Paul L.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Database‎ (1962 - current)</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>ProQuest Science Journals</collection><collection>Engineering Database</collection><collection>ProQuest advanced technologies &amp; aerospace journals</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied &amp; Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>ProQuest Central Basic</collection><collection>DELNET Engineering &amp; Technology Collection</collection><jtitle>AI EDAM</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bonnici, Alexandra</au><au>Akman, Alican</au><au>Calleja, Gabriel</au><au>Camilleri, Kenneth P.</au><au>Fehling, Patrick</au><au>Ferreira, Alfredo</au><au>Hermuth, Florian</au><au>Israel, Johann Habakuk</au><au>Landwehr, Tom</au><au>Liu, Juncheng</au><au>Padfield, Natasha M. J.</au><au>Sezgin, T. Metin</au><au>Rosin, Paul L.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Sketch-based interaction and modeling: where do we stand?</atitle><jtitle>AI EDAM</jtitle><date>2019-11-01</date><risdate>2019</risdate><volume>33</volume><issue>4</issue><spage>370</spage><epage>388</epage><pages>370-388</pages><issn>0890-0604</issn><eissn>1469-1760</eissn><abstract>Sketching is a natural and intuitive communication tool used for expressing concepts or ideas which are difficult to communicate through text or speech alone. Sketching is therefore used for a variety of purposes, from the expression of ideas on two-dimensional (2D) physical media, to object creation, manipulation, or deformation in three-dimensional (3D) immersive environments. This variety in sketching activities brings about a range of technologies which, while having similar scope, namely that of recording and interpreting the sketch gesture to effect some interaction, adopt different interpretation approaches according to the environment in which the sketch is drawn. In fields such as product design, sketches are drawn at various stages of the design process, and therefore, designers would benefit from sketch interpretation technologies which support these differing interactions. However, research typically focuses on one aspect of sketch interpretation and modeling such that literature on available technologies is fragmented and dispersed. In this paper, we bring together the relevant literature describing technologies which can support the product design industry, namely technologies which support the interpretation of sketches drawn on 2D media, sketch-based search interactions, as well as sketch gestures drawn in 3D media. This paper, therefore, gives a holistic view of the algorithmic support that can be provided in the design process. In so doing, we highlight the research gaps and future research directions required to provide full sketch-based interaction support.</abstract><cop>Cambridge</cop><pub>Cambridge University Press</pub><doi>10.1017/S0890060419000349</doi><tpages>19</tpages><orcidid>https://orcid.org/0000-0002-6580-3424</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0890-0604
ispartof AI EDAM, 2019-11, Vol.33 (4), p.370-388
issn 0890-0604
1469-1760
language eng
recordid cdi_proquest_journals_2331385492
source Cambridge Journals Online
subjects Algorithms
Augmented reality
Collaboration
Communication
Design
Designers
Industrial design
Interfaces
Modelling
Product design
Researchers
Sketches
title Sketch-based interaction and modeling: where do we stand?
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-23T13%3A27%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Sketch-based%20interaction%20and%20modeling:%20where%20do%20we%20stand?&rft.jtitle=AI%20EDAM&rft.au=Bonnici,%20Alexandra&rft.date=2019-11-01&rft.volume=33&rft.issue=4&rft.spage=370&rft.epage=388&rft.pages=370-388&rft.issn=0890-0604&rft.eissn=1469-1760&rft_id=info:doi/10.1017/S0890060419000349&rft_dat=%3Cproquest_cross%3E2331385492%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c316t-19e1d2d6df322b9f7acb76cef260f7bbcb0278aa77e7e9fe958f498594a2e0273%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2331385492&rft_id=info:pmid/&rfr_iscdi=true