Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Intelligent Control for Haptic Displays

Identifieur interne : 003E94 ( Istex/Corpus ); précédent : 003E93; suivant : 003E95

Intelligent Control for Haptic Displays

Auteurs : Stefan Munch ; Martin Stangenberg

Source :

RBID : ISTEX:37A66CEF030A378EDA5A7E717A3FB8F6AB68C851

English descriptors

Abstract

Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which, is able to apply simple haptic information, to the user's hand and index finger. A multi‐agent system has been designed which ‘observes’ the ‘user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide intelligent’ haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.

Url:
DOI: 10.1111/1467-8659.1530217

Links to Exploration step

ISTEX:37A66CEF030A378EDA5A7E717A3FB8F6AB68C851

Le document en format XML

<record>
<TEI wicri:istexFullTextTei="biblStruct">
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Intelligent Control for Haptic Displays</title>
<author>
<name sortKey="Munch, Stefan" sort="Munch, Stefan" uniqKey="Munch S" first="Stefan" last="Munch">Stefan Munch</name>
<affiliation>
<mods:affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</mods:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Stangenberg, Martin" sort="Stangenberg, Martin" uniqKey="Stangenberg M" first="Martin" last="Stangenberg">Martin Stangenberg</name>
<affiliation>
<mods:affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</mods:affiliation>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">ISTEX</idno>
<idno type="RBID">ISTEX:37A66CEF030A378EDA5A7E717A3FB8F6AB68C851</idno>
<date when="1996" year="1996">1996</date>
<idno type="doi">10.1111/1467-8659.1530217</idno>
<idno type="url">https://api.istex.fr/document/37A66CEF030A378EDA5A7E717A3FB8F6AB68C851/fulltext/pdf</idno>
<idno type="wicri:Area/Istex/Corpus">003E94</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title level="a" type="main" xml:lang="en">Intelligent Control for Haptic Displays</title>
<author>
<name sortKey="Munch, Stefan" sort="Munch, Stefan" uniqKey="Munch S" first="Stefan" last="Munch">Stefan Munch</name>
<affiliation>
<mods:affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</mods:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Stangenberg, Martin" sort="Stangenberg, Martin" uniqKey="Stangenberg M" first="Martin" last="Stangenberg">Martin Stangenberg</name>
<affiliation>
<mods:affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</mods:affiliation>
</affiliation>
</author>
</analytic>
<monogr></monogr>
<series>
<title level="j">Computer Graphics Forum</title>
<idno type="ISSN">0167-7055</idno>
<idno type="eISSN">1467-8659</idno>
<imprint>
<publisher>Blackwell Science Ltd</publisher>
<pubPlace>Edinburgh, UK</pubPlace>
<date type="published" when="1996-08">1996-08</date>
<biblScope unit="volume">15</biblScope>
<biblScope unit="issue">3</biblScope>
<biblScope unit="page" from="217">217</biblScope>
<biblScope unit="page" to="226">226</biblScope>
</imprint>
<idno type="ISSN">0167-7055</idno>
</series>
<idno type="istex">37A66CEF030A378EDA5A7E717A3FB8F6AB68C851</idno>
<idno type="DOI">10.1111/1467-8659.1530217</idno>
<idno type="ArticleID">CGF217</idno>
</biblStruct>
</sourceDesc>
<seriesStmt>
<idno type="ISSN">0167-7055</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Multimodal interaction</term>
<term>haptic output</term>
<term>human‐computer interaction</term>
<term>user modelling</term>
</keywords>
</textClass>
<langUsage>
<language ident="en">en</language>
</langUsage>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which, is able to apply simple haptic information, to the user's hand and index finger. A multi‐agent system has been designed which ‘observes’ the ‘user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide intelligent’ haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</div>
</front>
</TEI>
<istex>
<corpusName>wiley</corpusName>
<author>
<json:item>
<name>Stefan Munch</name>
<affiliations>
<json:string>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</json:string>
</affiliations>
</json:item>
<json:item>
<name>Martin Stangenberg</name>
<affiliations>
<json:string>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</json:string>
</affiliations>
</json:item>
</author>
<subject>
<json:item>
<lang>
<json:string>eng</json:string>
</lang>
<value>Multimodal interaction</value>
</json:item>
<json:item>
<lang>
<json:string>eng</json:string>
</lang>
<value>haptic output</value>
</json:item>
<json:item>
<lang>
<json:string>eng</json:string>
</lang>
<value>user modelling</value>
</json:item>
<json:item>
<lang>
<json:string>eng</json:string>
</lang>
<value>human‐computer interaction</value>
</json:item>
</subject>
<articleId>
<json:string>CGF217</json:string>
</articleId>
<language>
<json:string>eng</json:string>
</language>
<abstract>Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which, is able to apply simple haptic information, to the user's hand and index finger. A multi‐agent system has been designed which ‘observes’ the ‘user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide intelligent’ haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</abstract>
<qualityIndicators>
<score>6.931</score>
<pdfVersion>1.3</pdfVersion>
<pdfPageSize>595.276 x 841.89 pts (A4)</pdfPageSize>
<refBibsNative>false</refBibsNative>
<keywordCount>4</keywordCount>
<abstractCharCount>1096</abstractCharCount>
<pdfWordCount>4771</pdfWordCount>
<pdfCharCount>26397</pdfCharCount>
<pdfPageCount>10</pdfPageCount>
<abstractWordCount>180</abstractWordCount>
</qualityIndicators>
<title>Intelligent Control for Haptic Displays</title>
<genre.original>
<json:string>article</json:string>
</genre.original>
<genre>
<json:string>article</json:string>
</genre>
<host>
<volume>15</volume>
<publisherId>
<json:string>CGF</json:string>
</publisherId>
<pages>
<total>10</total>
<last>226</last>
<first>217</first>
</pages>
<issn>
<json:string>0167-7055</json:string>
</issn>
<issue>3</issue>
<genre>
<json:string>journal</json:string>
</genre>
<language>
<json:string>unknown</json:string>
</language>
<eissn>
<json:string>1467-8659</json:string>
</eissn>
<title>Computer Graphics Forum</title>
<doi>
<json:string>10.1111/(ISSN)1467-8659</json:string>
</doi>
</host>
<publicationDate>1996</publicationDate>
<copyrightDate>1996</copyrightDate>
<doi>
<json:string>10.1111/1467-8659.1530217</json:string>
</doi>
<id>37A66CEF030A378EDA5A7E717A3FB8F6AB68C851</id>
<score>1</score>
<fulltext>
<json:item>
<original>true</original>
<mimetype>application/pdf</mimetype>
<extension>pdf</extension>
<uri>https://api.istex.fr/document/37A66CEF030A378EDA5A7E717A3FB8F6AB68C851/fulltext/pdf</uri>
</json:item>
<json:item>
<original>false</original>
<mimetype>application/zip</mimetype>
<extension>zip</extension>
<uri>https://api.istex.fr/document/37A66CEF030A378EDA5A7E717A3FB8F6AB68C851/fulltext/zip</uri>
</json:item>
<istex:fulltextTEI uri="https://api.istex.fr/document/37A66CEF030A378EDA5A7E717A3FB8F6AB68C851/fulltext/tei">
<teiHeader>
<fileDesc>
<titleStmt>
<title level="a" type="main" xml:lang="en">Intelligent Control for Haptic Displays</title>
<respStmt xml:id="ISTEX-API" resp="Références bibliographiques récupérées via GROBID" name="ISTEX-API (INIST-CNRS)"></respStmt>
</titleStmt>
<publicationStmt>
<authority>ISTEX</authority>
<publisher>Blackwell Science Ltd</publisher>
<pubPlace>Edinburgh, UK</pubPlace>
<availability>
<p>WILEY</p>
</availability>
<date>1996</date>
</publicationStmt>
<sourceDesc>
<biblStruct type="inbook">
<analytic>
<title level="a" type="main" xml:lang="en">Intelligent Control for Haptic Displays</title>
<author>
<persName>
<forename type="first">Stefan</forename>
<surname>Munch</surname>
</persName>
<affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</affiliation>
</author>
<author>
<persName>
<forename type="first">Martin</forename>
<surname>Stangenberg</surname>
</persName>
<affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</affiliation>
</author>
</analytic>
<monogr>
<title level="j">Computer Graphics Forum</title>
<idno type="pISSN">0167-7055</idno>
<idno type="eISSN">1467-8659</idno>
<idno type="DOI">10.1111/(ISSN)1467-8659</idno>
<imprint>
<publisher>Blackwell Science Ltd</publisher>
<pubPlace>Edinburgh, UK</pubPlace>
<date type="published" when="1996-08"></date>
<biblScope unit="volume">15</biblScope>
<biblScope unit="issue">3</biblScope>
<biblScope unit="page" from="217">217</biblScope>
<biblScope unit="page" to="226">226</biblScope>
</imprint>
</monogr>
<idno type="istex">37A66CEF030A378EDA5A7E717A3FB8F6AB68C851</idno>
<idno type="DOI">10.1111/1467-8659.1530217</idno>
<idno type="ArticleID">CGF217</idno>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<creation>
<date>1996</date>
</creation>
<langUsage>
<language ident="en">en</language>
</langUsage>
<abstract xml:lang="en">
<p>Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which, is able to apply simple haptic information, to the user's hand and index finger. A multi‐agent system has been designed which ‘observes’ the ‘user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide intelligent’ haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</p>
</abstract>
<textClass xml:lang="en">
<keywords scheme="keyword">
<list>
<head>keywords</head>
<item>
<term>Multimodal interaction</term>
</item>
<item>
<term>haptic output</term>
</item>
<item>
<term>user modelling</term>
</item>
<item>
<term>human‐computer interaction</term>
</item>
</list>
</keywords>
</textClass>
</profileDesc>
<revisionDesc>
<change when="1996-08">Published</change>
<change xml:id="refBibs-istex" who="#ISTEX-API" when="2016-4-24">References added</change>
</revisionDesc>
</teiHeader>
</istex:fulltextTEI>
<json:item>
<original>false</original>
<mimetype>text/plain</mimetype>
<extension>txt</extension>
<uri>https://api.istex.fr/document/37A66CEF030A378EDA5A7E717A3FB8F6AB68C851/fulltext/txt</uri>
</json:item>
</fulltext>
<metadata>
<istex:metadataXml wicri:clean="Wiley component found">
<istex:xmlDeclaration>version="1.0" encoding="UTF-8" standalone="yes"</istex:xmlDeclaration>
<istex:document>
<component version="2.0" type="serialArticle" xml:lang="en">
<header>
<publicationMeta level="product">
<publisherInfo>
<publisherName>Blackwell Science Ltd</publisherName>
<publisherLoc>Edinburgh, UK</publisherLoc>
</publisherInfo>
<doi origin="wiley" registered="yes">10.1111/(ISSN)1467-8659</doi>
<issn type="print">0167-7055</issn>
<issn type="electronic">1467-8659</issn>
<idGroup>
<id type="product" value="CGF"></id>
<id type="publisherDivision" value="ST"></id>
</idGroup>
<titleGroup>
<title type="main" sort="COMPUTER GRAPHICS FORUM">Computer Graphics Forum</title>
</titleGroup>
</publicationMeta>
<publicationMeta level="part" position="08003">
<doi origin="wiley">10.1111/cgf.1996.15.issue-3</doi>
<numberingGroup>
<numbering type="journalVolume" number="15">15</numbering>
<numbering type="journalIssue" number="3">3</numbering>
</numberingGroup>
<coverDate startDate="1996-08">August 1996</coverDate>
</publicationMeta>
<publicationMeta level="unit" type="article" position="0021700" status="forIssue">
<doi origin="wiley">10.1111/1467-8659.1530217</doi>
<idGroup>
<id type="unit" value="CGF217"></id>
</idGroup>
<countGroup>
<count type="pageTotal" number="10"></count>
</countGroup>
<copyright> © 1996 Eurographics Association</copyright>
<eventGroup>
<event type="firstOnline" date="2003-02-13"></event>
<event type="publishedOnlineFinalForm" date="2003-02-13"></event>
<event type="xmlConverted" agent="Converter:BPG_TO_WML3G version:2.3.2 mode:FullText source:Header result:Header" date="2010-03-09"></event>
<event type="xmlConverted" agent="Converter:WILEY_ML3G_TO_WILEY_ML3GV2 version:4.0.1" date="2014-03-15"></event>
<event type="xmlConverted" agent="Converter:WML3G_To_WML3G version:4.1.7 mode:FullText,remove_FC" date="2014-10-16"></event>
</eventGroup>
<numberingGroup>
<numbering type="pageFirst" number="217">217</numbering>
<numbering type="pageLast" number="226">226</numbering>
</numberingGroup>
<linkGroup>
<link type="toTypesetVersion" href="file:CGF.CGF217.pdf"></link>
</linkGroup>
</publicationMeta>
<contentMeta>
<titleGroup>
<title type="main">Intelligent Control for Haptic Displays</title>
</titleGroup>
<creators>
<creator creatorRole="author" xml:id="cr1" affiliationRef="#a1">
<personName>
<givenNames>Stefan</givenNames>
<familyName>Munch</familyName>
</personName>
</creator>
<creator creatorRole="author" xml:id="cr2" affiliationRef="#a1">
<personName>
<givenNames>Martin</givenNames>
<familyName>Stangenberg</familyName>
</personName>
</creator>
</creators>
<affiliationGroup>
<affiliation xml:id="a1" countryCode="DE">
<unparsedAffiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</unparsedAffiliation>
</affiliation>
</affiliationGroup>
<keywordGroup xml:lang="en">
<keyword xml:id="k1">Multimodal interaction</keyword>
<keyword xml:id="k2">haptic output</keyword>
<keyword xml:id="k3">user modelling</keyword>
<keyword xml:id="k4">human‐computer interaction</keyword>
</keywordGroup>
<abstractGroup>
<abstract type="main" xml:lang="en">
<title type="main">Abstract</title>
<p>Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons.</p>
<p>In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which, is able to apply simple haptic information, to the user's hand and index finger. A multi‐agent system has been designed which ‘observes’ the ‘user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide intelligent’ haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</p>
</abstract>
</abstractGroup>
</contentMeta>
</header>
</component>
</istex:document>
</istex:metadataXml>
<mods version="3.6">
<titleInfo lang="en">
<title>Intelligent Control for Haptic Displays</title>
</titleInfo>
<titleInfo type="alternative" contentType="CDATA" lang="en">
<title>Intelligent Control for Haptic Displays</title>
</titleInfo>
<name type="personal">
<namePart type="given">Stefan</namePart>
<namePart type="family">Munch</namePart>
<affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</affiliation>
<role>
<roleTerm type="text">author</roleTerm>
</role>
</name>
<name type="personal">
<namePart type="given">Martin</namePart>
<namePart type="family">Stangenberg</namePart>
<affiliation>Institute for Real‐Time Computer Systems & Robotics, University of Karlsruhe Kaiserstr. 12, D‐76128 Karlsruhe, Germany</affiliation>
<role>
<roleTerm type="text">author</roleTerm>
</role>
</name>
<typeOfResource>text</typeOfResource>
<genre type="article" displayLabel="article"></genre>
<originInfo>
<publisher>Blackwell Science Ltd</publisher>
<place>
<placeTerm type="text">Edinburgh, UK</placeTerm>
</place>
<dateIssued encoding="w3cdtf">1996-08</dateIssued>
<copyrightDate encoding="w3cdtf">1996</copyrightDate>
</originInfo>
<language>
<languageTerm type="code" authority="rfc3066">en</languageTerm>
<languageTerm type="code" authority="iso639-2b">eng</languageTerm>
</language>
<physicalDescription>
<internetMediaType>text/html</internetMediaType>
</physicalDescription>
<abstract lang="en">Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which, is able to apply simple haptic information, to the user's hand and index finger. A multi‐agent system has been designed which ‘observes’ the ‘user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide intelligent’ haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</abstract>
<subject lang="en">
<genre>keywords</genre>
<topic>Multimodal interaction</topic>
<topic>haptic output</topic>
<topic>user modelling</topic>
<topic>human‐computer interaction</topic>
</subject>
<relatedItem type="host">
<titleInfo>
<title>Computer Graphics Forum</title>
</titleInfo>
<genre type="journal">journal</genre>
<identifier type="ISSN">0167-7055</identifier>
<identifier type="eISSN">1467-8659</identifier>
<identifier type="DOI">10.1111/(ISSN)1467-8659</identifier>
<identifier type="PublisherID">CGF</identifier>
<part>
<date>1996</date>
<detail type="volume">
<caption>vol.</caption>
<number>15</number>
</detail>
<detail type="issue">
<caption>no.</caption>
<number>3</number>
</detail>
<extent unit="pages">
<start>217</start>
<end>226</end>
<total>10</total>
</extent>
</part>
</relatedItem>
<identifier type="istex">37A66CEF030A378EDA5A7E717A3FB8F6AB68C851</identifier>
<identifier type="DOI">10.1111/1467-8659.1530217</identifier>
<identifier type="ArticleID">CGF217</identifier>
<accessCondition type="use and reproduction" contentType="copyright">© 1996 Eurographics Association</accessCondition>
<recordInfo>
<recordContentSource>WILEY</recordContentSource>
<recordOrigin>Blackwell Science Ltd</recordOrigin>
</recordInfo>
</mods>
</metadata>
<serie></serie>
</istex>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Istex/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 003E94 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Istex/Corpus/biblio.hfd -nk 003E94 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Istex
   |étape=   Corpus
   |type=    RBID
   |clé=     ISTEX:37A66CEF030A378EDA5A7E717A3FB8F6AB68C851
   |texte=   Intelligent Control for Haptic Displays
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024