Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

EEG Signature of Object Categorization from Event-related Potentials

Identifieur interne : 002024 ( Pmc/Curation ); précédent : 002023; suivant : 002025

EEG Signature of Object Categorization from Event-related Potentials

Auteurs : Mohammad Reza Daliri ; Mitra Taghizadeh ; Kavous Salehzadeh Niksirat

Source :

RBID : PMC:3785069

Abstract

Human visual system recognizes objects in a fast manner and the neural activity of the human brain generates signals which provide information about objects categories seen by the subjects. The brain signals can be recorded using different systems like the electroencephalogram (EEG). The EEG signals carry significant information about the stimuli that stimulate the brain. In order to translate information derived from the EEG for the object recognition mechanism, in this study, twelve various categories were selected as visual stimuli and were presented to the subjects in a controlled task and the signals were recorded through 19-channel EEG recording system. Analysis of signals was performed using two different event-related potential (ERP) computations namely the “target/rest” and “target/non-target” tasks. Comparing ERP of target with rest time indicated that the most involved electrodes in our task were F3, F4, C3, C4, Fz, Cz, among others. ERP of “target/non-target” resulted that in target stimuli two positive peaks occurred about 400 ms and 520 ms after stimulus onset; however, in non-target stimuli only one positive peak appeared about 400 ms after stimulus onset. Moreover, reaction times of subjects were computed and the results showed that the category of flower had the lowest reaction time; however, the stationery category had the maximum reaction time among others. The results provide useful information about the channels and the part of the signals that are affected by different object categories in terms of ERP brain signals. This study can be considered as the first step in the context of human-computer interface applications.


Url:
PubMed: 24083136
PubMed Central: 3785069

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3785069

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">EEG Signature of Object Categorization from Event-related Potentials</title>
<author>
<name sortKey="Daliri, Mohammad Reza" sort="Daliri, Mohammad Reza" uniqKey="Daliri M" first="Mohammad Reza" last="Daliri">Mohammad Reza Daliri</name>
<affiliation>
<nlm:aff id="aff1"></nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Taghizadeh, Mitra" sort="Taghizadeh, Mitra" uniqKey="Taghizadeh M" first="Mitra" last="Taghizadeh">Mitra Taghizadeh</name>
<affiliation>
<nlm:aff id="aff2">
<italic>Department of Computer Science, Virtual Center, Iran University of Science and Technology, Tehran, Iran</italic>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Niksirat, Kavous Salehzadeh" sort="Niksirat, Kavous Salehzadeh" uniqKey="Niksirat K" first="Kavous Salehzadeh" last="Niksirat">Kavous Salehzadeh Niksirat</name>
<affiliation>
<nlm:aff id="aff3">
<italic>Department of Control Engineering, Science and Research Branch, Islamic Azad University, Tehran, Iran</italic>
</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24083136</idno>
<idno type="pmc">3785069</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3785069</idno>
<idno type="RBID">PMC:3785069</idno>
<date when="2013">2013</date>
<idno type="wicri:Area/Pmc/Corpus">002024</idno>
<idno type="wicri:Area/Pmc/Curation">002024</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">EEG Signature of Object Categorization from Event-related Potentials</title>
<author>
<name sortKey="Daliri, Mohammad Reza" sort="Daliri, Mohammad Reza" uniqKey="Daliri M" first="Mohammad Reza" last="Daliri">Mohammad Reza Daliri</name>
<affiliation>
<nlm:aff id="aff1"></nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Taghizadeh, Mitra" sort="Taghizadeh, Mitra" uniqKey="Taghizadeh M" first="Mitra" last="Taghizadeh">Mitra Taghizadeh</name>
<affiliation>
<nlm:aff id="aff2">
<italic>Department of Computer Science, Virtual Center, Iran University of Science and Technology, Tehran, Iran</italic>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Niksirat, Kavous Salehzadeh" sort="Niksirat, Kavous Salehzadeh" uniqKey="Niksirat K" first="Kavous Salehzadeh" last="Niksirat">Kavous Salehzadeh Niksirat</name>
<affiliation>
<nlm:aff id="aff3">
<italic>Department of Control Engineering, Science and Research Branch, Islamic Azad University, Tehran, Iran</italic>
</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Journal of Medical Signals and Sensors</title>
<idno type="ISSN">2228-7477</idno>
<idno type="eISSN">2228-7477</idno>
<imprint>
<date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Human visual system recognizes objects in a fast manner and the neural activity of the human brain generates signals which provide information about objects categories seen by the subjects. The brain signals can be recorded using different systems like the electroencephalogram (EEG). The EEG signals carry significant information about the stimuli that stimulate the brain. In order to translate information derived from the EEG for the object recognition mechanism, in this study, twelve various categories were selected as visual stimuli and were presented to the subjects in a controlled task and the signals were recorded through 19-channel EEG recording system. Analysis of signals was performed using two different event-related potential (ERP) computations namely the “target/rest” and “target/non-target” tasks. Comparing ERP of target with rest time indicated that the most involved electrodes in our task were F3, F4, C3, C4, Fz, Cz, among others. ERP of “target/non-target” resulted that in target stimuli two positive peaks occurred about 400 ms and 520 ms after stimulus onset; however, in non-target stimuli only one positive peak appeared about 400 ms after stimulus onset. Moreover, reaction times of subjects were computed and the results showed that the category of flower had the lowest reaction time; however, the stationery category had the maximum reaction time among others. The results provide useful information about the channels and the part of the signals that are affected by different object categories in terms of ERP brain signals. This study can be considered as the first step in the context of human-computer interface applications.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Sanei, S" uniqKey="Sanei S">S Sanei</name>
</author>
<author>
<name sortKey="Chambers, Ja" uniqKey="Chambers J">JA Chambers</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Binder, Jr" uniqKey="Binder J">JR Binder</name>
</author>
<author>
<name sortKey="Desai, Rh" uniqKey="Desai R">RH Desai</name>
</author>
<author>
<name sortKey="Graves, Ww" uniqKey="Graves W">WW Graves</name>
</author>
<author>
<name sortKey="Conant, Ll" uniqKey="Conant L">LL Conant</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Walter, Wg" uniqKey="Walter W">WG Walter</name>
</author>
<author>
<name sortKey="Cooper, R" uniqKey="Cooper R">R Cooper</name>
</author>
<author>
<name sortKey="Aldridge, Vj" uniqKey="Aldridge V">VJ Aldridge</name>
</author>
<author>
<name sortKey="Mccallum, Wc" uniqKey="Mccallum W">WC Mccallum</name>
</author>
<author>
<name sortKey="Winter, Al" uniqKey="Winter A">AL Winter</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sutton, S" uniqKey="Sutton S">S Sutton</name>
</author>
<author>
<name sortKey="Braren, M" uniqKey="Braren M">M Braren</name>
</author>
<author>
<name sortKey="Zubin, J" uniqKey="Zubin J">J Zubin</name>
</author>
<author>
<name sortKey="John, Er" uniqKey="John E">ER John</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johnson Jr, R" uniqKey="Johnson Jr R">R Johnson Jr</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Coben, La" uniqKey="Coben L">LA Coben</name>
</author>
<author>
<name sortKey="Danziger, Wl" uniqKey="Danziger W">WL Danziger</name>
</author>
<author>
<name sortKey="Hughes, Cp" uniqKey="Hughes C">CP Hughes</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Visser, Sl" uniqKey="Visser S">SL Visser</name>
</author>
<author>
<name sortKey="Van Tilburg, W" uniqKey="Van Tilburg W">W Van Tilburg</name>
</author>
<author>
<name sortKey="Hooijer, C" uniqKey="Hooijer C">C Hooijer</name>
</author>
<author>
<name sortKey="Jonker, C" uniqKey="Jonker C">C Jonker</name>
</author>
<author>
<name sortKey="De Rijke, W" uniqKey="De Rijke W">W De Rijke</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cosi, V" uniqKey="Cosi V">V Cosi</name>
</author>
<author>
<name sortKey="Vitelli, E" uniqKey="Vitelli E">E Vitelli</name>
</author>
<author>
<name sortKey="Gozzoli, L" uniqKey="Gozzoli L">L Gozzoli</name>
</author>
<author>
<name sortKey="Corona, A" uniqKey="Corona A">A Corona</name>
</author>
<author>
<name sortKey="Ceroni, M" uniqKey="Ceroni M">M Ceroni</name>
</author>
<author>
<name sortKey="Callieco, R" uniqKey="Callieco R">R Callieco</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nunez, Pl" uniqKey="Nunez P">PL Nunez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Picton, Tw" uniqKey="Picton T">TW Picton</name>
</author>
<author>
<name sortKey="Lins, Do" uniqKey="Lins D">DO Lins</name>
</author>
<author>
<name sortKey="Scherg, M" uniqKey="Scherg M">M Scherg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Perrin, F" uniqKey="Perrin F">F Perrin</name>
</author>
<author>
<name sortKey="Pernier, J" uniqKey="Pernier J">J Pernier</name>
</author>
<author>
<name sortKey="Bertrand, O" uniqKey="Bertrand O">O Bertrand</name>
</author>
<author>
<name sortKey="Echallier, Jf" uniqKey="Echallier J">JF Echallier</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gerlach, C" uniqKey="Gerlach C">C Gerlach</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Adorni, R" uniqKey="Adorni R">R Adorni</name>
</author>
<author>
<name sortKey="Proverbio, Am" uniqKey="Proverbio A">AM Proverbio</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lloyd Jones, Tj" uniqKey="Lloyd Jones T">TJ Lloyd-Jones</name>
</author>
<author>
<name sortKey="Humphreys, Gw" uniqKey="Humphreys G">GW Humphreys</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Simanova, I" uniqKey="Simanova I">I Simanova</name>
</author>
<author>
<name sortKey="Van Gerven, M" uniqKey="Van Gerven M">M van Gerven</name>
</author>
<author>
<name sortKey="Oostenveld, R" uniqKey="Oostenveld R">R Oostenveld</name>
</author>
<author>
<name sortKey="Hagoort, P" uniqKey="Hagoort P">P Hagoort</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johnson, Js" uniqKey="Johnson J">JS Johnson</name>
</author>
<author>
<name sortKey="Olshausen, Ba" uniqKey="Olshausen B">BA Olshausen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johnson, Js" uniqKey="Johnson J">JS Johnson</name>
</author>
<author>
<name sortKey="Olshausen, Ba" uniqKey="Olshausen B">BA Olshausen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fuggetta, G" uniqKey="Fuggetta G">G Fuggetta</name>
</author>
<author>
<name sortKey="Rizzo, S" uniqKey="Rizzo S">S Rizzo</name>
</author>
<author>
<name sortKey="Pobric, G" uniqKey="Pobric G">G Pobric</name>
</author>
<author>
<name sortKey="Lavidor, M" uniqKey="Lavidor M">M Lavidor</name>
</author>
<author>
<name sortKey="Walsh, V" uniqKey="Walsh V">V Walsh</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pinto, N" uniqKey="Pinto N">N Pinto</name>
</author>
<author>
<name sortKey="Cox, Dd" uniqKey="Cox D">DD Cox</name>
</author>
<author>
<name sortKey="Dicarlo, Jj" uniqKey="Dicarlo J">JJ DiCarlo</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Proverbio, Am" uniqKey="Proverbio A">AM Proverbio</name>
</author>
<author>
<name sortKey="Del Zotto, M" uniqKey="Del Zotto M">M Del Zotto</name>
</author>
<author>
<name sortKey="Zani, A" uniqKey="Zani A">A Zani</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Muller, Kr" uniqKey="Muller K">KR Müller</name>
</author>
<author>
<name sortKey="Tangermann, M" uniqKey="Tangermann M">M Tangermann</name>
</author>
<author>
<name sortKey="Dornhege, G" uniqKey="Dornhege G">G Dornhege</name>
</author>
<author>
<name sortKey="Krauledat, M" uniqKey="Krauledat M">M Krauledat</name>
</author>
<author>
<name sortKey="Curio, G" uniqKey="Curio G">G Curio</name>
</author>
<author>
<name sortKey="Blankertz, B" uniqKey="Blankertz B">B Blankertz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Martin, A" uniqKey="Martin A">A Martin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Martinovic, J" uniqKey="Martinovic J">J Martinovic</name>
</author>
<author>
<name sortKey="Mordal, J" uniqKey="Mordal J">J Mordal</name>
</author>
<author>
<name sortKey="Wuerger, Sm" uniqKey="Wuerger S">SM Wuerger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rossion, B" uniqKey="Rossion B">B Rossion</name>
</author>
<author>
<name sortKey="Boremanse, A" uniqKey="Boremanse A">A Boremanse</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nazari, Ma" uniqKey="Nazari M">MA Nazari</name>
</author>
<author>
<name sortKey="Wallois, F" uniqKey="Wallois F">F Wallois</name>
</author>
<author>
<name sortKey="Aarabi, A" uniqKey="Aarabi A">A Aarabi</name>
</author>
<author>
<name sortKey="Nosratabadi, M" uniqKey="Nosratabadi M">M Nosratabadi</name>
</author>
<author>
<name sortKey="Patrick, B" uniqKey="Patrick B">B Patrick</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dering, B" uniqKey="Dering B">B Dering</name>
</author>
<author>
<name sortKey="Martin, Cd" uniqKey="Martin C">CD Martin</name>
</author>
<author>
<name sortKey="Moro, S" uniqKey="Moro S">S Moro</name>
</author>
<author>
<name sortKey="Pegna, Aj" uniqKey="Pegna A">AJ Pegna</name>
</author>
<author>
<name sortKey="Thierry, G" uniqKey="Thierry G">G Thierry</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Martinovic, J" uniqKey="Martinovic J">J Martinovic</name>
</author>
<author>
<name sortKey="Gruber, T" uniqKey="Gruber T">T Gruber</name>
</author>
<author>
<name sortKey="Muller, Mm" uniqKey="Muller M">MM Müller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Van Hoogmoed, Ah" uniqKey="Van Hoogmoed A">AH van Hoogmoed</name>
</author>
<author>
<name sortKey="Van Den Brink, D" uniqKey="Van Den Brink D">D van den Brink</name>
</author>
<author>
<name sortKey="Janzen, G" uniqKey="Janzen G">G Janzen</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">J Med Signals Sens</journal-id>
<journal-id journal-id-type="publisher-id">JMSS</journal-id>
<journal-title-group>
<journal-title>Journal of Medical Signals and Sensors</journal-title>
</journal-title-group>
<issn pub-type="ppub">2228-7477</issn>
<issn pub-type="epub">2228-7477</issn>
<publisher>
<publisher-name>Medknow Publications & Media Pvt Ltd</publisher-name>
<publisher-loc>India</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24083136</article-id>
<article-id pub-id-type="pmc">3785069</article-id>
<article-id pub-id-type="publisher-id">JMSS-3-37</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Original Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>EEG Signature of Object Categorization from Event-related Potentials</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Daliri</surname>
<given-names>Mohammad Reza</given-names>
</name>
<xref ref-type="aff" rid="aff1"></xref>
<xref ref-type="corresp" rid="cor1"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Taghizadeh</surname>
<given-names>Mitra</given-names>
</name>
<xref ref-type="aff" rid="aff2">1</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Niksirat</surname>
<given-names>Kavous Salehzadeh</given-names>
</name>
<xref ref-type="aff" rid="aff3">2</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<italic>Department of Biomedical Engineering, Iran University of Science and Technology, Tehran, Iran</italic>
</aff>
<aff id="aff2">
<label>1</label>
<italic>Department of Computer Science, Virtual Center, Iran University of Science and Technology, Tehran, Iran</italic>
</aff>
<aff id="aff3">
<label>2</label>
<italic>Department of Control Engineering, Science and Research Branch, Islamic Azad University, Tehran, Iran</italic>
</aff>
<author-notes>
<corresp id="cor1">
<bold>Address for correspondence:</bold>
Dr. Mohammad Reza Daliri, Department of Biomedical Engineering, Faculty of Electrical Engineering, Iran University of Science and Technology (IUST) 16846-13114 Tehran, Iran. E-mail:
<email xlink:href="daliri@iust.ac.ir">daliri@iust.ac.ir</email>
</corresp>
</author-notes>
<pub-date pub-type="ppub">
<season>Jan-Mar</season>
<year>2013</year>
</pub-date>
<volume>3</volume>
<issue>1</issue>
<fpage>37</fpage>
<lpage>44</lpage>
<history>
<date date-type="received">
<day>11</day>
<month>9</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>05</day>
<month>1</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright: © Journal of Medical Signals and Sensors</copyright-statement>
<copyright-year>2013</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by-nc-sa/3.0">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<abstract>
<p>Human visual system recognizes objects in a fast manner and the neural activity of the human brain generates signals which provide information about objects categories seen by the subjects. The brain signals can be recorded using different systems like the electroencephalogram (EEG). The EEG signals carry significant information about the stimuli that stimulate the brain. In order to translate information derived from the EEG for the object recognition mechanism, in this study, twelve various categories were selected as visual stimuli and were presented to the subjects in a controlled task and the signals were recorded through 19-channel EEG recording system. Analysis of signals was performed using two different event-related potential (ERP) computations namely the “target/rest” and “target/non-target” tasks. Comparing ERP of target with rest time indicated that the most involved electrodes in our task were F3, F4, C3, C4, Fz, Cz, among others. ERP of “target/non-target” resulted that in target stimuli two positive peaks occurred about 400 ms and 520 ms after stimulus onset; however, in non-target stimuli only one positive peak appeared about 400 ms after stimulus onset. Moreover, reaction times of subjects were computed and the results showed that the category of flower had the lowest reaction time; however, the stationery category had the maximum reaction time among others. The results provide useful information about the channels and the part of the signals that are affected by different object categories in terms of ERP brain signals. This study can be considered as the first step in the context of human-computer interface applications.</p>
</abstract>
<kwd-group>
<kwd>
<italic>Event-related potential</italic>
</kwd>
<kwd>
<italic>object categorization</italic>
</kwd>
<kwd>
<italic>brain-computer interface applications</italic>
</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="sec1-1">
<title>INTRODUCTION</title>
<p>The neural activities of human brain generate one of the most significant biomedical signals in which they are the source of encoded information about the human surrounding and the human internal states. Understanding characteristic of neural data can be used to decode the human thoughts and actions and also can be applied for diagnosing the brain's neuropsychological diseases. For these applications, brain data collection, preprocessing and analysis methods has become one of the most challenging items in this field in recent decades. Among different non-invasive methods of brain signals acquisition such as magnetoencephalogram, functional magnetic resonance imaging and electroencephalogram (EEG), EEG is more appropriate for the brain data collection because of low cost of experiment design and data collection and its high temporal resolution than other procedures.[
<xref ref-type="bibr" rid="ref1">1</xref>
<xref ref-type="bibr" rid="ref2">2</xref>
] The inventor of the human EEG signals was Hans Berger, in 1929 in which he was able to measure, amplify and plot the human brain's electrical activities by placing an electrode on the scalp.</p>
<p>Human brain signals can be used for the diagnosis of the brain disorders by visual investigation or by using more sophisticated automatic analysis methods of the EEG signals. The EEG signals consist of five major frequency rhythms which are called delta (d), theta (θ), alpha (α), beta (β), gamma (g) bands indicating frequencies of (0.5-4 Hz), (4-7.5 Hz), (7.5-15Hz), (15-30Hz), and (30-50Hz) respectively. The EEG signal can easily be mixed with noises and artifacts from physical movements or environment. Preprocessing stages for the EEG signals consists of artifact detection and correction techniques. The discrete wavelet transform, independent component analysis and principal component analysis are the most popular and accurate strategies for this aim. To process and analyze the EEG signals, various methods have been proposed. Event-related potential (ERP) is one of the well-known way of signal analysis methods which was introduced in 1964 for the first time.[
<xref ref-type="bibr" rid="ref3">3</xref>
<xref ref-type="bibr" rid="ref4">4</xref>
] ERP has been useful tool in both psychiatry and neurology and widely applied in brain-computer interface (BCI) applications. ERP is the sum of electrical responses of EEGs in response to some cognitive events. The responses typically are generated during external stimulations such as visual or auditory that cause voltage fluctuations in EEG brain signals. The waveforms of ERPs are characterized across three key parameters: Amplitude, latency, and distribution over the scalp.[
<xref ref-type="bibr" rid="ref5">5</xref>
] ERP has been used for many clinical applications as well. For example, Coben
<italic>et al</italic>
.,[
<xref ref-type="bibr" rid="ref6">6</xref>
] in their work analyzed and compared the ERP components like N100, N400 or P300 in diversity of psychiatric diseases such as Alzheimer's diseases or other cases with healthy people. Visser
<italic>et al</italic>
.[
<xref ref-type="bibr" rid="ref7">7</xref>
] and Cosi
<italic>et al</italic>
.[
<xref ref-type="bibr" rid="ref8">8</xref>
] did a similar study in non-organic behavioural disorders in elderly people. Additionally ERP is a method to discuss about brain activities in a specific task in order to recognize how EEG signal potentials change when a stimulus is observed and how long it takes for the human to give a correct response to the stimuli. Evaluating the ERP of EEGs can produce an alphabet of brain language which can enable researchers to read the signals and understand the different states of the human brain. Various experimental design have been considered by researchers such as category-based brain activity for alive objects against man-made objects, and for different particular object classes such as furniture, faces, animals, fruits, vegetables, buildings, tools and parts.[
<xref ref-type="bibr" rid="ref9">9</xref>
<xref ref-type="bibr" rid="ref10">10</xref>
<xref ref-type="bibr" rid="ref11">11</xref>
<xref ref-type="bibr" rid="ref12">12</xref>
] Moreover, ERP of visual, verbal and auditory stimulus modalities, living versus non-living have been analyzed.[
<xref ref-type="bibr" rid="ref13">13</xref>
<xref ref-type="bibr" rid="ref14">14</xref>
] Simanova
<italic>et al</italic>
.,[
<xref ref-type="bibr" rid="ref15">15</xref>
] in their study investigated the possibility to identify conceptual representations from event-related EEG based on three different forms of object presentation: Object spoken name, object visual representation and written name of object. They achieved high classification performance with accuracy of 89% for drawing objects. Jeffrey and Bruno (2003)[
<xref ref-type="bibr" rid="ref16">16</xref>
] in their study worked about the time of human visual system to recognize objects. They designed ERP experiment series in order to measure the time course of electrophysiological correlates of object recognition. They found two distinct components in the ERP recorded during categorization of natural images. One was an early presentation-locked signal arising around 135 ms after stimulus onset that was present when there were low-level feature differences between images. The other was a later, recognition-related component arising between 150 ms and 300 ms after stimulus onset. Jeffrey and Bruno (2005)[
<xref ref-type="bibr" rid="ref17">17</xref>
] also designed three sets of ERP experiments to specify processes of the target minus non-target difference signals seen in visual cued-target paradigms. They proved that the same difference signals were obtained when the target match was made to word stimuli as well as to object stimuli. Additionally signal amplitude had reverse relation with task difficulty.</p>
<p>It has also been shown that objects and their sensory or functional attributes (such as tool-associated actions) activate the same neural regions, suggesting that these regions are implicitly involved in concept representation.[
<xref ref-type="bibr" rid="ref18">18</xref>
<xref ref-type="bibr" rid="ref19">19</xref>
<xref ref-type="bibr" rid="ref20">20</xref>
<xref ref-type="bibr" rid="ref21">21</xref>
] Martin, 2007[
<xref ref-type="bibr" rid="ref22">22</xref>
] in his object recognition study proved that object's features coded very rapidly and played different functional roles while color and extra contours and edges delay it.</p>
<p>Martinovic
<italic>et al</italic>
.,[
<xref ref-type="bibr" rid="ref23">23</xref>
] in their work studied about detection and identification of objects a visual perception. They compared the role of luminance and chromatic information between full-color and reduced-color object. In a subsequent electroencephalographic ERP experiment, advantages in accuracy and high-level discrimination were found for full-color stimuli over the reduced-color stimuli.</p>
<p>Rossion and Boremanse (2011),[
<xref ref-type="bibr" rid="ref24">24</xref>
] showed that how the human brain discriminates complex visual patterns, such as individual faces testing sensitivity to individual faces using steady-state visual-evoked potentials (SSVEPs). They showed a large response at the fundamental stimulation frequency (3.5 Hz) over posterior electrode sites using fast fourier transform (FFT) of EEG. Nazari
<italic>et al</italic>
.[
<xref ref-type="bibr" rid="ref25">25</xref>
] in their study proved that several differences in the P300 component are observed when responses must be executed in the “go/no-go task”. They examined the peak amplitude and latency of Go-P300 and Nogo-P300 component in healthy children. The P300 component was measured at frontal (F3, Fz, F4) and parietal (P3, Pz, P4) regions. The results displayed higher P300 amplitude in the Go relative to No-Go at parietal region. In addition, decrease in P300 latency was observed at the frontal in comparison to parietal region. Dering
<italic>et al</italic>
.,[
<xref ref-type="bibr" rid="ref26">26</xref>
] in three experiments, measured the amplitude and latency of the mean P1 and N170 in response to faces, cars, and butterflies, cropped or morphed. The N170 was sensitive to cropping but did not differentiate frontal views of faces and cars. The P1 amplitude was larger for faces than objects. The authors concluded that P1, not N170, is a reliable face-sensitive event.</p>
<p>The aim of the current study is to clarify differences between ERP of target/rest and target/non-target signature of signals such as time courses and ERP signal forms. To this aim, much more categories are used in comparison with previous studies; consequently it is more complicated to study ERP signals. This makes the scenario similar to what the human faces in everyday life, so the results are more reliable.</p>
<p>Experimental setup of our study consists of visual stimuli presentation of twelve different categories (clothing, animals, foods, flowers, fruits, human body organs, transportation devices, dolls, electronic devices, jewelry, stationary, buildings, and scenes). We used the EEG recording system in order to measure ERP of the human brain. All of the participants in our study were in healthy condition and the patients are not addressed here.</p>
</sec>
<sec sec-type="materials|methods" id="sec1-2">
<title>MATERIALS AND METHODS</title>
<sec id="sec2-1">
<title>Participants</title>
<p>Ten human volunteers, 2 females and 8 males, mean age 25, participated in the study. All were right-handed except one, and reported that they did not have any psychological or neurological abnormalities. All participants were gifted for their participation and they were informed about the task paradigm before going to the experiment booth.</p>
</sec>
<sec id="sec2-2">
<title>Stimuli</title>
<p>Twelve different categories of clothing, animals, foods, flowers, fruits, human body organs, transportation devices, dolls, electronic devices, jewelry, stationary, buildings, scenes were selected in which each category included five different images. All images were picked up from Internet; color images with same resolution of 600 × 800 pixel and same difficulty. Picture difficulty means that how easy it is to recognize the picture at the first glance and put it in a correct category. Monitor parameters such as luminance and contrast were normalized. Whole stimuli were presented in the exact center of the screen with black background. The monitor was 17-inch LCD with 1280*800 resolutions.
<xref ref-type="fig" rid="F1">Figure 1</xref>
shows some sample images are used for the experiment. Process was divided in two parts of visual tasks, six categories were presented in task Part 1 and the next six categories in Part 2 with a gap of three minutes in between. This was done in order to let subjects’ eyes to rest and reduce EEG eye-blinking artifacts.
<xref ref-type="fig" rid="F2">Figure 2</xref>
indicates the task paradigm.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption>
<p>Different three normalized sample images (not full-size images are shown here)</p>
</caption>
<graphic xlink:href="JMSS-3-37-g001"></graphic>
</fig>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption>
<p>Task paradigm including the first and the second part of the experiment with timing of presentation</p>
</caption>
<graphic xlink:href="JMSS-3-37-g002"></graphic>
</fig>
</sec>
<sec id="sec2-3">
<title>Task of Study</title>
<p>The experiment which was designed for this study was “cued-target” with “Go/No-go” ERP categorization task. Cued-target means the category name (target cue) was displayed on the screen before starting the trial to notify the participants about the target images. Target cue remained on the screen for 1 s, and after an additional 900 ms delay, the images were presented. Each image was presented for 700 ms, the rest time or blank screen interval between two images was 800 ms and 5 s rest between the two different categories. The subjects were asked to press left button of mouse with the right hand index finger to deliver their responses as quickly as possible and to delay their blinks to 800 ms in the resting time. All images were centrally shown on a CRT LCD monitor. Image presentation was controlled via a computer Pentium 4 with 512 MB RAM and 40 GB HDD PC running on the PsyTask software. PsyTask is a software for visual/auditory stimuli presentation and psychophysiology investigations. It works in conjunction with WinEEG software. Viewing distance was 75 cm from computer screen. In “go/no-go” task subjects had to left-click if they saw target images and did nothing on non-target images. They were given a maximum of 700 ms time to respond, and after this period any responses were considered as an incorrect response. Subjects observed 360 pictures in a two-section experiment [
<xref ref-type="fig" rid="F2">Figure 2</xref>
].</p>
</sec>
<sec id="sec2-4">
<title>EEG Recording System</title>
<p>The subjects wore a 19-channel electrode cap consist of Fp1, Fp2, F3, F4, C3, C4, P3, P4, F7, F8, T3, T4, T5, T6, Fz, Cz, Pz, O1, O2, additionally A1 and A2 connected to the left and right earlobes, respectively, as the reference electrodes [
<xref ref-type="fig" rid="F3">Figure 3</xref>
]. Recording electrodes were selected from the 10-20 set of channel positions. Subjects performed the experiment in an electrically shielded and sound-damped booth. The EEG signals were sent to A/D converter after they amplified by MITSAR hardware. The electroencephalographic PC-controlled system “Mitsar-EEG” is purposed for acquisition, storage at personal computer's HDD, processing, displaying at PC's display and printing out of electroencephallographic signals. Recording was done at 500 Hz sampling frequency.
<xref ref-type="fig" rid="F4">Figure 4</xref>
shows the EEG recording system which it is comprised of two separate and synchronized PCs, one PC was for the psycho-task presenting stimuli and the other was for recording signals operating on WinEEG software, adding labels to the raw signals. Labeling is one of the significant methods of ERP computing where the WinEEG/Psycho-task has this capability. In this study label of ending one means non-target images and ending two indicates the target images that should be responded by the subjects. For example the integer 12 means an image of category 1 was presented that should be responded, and eleven means an image of category 1 was presented that should not be responded, so the integer number before the last integer indicated the category of the image.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption>
<p>Standard layout for electrode placement in a 19-channel electroencephalogram recording system with A1 and A2 references link to the earlobes. Red-marked electrodes are most involved electrodes in our task</p>
</caption>
<graphic xlink:href="JMSS-3-37-g003"></graphic>
</fig>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption>
<p>Electroencephalogram recording system including two different PCs, an amplifier (MITSAR), mouse, 19 electrodes</p>
</caption>
<graphic xlink:href="JMSS-3-37-g004"></graphic>
</fig>
</sec>
</sec>
<sec id="sec1-3">
<title>EXPERIMENTAL RESULTS AND DISCUSSION</title>
<p>In current study, ERP of twelve categories through a 19-channel EEG system were computed and total results were averaged among participants using WinEEG software. Two different analogies was done: “Target/rest” and “target/non-target”. In order to localize active electrodes during task paradigm, ERP of all correctly responded stimuli of 12 category from 8 subjects was measured and averaged (because of large amount of artifacts appeared in the data from subject 4 and 6, they were not considered for the analysis) and compared with the rest time. This is called “target/rest” results. Moreover, the other ERP applied to the correctly responded target images in comparison to non-target images. This was done to recognize difference in brain signals images when they play roles as target images with the same images when they are in non-target order.
<xref ref-type="fig" rid="F5">Figure 5</xref>
shows the ERP of 19-channel “target/rest” with topographical maps and
<xref ref-type="fig" rid="F6">Figure 6</xref>
indicates ERP of only Cz (because of its highest voltage among other electrodes) electrode with peak latencies mapping.
<xref ref-type="fig" rid="F5">Figure 5</xref>
is related to ERP of “target/non-target” with full mapping from 100 ms to 800 ms. In target-rest case the “rest” ERP is after the “target” ERP, but in target/non-target both ERPs are in comparing mode. It can be clearly seen from
<xref ref-type="fig" rid="F5">Figure 5</xref>
that the most involved electrodes during visual task compared with the rest time are F3, F4, C3, C4, Fz, Cz. The Frontal electrodes F3, F4 and Fz are mostly related to judgment and problem solving wherein our subjects are in judgment state during categorization of objects. C3, C4, Cz are sensory-motor electrodes measuring muscles output, and because the exemplars use the left click in the task, these electrodes have high amplitude. The three components in the “target/rest” can be obviously seen from
<xref ref-type="fig" rid="F6">Figure 6</xref>
: N248, P390 and P526. N248 have the most distribution on O1 and O2 channels which are related to the occipital visual state and have are negative. P390 is due to attention of subjects in which at 390 ms, stimuli were mentally recognized as target images by subjects and they responded to target images in about P526, therefore this component is related to the reaction or intention to respond.
<xref ref-type="fig" rid="F7">Figure 7</xref>
indicates “target/non-target” ERP unmatched features in target ERP. As it mentioned, it has two positive peaks compared with non-target ERP which has only one positive peak due to no reactions of subjects.</p>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption>
<p>(a) Event-related potential of 19 scalp electrodes from 100 ms to 1400 ms including target and rest time averaged from 8 participants bar below the graphs indicates time points where waveforms differ significantly (P < 0.001), (b) topographical mapping of the brain activities from 100 ms to 1400 ms that indicates active electrodes during visual stimulation (F3, F4, Fz, C3, C4, Cz) versus inactive one (the remaining electrodes) with low amplitude amount in rest time</p>
</caption>
<graphic xlink:href="JMSS-3-37-g005"></graphic>
</fig>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption>
<p>Occurring three peaks in target stimuli event-related potential: N248, P390, P526 with their related mapping</p>
</caption>
<graphic xlink:href="JMSS-3-37-g006"></graphic>
</fig>
<fig id="F7" position="float">
<label>Figure 7</label>
<caption>
<p>(a) “Target/non-target” event-related potential from 100 ms to 700 ms for F3, F4, Fz, C3, C4, Cz channels, (b) The activity of the Cz channel zoomed. Two peaks appeared in target but “non-target” only one peak can be seen due to its excluding reaction intention</p>
</caption>
<graphic xlink:href="JMSS-3-37-g007"></graphic>
</fig>
<sec id="sec2-5">
<title>Reaction Time</title>
<p>The reaction times of the 8 subjects were computed and listed in
<xref ref-type="table" rid="T1">Table 1</xref>
, from lowest to highest, in order to distinguish between all 12 categories. The results indicate that the category of flower has fastest reaction time with 392.2 ms and lowest standard deviation of ±81.2 ms (
<italic>P</italic>
< 0.001). In contrast, the stationery category has the highest time of reaction with 458.1 and maximum standard deviation of ±156.4 ms (
<italic>P</italic>
< 0.001). All statistical analysis was done by one way analysis of variance (ANOVA-1 test).</p>
<table-wrap id="T1" position="float">
<label>Table 1</label>
<caption>
<p>Reaction times for 12 categories averaged over all subjects</p>
</caption>
<graphic xlink:href="JMSS-3-37-g008"></graphic>
</table-wrap>
</sec>
</sec>
<sec sec-type="discussion" id="sec1-4">
<title>DISCUSSION</title>
<p>In this study the EEG signals of eight subjects during visual stimuli, including various categories, was analyzed using the ERP analysis method. Our task is composed of two tasks discussed in previous study of Jeffrey and Johnson (2003), (cued-target and single category), however, our task comprises 12 categories; in contrast they had two kinds of pictures (animal images and nature images). In this study we indicated three major times of ERPs (N248, P390 and P526) which shows the observation, cognition and response, respectively. In comparison with Jeffrey and Johnson (2003) they proved signal arising around 135 ms and the other is a later, recognition-related component arising between 150 ms and 300 ms. Unlike the early component, the latency of the later component co-varies with the subsequent reaction time. Additionally Jeffrey and Johnson (2005) reported that signal amplitude decreases with increasing the pictures difficulty, while here we selected colored pictures with same difficulty level., Gruber, Muller (2008)[
<xref ref-type="bibr" rid="ref27">27</xref>
] in their object recognition task have shown objects features coding and with extra features like color and edges to simulate delay of recognition, however, Simanova, Van Gerven, Oostenveld and Hagoort (2010)[
<xref ref-type="bibr" rid="ref15">15</xref>
] investigated with the highest accuracy of 89% for labeled pictures and spoken stimuli. Based on these references, we have chosen colored images which are a more realistic scenario quite similar to real life images. Van Hoogmoed,
<italic>et al</italic>
.,[
<xref ref-type="bibr" rid="ref28">28</xref>
] in their electrophysiological study compared five ERP scalp distribution mapping in different states (side change, depth change, disappearance, identity change and switch) for three N2, N3, and P3 latency windows. This study shows different activation area of brain scalp, in contrast we obtained same activation for all categories in different colors. The significant neural activity of brain was seen mostly in F3, F4, C3, C4, Fz, Cz electrodes. Comparing ERP of “target” with ERP of “non-target” images resulted in one positive peak versus two positive peaks for ERP that resulted from the “non-target” vs. “target” task. Additionally reaction time of 12 categories was measured and averaged through all subjects indicating that the flower category had the lowest reaction time while the stationery had the maximum reaction time. In terms of the nature of ERP, it is an averaged signal which does not give further information of exactly what happened when a picture was presented, so it is difficult to classify different categories using ERP information, however, it provides the changes that occurred due to different presentations, so it can provide useful information on selecting the proper channels and proper part of the signals for categorization.</p>
<p>Further work ERP of 12 different categories can be computed and compared with each other to understand differences in various images of grouping with much more details of processing to produce a brain language and application for BCI technologies.</p>
</sec>
<sec id="sec1-5">
<title>BIOGRAPHIES</title>
<p>
<inline-graphic xlink:href="JMSS-3-37-g009.jpg"></inline-graphic>
</p>
<p>
<bold>Mohammad Reza Daliri</bold>
is an assistant professor with the Department of Biomedical Engineering, Faculty of Electrical Engineering, Iran University of Science and Technology (IUST), Tehran, Iran. Before joining the IUST, he was a postdoctoral researcher in Cognitive Neuroscience Lab., Germant Primate Center (DPZ), Goettingen, Germany. He has a PhD degree in Cognitive Neuroscience from the International School for Advanced Studies (ISAS/SISSA), Trieste, Italy. His main research interests are computational and cognitive neuroscience, vision in human and machine and pattern recognition</p>
<p>
<bold>E-mail</bold>
:
<email xlink:href="daliri@iust.ac.ir">daliri@iust.ac.ir</email>
</p>
<p>
<inline-graphic xlink:href="JMSS-3-37-g010.jpg"></inline-graphic>
</p>
<p>
<bold>Mitra Taghizadeh</bold>
received the B.S. degree in electronic engineering from Department of Electronic Engineering, Islamic Azad University, Tabriz, Iran in 2007. She has received two M.S. degrees in ICT and Mechatronics Engineering from Department of Computer Science, Virtual Center, Iran University of Science and Technology (IUST), Tehran, Iran and Department of Mechanical Engineering, Islamic Azad University, Tabriz, Iran in 2012, respectively. Her research interests include analysis of brain signals and EEG-based brain-computer interface development and pattern recognition.</p>
<p>
<bold>E-mail</bold>
:
<email xlink:href="mitra.tagizadeh@gmail.com">mitra.tagizadeh@gmail.com</email>
</p>
<p>
<inline-graphic xlink:href="JMSS-3-37-g011.jpg"></inline-graphic>
</p>
<p>
<bold>Kavous Salehzadeh Niksirat</bold>
received the B.S. degree in electronic engineering from Department of Electronic Engineering, Islamic Azad University, Tabriz, Iran in 2007. He received M.S. degree in Control Engineering from Department of Control Engineering, Science and Research Branch, Islamic Azad University Tehran, Iran in 2011. His research interests are Medical and Rehabilitation Robotics, Haptic and Teleoperation, Pattern Recognition and Brain Signals analysis</p>
<p>
<bold>E-mail</bold>
:
<email xlink:href="kavus.salehzadeh@gmail.com">kavus.salehzadeh@gmail.com</email>
</p>
</sec>
</body>
<back>
<ack>
<title>ACKNOWLEDGMENT</title>
<p>Thanks from PAARAND institute (
<ext-link ext-link-type="uri" xlink:href="www.paarand.org">www.paarand.org</ext-link>
) for providing the EEG system and helping in data collection.</p>
</ack>
<fn-group>
<fn fn-type="supported-by">
<p>
<bold>Source of Support:</bold>
Nil</p>
</fn>
<fn fn-type="conflict">
<p>
<bold>Conflict of Interest:</bold>
None declared.</p>
</fn>
</fn-group>
<ref-list>
<title>REFERENCES</title>
<ref id="ref1">
<label>1</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Sanei</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Chambers</surname>
<given-names>JA</given-names>
</name>
</person-group>
<article-title>EEG Signal Processing, Centre of Digital Signal Processing</article-title>
<source>Cardiff University</source>
<year>2007</year>
<publisher-loc>UK</publisher-loc>
<publisher-name>John Wiley ' Sons</publisher-name>
</element-citation>
</ref>
<ref id="ref2">
<label>2</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Binder</surname>
<given-names>JR</given-names>
</name>
<name>
<surname>Desai</surname>
<given-names>RH</given-names>
</name>
<name>
<surname>Graves</surname>
<given-names>WW</given-names>
</name>
<name>
<surname>Conant</surname>
<given-names>LL</given-names>
</name>
</person-group>
<article-title>Where is the semantic system. A critical review and meta-analysis of 120 functional neuroimaging studies?</article-title>
<source>Cereb Cortex</source>
<year>2009</year>
<volume>19</volume>
<fpage>2767</fpage>
<lpage>96</lpage>
<pub-id pub-id-type="pmid">19329570</pub-id>
</element-citation>
</ref>
<ref id="ref3">
<label>3</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Walter</surname>
<given-names>WG</given-names>
</name>
<name>
<surname>Cooper</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Aldridge</surname>
<given-names>VJ</given-names>
</name>
<name>
<surname>Mccallum</surname>
<given-names>WC</given-names>
</name>
<name>
<surname>Winter</surname>
<given-names>AL</given-names>
</name>
</person-group>
<article-title>Contingent negative variation: An electric sign of sensorimotor association and expectancy in the human brain</article-title>
<source>Nature</source>
<year>1964</year>
<volume>203</volume>
<fpage>380</fpage>
<lpage>4</lpage>
<pub-id pub-id-type="pmid">14197376</pub-id>
</element-citation>
</ref>
<ref id="ref4">
<label>4</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sutton</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Braren</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Zubin</surname>
<given-names>J</given-names>
</name>
<name>
<surname>John</surname>
<given-names>ER</given-names>
</name>
</person-group>
<article-title>Evoked-potential correlates of stimulus uncertainty</article-title>
<source>Science</source>
<year>1965</year>
<volume>150</volume>
<fpage>1187</fpage>
<lpage>8</lpage>
<pub-id pub-id-type="pmid">5852977</pub-id>
</element-citation>
</ref>
<ref id="ref5">
<label>5</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Johnson Jr</surname>
<given-names>R</given-names>
<suffix>Jr</suffix>
</name>
</person-group>
<person-group person-group-type="editor">
<name>
<surname>Litvan</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Agid</surname>
<given-names>Y</given-names>
</name>
</person-group>
<article-title>Event-related brain potentials</article-title>
<source>Progressive Supranuclear Palsy: Clinical and Research Approaches</source>
<year>1992</year>
<publisher-loc>New York</publisher-loc>
<publisher-name>Oxford University Press</publisher-name>
<fpage>122</fpage>
<lpage>54</lpage>
</element-citation>
</ref>
<ref id="ref6">
<label>6</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Coben</surname>
<given-names>LA</given-names>
</name>
<name>
<surname>Danziger</surname>
<given-names>WL</given-names>
</name>
<name>
<surname>Hughes</surname>
<given-names>CP</given-names>
</name>
</person-group>
<article-title>Visual evoked potentials in mild senile dementia of Alzheimer type</article-title>
<source>Electroencephalogr Clin Neurophysiol</source>
<year>1983</year>
<volume>55</volume>
<fpage>121</fpage>
<lpage>30</lpage>
<pub-id pub-id-type="pmid">6185308</pub-id>
</element-citation>
</ref>
<ref id="ref7">
<label>7</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Visser</surname>
<given-names>SL</given-names>
</name>
<name>
<surname>Van Tilburg</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Hooijer</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Jonker</surname>
<given-names>C</given-names>
</name>
<name>
<surname>De Rijke</surname>
<given-names>W</given-names>
</name>
</person-group>
<article-title>Visual evoked potentials (VEPs) in senile dementia (Alzheimer type) and in non-organic behavioural disorders in the elderly; comparison with EEG parameters</article-title>
<source>Electroencephalogr Clin Neurophysiol</source>
<year>1985</year>
<volume>60</volume>
<fpage>115</fpage>
<lpage>21</lpage>
<pub-id pub-id-type="pmid">2578362</pub-id>
</element-citation>
</ref>
<ref id="ref8">
<label>8</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cosi</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Vitelli</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Gozzoli</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Corona</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Ceroni</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Callieco</surname>
<given-names>R</given-names>
</name>
</person-group>
<article-title>Visual evoked potentials in aging of the brain</article-title>
<source>Adv Neurol</source>
<year>1982</year>
<volume>32</volume>
<fpage>109</fpage>
<lpage>15</lpage>
<pub-id pub-id-type="pmid">7054929</pub-id>
</element-citation>
</ref>
<ref id="ref9">
<label>9</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Nunez</surname>
<given-names>PL</given-names>
</name>
</person-group>
<article-title>Neocortical Dynamics, EEG and Cognition</article-title>
<source>Electric Fields of the Brain</source>
<year>1981</year>
<publisher-loc>New York</publisher-loc>
<publisher-name>Oxford University Press</publisher-name>
<fpage>486</fpage>
</element-citation>
</ref>
<ref id="ref10">
<label>10</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Picton</surname>
<given-names>TW</given-names>
</name>
<name>
<surname>Lins</surname>
<given-names>DO</given-names>
</name>
<name>
<surname>Scherg</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>The recording and analysis of event-related potentials</article-title>
<source>Hand Book of Neurophysiology</source>
<year>1995</year>
<publisher-loc>Amsterdam</publisher-loc>
<publisher-name>Elsevier</publisher-name>
<fpage>3</fpage>
<lpage>73</lpage>
</element-citation>
</ref>
<ref id="ref11">
<label>11</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Perrin</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Pernier</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Bertrand</surname>
<given-names>O</given-names>
</name>
<name>
<surname>Echallier</surname>
<given-names>JF</given-names>
</name>
</person-group>
<article-title>Spherical splines for scalp potential and current density mapping</article-title>
<source>Electroencephalogr Clin Neurophysiol</source>
<year>1989</year>
<volume>72</volume>
<fpage>184</fpage>
<lpage>7</lpage>
<pub-id pub-id-type="pmid">2464490</pub-id>
</element-citation>
</ref>
<ref id="ref12">
<label>12</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gerlach</surname>
<given-names>C</given-names>
</name>
</person-group>
<article-title>A review of functional imaging studies on category specificity</article-title>
<source>J Cogn Neurosci</source>
<year>2007</year>
<volume>19</volume>
<fpage>296</fpage>
<lpage>314</lpage>
<pub-id pub-id-type="pmid">17280518</pub-id>
</element-citation>
</ref>
<ref id="ref13">
<label>13</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Adorni</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Proverbio</surname>
<given-names>AM</given-names>
</name>
</person-group>
<article-title>New insights into name category-related effects: Is the Age of Acquisition a possible factor?</article-title>
<source>Behav Brain Funct</source>
<year>2009</year>
<volume>5</volume>
<fpage>33</fpage>
<pub-id pub-id-type="pmid">19640289</pub-id>
</element-citation>
</ref>
<ref id="ref14">
<label>14</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lloyd-Jones</surname>
<given-names>TJ</given-names>
</name>
<name>
<surname>Humphreys</surname>
<given-names>GW</given-names>
</name>
</person-group>
<article-title>Perceptual differentiation as a source of category effects in object processing: Evidence from naming and object decision</article-title>
<source>Mem Cognit</source>
<year>1997</year>
<volume>25</volume>
<fpage>18</fpage>
<lpage>35</lpage>
</element-citation>
</ref>
<ref id="ref15">
<label>15</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Simanova</surname>
<given-names>I</given-names>
</name>
<name>
<surname>van Gerven</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Oostenveld</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Hagoort</surname>
<given-names>P</given-names>
</name>
</person-group>
<article-title>Identifying object categories from event-related EEG: Toward decoding of conceptual representations</article-title>
<source>PLoS One</source>
<year>2010</year>
<volume>5</volume>
<fpage>e14465</fpage>
<pub-id pub-id-type="pmid">21209937</pub-id>
</element-citation>
</ref>
<ref id="ref16">
<label>16</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johnson</surname>
<given-names>JS</given-names>
</name>
<name>
<surname>Olshausen</surname>
<given-names>BA</given-names>
</name>
</person-group>
<article-title>Timecourse of neural signatures of object recognition</article-title>
<source>J Vis</source>
<year>2003</year>
<volume>3</volume>
<fpage>499</fpage>
<lpage>512</lpage>
<pub-id pub-id-type="pmid">14507255</pub-id>
</element-citation>
</ref>
<ref id="ref17">
<label>17</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johnson</surname>
<given-names>JS</given-names>
</name>
<name>
<surname>Olshausen</surname>
<given-names>BA</given-names>
</name>
</person-group>
<article-title>The earliest EEG signatures of object recognition in a cued-target task are postsensory</article-title>
<source>J Vis</source>
<year>2005</year>
<volume>5</volume>
<fpage>299</fpage>
<lpage>312</lpage>
<pub-id pub-id-type="pmid">15929653</pub-id>
</element-citation>
</ref>
<ref id="ref18">
<label>18</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fuggetta</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Rizzo</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Pobric</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Lavidor</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Walsh</surname>
<given-names>V</given-names>
</name>
</person-group>
<article-title>Functional representation of living and nonliving domains across the cerebral hemispheres: A combined event-related potential/transcranial magnetic stimulation study</article-title>
<source>J Cogn Neurosci</source>
<year>2009</year>
<volume>21</volume>
<fpage>403</fpage>
<lpage>14</lpage>
<pub-id pub-id-type="pmid">18510439</pub-id>
</element-citation>
</ref>
<ref id="ref19">
<label>19</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pinto</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Cox</surname>
<given-names>DD</given-names>
</name>
<name>
<surname>DiCarlo</surname>
<given-names>JJ</given-names>
</name>
</person-group>
<article-title>Why is real-world visual object recognition hard?</article-title>
<source>PLoS Comput Biol</source>
<year>2008</year>
<volume>4</volume>
<fpage>e27</fpage>
<pub-id pub-id-type="pmid">18225950</pub-id>
</element-citation>
</ref>
<ref id="ref20">
<label>20</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Proverbio</surname>
<given-names>AM</given-names>
</name>
<name>
<surname>Del Zotto</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Zani</surname>
<given-names>A</given-names>
</name>
</person-group>
<article-title>The emergence of semantic categorization in early visual processing: ERP indices of animal vs.artifact recognition</article-title>
<source>BMC Neurosci</source>
<year>2007</year>
<volume>8</volume>
<fpage>24</fpage>
<pub-id pub-id-type="pmid">17411424</pub-id>
</element-citation>
</ref>
<ref id="ref21">
<label>21</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Müller</surname>
<given-names>KR</given-names>
</name>
<name>
<surname>Tangermann</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Dornhege</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Krauledat</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Curio</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Blankertz</surname>
<given-names>B</given-names>
</name>
</person-group>
<article-title>Machine learning for real-time single-trial EEG-analysis: From brain-computer interfacing to mental state monitoring</article-title>
<source>J Neurosci Methods</source>
<year>2008</year>
<volume>167</volume>
<fpage>82</fpage>
<lpage>90</lpage>
<pub-id pub-id-type="pmid">18031824</pub-id>
</element-citation>
</ref>
<ref id="ref22">
<label>22</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Martin</surname>
<given-names>A</given-names>
</name>
</person-group>
<article-title>The representation of object concepts in the brain</article-title>
<source>Annu Rev Psychol</source>
<year>2007</year>
<volume>58</volume>
<fpage>25</fpage>
<lpage>45</lpage>
<pub-id pub-id-type="pmid">16968210</pub-id>
</element-citation>
</ref>
<ref id="ref23">
<label>23</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Martinovic</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Mordal</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wuerger</surname>
<given-names>SM</given-names>
</name>
</person-group>
<article-title>Event-related potentials reveal an early advantage for luminance contours in the processing of objects</article-title>
<source>J Vis</source>
<year>2011</year>
<volume>11</volume>
<fpage>10</fpage>
</element-citation>
</ref>
<ref id="ref24">
<label>24</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rossion</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Boremanse</surname>
<given-names>A</given-names>
</name>
</person-group>
<article-title>Robust sensitivity to facial identity in the right human occipitotemporal cortex as revealed by steadystate visualevoked potentials</article-title>
<source>J Vis</source>
<year>2011</year>
<volume>11</volume>
<fpage>16</fpage>
<pub-id pub-id-type="pmid">21346000</pub-id>
</element-citation>
</ref>
<ref id="ref25">
<label>25</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nazari</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Wallois</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Aarabi</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Nosratabadi</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Patrick</surname>
<given-names>B</given-names>
</name>
</person-group>
<article-title>P300 component modulation during a go/nogo task in healthy children</article-title>
<source>Basic Clin Neurosci</source>
<year>2010</year>
<volume>2</volume>
<fpage>31</fpage>
<lpage>36</lpage>
</element-citation>
</ref>
<ref id="ref26">
<label>26</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dering</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Martin</surname>
<given-names>CD</given-names>
</name>
<name>
<surname>Moro</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Pegna</surname>
<given-names>AJ</given-names>
</name>
<name>
<surname>Thierry</surname>
<given-names>G</given-names>
</name>
</person-group>
<article-title>Face-sensitive processes one hundred milliseconds after picture onset</article-title>
<source>Front Hum Neurosci</source>
<year>2011</year>
<volume>5</volume>
<fpage>93</fpage>
<pub-id pub-id-type="pmid">21954382</pub-id>
</element-citation>
</ref>
<ref id="ref27">
<label>27</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Martinovic</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Gruber</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Müller</surname>
<given-names>MM</given-names>
</name>
</person-group>
<article-title>Coding of visual object features and feature conjunctions in the human brain</article-title>
<source>PLoS One</source>
<year>2008</year>
<volume>3</volume>
<fpage>e3781</fpage>
<pub-id pub-id-type="pmid">19023428</pub-id>
</element-citation>
</ref>
<ref id="ref28">
<label>28</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>van Hoogmoed</surname>
<given-names>AH</given-names>
</name>
<name>
<surname>van den Brink</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Janzen</surname>
<given-names>G</given-names>
</name>
</person-group>
<article-title>Electrophysiological correlates of object location and object identity processing in spatial scenes</article-title>
<source>PLoS One</source>
<year>2012</year>
<volume>7</volume>
<fpage>e41180</fpage>
<pub-id pub-id-type="pmid">22815960</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002024 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 002024 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3785069
   |texte=   EEG Signature of Object Categorization from Event-related Potentials
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:24083136" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024