GSTDTAP  > 气候变化
DOI10.1126/science.abi7262
Putting touch into action
A. Aldo Faisal
2021-05-21
发表期刊Science
出版年2021
英文摘要Human tactile senses maintain contact with the environment and are essential for the ability to manipulate objects dexterously ([ 1 ][1]). People suffering from upper spinal cord injuries, nerve injuries, and other forms of arm paralysis lose the ability not only to move their hand but also to feel with their fingers. Over the past decades, brain-machine interfacing has achieved the ability to read out action intentions from the human brain and restore movement by enabling paralyzed patients to control robotic arms ([ 2 ][2]) or to electrically stimulate muscles so paralyzed arms can move again ([ 3 ][3]). It is commonly assumed that this needs to be complemented with restoring sensory feedback. On page 831 of this issue, Flesher et al. ([ 4 ][4]) demonstrate a substantial practical benefit for a paralyzed patient whose ability to manipulate objects through a brain-controlled robotic arm was substantially increased through artificial tactile feedback. There are two main ways by which lost sensory feedback to the brain could be restored. The first is sensory substitution, where unaffected regions of the body or other senses are hijacked to convey tactile information—for example, by artificially stimulating a patch of skin where natural sensory feedback is still intact or by audio or visual cues as a proxy ([ 5 ][5]) to signal where and how a robotic hand touched an object. Sensory substitution techniques are low-cost, noninvasive, and easy to set up on a day-to-day basis. However, they require some learning, they do not give feedback that feels natural, and they repurpose sensory capacity, which crowds the patient's intact senses and may reduce their ability to perceive or communicate with the environment. The second option is to restore sensory feedback by directly interfacing with the nervous system. This can be at the peripheral level, for example, on residual nerves in arm stumps of hand amputees ([ 6 ][6]) or in the central nervous system, as is necessary in patients with high-level spinal cord injuries. “Writing” sensory information into the brain is not trivial: It usually requires neurosurgery, implantation of neural interfaces, and then finding a way to communicate with the brain so that the information is perceived meaningfully. Although it has been possible for more than 80 years to evoke tactile sensations by stimulating hundreds of neurons with a small electrical current (microstimulation), generating appropriate and controlled sensory feedback is more challenging. The microstimulation approach first required a better understanding of how tactile information is processed and represented in the nervous system and better neural interfaces. Decades of research in human volunteers and electrophysiological studies in nonhuman primates unlocked the foundations of tactile processing in the nervous system and how to stimulate the somatosensory cortex ([ 7 ][7], [ 8 ][8]). Critically, appropriate intracortical microstimulation (ICMS) with electrodes implanted in human somatosensory cortex was shown to create tactile sensations that appeared to originate from specific locations on the hand. Moreover, these ICMS experiments could induce tactile sensations that felt like natural sensations of pressure, sensation could be graded in strength, and the mapping between ICMS and reported sensation could remain stable for months ([ 9 ][9]). Flesher et al. now demonstrate the practical implementation of a bidirectional brain-machine interface (which “reads” and “writes” information to the brain). Their system decoded the brain activity of a tetraplegic patient to translate their intentions into control commands for a robotic arm and hand. At the same time, sensors on the robotic hand recorded the mechanical forces it experienced, transmitting these to the somatosensory cortex through ICMS. Specifically, the robotic arm and hand used torque sensing at the metacarpophalangeal joint at the base of the four robotic fingers to measure the forces they were experiencing when manipulating objects. Then it translated these into electrical stimuli that the authors had previously established would trigger tactile sensations linked to each of the four fingers. The sensations were localized to the same finger, and the strength of the sensations varied with the strength of the forces that the fingers experienced. Although the tactile feedback was limited in how natural it may have felt, it had a strong impact on performance, enabling the patient to perform the task much faster. The time taken to successfully grasp an object was halved and grasping was less prone to error, suggesting that the feedback provided gave the individual confidence about when the object was reliably in their hand. The movement time after picking up the object and transferring it to its target also improved, suggesting that the stimulation may have wider effects, including potentially stimulating motor circuits. The patient's performance improved rapidly and did not require training, suggesting that ICMS acted differently from sensory substitution such that little learning was involved. A direct comparison of task performance using ICMS-based sensory feedback versus sensory substitution in performing the same task would answer the important question of whether any modality of sensory feedback would be equally effective. The results of Flesher et al. open up many avenues of inquiry, including the possibility of advancing robotics and the development of tactile artificial skins ([ 10 ][10]) into clinical use, and transhumanist questions about augmenting human capabilities with nonbiological sensors. This proof-of-principle study raises an important challenge: Although a rich body of work in neuroscience developed high-level computational theories of how sensory feedback should control movement ([ 11 ][11]), much of the circuit-level understanding is limited and focused on decoding of the action. Yet, brains operate best when the loop between perception and action is closed, as demonstrated so effectively in this study. Moreover, sensors that inform how a robotic hand is configured (hand pose) are at least as important as tactile sensors, yet, in neuroscience, much of the research has focused on touch instead of the sense of bodily pose (proprioception). Integrating both senses—touch and proprioception—are going to be important for future bidirectional brain-machine interfaces and may help future patients to experience full restoration of their motor capabilities. Although neurotechnology has progressed in great leaps over the past decade, the uptake and long-term use of these devices was not commensurate. For example, many bionic hand prostheses are often abandoned within months by patients, which has been attributed to users perceiving them as inconvenient foreign objects that are not a part of their body ([ 12 ][12]). Restoration of sensory feedback through nervous system interfacing, as demonstrated by Flesher et al. , is hopefully going to resolve these issues, not only because such technology substantially improves a patient's motor capabilities but also because it may restore sensory ownership over their technologically extended body ([ 13 ][13]). 1. [↵][14]1. R. L. Sainburg et al ., J. Neurophysiol. 73, 820 (1995). [OpenUrl][15][CrossRef][16][PubMed][17][Web of Science][18] 2. [↵][19]1. L. R. Hochberg et al ., Nature 485, 372 (2012). [OpenUrl][20][CrossRef][21][PubMed][22][Web of Science][23] 3. [↵][24]1. C. Ethier et al ., Nature 485, 368 (2012). [OpenUrl][25][CrossRef][26][PubMed][27][Web of Science][28] 4. [↵][29]1. S. N. Flesher et al ., Science 372, 831 (2021). [OpenUrl][30][Abstract/FREE Full Text][31] 5. [↵][32]1. P. Bach-y-Rita, 2. S. W Kercel , Trends Cogn. Sci. 7, 541 (2003). [OpenUrl][33][CrossRef][34][PubMed][35][Web of Science][36] 6. [↵][37]1. S. Raspopovic et al ., Sci. Transl. Med. 6, 222ra19 (2014). [OpenUrl][38][Abstract/FREE Full Text][39] 7. [↵][40]1. S. J. Bensmaia, 2. L. E. Miller , Nat. Rev. Neurosci. 15, 313 (2014). [OpenUrl][41][CrossRef][42][PubMed][43] 8. [↵][44]1. T. Callier et al ., J. Neural Eng. 12, 056010 (2015). [OpenUrl][45][CrossRef][46][PubMed][47] 9. [↵][48]1. S. N. Flesher et al ., Sci. Transl. Med. 8, 361ra141 (2016). [OpenUrl][49][Abstract/FREE Full Text][50] 10. [↵][51]1. R. Dahiya , Proc. IEEE 107, 247 (2019). [OpenUrl][52] 11. [↵][53]1. J. Diedrichsen, 2. R. Shadmehr, 3. R. B. Ivry , Trends Cogn. Sci. 14, 31 (2010). [OpenUrl][54][CrossRef][55][PubMed][56][Web of Science][57] 12. [↵][58]1. E. A. Biddiss, 2. T. T. Chau , Prosthet. Orthot. Int. 31, 236 (2007). [OpenUrl][59][CrossRef][60][PubMed][61][Web of Science][62] 13. [↵][63]1. T. Makin, 2. F. de Vignemont, 3. A. A. Faisal , Nat. Biomed. Eng. 1, 0014 (2017). [OpenUrl][64] Acknowledgments: A.A.F. acknowledges his UKRI Turing AI Fellowship (EP/V025449/1). [1]: #ref-1 [2]: #ref-2 [3]: #ref-3 [4]: #ref-4 [5]: #ref-5 [6]: #ref-6 [7]: #ref-7 [8]: #ref-8 [9]: #ref-9 [10]: #ref-10 [11]: #ref-11 [12]: #ref-12 [13]: #ref-13 [14]: #xref-ref-1-1 "View reference 1 in text" [15]: {openurl}?query=rft.jtitle%253DJournal%2Bof%2BNeurophysiology%26rft.stitle%253DJ.%2BNeurophysiol.%26rft.aulast%253DSainburg%26rft.auinit1%253DR.%2BL.%26rft.volume%253D73%26rft.issue%253D2%26rft.spage%253D820%26rft.epage%253D835%26rft.atitle%253DControl%2Bof%2Blimb%2Bdynamics%2Bin%2Bnormal%2Bsubjects%2Band%2Bpatients%2Bwithout%2Bproprioception%26rft_id%253Dinfo%253Adoi%252F10.1152%252Fjn.1995.73.2.820%26rft_id%253Dinfo%253Apmid%252F7760137%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [16]: /lookup/external-ref?access_num=10.1152/jn.1995.73.2.820&link_type=DOI [17]: /lookup/external-ref?access_num=7760137&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [18]: /lookup/external-ref?access_num=A1995QU21500032&link_type=ISI [19]: #xref-ref-2-1 "View reference 2 in text" [20]: {openurl}?query=rft.jtitle%253DNature%26rft.stitle%253DNature%26rft.aulast%253DHochberg%26rft.auinit1%253DL.%2BR.%26rft.volume%253D485%26rft.issue%253D7398%26rft.spage%253D372%26rft.epage%253D375%26rft.atitle%253DReach%2Band%2Bgrasp%2Bby%2Bpeople%2Bwith%2Btetraplegia%2Busing%2Ba%2Bneurally%2Bcontrolled%2Brobotic%2Barm.%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature11076%26rft_id%253Dinfo%253Apmid%252F22596161%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [21]: /lookup/external-ref?access_num=10.1038/nature11076&link_type=DOI [22]: /lookup/external-ref?access_num=22596161&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [23]: /lookup/external-ref?access_num=000304099100043&link_type=ISI [24]: #xref-ref-3-1 "View reference 3 in text" [25]: {openurl}?query=rft.jtitle%253DNature%26rft.stitle%253DNature%26rft.aulast%253DEthier%26rft.auinit1%253DC.%26rft.volume%253D485%26rft.issue%253D7398%26rft.spage%253D368%26rft.epage%253D371%26rft.atitle%253DRestoration%2Bof%2Bgrasp%2Bfollowing%2Bparalysis%2Bthrough%2Bbrain-controlled%2Bstimulation%2Bof%2Bmuscles.%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature10987%26rft_id%253Dinfo%253Apmid%252F22522928%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [26]: /lookup/external-ref?access_num=10.1038/nature10987&link_type=DOI [27]: /lookup/external-ref?access_num=22522928&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [28]: /lookup/external-ref?access_num=000304099100042&link_type=ISI [29]: #xref-ref-4-1 "View reference 4 in text" [30]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DFlesher%26rft.auinit1%253DS.%2BN.%26rft.volume%253D372%26rft.issue%253D6544%26rft.spage%253D831%26rft.epage%253D836%26rft.atitle%253DA%2Bbrain-computer%2Binterface%2Bthat%2Bevokes%2Btactile%2Bsensations%2Bimproves%2Brobotic%2Barm%2Bcontrol%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.abd0380%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [31]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzNzIvNjU0NC84MzEiO3M6NDoiYXRvbSI7czoyMjoiL3NjaS8zNzIvNjU0NC83OTEuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9 [32]: #xref-ref-5-1 "View reference 5 in text" [33]: {openurl}?query=rft.jtitle%253DTrends%2Bin%2Bcognitive%2Bsciences%26rft.stitle%253DTrends%2BCogn%2BSci%26rft.aulast%253DBach-y-Rita%26rft.auinit1%253DP.%26rft.volume%253D7%26rft.issue%253D12%26rft.spage%253D541%26rft.epage%253D546%26rft.atitle%253DSensory%2Bsubstitution%2Band%2Bthe%2Bhuman-machine%2Binterface.%26rft_id%253Dinfo%253Adoi%252F10.1016%252Fj.tics.2003.10.013%26rft_id%253Dinfo%253Apmid%252F14643370%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [34]: /lookup/external-ref?access_num=10.1016/j.tics.2003.10.013&link_type=DOI [35]: /lookup/external-ref?access_num=14643370&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [36]: /lookup/external-ref?access_num=000187223400010&link_type=ISI [37]: #xref-ref-6-1 "View reference 6 in text" [38]: {openurl}?query=rft.jtitle%253DScience%2BTranslational%2BMedicine%26rft.stitle%253DSci%2BTransl%2BMed%26rft.aulast%253DRaspopovic%26rft.auinit1%253DS.%26rft.volume%253D6%26rft.issue%253D222%26rft.spage%253D222ra19%26rft.epage%253D222ra19%26rft.atitle%253DRestoring%2BNatural%2BSensory%2BFeedback%2Bin%2BReal-Time%2BBidirectional%2BHand%2BProstheses%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscitranslmed.3006820%26rft_id%253Dinfo%253Apmid%252F24500407%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [39]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTE6InNjaXRyYW5zbWVkIjtzOjU6InJlc2lkIjtzOjEzOiI2LzIyMi8yMjJyYTE5IjtzOjQ6ImF0b20iO3M6MjI6Ii9zY2kvMzcyLzY1NDQvNzkxLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ== [40]: #xref-ref-7-1 "View reference 7 in text" [41]: {openurl}?query=rft.jtitle%253DNat.%2BRev.%2BNeurosci.%26rft.volume%253D15%26rft.spage%253D313%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnrn3724%26rft_id%253Dinfo%253Apmid%252F24739786%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [42]: /lookup/external-ref?access_num=10.1038/nrn3724&link_type=DOI [43]: /lookup/external-ref?access_num=24739786&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [44]: #xref-ref-8-1 "View reference 8 in text" [45]: {openurl}?query=rft.jtitle%253DJ.%2BNeural%2BEng.%26rft.volume%253D12%26rft.spage%253D056010%26rft_id%253Dinfo%253Adoi%252F10.1088%252F1741-2560%252F12%252F5%252F056010%26rft_id%253Dinfo%253Apmid%252F26291448%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [46]: /lookup/external-ref?access_num=10.1088/1741-2560/12/5/056010&link_type=DOI [47]: /lookup/external-ref?access_num=26291448&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [48]: #xref-ref-9-1 "View reference 9 in text" [49]: {openurl}?query=rft.jtitle%253DSci.%2BTransl.%2BMed.%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscitranslmed.aaf8083%26rft_id%253Dinfo%253Apmid%252F27738096%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [50]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTE6InNjaXRyYW5zbWVkIjtzOjU6InJlc2lkIjtzOjE0OiI4LzM2MS8zNjFyYTE0MSI7czo0OiJhdG9tIjtzOjIyOiIvc2NpLzM3Mi82NTQ0Lzc5MS5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [51]: #xref-ref-10-1 "View reference 10 in text" [52]: {openurl}?query=rft.jtitle%253DProc.%2BIEEE%26rft.volume%253D107%26rft.spage%253D247%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [53]: #xref-ref-11-1 "View reference 11 in text" [54]: {openurl}?query=rft.jtitle%253DTrends%2Bin%2Bcognitive%2Bsciences%26rft.stitle%253DTrends%2BCogn%2BSci%26rft.aulast%253DDiedrichsen%26rft.auinit1%253DJ.%26rft.volume%253D14%26rft.issue%253D1%26rft.spage%253D31%26rft.epage%253D39%26rft.atitle%253DThe%2Bcoordination%2Bof%2Bmovement%253A%2Boptimal%2Bfeedback%2Bcontrol%2Band%2Bbeyond.%26rft_id%253Dinfo%253Adoi%252F10.1016%252Fj.tics.2009.11.004%26rft_id%253Dinfo%253Apmid%252F20005767%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [55]: /lookup/external-ref?access_num=10.1016/j.tics.2009.11.004&link_type=DOI [56]: /lookup/external-ref?access_num=20005767&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [57]: /lookup/external-ref?access_num=000273974700008&link_type=ISI [58]: #xref-ref-12-1 "View reference 12 in text" [59]: {openurl}?query=rft.jtitle%253DProsthetics%2Band%2BOrthotics%2BInternational%26rft.stitle%253DProsthet%2BOrthot%2BInt%26rft.aulast%253DBiddiss%26rft.auinit1%253DE.%2BA.%26rft.volume%253D31%26rft.issue%253D3%26rft.spage%253D236%26rft.epage%253D257%26rft.atitle%253DUpper%2Blimb%2Bprosthesis%2Buse%2Band%2Babandonment%253A%2BA%2Bsurvey%2Bof%2Bthe%2Blast%2B25%2Byears%26rft_id%253Dinfo%253Adoi%252F10.1080%252F03093640600994581%26rft_id%253Dinfo%253Apmid%252F17979010%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [60]: /lookup/external-ref?access_num=10.1080/03093640600994581&link_type=DOI [61]: /lookup/external-ref?access_num=17979010&link_type=MED&atom=%2Fsci%2F372%2F6544%2F791.atom [62]: /lookup/external-ref?access_num=000250395100003&link_type=ISI [63]: #xref-ref-13-1 "View reference 13 in text" [64]: {openurl}?query=rft.jtitle%253DNat.%2BBiomed.%2BEng.%26rft.volume%253D1%26rft.spage%253D0014%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx
领域气候变化 ; 资源环境
URL查看原文
引用统计
文献类型期刊论文
条目标识符http://119.78.100.173/C666/handle/2XK7JSWQ/328817
专题气候变化
资源环境科学
推荐引用方式
GB/T 7714
A. Aldo Faisal. Putting touch into action[J]. Science,2021.
APA A. Aldo Faisal.(2021).Putting touch into action.Science.
MLA A. Aldo Faisal."Putting touch into action".Science (2021).
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[A. Aldo Faisal]的文章
百度学术
百度学术中相似的文章
[A. Aldo Faisal]的文章
必应学术
必应学术中相似的文章
[A. Aldo Faisal]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。