Human-Computer Interaction for Space Situational Awareness (SSA): Towards the SSA Integrated Sensor Viewer (ISV)

Mitchell Kirshner, David C. Gross

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Systems intending to achieve any level of space situational awareness inevitably require operator interfaces that enable the operation of sensors and the utilization of sensor provided data. As sensors are frequently owned and operated by other agencies, significant modification of the individual sensor capabilities may be beyond the reach of the space situational awareness system designer. The utilization of the sensor data however is not limited by anything except the ability of the space situational awareness system designer to innovate. The mission of the Architecture Driven Systems Laboratory (ADSL) at the University of Arizona’s Systems and Industrial Engineering department is to explore such innovations in all aspects of system architecture- especially the operator interfaces. The ADSL has developed an initial operating capability encapsulated in the System Architecture Synthesis and Analysis Framework (SASAF). The SASAF’s Operations Phase capabilities include separate instantiable tools modeling operation of sensors (i.e., the Sensor Tasking Tool), and utilization of sensor data (i.e. The Integrated Sensor Viewer). The Sensor Tasking Tool provides little opportunity for innovation because it is limited by the available sensors. However, the Integrated Sensor Viewer (ISV) provides substantial opportunities for innovation to help operators visualize and understand Resident Space Object (RSO) ephemeris data. The ISV implements a dynamic ontology developed within the software Unity correlates RSO data to commercial SSA company LeoLabs data retrieval API, allowing users to access additional information in-situ, without disrupting their sensory immersion and situational awareness. Product documentation generated in the ADSL allow stakeholders to better understand the final product.

Original languageEnglish (US)
Title of host publicationVirtual, Augmented and Mixed Reality. Applications and Case Studies - 11th International Conference, VAMR 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings
EditorsJessie Y.C. Chen, Gino Fragomeni
PublisherSpringer-Verlag
Pages504-515
Number of pages12
ISBN (Print)9783030215644
DOIs
StatePublished - 2019
Event11th International Conference on Virtual, Augmented and Mixed Reality, VAMR 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019 - Orlando, United States
Duration: Jul 26 2019Jul 31 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11575 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th International Conference on Virtual, Augmented and Mixed Reality, VAMR 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019
Country/TerritoryUnited States
CityOrlando
Period7/26/197/31/19

Keywords

  • Applications: virtual worlds and social computing
  • Interaction and navigation in VR and MR: immersion
  • Issues in development and use of VR and MR: situational awareness

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Human-Computer Interaction for Space Situational Awareness (SSA): Towards the SSA Integrated Sensor Viewer (ISV)'. Together they form a unique fingerprint.

Cite this