TY - GEN
T1 - Human-Computer Interaction for Space Situational Awareness (SSA)
T2 - 11th International Conference on Virtual, Augmented and Mixed Reality, VAMR 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019
AU - Kirshner, Mitchell
AU - Gross, David C.
N1 - Publisher Copyright:
© 2019, Springer Nature Switzerland AG.
PY - 2019
Y1 - 2019
N2 - Systems intending to achieve any level of space situational awareness inevitably require operator interfaces that enable the operation of sensors and the utilization of sensor provided data. As sensors are frequently owned and operated by other agencies, significant modification of the individual sensor capabilities may be beyond the reach of the space situational awareness system designer. The utilization of the sensor data however is not limited by anything except the ability of the space situational awareness system designer to innovate. The mission of the Architecture Driven Systems Laboratory (ADSL) at the University of Arizona’s Systems and Industrial Engineering department is to explore such innovations in all aspects of system architecture- especially the operator interfaces. The ADSL has developed an initial operating capability encapsulated in the System Architecture Synthesis and Analysis Framework (SASAF). The SASAF’s Operations Phase capabilities include separate instantiable tools modeling operation of sensors (i.e., the Sensor Tasking Tool), and utilization of sensor data (i.e. The Integrated Sensor Viewer). The Sensor Tasking Tool provides little opportunity for innovation because it is limited by the available sensors. However, the Integrated Sensor Viewer (ISV) provides substantial opportunities for innovation to help operators visualize and understand Resident Space Object (RSO) ephemeris data. The ISV implements a dynamic ontology developed within the software Unity correlates RSO data to commercial SSA company LeoLabs data retrieval API, allowing users to access additional information in-situ, without disrupting their sensory immersion and situational awareness. Product documentation generated in the ADSL allow stakeholders to better understand the final product.
AB - Systems intending to achieve any level of space situational awareness inevitably require operator interfaces that enable the operation of sensors and the utilization of sensor provided data. As sensors are frequently owned and operated by other agencies, significant modification of the individual sensor capabilities may be beyond the reach of the space situational awareness system designer. The utilization of the sensor data however is not limited by anything except the ability of the space situational awareness system designer to innovate. The mission of the Architecture Driven Systems Laboratory (ADSL) at the University of Arizona’s Systems and Industrial Engineering department is to explore such innovations in all aspects of system architecture- especially the operator interfaces. The ADSL has developed an initial operating capability encapsulated in the System Architecture Synthesis and Analysis Framework (SASAF). The SASAF’s Operations Phase capabilities include separate instantiable tools modeling operation of sensors (i.e., the Sensor Tasking Tool), and utilization of sensor data (i.e. The Integrated Sensor Viewer). The Sensor Tasking Tool provides little opportunity for innovation because it is limited by the available sensors. However, the Integrated Sensor Viewer (ISV) provides substantial opportunities for innovation to help operators visualize and understand Resident Space Object (RSO) ephemeris data. The ISV implements a dynamic ontology developed within the software Unity correlates RSO data to commercial SSA company LeoLabs data retrieval API, allowing users to access additional information in-situ, without disrupting their sensory immersion and situational awareness. Product documentation generated in the ADSL allow stakeholders to better understand the final product.
KW - Applications: virtual worlds and social computing
KW - Interaction and navigation in VR and MR: immersion
KW - Issues in development and use of VR and MR: situational awareness
UR - http://www.scopus.com/inward/record.url?scp=85069637697&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85069637697&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-21565-1_34
DO - 10.1007/978-3-030-21565-1_34
M3 - Conference contribution
AN - SCOPUS:85069637697
SN - 9783030215644
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 504
EP - 515
BT - Virtual, Augmented and Mixed Reality. Applications and Case Studies - 11th International Conference, VAMR 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings
A2 - Chen, Jessie Y.C.
A2 - Fragomeni, Gino
PB - Springer-Verlag
Y2 - 26 July 2019 through 31 July 2019
ER -