Pervasive computing enables computers to interact with the real world in a ubiquitous and natural manner. Quality of service (QoS), related to transmission delay, bandwidth, or packet loss, has been studied in various building blocks in pervasive computing, e.g., different QoS mechanisms are presented for wireless or wired networks; the notion of computational QoS is used for parallel processing. The emerging pervasive computing, however, is application-driven and mission-critical and the existing QoS notions to do not really match. Quality of Information (QoI) or Information Quality (IQ) of sensor-originated information relates to the fitness of the information for a sensor-enabled application. Harnessing and optimizing QoI of information derived from sensor networks will be key to bringing together information acquisition and processing systems that support the on-demand information needs of a broad spectrum of smart, sensor-enabled applications such as remote real-time habitat monitoring, utility grid monitoring, environmental control, supply-chain management, health care, machinery control, intelligent highways, military intelligence, reconnaissance and surveillance (ISR), border control, and hazardous material monitoring, just to mention a few.
QoI touches every part of the end-to-end flow of sensor-derived information, from the sensors themselves and the observation data they produce to the various fusion layers that process these data and eventually to the applications (and their users) that use them. For example, sensor-generated information is used as the basis for determining context at varying levels of accuracy and fidelity, in a hierarchical fashion, with lower-layer context effectively serving as a virtual sensor stream for higher-layer context determination. The effectiveness of actions taken by the applications using this information serves as the ultimate assessor of the quality and value-add provided by the entire sensor-enabled application. For example, an action may be highly effective achieving all its anticipated goals, partially effective, or entirely ineffective. Complementing 'traditional' provisioning of QoS with QoI for pervasive computing is challenging and difficult due to the resource-constrained, dynamic and distributed nature of the system, the weakness under security attacks, and the lack of a design approach that takes into account the different types of resources and their inter-dependencies. Novel mechanisms are required in pervasive computing which should integrate QoI, network QoS, computational QoS, security, and a user's Quality of Experience (QoE), which will be influence by the application goals and the pervasive environment in which the application is utilized. Such advances require inter-disciplinary activities at the intersection of pervasive computing, human-computer interfaces, intent modeling, sensor fusion, machine learning, and information theory.
|Paper submission||November 11, 2016 December 6, 2016 (extended)|
|Acceptance notification||December 23, 2016|
|Camera-Ready due||January 13, 2017|
|Author registration due||January 13, 2017|
|Workshop date||March 17, 2017|
Rich high-frequency multi-modal sensor data streams, continually captured by mobile, embedded and human sensors and processed by machine learning algorithms, are revolutionizing a range of scientific, engineering, and humanities disciplines. Innovative applications in domains such as precision medicine, energy and water management, and smart cities seek to provide new insights and trigger just-in-time interventions. Software tools and cloud services tailored to collection, transport, storage, processing, and visualization of sensory data are available.
Clearly, there has been considerable progress towards the vision of a multi-tenant pervasive substrate providing sensing as a service for applications that need awareness of the state of the natural, engineered, and social world around us. Yet, a significant challenge remains: the trust that consumers and producers of sensory data can place in this emerging pervasive sensing substrate. With diverse sensors deployed out in the wild, and sensory information traversing multiple entities along the data-to-decision pathway, decisions makers who make use of sensory data face the problem of uncertain and variable data quality, and a lack of visibility into necessary contextual information that would help explain the data and its quality. Likewise, potential data contributors with privacy concerns face the uncertainty of how their sensor data is managed downstream.
A key to addressing both these problems is to have metadata accompanying the sensor measurement values so that one the one hand downstream users gain visibility into quality and provenance of the data, and on the other hand upstream users can exercise control over how the data is handled downstream. The cyberinfrastructure underlying the pervasive sensing substrate must therefore provide run-time support for efficiently capturing, representing, propagating, querying, and reasoning about metadata relating to quality, provenance, and usage constraints associated with the sensor measurements. Furthermore, the sensor processing software must be designed so that they also derive the metadata associated with the output values they produce, taking into account not only the input values but also the input metadata.
The talk will present ideas towards architecting a sensor cyber-infrastructure that incorporates a metadata framework with the aforementioned characteristics. Across two recently funded NSF projects, mProv (http://mprov.md2k.org) and MetroInsight (http://metroinsight.io), we are working towards developing such sensor cyber-infrastructures targeting mHealth and urban area sensing application domains respectively. In these systems, the multimodal high-frequency real-time sensor data streams would not only carry sensor measurement values but also metadata relating to quality, provenance, and usage policy so that knowledge discovery and decision making can be done robustly and responsibly.
Mani Srivastava is on the faculty at UCLA where he is associated with the EE Department with a joint appointment in the CS Department. His research is broadly in the area of networked human-cyber-physical systems, and spans problems across the entire spectrum of applications, architectures, algorithms, and technologies. His current interests include issues of energy efficiency, privacy and security, data quality, and variability in the context of systems and applications for mHealth and sustainable buildings. He is a Fellow of the IEEE.