Examinando por Autor "Azkune, Gorka"
Mostrando 1 - 2 de 2
Resultados por página
Opciones de ordenación
Ítem Cross-environment activity recognition using word embeddings for sensor and activity representation(Elsevier B.V., 2020-12-22) Azkune, Gorka; Almeida, Aitor; Agirre, EnekoCross-environment activity recognition in smart homes is a very challenging problem, specially for data-driven approaches. Currently, systems developed to work for a certain environment degrade substantially when applied to a new environment, where not only sensors, but also the monitored activities may be different. Some systems require manual labeling and mapping of the new sensor names and activities using an ontology. Ideally, given a new smart home, we would like to be able to deploy the system, which has been trained on other sources, with minimal manual effort and with acceptable performance. In this paper, we propose the use of neural word embeddings to represent sensor activations and activities, which comes with several advantages: (i) the representation of the semantic information of sensor and activity names, and (ii) automatically mapping sensors and activities of different environments into the same semantic space. Based on this novel representation approach, we propose two data-driven activity recognition systems: the first one is a completely unsupervised system based on embedding similarities, while the second one adds a supervised learning regressor on top of them. We compare our approaches with some baselines using four public datasets, showing that data-driven cross-environment activity recognition obtains good results even when sensors and activity labels significantly differ. Our results show promise for reducing manual effort, and are complementary to other efforts using ontologies.Ítem Embedding-based real-time change point detection with application to activity segmentation in smart home time series data(Elsevier Ltd, 2021-12-15) Bermejo Fernández, Unai; Almeida, Aitor; Bilbao Jayo, Aritz; Azkune, GorkaHuman activity recognition systems are essential to enable many assistive applications. Those systems can be sensor-based or vision-based. When sensor-based systems are deployed in real environments, they must segment sensor data streams on the fly in order to extract features and recognize the ongoing activities. This segmentation can be done with different approaches. One effective approach is to employ change point detection (CPD) algorithms to detect activity transitions (i.e. determine when activities start and end). In this paper, we present a novel real-time CPD method to perform activity segmentation, where neural embeddings (vectors of continuous numbers) are used to represent sensor events. Through empirical evaluation with 3 publicly available benchmark datasets, we conclude that our method is useful for segmenting sensor data, offering significant better performance than state of the art algorithms in two of them. Besides, we propose the use of retrofitting, a graph-based technique, to adjust the embeddings and introduce expert knowledge in the activity segmentation task, showing empirically that it can improve the performance of our method using three graphs generated from two sources of information. Finally, we discuss the advantages of our approach regarding computational cost, manual effort reduction (no need of hand-crafted features) and cross-environment possibilities (transfer learning) in comparison to others.