Exploring the computational cost of machine learning at the edge for human-centric Internet of Things

dc.contributor.authorGómez Carmona, Oihane
dc.contributor.authorCasado Mansilla, Diego
dc.contributor.authorKraemer, Frank Alexander
dc.contributor.authorLópez de Ipiña González de Artaza, Diego
dc.contributor.authorGarcía-Zubía, Javier
dc.date.accessioned2024-11-15T08:24:50Z
dc.date.available2024-11-15T08:24:50Z
dc.date.issued2020-11
dc.date.updated2024-11-15T08:24:50Z
dc.description.abstractIn response to users’ demand for privacy, trust and control over their data, executing machine learning tasks at the edge of the system has the potential to make the Internet of Things (IoT) applications and services more human-centric. This implies moving complex computation to a local stage, where edge devices must balance the computational cost of the machine learning techniques to meet the available resources. Thus, in this paper, we analyze all the factors affecting the classification process and empirically evaluate their impact in terms of performance and cost. We put the focus on Human Activity Recognition (HAR) systems, which represent a standard type of classification problems in human-centered IoT applications. We present a holistic optimization approach through input data reduction and feature engineering that aims to enhance all the stages of the classification pipeline and integrate both inference and training at the edge. The results of the conducted evaluation show that there is a highly non-linear trade-off to make between the computational cost, in terms of processing time, and the achieved classification accuracy. In the presented case of study, the computational effort can be reduced by 80% assuming a decline of the classification accuracy of only 3%. The potential impact of the optimization strategy highlights the importance of understanding the initial data and studying the most relevant characteristics of the signal to meet the cost–accuracy requirements. This would contribute to bringing embedded machine learning to the edge and, hence, creating spaces where human and machine intelligence could collaborate.en
dc.description.sponsorshipWe gratefully acknowledge the support of the Basque Government's Department of Education, Spain for the predoctoral funding of one of the authors and the Deustek Research Group. We also acknowledge the support of the Ministry of Economy, Industry and Competitiveness of Spain for SentientThings under Grant No.: TIN2017-90042-Ren
dc.identifier.citationGómez-Carmona, O., Casado-Mansilla, D., Kraemer, F. A., López-de-Ipiña, D., & García-Zubia, J. (2020). Exploring the computational cost of machine learning at the edge for human-centric Internet of Things. Future Generation Computer Systems, 112, 670-683. https://doi.org/10.1016/J.FUTURE.2020.06.013
dc.identifier.doi10.1016/J.FUTURE.2020.06.013
dc.identifier.issn0167-739X
dc.identifier.urihttp://hdl.handle.net/20.500.14454/1890
dc.language.isoeng
dc.publisherElsevier B.V.
dc.rights© 2020 Elsevier B.V.
dc.subject.otherCost-accuracy
dc.subject.otherEdge computing
dc.subject.otherEdge intelligence
dc.subject.otherEmbedded systems
dc.subject.otherInternet of Things
dc.subject.otherMachine learning
dc.titleExploring the computational cost of machine learning at the edge for human-centric Internet of Thingsen
dc.typejournal article
dcterms.accessRightsmetadata only access
oaire.citation.endPage683
oaire.citation.startPage670
oaire.citation.titleFuture Generation Computer Systems
oaire.citation.volume112
Ficheros en el ítem
Colecciones