Exploring the computational cost of machine learning at the edge for human-centric Internet of Things
dc.contributor.author | Gómez Carmona, Oihane | |
dc.contributor.author | Casado Mansilla, Diego | |
dc.contributor.author | Kraemer, Frank Alexander | |
dc.contributor.author | López de Ipiña González de Artaza, Diego | |
dc.contributor.author | García-Zubía, Javier | |
dc.date.accessioned | 2024-11-15T08:24:50Z | |
dc.date.available | 2024-11-15T08:24:50Z | |
dc.date.issued | 2020-11 | |
dc.date.updated | 2024-11-15T08:24:50Z | |
dc.description.abstract | In response to users’ demand for privacy, trust and control over their data, executing machine learning tasks at the edge of the system has the potential to make the Internet of Things (IoT) applications and services more human-centric. This implies moving complex computation to a local stage, where edge devices must balance the computational cost of the machine learning techniques to meet the available resources. Thus, in this paper, we analyze all the factors affecting the classification process and empirically evaluate their impact in terms of performance and cost. We put the focus on Human Activity Recognition (HAR) systems, which represent a standard type of classification problems in human-centered IoT applications. We present a holistic optimization approach through input data reduction and feature engineering that aims to enhance all the stages of the classification pipeline and integrate both inference and training at the edge. The results of the conducted evaluation show that there is a highly non-linear trade-off to make between the computational cost, in terms of processing time, and the achieved classification accuracy. In the presented case of study, the computational effort can be reduced by 80% assuming a decline of the classification accuracy of only 3%. The potential impact of the optimization strategy highlights the importance of understanding the initial data and studying the most relevant characteristics of the signal to meet the cost–accuracy requirements. This would contribute to bringing embedded machine learning to the edge and, hence, creating spaces where human and machine intelligence could collaborate. | en |
dc.description.sponsorship | We gratefully acknowledge the support of the Basque Government's Department of Education, Spain for the predoctoral funding of one of the authors and the Deustek Research Group. We also acknowledge the support of the Ministry of Economy, Industry and Competitiveness of Spain for SentientThings under Grant No.: TIN2017-90042-R | en |
dc.identifier.citation | Gómez-Carmona, O., Casado-Mansilla, D., Kraemer, F. A., López-de-Ipiña, D., & García-Zubia, J. (2020). Exploring the computational cost of machine learning at the edge for human-centric Internet of Things. Future Generation Computer Systems, 112, 670-683. https://doi.org/10.1016/J.FUTURE.2020.06.013 | |
dc.identifier.doi | 10.1016/J.FUTURE.2020.06.013 | |
dc.identifier.issn | 0167-739X | |
dc.identifier.uri | http://hdl.handle.net/20.500.14454/1890 | |
dc.language.iso | eng | |
dc.publisher | Elsevier B.V. | |
dc.rights | © 2020 Elsevier B.V. | |
dc.subject.other | Cost-accuracy | |
dc.subject.other | Edge computing | |
dc.subject.other | Edge intelligence | |
dc.subject.other | Embedded systems | |
dc.subject.other | Internet of Things | |
dc.subject.other | Machine learning | |
dc.title | Exploring the computational cost of machine learning at the edge for human-centric Internet of Things | en |
dc.type | journal article | |
dcterms.accessRights | metadata only access | |
oaire.citation.endPage | 683 | |
oaire.citation.startPage | 670 | |
oaire.citation.title | Future Generation Computer Systems | |
oaire.citation.volume | 112 |