As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Today, both science and industry rely heavily on machine learning models, predominantly artificial neural networks, that become increasingly complex and demand more computing resources to be trained. In this paper, we will look holistically at the efficiency of machine learning models and draw the inspirations to address their main challenges from the green sustainable economy principles. Instead of constraining some computations or memory used by the models, we will focus on reusing what is available to them: computations done in the previous processing steps, partial information accessible at run-time, or knowledge gained by the model during previous training sessions in continually learned models. This new research path of zero-waste machine learning can lead to several research questions related to efficiency of contemporary neural networks - how machine learning models can learn better with less data? How they select relevant data samples out of many? Finally, how can they build on top of already trained models to reduce the need for more training samples? Here, we explore all the above questions and attempt to answer them.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.