Menon, DilipKorkut, SafakInglese, TerryDornberger, RolfDornberger, Rolf2024-03-132024-03-132020978-3-030-48331-9978-3-030-48332-610.1007/978-3-030-48332-6_5https://irf.fhnw.ch/handle/11654/42721Today, all smartphones contain a variety of embedded sensors capable of monitoring and measuring relevant physical qualities and quantities, such as light or noise intensity, rotation and acceleration, magnetic field, humidity, etc. Combining data from these different sensors and deriving new practical information is the way to enhance the capabilities of such sensors, known as sensor fusion or multimodal sensing. However, the authors hypothesize that the sensing technology that is embedded in smartphones may also support daily life task management. Because one of the biggest challenges in mobile sensing on smartphones is the lack of appropriate unified data analysis models and common software toolkits, the authors have developed a prototype for a mobile sensing architecture, called Sensing Things Done (STD). With this prototype, by applying multimodal sensing and gathering sensor data from performing a specific set of tasks, the authors were able to conduct a feasibility study to investigate the hypothesis set above. Having examined to what extent the task-related activities could be detected automatically by using sensors of a standard smartphone, the authors of this chapter describe the conducted study and provide derived recommendationsen330 - WirtschaftUsing mobile sensing on smartphones for the management of daily life tasks04A - Beitrag Sammelband63-79