The pursuit of creating more convenient and useful applications has heavily centered on the concept of context-awareness. In this paper, we introduce a novel Android application that effectively detects and assesses three key parameters of its users: current location, movement status, and viewing status. To determine the movement status, our app utilizes data from the device's accelerometer, which measures acceleration forces in different directions. This allows the app to accurately identify whether the user is stationary, walking, running, or engaging in other forms of movement. Additionally, by integrating both accelerometer data and orientation values, the app can pinpoint the user’s precise geographic location and assess their viewing status—whether they are actively looking at the device or their attention is directed elsewhere. The design and architecture of the system are thoroughly discussed in this paper, highlighting the methodologies employed, the technology stack used, and the implications of context-aware computing for enhancing user experience in mobile applications. Through this approach, we aim to provide users with seamlessly integrated functionalities that adapt to their real-time contexts and needs.
Variable Multi-Sensor Signals; Context Awareness; Context Inference; Data fusion