Wearable-Based Consumer Activity Tracking for Eating & Retail
Personalised lifestyle analytics has always been of interest to researchers in domains such as healthcare, where researchers may be interested in understanding whether diabetes can be prevented by providing intervention. In the retail domain, researchers are interested in understanding factors that lead to changes in buying patterns. Our technologies dramatically reduce the human effort in which such data capture and analytics require, by relying on automated, smart sensing carried out by personal devices (such as smartphones and smartwatches). In particular, we utilise the inertial sensors and the embedded camera of a smartwatch to capture an individual’s eating behaviour and diet choices unobtrusively. Similarly, we utilise the inertial and Radio Frequency (RF) sensors on a shopper’s smartphone and/or smartwatch to capture their in-store interactions with different products and build deeper profiles for each individual shopper.Current approaches involve either significantly higher manual effort or more extensive infrastructure deployment. For example, for eating analytics, existing approaches require individuals to manually upload pictures of their diet or enter their eating activities into digital journals. For retail, alternative approaches involve the use of in-store cameras and videos, which have privacy concerns and cannot attach an observed shopping profile to a specific customer.