In recent years, status monitoring systems have become necessary and have been studied extensively to address the issue of aging and dying alone. Especially in Japan, which has the largest portion of elderly of all the countries, dying alone has been increasing not only for people over 65 years old, but also for those who are in their prime of life (ages 40 to 65 years). These who are 40 to 65 years old feel that it is necessary to lead a regular life; however, none of them has been be able to lead a regular life because of work. A correlation exists between mastication ability and independent living. Estimation of mastication using a wearable sensor has thus been studied. However, wearing a sensor adds to the subject severyday stress. Therefore, we designed a method to estimate mastication using non-contact depth sensors. In this study, we propose collaboration of action recognition with skeleton information and biting estimation with facial expressions. We also conduct an experiment to evaluate this method. Copyright ISCA.