Sensor analysis and fusion for detecting human states and affect is a challenging open problem in ubiquitous computing research, as well as a popular milestone for the automobile industry. The use of driver assistance systems has become increasingly popular due to advances in Artificial Intelligence, with the aim of improving road safety and reducing the number of accidents caused by human error. Despite their great potential, the deployment of such technologies is still at infant stage, especially when considering the driver’s affective state, which can greatly impact driving performance. This project aims to address this issue by developing systems and improving the performance of affective state detection in driving with the use of multimodal biometric sensor information, such as EDA, ECG, PPG, and respiration.