Using Artificial Intelligence to detect Emotions… Can it help in FM, at what cost?
- dan1462
- Jul 1, 2024
- 6 min read

Sometimes when you read an article in the press it makes you ask yourself important questions, other times they make you think – occasionally they may spark an idea. On reading Wired’s piece about AI powered cameras being used to detect emotions of unwitting train passengers which sparked significant debate about the ethical implications and potential applications of this technology. You find yourself in the unique position of having some inside knowledge of the story and just end up laughing.
These AI engines work on models that are trained using video footage. The software effectively watches the footage and learns to recognise patterns. Whether it is cups with a company’s logo being thrown into a bin, or identifying happy, sad, or frustrated faces, the accuracy of these models improves as the data sets behind them grow larger and more diverse.
The project involving emotion detection cameras at Network Rail had its origins in identifying terrorist behaviours and assessing suicide risks at stations to keep the public safe. It was accelerated during the COVID-19 pandemic, gaining rapid approval to monitor compliance with face mask mandates at gate lines in major London transport hubs. Some other use cases were tested once the technology was there for feasibility, with facilities service providers being alerted if the cameras identified spillages or detect crowding to alert security and platform staff to open barriers and clear areas, thereby avoiding congestion and potential safety issues.
This technology, which utilises advanced AI can also be used to analyse human emotions. It opens new possibilities for facilities managers and the built environment as the potential for such technology to improve service delivery, understand user satisfaction first hand in real-time and maintain or improve security in various environments is immense. However, it also raises important questions about privacy, consent, and the potential for misuse, hence the lively debate on LinkedIn following the somewhat ‘scaremongering’ article in Wired.
While the concept of emotion detection in facilities management is relatively novel, there are early adopters exploring its potential. Retail environments, for instance, have experimented with emotion detection to gauge customer reactions to products and store layouts. Airports have used similar technology to monitor passenger stress levels, improving crowd management and reducing bottlenecks.
Current Use of Artificial Intelligence and Future Prospect
The future of this technology in FM looks promising, with advancements in AI and machine learning making it increasingly accurate and reliable. However, its widespread adoption will depend on overcoming several challenges, particularly concerning privacy and ethical considerations.
The deployment of emotion detection technology raises significant ethical issues. The primary concern is privacy. The ability to monitor and analyse emotions in real-time can be perceived as intrusive, especially if individuals are unaware, they are being observed. This could lead to a sense of constant surveillance, eroding trust between occupants and facility managers.
Ethical and Privacy Concerns of Emotion Detection
It’s important to address the concerns raised by sensationalist perspectives like the one presented in Wired, which can sometimes take the implications of such technology out of context. It’s worth noting that we are already under constant surveillance through CCTV systems in public spaces. Emotion detection AI doesn’t take pictures but rather analyses distance points on faces, converting these into numerical patterns to identify emotions like happiness or sadness.

While consent is a critical issue for emotion detection to be ethically implemented, it must be transparent, with clear communication about its use and purpose. It needs to be understood that the businesses or organisations trying to deploy this aren’t gathering data on you as an individual. Arguably the loyalty cards you use when you get to the till, or digital season tickets you use are doing this much more effectively.
You may feel that users should have the option to opt-in or out and should be informed about how their data will be used, stored, and protected, but how does this differ from your image being captured on CCTV?
Potential Use Cases of Emotion Detection in FM
1. Enhancing Customer Experience in Offices and Public Spaces:
Facilities managers could use emotion detection cameras to monitor and improve the satisfaction levels of occupants in offices, washrooms, and other public areas. By analysing emotions such as frustration, happiness, or stress, FMs can identify and address problem points in their buildings. This has already been done in Airports and other locations to help with a passenger experience.
If users frequently display signs of frustration in a particular area of an office, it might indicate problems such as poor lighting, uncomfortable seating, or inadequate climate control, or they may just be stood near a printer that isn’t working, again. Prompt interventions could significantly enhance the overall user experience.
2. Improving Public Safety and Security:
Just as Network Rail aims to enhance safety by detecting behaviours indicative of potential theft or trespassing, similar technology can be employed in other built environments. In malls, airports, or stadiums, emotion detection can help security personnel respond more swiftly to unusual or threatening behaviour, potentially preventing incidents before they escalate.
3. Operational Efficiency and Maintenance:
Emotion detection can also provide insights into how different spaces are being used and perceived, informing maintenance schedules and space utilisation strategies. For example, if a washroom consistently elicits negative emotions, it might be due for a deep clean or renovation. This data-driven approach allows for more efficient allocation of resources and ensures that facilities are maintained to high standards.
4. Targeted Advertising and Marketing:
An evolution of demographic-based AI applications is the use of emotion detection for targeted advertising. Companies are already using AI to identify people based on demographic factors and drive video advertising boards to display relevant content as they pass. For instance, if a camera detects a person pushing a pram, it might trigger adverts for nappies or baby products. Emotion detection takes this a step further by allowing for more nuanced and responsive advertising. If a camera identifies that someone is happy, they might be shown ads for leisure activities, whereas signs of stress might trigger ads for relaxation products. This could significantly enhance the effectiveness of advertising campaigns by aligning them more closely with the current emotional state of potential customers.
5. Automated room controls based on users’ mood:
Instead of changing an advertising board, meeting room or office conditions could be change, lift music could be tailored to its occupants and favoured smells could be selected on and employee profile and sprayed ahead of people based on predictive movement analysis.
Summary
While the initial list of use cases might have been constructed to justify the high costs of this cutting-edge technology, as it improves and becomes more affordable, it will likely find a viable price point, offering a return on investment. To illustrate, washroom door sensors were once prohibitively expensive, have become affordable over time—dropping to less than £1 a day in some cases.
Balancing Benefits and Risks
To balance the benefits of emotion detection analysis with the associated risks, facilities managers and stakeholders need to adopt a measured and ethical approach:
1. Transparency and Consent:
Ensure that the deployment of emotion detection cameras is transparent. Communicate clearly with occupants about why the technology is being used, how it benefits them, and what measures are in place to protect their privacy.
2. Data Security:
Implement robust data security protocols to protect sensitive information gathered through emotion detection. Ensure that data is anonymised and stored securely to prevent unauthorised access.
3. Human Oversight:
There is the risk of misuse or over-reliance on AI. Emotion detection systems are not infallible and can misinterpret signals, leading to false positives or negatives. Over-reliance on this technology could result in neglecting traditional, human-centric approaches to facilities management, such as direct feedback and personal interaction. Human oversight is crucial to interpret AI findings accurately and make informed decisions.
4. Ethical Guidelines:
Develop and adhere to ethical guidelines for the use of AI in facilities management. These should be aligned with broader societal values and legal frameworks to ensure responsible use.
Conclusion
The issue with a focused article on the topic of Emotion Detection with AI, comes when summarising your thoughts. Emotion detection technology offers exciting possibilities for enhancing the built environment and making facilities managers lives better. By providing real-time insights into user satisfaction and behaviour, it can help create more responsive, efficient, and enjoyable spaces. The integration of this technology into targeted advertising represents a natural evolution, further personalising and optimising user experiences.
Emotion detection analysis can indeed be a positive development for AI in the built environment, but not in isolation, technology needs to be embraced as a whole and brought together to gain the true benefits and value that is possible. This is just another piece at the bleeding edge of the AI revolution then when properly harnessed can be used to improve our lives – the question remains to be seen what is the true price?
Comentarios