The emergence of Artificial Intelligence is playing a very important role in automating the services, guest interaction and experience in Hospitality Industry. Some instances where AI assists hotels in identifying and satisfying the likes and dislikes of guests are:
- Smart Room Experiences
AI-powered systems to adjust room temperature lighting, and music preferences. Few hotel chains viz. (i) YOTEL Hotels provide robotic services to the guests; (ii) Wynn Hotels in Las Vegas have integrated Amazon's Alexa in each of their guest rooms; (iii) Hilton has introduced 'Connie' an AI robot that recommends tourist destinations that guests can visit; (iv) Cosmopolitan Hotel, Las Vegas uses AI-driven digital concierge named "Rose" which is an interactive voice-activated tool; (v) Henna na Hotel, Japan uses anthropomorphic robots to assist throughout their stay; and (vi) Edwardian Hotel, UK uses AI chatbot, "Edward" which assists the guests with housekeeping inquiries etc. - Enhanced Guest Communication Systems
A digital concierge, powered by AI chatbot redirects real-time the guest's requests to the relevant departments of hotel thereby reducing staffing costs and improve the service. - Enhanced Revenue Management and Pricing
Strategy
Advanced AI-driven tools are enabling revenue management, data analytics (occupancy rates, demand trends, market fluctuations etc.) distribution channels to provide real-time pricing recommendations to hoteliers to increase GOP and revenue in the market.
Therefore, AI driven assistants are being deployed which are reducing staffing needs. However, this use of AI technology raises a few legal concerns Eg. Vicarious Liability which is addressed in this article.
Normally, the employer is answerable for the work performed by any assistant/staff/employee at the hotel. The important question which arises is whether the hotel will be responsible for the mistakes, deficiencies or negligence of the AI tools/chatbot etc.
Vicarious Liability deals with the master and servant relationship where the hotel management/General Manager is liable for the act of assistant/staff/employee. As per this principle, the employer is held liable for the wrongful acts that employees have committed when performing their job duties. The essential feature of Vicarious liability is that even though the employer did not directly engage in the wrongful act but the employer will be held accountable
In Shiv Kumar Jatia v. State of NCT of Delhi1, the Hon'ble Supreme Court of India quashed the criminal proceedings that were initiated against Mr. Jatia on the ground that the accused was the Managing Director of the company and that he was the only non-independent executive director of the company. The Honourable Supreme Court in the above case reaffirmed its views basis the case of Sunil Bharti Mittal v. Central Bureau of Investigation2, where it inter-alia was held that directors under the vicarious liability provision, can be made accused only if there is sufficient material to prove their active role coupled with criminal intent. The facts of the case were that (i) One guest (Rishi) staying at Hyatt Regency, Delhi, (owned by Asian hotel), went to the 6th Floor terrace for smoking with 2 other guests, and suffered a fall from the 6th floor to the 4th Floor. It was alleged that the hotel staff did not stop the guests from going to the open terrace and there was no light on the terrace. It was also further alleged by the guest that there was lapse on part of the hotel management in taking proper safety measures for the guests and allowing the guests to smoke in an area which was not at all safe. The first information report (FIR) no 390 of 2013 filed, charged the managing director, general manager and other six employees of the hotel for offences under section 308, 336 and338 of IPC. However, no ingredients for were found. Accused persons were also sought to be prosecuted under section 4 of the Cigarettes and other Tobacco Products (Prohibition of Trade and Commerce, Production, Supply and Distribution) Act, 2003. (COTPA).
The charge sheet was filed against the Managing Director stating that he is overall responsible for all omission and commission of its officials, violation of lodging license / health trade license with regards to safety of its guests and therefore liable on account of vicarious liability. In the above case there was no material to prove the active role coupled with criminal intent of the MD of the Company.
The Supreme Court of India in the recent case of KS Mehta vs MS Morgan Securities and Credits Private Limited3, has held that two conditions need to be fulfilled for holding a Director vicariously liability viz. (1) the person is in charge of the business, (2) the person is responsible to the company for the conduct of the business. The Court clarified that criminal liability cannot be presumed automatically, and the burden is on the complainant to show the accused's specific role in the transaction. If no express provision providing for vicarious liability of directors exists, then the individuals can only be prosecuted if there is direct evidence of their active role along with criminal intent. Thus, the liability arises where actus reus (action or conduct is a constituent element of a crime) and mensrea (the mental state) of the accused exist.
Now let us examine if the employer will be Vicariously liable for the acts of the AI tool/assistant/chatbot. In the case of Moffatt vs Air Canada, 20244 the Canadian Court held that Air Canada did not take reasonable care to ensure its chatbot was accurate and, in reasonably relying on the chatbot, the Applicant suffered damages as a result. The court after hearing both Parties, held that Moffatt met all the above requirements and Air Canada was solely responsible for all the information show on the website (including information provided by AI chatbot) and ultimately ordered Air Canada to pay money to Moffatt for damages. The facts of this case are that Jake Moffatt (a customer of Air Canada airlines) had sought information via the AI chatbot about bereavement fare discounts offered by Air Canada whether he could afford to travel by plane for his grandmother's funeral. The AI chatbot had responded to Moffatt that he could purchase tickets at a reduced rate or receive a partial reimbursement of the full cost of a ticket already purchased if he submitted his claim within 90 days of the travel date. The AI chatbot had also shared a link to a website containing the bereavement policy which was included in the chatbot's reply to Moffatt. That policy had stated: "Please be aware that our Bereavement policy does not allow refunds for travel that has already happened."
Moffatt after hearing chatbot's reply of bereavement fare discount, booked his flight tickets. But Air Canada denied providing him a bereavement discounted flight ticket. Moffatt thereafter, challenged Air Canada's decision, and submitted an application in the court for a refund and a negligence by Air Canada. He also stated that he booked the ticket, after relying on the information that was provided by the AI chatbot on Air Canada's website. Air Canada had admitted that the information provided by the AI chatbot was misleading in nature, but still stated that Moffatt was not liable to a refund and that AI Chatbot was a separate legal entity from Air Canada. The Court stated that Moffatt was to show a duty of care, that its representation was untrue, inaccurate or misleading, that Air Canada made the representation negligently, that he reasonably relied on it, and that that reliance resulted in damages.
Thus, the concept of vicarious liability is becoming applicable for AI tools/chatbots deployed by the Company that render services to consumers (in the above case Air Canada) and they will be held responsible if anything goes wrong.
There are certain exceptions to the exclusion for vicarious liability stated above. This exception is for "strict liabilities" arising in certain circumstances. Coming back to the hotel industry, if services provided by the Hotel can cause any foreseeable harm to the guests, then the Hotel will be liable for the damages, irrespective whether or not the Hotel was negligent or was making efforts to prevent those harms. Strict liability doctrine will hold the hotel's management liable for damages or losses without needing proof of their intentions. The Supreme Court of India imposed this principle in the case of MC Mehta vs Union Carbide where it held that "Once the activity carried on is hazardous or inherently dangerous, the person carrying on such activity is liable to make good the loss caused to any other person by his activity irrespective of the fact whether he took reasonable care".
Imagine a situation that the hotel allows to the guests to use wi-fi during their stay and a 12-year-old kid plays PUBG[i] and hurts himself in the process. The question that may arise is whether the hotel management can be held responsible under the principle of strict liability or will there be no vicarious liability as the choice of using the internet was solely with the guest. Another situation is where again 12-year-old kid uses the internet to play online real money games which as per the Intermediary guidelines and Digital Media Ethics Code Rules, are only permitted to be played by persons over the age of 13 years.
One may argue that if any person is executing any crime by using AI then such criminal activity cannot be classified as a traditional crime, even though it has been committed via AI tools/robot/chatbot that is not related to the person who created the software, program, or machine. Situations where hotel owners may get exposed to criminal liability are when guests flout strict legislations such as Officials Secrets Act or PMLA using the facilities and the premises of the hotel. For instance, use of public wi-fi during a confidential conference of government officials discussing classified information, which got hacked due to low strength of the security firewalls put in place by the hotel.
The hotel management needs to be mindful that even if they do not get embroiled in the criminal liability for damages caused due to the AI/software, the hotel could still be liable for civil liability/tort of negligence where the hotel does not fulfil/breaches its "duty of care" and any injury caused to the guest due to such a breach. There is also a rise of misleading advertisements or commitments coming from AI tools/chatbots which can be considered as unfair trade practice under the 2019 amendments to the Consumer Protection Act. Further, the use and deployment of AI tools/chatbots need to be mindful of data breach and unauthorized access to sensitive information and privacy laws (DPDP) remain up-to-date and enforceable.
Currently, there are no specific laws enacted by the Government of India (GOI) to protect citizens from misuse of AI tools/chatbots etc. The government is planning on implementing a proposed Digital India Act (DIA) which may replace the Information Technology Act, 2000. In DIA the government may propose laws wherein AI will also be protected. Given India's commitment made at the Global Partnership on Artificial Intelligence ("GPAI") 2024 where 29 member countries unanimously adopted the GPAI New Delhi Declaration ("Declaration"), which acknowledged to work towards safe, secure and trustworthy AI, including the development of relevant regulations, policies, standards. The Declaration adopted, has stated that there is a crucial need to mitigate risks associated with misinformation and disinformation, unemployment, lack of transparency and fairness, protection of intellectual property and personal data, which may threaten human rights and democratic values. The member countries conveyed their support for India's intentions to promote collaborative AI for global partnership.
Therefore, AI technology is evolving at a very brisk pace, and it will play a significant role in the hotel sector. This also brings in various risks such as safety risks which can cause potential harm to guests caused by the deployment of AI systems. There is currently a vacuum in terms of laws and compliances in place to deal with the use of AI and globally, many countries are also still developing their own AI laws and regulations to tackle this issue.
In addition, there are instances where use of AI tools could result in tortious liability: (i) Medical Misdiagnosis: AI tools used in healthcare can sometimes lead to incorrect diagnoses resulting in improper treatment and therefore liability for medical malpractice, (ii) Autonomous Vehicles: Increasing reliance of technology in cars including AI controls and self-drive features can lead to tortious liability, (iii) Employment Discrimination: Increasing use of AI tools in hiring processes can cause bias based on race, gender, or other protected characteristics, causing employment discrimination claims, (iv) Defamation and Misinformation: AI tools used in content generation or social media can sometimes spread false information or defamatory content. Similarly in case of hotels too, there is increasing use of AI tools for instance virtual check-in, voice activated room control, personalized guest experience etc. where sufficient disclaimers need to be added for protection against the growing tort cases.
Footnotes
1. Shiv Kumar Jatia v. State of NCT of Delhi 2019 SCC Online SC 1090
2. Sunil Bharti Mittal v. Central Bureau of Investigation AIR 2015 SUPREME COURT 923
3. KS Mehta v. Morgan Securities and Credits (2025) SLP 4774
4. Moffatt v. Air Canada(2024) BCCRT 149
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.