The COVID-19 vaccine breakthroughs in late 2020 brought hope that the pandemic's end could be in sight, but a return to normalcy will require widespread inoculation, raising an urgent question: Should employers mandate COVID-19 vaccinations for their workers?
Littler's latest survey of more than 1,800 in-house counsel, HR professionals and C-suite executives finds most employers unlikely to mandate COVID-19 vaccination for a variety of reasons, with their top concerns centered around company culture and employee relations issues. Instead, most are focused on efforts to encourage vaccination, as well as on continuing remote work and maintaining safety protocols in the workplace, even when vaccines become more widely available.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.