I recently returned from running our annual Workshop in Sydney. Alongside trying to find the best flat white in the city and dealing with jetlag, I was able to hear more about what is on the minds of our Australia based clients. At our workshop we discussed why companies need to build a narrative on the future of work, and how to build a future-proofed culture amongst other topics. There were three major takeaways for organisations that came out for me around the workshop.
- Think about your narrative
Despite increasing digital disruption and the rise of AI and analytics, organisations need to ensure they don’t forget the social aspects of change, and the power of stories over straight facts or data. Research has shown that stories impact people’s brains differently to facts, causing more connections in the brain and leading to closer relationships between the storyteller and the listener. People use stories as a way of understanding the world and this is particularly true when it comes to the future of work. Employees are looking to employers to provide a sense of stability and purpose in a rapidly changing world. Organisations therefore need to reflect on their own narrative on the future, thinking about what it will mean to work in their company and how work will be done in the future. Where are your non-negotiables? Where are you going to take a bet and what will stay the same? In considering questions such as these, companies can provide their workers with a story about where they are going, and how they will be supported along this journey.
- Abandon assumptions around aging
The importance of not relying on stereotypes and assumptions around aging also came out strongly in the Workshop. Longer working lives mean that organisations cannot make assumptions around the needs and desires of their workforce, particularly older workers. No longer is it always the case that a worker in their 60s is looking to retire, for example. Organisations need to make sure that their practices and processes are not based on erroneous expectations. They need to rethink the way they approach retirement, or what it means to progress in the organisation, so that people are not penalised if they want to downgrade their working hours without losing status in the organisation.
- Identify your influencers
Finally, the need to think about the cultural influencers in organisations was another important point. Rather than relying on hierarchical leaders, companies need to uncover the real influencers and work with them to drive cultural change. These influencers can be discovered through network analysis or crowdsourced conversations but should be brought in early on in the process to ensure the behavioural change so crucial so a successful culture shift.
It was great to hear from our members in Sydney, and we look forward to our next trip Down Under!
During my final year at University, students were approached by counsellors about taking lessons in mindfulness in order to help us cope with the stress of final year. Initially, I dismissed it as another one of those health fads claiming to be a panacea for all modern ills, but after hearing about the benefits from other friends, I decided to do some research. Mindfulness can be described as a way to focus one’s awareness on the present, so that you are more conscious of what you are doing in that moment. Essentially, it is a way to re-programme one’s mind to think in less stressful ways. Admittedly, as quite an anxious person, this resonated with me, and so now I try to incorporate mindfulness into my everyday life, and since starting here at Hot Spots Movement, I have been interested in how mindfulness could be transferred into my working life, and how it can help workers to be less stressed and ultimately more productive.
Over the past decade, research into mindfulness has exploded, with thousands of studies being conducted into its possible potential. The latest neuroscience studies are predominantly what transformed this practice from an ancient Buddhist concept into an exercise adopted by celebrities, businesses, politicians and the NHS. In 2007, scientists discovered that there are two different networks in our brain, two different ways we interact with the world: the default network and the direct experience network. The direct experience network is activated when you are being mindful; not thinking about the past, the future or about other people. It is argued that this way of thinking allows you to get closer to the reality of an event, making you more flexible and relaxed in the decisions you make.
Some of the world’s biggest companies such as Google, Facebook and interestingly, our Future of Work Consortium member KPMG are paying attention to these studies and are now offering mindfulness or meditation programmes as a way to make their employees happier and less stressed. For example, Chade-Meng Tan, a Google pioneer completely revolutionised Google through introducing the ground-breaking ‘Search Inside Yourself’ mindfulness programme to all employees. Perhaps this approach to wellbeing is one of the reasons why Google is consistently rated as the world’s best employer. Similarly, CEO Mark Bertolini completely reshaped the culture of Aetna when he joined in 2010, drawing on his experience of mindfulness, which helped him through a time of intense depression after a life-threatening skiing incident. He introduced free yoga and meditation classes to all employees, with those participating reporting on average a 28% reduction in their stress levels and a 20% improvement in sleep quality. Since Bertolini took over as CEO, Aetna’s stock increased threefold. The New York Times wrote an interesting article on this case study, finding that Aetna’s employees each gained an average of 62 minutes per week of productivity, which Aetna estimates is worth $3,000 per employee per year.
Another motivation behind introducing mindfulness into the workplace has been the immense pressure that workers are under today. According to the City Mental Health Alliance, 50% of long-term absences are accounted for by such stress, accumulating to 70 million sick days. More poignantly for employers, stress causes losses of £26 billion a year for the UK alone, and so it is no surprise that the leading innovative businesses have embraced mindfulness, in the hope that it will reflect in not only employees’ wellbeing, but also in productivity levels, and ultimately in profits.
I believe that introducing mindfulness into an organisation is a step in the right direction. Research may still not be able to unequivocally say that practicing mindfulness increases productivity, however the results of neuroscience studies are impressive and the case studies such as those of Google and Aetna show it is definitely worth investing in.
If you’d like to find out more about the benefits of mindfulness at work, please don’t hesitate to reach out to me at firstname.lastname@example.org
Applying for jobs can be a nerve-wracking experience, as competition is high and a step toward to your career goals hangs in the balance. My assumption was that all candidates shared this same trepidation, but research from 2014 has revealed that men are far less cautious than women in this regard and will tend to apply for a role if they meet around 60% of the job requirements, whereas women will only apply if they meet 100% of them.[i] Why does this disparity exist, and why aren’t more women applying for roles within their reach?
One argument is that the language used within job adverts themselves dissuades certain genders from applying. For example, women are more likely to be deterred by adverts requesting individuals able to ‘manage’ rather than ‘develop’ teams, whereas men tend to prefer jobs requesting ‘competitive’ rather than ‘supportive’ candidates. Words such as these, imbued with gender connotations, are surprisingly prevalent. The technology company, Textio carried out research in 2016 to flag gendered language and found that the average job advert contains twice as many ‘masculine’ phrases as ‘feminine’ ones.[ii] A similar study by recruitment services company, Total Jobs discovered that, within the 77,000 job adverts included in their study, 478,175 words carried gender bias; an average of six male-coded or female-coded words per advert.[iii] The use of gendered language can pose a significant problem, as it can signal to potential candidates that they don’t – and won’t – belong.
Simple alterations can make a huge difference. Atlassian, an Australian software company, hired 80% more women into technical roles within two years by changing the wording of its job adverts, demonstrating the extensive effect of language.[iv] Paying close attention to the language used will be critical for companies wanting to grow the size of their talent pool, as ZipRecruiter proved when it discovered that gender neutral adverts receive up to 42% more applications than more biased ones.[v]
And yet, there are some points of contention that arise when asking organisations to change their wording. Firstly, in some cases, specific words are necessary. For example, positions in investment banking demand a level of competition and fearlessness, and failing to include these elements in a job description may mean that a new employee is unprepared for the realities of the role. Secondly, changing the language in adverts does not attempt to address the underlying social issues concerning why certain characteristics are perceived as either masculine or feminine in the first place. Removing gendered words from job descriptions does not necessarily remove the biases associated with them. However, despite these concerns, crafting gender neutral job adverts is an expression of a firm’s commitment to inclusion; and this must be seen as a step in the right direction.
Some state that the 60%/100% disparity is not evidence of a language problem but of a “confidence gap” between men and women.[vi] They argue that women are less confident in their own abilities, whereas men are more self-assured and tend to take a more “cavalier” approach to applications.[vii] This may be true of certain individuals but it seems both unfair and unlikely to assume that all men and women fit this stereotype. In fact, researchers at the Harvard Business Review have dubbed the confidence gap a “myth”, suggesting that women are not deterred from job applications because they lack confidence but because they do not want to waste time and energy applying to a role they are not adequately equipped to perform.[viii] Which instead raises the question: why are men applying for jobs that they aren’t qualified for? And, do the men that start in these roles find themselves out of their depth? Maybe. Maybe not. Perhaps what this disparity actually shows is that more men have simply seen these job adverts for what they really are: wish lists.
A lack of female applicants signals the need for a wider change in how job adverts are understood.
Lengthy bullet-pointed lists of job requirements can trick applicants into thinking that each point is vital when, in reality, recruiters write lists of ideal attributes rather than strict, unyielding lists of absolute necessities. Limiting the number of words in your job adverts will make it far easier for candidates to realise that they meet the requirements, while also reducing the risk of including gendered language. As more people feel both able and inspired to apply, recruiters may find that individuals with transferrable skills can bring something unexpected to the organisation and take the role in a new and exciting direction. Furthermore, recent research on job descriptions has shown that providing people with a rigid list of tasks does not encourage them to push boundaries and innovate. Looser listings encourage opportunities for creativity and demonstrate that your organisation has space for people to be ambitious and to craft their own work and career path.[ix] Let all of your applicants feel 100% ready to take on a role they can help to shape.
To talk more about inclusion at work, drop me an email at email@example.com.
In his 1994 book, ‘The Age of Diminishing Expectations’ Nobel Prize-winning economist Paul Krugman, perspicaciously argued that ‘productivity isn’t everything, but in the long run it is almost everything’.[i] When one considers that productivity is perhaps the main driver in an economy’s ability to grow and therefore also the greatest predictor of the standard of living for a given person or group of people, it is difficult to disagree with Krugman’s contention.
In essence, productivity is defined as output per hour worked. In recent years, however, within the developed world productivity levels have been lagging. To elaborate, the recent ‘Skills and Employment Survey’ highlighted that in the UK, labour productivity has historically grown by around 2% per year since the 1970s, but since the 2008-2009 recession it has stagnated and has failed to climb back to its prerecession growth rate.[ii] This unprecedented and unexplained slump has become known as the ‘productivity puzzle’ and is an issue that has caused widespread concern amongst economists, business leaders and governments within the developed world.
As productivity levels continue to stagnate, organisations are implementing AI solutions which are reminiscent of Charlie Brooker’s superb dystopian TV show ‘Black Mirror’ to help boost productivity levels. Amazon, for example has recently patented a wristband that tracks the hand movements of warehouse workers and uses vibrations to nudge them into being more productive. Veriato, a software firm, is able to track and log every keystroke employees make on their computers in order to measure how dedicated they are to their role and the company.[iii] In Helsinki, a digital innovation consultancy named ‘Futurice’ has installed sensors that can track an employee’s every move in the office, even in the toilet.[iv] Such technologies fall under the remit of what experts call the internet of things (IoT). Employees report mixed feelings about these new technologies, with a Harvard Business Review study revealing an approximate 50/50 split between those who believe AI technology enhances productivity and those who either disagree or feel its impact is neutral.[v]
The appeal of using advanced AI from the organisation’s perspective is clear and, although surveillance at work is not a new concept (factory workers have long clocked in and out), the scale to which certain AI technologies can now be used to monitor the productivity of the workforce is leading some commentators to suggest they are bordering on Orwellian. This inevitably raises acute philosophical questions about the ethical underpinnings of applied AI in the workplace. Indeed, just how far are organisations willing to go in the pursuit of productivity? Finding the balance between safeguarding basic privacy, workers’ rights and enhanced productivity will raise some moral dilemmas for organisations, and will no doubt become central to AI discourse in the coming years.
Finding this equilibrium will not be an easy task for organisations. A recent RSA report on the ethics of AI suggests there is a public perception that we may be surrendering too much power to AI technology.[vi] One thorny issue is that existing ethical frameworks are often incompatible with the world of technology. Science has attempted to develop ethical frameworks before – from Asimov’s Three Laws for Robots to Nick Bostrom’s work on ethics. Adhering to these frameworks can be problematic, as humans often find it difficult to develop virtues for their own conduct, let alone build relevant virtues into new technologies.[vii] The debate around ethical AI must also consider how certain workers are better equipped than others to prevent employers going too far. For example, those with a specialist, in demand skill-set stand a greater chance of resisting any unethical implementation of AI, whereas those in insecure forms of employment such as zero-hours contract workers in low-wage industries, have considerably less leverage.
In the current economic climate, solving the productivity puzzle is an alluring prize for organisations. However, if organisations wish to solve it using certain AI, it must be conscientiously executed with a strong injection of humanity to help ensure workers can retain a sense of dignity in their work during this period of accelerated and uncertain change.
[i] Krugman, P. (1994) The Age of Diminishing Expectations. Cambridge, MIT Press
[iii] The Economist (2018) AI in the Workplace
[iv] Burke, C (2016) In offices of the future, sensors may track your every move – even in the bathroom (The Guardian)
[vi] Balaram, B (2018) The Ethics of Ceding More Power To Machines (RSA)
[vii] Dalmia, V. Sharma, K. (2018) The Moral Dilemmas of the Fourth Industrial Revolution (World Economic Forum)
According to Harvard Psychologist, Dan Gilbert, ‘all of us are walking around with an illusion, an illusion that we have just recently become the people that we were always meant to be and will be for the rest of our lives. However, time is a powerful force. It transforms our preferences. It reshapes our values. It alters our identities. We seem to appreciate this fact, but only in retrospect. Only when we look backwards do we realise how much change happens in a decade.’[i] Our research at the Future of Work (FoW) Research Consortium is indicating that this notion of transformations is becoming increasingly tangible and pronounced for three reasons: longer working lives, greater reflexivity and new social norms.
Longer working lives: More years have been added to life expectancy in the last century than in all previous millennia of mankind. A longer life means a longer working life, with some predicting that we will be working until we are 80. In this context, a longer working life provides more productive hours, presents more opportunities to be grasped and more identities to be explored. Simply put, longer working lives present an increasing range of possible ways of living.
Greater reflexivity: We are seeing an increasing disintegration of societal traditions enabling us greater freedom to think about and construct who we want to be. According to sociologist Ulrich Beck, we now live in a ‘risk society’ where tradition has less influence and people have more choice.
New social norms: An increased acceptance of homosexuality is perhaps the best example of new social norms forming. For example, whilst 70% of people believed gay marriage was wrong in 1973 this figure went down to almost 40% by 2010. In contrast, the percentage of people who thought that there was nothing wrong with gay marriage increased from just 10% in 1973 to over 40% in 2010.[ii]
Indeed, the rise in individualisation and its resulting impact on social norms explains why people are increasingly comfortable in both expressing and accepting a wider range of identities. What all this means is that each person at a given point in time has a spectrum of many possible selves. These possible selves are future articulations of who they might be and what they might do. They represent an ideal of what they might become, what they would like to become or what they are afraid of becoming.
What are your possible future selves?
[i] Retrieved from https://www.ted.com/talks/dan_gilbert_you_are_always_changing
[ii] Retrieved from https://www.theatlantic.com/business/archive/2013/04/the-rise-of-gay-marriage-and-the- decline-of-straight-marriage-wheres-the-link/274665/
[iii] Ibarra, H. (2004). Working identity: Unconventional strategies for reinventing your career. Harvard Business Press