In chapter 26 Dorothy Kenny discusses machine translation (MT), tracing its history from rocky beginnings in the aftermath of the Second World War, and pointing up the technological and geopolitical factors that reinvigorated MT research on more than one occasion. Of particular significance was the shift in the late twentieth century from rule-based systems to data-driven systems, in which machines ‘learned’ probabilistic models of translation from an ever-growing supply of human translations available in digital form. Kenny shows how the conceptualisation of meaning and translation changes with shifts in research paradigms: from the symbolism of rule-based systems; to the statistical approach that sees translation as a form of Bayesian optimization and says little about
meaning; to the connectionism of neural MT. The chapter also considers some of the more troublesome aspects of contemporary MT, including its complicated relationship with human translation. Despite occasional tensions, Kenny argues that MT provides the ideal locus for translation studies to engage with some of the most pressing questions of our time, questions linked to the resurgence of interest in artificial intelligence, and to the future of human labour.