This paper explains the ways that digital inequalities are becoming more complex in higher education (HE). It shows that while the foundations of access to devices and connectivity are improving to an extent, the fundamental social inequalities of electricity and affordability are severe. The paper shows how the rapid digitalisation of HE catalysed by the Covid-19 pandemic introduced risks pertaining to student and staff data sovereignty. There is an elaboration on the role of technology in knowledge representation and visibility; the Matthew Effect in educational technology, the biases of algorithms; and the underside of the “any time anywhere” promise.
In answer to the question “How can HEIs, ICTs and digitalisation address these inequities and contribute to inclusive and accessible HEIs?”, the first answer is that sometimes it can’t, and that technology might be inappropriate or even unethical. The argument is made for a serious commitment to a research agenda regarding the ways that HE has been changed by dominant technological systems and discourses. There are also opportunities to leverage the gains of designing for equity in practice and in policy. And finally, there is room to use the affordances of the technology itself to build completely transformed systems for equitable ends.
The question of whether and how technology can assist higher education in becoming more inclusive and accessible is not a new one, with decades of efforts, promises, failures and research building a substantial knowledge base. As society at large has made digital integration essential for participation, new forms of exclusion are coming to bear into, in and on higher education, abetted by unequal power relations and compromises to be negotiated within the Higher Education (HE) ecosystem. The intensive digitalisation catalysed by the pandemic and concomitant “online pivot” means that HE is in danger of fast becoming a site of surveillance capitalism, with the concomitant dangers for equity, little transparency and unequal terms of engagement.
It is not possible to review ICT and inequality in higher education in isolation: addressing inequality must be considered within broader social realities. Society is sometimes described as being post-digital because it is impossible not to be impacted by the digital, even, ironically, as digital inequalities grow.
However, digital structures and practices are unevenly distributed and experienced within social structures, which are in turn refracted into universities. In a virtual cycle, universities reproduce these structures and practices, while knowledge production and dissemination in universities also shape and reframe social practices.
The intersection of the digital with dominant economic models has created what Zuboff calls rogue capitalism, i.e. surveillance capitalism - an economic model which uses human experience as data for the purposes of profit making and behaviour modification (Zuboff, 2019). From an HE perspective, “our mind and psychic life have become the main raw material which digital capitalism aims at capturing and commodifying” (Mbembe, 2019). The value of data in HE was illustrated pre-COVID19 by the financial value of companies which own and provide student data.
The pandemic saw the rapid entry and scaling-up of private companies into the HE sector with massive educational technology investment in a sector confirmed as a market opportunity. Of course, there had previously been private companies in the HE ecosystem, and rightly so. However, because of the urgency of responding to lockdowns and campus closures in 2020, speedy negotiations in tandem with underfunded universities meant that there was insufficient time for needy universities to hammer out equitable terms of engagement. It also meant that there was a likelihood that short term decisions and agreements, hastily made for immediate ends, would become entrenched in the long term.
Technology and inclusion in HE involves complex interconnections between several sectors and stakeholders. The links between digital divides and educational socio-economic indicators have been emphasised by researchers across the world (for numerous examples see Stewart, 2021) and unsurprisingly have proved critical during the pandemic. These have played out in inter-dependent and contextual ways, which makes dealing with HE and ICT inequalities one of the most wicked policy problems.
Addressing inclusion in HE means simultaneous engagement with several of the Sustainable Development Goals: quality education (4), decent work (8), infrastructure (9) and reduced inequalities (10). Exclusion also operates at several levels: individually (students and educators), institutionally, and across the sector nationally, regionally and internationally. It also requires disentangling how divides play out and how peripheries manifest, as well as the terms under which forms of capital intersect.
All these issues have been spotlighted since the first lockdowns and university closures early in 2020, when students were sent home to study and educators had to teach from home. There has been widespread agreement that the multiple forms of existing inequalities in university communities were exposed. Now that they have been seen, they cannot be unseen (Czerniewicz et al., 2020).
The questions for those concerned with ICTs and inequality for addressing inclusion in HE must always be: who profits, who loses, which interests are served, which agendas are marginalised, what is the balance of power and what are the terms of engagement?
The digital divide is alive and well; indeed the digital paradox is that even as the basics of the divide are addressed through access, more complex layers of exclusion are added; digital inequalities thus morph into new complicated forms. Nevertheless, fair and equitable technological infrastructure is the foundation of inclusion in HE: electricity, devices, ubiquitous connectivity and cheap data. These are essential but insufficient.
The ability of residential universities to ameliorate differentials in access to technological infrastructure on campus fell away during the pandemic, when students and academics were sent home to learn and teach.
The most basic access requirement is electricity. Yet 790 million people have no access to electricity and 2.6 billion people in developing countries do not have access to constant electricity (World Bank, 2021). Many students, especially in rural areas, had no electricity to study from home.
Basic connectivity is becoming globally ubiquitous: ninety-three per cent of the world population has access to a mobile-broadband network. Yet this percentage is only 77% in Africa. Globally, about 72% of households in urban areas had access to the Internet at home in 2019, almost twice as much as in rural areas (only 38 per cent). The urban-rural gap was small in developed countries, but in developing countries urban access to the Internet was 2.3 times as high as rural access (International Telecommunication Union (ITU), 2020b).
The cost of data is a serious barrier. There is a 30,000% difference between the cheapest price for data and the most expensive, with the most expensive data being in three African countries (Malawi, Benin and Chad), while India, Israel and Kyrgyzstan have the least expensive (Ang, 2020). A significant affordability gap remains between developed and developing countries, especially for baskets that include at least 1.5 GB of data. ICT services in the majority of the least developed countries (LDCs) remain prohibitively expensive. In many developing countries a data-only package with the minimum 1.5 GB of data still costs the consumer more than 2% of monthly income. And in several countries the median price can be more than three times the 2% affordability target. The gap between developed and developing countries in terms of value for money is growing (International Telecommunication Union (ITU), 2020a). Of course, in addition to cost, the adequacy, appropriateness and fixability of devices are relevant considerations.
These factors are outside of the education sector but have a direct impact on it. As long as technology infrastructure is not considered and implemented as a public good, those with resources will be advantaged. It is for this reason that in a networked and global world, national elites were able to access what was needed to study online in every country during the pandemic.
Divides at sectoral level have widened as universities grapple with digitalisation.
Underfunded universities were thrust into the digital age at speed in 2020, unable to escape the digital and related realities of their students’ lives as institutions scrabbled to improve access and connectivity. Their varying abilities to do so exposed the stratification of national systems; some universities had deep pockets, large endowments and wealthy students. Others had none, or lost their additional forms of income, and some universities have closed (Higher Ed Dive Team, 2022).
There has been growth in the number of public-private relationships being formed, partly in response to some of these challenges. These relationships are being forged and negotiated by over-stretched public universities, many of which are coping with slashed government funding, hungry students and exhausted overloaded educators. Wealthier universities are in a better position; they have brand power, can afford to develop in-house capacity, are able to develop and implement privacy frameworks, can and do employ privacy officers, and have the capacity to negotiate terms with vendors such as Online Programme Managers (OPMs). These are the ways in which, adjusting to the requirements of a digital university in a post digital world, uneven university systems are being further stratified.
The amalgamation of the digital into higher education, through the dominant extractive economy, introduces complex and often invisible power dynamics into public higher education. The terms of engagement are imbalanced, hidden behind dense language and easy promises. There are especially profound implications for those with barriers to participation at individual and institutional levels. This has introduced several new inequities into the student experience and the sector.
As institutional systems, research, teaching and learning have become digitalised, so it has come to be that metadata (if not content) in the form of clicks, uploads, downloads, information use, etc. can be extracted and used by the company whose system provides the service being used. This data has financial value and provides opportunities for profit making. For those who use big tech companies’ products as teaching and learning platforms, there are more serious ramifications as this metadata can be aggregated with that of other products in the company’s basket.
Understanding these new technically convoluted education technology systems creates new forms of inequities. While the interface is designed for ease of use, decoding the data provided, what has been called its “shadow text” is hidden from view and accessible only to epistemic elites, who alone have the expertise and the technological machine learning resources to decode it (Perrotta et al., 2021). This makes disputing company assurances and negotiating with them arduous: another form of inequality is introduced into higher education as only those with sophisticated expertise can engage with the data systems.
For students, privacy and cookie settings are the first point of encounter with data. These are generally obscure and unclear (Amiel et al., 2021), with a minute minority likely to respond to these settings at all. In less obvious ways students are caught up in surveillance practices, whereby their experiences are turned into data. Their “consent” means little when they have no effective choice and the ostensible “agreements” are obfuscated. “Free” tools extract a data price, and it is only those with the financial ability to pay for tools and services who really have the option of refusing to use such tools.
Responding to this dense and convoluted terrain requires multifaceted inter-connected digital literacies, critical literacies, information literacies and data literacies (Pangrazio & Sefton-Green, 2020). Those with access to extensive cultural capital are more likely to be positioned to take meaningful control and ownership of their own data. There are thus inequalities within the student population, as well as between students and the tools they use.
Nationally, it is not the purview of one department to put in place practical and legal structures to ensure fair and equitable data sovereignty and to make strides towards resolving digital divides. The tasks are fragmented across several departments or ministries of telecommunications, education, labour, infrastructural planning and so on. It is a national imperative to ensure that such coordination takes place to ensure citizen rights for all, especially those most marginalised by limited access to economic and other capital.
There are numerous forms of exclusion in higher education with and through technology. This piece briefly touches on four points which are especially relevant following the pandemic; the role of technology in knowledge representation and visibility; the Matthew Effect in educational technology, the biases of algorithms; and the underside of the “any time anywhere” promise.
The geopolitics and decolonising of knowledge are currently burning issues, with the focus of research and discourse largely on epistemology, power, voice, legitimacy and representation. Threaded through this mix is technology, which is of course not neutral and enables, echoes or amplifies existing and unequal power relations. However, the debates about decolonising the curriculum and those regarding the role of technology tend to be siloed in different disciplinary fields.
Firstly, there is the simple matter of local research and knowledge being online. For many, if it is not online it does not exist. Unfortunately, the dominant open access models have paradoxically replaced access paywalls with publishing paywalls, effectively excluding knowledge and voices from the peripheries. Despite the affordances of free-to-share technology, the current business model for scholarly communication has not led to fairness or equity (Poydner, 2019). Search engines are active players in knowledge production and representation, given the role they play in surfacing and distributing information. Here too, technology, and specifically algorithms have been shown to be skewed towards profit making (Headlee, 2020). It is of great concern to universities, as sites of knowledge production, that technological affordances are bolstering knowledge inequalities.
Algorithms (defined most simply as automated decision-making with large data sets) are playing more of a role in student university experiences, as students’ journeys through education becomes more digitalised - from application for university, to programme selection, to using learning technologies for their studies, to examinations. Beyond education, the risks of algorithmic bias have been widely explored, through books largely from the US including Algorithms of Oppression, (Noble, 2018), Automating Inequality (Eubanks, 2018) and many more. In an African context, AI-related technologies have been described as masculine, white, heteronormative, able-bodied and Western (Foster et al., 2020). Reviews of research on algorithmic bias in education have found several examples, noting that such research is relatively sparse (Baker & Hawn, 2021). As algorithms and AI percolate the sector, the lack of research poses a risk to inclusion and equity.
The use of learning technologies has in itself been shown to be a risk to equity in the student body. During the pandemic, online tools were adopted at scale and speed. Without sufficient and focused learning design to ensure inclusive participation in many contexts, the indications are that extensive use of learning technologies during this stressful period has had the Matthew Effect on students. Drawing on the biblical reference, the term was popularised by Merton in 1968 to describe accumulated advantages. The Matthew Effect in learning tools has thus meant that such tools have been most beneficial for well-off students with the social and cultural capital to exploit them (Reich, 2020).
Finally, the recent global online education “experiment” following the pandemic has laid bare the inequalities in the “any time anywhere” promise of flexible education. There is no model student, no “roaming autodidact” - a self-motivated, able learner that is simultaneously embedded in technocratic futures and disembedded from place, culture, history and markets (McMillan Cottom, 2016). Instead, there are students living real enmeshed domestic, familial, working and studying lives struggling to find the time and the space to study. Designing for a model student means designing for the privileged elite and disadvantaging the majority of the student population.
In light of this very brief overview of ICTs in HE though an equity lens, what can be done?
In a post-digital world, there is enormous pressure on universities not to be “left behind” in order to be part of and prepare students for “the Fourth Industrial Revolution”. Yet sometimes technology is not the answer, sometimes the solution it offers is out of sync with the problems of HE, and sometimes the use of technology is unethical. Recognising these instances can be extremely difficult, and they are certainly contested.
In the first instance, the question is whether technology is needed at all, whether non-technological practices work well or even better. Technochauvinism - the belief that technology must be the solution (Broussard, 2018) - leads to unnecessary digital applications which might well be introducing inequities, as certain groups will not be able to participate. Also falling into this category is recognition of when a complicated technology is unnecessary, as a simple one would do.
In the second instance, it might be decided that the potential value of a tool is outweighed by the potential harms or inequalities. There are examples of universities which have made a blanket decision not to use online proctoring tools because of the invasion of student privacy, as well as the exposure of poorer students’ home circumstances; or where it has been decided that facial recognition systems will be banned.
Making these decisions is hard, partly because they are political and partly because there may not be reliable evidence to inform the debate.
There has been too little time to pay attention to these ballooning issues. So much is new, especially at scale, and so much has happened so speedily that there simply has been little chance to grasp the complexities, the unanticipated outcomes and the dangers. Universities are already so pressurised, that technological solutions are tempting when they are sold as easy, promising simple solutions to intractable problems.
There is much that is not yet understood; here universities can make a valuable contribution since research is part of their core business. In particular, there needs to be research on educational technologies of all kinds in terms of inclusion and equity. The areas needing scholarly attention are numerous. At micro level, how students with barriers to learning experience technological tools and datafied educational practices; the responses, experiences, literacies and outcomes for different student groupings with varying access to cultural, social and economic capital; in which circumstances which technologies prove useful for students with barriers to learning; the ability of educators to support inclusivity. At institutional level, the nature of the new roles required of public universities in what are effectively forms of market making (Komljenovic & Robertson, 2016), while protecting their public university mission; the forms and choices regarding governance structures to both protect privacy and enable open research; how dominant technological discourses are infused into and resisted in teaching and learning practices. Nationally, the ways that existing divergent policy and regulatory frameworks can be brought together to identify risks for exclusion and be revised for inclusion and equality. Internationally, given how the pandemic has exposed digital inequalities across the entire sector, not only in low-income countries, and given the power of big tech companies to override national and international laws, identifying points of leverage to ensure that the public missions of public universities are not simply lost.
Leveraging what has been learnt about equitable design
Covid and the concomitant online pivot has been a terrifying educational experiment which has had very material effects on students, educators and the sector. It has also confirmed and amplified much of what scholars and professionals already knew - especially that one size does not fit all. It has revealed how painfully difficult the holy grail of “scaling up” is. It has made clear the limitations of adaptive learning and the extent to which it has not fulfilled its promises. What has come into focus is that certain learning technologies are useful for specific purposes in particular contexts. Using them generically for the sake of efficiency is to the disadvantage of some, leaving academics and designers to answer the impossible question of what number of “some” is too many.
This period has also shone a light of the numerous ways programmes and curricula have been and can be designed with diversity and inclusion at the forefront. Such equity-focused design has been explored world-wide, in even the richest countries. Student learning has been enabled in many places with low connectivity or no connectivity contexts and online classrooms with varying levels of access. There are also examples where students have been involved in decision-making and co-creation of resources.
Improved learning design and the increased take-up of universal design learning (UDL) through the multiple modes necessitated by the pandemic has offered improvement for increased diversity in the sector, partly because of massification in the system. These are activities and approaches to build on and grow.
Inequalities and unequal power relations can and are being tackled at policy and regulatory level. These are largely under the banner of FAT - Fair, Accountable and Transparent. Such efforts occur within curricula, institutionally, nationally and internationally. Some of these efforts occur outside the HE sector but impact on HE in immediate ways. Examples are the General Data Protection Regulation (GDPR) at regional level, and the Protection of Personal Information Act (POPIA) at national level. These kinds of policies are aimed at individual data sovereignty and control, with implications for both the running of universities and the way that research can be undertaken and reported.
Within universities, valuable regulatory frameworks regarding student data and learning analytics protect students. Such frameworks highlight the principles of privacy; data ownership and control; transparency and consent; anonymity; non-maleficence and beneficence; data management and security; access; responsibility; minimising adverse impacts and enabling interventions (Corrin et al., 2019). In addition, there is a need for other ethical considerations such as lack of justice, inequality and power embedded in the learning analytics system (Cerratto-Pargman & McGrath, 2021).
Perhaps the most demanding area is the formation and development of digital, data and critical literacies, as research has shown that such literacies are much more effective when integrated into curricula. Stand-alone literacy development is essentially a Band-Aid solution. Given how complicated and emergent the terrain is, this is a big ask of overburdened educators who may themselves not have those very capabilities.
There are equity implications in practice as well as in policy. Technical, administrative, procurement and legal services within institutions make decisions about tools, platforms and services which impact on equality. Procurement processes need to ensure that due consideration is given to technologies which may cause or impact on barriers to learning. In hybrid environments, such teaching and learning models are likely to account for students being both on and off campus in diverse environments. In addition, it is the responsibility of those in these positions to negotiate terms and conditions with educational technology companies and vendors, keeping an eye on the agreements regarding student data in particular.
For parity of participation - Fraser’s definition of social justice (Fraser, 2005) - to be possible, the HE sector would need to be foundationally transformed in terms of the allocation of resources, values, funding models, governance structures and systems. Perhaps ironically given how technologies have been used to date, digital technologies intrinsically have affordances which enable sharing and collaboration at low or no cost. They are ideal for cooperative and commons-based models which are premised on sharing and collaboration.
It is arguable that this fundamental restructuring of universities is not possible given the broader social and economic context in which they are located. Nevertheless, it is important to envisage a higher education system which uses technology for equity and social justice. At this post-pandemic time, the shape and future of universities are under scrutiny. This is a time where pluriversal knowledge structures, open education, knowledge commons and learning commons can be dreamed into being. This is tough but possible through the building of alliances and collegial collaboration. As sites of knowledge production, radical innovation and deep expertise, universities are the ideal location for radical transformation.
Amiel, T., Pezzo, T., Ribeiro, L., da Cruz, L. & Oliveira, A. (2021). Os modos de adesão e a abrangência do capitalismo de vigilância na educação brasileira/ The Modes of Accession and the Scope of Surveillance Capitalism in Education. Perspectiva Revista Do Centro de Ciências Da Educação, 39(3), 1–22.
Ang, C. (2020). What Does 1GB of Mobile Data Cost in Every Country? Visual Capitalist https://www.visualcapitalist.com/cost-of-mobile-data-worldwide/
Baker, R. S. & Hawn, A. (2021). Algorithmic Bias in Education. International Journal of Artificial Intelligence in Education. https://doi.org/10.1007/s40593-021-00285-9
Broussard, M. (2018). Artificial Unintelligence. MIT Press.
Cerratto-Pargman, T. & McGrath, C. (2021). Mapping the Ethics of Learning Analytics in Higher Education: A Systematic Literature Review of Empirical Research. Journal of Learning Analytics, 82(5), 105–122. https://doi.org/10.18608/jla.2021.1
Corrin, L., Kennedy, G., French, S., Buckingham Shum, S., Kitto, K., Pardo, A., West, D., Mirriahi, N. & Colvin, C. (2019). The Ethics of Learning Analytics in Australian Higher Education. A Discussion Paper. University of Melbourne. https://melbourne-cshe.unimelb.edu.au/research/research-projects/edutech/the-ethical-use-of-learning-analytics
Czerniewicz, L., Agherdien, N., Badenhorst, J., Belluigi, D., Chambers, T., Chili, M., de Villiers, M., Felix, A., Gachago, D., Gokhale, C., Ivala, E., Kramm, N., Madiba, M., Mistri, G., Mgqwashu, E., Pallitt, N., Prinsloo, P., Solomon, K., Strydom, S. and Wissing, G. (2020). A Wake-Up Call: Equity, Inequality and Covid-19 Emergency Remote Teaching and Learning. Postdigital Science and Education, 2(3), 946–967. https://doi.org/10.1007/s42438-020-00187-4
Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor (Illustrated edition). St Martin’s Press.
Foster, L., Van Wiele, B., & Schönwetter, T. (2020). Narratives of Artificial Intelligence in a Gendered and Racialized World: Emergence on the African Continent. We Robot (2020). Ottawa. https://techlaw.uottawa.ca/werobot/papers P
Headlee, C. (2020). How Google Search Sold Out (What Next TBD (Podcast)). https://slate.com/podcasts/what-next-tbd/2020/08/what-went-wrong-with-google-search
Higher Ed Dive Team. (2022). A look at trends in college consolidation since 2016. Higher Ed Dive. https://www.highereddive.com/news/how-many-colleges-and-universities-have-closed-since-2016/539379/
International Telecommunication Union (ITU), (2020a). The affordability of ICT services (2020).
International Telecommunication Union (ITU), (2020b). Measuring digital development: Facts and figures (2021). International Telecommunication Union (ITU). https://www.itu.int/en/itu-d/statistics/pages/facts/default.aspx
Komljenovic, J. & Robertson, S. (2016). The Dynamics of “Market-Making” in Higher Education. Journal of Education Policy, 31(5), 622–636.
Mbembe, A. (2019). Thoughts on the planetary. New Frame. https://www.newframe.com/thoughts-on-the-planetary-an-interview-with-achille-mbembe/
McMillan Cottom, T. (2016). Black Cyberfeminism: Intersectionality, Institutions and Digital Sociology. In Digital Sociologies. Policy Press. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2747621
Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Oppression. New York University Press.
Pangrazio, L. & Sefton-Green, J. (2020). The social utility of ‘data literacy’. Learning, Media and Technology, 45(2), 208–220. https://doi.org/10.1080/17439884.2020.1707223
Perrotta, C., Gulson, K. N., Williamson, B. & Witzenberger, K. (2021). Automation, APIs and the distributed labour of platform pedagogies in Google Classroom. Critical Studies in Education, 62(1), 97–113. https://doi.org/10.1080/17508487.2020.1855597
Poydner, R. (2019). Open access: Could defeat be snatched from the jaws of victory? https://richardpoynder.co.uk/Jaws.pdf.
Reich, J. (2020). Failure to Disrupt: Why Technology Alone Can’t Transform Education. Harvard University Press.
Stewart, W. H. (2021). A global crash-course in teaching and learning online: A thematic review of empirical Emergency Remote Teaching (ERT) studies in higher education during Year 1 of COVID-19. Open Praxis, 13(1). 10.5944/openpraxis.13.1.1177
World Bank. (2021). Tracking SDG 7: The Energy Progress Report. World Bank Group. https://trackingsdg7.esmap.org/
Zuboff, S. (2019). The Age of Surveillance Capitalism. Profile.