1.4 Why Should We Care about AI in Vocational Education and Training
You probably already have one or more reasons in mind why we should be looking at AI and VET. Therefore, this unit is especially helpful if you are completely new to the field or want to get an overview again.
Reflection
You can already see in this short introduction that the developments in the field of AI have various implications for VET. Another dimension that encompasses the effects presented is the ethical discussion around AI. Technologically, some things are feasible, but what, for example, makes sense pedagogically? How can an AI system be gender-responsive and inclusive?
9 thoughts on “1.4 Why Should We Care about AI in Vocational Education and Training”
Leave a Reply
You must be logged in to post a comment.
AI can help to personalize the learning experience for the individual and respond ‘in the moment’ to learner needs. However, the technology is ultimately designed by someone, who may have their own biases or preconceptions. This also doesn’t happen in an isolated context and may be influenced by organisational location, motivations and/or pressures.
Although this will not guarantee against biases and preconceptions, I think education professionals need to be involved in the design and development of AI applications for education. I fear that many at the moment are being led by software developers who have no training in education. I believe that learning is ultimately a socialprocess and thus a socio-technical approach needs to be taken.
AI is transformative. TVET teachers need to be abreast of the changes in the workforce. AI is such a change. At this point, teachers need to identify ways to infuse AI in their programmes and facilitate the differentiated learning processes that are evident in AI.
Outside the educational sector, there are certain apps and platforms that are currently relying on algorithms. I have come across some cases that were reported in the media around Pinterest and Tinder. Pinterest, for instance, is aware of the “miscarriage problem”, i.e. the algorithm might not “know” that a user is no longer pregnant, and it will continue targeting baby-related images at them (which might cause the user a lot of frustration and anguish). Bias in Tinder algorithms against black women and Asian men means that the dating app itself reinforces the existing bias that some people might have.
So introducing any such technology into the educational sector has to be done in a very deliberate way. Educational institutions are sites of struggle, and implementing new solutions has to ensure that equitable input and feedback is sought from all stakeholders, including representatives of minority groups.
totally agree Natalie and That’s why I think that educational professionals teachers, trainers, researcher must be involved in the development and signing off of AI based applications and by extension need some level of understanding about AI and the accompanying ethical issues.
I guess for AI to be good … the people behind it should also be good and of varied backgrounds so as to cover the many aspects and complexities of any user…. So a technician producing an AI product is not enough .. other people should be in it too …. like a pedagogist, a pyschologist, an expert on ergonomics, and others….
Totally agree Ephrem. My argument is that we need educationalists both to understand AI – that is why we need materials like thsi course – but alos involved in teh design and development of AI based education applications
The “others” would surely include the development of soft skills … and other positive concomitant attitudes that we should inculcate in the learners… like gender sensitivity … or being environment friendly, etc…
We are confronted with more and more information – also in vocational education and training and want to exchange and understand each other across professional boundaries. AI can help find the information and sources to fill gaps in understanding and, in particular, to accurately calculate the impact of our decisions and actions.