Transition from the traditional method to artificial intelligence: a new vision for academic project management
DOI:
https://doi.org/10.56294/mw2025424Keywords:
Artificial intelligence, Academic project management, Higher education, Mixed methods, Educational innovationAbstract
The study examines the transition from traditional methods to the application of Artificial Intelligence (AI) in academic project management, with the aim of assessing its effects on efficiency, adaptability, and quality in higher education. Framed within a positivist paradigm and using a mixed-methods approach, quantitative surveys (n=70) and qualitative semi-structured interviews (n=21) were conducted with lecturers and university managers. Quantitative data were processed with IBM SPSS Statistics 27, while qualitative information was analysed using NVivo 14, enabling triangulation of results and strengthening the study’s validity. Findings reveal that AI integration generates significant improvements in management efficiency (p=0.002), enhances institutional adaptability, and raises the quality of project execution. Four key thematic categories were identified: impact on efficiency, improvement of adaptability, perception of quality, and implementation challenges. Additional results highlight AI’s potential for strategic institutional planning, as well as a motivational effect on academic staff performance. It is concluded that AI constitutes a positive tool for innovation and optimisation in academic management, provided it is accompanied by organisational readiness, ethical oversight, and continuous training programmes to ensure its sustainable and responsible adoption in higher education.
References
1. Larman C, Basili VR. Iterative and incremental development: A brief history. Computer. 2003;36(6):47-56. https://doi.org/10.1109/MC.2003.1204343
2. Zawacki-Richter O, Marín VI, Bond M, Gouverneur F. Systematic review of research on artificial intelligence applications in higher education. Int J Educ Technol High Educ. 2019;16(1):39. https://doi.org/10.1186/s41239-019-0171-0
3. Alyahyan E, Düştegör D. Predicting academic success in higher education: literature review and best practices. Educ Inf Technol. 2020;25:2633-2648. https://doi.org/10.1007/s10639-019-10044-0
4. Papamitsiou Z, Economides AA. Learning Analytics and Educational Data Mining in practice: A systematic literature review. Educ Res Rev. 2014;12:70-83. https://doi.org/10.1016/j.edurev.2014.06.002
5. Okonkwo CW, Ade-Ibijola A. Chatbots applications in education: A systematic review. IEEE Access. 2021;9:65420-65449. https://doi.org/10.1109/ACCESS.2021.3096648
6. Bosch N, D’Mello S, Baker R, Ocumpaugh J, Shute V, Ventura M. Detecting student engagement and affect in computer-based learning. Comput Human Behav. 2016;58:343-353. https://doi.org/10.1016/j.chb.2016.05.036
7. Baker RS. Algorithmic bias in education. Int J Artif Intell Educ. 2022;32:901-902. https://doi.org/10.1007/s40593-021-00285-9
8. Okoye K, Hussein H, Arrona-Palacios A, Quintero HN, Peña-Ortega LO, López-Sánchez A, et al. Digital technologies in LATAM higher education: reach, barriers and bottlenecks. Educ Inf Technol. 2023;28:10541-10567. https://doi.org/10.1007/s10639-023-11441-7
9. Serrador P, Pinto JK. Does Agile work?—A quantitative analysis of Agile project success. Int J Proj Manag. 2015;33(5):1040-1051. https://doi.org/10.1016/j.ijproman.2015.01.006
10. Kizilcec RF, Lee H. Algorithmic fairness in education. In: Holmes W, Porayska-Pomsta K, eds. Ethics of AI in Education. Routledge; 2022. https://doi.org/10.4324/9780429329067-10
11. Trist EL, Bamforth KW. Some social and psychological consequences of the longwall method of coal-getting. Hum Relat. 1951;4(1):3-38. https://doi.org/10.1177/001872675100400101
12. Lee Y, Kim J. Predictive analytics for student success in higher education: A review. Educ Technol Res Dev. 2020;68:6987-7011. https://doi.org/10.1007/s11423-020-09755-2
13. Mejías-Acosta A, D’Armas Regnault M, Vargas-Cano E, Cárdenas-Cobo J, Vidal-Silva C. Assessment of digital competencies in higher education students: scale development and validation. Front Educ. 2024;9:1425487. https://doi.org/10.3389/feduc.2024.1425487
14. Moreira-Choez JS, Portillo-Núñez RM, Pérez-Arce R, Arevalo-Diaz Y. Assessment of digital competencies in higher education faculty: a multimodal approach with AI. Front Educ. 2024;9:1452743. https://doi.org/10.3389/feduc.2024.1452743
15. Johnson RB, Onwuegbuzie AJ, Turner LA. Toward a definition of mixed methods research. J Mix Methods Res. 2007;1(2):112-133. https://doi.org/10.1177/1558689806298224
16. Schoonenboom J, Johnson RB. How to construct a mixed methods research design. KZfSS. 2017;69(S2):107-131. https://doi.org/10.1007/s11577-017-0454-1
17. Setia MS. Methodology series module 3: Cross-sectional studies. Indian J Dermatol. 2016;61(3):261-264. https://doi.org/10.4103/0019-5154.182410
18. Etikan I, Musa SA, Alkassim RS. Comparison of convenience and purposive sampling. Am J Theor Appl Stat. 2016;5(1):1-4. https://doi.org/10.11648/j.ajtas.20160501.11
19. Gill P, Stewart K, Treasure E, Chadwick B. Methods of data collection in qualitative research. Br Dent J. 2008;204(6):291-295. https://doi.org/10.1038/bdj.2008.192
20. Kallio H, Pietilä AM, Johnson M, Kangasniemi M. Systematic methodological review: building a framework for a semi-structured interview guide. J Adv Nurs. 2016;72(12):2954-2965. https://doi.org/10.1111/jan.13031
21. Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ. 2011;2:53-55. https://doi.org/10.5116/ijme.4dfb.8dfd
22. Sullivan GM, Artino AR Jr. Analyzing and interpreting data from Likert-type scales. J Grad Med Educ. 2013;5(4):541-542. https://doi.org/10.4300/JGME-5-4-18
23. Pallant J. SPSS Survival Manual. 7th ed. London: Routledge; 2020. https://doi.org/10.4324/9781003117452
24. Zamawe FC. The implication of using NVivo software in qualitative data analysis. Malawi Med J. 2015;27(1):13-15. https://doi.org/10.4314/mmj.v27i1.4
25. Fetters MD, Curry LA, Creswell JW. Achieving integration in mixed methods designs. Health Serv Res. 2013;48(6 Pt2):2134-2156. https://doi.org/10.1111/1475-6773.12117
26. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77-101. https://doi.org/10.1191/1478088706qp063oa
27. Kim TK. T test as a parametric statistic. Korean J Anesthesiol. 2015;68(6):540-546. https://doi.org/10.4097/kjae.2015.68.6.540
28. Student. The probable error of a mean. Biometrika. 1908;6(1):1-25. https://doi.org/10.1093/biomet/6.1.1
29. Ruxton GD. The unequal variance t-test is an underused alternative. Biol J Linn Soc. 2006;88(3):567-572. https://doi.org/10.1111/j.1095-8312.2006.00662.x
30. O’Cathain A, Murphy E, Nicholl J. Three techniques for integrating data in mixed methods studies. BMJ. 2010;341:c4587. https://doi.org/10.1136/bmj.c4587
31. Carayon P, Hancock P, Leveson N, Noy I, Sznelwar L, van Hootegem G. Advancing a sociotechnical systems approach to workplace safety. Accid Anal Prev. 2015;74:77-95. https://doi.org/10.1016/j.aap.2014.11.013
32. Conforto EC, Salum F, Amaral DC, da Silva SL, de Almeida LF. Can Agile Project Management be adopted by industries other than software? Proj Manag J. 2016;47(3):21-34. https://doi.org/10.1177/875697281604700303
33. Borenstein M, Hedges LV, Higgins JPT, Rothstein HR. A basic introduction to fixed-effect and random-effects models for meta-analysis. Res Synth Methods. 2010;1(2):97-111. https://doi.org/10.1002/jrsm.12
34. Miles MB, Huberman AM, Saldaña J. Qualitative data analysis: A methods sourcebook. 4th ed. Thousand Oaks: SAGE; 2020. (Capítulos con DOIs no disponibles; citar libro es aceptado.)
35. Field A. Discovering Statistics Using IBM SPSS Statistics. 5th ed. London: SAGE; 2018. (Sin DOI de capítulo; libro de referencia aceptado.)
36. Lakens D. Calculating and reporting effect sizes to facilitate cumulative science. Front Psychol. 2013;4:863. https://doi.org/10.3389/fpsyg.2013.00863
37. Cumming G. Understanding the new statistics: effect sizes, confidence intervals, and meta-analysis. New York: Routledge; 2012. https://doi.org/10.4324/9780203807002
38. Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. 3rd ed. Thousand Oaks: SAGE; 2018. (Libro de referencia; sin DOI de libro completo)
39. Greene JC, Caracelli VJ, Graham WF. Toward a conceptual framework for mixed-method evaluation designs. Educ Eval Policy Anal. 1989;11(3):255-274. https://doi.org/10.3102/01623737011003255
40. Fetters MD, Curry LA, Creswell JW. The Stanford mixed methods research integration triangle. J Mixed Methods Res. 2020;14(2):146-166. https://doi.org/10.1177/1558689819881367.
Published
Issue
Section
License
Copyright (c) 2025 Miguel Ángel Medina Romero, Víctor Alfonso Erazo Arteaga , Daniel Washington Barzola Jaya, Miguel Ángel Rodríguez Mireles, Jorge Pablo Rivas-Díaz , Alberto Daniel Salinas Montemayor , David Israel Guerrero Vaca (Author)

This work is licensed under a Creative Commons Attribution 4.0 International License.
The article is distributed under the Creative Commons Attribution 4.0 License. Unless otherwise stated, associated published material is distributed under the same licence.