AUTOMATED EDUCATIONAL ASSESSMENT THROUGH QUESTION GENERATION FROM PDF RESOURCES
DOI:
https://doi.org/10.53555/cse.v12i1.2476Keywords:
Automated Question Generation, Assessment System, PDF Content Extraction, Intelligent Evaluation, Educational AnalyticsAbstract
Assessment creation has been a very slow and time-consuming process for teachers who have had to spend a lot of time making sure questions reflect the objectives of the course. In such rapidly changing academic environments, this not only drastically reduces the amount of time that can be used for teaching but also the quality and coverage of assessments may vary. Our research focuses on an automated exam generation revolution to cope with exiting problems and simplify the whole exam process right from the course materials. The platform allows a teacher to upload a single or multiple PDF documents that can be used to generate the question according to the type of questions teacher want to generate like mcqs, blanks. Since there are limitations in data extraction from PDFs, the system has some features that will help the user input the specific topic he is interested in. The teacher can input the topics, and our system will generate questions only from these topics to meet the teacher requirements for the questions. Our system uses Gemini API to processes the content, which is an intelligent material analyser that produces relevant and well-structured questions based on the input it receives. Once questions for the exam are created, the exam can be sent to the students through the system. Students get the test and do it in the time limit. They also get their results immediately, so they know how they have performed. Teachers can use the time and effort saved by the automatic grading alongside the visual analytics provided, for instance, donut chart and performance summary, which help the teachers to easily find top performers, the students in need of support, and the trends happening in the class. The system will automatically generate questions, align content to educator defined topics, provide immediate results on assessments, allow for quicker, more accurate, and efficient evaluation of all types of students, thus helping to improve student learning outcomes by eliminating the necessity of manual administrative work on behalf of teachers; therefore, it will provide intelligence, efficiency and adaptability to today's educational systems.
Downloads
References
A. Das, M. Majumder, and S. Phadikar, “Automatic question generation and answer assessment: A survey,” Research and Practice in Technology Enhanced Learning, vol. 16, no. 1, pp. 1–31, 2021, doi: 10.1186/s41039-021-00151-1.
S. Bhowmick, A. Basu, and S. Das, “Automating question generation from educational text using deep learning approaches,” in Proceedings of the International Conference on Artificial Intelligence in Education (AIED), 2020, pp. 33–44.
Y. Wang, X. Li, and W. Zhao, “Automatic question–answer pair generation using pretrained language models,” Machine Learning with Applications, vol. 15, p. 100527, 2024, doi: 10.1016/j.mlwa.2024.100527.
D. G. Bhatia and N. B. Patel, “Automated question generation using natural language processing techniques,” International Journal of Advanced Computer Science and Applications, vol. 13, no. 2, 2022.
A. Kurdi, J. Leo, and K. Parsia, “A systematic review of automatic question generation for educational purposes,” International Journal of Artificial Intelligence in Education, vol. 31, pp. 121–204, 2021.
Z. Zhang and M. Lapata, “Automatic question generation from text using neural networks,” Proceedings of ACL, 2021.
T. Chen, S. Li, and H. Jin, “Transformer-based automatic question generation for educational content,” IEEE Access, vol. 10, pp. 35214–35225, 2022.
M. Du, X. Shao, and M. Cardie, “Learning to ask: Neural question generation for reading comprehension,” Proceedings of ACL, 2021.
P. Kumar and R. Gupta, “AI-driven intelligent tutoring and assessment systems in education,” Education and Information Technologies, vol. 28, 2023.
A. Brown, J. Smith, and L. Wang, “Large language models for automated assessment generation,” Computers & Education: Artificial Intelligence, vol. 4, 2023.
H. Li, Y. Chen, and Q. Liu, “Educational question generation using BERT-based models,” Applied Sciences, vol. 12, no. 7, 2022.
M. Al-Shehab and T. Al-Azmi, “AI-based automated exam generation systems: A review,” IEEE Access, vol. 11, pp. 56789–56803, 2023.
K. Mishra and A. Singh, “PDF text extraction techniques for intelligent document processing,” Journal of Information Processing Systems, vol. 18, no. 5, 2022.
J. Devlin et al., “BERT: Pre-training of deep bidirectional transformers for language understanding,” Proceedings of NAACL, 2019.
OpenAI, “GPT models for natural language understanding and generation,” OpenAI Technical Report, 2023.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 International Journal For Research In Advanced Computer Science And Engineering

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
In consideration of the journal, Green Publication taking action in reviewing and editing our manuscript, the authors undersigned hereby transfer, assign, or otherwise convey all copyright ownership to the Editorial Office of the Green Publication in the event that such work is published in the journal. Such conveyance covers any product that may derive from the published journal, whether print or electronic. Green Publication shall have the right to register copyright to the Article in its name as claimant, whether separately
or as part of the journal issue or other medium in which the Article is included.
By signing this Agreement, the author(s), and in the case of a Work Made For Hire, the employer, jointly and severally represent and warrant that the Article is original with the author(s) and does not infringe any copyright or violate any other right of any third parties, and that the Article has not been published elsewhere, and is not being considered for publication elsewhere in any form, except as provided herein. Each author’s signature should appear below. The signing author(s) (and, in
