• Join our email list
  • Post an article

How can we help you?

Remote learning survey questionnaire for students.

Remote Learning Survey Questionnaire For Students

How To Evaluate The Effectiveness Of Remote Learning?

Want to know the effectiveness of remote learning but don't know how? We are here to guide you through remote learning surveys. Remote learning survey questions can help you boost student engagement and better understand the limitations of remote learning. For example, employees may choose to tailor their training schedules to their shift schedules. They may also wish to include case studies and scenarios that they can tackle as a group. A survey is an excellent tool for developing an efficient remote training programme. In this blog, we will learn about the Remote Learning Survey Software and its importance. Let's get started.

What Is A Remote Learning Survey?

Institutions must focus their research on how students feel about remote education and their experiences. To collect data, they might send out a remote learning survey to pupils. Once the findings are in, the management team will know what students enjoy about the current system and what they would like to alter.

Remote learning, as opposed to traditional lectures or a schedule of tasks to complete, encompasses a number of modalities that lead students through curriculum modules. It enables students who desire to learn more actively to discover additional alternatives. Remote learning provides extra opportunities for instructors and students to collaborate so that young people can achieve their academic goals. In addition, special emphasis is placed on evoking emotion and motivation in them, as well as being deliberate in urging them to apply their information regularly. In the educational environment, the social and emotional aspects of learning are receiving considerably greater attention in the form of socio-emotional learning.

Examples Οf Remote Learning Questionnaire

Below we can see what some remote learning survey questions could look like and what information we can extract from each one.

1. Do you have a device for online learning?

  • Yes, but it is ineffective.
  • No, I share my resources with others.

Students should have continuous access to a device for online study. Find out whether they have any issues with the device's hardware quality. Or if they share the gadget with others in the house and are unable to access it when needed.

2. How beneficial has digital learning been for you?

  • Ineffective
  • Slightly effective
  • Moderately effective
  • Very effective
  • Extremely effective

Students may choose to learn in the classroom with their peers or alone at home, depending on their personality. The school provides a more vibrant and interactive setting, whereas the home environment is more peaceful. You may use this question to determine whether or not remote learning is beneficial to pupils.

3. Do you like studying from a considerable distance?

  • Yes, without a doubt.
  • Yes, but I'd like to make a few changes.
  • No, there are several difficulties.
  • Not at all.

Determine if children are learning from home because they want to or because they are forced to. Learn ways to improve distance education and make it more appealing to students or not.

Importance Οf Remote Learning Survey

Students continuously astound us with their tenacity in the face of adversity and their desire for learning, even when this isn't always visible. Collecting comments from the Remote Learning Survey can help you understand how much your students love their professors and how much they value having someone there for them every day. It also demonstrates that students readily recognise professors' efforts to make learning experiences as interesting as feasible during remote learning. These reminders are vital for instructors because the benefits of remote learning were not always evident or front of mind owing to the obstacles that it posed.

Remote learning allows you to broaden your teaching toolkit. Therefore, we want to make sure that you can record the components of these remote learning methods that worked effectively so that we can bring them back to your classroom practice whenever we return to onsite learning. The Distance Learning Survey has proven extremely helpful in developing an understanding of the finest remote learning teaching approaches. Without knowing how many more periods of distant learning you will experience, feedback that consistently improves your expertise in remote teaching and learning methodologies has proven essential. Particularly as it becomes increasingly challenging to keep students interested in a remote learning setting.

When developing any type of survey, whether in the education or non-education sectors, you need a mechanism to conveniently gather and preserve the data you get. A secure drag-and-drop remote learning survey-building tool is ideal for schools and other community groups who need to construct customized surveys.

Additional Resources

  • What Is Survey Software?
  • Remote Learning For Students Who Don't Have Internet Access in 2020
  • 6 Ways To Research Your Remote Corporate Learners
  • Learning Maturity Matters: Learning Management Survey [eBook]
  • The Learning Management Systems’ Quality Evaluation Survey
  • The Democratization Of Learning: A Positive Outcome Of The Pandemic?

How to make a questionnaire regarding the impact of eLearning?

  • Best practice

Difference between eLearning, remote learning, and distance learning

Elearning surveys to evaluate impact, things to consider before creating an elearning questionnaire, a step-by-step guide to making an elearning survey, teacher elearning survey questions, elearning survey questions for students.

eLearning has become increasingly prevalent in recent years, thanks to its providing access to education regardless of location or physical accessibility issues. The term refers to using an electronic device and an Internet connection to distribute learning opportunities and provide educational content.

It is essential to better understand the impact of eLearning on current generations and its role in the future. Learning what people think about e-learning and investigating how it affects their lives is the goal we will explore in this article—and how to make a questionnaire on the impact of e-learning that will stand out.

Continue reading to get answers about gauging the impact of e-learning, including inspiration from our examples of eLearning survey questions.

Consequences of eLearning

There are several reasons to discuss eLearning. First, it has the potential to expand access to education for people who cannot attend traditional in-person classes, like those living in remote or underdeveloped areas or people with disabilities.

eLearning can also be more cost-effective for students and institutions since it reduces the need for physical classrooms and materials. That also means that eLearning has a positive impact on the environment, given that it has the potential to reduce waste.

Additionally, eLearning can potentially improve the quality of education by providing access to a broader range of resources and experts. Borders and distance from a university will no longer bar individuals of talent from accessing materials.

To conclude, eLearning has the potential to increase access to education, improve the quality of education, and make teaching more convenient and cost-effective. It is a topic of ongoing research, development, and discussion as it continues evolving and shaping education.

Terms like eLearning, remote, and distance learning are often used interchangeably. However, there are nuances. eLearning refers to learning facilitated by Internet technologies.

However, the term “distance learning” is most appropriate when the student and educator are physically separated.

Remote learning is a more general term that encompasses all forms of education that are not conducted in a traditional classroom setting, including distance learning. Surveys about distance learning, remote learning, and eLearning are often used as synonyms, regardless of their subtle differences.

To sum up, eLearning typically emphasizes technology, while distance learning refers to any remote education form. However, distant eLearning and remote learning have become more prevalent in recent years due to the COVID-19 pandemic, which has led to widespread school closures and a shift toward online learning.

If you are interested in measuring the impact of eLearning, the first step is to make a questionnaire. For example, you can ask about how often respondents use the technology to learn new information, whether they prefer traditional or digital education, and if they think eLearning is destined to become more popular. What they like and don’t like about eLearning can also be asked.

Once surveys have been collected, analyzing them will highlight the most common trends in the field.

If you need help analyzing the collected data, we’ve got you! Read our blog to find helpful tips and tricks that will ease this process.

Before diving into creating a survey, make sure to think through the following issues:

Define your target audience: Who should be surveyed? This will help narrow down the required research questions and determine whether a topic is relevant. For example, if you’re looking into how eLearning affects teenagers, then survey teenagers rather than adults or children, which will ensure more accurate answers since respondents will have first-hand experience rather than just their impressions about it (as in the case of parents who haven’t actually taken eLearning courses).

Choose an appropriate format: There are many different research methodologies, such as face-to-face interviews, telephone interviews, written questionnaires/surveys, etc. Each has its advantages/disadvantages.

Do your research: No one likes monotonous and bland surveys. Help yourself by referring to our valuable resources that will help create surveys that will stand out from the crowd:

  • How to write good survey questions (with examples)
  • How to create an engaging survey
  • How to avoid biased questions

Here are nine steps to follow when making an eLearning questionnaire:

  • Begin by defining objectives. What specific information are you trying to gather about the impact of eLearning?
  • Determine the target population for your questionnaire. Who will you be surveying and why?
  • Create questions that will help achieve your objectives. Be sure to include a mix of open-ended and closed-ended questions.
  • Have someone review the questionnaire to ensure the questions are straightforward and understandable.
  • Test the questionnaire with a small group to identify issues or ambiguities.
  • Consider the feedback from the test group and make any necessary changes.
  • Distribute the questionnaire to your target population.
  • Collect and analyze the data.
  • Draw conclusions about the impact of eLearning from the data.
  • Create a report summarising the research findings.
  • Provide recommendations for future eLearning initiatives.

The target group for eLearning surveys is usually someone from the online education system—administrators, teachers, professors, or students. So, let’s start with eLearning survey questions for teachers and educators.

  • How satisfied are you with the overall eLearning experience?
  • On a scale of 1 to 10, how successful have your classes been?
  • Do you have the opportunity to use innovative methods for improved learning outcomes using eLearning?
  • What type of learning methods do you prefer?
  • How often have you received an appraisal for your teaching?
  • How difficult was it to hold the attention of students in an online classroom?
  • On a scale of 1 to 10, what was the average attention level of students in the online classroom?
  • What are the biggest challenges for teachers when utilizing eLearning?
  • In your opinion, how could online lectures be more effective?
  • Is there anything else you’d like to share about your eLearning experience?

See more teacher survey examples here.

These are extremely important to better understand the educational “product” being offered to students. Here are some examples of eLearning survey questions to gather feedback from students:

  • How easily could you access the course materials and assignments?
  • How effective was the instructor at communicating and providing feedback through the eLearning platform?
  • How well did the eLearning format support your ability to learn and retain material?
  • Were there any technical issues that hindered your eLearning experience?
  • How much interaction did you have with classmates and instructors during the eLearning course?
  • How would you rate the overall quality of the course materials and resources provided in the eLearning format?
  • How would you rate the eLearning platform (e.g., Moodle, Blackboard) in terms of user-friendliness and functionality?
  • How would you rate the eLearning course compared to a traditional classroom setting?
  • Are there any suggestions you would make to improve the eLearning experience?

See more student survey examples here.

Following the steps above, while using our questions as an inspiration, will help you gather relevant data and develop proper conclusions regarding eLearning programming.

For more inspiration and tools, we encourage you to explore our free education survey questions, templates, and academic survey question examples .

The survey-making tools that SurveyPlanet offers can assist you further. We have hundreds of survey templates, pre-written questions, and multiple additional features. Sign up to create an unlimited number of questionnaires, the kinds that will help you gather valuable data and better understand the actual impacts of eLearning.

Photo by Compare Fibre on Unsplash

Register for free to join our upcoming EMPOWER MTSS Virtual Summit on Thursday, Oct. 5th!

  • Student Success
  • Social-Emotional Learning
  • Character Surveys | TX
  • Character Surveys | AZ
  • Life Skills Surveys
  • Feedback Surveys
  • Positive Behavior
  • Multi-Tiered System of Supports
  • Positive Behavior Interventions & Supports
  • On-Time Graduation
  • Student Check-Ins
  • Family Engagement
  • Adult SEL and Well-Being
  • Equity and Inclusion
  • Professional Development
  • Survey Instruments
  • Guides and Toolkits
  • Testimonials
  • Success Stories
  • Leadership Team
  • Data Privacy
  • Connecticut
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia

Request a Demo

  • Popular Posts
  • College and Career Readiness

Show Categories

School Climate

45 survey questions to understand student engagement in online learning.

Nick Woolf

In our work with K-12 school districts during the COVID-19 pandemic, countless district leaders and school administrators have told us how challenging it's been to  build student engagement outside of the traditional classroom. 

Not only that, but the challenges associated with online learning may have the largest impact on students from marginalized communities.   Research   suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face.

As you look to improve the online learning experience for students, take a moment to understand  how students, caregivers, and staff are currently experiencing virtual learning. Where are the areas for improvement? How supported do students feel in their online coursework? Do teachers feel equipped to support students through synchronous and asynchronous facilitation? How confident do families feel in supporting their children at home?

Below, we've compiled a bank of 45 questions to understand student engagement in online learning.  Interested in running a student, family, or staff engagement survey? Click here to learn about Panorama's survey analytics platform for K-12 school districts.

Download Toolkit: 9 Virtual Learning Resources to Engage Students, Families, and Staff

45 Questions to Understand Student Engagement in Online Learning

For students (grades 3-5 and 6-12):.

1. How excited are you about going to your classes?

2. How often do you get so focused on activities in your classes that you lose track of time?

3. In your classes, how eager are you to participate?

4. When you are not in school, how often do you talk about ideas from your classes?

5. Overall, how interested are you in your classes?

6. What are the most engaging activities that happen in this class?

7. Which aspects of class have you found least engaging?

8. If you were teaching class, what is the one thing you would do to make it more engaging for all students?

9. How do you know when you are feeling engaged in class?

10. What projects/assignments/activities do you find most engaging in this class?

11. What does this teacher do to make this class engaging?

12. How much effort are you putting into your classes right now?

13. How difficult or easy is it for you to try hard on your schoolwork right now?

14. How difficult or easy is it for you to stay focused on your schoolwork right now?

15. If you have missed in-person school recently, why did you miss school?

16. If you have missed online classes recently, why did you miss class?

17. How would you like to be learning right now?

18. How happy are you with the amount of time you spend speaking with your teacher?

19. How difficult or easy is it to use the distance learning technology (computer, tablet, video calls, learning applications, etc.)?

20. What do you like about school right now?

21. What do you not like about school right now?

22. When you have online schoolwork, how often do you have the technology (laptop, tablet, computer, etc) you need?

23. How difficult or easy is it for you to connect to the internet to access your schoolwork?

24. What has been the hardest part about completing your schoolwork?

25. How happy are you with how much time you spend in specials or enrichment (art, music, PE, etc.)?

26. Are you getting all the help you need with your schoolwork right now?

27. How sure are you that you can do well in school right now?

28. Are there adults at your school you can go to for help if you need it right now?

29. If you are participating in distance learning, how often do you hear from your teachers individually?

For Families, Parents, and Caregivers:

30 How satisfied are you with the way learning is structured at your child’s school right now?

31. Do you think your child should spend less or more time learning in person at school right now?

32. How difficult or easy is it for your child to use the distance learning tools (video calls, learning applications, etc.)?

33. How confident are you in your ability to support your child's education during distance learning?

34. How confident are you that teachers can motivate students to learn in the current model?

35. What is working well with your child’s education that you would like to see continued?

36. What is challenging with your child’s education that you would like to see improved?

37. Does your child have their own tablet, laptop, or computer available for schoolwork when they need it?

38. What best describes your child's typical internet access?

39. Is there anything else you would like us to know about your family’s needs at this time?

For Teachers and Staff:

40.   In the past week, how many of your students regularly participated in your virtual classes?

41. In the past week, how engaged have students been in your virtual classes?

42. In the past week, how engaged have students been in your in-person classes?

43. Is there anything else you would like to share about student engagement at this time?

44. What is working well with the current learning model that you would like to see continued?

45. What is challenging about the current learning model that you would like to see improved?

Elevate Student, Family, and Staff Voices This Year With Panorama

Schools and districts can use Panorama’s leading survey administration and analytics platform to quickly gather and take action on information from students, families, teachers, and staff. The questions are applicable to all types of K-12 school settings and grade levels, as well as to communities serving students from a range of socioeconomic backgrounds.


In the Panorama platform, educators can view and disaggregate results by topic, question, demographic group, grade level, school, and more to inform priority areas and action plans. Districts may use the data to improve teaching and learning models, build stronger academic and social-emotional support systems, improve stakeholder communication, and inform staff professional development.

To learn more about Panorama's survey platform, get in touch with our team.

Related Articles

44 Questions to Ask Students, Families, and Staff During the Pandemic

44 Questions to Ask Students, Families, and Staff During the Pandemic

Identify ways to support students, families, and staff in your school district during the pandemic with these 44 questions.

The Top 5 Ways to Raise Survey Response Rates (Q&A Ep. 2)

The Top 5 Ways to Raise Survey Response Rates (Q&A Ep. 2)

In this episode of Panorama Q&A, we share the top five strategies for improving survey response rates from students and families when gathering feedback.

La Cañada Shares Survey Results

La Cañada Shares Survey Results

La Cañada Unified School District, Panorama's first client, shares results from its surveys, used to collect feedback from students, families, and staff.

online learning research paper questionnaire

Featured Resource

9 virtual learning resources to connect with students, families, and staff.

We've bundled our top resources for building belonging in hybrid or distance learning environments.

Join 90,000+ education leaders on our weekly newsletter.

  • Article of professional interests
  • Published: 05 April 2021

A Survey on the Effectiveness of Online Teaching–Learning Methods for University and College Students

  • Preethi Sheba Hepsiba Darius   ORCID: orcid.org/0000-0003-0882-6213 1 ,
  • Edison Gundabattini   ORCID: orcid.org/0000-0003-4217-2321 2 &
  • Darius Gnanaraj Solomon   ORCID: orcid.org/0000-0001-5321-5775 2  

Journal of The Institution of Engineers (India): Series B volume  102 ,  pages 1325–1334 ( 2021 ) Cite this article

121k Accesses

24 Citations

Metrics details

Online teaching–learning methods have been followed by world-class universities for more than a decade to cater to the needs of students who stay far away from universities/colleges. But during the COVID-19 pandemic period, online teaching–learning helped almost all universities, colleges, and affiliated students. An attempt is made to find the effectiveness of online teaching–learning methods for university and college students by conducting an online survey. A questionnaire has been specially designed and deployed among university and college students. About 450 students from various universities, engineering colleges, medical colleges in South India have taken part in the survey and submitted responses. It was found that the following methods promote effective online learning: animations, digital collaborations with peers, video lectures delivered by faculty handling the subject, online quiz having multiple-choice questions, availability of student version software, a conducive environment at home, interactions by the faculty during lectures and online materials provided by the faculty. Moreover, online classes are more effective because they provide PPTs in front of every student, lectures are heard by all students at the sound level of their choice, and walking/travel to reach classes is eliminated.

Working on a manuscript?


Critical thinking and creativity of students increase with innovative educational methods according to the world declaration on higher education in the twenty-first century [ 1 ]. Innovative educational strategies and educational innovations are required to make the students learn. There are three vertices in the teaching–learning process viz., teaching, communication technology through digital tools, and innovative practices in teaching. In the first vertex, the teacher is a facilitator and provides resources and tools to students and helps them to develop new knowledge and skills. Project-based learning helps teachers and students to promote collaborative learning by discussing specific topics. Cognitive independence is developed among students. To promote global learning, teachers are required to innovate permanently. It is possible when university professors and researchers are given space to new educational forms in different areas of specializations. Virtual classrooms, unlike traditional classrooms, give unlimited scope for introducing teaching innovation strategies. The second vertex refers to the use of Information and Communication Technology (ICT) tools for promoting innovative education. Learning management systems (LMS) help in teaching, learning, educational administration, testing, and evaluation. The use of ICT tools promotes technological innovations and advances in learning and knowledge management. The third vertex deals with innovations in teaching/learning to solve problems faced by teachers and students. Creative use of new elements related to curriculum, production of something new, and transformations emerge in classrooms resulting in educational innovations. Evaluations are necessary to improve the innovations so that successful methods can be implemented in all teaching and learning community in an institution [ 2 ]. The pandemic has forced digital learning and job portal Naukri.com reports a fourfold growth for teaching professionals in the e-learning medium [ 3 ]. The initiatives are taken by the government also focus on online mode as an option in a post-covid world [ 4 ]. A notable learning experience design consultant pointed out that, educators are entrusted to lead the way as the world changes and are actively involved in the transformation [ 5 ]. Weiss notes that an educator needs to make the lectures more interesting [ 6 ].

This paper presents the online teaching–learning tools, methods, and a survey on the innovative practices in teaching and learning. Advantages and obstacles in online teaching, various components on the effective use of online tools, team-based collaborative learning, simulation, and animation-based learning are discussed in detail. The outcome of a survey on the effectiveness of online teaching and learning is included. The following sections present the online teaching–learning tools, the details of the questionnaire used for the survey, and the outcome of the survey.

Online Teaching and Learning Tools

The four essential parts of online teaching [ 7 ] are virtual classrooms, individual activities, assessments in real-time, and collaborative group work. Online teaching tools are used to facilitate faculty-student interaction as well as student–student collaborations [ 8 ]. The ease of use, the satisfaction level, the usefulness, and the confidence level of the instructor is crucial [ 9 ] in motivating the instructor to use online teaching tools. Higher education institutes recognize the need to accommodate wide diverse learners and Hilliard [ 10 ] points out that technical support and awareness to both faculty and student is essential in the age of blended learning. Data analytics tool coupled with the LMS is essential to enhance [ 11 ] the quality of teaching and improve the course design. The effective usage of online tools is depicted in Fig.  1 comprising of an instructor to student delivery, collaboration among students, training for the tools, and data analytics for constant improvement of course and assessment methods.

figure 1

The various components of effective usage of online tools

Online Teaching Tools

A plethora of online teaching tools are available and this poses a challenge for decision-makers to choose the tools that best suits the needs of the course. The need for the tools, the cost, usability, and features determine which tools are adopted by various learners and institutions. Many universities have offered online classes for students. These are taken up by students opting for part-time courses. This offers them flexibility in timing and eliminates the need for travel to campus. The pandemic situation in 2019 has forced many if not all institutions to completely shift classes online. LMS tools are packaged as Software as a Service (SaaS) and the pricing generally falls into 4 categories: (i) per learner, per month (ii) per learner, per use (iii) per course (iv) licensing fee for on-premise installation [ 12 ].

Online Learning Tools

Online teaching/learning as part of the ongoing semester is typically part of a classroom management tool. GSuite for education [ 13 ] and Microsoft Teams [ 14 ] are both widely adopted by schools and colleges during the COVID-19 pandemic to effectively shift regular classes online. Other popular learning management systems that have been adopted as part of blended learning are Edmodo [ 15 ], Blackboard [ 16 ], and MoodleCloud [ 17 ]. Davis et al. [ 18 ] point out advantages and obstacles for both students and instructors about online teaching shown in Table 1 .

The effectiveness of course delivery depends on using the appropriate tools in the course design. This involves engaging the learners and modifying the course design to cater to various learning styles.

A Survey on Innovative Practices in Teaching and Learning

The questionnaire aims to identify the effectiveness of various online tools and technologies, the preferred learning methods of students, and other factors that might influence the teaching–learning process. The parameters were based on different types of learners, advantages, and obstacles to online learning [ 10 , 18 ]. Questions 1–4 are used to comprehend the learning style of the student. Questions 5–7 are posed to find out the effectiveness of the medium used for teaching and evaluation. Questions 8–12 are framed to identify the various barriers to online learning faced by students.

This methodology is adopted as most of the students are attending online courses from home and polls of this kind will go well with the students from various universities. Students participated in the survey and answered most of the questionnaire enthusiastically. The only challenge was a suitable environment and free time for them to answer the questionnaire, as they are already loaded with lots of online work. Students from various universities pursuing professional courses like engineering and medicine took part in this survey. They are from various branches of sciences and technologies. Students are from private universities, colleges, and government institutions. Figure  2 shows the institution-wise respondents. Microsoft Teams and Google meet platforms were used for this survey among university, medical college, and engineering college students. About 450 students responded to this survey. 52% of the respondents are from VIT University Vellore, Tamil Nadu, 23% of the respondents are from CMR Institute of Technology (CMRIT), Bangalore, 15% of the respondents are from medical colleges and 10% are from other engineering colleges. During this pandemic period, VIT students are staying with parents who are living in different states of India like Andhra, Telangana, Kerala, Karnataka, MP, Haryana, Punjab, Maharashtra, Andaman, and so on. Only a few students are living in Tamil Nadu. Some of the students are staying with parents in other countries like Dubai, Oman, South Africa, and so on. Some of the students of CMRIT Bangalore are living in Bangalore and others in towns and villages of Karnataka state. Students of medical colleges are living in different parts of Tamil Nadu and students of engineering colleges are living in different parts of Andhra Pradesh. Hence, the survey is done in a wider geographical region.

figure 2

Institution-wise respondents

Figure  3 shows the branch-wise respondents. It is shown that 158 students belong to mechanical/civil engineering. 108 respondents belong to computer science and engineering, 68 students belong to medicine, 58 students belong to electrical & electronics engineering, and electronics & communication engineering. 58 students belong to other disciplines.

figure 3

Branch-wise respondents

Questionnaire Used

Students were assured of their confidentiality and were promised that their names would not appear in the document. A list of the questions asked as part of the survey is given below.


Sample group: B Tech students from different branches of sciences across various engineering institutions and MBBS medical students.

Which of the methods engage you personally to learn digitally ?

Individual assignment

Small group (No. 5 students) work

Large group (No. 10 students and more) work

Project-based learning

Which of the digital collaborations enables you to work on a specific task at ease

Two by two (2 member team)

Small group workgroup (No. 5 students) work

Which of the digital approaches motivate you to learn

Whiteboard and pen

PowerPoint presentation

Digital pen and slate

My experience with online learning from home digitally

I am learning at my own pace comfortably

My situational challenges are not suitable

I can learn better with uninterrupted network connectivity

I am distracted with various activities at home, viz. TV, chatting, etc.

Which type of recorded video lecture is more effective for learning ?

delivered by my faculty

delivered by NPTEL

delivered by reputed Overseas Universities

delivered by unknown experts

Which type of quiz is more effective for testing the understanding?

Traditional—pen and paper—MCQ

Traditional—pen and paper—short answers

Online quiz—MCQ

Online quiz—short answers

Student version software downloaded from the internet is useful for learning

Unable to decide

Online teaching – learning takes place effectively because:

Every student can hear the lecture clearly

PPTs are available right in front of every student

Students can ask doubts without much reservation

Students need not walk long distances before reaching the class

Which of the following statements is true of online learning off-campus ?

No one disturbs me during my online learning.

My friend/family member/roommate/neighbor occasionally disturb me

My friend/family member/roommate/neighbor constantly disturb me

At home/place of residence, how many responsibilities do you have?

I don’t have many responsibilities.

I have a moderate amount of responsibilities, but I have sufficient time for online learning.

I have many responsibilities; I don’t have any time left for online learning.

What is your most preferred method for clearing doubts in online learning?

Ask the professor during/after an online lecture

Post the query in a discussion forum of your class and get help from your peers

Go through online material providing an additional explanation.

Which of the following devices do you use for your online learning?

A laptop/desktop computer

A smartphone

Other devices

Outcome of the survey

Students would prefer to work in a group of 5 students to engage personally in digital learning as seen from Fig.  4 .

figure 4

Personal engagement in digital learning

Digital collaboration to enable students to work at ease on a specific task is to allow them to work in small groups of 5 students as seen in Fig.  5 .

figure 5

Digital collaboration to enable students to work at ease

Animations are found to be the best digital approach motivating many students to learn as seen in Fig.  6 .

figure 6

Digital approaches that motivate students to learn

The online learning experience of students is shown in Fig.  7 . The majority of students have said that they can learn at their own pace comfortably through online learning.

figure 7

The online learning experience of students

The effectiveness of the recorded video lecture is shown in Fig.  8 . The majority of students agree that the video lectures delivered by his/her faculty teaching the subject help students to learn effectively.

figure 8

More effective recorded video lecture

Online quiz having multiple-choice questions (MCQ) is preferred by most of the students for testing their understanding of the subject as seen in Fig.  9 .

figure 9

More effective quiz for testing the understanding

The usefulness of the student version of the software downloaded from the internet is shown in Fig.  10 . 45.7% of the students agree that it is useful for learning whereas 45.2% of them are unable to decide. The rest of the students feel that the student version of the software is not useful.

figure 10

The usefulness of the student version of the software

The reasons for the effectiveness of online teaching–learning are shown in Fig.  11 . The majority of the students, feel that the PPTs are available right in front of every student so that following the lecture makes the learning effective. In universities where a fully flexible credit system (FFCS) is followed, students need to walk long distances for reaching their classrooms. Day Scholars in universities as well as engineering colleges are required to travel a considerable distance before reaching the first-hour class. According to many students, online learning is more effective since walking/traveling is completed eliminated. If the voice of the faculty member is feeble, students sitting in the last few rows of the class would not hear the lecture completely. Some students feel that online learning is more effective since the lecture is reaching every student irrespective of the number of students in a virtual classroom.

figure 11

Reasons for the effectiveness of online teaching–learning

50.3% of students agree that they do not have any disturbance during online learning and it is more effective. Many of them feel that occasionally their friends or relatives disturb students during their online learning as shown in Fig.  12 .

figure 12

Disturbances during online learning

Figure  13 shows the environment at home for online learning. 76.9% of the respondents stated that they have a moderate amount of responsibilities at home but they have sufficient time for online learning. 16.1% of them have said that they do not have many responsibilities whereas 7% of them claimed that they have many responsibilities at home and they do not have any time left for online learning.

figure 13

The environment at home for online learning

Figure  14 shows the methods adopted for clearing doubts in online learning. 43.2% of the respondents ask the Professor and get their doubts clarified during online lectures. 25.5% of them post queries in the discussion forum and help from peers. 31.3% of them go through the online materials providing additional explanation and get their doubts clarified.

figure 14

Methods adopted for clearing doubts in online learning

Figure  15 shows the devices used by students for online learning. Most of the students use laptop/desktop computers, many of them use smartphones and very few students use tablets.

figure 15

Devices used for online learning

The association between responses 1 and 2 is tested using the chi-square test. The results are presented in Table 2 which shows the observed cell totals, expected cell values, and chi-square statistic for each cell. It is seen that association exists between several responses between questions.

The observed cell values indicate that the highest association is found between responses 1b and 2b since both these responses are related to a small working group having 5 members. The lowest association is found between the responses of 1c and 2a having the lowest observed cell value and expected cell value. The reason for this is response 1c shows the work done by a 10 member team and the response 2a shows a two-member team. The chi-square statistic is 65.6025. The p value is < 0.00001. The result is significant at p  < 0.05.

The outcome of a survey on the effectiveness of innovations in online teaching–learning methods for university and college students is presented. About 450 students belonging to VIT Vellore, CMRIT Bangalore, Medical College, Pudukkottai, and engineering colleges have responded to the survey. A questionnaire designed for taking is survey is presented. The chi-square statistic is 65.6025. The p value is < 0.00001. The result is significant at p  < 0.05. Associations between several responses of questions exist. The survey undertaken provides an estimate of the effectiveness and pitfalls of online teaching during the online teaching that has been taking place during the pandemic. The study done paves the way for educators to understand the effectiveness of online teaching. It is important to redesign the course delivery in an online mode to make students engaged and the outcome of the survey supports these aforementioned observations.

The outcome of the survey is given below:

A small group of 5 students would help students to have digital collaboration and engage personally in digital learning.

Animations are found to be the best digital approach for effective learning.

Online learning helps students to learn at their own pace comfortably.

Students prefer to learn from video lectures delivered by his/her faculty handling the subject.

Online quiz having multiple-choice questions (MCQ) preferred by students.

Student version software is useful for learning.

Online classes are more effective because they provide PPTs in front of every student, lectures are heard by all students at the sound level of their choice, and walking/travel to reach classes is eliminated.

Students do not have any disturbances or distractions which make learning more effective.

But for a few students, most of the students have no or limited responsibilities at home which provides a good ambiance and a nice environment for effective online learning.

Students can get their doubts clarified during lectures, by posting queries in discussion forums and by referring to online materials provided by the faculty.

World Declaration on Higher Education for the Twenty-first Century: Vision and Action (1998) https://unesdoc.unesco.org/ark:/48223/pf0000141952 . Accessed on 10 December 2020.

S. Cadena-Vela, J.O. Herrera, G. Torres, G. Mejía-Madrid, Innovation in the university, in: Proceedings of the Sixth International Conference on Technological Ecosystems for Enhancing Multiculturality-TEEM’18 (2018), pp. 799–805. https://doi.org/10.1145/3284179.3284308

Demand for online tutors soars, pay increases 28%. Times of India (2020) https://timesofindia.indiatimes.com/city/chennai/demand-for-online-tutors-soars-pay-increases-28/articleshow/77939414.cms . Accessed on 7 December 2020

Can 100 top universities expand e-learning opportunities for 3.7 crore students. Times of India (2020) https://timesofindia.indiatimes.com/home/education/news/can-100-top-universities-expand-e-learning-opportunities-for-3-7-crore-students/articleshow/76032068.cms . Accessed on 9 December 2020.

C. Malemed, Retooling instructional design (2019). https://theelearningcoach.com/elearning2-0/retooling-instructional-design/ accessed on 8 December 2020

C. Wiess, COVID-19 and its impact on learning (2020). https://elearninfo247.com/2020/03/16/covid-19-and-its-impact-on-learning/ . Accessed on 10 December 2020

E. Alqurashi, Technology tools for teaching and learning in real-time, in Educational Technology and Resources for Synchronous Learning in Higher Education (IGI Global, 2019), pp. 255–278

J.M. Mbuva, Examining the effectiveness of online educational technological tools for teaching and learning and the challenges ahead. J. Higher Educ. Theory Pract. 15 (2), 113 (2015)

Google Scholar  

S.N.M. Mohamad, M.A.M. Salleh, S. Salam, Factors affecting lecturer’s motivation in using online teaching tools. Procedia Soc. Behav. Sci. 195 , 1778–1784 (2015)

Article   Google Scholar  

A.T. Hilliard, Global blended learning practices for teaching and learning, leadership and professional development. J. Int. Educ. Res. 11 (3), 179–188 (2015)

M. Moussavi, Y. Amannejad, M. Moshirpour, E. Marasco, L. Behjat, Importance of data analytics for improving teaching and learning methods, in Data Management and Analysis (Springer, Cham, 2020), pp. 91–101

P. Berking, S. Gallagher, Choosing a learning management system, in Advanced Distributed Learning (ADL) Co-Laboratories (2013), pp 40–62

R.J.M. Ventayen, K.L.A. Estira, M.J. De Guzman, C.M. Cabaluna, N.N. Espinosa, Usability evaluation of google classroom: basis for the adaptation of gsuite e-learning platform. Asia Pac. J. Educ. Arts Sci. 5 (1), 47–51 (2018)

B.N. Ilag, Introduction: microsoft teams, in Introducing Microsoft Teams (Apress, Berkeley, CA, 2018), pp. 1–42

A.S. Alqahtani, The use of Edmodo: its impact on learning and students’ attitudes towards it. J. Inf. Technol. Educ. 18 , 319–330 (2019)

J. Uziak, M.T. Oladiran, E. Lorencowicz, K. Becker, Students’ and instructor’s perspectives on the use of Blackboard Platform for delivering an engineering course. Electron. J. E-Learn. 16 (1), 1 (2018)

T. Makarchuk, V. Trofimov, S. Demchenko, Modeling the life cycle of the e-learning course using Moodle Cloud LMS, in Conferences of the Department Informatics , No. 1 (Publishing house Science and Economics Varna, 2019), pp. 62–71

N.L. Davis, M. Gough, L.L. Taylor, Online teaching: advantages, obstacles, and tools for getting it right. J. Teach. Travel Tour. 19 (3), 256–263 (2019)

Download references

Author information

Authors and affiliations.

Department of Computer Science and Engineering, CMR Institute of Technology, Bangalore, 560037, India

Preethi Sheba Hepsiba Darius

School of Mechanical Engineering, Vellore Institute of Technology, Vellore, 632014, India

Edison Gundabattini & Darius Gnanaraj Solomon

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Darius Gnanaraj Solomon .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Cite this article.

Darius, P.S.H., Gundabattini, E. & Solomon, D.G. A Survey on the Effectiveness of Online Teaching–Learning Methods for University and College Students. J. Inst. Eng. India Ser. B 102 , 1325–1334 (2021). https://doi.org/10.1007/s40031-021-00581-x

Download citation

Received : 10 August 2020

Accepted : 18 March 2021

Published : 05 April 2021

Issue Date : December 2021

DOI : https://doi.org/10.1007/s40031-021-00581-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Learning management
  • Learning environment
  • Teaching and learning
  • Digital learning
  • Collaborative learning
  • Online learning
  • Find a journal
  • Publish with us

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

E-Learning Survey Questionnaire

Profile image of Alade Ameen

EDUCAUSE is conducting a study of evolving IT requirements to support instructional technology. In general this survey focuses on IT support for instructors in the following areas: 1. Online distance-learning courses where the instructor conducts all class sessions primarily online — NOT via mail or telephone — requiring no face-to-face meetings between students and instructor, either in the classroom or via video during the course. 2. Traditional courses where the instructor teaches all sessions in the classroom but incorporates technology in some or all classes — for example, PowerPoint presentations, Web-based activities, multimedia simulations of key concepts, virtual labs, and/or online testing. 3. Hybrid courses where the instructor combines the elements of online distance-learning courses and traditional courses to replace some classroom sessions with virtual sessions — for example online forums or Web-based activities.

Related Papers

Steve Myran

online learning research paper questionnaire

The Internet and Higher Education

Julian Scher

International Journal of Teaching and Learning in Higher Education

Teresa Foulger , Audrey Amrein-Beardsley

Jenni Hayman

The purpose of this study was to develop a set of essential practices for online instruction at a higher education institution. The literature of online learning indicated that traditional classroom-based instructors needed support and professional development to adapt their teaching methods for effective online course delivery. Many instructors at the participant institution for this study were asked to teach online; however, they received very few guidelines for online instruction and minimal support. Using techniques based on the Delphi Method, a group of expert online instructors at the participant institution was asked to agree on a set of practices they considered essential for online instruction. The initial set of practices was developed through a qualitative analysis of 18 references from the literature of online learning. The final set of practices the participants agreed were essential, 37 items in total, may represent an effective starting point for professional development and support of online instructors at the participant institution.

Online Learning

Curt Bonk , Meina Zhu

As massive open online courses (MOOCs) increase, the large scale and heterogeneity of MOOC participants bring myriad significant design challenges. This exploratory mixed methods study explores 143 MOOC instructors’ considerations and challenges in designing MOOCs; 12 of whom were interviewed and had their courses analyzed. The survey, interview, and course review data revealed a variety of considerations and challenges in MOOC design in terms of pedagogy, resources, and logistics. Pedagogical considerations included learning objectives, assessment methods, course length, course content, flexibility, and collaborative learning support. Resource considerations included the affordance of MOOC platforms, support from the host institution and the platform, and the available intellectual and hardware resources. Logistical considerations included the amount of time instructors spent designing the MOOC. The obstacles included pedagogical challenges (e.g., engaging learners, increasing learner interaction, and limited assessment methods), resource challenges (e.g., limitations associated with the affordances of the platform), and logistical challenges (e.g., time limitations for designing and developing MOOCs). To address these challenges, the instructors often relied on reviewing other MOOCs. They also sought help from colleagues, their universities, and support personnel of the adopted platforms. Keywords: massive open online courses (MOOCs), instructional design, design considerations, design challenges, MOOC instructors

Alexandra M . Pickett , Peter Shea , Eric Fredericksen

This paper examines issues of pedagogy, faculty development, student satisfaction, and reported learning in the State University of New York (SUNY) Learning Network (SLN). Beginning with an overview of the SLN program, we provide a conceptual framework for our current research on higher education, online learning environments. This framework attempts to integrate research on how people learn [1], with best practices in higher education [2] and recent research on learning in asynchronous higher education environments [3].

Eric Fredericksen , William Pelz , Alexandra M . Pickett

Elements of Quality Online …

Alexandra M . Pickett , Eric Fredericksen

Edwige Simon

Annual Meeting of the American …

Steven Lonn



The International Review of Research in Open …

Alexandra M . Pickett

The International Review of Research in Open and Distributed Learning

Alexandra M . Pickett , Peter Shea

Abdulmohsin Altawil


D Natasha Brewley

Nicole Buzzetto-Hollywood

George Houston

Stephen Downes

The Relationship Between Teaching Presence and Online Instructor Satisfaction in an Online Teacher Training Program.

Stafford Lumsden


  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2023
  • English (selected)
  • Bahasa Indonesia

Welcome to Scribd!

  • Language (EN)
  • Read for free
  • FAQ and support
  • What is Scribd?
  • Sheet music
  • Documents (selected)

Explore Ebooks

  • Bestsellers
  • Editors' Picks
  • Contemporary Fiction
  • Literary Fiction
  • Religion & Spirituality
  • Personal Growth
  • Science Fiction & Fantasy
  • Paranormal, Occult & Supernatural
  • Historical Fiction
  • Science & Mathematics
  • Study Aids & Test Prep
  • Small Business & Entrepreneurs

Explore Audiobooks

  • All audiobooks
  • Contemporary
  • Mysteries & Thrillers
  • Science Fiction
  • Adventurers & Explorers
  • Inspirational
  • New Age & Spirituality

Explore Magazines

  • All magazines
  • Business News
  • Entertainment News
  • Personal Finance
  • Strategic Planning
  • Sports & Recreation
  • Video Games
  • Exercise & Fitness
  • Cooking, Food & Wine
  • Crafts & Hobbies

Explore Podcasts

  • All podcasts
  • Mystery, Thriller & Crime Fiction
  • Social Science
  • Jazz & Blues
  • Movies & Musicals
  • Pop & Rock
  • Religious & Holiday
  • Drums & Percussion
  • Guitar, Bass, and Fretted
  • Intermediate

Explore Documents

  • Academic Papers
  • Business Templates
  • Court Filings
  • All documents
  • Bodybuilding & Weight Training
  • Martial Arts
  • Christianity
  • Performing Arts
  • Body, Mind, & Spirit
  • Weight Loss
  • Self-Improvement
  • Technology & Engineering
  • Political Science


Uploaded by, document information, available formats, share this document, share or embed document, sharing options.

  • Share on Facebook, opens a new window Facebook
  • Share on Twitter, opens a new window Twitter
  • Share on LinkedIn, opens a new window LinkedIn
  • Share with Email, opens mail client Email
  • Copy Link Copy Link

Did you find this document useful?

Is this content inappropriate, e-learning questionnaire.

Thanks for filling in this questionnaire. The contents of this questionnaire are absolutely confidential and academic project only. Please tick the relevant boxes and be as full and comprehensive as possible with your other answers.

Section A: Perception Towards E-learning Q1: Since how many years are you using internet approximately?

Q2: Do you prefer taking online course over face to face course? Yes No Q3: The e-learning is only advisable for people with a lot of computer knowledge? Yes No Q4: Are textbook your preferred learning material? Yes No Q5: Is accessing copies of offline academic transcript is hard? Yes No Q6: Do you have difficulty in finding relevant online material? Yes No

Q7: How much E-learning Helps?

Preference towards e-learning, q9: do you prefer online study, particulars highly agr neutral dis- highly, 15. e-learning provides working, satisfaction towards e-learning, section b: personal details, age: below 18 years 18-21 years, 22-25 years above 25 years, gender: male female.

Employment Status:

In full time education In part time education Not in education

Main program of study:, engineering pharmacy bba b.com b.sc, mba m.tech mca, others, please specify, you might also like.


  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, online learning satisfaction during covid-19 pandemic among chinese university students: the serial mediation model.

online learning research paper questionnaire

  • 1 Faculty of Business and Law, Taylor’s University, Subang Jaya, Malaysia
  • 2 Foundation of Studies, Warwick University, Coventry, United Kingdom
  • 3 Educational Development Center, Mazandaran University of Medical Sciences, Sari, Iran
  • 4 Department of Nursing, Alborz University of Medical Sciences, Karaj, Iran

The aim of this study was to investigate the relationship between interaction and online learning satisfaction, whether this relationship is mediated by academic self-efficacy and student engagement among Chinese university students during the COVID-19 pandemic. A serial mediation model was developed to examine the proposed relationship. This study employed a cross-sectional, questionnaire-based research design. A sample of 1,504 Chinese university students ( M age =19.89years, SD age =1.93) from five provinces in China completed an online survey questionnaire from December 2020 to January 2021 to respond to questions on demographic characteristics and items to measure the variables in the research model. The partial least squares structural equation modeling was used to assess the measurement model and proposed serial mediation model. Data were analyzed using SmartPLS software version 3.3.2. The results of the measurement model showed good reliability and validity for all constructs. The results of the structural model and hypothesis testing showed that all hypotheses were supported in this study. Particularly, there was a significant positive relationship between interaction and online learning satisfaction (Q1), interaction and academic self-efficacy (Q2), academic self-efficacy and student engagement (Q3), and the student engagement and online learning satisfaction (Q4). In addition, the results showed that academic self-efficacy and student engagement serial mediated the relationship between interaction and online learning satisfaction (Q5). The serial mediation model explained 34.6% of the variance of online learning satisfaction. The findings shed light on the underlying mechanisms that explain students’ online learning satisfaction during the COVID-19 pandemic. Universities and policymakers need to make better decisions that ultimately could lead to students’ academic outcomes and achievement.


The COVID-19 declared as a pandemic by World Health Organization in 2020 has utterly disrupted educational activities, forcing most universities to a full closure, thus affecting hundreds of millions of students and educators across the globe ( Shahzad et al., 2021 ). When traditional learning and teaching are no longer an option, online learning (synchronous or asynchronous) acts as an alternative to support the continuation of education in the midst of a pandemic with its flexibility, accessibility, and convenience ( Adedoyin and Soykan, 2020 ; Selvanathan et al., 2020 ). Most higher institutions shifted from face-to-face learning to emergency remote teaching ( Jan, 2020 ), and the motive behind such implementation was to alleviate the transmission of the coronavirus and maintain the continuation of education during the challenging times of lockdown among students and educators ( Bayham and Fenichel, 2020 ; Wang et al., 2020 ).

Faculty members of universities have begun to learn and deliver online teaching to their students and are eager to understand how to produce better learning outcomes with online instructions ( Shahzad et al., 2021 ). On the other hand, students have more control over the content and time to learn based on their individual learning needs and autonomy ( Coman et al., 2020 ). However, this unplanned and rapid shift has raised concerns over the quality of learning, students’ academic achievement ( Sahu, 2020 ) and satisfaction ( Dziuban et al., 2015 ) as not much information or guidance is available on the best online teaching practices for instructors ( Armstrong-Mensah et al., 2020 ). On the contrary, prior to the COVID-19 pandemic, online education featured with high attrition rates was downgraded as the second best preferred by students compared to traditional higher education ( Muilenburg and Berge, 2005 ; Hassan et al., 2021 ).

Preliminary studies emphasize the pivotal role that student satisfaction plays in determining the success or failure of online education ( Kuo et al., 2014 ; Rabin et al., 2019 ; Gopal et al., 2021 ) opposes the completion rates, as learners’ satisfaction reflects how they perceive their learning experiences ( Kuo et al., 2014 ) and interprets the quality of the course instruction ( Hew et al., 2020 ). Interaction in a fully online learning setting has been regarded as a critical factor that determines to the extent which students are satisfied with their online education ( Wu et al., 2010 ; Cidral et al., 2018 ). According to Kuo et al. (2014) , a high level of interaction with the instructor, other learners, or content leads to high satisfaction and thus reveals high engagement in online learning ( Veletsianos, 2010 ). Similarly, lack of interaction often leads to poor student engagement and lower student satisfaction ( Martin et al., 2018 ; Rahmatpour et al., 2021 ). It can be concluded that interaction in online learning often translates to students’ engagement in their academic activities before positively affecting students’ satisfaction ( Kim and Kim, 2021 ).

On the other hand, academic self-efficacy has been indicated to have a positive effect on students’ engagement within the self-directed distance education nature, where students with high academic self-efficacy are more engaged in their online studies ( Jung and Lee, 2018 ) and more likely to experience learning satisfaction ( Artino, 2008 ). Academic self-efficacy, which is understood as students’ belief incapability to perform academically well during an online platform, has been reported to be the most predictive factor of students’ satisfaction ( Shen et al., 2013 ; Jan, 2015 ). As aforementioned, prior studies indicate the significant role of interaction ( Enkin and Mejías-Bikandi, 2017 ), academic self-efficacy ( Shen et al., 2013 ), and students’ engagement in the online classrooms ( Robinson and Hullinger, 2008 ) and their relationship to online learning satisfaction. There is a scarcity of studies investigating the mechanisms of interaction, self-efficacy, and engagement on students’ overall satisfaction. Hence, the extension of the existing research is needed.

This study adopts the theory of transactional distance ( Moore, 1993 ), most often identified with distance learning programs ( Benson and Samarawickrema, 2009 ). It helps identify the mechanism behind the relationship between interaction and satisfaction. Ekwunife-Orakwue and Teng (2014) argue that although the theory of transactional distance has been posited to explain the mechanisms in online learning education, few studies have identified the factors from this theory to predict a causal pathway for the mechanism of occurrence. Nevertheless, the theory recognizes interaction as a bridge to “a psychological and communications gap” in distance learning in promoting students’ overall satisfaction ( Moore, 1993 ; Benson and Samarawickrema, 2009 ). Hence, this study goes one step further and suggests that academic self-efficacy and student engagement may explain the mechanism behind the relationship between interaction and online learning satisfaction among online learners, particularly Chinese online learners.

China was the first country to respond to this transition by instructing a quarter of billion full-time students to resume their studies online ( OECD, 2020 ; World Economic World Economic Forum, 2020 ). Chinese online education advocates “interactivity” in online learning provides some perspectives to access online learning in our study. As students’ satisfaction reflects the effectiveness of e-Learning quality ( Alqurashi, 2019 ), it has become very important to understand how interactions impact the e-Learning quality, especially during the pandemic when the education around the world has moved to online teaching & learning ( Kumar et al., 2021 ). However, the literature is not exhaustive on student satisfaction in an online environment during the pandemic. It is particularly scarce in the context of developing countries, as in the case with China. Thus, the current study offers some new insights on distance learning by investigating the mechanism behind the relationship between online interaction and learners’ satisfaction from students’ perspectives with the lens of the theory of transactional distance in developing countries. Secondly, there is a regrettable paucity of research to address the serial mediation of academic self-efficacy and student engagement in the correlation between students’ satisfaction and interactions. And it is worth noting that student satisfaction is closely tied to their academic performance or achievement and also acts as an indicator to measure the success of online courses ( Alqurashi, 2019 ). Thus, to understand student satisfaction and its relationship to interaction through student engagement, academic self-efficacy will largely assist students in achieving better online learning outcomes. On the other hand, academic self-efficacy has been supported by prior researchers on its impact on student engagement ( Bong and Skaalvik, 2003 ) has been only measured at a task-specific level and has not yet been widely measured at a general level ( Ferla et al., 2009 ). Thus, the study holds significance in opening up a new perspective for educators and policymakers on how to effectively plan for the implementation of distance learning in any situation in the future.

Literature Review

Online learning satisfaction.

Learning satisfaction represents learners’ feelings and attitudes toward the learning process or the perceived level of fulfilment attached to one’s desire to learn, caused by the learning experiences ( Topala and Tomozii, 2014 ). In the online context, satisfaction has been found to be one of the most significant considerations influencing the continuity of online learning ( Moore and Kearsley, 2011 ; Parahoo et al., 2016 ). Previous research on online learning has shown that learners’ satisfaction is a critical indicator of learning achievements and the success of online learning system implementation ( Ke and Kwak, 2013 ). To meet learners’ real learning needs and create an effective learning environment, a growing body of literature have been conducted to examine various determinants of learner’s online satisfaction ( Shen et al., 2013 ; Hew et al., 2020 ; Jiang et al., 2021 ).

Muilenburg and Berge (2005) identified eight barriers that prevent students from satisfactory online education: administrative and technical issues, lack of academic and technical skills, interaction, motivation, time, and support for studies, and accessibility and affordably of Internet usage. Similarly, Baber (2020) performed a comparative analysis to investigate the determinants of students' learning satisfaction on undergraduate students from South Korea and India. The study discovered that the variables such as interaction in the classroom, student engagement, course structure, teacher awareness, and facilitation positively influence students' perceived learning satisfaction. Other factors, such as online support service quality, perceived ease of use and usefulness of online platform, computer self-efficacy, academic self-efficacy, prior experience, and online learning acceptance, were found to significantly impact students’ online learning satisfaction ( Lee, 2010 ; Jan, 2015 ; Jiang et al., 2021 ).

Among the various factors that impact learners’ online learning satisfaction and academic outcome, interaction in online learning can be seen as the key component, and its importance and effectiveness have been also emphasized by the theory of transactional distance ( Moore, 1993 ; Benson and Samarawickrema, 2009 ). Even though previous studies have confirmed the positive impact of interaction on online learning satisfaction, the mechanism behind this relationship has not been well addressed in the literature. Palmer and Holt (2009) stated that the ability and the confidence to learn from online courses and connect and engage with others were the main reasons in explaining online learners' satisfaction. In this regard, this study argues that students’ academic self-efficacy and engagement in online classes may explain the relationship between interaction and online learning satisfaction.


According to Moore and Kearsley (1996) , interaction should be highlighted and examined in all forms of education, either face-to-face or online. It is a process that allows learners to seek new information and form connections with instructors, other learners, and content in their learning activities ( Moore, 1989 ). It has been identified that learning activities are a significant element that critically determines the learners’ learning outcomes ( Baber, 2021 ). A cross-country study conducted by Baber (2020) during the COVID-19 pandemic revealed interaction as the most significant factor in examining students’ online learning satisfaction and learning outcomes. It is notable that interactions in online learning have been underachieved due to technological constraints ( Downing et al., 2007 ), and literature on distance education has largely neglected the significance of interaction ( Bernard et al., 2009 ). Bernard et al. (2009) added that interaction has not been explicitly explained or highlighted in the study of distance education, and it is a much-needed component of online learning. Nevertheless, the study conducted by Bali and Liu (2018) has shown that in face-to-face classes, there is a higher degree of interaction and satisfaction than in online courses. Interaction can be categorized into three dimensions: interaction with instructors, interaction with peers, and interaction with content ( Moore, 1989 ). Jung et al. (2002) found that consistent interaction with instructors accounting for 60% of students’ online satisfaction, especially in the early stages of a course. This is due to the reason that in an online learning environment, instructors are expected to offer advice, direction, and assistance to each learner based on their individual needs, to administer formal and informal evaluations, to ensure that learners are making progress, to inspire learners, and to assist learners in putting what they have learned into effect ( Moore, 1989 ; Anderson et al., 2001 ). In addition, Kurucay and Inan (2017) stated that the interaction between learner-learner is also important for both student satisfaction and student academic achievement in online learning, which allows students to socialize, exchange, and discuss ideas and participate in group activities. Moreover, social interactivity with other students fosters great student satisfaction with a course ( Skinner et al., 2008 ). In the same vein, interaction with content has been identified to be closely related to the course content quality, which in turn affects student satisfaction ( Kim and Kim, 2021 ). The better the content quality is, the more motivated and satisfied learners are ( Knowles et al., 2020 ). On the contrary, a few studies found that learner-learner or learner-instructor interactions have no effect on learners’ satisfaction on different Massive Open Online Courses in the United States ( Kuo et al., 2014 ; Gameel, 2017 ). Thus, this study synthesizes these three components to construct interaction. Hence, we hypothesize that:

Question (Q) 1: Is there a positive effect of interaction on online learning satisfaction?

Academic Self-Efficacy

Self-efficacy is a multidimensional concept that described as an individual’s confidence on his or her ability to master a task ( Bandura, 1982 ) and is believed to be a vital component in online learning ( Shen et al., 2013 ). Academic self-efficacy, on the other hand, as a dimension of self-efficacy, has been defined as one’s capacity to carry out specific academic roles and attain designated performance in learning situations ( Zhang, 2014 ). Studies have indicated that students with higher academic self-efficacy make greater progress by seeking difficult tasks and adopting effective strategies to solve those tasks ( Walker et al., 2006 ). Specifically, those with high academic self-efficacy tend to be more academic and mastery-oriented and are devoting a greater amount of time to complete their assignments ( Richardson, 2007 ). In contrast, students with low academic self-efficacy resulting from prior failure learning experiences tend to give up easily and are less likely to be academic engaged ( Mercer et al., 2011 ). Moreover, extensive literature has also discovered that academic self-efficacy is closely associated with favorable academic outcomes and strongly tied to change in states of learning engagement ( Zhen et al., 2017 ). For instance, Walker et al. (2006) found that academic self-efficacy positively impacts student engagement in the learning process. Similarly, suggest that among the motivational constructs, academic self-efficacy is one of the key players in promoting students’ engagement, including behavioral engagement, cognitive engagement, and emotional engagement.

Interestingly, most literature on academic self-efficacy focuses on its beneficial effects on the learning process and performance, and fewer studies have investigated its antecedents ( Zhang, 2014 ). According to Bandura (1977) , student academic self-efficacy can be affected by a variety of personal, cognitive, and environmental stimuli, including student’s behavior or teacher behavior, that is, interaction with teachers and peers ( Zhang, 2014 ). In this regard, Santiago and Einarson (1998) report that graduate students’ expectations of peer/faculty interaction emerge as a significant predictor of academic self-efficacy regardless of gender differences. Further, Nelson Laird (2005) and Zhang (2014) have found that students with positive quality interaction with their peers or teachers are more likely to possess higher academic self-efficacy. In Zhang (2014) ‘s study, he suggested that when university students perceived their instructors as interactive and enthusiastic, they tend to be more intrinsically motivated, which consequently fuels up their academic self-efficacy. Meanwhile, college students who experience quality positive peer-to-peer interactions are apt to possess more confidence in their academic life and tend to participate in more diversified courses in the future ( Nelson Laird, 2005 ). Considering the compelling evidence of the positive function of academic self-efficacy and its antecedent, this study proposes the following hypotheses:

Q2: Is there a positive effect of interaction on academic self-efficacy?

Q3: Is there a positive effect of academic self-efficacy on student engagement?

Student Engagement

Student engagement has been referred to as the input of physical and psychological energy that a student dedicates to educationally effective activities ( Astin, 1984 ; Kuh, 2003 ), which is closely related to learning outcomes, such as learning satisfaction, academic achievement, and completion rates ( Baron and Corbin, 2012 ; Gao et al., 2020 ) in all modes of education ( Fisher et al., 2018 ). According to prior research ( Fredricks et al., 2016 ; Maroco et al., 2016 ), student engagement is a multidimensional construct that includes three basic substructures: behavioral, emotional, and cognitive engagement. Specifically, behavioral engagement is related to students’ behaviors, such as attending classes and participating in learning activities following the social and institutional rules ( Sinval et al., 2021 ). Emotional engagement is referred to as the students’ positive and negative emotional responses to the learning process and class activities ( Manwaring et al., 2017 ). Furthermore, cognitive engagement is defined as students’ learning efforts, such as learning strategies or approaches and academic self-regulation ( Manwaring et al., 2017 ; Gao et al., 2020 ). In this vein, Janosz (2012) suggested that all three dimensions of students’ engagement are interdependent as students need to engage both physically (behavioral) and psychologically (emotional and cognitive) to acquire new skills and knowledge in the learning process. If students fail to engage either way in the learning process, they will be inclined to experience a low level of learning satisfaction ( Sun and Rueda, 2012 ; Gao et al., 2020 ). In contrast, students who are more engaged in learning activities are more likely to spend extra time on the learning process, participate more, and develop mechanisms to assist them in the learning process and achievement ( Klem and Connell, 2004 ; Sinval et al., 2021 ), which eventually led to higher learning satisfaction. This is consistent with the findings of Kim and Kim (2021) and Cheng and Chau (2016) ‘s study that student engagement has a significant positive effect on students’ satisfaction. The explainable can be that most undergraduate students who were satisfied with online learning believed that active student engagement was an effective way to boost learning.

Indeed, student engagement is crucial for online pedagogy because well-designed online courses revolve around the learners ( McCombs, 2015 ). Some studies argue that enhancing student engagement in online learning is difficult due to the overall insufficient mastery of technology and self-discipline ( Oliver and Herrington, 2003 ). Nevertheless, Mount et al. (2009) suggest that student engagement can be best achieved by the interaction among peers and instructors. Meanwhile, some studies further discovered that student engagement mediates the impact of student interaction on students satisfaction ( Jelas et al., 2016 ). However, contrary to prior findings, Gray and DiLoreto (2016) have argued that student engagement only mediates the effect of instructor interaction on students’ satisfaction. This medication has not been found between peer interactions and student satisfaction. This may be due to that the peer-to-peer interaction had often been identified as a poor predictor of students’ satisfaction ( Kuo et al., 2014 ). On the other hand, academic self-efficacy has been used to predict students’ satisfaction in the online learning context ( Shen et al., 2013 ) or as a mediator to explain the relationships between academic achievement and other factors ( Hejazi et al., 2009 ; Shams et al., 2011 ). Few studies have examined its mediation effect on online learning satisfaction except on job satisfaction ( Peng and Mao, 2015 ; Yıldız and Şimşek, 2016 ). Therefore, to conclude the above discussion, also with relatively limited studies on investigating the mechanism behind the relationship between interaction and online learning satisfaction and to conclude above discussions, this study predicts that:

Q4: Is there is a positive effect of student engagement on online learning satisfaction?

Q5: Do academic self-efficacy and student engagement serially mediate the positive relationship between interaction and online learning satisfaction?

Materials and Methods

Study design.

This study employed a cross-sectional, questionnaire-based research design to investigate the relationship between interaction and online learning satisfaction as well as the serial mediating role of academic self-efficacy and student engagement in the relationship between interaction and online learning satisfaction among Chinese university students.


An online survey was conducted among Chinese university students from December 2020 to January 2021 using Sojump, an online questionnaire platform. The online survey link, with a brief description of the objective of the study, was shared through the Chinese social media app WeChat. Participants could respond directly from their smartphone, tablet, or laptop. The inclusion criteria for respondents were as follows: (1) Chinese university-level students who had the experience of attending online classes during the COVID-19 pandemic and (2) those who willingly participated in this study. We have used a prior estimation to calculate the minimum sample size required. The prior sample size estimation was employed during the research planning state to avoid type I and type II errors ( Beck, 2013 ; She et al., 2021 ). The minimum sample size of 1,454 was required in this study based on 4 latent variables, 35 observed variables, a probability level less than 0.05, a power level of 0.8, and an effect size of 0.1 ( Cohen, 2013 ). In total, using the convenient sampling technique, a total of 1,504 university students from five provinces in China, namely, Xinjiang, Gansu, Henan, Shandong, and Hebei, have fulfilled the inclusion criteria of this study. The sample of this study consisted of 1,058 females (70.3%) and 446 males (29.7%) with a mean age of 19.89years ( SD =1.93). Moreover, majority of the participants were undergraduate students (97.7%) and the most of the students reported having at least six online classes per week during the pandemic (61.4%). Regarding the years in university, 83.2% of them were Year 1 and Year 2 students.

Students’ online learning satisfaction was measured by adopting four items developed by Lin (2005) . For the purpose of this study, “course” was replaced with “online learning” in the original scale (e.g., “The online learning activities met my expectations for what I have hoped to learn”). The participants responded on a five-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree).

To measure respondents’ interaction in online learning, this study adopted a six-item scale from Chung and Chen (2020) . This is the subscale of the student perceptions of an online course. It integrates the interaction between instructors and students (e.g., “ The instructor is supportive when a student had difficulties or questions ”), between students to students (e.g., “ The course foster student-to-student interaction for supporting productive learning ”), and between content to students (e.g., “ The course content provides mutual interaction to facilitate student learning ”). The respondents were asked to respond on a five-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree) to each of the statements.

The six-item that was adopted from Hu and Schaufeli (2009) to assess student academic self-efficacy (e.g., “I can effectively solve the problems that arise in my studies.”). This scale is a sub-dimension of the Maslach Burnout Inventory student survey (MBI-SS). The response was scored on a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree).

This study used university student engagement inventory developed by Maroco et al. (2016) to measure university students engagement in online learning during the pandemic. The scale consisted of 15 items with three sub-dimensions, namely, behavior engagement (e.g., “I usually do my homework on time”), emotional engagement (e.g., “I feel excited about the school work”), and cognitive engagement (e.g., “when I read a book, I question myself to make sure I understand the subject I’m reading about”). Each item was recorded on a 5-point Likert scale ranging from one (never) to five (always). Item 6 of the scale was coded reversely (“I do not feel very accomplished at this school”).

Data Analysis

The partial least squares structural equation modeling (PLS-SEM) and SmartPLS, version 3.3.2, were used to assess the measurement model and structural model. PLS-SEM does not impose any distribution assumptions and maximized the explained variance by the developed model ( Pahlevan Sharif and Nia, 2018 ). PLS-SEM also allows researchers to assess more complex models with several variables, indicator constructs, and structural paths ( Pahlevan Sharif et al., 2021 ). Hair Jr et al. (2017) indicated that if the prediction is the focus of the research, then PLS-SEM is the better option in a direct comparison with covariance-based SEM. Also, PLS-SEM and the SmartPLS software can facilitate SEM solutions with practically any level of complexity in the structural model and constructs, including higher-order constructs that usually reduce the multicollinearity issues (Ringle et al., 2014). In addition, SmartPLS software offers a wide range of algorithmic and modeling options, advanced usability with user-friendly and professional support (Bido et al., 2014). Therefore, this study employed PLS-SEM and SmartPLS software. A two-step approach was used to test the measurement model PLS-SEM also allows researchers to assess complex models that include both observed and latent construct ( Pahlevan Sharif et al., 2021 ). A two-step approach was used to test the structural model due to the presence of both lower-order (e.g., Interaction, academic self-efficacy, online learning satisfaction) and higher-order (e.g., student engagement ) construct ( Becker et al., 2012 ). The internal consistency among items and construct reliability was assessed using Cronbach’s alpha and composite reliability (CR). If the value of Cronbach’s alpha and CR are more than 0.7 that indicate each test item measures the same latent trait on the same scale ( Pahlevan Sharif et al., 2019 ; She et al., 2021 ). Construct validity was assessed through both convergent and discriminant validity. Convergent validity is the convergence or correlation between items that are intended to measure the same variable ( Trockel et al., 2018 ). In other words, testing of convergent validity assumes that the items under the construct are related to the same concept. To assess and establish the convergent validity, the average variance extracted (AVE) of each construct should be greater than 0.5 and less than its CR ( Sharif et al., 2019 ). Discriminant validity is the divergence of lack of correlation between variables that intended to assess different concepts ( Trockel et al., 2018 ), and it is referring to the extent to which the construct is differing from one another in the research model ( Henseler et al., 2015 ). To assess and establish the discriminant validity, the square root of each construct’s AVE should be greater than its correlation with other constructs ( Fornell and Larcker, 1981 ). Next, the structural model was assessed. PLS algorithm was used to compute the path coefficients, and a bootstrapping approach with 2,000 subsamples was used to estimate the standard error and value of p . This study also used the Blindfolding procedure to obtain the Q2 value to assess the predictive accuracy of the model. All tests were two-tailed, and a value of p less than 0.05 is considered statistically significant.

The results of the measurement model assessment are shown in Table 1 . Two items (item 6 and 8) from Emotional engagement were removed due to weak factor loadings. All the remaining items’ factor loadings were significant and greater than 0.7 for both lower-order and higher-order constructs. The internal consistency and construct reliability for all constructs were good, as evidenced by Cronbach’s alpha (ranged from 0.867 to 0.950) and CR (ranged from 0.919 to 0.959) of all constructs of greater than 0.7. In terms of convergent validity, AVE for all constructs was greater than 0.5 (ranged from 0.627 to 0.913), and each construct’s AVE was less than its respective CR, indicating good convergent validity. As shown in Table 2 , the discriminant validity was also established as the square root of each construct’s AVE was greater than its correlation with other constructs.


Table 1 . Results of the measurement model assessment.


Table 2 . Discriminant validity assessment using the Fornell-Larcker criterion.

Table 3 reports the results of the structural model assessment after controlling the effect of age, gender, class per week, and years in university. Results of total effect model showed a positive relationship between interaction and online learning satisfaction ( β =0.549, t -value=24.813, p <0.001), providing support Q1. The total effect explained 32.3% of the variance. Moreover, the results showed that the relationship between interaction and academic self-efficacy ( β =0.792, t -value=56.672, p <0.001), academic self-efficacy and student engagement ( β =0.759, t -value=49.206, p <0.001), and student engagement and online learning satisfaction ( β =0.198, t -value=5.718, p <0.001) was positive and statistically significant that supported Q2, Q3, and Q4 respective. In addition, the results of the mediation model showed that there is a serial mediation of academic self-efficacy and student engagement in the relationship between interaction and online learning satisfaction ( β =0.119, t -value=5.681, p <0.001), which support Q5. The significant relationship between interaction and online learning satisfaction ( β =0.430, t -value=12.094, p <0.001), in the mediation model, indicated that the mediation was partial. The mediation model explained 34.6% of the variance of the online learning satisfaction, 57.6% of the variance of student engagement, and 62.7% of the variance of the academic self-efficacy (see Figure 1 ). The Q2 value of online learning satisfaction (31.1%), student engagement (35.8%), and academic self-efficacy (49.6%) in the mediation model showed good predictive accuracy.


Table 3 . Structural model assessment.


Figure 1 . The results of the structural model. *** p <0.001, ** p <0.01; model controls age, gender, class per week, and years in university.

This study attempted to examine the relationship between interaction and online learning satisfaction and investigate the serial mediation role of academic self-efficacy and student engagement in this relationship in a sample of university students in China during the COVID-19 pandemic.

The results showed a positive relationship between interaction and online learning satisfaction (Q1). The findings indicated that Chinese students who interact more often during online learning showed higher levels of learning satisfaction. The results are consistent with prior research conducted in face-to-face learning that interaction enhances student learning involvement and develops the sense of belonging in the learning process for students ( Scagnoli, 2001 ), which positively affects students’ learning satisfaction ( Jung et al., 2002 ; Baber, 2021 ), regardless of whether it is online or face-to-face ( Moore and Kearsley, 1996 ). Additionally, in an online environment, students leverage the technology to interact without the limitation of time, place, and space to gain knowledge and skills ( Kaymak and Horzum, 2013 ). Furthermore, various interactivities can be crucial for students to improve their learning satisfaction and learning outcomes in an online learning environment. Strauß and Rummel (2020) stated that one of the reasons for this positive relationship in university students is that interacting in online learning fosters social presence, which can be seen as students’ perception of having psychical contact with “real” people ( Aragon, 2003 ). Then, social presence, in turn, leads to satisfaction in online learning ( Kim et al., 2011 ).

The results also provided evidence for the positive relationship between interaction and academic self-efficacy (Q2), hence supporting the previous studies which indicated that students with more experience in interaction with their peers, instructors, and content are more likely to have a higher level of academic self-efficacy ( Nelson Laird, 2005 ; Zhang, 2014 ). A study showed that student fosters academic self-efficacy by observing and interacting with others ( Gebauer et al., 2020 ), for example, interaction with peers to academic achievement can alter a student’s academic self-efficacy by suggesting that he or she can achieve the same results ( Gebauer et al., 2020 ). Similarly, interaction with peers helps students create opportunities to access various academic activities to experience resources and enhance their academic self-efficacy ( Schunk and Mullen, 2012 ). Moreover, the instructor can also enhance students’ academic self-efficacy by providing guidance and persuasive support as he/she usually works as a role model to guide and steer students’ successful mastery learning experiences ( Miller and Brickman, 2004 ; McMahon and Wernsman, 2009 ). It is noted that students tend to develop their cognitive ability and perspectives through interaction with course content ( Moore, 1989 ). The same interaction helps them perform internal didactic communication with themselves when they gain information and knowledge from course materials, enabling them to improve their confidence and ability of the discipline knowledge ( Goh et al., 2019 ).

Besides, the positive association between academic self-efficacy and student engagement was confirmed by this study (Q3), which is in line with past studies indicating that academic self-efficacy is the key motivational construct in promoting students’ behavioral, cognitive, and emotional engagement ( Linnenbrink and Pintrich, 2003 ; Walker et al., 2006 ). Students with a higher level of academic self-efficacy are more likely to take challenges and be persistent in facing multiple academic problems ( Liu et al., 2018 ), which urges students to engage more in academic activities. It is also believed that students with competency beliefs tend to develop an intrinsic interest in learning ( Ryan and Patrick, 2001 ), which enables them to use effective and complex learning strategies to engage and involve more in learning activities ( Putwain et al., 2013 ). Zhen et al. (2017) stated that academic self-efficacy functions as a motivational force to motivate students to use more learning strategies and improve their cognitive competency to deal with learning challenges. Thus, students with a higher level of academic self-efficacy showed higher engagement in learning activities to attain specific academic goals in the online learning environment.

In addition, this study established that student engagement produces positive effects on online learning satisfaction (Q4). This implies that students who are more engaged with their studies are more likely to be satisfied with online learning. The findings also provide further evidence for Kim and Kim (2021) that student engagement is a key factor in enhancing students’ desirable learning outcomes, positively associated with student online learning satisfaction. In accordance with the previous studies, those students who are engaged in the learning process tend to invest more during their learning, participate more in learning activities, and tend to develop mechanisms to assist them in achieving their academic goals ( Klem and Connell, 2004 ) and leading to more satisfaction in both face-to-face and online context learning ( Coetzee and Oosthuizen, 2012 ; El-Sayad et al., 2021 ). Finally, this study confirmed the partial mediation role of academic self-efficacy and student engagement in the relationship between interaction and online learning satisfaction (Q5). The findings explained the mechanism behind this relationship and implied that academic self-efficacy and student engagement partially explained why interaction positively affects online learning satisfaction. That is to say, university students who interact more are likely to foster their academic self-efficacy ( Gebauer et al., 2020 ). Subsequently, with higher academic self-efficacy, they believe that they have sufficient ability to perform online tasks and are more engaged with their learning ( El-Sayad et al., 2021 ), which in turn contribute to their satisfaction in online learning ( Kim and Kim, 2021 ).

This study also gives rise to several important implications for better understanding students’ satisfaction in the online learning context during the COVID-19 pandemic. Theoretically, this study is among the first to provide empirical evidence for serial mediating roles of academic self-efficacy and student engagement in the correlation between interaction and online learning satisfaction. In this vein, the results of the study improve our understanding of the mechanism behind the relationship between interaction and online learning satisfaction. Second, in response to a call to Ekwunife-Orakwue and Teng (2014) ‘s study to identify the factors from the theory of transactional distance to predict a causal pathway for the mechanism of occurrence. This study also constitutes a novel research basis for future studies aiming at capturing a comprehensive picture of online learning satisfaction. As we learned from the study, online learning requires student interactions to boost student engagement and fuel students’ academic self-efficacy to improve their learning satisfaction online.

Practically, educators or practitioners are recommended to centralize learning interactions as a core to plan, design, and deliver online learning to create a sense of community and an online environment that emphasis the students’ own contribution to the learning process. Instructors as facilitators should be acknowledged that interactions in the learning process do not only help students learn influences students’ satisfaction but also help students build their confidence in online academic life. Meanwhile, as there is a lack of a recognized system for instructional quality measurement in the online learning context in general ( Margaryan et al., 2015 ), the findings of the study will also be able to provide some insights for policymakers or higher education institutions to rely on to improve the current e-Learning systems across the global. In particular, the study demonstrates the process of interactions translate to online learning satisfaction through academic self-efficacy and student engagement. Thus, an e-Learning system should be designed to maximize students’ autonomy and involvement in the learning process and emphasize it as an ultimate goal of learning achievement. In this vein, students gain content knowledge, improve their creativity in completing tasks, create a sense of responsibility for their learning, and eventually benefit their future job performances.

This study, however, is not without limitations. The study sample was based on university students from five provinces of China that did not represent the whole population of Chinese university students, thus limiting the generalizability of the findings. Future studies are suggested to obtain more samples that are representative. Moreover, the use of self-report measures of the instrument may be subjected to exaggeration and lead to social desirability bias. Third, the use of a cross-sectional research design could not effectively indicate causal inferences. Thus, future research may adopt longitudinal or experimental design to provide more supporting evidence about the observed relationships and their underlying mechanisms. Also, future studies are suggested to test our model in different contexts such as blended learning environments or other online leaning related domains. Future studies testing the model developed by this study may also take into account the role of technology in students’ online satisfaction. Lastly, there is no clear explanation or reason behind the scenes in the study, and the qualitative study is much needed to delve in and gain a deeper understanding of the relationship between online interaction and online learning satisfaction from students’ perspectives.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Ethics Statement

The studies involving human participants were reviewed and approved by Mazandaran University of Medical Sciences Research Ethics Committee. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

PR, HS, LS, LM contributed to the study conception and design. Material preparation, data collection were performed by LS and PR. LS and HS performed the data analysis. The first draft of the manuscript was written by LM, PR, LS, AJ, and HS. All authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.


We would like to thank all of our participants in this study. The present study was approved by Ethical Committee of Mazandaran University of Medical Sciences, Sari, Iran (Ethical Code: IR.MAZUMS.REC.1399.7523 available at: https://ethics.research.ac.ir/form/5pm744q8yhqlcy41.pdf ).

Adedoyin, O. B., and Soykan, E. (2020). Covid-19 pandemic and online learning: the challenges and opportunities. Interact. Learn. Environ. , 1–13. doi: 10.1080/10494820.2020.1813180

CrossRef Full Text | Google Scholar

Alqurashi, E. (2019). Predicting student satisfaction and perceived learning within online learning environments. Distance Educ. 40, 133–148. doi: 10.1080/01587919.2018.1553562

Anderson, T., Liam, R., Garrison, D. R., and Archer, W. (2001). Assessing teaching presence in a computer conferencing context. J. Asynchronous Learn. Network 5, 1–17.

Google Scholar

Aragon, S. R. (2003). Creating social presence in online environments. New Direct. Adult and Continuing Educ. 2003, 57–68. doi: 10.1002/ace.119

Armstrong-Mensah, E., Ramsey-White, K., Yankey, B., and Self-Brown, S. (2020). COVID-19 and distance learning: effects on Georgia State University School of public health students. Front. Public Health 8:576227. doi: 10.3389/fpubh.2020.576227

PubMed Abstract | CrossRef Full Text | Google Scholar

Artino, A. R. (2008). Motivational beliefs and perceptions of instructional quality: predicting satisfaction with online training*. J. Comput. Assist. Learn. 24, 260–270. doi: 10.1111/j.1365-2729.2007.00258.x

Astin, A. W. (1984). Student involvement: A developmental theory for higher education. J. Coll. Stud. Pers. 25, 297–308.

Baber, H. (2020). Determinants of students’ perceived learning outcome and satisfaction in online learning during the pandemic of COVID-19. J. Educ. E-Learn. Res. 7, 285–292. doi: 10.20448/journal.509.2020.73.285.292

Baber, H. (2021). Social interaction and effectiveness of the online learning – A moderating role of maintaining social distance during the pandemic COVID-19. Asian Educ. Dev. Stud. [Ahead of print]. doi: 10.1108/AEDS-09-2020-0209

Bali, S., and Liu, M. C. (2018). Students’ perceptions toward online learning and face-to-face learning courses. J. Phys. Conf. Ser. 1108:012094. doi: 10.1088/1742-6596/1108/1/012094

Bandura, A. (1977). Social Learning: Theory. Englewood Cliffs, NJ: Prentice Hall.

Bandura, A. (1982). Self-efficacy mechanism in human agency. Am. Psychol. 37, 122–147. doi: 10.1037/0003-066X.37.2.122

Baron, P., and Corbin, L. (2012). Student engagement: rhetoric and reality. High. Educ. Res. Dev. 31, 759–772. doi: 10.1080/07294360.2012.655711

Bayham, J., and Fenichel, E. P. (2020). Impact of school closures for COVID-19 on the US health-care workforce and net mortality: a modelling study. Lancet Public Health 5, e271–e278. doi: 10.1016/S2468-2667(20)30082-7

Beck, T. W. (2013). The importance of A priori sample size estimation in strength and conditioning research. J. Strength Cond. Res. 27, 2323–2337. doi: 10.1519/JSC.0b013e318278eea0

Becker, J.-M., Klein, K., and Wetzels, M. (2012). Hierarchical latent variable models in PLS-SEM: guidelines for using reflective-formative type models. Long Range Plan. 45, 359–394. doi: 10.1016/j.lrp.2012.10.001

Benson, R., and Samarawickrema, G. (2009). Addressing the context of e-learning: using transactional distance theory to inform design. Distance Educ. 30, 5–21. doi: 10.1080/01587910902845972

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., et al. (2009). A meta-analysis of three types of interaction treatments in distance education. Rev. Educ. Res. 79, 1243–1289. doi: 10.3102/0034654309333844

Bong, M., and Skaalvik, E. M. (2003). Academic Self-concept and Self-efficacy: how different are they really? Educ. Psychol. Rev. 15, 1–40. doi: 10.1023/A:1021302408382

Cheng, G., and Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An empirical study of a blended learning course. Br. J. Educ. Technol. 47, 257–278. doi: 10.1111/bjet.12243

Chung, J., and Chen, H.-C. (2020). Development and psychometric properties of student perceptions of an online course (SPOC) in an RN-to-BSN program. Nurse Educ. Today 85:104303. doi: 10.1016/j.nedt.2019.104303

Cidral, W. A., Oliveira, T., Di Felice, M., and Aparicio, M. (2018). E-learning success determinants: Brazilian empirical study. Comput. Educ. 122, 273–290. doi: 10.1016/j.compedu.2017.12.001

Coetzee, M., and Oosthuizen, R. M. (2012). Students' sense of coherence, study engagement and Self-efficacy in relation to their study and employability satisfaction. J. Psychol. Afr. 22, 315–322. doi: 10.1080/14330237.2012.10820536

Cohen, J. (2013). Statistical Power Analysis for the Behavioral Sciences. New York: Academic press.

Coman, C., Țîru, L. G., Meseșan-Schmitz, L., Stanciu, C., and Bularca, M. C. (2020). Online teaching and learning in higher education during the coronavirus pandemic: students’ perspective. Sustainability 12:10367. doi: 10.3390/su122410367

Downing, K. J., Lam, T., Kwong, T., Downing, W., and Chan, S. (2007). Creating interaction in online learning: a case study. ALT-J 15, 201–215. doi: 10.1080/09687760701673592

Dziuban, C., Moskal, P., Thompson, J., Kramer, L., DeCantis, G., and Hermsdorfer, A. (2015). Student satisfaction with online learning: is it a psychological contract? Online Learn. 19:n2.

Ekwunife-Orakwue, K. C. V., and Teng, T.-L. (2014). The impact of transactional distance dialogic interactions on student learning outcomes in online and blended environments. Comput. Educ. 78, 414–427. doi: 10.1016/j.compedu.2014.06.011

El-Sayad, G., Md Saad, N. H., and Thurasamy, R. (2021). How higher education students in Egypt perceived online learning engagement and satisfaction during the COVID-19 pandemic. J. Computer. Educ. , 1–24. doi: 10.1007/s40692-021-00191-y

Enkin, E., and Mejías-Bikandi, E. (2017). The effectiveness of online teaching in an advanced Spanish language course. Int. J. Appl. Linguist. 27, 176–197. doi: 10.1111/ijal.12112

Ferla, J., Valcke, M., and Cai, Y. (2009). Academic self-efficacy and academic self-concept: reconsidering structural relationships. Learn. Individ. Differ. 19, 499–505. doi: 10.1016/j.lindif.2009.05.004

Fisher, R., Perényi, Á., and Birdthistle, N. (2018). The positive relationship between flipped and blended learning and student engagement, performance and satisfaction. Active Learn. High. Educ. 22, 97–113. doi: 10.1177/1469787418801702

Fornell, C., and Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 18, 39–50. doi: 10.1177/002224378101800104

Fredricks, J. A., Filsecker, M., and Lawson, M. A. (2016). Student engagement, context, and adjustment: addressing definitional, measurement, and methodological issues. Learn. Instr. 43, 1–4. doi: 10.1016/j.learninstruc.2016.02.002

Gameel, B. G. (2017). Learner satisfaction with massive open online courses. Am. J. Dist. Educ. 31, 98–111. doi: 10.1080/08923647.2017.1300462

Gao, B. W., Jiang, J., and Tang, Y. (2020). The effect of blended learning platform and engagement on students’ satisfaction— the case from the tourism management teaching. J. Hosp. Leis. Sport Tour. Educ. 27:100272. doi: 10.1016/j.jhlste.2020.100272

Gebauer, M. M., McElvany, N., Bos, W., Köller, O., and Schöber, C. (2020). Determinants of academic self-efficacy in different socialization contexts: investigating the relationship between students’ academic self-efficacy and its sources in different contexts. Soc. Psychol. Educ. 23, 339–358. doi: 10.1007/s11218-019-09535-0

Goh, C. F., Tan, O. K., Rasli, A., and Choi, S. L. (2019). Engagement in peer review, learner-content interaction and learning outcomes. Int. J. Info. Learn. Technol. 36, 423–433. doi: 10.1108/IJILT-04-2018-0038

Gopal, R., Singh, V., and Aggarwal, A. (2021). Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19. Educ. Inf. Technol. , 1–25. doi: 10.1007/s10639-021-10523-1

Gray, J. A., and DiLoreto, M. (2016). The effects of student engagement, student satisfaction, and perceived learning in online learning environments. Int. J. Educ. Leadership Prepar. 11:n1

Hair, J. F. Jr., Matthews, L. M., Matthews, R. L., and Sarstedt, M. (2017). PLS-SEM or CB-SEM: updated guidelines on which method to use. Int. J. Multivariate Data Analy. 1, 107–123. doi: 10.1504/IJMDA.2017.10008574

Hassan, S. U., Algahtani, F. D., Zrieq, R., Aldhmadi, B. K., Atta, A., Obeidat, R. M., et al. (2021). Academic Self-perception and course satisfaction among university students taking virtual classes during the COVID-19 pandemic in the kingdom of Saudi-Arabia (KSA). Educ. Sci. 11:134. doi: 10.3390/educsci11030134

Hejazi, E., Shahraray, M., Farsinejad, M., and Asgary, A. (2009). Identity styles and academic achievement: mediating role of academic self-efficacy. Soc. Psychol. Educ. 12, 123–135. doi: 10.1007/s11218-008-9067-x

Henseler, J., Ringle, C. M., and Sarstedt, M. (2015). A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 43, 115–135. doi: 10.1007/s11747-014-0403-8

Hew, K. F., Hu, X., Qiao, C., and Tang, Y. (2020). What predicts student satisfaction with MOOCs: A gradient boosting trees supervised machine learning and sentiment analysis approach. Comput. Educ. 145:103724. doi: 10.1016/j.compedu.2019.103724

Hu, Q., and Schaufeli, W. B. (2009). The factorial validity of the Maslach burnout inventory–student survey in China. Psychol. Rep. 105, 394–408. doi: 10.2466/PR0.105.2.394-408

Jan, S. K. (2015). The relationships Between academic Self-efficacy, computer Self-efficacy, prior experience, and satisfaction With online learning. Am. J. Dist. Educ. 29, 30–40. doi: 10.1080/08923647.2015.994366

Jan, A. (2020). A phenomenological study of synchronous teaching during COVID-19: A case of an international school in Malaysia. Soc. Sci. Human. open 2:100084. doi: 10.1016/j.ssaho.2020.100084

Janosz, M. (2012). “Part IV commentary: outcomes of engagement and engagement as an outcome: Some consensus, divergences, and unanswered questions” in Handbook of Research on Student Engagement. eds. S. L. Christenson, A. L. Reschly, and C. Wylie (Boston, MA: Springer US), 695–703.

Jelas, Z. M., Azman, N., Zulnaidi, H., and Ahmad, N. A. (2016). Learning support and academic achievement among Malaysian adolescents: the mediating role of student engagement. Learn. Environ. Res. 19, 221–240. doi: 10.1007/s10984-015-9202-5

Jiang, H., Islam, A. Y. M. A., Gu, X., and Spector, J. M. (2021). Online learning satisfaction in higher education during the COVID-19 pandemic: A regional comparison between eastern and Western Chinese universities. Educ. Inf. Technol. , 1–23. doi: 10.1007/s10639-021-10519-x

Jung, I., Choi, S., Lim, C., and Leem, J. (2002). Effects of different types of interaction on learning achievement, satisfaction and participation in web-based instruction. Innov. Educ. Teach. Int. 39, 153–162. doi: 10.1080/14703290252934603

Jung, Y., and Lee, J. (2018). Learning engagement and persistence in massive open online courses (MOOCS). Comput. Educ. 122, 9–22. doi: 10.1016/j.compedu.2018.02.013

Kaymak, Z. D., and Horzum, M. B. (2013). Relationship between online learning readiness and structure and interaction of online learning students. Educ. Sci. Theory Practice 13, 1792–1797.

Ke, F., and Kwak, D. (2013). Online learning across ethnicity and age: A study on learning interaction participation, perception, and learning satisfaction. Comput. Educ. 61, 43–51. doi: 10.1016/j.compedu.2012.09.003

Kim, S., and Kim, D.-J. (2021). Structural relationship of key factors for student satisfaction and achievement in asynchronous online learning. Sustainability 13:6734. doi: 10.3390/su13126734

Kim, J., Kwon, Y., and Cho, D. (2011). Investigating factors that influence social presence and learning outcomes in distance higher education. Comput. Educ. 57, 1512–1520. doi: 10.1016/j.compedu.2011.02.005

Klem, A. M., and Connell, J. P. (2004). Relationships matter: linking teacher support to student engagement and achievement. J. Sch. Health 74, 262–273. doi: 10.1111/j.1746-1561.2004.tb08283.x

Knowles, M. S., Holton Iii, E. F., and Swanson, R. A. (2020). The Adult Learner: The Definitive Classic in Adult Education and Human Resource Development. 9th Edn . New York: Routledge.

Kuh, G. D. (2003). What We're learning About student engagement From NSSE: benchmarks for effective educational practices. Change Magazine High. Learn. 35, 24–32. doi: 10.1080/00091380309604090

Kumar, P., Saxena, C., and Baber, H. (2021). Learner-content interaction in e-learning- the moderating role of perceived harm of COVID-19 in assessing the satisfaction of learners. Smart Learn. Environ. 8, 1–15. doi: 10.1186/s40561-021-00149-8

Kuo, Y.-C., Walker, A. E., Schroder, K. E. E., and Belland, B. R. (2014). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet High. Educ. 20, 35–50. doi: 10.1016/j.iheduc.2013.10.001

Kurucay, M., and Inan, F. A. (2017). Examining the effects of learner-learner interactions on satisfaction and learning in an online undergraduate course. Comput. Educ. 115, 20–37. doi: 10.1016/j.compedu.2017.06.010

Lee, J.-W. (2010). Online support service quality, online learning acceptance, and student satisfaction. Internet High. Educ. 13, 277–283. doi: 10.1016/j.iheduc.2010.08.002

Lin, Y.-M. (2005). Understanding Students' Technology Appropriation and Learning Perceptions in Online Learning Environments. Columbia: University of Missouri. Doctoral dissertation.

Linnenbrink, E. A., and Pintrich, P. R. (2003). The role of self-efficacy beliefs INSTUDENT engagement and learning INTHECLASSROOM. Read. Writ. Q. 19, 119–137. doi: 10.1080/10573560308223

Liu, R.-D., Zhen, R., Ding, Y., Liu, Y., Wang, J., Jiang, R., et al. (2018). Teacher support and math engagement: roles of academic self-efficacy and positive emotions. Educ. Psychol. 38, 3–16. doi: 10.1080/01443410.2017.1359238

Manwaring, K. C., Larsen, R., Graham, C. R., Henrie, C. R., and Halverson, L. R. (2017). Investigating student engagement in blended learning settings using experience sampling and structural equation modeling. Internet High. Educ. 35, 21–33. doi: 10.1016/j.iheduc.2017.06.002

Margaryan, A., Bianco, M., and Littlejohn, A. (2015). Instructional quality of massive open online courses (MOOCs). Comput. Educ. 80, 77–83. doi: 10.1016/j.compedu.2014.08.005

Maroco, J., Maroco, A. L., Campos, J. A. D. B., and Fredricks, J. A. (2016). University student’s engagement: development of the university student engagement inventory (USEI). Psicologia: Reflexão e Crítica 29:21. doi: 10.1186/s41155-016-0042-8

Martin, F., Wang, C., and Sadaf, A. (2018). Student perception of helpfulness of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet High. Educ. 37, 52–65. doi: 10.1016/j.iheduc.2018.01.003

McCombs, B. (2015). Learner-Centered online instruction. New Dir. Teach. Learn. 2015, 57–71. doi: 10.1002/tl.20163

McMahon, S. D., and Wernsman, J. (2009). The relation of classroom environment and school belonging to academic Self-efficacy among urban fourth- and fifth-grade students. Elem. Sch. J. 109, 267–281. doi: 10.1086/592307

Mercer, S. H., Nellis, L. M., Martínez, R. S., and Kirk, M. (2011). Supporting the students most in need: academic self-efficacy and perceived teacher support in relation to within-year academic growth. J. Sch. Psychol. 49, 323–338. doi: 10.1016/j.jsp.2011.03.006

Miller, R. B., and Brickman, S. J. (2004). A model of future-oriented motivation and Self-regulation. Educ. Psychol. Rev. 16, 9–33. doi: 10.1023/B:EDPR.0000012343.96370.39

Moore, M. G. (1989). Editorial: three types of interaction. Am. J. Dist. Educ. 3, 1–7. doi: 10.1080/08923648909526659

Moore, M. G. (1993). “Theory of transactional distance,” in Theoretical Principles of Distance Education. ed. D. Keegan (New York: Routledge), 84–103.

Moore, M. G., and Kearsley, G. G. (1996). Distance Education: A System View. Wadsworth: Wadsworth Publishing Company.

Moore, M. G., and Kearsley, G. (2011). Distance Education: A Systems View of Online Learning. 3rd Edn . Wadsworth: Cengage Learning.

Mount, N. J., Chambers, C., Weaver, D., and Priestnall, G. (2009). Learner immersion engagement in the 3D virtual world: principles emerging from the DELVE project. Innovat. Teach. Learn. Info. Comput. Sci. 8, 40–55. doi: 10.11120/ital.2009.08030040

Muilenburg, L. Y., and Berge, Z. L. (2005). Student barriers to online learning: A factor analytic study. Distance Educ. 26, 29–48. doi: 10.1080/01587910500081269

Nelson Laird, T. F. (2005). College students’ experiences with diversity and their effects on academic Self-confidence, social agency, and disposition toward critical thinking. Res. High. Educ. 46, 365–387. doi: 10.1007/s11162-005-2966-1

OECD. (2020). Strengthening online learning when schools are closed: The role of families and teachers in supporting students during the COVID-19 crisis. Available at: https://www.oecd.org/coronavirus/policy-responses/strengthening-online-learning-when-schools-are-closed-the-role-of-families-and-teachers-in-supporting-students-during-the-covid-19-crisis-c4ecba6c/ (Accessed June 10, 2021).

Oliver, R., and Herrington, J. (2003). “Factors influencing quality online learning experiences,” in Quality Education @ a Distance: IFIP TC3/WG3.6 Working Conference on Quality Education @ a Distance. eds. G. Davies and E. Stacey February 3–6, 2003; Geelong, Australia (Boston, MA: Springer US), 129–136.

Pahlevan Sharif, S., Mostafiz, I., and Guptan, V. (2019). A systematic review of structural equation modelling in nursing research. Nurs. Res. 26, 28–31. doi: 10.7748/nr.2018.e1577

Pahlevan Sharif, S., and Nia, H. S. (2018). Structural Equation Modeling with AMOS. Tehran: Artin Teb.

Pahlevan Sharif, S., Naghavi, N., Ong Fon, S., Sharif Nia, H., and Waheed, H. (2021). Health insurance satisfaction, financial burden, locus of control and quality of life of cancer patients: a moderated mediation model. Int. J. Soc. Econ. 48, 513–530. doi: 10.1108/IJSE-10-2019-0629

Palmer, S. R., and Holt, D. M. (2009). Examining student satisfaction with wholly online learning. J. Comput. Assist. Learn. 25, 101–113. doi: 10.1111/j.1365-2729.2008.00294.x

Parahoo, S. K., Santally, M. I., Rajabalee, Y., and Harvey, H. L. (2016). Designing a predictive model of student satisfaction in online learning. J. Mark. High. Educ. 26, 1–19. doi: 10.1080/08841241.2015.1083511

Peng, Y., and Mao, C. (2015). The impact of person–job fit on job satisfaction: The mediator role of Self efficacy. Soc. Indic. Res. 121, 805–813. doi: 10.1007/s11205-014-0659-x

Putwain, D., Sander, P., and Larkin, D. (2013). Academic self-efficacy in study-related skills and behaviours: relations with learning-related emotions and academic success. Br. J. Educ. Psychol. 83, 633–650. doi: 10.1111/j.2044-8279.2012.02084.x

Rabin, E., Kalman, Y. M., and Kalz, M. (2019). An empirical investigation of the antecedents of learner-centered outcome measures in MOOCs. Int. J. Educ. Technol. High. Educ. 16, 1–20. doi: 10.1186/s41239-019-0144-3

Rahmatpour, P., Peyrovi, H., and Sharif Nia, H. (2021). Development and psychometric evaluation of postgraduate nursing student academic satisfaction scale. Nur. Open 8, 1145–1156. doi: 10.1002/nop2.727

Richardson, J. T. E. (2007). Motives, attitudes and approaches to studying in distance education. High. Educ. 54, 385–416. doi: 10.1007/s10734-006-9003-y

Ringle, C., Da Silva, D., and Bido, D. (2015). Structural equation modeling with the smartpls. Brazilian J. Market. 13, 56–73. doi: 10.5585/remark.v13i2.2717

Robinson, C. C., and Hullinger, H. (2008). New benchmarks in higher education: student engagement in online learning. J. Educ. Bus. 84, 101–109. doi: 10.3200/JOEB.84.2.101-109

Ryan, A. M., and Patrick, H. (2001). The classroom social environment and changes in adolescents’ motivation and engagement During middle school. Am. Educ. Res. J. 38, 437–460. doi: 10.3102/00028312038002437

Sahu, P. (2020). Closure of universities due to coronavirus disease 2019 (COVID-19): Impact on Education and mental health of students and academic staff. Cureus 12:e7541. doi: 10.7759/cureus.7541

Santiago, A. M., and Einarson, M. K. (1998). Background characteristics as predictors of academic self-confidence and academic self-efficacy among graduate science and engineering students. Res. High. Educ. 39, 163–198. doi: 10.1023/A:1018716731516

Scagnoli, N. I. (2001). Student orientations for online programs. J. Res. Technol. Educ. 34, 19–27. doi: 10.1080/15391523.2001.10782330

Schunk, D. H., and Mullen, C. A. (2012). “Self-efficacy as an engaged learner,” in Handbook of Research on Student Engagement. eds. S. L. Christenson, A. L. Reschly, and C. Wylie (Boston, MA: Springer US), 219–235.

Selvanathan, M., Hussin, N. A. M., and Azazi, N. A. N. (2020). Students learning experiences during COVID-19: work from home period in Malaysian higher learning institutions. Teach. Public Admin. :0144739420977900. doi: 10.1177/0144739420977900

Shahzad, A., Hassan, R., Aremu, A. Y., Hussain, A., and Lodhi, R. N. (2021). Effects of COVID-19 in E-learning on higher education institution students: the group comparison between male and female. Qual. Quant. 55, 805–826. doi: 10.1007/s11135-020-01028-z

Shams, F., Mooghali, A. R., and Soleimanpour, N. (2011). The mediating role of academic self-efficacy in the relationship between personality traits and mathematics performance. Procedia. Soc. Behav. Sci. 29, 1689–1692. doi: 10.1016/j.sbspro.2011.11.413

She, L., Rasiah, R., Waheed, H., and Pahlevan Sharif, S. (2021). Excessive use of social networking sites and financial well-being among young adults: the mediating role of online compulsive buying. Young Consum. 22, 272–289. doi: 10.1108/YC-11-2020-1252

She, L., Sharif, S. P., and Nia, H. S. (2021). Psychometric evaluation of the Chinese version of the modified online compulsive buying scale among Chinese young consumers. J. Asia Pac. Bus. 22, 121–133. doi: 10.1080/10599231.2021.1905493

Shen, D., Cho, M.-H., Tsai, C.-L., and Marra, R. (2013). Unpacking online learning experiences: online learning self-efficacy and learning satisfaction. Internet High. Educ. 19, 10–17. doi: 10.1016/j.iheduc.2013.04.001

Sinval, J., Casanova, J. R., Marôco, J., and Almeida, L. S. (2021). University student engagement inventory (USEI): psychometric properties. Curr. Psychol. 40, 1608–1620. doi: 10.1007/s12144-018-0082-6

Skinner, E., Furrer, C., Marchand, G., and Kindermann, T. (2008). Engagement and disaffection in the classroom: part of a larger motivational dynamic? J. Educ. Psychol. 100, 765–781. doi: 10.1037/a0012840

Strauß, S., and Rummel, N. (2020). Promoting interaction in online distance education: designing, implementing and supporting collaborative learning. Inf. Learn. Sci. 121, 251–260. doi: 10.1108/ILS-04-2020-0090

Sun, J. C.-Y., and Rueda, R. (2012). Situational interest, computer self-efficacy and self-regulation: their impact on student engagement in distance education. Br. J. Educ. Technol. 43, 191–204. doi: 10.1111/j.1467-8535.2010.01157.x

Topala, I., and Tomozii, S. (2014). Learning satisfaction: validity and reliability testing for students’ learning satisfaction questionnaire (SLSQ). Procedia. Soc. Behav. Sci. 128, 380–386. doi: 10.1016/j.sbspro.2014.03.175

Trockel, M., Bohman, B., Lesure, E., Hamidi, M. S., Welle, D., Roberts, L., et al. (2018). A brief instrument to assess Both burnout and professional Fulfillment in physicians: reliability and validity, including correlation with Self-reported medical errors, in a sample of resident and practicing physicians. Acad. Psychiatry 42, 11–24. doi: 10.1007/s40596-017-0849-3

Veletsianos, G. (2010). Contextually relevant pedagogical agents: visual appearance, stereotypes, and first impressions and their impact on learning. Comput. Educ. 55, 576–585. doi: 10.1016/j.compedu.2010.02.019

Walker, C. O., Greene, B. A., and Mansell, R. A. (2006). Identification with academics, intrinsic/extrinsic motivation, and self-efficacy as predictors of cognitive engagement. Learn. Individ. Differ. 16, 1–12. doi: 10.1016/j.lindif.2005.06.004

Wang, G., Zhang, Y., Zhao, J., Zhang, J., and Jiang, F. (2020). Mitigate the effects of home confinement on children during the COVID-19 outbreak. Lancet 395, 945–947. doi: 10.1016/S0140-6736(20)30547-X

Wu, J.-H., Tennyson, R. D., and Hsia, T.-L. (2010). A study of student satisfaction in a blended e-learning system environment. Comput. Educ. 55, 155–164. doi: 10.1016/j.compedu.2009.12.012

World Economic Forum (2020). The COVID-19 pandemic has changed education forever. This is how Available at: https://www.weforum.org/agenda/2020/04/coronavirus-education-global-covid19-online-digital-learning/ (Accessed June 10, 2021).

Yıldız, I. G., and Şimşek, Ö. F. (2016). Different pathways from transformational leadership to job satisfaction. Nonprofit Manage. Leader. 27, 59–77. doi: 10.1002/nml.21229

Zhang, Q. (2014). Assessing the effects of instructor enthusiasm on classroom engagement, learning goal orientation, and academic Self-efficacy. Commun. Teach. 28, 44–56. doi: 10.1080/17404622.2013.839047

Zhen, R., Liu, R.-D., Ding, Y., Wang, J., Liu, Y., and Xu, L. (2017). The mediating roles of academic self-efficacy and academic emotions in the relation between basic psychological needs satisfaction and learning engagement among Chinese adolescent students. Learn. Individ. Differ. 54, 210–216. doi: 10.1016/j.lindif.2017.01.017

Keywords: online learning, student satisfaction, interaction, academic self-efficacy, student engagement, COVID-19 pandemic

Citation: She L, Ma L, Jan A, Sharif Nia H and Rahmatpour P (2021) Online Learning Satisfaction During COVID-19 Pandemic Among Chinese University Students: The Serial Mediation Model. Front. Psychol . 12:743936. doi: 10.3389/fpsyg.2021.743936

Received: 19 July 2021; Accepted: 07 September 2021; Published: 05 October 2021.

Reviewed by:

Copyright © 2021 She, Ma, Jan, Sharif Nia and Rahmatpour. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Pardis Rahmatpour, [email protected]

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

online learning research paper questionnaire

Home Surveys Academic

Distance learning survey for students: Tips & examples

Distance learning survey questions for students

The COVID-19 pandemic changed learning in many unprecedented ways. Students had to not just move to online learning but also keep a social distance from their friends and family. A student interest survey helps customize teaching methods and curriculum to make learning more engaging and relevant to students’ lives. It was quite challenging for some to adjust to the ‘new normal’ and missed the in-person interaction with their teachers. For some, it simply meant spending more time with the parents.

Schools need to know how students feel about distance education and learn more about their experiences. To collect data, they can send out a survey on remote learning for students. Once they have the results, the management team can know what students like in the existing setup and what they would like to change.

The classroom response system allowed students to answer multiple-choice questions and engage in real-time discussions instantly.

Here are the examples of class survey questions of distance learning survey for students you must ask to collect their feedback.

LEARN ABOUT:  Testimonial Questions

Examples of distance learning survey questions for students

1. How do you feel overall about distance education?

  • Below Average

This question collects responses about the overall experience of the students regarding online education. Schools can use this data to decide whether they should continue with teaching online or move in-person learning.

2. Do you have access to a device for learning online?

  • Yes, but it doesn’t work well
  • No, I share with others

Students should have uninterrupted access to a device for learning online. Know if they face any challenges with the device’s hardware quality. Or if they share the device with others in the house and can’t access when they need it.

3. What device do you use for distance learning?

Know whether students use a laptop, desktop, smartphone, or tablet for distance learning. A laptop or desktop would be an ideal choice for its screen size and quality. You can use a multiple-choice question type in your questionnaire for distance education students.

4. How much time do you spend each day on an average on distance education?

Know how much time do students spend while taking an online course. Analyze if they are over-spending time and find out the reasons behind it. Students must allocate some time to play and exercise while staying at home to take care of their health. You can find out from answers to this question whether they spend time on other activities as well.

5. How effective has remote learning been for you?

  • Not at all effective
  • Slightly effective
  • Moderately effective
  • Very effective
  • Extremely effective

Depending on an individual’s personality, students may like to learn in the classroom with fellow students or alone at home. The classroom offers a more lively and interactive environment, whereas it is relatively calm at home. You can use this question to know if remote learning is working for students or not. 

6. How helpful your [School or University] has been in offering you the resources to learn from home?

  • Not at all helpful
  • Slightly helpful
  • Moderately helpful
  • Very helpful
  • Extremely helpful

The school management teams need to offer full support to both teachers and students to make distance education comfortable and effective. They should provide support in terms of technological infrastructure and process framework. Given the pandemic situation, schools must allow more flexibility and create lesser strict policies.

7. How stressful is distance learning for you during the COVID-19 pandemic?

Studying in the time of pandemic can be quite stressful, especially if you or someone in the family is not doing well. Measure the stress level of the students and identify ways to reduce it. For instance, you can organize an online dance party or a lego game. The responses to this question can be crucial in deciding the future course of distance learning. 

8. How well could you manage time while learning remotely? (Consider 5 being extremely well and 1 being not at all)

  • Academic schedule

Staying at home all the time and balancing multiple things can be stressful for many people. It requires students to have good time-management skills and self-discipline. Students can rate their experience on a scale of 1-5 and share it with the school authorities. Use a multiple-choice matrix question type for such questions in your distance learning questionnaire for students.

LEARN ABOUT: System Usability Scale

9. Do you enjoy learning remotely?

  • Yes, absolutely
  • Yes, but I would like to change a few things
  • No, there are quite a few challenges
  • No, not at all

Get a high-level view on whether students are enjoying learning from home or doing it because they are being forced to do so. Gain insights on how you can improve distance education and make it interesting for them.

10. How helpful are your teachers while studying online?

Distance education lacks proximity with teachers and has its own set of unique challenges. Some students may find it difficult to learn a subject and take more time to understand. This question measures the extent to which students find their teachers helpful.

You can also use a ready-made survey template to save time. The sample questionnaire for students can be easily customized as per your requirements.


Other important questions of distance learning survey for students

  • How peaceful is the environment at home while learning?
  •  Are you satisfied with the technology and software you are using for online learning?
  • How important is face-to-face communication for you while learning remotely?
  • How often do you talk to your [School/University] classmates?
  • How often do you have a 1-1 discussion with your teachers?

How to create a survey?

The intent behind creating a remote learning questionnaire for students should be to know how schools and teachers can better support them. Use an online survey software like ours to create a survey or use a template to get started. Distribute the survey through email, mobile app, website, or QR code.

Once you get the survey results, generate reports, and share them with your team. You can also download them in formats like .pdf, .doc, and .xls. To analyze data from multiple resources, you can integrate the survey software with third-party apps.

If you need any help with designing a survey, customizing the look and feel, or deriving insights from it, get in touch with us. We’d be happy to help.


Explore 2023's best 10 software of Weekly10 alternatives for enhanced productivity and engagement. Find your perfect fit.

10 Best Weekly10 Alternatives For Employee Engagement in 2023

Sep 27, 2023

Elevate your business with excellent customer experiences that cultivate strong brand preference. Let's explore how to create it.

Brand Preference: What is, Importance & How to Create One

Sep 26, 2023

A synthetic data vault is a secure haven for data privacy. Learn how it works, safeguards sensitive information, and ensures data management.

Synthetic Data Vault: What it is, Safeguarding + Maintenance

The Voice-of-the-Customer feedback provided by customers about the customer journey on a tool like the QuestionPro CX is our time machine.

Looking Backwards Through Time — Tuesday CX Thoughts

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence


  1. 23+ Question Matrix For Teachers Pictures

    online learning research paper questionnaire

  2. Qualitative Questionnaire.

    online learning research paper questionnaire

  3. 6+ Student Survey Questionnaire Templates in PDF

    online learning research paper questionnaire

  4. Draft of Questionnaire on Students' responses to online learning

    online learning research paper questionnaire

  5. 30+ Questionnaire Templates (Word) ᐅ TemplateLab

    online learning research paper questionnaire

  6. 30+ Questionnaire Templates (Word) ᐅ TemplateLab

    online learning research paper questionnaire


  1. questions paper of research methodology for BBA students

  2. Tips on Writing the Background of the Study

  3. This Researcher Submitted A Paper In 3 Weeks

  4. Writing A Research Paper: Discussion

  5. Step-by-step approach to starting and completing a good research paper

  6. How to Write Research Question


  1. Why Should You Use Questionnaires As a Research Method?

    Questionnaires are a cost-effective, simple and quick way to gather data that comes straight from the sources. This research method has been used for decades to gather data en masse, but it comes with its own complications and setbacks.

  2. How Do You Make an Acknowledgment in a Research Paper?

    To make an acknowledgement in a research paper, a writer should express thanks by using the full or professional names of the people being thanked and should specify exactly how the people being acknowledged helped.

  3. What Is a Sample Methodology in a Research Paper?

    The sample methodology in a research paper provides the information to show that the research is valid. It must tell what was done to answer the research question and how the research was done.

  4. (PDF) Online/Digital Learning Questionnaire

    Abstract · 1) Age 16-20 · 2) Gender Male · 3) Education Undergraduate · 4) Do you have proper Internet access at home? · 5) What is the main

  5. Online Learning/Distance Education Questionnaire Fall Term 2003

    I believe looking back on what I have learned in a course will help me to remember it better. 7. In my studies, I am self-disciplined and find it easy to set

  6. Remote Learning Survey Questionnaire For Students

    Examples Οf Remote Learning Questionnaire · 1. Do you have a device for online learning? Yes; Yes, but it is ineffective. No, I share my

  7. Questionnaire on the impact of e-learning

    Teacher eLearning survey questions · How satisfied are you with the overall eLearning experience? · On a scale of 1 to 10, how successful have your classes been?

  8. 45 Survey Questions to Understand Student Engagement in Online

    45 Questions to Understand Student Engagement in Online Learning · 1. How excited are you about going to your classes? · 2. How often do you get so focused on

  9. A Survey on the Effectiveness of Online Teaching–Learning

    Project-based learning helps teachers and students to promote collaborative learning by discussing specific topics. Cognitive independence is

  10. E-Learning Survey Questionnaire

    Beginning with an overview of the SLN program, we provide a conceptual framework for our current research on higher education, online learning environments.

  11. Questionnaire

    Q3: The e-learning is only advisable for people with a lot of computer knowledge? ... Q4: Are

  12. Online Learning Satisfaction During COVID-19 Pandemic Among

    This study employed a cross-sectional, questionnaire-based research design. A sample of 1,504 Chinese university students (Mage=19.89years, SD

  13. Distance learning survey for students: Tips & examples

    Examples of distance learning survey questions for students · 1. How do you feel overall about distance education? · 2. Do you have access to a device for

  14. Survey on Impact and Learning's of the Online Courses on the

    STUDENT EXPERIENCES IN ONLINE COURSES A Qualitative Research Synthesis Stephanie J.