Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • The Impact of Digital Tools on Student Writing and How Writing is Taught in Schools

Table of Contents

  • Part I: Introduction
  • Part II: How Much, and What, do Today’s Middle and High School Students Write?
  • Part III: Teachers See Digital Tools Affecting Student Writing in Myriad Ways
  • Part IV: Teachers Assess Students on Specific Writing Skills
  • Part V: Teaching Writing in the Digital Age

A survey of 2,462 Advanced Placement (AP) and National Writing Project (NWP) teachers finds that digital technologies are shaping student writing in myriad ways and have also become helpful tools for teaching writing to middle and high school students.  These teachers see the internet and digital technologies such as social networking sites, cell phones and texting, generally facilitating teens’ personal expression and creativity, broadening the audience for their written material, and encouraging teens to write more often in more formats than may have been the case in prior generations.  At the same time, they describe the unique challenges of teaching writing in the digital age, including the “creep” of informal style into formal writing assignments and the need to better educate students about issues such as plagiarism and fair use.

The AP and NWP teachers surveyed see today’s digital tools having tangible, beneficial impacts on student writing

Overall, these AP and NWP teachers see digital technologies benefitting student writing in several ways:

  • 96% agree (including 52% who strongly agree) that digital technologies “allow students to share their work with a wider and more varied audience”
  • 79% agree (23% strongly agree) that these tools “encourage greater collaboration among students”
  • 78% agree (26% strongly agree) that digital technologies “encourage student creativity and personal expression”

The combined effect of these impacts, according to this group of AP and NWP teachers, is a greater investment among students in what they write and greater engagement in the writing process.

At the same time, they worry that students’ use of digital tools is having some undesirable effects on their writing, including the “creep” of informal language and style into formal writing

In focus groups, these AP and NWP teachers shared some concerns and challenges they face teaching writing in today’s digital environment.  Among them are:

  • an increasingly ambiguous line between “formal” and “informal” writing and the tendency of some students to use informal language and style in formal writing assignments
  • the increasing need to educate students about writing for different audiences using different “voices” and “registers”
  • the general cultural emphasis on truncated forms of expression, which some feel are hindering students willingness and ability to write longer texts and to think critically about complicated topics
  • disparate access to and skill with digital tools among their students
  • challenging the “digital tool as toy” approach many students develop in their introduction to digital tools as young children

Survey results reflect many of these concerns, though teachers are sometimes divided on the role digital tools play in these trends.  Specifically:

  • 68% say that digital tools make students more likely—as opposed to less likely or having no impact—to take shortcuts and not put effort into their writing
  • 46% say these tools make students more likely to “write too fast and be careless”
  • Yet, while 40% say today’s digital technologies make students more likely to “use poor spelling and grammar” another 38% say they make students LESS likely to do this

Overall, these AP and NWP teachers give their students’ writing skills modest marks, and see areas that need attention

Asked to assess their students’ performance on nine specific writing skills, AP and NWP tended to rate their students “good” or “fair” as opposed to “excellent” or “very good.”  Students were given the best ratings on their ability to “effectively organize and structure writing assignments” with 24% of teachers describing their students as “excellent” or “very good” in this area. Students received similar ratings on their ability to “understand and consider multiple viewpoints on a particular topic or issue.”  But ratings were less positive for synthesizing material into a cohesive piece of work, using appropriate tone and style, and constructing a strong argument.

These AP and NWP teachers gave students the lowest ratings when it comes to “navigating issues of fair use and copyright in composition” and “reading and digesting long or complicated texts.”  On both measures, more than two-thirds of these teachers rated students “fair” or “poor.”

Figure 1

Majorities of these teachers incorporate lessons about fair use, copyright, plagiarism, and citation in their teaching to address students’ deficiencies in these areas

In addition to giving students low ratings on their understanding of fair use and copyright, a majority of AP and NWP teachers also say students are not performing well when it comes to “appropriately citing and/or referencing content” in their work.  This is fairly common concern among the teachers in the study, who note how easy it is for students today to copy and paste others’ work into their own and how difficult it often is to determine the actual source of much of the content they find online.  Reflecting how critical these teachers view these skills:

  • 88% (across all subjects) spend class time “discussing with students the concepts of citation and plagiarism”
  • 75% (across all subjects) spend class time “discussing with students the concepts of fair use and copyright”

A plurality of AP and NWP teachers across all subjects say digital tools make teaching writing easier

Despite some challenges, 50% of these teachers (across all subjects) say the internet and digital tools make it easier for them to teach writing, while just 18% say digital technologies make teaching writing more difficult.  The remaining 31% see no real impact.

Figure 2

Positive perceptions of the potential for digital tools to aid educators in teaching writing are reflected in practice:

  • 52% of AP and NWP teachers say they or their students use interactive whiteboards in their classes
  • 40% have students share their work on wikis, websites or blogs
  • 36% have students edit or revise their own work and 29% have students edit others’ work using collaborative web-based tools such as GoogleDocs

In focus groups, teachers gave a multitude of examples of the value of these collaborative tools, not only in teaching more technical aspects of writing but also in being able to “see their students thinking” and work alongside students in the writing process.  Moreover, 56% say digital tools make their students more likely to write well because they can revise their work easily.

These middle and high school teachers continue to place tremendous value on “formal writing”

While they see writing forms and styles expanding in the digital world, AP and NWP teachers continue to place tremendous value on “formal writing” and try to use digital tools to impart fundamental writing skills they feel students need.  Nine in ten (92%) describe formal writing assignments as an ��essential” part of the learning process, and 91% say that “writing effectively” is an “essential” skill students need for future success.

More than half (58%) have students write short essays or responses on a weekly basis, and 77% assigned at least one research paper during the 2011-2012 academic year.  In addition, 41% of AP and NWP teachers have students write weekly journal entries, and 78% had their students create a multimedia or mixed media piece in the academic year prior to the survey.

Almost all AP and NWP teachers surveyed (94%) encourage students to do some of their writing by hand

Alongside the use of digital tools to promote better writing, almost all AP and NWP teachers surveyed say they encourage their students to do at least some writing by hand.  Their reasons are varied, but many teachers noted that because students are required to write by hand on standardized tests, it is a critical skill for them to have.  This is particularly true for AP teachers, who must prepare students to take AP exams with pencil and paper.  Other teachers say they feel students do more active thinking, synthesizing, and editing when writing by hand, and writing by hand discourages any temptation to copy and paste others’ work.

About this Study

The basics of the survey.

These are among the main findings of an online survey of a non-probability sample of 2,462 middle and high school teachers currently teaching in the U.S., Puerto Rico and the U.S. Virgin Islands, conducted between March 7 and April 23, 2012.  Some 1,750 of the teachers are drawn from a sample of advanced placement (AP) high school teachers, while the remaining 712 are from a sample of National Writing Project teachers.  Survey findings are complemented by insights from a series of online and in-person focus groups with middle and high school teachers and students in grades 9-12, conducted between November, 2011 and February, 2012.

This particular sample is quite diverse geographically, by subject matter taught, and by school size and community characteristics.  But it skews towards educators who teach some of the most academically successful students in the country. Thus, the findings reported here reflect the realities of their special place in American education, and are not necessarily representative of all teachers in all schools. At the same time, these findings are especially powerful given that these teachers’ observations and judgments emerge from some of the nation’s most advanced classrooms.

In addition to the survey, Pew Internet conducted a series of online and offline focus groups with middle and high school teachers and some of their students and their voices are included in this report.

The study was designed to explore teachers’ views of the ways today’s digital environment is shaping the research and writing habits of middle and high school students, as well as teachers’ own technology use and their efforts to incorporate new digital tools into their classrooms.

About the data collection

Data collection was conducted in two phases.  In phase one, Pew Internet conducted two online and one in-person focus group with middle and high school teachers; focus group participants included Advanced Placement (AP) teachers, teachers who had participated in the National Writing Project’s Summer Institute (NWP), as well as teachers at a College Board school in the Northeast U.S.  Two in-person focus groups were also conducted with students in grades 9-12 from the same College Board school.   The goal of these discussions was to hear teachers and students talk about, in their own words, the different ways they feel digital technologies such as the internet, search engines, social media, and cell phones are shaping students’ research and writing habits and skills.  Teachers were asked to speak in depth about teaching research and writing to middle and high school students today, the challenges they encounter, and how they incorporate digital technologies into their classrooms and assignments.

Focus group discussions were instrumental in developing a 30-minute online survey, which was administered in phase two of the research to a national sample of middle and high school teachers.  The survey results reported here are based on a non-probability sample of 2,462 middle and high school teachers currently teaching in the U.S., Puerto Rico, and the U.S. Virgin Islands.  Of these 2,462 teachers, 2,067 completed the entire survey; all percentages reported are based on those answering each question.  The sample is not a probability sample of all teachers because it was not practical to assemble a sampling frame of this population. Instead, two large lists of teachers were assembled: one included 42,879 AP teachers who had agreed to allow the College Board to contact them (about one-third of all AP teachers), while the other was a list of 5,869 teachers who participated in the National Writing Project’s Summer Institute during 2007-2011 and who were not already part of the AP sample. A stratified random sample of 16,721 AP teachers was drawn from the AP teacher list, based on subject taught, state, and grade level, while all members of the NWP list were included in the final sample.

The online survey was conducted from March 7–April 23, 2012.  More details on how the survey and focus groups were conducted are included in the Methodology section at the end of this report, along with focus group discussion guides and the survey instrument.

There are several important ways the teachers who participated in the survey are unique, which should be considered when interpreting the results reported here.  First, 95% of the teachers who participated in the survey teach in public schools, thus the findings reported here reflect that environment almost exclusively.  In addition, almost one-third of the sample (NWP Summer Institute teachers) has received extensive training in how to effectively teach writing in today’s digital environment.  The National Writing Project’s mission is to provide professional development, resources and support to teachers to improve the teaching of writing in today’s schools.   The NWP teachers included here are what the organization terms “teacher-consultants” who have attended the Summer Institute and provide local leadership to other teachers.  Research has shown significant gains in the writing performance of students who are taught by these teachers. 1

Moreover, the majority of teachers participating in the survey (56%) currently teach AP, honors, and/or accelerated courses, thus the population of middle and high school students they work with skews heavily toward the highest achievers.  These teachers and their students may have resources and support available to them—particularly in terms of specialized training and access to digital tools—that are not available in all educational settings.  Thus, the population of teachers participating in this research might best be considered “leading edge teachers” who are actively involved with the College Board and/or the National Writing Project and are therefore beneficiaries of resources and training not common to all teachers.  It is likely that teachers in this study are developing some of the more innovative pedagogical approaches to teaching research and writing in today’s digital environment, and are incorporating classroom technology in ways that are not typical of the entire population of middle and high school teachers in the U.S.  Survey findings represent the attitudes and behaviors of this particular group of teachers only, and are not representative of the entire population of U.S. middle and high school teachers.

Every effort was made to administer the survey to as broad a group of educators as possible from the sample files being used.  As a group, the 2,462 teachers participating in the survey comprise a wide range of subject areas, experience levels, geographic regions, school type and socioeconomic level, and community type (detailed sample characteristics are available in the Methods section of this report).  The sample includes teachers from all 50 states, Puerto Rico, and the U.S. Virgin Islands.  All teachers who participated in the survey teach in physical schools and classrooms, as opposed to teaching online or virtual courses.

English/language arts teachers make up a significant portion of the sample (36%), reflecting the intentional design of the study, but history, social science, math, science, foreign language, art, and music teachers are also represented.  About one in ten teachers participating in the survey are middle school teachers, while 91% currently teach grades 9-12.  There is wide distribution across school size and students’ socioeconomic status, though half of the teachers participating in the survey report teaching in a small city or suburb.  There is also a wide distribution in the age and experience levels of participating teachers.  The survey sample is 71% female.

About the Pew Research Center’s Internet & American Life Project

The Pew Research Center’s Internet & American Life Project is one of seven projects that make up the Pew Research Center, a nonpartisan, nonprofit “fact tank” that provides information on the issues, attitudes and trends shaping America and the world. The Project produces reports exploring the impact of the internet on families, communities, work and home, daily life, education, health care, and civic and political life. The Pew Internet Project takes no positions on policy issues related to the internet or other communications technologies. It does not endorse technologies, industry sectors, companies, nonprofit organizations, or individuals. While we thank our research partners for their helpful guidance, the Pew Internet Project had full control over the design, implementation, analysis and writing of this survey and report.

About the National Writing Project

The National Writing Project (NWP) is a nationwide network of educators working together to improve the teaching of writing in the nation’s schools and in other settings. NWP provides high-quality professional development programs to teachers in a variety of disciplines and at all levels, from early childhood through university. Through its nearly 200 university-based sites serving all 50 states, the District of Columbia, Puerto Rico and the U.S. Virgin Islands, NWP develops the leadership, programs and research needed for teachers to help students become successful writers and learners. For more information, visit www.nwp.org .

  • More specific information on this population of teachers, the training they receive, and the outcomes of their students are available at the National Writing Project website at www.nwp.org . ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Age & Generations
  • Digital Divide
  • Education & Learning Online
  • Online Search
  • Platforms & Services
  • Teens & Tech
  • Teens & Youth

Teens and Video Games Today

As biden and trump seek reelection, who are the oldest – and youngest – current world leaders, how teens and parents approach screen time, who are you the art and science of measuring identity, u.s. centenarian population is projected to quadruple over the next 30 years, most popular.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Politics & Policy
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Terms & Conditions

Privacy Policy

Cookie Settings

Reprints, Permissions & Use Policy

Student Writing in the Digital Age

Essays filled with “LOL” and emojis? College student writing today actually is longer and contains no more errors than it did in 1917.

student using laptop

“Kids these days” laments are nothing new, but the substance of the lament changes. Lately, it has become fashionable to worry that “kids these days” will be unable to write complex, lengthy essays. After all, the logic goes, social media and text messaging reward short, abbreviated expression. Student writing will be similarly staccato, rushed, or even—horror of horrors—filled with LOL abbreviations and emojis.

JSTOR Daily Membership Ad

In fact, the opposite seems to be the case. Students in first-year composition classes are, on average, writing longer essays (from an average of 162 words in 1917, to 422 words in 1986, to 1,038 words in 2006), using more complex rhetorical techniques, and making no more errors than those committed by freshman in 1917. That’s according to a longitudinal study of student writing by Andrea A. Lunsford and Karen J. Lunsford, “ Mistakes Are a Fact of Life: A National Comparative Study. ”

In 2006, two rhetoric and composition professors, Lunsford and Lunsford, decided, in reaction to government studies worrying that students’ literacy levels were declining, to crunch the numbers and determine if students were making more errors in the digital age.

They began by replicating previous studies of American college student errors. There were four similar studies over the past century. In 1917, a professor analyzed the errors in 198 college student papers; in 1930, researchers completed similar studies of 170 and 20,000 papers, respectively. In 1986, Robert Connors and Andrea Lunsford (of the 2006 study) decided to see if contemporary students were making more or fewer errors than those earlier studies showed, and analyzed 3,000 student papers from 1984. The 2006 study (published in 2008) follows the process of these earlier studies and was based on 877 papers (one of the most interesting sections of “Mistakes Are a Fact of Life” discusses how new IRB regulations forced researchers to work with far fewer papers than they had before.

Remarkably, the number of errors students made in their papers stayed consistent over the past 100 years. Students in 2006 committed roughly the same number of errors as students did in 1917. The average has stayed at about 2 errors per 100 words.

What has changed are the kinds of errors students make. The four 20th-century studies show that, when it came to making mistakes, spelling tripped up students the most. Spelling was by far the most common error in 1986 and 1917, “the most frequent student mistake by some 300 percent.” Going down the list of “top 10 errors,” the patterns shifted: Capitalization was the second most frequent error 1917; in 1986, that spot went to “no comma after introductory element.”

In 2006, spelling lost its prominence, dropping down the list of errors to number five.  Spell-check and similar word-processing tools are the undeniable cause. But spell-check creates new errors, too: The new number-one error in student writing is now “wrong word.” Spell-check, as most of us know, sometimes corrects spelling to a different word than intended; if the writing is not later proof-read, this computer-created error goes unnoticed. The second most common error in 2006 was “incomplete or missing documentation,” a result, the authors theorize, of a shift in college assignments toward research papers and away from personal essays.

Additionally, capitalization errors have increased, perhaps, as Lunsford and Lunsford note, because of neologisms like eBay and iPod. But students have also become much better at punctuation and apostrophes, which were the third and fifth most common errors in 1917. These had dropped off the top 10 list by 2006.

The study found no evidence for claims that kids are increasingly using “text speak” or emojis in their papers. Lunsford and Lunsford did not find a single such instance of this digital-era error. Ironically, they did find such text speak and emoticons in teachers’ comments to students. (Teachers these days?)

The most startling discovery Lunsford and Lunsford made had nothing to do with errors or emojis. They found that college students are writing much more and submitting much longer papers than ever. The average college essay in 2006 was more than double the length of the average 1986 paper, which was itself much longer than the average length of papers written earlier in the century. In 1917, student papers averaged 162 words; in 1930, the average was 231 words. By 1986, the average grew to 422 words. And just 20 years later, in 2006, it jumped to 1,038 words.

Why are 21st-century college students writing so much more? Computers allow students to write faster. (Other advances in writing technology may explain the upticks between 1917, 1930, and 1986. Ballpoint pens and manual and electric typewriters allowed students to write faster than inkwells or fountain pens.) The internet helps, too: Research shows that computers connected to the internet lead K-12 students to “conduct more background research for their writing; they write, revise, and publish more; they get more feedback on their writing; they write in a wider variety of genres and formats; and they produce higher quality writing.”

The digital revolution has been largely text-based. Over the course of an average day, Americans in 2006 wrote more than they did in 1986 (and in 2015 they wrote more than in 2006). New forms of written communication—texting, social media, and email—are often used instead of spoken ones—phone calls, meetings, and face-to-face discussions. With each text and Facebook update, students become more familiar with and adept at written expression. Today’s students have more experience with writing, and they practice it more than any group of college students in history.

Get Our Newsletter

Get your fix of JSTOR Daily’s best stories in your inbox each Thursday.

Privacy Policy   Contact Us You may unsubscribe at any time by clicking on the provided link on any marketing message.

In shifting from texting to writing their English papers, college students must become adept at code-switching, using one form of writing for certain purposes (gossiping with friends) and another for others (summarizing plots). As Kristen Hawley Turner writes in “ Flipping the Switch: Code-Switching from Text Speak to Standard English ,” students do know how to shift from informal to formal discourse, changing their writing as occasions demand. Just as we might speak differently to a supervisor than to a child, so too do students know that they should probably not use “conversely” in a text to a friend or “LOL” in their Shakespeare paper. “As digital natives who have had access to computer technology all of their lives, they often demonstrate in theses arenas proficiencies that the adults in their lives lack,” Turner writes. Instructors should “teach them to negotiate the technology-driven discourse within the confines of school language.”

Responses to Lunsford and Lunsford’s study focused on what the results revealed about mistakes in writing: Error is often in the eye of the beholder . Teachers mark some errors and neglect to mention (or find) others. And, as a pioneering scholar of this field wrote in the 1970s, context is key when analyzing error: Students who make mistakes are not “indifferent…or incapable” but “beginners and must, like all beginners, learn by making mistakes.”

College students are making mistakes, of course, and they have much to learn about writing. But they are not making more mistakes than did their parents, grandparents, and great-grandparents. Since they now use writing to communicate with friends and family, they are more comfortable expressing themselves in words. Plus, most have access to technology that allows them to write faster than ever. If Lunsford and Lunsford’s findings about the average length of student papers stays true, today’s college students will graduate with more pages of completed prose to their name than any other generation.

If we want to worry about college student writing, then perhaps what we should attend to is not clipped, abbreviated writing, but overly verbose, rambling writing. It might be that editing skills—deciding what not to say, and what to delete—may be what most ails the kids these days.

JSTOR logo

JSTOR is a digital library for scholars, researchers, and students. JSTOR Daily readers can access the original research behind our articles for free on JSTOR.

More Stories

Young woman, a university student, studying online.

Scaffolding a Research Project with JSTOR

An overhead view of a group of five preschoolers sitting at a table playing with colorful blocks and geometric shapes.

Making Implicit Racism

Colombian taitas, 2001

The Diverse Shamanisms of South America

Illustration of a man catching a catfish with his arm.

Doing Some (Catfish) Noodling?

Recent posts.

  • Katherine Mansfield and Anton Chekhov
  • Witnessing and Professing Climate Professionals
  • The Power of the Veil for Spanish Women
  • Crucial Building Blocks of Life on Earth Can More Easily Form in Outer Space
  • Before Palmer Penmanship

Support JSTOR Daily

Sign up for our weekly newsletter.

Advertisement

Advertisement

Students’ Digital Media Self-Efficacy and Its Importance for Higher Education Institutions: Development and Validation of a Survey Instrument

  • Original research
  • Open access
  • Published: 25 July 2020
  • Volume 26 , pages 555–575, ( 2021 )

Cite this article

You have full access to this open access article

essay on digital study

  • Marina Pumptow   ORCID: orcid.org/0000-0003-4407-4426 1 &
  • Taiga Brahm 1  

13k Accesses

21 Citations

3 Altmetric

Explore all metrics

Although digital media are in general very common, their role in academic settings and their relevance for academic achievement are not satisfactorily explored. A research gap that is particularly apparent during the corona crisis in 2020 when university processes in many countries are suddenly almost completely digitalised. Research suggests a link between students’ diversity, in particular, their socio-economic background, academic self-efficacy expectations, study-related attitudes, and academic achievement. However, previous empirical studies on digital media at universities predominantly describe different types of media usage patterns but little is revealed about the students’ study-related attitudes and performance. The present study aims at developing a survey instrument to explore the relationship of individual, contextual as well as social background factors concerning academic achievement, with a special focus on academic and digital media self-efficacy expectations (DMSE). For this purpose, a new scale for DMSE has been constructed, based on existing psychological research. After pre-testing the instrument in 2017, data was collected at four German universities in summer 2018 (n = 2039). Validity and reliability are shown and the instrument appears suitable for further research in order to explore the interplay of student learning and digital media use in higher education, integrating the institutional and social context.

Similar content being viewed by others

essay on digital study

Use of Social Media in Student Learning and Its Effect on Academic Performance

essay on digital study

Social Media Impact on Academic Performance: Lessons Learned from Cameroon

essay on digital study

Effect of social media on academic engagement and performance: Perspective of graduate students

Avoid common mistakes on your manuscript.

1 Introduction

How to integrate digital media in university is an important topic which is addressed both in research and is of practical relevance since Higher Education (HE) practitioners are struggling how to integrate digital media in study programs and infrastructure. This integration is assumed to offer innovative potential for teaching and learning at Higher Education Institutions (HEIs). However, the relatively new trend towards ‘Digitalisation’ in HE has not been sufficiently considered in the evaluation of student performance and academic success yet. For instance, the comprehensive multidimensional instrument to evaluate university teaching by Lemos, Queirós, Teixeira, and Menezes ( 2011 ) includes the dimension teaching methods but does not yet include digital media.

However, just as digital media is affecting everyday life, a change of academic studies and demands might be assumed, not least because skills concerning computers or digital media, in general, are more and more required in many occupational fields (see e.g. Ally and Prieto-Blázquez 2014 ), but also as the corona-crisis in 2020 and the associated rapid digitalisation of university teaching show very clearly. Notwithstanding the amount of existing research concerning media use of lecturers and students in recent years, for instance in Germany (Dolch and Zawacki-Richter 2018 ; Grosch 2012 ; Grosch and Gidion 2011 ; Müßig-Trapp and Willige 2006 ; Persike and Friedrich 2016 ; Schulmeister 2009 ; Vogel and Woisch 2013 ; Zawacki-Richter 2015 ; Zawacki-Richter et al. 2015 , 2016 , 2017 ) and internationally (Al-Husain and Hammo 2015 ; Dahlstrom and Bichsel 2014 ; Dahlstrom et al. 2011 , 2013 , 2015 ; Dahlstrom and Walker 2012 ; Rutherford and Standley 2016 ; Thompson 2013 ) both the use of digital media in HE settings as well as the impact of digital media on studying itself are still insufficiently investigated. With 2339 students in 2012 and 1327 students in 2015, from several HEI’s that offered online courses and study programs at the time, Zawacki-Richter et al. conducted surveys addressing digital media usage (Zawacki-Richter et al. 2017 ). In 2012 only 56% of the students owned a smartphone, 86% a laptop, and 9% a tablet and in 2015, already 91% owned a smartphone, 92% a laptop, and 40% a tablet (Zawacki-Richter et al. 2017 ). The EDUCAUSE Center for Analysis and Research (ECAR) has researched undergraduate students and IT between 2004 and 2015, based on 4123 in 2004 to 50,274 students in 2015 from HEI’s in the USA and up to 15 other countries (Dahlstrom et al. 2015 ). Those studies have shown a similar increase in the spread of technology and the use of mobile devices in both the private and academic sectors over time (Dahlstrom and Bichsel 2014 ).

This increase is only one indicator for the increasing relevance of digital media at HEIs and thereby the need for research on digital media behaviour of university students. In addition to further, mainly descriptive analyses of media use and distribution, Zawacki-Richter et al. ( 2015 ) established a media usage typology based on the 2012 survey of 2339 university students. In several subgroup analyses, they found, among other things, significant differences between male and female students (Zawacki-Richter et al. 2015 ). This corresponds to previous studies, such as by Huang, Hood, and Yoo ( 2013 ), who also investigated the use and acceptance of various (web 2.0) applications, for 432 college students, and found significant gender differences as well.

Above all, in the studies mentioned, factors such as underlying motivations, emotions, self-evaluations, or self-efficacy are hardly considered, and students’ social background is not taken into account either. A notable exception is a study by Horvitz et al. ( 2015 ) who examine faculty’s self-efficacy regarding online teaching, however, this study does not take the students’ view into account. Other studies examine the interplay in the context of school education (e.g. Li et al. 2019 ; Sangkawetai et al. 2018 ). A recent study by Nouri ( 2018 ) with about 500 students at a Swedish university investigated multimodal literacy and the learning design during self-studies. It finds that technology indeed changed university students’ self-studies and knowledge building, however, this study does not reveal how these multimodal learning practices affect academic success. Therefore, there is a need for a comprehensive evaluation instrument to explore the connection between students’ background factors and their study respectively media behaviour including the link with academic performance.

Research regarding students’ academic performance mainly focuses on individual characteristics, often using concepts from educational psychology but does not specifically address digital media in academic contexts. Concepts often used are for example the ‘expectancy-value theory of achievement motivation’ by Wigfield and Eccles ( 2000 ) or the framework of social-cognitive theory (SCT) and self-efficacy by Bandura (e.g. Bandura 1977 ). Most research is also located outside of Europe; in consequence, little is known about the transferability of this research to the European context. Also, as research is often characterized by low case numbers and a predominant focus on psychology students, results are often not generalizable respectively valid for other disciplines (see for example review studies by Bartimote-Aufflick et al. 2015 ; Honicke and Broadbent 2016 ). In addition to these short-comings, family background or other social and contextual factors are hardly taken into account in the aforementioned research strand. These factors, in turn, are addressed in social-science research. Due to increasing heterogeneity of students, not only at German HEI, such studies focus for example on the identification of groups with certain characteristics that are in some ways disadvantaged in academic studies (e.g. Röwert et al. 2017 ). In these studies, however, the important mechanisms and variables at the individual level, which would allow for further implications regarding possible interventions, are not considered. An integrative model considering social cognitive, individual characteristics as well as contextual or familial factors in terms of students’ performance and digital media behaviour at HEI is lacking so far.

Research shows that academic achievement varies between different social groups, such as migrants, students with children, or low socioeconomic status (SES) (Röwert et al. 2017 ). Often, this relation leads to lower academic achievement of those students whose parents are characterized by lower educational background. In addition to students’ socio-economic background, students’ self-efficacy expectations and motivation are related to their academic achievement and goal setting (e.g. Komarraju and Dial 2014 ; Pajares 1996 ; Putwain et al. 2013 ; Schunk and Pajares 2002 ; Zimmerman 2000a , 2000b ; Zimmerman et al. 1992 ). Assuming a link between social backgrounds, e.g. parents’ educational background, certain self-efficacy expectations, and behaviour in academic settings in general (Zimmerman et al. 1992 ; Zimmerman 2000b ), the same factors might be relevant in terms of students’ digital media behaviour, this needs to be further explored.

In sum, the introduced study aims to supplement current research in the field of digital media in Higher Education by developing a survey instrument that allows addressing the multi-faceted character of academic studies and digital media behaviour. Particularly, a new scale for digital media self-efficacy expectation (DMSE) is constructed to allow for a further examination of the determinants for observable media usage patterns and their potential links to students’ social backgrounds. Our instrument is designed to comprehensively capture the relevant individual, contextual, and social factors for academic performance and therefore, lead to a deeper understanding of the mechanism for the disadvantage of certain student groups, the relevance of digital media in HE and also further research on possible interventions at the same time. Thus, the developed evaluation instrument contributes to extending research on digital media in Higher Education. Furthermore, this focus on digital media in our research instrument also complements the study of Brahm and Jenert ( 2015 ) on university students and their attitudes towards studying, which is therefore partly replicated and also validated once more. However, this paper exclusively focuses on the development and validation of a survey instrument and some first descriptive insights, therefore, we do not present the results of the above mentioned potential analyses in the paper at hand. Nevertheless, we want to point out the possible applications of this instrument and promising starting points for further research. As a practical contribution, the instrument can also be used by other HEIs to evaluate their own digital media use and to determine in which ways their students are benefiting (or hindered) from using digital media, in particular concerning the disadvantage of certain groups of students.

The pre-test survey was conducted in December 2017 and the full data set was collected in summer 2018. In the following, the theoretical background of the evaluation instrument as well as the evaluation procedure is presented. In a second step, the results of validation procedures as well as first (descriptive) results regarding university students’ digital media attitudes and behaviour.

1.1 Theoretical Background

Bandura’s social cognitive theory (SCT) (e.g. Bandura 1977 , 1986 , 2011 ) offers a theoretical frame to analyse thoughts, motivation, and behaviour and therefore appears to be well suited to the aim of the study at hand. According to this theory, human behaviour, in general, is caused by personal, behavioural as well as environmental influences. In a reciprocal determinism, individuals interpret the results of their performance attainments in a certain way, which in turn informs and changes their environment and their self-beliefs. This again, informs and changes the subsequent behaviour. One central aspect of the SCT is self-efficacy which Bandura ( 1986 , p. 391) defines as ‘people’s judgement of their capabilities to organize and execute courses of action required to attain designated types of performances’. The higher the self-efficacy belief, the higher the effort people will expend on an activity, the longer they will keep up when confronting obstacles, and the more resilient they will prove in the face of adverse situations (Pajares 1996 , p. 544). In reference to the HE context, academic self-efficacy beliefs are based on students’ perceptions of their abilities to achieve a certain goal, e.g. to complete a course or to pass an exam. This may determine their learning effort that is spent on the activities to reach such goals.

Self-efficacy expectations and behaviour in academic settings may also be linked to students’ success of integration at a higher education institution (HEI). In line with the ‘model of institutional departure’ by Tinto ( 1993 ), the failure to become or remain incorporated in the intellectual and social life of the institution is one of three crucial factors for student dropout, in addition to academic difficulties and the inability of individuals to resolve their educational and occupational goals. While incorporation in intellectual life refers to integration into the academic system, incorporation in social life refers to students’ social integration. Both integration aspects depend on the terms determined by the HEI such as the course of studies as well as on external factors such as the social background. Although Tinto focusses on the identification of courses of action for HEI to reduce student dropout, the model and especially the aspect of integration may in combination with self-efficacy expectations and other non-cognitive factors, such as goal orientation, be appropriate to describe reasons for academic achievement and behaviour in academic settings as well.

Due to the general behaviour-determining influence of self-efficacy expectations, it can be assumed that students’ media behaviour is influenced by media-related self-efficacy expectations. For example, the willingness to deal with new technologies, to try out new applications or to try out digitally supported learning environments, and to stay on track even when facing difficulties, depends on how much a person relies on their skills and problem-solving abilities in dealing with these technologies, in other words: their media-related self-efficacy. However, research concerning media use in HE so far is limited to either the assessment of media applications in specific contexts (e.g. lectures, seminars) or analyses of media usage patterns for a rather broad student population (see above and the following section). In consequence, it has hardly been investigated whether, in addition to general academic behaviour, media use could also affect academic success. Furthermore, there is hardly any empirical evidence concerning the role of digital media self-efficacy for media-behaviour in academic contexts, and again its relevance for academic performance and the relationship with socio-economic backgrounds. Thus, the dual focus on both academic and digital media self-efficacy may be useful in terms of further examining students’ learning behaviour and digital media use. Also, since self-efficacy expectations depend on environmental aspects that are deemed highly relevant in SCT in general, it is important to take contextual as well as social factors into account which illustrates once more the relevance of a comprehensive survey instrument to analyse study behaviour in the digital era. This is especially true in times of rapid acceleration of digitalisation processes, such as during the corona crisis in 2020, where traditional models of academic behaviour may reach their limits.

1.2 State of Research

Research concerning the link between students’ self-efficacy expectations, motivation, and academic attainment (e.g. Komarraju and Dial 2014 ; Pajares 1996 ; Putwain et al. 2013 ; Schunk and Pajares 2002 ; Zimmerman et al. 1992 ) identifies self-efficacy expectations as an important predictor for academic goal setting and achievement.

For example, Bartimote-Aufflick et al. ( 2015 ) and Honicke and Broadbent ( 2016 ) found a connection between self-efficacy and study success. In line with the theoretical concept, i.e. reciprocal determinism (Bandura 1977 ), former experiences such as past grades in the academic context may influence subsequent self-efficacy expectations. This has also been shown in empirical studies (Klassen and Usher 2010 ; Lindsley et al. 1995 ; Talsma et al. 2018 ). However, for the context of physiotherapy education, Jones and Sheppard ( 2012 ) showed that previous experience was only related to self-efficacy in two distinct fields. Also motivation and goal orientation (Hsieh et al. 2007 ) because of their relevance for interest and self-regulation (Honicke and Broadbent 2016 ); emotions like anxiety (Hsieh et al. 2012 ); perceived control over actions and outcomes (Pekrun 2006 ), and certain personality traits like conscientiousness due to its link to self-discipline (Lievens et al. 2009 ) are relevant for the self-efficacy-achievement relation.

Furthermore, academic achievement varies between different social groups, such as migrants, students with children, or low socioeconomic status (SES) (Röwert et al. 2017 ). In this regard, research suggests that students’ SES may affect academic achievement via self-efficacy (Gecas and Schwalbe 1983 ; Weiser and Riggio 2010 ). Students stemming from lower socioeconomic backgrounds show higher academic performance when indicating higher self-efficacy; however, usually, such students are equipped with lower self-efficacy expectations (Weiser and Riggio 2010 ).

In our research, two instruments, in particular, have proven to be reliable and often used for investigating academic self-efficacy: The academic self-efficacy scale as designed by Jerusalem and Schwarzer ( 2002 ) and the scale used in the Motivated Strategies for Learning Questionnaire (Duncan et al. 2015 ). Since we focus on German university students and developed a survey instrument in the German language, the academic self-efficacy scale (Jerusalem and Schwarzer 2002 ) seems most appropriate to our intentions.

Recent research concerning students’ digital media use shows that students varying in e.g. age, family status, or ambitions show differing patterns of digital media use in academic settings (e.g. Grosch and Gidion 2011 ; Zawacki-Richter 2015 ; Zawacki-Richter et al. 2016 ). Since digital media are a global phenomenon and can have a positive impact on learning outcomes (Cavanaugh et al. 2009 ; Li and Ma 2010 ; Tienken and Wilson 2007 ), some relevance can also be assumed for HEI. Under the assumption that digital media behaviour is at least partly affected by self-efficacy expectations regarding digital media (applications), a closer look at digital media self-efficacy seems promising to analyse factors for study success.

To our knowledge, there is no up-to-date and suitable scale for assessing media-related self-efficacy. Possible scales are either outdated (Compeau and Higgins 1995 ) or rather specific, focusing on internet search (Eastin and LaRose 2000 ), social media (Hocevar 2013 ), information search (Vishwanath 2007 ), or communication. The scale for media self-efficacy (Hofstetter et al. 2009 ) on the contrary is too broad for our purposes. In consequence, we developed a scale for investigating DMSE that is not too specific, in order to address a broad range of different digital media and also not to wide-spread, to assure for validity and reliability of the scale.

2 Design and Sample

In order to empirically observe determinants of students’ academic behaviour, media use, and related attitudes, we developed a standardized questionnaire. Multiple instruments are arranged in three thematic blocks to capture self-efficacy expectations as well as emotions, motivation, media-usage behaviour, and socioeconomic factors. On that account, we chose instruments that are either research standards used in current research in the subject area or were constructed based on those standards, as is briefly described below.

2.1 Instrument Design

In Tables 1 and 2 , a list of the scales addressing attitudes, motivation, and behaviour either in the general academic context (Table 1 ) or related to digital media (Table 2 ) and an example item for each of the scales is shown. We consistently used 7-point Likert scales for all of the psychometric measurements, e.g. ranging from ‘totally disagree’ to ‘totally agree’.

Evaluation concerning studying (including emotions, motivation, and attitudes) is undertaken by partly adapting the scales for the ‘assessment of students’ attitudes towards studying’ (Brahm and Jenert 2015 ) and CHE-Quest Footnote 1 (Leichsenring 2011 ), which includes a scale for integration (Tinto 1993 ). The instruments used in Brahm and Jenert ( 2015 ) appear to be well suited to the present research project since they address attitudes towards the university as an institution and therefore the students’ social and contextual environment and attitudes towards studying (e.g. support from important people, emotions of anxiety and joy), while taking into account self-efficacy expectations and attitudes towards learning (e.g. autonomy in learning processes). Academic self-efficacy expectation (ASE) is measured with the corresponding instrument by Jerusalem and Schwarzer ( 2002 ). Additionally, instruments for intrinsic motivation and extrinsic goal orientation as well as perceived academic achievement are included from Brahm and Jenert 2015 . To measure the big five personality traits, we as well included the BFI-10, a 10-item scale with two items each for the dimensions extraversion, agreeableness, conscientiousness, emotional stability, and openness (Rammstedt et al. 2013 ).

Aim and frequency of students’ media use and different attitudes regarding those media are questioned according to approved instruments by Grosch and Gidion ( 2011 ) and Zawacki-Richter ( 2015 ). Computers, tablets, and smartphones are seen as digital media equipment, while software tools (e.g. for text or spreadsheet processing, picture editing), research tools, search engines, and other online media tools, are summarized as digital media itself. The measurement of attitudes towards digital media is divided into several units, dealing with the overall evaluation for example of usefulness and concerns (e.g. privacy and data security concerns) and also the evaluation of usefulness with regard to academic studies.

Based on the general self-efficacy scale by Schwarzer and Jerusalem ( 2010 ), a scale for DMSE is newly constructed to capture students’ media-related self-efficacy. Self-assessed knowledge resp. skills regarding digital media applications are included as well.

Constructs and scales for age, educational qualification, nationality, occupational status, and income are based on a study conducted at a German university in 2014 (Lang and Hillmert 2014 ). In line with this study, parental characteristics (e.g. educational qualification) are measured as well. The data will allow classifying the respondents’ socioeconomic status according to the ‘International Socio-Economic Index of Occupational Status (ISEI)’ (Ganzeboom and Treiman 2003 ).

Besides, gender, subject of study and the number of semesters are included which will allow for subgroup analyses regarding gender, subjects and study experience, in addition to the four university contexts.

In summary, the questionnaire includes scales for self-efficacy, goal orientation, emotions associated with studying, media use, and related attitudes and demographic factors. All in all, this comprehensive instrument, thus, allows data to be collected for analyses of the complex relationships described above. Since the psychometric quality of the instrument must be guaranteed for these analyses, the instrument’s validity and reliability are the focus of the study at hand.

To ensure the adequacy of the chosen scales and items, a first version of the developed instrument was given to experts in the field of educational research. These experts were three professors and four graduate students. After implementing the received feedback, the revised version of the online-questionnaire was given to two undergraduate students to finally check for wording, comprehensibility, and processing time. Footnote 2 Both the pre-test version and the final version of the questionnaire have been approved by the ethics committee for psychological research at the university Q.

We decided to recruit participants partly via mass emails to invite them to participate voluntarily. Footnote 3 Due to this nonprobability sampling method, the representativeness of the sample collected was not guaranteed. However, our goal was to reach a wide variety of participants from all faculties of the universities and not a representative sample. To attract more participants, we also used flyers and posters, information screens (e.g. in the library), and announcements in lectures. Also, we raffled vouchers and prices such as IPads or speakers.

The pre-test-survey was conducted using the online-questionnaire at three HEI in Switzerland and Germany at the end of 2017. In total, about 2000 students of different subjects received a link to the questionnaire. A total response of 171 cases was gained (response rate approximately 8.6%) of which 63 responded to every question of the survey.

The data of the main study was collected in the summer term 2018 at four German Universities X, Y, Z, and Q. In sum, 135,464 enrolled students were addressed and 3342 participated (response rate approx. 2.5%). The number of participants who completed the whole questionnaire is 1925 (response rate for complete questionnaires: 1.4%). Table 3 shows the absolute response number per HEI. The statistics reported below refer to the data of the main study but are roughly consistent with those found for the pre-test data unless otherwise stated.

The proportion of female participants in the sample is 59.6%, 39.6% are male. The students’ mean age is M = 24.03 (SD = 4.01, min = 18, max = 59). As expected, our sample is not representative of the German students’ population in general since female students are slightly overrepresented in our sample. Footnote 4

We purposefully addressed students enrolled in a multitude of different subjects to reach enough variability in our sample to establish the instrument’s psychometric quality. In future studies, it would, therefore, be possible to analyse sub-populations in order to complement other studies (e.g. Zawacki-Richter et al. 2015 ) who for instance, did not include the humanities’ students. Also, in contrast to those studies, we focus on traditional students since our data was collected at universities without a considerable amount of distance or online education programs in 2018. Thus, the data allows exploring the digital media usage patterns of a more general, heterogeneous student population, while highlighting differences in study subjects and gender.

In our analyses (see below), we included only cases with at least 50 completed pages of the 119 pages of the questionnaire, resulting in 2039 cases. However, some analyses required complete cases so the number of cases was reduced due to missing values in some instances, as is mentioned below. To ensure construct validity and reliability, exploratory and confirmatory factor analyses, as well as correlation and internal consistency analyses, were applied on the pre-test data as well as on the full-scale data. The data analyses were conducted in R (R Core Team 2019 ).

3.1 Internal Consistency Analyses

Each value of Cronbach’s Alpha for the main study data (see supplementary material for the results of the analyses for the pre-test) is above the common threshold of 0.70 (see Table 4 ). Any item showing an item-scale correlation lower than 0.50 was not part of the main survey resp. excluded afterward.

Both self-efficacy scales show high values for Cronbach’s Alpha (ASE α = 0.92; DMSE α = 0.92) and also high item-scale correlations (r > 0.69) for each item, for the full-scale survey. Footnote 5 However, in order to further reduce the number of items, two items that referred to similar aspects of self-efficacy as two other remaining items (explicit problem-solving and reaction to surprising situations) were excluded for the revised instrument. An additional item was excluded from the main study data because of its low item-scale correlation (< 0.50). Each scale now consisted of seven instead of ten items. The mean scores of both self-efficacy dimensions show only a moderate correlation of r  = 0.38. Moreover, a joint exploratory factor analysis of the 20 original items of both scales in the pre-test data clearly leads to the two dimensions of ASE and DMSE. Therefore, based on the pre-test data and also the full-scale data, the scales appear to provide valid and reliable measurements for both distinct self-efficacy dimensions.

3.2 Validity Analyses

3.2.1 exploratory factor analyses.

For the pre-test data, the attitude and motivation scales taken from Brahm and Jenert ( 2015 ), CHE, the two scales for ASE, and DMSE were tested for construct validity by conducting several exploratory factor analyses (EFA). Footnote 6 Due to a large number of expected factors, a single factor analysis containing all items was not useful. Instead, the analyses were carried out at the construct level. However, theoretically close constructs, e.g. intrinsic motivation and enjoyment in studying, were also checked in a comprehensive factor analysis including items of more than one construct.

First, each of the scales was evaluated in terms of the Kaiser–Meyer–Olkin (KMO) measure of sampling adequacy test (KMO > 0.60) and the Bartlett's test of sphericity. The factor models itself were evaluated in terms of Scree plots, factor loadings (higher than 0.30), parallel analyses (comparison to the structure of random data), and in accordance with theoretically expected factor structure and interpretability. For each scale, items that showed loadings greater than 0.30 on more than one factor are excluded for interpretation and also for the revised survey instrument, used for the full-scale data collection (see also supplementary material, Table A1). Still, it was ensured that there were at least three items for every uni-dimensional scale.

Most of the scales showed the expected one-factor structure. However, against our theoretical expectation, the factor structure of the original social integration scale (Leichsenring 2011 ) could not be replicated. In contrast to our theoretical expectation, two factors can be suggested in case of the original social integration scale (Leichsenring 2011 ), based on the pre-test data analysis. These might roughly be interpreted as social and academic integration since the items of one factor refer to private activities, partly with fellow students (e.g. private meetings, attendance of parties), and those of the other factor refer to learning activities with classmates (e.g. group learning activities). The scale was unchanged for the full-scale data collection and the exploratory factor analysis was repeated. For the full-scale data, a three factor solution seems most appropriate, based on the EFA. Those three factors distinguish between social and academic integration, as suggested above, but also free-time activities. As a consequence, only those items that could be uniquely defined as reflecting social integration, both theoretically and empirically, for the pre-test and also full-scale data, will be considered in further data analyses.

3.2.2 Confirmatory Factor Analyses

The final 11 factor-solution was validated for the full-scale data by conducting a confirmatory factor analysis (CFA). The results confirm a good fit of the model, based on comparative fit index (CFI > 0.95), Tucker Lewis Index (TLI > 0.95), root mean square error of approximation (RMSEA < 0.05), and standardized root mean square residuals (SMRM < 0.05) values (Brown 2015 ). The fit indices are reported in Table 5 (see also Output 1, supplementary material).

3.2.3 Correlation Analyses

In Table 6 , the Pearson product-moment correlation coefficients of test scores for the scales are given to check if these are in line with the reported results of Brahm and Jenert ( 2015 ).

The nomological validity of ASE is shown by expected correlations with other constructs (see Sect.  1.2 ), e.g. a negative correlation with anxiety when studying ( r  = − 0.52) or a positive correlation with motivation ( r  = 0.39). Also, motivation is positively correlated with enjoyment ( r  = 0.84), active participation in classes ( r  = 0.31) or identification with the university ( r  = 0.43). Students who are supported by their personal environment (subjective norm) in studying at their university also show a high identification with this institution ( r  = 0.45). Altogether, these results confirm the reported findings of Brahm and Jenert ( 2015 ).

Additional correlation analyses (not included in Table 6 ) of DMSE, self-assessed digital media knowledge/skills, and frequency of usage of certain media applications also appear plausible. Self-assessed skills in e-learning applications or programming are positively correlated with DMSE, in medium strength ( r  = 0.35 for e-learning; r  = 0.37 for programming), whereas research in library catalogues or general online literature research are less strongly correlated with DMSE ( r  = 0.14 for library research; r  = 0.20 for general literature research), to name only a few examples. Since the former presumably also require more elaborated skills in dealing with these media applications, a stronger correlation with a perceived high ability to deal with digital media in general (high DMSE) seems to be reasonable. In contrast, less elaborated skills are probably needed when dealing with literature databases or library online-services, so the ability to successfully face difficulties regarding digital media applications seems less relevant and the link between those is less strong. Similar results occur when analysing the relationship between DMSE and frequency of usage of certain media applications, such as cloud services ( r  = 0.29) or cooperation tools ( r  = 0.21) in comparison to no visible relationship at all between DMSE and usage of library services ( r  = 0.02) or the university webpages ( r  = 0.03). Again, this could be explained by the different demands on the skills required, with higher skills needed for online tools than for browsing through simple webpages. In sum, the relationship between DMSE and on the one hand self-assessed capabilities as well as on the other hand the frequency of use of certain media applications seems to vary depending on the requirement level of these applications, which seems plausible to us and thereby underlines the validity of the DMSE scale.

3.3 Distributions of Self-Efficacy Scales and Descriptive Results

In Fig.  1 a, b, the distribution of the two self-efficacy dimensions is shown. In both cases, skewness differs from a normal distribution, slightly tending to the right. Still, a reasonable amount of variance is given in the data and the emphasis lies on the middle area of the Likert scale. Thus, multivariate analysis procedures such as regression analyses seem to be applicable.

figure 1

Histograms of testscores (average itemscores) for a Academic Self-Efficacy and b Digital Media Self-Efficacy (n = 1955)

Based on the pre-test and full-scale data, first descriptive results show the aims and frequency of students’ media use. While nearly each of the respondents owns either a laptop (94%) and a smartphone (96%) and has internet access (99%), a tablet-PC is owned by 45% of the students. According to the data, these mobile devices are mostly used on campus, for example, to look up something online, for online research (for study purposes), text messages, or e-mails to lecturers. The usage varies depending on the device type, for example, smartphones are used predominantly for communication and search purposes whereas laptops are used for access to university platforms, internet search, and also writing tasks (e.g. assignments). In contrast, location-based services, taking pictures, posting content, or communication on learning management systems (LMS) are rarely mentioned by the respondents. In terms of certain media applications, text-processing software, search engines, university e-mail account, chats, and e-books are used very frequently while e.g. massive open online courses (MOOCs), blogs, e-portfolios or twitter are seldom used, if at all. However, these many different media applications and differences in frequency of use as well as relatedness to study purposes allow for a more detailed analysis of media usage types and its link to study performance in subsequent investigations.

4 Discussion

Different instruments are available to assess student teaching and learning, however, instruments assessing the different facets of academic studies in the context of digitalisation are rare and currently more relevant than ever. In consequence, this study aimed to develop a comprehensive survey instrument that addresses the multi-faceted character of academic studies and digital media behaviour. Existing survey instruments such as ASAtS (Brahm and Jenert  2015 ), the CHE-Quest (Leichsenring 2011 ), and scales regarding students’ media use and attitudes (Grosch and Gidion 2011 ; Zawacki-Richter 2015 ) were combined to assess all aspects of students’ studying. In addition to these established scales, students’ self-efficacy expectations regarding digital media use have not been assessed up to now. In consequence, it was necessary to develop a new scale building upon the established self-efficacy scale by Jerusalem and Schwarzer ( 2002 ).

Both with pre-test data and the main study, the instrument proved to show valid and reliable results. In accordance with Bandura’s SCT (Bandura 2011 ) and also in line with other empirical research, the study showed positive correlations of students’ self-efficacy with motivation and goal orientation (Hsieh et al. 2007 ), interest (Honicke and Broadbent 2016 ) as well as study success (Bartimote-Aufflick et al. 2015 ; Honicke and Broadbent 2016 ). In contrast, a negative correlation with anxiety (Hsieh et al. 2012 ) could be confirmed. These correlations were shown both in the study by Brahm and Jenert ( 2015 ) as well as in our data, thus, indicating nomological validity of the survey instruments. However, we could not replicate the social integration scale according to (Leichsenring 2011 ), with our available data. Furthermore, the pre-test and main surveys led to different results. For future analyses, therefore, only those three items that can be clearly assigned to a scale for social integration should be considered.

In addition to this replication, a new scale for DMSE was developed and could be separated from the well-established construct of students’ academic self-efficacy (Jerusalem and Schwarzer 2002 ). Correlation analyses of the relationship between DMSE and self-assessed capabilities as well as the frequency of usage of certain media applications show a varying strength of the relationship with DMSE, depending on the requirement level of the application of consideration, which appears to be plausible. Our results, therefore, point to a valid scale for DMSE here as well.

Another important result concerns students’ aim and frequency of media use which matches those reported by Zawacki-Richter et al. ( 2017 ). We found a further increase in the dissemination of smartphones, laptops, and tablets, in comparison to the findings based on a survey in 2012 and 2015 (Grosch 2012 ; Zawacki-Richter 2015 ) with 96% of our student sample owning a smartphone, 94% a laptop, and 45% a tablet. This increase again highlights the increasing relevance of digital media at universities and thereby of research on digital media behaviour of university students.

4.1 Limitations

It is always challenging to obtain a sufficiently large sample. Of course, the use of large samples can lead to more reliable results and can further enhance instrument validity tests. This scale development is based on a small pre-test and a larger main study, thus, the validity and reliability of this newly developed scale should still be examined in further research. Results are already promising, however, the scale has up to now only been used in one Swiss and six German HEIs. Further research will be required to replicate and extend these findings in diverse contexts, i.e. different kinds of HEIs, such as universities of applied sciences or colleges in order to establish broad applicability of the scale. In addition to institutional diversity, the study could also be replicated outside the German-speaking higher education context.

The most notable limitation is the exclusive use of self-reported data in the questionnaire. In the meta-analysis by Kuncel, Credé, and Thomas ( 2005 ) the reliability of self-reported results is related to students’ school performance. This means that self-reported grades are appropriate measures of actual grades for students with good grades, but not for those with low grades. In consequence, other measures may be needed.

Furthermore, the validity of the instrument must be considered preliminary since this study focused on evaluating the factor structure, internal consistency, concurrent validity, and the divergent validity of the newly developed scale. Future research should include a more comprehensive evaluation of the scale, such as the test–retest reliability and criterion-related validity. Since convergent and discriminant validity could be established, criterion-related validity can be assumed, however, needs to be checked, ideally using longitudinal data.

4.2 Research and Practical Implications

Since the prevalent use of digital media for study purposes are confirmed in this study, it is necessary to extend existing instruments on teaching and learning (e.g. Lemos et al. 2011 ) by integrating the usage of digital media as well as self-efficacy beliefs concerning digital media. Furthermore, our study confirms the importance of our initial overarching questions. As we assume that students’ social backgrounds, e.g. parents’ educational background are linked to their DMSE and behaviour in academic settings in general (Zimmerman et al. 1992 ; Zimmerman 2000b ), it should be investigated whether students equipped with higher DMSE can accomplish study demands better. Accordingly, research is needed on whether students’ individual and subject-specific academic and media behaviour depend upon other measured constructs such as motivation, attitudes, and their socio-economic background. Particularly about media behaviour, major subject-related differences, for example between engineering and humanities, can be expected, and especially regarding students’ self-efficacy expectations, and gender differences (Huang 2013 ; Pintrich and Schunk 2002 ; Schunk and Pajares 2002 ; Zawacki-Richter 2015 ).

Since the instrument has good psychometric properties, it can be recommended for application at other HEIs to examine students’ digital media usage and study behaviour. Also only selected parts of the instrument could be used, depending on the (research) question the applicants of the instruments have. In the summer semester 2020, for example, the instrument presented is used to conduct a longitudinal study on changes in media-related behaviour, attitudes, and self-efficacy as a result of the (at least partially) digitalised teaching during the corona-crisis. Besides, for example, the use and acceptance of learning platforms and campus information systems at universities could be evaluated to guide quality enhancement processes at HEIs. In this context, a study using the instrument could give insights into the purposes that widespread digital devices, such as smartphones and laptops, are actually used for. If a HEI, for instance, intends to increase the use of LMS as part of students’ learning environment, and if students often use the LMS with their smartphones, it is advisable to make them universally usable with smartphones. Based on the assumption that a high level of media-related and academic self-efficacy motivates students and promotes their goal orientation (van Dinther et al. 2011 ), interventions for enhancing the students’ (academic and digital media) self-efficacy expectations, potentially especially focusing on disadvantaged students, could be developed and evaluated, using the presented scales (see Brahm and Pumptow 2020 ).

In particular, the role of DMSE should be addressed in future research since digital media are prevalent in teaching and learning. A possible approach would be to model different clusters of media usage types, based on the data concerning digital media. This would allow for further analysis of students’ media behaviour in line with concepts resp. dimensions of media-usage by Johnsson-Smaragdi ( 2005 ; see also Zawacki-Richter, 2015 ). Such an analysis could provide insights into the determinants of digital media behaviour and also regarding its relevance for academic achievement.

5 Conclusion

Overall, this study contributes to further differentiation of established scales to evaluate students’ behaviour (e.g. Brahm and Jenert 2015 ; Lemos et al. 2011 ). Above all, it adds a new perspective to this research stream by developing a new scale to assess DMSE which seems to be an important construct to further explore students’ use of digital media in HEI. In this respect, the research also makes an initial contribution to extend Bandura’s SCT towards digital media usage in the context of HE: Academic self-efficacy and digital media self-efficacy can be conceptualized as two separate constructs. This distinction could, in turn, be used to make clearer differences between various student groups and better cater to their respective needs in teaching and learning. In summary, the instrument can be recommended for wider application at other HEIs to find out more about students’ digital media usage and its implications for other facets of learning.

Availability of Data and Materials

The datasets generated and analysed during the current study are not yet publicly available, however, they are currently prepared for publication at the Research Data Centre For Higher Education Research and Science Studies (FDZ DZHW). They are available from the corresponding author upon reasonable request.

‘CHE’ stands for ‘Centrum für Hochschulentwicklung’ (Center for Higher Education Development) and‚ ‘Quest’ is used as an abbreviation for ‘questionnaire’.

The students reported a processing time of 45–50 min, which is in line with the calculated average processing time of the pre-test online-survey. After the pre-test, the questionnaire was shortened to a processing time of about 30–35 min.

All respondents were asked to give their consent to scientific use and publication of their data before filling in the questionnaire.

The proportion of female students in Germany in 2017 was 48% compared to 60% in our sample Statistisches Bundesamt ( 2018 ).

Due to the higher number of items for both self-efficacy scales compared to the other scales (seven instead of three), we also calculated the Cronbach’s Alpha values for fewer items. In case of three items, we found an Alpha of 0.86 for both the ASE and the DMSE-Scale.

Because of the data limitations and nature of the psychological constructs, minimum residuals (MINRES) method was used for factor extraction and Oblimin for rotation of the resulting factors (see Izquierdo et al. 2014 ).

Al-Husain, D., & Hammo, B. H. (2015). Investigating the readiness of college students for ICT and mobile learning: A case study from King Saud University. International Arab Journal of E-Technology, 4 (1), 48–55.

Google Scholar  

Ally, M., & Prieto-Blázquez, J. (2014). What is the future of mobile learning in education? International Journal of Educational Technology in Higher Education, 11 (1), 142–151. https://doi.org/10.7238/rusc.v11i1.2033 .

Article   Google Scholar  

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84 (2), 191–215.

Bandura, A. (1986). Social foundations of thought and action. A social cognitive theory . Englewood Cliffs: Prentice Hall.

Bandura, A. (2011). Social cognitive theory. In P. A. M. Van Lange, A. W. Kruglanski, & E. T. Higgins (Eds.), Handbook of social psychological theories (pp. 349–373). Los Angeles: Sage.

Bartimote-Aufflick, K., Bridgeman, A., Walker, R., Sharma, M., & Smith, L. (2015). The study, evaluation, and improvement of university student self-efficacy. Studies in Higher Education, 41 (11), 1918–1942. https://doi.org/10.1080/03075079.2014.999319 .

Brahm, T., & Jenert, T. (2015). On the assessment of attitudes towards studying—Development and validation of a questionnaire. Learning and Individual Differences , 43 , 233–242. https://doi.org/10.1016/j.lindif.2015.08.019 .

Brahm, T., & Pumptow, M. (2020). Förderung von (medienbezogener) Selbstwirksamkeit an Hochschulen. In S. Hofhues, M. Schiefner-Rohs, S. Aßmann, & T. Brahm (Eds.), Studierende—Medien—Universität: Einblicke in studentische Medienwelten . Waxman: Münster.

Brown, T. A. (2015). Confirmatory factor analysis for applied research . New York: Guilford Publications.

Cavanaugh, C. S., Barbour, M. K., & Clark, T. (2009). Research and practice in K-12 online learning: A review of open access literature. International Review of Research in Open and Distance Learning . https://doi.org/10.19173/irrodl.v10i1.607 .

Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: development of a measure and initial test. MIS Quarterly, 19 (2), 189. https://doi.org/10.2307/249688 .

Dahlstrom, E., & Bichsel, J. (2014). ECAR study of undergraduate students and information technology, 2014. https://library.educause.edu/%7E/media/files/library/2014/10/ers1406-pdf.pdf?la=en . Accessed 23 July 2020.

Dahlstrom, E., Boor, T. de, Grunwald, P., & Vockley, M. (2011). ECAR study of undergraduate students and information technology, 2011. https://library.educause.edu/resources/2011/10/%7E/media/files/library/2011/10/ers1103w-pdf.pdf . Accessed 23 July 2020.

Dahlstrom, E., Brooks, D. C., Grajek, S., & Reeves, J. (2015). ECAR study of undergraduate students and information technology, 2015. https://library.educause.edu/%7E/media/files/library/2015/8/ers1510ss.pdf?la=en . Accessed 23 July 2020.

Dahlstrom, E., & Walker, J. D. (2012). ECAR study of undergraduate students and information technology, 2012. https://library.educause.edu/%7E/media/files/library/2012/9/ers1208.pdf?la=en . Accessed 23 July 2020.

Dahlstrom, E., Walker, J. D., & Dziuban, C. (2013). ECAR study of undergraduate students and information technology, 2013. https://library.educause.edu/%7E/media/files/library/2013/9/ers1302-pdf.pdf?la=en . Accessed 23 July 2020.

Dolch, C., & Zawacki-Richter, O. (2018). Are students getting used to learning technology? Changing media usage patterns of traditional and non-traditional students in higher education. Research in Learning Technology . https://doi.org/10.25304/rlt.v26.2038 .

Duncan, T., Pintrich, P., Smith, D., & Mckeachie, W. (2015). Motivated strategies for learning questionnaire (MSLQ) manual. https://www.researchgate.net/publication/280741846_Motivated_Strategies_for_Learning_Questionnaire_MSLQ_Manual https://doi.org/10.13140/RG.2.1.2547.6968 . Accessed 23 July 2020.

Eastin, M. S., & LaRose, R. (2000). Internet self-efficacy and the psychology of the digital divide. Journal of Computer-Mediated Communication . https://doi.org/10.1111/j.1083-6101.2000.tb00110.x .

Ganzeboom, H. B. G., & Treiman, D. J. (2003). Three internationally standardised measures for comparative research on occupational status. In J. H. P. Hoffmeyer-Zlotnik & C. Wolf (Eds.), Advances in cross-national comparison: A European working book for demographic and socio-economic variables (pp. 159–193). Boston: Springer.

Chapter   Google Scholar  

Gecas, V., & Schwalbe, M. L. (1983). Beyond the looking-glass self: Social structure and efficacy-based self-esteem. Social Psychology Quarterly, 46 (2), 77. https://doi.org/10.2307/3033844 .

Grosch, M. (2012). Mediennutzung im Studium: Eine empirische Untersuchung am Karlsruher Institut für Technologie . Zugl.: Karlsruhe, Karlsruher Inst. für Technologie, Diss., 2011 u.d.T.: Grosch, Michael: Phänomene und Strukturen der Mediennutzung im Studium. Aachen: Shaker.

Grosch, M., & Gidion, G. (2011). Mediennutzungsgewohnheiten im Wandel: Ergebnisse einer Befragung zur studiumsbezogenen Mediennutzung : KIT Scientific Publishing.

Hocevar, K. P. (2013). What is social about social media users? How social media efficacy impacts information evaluation online . Santa Barbara: University of California.

Hofstetter, C. R., Zuniga, S., & Dozier, D. M. (2009). Media self-efficacy: Validation of a new concept. Mass Communication and Society, 4 (1), 61–76. https://doi.org/10.1207/S15327825MCS0401_05 .

Honicke, T., & Broadbent, J. (2016). The influence of academic self-efficacy on academic performance: A systematic review. Educational Research Review, 17 , 63–84. https://doi.org/10.1016/j.edurev.2015.11.002 .

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovative Higher Education, 40 (4), 305–316. https://doi.org/10.1007/s10755-014-9316-1 .

Hsieh, P.-H., Sullivan, J. R., & Guerra, N. S. (2007). A Closer look at college students: self-efficacy and goal orientation. Journal of Advanced Academics, 18 (3), 454–476. https://doi.org/10.4219/jaa-2007-500 .

Hsieh, P.-H., Sullivan, J. R., Sass, D. A., & Guerra, N. S. (2012). Undergraduate engineering students’ beliefs, coping strategies, and academic performance: An evaluation of theoretical models. The Journal of Experimental Education, 80 (2), 196–218. https://doi.org/10.1080/00220973.2011.596853 .

Huang, C. (2013). Gender differences in academic self-efficacy: a meta-analysis. European Journal of Psychology of Education, 28 (1), 1–35. https://doi.org/10.1007/s10212-011-0097-y .

Huang, W.-H. D., Hood, D. W., & Yoo, S. J. (2013). Gender divide and acceptance of collaborative Web 2.0 applications for learning in higher education. The Internet and Higher Education, 16 , 57–65. https://doi.org/10.1016/j.iheduc.2012.02.001 .

Izquierdo, I., Olea, J., & Abad, F. J. (2014). Exploratory factor analysis in validation studies: Uses and recommendations. Psicothema, 26 (3), 395–400.

Jerusalem, M., & Schwarzer, R. (2002). Das Konzept der Selbstwirksamkeit. In M. Jerusalem & D. Hopf (Eds.), Zeitschrift für Pädagogik. Beiheft: Vol. 33. Selbstwirksamkeit und Motivationsprozesse in Bildungsinstitutionen (pp. 28–53). Weinheim: Beltz.

Johnsson-Smaragdi, U. (2005). Models of change and stability in adolescents media use. In K. E. Rosengren (Ed.), Communication and society. Media effects and beyond: Culture, socialization and lifestyles (pp. 127–186). London, New York: Routledge/Taylor & Francis.

Jones, A., & Sheppard, L. (2012). Developing a measurement tool for assessing physiotherapy students’ self-efficacy: a pilot study. Assessment and Evaluation in Higher Education, 37 (3), 369–377. https://doi.org/10.1080/02602938.2010.534765 .

Klassen, R. M., & Usher, E. L. (2010). Self-efficacy in educational settings: Recent research and emerging directions. In T. C. Urdan & S. A. Karabenick (Eds.), The decade ahead: Theoretical perspectives on motivation and achievement (pp. 1–33). Bingley: Emerald Group Publishing Limited.

Komarraju, M., & Dial, C. M. (2014). Academic identity, self-efficacy, and self-esteem predict self-determined motivation and goals. Learning and Individual Differences, 32 , 1–8. https://doi.org/10.1016/j.lindif.2014.02.004 .

Kuncel, N. R., Credé, M., & Thomas, L. L. (2005). The validity of self-reported grade point averages, class ranks, and test scores: A meta-analysis and review of the literature. Review of Educational Research, 75 (1), 63–82. https://doi.org/10.3102/00346543075001063 .

Lang, V., & Hillmert, S. (2014). CampusPanel User Handbook V1. 1: Documentation for the Student Panel of the ScienceCampus Tuebingen (wave ‘a’Tübingen: Institut für Soziologie).

Leichsenring, H. (2011). CHE-Quest-Ein Fragebogen zum Adationsprozess zwischen Studierenden und Hochschule-Entwicklung und Test des Fragebogens. https://d-nb.info/101390978X/34 . Accessed 23 July 2020.

Lemos, M. S., Queirós, C., Teixeira, P. M., & Menezes, I. (2011). Development and validation of a theoretically based, multidimensional questionnaire of student evaluation of university teaching. Assessment and Evaluation in Higher Education, 36 (7), 843–864. https://doi.org/10.1080/02602938.2010.493969 .

Li, Y., Garza, V., Keicher, A., & Popov, V. (2019). Predicting high school teacher use of technology: Pedagogical beliefs, technological beliefs and attitudes, and teacher training. Technology, Knowledge and Learning, 24 (3), 501–518. https://doi.org/10.1007/s10758-018-9355-2 .

Li, Q., & Ma, X. (2010). A meta-analysis of the effects of computer technology on school students’ mathematics learning. Educational Psychology Review, 22 (3), 215–243.

Lievens, F., Ones, D. S., & Dilchert, S. (2009). Personality scale validities increase throughout medical school. Journal of Applied Psychology, 94 (6), 1514.

Lindsley, D. H., Brass, D. J., & Thomas, J. B. (1995). Efficacy-performing spirals: A multilevel perspective. The Academy of Management Review, 20 (3), 645–678.

Müßig-Trapp, P., & Willige, J. (2006). Lebensziele und Werte Studierender. https://www.hisbus.de/intern/pdf/2006_hisbus14.pdf . Accessed 23 July 2020.

Nouri, J. (2018). Students multimodal literacy and design of learning during self-studies in higher education. Technology, Knowledge and Learning. . https://doi.org/10.1007/s10758-018-9360-5 .

Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational Research, 66 (4), 543–578. https://doi.org/10.2307/1170653 .

Pekrun, R. (2006). The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educational Psychology Review, 18 (4), 315–341.

Persike, M., & Friedrich, J.‑D. (2016). Lernen mit digitalen Medien aus Studierendenperspektive: Sonderauswertung aus dem CHE Hochschulranking für die deutschen Hochschulen. Arbeitspapier des Hochschulforum Digitalisierung. https://hochschulforumdigitalisierung.de/sites/default/files/dateien/HFD_AP_Nr_17_Lernen_mit_digitalen_Medien_aus_Studierendenperspektive.pdf . Accessed 23 July 2020.

Pintrich, P. R., & Schunk, D. H. (2002). Motivation in education . Englewood Cliffs: Merrill Prentice Hall.

Putwain, D., Sander, P., & Larkin, D. (2013). Academic self-efficacy in study-related skills and behaviours: Relations with learning-related emotions and academic success. The British Journal of Educational Psychology, 83 (Pt 4), 633–650. https://doi.org/10.1111/j.2044-8279.2012.02084.x .

Rammstedt, B., Kemper, C. J., Klein, M. C., Beierlein, C., & Kovaleva, A. (2013). Eine kurze Skala zur Messung der fünf Dimensionen der Persönlichkeit: Big-Five-Inventory-10 (BFI-10).

R Core Team (2019). A language and environment for statistical computing (Computer software). Vienna, Austria: R Foundation for Statistical Computing. https://www.R-project.org/ . Accessed 23 July 2020.

Röwert, R., Lah, W., Dahms, K., Berthold, C., & Stuckrad, T. von (2017). Diversität und Studienerfolg: Studienrelevante Heterogenitätsmerkmale an Universitäten und Fachhochschulen und ihr Einfluss auf den Studienerfolg—eine quantitative Untersuchung. CHE Centrum Für Hochschulentwicklung—Arbeitspapier . (198). https://www.che.de/downloads/CHE_AP_198_Diversitaet_und_Studienerfolg.pdf . Accessed 23 July 2020.

Rutherford, S. M., & Standley, H. J. (2016). Social space or pedagogic powerhouse: Do digital natives appreciate the potential of Web 2.0 technologies for learning? In M. M. Pinheiro & D. Simoes (Eds.), Handbook of research on engaging digital natives in higher education settings (pp. 72–97). IGI Global: Hershey.

Sangkawetai, C., Neanchaleay, J., Koul, R., & Murphy, E. (2018). Predictors of K-12 teachers’ instructional strategies with ICTs. Technology, Knowledge and Learning . https://doi.org/10.1007/s10758-018-9373-0 .

Schulmeister, R. (2009). Gibt es eine "Net Generation"? Erweiterte Version 3.0. https://epub.sub.uni-hamburg.de/epub/volltexte/2013/19651/pdf/schulmeister_net_generation_v3.pdf . Accessed 23 July 2020.

Schunk, D. H., & Pajares, F. (2002). The development of academic self-efficacy. In A. Wigfield & J. S. Eccles (Eds.), Development of achievement motivation (pp. 15–31). San Diego: Academic Press.

Schwarzer, R., & Jerusalem, M. (2010). The general self-efficacy scale (GSE). Anxiety, Stress, and Coping, 12 , 329–345.

Statistisches Bundesamt (2018). Frauenanteile an Hochschulen in Deutschland nach akademischer Laufbahn im Jahr 2017. https://de.statista.com/statistik/daten/studie/249318/umfrage/frauenanteile-an-hochschulen-in-deutschland/ . Accessed 23 July 2020.

Talsma, K., Schüz, B., Schwarzer, R., & Norris, K. (2018). I believe, therefore I achieve (and vice versa): A meta-analytic cross-lagged panel analysis of self-efficacy and academic performance. Learning and Individual Differences, 61 , 136–150. https://doi.org/10.1016/j.lindif.2017.11.015 .

Thompson, P. (2013). The digital natives as learners: Technology use patterns and approaches to learning. Computers and Education, 65 , 12–33.

Tienken, C. H., & Wilson, M. J. (2007). the impact of computer assisted instruction on seventh-grade students’ mathematics achievement. Planning and Changing, 38 (3/4), 181.

Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition (2. ed., 4. print). Chicago: University of Chicago Press.

Van Dinther, M., Dochy, F., & Segers, M. (2011). Factors affecting students’ self-efficacy in higher education. Educational Research Review, 6 (2), 95–108. https://doi.org/10.1016/j.edurev.2010.10.003 .

Vishwanath, A. (2007). Information search efficacy: A new measure and its initial tests. Communication Research Reports, 24 (3), 195–203. https://doi.org/10.1080/08824090701439042 .

Vogel, B., & Woisch, A. (2013). Orte des Selbststudiums.: Eine empirische Studie zur zeitlichen und räumlichen Organisation des Lernens von Studierenden. HIS: Forum Hochschule: Vol. 7 . Hannover: HIS Hochschul-Informations-System GmbH. https://his-he.de/index.php?eID=tx_securedownloads&p=131&u=0&g=0&t=1579869978&hash=6673c5d4fef615ee12272c820fff60fd39d5bac3&file=/fileadmin/user_upload/Publikationen/Forum_Hochschulentwicklung/fh-201307.pdf . Accessed on 23 Jan 2020.

Weiser, D. A., & Riggio, H. R. (2010). Family background and academic achievement: Does self-efficacy mediate outcomes? Social Psychology of Education, 13 (3), 367–383. https://doi.org/10.1007/s11218-010-9115-1 .

Wigfield, A., & Eccles, J. S. (2000). Expectancy-value theory of achievement motivation. Contemporary Educational Psychology, 25 (1), 68–81. https://doi.org/10.1006/ceps.1999.1015 .

Zawacki-Richter, O. (2015). Zur Mediennutzung im Studium—unter besonderer Berücksichtigung heterogener Studierender. Zeitschrift Für Erziehungswissenschaft, 18 (3), 527–549. https://doi.org/10.1007/s11618-015-0618-6 .

Zawacki-Richter, O., Dolch, C., & Müskens, W. (2017). Weniger ist mehr? Studentische Mediennutzung im Wandel. Synergie Fachmagazin Für Digitalisierung in Der Lehre, 3 , 70–73.

Zawacki-Richter, O., Kramer, C., & Müskens, W. (2016). Studiumsbezogene Mediennutzung im Wandel—Querschnittdaten 2012 und 2015 im Vergleich. Schriftenreihe Zum Bildungs—Und Wissenschaftsmanagement , 1 .

Zawacki-Richter, O., Müskens, W., Krause, U., Alturki, U., & Aldraiweesh, A. (2015). Student media usage patterns and non-traditional learning in higher education. The International Review of Research in Open and Distributed Learning, 16 (2), 136–170.

Zimmerman, B. J. (2000a). Attaining self-regulation. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 13–39). San Diego: Academic Press.

Zimmerman, B. J. (2000b). Self-efficacy: An essential motive to learn. Contemporary Educational Psychology, 25 (1), 82–91. https://doi.org/10.1006/ceps.1999.1016 .

Zimmerman, B. J., Bandura, A., & Martinez-Pons, M. (1992). Self-motivation for academic attainment: The role of self-efficacy beliefs and personal goal setting. American Educational Research Journal, 29 (3), 663–676.

Download references

Acknowledgements

Open Access funding provided by Projekt DEAL. The authors acknowledge the support of Sandra Aßmann, Sandra Hofhues, Mandy Schiefner-Rohs, Sabrina Pensel, Tim Riplinger, Yannic Steffens and Antonia Weber, in the design of the survey instrument. Furthermore, the authors would like to thank Molly Hammer for her diligent proof-reading of an earlier version of this paper.

The presented results are partial results of a research project funded by the Federal Ministry of Education and Research (Grant Number: 16DHL1018). The funding body did not participate or influence in the design of the study and collection, analysis, and interpretation of data or in writing the manuscript.

Author information

Authors and affiliations.

University of Tuebingen, Melanchthonstr. 30, 72074, Tuebingen, Germany

Marina Pumptow & Taiga Brahm

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Marina Pumptow .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 36 kb)

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Pumptow, M., Brahm, T. Students’ Digital Media Self-Efficacy and Its Importance for Higher Education Institutions: Development and Validation of a Survey Instrument. Tech Know Learn 26 , 555–575 (2021). https://doi.org/10.1007/s10758-020-09463-5

Download citation

Published : 25 July 2020

Issue Date : September 2021

DOI : https://doi.org/10.1007/s10758-020-09463-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Academic self-efficacy
  • Digital media self-efficacy
  • Psychometric properties
  • Questionnaire development
  • Find a journal
  • Publish with us
  • Track your research

The opportunities and challenges of digital learning

Subscribe to the center for economic security and opportunity newsletter, brian a. jacob brian a. jacob walter h. annenberg professor of education policy; professor of economics, and professor of education - university of michigan, former brookings expert.

May 5, 2016

Twenty years ago this week, one of my very first writings on education policy appeared in print. [i] It was an opinion piece I wrote while teaching middle school in East Harlem, in which I described my school’s struggle to effectively use classroom computers. Two decades later, as a professor of economics and education policy, I am engaged in several research projects studying the use and impact of digital learning. [ii]

Much has changed since I taught middle school. I am struck by the extent to which recent technological innovations have created many new opportunities to better serve traditionally disadvantaged students.

First, increasing speed and availability of internet access can reduce many of the geographic constraints that disadvantage poor students. Schools serving higher-resourced families are often able to recruit better teachers and administrators—perhaps the most important school resources—even without additional funding.

Unlike teachers, however, technologies have no preferences for the schools in which they work. The resources available on the internet, for example, are equally available to all schools with the same internet access and internet access costs the same for all schools in the same area, regardless of the student population served. Students can now access online videos that provide instruction on a wide variety of topics at various skill levels, and participate in real-time video conferences with teachers or tutors located a state (or even a continent) away. [iii]

Second, the evolution of touch-screen technology has enabled very young children to engage in technology-aided instruction. Prior to tablets, it was difficult for pre-school, kindergarten and even early primary grade students to work with educational software because it required use of a mouse or keyboard. Now there are a hundreds of applications that can effectively expose children to early literacy and numeracy skills.

Third, advances in artificial intelligence technology now allow teachers to differentiate instruction, providing extra support and developmentally-appropriate material to students whose knowledge and skill is far below or above grade level norms. The latest “intelligent” tutoring systems are able to not only assess a student’s current weaknesses, but also diagnose why students are making specific errors. [iv] These technologies could enable teachers to better reach students who are further from the average within their classroom, potentially benefiting students with weaker academic preparation.

And these technologies scale easily so that innovations (or even good curriculum) can reach more students. Much like a well-written textbook, a well-designed educational software application or online lesson can reach students not just in a single classroom or school, but across the state or country.

While technologies such as virtual instruction and intelligent tutoring offer great promise, unless the challenges that are associated with implementing them are fully understood and addressed their failure is almost surely guaranteed. To date, there is little evidence that digital learning can be implemented at scale in a way that improves outcomes for disadvantaged students.

Hundreds of thousands of students attend full-time online schools, [v] but a study released last year found that students of online charter schools had significantly weaker academic performance in math and reading, compared with demographically similar students in conventional public schools. [vi] Computer-aided instruction has been studied extensively over the past twenty-five years and the findings have not been encouraging. Consistently, programs that are implemented widely and evaluated with rigorous methods have yielded little to no benefit for students on average. [vii]

What are the key challenges?

Let’s start with student motivation. If technologies can draw in otherwise disenfranchised students through the personalization of material to a student’s interest or through gaming technology, they could benefit disengaged, poorly performing students. However, these technologies often reduce oversight of students, which could be particularly detrimental for children who are less motivated or who receive less structured educational supports at home. It is also possible that these technologies will be less able to engage reluctant learners in the way a dynamic and charismatic teacher can.

Moreover, approaches that forgo direct interpersonal interaction completely are unlikely to be able to teach certain skills. Learning is an inherently social activity. While an intelligent tutor might be able to help a student master specific math concepts, it may not be able to teach students to critically analyze a work of literature or debate the ethics of new legislation.

The experience of Rocketship, a well-known charter school network, illustrates this concern. Developed in the Bay Area of California in 2006, Rocketship’s instructional model revolves around a blended learning approach in which students spend a considerable amount of each day engaged with computer-aided learning technologies. The network received early praise for its innovative approach to learning and, most importantly, for the high achievement scores posted by its mostly poor, nonwhite student population. In 2012, however, researchers and educators raised concerns about graduates from Rocketship elementary schools, noting that they had good basic skills but were struggling with the critical analysis required in middle school. [viii]

More broadly, it is important to realize that technologies can be either substitutes for or complements to resources already in the school. To the extent that they are substitutes, they are inherently equalizing forces. For example, well-designed and structured online content might provide critical support to a novice teacher who is too overwhelmed to produce the same coherent and engaging materials that some more experienced teachers can create.

However, in many cases it may be more appropriate to think of technologies as complements—e.g., when they require skilled teachers or students with strong prior skills to be implemented well. In these cases, technologies must be accompanied with additional resources in order for them to benefit traditionally underserved populations.

Perhaps most importantly, systems that blend computer-aided and face-to-face instruction are notoriously difficult to implement well. In recent studies of the popular Cognitive Tutor math programs, teachers reported trouble implementing the program’s instructional practices that revolve around collaborative work, making strong connections between computer-based activities and classroom instruction, and maintaining the expected learning pace with many students who lacked prior math and reading skills. [ix]

Finally, even with the best implementation, digital learning is likely to benefit students differently depending on their personal circumstances and those of their school. For instance, non-native English speakers might benefit from online instruction that allows them to pause and look up unfamiliar words. Likewise, we might expect an online course to be more advantageous for students attending a brick-and-mortar school with very low-quality teachers.

Indeed, some recent research finds exactly this type of heterogeneity. A large IES-funded evaluation of computer-aided instruction (CAI) released in 2007 found that students randomly assigned to teachers using the leading CAI products fared no better than students in control classrooms. Several years later, then graduate student Eric Taylor, decided to reanalyze the data from the study, focusing on whether the impacts of these technologies varied across classrooms. His analysis suggests that the introduction of computer-aided instruction had a positive impact on students in classrooms with less effective teachers and a negative impact on students in classrooms with more effective teachers. [x]

In recent years, the worlds of online learning and computer-aided instruction have converged to some extent, morphing into what is often referred to as blended- or personalized-learning models. There are a number of interesting projects underway across the country, including pilots supported by the Gates Foundation’s Next Generation Learning Challenge, and the emergence of charter networks with a goal to provide truly personalized learning for every student, such as Summit Public Schools in California and Washington. [xi]

In order for these new endeavors to be successful, they must overcome the challenges described above.

[i] http://www.edweek.org/tm/articles/1996/05/01/08jacob.h07.html

[ii] In a recent publication, the International Association for K-12 Online Learning defined digital learning as “any instructional practice in or out of school that uses digital technology to strengthen a student’s learning experience and improve educational outcomes.”

[iii] This technology has even expanded opportunities for the long-distance professional development of teachers, enabling novice teachers to receive mentorship from master teachers regardless of distance.

[iv] http://www.apa.org/pubs/books/4311503.aspx?tab=2

[v] http://www.inacol.org/wp-content/uploads/2015/11/Keeping-Pace-2015-Report.pdf

[vi] https://credo.stanford.edu/pdfs/Online%20Charter%20Study%20Final.pdf

[vii] http://www.sciencedirect.com/science/article/pii/S1747938X13000031

http://psycnet.apa.org/journals/edu/105/4/970/?_ga=1.79079444.1486538874.1462278305

http://www.apa.org/pubs/journals/features/edu-a0037123.pdf

http://rer.sagepub.com/content/86/1/42.abstract

[viii] http://www.edweek.org/ew/articles/2014/01/21/19el-rotation.h33.html?qs=New+Model+Underscores+Rocketship%E2%80%99s+Growing+Pains

http://educationnext.org/future-schools/

[ix] http://epa.sagepub.com/content/36/2/127.abstract

http://www.tandfonline.com/doi/full/10.1080/19345741003681189

[x] https://scholar.google.com/citations?user=5LXmfylL6JAC

[xi] http://www.rand.org/pubs/research_reports/RR1365.html

Economic Studies

Center for Economic Security and Opportunity

Jing Liu, Cameron Conrad, David Blazar

May 1, 2024

Hannah C. Kistler, Shaun M. Dougherty

April 9, 2024

Natalie Evans, Jamie Jirout, Kathy Hirsh-Pasek

August 22, 2023

Op-Ed: When reading to learn, what works best for students — printed books or digital texts?

At a bookstore, a girl reads a book while sitting in a chair.

  • Show more sharing options
  • Copy Link URL Copied!

As the pandemic drove a sudden, massive and necessary shift to online education last year, students were forced to access much of their school reading assignments digitally. Turning so heavily to screens for school reading was a temporary fix — and should remain that way.

A wealth of research comparing print and digital reading points to the same conclusion — print matters. For most students, print is the most effective way to learn and to retain that knowledge long-term.

When measuring reading comprehension, researchers typically ask people to read passages and then answer questions or write short essays. Regardless of the age of the students, reliably similar patterns occur.

When the text is longer than about 500 words, readers generally perform better on comprehension tests with print passages. The superiority of print especially shines through when experimenters go beyond questions having superficial answers to those whose responses require inferences , details about the text , or remembering when and where in a story an event took place.

Part of the explanation for discrepancies between print and digital test scores involves the physical properties of paper. We often use the place in the book (at the beginning, halfway through) or location on a page as a memory marker. But equally important is a reader’s mental perspective. People tend to put more effort into reading print than reading digitally.

Teacher pointing to raised hands in classroom

Op-Ed: Distance learning? Even my students will tell you that’s not the future

Magic happens in a physical classroom, not through online platforms popularized during the coronavirus quarantine.

May 26, 2020

We can learn a lot about the importance of print by asking students themselves. Overwhelmingly, college students report they concentrate, learn or remember best with paper, according to my research and studies conducted by colleagues.

For instance, students say that when reading hard copy, “everything sinks in more” and can be pictured “more vividly.” When reading digitally, they admit they get distracted by things like online social media or YouTube.

However, not all students relish reading in print. Several of the more than 400 I surveyed commented that digital texts seemed shorter than the print versions (when they’re actually the same length) or declared that digital is more entertaining and print can be boring. They said things like digital screens “keep me awake” or “print can tire you out really fast” no matter how interesting the book.

Such attitudes support research that finds when students are allowed to choose how much time to spend reading a passage, many speed more quickly through the digital version — and do worse on the comprehension test.

Reading digitally only started becoming a norm about a decade ago, thanks to advancements in technology and consumer products such as e-readers and tablet computers. Meanwhile, another seismic shift was beginning to happen in education. Academic courses, and then whole degree programs, became available online at universities before such technology-driven offerings percolated down through the lower grades.

As academic e-books made their way onto the market, students and faculty alike saw these more affordable digital versions as a way to combat the high cost of print textbooks . Open educational resources — teaching and learning materials available free (almost always online) — also became another popular option.

In 2012, the U.S. Department of Education and the Federal Communications Commission unveiled a plan for all K-12 schools to transition from print to digital textbooks by 2017 . The rationale? Improve education, but also cost savings. The big three textbook publishers (Pearson, McGraw-Hill Education, and Houghton Mifflin Harcourt) were quick to develop digital initiatives for K-12 materials. The pace accelerated in higher education as well, most recently with inclusive-access models , where publishers provide reduced-price digital texts to all course enrollees.

Regrettably, both the textbook industry and school decision-makers rushed to embrace digital reading platforms without assessing potential educational implications. Yet below the radar, teachers and students have often recognized the educational mismatch.

A recent survey by the research group Bay View Analytics found that 43% of college faculty believe students learn better with print materials — the same message students have been sending, when we bother to ask. Yes, cost issues need to be addressed, and yes, digital has a vital place in contemporary education. But so does print.

There’s a pressing need to rethink the balance between print and digital learning tools. When choosing educational materials, educators — and parents — have to consider many factors, including subject matter, cost, and convenience. However, it’s also important to remember that research findings usually tip the scales toward print as a more effective learning tool.

What can parents and educators do? For starters, explore students’ perceptions about which reading medium helps them concentrate and learn more easily. Conduct a short survey and discuss the results with students in class or at home. Make sure everyone who has a stake in students’ education — teachers, librarians, administrators and parents — thinks about the consequences of their choices.

The pandemic drove society to educational triage, not just by pivoting to digital materials but also by reducing curricular rigor . As schools continue to reopen and rethink their educational goals, research about learning should be used to help find the right balance between screens and print in the digital age.

Naomi S. Baron is professor emerita of linguistics at American University and author of “How We Read Now: Strategic Choices for Print, Screen, and Audio.”

More to Read

Gov. Gavin Newsom reads the book "Rosie Revere, Engineer by Andrea Beaty and David Roberts to kindergarteners at the Washington Elementary School in Sacramento, Calif., Friday, March 1, 2019. Newsom, accompanied by his wife, Jennifer Siebel Newsom, left, visited the school to celebrate Read Across America Day. (AP Photo/Rich Pedroncelli)

Letters to the Editor: No, the ‘science of reading’ isn’t just about teaching phonics

March 2, 2024

Opinion: Should California schools stick to phonics-based reading ‘science’? It’s not so simple

Feb. 26, 2024

Marisa Varalli (hand at left), Balboa High School World Languages teacher, works with a student on a make up test on Friday, April 8, 2016 in San Francisco, California. (Photo By Lea Suzuki/The San Francisco Chronicle via Getty Images)

Opinion: The surprising way to help your brain remember

Feb. 19, 2024

A cure for the common opinion

Get thought-provoking perspectives with our weekly newsletter.

You may occasionally receive promotional content from the Los Angeles Times.

More From the Los Angeles Times

Interior of a modern office cubicles

Opinion: Struggling to find meaning and happiness at work? Here’s where you may have gone wrong

May 10, 2024

LOS ANGELES, CA - APRIL 27, 2024 - USC graduates look over vacant chairs and tables in Alumni Park on the USC campus in Los Angeles on April 27, 2024. The marquee 65,000-attendee "main stage" commencement ceremony that, traditionally is held in Alumni Park, has been called off due to all the protest over students calling for the end of the war in Gaza and divestment in Israel. (Genaro Molina/Los Angeles Times)

Opinion: The commencement USC students, and their parents, should have had

Mobs of Vietnamese people scale the wall of the U.S. Embassy in Saigon, Vietnam, trying to get to the helicopter pickup zone, just before the end of the Vietnam War on April 29, 1975. (AP Photo/Neal Ulevich)

Opinion: Why L.A. County’s ‘Jane Fonda Day’ declaration was so astoundingly insensitive

May 9, 2024

As footage from the Jan. 6, 2021, insurrection at the U.S. Capitol is displayed in the background, former President Donald Trump stands while a song, "Justice for All," is played during a campaign rally at Waco Regional Airport, Saturday, March 25, 2023, in Waco, Texas. The song features a choir of men imprisoned for their role in the Jan. 6, 2021, insurrection singing the national anthem and a recording of Trump reciting the Pledge of Allegiance. (AP Photo/Evan Vucci)

Calmes: Trump promises to subvert the law — first by freeing the Jan. 6 criminals

LAST STRETCH!

essay on digital study

You can still support your source of education news this teacher appreciation week.

The Hechinger Report

Covering Innovation & Inequality in Education

A textbook dilemma: Digital or paper?

Avatar photo

Share this:

  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Pinterest (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to share on WhatsApp (Opens in new window)
  • Click to email a link to a friend (Opens in new window)

The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our  weekly newsletters  to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming  a member  today.

My friend Joanne was packing her youngest child off to college this month and wrestling with a modern dilemma: Is it better to buy textbooks in digital form or old-fashioned print? One of her son’s professors was recommending an online text for a business course: lighter, always accessible and seriously cheaper ($88 vs. $176 for a 164-page book). But Joanne’s instinct was that her son would “learn better” from a printed volume, free of online distractions, and with pages he could dog-ear, peruse in any order, and inscribe with marginal notes. Her son was inclined to agree.

digital textbooks vs printed

Many of us book lovers cherish the tactile qualities of print, but some of this preference is emotional or nostalgic. Do reading and note-taking on paper offer any measurable advantages for learning? Given the high cost of hard-backed textbooks, is it wiser to save the money and the back strain by going digital?

Website for KQED

You might think that, decades into the digital revolution, we would have a clear answer to this question. Wrong. Earlier this year educational psychologist Patricia Alexander, a literacy scholar at the University of Maryland, published a thorough review of recent research on the topic.  She was “shocked,” she says, to find that out of 878 potentially relevant studies published between 1992 and 2017, only 36 directly compared reading in digital and in print and measured learning in a reliable way. (Many of the other studies zoomed in on aspects of e-reading, such as eye movements or the merits of different kinds of screens.)

Aside from pointing up a blatant need for more research, Alexander’s review, co-authored with doctoral student Lauren Singer and appearing in Review of Educational Research , affirmed at least one practical finding:  if you are reading something lengthy – more than 500 words or more than a page of the book or screen – your comprehension will likely take a hit if you’re using a digital device. The finding was supported by numerous studies and held true for students in college, high school and grade school.

Research suggests that the explanation is at least partly the greater physical and mental demands of reading on a screen: the nuisance of scrolling, and the tiresome glare and flicker of some devices. There may be differences in the concentration we bring to a digital environment, too, where we are accustomed to browsing and multitasking. And some researchers have observed that working your way through a print volume leaves spatial impressions that stick in your mind (for instance, the lingering memory of where a certain passage or diagram appeared in a book).

Of 878 potentially relevant studies published between 1992 and 2017, only 36 directly compared reading in digital to reading in print, and measured learning in a reliable way.

Alexander and Singer have done their own studies of the digital versus print question. In a 2016 experiment they asked 90 undergraduates to read short informational texts (about 450 words) on a computer and in print. Due to the length, no scrolling was required, but there still was a difference in how much they absorbed. The students performed equally well in describing the main idea of the passages no matter the medium, but when asked to list additional key points and recall further details, the print readers had the edge.

Curiously, the students themselves were unaware of this advantage. In fact, after answering comprehension questions, 69% said they believed they had performed better after reading on a computer. Researchers call this failure of insight poor “calibration.”

The point of such research, as Alexander herself notes, is not to anoint a winner in a contest between digital and print. We all swim in a sea of electronic information and there’s no turning back the tide.

“The core question,” Alexander said in an interview, is “when is a reader best served by a particular medium. And what kind of readers? What age? What kind of text are we talking about? All of those elements matter a great deal.”

On top of that, we all could do with a lot more self-awareness about how we learn from reading.

For example, a big reason that students in the study thought they learned better from digital text is that they moved more quickly in that medium. Research by Alexander and others has confirmed this faster pace. “They assume that because they were going faster, they understood it better,” Alexander observes. “It’s an illusion.”

If students become aware of this illusion, they can make better choices. Just as they might decide to turn off social media alerts while studying an online textbook, they might want to consciously slow themselves down when reading for deep meaning. On the other hand, when reading for pleasure or surface information, they can let ’er rip.

Digital text makes it easy for students to copy and paste key passages into a document for further study, but there is little research on how this compares with taking notes by hand.

“They assume that because they were going faster [reading digitally], they understood it better. It’s an illusion.”

“We study things like highlighting and underlining,” Alexander says, “but those kind of motor responses have never been of highest value in terms of text-processing strategies” – whether done with a cursor or a marker. The studying strategy with “the greatest power,” she adds, involves deeply questioning the text — asking yourself if you agree with the author, and why or why not.

Dutch scholar Joost Kircz points out that these are still early days for digital reading, and new and better formats will continue to emerge. In his view, the linear format of a traditional book is well suited for narratives but not necessarily ideal for academic texts or scientific papers.

“In narrative prose fiction, the author strictly determines the reading path,” he and co-author August Hans Den Boef write in The Unbound Book , a  collection of essays about the future of reading. “But in a digital environment we can easily enable a plurality of reading paths in educational and scholarly texts.”

In addition to the hyperlinks, video and audio that currently enhance many digital texts, Kircz would like to see innovations such as multiple types of hyperlinks, perhaps in a rainbow of colors that denote specific purposes (annotation, elaboration, contrary views, media, etc.). He also imagines digital books that could enable a variety of paths through a body of work.  Not all information is linear or even layered, he told me: “There’s a lot of information that’s spherical. You cannot stack it up. The question is to what extent can we mimic human understanding?”

While we await those future digital products, students deciding what school books to buy this fall would do well to ask themselves just what they hope to get from the text. As Alexander notes, “If I’m only trying to learn something that’s going to be covered on a test and the test is shallow in nature, then [digital] is just fine.” If, on the other hand, you hope to dive in deeply and gather imperishable pearls, spring for the book.

This story was produced by The Hechinger Report , the nonprofit, independent news website focused on inequality and innovation in education. Sign up for our newsletter .

Related articles

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Claudia Wallis

Claudia... More by Claudia Wallis

Letters to the Editor

At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.

By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.

I consider this report timely, indeed, not to mention informative. Such a decision–digital vs. paper–is not just a decision that young(er) students may make either. It should be given serious consideration by educational agencies that purport to help teachers teach. Three very recent Frameworks (Think blueprints for curriculum.) have been published by the California Department of Education (CDE): History-Social Science, at 985 pages; English Language Arts/English Language Development (ELA/ELD), at 1073 pages, and Science at 1800+ pages. Because of the size of these guides, the CDE makes them available electronically only. When I priced a hard copy of the ELA/ELD Framework at a local copy place, I was told: $225. Because of my connection with my local university, CSULB, I got it for (only!) $75.

One of the biggest issues I have found for my 6th grader after years of hard copies is getting familiar with the format being used. Now an assignment may say read pages 1 & 2 and answer between 7-15 questions on the assignment. At first I thought my daughter saying she couldn’t find answers was just being a bit bored and lax in studying. Then I tried to find the answers: There are also between 3-6 additional links to video or articles to read. And the question section is also about the additional pages and info. The layout just seems clunky and for an ADHD student getting lost in reading a side article before finishing the main page, or reading through so many extras she forgot the main pages lesson. I’ve watched her grades drop from A’s in 5th grade to barely passing the first 9 weeks.

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Sign me up for the newsletter!

Submit a letter

essay on digital study

If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

To log in and use all the features of Khan Academy, please enable JavaScript in your browser.

Digital SAT Reading and Writing

Course: digital sat reading and writing   >   unit 1.

  • About the digital SAT
  • Digital SAT overview
  • About the digital SAT Reading and Writing test

Digital SAT FAQs

  • About the digital PSAT/NMSQT

Frequently asked questions

What is the digital sat, which version of the sat will i take.

  • Starting in 2023, students taking the test outside the United States and its territories (American Samoa, Guam, the Northern Mariana Islands, Puerto Rico or the U.S. Virgin Islands) will take the digital SAT.
  • Starting in 2024, the digital SAT will be taken by all students.
  • You’re taking it outside of the U.S., you should prepare for the digital SAT only by using Khan Academy’s Official Digital SAT Prep .
  • You’re taking it in the U.S. (or its territories), you should prepare for the paper-and-pencil test using Khan Academy’s Official SAT Practice .

What is the difference between the digital SAT and the paper-and-pencil test?

  • Calculator use: Calculators are now allowed throughout the entire Math section. A graphing calculator is integrated into the digital test experience so that all students have access.
  • Question word count: The average length of Math word problems has been reduced. In-context questions are still a big part of the test, but they’re not quite so wordy.
  • One test for Reading and Writing: While the pencil-and-paper SAT tested reading and writing in separate test sections, the digital SAT combines these topics.
  • Shorter passages (and more of them): Instead of reading long passages and answering multiple questions on each passage, students taking the digital SAT will encounter shorter passages, each with just one follow-up question.
  • New question types: With a greater number and variety of passages, the digital SAT includes new types of questions, with new prompts that require new strategies.

How should I start studying for the digital SAT?

  • Starting with a practice test will help you diagnose the areas where you need the most practice.
  • After taking each practice test, you can review your performance and read answer explanations for the questions you missed.
  • You can take full-length practice tests using College Board’s Bluebook app .
  • We recommend taking full-length practice tests at set intervals throughout your test prep journey. Doing this will help you gauge your progress, refine the focus of your skill practice, and build endurance and experience for test day.
  • The best way to practice individual skills is to explore both the digital SAT Math course and the digital SAT Reading and Writing courses on Khan Academy.
  • In these courses, you can read articles and watch videos that cover each skill, practice those skills, then test yourself in the related exercises. As you become proficient in more skills, you’ll become more prepared for test day.

How many practice tests should I take to be prepared?

What types of practice does khan academy’s official digital sat prep have.

  • The Math course is organized into 37 math skills and features three levels of difficulty in each skill.
  • The Reading and Writing course is organized into 11 skills and covers the full range of questions on the exam.

Will Official Digital SAT Prep show me where to focus when I study?

What is included in the math section of the digital sat.

  • Algebra: Analyze, fluently solve, and create linear equations and inequalities, as well as analyze and fluently solve systems of equations.
  • Advanced Math: Demonstrate attainment of skills and knowledge central for successful progression to more advanced math courses, including analyzing, fluently solving, interpreting, and creating a variety of equation types.
  • Problem-Solving and Data Analysis: Apply quantitative reasoning about ratios, rates, and proportional relationships; understand and apply units and rates; and analyze and interpret one- and two-variable data.
  • Geometry and Trigonometry: Solve problems that focus on perimeter, area, and volume; angles, triangles, and trigonometry; and circles.
  • Multiple-choice: Questions offer four possible choices from which students must select the answer.
  • Student-produced response: Questions require students to produce their own answer, which they then enter into the provided field.

How can I use Khan Academy’s digital SAT Math course to study for the digital SAT?

  • Taking the course challenge: By attempting 40 questions from different lessons throughout the course, you can get credit for the skills you’ve already mastered and identify the skills where you could improve. You can also get credit within individual units by attempting unit tests.
  • Working from top to bottom: The course takes all the math skills tested on the SAT and splits them into three difficulty levels: Foundations, Medium, and Advanced. By working through the course from top to bottom, you’ll encounter each skill at each level, keeping your practice balanced and ensuring no skills fall through the cracks.
  • Taking quizzes and unit tests as you go: As you progress through different skills, you can take quizzes and unit tests to prove your mastery of the content. The more units you master in the course, the more prepared you’ll be for test day.

What is included in the Reading and Writing section of the digital SAT?

  • Information and Ideas: Use, locate, interpret, and evaluate information from various texts and infographics.
  • Craft and Structure: Determine the meaning of high-utility academic words and phrases in context, evaluate texts rhetorically, and make supportable connections between multiple related texts.
  • Expression of Ideas: Use revision skills and knowledge to improve the effectiveness of written expression in order to accomplish specified rhetorical goals.
  • Standard English Conventions: Use editing skills and knowledge to make texts conform to core conventions of Standard English sentence structure, usage, and punctuation.

How can I use Khan Academy’s digital SAT Reading and Writing course to study for the digital SAT?

  • Work from top to bottom: The course takes all the reading and writing skills tested on the SAT and organizes them by focus. By working through the course from top to bottom, you’ll encounter each skill in turn, keeping your practice balanced and ensuring no skills fall through the cracks.
  • Try exercises more than once: Because the digital SAT Reading and Writing test is new, there’s not as much content available for practice as there is for the SAT Math test. While we hope to expand this course in the future, we encourage you to squeeze every last bit of practice you can from the presently available materials. So, even if you’ve tried (and passed) an exercise before, you can get extra practice by attempting it again!

Want to join the conversation?

  • Upvote Button navigates to signup page
  • Downvote Button navigates to signup page
  • Flag Button navigates to signup page

Good Answer

Peer Reviewed

Digital literacy is associated with more discerning accuracy judgments but not sharing intentions

Article metrics.

CrossRef

CrossRef Citations

Altmetric Score

PDF Downloads

It has been widely argued that social media users with low digital literacy — who lack fluency with basic technological concepts related to the internet — are more likely to fall for online misinformation, but surprisingly little research has examined this association empirically. In a large survey experiment involving true and false news posts about politics and COVID-19, we found that digital literacy is indeed an important predictor of the ability to tell truth from falsehood when judging headline accuracy. However, digital literacy is not a robust predictor of users’ intentions to share true versus false headlines. This observation resonates with recent observations of a substantial disconnect between accuracy judgments and sharing intentions. Furthermore, our results suggest that lack of digital literacy may be useful for identifying people with inaccurate beliefs, but not for identifying those who are more likely to spread misinformation online.

Sloan School of Management, Massachusetts Institute of Technology, USA

Media Lab, Massachusetts Institute of Technology, USA

Center for Research and Teaching in Economics (CIDE), Aguascalientes, Mexico

essay on digital study

Research Questions

  • How do social media users’ level of digital literacy relate to their ability to discern truth versus falsehood when assessing the accuracy of news posts, and when deciding what news to share?
  • How does the strength of these associations with digital literacy compare to other constructs that have been previously associated with truth discernment, particularly analytic thinking and general procedural news knowledge?
  • Do these relationships differ based on users’ political partisanship, or the topic (politics versus COVID-19) of the news headlines?

Essay Summary

  • In surveys conducted in late 2020, American social media users ( N = 1,341) were presented with a set of true and false news posts about politics or COVID-19 taken from social media. Participants were randomly assigned to either assess the accuracy of each headline or indicate their likelihood of sharing each headline in social media, as well as completing two measures of digital literacy along with analytic thinking (the tendency to stop and think versus going with one’s gut), procedural news knowledge, partisanship, and basic demographics.
  • Both digital literacy measures were positively associated with the ability to tell true from false headlines when assessing accuracy (accuracy discernment), regardless of the user’s partisanship or whether the headlines were about politics or COVID-19. General news knowledge was a stronger predictor of accuracy discernment than digital literacy, and analytic thinking was similar in strength to digital literacy.
  • Conversely, neither digital literacy measure was consistently associated with sharing discernment (difference in sharing intentions for true relative to false headlines, or fraction of shared headlines that are true).
  • These results emphasize the previously documented disconnect between accuracy judgments and sharing intentions and suggest that while digital literacy is a useful predictor of people’s ability to tell truth from falsehood, this may not translate to predicting the quality of information people share online.

Implications

In recent years, there has been a great deal of concern about misinformation and “fake news,” with a particular focus on the role played by social media. One popular explanation of why some people fall for online misinformation is lack of digital literacy: If people cannot competently navigate through digital spaces, then they may be more likely to believe and share false content that they encounter online. Thus, people who are less digitally literate may play a particularly large role in the spread of misinformation on social media.

Despite the intuitive appeal of this argument, however, there is little evidence to date in support of it. For example, Jones-Jang et al. (2019) found in a representative sample of US adults that digital literacy—defined as self-reported recognition of internet-related terms, which has been shown to predict the ability to effectively find information online (Guess & Munger, 2020; Hargittai 2005)—did not, in fact, predict the ability to identify fake news stories. Neither did scales measuring the distinct attributes of news literacy or media literacy; while information literacy—the ability to identify verified and reliable information, search databases, and identify opinion statements— did predict greater ability to identify fake news stories. Guess et al. (2020) found that a brief intervention aimed at teaching people how to spot fake news significantly improved the ability to tell truth from falsehood, but not sharing intentions, in both American and Indian samples, and Epstein et al. (2021) found that an even briefer version of the same intervention improved the quality of COVID-19 news Americans indicated they would share online (although this improvement was no greater than the improvement from simply priming the concept of accuracy). Relatedly, McGrew et al. (2019), Breakstone et al. (2021), and Brodsky et al. (2021) found that much more extensive fact-checking training modules focused on lateral reading improved students’ assessments of source and information credibility. These interventions were focused more on media literacy than on digital skills per se; and similarly, Hameleers (2020) found that a media literacy intervention with no digital literacy component increased American and Dutch subjects’ ability to identify misinformation. Conversely, Badrinathan (2021) found no effect of an hour-long media literacy training of Indian subjects’ ability to tell truth from falsehood.

In this paper, we aim to shed further light on the relationship between digital literacy and susceptibility to misinformation using a survey of 1,341 Americans, quota-sampled to match the national distribution on age, gender, ethnicity, and geographic region. We examine the association between two different measures of digital literacy and two outcome measures, belief and sharing, for true versus false news about politics and COVID-19.

We examine belief and sharing separately, as recent work has documented a substantial disconnect between these outcomes: Accuracy judgments tend to be much more discerning than sharing intentions (Epstein et al. 2021; Pennycook et al., 2020; Pennycook et al., 2021). Evidence suggests that this disconnect is largely driven by people failing to attend to accuracy when thinking about what to share (for a review, see Pennycook & Rand, in press). As a result, even if people who are more digitally literate are better able to tell truth from falsehood, this may not translate into improved sharing discernment. If they fail to even consider whether a piece of news is accurate before deciding to share it, their higher ability to identify which news is accurate will be of little assistance.

With respect to digital literacy, the first measure we use follows the tradition of conceptualizing digital literacy as the possession of basic digital skills required to effectively find information online (e.g., Hargittai, 2005). To measure this form of digital literacy, we use a set of self-report questions about familiarity with internet-related terms and attitudes towards technology that Guess & Munger (2020) found to be most predictive of actual performance on online information retrieval tasks. The second digital literacy measure we use, adapted from Newman et al. (2018), focuses specifically on literacy about social media and asks subjects how social media platforms decide which news stories to show them. This measure seems particularly well-suited to the challenges of identifying susceptibility to fake news on social media: If people do not understand that there are no editorial standards for content shared on social media, or that journalists do not get news before it is posted, it seems likely that they would be less skeptical of low-quality social media news content when they encounter it.

We find that both digital literacy measures are independently predictive of the tendency to rate true news as more accurate than false news—that is, of being able to successfully discern the truth of news (we define truth discernment as the average accuracy rating of true news minus the average accuracy ratings of false news). This positive association is similar in size to the association between truth discernment and both the tendency to engage in analytic thinking and performance on a procedural news knowledge quiz—measures which have been previously demonstrated to predict truth discernment (Amazeen & Bucy, 2019; Pennycook & Rand, 2019; Pennycook & Rand, 2021). The positive association between truth discernment and both digital literacy measures is equivalent for news about politics and COVID-19 and does not vary based on subjects’ partisanship. Thus, we find robust evidence that lack of digital literacy is indeed associated with less ability to tell truth from falsehood.

The pattern is strikingly different, however, when considering sharing intentions. Neither digital literacy measure is consistently associated with sharing discernment—the tendency to share true news more than false news—nor are they significantly associated with the fraction of headlines the subject shared that are true (an alternative metric of information sharing quality). Again, the results do not significantly differ for news about politics versus covid or based on participants’ partisanship. Analytic thinking is also not significantly associated with either sharing quality measure. It is only procedural news knowledge that is positively associated with having a higher fraction of shared content that is true (regardless of news type or partisanship) and positively associated with sharing discernment for Republicans.

These results add to the mixed pattern regarding digital literacy and misinformation on social media. While digital literacy was associated with a better ability to identify true versus false information, this did not appear to translate into sharing better quality information. Conversely, procedural news knowledge was positively predictive of both the ability to identify true versus false information and the tendency to share higher quality information. This is surprising, as one might intuitively have assumed that digital literacy was particularly relevant for social media sharing decisions. Yet, these findings resonate with experiments in which digital media literacy interventions increased belief accuracy but did not increase sharing discernment beyond simply priming the concept of accuracy (Epstein et al. 2021; Guess et al., 2020). And more generally, our digital literacy findings add to the growing body of evidence for a sizable disconnect between accuracy judgments and sharing intentions (e.g., Epstein et al. 2021; Pennycook et al., 2020; Pennycook et al., 2021). More digitally literate subjects’ higher accuracy discernment did not translate into higher sharing discernment, likely (at least in part) due to people’s tendency to fail to consider accuracy when considering what to share (Pennycook & Rand, in press).

An important implication of our findings is that digital literacy may be useful (e.g., for policymakers or social media platforms) when trying to identify users who are vulnerable to believing misinformation, but it does not seem particularly promising for identifying users who are likely to spread misinformation. Our findings also have implications regarding the potential impact of interventions that shift attention to accuracy to reduce the sharing of misinformation (for a review of the accuracy prompt approach, see Pennycook & Rand, in press). Although digital literacy did not significantly predict baseline sharing discernment, it seems likely that higher digital literacy users’ sharing decisions will be more responsive to accuracy prompts because their underlying accuracy judgments are better calibrated: Once their attention is shifted to accuracy, their improved accuracy discernment should then translate into improved sharing discernment. Testing this empirically is a promising direction for future study. Finally, if future work shows that the correlational relationships observed here reflect underlying causal effects, that could suggest that education interventions aimed at improving the quality of content shared online would do better to focus on procedural news knowledge more generally, rather than digital literacy per se.

On all these counts, it is essential for future work to explore cross-cultural generalizability. This is particularly true given that in many parts of the world, widespread access to digital technologies is an extremely recent development, and thus digital literacy levels are likely to be quite low. It is also of great importance to keep in mind that a major limitation of our study is that our sharing intention judgments are hypothetical. Although the hypothetical sharing intentions measure employed here has been widely used in the literature (e.g., Epstein et al. 2021; Guess et al., 2020; Pennycook & Rand, 2019; Pennycook et al., 2020; Pennycook et al., 2021; Roozenbeek et al., 2021; Roozenbeek & van der Linden, 2020; Rosenzweig et al., 2021; Ross et al., 2021), it is of paramount importance for future work to examine the relationship between digital literacy and actual on-platform sharing.

Finally, our findings are also important for the ongoing discussion about what, exactly, digital literacy is (e.g., Guess & Munger, 2020). We examined measures capturing two distinct notions of digital literacy and found that each measure was similarly predictive of truth discernment, even when both were included in the same model, along with procedural news knowledge and general analytic thinking. This emphasizes the multifaceted nature of digital literacy and the importance of future scholarship further elucidating the many dimensions of digital literacy, as well as the relationship between digital literacy, media literacy, and digital media literacy.

Finding 1: More digitally literate social media users show better truth discernment.

We predict participants’ ability to tell true headlines from false headlines (truth discernment, defined as average accuracy ratings for true headlines minus average accurate ratings for false headlines) including controls for age, gender, race (white versus non-white), education (less than college degree versus college degree or more), political partisanship, and headline content (political versus COVID-19). We find a significant positive relationship for self-reported familiarity/comfort with the internet ( β = .224, 95% CI = [0.154, 0.295], p < .001) and for correct understanding of the Facebook newsfeed algorithm ( β = .231, 95% CI = [0.164, 0.297], p < .001); see Figure 1. The size of these relationships is similar to what we find for procedural news knowledge ( β = .290, 95% CI = [0.221, 0.359], p < .001) and analytic thinking ( β = .210, 95% CI = [0.142, 0.278], p < .001), replicating previous findings (e.g., Amazeen & Bucy, 2019; Pennycook & Rand, 2019; Pennycook & Rand, 2021).

essay on digital study

Finding 2: More digitally literate social media users are no more discerning in their sharing.

We predict participants’ tendency to share true headlines more than false headlines (sharing discernment, defined as average sharing probability for true headlines minus average sharing probability for false headlines) including controls for age, gender, race, education, political partisanship, and headline content. We find no significant relationship for self-reported familiarity/comfort with the internet ( β = .068, 95% CI = [-0.010, 0.146], p = .088) or for correct understanding of the Facebook newsfeed algorithm ( β = .027, 95% CI = [-0.048, 0.101], p = .483); see Figure 2. We also find no significant relationship for procedural news knowledge ( β = .051, 95% CI = [-0.027, 0.130], p = .199) or analytic thinking ( β = -.019, 95% CI = [-0.094, 0.057], p = .628). The lack of association with analytic thinking is interesting as it stands in contrast to previous working findings that analytic thinking is positively associated with sharing discernment for political news (Ross et al., 2021) and COVID-19 news (Pennycook et al., 2020), and the quality of news sites shared on Twitter (Mosleh et al., 2021) (although, see Osmundsen et al., 2021 who do not find a significant relationship between analytic thinking and the sharing of fake news on Twitter).

essay on digital study

This measure of sharing discernment, however, can be potentially misleading when comparing groups where the overall sharing rate differs substantially. Thus, as a robustness check, we also consider a different measure of sharing quality: out of all of the news that the user said they would share, what fraction is true? (That is, we divide the number of true headlines selected for sharing by the total number of headlines selected for sharing). We predict this fraction of the subject’s shared headlines that are true—again including controls for age, gender, race, education, political partisanship, and headline content—and continue to find no significant relationship for self-reported familiarity/comfort with the internet ( β = .013, 95% CI = [-0.080, 0.106], p = .784) or for correct understanding of the Facebook newsfeed algorithm ( β = .015, 95% CI = [-0.072, 0.102], p = .348); see Figure 3. We also find no significant relationship for analytic thinking ( β = .002, 95% CI = [-0.093, 0.097], p = .961), but we do find a highly significant positive relationship for procedural news knowledge ( β = 0.153, 95% CI = [0.057, 0.248], p = 0.002).

essay on digital study

We note that these null results for digital literacy are not easily attributable to floor effects. First, the average sharing probabilities are far from the floor of 0: 36.4% of false headlines and 40.3% of true headlines were shared (yielding an average discernment of 0.403-0.364 = 0.039). Second, we find that other factors were significantly related to sharing discernment, indicating that there is enough signal to observe relationships that do exist. Furthermore, our results are broadly consistent when excluding users who, at the outset of the study, indicated that they would never share anything on social media, or who indicated that they did not share political news on social media. The only exception is that self-reported internet familiarity was significantly positively related with the first sharing discernment measure ( β = .18, 95% CI = [0.07, 0.30], p = 0.002) when excluding users who stated that they never share political news. Given that this was the only significant result out of the numerous combinations of digital literacy measure, sharing outcome measure, and exclusion criteria, we conclude that there is no robust evidence that digital literacy is associated with the quality of sharing intentions.

Finding 3: Relationships between digital literacy and discernment are similar for Democrats and Republicans, and for news about politics and COVID-19.

Finally, we ask whether there is evidence that the results discussed above differ significantly based on the subject’s partisanship or the focus of the headlines being judged. To do so, we use a model including both digital literacy measures, procedural news knowledge, analytic thinking, and all controls, and interact each of these variables with partisanship (6-point scale of preference for Democrat versus Republican party) and headline content (political versus COVID-19), as well as including the relevant three-way interactions. We find no significant two-way or three-way interactions for either digital literacy measure when predicting truth discernment or either sharing discernment measure ( p > .05 for all). Thus, we do not find evidence that the relationship (or lack thereof) between digital literacy and discernment varies based on user partisanship or political versus COVID-19 news.

Conversely, we do find some significant interactions for analytic thinking and procedural news knowledge (although these interactions should be interpreted with caution given the lack of ex ante predictions and the large number of tests performed). When predicting truth discernment we find a significant three-way interaction between analytic thinking, partisanship, and news type ( β = .095, 95% CI = [0.030, 0.161], p = .005), such that when judging political news analytic thinking positively predicts truth discernment regardless of user partisanship, but when judging COVID-19 news analytic thinking only predicts greater truth discernment for Democrats but not Republicans. This pattern differs from what has been observed previously, where participants higher on analytic thinking were more discerning regardless of partisanship for both political news (e.g., Pennycook & Rand, 2019) and COVID-19 news (e.g., re-analysis of data from Pennycook et al., 2020). When predicting sharing discernment, we find a significant two-way interaction between procedural news knowledge and partisanship ( β = .133, 95% CI = [0.043, 0.223], p = .004), such that procedural news knowledge is associated with significantly greater sharing discernment for Republicans but not for Democrats. We also find a significant three-way interaction between analytic thinking, partisanship, and news type ( β = .095, 95% CI = [0.030, 0.161], p = .005), but the association between analytic thinking and sharing discernment is not significantly different from zero for any subgroup. Finally, when predicting the fraction of shared headlines that are true, we find a significant two-way interaction between analytic thinking and politics ( β = .111, 95% CI = [0.009, 0.213], p = .033), such that analytic thinking is associated with a significantly lower fraction of true headlines shared for COVID-19 news but not political news.

We conducted two waves of data collection with two similarly structured surveys on Qualtrics. Our final dataset included a total of 1,341 individuals: mean age = 44.7, 59.3% female, 72.4% White. These users were recruited on Lucid, which uses quota-sampling in an effort to match the national distribution of age, gender, ethnicity, and geographic region. Participants were first asked “What type of social media accounts do you use (if any)?” and only participants who had selected Facebook and/or Twitter were allowed to participate. The studies were exempted by MIT COUHES (protocol 1806400195).

In both surveys, participants were shown a series of news items, some of which were true and others which were false. The first wave had 25 headlines (just text) pertaining to COVID-19, 15 false and 10 true, and participants saw all 25. This study ( N = 349) was conducted from July 29, 2020 to August 8, 2020. The second wave ( N = 992) had 60 online news cards about politics (including headline, image, and source), half were true and half false, and subjects were shown a random subset of 24 of these headlines. This study was conducted from October 3, 2020 to October 11, 2020. Full experimental materials are available at https://osf.io/kyx9z/?view_only=762302214a8248789bc6aeaa1c209029 .

Each subject was randomly assigned to one of two conditions for all headlines: accuracy and sharing. In the accuracy condition, participants were asked for each headline “To the best of your knowledge, is the claim in the above headline accurate?” In the sharing condition, they were instead asked “Would you consider sharing this story online (for example, through Facebook or Twitter)?” Both questions had binary no/yes response options. In the political item wave, participants in the sharing condition were also asked if they would like or comment on each headline, but we do not analyze those responses here because they were not included in the COVID-19 wave. Both experiments also included two additional treatment conditions (in which both accuracy and sharing were asked together) that we do not analyze here, as prior work has shown that asking the accuracy and sharing together dramatically changes responses (e.g., Pennycook et al., 2021), and thus these data do not provide clean measures of accuracy discernment and sharing discernment.

Given the survey nature of our experiment, we were limited to measuring people’s intentions to share social media posts, rather than actual sharing on platforms. This is an important limitation, although some reason to expect the relationships observed here to extend to actual sharing comes from the observation that self-reported willingness to share a given article is meaningfully correlated with the number of shares of that article actually received on Twitter (Mosleh et al., 2020; although this study examined item-level, rather than subject-level, correlations), and from the observation that an intervention (accuracy prompts) that was developed using sharing intentions as the outcome was successful when deployed on actual sharing in a field experiment on Twitter (Pennycook et al., 2021); this suggests that the sharing intentions were not just noise but contained meaningful signal as to what people would actually share (and how that sharing would respond to the intervention).

Following the main task, participants completed a question about their knowledge of the Facebook algorithm (Newman et al., 2018) that asked, “How are the decisions about what stories to show people on Facebook made?” The answer choices were At Random ; By editors and journalists that work for news outlets ; By editors and journalists that work for Facebook ; By computer analysis of what stories might interest you ; and I don’t know . If the participant selected I don’t know, they were asked to give their best guess out of the remaining choices in a follow-up question. Participants who chose By computer analysis of what stories might interest you in either stage were scored as responding correctly. Next, they completed a short form of the digital literacy battery from Guess & Munger (2020), responding to “How familiar are you with the following computer and Internet-related items?” using a slider scale from 1 ( no understanding ) to 5 ( full understanding ) for phishin g, hashtags , JPEG , malware , cache , and RSS ; and indicated their agreement with four statements about attitudes towards digital technology using a scale of -4 ( Strongly Disagree ) to 4 ( Strongly Agree ): I prefer to ask friends how to use any new technological gadget instead of trying to figure it out myself; I feel like information technology is a part of my daily life; Using information technology makes it easier to do my work; and I often have trouble finding things that I’ve saved on my computer. Finally, participants completed a three-item cognitive reflection test that measures analytic thinking using math problems that have intuitively compelling but incorrect answers (Frederick, 2005), a ten-item procedural news knowledge quiz adapted from Amazeen & Busy (2019), and demographics. Table 1 shows the pairwise correlations between the various individual difference measures.

  • / Fake News
  • / Media Literacy
  • / Partisan Issues
  • / Social Media

Cite this Essay

Sirlin, N., Epstein, Z., Arechar, A. A., & Rand, D. G. (2021). Digital literacy is associated with more discerning accuracy judgments but not sharing intentions. Harvard Kennedy School (HKS) Misinformation Review . https://doi.org/10.37016/mr-2020-83

Bibliography

Amazeen, M. A., & Bucy, E. P. (2019). Conferring resistance to digital disinformation: The inoculating influence of procedural news knowledge. Journal of Broadcasting & Electronic Media, 63 (3 ), 415–432 . https://doi.org/10.1080/08838151.2019.1653101

Badrinathan, S. (2021). Educative interventions to combat misinformation: Evidence from a field experiment in India. American Political Science Review, 115 (4), 1325–1341 . https://doi.org/10.1017/S0003055421000459

Breakstone, J., Smith, M., Connors, P., Ortega, T., Kerr, D., & Wineburg, S. (2021). Lateral reading: College students learn to critically evaluate internet sources in an online course. Harvard Kennedy School (HKS) Misinformation Review, 2 (1). https://doi.org/10.37016/mr-2020-56

Brodsky, J. E., Brooks, P. J., Scimeca, D., Todorova, R., Galati, P., Batson, M., Grosso, R., Matthews, M., Miller, V., & Caulfield, M. (2021). Improving college students’ fact-checking strategies through lateral reading instruction in a general education civics course. Cognitive Research: Principles and Implications, 6 (23). https://doi.org/10.1186/s41235-021-00291-4

Epstein, Z., Berinsky, A. J., Cole, R., Gully, A., Pennycook, G., & Rand, D. G. (2021). Developing an accuracy-prompt toolkit to reduce Covid-19 misinformation online. Harvard Kennedy School (HKS) Misinformation Review, 2 (3). https://doi.org/10.37016/mr-2020-71  

Frederick, S., (2005). Cognitive reflection and decision making. Journal of Economic Perspectives , 19 (4), 25–42. https://www.aeaweb.org/articles?id=10.1257/089533005775196732

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences, 117 (27), 15536–15545. https://doi.org/10.1073/pnas.1920498117

Guess, A., & Munger, K. (2020). Digital literacy and online political behavior. OSF Preprints . https://doi.org/10.31219/osf.io/3ncmk

Hameleers, M. (2020). Separating truth from lies: Comparing the effects of news media literacy interventions and fact-checkers in response to political misinformation in the US and Netherlands. Information, Communication & Society. 1–17. https://doi.org/10.1080/1369118X.2020.1764603

Hargittai, E. (2005). Survey measures of web-oriented digital literacy. Social Science Computer Review, 23 (3), 371–379. https://doi.org/10.1177/0894439305275911

McGrew, S., Smith, M., Breakstone, J., Ortega, T., & Wineburg, S. (2019). Improving university students’ web savvy: An intervention study. British Journal of Educational Psychology , 89 (3), 485–500. https://doi.org/10.1111/bjep.12279

Mosleh, M., Pennycook, G., Arechar, A. A., & Rand, D. G. (2021). Cognitive reflection correlates with behavior on Twitter. Nature Communications, 12 (921). https://doi.org/10.1038/s41467-020-20043-0

Mosleh, M., Pennycook, G., & Rand, D. G. (2020). Self-reported willingness to share political news articles in online surveys correlates with actual sharing on Twitter. PLOS One, 15 (2). https://doi.org/10.1371/journal.pone.0228882

Mortensen, T., Jones-Jang, S. M., & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, 65 (2), 371–388. https://doi.org/10.1177/0002764219869406

Newman, N., Fletcher, R., Kalogeropoulos, A., Levy, D. A. L., & Nielsen, R. K. (2018). Reuters Institute digital news report 2018. Reuters Institute for the Study of Journalism. https://s3-eu-west-1.amazonaws.com/media.digitalnewsreport.org/wp-content/uploads/2018/06/digital-news-report-2018.pdf

Osmundsen, M., Bor, A., Vahlstrup, P., Bechmann, A., & Petersen, M. (2021). Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter. American Political Science Review , 115 (3), 999–1015. https://doi.org/10.1017/S0003055421000290

Pennycook, G., Epstein, Z., Mosleh, M., Arechar A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592 , 590–595. https://doi.org/10.1038/s41586-021-03344-2

Pennycook, G., McPhetres, J., Zhang, Y., Jackson, L. G., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychological Science, 31 (7), 770–780. https://doi.org/10.1177/0956797620939054

Pennycook, G., & Rand, D. G. (2018). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50. https://doi.org/10.1016/j.cognition.2018.06.011

Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences 25 (5), 388–402. https://doi.org/10.1016/j.tics.2021.02.007

Pennycook, G., & Rand, D. G. (in press). Nudging social media sharing towards accuracy. Annals of the American Academy of Political and Social Science. https://doi.org/10.31234/osf.io/tp6vy

Roozenbeek, J., Freeman, A. L. J., & van der Linden, S. (2021). How accurate are accuracy-nudge interventions? A preregistered direct replication of Pennycook et al. (2020). Psychological Science, 32 (7), 1169 –1178 . https://doi.org/10.1177/09567976211024535

Roozenbeek, J., & van der Linden, S. (2020). Breaking Harmony Square: A game that “inoculates” against political misinformation. Harvard Kennedy School (HKS) Misinformation Review , 1 (8). https://doi.org/10.37016/mr-2020-47

Rosenzweig, L. R., Bago, B., Berinsky A. J., & Rand, D. G. (2021). Happiness and surprise are associated with worse truth discernment of COVID-19 headlines among social media users in Nigeria. Harvard Kennedy School (HKS) Misinformation Review , 2 (4) . https://doi.org/10.37016/mr-2020-75

Ross, R. M., Rand, D. G., & Pennycook, G. (2021). Beyond “fake news”: Analytic thinking and the detection of false and hyperpartisan news headlines. Judgment and Decision Making , 16 (2), 484–504. http://journal.sjdm.org/20/200616b/jdm200616b.pdf

The authors gratefully acknowledge funding from the Ethics and Governance of Artificial Intelligence Initiative of the Miami Foundation, the William and Flora Hewlett Foundation, the Reset Initiative of Luminate (part of the Omidyar Network), the John Templeton Foundation, the TDF Foundation, and Google.

Competing Interests

DR received research support through gifts to MIT from Google and Facebook.

This research was deemed exempt by the MIT Committee on the Use of Humans as Experimental Subjects, # E-2443.

This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided that the original author and source are properly credited.

Data Availability

All materials needed to replicate this study are available at https://osf.io/kyx9z/?view_only=762302214a8248789bc6aeaa1c209029 and via the Harvard Dataverse at https://doi.org/10.7910/DVN/N0ITTI (data & code) and at https://doi.org/10.7910/DVN/WG3P0Y (materials).

essay on digital study

25,000+ students realised their study abroad dream with us. Take the first step today

Meet top uk universities from the comfort of your home, here’s your new year gift, one app for all your, study abroad needs, start your journey, track your progress, grow with the community and so much more.

essay on digital study

Verification Code

An OTP has been sent to your registered mobile no. Please verify

essay on digital study

Thanks for your comment !

Our team will review it before it's shown to our readers.

Leverage Edu

  • School Education /

✍️Essay on Online Classes: Samples in 100, 150, 200 Words

essay on digital study

  • Updated on  
  • Oct 20, 2023

Essay on Online Classes

Online classes, also known as virtual classes, have over time revolutionized education. They are known for providing students with the flexibility to access educational content and at the same time interact with professors in the comfort of their homes. With time, this mode of learning has gained huge popularity due to its accessibility and the ability to cater to diverse learning styles.

In this digital age, online classes have become a fundamental part of education, enabling all individuals to acquire knowledge, skills etc. Are you looking to gain some more information about online classes? Well, you have come to the right place. Here you will get to read some samples of online classes. 

Table of Contents

  • 1 What are Online Classes?
  • 2 Essay on Online Classes in 100 Words
  • 3 Essay on Online Classes in 150 Words
  • 4 Essay on Online Classes in 200 Words

Also Read: Online Courses

What are Online Classes?

Online classes are educational courses or learning programs which are conducted over the Internet. They provide students with the opportunity to study and complete their coursework remotely from the comfort of their homes. Online classes are a part of formal education. They can be taken in schools or colleges or can be offered by various online learning platforms. 

Online classes may include a variety of digital resources as well as tools. These may include quizzes, assignments, video lectures, discussion forums, connecting with friends via email, chat video calls etc. This type of learning offers the student flexibility in terms of when and where they can access their coursework and study. It is also helpful for those who study part-time have busy schedules and prefer remote learning. 

With the onset of COVID-19 , online classes became a huge hit hence the evolution of online classes. It offers one with different levels of education, skill training and much more. 

Essay on Online Classes in 100 Words

Online classes have become a central aspect of modern education. They offer flexibility, accessibility, and convenience, allowing students to learn from the comfort of their homes. The rise of online classes was accelerated during the COVID-19 pandemic, making a shift from traditional classrooms to virtual learning environments. 

However, there are many disadvantages to online classes. Students may struggle with distractions, lack of in-person interaction, and technical issues. Additionally, they have opened up new avenues for global collaboration and lifelong learning. In an increasingly digital world, online classes are likely to remain a significant part of education.

Essay on Online Classes in 150 Words

Online classes have become a prevalent mode of education, especially in the past two years. These digital platforms offer several advantages. First, they provide flexibility, allowing students to learn from the comfort of their homes. This is especially beneficial for those with busy schedules or who are studying part-time. 

Second, online classes often offer a wider range of courses, enabling learners to explore diverse subjects. Additionally, these classes promote self-discipline and time management skills as students must regulate their own study routines.

However, there are challenges associated with online learning. Technical issues can disrupt classes, and the lack of face-to-face interaction may hinder social development. It can also be isolating for some students.

In conclusion, online classes offer convenience and a variety of courses, but they also present challenges related to technology and socialization. The future of education likely involves a blend of traditional and online learning methods, catering to diverse learning needs.

Also Read: Online Learning

Essay on Online Classes in 200 Words

Online classes have become a prevalent mode of education. However, this shift has brought about both advantages and challenges.

One significant benefit of online classes is accessibility. They allow students from diverse backgrounds and locations to access quality education without any constraints. This inclusivity promotes diversity and global learning experiences. Additionally, online classes often offer flexible schedules, enabling students to balance their studies with other responsibilities.

However, online classes present challenges too. Technical issues and a lack of face-to-face interaction can hinder effective learning. Students may even struggle with self-discipline and motivation, leading to a decline in academic performance. Moreover, the absence of physical facilities like libraries and laboratories can limit hands-on learning opportunities.

In conclusion, online classes have revolutionized education by providing accessibility and flexibility. Yet, they also pose challenges related to technical issues, motivation, and practical experiences. 

Related Articles

Every student has their own pace of study, and this is where distance learning’s benefits really shine. You can go at your own speed in online classes, go over the material as needed, and complete the work in a method that best suits your learning preferences.

Online courses can be successful provided they are well-designed and delivered, just like any other course or programme. However, this depends from person to person as not every student is meant for online classes. 

In online education, students get to study online using a computer/laptop and only need a proper internet connection. 

For more information on such interesting topics, visit our essay-writing page and follow Leverage Edu ! 

' src=

Malvika Chawla

Malvika is a content writer cum news freak who comes with a strong background in Journalism and has worked with renowned news websites such as News 9 and The Financial Express to name a few. When not writing, she can be found bringing life to the canvasses by painting on them.

Leave a Reply Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Contact no. *

essay on digital study

Connect With Us

essay on digital study

25,000+ students realised their study abroad dream with us. Take the first step today.

essay on digital study

Resend OTP in

essay on digital study

Need help with?

Study abroad.

UK, Canada, US & More

IELTS, GRE, GMAT & More

Scholarship, Loans & Forex

Country Preference

New Zealand

Which English test are you planning to take?

Which academic test are you planning to take.

Not Sure yet

When are you planning to take the exam?

Already booked my exam slot

Within 2 Months

Want to learn about the test

Which Degree do you wish to pursue?

When do you want to start studying abroad.

January 2024

September 2024

What is your budget to study abroad?

essay on digital study

How would you describe this article ?

Please rate this article

We would like to hear more.

Have something on your mind?

essay on digital study

Make your study abroad dream a reality in January 2022 with

essay on digital study

India's Biggest Virtual University Fair

essay on digital study

Essex Direct Admission Day

Why attend .

essay on digital study

Don't Miss Out

IMAGES

  1. Living in Digital World Essay Example

    essay on digital study

  2. Advantages and disadvantages of using the internet Free Essay Example

    essay on digital study

  3. Digital technology essay

    essay on digital study

  4. Digital Technologies Essay Temp Free Essay Example

    essay on digital study

  5. Sample essay on the future of digital media

    essay on digital study

  6. Essay on Digital Marketing

    essay on digital study

VIDEO

  1. Essay on Digitalization in Daily Life 400 words || Digitalization in Daily Life Essay ||CBSE series

  2. John Seely Brown Lecture on Learning in the Digital Age

  3. Essay on Digital India for New India / Paragraph on Digital India for New India/ Essay Digital India

  4. Essay on Digital India in English

  5. Digital Bangladesh Paragraph/Essay writing in English

  6. Essay on Digital India

COMMENTS

  1. PDF The Impact of Digital Technology on Learning: A Summary for the ...

    A similar relationship between length of treatment and study outcome has been reported in previous meta-analyses. Kulik et al. (1983), for example, reported an effect size of 0.56 for 4 weeks or less, 0.30 for 5-8 weeks, and 0.20 for more than 8 weeks.

  2. (Pdf) Transforming Education in The Digital Age: a Comprehensive Study

    The findings from this study suggest that digital storytelling is a powerful tool to integrate instructional messages with learning activities to create more engaging and exciting learning ...

  3. Digital transformation: a review, synthesis and opportunities for

    A systematic review is a type of literature review that applies an explicit algorithm and a multi-stage review strategy in order to collect and critically appraise a body of research studies (Mulrow 1994; Pittaway et al. 2004; Crossan and Apaydin 2010).This transparent and reproducible process is ideally suited for analyzing and structuring the vast and heterogeneous literature on digital ...

  4. (PDF) Importance of Digital Library in Education

    PDF | On Nov 11, 2021, Rahat Khan published Importance of Digital Library in Education | Find, read and cite all the research you need on ResearchGate

  5. The Impact of Digital Tools on Student Writing and How Writing is

    The study was designed to explore teachers' views of the ways today's digital environment is shaping the research and writing habits of middle and high school students, as well as teachers' own technology use and their efforts to incorporate new digital tools into their classrooms.

  6. Student Writing in the Digital Age

    College student writing today actually is longer and contains no more errors than it did in 1917. A longitudinal study of student writing finds that digital technology has not been the downfall of written expression. The icon indicates free access to the linked research on JSTOR. "Kids these days" laments are nothing new, but the substance ...

  7. Students' Digital Media Self-Efficacy and Its Importance ...

    Since the prevalent use of digital media for study purposes are confirmed in this study, it is necessary to extend existing instruments on teaching and learning (e.g. Lemos et al. 2011) by integrating the usage of digital media as well as self-efficacy beliefs concerning digital media. Furthermore, our study confirms the importance of our ...

  8. The opportunities and challenges of digital learning

    First, increasing speed and availability of internet access can reduce many of the geographic constraints that disadvantage poor students. Schools serving higher-resourced families are often able ...

  9. Digital Transformation: An Overview of the Current State of the Art of

    Disruptive changes, understood as changes in a company and its operating environment caused by digitalization, possibly leading to the current business becoming obsolete (Parviainen et al., 2017), trigger DT in different environments due to rapid or disruptive innovations in digital technologies.These changes create high levels of uncertainty, and industries and companies try to adapt to these ...

  10. Op-Ed: Do students learn best via printed books or digital texts?

    Overwhelmingly, college students report they concentrate, learn or remember best with paper, according to my research and studies conducted by colleagues. For instance, students say that when ...

  11. A textbook dilemma: Digital or paper?

    You might think that, decades into the digital revolution, we would have a clear answer to this question. Wrong. Earlier this year educational psychologist Patricia Alexander, a literacy scholar at the University of Maryland, published a thorough review of recent research on the topic. She was "shocked," she says, to find that out of 878 potentially relevant studies published between 1992 ...

  12. Impacts of digital technologies on education and factors influencing

    Introduction. Digital technologies have brought changes to the nature and scope of education. Versatile and disruptive technological innovations, such as smart devices, the Internet of Things (IoT), artificial intelligence (AI), augmented reality (AR) and virtual reality (VR), blockchain, and software applications have opened up new opportunities for advancing teaching and learning (Gaol ...

  13. Growing up in a digital world: benefits and risks

    Digital technologies have profoundly changed childhood and adolescence. The internet and the means to access it, such as tablets and smartphones, along with social media platforms and messaging apps, have become integral to the lives of youth around the world. They have transformed their education and learning, the way they make and maintain friendships, how they spend their leisure time, and ...

  14. (PDF) Effectiveness of traditional and online learning: comparative

    The study revealed that it is necessary for the digital competence trainings to be implemented effectively in which pre-service teachers collaborate on digital issues regardless of their previous ...

  15. Digital SAT FAQs (article)

    One test for Reading and Writing: While the pencil-and-paper SAT tested reading and writing in separate test sections, the digital SAT combines these topics. Shorter passages (and more of them): Instead of reading long passages and answering multiple questions on each passage, students taking the digital SAT will encounter shorter passages, each with just one follow-up question.

  16. Digital literacy is associated with more discerning accuracy judgments

    It has been widely argued that social media users with low digital literacy—who lack fluency with basic technological concepts related to the internet—are more likely to fall for online misinformation, but surprisingly little research has examined this association empirically.In a large survey experiment involving true and false news posts about politics and COVID-19, we found that digital ...

  17. Essay on Online Classes: Samples in 100, 150, 200 Words

    Essay on Online Classes in 150 Words. Online classes have become a prevalent mode of education, especially in the past two years. These digital platforms offer several advantages. First, they provide flexibility, allowing students to learn from the comfort of their homes. This is especially beneficial for those with busy schedules or who are ...

  18. Positive Effects of Digital Technology Use by Adolescents: A Scoping

    This study examines the research literature published from 2012 to 2022 on the relationship between increases in adolescent consumption of digital technologies and its impact on multiple areas of development, with a focus on how adolescent immersion in an increasingly ubiquitous digital world engenders positive outcomes in terms of brain, cognitive, and social-emotional development.

  19. A Study On Digital Technologies

    A Study On Digital Technologies. The advent of technology has transformed and changed the mode of our lives in all its aspects. People can learn in distant universities without having to travel there. They can access to various sources of information through the World Wide Web. Unlike the traditional schools where students' learning was only ...

  20. Understanding and Teaching the Digital Generation

    The Digital Generation. Screens, speed, and information. These three words define how students in the 21st century are used to interacting with the world. These students, sometimes referred to as ...

  21. (PDF) Digital Entrepreneurship: Research and Practice

    Digital entrepreneurship is broadly defined as creating new ventures and tr ansforming existing. businesses by developing novel digital technologies and/or novel usage of such technologies ...

  22. Digital detox: An effective solution in the smartphone era? A

    Both the public and scientific community use different terms when it comes to non-use of electronical devices. Usually, terms like abstinence, break, disconnection, detox, timeout, or unplugging are used (e.g., Brown & Kuss, 2020; Fioravanti et al., 2019).The important aspect that these terms have in common is they describe a period during which use of digital devices, e.g., tablets, is ...

  23. Free Essay: Digital Citizenship

    Digital citizenship is the concept in which to prepare everyone. It includes nine key elements that teach and require the proper use of technology. The key elements of digital citizenship are, Digital Etiquette, Digital Communication, Digital Literacy, Digital Access, Digital Commerce, Digital Law, Digital Rights & Responsibilities, Digital ...