NCES Blog

National Center for Education Statistics

Money Matters: Exploring Young Adults’ Financial Literacy and Financial Discussions With Their Parents

Financial literacy is a critical skill for young adults—especially as they begin to enter college or the workforce—that is often needed for partial or full financial independence and increased financial decision making.

The Program for International Student Assessment (PISA)—which is coordinated by the Organization for Economic Cooperation and Development (OECD)—gives us a unique opportunity to analyze and understand the financial literacy of 15-year-olds in the United States and other education systems around the world. PISA is the only large-scale nationally representative assessment that measures the financial literacy skills of 15-year-olds. The financial literacy domain was administered first in 2012 and then in 2015 and 2018. The 2018 financial literacy cycle assessed approximately 117,000 students, representing about 13.5 million 15-year-olds from 20 education systems. The fourth cycle began in fall 2022 in the United States and is currently being conducted.


How Frequently Do Students Discuss Financial Topics With Their Parents?

In 2018, all education systems that administered the PISA financial literacy assessment also asked students to complete a questionnaire about their experiences with money matters in school and outside of school. In the United States, about 3,500 students out of the total 3,740 U.S. PISA sample completed the questionnaire.

This blog post explores how frequently students reported talking about the following five topics with their parents (or guardians or relatives):

  1. their spending decisions
  2. their savings decisions
  3. the family budget
  4. money for things they want to buy
  5. news related to economics or finance

Students’ answers were grouped into two categories: frequent (“a few times a month” or “once a week or more”) and infrequent (“never or almost never” or “a few times a year”).

We first looked at the degree to which students frequently discussed various financial topics with their parents. In 2018, the frequency of student-parent financial discussions varied by financial topic (figure 1):

  • About one-quarter (24 percent) of U.S. 15-year-old students reported frequently discussing with their parents news related to economics or finance.
  • More than half (53 percent) of U.S. 15-year-old students reported frequently discussing with their parents money for things they wanted to buy.

Bar chart showing percentage of 15-year-old students who frequently discuss financial topics with their parents, by topic (spending decisions, savings decisions, family budget, money for things you want to buy, and news related to economics or finance), in 2018


Do male and female students differ in how frequently they discuss financial topics with their parents?

In 2018, higher percentages of female students than of male students frequently discussed with their parents the family budget (35 vs. 32 percent) and money for things they wanted to buy (56 vs. 50 percent). Meanwhile, a lower percentage of female students than of male students frequently discussed with their parents news related to economics or finance (21 vs. 26 percent) (figure 2).


Bar chart showing percentage of 15-year-old students who frequently discuss financial topics with their parents, by topic (spending decisions, savings decisions, family budget, money for things you want to buy, and news related to economics or finance) and gender, in 2018


Are Students’ Financial Literacy Scores Related to How Frequently They Discuss Financial Matters With Their Parents?

With a scale from 0–1,000, the PISA financial literacy assessment measures students’ financial knowledge in four content areas:

  1. money and transactions
  2. planning and managing finances
  3. risk and reward
  4. the financial landscape

In 2018, the average score of 15-year-old students ranged from 388 points in Indonesia to 547 points in Estonia. The U.S. average (506 points) was higher than the average in 11 education systems, lower than the average in 4 education systems, and not measurably different from the average in 4 education systems. The U.S. average was also not measurably different from the OECD average.

We also examined the relationship between frequent parent–student financial discussions and students’ financial literacy achievement (figure 3). After taking into account students’ gender, race/ethnicity, immigration status, and socioeconomic status—as well as their school’s poverty and location—the results show that students who reported frequently discussing spending decisions with their parents scored 16 points higher on average than did students who reported infrequently discussing this topic. On the other hand, students who reported frequently discussing news related to economics or finance with their parents scored 18 points lower on average than did students who reported infrequently discussing this topic.  


Two-sided horizontal bar chart showing financial literacy score-point differences between students who frequently and infrequently discuss financial topics with their parents, after accounting for student and school characteristics, in 2018


Do Students Think That Young Adults Should Make Their Own Spending Decisions?

We also explored whether students agreed that young people should make their own spending decisions. In 2018, some 63 percent of U.S. 15-year-old students reported they agreed or strongly agreed, while 37 percent reported that they disagreed.

Do male and female students differ in their agreement that young adults should make their own spending decisions?

When comparing the percentage of male versus female students, we found that a lower percentage of female students than of male students agreed or strongly agreed that young people should make their own spending decisions (59 vs. 66 percent). This pattern held even after taking into account students’ gender, race/ethnicity, immigration status, and socioeconomic status as well as school poverty and location.  


Upcoming PISA Data Collections

A deeper understanding of the frequency of parent–student financial conversations, the types of topics discussed, and the relationships between financial topics and financial literacy could help parents and educators foster financial literacy across different student groups in the United States.

PISA began collecting data in 2022 after being postponed 1 year due to the COVID-19 pandemic; 83 education systems are expected to participate. The PISA 2022 Financial Literacy Assessment will include items from earlier years as well as new interactive items. The main PISA results will be released in December 2023, and the PISA financial literacy results will be released in spring/summer 2024.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to receive notifications when these new PISA data are released.

 

By Saki Ikoma, Marissa Hall, and Frank Fonseca, AIR

International Computer and Information Literacy Study: 2023 Data Collection

In April, the National Center for Education Statistics (NCES) will kick off the 2023 International Computer and Information Literacy Study (ICILS) of eighth-grade students in the United States. This will be the second time the United States is participating in the ICILS.

What is ICILS?

ICILS is a computer-based international assessment of eighth-grade students’ capacity to use information and communications technologies (ICT)1 productively for a range of different purposes. It is sponsored by the International Association for the Evaluation of Educational Achievement (IEA) and conducted in the United States by NCES.

In addition to assessing students on two components—computer and information literacy (CIL) and computational thinking (CT)—ICILS also collects information from students, teachers, school principals, and ICT coordinators on contextual factors that may be related to students’ development in CIL.

Why is ICILS important?

ICILS measures students’ skills with ICT and provides data on CIL. In the United States, the development of these skills is called for in the Federal STEM Education Strategic Plan. Outside of the United States, ICILS is also recognized as an official EU target by the European Council and EU member states to support strategic priorities toward the European Education Area and Beyond (2021–2030). From a global perspective, ICILS provides information for monitoring progress toward the UNESCO Sustainable Development Goals (SDGs).

The measurement of students’ CIL is highly relevant today—digital tools and online learning became the primary means of delivering and receiving education during the onset of the coronavirus pandemic, and technology continually shapes the way students learn both inside and outside of school.

ICILS provides valuable comparative data on students’ skills and experience across all participating education systems. In 2018, ICILS results showed that U.S. eighth-grade students’ average CIL score (519) was higher than the ICILS 2018 average score (496) (figure 1).


Horizontal bar chart showing average CIL scores of eighth-grade students, by education system, in 2018

* p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.
NOTE: CIL = computer and information literacy. The ICILS CIL scale ranges from 100 to 700. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average CIL scores, from largest to smallest. Italics indicate the benchmarking participants.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), International Computer and Information Literacy Study (ICILS), 2018.


ICILS data can also be used to examine various topics within one education system and shed light on the variations in the use of digital resources in teaching and learning among student and teacher subgroups. For example, in 2018, lower percentages of mathematics teachers than of English language arts (ELA) and science teachers often or always used ICT to support student-led discussions, inquiry learning, and collaboration among students (figure 2).


Stacked horizontal bar chart showing percentage of U.S. eighth-grade teachers who often or always use ICT, by selected teaching practice and subject (English language arts, math, and science), in 2018

NOTE: ICT = information and communications technologies. Teaching practices are ordered by the percentage of English language arts teachers using ICT, from largest to smallest. Science includes general science and/or physics, chemistry, biology, geology, earth sciences, and technical science.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), International Computer and Information Literacy Study (ICILS), 2018.


What does the ICILS 2023 data collection include?

In November 2022, NCES started the preparation work for the ICILS 2023 main study data collection, which is scheduled for administration from April to June 2023. Eighth-grade students and staff from a nationally representative sample of about 150 schools will participate in the study.

Students will be assessed on CIL (which focuses on understanding computer use, gathering information, producing information, and communicating digitally) and CT (which focuses on conceptualizing problems and operationalizing solutions). In addition to taking the assessment, students will complete a questionnaire about their access to and use of ICT.

Teachers will be surveyed about their use of ICT in teaching practices, ICT skills they emphasize in their teaching, their attitudes toward using ICT, and their ICT-related professional development. In addition, principals and ICT coordinators will be surveyed about ICT resources and support at school, priorities in using ICT, and management of ICT resources.

In 2023, more than 30 education systems will participate in the study and join the international comparisons. When ICILS 2023 results are released in the international and U.S. reports in November 2024, we will be able to learn more about the changes in students’ and teachers’ technology use over the past 5 years by comparing the 2023 and 2018 ICILS results. Such trend comparisons will be meaningful given the increased availability of the Internet and digital tools during the pandemic.

 

Explore the ICILS website to learn more about the study, and be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on future ICILS reports and resources.

 

By Yan Wang and Yuqi Liao, AIR

 


[1] Refers to technological tools and resources used to store, create, share, or exchange information, including computers, software applications, and the Internet.

NCES Celebrates IES and NCES Anniversaries With Retrospective Report on Federal Education Statistics

This year marks the 20th anniversary of the Institute of Education Sciences (IES) and 155 years since the creation of a federal agency to collect and report education statistics for the United States, a role now fulfilled by the National Center for Education Statistics (NCES). To celebrate both of these anniversaries, NCES has just released a new commemorative report—A Retrospective Look at U.S. Education Statistics—that explores the history and use of federal education statistics.



The 11 statistical profiles in phase I of this report can be found within two tabs: Elementary and Secondary Education and Postsecondary Education. Users can toggle between these two tabs and then select a particular statistical profile in the drop-down menu, such as Number of Elementary and Secondary Schools, High School Coursetaking, Enrollment in Postsecondary Institutions, and Postsecondary Student Costs and Financing.


Image of report website showing tabs for Elementary and Secondary Education and Postsecondary Education and the drop-down menu to select individual statistical profiles


Each of the statistical profiles in this report is broken down into the following sections:

  • what the statistic measures (what the data may indicate about a particular topic)
  • what to know about the statistic (the history of the data collection and how it may have changed over time)
  • what the data reveal (broad historical trends/patterns in the data, accompanied by figures)
  • more information (reference tables and related resources)

Each statistical profile can be downloaded as a PDF, and each figure within a profile can be downloaded or shared via a link or on social media.

For background and context, this report also includes a Historical Event Timeline. In this section, readers can learn about major periods of prolonged economic downturn, periods of military action, and periods when U.S. troops were drafted as a part of military action—as well as major pieces of federal legislation—and how some of these events could have disrupted the nation’s social life and schooling or impacted education across the country.

The report also includes a brief overview of NCES, which can be accessed by expanding the dark blue bar labeled NCES Overview: Past, Present, and Future. This section covers the history of NCES and its mission, the evolution of NCES reports and data collections, and current and future changes to NCES’s reporting methods.


Image of report website showing introductory text and the NCES Overview blue bar


This commemorative guide to federal education statistics is not intended to be a comprehensive report on the subject but rather a resource that provides an in-depth look at a selection of statistics. Stay tuned for the release of phase II next year, which will include additional statistical profiles. Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date!

 

By Megan Barnett, AIR

NCES Releases Indicators on Rural Education

NCES is excited to announce the release of five Education Across America indicators that focus on education in rural areas. These indicators—which summarize data patterns and provide analyses of the rural education experience—focus on the following topics:

For example, Rural Students’ Access to the Internet highlights the percentage of students in rural areas who had no internet access or only dial-up access to the Internet in 2019 (7 percent or 663,000 students). This percentage was higher than the percentages for students in towns (6 percent), cities (5 percent), and suburban areas (3 percent). In addition, compared with students in other locales, it was less common for students in rural areas to have fixed broadband internet access at home and more common for them to have only mobile broadband internet access at home. 


Figure 1. Percentage of 5- to 17-year-old students with no access to the Internet or only dial-up access to the Internet at home, by home locale: 2019

[click to enlarge image]

Horizontal bar chart showing the percentage of 5- to 17-year-old students with no access to the Internet or only dial-up access to the Internet at home in 2019, by home locale

NOTE: "No access to the Internet or only dial-up access to the Internet" includes households where no member accesses the Internet at home as well as households where members access the Internet only with a dial-up service. Data are based on sample surveys of the entire population residing within the United States. This figure includes only students living in households, because respondents living in group quarters (e.g., shelters, healthcare facilities, or correctional facilities) were not asked about internet access. Excludes children under age 15 who are not related to the householder by birth, marriage, or adoption (e.g., foster children) because their family and individual income is not known and a poverty status cannot be determined for them. Although rounded numbers are displayed, figures are based on unrounded data.

SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2019, Restricted-Use Data File. See Digest of Education Statistics 2020, table 218.70.


These indicators are currently available through the Condition of Education Indicator System. To access them, select Explore by Indicator Topics and then select the Education Across America icon.


Image of the Condition of Education's Explore by Indicator Topics page highlighting the Education Across America section


Stay tuned for the release of additional indicators in early 2023. Then, in spring/summer 2023, check back to explore our highlights reports—which will explore key findings across multiple indicators grouped together by a theme—and our spotlight on distant and remote rural areas and the unique challenges they face.

Explore the Education Across America resource hub—including locale definitions, locale-focused resources, and reference tables with locale-based data—and watch this video to learn more about the hub. Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on Education Across America releases and resources.

 

By Xiaolei Wang and Jodi Vallaster, NCES

U.S. Is Unique in Score Gap Widening in Mathematics and Science at Both Grades 4 and 8: Prepandemic Evidence from TIMSS

Tracking differences between the performance of high- and low-performing students is one way of monitoring equity in education. These differences are referred to as achievement gaps or “score gaps,” and they may widen or narrow over time.

To provide the most up-to-date international data on this topic, NCES recently released Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS. This interactive web-based Stats in Brief uses data from the Trends in International Mathematics and Science Study (TIMSS) to explore changes between 2011 and 2019 in the score gaps between students at the 90th percentile (high performing) and the 10th percentile (low performing). The study—which examines data from 47 countries at grade 4, 36 countries at grade 8, and 29 countries at both grades—provides an important picture of prepandemic trends.

This Stats in Brief also provides new analyses of the patterns in score gap changes over the last decade. The focus on patterns sheds light on which part of the achievement distribution may be driving change, which is important for developing appropriate policy responses. 


Did score gaps change in the United States and other countries between 2011 and 2019?

In the United States, score gap changes consistently widened between 2011 and 2019 (figure 1). In fact, the United States was the only country (of 29) where the score gap between high- and low-performing students widened in both mathematics and science at both grade 4 and grade 8.


Figure 1. Changes in scores gaps between high- and low-performing U.S. students between 2011 and 2019

Horizontal bar chart showing changes in scores gaps between high- and low-performing U.S. students between 2011 and 2019

* p < .05. Change in score gap is significant at the .05 level of statistical significance.

SOURCE: Stephens, M., Erberber, E., Tsokodayi, Y., and Fonseca, F. (2022). Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS (NCES 2022-041). U.S. Department of Education. Washington, DC: National Center for Education Statistics, Institute of Education Sciences. Available at https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2022041.


For any given grade and subject combination, no more than a quarter of participating countries had a score gap that widened, and no more than a third had a score gap that narrowed—further highlighting the uniqueness of the U.S. results.


Did score gaps change because of high-performing students, low-performing students, or both?

At grade 4, score gaps widened in the United States between 2011 and 2019 due to decreases in low-performing students’ scores, while high-performing students’ scores did not measurably change (figure 2). This was true for both mathematics and science and for most of the countries where score gaps also widened.


Figure 2. Changes in scores of high- and low-performing U.S. students between 2011 and 2019

Horizontal bar chart showing changes in scores of high- and low-performing U.S. students between 2011 and 2019 and changes in the corresponding score gaps

p < .05. 2019 score gap is significantly different from 2011 score gap.

SOURCE: Stephens, M., Erberber, E., Tsokodayi, Y., and Fonseca, F. (2022). Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS (NCES 2022-041). U.S. Department of Education. Washington, DC: National Center for Education Statistics, Institute of Education Sciences. Available at https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2022041.


Low-performing U.S. students’ scores also dropped in both subjects at grade 8, but at this grade, they were accompanied by rises in high-performing students’ scores. This pattern—where the two ends of the distribution move in opposite directions—led to the United States’ relatively large changes in score gaps. Among the other countries with widening score gaps at grade 8, this pattern of divergence was not common in mathematics but was more common in science.

In contrast, in countries where the score gaps narrowed, low-performing students’ scores generally increased. In some cases, the scores of both low- and high-performing students increased, but the scores of low-performing students increased more.

Countries with narrowing score gaps typically also saw their average scores rise between 2011 and 2019, demonstrating improvements in both equity and achievement. This was almost never the case in countries where the scores of low-performing students dropped, highlighting the global importance of not letting this group of students fall behind.  


What else can we learn from this TIMSS Stats in Brief?

In addition to providing summary results (described above), this interactive Stats in Brief allows users to select a subject and grade to explore each of the study questions further (exhibit 1). Within each selection, users can choose either a more streamlined or a more expanded view of the cross-country figures and walk through the findings step-by-step while key parts of the figures are highlighted.


Exhibit 1. Preview of the Stats in Brief’s Features

Image of the TIMSS Stats in Brief web report


Explore NCES’ new interactive TIMSS Stats in Brief to learn more about how score gaps between high- and low-performing students have changed over time across countries.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on TIMSS data releases and resources.

 

By Maria Stephens and Ebru Erberber, AIR; and Lydia Malley, NCES