Full Text
Full Text
Public Access Theses and Dissertations from Education and Human Sciences, College of
the College of Education and Human Sciences (CEHS)
July 2008
Moler, Mary C., "The Relationship between the Curriculum, Instruction, and Assessment Provided by
Wyoming High School Mathematics Teachers and the Performance of Wyoming 11th Grade Students on
the Adequate Yearly Progress of Wyoming Schools" (2008). Public Access Theses and Dissertations from
the College of Education and Human Sciences. 24.
https://digitalcommons.unl.edu/cehsdiss/24
This Article is brought to you for free and open access by the Education and Human Sciences, College of (CEHS) at
DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in Public Access Theses and
Dissertations from the College of Education and Human Sciences by an authorized administrator of
DigitalCommons@University of Nebraska - Lincoln.
THE RELATIONSHIP BETWEEN THE CURRICULUM, INSTRUCTION, AND
ASSESSMENT PROVIDED BY WYOMING HIGH SCHOOL MATHEMATICS
TEACHERS AND THE PERFORMANCE OF WYOMING 11TH GRADE STUDENTS
ON THE ADEQUATE YEARLY PROGRESS OF WYOMING SCHOOLS
by
Mary C. Moler
A DISSERTATION
Lincoln, Nebraska
August 2008
THE RELATIONSHIP BETWEEN THE CURRICULUM, INSTRUCTION, AND
ASSESSMENT PROVIDED BY WYOMING HIGH SCHOOL MATHEMATICS
TEACHERS AND THE PERFORMANCE OF WYOMING 11TH GRADE STUDENTS
ON THE ADEQUATE YEARLY PROGRESS OF WYOMING SCHOOLS
Mary C. Moler, Ph.D.
University of Nebraska, 2008
Advisor: David Fowler
CHAPTER
1 INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Purpose Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Curriculum Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Standards Movement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Accountability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Instructional Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Teacher Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Professional Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Assessment Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
3 METHODOLOGY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Population . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Sampling Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Pilot Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
v
Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
4 RESULTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Response Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Variables Analyzed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Effect Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
ANOVA Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
t-Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
5 DISCUSSION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Questionnaire Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
APPENDIX
FIGURE
Problem Statement
During the 1980s and 1990s, the content, instruction, and assessment of
skills, and the results that these high school students had demonstrated. These changes
occurred due to the national standards movement, technology advances, and a desire for
culmination of these changes came following the United States government mandated No
Before the 1990s, a teacher’s classroom accountability was based mainly on the
number of students passing the mathematics class. That record was part of the school’s
state accreditation report. Currently, classroom and state assessments are accountability
records for the teacher’s classroom grades, for student graduation requirements, and for
According to the mandates of NCLB, teachers are also accountable for how much
mathematics knowledge and how many mathematics skills students can demonstrate on
Presently, accountability is demanded not only at the state level with accreditation, but
also on a national level with the National Report Card. A school’s student achievement
level on a statewide assessment system is the main part of the school’s AYP. The
from a school’s PAWS scores does not reach the state determined target score, then the
school has not made AYP. The nine subgroups are “all students,” “free/reduced lunch,”
Education Plan (IEP),” and “Limited English Proficiency (LEP)” (US Department of
Education, 2005, p. 21). If a school does not meet the State’s definition for AYP for
“two or more consecutive years,” the school is then given three years of interventions
defined in NCLB. Teachers must be aware of what the students already know and what
skills they need to learn in order to be proficient. These factors are embedded within the
Wyoming State statues Title 21, called the Wyoming Education Code of 1969, (Title 21,
2006). Under the NCLB Act, the state determines which schools make AYP based on
have yearly increasing AYP targets that all student subgroups in a school must achieve.
By 2014, all students in all subgroups must be 100% proficient (NCLB Annual Report,
2005, p. 1).
Purpose Statement
Accountability of teachers has moved from the classroom level to the national
level. The accountability measure on the standards for schools has become the AYP,
adequate yearly progress. Did the school make AYP or not make AYP? The important
3
question is what teachers’ practices in curriculum, instruction, and assessment are
related to the schools making AYP. In Wyoming high school mathematics courses, are
making AYP? The purpose of this study is to determine and to highlight key findings of
these relationships for consideration by Wyoming school districts and teachers while they
are striving to improve the teaching of mathematics and student learning of mathematics
in the state’s high schools. Higher performance by students taking the grade 11
the school to make AYP. It is, therefore, imperative to know what critical parts in
demonstrate proficiency on PAWS and for the schools to make AYP. The school should
be on target toward achieving the goal of 100% proficiency by 2014 as outlined in the
NCLB Act.
Between 2000 and 2006, two dissertations had surveyed Wyoming elementary
teachers in grades two through four. One dissertation looked at instructional issues and
achievement on the mathematics portion of the statewide assessments. The other focused
on the leadership skills of the principal and the achievement on the mathematics portion
mathematics teachers before the 2006-2007 school year. A self-reporting survey of high
school mathematics teachers in the state of Wyoming was conducted in the 2006-2007
school year.
4
Research Questions
A survey was given to the high school mathematics teachers in Wyoming. The
survey was designed and created to answer the questions, “Are mathematics teachers’
adequate yearly progress, AYP?” and “Does the mathematics course level affect the
in the NCLB Act used to determine if the school is achieving proficiency on the state
standards. The course level deals with the mathematical content for a targeted class. The
teachers’ responses were stratified into three levels based on the target class chosen. If
the target class covered curriculum below Algebra 1 topics, then this teacher was
assigned to level one, a lower level. If the curriculum for the target was accelerated from
the normal progression as deemed by the teacher, then the teacher was assigned to level
three, an honors level. The remaining teachers were assigned to level two, a regular
level.
The purpose of this study was to answer the following research questions:
2. What is the relationship between the type of mathematics instruction given and
3. What is the relationship between the type of classroom assessments given and the
5. How did the responses compare on mathematical instructional questions from the
6. How did the responses compare on mathematical assessment questions from the
Introduction
This chapter will review research available on the reforms in education that have
resulted from the standards movement and the effects of the standards movement on the
content, instruction, and assessment that Wyoming students might experience in high
school. The term “standards” usually refers to the content, including knowledge,
processes, and performance level in a subject area that students are to learn in school.
The standards movement is a label for all of the actions taken in the attempt to set
benchmarks in the content areas. The first part of the chapter will highlight the literature
of factors that brought about both the standards movement and an increase in
accountability. The second part will highlight research on reforms in instruction and the
professional development needed to implement these reforms. The third part will
highlight investigations of school and classroom assessments and the adequate yearly
progress (AYP) requirements resulting from the No Child Left Behind (NCLB) Act.
Even with high standards, exemplary textbooks, and powerful assessments, what
really matters for mathematics learning are the interactions that take place in
contains little reliable data about those interactions. (National Research Council,
2001, p. 45)
Curriculum Changes
Background
of the first earth satellite by the Soviet Union, and widespread concern in the
The significant forces in society that contributed to the change in mathematics education
international assessments, and the needs of business and industry. The Standards
movement called for reforms in curriculum, instruction, and assessments. The demand
for increased accountability for public schools was a major factor in the passage of the
NCLB Act.
Technology has become a part of everyday life at home and at work. Advances in
technology have led to smaller, more powerful technological tools that are accessible to
everyone. Beginning with the October 4, 1957, launch of Sputnik, the first space satellite,
technology continued to advance with the 1975 personal computer revolution exemplified
in the release of the MITS Altair 8800 and advancing in 1977 with the Apple II (World
Almanac and Book of Facts, 2002, p. 628). In the classroom, the availability of the
computer and the graphing calculator have encouraged major influences in content and
instruction. “Moses (2001) argues that those who are technologically literate will have
access to jobs and economic enfranchisement, while those without such skills will not”
The first international mathematics study, “carried out in the early 1960s,”
(Robitaille & Travers, 1992, p. 701) involved 12 participating countries. The second
study, which included 20 countries, was done “in the early 1980s” (Robitaille & Travers,
8
1992, p. 701). The media usually reported the ranking of the countries, but the main
purpose of evaluating these international students was “trying to determine the effects of
particular variables on teaching and learning” (Robitaille & Travers, 1992, p. 688).
Information from both the first and second studies, which was often overlooked by the
media, indicated that “the highest ability students from almost all systems perform at
about the same level on topics which they all have studied” (Robitaille & Travers, 1992,
The Trends in International Mathematics and Science Study (TIMSS) was offered
in 1995, 1999, 2003, and 2007. In a 1998 press release, the United States secondary
countries in the TIMSS twelfth-grade study (Math and Science Achievement, 1999, p.
205). “TIMSS is based on a model of curriculum that has three components: the
(National Center for Education Statistics, 2006, FAQType=2, para. 1). “TIMSS is
the degree to which students have learned concepts in mathematics and science they have
encountered in school” (Gonzales et al., 2004, p. 1). The students answer the “same
assessment and questionnaire items, albeit in the language of instruction. . . . [making it]
possible to compare the performance of students in the United States on mathematics and
science items to that of their peers around the world” (National Center for Education
Statistics, 2006, FAQType=2, para. 5). The questionnaires gather data regarding the
9
students, teachers, principals, and school issues relating to learning, teaching, and types
of behavior.
In the early to mid-1950s, businesses and the military attacked the public school
systems “for graduating young adults who lacked basic computational skills” (Kilpatrick,
1992, p. 24). Colleges attacked the school systems for “failing to equip their entrants
with a knowledge of mathematics adequate for college work” (Kilpatrick, 1992, p. 24).
The public attacked the school systems “for having watered down the curriculum”
“By the end of the 1950s, widening discontinuities between the mathematics
taught in universities and that taught in the lower schools” and the declining enrollments
various countries that collectively became known as ‘the new math’” (Kilpatrick, 1992,
p. 23). Higher academic achievement and more rigorous courses in math and science
were demanded for students. Congress responded by passing the “1958 National Defense
Education Act, which provided fellowships, grants, and loans for students in higher
demands was “New Math.” Its change in instruction was “to meet the new demands
made by science, industry, and government” (Adler, 1972, p. 217). The new math
emphasized the teaching of different bases other than base 10, set theory, functions, and
diagram drawings. Set theory was to be introduced early in the curriculum. “In the
laboratories were established between 1965 and 1967 as a consequence of the Elementary
and Secondary Education Act of 1965” (Kilpatrick, 1992, p. 26). Parents and teachers
10
complained that this curriculum took time away from the basics. The “New Math” fell
out of favor before the end of decade, though it continued to be taught for years thereafter
the effects of the new-math curriculum revision efforts” (Kilpatrick, 1992, p. 29).
curricula but to describe and contrast levels of performance. These included the
mathematics assessment conducted in the United States every 4 or 5 years since 1972-
29). NAEP has tracked United States public school students’ progress in “reading,
mathematics, science, writing, U.S. history, civics, geography, and the arts” (National
Center for Education Statistics, 2006, NAEP Overview, para. 1). NAEP measures both
public school and non-public school achievement in the listed subjects for students in
grades 4, 8, and 12. Two reports given by NAEP relate to national trends and to students’
point when viewing a (sic) state’s academic testing program. Results from this year’s
[2005] NAEP in Wyoming continued to track in roughly the same pattern as past
WyCAS scores” (Wyoming Department of Education, 2005, para. 19). Both TIMSS and
NAEP results indicate the general average scores in mathematics. NAEP “allows the
eighth grade – against which to compare what students actually know at the end of eighth
11
grade” (Gonzales et al., 2004, p. 101). TIMSS allows comparisons of student
populations with similar numbers of years of schooling. NAEP and TIMSS track the
Achievement levels on national and international tests and the level of student
preparedness for continued academic achievement or for a job were indicators of the need
for a change in the mathematics curriculum. “The reform curriculum, in contrast, calls
for instruction that provides all students with the mathematical background for
quantitative literacy for the workplace and for study at the college level” (Holloway,
2004, p. 85).
involved curriculum and instruction (Jones & Coxford, 1970, p. 2). “Throughout the
decade of the seventies, the mathematics education community seemed to be groping for
a clearer focus and sense of direction” (Hill, 1983, p. 1). The National Council of
informational society. NCTM responded with a position paper, Agenda for Action, which
at Risk, a call for major educational reform. This report was written in lay terms and
discussed the mediocre level of student achievement and the lack of student preparedness
for further education (A Nation at Risk, 1983, p. 1; Ratvich, 2000, p. 411). The report
looked at the numbers of students taking remedial and advanced classes. Unfortunately,
Holloway (2004) found the situation had changed little for minorities. In “1997, 33 of
12
th
every 1,000 white 12 graders enrolled in this course [Advanced Placement Calculus],
but only 7 of every 1,000 black students and 12 of every 1,000 Hispanic students took on
this challenge” (p. 84). “The opportunity for all students to learn mathematics has been
heralded as the new ‘civil right’” (Boaler, 2006, p. 364). “The evidence indicates that the
traditional curriculum and instructional methods in the United States are not serving our
students well” (Hiebert, 1999, p. 13). The National Research Council adds more
(National Research Council, 2001, p. 407) in school mathematics programs and that all
Standards Movement
School Mathematics whose purpose was to “help improve the quality of school
were adopted to ensure quality, to indicate goals, and to promote change (National
responded with two major reports in 1989. The first was Everybody Counts from the
Mathematical Science Education Board (MSEB), and the second was the NCTM
curriculum in order to be contributing members of society. The new social goals include
“(1) mathematically literate workers, (2) lifelong learning, (3) opportunity for all, and (4)
document sets new goals for all students in order to achieve mathematical literacy or
13
“mathematical power.” In a world “teeming with data,” (Baker & Leak, 2006, para. 10)
the 21st century” (Baker & Leak, 2006, para. 11). “Math will be involved in students’
everyday lives more than ever before, and this means students must become familiar with
it to succeed” (Franklin, 2006b, p. 12). “Many security-related jobs – from data analysis
to cryptography – increasingly require the kinds of advanced math skills that American
students aren’t learning” (Franklin, 2006, p. 11). School mathematics should not be “set
in stone” (National Research Council, 2001, p. 407). Students need to be able “to deal
with mathematics on a higher level than they did just 20 years ago” (National Research
Council, 2001, p. 407). Managers and entrepreneurs “still must understand enough about
math to question the assumptions behind the numbers” (Baker & Leak, 2006, para. 36).
“The country must breed more top-notch mathematicians at home, . . . [and] must
cultivate greater math savvy among the broader population” (Baker & Leak, 2006, para.
32). People have different views of mathematics, and “these diverse views of the nature
of mathematics also have a pronounced impact on the ways in which our society
conceives of mathematics and reacts to its ever-widening influence on our daily lives”
The profession responded through the next 12 years with more reform documents
that focused, not just on curriculum as in the Curriculum and Evaluation Standards for
the Professional Standards for Teaching Mathematics (1991) and the classroom
assessments in the Assessment Standards for School Mathematics (1995). The final
document, Principles and Standards for School Mathematics (2000), from NCTM,
14
revisited all these areas, in an effort to improve mathematics education, and placed all
the reforms in one resource guide for the stakeholders. Schoenfeld (2002) states that the
NCTM’s Principles and Standards calls for “equity . . .; coherent curricula . . .; teacher
professionalism, . . . ; and the effective use of assessment and technology in the service of
student achievement on TIMSS and NAEP, and the economic demands of a global
economy. The National Research Council (2001) stated that “experiences, discussions,
and review of the literature have convinced us that school mathematics demands
substantial change” (p. 407). “The impact of curriculum as a variable must be recognized
and taken into account” (Robitaille & Travers, 1992, p. 689). Cohen (1995) found that
“systemic reform has had significant effects” (p. 11). “Prevailing patterns of curriculum,
sociology)” (NCTM Research Committee, 2006, p. 76). Thus, it is likely that as these
Accountability
expectations can be set that every student can strive for and achieve, that different
performances can and will meet agreed-on expectations, and that teachers can be fair and
assessments. At the same time, education is viewed “as an industry, with an increasing
Travers, 1992, p. 688). The required reforms needed to involve all stakeholders
businesses and industries, and public policy people. Teachers “teach in a system that
currently works against improvement. Unless other important players get involved, our
country cannot implement a program that allows teachers to improve teaching” (Stigler &
Hiebert, 1999, p. xii). “Teaching, as a cultural activity, fits within a variety of social,
economic, and political forces in our society” (Stigler & Hiebert, 1997, para. 37).
Massachusetts, to show that the schools were doing the job the state expected of them in
(Kilpatrick, 1992, p. 14). The ranking of schools, of evaluating “how much a student had
learned”, and of judging “the effectiveness of a school’s program” all began with this
survey (Kilpatrick, 1992, p. 14). The “written short-answer tests became the medium of
choice” (Kilpatrick, 1992, p. 14). In 1892 Joseph Mayer Rice investigated various large
city public school systems. Rice examined “factors that might account for the differences
between the schools in arithmetic achievement” (Kilpatrick, 1992, p. 15). His method
and inspection of “characteristics of high- and low-scoring schools, did not allow him to
estimate the strength or shape of the relationship between some hypothesized factor and
16
the level of achievement” (Kilpatrick, 1992, p. 15). Rice summarized that schools
should set “standards” in order to get the desired results (Kilpatrick, 1992, p. 15).
Over the next century, accountability was generally based on the achievement of
664)
“Stakeholders in the educational system have to understand the great variability inherent
teachers, schools, and the nation. For all student and their teachers, assessment provides
comprehensive “evidence and feedback on what students know and are able to do” on the
standards, and it provides information for “decision makers” and about the “effectiveness
of the educational system as a whole” (Webb, 1992, p. 663; see also Lefkowits & Miller,
2006, p. 406).
There was a higher lever of accountability after December 18, 2001, when
Congress passed the No Child Left Behind Act. “Three days after taking office in January
2001 as the 43rd President of the United States, George W. Bush announced No Child Left
Behind, his framework for bipartisan education reform” (NCLB Overview, 2006, para.
17
2). According to the NCLB Act of 2001, under Section 1111(h)(5) of the Elementary
and Secondary Education Act (ESEA), the Secretary of Education is “required to transmit
to the Committee on Education and the Workforce of the House of Representatives and
the Committee on Health, Education, Labor, and Pensions of the Senate a report that
provides State-level data for each State receiving assistance under Title I of ESEA”
(NCLB Annual Report, 2005, p. 1). States, however, must determine if districts and
schools – even those that do not receive Title I funds – makes AYP. Each state
establishes its own definition of AYP for each year. This illustrates that while local
education control is still a sensitive issue, the state must establish proficiency goals,
statewide, based on assessment data from the 2001-2002 school year. These goals must
increase progressively to reflect 100 percent proficiency for all students by 2013-2014
school year. NCLB “significantly raises expectations for States, local educational
agencies, and schools . . . by the 2013-2014 school year” (NCLB Annual Report, 2005, p.
poverty, race, ethnicity, disability, migrant students, gender, and limited English
proficiency. Wyoming is in compliance with NCLB with the state statue Chapter 2,
Article 3, #21-2-304 (b)(xiv), to “establish improvement goals for public schools for
assessment of student progress based upon the national assessment of education progress
testing program and the statewide assessment system established under paragraph (a) (v)
States must “describe how they will close the achievement gap and make sure all
“Monitoring student progress and the impact of these curricula on the mathematics
18
performance . . . is consistent with the recommendations offered in . . . No Child Left
Behind (2002)” (Reys, Reys, Tarr, & Chavez, 2006, p. 6). Test-based accountability “is
the only tool our education system has to reassure the public that it is spending resources
wisely and making progress on student achievement” (Lewis, 2006, p. 339). “The real
blame . . . in response to testing lies not with teachers but with state and national policy
makers who create accountability systems centered on ever-higher test scores (AYP) with
little regard for how these scores relate to better learning” (Lederman & Burnstein, 2006,
data, teachers often find themselves stretched to handle both the various data-recording
responsibilities that are required by law and their regular duties of preparing lessons and
State of Wyoming
(3) to plan and develop programs acceptable to the education profession and the
The 9-12 grade recommendations were very concise and appeared on one single
page beginning with the “dual purpose of the 9-12 mathematics program is preparation
19
for college entrance and job entry skills for students who do not continue formal study”
following 1990, state accreditation was the driving force behind the reform and the
creation of a standards based curriculum. “As of 1999, 49 states reported having content
(National Research Council, 2001, p. 34). “Virtually every state in the nation . . . . [has] a
growing commitment to the idea that clear and shared goals for student learning must
The state of Wyoming complied with the requirements set forth with the passage
of NCLB. The Wyoming State Board of Education passed a state curriculum and
clarified the mathematics requirements for high school graduation in the state statues
called the “Wyoming Education Code of 1969” (Title 21, 2006, p. 1). The school
districts have aligned their curriculum with the State Standards. The five Mathematics
Content Standards for 9-12 grade students in the State of Wyoming are
(1) students use numbers, number sense, and number relationships in a problem-
solving situation,
problem-solving situation,
solving situation,
20
(4) students use algebraic methods to investigate, model, and interpret
problem-solving situation,
(5) students use data analysis and probability to analyze given situations and the
Curriculum Summary
“Outfitting students with the right quantitative skills is a crucial test facing school
boards and education ministries worldwide” (Baker & Leak, 2006, para. 31). “The
Research Council, 2001, p. 12). Systemic reform “attempts to align all parts of the
and local policies such as graduation requirements – to promote change in the classroom
and, ultimately, improve student performance (Smith and O’Day, 1991)” (Klein,
Instructional Changes
actions. The actions discussed in this section are the expectations of the teacher about
how students learn, the use of the textbook, the instructional strategies, the achievement
teaching” and “on learning” (Koehler & Grouws, 1992, p. 117) is necessary.
her own role in teaching, the students’ role, appropriate classroom activities,
21
desirable instructional approaches and emphases, legitimate mathematical
procedures, and acceptable outcomes of instruction are all part of the teacher’s
Schools need to refrain from filtering out students from mathematics, because “it not only
filters students out of careers, but frequently out of school itself” (National Research
problems, being skillful in performing these rules, identifying basic concepts, considering
for mathematics education (Dossey, 1992, p. 42). “Correlational techniques were often
used to assess the relationships between teacher knowledge and student performance so
that little is known about the directionality of any existing relationships” (Fennema &
classrooms commonly focus on the actions and instructional methods of teachers rather
than on the mathematics being taught or the methods by which that mathematics is being
Teachers determine what content is taught in the classroom and how it is taught.
They adjust the content or methods used as determined by the students’ understanding of
the content and their performance of skills (Dossey, 1992, p. 44; Fennema & Franke,
1992, p. 158). Stigler and Hiebert (1999) state that teaching is “a complex cultural
activity that is highly determined by beliefs and habits that work partly outside the realm
22
of consciousness” (p. 103; see also Nickson, 1992, p. 102). For effective reforms in
curriculum and instruction, “these reforms must ultimately be adopted by teachers and
must take hold in the classroom (Tyack and Cuban, 1995)” (Klein, Hamilton, McCaffrey,
Stecher, Robyn & Burroughs, 2000, p. 3; see also National Council of Teachers of
influences the instructional practices in the classroom (Dossey, 1992, p. 42; Fennema &
Franke, 1992, p. 152; Koehler & Grouws, 1992, p. 124; Nickson, 1992, p. 103;
Thompson, 1992, p. 128). “The vast majority of today’s American mathematics teachers
learned the traditional mathematics curriculum in the traditional way” (Schoenfeld, 2002,
p. 20; see also National Council of Teachers of Mathematics, 1991, p. 2). The way in
which teachers were taught is often the methods of instruction they use in their classroom
as they become the next generation of teachers (Hiebert & Carpenter, 1992, p. 90; Reys,
Reys, Tarr, & Chavez, 2006, p. 5; Stigler & Hiebert, 1997, para. 6; Wainwright, Morrell,
teachers’ knowledge is continually developing and changing (Fennema & Franke, 1992,
and procedural knowledge,” which has an impact: “(1) on agendas, . . . richer mental
plans . . . ; (2) on scripts, . . . more representations and richer explanations; and (3) on
The actions of the teacher determine what happens in the classroom, the learning
that occurs, and, ultimately, the achievement level of the students. “Teachers are the key
to closing the gap” (Stigler & Hiebert, 1999, p. xii). “If we do not understand these
processes [that lead to learning in the classroom], we have little chance of improving
them” (Stigler & Hiebert, 1997, para. 5). “Analysis of classroom practice plays several
important roles. . . . Attempts to implement reform without analysis of practice are not
procedures [that] can either be simply stated by the teacher or be developed through
examples, demonstrations, and discussions” (Stigler & Hiebert, 1997, para. 25). The
suggested by the materials often influences teachers’ pedagogical strategies” (Reys, Reys,
Lapan, Holliday, & Wasman, 2003, p. 75). Most decisions made by teachers occur
24
“during the preactive or planning phase,” (Fennema & Franke, 1992, p. 156) which
greatly affect instruction. While the majority of teachers report that they are aware of and
are using reform instructional methods, when the teachers were observed, the
observations showed “that many secondary students are not being given the opportunity
to learn through reform-based practices” (Wainwright, Morrell, Flick, & Schepige, 2004,
control over policies and practices” (National Research Council, 2001, p. 32). The
teacher must have knowledge about mathematics and since the United States is becoming
multicultural, “he or she must understand the cultural diversity of the students” (Fennema
& Franke, 1992, p. 147). “I would suggest that teachers need to develop not only such a
Classroom Environment
“Efforts to improve student learning succeed or fail inside the classroom, a fact
that has too often been ignored by would-be reformers” (Stigler & Hiebert, 1997, para.
4). The reform-minded teacher must set up “cognitively demanding tasks, plan the lesson
by elaborating the mathematics that the students are to learn through those tasks, and
allocate sufficient time for the students to engage in and spend time on the tasks”
“Students must be informed active partners in this endeavor” and “need explicit guidance
in how to engage in complex tasks” (Flick & Lederman, 2005, p. 114). The teacher,
using reform methods, actively involves the students in valuable learning activities using
reflective inquiry, spends more time explaining and demonstrating materials, asks
probing questions of the students when they are sharing their results and justifications of
their inquiry methods. The students are also responsible for creating this community of
learners by becoming good listeners and staying on task (Boaler, 2002, pp. 247-253;
Boaler, 2006, pp. 365-367; Fennema & Franke, 1992, p. 151; Hiebert et al., 1996, pp. 16-
17; Hufferd-Ackles, Fuson, & Sherin, 2004, p. 113; Klein, Hamilton, McCaffrey,
Stecher, Robyn & Burroughs, 2000, p. 3; McCaffrey, Hamilton, Stecher, Klein, Bugliari,
& Robyn, 2001, pp. 494-495; National Research Council, 2001, p. 9; Secada, 1992, p.
649). Teachers, encouraging this open communication with students, facilitate students’
interconnections can be made because, without them, there would be a real danger that
questions put in isolation would make the learning process rather piecemeal and
incoherent” (Marshall, 2006, p. 358). “The benefits of reflective inquiry lie . . . in the new
relationships that are uncovered, the new aspects of the situation that are understood more
deeply” (Hiebert et al., 1996, p. 15). Students need “to know what these things mean,
where they come from, and how they fit into the grand scheme of things we call
what it means to abstract and then generalize” (Flick & Lederman, 2005, p. 115).
Successful classrooms are places “where students show[ed] a lot of satisfaction and
Opportunity to Learn
important in the way teachers provide situations and explore problems to help students
learn worthwhile mathematical content in order to reach the new standards (Boaler, 2002,
p. 249; National Research Council, 2001, p. 10; Stigler & Hiebert, 1999, p. 2). This is
Council, 1989)” (Thompson, 1992, p. 128). These reforms of using open-ended problems
rather than direct instruction are “more difficult for a high school teacher than for an
early elementary school teacher” (Schoen, Cebulla, Finn, & Fi, 2003, p. 231). Teachers
need to allow students to “learn by creating mathematics through their own investigations
of problematic situations” (Riordan & Noyce, 2001, p. 369; see also Koehler & Grouws,
1992, p. 119).
learn” (Reys, Reys, Tarr, & Chavez, 2006, p. 5). “The greatest growth [in achievement]
27
seems to be associated with exposure to new content” (Secada, 1992, p. 645). Minority
content with the teachers “expecting more, not less” student engagement (Davis, 1992, p.
730; Secada, 1992, pp. 646-47). “Students using the NSF mathematics curricula that
were taught by teachers using standards-based instruction were the highest performing
students” (Reys, Reys, Tarr, & Chavez, 2006, p. 4). “Two groups of students from
Standards-based reform efforts within these schools (Mullis et al., 2001)” (Reys, Reys,
Constructivism
understanding based on new experiences that enlarge the intellectual framework in which
ideas can be created” (National Research Council, 1989, p. 6). This implies that “each
of Mathematics, 1991, p. 2; see also Dossey, 1992, p. 44). The common phrase used is
for students to become ‘constructivist’ of their learning by internally integrating the new
knowledge with their prior mathematics relationships (Dossey, 1992, p. 45; Hiebert &
Carpenter, 1992, p. 66; Klein, Hamilton, McCaffrey, Stecher, Robyn & Burroughs, 2000,
p. 14; Koehler & Grouws, 1992, p. 119; McCaffrey, Hamilton, Stecher, Klein, Bugliari,
& Robyn, 2001, p. 494; Schifter, 1996; Stigler & Hiebert, 1999, p. 91).
the conceptual knowledge has a connected network “that is rich in relationships (Hiebert
& Lefevre, 1986)” (Hiebert & Carpenter, 1992, p. 78). “When students develop methods
28
for constructing new procedures they are integrating their conceptual knowledge with
their procedural skill” (Hiebert et al., 1996, p. 17). With conceptual knowledge and
understanding, students are able to “apply them to each new situation they meet”
(Marshall, 2006, p. 358; see also Hiebert & Carpenter, 1992, p. 74). “Research has shown
that things our brain does not understand are more likely to be forgotten” (Marshall,
2006, p. 362).
Proper feedback helps students focus on the task rather than the answer, and
personally guides students to eliminate gaps in their mathematics learning. “In teaching
for understanding,” students need to experience a concept through the use of “real-world
symbolisms (Fennema & Franke, 1992, p. 154; Marshall, 2006, p. 359). “One has to
master skills before using them for applications and problem solving” (Schoenfeld, 2002,
p. 23). When exploring and solving a problem, students have questions, experience
confusion and frustrations before their understanding is reorganized into “more richly
Communicating with students and parents is vital if the reforms are to succeed.
that is appropriate for students to learn – for example that problem solving, reasoning,
and communication are essential goals for the curriculum, and that they need to be
assessed” (Schoenfeld, 2002, p. 23). Parents are open and willing “to accept poor
performance in school mathematics, but they are not so willing to accept poor
Teaching
Teaching is a system, which works “like a machine, with the parts operating
together and reinforcing one another, driving the vehicle forward” (Stigler & Hiebert,
1999, p. 75). Each country has a system that looks “similar from lesson to lesson”
(Stigler & Hiebert, 1999, p. 77). According to Stigler and Hiebert (1999) in the TIMSS
videotapes, the German teachers are “developing advanced procedures;” the Japanese
teachers are developing “structured problem solving,” and the United States teachers
“Although most U.S. teachers report trying to improve their teaching with current
reform recommendations in mind, the videos show little evidence that change is
occurring” (Stigler & Hiebert, 1999, p. 12). “Over the past decade, numerous studies
have investigated teachers’ attempt to change their mathematics instruction in light of the
to learn about student thinking (Fennema et al., 1996)” (Hufferd-Ackles, Fuson, &
Sherin, 2004, pp. 81-82). “The evidence is beginning to accumulate to support the idea
classroom instruction in a positive way” (Fennema & Franke, 1992, p. 151). The results
of a three-year research project, Reys, Reys, Tarr and Chavez (2006) investigation of “the
use of mathematics curriculum materials (textbooks) in the middle grades and their
impact on student learning,” showed that “teachers using the NSF supported mathematics
30
curricula were most likely to use standards-based teaching” than those who used other
Textbooks
Few facts stand undisputed in educational research, but [two such facts are] the
mathematics, teachers teach only what is in the textbook and students learn only
“Robitalle and Travers (1992) argue, ‘Teachers decide what to teach, how to teach it, and
what sorts of exercises to assign to their students largely on the basis of what is contained
in the textbook authorized for their course’“ (Reys, Reys, Tarr, & Chavez, 2006, p. 5).
they enact the intended school mathematics curriculum” (Reys, Reys, Lapan, Holliday, &
achievement. It has been said, “U.S. mathematics textbooks cover more topics, but more
superficially, than their counterparts in other countries do” (National Research Council,
2001, p. 4). The National Science Foundation has funded “13 curriculum projects” to
produce materials for “elementary, middle, or high school that embodied the ideas
expressed in the standards documents (National Research Council, 2001, p. 34). The
middle school project found that about “half of the teachers” use the order of the textbook
to determine what is taught and when it is taught (Reys, Reys, Tarr, & Chavez, 2006, p.
9). “The other half of the teachers reported that their state or curriculum framework and
the textbook lessons” (Reys, Reys, Tarr, & Chavez, 2006, p. 9).
the steps can also be supported by reform-based curriculum materials (Ball & Cohen,
1996)” (Hufferd-Ackles, Fuson, & Sherin, 2004, p. 82). “Riodan and Noyce (2001). . . .
than did students in schools using traditional textbooks” (Reys, Reys, Lapan, Holliday, &
(Stigler & Hiebert, 1999, para. 41). Teaching has the goal of “steady improvement in the
mathematics learning of students” (Stigler & Hiebert, 1997, p. 6). “Research on teaching
has often been restricted . . . rather than examining continued interactions among the
teacher, the students, and the mathematical content.” (National Research Council, 2001,
p. 9)
Robyn, 2001, p. 495; Reys, Reys, Tarr, & Chavez, 2006, p. 11; Schoen, Cebulla, Finn, &
Fi, 2003, p. 232). “The relationships between student achievement and teachers’ use of
instruction, assessment, teacher preparation, and state and local policies such as
graduation requirement” – the classroom changes are promoted and, ultimately, improve
student performance (Klein, Hamilton, McCaffrey, Stecher, Robyn & Burroughs, 2000,
proficiency on state-mandated tests. “He [Begle (1979)] found that there was little
correlation between the many teacher characteristics and variables identified and the
(Nickson, 1992, p. 106). Higher achievement levels were found in classrooms with
teachers who knew their students, “what their backgrounds are, and what they know”
(National Research Council, 2001, p. 424; see also Fennema & Franke, 1992, p. 156).
When teachers know their students, the teachers can intervene to reduce frustrations,
reduce some of the “hidden social messages” and convey positive student “expectations”
to help students become successful (McLeod, 1992, p. 590; Nickson, 1992, p. 110-111).
Standards-based materials and reform instructional practices that support student learning
(Reys, Reys, Lapan, Holliday, & Wasman, 2003, p. 87; Riodan & Noyce, 2001, p. 392;
Schoen, Cebulla, Finn, & Fi, 2003, p. 229). Boaler (2006) found that “departmental
student responsibility” were “critical to the success of the students” (p. 369).
33
Some national studies found increased student achievement “while others do
not” (Jennings & Rentner, 2006, p. 110). Schoenfeld (2002) found these results from
(1) on “tests of basic skills,” there was “no significant performance differences
(2) on conceptual and problem solving tests, “students who learn from reform
predictor of student achievement” (Reys, Reys, Tarr, & Chavez, 2006, p. 11).
increased time on mathematics and the taking of advanced coursework” (Secada, 1992, p.
645). Students who “complete higher-level mathematics courses usually earn bachelor’s
degrees and, as a result, increase their earnings after college” (Franklin, 2006b, p. 12).
“Research has shown, for example, that an extra course in algebra or geometry can
increase a student’s earnings by 6.3 percent (Rose & Betts, 2001)” (Franklin, 2006b, p.
12). The more mathematics a person knows, as when students “take higher level”
mathematics classes, “the greater are his or her opportunities,” and this is an important
countries (Hawkins, Stancavage, & Dossey, 1998, pp. 62-63; see also Robitaille &
critical to the success of the initiatives” (Klein, Hamilton, McCaffrey, Stecher, Robyn &
curricula” (Reys, Reys, Lapan, Holliday, & Wasman, 2003, p. 80). Professional
development includes the support of changes by teachers and principals, time to learn
about the needed changes, and collaboration with colleagues. Collaboration allows
important. In Japan the collaborative study, observation, and refinement of lessons and
curricula which take place in “lesson study” – are part of the teacher’s ongoing
responsibilities (Stigler & Hiebert, 1999, p. 110; see also Schoenfeld, 2002, p. 20). In the
to the process of “improving teaching” which is a “most critical part of the school’s
development” (National Council of Teachers of Mathematics, 1991, pp. 2-3; Stigler &
Hiebert, 1999, p. 157). After returning from summer or intensive workshops, teachers
p. 12; Stigler & Hiebert, 1999, p. 162). Stigler and Hiebert (1999) call these sessions
“lesson study,” which is “the ideal context in which teachers develop deeper and broader
Understanding the changes required for reform to succeed “requires a focus on the
practices of teaching and learning” (Boaler, 2002, p. 244). Since change is difficult,
networking, collaborating, and sharing with the mathematics department is “critical to the
teachers’ morale and work” (Boaler, 2006, p. 369; Schifter, 1996, para. 84; Wainwright,
Morrell, Flick, & Schepige, 2004, p. 327). A pivotal part of change in improving
1992, p. 139).
Instructional Summary
“Standards set the course, and assessments provide the benchmarks, but it is
teaching that must be improved to push us along the path to success” (Stigler & Hiebert,
“No state [policymakers] that we know of regularly collects and uses data” to see
to make wise decisions, we need to know what is going on in typical classrooms” (Stigler
curriculum and instructional issues. The NCLB Act has increased student and school
accountability.
Classroom Assessments
The NCTM standards “provide criteria for judging the quality of mathematics
the process of gathering evidence about a student’s knowledge of, ability to use, and
disposition toward, mathematics and of making inferences from that evidence for a
to both assessment and instruction” (Webb, 1993, p. 2). “Because mathematical thinking
is complex and has many aspects, the assessment of this thinking requires the use of
(Webb, 1993, p. 2). Over a period of time, teachers must measure the students’ range of
knowledge of mathematical content, the connections among the many ideas, and the
aware that there is an “appropriateness of the form of assessment for the intended purpose
for discussing . . . and for reflecting on the form of the assessment” (Webb, 1993, p. 3).
(3) the interpretation of the student’s response by the teacher or student (if a self-
assessment),
(5) the reporting and recording of the results from the assessment (Webb, 1993,
pp. 3-4).
These features are interactive and the “distinctions between them is blurred” (National
most complex and important tasks of teachers (Stiggins, Conklin, & Bridgeford, 1986)”
instruction,” and allow teachers to “optimize both quantity and quality of their
assessment and their instruction and thereby optimize the learning of students”
(Chambers, 1993, p. 25; Lederman & Burnstein, 2006, p. 431; Schoen, Cebulla, Finn, &
Fi, 2003, p. 233; Turley, 2006, p. 441). These assessments occur through the teacher’s
of methods, and solutions to the problem situations (Schoen, Cebulla, Finn, & Fi, 2003,
p. 233). Integrating assessment “is not easy and requires teachers to have training to use
summative assessment, at the end of a course, should evaluate the majority of all the past
knowledge and skills learned. A formative assessment evaluates some or all the
431). The formative assessments could be very brief or could cover the entire unit,
based assessment. Selected-response formats are multiple choice, matching, and true-
false questions. This format usually contains a single correct answer and assesses
knowledge and skills rather than critical thinking or real-world problem (McTighe &
Ferrara, 1994, p. 14; Porter, 1995, para. 18; Webb, 1992, p. 677). Performance-based
(McTighe & Ferrara, 1994, p. 15) and proficiency, “performance skills, and product
cognitive processes used by the students (McTighe & Ferrara, 1994, p. 13; Stiggins,
etc., are . . . more reflective of new curricular goals and methods of instruction” (Porter,
Stiggins (1988) stated that “teachers may spend as much as 20 to 30% of their
676). “Little up-to-date information is available on how U.S. teachers conduct internal
What “teachers teach” and prepare students for is greatly influenced by the
exams” (Lederman & Burnstein, 2006, p. 430; Toch, 2006, p. 5; Webb, 1992, p. 678).
“Students are taking many more tests as a result of NCLB” (Jennings & Rentner, 2006, p.
reflective questions” with enough wait time for students to respond; giving specific and
“Whites and underrepresented minorities;” help narrow the students’ achievement gap
(Black & Wiliam, 1998 para. 16, 18, 45; Boston, 2002, para. 8, 12; Schoenfeld, 2002, p.
16). Bracey (2006) states “NCLB has not helped the nation and states significantly
narrow the achievement gap. . . .” (p. 153) while Jennings and Rentner (2006) state that
the achievement gaps on the NCLB tests are “generally narrowing or staying the same”
(p. 110). Each state gets to determine its own criteria for the barrier of achievement. By
having the NCLB barriers in place, some students are jumping over the barrier, but “they
don’t tell you how high the successful jumpers jumped. Worst, they can mask a
widening achievement gap” (Bracey, 2006, p. 159). These comments reflect some
“The NCLB mandates for AYP (adequate yearly progress) and public reporting of
raise test scores” (Lederman & Burnstein, 2006, p. 430). “NCLB poses the greatest
challenge for those schools with many subgroups, because failure of a single subgroup to
meet proficiency requirements can cause the entire school to be identified for
improvement” (Sunderman, 2006, p. 121). “An even greater problem is that the states
that are maintaining higher demands on students, such as South Carolina and Wyoming,
have created problems for their schools” in making AYP (Lewis, 2006, p. 339).
Challenges have occurred at the researcher’s school and the pressure to make AYP has
School districts and schools that fail to make AYP towards state proficiency goals
standards. Schools that meet or exceed AYP or close achievement gaps will be eligible
To make AYP, a school must demonstrate that it has met the State’s target for
and for each of its subgroups of students, ensure that at least 95 percent of all
students and each subgroup of participated in the State’s . . . assessments, and that
the school has met the State’s target for an additional academic indicator. At the
high school level, this additional academic indicator must be the graduation rate
years”, the school is then given three years of interventions labeled “school
10).
proficient students, on average, than schools making AYP,” because each subgroup must
reach a target score (Sunderman, 2006, p. 120). In schools that have not made AYP for
two years, there is “greater alignment of curriculum and instruction with standards and
development for teachers, and the provision of more intensive instruction to low-
achieving students” (Jennings & Rentner, 2006, p. 111). “Programs that focused on
individual student remediation but were not coordinated with the regular classroom
With the adoption of NCLB, several negative trends are that high-stakes testing
has not reduced “achievement gaps among students of different ethnicity;” and the AYP
model used in NCLB “may not identify schools that are doing a good job of helping low
performing students grow” (Cronin, Kingsbury, McCall, & Bowe, 2005, p. 11). Positive
trends have also emerged. State-level tests tended to “improve observed achievement”
and “improved student achievement” (Cronin, Kingsbury, McCall, & Bowe, 2005, p. 60).
42
Assessment Summary
If the “magnitude” in the achievement gap continues, “it won’t bring schools
close to the requirement of 100% proficiency by 2014” and students in ethnic groups may
Summary
and other listed concerns need to be addressed in the reauthorization of NCLB for the
goal of 100% proficiency by 2014 to be possible. If the reforms are implemented fully,
then the schools could achieve the goal of NCLB of 100% proficiency.
43
CHAPTER THREE: METHODOLOGY
Background
Wyoming, with a “population density of 5.2” per square mile, is the least-
populated state in the United States and “ranks 10th in total area of 97,814” square miles
(World Almanac and Book of Facts 2006, 2006, p. 451). Even though the state is rural,
the No Child Left Behind (NCLB) Act of 2001 requires that by 2014, 100% of the
assessment system (No Child Left Behind Act 2001, 2002). “To be effective, these
reforms must ultimately be adopted by teachers and must take hold in the classroom
(Tyack and Cuban, 1995)” (Klein, Hamilton, McCaffrey, Stecher, Robyn & Burroughs,
2000, p. 3). Since the Wyoming school districts and teachers reside in this rural
Wyoming is vital in order for this national goal to be achieved. The purpose of this
research study was to provide key findings to consider for the 48 Wyoming school
districts and for the nearly 300 high school mathematics teachers when trying to improve
the teaching of and student learning of mathematics at the high school level.
Population
Every Wyoming 11th grade public school student is required to take PAWS,
Code. The students’ performance on PAWS and the schools’ progress on the indicators of
AYP, adequate yearly progress, affect every teacher in grades 9-12 (Title 21, 2006,
article 3, #21-2-304, (a) (iii)). The target population was the group of Wyoming high
school (grades 9-12) mathematics teachers who teach at a school that has an 11th grade
44
Performance Assessment of Wyoming Students, PAWS, score for 2006 in
mathematics. AYP, adequate yearly progress, accountability data was gleaned from the
districts identified. The teachers’ responses were stratified into two groups, based on the
schools’ AYP mathematics results. One group that made AYP in 2006 was comprised of
57 schools with 250 teachers. The second group that did not make AYP in 2006 had 8
Results). For ease of tracking, the names of the groups were “making AYP” and “not
making AYP,” respectively. The survey was sent to all high school mathematics teachers
Wyoming’s AYP is based on several indicators. The first indicator includes the
test participation rate and the number of students scoring “proficient” and “above
proficient” for language arts. The second indicator is the test participation rate and
students scoring “proficient” or “above proficient” for mathematics. The third indicator
for Wyoming is the school’s graduation rate. If any one of these indicators is not met,
then the school has not met AYP. For this study, only the indicators for mathematics
Besides the participation rate and proficiency level of students, each school must
disaggregate the data into nine subgroups. If any of the nine subgroups in a school fail to
meet the stated AYP participation rate or the percent needed as proficient and advanced
for that school year, then the school does not make AYP. The nine subgroups in
The teachers were also stratified into three groups based on the target class
chosen. If the target class covered curriculum below Algebra 1 topics, then this teacher
was assigned to level one, a lower level. If the curriculum for the target was accelerated
from the normal progression as deemed by the teacher, then the teacher was assigned to
level three, an honors level. The remaining teachers were assigned to level two, a regular
level.
Sampling Techniques
Surveys have been used in education to gather information about the schools since
1817 (Creswell, 2005, p. 354). This self-reporting survey focused on the performance of
high school students and their high school classroom experiences as viewed through the
amounts of data from many respondents, but its limitations are well known” (Robitaille &
Travers, 1992, p. 708). Biased answers and the number of non-responses are some
limitations that must be taken into account. Another limitation is the validity of the
responses. For example, do the frequencies and types of activities that the teacher reports
on the survey really reflect what is happening in the classroom (Robitaille & Travers,
1992, p. 708)? Observing the frequency and type of instruction students receive over the
course is a limitation of this research project. The teacher’s busy schedule and lack of
time to respond may contribute to a low response rate (Chval, Reyes, Reys, Tarr, &
46
Chavez, 2006, p. 161). The timing of the mailing of the sample was critical. If the
survey was sent near a major vacation or at the end of a semester, then the response rate
would have decreased. From the beginning of January to the end of March, no state
testing for 11th graders taking the PAWS during the 2006-2007 school year was given.
During this time frame, most schools were in the early to middle part of the third quarter
of the school year. Each teacher’s workload should have been as ‘normal’ as possible.
study. The University of Wyoming Professor, Dr. Alan Moore, conducted an on-line
Supportive Assessment: The implementation and effects of the new state assessment
system in Wyoming” - based on 16 randomly selected districts, had a low response rate
from teachers (no percentage was given). Dr. Moore’s survey letter was dated February
16, 2006. The superintendents and district curriculum coordinators had the highest
response rate; then the building administrators; and finally the teachers (Moore, A. D.,
In February 2007, this research study’s design had three components: (1) a
measure of curriculum, instruction, and assessment practices, and (2) the relationship
between these teachers’ practices and the different content level of the target class, and
(3) the relationship between these teachers’ practices and the schools’ making AYP
(Klein, Hamilton, McCaffrey, Stecher, Robyn & Burroughs, 2000, p. xi). The questions
on the survey were slightly modified from the teacher questionnaires written by Horizon
47
Research, Incorporated. Horizon Research, Incorporated was the subcontractor for
several National Science Foundation grants that dealt with curriculum, instructional and
assessment issues, and teacher preparation across the country (Chval, Grouws, Smith,
Weiss, & Ziebarth, 2006, p. 1-2; Klein, Hamilton, McCaffrey, Stecher, Robyn &
Burroughs, 2000, p. xi-xii; Horizon Research, 2000). “Horizon Research, Inc. (HRI),
under a subcontract from RAND, had primary responsibility for designing and validating
this questionnaire” (Klein, Hamilton, McCaffrey, Stecher, Robyn & Burroughs, 2000, p.
xii). Permission to use teacher questionnaire questions from various National Science
Foundation contracts was received from Iris Weiss of Horizon Research, Inc.; Brian
Stecher of RAND, and Barbara Reys of University of Missouri on October 11-12, 2006.
Pilot Survey
A pilot survey was given to five mathematics teachers from three public schools
in Casper, Wyoming, during the week of October 11-18, 2006. The first teacher was a
retired high school mathematics teacher. The second was currently working as a junior
high school mathematics instructor. The third and fourth pilot survey participants were
former junior high mathematics teachers who have switched from teaching mathematics.
One was teaching computer applications and the other was the school’s mathematics
mathematics, was currently teaching 6th grade mathematics. These participants suggested
a few minor modifications in wording and the elimination of a repeated question. The
suggestions were incorporated into the final questionnaire sent to the Institutional Board
After the Institutional Board Review approved the project, permission from the
superintendents was needed. See Appendix A for the approvals. The names of the high
school mathematics teachers were acquired and they were sent the surveys by February 5,
2007.
Forty-six of the 48 superintendents were sent a letter requesting permission for the
high school mathematics teachers to be surveyed. (The two school districts not contacted
gave permission to conduct a research study with their high school mathematics teachers.
Two of the smaller school districts did not give permission to send the surveys. See
Appendix B for a copy of the letter, the permission letter, and a list of participating
school districts. Immediately upon providing permission, the principals of all the high
schools in each district were sent a letter requesting the names and email addresses for
their 9-12 grade mathematics teachers. Telephone calls were made to principals to elicit
this information when not provided by the requested deadlines. See Appendix C for the
principal letter and name request. There were 295 teachers in 75 schools that were
contacted by mail. See Appendix D for a copy of the teacher’s letter, the teacher’s
instructions, and the questionnaire. The teacher questionnaire had 32 questions requiring
The teachers received the questionnaire at their school addresses during the first
full week of February. Teachers had three to four weeks to complete the questionnaire,
and return it in self-addressed stamped envelope. The teachers were assigned a tracking
49
number when follow-up was needed to get completed surveys returned. The tracking
number helped the researcher note the questionnaires not returned. A week before the
deadline, an email to the school address was sent to those teachers who have not returned
Through a random drawing, participating teachers were given the chance to win
one of three VISA Gift cards as incentive to return the completed surveys by the stated
deadline. The deadline for entry was one month following the surveys’ arrival at each
school. Once the teachers completed and returned their surveys before the deadline, the
bottom portion of the last page with the teachers’ names and telephone numbers was
separated from the surveys. The tracking numbers were eliminated and the separated slips
were placed in a random drawing. This process was followed to ensure response
anonymity of the teachers, schools, and school districts. The drawing took place several
days after the deadline to allow any remaining mail to be received. The first name drawn
received a $100 VISA Gift card; the second receive a $75 card, and the third received a
$50 card.
The names of the schools, teachers, and tracking numbers were kept in a secure
data file. The responses were separated into the two groups based the school making or
not making AYP. The responses were also separated into the three levels based on the
target class taught. Schools making AYP were designated with a 1 and schools not
making AYP with a 0. The level of the target class taught had a designation of 1 for
content below Algebra 1 topics; a 2 for content of Algebra 1 and above; and a 3 for the
assessment issues that teachers dealt with throughout the school year. Systemic reform
assessment, teacher preparation, and state and local policies such as graduation
performance” (Klein, Hamilton, McCaffrey, Stecher, Robyn & Burroughs, 2000, p. 1).
The Center for the Study of Mathematics Curriculum conducted a cross-site study for
three school districts and are using the results in the same manner as proposed in this
Time constraints required that the data be limited in scope for analysis. The
More schools have aligned the “curriculum and instruction with standards and
assessments, [and making] more use of test data to modify instruction” (Jennings &
Rentner, 2006 p. 111). The statistics used for analysis on each question was a single
average. All of the subparts for each question, which used a Likert scale, were combined
to create the single value. The more frequent the use of the topic, the higher the scale
value given. If there was no response, then a value was not included in the average for
the question. Analysis was done by combining the separate subparts of each question
51
into a single average. For example, if a teacher’s response for question 1 – state
standards - was “Often” for 1a – a scaled value of 4 was assigned; “Rarely” for 1b – a
scaled value of 2 was assigned; and “Sometimes” for 1c – a scaled value of 3 was
assigned; then the single average for the teacher on question 1 – state standards - was a 3.
value of 3.
curriculum for all three levels of courses taught was averaged. Finally, each question on
curriculum for the two groups of schools “making AYP” or “not making AYP” was
averaged. If the curriculum did not affect the course the teachers were teaching or did
not affect the schools’ making AYP, then the levels and groups should have had the same
average as well as a small effect size and no significant difference between the levels of
The instruction issues were the questions number 2, 4, 5 through 7, 14, 22 and 23.
class
The specific background knowledge about the NCTM content and process
standards for grades 9-12 was asked in questions 5 and 6 (NCTM, 2000, pp. 287-364).
knowledge and beliefs” (Koehler & Grouws, 1992, p. 118; Thompson, 1992, p. 128).
schools not making AYP (Jennings & Rentner, 2006, p. 111). Question 7 - professional
meetings, and national board certification. School schedules rarely allow regular
collaborate time for teachers to discuss curriculum, instruction, and assessment issues. If
the teachers do want to collaborate regularly, the time is scheduled before or after normal
practices are critical to the success of the initiatives” (Klein, Hamilton, McCaffrey,
Stecher, Robyn & Burroughs, 2000, p. 5). Each question for the teachers’ responses on
instruction was averaged together to a single value. Each instruction question for all the
three levels of courses taught was averaged. Finally each instruction question for the two
groups for schools making AYP or not making AYP were averaged. If the instruction did
not affect the course the teachers were teaching or does not affect the schools’ making
AYP, then the levels and groups should have had the same average as well as a small
effect size and no significant difference between the levels of courses and groups of
schools.
53
Jennings and Rentner (2006) found that “students are taking a lot more tests”
3- was related to instructional time devoted to test preparation whether the test was a
Each question for all the teachers’ responses on assessment was averaged together to a
single value. Each assessment question for all the three levels of courses taught was
averaged. Finally, each assessment question for the two groups for schools “making
AYP” or “not AYP” was averaged. If the assessment used did not affect the course the
teachers were teaching or did not affect the schools’ making AYP, then the levels and
groups should have had the same average as well as a small effect size and no significant
about the teacher or the school. The topics covered the number of classes taught and the
target class information relating to classroom time and textbook used. Questions also
focused on the students in the target class and the school configuration. Finally,
information on the years of experience of the teacher at the school and in the school
district and the highest degree achieved by the teacher was gathered. Information about
the target class placed the teacher’s responses in the appropriate course level for analysis.
Information about the school placed the teacher’s responses in the appropriate “making
Likert scales.
• Question 24 – assessment all used the same five-point Likert scale responses.
The scaled responses were 1 for “Never”; 2 for “Rarely”; 3 for “Sometimes”; 4 for
Question 3- time preparation for testing, used the scaled numbers of 1 for “Does Not
Apply,” 2 for “Decreased,” 3 for “Remained the Same,” and 4 for “Increased.”
Question 6 – teaching NCTM Standards used the scaled responses of 1 for “Not
Question 7 - teacher professional development the scaled responses were 0 for “Does
Not Apply,” 1 for “Not Interested,” 2 for “Somewhat Interested,” and 3 for “Very
Interested.”
Question 23 – The scaled responses for instructional time were 1 for “0-15%,” 2 for
The questionnaire was designed to answer the questions, “Does the course level
taught by the mathematics teacher affect the practices in curriculum, instruction, and
assessment related to their schools of making adequate yearly progress, AYP?” AYP is
the accountability measure in the NCLB Act used to determine if the school is achieving
proficiency on the state standards. The course level deals with the mathematical content
for a targeted class. The teachers’ responses were stratified into three levels based on the
target class chosen. If the target class covered curriculum below Algebra 1 topics, then
this teacher was assigned to level one, a lower level. If the curriculum for the target was
accelerated from the normal progression as deemed by the teacher, then the teacher was
assigned to level three, an honors level. The remaining teachers were assigned to level
2. What is the relationship between the type of mathematics instruction given and
3. What is the relationship between the type of classroom assessments given and the
4. How did the responses compare on mathematics curriculum questions from the
6. How did the responses compare on mathematical assessment questions from the
Data Analysis
The variables were the questions on the curriculum, instruction, and assessment
The first step was to place all of the teachers’ responses together to figure the
mean and standard deviation of the entire population of Wyoming high school
mathematics teachers.
The second step was to sort the teachers’ responses by level of the target class
chosen. The mean and standard deviations were calculated for each of the three levels.
An effect size was figured for each of the three possible groupings between the lower
level course and the regular level; between the lower level and between the honors level;
and the regular level and honors level. An ANOVA test of independent groups was
calculated at the 0.05 significance level in order to determine any relationship between
the curriculum, instruction, and assessment questions among the three levels of
mathematical courses.
The third step was to sort the teachers’ responses by the school’s performance on
the 2006 statewide test by making AYP or not making AYP. The mean and standard
deviations were calculated for each of the two groups. An effect size was figured for
each of the two possible groupings between making AYP and not making AYP. A
significance t-test of independent groups was also done at the 0.05 significance level to
57
determine any relationship between the curriculum, instruction, and assessment
Response Rate
By the end of the 2006-2007 school year, 164 questionnaires out of the 295
surveys were completed and returned. This was a 55.78% return response rate. One of
the 295 teachers contacted me stating that she was a music teacher. She was eliminated
from the original list of mathematics teachers to be analyzed. The three randomly
selected teachers who received a VISA Gift card due to returning their questionnaire by
the deadline were teachers from Horizon High School in Evanston, Jackson Hole High
School in Jackson, and Saratoga High School in Saratoga. Of the 75 schools contacted
67 schools had a teacher respond. This included seven of the eight Wyoming schools that
did not make AYP and 60 Wyoming schools that did make AYP for the 2006-2007
Reliability
Alpha. This alpha number can range from 0 to 1 with higher numbers meaning that the
teachers’ responses are more consistent in measuring the same content (SPSS FAQ, para.
5). The reliability on the 13 variables was 0.896 using SPSS. See Appendix F for the
statistics.
Validity
University, has described two types of “translation validity” and four types of “criterion-
related validity” (Trochim, 2006, para. 4). The translation validities “focus on whether
59
the operationalization is a good reflect of the construct” (Trochim, 2006, para. 4). The
operationalization will perform based on . . . . the criteria they use as the standard for
This research study shows evidence of convergent validity. The study compares how
similar the overall teachers’ means are to the three levels of mathematics courses taught
(1, 2, or 3), and how similar the overall teachers’ means are to the two AYP school
Variables Analyzed
Time constraints required that the data be limited in scope for analysis. Analysis
was done by combining the separate subparts of each question into a single average.
60
From the 32-question survey, 11 questions were analyzed. Since one question’s
subparts dealt with three issues, the responses were split into separate parts. This gave a
total of 13 independent variables. The dependent variable was the combined average of
The other questions related to demographics and information about the targeted
class are not reported in this dissertation. The variables are listed next to the question
number on the questionnaire. A short descriptive name about the question is also
provided.
All Teachers
The descriptive statistics for all the teachers’ responses and three levels of
mathematics courses are given in Table 4.1. The number respondents, the mean, and
Table 4.1
Descriptive Statistics for Scaled Responses for Level Courses
Question Total Sample For Level 1 For Level 2 For Level 3
Num Name n M SD n M SD n M SD n M SD
1 Use Standards 163 3.54 .94 29 3.54 .93 109 3.50 .97 25 3.69 .82
2 PAWS Results 163 2.34 1.07 29 2.37 1.24 108 2.36 1.04 26 2.21 1.04
3 Time Prep Test 163 2.99 .79 29 2.64 .94 108 3.00 .73 26 3.35 .75
4 Tchr Collab 164 2.93 .93 29 2.80 .84 109 2.89 .95 26 3.22 .90
5 Tchr Readiness 164 3.31 .48 29 3.08 .56 109 3.33 .45 26 3.48 .41
6 Tch NCTM Stds 164 3.32 .54 29 3.35 .49 109 3.26 .57 26 3.51 .45
7 Profess Develop 164 2.34 .38 29 2.27 .40 109 2.34 .38 26 2.40 .33
21 Textbk Driven 161 3.51 .40 27 3.47 .39 109 3.51 .41 25 3.56 .34
22 Instr Strategies 164 3.52 .36 29 3.57 .35 109 3.46 .34 26 3.69 .43
22 Std Ctrd Learn 164 3.62 .54 29 3.51 .52 109 3.61 .54 26 3.78 .53
22 Tch Mangmt 164 3.47 .46 29 3.44 .37 109 3.44 .46 26 3.64 .54
23 Instr Time 164 2.19 .56 29 2.16 .53 109 2.18 .53 26 2.27 .70
61
24 Assessmt Used 164 2.43 .52 29 2.50 .57 109 2.42 .50 26 2.39 .53
For all the questions, the range of values of the means spanned 3.62 to 2.19. The
value of 3.62 meant the teacher chose “Sometimes” (scale of 3) to “Often” (scale of 4).
The value of 2.19 meant the teacher chose “16-30%” (scale of 2) to “31-45%” (scale of
3). The range of the standard deviations for the 13 questions spanned 1.07 to 0.36. The
majority of standard deviations are smaller than the ‘standard normal’ distribution
indicating that there is little variation in responses among the teachers. The top five
curriculum issues. See Figure 4.1 for the ordered bar graph with all the topics.
0.5
0
t
s
n
t
en
ds
ts
e
d
n
s
s
en
en
es
ti n
tio
ie
rd
im
se
ar
ul
ar
iv
eg
m
m
in
da
Le
U
ra
s
es
lT
r
nd
Te
D
op
e
at
bo
t
an
R
a
ag
na
er
en
ta
k
Re
tr
el
r
S
lla
St
oo
nt
an
fo
S
tio
lS
sm
ev
W
Co
Ce
r
tb
M
M
ng
ep
uc
PA
he
D
na
es
CT
ex
tr
si
Pr
of
nt
ac
ti o
ss
he
he
g
N
-T
U
ns
Pr
in
de
Te
-A
e
uc
1-
ac
21
ng
-I
m
s
ea
tu
r
5-
24
U
tr
Te
23
he
Ti
hi
-T
2-
-S
ns
3-
ac
4-
ac
22
22
-I
Te
Te
22
6-
7-
Figure 4.1. This is an ordered bar graph of the mean of 13 questions for all teachers on
the scaled averages for each question from the Wyoming High School Mathematics
62
Teacher Questionnaire given in February 2007. n = 164 (9 questions); n = 163 (3
questions); n = 161 (1 question).
Course Level 1
For the course level 1 questions, the range of values of the means spanned 3.57 to
2.16. The range of the standard deviations for level 1 was from 1.24 to 0.35. The value
of 3.57 for question 22 - instructional strategies meant the teacher chose “Sometimes”
(scale of 3) to “Often” (scale of 4). The value of 2.16 for question 23 - instructional time
meant the teacher chose “16-30%” (scale of 2) to “31-45%” (scale of 3). The top five
curriculum issues. See Figure 4.2 for the ordered bar graph with all the questions.
4.00
3.57 3.54 3.51 3.47
3.44 3.35
3.50
3.08
3.00 2.80
2.64
2.50
2.37 2.27
2.50 2.16
2.00 Level 1 Tchs
1.50
1.00
0.50
0.00
t
n
t
ng
n
ds
lts
e
ed
rn
s
ds
en
en
es
tio
ve
ie
im
r
a
su
r
s
st
eg
da
m
em
in
da
i
Le
U
ra
lT
r
Te
Re
D
ad
op
at
an
bo
t
an
ag
na
er
en
ok
Re
tr
el
r
St
S
lla
St
nt
an
fo
tio
lS
ev
W
bo
Co
Ce
r
M
s
M
ng
ep
uc
PA
he
D
na
es
CT
ex
er
tr
si
Pr
r
of
t
ac
io
ss
he
ng
en
N
-T
U
ns
ch
Pr
ct
Te
-A
e
1-
ac
d
si
ng
21
-I
m
ru
ea
tu
r
5-
24
U
Te
23
he
Ti
hi
st
2-
-T
-S
3-
ac
n
4-
ac
22
22
-I
Te
Te
22
6-
7-
Figure 4.2. This is an ordered bar graph of the mean of 13 questions for the teachers
63
teaching a level 1 course on the scaled averages for each question from the Wyoming
High School Mathematics Teacher Questionnaire given in February 2007. n = 29 (12
questions); n = 27 (1 question).
Course Level 2
For the course level 2 questions, the range of values of the means spanned 3.61 to
2.18. The range of the standard deviations for level 2 was from 1.04 to 0.34. The top
“Sometimes” (scale of 3) to “Often” (scale of 4). The value of 2.18 for question 23 -
instructional time meant the teacher chose “16-30%” (scale of 2) to “31-45%” (scale of
3). See Figure 4.3 for the ordered bar graph with all the questions.
n
t
t
en
op s
s
ed
e
at s
n
en
en
rd
lt
io
tin
ie
d
im
ar
ne
iv
su
ar
s
at
eg
da
em
m
Le
U
es
lT
r
di
nd
Re
r
D
bo
t
T
CT e a
ag
na
er
en
tio S t a
k
tr
el
r
St
S
la
oo
R
nt
an
fo
tio
lS
sm
ev
W
ol
ng her
Ce
M
tb
ep
PA
uc
na
D
C
es
n
ex
er
i
tr
r
Pr
of
ac
nt
ns Us
ss
he
ac ing
N
-T
ns
ch
Pr
e
Te
-A
e
uc
1-
ac
ud
21
-I
s
m
ea
r
5-
24
7- 2-U
tr
Te
23
he
t
Ti
hi
-T
-S
3-
ac
4-
22
22
-I
Te
Te
22
6-
Figure 4.3. This is an ordered bar graph of the mean of 13 questions for the teachers
teaching a level 2 course on the scaled averages for each question from the Wyoming
64
High School Mathematics Teacher Questionnaire given in February 2007. n = 109
(11 questions); n = 108 (2 questions).
Course Level 3
For the course level 3 questions, the range of values of the means spanned 3.78 to
2.21. The range of the standard deviations for level 3 were 1.04 to 0.33. The top five
“Sometimes” (scale of 3) to “Often” (scale of 4). The value of 2.21 for question 2 - using
PAWS results meant the teacher chose “Decreased” (scale of 2) to “Remaining the
Same” (scale of 3). See Figure 4.4 for the ordered bar graph with all the questions.
en
ss
t
s
n
g
lts
es
n
e
en
en
rd
tio
tin
rd
im
se
ar
ne
riv
gi
su
da
em
m
da
Le
U
ra
s
lT
te
Te
Re
D
ad
op
an
bo
an
t
ra
ag
na
r
en
k
Re
te
el
St
S
lla
St
St
oo
an
fo
io
sm
ev
en
W
Co
r
ct
tb
M
al
M
ng
PA
he
D
C
es
CT
u
re
ex
on
er
si
tr
r
of
nt
ac
ss
he
P
ng
N
-T
U
ns
ch
Pr
t
de
Te
-A
e
uc
1-
ac
si
21
ng
-I
m
ea
tu
r
5-
24
U
tr
Te
he
23
Ti
hi
-T
2-
-S
ns
3-
ac
4-
ac
22
22
-I
Te
Te
22
6-
7-
The means for all the questions and levels for the target level course followed a
somewhat consistent pattern as shown in Figure 4.5. There appeared some variation in
4.00
3.50
3.00
ev
t
ds
en
S
e
ds
rn
d
ad
m
ra
lla
m
W
se
Te
D
St
L
St
riv
ng
Re
St
Ti
Co
PA
U
tr
of
ep
M
M
se
Cn
ct
tr
t
hr
Pr
m
ng
hr
CT
Pr
bk
ns
r
U
tr
Tc
ch
ss
Tc
1-
hr
d
si
ns
N
xt
-I
e
tu
5-
-A
-T
U
4-
m
Tc
22
-I
h
-T
2-
-S
22
Tc
24
Ti
23
7-
21
22
3-
6=
Figure 4.5. The means of the 13 questions for all teachers’ responses and the means of
the questions for the three levels of target classes from the Wyoming High School
Mathematics Teacher Questionnaire given in February 2007.
66
AYP Groups
The complete list of descriptive statistics for the schools making AYP designation
Table 4.2
2 Using PAWS Results 163 2.34 1.07 128 2.35 1.09 35 2.28 1.01
3 Time Prep for Testing 163 2.99 .79 128 2.95 .84 35 3.14 .55
4 Teacher Collaboration 164 2.93 .93 128 2.88 .91 36 3.08 .97
5 Teacher Readiness 164 3.31 .48 128 3.36 .46 36 3.13 .49
6 Teach NCTM Standards 164 3.32 .54 128 3.38 .50 36 3.07 .64
7 Professional Development 164 2.34 .38 128 2.35 .38 36 2.29 .38
21 Textbook Driven 161 3.51 .40 127 3.55 .35 34 3.37 .53
22 Instructional Strategies 164 3.52 .36 128 3.50 .36 36 3.57 .37
22 Student Centered Learning 164 3.62 .54 128 3.63 .54 36 3.56 .52
22 Teacher Management 164 3.47 .46 128 3.49 .47 36 3.40 .45
23 Instructional Time 164 2.19 .56 128 2.19 .57 36 2.19 .53
24 Assessment Used 164 2.43 .52 128 2.45 .52 36 2.35 .49
67
The means for the questions with the schools making AYP and not making AYP also
followed a somewhat consistent pattern. See Figure 4.6. There appeared a small
variation in questions 5, 6, and 21.
3.5
2.5
All Teachers,
2 n~164
0
b
t
i
ev
st
ds
S
en
d
n
ds
ad
m
ra
lla
se
W
Lr
Te
D
St
St
riv
ng
Re
St
Ti
Co
PA
U
tr
of
p
M
M
se
Cn
ct
t
tr
e
Pr
m
ng
hr
CT
ch
Pr
bk
r
ns
U
tr
ch
ss
Tc
1-
d
hr
si
ns
T
-I
e
tu
5-
Tx
-T
-A
U
4-
m
Tc
22
-I
h
2-
-S
22
Tc
24
-
Ti
23
7-
21
22
3-
6=
Figure 4.6. The means of the 13 questions for all teachers’ responses and the means of
the questions for the two groups of school’s AYP designation from the Wyoming High
School Mathematics Teacher Questionnaire given in February 2007.
For schools making AYP questions, the range of values of the means spanned
3.63 to 2.19. The range of the standard deviations for the making AYP group were 1.09
to 0.35. The top five means for schools making AYP were:
See Figure 4.7 for the ordered bar graph with all the questions. The value of 3.63 for
question 22 - student centered learning meant the teacher chose “Sometimes” (scale
of 3) to “Often” (scale of 4). The value of 2.19 for question 23 - instructional time
meant the teacher chose “16-30%” (scale of 2) to “31-45%” (scale of 3).
t
s
t
n
en
l ts
s
e
d
s
n
en
en
es
rd
tio
tin
ie
rd
im
se
ar
iv
su
eg
da
m
in
em
da
Le
U
es
lT
r
Re
or
ad
D
op
at
an
an
t
T
ag
na
r
en
b
ok
Re
tr
e
el
St
S
la
St
nt
an
fo
tio
lS
sm
ev
W
o
ol
r
Ce
M
tb
ng
PA
ep
uc
he
D
na
es
CT
ex
er
si
tr
Pr
of
ac
t
tio
ss
g
he
en
N
-T
U
ns
ch
Pr
in
Te
-A
e
uc
1-
ac
d
ng
21
-I
s
m
ea
tu
r
5-
24
U
tr
he
Te
23
Ti
hi
2-
-T
-S
ns
3-
ac
4-
ac
22
22
-I
Te
Te
22
6-
7-
Figure 4.7. This is an ordered bar graph of the mean of 13 questions for the teachers
teaching in a school which made AYP on the scaled averages for each question from the
Wyoming High School Mathematics Teacher Questionnaire given in February 2007. n =
128 (11 questions); n = 127 (2 questions).
For the schools not making AYP questions, the range of values of the means
spanned 3.57 to 2.19. The range of the standard deviations for the not making AYP
group were 1.01 to 0.37. The top five means for the schools not making AYP were:
“Often” (scale of 4). The value of 2.19 for question 23 - instructional time meant the
4.00
3.57 3.56
3.40 3.37 3.35
3.50 3.14 3.13 3.08 3.07
3.00
2.35 2.29 2.28
2.50 2.19
2.00 Not Md AYP
1.50
1.00
0.50
0.00
n
t
s
t
s
g
en
lts
s
e
ds
en
en
rd
es
tio
tin
ie
im
se
ar
riv
su
ar
eg
da
m
em
in
Le
ra
U
es
lT
d
Re
D
ad
op
at
an
bo
an
t
T
ag
na
er
en
k
Re
tr
el
St
r
S
lla
St
oo
nt
an
fo
tio
lS
sm
ev
W
Co
Ce
M
tb
M
PA
ep
uc
he
D
na
es
n
CT
ex
er
si
tr
r
Pr
of
nt
ac
tio
ss
he
ng
N
-T
ns
ch
Pr
de
Te
-A
e
uc
1-
ac
si
ng
21
-I
m
ea
tu
r
5-
24
U
tr
Te
he
23
Ti
hi
2-
-T
-S
ns
3-
ac
4-
ac
22
22
-I
Te
Te
22
6-
7-
Figure 4.8. This is an ordered bar graph of the mean of 13 questions for the teachers
teaching in a school which did not make AYP on the scaled averages for each question
from the Wyoming High School Mathematics Teacher Questionnaire given in February
2007. n = 36 (10 questions); n = 35 (2 questions); n = 34 (1 question).
Effect Size
Effect size is the value when the means of two groups are compared
“independent” of their sample sizes. In significance tests, sample sizes are required in
those calculations. If sample sizes are large enough, significances test can give
significant results when there really are no significant differences between the groups.
Thus, the use of effect sizes has become a more common measure when comparing
groups. Cohen “hesitantly defined effect sizes as “small, d = .2,” “median, d = .5,” and
70
“large, d = .8” (Becker, 1998, Lecture, para. 13). Using the values in table 1 and the
website calculator, the values for effect sizes were calculated for the levels of target
Table 4.3
courses, only 10 had a moderate or strong effect between the teacher levels. The
majority, 29, of the effects was negative. There were only two questions which had a
large effect size according to Cohen’s descriptions. The large effect came from teachers
in the same levels. They were from teachers in level 1 and level 3.
71
Since both values are negative, however, the teachers teaching a lower level courses
(teachers teaching a class with content below Algebra 1) appeared do more test
preparation time with students than those teachers teaching an honors level class. Also,
the teachers teaching a lower level classes felt more prepared to teach a wider variety of
There were seven variables between the groups with a moderate negative effect
• question 3 - time preparation for testing with level 1 and 2 (-0.43) and
All three levels had moderate to strong effect on question 3 - time preparation for
testing. This indicates that teachers instructing lower level classes do more test
preparation work with students than teachers teaching regular level and honors level.
72
Even teachers instructing regular level do more test preparation work with students
There were three questions with moderate effect size between the schools that
made AYP and the schools which did not make AYP. See Table 4.4. All three had a
The teachers in schools, which made AYP, appeared to feel more prepared to teach a
variety of mathematics concepts, teach more NCTM Standard, and follow the topics in
the textbook than those teachers in school which did not make AYP.
Table 4.4
assessment questions, an ANOVA test was calculated. The significance level was 0.05,
so any p-value that is less than 0.05 was significant. For the three levels of target courses
the ANOVA test have significant results in the areas of instruction and assessment. See
Since there were significances found for instruction and assessment questions, SPSS
calculated a Tukey HSD Post Hoc Test. This Post Hoc test results were on level 1
teachers’ and level 3 teachers’ instruction questions. Level 2 teachers and level 3
teachers had a significant difference in their means. On the assessment questions only
level 1 teachers and level 3 teachers had a significant difference in their means.
t-Test
When there are only two groups of teachers’ responses to compare like those in
schools making AYP and those in schools not making AYP, the t-test of significance is
calculated instead of ANOVA. None of the t-test values were significant at the 0.05
level. See Appendix F for the statistical results. There was no significant difference
between the teachers’ responses in schools making AYP and the teachers’ responses in
This chapter will summarize the results from the teacher’s questionnaire for the
Wyoming high school mathematics teachers and school districts. Suggestions will be
general direction that must be interpreted by individual districts, schools, and teachers in
Questionnaire Limitations
amounts of data from many respondents, but its limitations are well known” (Robitaille &
Travers, 1992, p. 708). The limitation of biased responses brings cautions when “the data
(Hawkins, Stancavage, & Dossey, 1998, p. 5; see also Chval, Grouws, Smith, Weiss, &
Ziebarth, 2006, p. 47). Even though this instrument had high reliability (0.896), and
many questions had been tested with teachers, “respondents [are] working in different
Stancavage, & Dossey, 1998, p. 5). When you have simple Likert scale words, there is
an intrinsic variation with the respondents. The interpretation of the response “Often” by
one person is different from the interpretation of the next person. This is true even when
the survey participants are from the same school or in the same town. There will be
variation of data. Data are not always as clear and as clean as expected. Personality and
gathering data rather than a self-reporting questionnaire may show additional variation
76
between the levels of courses or between the schools that did not shown any
The timing of the survey and offer of an incentive promoted a very good response
rate (55.87%). A suggestion for improvement would be to get more of the teachers who
Results
Background
It is important to know the critical factors that must be present in order for
students to be proficient as defined in NCLB. These factors are embedded within the
Wyoming State statues Title 21, called the Wyoming Education Code of 1969, (Title 21,
2006). Teachers must be aware of what the Wyoming high school students’ mathematics
already know and what skills they need to learn in order to be proficient on the PAWS,
with assessment scores from students in their classroom. The results for each school’s
latest scores regarding the AYP status and the district’s results in regard to NCLB are
also shared with teachers. Another suggestion for improvement would be an onsite visit.
During onsite visits, observers should ask teachers how they specifically use the supplied
making adequate yearly progress, AYP?” and “Does the mathematics course level affect
There were no significant differences found in the curriculum questions for any
level of teacher’s courses or any school’s AYP designation. The ANOVA results for the
three levels of courses showed no significant differences. The t-test for the schools that
made AYP and schools that did not make AYP showed no significant differences. The
data from this questionnaire adds more evidence to support the conclusion that the
“schools make little difference” in test scores, but “an individual teacher can have a
powerful effect” on student’s achievement (Marzona, Pickering, & Pollack, 2001, p. 2).
Because some of the curriculum questions had higher mean scaled scored than
the teachers’ responses on the questionnaire. When planning lessons, it appears that
teachers do stay focused on the curriculum topics of the using standards (question 1) or
Two of these four questions were in the top five means for all teachers.
These two questions were in the top three means for level 2 teachers.
There were in the top three means for the schools making AYP.
The curriculum questions for the schools not making AYP were:
These were in the top five means for the schools not making AYP.
Because national and state tests like NAEP and PAWS follow the curriculum
content of the NCTM Standards, districts have aligned the curriculum to these standards
(NAEP, 2007, para. 4). It is expected that the teachers to teach toward these standards
(Hawkins, Stancavage, & Dossey, 1998, p. 47). The curriculum should be designed for
There were significant differences in the means of the instruction questions for
two out of the three levels of courses. The ANOVA results showing significance was
followed by a Tukey HSD comparison test. There was differences in the means (0.004)
between teachers teaching the level 1 and 3 courses and differences in means (0.015)
between teachers teaching the level 2 and 3 courses. This implies a difference in the
instruction responses between teachers teaching courses which contain content below
Algebra 1 topics (level 1) and the teachers teaching course which teach an honors or
accelerated course (level 3). There were differences in the instruction responses between
teachers teaching courses which contain regular course content (level 2) and the teachers
Do the frequencies and types of activities that the teacher reports on the survey
really reflect what is happening in the classroom (Robitaille & Travers, 1992, p. 708;
Chval, Grouws, Smith, Weiss, & Ziebarth, 2006, p. 47)? While the majority of teachers
report that they are aware of and are using reform instructional methods, when the
teachers were observed, the observations showed “that many secondary students are not
Morrell, Flick, & Schepige, 2004, p. 322; see also Stigler & Hiebert, 1999, p. 12). Not
observing the frequency and type of instruction students receive over the course is a
The t-test for the mean differences in the schools that made AYP and schools
that did not make AYP showed no significant differences between the schools.
80
Instruction questions analyzed on the questionnaire were:
The three parts of question 22 were in top five means for all the teachers.
In instruction, question 22 – student centered learning had the highest means for:
• level 3 (3.78).
increased the differences in science achievement between boys and girls” (Marzano,
Pickering, & Pollack, 2001, p. 9). Instruction issues were the other large influence on the
teachers’ responses on the questionnaire. This implies that Wyoming high school
mathematics teachers care about the delivery of instruction they give and have instruction
focused on the students. Teachers care about aligning instruction to the standards
The three parts of question 22 finished in the top five means for level 1 teachers.
The three parts of question 22 finished in the top five means for level 2 teachers.
The three parts of question 22 finished in the top four means for level 3 teachers.
For the schools making AYP, the top instruction means were:
The three parts of question 22 finished in top five means for schools making AYP.
For the schools not making AYP, the top instruction means were:
82
• question 22 - instructional strategies (3.57),
The three parts of question 22 finished in the top three means for schools not making
AYP. The individual teacher still chooses what to teach and when to teach. Even if
teachers are in the same school and are “using the same textbook, they still make
independent decisions about what to teach . . . . and depth of instruction” (Paek, 2008, p.
9).
Assessment
The ANOVA results indicated one significant difference in the means (0.015) of
the assessment questions between the level 1 and level 3 courses. The t-test for the
schools that made AYP and schools that did not make AYP showed no significant
differences in the means between the schools. “Because of the complexity of the context
in which learning takes place, examining a single variable at a time and its sole
relationship to student achievement may not necessarily reveal the true underlying
(Hawkins, Stancavage, & Dossey, 1998, p. 5). “In addition, the reader should remember
that statistically significant differences may be differences that are not considered
educationally significant” (Hawkins, E.F., Stancavage, F.B., Dossey, J.A., 1998, p. 5).
None of the assessment questions had means in the top five for any of the levels of course
Effect Size
What accounts for the effect size in the differences of the means between level 1
and level 3 on question 3 - time preparation for testing (-0.83) and question 5 – teacher
achievement levels,” so the if the teacher is effective, the level of class taught should not
make a difference (Marzona, Pickering, & Pollack, 2001, p. 3). The data reveals that
there is a difference in the means. Question 3 – time preparation for testing had a mean,
which was:
• largest (-0.83) between the lower level (1) and honors (3) teachers;
• moderate (-0.47) between the regular level (2) and the honors (3) teachers; and
• moderate (-0.43) between the lower level (1) and the regular level (2) teachers.
• large (-0.82) between the lower level (1) and the honors level (3) teachers; and
• moderate (-0.49) between the lower level (1) and regular level (2) teachers.
There were only moderate effect size for schools making AYP and those schools,
which did not make AYP, were positive. The effect sizes were:
85
• question 6 – teaching NCTM standards (0.54);
The results showed differences between some topics and some levels of courses.
The differences cannot be answered by the statistical analysis done. Some questions a
• Do honors level teachers incorporate test preparation within the normal lessons
throughout the school year and not count this time as test preparation time?
• How do the teachers teaching different level courses prepare and present a
lesson?
• Do the teachers change the lessons, activities, quizzes, and tests from year to
• How do the teachers pace the curriculum material throughout the entire course?
• Why do the teachers in schools which make AYP feel more prepared to teach the
broad five NCTM Standards than the teachers in schools which did not make
AYP?
• This study included 29 lower level teachers and 26 honors level teachers. Is a
If classroom and school visits did occur, the researcher needs to:
• Note any insights about the similarities that teachers do whether they teach level
• Determine the professional demographics for the teacher’s teaching level 1, level
• Determine the personal and professional demographics for the teachers in schools
that have made AYP and schools that have not made AYP.
Under the NCLB Act, schools have yearly increasing AYP targets that all student
subgroups in a school must achieve. By 2014, all students in all subgroups must be 100%
proficient (NCLB Annual Report, 2005, p. 1). With this goal fast approaching, teachers
do not have much time to insure all the key factors of curriculum, instruction, and
assessment are in place to insure success for all the students in Wyoming. Further study
is needed to observe and document specific instructional strategies and assessment that
occur. Researchers need to observe what effective teachers at whichever level of course
proficiency in mathematics.
86
REFERENCES
http://www.ed.gov/pubs/NatAtRisk/risk.html
Adler, I. (1972). Appendix II: The changes taking place in mathematics. The new
Baker, S. & Leak, B. (2006, January 23). Math {will rock your world}. Business World,
http://web.uccs.edu/lbecker/Psy590/escalc3.htm
Becker, L. A. (1998, 1999). Lecture Notes Retrieved on June 12, 2008 from
http://web.uccs.edu/lbecker/Psy590/es.htm
Black, P. & Wiliam, D. (1998, October). Inside the black box: Raising standards through
classroom assessment. Phi Delta Kappan. Retrieved on June 17, 2002 from
http://www.pdkintl.org/kappan/kbla9810.htm
Boaler, J. (2002, July). Learning from teaching: Exploring the relationship between
33(4), 239-258.
http://pareonline.net/getvn.asp?v=8&n=9
87
th
Bracey, G. W. (2006, October). The 16 Bracey report on the condition of public
Mathematics.
Chval, K., Grouws, D, Smith, M., Weiss, I, Ziebarth, S. (2006, June 1). Understanding
the use of curriculum materials: A cross-site research study report. Center of the
http://www.mathcurriculumcenter.org/CrossSiteFull.pdf
Chval, K. B., Reys, R., Reys, B. J., Tarr, J. E., & Chavez, O. (2006, May). Pressure to
improve student performance: A context that both urges and impedes school-
Education, Inc.
Cronin, J., Kinsbury, G. G., McCall, M. S., & Bowe, B. (2005, April). The impact of the
no child left behind act on student achievement and growth: 2005 edition.
http://www.nwea.org/research/nclbstudy.asp
88
Davis, R. B. (1992). Reflections on where mathematics education now stands and on
Dossey, J. A. (1992). The nature of mathematics: Its role and its influence. In D. A.
Franklin, J. (2006a, July). A new lesson hits home how testing and standards are
changing homework for students and teachers. Education Update, 48(7), 3,6.
Franklin, J. (2006b, September). The n3w3st trend? Education Update, 48(9), 11-12.
Grissom, R. J. & Kim, J. J. (2005). Effect size for research: a broad practical approach.
Gonzales, P., Guzman, J. C., Partelow, L., Pahlke, E., Jocelyn, L., Kastberg, D., et al.
(2004). Highlights from the trends in international mathematics and science study
http://nces.ed.gov/timss/results03.asp
89
Hawkins, E. F., Stancavage, F. B., & Dossey, J. A. (1998, August). School policies
Hiebert, J. (1999). Relationships between research and the NCTM standards. Journal for
Hiebert, J., Carpenter, T. P., Fennema, E., Fuson, K., Human, P., Murray, H., Olivier, A.,
& Wearne, D. (1996, May). Problem solving as a basis for reform in curriculum
Hill, S. (1983). An agenda for action: Status and impact. In G. Shufelt (Yearbook Ed.) &
mathematics of the 1980s (pp. 1-7). Reston, VA: National Council of Teachers of
Mathematics.
Horizon Research, Inc. (2000). 2000 national survey of science and mathematics
research.com/instruments/math_teacher.pdf
Hufferd-Ackles, K., Fuson, K. C., & Sherin, M. G. (2004, March). Describing levels and
Jones, P. S. & Coxford, A. F. (1970). The goals of history: Issues and forces. In P. S.
the United States and Canada (pp. 1-8). Washington, DC: National Council of
Teachers of Mathematics.
(Ed.), Handbook of research on mathematics and learning (pp. 3-38). New York,
NY: Macmillan.
Klein, S., Hamilton, L., McCaffrey, D., Stecher, B., Robyn, A., & Burroughs, D. (2000).
Lefkowits, L. & Miller, K. (2006, January). Fulfilling the promise of the standards
Marshall, J. (2006, January). Math wars 2: It’s the teaching, stupid! Phi Delta Kappan,
87(5), 356-363.
Curriculum Development.
McCaffrey, D. F., Hamilton, L. S., Stecher, B. M., Klein, S. P., Bugliari, D., & Robyn, A.
McTighe, J. & Ferrara, S. (1994, November). A report from professional standards and
Education Association.
NAEP - what does the NAEP mathematics assessment measure? (2007, September).
http://nces.ed.gov/nationsreportcard/mathematics/whatmeasure.asp
National Center for Educational Statistics. (2006). Frequently asked questions about the
http://nces.ed.gov/timss/FAQ.asp?FAQType=
92
National Center for Education Statistics. (2006). NAEP Overview the nation’s report
National Council of Teachers of Mathematics. (1980). An agenda for action. Reston, VA:
Mathematics.
Mathematics.
National Research Council. (1989). Everybody counts: A report to the nation on the
National Research Council. (2001). Adding it up: Helping children learn mathematics. J.
Press.
NCLB overview executive summary. (2006). Retrieved on July 23, 2006 from
www.ed.gov/nclb/overview/intro/execsumm.html
93
NCTM Research Committee: Heid, M. K., Middleton, J. A., Larson, M., Gutstein, E.,
Fey, J. T., King, K., Strutchens, M. E., & Tunis, H. (2006, March). The challenge
37(2), 76-86.
[NCLB annual report] No child left behind act of 2001 annual report to congress. (2005,
www.ed.gov/print/about/reports/annual/nclbrpts.html
Porter, A. (1995). Critical issue: Integrating assessment and instruction in ways that
http://www.ncrel.org/sdrs/areas/issues/methods/assment/as500.htm
Ravitch, Diane. (2000). Left back: a century of failed school reforms. New York, NY:
Reys, R., Reys, B., Lapan, R., Holliday, G., & Wasman, D. (2003, January). Assessing
95.
94
Reys, R., Reys, B., Tarr, J., & Chavez, O. (2006 March). Assessing the impact of
http://mathcurriculumcenter.org/MS2_report.pdf
Schifter, D. (1996, March). On teaching and learning mathematics. Phi Delta Kappan,
Schoen, H. L., Cebulla, K. J., Finn, K. F., & Fi, C. (2003, May). Teacher variables that
SPSS FAQ. What does Cronbach’s alpha mean? Retrieved on June 16, 2008 from
http://www.ats.ucla.edu/stat/spss/faq/alpha.html
95
rd
Stiggins, R. J. (2001). Student-involved classroom assessment. (3 Ed.) Upper Saddle
Kappan, 79(1). Retrieved on Septembert 13, 2006 from Wilson Omnifile Full
Text Select.
Stigler, J. W. & Hiebert, J. (1999). The teaching gap: Best ideas from the world’s
teachers for improving education in the classroom. New York, NY: The Free
Press.
http://legisweb.state.wy.us/statutes/dlstatutes.htm
Toch, T. (2006). Margins of error: The education testing industry in the no child left
http://www.educationsector.org/usr_doc/Margins_of_Error.pdf
Trochim, W. M. K. (2006). Measurement validity types. Retrieved on July 11, 2008 from
http://www.socialresearchmethods.net/kb/measval.php
96
Turley, E. D. (2006, February). Textural perceptions of school time and assessment.
Wainwright, C., Morrell, P. D., Flick, L., & Schepige, A. (2004, November). Observation
Mathematics.
World almanac and book of facts 2002. (2002). New York, NY: World Almanac
Education Group.
World almanac and book of facts 2006. (2006). New York, NY: World Almanac Books.
results – schools and districts release. Retrieved on October 12, 2006 from
http://www.k12.wy.us/SAA/ayp/Ayp06.pdf
http://www.k12.wy.us/SAA/standards/math.pdf
Wyoming Department of Education. (2005, October 19). 2005 Wyoming NAEP scores
http://www.k12.wy.us/A/2005_pr/NAEP.pdf
98
APPENDIX A
APPENDIX B
SUPERINTENDENT REQUESTS
102
103
104
Superintendent’s Name Participating District Name City
Brian Recht Albany County School District #1 Laramie
Kevin Mitchell Big Horn County School District #1 Cowley
Dan Coe Big Horn County School District #2 Lovell
Craig Sorensen Big Horn County School District #3 Greybull
Ray Yoder Big Horn County School District #4 Basin
Richard Strahorn Campbell County School District #1 Gillette
Peggy Sanders Carbon County School District #1 Rawlins
Robert Gates Carbon County School District #2 Saratoga
Dan Espeland Converse County School District #1 Douglas
Lon Streib Crook County School District #1 Sundance
Paige Fenton-Hughes Fremont County School District #1 Lander
Susan Kinneman Fremont County School District #2 Dubois
Diana Clapp Fremont County School District #6 Pavillion
Michelle Hoffman Fremont County School District #14 Ethete
Tammy Cox Fremont County School District #24 Shoshoni
Craig Beck Fremont County School District #25 Riverton
Ray Schulte Goshen County School District #1 Torrington
John Balow Hot Springs County School District #1 Thermopolis
Rod Kessler Johnson County School District #1 Buffalo
Dan Stephan Laramie County School District #1 Cheyenne
Margie Simineo Laramie County School District #2 Pine Bluffs
Gene Carmody Lincoln County School District #1 Diamondville
Jon Abrams Lincoln County School District #2 Afton
Jim Lowham Natrona County School District #1 Casper
Richard Luchsinger Niobrara County School District #1 Lusk
Jerry Maurer Park County School District #1 Powell
Bryan Monteith Park County School District #6 Cody
Stuart Nelson Platte County School District #1 Wheatland
David Barker Platte County School District #2 Guernsey
Sue Belish Sheridan County School District #1 Ranchester
Craig Dougherty Sheridan County School District #2 Sheridan
John Baule Sheridan County School District #3 Clearmont
Doris Woodbury Sublette County School District #1 Pinedale
Weldon Shelley Sublette County School District #9 Big Piney
Paul Grube Sweetwater County School District #1 Rock Springs
Barb Arnold Van Matre Sweetwater County School District #2 Green River
Sean Shockley Teton County School District #1 Jackson
Dennis Wilson Unita County School District #1 Evanston
Jack Cozort Unita County School District #4 Mountain View
Randy Hillstead Unita County School District #6 Lyman
Michael Hejtmanek Washakie County School District #1 Worland
Jerry Erdahl Washakie County School District #2 Ten Sleep
Brad LaCroix Weston County School District #1 Newcastle
Troy Claycomb Weston County School District #7 Upton
105
APPENDIX C
PRINCIPAL REQUESTS
106
107
108
APPENDIX D
TEACHER REQUESTS
109
110
WYOMING HIGH SCHOOL
MATHEMATICS TEACHER QUESTIONNAIRE
UNL IRB #2006-11-119 EP
Most of the questions instruct you to “circle one” or “darken one.” For a few questions, you are
asked to write in your answer on the line provided. Please use a #2 pencil or black or blue pen to
complete this questionnaire. Be sure to erase or white out completely any stray marks.
Target Class
Part of the questionnaire asks you to provide information about a particular “target” class. Please
consider your second period or block mathematics class as the target class. If this is your
planning time, please use your first mathematics class as the target class. If your schedule varies
by day, use today’s schedule or the most recent school day. If you are not teaching math at this
time, please think back to a class you taught last semester or last year.
If you have questions about the study or any items in this questionnaire, please contact me at
Kelly Walsh High School – 307-233-2000 or 307-234-6354 (home) or email at
[email protected] or [email protected] (home).
Drawing Deadline
If you return your completed questionnaire before Monday, March 5, 2007, your name and phone
number will be separated from the questionnaire and entered into a drawing for one of three
VISA Gift cards. The first name drawn will receive a $100 VISA Gift card. A $75 VISA Gift
card will be awarded to the second name drawn. A $50 VISA Gift card will be awarded to the
third name drawn. The VISA Gift cards should be accepted anywhere VISA is accepted and can
be used for a tank of gas, a dinner at your favorite restaurant, or a special gift for you – your
choice. (You may wish to date and copy your completed questionnaire, if there is a question
about returning it prior to March 5th.)
Thank you very much. Your participation is greatly appreciated. Please return the completed
questionnaire in the postage-paid envelope.
Mary Moler
Teacher Survey
1625 Holly Street
Casper, WY 82604-3227
111
112
113
114
115
116
117
118
APPENDIX E
FOLLOW-UP REQUEST
119
Several weeks ago, you were sent a teacher survey in the US mail. The survey asked
questions about standards, PAWS results, your mathematical and pedagogical
background, professional development, a target mathematics class, and the
textbook/program, the instruction used, and the assessments given to the target class. To
the best of my knowledge, your survey has not been returned. (If you have just sent it,
please ignore this reminder.) Remember your answers will be completely confidential.
No information identifying specific teachers, schools, or districts will be released or
published.
The data from the teachers who have already responded is useful, but it does not give a
complete picture without your unique perspective of what you do in your math
classroom.
Remember, I need to have the survey completed and returned prior to March 5, 2007, in
order for you to be entered into the drawing. Your opportunity to be entered to win a
$100, $75, or $50 VISA Gift card is fast approaching.
I hope that you will complete and return the survey. If you have any questions or
comments about this study, I would be happy to talk with you. The telephone number at
Kelly Walsh High School is 233-2000 or 234-6354 at my home. You may e-mail me at
[email protected] or [email protected]. Also, if you have questions or
concerns about research participants’ rights, you may call the University of Nebraska-
Lincoln Institution Review Board at (402) 472-6965.
Thank you for assisting with this important survey. Have a great day!
Sincerely,
Mary Moler☺
Mathematics Teacher
Kelly Walsh High School
307-233-2000
[email protected]
1625 Holly Street
Casper, WY 82604-3227
307-234-6354
[email protected]
120
APPENDIX F
SPSS RESULTS
121
Reliability
Warnings
The space saver method is used. That is, the covariance matrix is not calculated or used in the
analysis.
N %
Cases Valid 132 80.5
Excluded(a) 32 19.5
Total 164 100.0
a Listwise deletion based on all variables in the procedure.
Reliability Statistics
Cronbach's
Alpha N of Items
.896 87
122
Oneway
Descriptives
ANOVA
Multiple Comparisons
Tukey HSD
Group Statistics
95% Confidence
Interval of the
Difference
Equal variances
-1.246 53.355 .218 -.10698 .08587 -.27919 .06524
not assumed
Equal variances
-1.554 53.549 .126 -.08729 .05616 -.19989 .02532
not assumed