LANGUAGE EDUCATION RESEARCH: REVIEWER
—------------------------------------------------------------------------------------------------------------------------------------------------
What counts as a good language education research?
● Is of good quality
● Makes a meaningful change in society
Research is actually making an impact not only in our time as a university students, but actually
in the different undertakings we’re in
What we’re actually trying to promote is a useful research, meaning that it’s not only because
we’re curious of the topic, or that we still do not develop a technology about it, but we are sure
that when we do the research, it will actually make a meaningful change in the society or
institution.
Identify if
● “What is the actual difference does LER make?”
● “How do we know the difference they make?”
It reaches the standard that we are aiming but at the same time, it makes a difference.
3 Basic Importance of Research
● Adds to our knowledge
○ Evidence-based
● Improves practice
○ Never-ending quest for improvement
● Informs policy debates
Limitations or Controversies in Research
● Contradictory or vague findings
● Questionable data
○ Unethical, manipulation, misinterpretation
Why are theories theories?
● The studies are still evolving
● The content and nature are evolving
“We cannot erase all the poor research recorded in the educational field. We can, however, as responsible
inquirers seek to reconcile different findings in your research procedure to collect and analyze data and to
provide a clear understanding”
Kinds of Research Key Characteristics
BASIC Research Focuses on generating fundamental
knowledge
APPLIED Research Focuses on real-world questions and
applications
EVALUATION Research Focuses on determining the worth , merit, or
quality of intervention programs
ACTION Research Focusing on solving local problems that
practitioners face
ORIENTAL Research Focuses on reducing inequality and giving
voice to the disadvantages
Educational Action Research
- ACTION RESEARCH is focused on solving specific problems that local practitioners face in
their schools and communities.
- Views classroom or any local environment as a place to conduct research.
- Action research integrates theory and research with practice.
Action + Research = Action Research
ACTION: implies doing or changing something; solving a problem
RESEARCH: depicts creating knowledge or theory about that action that contributes to Science
What Action Research Is in Brief
Action research is...
- a process that improves things, places, events, practice
- problem-solving education through change
- collaborative
- doing research on or about people
- cyclical
- linear
- practical and relevant
- conclusive
- within context of teachers
- generalizing to larger environment populations
- how we can do things better
- why we do certain things
- exploring, discovering and seeking
- finding creative solutions as predetermined answers to questions
- a way to improve instructional practice
- a fad practice by observing, revising, and reflecting
What Action Research is Not (Common Misnomers on Action Research)
1. Action research is not what usually comes to mind when we hear the word "research."
2. Action research is not a library project in which one learns more about a topic of
interest.
3. It is not problem-solving in the sense of trying to find out what is wrong, but rather a
quest for knowledge about how to improve.
4. Action research is not about doing research on or about people, or finding all
available information on a topic looking for the correct answers. Instead, it involves
people working to improve their skills, techniques, and strategies.
5. Action research is not about learning why one does certain things, but rather how
he/she can do things better. It is about how a teacher can change his/her instruction to impact
students.
You use action research when you have a specific educational problem to solve.
An important characteristic of action research, which sets it apart from other designs, is that it is
usually cyclical in nature, reflecting the fact that people usually work towards solutions to their
problems in cyclical, iterative ways.
Action research is repetitive, continuing and cyclical.
Nature of Language Research (Posecion et. al 2011)
1. Language Use
2. Types of Users
3. Acquisition Process
4. Setting
5. Research Methodology
Framework of Language Research (Seliger & Shogany 1989)
● Approaches
● Objectives
● Research Design
● Data Collection and Analysis
WRITING THE CHAPTER 1
The Introduction
● Establishes the context of the research
● Mental road map
● What was I studying?
● Why was this topic important to investigate?
● What did we know about this topic before I did this study?
● How will this study advance our knowledge?
Structure and Approach
● What is this?
● Why am I reading it?
● What do you want me to think about/consider doing/react to?
Establish an area to research by:
● Highlighting the importance of the topic, and/or
● Making general statements about the topic, and/or
● Presenting an overview on current research on the subject
Identify a research niche by:
● Opposing an existing assumption, and/or
● Revealing a gap in existing research, and/or
● Formulating a research question or problem, and/or
● Continuing a disciplinary tradition
Place your research within the research niche by:
● Stating the intent of your study,
● Outlining the key characteristics of your study,
● Describing important results, and
● Giving a brief overview of the structure of the paper
Delimitation of the Study
● The key aims and objectives of your study,
● The research questions that you address,
● The variables of interest [i.e., the various factors and features of the phenomenon being
studied],
● The method(s) of investigation, and
● Any relevant alternative theoretical frameworks that could have been adopted
The Narrative Flow
● Your introduction should clearly identify the subject area of interest
● Establish context by providing a brief and balanced review of the pertinent published
literature that is available on the subject
● Clearly state the hypothesis that you investigated
● Why did you choose this kind of research study or design?
Engaging the Reader
● Open with a compelling story
● Include a strong quotation or a vivid, perhaps unexpected anecdote,
● Pose a provocative or thought-provoking question,
● Describe a puzzling scenario or incongruity, or
● Cite a string example or case study that illustrates why the research problem is important
WRITING THE PROPOSAL
Proposal Writing Made Platable
By David E. Rawnsley
Proposal writing, while not an easy task, need to be an unpleasant one. The checklist and suggestions
provided here should help to assure that is the case
In spite of personal predispositions, it is the rare school administrator who sooner or later doesn’t
find himself or herself involved in the development of a proposal. While this is a rarely pleasant
experience, it does not have to be a traumatic one if a little organization is applied to the
process.
The 20-item checklist on the following pages, based on some 15 years of helping various
agencies apply for funds from a variety of sources, is offered as one way to put this organization
into the process. The checklist has been helpful regardless of the type of funding source (federal,
state, or private) that is being approached, even though there may be wide variations in the
specifics of the application process.
It is always tempting to get someone else to do project planning and proposal writing for you,
and some notes are provided at the end of this article concerning the types of assistance you
might wish to seek. However, you should keep in mind that a proposal should be a workable plan
even if it isn’t funded. A school district frequently finds a way to implement a good plan out of its
own money in spite of a lack of financial support from outside agencies.
Although this checklist is presented as a series of discrete steps, it will be an unusual
circumstance when each of the steps would be followed in the exact sequence offered. Nor is
there any profound reason why they should be. If, however, all the steps are completed in some
reasonable order, the professor should be able to avoid any great surprises on the day before the
deadline, or perhaps more importantly, the day after.
1. Determine the specific area of need you want to work on, or the specific problem you want to
solve.
Avoid the proposal that is a solution looking for a problem. Also avoid:
● Confusing a personal need (“We need a reading consultant”) with a student need
(‘students aren’t reaching reading objectives”). Personnel is a solution.
● Analyzing the need or problem inadequately. For example, stating that your
students have a reading problem doesn’t give you much insight into whether to
plan a program to increase decoding skills or comprehension skills– or something
else
2. Set the general goal you want the project to attain, match it with school or district priorities, and
get the appropriate approval(s) to proceed with planning.
This doesn’t need to involve developing a set of technically exact goals and objectives. It
does involve defining what it is you are trying to plan in clear enough terms to be able to
communicate it. Do get approvals from the superintendent, the board, advisory groups,
and staff (depending on your district’s regular practices). Surprise is not a good tactic in
developing projects.
3. Investigate proper funding sources appropriate to your need and goal.
Get copies of guidelines from prospective funding sources when possible and mark them
up, making specific note of obscure or unusual requirements (such as requirements for
matching funds, disallowment of certain types of costs, requirements to demonstrate
involvement of special groups in planning, and so on) which can bring your proposal to
grief at the last minute.
4. Write down a plan for planning the project and writing the proposal.
This plan should include:
● Who will be involved, and how
● Who will be responsible for what
● A timeline (which can be built by working back from the deadline)
The table of contents of the application form provides a good outline of what needs to be
done. Remember in scheduling that many boards of education require two meetings to
approve submission of a proposal, and that it will generally take a minimum of one week
to type, proofread, and produce.
5. Do some research
● Investigate what is known from research and experience about your need and the
students who have it
● Find out what solutions others may have used for the same or similar problems.
Research indicates that the most effective programs are adaptations by local staffs
or solutions developed and tried elsewhere. Have at least one alternative to your
“pet” solution, if for no other reason than to have something with which to compare
your idea.
● Gather information about your own situation, such as past performance of your
students, background or staff, or materials already available. You’ll need it for the
proposal.
All of this research provides the basic “stuff” for your rationale in step number 12.
6. Select the approach you want to use to meet your need in general terms.
Most proposals, regardless of funding source, will include:
● A statement of need
● Goals/objectives
● Rationale
● Major activities
● Evaluation
● Personnel, materials, and equipment
● Budget
7. Rewrite your needs statement and goal in formal terms and specify discrepancies between
needs and goals/objectives if guidelines require them.
This is the first step in actually writing the proposal. However, remember that until you
give the final copy to the typist, anything can be revised. Don’t let anyone get too “ego-
involved” in their first draft. People tend to fall in love with their own rhetoric.
8. Analyze your project into components
Components are sets of major activities which reasonably group together around an
objective or set of objectives (such as instruction, staff development, parent involvement,
and so on). Most educational projects include an instructional component, amd this is the
place to start. But remember that all projects require management, making this a
separate component may reveal a number of activities (and costs) you hadn’t thought of
beforehand.
There is no “true” set of components; unless the guidelines require a certain set, use what
seems most reasonable and useful.
9. Specify objectives for each component
Figure out how you will know whether a component is being successfully carried out and
state that as an objective. Unless you are omniscient, your project will need revision in the
second year, and organization by component provides a good way to see what should be
revised.
10. List major activities by component
List them once, and the go back and list them in reasonable order of occurrence. Simple
flow-charting can be of great help here. Be very realistic when estimating how long an
activity will take– it is unlikely that a new curriculum can be developed in two weeks,
regardless of the abilities of your staff. Don’t forget little matters like vacations and
holidays when estimating timelines.
11. Design the formal evaluation plan
Be realistic, get advice, and remember to budget for evaluation. First determine the
purpose(s) of the evaluation. Don’t assume that the evaluation required in the guidelines
will provide the same information that you (or your board) will want to know about the
project. And don’t propose to measure significant gains in student learning within six to
nine months after the project starts. Try as hard as you can to base your evaluation on
data which are normally collected. Be careful about scheduling of standardized tests; too
many projects expect to see gains in scores on tests administered four to five months
apart, and this expectation is very likely to be a disappointment for technical reasons
having nothing to do with your project.
12. Write the rationale
Although some funding programs specify what should be included in this section, the
rationale is basically a statement of why you chose this solution to your problem, and what
makes you think it will work. One of the basic purposes of a rationale is to provide
evidence (direct or indirect) that your school/school district is capable of doing what is
proposed. Don’t confuse the rationale with the proposal. Too many project developers
think that once they’ve written this section, the rest is boilerplate. Remember: A proposal
is a plan of action, not a set of reasons.
13. Determine personnel needs, job descriptions, and qualifications
In many organizations, particularly public ones, it is a complex process to get a completely
new position authorized. Check to see if the project’s personnel needs can fit into already
existing job classifications,
rather than inventing new ones. Also be careful not to require more credentials than are
needed and/or required. Excessive credential requirements can increase personnel costs
and severely limit the “pool” of candidates.
14. List everything that you really need to make the project work that will cost something, and build
a budget.
The budget is the financial expression of your plan of action, so the plan and the budget
should match virtually item for item. It is usually easier to build the budget in the form you
are most used to, and then translate it into the form required. Remember that salaries will
almost always be determined by district salary schedules, not by what you think you can
get away with paying someone in order to keep the budget down. Don’t forget that
someone has to pay for administering the project (even if this is no more than
bookkeeping). Include these costs as overhead (if allowed sometimes it isn’t), or as direct
cost items. Budget realistically for what you need; “under-budgeting” can have as adverse
an effect on proposal review as “over-budgeting”; and won’t add to your local reputation
as a planner if you are funded.
15. Review a draft of the project
Use the table of contents of the guidelines or application form as an outline for the review.
Let all interested parties see the draft; don’t wait until you have produced the
&dquo;final&dquo; copies, or someone may “definalize” them by pointing out some
awesome omission.
16. Prepare a summary for preliminary board (and advisory committee) review
First make sure your board will accept such a summary. If not, you’ll have to move your
schedule for the next three steps ahead considerably.
17. Get the proposal typed, proofread, and assembled
This step can be started before Steps 18 and 19 are completed, but don’t staple the
proposal together yet. Be forewarned that typing a proposal is not like typing a letter; be
available to the typist. A draft is rarely so clear and lucid as to be beyond
misunderstanding.
This step inevitably takes longer than you think it should.
18. Get the necessary signatures on resolutions, assurances, and title pages, and insert them into
the proposal.
You can be collecting these pieces of “boilerplate” throughout the writing process, but you
won’t get far without them.
19. Build your table of contents, and put it into the proposal.
Along with being a necessary part of the proposal, building a table of contents is an
excellent way to assure that you haven’t forgotten something. Compare yours with any
which may appear in the guidelines. Now you can staple
20. Submit the proposal
Remember most deadlines specify the date the funding agency must receive the proposal,
not the date it is to be mailed. It is a good practice to send proposals by registered mail.
Always include the number of copies required by the guidelines. Produce and keep
enough copies to pass around if you are funded; after all, the proposal is a plan of action,
not something to be filed while you spend the money in some other fashion.
Resources for Assistance
Planning a project and writing a proposal is not a simple activity, and you may need to call in
some outside assistance. The kind of assistance you do seek will depend upon your specific
needs, but most will fall into one of the following categories (or some combination of them).
● Content-area Consultants, frequently called “experts”
These are people with experience, knowledge, and reputation in the area in which you are
working. They can be particularly helpful during those steps in which you are selecting,
adapting, or inventing a solution and planning a program. They can provide ideas,
research knowledge, references to other programs, knowledge about what it will take to
reach your goal, and (let’s face it) a certain amount of “clout” to your proposal. However,
if they write your plan and proposal, it will be theirs and not yours, and in all likelihood will
not be as successful in implementation.
● Planning Consultants, or specialists in generalities
Planners assist you in getting from a general idea to a specific plan of action. Their specific
skills are in designing, organizing ideas, and critiquing plans for completeness and
continuity. Occasionally, these skills and those of the content-area consultant will be found
in the same person, but one does not necessarily imply the other.
● Proposal or project writers
These people take a well-developed plan of action and translate it into the forms and
guidelines required by a specific funding source. In the process of doing so, they may be
able to point out inadequacies in your plan, but if they are expected to fill in gaps in your
planning (e.g., makeup objectives) or to develop the plans themselves, they have been
transposed into “creative liars”. In some cases you may be able to find someone who can
combine this skill with other types of consultation, but this can be overdone to the point at
which no one else is involved in the project.
● Technical assistance
This refers to assistance in meeting the technical requirements of a particular application
process. Included are such things as interpreting guidelines and regulations, reviewing of
proposals to see if they match the requirements of the funding source, giving assistance in
filling out specific forms, and providing information about funding alternatives.
Intermediate agencies are good places to look for this type of assistance. When in doubt,
don’t be bashful about contacting the funding agency itself. Most will be very helpful up to
the point of giving you an advantage over other applicants.
Even the best of proposals are not always funded, but there are important side
benefits to a well-organized, project-development activity. These can include an increase
in the clarity with which those involved look at a problem or need, development of skills
which can be used in day-to-day program planning, and better understanding on the part
of staff of the complexities of planning and budget-building. Good management of the
process, however, is the key.
Testing Effective If Used Properly
Standardized testing has suffered more from the excessive expectations of its advocates than
from the attacks of its critics, according to William W. Turnbull, president of the Educational
Testing Service (ETS).
Turnbull suggests that standardized tesing be put into proper perspective-as an objective,
accurate method for teachers to assess how much their students have learned and as a basis for
comparison to students in other classes, schools, and states.
—
WRITING YOUR CHAPTER 3
Research Design
● Methodology
● Data source(s)
● Procedure (sampling procedure)
● Instrumentation
● Analytical plan, including statistical treatment applicable
Methodology
● Assumptions, postulates, rules, and methods–the blueprint or roadmap–that researchers
employ to render their work open to analysis, critique, replication, repetition, and/or
adaptation and to choose research methods
(Given, 2008, p.516)
● What methods do we propose to use?
● What methodology governs our choice and use of methods?
● What theoretical perspective lies behind the methodology in question?
● What epistemology informs this theoretical perspective?
(Crotty, 1998, p.2)
Data Sources
● What data will be collected?
Participants?
● Who will participate in the study?
The research participants will be 140 randomly selected children from those attending Grades 2 and 6
in three Midwestern schools serving a primarily middle-class neighborhood. There will be an equal
number of male and female children from each grade. Each will be given a free ticket to one of the local
theaters when he or she completes the research study.
Procedure
● How will you conduct the study?
Sampling Procedure
● How did you collect your sample?
Instrumentation
● What tools will you use?
The Information and Block Design subtests of the Wechsler Preschool and Primary Scale of
Intelligence-Revised (WPPSI-R; Wechsler, 1989) will be used to estimate the research participants’
general level of intellectual functioning. The information subtest… [briefly explain what it is and what
type of response is required of the child]. The Block Design subtest… [briefly explain what it is and what
type of response is required of the child]. Test-retest reliability of the Information subtest ranges from
.74 to .84 and .79 to .86 for the Block Design subtest. The subtests should be appropriate because the
participants to be used in the study proposed here will be socially and demographically similar to the
individuals in the norming group that was used to obtain the published reliability and validity data.
Analytical Plan
● How will you analyze the data?
Statistical Treatment
● If there are any Statistics involved?
RESEARCH DESIGNS
Overview:
● Research Approaches
● Research Methodology
● Research Designs
● Research Methods
Research Approaches
● Researcher’s ontologies and epistemologies
● Research Designs
● Research Methods
● Quantitative
● Qualitative
● Mixed Approach
Research Approaches/Paradigms (Johnson & Christensen, 2012, p. 34)
Quantitative Research Mixed Research Qualitative Research
Scientific Method Confirmatory Confirmatory Exploratory
or “top- and or “bottom-
down”. The exploratory up”
researcher
tests The researcher
hypotheses and generates or constructs
theory with data knowledge,
hypotheses, and
grounded theory
from data collected
during fieldwork
Ontology (i.e., nature of Objective, material, Pluralism; Subjective,
reality/truth) structural, agreed- appreciation of mental,
upon objective, subjective, personal, and
and intersubjective constructed
reality and their
interrelations
Epistemology (i.e., Scientific realism; Dialectical pragmatic Relativism; individual
theory of knowledge) search for Truth; justification (what and group
justification by works for whom in justification; varying
empirical confirmation specific contexts); standards
of hypotheses; mixture of universal
universal scientific (e.g., always be
standards ethical) and
community-specific
needs-based
standards
View of human thought Regular and Dynamic, complex, Situational, social,
and behavior predictable and partially contextual, personal,
predictable. and unpredictable
Multiple influences
include
environment/nurture,
biology/nature, free
will/agency,
chance/fortuity
Most common research Quantitative/ Multiple objectives; Qualitative/
objectives numerical provide complex and subjective
description, causal fuller explanation description,
explanation, and and understanding; empathetic
prediction understand multiple understanding, and
perspectives exploration
Interest Identify general Connect theory and Understand and
scientific laws; inform practice; understand appreciate particular
national policy multiple causation, groups and
nomothetic (i.e., individuals; inform
general) causation, local policy
and idiographic (i.e.,
particular, individual)
causation; connect
national and local
interests and policy
“Focus” Narrow-angle Multilens focus Wide-angle and
lens, testing “deep-angle”
specific lens,
hypotheses examining the
breadth and depth of
phenomena to learn
more about them
Nature of observation Study behavior Study multiple Study groups and
under controlled contexts, individuals in natural
conditions; isolate perspectives, or settings; attempt to
the causal effect of conditions; study understand insiders’
single variables multiple factors as views, meanings,
they operate together and perspectives
Form of data collected Collect quantitative Collect multiple kinds Collect qualitative
data based on of data data such as in-depth
precise interviews,
measurement using participant
structured and observation, field
validated data- notes, and open-
collection ended questions. The
instruments researcher is the
primary data-
collection instrument
Nature of data Variables Mixture of variables, Words,
words, categories, and images,
images categories
Data analysis Identify statistical Quantitative and Use descriptive data;
relationships qualitative analysis search for patterns,
among used
variables separately and in themes, and holistic
combination features; and
appreciate
difference/variation
Results Generalizable Provision of Particularistic
findings providing “subjective insider” findings; provision
representation of and “objective of insider findings
objective outsider outsider” viewpoints;
viewpoint of presentation and
populations integration of multiple
dimensions and
perspectives
Form of final report Formal statistical Mixture of numbers Informal narrative
report (e.g., with and narratives report with contextual
correlations, description and direct
comparisons of quotations from
means, and research participants
reporting of
statistical
significance of
findings)
Quantitative Approach
● Objective
● Numerical
Qualitative Approach
● Exploration
● Non-numerical
Mixed Method Approach
● Combination of data, methods, and theories
Research Methodology
● Assumptions, postulates, rules, and methods–the blueprint or roadmap–that researchers
employ to render their work open to analysis, critique, replication, repetition, and/or
adaptation and to choose research methods
(Given, 2008, p.516)
● What methods do we propose to use?
● What methodology governs our choice and use of methods?
● What theoretical perspective lies behind the methodology in question?
● What epistemology informs this theoretical perspective?
(Crotty, 1998, p.2)
Research Designs
● Strategies of inquiry (Denzin & Lincoln, 2011)
● Collect, analyze, and interpret data
Quantitative Designs
● Experimental
○ True
○ Quasi
● Non-experimental
○ Causal Comparative
○ Correlational
● Experimental
● Correlational
● Survey
Experimental Research Key Characteristics
● Random Assignment
● Control over Extraneous Variables
● Manipulation of the Treatment Conditions
● Outcome Measures
● Group Comparisons
● Threats to Validity
Experimental Research Types
● Between Group Designs
○ True Experiments
○ Quasi Experiments
○ Factorials Designs
● Within Group or Individual Designs
○ Time Series Experiments
○ Repeated Measures Experiments
○ Single Subject Experiments
Experimental Research Potential Ethical Issues
● Ethics of Procedure
○ Withholding treatments
○ Randomized assignments (Mark & Gamble, 2009)
○ Discontinuance
Experimental Research Steps (Creswell, 2014)
1. Decide if ER is the best choice
2. Form Hypothesis
3. Select Experimental Units
4. Select Experimental Treatments
5. Choose a Design
6. Conduct the Experiment
7. Organize and Analyze Data
8. Develop an ER Report
Experimental Research Quality Criteria (Creswell, 2014, p.327)
1. The experiment has a powerful intervention
2. Participants gain from involvement in the intervention
3. The researcher selects an adequate number of participants per group in some systematic
way
4. The researcher uses valid, reliable, and sensitive measures
5. The researcher controls for extraneous factors that might influence the outcome.
6. The researcher addresses threats to internal and external validity.
Correlational Research Key Characteristics
● Display of Scores
● Association Between Scores
● Multiple Variable Analysis
○ Use of Advanced Correlational Statistical Procedures
■ Factor Analysis
■ Discriminant Functional Analysis
■ Path Analysis
■ Structural Equation Modeling
■ Hierarchical Equation Modeling
Correlational Research Types
● Explanatory Design
● Prediction Design
Correlational Research Potential Ethical Issues
● Not measuring appropriate controls
● Editing data or making up of data
● Failure to analyze and report effect sizes
● Measurement error
● Plagiarism
● Failure to report contradictory findings
● Publication of same evidence many times
● Omission of negative findings
Correlational Research Steps
1. Determine if CR is the best option
2. Identify individuals
3. Identify two or more measures
4. Collect data and monitor potential threats
5. Analyze data and represent the results
6. Interpret the results
Correlational Research Quality Criteria (Creswell, 2014, p. 361)
1. There is an adequate sample size for hypothesis testing.
2. The researcher displays correlational research in a table or graph
3. The researcher selects an appropriate statistical test.
4. There is an interpretation about the direction and magnitude of association among the
variables.
5. An assessment is made about the magnitude of of the association based on the
coefficient of determination, p values, effect size or the size of the coefficient.
6. The researcher identifies the predictor and criterion variables.
7. In a visual model, the expected/predicted direction based on observed data is presented.
RESEARCH DESIGNS (PART 2)
Survey Research Key Characteristics
● Sampling from a population
● Collecting data through questionnaires or interviews
● Designing instruments for data collection
● Obtaining a high response rate
Survey Research Types
● Cross-Sectional Survey Designs
○ Attitudes and practices
○ Group comparisons
○ Community needs
○ National assessment
○ Program Evaluation
● Longitudinal Survey Designs
○ Trend Studies
○ Cohort Studies
○ Panel Studies
Survey Research Potential Ethical Issues
● Exemption from detailed review
● Use of incentives
● Ethical responsibility of interviewer
● Safety of respondents (confidentiality)
Fowler, 2009
Survey Research Steps
1. Decide if SR is the best design
2. Identify research questions or hypothesis
3. Identify the population, sampling frame, and the sample
4. Determine survey design and data collection procedures
5. Develop or Locate an Instrument
6. Administer the Instrument
7. Analyze the Data
8. Write the Report
Survey Research Quality Criteria (Creswell, 2014, p. 408)
1. The researcher describes the target population.
2. The researcher identifies and uses a systematic approach to identifying the sample.
3. The size of the sample and the means for identifying the size are identified.
4. The researcher identifies the type of survey used in the study.
5. The survey instrument for data collection is mentioned.
6. The survey researcher reports on validity and reliability of past scores on the instrument.
7. The researcher discusses the procedures for administering the instrument.
8. The survey administration procedures provide a discussion about the follow-up procedures
to ensure a high return rate.
9. The researcher provides a systematic procedure for analyzing the survey data.
Further Reading
● Crotty, M. (1998). The foundations of social research: Meaning and perspective in
the research process. Thousand Oaks, CA: Sage.
● Campbell, D. T., & Riecken, H. W. (1968). Quasi-experimental design. International
encyclopedia of the social sciences, 5(3), 259-263.
● Maxwell, J., Bickman, L., & Rog, D. J. (2009). The SAGE handbook of applied social
research methods. SAGE Publications, Inc., Thousand Oaks. doi, 10, 9781483348858.
● Wickens, T. D., & Keppel, G. (2004). Design and analysis: A researcher's handbook.
Upper Saddle River, NJ: Pearson Prentice-Hall.
QR Design (Narrative Research)
● “Narrative” might be the phenomenon being studied, such as a narrative of illness, or it
might be the method used in a study, such as the procedures of analyzing stories told
(Chase, 2005; Clandinin & Connolly, 2000; Pinnegar & Daynes, 2007).
● it begins with the experiences as expressed in lived and told stories of individuals
● Clandinin, D. J. (2013). Engaging in narrative inquiry. Walnut Creek, CA: Left Coast Press.
● Riessman, C. K. (2008). Narrative methods for the human sciences. Thousand Oaks, CA:
Sage.
QR Design (Phenomenology)
● describes the common meaning for several individuals of their lived experiences of a
concept or a phenomenon
● The basic purpose of phenomenology is to reduce individual experiences with a
phenomenon to a description of the universal essence (a “grasp of the very nature of the
thing,” van Manen, 1990, p. 177)
● Moustakas, C. (1994). Phenomenological research methods. Thousand Oaks, CA: Sage.
● van Manen, M. (2014). Phenomenology of practice: Meaning-giving methods in
phenomenological research and writing. Walnut Creek, CA: Left Coast Press.
QR Design (Grounded Theory)
● generate or discover a theory, a “unified theoretical explanation” (Corbin & Strauss, 2007,
p. 107)
● theory development does not come “off the shelf” but rather is generated or “grounded”
in data from participants who have experienced the process (Strauss & Corbin, 1998)
● Charmaz, K. (2014). Constructing grounded theory (2nd ed.). Thousand Oaks, CA: Sage.
● Corbin, J., & Strauss, A. (2015). Basics of qualitative research: Techniques and procedures
for developing grounded theory (4th ed.). Thousand Oaks, CA: Sage.
QR Design (Ethnography)
● focuses on an entire culture-sharing group
● qualitative design in which the researcher describes and interprets the shared and learned
patterns of values, behaviors, beliefs, and language of a culture-sharing group (Harris,
1968)
● Fetterman, D. M. (2010). Ethnography: Step-by-step (3rd ed.). Thousand Oaks, CA: Sage.
● Wolcott, H. F. (2008). Ethnography: A way of seeing (2nd ed.). Lanham, MD: AltaMira.
QR Design (Case Study)
● the study of a case (or cases) within a real-life, contemporary context or setting (Yin, 2014)
● a qualitative approach in which the investigator explores a real-life, contemporary
bounded system (a case) or multiple bounded systems (cases) over time, through
detailed, in-depth data collection involving multiple sources of information (e.g.,
observations, interviews, audiovisual material, and documents and reports), and reports a
case description and case themes.
● Stake, R. (1995). The art of case study research. Thousand Oaks, CA: Sage.
● Yin, R. K. (2014). Case study research: Design and method (5th ed.). Thousand Oaks, CA:
Sage.
Mixed Research Design (Basic)
● The convergent design
● The explanatory sequential design
● The exploratory sequential design
Mixed Research Design (Advanced)
● The experimental design
● The social justice design
● The multistage evaluation design
Action Research Design
● Practical AR Design
● Participatory AR Design
Research Methods
1. Tests
2. Questionnaires
3. Interviews
4. Focus Groups
5. Observation
6. Secondary or existing data
Research Methods (Questionnaire)
● Social surveys are a questionnaire-based method of research that can produce both
qualitative and quantitative information depending on how they are structured and
analysed.
Research Methods (Interviews)
● a qualitative method of research often used to obtain the interviewees’ perceptions and
attitudes to the issues.
Research Methods (Discussion Groups)
● A discussion groups consists of a number of individuals you invite to discuss their views on
a particular topic, typically involving between 6 and 12 people, which is conducted
specifically to get a group of people’s views on a subject
Research Methods (Workshops)
● Workshops are a group-based method of research in which there is an emphasis on
activity-based, interactive working
Research Methods (Observation)
● key method of anthropology and in itself can consist of a mix of techniques; informal
interviews, direct observation, participation in the life of the group, collective discussions,
analyses of personal documents produced within the group, self-analysis, and life-
histories, notes, diaries and transcripts are often kept and the observation method can
generate a lot of written material which the investigator must synthesize
Research Methods (Visual Techniques)
● Visual methods such as drawing, painting, video, photography and hypermedia offer
increasingly accessible and popular resources for research