Posts

,

Research Spotlight: Corinne Huggins-Manley

Q&A with Corinne Huggins-Manley, Ph.D., Associate Professor of Research and Evaluation Methodology, Associate Director of Human Development and Organizational Studies in Education, University of Florida Research Foundation Professor

What basic questions does your research seek to answer?

All of my methodological research seeks to overcome challenges to the practice and interpretation of quantitative measurements of latent constructs. Outside of academia, it is taken for granted that we can assign numbers to concepts such as “academic achievement” or “self-esteem,” and it is often assumed that those numbers are either accurate representations of such concepts or inaccurate only within some quantifiable margin of error.

There is little appreciation for the statistical challenges to measurement and the process of validating inferences made from such measurements. Inside the world of academia, there is more awareness about the challenges to quantifying latent constructs but still some difficulties in recognizing and meeting those challenges. Research in educational measurement is aimed at improving our abilities to overcome such challenges so that valid information can be gleaned from the measurement of education-related constructs. I aim to advance the field of educational measurement with research on topics such as item response theory, fairness in reporting subgroup test scores within and across schools and teachers, scale development and score use validity, and statistical model building that can help practitioners to overcome issues such as non-response bias.

What makes your work interesting?

My work is interesting because it directly tackles many of the problems that occur with respect to measurement in educational research, policy, and practice. In educational research, the ability to use statistics to analyze data and answer research questions hinges in large part on the measurement quality of the variables being studied. However, measurement courses are often not required for doctoral students in the social sciences and many researchers have noted that a lack of attention to measurement has become the Achilles’ heel to social science research. In educational policy and practice, many measurement demands have been mandated onto educators over the past few decades, often by persons who are not trained in measurement or the validity of test score interpretations. The ramifications of such policies have been widely felt in the educational community, and I believe many of them stem from the lack of understanding about what measurement is and what it can (and cannot) tell us about students, teachers, and learning. I focus my research on topics that can improve these conditions in educational research, policy, and practice.

What are you currently working on?

I have four projects in progress that I am very excited about. One is related to assumptions of item response theory models and how we can best test for violations of them. Increasing the availability of accurate methods for testing measurement model assumptions is critical for ensuring the appropriate use and interpretations of parameter estimates produced from such models. A second is related to the development of two statistical models that allow for the incorporation of simultaneous nominal and ordinal within-item response data. The availability of models such as these would allow practitioners and researchers to more easily and appropriately model non-responses on tests and surveys such as “not applicable” responses on Likert scales. The third is an applied measurement project in which I am co-developing an adaptive, diagnostic assessment of reading skills for students in grades 3 to 5. The fourth is a continuation of my research on subpopulation item parameter drift and its relationship to differential item functioning and equating invariance. These three phenomena are statistical manifestations of measurement bias, which pose problems for achieving standards of fairness in large-scale educational testing.

OER Is Updating Proposal Status in UFIRST

The Office of Educational Research (OER) would like your help with notifying us about unfunded proposals. This will assist us with updating proposal status in UFIRST. We request that you email Brian Lane at blane@coe.ufl.edu with a list of your unfunded proposals from January 1, 2016 to present and from this point forward. Please also provide supporting documentation with your message (i.e., the email and/or letter you received from the agency notifying you of non-funding).

Thank you in advance for your assistance.

 

UFIRST Awards System Is Live

The UFIRST awards system is now live. All active awards have been converted and are now available for viewing and management. All new awards as well as modifications to awards requested (including no cost extensions, supplements, PI changes, incremental funding, etc.) must now be processed in UFIRST.

What Does This Mean To You?

Faculty:

  • Principal Investigators must now complete the Award Compliance Form in UFIRST prior to releasing any award. PIs are allowed to delegate this but must perform the delegation in UFIRST. Every key person must also complete the Financial Conflict of Interest (FCOI) declaration in UFIRST before any award can be released. Please visit the following instruction guides for more information:
  • As has been required on all awards since 2013, all key personnel must complete training courses RSH220 & RSH260 on Effort Fundamentals and Cost Principles. This was a one-time requirement, so if the courses have already been completed, you are not required to take them again.
  • Your award notification emails will now come from UFIRST rather than from eNOA email addresses. You will receive an email for any new award or any change to an award on which you are listed as the project manager (formerly myUFL PI) or the award PI.

Grants administrators:

  • Starting July 6, DSP will be sending you requests for budget breakouts, IRB approvals, and other award related requirements through UFIRST.
  • Starting July 6, any request for a change to a 201, 209 or 214 must be submitted through UFIRST. This includes no cost extensions, incremental funding, PI changes, moving budget between projects, clinical trial deposits, etc.

eNOA emails will now come from UFIRST
The email lists you have historically maintained for your department will NOT be used.  Anyone listed as a Grant Administrator (along with the PI & project managers) will receive this email.  If anyone in your unit received emails previously and needs to continue receiving them, you will either need to set up local email forward rules or add them as administrators.

Converted awards
(Active prior to June 30 and converted into UFIRST using their existing Award ID, e.g. 00098765)

  • Commitment data will be collected via UFIRST. However, please note that NO COMMITMENT DATA was converted into the system on existing awards.  If you need to make changes to commitments on awards that are active effective June 30, 2016 and do not have any converted commitment data in UFIRST (either performing this at the time of modification or at any point during the life of your award), you will need to enter commitments for the extension periods only.  Review myinvestigator to determine if commitments have already been entered for the period in which you are acting.
  • Review the data. There will be blank fields and there may be some data (project type) that was not captured in a way that is useful to your business (project names).  UPDATE THIS INFORMATION in UFIRST.
  • Modifications to dates and financials require you to create a new allocation. If you are not sure how to do this, attend RSH282 or review the Financial and Date Modification – Converted Award Instruction Guide.

Non-federal clinical trial end dates will not automatically be extended as the end date approaches. If your trial is nearing its project end date but will continue, please request a no cost extension via UFIRST.  If you are not sure how to do this, review the No Cost Extension Instruction Guide.

Other departmental staff:
Department administrators that do not need to perform actions in UFIRST may have historically had the Grants Administrator role to be able to view proposals. A new Viewer role is available so that staff can see UFIRST Awards without receiving email communications or being able to edit the unit’s Proposals or Awards. To move your previous Grant Administrator role to Viewer or to receive the Viewer role, please contact your college Grants Workflow Administrator.

Report users/creators:

  • All awards data for FY16 and prior will continue to be available as it has been. In the coming months, this data will be moved to and available only in UF Enterprise Reporting.

All awards data for FY17 and forward will be available in UF Enterprise Reporting; however, it will not be available on July 6. We will communicate through these same channels the date when it becomes available, aiming for an August 1 release. The data will be structured differently. FAR more information will be available than was historically. We will be having training classes specifically devoted to the new UFIRST awards reporting tables.

Resources to help:

Questions? Email ufirst@research.ufl.edu

Enhancements to UFIRST Proposal Reports Are Now Online

The Office of Research will be adding more reports and data in the months ahead for both Proposals and Awards. The next time you log in to view UFIRST Proposal Reports via Enterprise Reporting, please take note of some minor enhancements.

The navigation has been updated and any bookmarks will need updated accordingly. The new navigation is as follows: Enterprise Reporting > Access Reporting > Sponsored Program Information > UFIRST Proposals.

For the canned reports available, the date selection options have changed:

  1. Submitted Proposals Activity Report can now be ran for the current month, previous month, previous 3, 6, or 12 months, in addition to standard start/end dates. The date used to return records is the date the proposal was submitted.
  2. All Activity by Personnel Report has the updated date options, and can also be generated based on Proposal Submitted Date, the date the Proposal was created, or the date the record was last modified. This will allow you to capture all records that may not be in a Post-Submission state yet, but you want to capture that activity on any report.
  3. All Proposals Role Activity Report has the updated date options, and can also be generated based on Proposal Submitted Date, the date the Proposal was created, or the date the record was last modified.

If you have any questions or concerns regarding these reports, or have any future UFIRST reporting suggestions, please contact Lisa Stroud in the Office of Research.

IES Grant Writing Webinar

The Institute of Education Sciences (IES) Grant Writing webinar held on June 17 provided instructions and recommendations specifically for Education Research Grants Program (84.305A) and Special Education Research Grants Program (84.324A).

The webinar emphasized that applicants should email IES Program Officers early in the process to discuss the proposed framework and statement of purpose. IES program staff can review draft proposals and provide feedback as time permits. Additionally, the webinar discussed parts of the proposal, key problems to avoid, and writing tips among other topics.

Reminder: The final webinar in this series will be held Thursday, July 14 as follows:

IES Application Process
Thursday, July 14, 11:00 a.m. – 12:30 p.m. EDT
IES staff will provide information regarding the grant submission process. Topics focus on the application instructions, including content and formatting requirements, registration and submission through Grants.gov, and application forms.
To register for the webinar, please click here.

Agenda
Grant Writing Tips
General Requirements
Grant Research Topics
Grant Research Goals
Four Sections of the Project Narrative
Peer Review Process

Visit the IES website Previous IES Research Funding Opportunities Webinars page to download the complete PowerPoint presentations and transcripts for FY 2017 webinars.

For more information, visit the IES Resources for Researchers.

RFA Changes of Note

For 84.305A:

  • The Development and Innovation goal is back.
  • The Improving Education Systems: Policy, Organization, Management and Leadership topic has been split into two separate topics: (1) Improving Education Systems and (2) Education Leadership.
  • Three Special Topics (Arts in Education, Career and Technical Education, and Systemic Approaches to Educating Highly Mobile Students) are being competed in FY 2017.
  • You have the option of using SciENcv to create an IES Biosketch for each key person and significant contributor on the project.

For 84.324A:

  • For FY 2017, there is a focus on teachers and other instructional personnel across all topics and goals.
    • No related services personnel, except in Early Intervention
    • For Exploration projects, research can involve the study of pre-service teachers across all topics
  • Maximum amount of funding that can be requested under each research goal is slightly reduced.
  • Language has been added for applicants proposing SMART designs under Efficacy & Replication and Effectiveness goals.
  • You have the option of using SciENcv to create an IES Biosketch for each key person and significant contributor on the project.

Grant Writing Tips

Think of your proposal as a persuasive essay. You know the problem and have the best way to address it. The opening paragraph sets the scene for the readers:

  • It identifies the significance of the work to be done and what actually will be done.
  • Readers use it to organize information in the rest of the application.
  • You can lose your readers right off with an unclear opening.

The statement of purpose should

  • Be short and attention-getting
  • Contain the problem statement and your contribution to solving it
  • Your fellow researchers, friends, and family members should be able to understand it and see its relevance

Theory of Change (i.e., Logic Model or Logical Framework) should be

  • The model underlying your research
  • A roadmap to your project narrative
  • A source for generating research questions
  • Constantly evolving

Theory of Change and the Research Plan
In your research plan, you need to specify exactly what it is you’re exploring, creating, validating, or testing. You also need to specify how you will do these things.

  • Strategies/Activities: What are the pieces that you’ll be exploring, creating, testing, etc.?
  • Outcomes
    • Indicators: What will you measure, and how you will measure it?
    • Populations: Who and where (both in treatment and control/comparison)?
    • Thresholds: What effect (size) should you expect?
    • Timeline: When should you be collecting what data?

Clarity of Writing
Don’t assume reviewers know what you are talking about. Reviewers get frustrated when they can’t understand what you are saying.

  • Readers (e.g., application reviewers) often complain about lack of clarity as follows:
    • Significance too general
    • Lack of detail regarding intervention, development cycle, or data analysis
    • Use of jargon and assumptions of knowledge
    • Poor writing (e.g., grammar), awkward constructions, etc.

General Requirements

All proposed studies must

  • Measure student education outcomes
  • Be relevant to education in the U.S.
  • Address authentic education settings
  • Specify one research topic
  • Specify one research goal

NCER Ultimate Outcomes of Interest: Student Outcomes

Grade Outcome
Prekindergarten School readiness (e.g., pre-reading, language, vocabulary, early math and science knowledge, social and behavioral competencies)
Kindergarten – Grade 12 Learning, achievement, and higher-order thinking in reading, writing, mathematics, and science; progress through the education system (e.g., course and grade completion or retention, high school graduation, and dropout); social skills, attitudes, and behaviors that support learning in school
Grade Outcome
Postsecondary

(Grades 13 – 16)

Access to, persistence in, progress through, and completion of postsecondary education; for students in developmental programs, additional outcomes include achievement in reading, writing, English language proficiency, and mathematics
Adult Education

(Adult Basic Education, Adult Secondary Education, Adult ESL, and GED preparation)

Student achievement in reading, writing, and mathematics; access to, persistence in, progress through, and completion of adult education programs

NCSER Student Outcomes of Interest
(For FY 2017, primary outcomes of teachers/instructional personnel)

Grade Outcome
Birth – 5 Developmental outcomes and school readiness
Kindergarten – High School Achievement in core academic content (reading, writing, mathematics, science); behaviors that support learning in academic contexts; functional outcomes that improve educational results; and transitions to employment, independent living, and postsecondary education

Grant Research Topics

 Grant Topics

  • All applications to the primary research grant programs must be directed to a specific topic:
    • Note on SF 424 Form, Item 4b (Agency Identifier Number)
    • Note at top of Abstract and Project Narrative
  • Must address student education outcomes
  • Grade range varies by topic
  • Your project might fit in more than one topic

305A: Topics and Their Grade Range

Topic Pre-kindergarten K-12 Sub-Baccalaureate & Baccalaureate
Early Learning Programs and Policies X
Cognition & Student Learning X X
Education Technology X X
Education Leadership X
Effective Teachers & Effective Teaching X
English Learners X
Improving Education Systems X
Mathematics & Science Education X
Reading & Writing X
Social & Behavioral Context for Academic Learning X
Special Topics X
Postsecondary & Adult Education X

324A: Topics and Their Grade Range

Topic Infants, Toddlers, & Preschoolers K-12
Early  Intervention & Early Learning in SpEd X
Autism Spectrum Disorders X
Cognition & Student Learning in SpEd X
Families of Children with Disabilities X
Mathematics & Science Education X
PD for Teachers & Related Services Providers X
Reading, Writing & Language Development X
Social & Behavioral Outcomes to Support Learning X
SpEd Policy, Finance, and Systems X
Technology for Special Education X
Transition Outcomes for Secondary Students with Disabilities X

Choosing among Overlapping Topics

  • What literature are you citing?
  • To which topic is your area of expertise best aligned?
  • If your focus is on a specific population of students/teachers, go to that program/topic:
    • Is your focus on a specific type of student/teacher (e.g., English Learners), or are you studying them as a subgroup of your sample?

Grant Research Goals

Research Goals

  • All applications to 84.305A/84.324A must be directed to a specific goal:
    • Note on SF 424 Form, Item 4b
    • Note at top of Abstract and Research Narrative
  • The goal describes the type of research to be done
  • Every application is directed to a specific topic/goal combination

FY 2017 Research Goals

  • Exploration
  • Development & Innovation
  • Efficacy & Replication
  • Effectiveness
  • Measurement

Maximum Award Amounts

Goal Maximum Grant Duration Maximum Grant Award
Exploration

– With secondary data

– With primary data

 

2 years

4 years

 

$600,000

$1,400,000

Development & Innovation 4 years $1,400,000
Efficacy & Replication

– Efficacy or Replication

– Follow-up study

– Retrospective

5 years (4 years NCSER)

3 years

3 years

$3,300,000

$1,100,000

$700,000

Effectiveness

– Effectiveness

– Follow-up study

5 years

3 years

$3,800,000

$1,400,000

Measurement 4 years $1,400,000

Goal Requirements

  • Your application must meet all Project Narrative and Award requirements listed for the goal you select in order for your application to be considered responsive and sent forward to review.
  • We strongly encourage you to incorporate the recommendations into your Project Narrative.
  • All applications must include a Dissemination Plan.

Exploration Projects: Key Features

  • Malleable factors must be under the control of the education system
  • Something that can be changed by the system
  • Examples
    • Student characteristics: behavior, skills
    • Teacher characteristics: practices, credentials
    • School characteristics: size, climate, organization
    • Education interventions: practices, curricula, instructional approaches, programs, and policies

Development & Innovation Projects: Key Features

  • Iterative development process
  • Well specified theory of change
  • Data collected on feasibility and usability in authentic education settings
  • Fidelity must be measured
  • Pilot data on student outcomes

Efficacy & Replication Projects: Key Features

  • Testing a causal question
  • Ask what might be needed to implement intervention under routine practice, even if you intend to test under ideal conditions
  • Consider role of developer to avoid conflict of interest for developer-evaluators
  • Do not require confirmatory mediator analyses (primary research question) but recommend exploratory ones

Effectiveness Projects: Key Features

  • IES expects researchers to
    • Implement intervention under routine practice
    • Include evaluators independent of development/distribution
    • Describe strong efficacy evidence for intervention (from at least one previous efficacy study)
  • Does not expect wide generalizability from a single study
    • Expects multiple Effectiveness projects to this end
    • Sample size is not a key distinction from Efficacy
  • Does not require confirmatory mediator analyses but encourages exploratory ones
  • Cost of implementation is limited to 25% of budget

Measurement Projects: Key Features

  • Assessments may also be developed in other goals, but not as the primary focus
  • Primary product of measurement grant is the design, refinement, and/or validation of an assessment
  • Include an assessment framework
  • Must link the assessment to student education outcomes

Expected Products

  • Expected Products for each goal can help you identify the right goal for your project
  • At the end of a funded project, IES expects you to provide… (see RFA)

Four Sections of the Project Narrative

Project Narrative

  • Four Required Sections
    • Significance
    • Research Plan
    • Personnel
    • Resources
  • Each of these sections will be scored individually by the peer reviewers
  • In addition, reviewers provide an overall score of Scientific Merit
  • Requirements vary by Topic & Goal
  • READ THE REQUIREMENTS CAREFULLY
  • 25 pages, single spaced
  • Project Narrative is supported by Appendices, but all critical content for reviewers should be included within the 25 pages of the Project Narrative

Significance Section

  • Describes the overall project
    • Your research question to be answered, intervention to be developed or evaluated, or measure to be developed and/or validated
  • Provides a compelling rationale for the project
    • Theoretical justification
      • Theory of Change
    • Empirical justification
    • Practical justification
  • Do not assume reviewers know significance of your work
  • Do not quote back RFA on general importance of a topic
    • e.g., RFA paragraph on lack of reading proficiency of 8th and 12th graders based on NAEP data
  • Do quote back RFA if your project is addressing a research gap or consideration identified in the RFA
    • e.g., disproportionality in discipline (Social/Behavioral); impact of early childhood policy initiatives (Early Learning)

Significance: 2 Key Problem Areas

  1. Description of Malleable Factor/Intervention
    • Unclear what intervention is: confuses reviewers
      • Many components and may be applied at different times – how fit together – Graphic may help
    • Unclear how to be implemented to ensure fidelity
    • Intervention not shown to be strong enough to expect an impact
      • Especially true for information interventions – e.g., provide data on students, short teacher workshops
    • Overly focused on actions not content
      • : 20 hours of PD held over 10 weeks but no detail on what is to be covered in the sessions
  1. Theory of change
    • Why a malleable factor is expected to be related to a student outcome
    • Why the proposed intervention should improve outcomes versus current practice
    • How an assessment/instrument will measure a specific construct
    • When well laid out, a theory of change makes clear what is expected to happen and in what order
    • Easy for reviewers to understand research plan – why measure certain outcomes
    • Graphic can be helpful

Theory of Change Should Describe

  • How the intervention addresses the need and why it should work
    • Content: what the student should know or be able to do; why this meets the need
    • Pedagogy: instructional techniques and methods to be used; why appropriate
    • Delivery System: how the intervention will arrange to deliver the instruction
    • Which aspects of the intervention are different from the counterfactual condition
  • Key factors or core ingredients most essential and distinctive to the intervention
  • Do not overwhelm the reader
  • Do not use color as a key because applications are often reviewed in black and white

Research Plan Section

  • Describe the work you intend to do
    • How you will answer your research question; develop your intervention; evaluate the intervention, or develop and/or validate your assessment
  • Make certain Research Plan is aligned to Significance section
    • All research questions should have justification in Significance
  • Step-by-step process
    • A timeline is strongly recommended!

Identify Setting, Population, & Sample

  • Identify the places you will be doing research
  • Identify the population you are addressing
  • Identify the sample
    • Inclusion and exclusion criteria
    • Sample size (issues of power for analysis)
    • The importance of attrition and how to address it
    • External validity: can you generalize to your population or only to a subset of it
  • If using secondary data, discuss these for the datasets you will be using

Specify Your Outcome Measures

  • For both proximal and distal outcomes
  • Sensitive (often narrow, aligned with intervention) measures
  • Measures of broad interest to educators
  • Describe reliability, validity, and relevance
  • Do not include measures not linked to research questions
  • Consider issue of multiple comparisons

Specify Features of All Other Measures

  • Measures that feed back into iterative development process
  • Fidelity of Implementation
    • Operating as intended
    • Able to address comparison groups
  • Feasibility

Qualitative Measures

  • Describe
    • Actual items to be used
    • How items link to constructs – the validity of these measures
    • Procedures for collection and coding (address inter-rater reliability)
    • How qualitatively collected measures are used in analysis of quantitative outcomes (e.g., test scores)

Measurement Projects

  • Alternate forms – horizontal equating
  • Vertical equating, if measuring growth
  • Test fairness
  • Non-student instruments must be validated against student outcomes

Analysis Depends on Design

  • Describe how your analysis answers your research questions
  • Describe analyses of qualitative data
  • Show your model
  • Identify coefficients of interest and their meaning
  • Show different models for different analyses
  • Include Equations
  • Address clustering
  • Describe plan for missing data
  • Check for equivalency at start and attrition bias
  • Use sensitivity tests of assumptions

Personnel Section

Describe key personnel:

  • Show that every aspect of project has an individual with expertise to do it
    • Appropriate methodological expertise
    • Substantive person for all issues addressed
    • Do not propose to hire a key person with X expertise
    • Project management skills
  • Show that every aspect of project has enough time from an expert
  • Orient CVs so it is specific to the project
    • Can use SciENcv to create an IES Biosketch
    • 4 pages plus 1 page for other sources of support

Personnel Strategies for PI

  • Senior Researcher as PI
    • Show adequate time to be PI
    • Make credentials clear (not all reviewers may know)
  • Junior Researcher as PI
    • Show you have adequate expertise not only to do work but to manage project
      • Continuation of graduate/postdoctoral research
      • Management skills as graduate student/postdoc
    • Reviewers more comfortable if you have senior person(s) on project to turn to for advice
      • Co-PI, Co-I, contractors, advisory board
      • Have them on for enough time to be taken seriously

Resources

  • Show institutions involved have capacity to support work
    • Do not use university boilerplate
  • Show that all organizations involved understand and agree to their roles
    • What will each institution, including schools, contribute to the project
    • Show strong commitment of schools and districts and alternatives in case of attrition
  • If you have received a prior grant award for similar work, describe the success of that work

Dissemination Plan

  • Describe your capacity to disseminate information about the findings from your research
  • Identify the audiences that you expect will benefit from your research
  • Discuss the ways in which you intend to reach these audiences
  • Appendix D should back up the Resources section
  • Detailed Letters of Agreement from research institutions, States, districts, schools
  • Do letters show that partners understand their role in the project (e.g., random assignment to condition, time commitments)?
  • Do letters show that you have access to all necessary data to do the proposed work?

Budget and Budget Narrative

  • Provide a clear budget and budget narrative for overall project and each sub-award
  • Provide detail on the assumptions used in the budget (e.g., assumptions for travel)
  • Budget categories are described beginning on pg. 110 of 84.305A RFA, and pg. 116 of 84.324A RFA
  • Check RFA for specific budget requirements for Research Goals
  • Ensure alignment among Project Narrative, Budget, and Budget Narrative
  • Level of effort should be in 12-month calendar year

Appendices

Appendix Content to be included
A Response to reviewers for resubmitted applications – Limit 3 pages
B Figures, charts, tables that supplement project narrative; examples of measures to be used in the project (optional) – Limit 15 pages
C Examples of materials to be used in the intervention or assessment that is the focus of your project (optional) – Limit 10 pages
D Letters of agreement from partners, data sources, and consultants (optional) – No page limit
E Data Management Plan (Required for Efficacy/Replication and Effectiveness Applications) – Limit 5 pages

Peer Review Process

Peer Review:

 

IES Launches New Software to Support Program Evaluation in Education

The Institute of Education Sciences (IES) has launched a new tool that can make it easier and more cost-effective for states and school districts to evaluate the impact of their programs. RCT-YESTM is free, user-friendly software that assists those with a basic understanding of statistics and research design in analyzing data and reporting results from randomized controlled trials (RCTs) and other types of evaluation designs.

RCT-YESTM was developed by Mathematica Policy Research, Inc. under a contract from the IES National Center for Education Evaluation and Regional Assistance. While the software has a simple interface and requires no knowledge of programming, it does not sacrifice rigor. RCT-YESTM uses state-of-the-art statistical methods to analyze data.

The software can be downloaded from the RCT-YESTM website, along with a quick start guide, interactive how-to videos, and a detailed user’s manual. A report properly generated using RCT-YESTM will include all the statistical information necessary for a review by the What Works ClearinghouseTM (WWC). The WWC, another IES investment, reviews high-quality research on interventions and programs to determine what works in education.

The new tool is part of IES’ ongoing efforts to build the capacity of states and school districts to conduct and use high-quality research. Other IES-led efforts include grants for researcher-practitioner partnerships, technical assistance for conducting and using research in each area of the country, and grants to state education agencies to build and use their longitudinal data systems.

For more information on RCT-YESTM, visit www.rct-yes.com.

Reprinted from the IES June 14, 2016 press release.

For further information, read the blog post, RCT-YES: Supporting a Culture of Research Use in Education, by Ruth Curran Neild, delegated director.

See How ERIC Selects New Sources

A new video provided by the Institute of Education Sciences (IES) provides information on how ERIC selects new sources, including education-focused journals, grey literature reports, and conference papers.

The video summarizes the types of resources ERIC will and will not index, the source selection process, and how to recommend a new resource.

To watch the video, visit the National Center for Education Evaluation and Regional Assistance blog by Erin Pollard, ERIC project officer.

Words That Should Be Avoided in Grant Proposals: Part 2

One of the most commonly used words in grant proposals is the verb “(to) understand.” Applicants frequently plan grant proposals that have been designed to “understand” something. Thus, it is common to read, “The objective of this proposal is to understand the underlying reasons for…”

Alternatively, some applicants feel it is important to be more conservative. Under such circumstances, they might write something like, “The objective of this proposal is to better understand the underlying reasons for…” While these are certainly laudable goals and meritorious of an applicant’s promise, their achievement in reality should prompt serious questions among reviewers and even the applicants themselves.

In point of fact, understanding is a goal which is rarely, if ever, fully achieved in practice. In fact, the acquisition of what might be called “full understanding” can only be approached asymptotically.

The extent to which the promise of an applicant to “better understand” or “increase understanding” is meaningful would depend rather critically upon exactly where one is in the understanding versus knowledge relationship, and this is likely not to be accurately known either by the applicant, by the reviewers, or by the funding agency.

An additional problem that becomes apparent is that to “understand” or “better understand” has a relatively qualitative and indeterminate endpoint. In contrast, every applicant should have an objective where the endpoint can be precisely delineated.

As a consequence, applicants should think seriously before presenting arguments for the need to “understand” or for proposing deliverables in which a “better understanding” would be anticipated. There is, in fact, alternative terminology to “understanding” that would be equally acceptable, for example “define the mechanism,” “elucidate the primary contributing factors,” “explicate the most likely external factors.” In this way, it will be possible to avoid the “nebulous trap” that is usually impossible to escape with the promise of “understanding.”

Excerpted from Grant Writers’ Seminars & Workshops (GWSW) Blog.

To view specific examples of tips and strategies from various versions of the GWSW The Grant Application Writer’s Workbook, visit the Workbooks page and click the link for the specific workbook of interest to you.

Awarded Projects for June 2016

College of Education
Awarded Projects
June 2016
Principal Investigator: Carole Beal (STL)
Co-PI: Walter Leite (SHDOSE), Donald Pemberton (Lastinger Center for Learning), George Michailidis (Informatics Institute)
Funding Agency: US Department of Education/IES
Project Title: Precision Education: The Virtual Learning Lab
Project Period: 7/1/2016 – 6/30/2021
Award Amount: $8,908,288
Principal Investigator: Carole Beal (STL)
Co-PI: Nicholas Gage (SSESPECS)
Funding Agency: US Department of Education/IES
Project Title: An Intervention to Provide Youth with Visual Impairments with Strategies to Access Graphical Information in Math Word Problems
Project Period: 7/1/2016 – 6/30/2019
Award Amount: $1,397,638
Principal Investigator: Ann Daunic (SSESPECS)
Co-PI: Nancy Corbett (SSESPECS), Stephen Smith (SSESPECS), James Algina (SSESPECS)
Funding Agency: US Department of Education/IES
Project Title: Evaluating a Social-Emotional Learning Curriculum for Children at Risk for Emotional or Behavioral Disorders
Project Period: 7/1/2016 – 6/30/2020
Award Amount: $3,499,958
Principal Investigator: Holly Lane (SSESPECS)
Co-PI: Nicholas Gage (SSESPECS)
Funding Agency: US Department of Education/OSEP
Project Title: Project TIER: Teaching, Intervention, and Efficacy Research
Project Period: 1/1/2017 – 12/31/2021
Award Amount: $1,250,000
Principal Investigator: Aki Murata (STL)
Co-PI: N/A
Funding Agency: Florida State University (Subcontract – NSF Flow Through)
Project Title: Identifying an Effective and Scalable Model of Lesson Study
Project Period: 1/1/2016 – 7/31/2017
Award Amount: $15,696
Principal Investigator: Donald Pemberton (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Stranahan Foundation
Project Title: Stranahan Early Learning
Project Period: 7/1/2016 – 6/30/2018
Award Amount: $599,987
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Orange County Public Schools
Project Title: Orange County Certified Coaching
Project Period: 7/1/2016 – 6/30/2017
Award Amount: $138,138
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: United Way of Miami-Dade, Florida
Project Title: Early Childhood Technical Assistance Certification: Coaching Program
Project Period: 5/1/2016 – 6/30/2017
Award Amount: $44,000
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: United Way of Miami-Dade, Florida
Project Title: Supporting Family Engagement in the Early Head Start Child Care Partnership
Project Period: 5/1/2016 – 6/30/2018
Award Amount: $39,600

 

Submitted Projects for June 2016

College of Education
Submitted Projects
June 2016
Principal Investigator: Alice Kaye Emery (SSESPECS)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: Working with the Experts 2016 – 2017
Requested Amount: $240,000
Principal Investigator: Christy Gabbard (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: Title II, Part A, Teacher & Principal Training & Recruiting Fund 2016 – 2017
Requested Amount: $25,735
Principal Investigator: Jeffrey Pufahl (Center for Arts in Medicine)
Co-PI: Dennis Kramer (SHDOSE), Christopher Loschiavo (Student Affairs)
Funding Agency: Centers for Disease Control and Prevention
Proposal Title: Evaluating the Efficacy of Sexual Assault Education through Applied Theatre
Requested Amount: $192,726
Principal Investigator: Isaac McFarlin (SHDOSE)
Co-PI: N/A
Funding Agency: University of Michigan (Subcontract – IES Flow Through)
Proposal Title: On the Importance of School Facilities Spending to Student Outcomes
Requested Amount: $213,793
Principal Investigator: Jack Judy (Electrical and Computer Engineering)
Co-PI: M. David Miller (SHDOSE), Kevin Otto (Biomedical Engineering)
Funding Agency: National Science Foundation
Proposal Title: NSF Engineering Research Center for Autonomic Neural Engineering
Requested Amount: $199,646
Principal Investigator: Ashley Pennypacker Hill (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: IDEA Part B, Entitlement 2016 – 2017
Requested Amount: $207,547
Principal Investigator: Ashley Pennypacker Hill (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: Title I, Part A: Improving the Academic Achievement of the Disadvantaged 2016 – 2017
Requested Amount: $123,295
Principal Investigator: Donald Pemberton (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Florida’s Office of Early Learning
Proposal Title: Office of Early Learning VPK Instructor Support
Requested Amount: $4,180
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Seminole County Public Schools
Proposal Title: Seminole County Instructional Coaching
Requested Amount: $16,500