Enhancements to UFIRST Proposal Reports Are Now Online

The Office of Research will be adding more reports and data in the months ahead for both Proposals and Awards. The next time you log in to view UFIRST Proposal Reports via Enterprise Reporting, please take note of some minor enhancements.

The navigation has been updated and any bookmarks will need updated accordingly. The new navigation is as follows: Enterprise Reporting > Access Reporting > Sponsored Program Information > UFIRST Proposals.

For the canned reports available, the date selection options have changed:

  1. Submitted Proposals Activity Report can now be ran for the current month, previous month, previous 3, 6, or 12 months, in addition to standard start/end dates. The date used to return records is the date the proposal was submitted.
  2. All Activity by Personnel Report has the updated date options, and can also be generated based on Proposal Submitted Date, the date the Proposal was created, or the date the record was last modified. This will allow you to capture all records that may not be in a Post-Submission state yet, but you want to capture that activity on any report.
  3. All Proposals Role Activity Report has the updated date options, and can also be generated based on Proposal Submitted Date, the date the Proposal was created, or the date the record was last modified.

If you have any questions or concerns regarding these reports, or have any future UFIRST reporting suggestions, please contact Lisa Stroud in the Office of Research.

IES Grant Writing Webinar

The Institute of Education Sciences (IES) Grant Writing webinar held on June 17 provided instructions and recommendations specifically for Education Research Grants Program (84.305A) and Special Education Research Grants Program (84.324A).

The webinar emphasized that applicants should email IES Program Officers early in the process to discuss the proposed framework and statement of purpose. IES program staff can review draft proposals and provide feedback as time permits. Additionally, the webinar discussed parts of the proposal, key problems to avoid, and writing tips among other topics.

Reminder: The final webinar in this series will be held Thursday, July 14 as follows:

IES Application Process
Thursday, July 14, 11:00 a.m. – 12:30 p.m. EDT
IES staff will provide information regarding the grant submission process. Topics focus on the application instructions, including content and formatting requirements, registration and submission through Grants.gov, and application forms.
To register for the webinar, please click here.

Agenda
Grant Writing Tips
General Requirements
Grant Research Topics
Grant Research Goals
Four Sections of the Project Narrative
Peer Review Process

Visit the IES website Previous IES Research Funding Opportunities Webinars page to download the complete PowerPoint presentations and transcripts for FY 2017 webinars.

For more information, visit the IES Resources for Researchers.

RFA Changes of Note

For 84.305A:

  • The Development and Innovation goal is back.
  • The Improving Education Systems: Policy, Organization, Management and Leadership topic has been split into two separate topics: (1) Improving Education Systems and (2) Education Leadership.
  • Three Special Topics (Arts in Education, Career and Technical Education, and Systemic Approaches to Educating Highly Mobile Students) are being competed in FY 2017.
  • You have the option of using SciENcv to create an IES Biosketch for each key person and significant contributor on the project.

For 84.324A:

  • For FY 2017, there is a focus on teachers and other instructional personnel across all topics and goals.
    • No related services personnel, except in Early Intervention
    • For Exploration projects, research can involve the study of pre-service teachers across all topics
  • Maximum amount of funding that can be requested under each research goal is slightly reduced.
  • Language has been added for applicants proposing SMART designs under Efficacy & Replication and Effectiveness goals.
  • You have the option of using SciENcv to create an IES Biosketch for each key person and significant contributor on the project.

Grant Writing Tips

Think of your proposal as a persuasive essay. You know the problem and have the best way to address it. The opening paragraph sets the scene for the readers:

  • It identifies the significance of the work to be done and what actually will be done.
  • Readers use it to organize information in the rest of the application.
  • You can lose your readers right off with an unclear opening.

The statement of purpose should

  • Be short and attention-getting
  • Contain the problem statement and your contribution to solving it
  • Your fellow researchers, friends, and family members should be able to understand it and see its relevance

Theory of Change (i.e., Logic Model or Logical Framework) should be

  • The model underlying your research
  • A roadmap to your project narrative
  • A source for generating research questions
  • Constantly evolving

Theory of Change and the Research Plan
In your research plan, you need to specify exactly what it is you’re exploring, creating, validating, or testing. You also need to specify how you will do these things.

  • Strategies/Activities: What are the pieces that you’ll be exploring, creating, testing, etc.?
  • Outcomes
    • Indicators: What will you measure, and how you will measure it?
    • Populations: Who and where (both in treatment and control/comparison)?
    • Thresholds: What effect (size) should you expect?
    • Timeline: When should you be collecting what data?

Clarity of Writing
Don’t assume reviewers know what you are talking about. Reviewers get frustrated when they can’t understand what you are saying.

  • Readers (e.g., application reviewers) often complain about lack of clarity as follows:
    • Significance too general
    • Lack of detail regarding intervention, development cycle, or data analysis
    • Use of jargon and assumptions of knowledge
    • Poor writing (e.g., grammar), awkward constructions, etc.

General Requirements

All proposed studies must

  • Measure student education outcomes
  • Be relevant to education in the U.S.
  • Address authentic education settings
  • Specify one research topic
  • Specify one research goal

NCER Ultimate Outcomes of Interest: Student Outcomes

Grade Outcome
Prekindergarten School readiness (e.g., pre-reading, language, vocabulary, early math and science knowledge, social and behavioral competencies)
Kindergarten – Grade 12 Learning, achievement, and higher-order thinking in reading, writing, mathematics, and science; progress through the education system (e.g., course and grade completion or retention, high school graduation, and dropout); social skills, attitudes, and behaviors that support learning in school
Grade Outcome
Postsecondary

(Grades 13 – 16)

Access to, persistence in, progress through, and completion of postsecondary education; for students in developmental programs, additional outcomes include achievement in reading, writing, English language proficiency, and mathematics
Adult Education

(Adult Basic Education, Adult Secondary Education, Adult ESL, and GED preparation)

Student achievement in reading, writing, and mathematics; access to, persistence in, progress through, and completion of adult education programs

NCSER Student Outcomes of Interest
(For FY 2017, primary outcomes of teachers/instructional personnel)

Grade Outcome
Birth – 5 Developmental outcomes and school readiness
Kindergarten – High School Achievement in core academic content (reading, writing, mathematics, science); behaviors that support learning in academic contexts; functional outcomes that improve educational results; and transitions to employment, independent living, and postsecondary education

Grant Research Topics

 Grant Topics

  • All applications to the primary research grant programs must be directed to a specific topic:
    • Note on SF 424 Form, Item 4b (Agency Identifier Number)
    • Note at top of Abstract and Project Narrative
  • Must address student education outcomes
  • Grade range varies by topic
  • Your project might fit in more than one topic

305A: Topics and Their Grade Range

Topic Pre-kindergarten K-12 Sub-Baccalaureate & Baccalaureate
Early Learning Programs and Policies X
Cognition & Student Learning X X
Education Technology X X
Education Leadership X
Effective Teachers & Effective Teaching X
English Learners X
Improving Education Systems X
Mathematics & Science Education X
Reading & Writing X
Social & Behavioral Context for Academic Learning X
Special Topics X
Postsecondary & Adult Education X

324A: Topics and Their Grade Range

Topic Infants, Toddlers, & Preschoolers K-12
Early  Intervention & Early Learning in SpEd X
Autism Spectrum Disorders X
Cognition & Student Learning in SpEd X
Families of Children with Disabilities X
Mathematics & Science Education X
PD for Teachers & Related Services Providers X
Reading, Writing & Language Development X
Social & Behavioral Outcomes to Support Learning X
SpEd Policy, Finance, and Systems X
Technology for Special Education X
Transition Outcomes for Secondary Students with Disabilities X

Choosing among Overlapping Topics

  • What literature are you citing?
  • To which topic is your area of expertise best aligned?
  • If your focus is on a specific population of students/teachers, go to that program/topic:
    • Is your focus on a specific type of student/teacher (e.g., English Learners), or are you studying them as a subgroup of your sample?

Grant Research Goals

Research Goals

  • All applications to 84.305A/84.324A must be directed to a specific goal:
    • Note on SF 424 Form, Item 4b
    • Note at top of Abstract and Research Narrative
  • The goal describes the type of research to be done
  • Every application is directed to a specific topic/goal combination

FY 2017 Research Goals

  • Exploration
  • Development & Innovation
  • Efficacy & Replication
  • Effectiveness
  • Measurement

Maximum Award Amounts

Goal Maximum Grant Duration Maximum Grant Award
Exploration

– With secondary data

– With primary data

 

2 years

4 years

 

$600,000

$1,400,000

Development & Innovation 4 years $1,400,000
Efficacy & Replication

– Efficacy or Replication

– Follow-up study

– Retrospective

5 years (4 years NCSER)

3 years

3 years

$3,300,000

$1,100,000

$700,000

Effectiveness

– Effectiveness

– Follow-up study

5 years

3 years

$3,800,000

$1,400,000

Measurement 4 years $1,400,000

Goal Requirements

  • Your application must meet all Project Narrative and Award requirements listed for the goal you select in order for your application to be considered responsive and sent forward to review.
  • We strongly encourage you to incorporate the recommendations into your Project Narrative.
  • All applications must include a Dissemination Plan.

Exploration Projects: Key Features

  • Malleable factors must be under the control of the education system
  • Something that can be changed by the system
  • Examples
    • Student characteristics: behavior, skills
    • Teacher characteristics: practices, credentials
    • School characteristics: size, climate, organization
    • Education interventions: practices, curricula, instructional approaches, programs, and policies

Development & Innovation Projects: Key Features

  • Iterative development process
  • Well specified theory of change
  • Data collected on feasibility and usability in authentic education settings
  • Fidelity must be measured
  • Pilot data on student outcomes

Efficacy & Replication Projects: Key Features

  • Testing a causal question
  • Ask what might be needed to implement intervention under routine practice, even if you intend to test under ideal conditions
  • Consider role of developer to avoid conflict of interest for developer-evaluators
  • Do not require confirmatory mediator analyses (primary research question) but recommend exploratory ones

Effectiveness Projects: Key Features

  • IES expects researchers to
    • Implement intervention under routine practice
    • Include evaluators independent of development/distribution
    • Describe strong efficacy evidence for intervention (from at least one previous efficacy study)
  • Does not expect wide generalizability from a single study
    • Expects multiple Effectiveness projects to this end
    • Sample size is not a key distinction from Efficacy
  • Does not require confirmatory mediator analyses but encourages exploratory ones
  • Cost of implementation is limited to 25% of budget

Measurement Projects: Key Features

  • Assessments may also be developed in other goals, but not as the primary focus
  • Primary product of measurement grant is the design, refinement, and/or validation of an assessment
  • Include an assessment framework
  • Must link the assessment to student education outcomes

Expected Products

  • Expected Products for each goal can help you identify the right goal for your project
  • At the end of a funded project, IES expects you to provide… (see RFA)

Four Sections of the Project Narrative

Project Narrative

  • Four Required Sections
    • Significance
    • Research Plan
    • Personnel
    • Resources
  • Each of these sections will be scored individually by the peer reviewers
  • In addition, reviewers provide an overall score of Scientific Merit
  • Requirements vary by Topic & Goal
  • READ THE REQUIREMENTS CAREFULLY
  • 25 pages, single spaced
  • Project Narrative is supported by Appendices, but all critical content for reviewers should be included within the 25 pages of the Project Narrative

Significance Section

  • Describes the overall project
    • Your research question to be answered, intervention to be developed or evaluated, or measure to be developed and/or validated
  • Provides a compelling rationale for the project
    • Theoretical justification
      • Theory of Change
    • Empirical justification
    • Practical justification
  • Do not assume reviewers know significance of your work
  • Do not quote back RFA on general importance of a topic
    • e.g., RFA paragraph on lack of reading proficiency of 8th and 12th graders based on NAEP data
  • Do quote back RFA if your project is addressing a research gap or consideration identified in the RFA
    • e.g., disproportionality in discipline (Social/Behavioral); impact of early childhood policy initiatives (Early Learning)

Significance: 2 Key Problem Areas

  1. Description of Malleable Factor/Intervention
    • Unclear what intervention is: confuses reviewers
      • Many components and may be applied at different times – how fit together – Graphic may help
    • Unclear how to be implemented to ensure fidelity
    • Intervention not shown to be strong enough to expect an impact
      • Especially true for information interventions – e.g., provide data on students, short teacher workshops
    • Overly focused on actions not content
      • : 20 hours of PD held over 10 weeks but no detail on what is to be covered in the sessions
  1. Theory of change
    • Why a malleable factor is expected to be related to a student outcome
    • Why the proposed intervention should improve outcomes versus current practice
    • How an assessment/instrument will measure a specific construct
    • When well laid out, a theory of change makes clear what is expected to happen and in what order
    • Easy for reviewers to understand research plan – why measure certain outcomes
    • Graphic can be helpful

Theory of Change Should Describe

  • How the intervention addresses the need and why it should work
    • Content: what the student should know or be able to do; why this meets the need
    • Pedagogy: instructional techniques and methods to be used; why appropriate
    • Delivery System: how the intervention will arrange to deliver the instruction
    • Which aspects of the intervention are different from the counterfactual condition
  • Key factors or core ingredients most essential and distinctive to the intervention
  • Do not overwhelm the reader
  • Do not use color as a key because applications are often reviewed in black and white

Research Plan Section

  • Describe the work you intend to do
    • How you will answer your research question; develop your intervention; evaluate the intervention, or develop and/or validate your assessment
  • Make certain Research Plan is aligned to Significance section
    • All research questions should have justification in Significance
  • Step-by-step process
    • A timeline is strongly recommended!

Identify Setting, Population, & Sample

  • Identify the places you will be doing research
  • Identify the population you are addressing
  • Identify the sample
    • Inclusion and exclusion criteria
    • Sample size (issues of power for analysis)
    • The importance of attrition and how to address it
    • External validity: can you generalize to your population or only to a subset of it
  • If using secondary data, discuss these for the datasets you will be using

Specify Your Outcome Measures

  • For both proximal and distal outcomes
  • Sensitive (often narrow, aligned with intervention) measures
  • Measures of broad interest to educators
  • Describe reliability, validity, and relevance
  • Do not include measures not linked to research questions
  • Consider issue of multiple comparisons

Specify Features of All Other Measures

  • Measures that feed back into iterative development process
  • Fidelity of Implementation
    • Operating as intended
    • Able to address comparison groups
  • Feasibility

Qualitative Measures

  • Describe
    • Actual items to be used
    • How items link to constructs – the validity of these measures
    • Procedures for collection and coding (address inter-rater reliability)
    • How qualitatively collected measures are used in analysis of quantitative outcomes (e.g., test scores)

Measurement Projects

  • Alternate forms – horizontal equating
  • Vertical equating, if measuring growth
  • Test fairness
  • Non-student instruments must be validated against student outcomes

Analysis Depends on Design

  • Describe how your analysis answers your research questions
  • Describe analyses of qualitative data
  • Show your model
  • Identify coefficients of interest and their meaning
  • Show different models for different analyses
  • Include Equations
  • Address clustering
  • Describe plan for missing data
  • Check for equivalency at start and attrition bias
  • Use sensitivity tests of assumptions

Personnel Section

Describe key personnel:

  • Show that every aspect of project has an individual with expertise to do it
    • Appropriate methodological expertise
    • Substantive person for all issues addressed
    • Do not propose to hire a key person with X expertise
    • Project management skills
  • Show that every aspect of project has enough time from an expert
  • Orient CVs so it is specific to the project
    • Can use SciENcv to create an IES Biosketch
    • 4 pages plus 1 page for other sources of support

Personnel Strategies for PI

  • Senior Researcher as PI
    • Show adequate time to be PI
    • Make credentials clear (not all reviewers may know)
  • Junior Researcher as PI
    • Show you have adequate expertise not only to do work but to manage project
      • Continuation of graduate/postdoctoral research
      • Management skills as graduate student/postdoc
    • Reviewers more comfortable if you have senior person(s) on project to turn to for advice
      • Co-PI, Co-I, contractors, advisory board
      • Have them on for enough time to be taken seriously

Resources

  • Show institutions involved have capacity to support work
    • Do not use university boilerplate
  • Show that all organizations involved understand and agree to their roles
    • What will each institution, including schools, contribute to the project
    • Show strong commitment of schools and districts and alternatives in case of attrition
  • If you have received a prior grant award for similar work, describe the success of that work

Dissemination Plan

  • Describe your capacity to disseminate information about the findings from your research
  • Identify the audiences that you expect will benefit from your research
  • Discuss the ways in which you intend to reach these audiences
  • Appendix D should back up the Resources section
  • Detailed Letters of Agreement from research institutions, States, districts, schools
  • Do letters show that partners understand their role in the project (e.g., random assignment to condition, time commitments)?
  • Do letters show that you have access to all necessary data to do the proposed work?

Budget and Budget Narrative

  • Provide a clear budget and budget narrative for overall project and each sub-award
  • Provide detail on the assumptions used in the budget (e.g., assumptions for travel)
  • Budget categories are described beginning on pg. 110 of 84.305A RFA, and pg. 116 of 84.324A RFA
  • Check RFA for specific budget requirements for Research Goals
  • Ensure alignment among Project Narrative, Budget, and Budget Narrative
  • Level of effort should be in 12-month calendar year

Appendices

Appendix Content to be included
A Response to reviewers for resubmitted applications – Limit 3 pages
B Figures, charts, tables that supplement project narrative; examples of measures to be used in the project (optional) – Limit 15 pages
C Examples of materials to be used in the intervention or assessment that is the focus of your project (optional) – Limit 10 pages
D Letters of agreement from partners, data sources, and consultants (optional) – No page limit
E Data Management Plan (Required for Efficacy/Replication and Effectiveness Applications) – Limit 5 pages

Peer Review Process

Peer Review:

 

IES Launches New Software to Support Program Evaluation in Education

The Institute of Education Sciences (IES) has launched a new tool that can make it easier and more cost-effective for states and school districts to evaluate the impact of their programs. RCT-YESTM is free, user-friendly software that assists those with a basic understanding of statistics and research design in analyzing data and reporting results from randomized controlled trials (RCTs) and other types of evaluation designs.

RCT-YESTM was developed by Mathematica Policy Research, Inc. under a contract from the IES National Center for Education Evaluation and Regional Assistance. While the software has a simple interface and requires no knowledge of programming, it does not sacrifice rigor. RCT-YESTM uses state-of-the-art statistical methods to analyze data.

The software can be downloaded from the RCT-YESTM website, along with a quick start guide, interactive how-to videos, and a detailed user’s manual. A report properly generated using RCT-YESTM will include all the statistical information necessary for a review by the What Works ClearinghouseTM (WWC). The WWC, another IES investment, reviews high-quality research on interventions and programs to determine what works in education.

The new tool is part of IES’ ongoing efforts to build the capacity of states and school districts to conduct and use high-quality research. Other IES-led efforts include grants for researcher-practitioner partnerships, technical assistance for conducting and using research in each area of the country, and grants to state education agencies to build and use their longitudinal data systems.

For more information on RCT-YESTM, visit www.rct-yes.com.

Reprinted from the IES June 14, 2016 press release.

For further information, read the blog post, RCT-YES: Supporting a Culture of Research Use in Education, by Ruth Curran Neild, delegated director.

See How ERIC Selects New Sources

A new video provided by the Institute of Education Sciences (IES) provides information on how ERIC selects new sources, including education-focused journals, grey literature reports, and conference papers.

The video summarizes the types of resources ERIC will and will not index, the source selection process, and how to recommend a new resource.

To watch the video, visit the National Center for Education Evaluation and Regional Assistance blog by Erin Pollard, ERIC project officer.

Words That Should Be Avoided in Grant Proposals: Part 2

One of the most commonly used words in grant proposals is the verb “(to) understand.” Applicants frequently plan grant proposals that have been designed to “understand” something. Thus, it is common to read, “The objective of this proposal is to understand the underlying reasons for…”

Alternatively, some applicants feel it is important to be more conservative. Under such circumstances, they might write something like, “The objective of this proposal is to better understand the underlying reasons for…” While these are certainly laudable goals and meritorious of an applicant’s promise, their achievement in reality should prompt serious questions among reviewers and even the applicants themselves.

In point of fact, understanding is a goal which is rarely, if ever, fully achieved in practice. In fact, the acquisition of what might be called “full understanding” can only be approached asymptotically.

The extent to which the promise of an applicant to “better understand” or “increase understanding” is meaningful would depend rather critically upon exactly where one is in the understanding versus knowledge relationship, and this is likely not to be accurately known either by the applicant, by the reviewers, or by the funding agency.

An additional problem that becomes apparent is that to “understand” or “better understand” has a relatively qualitative and indeterminate endpoint. In contrast, every applicant should have an objective where the endpoint can be precisely delineated.

As a consequence, applicants should think seriously before presenting arguments for the need to “understand” or for proposing deliverables in which a “better understanding” would be anticipated. There is, in fact, alternative terminology to “understanding” that would be equally acceptable, for example “define the mechanism,” “elucidate the primary contributing factors,” “explicate the most likely external factors.” In this way, it will be possible to avoid the “nebulous trap” that is usually impossible to escape with the promise of “understanding.”

Excerpted from Grant Writers’ Seminars & Workshops (GWSW) Blog.

To view specific examples of tips and strategies from various versions of the GWSW The Grant Application Writer’s Workbook, visit the Workbooks page and click the link for the specific workbook of interest to you.

Awarded Projects for June 2016

College of Education
Awarded Projects
June 2016
Principal Investigator: Carole Beal (STL)
Co-PI: Walter Leite (SHDOSE), Donald Pemberton (Lastinger Center for Learning), George Michailidis (Informatics Institute)
Funding Agency: US Department of Education/IES
Project Title: Precision Education: The Virtual Learning Lab
Project Period: 7/1/2016 – 6/30/2021
Award Amount: $8,908,288
Principal Investigator: Carole Beal (STL)
Co-PI: Nicholas Gage (SSESPECS)
Funding Agency: US Department of Education/IES
Project Title: An Intervention to Provide Youth with Visual Impairments with Strategies to Access Graphical Information in Math Word Problems
Project Period: 7/1/2016 – 6/30/2019
Award Amount: $1,397,638
Principal Investigator: Ann Daunic (SSESPECS)
Co-PI: Nancy Corbett (SSESPECS), Stephen Smith (SSESPECS), James Algina (SSESPECS)
Funding Agency: US Department of Education/IES
Project Title: Evaluating a Social-Emotional Learning Curriculum for Children at Risk for Emotional or Behavioral Disorders
Project Period: 7/1/2016 – 6/30/2020
Award Amount: $3,499,958
Principal Investigator: Holly Lane (SSESPECS)
Co-PI: Nicholas Gage (SSESPECS)
Funding Agency: US Department of Education/OSEP
Project Title: Project TIER: Teaching, Intervention, and Efficacy Research
Project Period: 1/1/2017 – 12/31/2021
Award Amount: $1,250,000
Principal Investigator: Aki Murata (STL)
Co-PI: N/A
Funding Agency: Florida State University (Subcontract – NSF Flow Through)
Project Title: Identifying an Effective and Scalable Model of Lesson Study
Project Period: 1/1/2016 – 7/31/2017
Award Amount: $15,696
Principal Investigator: Donald Pemberton (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Stranahan Foundation
Project Title: Stranahan Early Learning
Project Period: 7/1/2016 – 6/30/2018
Award Amount: $599,987
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Orange County Public Schools
Project Title: Orange County Certified Coaching
Project Period: 7/1/2016 – 6/30/2017
Award Amount: $138,138
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: United Way of Miami-Dade, Florida
Project Title: Early Childhood Technical Assistance Certification: Coaching Program
Project Period: 5/1/2016 – 6/30/2017
Award Amount: $44,000
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: United Way of Miami-Dade, Florida
Project Title: Supporting Family Engagement in the Early Head Start Child Care Partnership
Project Period: 5/1/2016 – 6/30/2018
Award Amount: $39,600

 

Submitted Projects for June 2016

College of Education
Submitted Projects
June 2016
Principal Investigator: Alice Kaye Emery (SSESPECS)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: Working with the Experts 2016 – 2017
Requested Amount: $240,000
Principal Investigator: Christy Gabbard (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: Title II, Part A, Teacher & Principal Training & Recruiting Fund 2016 – 2017
Requested Amount: $25,735
Principal Investigator: Jeffrey Pufahl (Center for Arts in Medicine)
Co-PI: Dennis Kramer (SHDOSE), Christopher Loschiavo (Student Affairs)
Funding Agency: Centers for Disease Control and Prevention
Proposal Title: Evaluating the Efficacy of Sexual Assault Education through Applied Theatre
Requested Amount: $192,726
Principal Investigator: Isaac McFarlin (SHDOSE)
Co-PI: N/A
Funding Agency: University of Michigan (Subcontract – IES Flow Through)
Proposal Title: On the Importance of School Facilities Spending to Student Outcomes
Requested Amount: $213,793
Principal Investigator: Jack Judy (Electrical and Computer Engineering)
Co-PI: M. David Miller (SHDOSE), Kevin Otto (Biomedical Engineering)
Funding Agency: National Science Foundation
Proposal Title: NSF Engineering Research Center for Autonomic Neural Engineering
Requested Amount: $199,646
Principal Investigator: Ashley Pennypacker Hill (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: IDEA Part B, Entitlement 2016 – 2017
Requested Amount: $207,547
Principal Investigator: Ashley Pennypacker Hill (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Proposal Title: Title I, Part A: Improving the Academic Achievement of the Disadvantaged 2016 – 2017
Requested Amount: $123,295
Principal Investigator: Donald Pemberton (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Florida’s Office of Early Learning
Proposal Title: Office of Early Learning VPK Instructor Support
Requested Amount: $4,180
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Seminole County Public Schools
Proposal Title: Seminole County Instructional Coaching
Requested Amount: $16,500
,

Research Spotlight: Angela Kohnen

Q & A with Angela Kohnen, Ph.D., Associate Professor in the School of Teaching and Learning

What basic questions does your research seek to answer?

I am very interested in understanding the teaching of writing and the role of writing in classrooms across the curriculum, K-12. Some of the questions I hope to answer include: how do teachers across the curriculum learn to incorporate writing into their classrooms? What are the most effective ways to prepare teachers to teach writing? How does the teaching of writing impact student and teacher identity?

We are in an interesting place when it comes to writing instruction. The Common Core State Standards have brought more attention to writing instruction than we’ve had for years, but the standardized writing tests have also created a lot of pressure. In some places, we see more formulaic instruction rather than authentic writing, which I find troubling.

What makes your work interesting?

Everything! I love my work. I feel very privileged that I get to ask interesting questions, explore the answers, and write about all of this for a living. Writing and writing instruction can be very powerful. I have worked with teachers in a wide range of classrooms, including elementary, secondary English, science, welding, construction, and culinary arts, and in each context we have been able to find ways that writing enhances the curriculum and helps students develop into the kind of people the teachers were hoping they would become. It isn’t always easy, but that’s also what makes the work interesting. For example, coming to understand how writing can help students become welders—learn to think like welders and enact the processes of welding—that’s fascinating! I’m most engaged when I am working in fields and places where I can learn too.

What are you currently working on?

My colleagues in English Education and I are beginning a long-term study on how teachers think about and enact their role as writing teachers. We hope to work with our secondary English Education students from the time they begin our program into their first years in the field to understand how they make sense of the competing demands they face as teachers of writing. Each day, in each lesson, teachers are influenced by so many different factors: the way they were taught themselves; the curriculum they’ve been given or are creating; their students’ expectations and preparation; standardized testing; and, we hope, what they learn in a teacher preparation program. How that all plays out in their actual instruction is something we want to understand more.

I am also continuing work with colleagues at the University of Missouri-St. Louis on the teaching of nonfiction writing at the elementary level, something emphasized in the Common Core State Standards. We see this attention to nonfiction writing as an opportunity to engage students in authentic questions and information seeking—to really foster student curiosity, something that notoriously diminishes as students move through school.

Reminder: Research Training Utility

The UF Research Training Utility helps faculty identify what mandatory training must be completed in order to conduct research at UF.

To begin, answer the questions on this webpage http://research.ufl.edu/rtu.html based on the kind of research activities you expect to conduct. Once you have answered all of the questions, a list of the courses you need to complete will appear. You can then print the list or receive it by email.

The list will include the course name, how often it needs to be completed, where to take it, and how long it typically takes to complete.

To access your training records, log into myTraining at http://mytraining.hr.ufl.edu/. Your training transcript will list your completion status for individual trainings. Trainings hosted by outside entities may not have their results incorporated into myTraining at this time.

UFIRST Awards Implementation Goes Live

On July 5, the final phase of the UFIRST implementation will be complete when the awards module of the system is launched. This final phase of the UFIRST implementation reflects an increased synergy and partnership between the Division of Sponsored Programs and the Contracts and Grants Office, both of which are now housed together under the Office of Research.

While the requirements for departments to provide essential data about new and modifications to awards are the same, research administrators will now enter this information directly into UFIRST rather than emailing this information to DSP Awards.

RSH282, the UFIRST Awards training course, is available beginning this month. Register now to attend one of the many scheduled sessions beginning June 2.

Please note that users may experience some slowness during the weekend hours of July 2-4 as the Office of Research loads in all active awards, but at no time will UFIRST proposals be unavailable.

Some additional items to note:

  • For all new awards set up after July 5, the current Award Compliance Form will be available via UFIRST. Principal investigators will be required to log into UFIRST to complete this form.
  • In addition, the Conflict of Interest portion of this form is moving into UFIRST.  All key personnel on each award must certify their outside activities related to each award in UFIRST.
  • A new “Viewer” role will be introduced that will provide increased transparency and ease of access to information. The new common workspace will ensure more accurate record-keeping and the elimination of the need to rekey information in the myUFL system, with the expectation of a reduction in discrepancies.

In an effort to significantly improve the way it administers research, the Office of Research introduced the University of Florida Integrated Research Support Tool—or UFIRST—in the spring of 2015.

If you have questions or concerns, please email ufirst@research.ufl.edu.

Reminder: IES Research Funding Webinars

The remaining webinars in the Institute of Education Sciences (IES) series will be held in June and July as follows for those who are interested in Fiscal Year 2017 funding opportunities:

  • IES/NCSER Special Education Research Training for Early Career Development and Mentoring, Tuesday, June 7th, 3:00 p.m. – 4:00 p.m.
  • IES Application Process, Wednesday, June 8th, 1:00 p.m. – 2:30 p.m.
  • Low-Cost, Short-Duration Evaluations of Education and Special Education Programs, Thursday, June 9th, 2:00 p.m. – 3:30 p.m.
  • Research Networks Focused on Critical Problems of Policy and Practice, June 16th, 2:30 p.m. – 4:00 p.m.
  • IES Grant Writing Workshop, Friday, June 17th, 11:00 a.m. – 12:30 p.m.
  • IES Application Process (repeated), Thursday, July 14th, 11:00 a.m. – 12:30 p.m.

For more information on the sessions and to register, visit the IES funding webinars web page. (NOTE: All times are Eastern Daylight Time)

Words That Should Be Avoided in Grant Proposals: Part 1

Two words that should usually be avoided by applicants in preparing their grant applications are “If” and “Whether.” The primary problem with their use is that they both provide opportunities for a negative outcome to occur.

While it is certainly possible that either a positive or an alternative outcome might be beneficial, in many cases, the negative outcome leads to a “dead end” in a line of work or progression of activities.

It is critical that applicants summarize the expectations for what will be accomplished (the “deliverables”) at the completion of any series of activities or experiments that are described in the proposal, as well as what the potential importance of those outcomes would be. In efforts to exercise caution in describing projected outcomes, applicants may be tempted to write something along the lines of “If successful, these studies would provide new opportunities to………”

The problem with such a phrase is that it immediately conjures up the thought among the reviewers of “Well, what if not?” which, of course, has the potential to instill doubt in the mind of reviewer as to whether the applicant him/herself has confidence in the outcomes of the projected studies. In addition, if not successful, the conclusion would seem to be that such opportunities would not exist.

A potentially straightforward way to avoid this problem would be to rephrase this in the positive using the subjunctive verb tense. For example, “Our ability to provide strong evidence of the importance of our expected findings would provide new opportunities to …….”

With this strategy, the applicant would not be guaranteeing that proposed activities would be successful but simply that such success would be likely to have a significant positive impact from the perspective of the all-important mission of the funding agency.

Excerpted from Grant Writers’ Seminars & Workshops (GWSW) Blog.

To view specific examples of tips and strategies from various versions of the GWSW The Grant Application Writer’s Workbook, visit the Workbooks page and click the link for the specific workbook of interest to you.

New Lynda.com Login Process Provides Free 24/7 Access

The login page for Lynda.com — the company with which UF partners to provide students and employees with access to online software and technology courses — has changed, resulting in a slightly different process for using the free service at UF. This updated process will impact you if you are new to the Lynda.com service or if you are using a new device to access Lynda.com.

Lynda.com provides access to nearly 5,000 video courses on work-related topics including Excel, SAS, and InDesign. In addition, the site offers courses in digital photography, becoming an iBook author, and more.

To log in and access your free Lynda.com subscription, click the “Log in” button on the Lynda.com homepage and type “www.ufl.edu” into the box that reads “Or, enter your organization’s URL to log in through their portal.” Next, enter your GatorLink username. The new login page also includes “Remember Me” by default for future log-ins.

Please contact the UF Computing Help Desk if you need any assistance accessing Lynda.com.

Awarded Projects for May 2016

College of Education
Awarded Projects
May 2016
Principal Investigator: Mildred Maldonado Molina (Health Outcomes and Policy)
Co-PI: Lisa Langley (Lastinger Center for Learning)
Funding Agency: Florida Department of Education
Project Title: Office of Early Learning Web Portal
Project Period: 11/2/2015 – 6/30/2016
Award Amount: $4,433.00
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Duval County Public Schools
Project Title: Duval County Certified Coaching
Project Period: 6/16/2016 – 6/15/2017
Award Amount: $158,176.00

Submitted Projects for May 2016

College of Education
Submitted Projects
May 2016
Principal Investigator: Michael Bowie (Administration, Dean’s Area)
Co-PI: Nancy Waldron (Administration, Dean’s Area)
Funding Agency: Florida Department of Education
Proposal Title: College Reach-Out Program (CROP)
Requested Amount: $104,917
Principal Investigator: Mary Brownell (SSESPECS)
Co-PI: Amber Benedict (SSESPECS)
Funding Agency: US Department of Education/i3
Proposal Title: Project Coordinate: Supporting Teams of 3rd and 4th Grade Teachers in Designing and Implementing Effective Literacy Instruction Coordinated through a Multi-Tiered System of Support Framework
Requested Amount:$2,999,998
Principal Investigator: Anne Corinne Huggins-Manley (SHDOSE)
Co-PI: Walter Leite (SHDOSE)
Funding Agency: US Army Research Initiative for the Behavioral & Social Sciences
Proposal Title: Personnel Testing and Performance: Statistical Models for Non-response in Measurement
Requested Amount:  $344,400
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: Susan Butler (STL), Walter Leite (SHDOSE), Donald Pemberton (Lastinger Center for Learning)
Funding Agency: Rapides Foundation
Proposal Title: Rapides Evaluation of Education Initiative
Requested Amount: $449,501
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: United Way of Miami-Dade, Florida
Proposal Title: Early Childhood Technical Assistance Certification: Coaching Program
Requested Amount: $44,000
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: United Way of Miami-Dade, Florida
Proposal Title: Supporting Family Engagement in the Early Head Start Child Care Partnership
Requested Amount: $39,600
Principal Investigator: Albert Ritzhaupt (STL)
Co-PI: Nancy Waldron (Administration, Dean’s Area), Christina Gardner-McCune (Computer & Information Science & Engineering)
Funding Agency: National Science Foundation
Proposal Title: Collaborative Research: Creating Alternative Pathways to Computing Careers for Diverse Populations
Requested Amount: $2,479,506