,

Research Spotlight: Joni Williams Splett

Q & A with Joni Williams Splett, Ph.D., Assistant Professor in the School of Special Education, School Psychology, and Early Childhood Studies

What basic questions does your research seek to answer?

How do we improve the mental health and wellness of children and youth? Through my research, I seek to identify strategies that help all children, youth, and their families achieve and maintain positive mental health outcomes. On a systems level, my research is focused on meaningfully interconnecting child-serving systems, such as schools and community mental health agencies, so that resources are multiplicatively enhanced and the delivery of a continuum of evidence-based mental practices is improved. At the student level, my research focuses on preventing and reducing aggressive behaviors through the development and testing of intervention programs for children, families, and schools.

What makes your work interesting?

Children’s mental health is gaining more national and international attention. It is an area most people can agree is important. My research includes systems- and student-level questions with emphasis on the inclusion and integration of families, communities, and schools. In this way, I seek to use resources more effectively to improve access to mental health promotion, prevention and intervention, and associated outcomes. My research questions, thus, include intervention effectiveness, as well as resource allocation, access, and economic impact.

What are you currently working on?

My current systems-level work includes three grant-funded, national projects, while my student-level intervention research program is focused on revising and testing GIRLSS (Growing Interpersonal Relationships through Learning and Systemic Support), a group counseling intervention to reduce relational aggression.

Currently, my largest project is a four-year, multisite randomized control trial of the Interconnected Systems Framework (ISF) funded by the National Institute of Justice (NIJ). I am co-principal investigator of the study and PI of the Florida site. The ISF is a structure and process for blending education and mental health systems through a multitiered structure of mental health promotion, prevention, and intervention. It interconnects the multitiered system features of School-Wide Positive Behavioral Interventions and Supports (PBIS) with the evidence-based mental health practices of community mental health agencies in the school setting. Specifically, key components of the ISF include interdisciplinary collaboration and teaming, data-based decision making, and evidence-based mental health promotion, prevention, and intervention practices. We are in the first of two implementation years for the NIJ-funded randomized control trial and have one year for follow up and sustainability tracking.

My second systems-level project is the development and validation of an action planning fidelity measurement tool for the ISF, called the ISF Implementation Inventory (ISF-II; Splett, Perales, Quell, Eber, Barrett, Putnam, & Weist, 2016). During phase one of this project, we piloted the ISF-II with three school districts in three states to examine the tool’s content and social validity. We revised the measure accordingly and are now testing its reliability, construct validity, and social validity in phase two. I am leading the phase two study in collaboration with the National PBIS Technical Assistance Center’s ISF workgroup, and it currently includes more than 10 school districts in seven states. We aim to include more than 100 schools in our phase two psychometric study of the ISF-II.

The third systems-level project that I am advancing examines the adoption considerations and implementation outcomes of universal mental health screening in schools. Mental health screening is a key data-based decision making component of the ISF, as it is hypothesized to improve identification and access to mental health services for children and youth. Currently, I have several papers under review or in preparation in this area. My major project includes examining the intervention receipt outcomes in schools using a mental health screener. Schools have limited intervention resources, and it is unlikely that every student identified as in need by a universal mental health screener will receive services. My research team is using real-life screening data from schools implementing the ISF, combined with service receipt, teacher survey, and extant student records data, to examine the characteristics of students who receive intervention versus those who do not but are identified by the screener as in need. Our findings will inform recommendations to schools and policy makers for improving the implementation strategies of these screening tools.

At the student level, I am excited to be revising and testing the referral and intervention protocol of GIRLSS. I developed GIRLSS during a practicum placement in graduate school and tested it for my dissertation, but was unable to advance the work during my internship or postdoctoral positions. At UF, I have developed a partnership with Stephen Smith, Ph.D., in the Special Education program, who has successfully developed and tested other interventions to prevent and reduce aggressive behaviors in the school setting. We lead a team of doctoral students who have revised the group counseling curriculum of GIRLSS and conducted a field trial of it with middle school girls attending a local summer camp. Currently, we are writing grant applications to fund further development and testing of our revised referral and intervention protocol.

Happy Holidays from Your OER Team!

happy-holidays

NSF Releases Revised Proposal and Award Policies Procedures Guide

The National Science Foundation (NSF) has published a revised version of the Proposal and Award Policies and Procedures Guide (PAPPG, NSF 17-1) effective for proposals due on or after January 30, 2017.

For a complete list of revisions, please see the NSF webpage Significant Changes and Clarifications to the PAPPG. For a pdf or html version of the entire guide, please see Proposal and Award Policies and Procedures Guide NSF 17-1.

A summary of the significant changes follows:

Overall Document

  • The PAPPG has been modified in its entirety, to remove all references to the Grant Proposal Guide (GPG) and Award & Administration Guide (AAG). The document will now be referred to solely as the NSF Proposal & Award Policies & Procedures Guide (PAPPG). The document will be sequentially numbered from Chapter I-XII and all references throughout have been modified to reflect this change. Part I of the document covers NSF’s proposal preparation and submission guidelines, and Part II covers NSF’s award, administration and monitoring guidelines.
  • Editorial changes have been made to either clarify or enhance the intended meaning of a sentence or section or ensure consistency with data contained in NSF systems or other NSF policy documents.

Significant Changes to the PAPPG Part I

  • Chapter I.D.1, Letters of Intent (LOI), includes additional language regarding the submission of a LOI for collaborative proposals. Proposers that plan to submit a collaborative proposal from multiple organizations should submit a single LOI for the entire project, given that NSF considers a collaborative proposal to be a unified research project.
  • Chapter II.B, Format of the Proposal, has been updated to include two new types of proposals, RAISE and GOALI. These two types of proposals are described in greater detail in Chapter II.E. An additional resource has also been added to this section with information on NSF auto-compliance checks that are conducted during the proposal preparation and submission process.
  • Chapter II.C.1.e, Collaborators & Other Affiliations Information, includes additional instructions for proposers. Each section of the Collaborators & Other Affiliations Information should be listed alphabetically by last name. The text has also been revised to remove the requirement that proposers list postgraduate scholar sponsors in this section of the proposal. Postgraduate scholar sponsor is not a disqualifying relationship for a reviewer; therefore, it was determined that this information is not necessary.
  • Chapter II.C.2, Sections of the Proposal, has been revised to inform proposers that proposal preparation for RAPID, EAGER, RAISE, GOALI, Ideas Lab, FASED, Conference, Equipment, Travel, Center, Research Infrastructure and Fellowship projects may deviate from the content requirements of a full research proposal.
  • Chapter II.C.2.a, Cover Sheet, has been updated to provide instructions that more closely follow the proposal preparation screens in FastLane.
  • Chapter II.C.2.d(iii), Results from Prior NSF Support, includes revised language to clarify NSF’s purpose for collecting this information in the Project Description. The purpose of the Results from Prior NSF Support section is to assist reviewers in assessing the quality of prior work conducted with current or prior NSF support. Additional instructions have also been added regarding the type of information that should be included for projects that have been recently awarded, where no new results exist.
  • Chapter II.C.2.g(vi), Other Direct Costs, has been updated to include information on incentive payments, for example, payments to human subjects or incentives to promote completion of a survey. These costs should be included on line G6 of the NSF Budget and should be proposed in accordance with organizational policies and procedures. Indirect costs should be calculated on incentive payments in accordance with the organization’s approved US Federally negotiated indirect cost rate(s).
  • Chapter II.D, Special Processing Instructions, has been revised to address areas where special proposal processing may be required. Information on RAPID, EAGER, Ideas Lab, FASED, Equipment, Conference, and Travel Proposals has been moved to Chapter II.E.
  • Chapter II.D.5, Proposals Involving Human Subjects, has been updated to reflect the Foundation’s implementation of 45 CFR 690.118, applications and proposals lacking definite plans for involvement of human subjects. A hypertext link is provided to an NSF-approved format that may be used to submit such determinations by proposing institutions. Clarification has also been added regarding the IRB documentation that NSF must have in order to make an award when proposals involve human subjects.
  • Chapter II.E, Types of Proposals, has been added to describe, in one place, the various other types of proposals that can be submitted to NSF, including the two new types, RAISE and GOALI. This section includes proposal preparation instructions for each of the types of proposal that may supplement or deviate from the guidance provided elsewhere in Chapter II.
  • Chapter II.E.9, Travel Proposal, has been updated from “International Travel Proposals” to “Travel Proposal” to reflect that this type of proposal can be used for both domestic and international travel requests. Additional proposal preparation instructions have also been added to inform proposers of the required proposal elements, including the requirement that the Project Description contain Results from Prior NSF Support.

Clarifications and Other Changes to the PAPPG Part I

  • Chapter I.F, When to Submit Proposals, has been revised to include additional instructions on how to submit proposals under the Special Exception to NSF’s Deadline Date Policy. This section includes proposal preparation instructions for organizations impacted by a natural or anthropogenic disaster. Impacted proposers must check the “Special Exception to the Deadline Date Policy” box on the NSF Cover Sheet and upload the requisite Single Copy Document(s).
  • Chapter II.C.2, Sections of the Proposal, has been amended to include k. Single Copy Documents in the list of the required components of a full research proposal.
  • Chapter II.C.2.f(i), Biographical Sketch(es), Senior Personnel, has been revised to reflect that FastLane no longer accepts the Biographical Sketch inserted as text. The Biographical Sketch for each senior personnel must be uploaded as a single PDF file associated with that individual.
  • Chapter II.C.2.g(iv)(b) Domestic Travel, has been revised to inform proposers that travel, meal and hotel expenses of grantee employees who are not on travel status are unallowable. Additional language has also been added stating that costs of employees on travel status are limited to those specifically authorized by 2 CFR § 200.474.
  • Chapter II.C.2.g(viii), Indirect Costs (also known as Facilities and Administrative Costs (F&A) for Colleges and Universities) (Line I on the Proposal Budget), has been updated to clarify that the use of an indirect cost rate lower than the organization’s approved negotiated indirect cost rate is considered a violation of NSF’s cost sharing policy.
  • Chapter II.C.2.g(xii), Voluntary Committed and Uncommitted Cost Sharing, has been amended to include an additional reference to 2 CFR § 200.99, definition of voluntary committed cost sharing. Clarifying language has also been added to emphasize how voluntary committed and voluntary uncommitted cost sharing are treated differently by NSF. In accordance with the Uniform Guidance, in order to be considered voluntary committed cost sharing, the amount must appear on the NSF budget, and be specifically identified in the approved NSF budget. Voluntary uncommitted cost sharing, however, should not be included in the proposal budget or budget justification and these resources are not financially auditable by NSF.
  • Chapter II.C.2.j, Special Information and Supplementary Documentation:
    1. has been updated to clarify where the “Mentoring Plan” and “Data Management Plan” should be uploaded in the Supplementary Documentation section of FastLane.
    2. includes additional language to emphasize the importance of submitting letters of support only when specifically required by a program solicitation.
  • Chapter II.D.3.b, Submission of a collaborative proposal from multiple organizations, has been updated to include the Collaborators & Other Affiliations Information in the list of required sections for a collaborative proposal. The Collaborators & Other Affiliations Information should be separately provided by the lead and non-lead organization(s) in a separately submitted collaborative proposal.
  • Chapter II.D.7, Projects Requiring High-Performance Computing Resources, Large Amounts of Data Storage, or Advanced Visualization Resources, includes additional language that clarifies how submitters can address the locally available high-performance computing resources in their proposal. The description of available computing resources has also been updated.
  • Exhibit II-1, Proposal Preparation Checklist, has been clarified with an additional sentence letting proposers know that FastLane uses different rules for each type of proposal (e.g. Research, RAPID, EAGER, RAISE, GOALI, Ideas Lab, FASED, Conference, Equipment or Travel) to check for compliance prior to submission to NSF. Additional checklist components have also been added to assist proposers in the presubmission administrative review of proposals to NSF.
  • Exhibit II-2, Potentially Disqualifying Conflicts of Interest, has been updated to clarify the types of relationships that would prevent a reviewer from reviewing a proposal unless a waiver has been granted by NSF. Specifically, language relating to serving as a consultant at an organization, and involvement as a former Ph.D. student/ advisor has been added in this exhibit.
  • Exhibit II-6, Nondiscrimination Certification, has been revised to ensure that references to subrecipients, contractors and subawards are consistent with definitions in 2 CFR § 200, Subpart A, Acronyms and Definitions.
  • Exhibit II-7, Definitions of Categories of Personnel, has been updated to clarify that a Faculty Associate can be a faculty member or equivalent at the performing institution.
  • Chapter III.F.2(c)(3), Process to Appeal NSF’s Decision to Decline a Proposal for Financial or Administrative Reasons, Procedures, includes additional language to clarify that proposers may submit documentation to support their statements – even documentation that may not have been presented as part of the original review process – as long as it is not “new” information that would not have been available at the time the decision to decline was made.

Significant Changes to the PAPPG Part II

  • Chapter VI.D.3.c(ii), NSF-Approved Extension, has been updated to clarify that a request for an NSF-approved extension should be submitted at least 45 days prior to the end date of the grant and must be signed and submitted by the AOR via use of NSF’s electronic systems. Information has also been added to make grantees aware of the limited time period of availability of funds due to cancelation of appropriations.
  • Chapter VII.B.2.c, Addition of co-PI/co-PD, has been added to provide instructions to grantees desiring to add a new co-PI/co-PD. This section includes instructions on how an AOR can prepare and submit this request via use of NSF’s electronic systems. This section lists the required components of this type of grantee request.
  • Chapter VIII.C.2, Payment Policies, has been amended to remove the requirement that grantees must certify that all disbursements have been made, or will be made within three days of the receipt of the payment.
  • Chapter VIII.E.6, Award Financial Reporting Requirements and Final Disbursement, has been supplemented with a new subpart, E.6., to make grantees aware of how NSF awards with canceled appropriations will be treated and to include the regulatory citation related to expiration of appropriated funds. In accordance with 31 USC 1552(a), funds will no longer be available for expenditure for any purpose beyond September 30th of the fifth fiscal year after the expiration of a fixed appropriation’s period of availability for incurring new obligations.
  • Chapter XI.B.1, Human Subjects, has been updated to reflect the Foundation’s implementation of 45 CFR 690.118, applications and proposals lacking definite plans for involvement of human subjects. Clarification has also been added regarding the IRB documentation that NSF must have for projects that involve human subjects.

Clarifications and Other Changes to the PAPPG Part II

  • Chapter VII.B.2.e, Substitute (Change) PI/PD or co-PI/co-PD, includes additional instructions on how an AOR can prepare and submit the request via use of NSF’s electronic systems. This section now lists the required components of this type of grantee request.
  • Chapter VIII.E.5, Award Financial Reporting Requirements and Final Disbursement, has been updated to inform grantees of the time limits applicable to the upward or downward adjustments to the Federal share of costs for a financially closed award.
  • Chapter IX.A, Conflict of Interest Policies, has been amended to remove the term “contractors” from this section for greater consistency with 2 CFR § 200, Subpart A, Acronyms and Definitions.
  • Chapter X.A.2.b, Pre-Award (Pre-Start Date) Costs, includes an additional footnote reminding grantees that in the case of a renewal award, costs incurred under the old grant cannot be transferred to the new grant.
  • Chapter X.B.2, Administrative and Clerical Salaries & Wages, has been updated to reference 2 CFR § 200.413, Direct Costs. Language has also been added to clarify that an AOR should initiate the request for NSF approval to direct charge salaries of administrative or clerical staff after an award has been made.

NSF Provides Update on Proposal Submission Modernization

The National Science Foundation (NSF) has undertaken a multiyear proposal submission modernization (PSM) effort to modernize and transition grantee electronic capabilities from FastLane to Research.gov. Plans for next year include a pilot launch for a single solicitation in Research.gov.

PSM aims to reduce the administrative burden to the research community and NSF staff associated with preparation, submission, and management of proposals. In FY 2016, NSF has focused on establishing requirements, developing proposal section modernization concepts, and setting up the back-end application infrastructure.

In order to assist grantees, NSF has developed a matrix that lists NSF’s grantee electronic capabilities, and whether they can be found in FastLane, Research.gov, or both. This matrix will be updated as appropriate, independent of the PAPPG revision cycle. To view the matrix, please visit the NSF Electronic Capabilities Modernization Status webpage.

NIH: Final Research Performance Progress Report Effective January 1, 2017

The National Institutes of Health (NIH) has issued a Final Research Performance Progress Report (F-RPPR) replacing the Final Progress Report (FPR) for grants closeout, effective January 1, 2017.  The F-RPPR will be available for use in eRA Commons on January 1, 2017.

What This Means for You

If you have a final progress report due, and you wish to use the old FPR format of an uploaded document, you must submit the FPR before January 1, 2017. NIH will no longer accept any of the old format FPRs on or after January 1, 2017.

The Format

The format of the Final RPPR is very similar to that of the annual RPPR. The notable differences being the F-RPPR does not have sections D (Participants), F (Changes), and H (Budget). The F-RPPR does have a new section: Section I (Outcomes).

Project Outcomes (Section I) will be made publicly available, allowing recipients the opportunity to provide the general public with a concise summary of the public significance of the research.

Deadline Remains Unchanged

The deadlines for submitting a Final RPPR remain the same — no later than 120 days from the project end date.

The Effect on Delegation to Submit RPPR

NIH will maintain the business rule that allows the Signing Official (SO) to delegate the submission of the Final RPPR or Interim-RPPR to a Program Director/Principal Investigator (PD/PI).

For More Information, see Guide Notice NOT-OD-17-022.  Visit the NIH RPPR web page, and check out November’s eRA Items of Interest article Please Call it “Final RPPR”

UF Office of Research Announces UFIRST Open Forum

The UF Office of Research would like to invite you to an open forum on Monday, December 12 in the Florida Gym Room 250 from 10:30 a.m. – 12:00 p.m. The forum will help to identify opportunities for UFIRST improvements in the coming year.

In August 2013, the Office of Research kicked off the UFIRST program to redesign the proposal and awards process.  UFIRST is also a commitment to ongoing process improvement.  The implementation of the awards module is not an END to that commitment, but simply a change from implementation to evolution.  The conversation continues.

Awarded Projects for November 2016

College of Education
Awarded Projects
November 2016
Principal Investigator: Catherine Atria (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Project Title: District Instructional Leadership and Faculty Development Grant
Project Period: 7/1/2016 – 6/30/2017
Award Amount: $7,780
Principal Investigator: Michael Bowie (Administration, Dean’s Area)
Co-PI: Nancy Waldron (Student Services)
Funding Agency: Florida Department of Education
Project Title: College Reach-Out Program (CROP)
Project Period: 8/1/2016 – 7/31/2017
Award Amount: $50,360
Principal Investigator: Maria Coady (STL)
Co-PI: Ester de Jong (STL)
Funding Agency: U.S. Department of Education/OELA
Project Title: Project STELLAR: Supporting Teachers of English Language Learners Across Rural Settings
Project Period: 9/1/2016 – 8/31/2021
Award Amount: $2,393,911
Principal Investigator: Ashley Pennypacker Hill (P.K. Yonge)
Co-PI: N/A
Funding Agency: Florida Department of Education
Project Title: IDEA Part B, Entitlement 2016 – 2017
Project Period: 7/1/2016 – 9/30/2017
Award Amount: $207,547

Submitted Projects for November 2016

College of Education
Submitted Projects
November 2016
Principal Investigator: Pasha Antonenko (STL)
Co-PI: N/A
Funding Agency: National Science Foundation
Proposal Title: Collaborative Research: i-Tracker: Enhancing Engineering Design Interpretation Skills in Construction Engineering and Management Education Through Deep Context Immersion
Requested Amount: $61,681
Principal Investigator: Masoud Gheisari (M.E. Rinker, Sr. School of Construction Management)
Co-PI: Pasha Antonenko (STL), Raja Issa (M.E. Rinker, Sr. School of Building Construction), Benjamin Lok (Computer & Information Science & Engineering)
Funding Agency: National Science Foundation
Proposal Title: iVisit: Using Conversational Virtual Humans within Spatiotemporal Contexts of Construction Sites to Improve the Development of Communication Skills of Construction Students
Requested Amount: $20,928
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Lafayette Parish School System
Proposal Title: Lafayette Parish Believe and Prepare
Requested Amount: $16,000
Principal Investigator: Rose Pringle (STL)
Co-PI: N/A
Funding Agency: Georgia State University (Subcontract – NSF Flow Through)
Proposal Title: I AM STEM Innovation, Achievement, and Motivation in Science, Technology, Engineering and Mathematics
Requested Amount: $167,332
Principal Investigator: Patricia Snyder (AZCEECS/SSESPECS)
Co-PI: N/A
Funding Agency: University of North Carolina (Subcontract – IES Flow Through)
Proposal Title: ECTA: DEC Recommended Practices in Early Intervention/Early Childhood Special Education
Requested Amount: $33,150
,

Research Spotlight: Albert Ritzhaupt

Q & A with Albert Ritzhaupt, Ph.D., Associate Professor in the School of Teaching and Learning

What basic questions does your research seek to answer?

While my research interests are varied, my core research agenda attempts to answer two general research questions: (1) How do you design, develop, and evaluate technology-enhanced learning environments? and (2) What factors influence technology integration into formal educational settings?

These two research questions have led me down a path to study a wide variety of technology-enhanced learning environments, ranging from multimedia learning environments to game-based learning environments. Further, I have studied factors that both facilitate and hinder technology integration in educational environments, such as the digital divide, leadership, and community engagement.

I use a wide variety of research methods to answer my research questions. I employ traditional experimental design research methods for testing many of my instructional designs and innovations in technology-enhanced learning environments. Additionally, I use classical and modern test theory to establish measurement systems to inform my research and the research of others, using procedures such as exploratory factor analysis, confirmatory factor analysis, and more. I have employed literature synthesis and meta-analysis procedures to synthesize across primary studies. I have also used more complex procedures for analyzing larger data sets, including multi-level modeling and structural equation modeling techniques. Although I was never trained to use qualitative measures, over the years, I have added some qualitative techniques to my toolbox, such as the constant comparative method or phenomenology to answer different types of research questions.

What makes your work interesting?

I cannot answer this question for everyone else (you would have to ask others), but I can tell you what makes me passionate about my work. I have seen information and communication technology (ICT) open doors for students, teachers, instructional designers, trainers, and many others. ICT has given us the potential to do things we would not be able to accomplish otherwise, such as visualization, economy of scale, sharing of resources, and more. In all of my research, I try to provide the readers with a theoretical or conceptual framework to understand the empirical aspects of my work.

What are you currently working on?

Like most of us, I have TOO MANY projects going on right now to write about them all. However, I will note two projects that I am presently excited about. The first project is a meta-analysis of the flipped classroom empirical literature. Many educators are moving to a flipped classroom model where homework is done in class via collaborative problem-solving activities (active learning), and lecture is moved to the online space typically as video capture. We have already presented this research to the Association for Educational Communications and Technology (AECT). We are presently working on the manuscript, which we hope to submit to the Review of Educational Research journal.

The second project is a NIH grant funded project where I am a co-Principal Investigator with some excellent researchers in medicine at UF and at UC Denver. The purpose of the grant is first to design and develop a short course focused on estimating power and sample size for longitudinal multi-level model designs. Sample size is an important issue in that if you overestimate, you potentially expose more people to risk than necessary. Conversely, if you underestimate, you may not reach your scientific goals and objectives. We will create a workshop to teach researchers about these ideas, and then create a Massive Open Online Course (MOOC) for dissemination on a wider scale.

National Science Foundation Offers Free Webcast

The National Science Foundation (NSF) has announced that the Fall 2016 NSF Grants Conference Plenary (General) Sessions will be webcast live November 14-15, 2016 at no cost to the research community. NSF staff will speak to the state of current funding, review new and current policies and procedures, and provide updates on pertinent administrative issues.

You can expect to learn more about the following topics:

  • New programs and initiatives
  • Proposal preparation
  • NSF’s merit review process
  • Conflict of interest policies
  • Award Administration
  • NSF Faculty Early Career Program

The live webcast of the conference will be held on November 14-15, 2016.  We encourage you to view the full webcast agenda for additional information, and to see scheduled times. Please note that the conference is taking place in Pittsburgh, PA, so all times are EST.

Register here. Webcast login information will be emailed to registrants on Thursday, November 10th. These sessions will also be recorded for on-demand viewing once the conference has concluded. Presentations will also be available on the conference website.

If you have any questions regarding live webcasting, please contact us at grants_conference@nsf.gov, or call Allison Kennedy at: 703-245-7407.

How to Write Successful Collaborative Grant Applications

Multidisciplinary collaborative research grant applications are becoming the norm for early and seasoned investigators. But navigating through these applications is different from writing an application as a single investigator. For one thing, scientists from different fields need to share the writing. While there is literature on general strategies for writing a multidisciplinary grant application, this endeavor is still new enough that we continue to figure out the specifics.

This article identifies four key strategies to writing a successful collaborative grant application.

(Excerpted from an article by Barbara St. Pierre Schneider, a research professor of nursing at the University of Nevada, Las Vegas, in the Grant Training Center blog, April 4, 2016)

My recent experience in writing an interdisciplinary collaborative research grant differed from my previous ones in that there were three rather than two of us involved in the process. In addition, my two collaborators were from a discipline far from my health science expertise: electrical and computer engineering.

There were parts of the methods that I just couldn’t write because they were beyond my expertise. I had mixed feelings about this inability to write. While I had confidence in my collaborators, I felt totally dependent upon them. On the other hand, this feeling of dependence meant to me that we had formed a true interdisciplinary collaboration.

Through this experience, I have identified four key strategies that worked for us to complete and submit the grant application and feel a sense of accomplishment during the process.

  1. Be clear about collaborators’ expertise and contributions.

Successful grant writing requires that collaborators are clear on what everyone’s expertise is and how this expertise fits with the project’s specific aims. Prior to this grant submission, we were asked to write a white paper. It wasn’t long into the writing process that I realized our good fortune of having written this paper: our research question and approach were solidified, our contributions and expertise were transparent, and our current effort was pure writing instead of conceptualizing as we were writing. But you and your collaborators don’t need to wait for an opportunity to write a white paper. Consider writing one as the collaboration is forming.

  1. Communicate frequently with your collaborators.

During a four-week period, we were in constant communication about the grant. Initially, the three of us had one in-person meeting to review our scientific approach. Then we communicated almost on a daily basis via email. One week before the application, I had another in-person meeting with one collaborator. This meeting was helpful as the collaborator had specific questions about the context of this proposed work within the state of the science. Not only did these questions enhance the collaborator’s understanding of the scientific field, but these questions also helped me to identify areas that I needed to strengthen or clarify within the proposal.

It’s common sense that constant communication is critical to the success of writing a collaborative grant application; however, we are not always intentional about our communication plan, and we all have different approaches to checking and responding to email. So it doesn’t hurt to discuss communication approaches at the start of writing the grant. For example, try to schedule at least two in-person meetings in advance—one at the start and one near the deadline. If you don’t need the second or subsequent meetings, then you can always cancel. It’s easier to schedule in advance than later. Also, to prevent any communication breakdowns, ensure all collaborators are included in email traffic. It’s a simple way to keep everyone in the loop and to create a record of reference for you and your collaborators.

  1. Outsource tasks when possible.

Because we only had four weeks to draft the grant, as the lead co-principal investigator (co-PI), I decided to seek assistance from others so I could focus on the science for the proposal. For example, in my department, we have one staff person who can create a budget table and prepare a budget justification and another staff person who provides guidance in creating National Institutes of Health (NIH) biographical sketches. In addition, I enlisted the assistance of an outside editor to do editing/proofing. This editor helped with ascertaining the strengths of our case, ensuring the grant application read as one voice and met the formatting and content guidelines, and writing mechanics were correct.

These support individuals made a huge difference in completing and submitting a polished grant application, so I highly recommend outsourcing these tasks to these experts. At a research university, staff who can advise about grant budgets are usually available. Also, for specific grant requirements (e.g., the NIH biographical sketch), reach out to colleagues who have completed a similar grant proposal or visit the institution’s website for guidance and/or samples.

In terms of editing support, check with your institution’s research or sponsored programs office to learn if editorial assistance is available. Another viable—and valuable—option is to hire a graduate student as an hourly worker to help with these tasks. A graduate student with a particular expertise can be just as effective at creating budgets, drafting biosketches, and editing your proposal as a full-time employee.

  1. Develop a strategic plan for writing the grant.

Finally, as the lead co-PI, I was strategic about the order in which the different application sections were completed. For our specific project, there were five required sections: the project narrative, the one-page summary for the future external grant application, the budget, the curriculum vitae (the NIH biographical sketch), and the description of current and pending support.

I chose to complete the project narrative first because I wanted to ensure that I had the creative energy and time for multiple revisions, especially since this application involved three writers. Plus, I needed to complete the project narrative before starting the one-page summary since this would describe the subsequent project to be submitted to an external funding agency. The budget table and justification were also completed early because the budget affected the project timeline, which was part of the project narrative. I was intentional in waiting until the project narrative was almost done to complete the curriculum vitae and support sections because I knew I would be outsourcing this task and could complete these even after much of my creative energy had been used up.

So when writing this type of grant, develop a strategic plan. This plan needs to account for the high and low points of your creative energy, the order in which multiple writers will need to contribute, the order in which sections need to be written, and the availability of support individuals. Even as a collaborator who is not the lead co-PI, you can develop a strategic plan so that your creative energy is highest when you need to contribute to sections, such as the project narrative. Finally, don’t forget to set deadlines as part of this plan. If you are the lead co-PI, you may propose these. If you are a collaborator, you may offer counter deadlines. When you can’t meet these, give new ones to ensure that the work will get done in a timely manner.

Although writing an interdisciplinary research collaborative grant application can be intimidating, implementing these four strategies will likely reduce this feeling, allowing you to be more confident and composed throughout the entire writing process.

Navigating Collaborative Grant Research

Collaboration pays, so funding agencies are promoting team research. Running a successful collaboration, especially one with several leaders at multiple sites, means thinking like a CEO: vetting partners, delegating responsibilities, and making tough management decisions. Researchers in multisite, multi-investigator projects need to adjust their grant-writing approach and work culture.

(Excerpted from an article by Chris Tachibana, September 13, 2013, in Science, a journal of the American Association for the Advancement of Science)

“Resources are shrinking,” says Alicia Knoedler, associate vice president for research, University of Oklahoma. “So government, industry, and some private funders prefer a collaborative approach.” Getting experts to work together on a problem can be more effective than supporting many separate projects.

Strong Relationships

“My most important advice about collaborations,” says Knoedler, “is to build strong research relationships in advance.” All collaborators must know they are crucial members of the research team or they’ll drop the ball or drop out. And tokenism shows in a proposal, says Knoedler. “It’s obvious to reviewers when people have just been placed on a team and not integrated into it.”

Creating a functional cross-disciplinary research team requires serious vetting. Choose senior investigators because agencies tend to fund people with an established record. Look for team players and select collaborators who are experts in their field and people you can count on to deliver results that integrate into the overall project goals.

Diversity and Unity

Once you have your research dream team, set ground rules, says Barry Bozeman, an Arizona State University professor of public policy who is studying the dos and don’ts of collaborations for the NSF. “People focus on getting the money and don’t worry about other issues until it’s too late,” he says. “Intersector work can be fraught with misunderstanding and managing a heterodox team is hard.” Scientific disciplines differ in their norms about authorship and credit, work culture, and expectations about intellectual property rights and distribution of results. Discuss those issues in advance, says Bozeman. The decision-making process and the budget should be clear as well as the division of labor and timelines, he says. “Let people know what they are expected to do and when.”

Mark Gerstein a professor of biomedical informatics and computer science at Yale University says collaborating lets him contribute to research that has a greater impact than single-laboratory studies. Gerstein recognizes that this work doesn’t fit in the tenure-track groove, though. “Most universities expect you to bring in your own money and publish your own papers, so I do that as well.” Gerstein first puts students and postdocs in his lab on large collaborations, then gives them smaller follow-up projects for their own first-author papers. This strategy has placed lab members in faculty positions around the world. Collaboration has pros and cons, says Gerstein. “Sometimes you have to wait to publish until the consortium is ready, and you can’t always do things the way you want to. But the benefit is being on high-impact publications and highly connected work. For these projects,” he says, “the whole is greater than the sum of the parts, and I’ve profited from being part of the whole.”

5 Tips for Federal Grant Management: Multiple Awards

Are you managing multiple awards from multiple federal agencies? If so, then you know it takes planning and diligence to make sure that the federal dollars are used, tracked, and reported to each federal agency following the specific requirements set forth by the individual set of terms and conditions and the agency-specific grant regulations.

The following article provides five tips to help you better manage multiple federal grants.

(Excerpted from an article by Lucy Morgan CPA, MBA and director of MyFedTrainer.com, August 27, 2015)

Tip #1: Start right from the beginning.

When you first get a Notice of Award (NOA) there are a number of steps you should take to get started on the right foot:

  • Begin by reviewing all materials related to the award.
  • Set up a reporting calendar that identifies the various important dates mandated by the Federal agency.
  • Better yet, have a master calendar that includes all critical dates from all grantors.
  • Assign a team of people or a leader to administer each individual grant.
  • Alert financial staff of the new award and specific terms and conditions so they are on notice to ensure policies and procedures reducing the risk of both misuse and direct theft of funds are in place and regular monitoring is taking place.

Tip #2: Train program staff and other stakeholders on the requirements of federal grants.

Adequate training is one of the keys to administering multiple grants in ways that ensure the effective and efficient completion of the project along with compliance with individual grant requirements.

Here are some specifics:

  • Make sure that employees working on the grant (both directly and indirectly) are sufficiently trained to be effective at grant management.
  • Communicate requirements for time and effort reporting to individual employees.
  • Ensure that individual employee’s record their time throughout the work day indicating the duration of time and effort spent on different projects and mark it clearly on their timesheet.
  • Conduct regular reviews of project budgets, including exercising due diligence to find any errors and omissions where expenses may have been misapplied among multiple awards.

Tip #3: Select a central person as the grant administrator.

When you have multiple grants, managed by multiple people it can be a challenge to keep juggling all the roles and responsibilities without dropping a few balls. That is why I believe best practices for managing multiple awards include appointing a center grant administrator.

This person would be responsible for the following:

  • Oversight of grant administration by all the individual grant managers or grant team leads
  • Coordination and review of the fiscal reporting for each grant
  • Monitoring the individual awards for discrepancies or misreporting of how federal funds have been spent and reported back to the federal agencies
  • Confirming that all required milestones (including cost-share requirements) spelled out in the individual grants are met and that reporting to the federal agency has happened on schedule

Tip #4: Set up a grants management master file.

Let’s face it, grant management includes a lot of paperwork and requirements. To keep track of all the various key elements, I recommend the use of a grants management master file, including a fully maintained file for each individual award as well.

The grants management master file should include the following:

  • Original RFP
  • Notice of award (NOA)
  • Final approved budget, scope, and work plan
  • Subsequent budget revisions and prior approvals
  • Monitoring reports
  • Meeting minutes as applicable
  • Financial and narrative reports required by the funding agency
  • Log of required communication and other critical interactions
  • Other documentation related to meeting project milestones and/or performance measurements
  • Any other forms required by the federal agency

Keeping the grants management master file current and complete enables the award recipient to efficiently and effectively manage each award in a way that reduces risk and increases responsiveness to any changes in compliance with the terms and conditions of the individual award.

Tip #5: Build habits of communicating and using tracking tools from the start.

It doesn’t matter if you are managing one grant or one hundred federal awards supporting strong communication and tracking tools (whether manual processes or software solutions) will improve your grant management.

I advise you to encourage regular communication with each funding source’s program officer. It’s better to be proactive and contact the program officer with questions rather than risk the negative consequences of non-compliance with the grant requirements.

Awarded Projects for October 2016

College of Education
Awarded Projects
October 2016
Principal Investigator: Kent Crippen (STL)
Co-PI: Chang-Yu Wu (Engineering School of Sustainable Infrastructure & Environment), Maria Korolev (Chemistry), Philip Brucat (Chemistry)
Funding Agency: National Science Foundation
Project Title: ChANgE Chem Lab: Cognitive Apprenticeship for Engineers in Chem Lab
Project Period: 9/1/2016 – 8/31/2019
Award Amount: $599,333
Principal Investigator: Daniel Connaughton (Department of Recreation, Tourism & Sport Management)
Co-PI: M. David Miller (SHDOSE)
Funding Agency: Florida Department of Transportation
Project Title: Florida Safe Routes to School Non-Infrastructure Program
Project Period: 8/10/2016 – 6/30/2017
Award Amount: $23,515.75
Principal Investigator: M. David Miller (SHDOSE)
Co-PI: N/A
Funding Agency: U. S. Department of Veterans Affairs
Project Title: IPA for Christiana Akande
Project Period: 10/4/2016 – 3/31/2017
Award Amount: $10,861.52
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Alachua County School Board
Project Title: Alachua Turnaround School Support
Project Period: 9/20/2016 – 6/30/2017
Award Amount: $90,200
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Duval County School Board
Project Title: Duval County Faculty Literacy Professional Development
Project Period: 7/1/2016 – 6/30/2017
Award Amount: $74,415
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Children’s Trust
Project Title: TCT Early Learning Coaching
Project Period: 10/1/2016 – 9/30/2017
Award Amount: $24,000

 

Submitted Projects for October 2016

College of Education
Submitted Projects
October 2016
Principal Investigator: Matthew Gurka (Institute for Child Health Policy)
Co-PI: Linda Behar-Horenstein (SHDOSE)
Funding Agency: National Institutes of Health
Proposal Title: OneFlorida Implementation Science in Cardiovascular Disease (OFIS-CVD) K12 Career Development Program
Requested Amount: $62,237
Principal Investigator: Herman Knopf (AZCEECS/SSESPECS)
Co-PI: N/A
Funding Agency: University of South Carolina (Subcontract – Administration for Children and Families Flow-Through)
Proposal Title: Child Care Accessibility Index: Leveraging SC Child Care Administrative Data to Inform State CCDBG Subsidy Policies
Requested Amount: $28,040
Principal Investigator: Ashley MacSuga-Gage (SSESPECS)
Co-PI: N/A
Funding Agency: University of South Florida (Subcontract – Florida Department of Education Flow-Through)
Proposal Title: Florida Positive Behavioral Support: Multi-Tiered System of Supports (FLPBIS-MTSS)
Requested Amount: $58,066
Principal Investigator: Janice Raup-Krieger (Advertising Department)
Co-PI: Anne Corinne Huggins-Manley (SHDOSE)
Funding Agency: National Institutes of Health
Proposal Title: Engaging Patients in Informed Decision-Making about Cancer Research Studies Using Electronic Health Records
Requested Amount: $95,939
Principal Investigator: M. David Miller (SHDOSE)
Co-PI: N/A
Funding Agency: U.S. Department of Veterans Affairs
Proposal Title: IPA for Christiana Akande
Requested Amount: $10,861
Principal Investigator: Donald Pemberton (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Berkeley County School District
Proposal Title: Berkeley County Early Learning Courses and COP
Requested Amount: $94,688
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Duval County School Board
Proposal Title: Duval County Whole Faculty Literacy Professional Development
Requested Amount: $74,415
Principal Investigator: Patricia Snyder (AZCEECS/SSESPECS)
Co-PI: N/A
Funding Agency: Florida Department of Health
Proposal Title: Increasing Social-Emotional Outcomes for Florida’s Early Steps Infants/Toddlers: Institutions of Higher Education Supporting the Three Model Demonstration Sites to Implement the Demonstration Site Implementation Plan
Requested Amount: $656,151