,

Research Spotlight: Zhihui Fang

Q & A with Zhihui Fang, Ph.D., Professor in the School of Teaching and Learning

What basic questions does your research seek to answer?

My research addresses three interrelated questions that I believe are of both theoretical and practical significance: (a) how are knowledge and value constructed through language across different academic disciplines? (b) what challenges do these ways of using language present to students in subject-area reading and writing? and (c) how can these challenges be addressed through a language-based pedagogy?

What makes your work interesting?

What makes my work interesting is its focus on the role language plays in teaching and learning. Although language is arguably the most powerful and creative resource for making meaning, it is, ironically, also the most taken-for-granted aspect of schooling. I have had a fascination with language since my middle school years, when I started to learn English as a foreign language. This interest grew during my undergraduate and graduate years when I began my formal studies in linguistics and language in education. As a student, I used to wonder why school curriculum content had to be presented to us in textbooks that we found daunting and alienating. I learned later that each text students read or write has a purpose, and this purpose is realized through language (and other semiotic) choices that configure in particular ways in order to have particular effects. So, I have been motivated early on to find out how content experts use language to present knowledge, infuse points of view, and structure texts, as well as how students can be supported in disciplinary learning through a functional focus on language. My work in this area recognizes language as the hidden curriculum of schooling and responds to the challenges of developing advanced literacy, critical literacy, and disciplinary literacies among students who struggle with reading and writing, who are learning English as an additional language, or who have histories of school failure.

What are you currently working on?

I am currently working on three projects. In one project, I try to reconceptualize, from a functional linguistics perspective, three key constructs in the Common Core State Standards — text complexity, close reading, and disciplinary literacy — in an effort to make their classroom implementation more effective and empowering for teachers. In another project, my research team is examining adolescents’ use of academic language in informational writing, hoping to gain a better understanding of how access to and control over academic language impacts students’ reading/writing achievement and disciplinary learning. In the third project, my research team is studying disciplinary experts’ social practice (i.e., the daily workplace routines experts engage in), semiotic practice (i.e., how experts use language and other semiotic resources in disciplinary meaning-making), and cognitive practice (i.e., the mental routines and strategies employed by experts in disciplinary reading and writing), hoping to use the findings from the study to inform subsequent design and delivery of disciplinary literacy instruction in the K-12 setting.

,

Research Spotlight: John Kranzler

Q & A with John Kranzler, Ph.D., Professor in the School of Special Education, School Psychology, and Early Childhood Studies

What basic questions does your research seek to answer?

My recent empirical research has largely fallen within the evidence-based practice (EBP) movement within the field of school psychology, which aims to “identify, disseminate, and promote the adoption of practices with demonstrated research support” (Kratochwill, 2007, p. 829). The goal of the EBP movement is to improve the quality of professional services (e.g., diagnosis, intervention, and evaluation) delivered to children and youth, families, and schools. Of particular interest to me at the current time is the innovative approach to the identification of specific learning disabilities (SLD) known as the pattern of strengths and weaknesses (PSW) approach. PSW methods define SLD as unexpected underachievement and corresponding weakness in specific cognitive abilities. The PSW approach has already been adopted in 14 states for SLD identification (Maki, Floyd, & Roberson, 2015), despite the fact that substantiating scientific evidence is currently lacking. Thus, I have conducted several investigations of important postulates underlying the PSW approach. Below I describe two recent studies my colleagues and I have conducted to provide some description of my work.

One postulate of the PSW approach concerns the focus of IQ test interpretation. Proponents of the PSW approach contend that the focus of interpretation should not be on the overall score, but on the pattern of intra-individual strengths and weaknesses at the composite score level. For composite scores to warrant interpretation, they must demonstrate incremental validity. Incremental validity addresses the question of whether scores on a test increase the predictive validity of important external criteria over other scores on the same test or scores on other established measures. To examine this question, my colleagues and I used estimated factor scores from a bifactor analysis of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) to examine the unique effects of its latent variables on academic achievement. Results of this study only partially replicated the findings of previous research on the incremental validity of scores that can be derived from performance on the WAIS-IV. Although we found that psychometric g is the most important underlying construct measured by the WAIS-IV for the prediction of academic achievement in general, results indicated that only the unique effect of Verbal Comprehension was important, and only for certain academic outcomes. Results of this study, which was published in Psychological Assessment (Kranzler, Floyd, & Benson, 2015), question the utility of composite scores underlying the PSW approach.

Valid identification of SLD using the PSW methods requires the application of diagnostic criteria that result in the reliable grouping of children and adolescents with this disability and those who do not. We examined the diagnostic accuracy of the Cross-Battery Assessment (XBA) PSW approach to identifying SLD. To examine this postulate, we conducted a classification agreement analysis using the Woodcock-Johnson III Tests of Cognitive Abilities and Achievement. We examined the broad cognitive abilities of the Cattell-Horn-Carroll theory held to be meaningfully related to basic reading, reading comprehension, mathematics calculation, and mathematics reasoning across age. Results of analyses of 300 participants in three age groups (6-8, 9-13, and 14-19 years) indicated that the XBA method is very reliable and accurate in detecting true negatives. Results of classification agreement analyses were generally quite low, however, indicating that this method is very poor at detecting true positives. Mean sensitivity and positive predictive value were 21% and 34% across all broad cognitive abilities and academic domains. In sum, results of this study do not support the use of the XBA method for identifying SLD. Results of this study, as well as a reply to commentary on our article by PSW proponents, are in press in the International Journal of School Educational Psychology.

What makes your work interesting?

My primary area of scholarly interest concerns the nature, development, and assessment of intelligence (IQ). Standardized IQ tests have been called psychology’s greatest contribution to society. The overall score on these tests is a better predictor of achievement in school or college, military training programs, and employment in business and industry than any other combination of variables independent of IQ. The interpretation and use of IQ tests, has long been surrounded by controversy, however. Indeed, IQ tests have been used to admit, advance, and employ, but also to deny, track, and institutionalize. Much of my work in recent years has concerned investigating the validity of innovative practices involving the interpretation and use of the results of IQ tests.

What are you currently working on?

My future research agenda involves extension of research on the PSW methods, SLD identification, and valid interpretation of IQ tests. I recently received IRB approval for a study on the cognitive ability profiles of children and youth identified as SLD in a response-to-intervention model. In addition to empirical research, I recently co-authored a textbook on intellectual assessment titled, Assessing Intelligence in Children and Adolescents: A Practical Guide. Our aim in writing this book was to address the need for an updated, evidence-based, user-friendly resource to meet the training needs of students and practitioners. I also guest edited a special issue of the International Journal of School & Educational Psychology on current practices and future trends in the intellectual assessment of children and youth around the world, which will be published this fall.

Notice to Investigators: Mandatory Conversion of Paper Studies to myIRB

Since February of 2016, the IRB has allowed investigators to avoid converting certain types of active studies to myIRB. However, the recently revised Common Rule that goes into effect on January 19, 2018 will result in many changes that UF will need to implement.

As a result, all studies currently in the IRB’s legacy system (i.e., in paper) must either be converted to myIRB or be closed by December 31, 2017.  If we have not heard from you or your study staff regarding your intentions for your active study by December 1, 2017, the IRB will proceed to close any remaining “paper” studies.  The IRB will contact the PI before closing a study when there is a possibility of currently active study subjects.

For more information regarding the conversion process, please see the website:

http://irb.ufl.edu/irb01/forms/converting-paper-studies-to-myirb.html

If you have any further questions, call the IRB-01 Office (352) 273-9600.

PI PROCESS FOR CONVERTING EXISTING PAPER STUDIES TO myIRB

  • Investigators will create a new study in myIRB.
  • After creating the new study, on the myIRB Legacy Conversion smartform page – Investigators MUST:
    • Indicate the previous IRB #
    • Attach their completed Continuing Review Report for the paper study to include:
      • The appropriate Continuing Review Report  Continuing Review or  Continuing Review for TISSUE/DATA banks
        • Please keep in mind that for many regulatory reasons, if the paper study also contains a local tissue or data bank, these must be “uncoupled,” meaning that both the primary study and the tissue and/or data bank must be submitted in myIRB as two separate studies.
      • Copy of the last signed ICF (with the LAR and participant’s name blacked out)
      • Cumulative Adverse Event Table
      • Cumulative Deviation Tracking Table
      • Publications
      • Current ancillary reviews for RAC, HURRC, or COI as applicable
  • Submit the new study for review.
  • Notifications will be sent to the PI (and Coordinator) via email, if changes are needed.
  • Once approval documents are generated, the approval letter will indicate approval for the myIRB study AND indicate that the paper study has been converted. All future submissions MUST be submitted within myIRB.

*NOTE:
REVISIONS TO THE PAPER STUDY WILL NOT BE ACCEPTED DURING THE CONVERSION PROCESS.

FAQs

  • REVISIONS CANNOT BE MADE TO THE PAPER STUDY DURING THE CONVERSION PROCESS.
  • The study will be reviewed as a new study; IRB reviewers will have access to supporting documentation from previous paper IRB submissions.
  • Plan ahead to make sure all study staff are registered in myIRB and have completed the mandatory IRB training and HIPAA for researchers training is up to date.
  • The myIRB approval letter will indicate the new status for the paper study as “Converted.”
  • VA studies: Plan 3 months ahead to begin the VA Paperwork that will be required to be uploaded prior to submitting your new conversion study.
  • If your study included the option for subjects to indicate if they wanted to be contacted for future research (i.e., noted by the IRB to be a “piggy bank”), this will need to be a separate banking study for research (Registry or Data Bank). Please call the IRB office for assistance.
  • Deadlines will still apply for full Board studies.

If you have any further questions, call the IRB-01 Office (352) 273-9600

National Science Foundation Two Months Salary Language

When it is anticipated that any senior personnel will exceed two months total salary from all National Science Foundation (NSF) funded work, the language provided below would be a best practice for inclusion in all NSF budget justifications.

The proposed salary for [insert name], in combination with other current NSF support, exceeds the two-month limit for senior personnel. The proposed level of commitment for this proposal is appropriate for the scope of work and is required in order to fulfill the objectives of this project within the proposed timeframe.

NIH Offers Seminars on Program Funding and Grants Administration

The National Institutes of Health (NIH) Regional Seminar on Program Funding and Grants Administration provides an unparalleled opportunity for you to gain a better perspective of NIH policies and programs, network with peers, gather helpful NIH contacts, and return to your office with useful information, resources, and tools to assist in obtaining and managing NIH awards.

Registration is now open for the following 2017 NIH Regional Seminar Locations:

May 3-5: New Orleans, LA  (General Registration Rates ending Friday, April 7.)

October 25-27: Baltimore, MD

Workshops are also filling fast, so don’t delay in registering if this is the seminar for you.

In a nutshell, here is why you should plan on attending and reserve your seat today.

Who attends? Over 600 new investigators, research administrators, grant writers, and others working with NIH grants and contracts will be in attendance from almost every state and numerous countries around the globe. The seminar is designed for those new to working with NIH grants. (This means session questions and discussions from those with roles similar to yours.)

Who presents? Over 65 presenters from 16 different Institutes and Centers are presenting and making themselves available to meet individually with you during our 1:1 Meet the Expert opportunities.  Get your questions answered by NIH officials representing program, grants management, review, and policy.  From HHS, hear experts from the Office of Human Research Protections (OHRP), Office of the Inspector General (OIG), and the Office of Research Integrity (ORI). (See the New Orleans online Faculty Page for details. Baltimore will have additional staff available due to the proximity to NIH offices.)

What are the topics? Over 45 different topics during the two-day seminar with unique tracks specifically for Administrators and New Investigators, as well as an All Interests track with topics of wide appeal. Review the agenda and list of session descriptions today on each location’s website.  But that’s not all!  Check out the Optional Pre-Seminar Workshops on topics such as electronic Research Administration (eRA), human subject research protections, intellectual property & iEdison, and a new administrators “boot camp.”

Find more details, such as agendas, session descriptions, faculty information, hotel discounts, and so much more, on each location’s website: New Orleans, LA and Baltimore, MD.

Check out this snapshot of topics:

  • Budget Basics for Administrators and Investigators
  • Career Development Awards
  • Clinical Trials
  • Compliance (Case Studies)
  • Current Issues at NIH
  • Diversity in the Extramural Research Workplace
  • electronic Research Administration (eRA)
  • Financial Conflict of Interest
  • Fundamentals of the NIH Grants Process
  • Grant Writing for Success
  • Human Research Protections
  • Budget Basics for Administrators and Investigators
  • Career Development Awards
  • Clinical Trials
  • Compliance (Case Studies)
  • Current Issues at NIH
  • Diversity in the Extramural Research Workplace
  • electronic Research Administration (eRA)
  • Financial Conflict of Interest
  • Fundamentals of the NIH Grants Process
  • Grant Writing for Success
  • Human Research Protections
  • Loan Repayment Program
  • Intellectual Property, Inventions, and Patents
  • Office of Laboratory Animal Welfare (OLAW)
  • Peer Review Process
  • Preventing & Detecting Fraud
  • Public Access
  • SciENcv
  • R&D Contracts
  • Research Integrity
  • Rigor & Reproducibility
  • Training/Fellowships
  • SBIR/STTR Program
  • ….and that’s not all!

Apply for a Two-Week Summer Workshop on Quasi-Experimental Design and Analysis

The National Center for Education Research (NCER), in conjunction with the Institute for Policy Research (IPR) at Northwestern University, announces a new two-week training workshop on quasi-experimental design and analysis.

All applications are due by April 20, 2017 and decisions will be made by May 1, 2017.

Dates: July 31-August 11, 2017
Location: Northwestern University, Evanston, IL

Through this two-week workshop, attendees will learn state-of-the-art quasi-experimental methods for evaluating education interventions. Participants will conduct hands-on analyses of data collected from quasi-experimental designs. Participants in prior quasi-experimental workshops that did not include hands-on training are encouraged to apply. The format will consist of lectures, small group discussions, individual consultations, and small-group practical exercises.

For workshop details, click here.

Application materials are available online at the Institute for Policy Research web site.

For questions about the 2016 Quasi-Experimental Design and Analysis workshop, please contact Rebecca Morris.

For questions about the IES grant supporting this training, please contact Dr. Phill Gagné

NCSER Announces the 2017 Summer Research Training Institute on SMART Designs

Applications are being accepted for a Summer Research Training Institute on sequential multiple assignment randomized trials (SMARTs) used in the development and evaluation of adaptive interventions.

All applications must be received no later than April 24, Monday, at 11:59 p.m. EST.

What: Summer! Research Training Institute on SMART Designs
When: June 14-16, 2017 (full-day on June 14 and 15, half-day on June 16)
Where: Washington, D.C.

Interventions in educational settings often require an individualized approach in which the intervention to be delivered is adapted and re-adapted over time in response to the evolving needs of the individual. Adaptive interventions provide one way to operationalize this approach.

The goal of this Institute is to increase the national capacity of education researchers to conduct novel and methodologically rigorous trials for developing and evaluating adaptive interventions. This training is funded by the National Center for Special Education Research (NCSER) in the Institute of Education Sciences.

For general information about the Training Institute please click here.

For additional information and application procedures for the Training Institute please email National Capitol Contracting (NCC), the NCSER Contractor coordinating this training at: SMARTTraining@nccsite.com.

Save the Date: 2017 NCES STATS-DC Data Conference

The National Center for Education Statistics (NCES) will host the 2017 NCES STATS-DC Data Conference on August 1 – August 3, 2017 in Washington, D.C.

The Data Conference welcomes proposals for presentations about Common Core of Data (CCD), data collection, data linking beyond K-12, data management, data privacy, data quality, data standards, data use (analytical), data use (instructional), fiscal data, and Statewide Longitudinal Data Systems (SLDS). Those interested are encouraged to submit a proposal to present a session once registration opens.

More information, including hotel and registration information, will be coming soon!

IES Releases Education Technology Compendium

Education technology supports teaching and learning for students at all grade levels and across various subjects. Since 2002, the Institute of Education Sciences (IES) has funded over 400 projects focused on education technology.

A new compendium is now available and provides information about current and completed education technology projects funded by IES’s two research centers—the National Center for Education Research (NCER) and the National Center for Special Education Research (NCSER). The compendium is designed to provide information on completed and current projects in an easily accessible and usable format.

With IES funding, researchers have developed or studied technologies for classroom, school, and education research purposes, including more than 270 web-based tools, 85 virtual environments and interactive simulations, 95 intelligent tutor and artificial intelligence software systems, 50 game-based tools, and 105 computer-based assessments.

For additional information, please view A Compendium of Education Technology Research Funded by NCER and NCSER: 2002-2014.

REL Midwest Offers Lessons Learned for Working in Collaborative Research Partnerships

Two new reports from Regional Educational Laboratory (REL) Midwest describe lessons learned from its efforts to work with states, districts, and other practitioners on identifying and addressing educational challenges.

Reflections from a Professional Learning Community for Researchers Working in Research Alliances

Conducting collaborative research is challenging — especially for researchers who have never partnered with practitioners to conduct research. To address these challenges, REL Midwest formed a professional learning community for its researchers. The researchers found that the structure provided a safe space to discuss the strengths and weaknesses of their traditional research training and experiences. It also allowed them to work together to solve challenges related to the collaborative research approach. This report is a reflective piece that summarizes the REL Midwest professional learning community’s lessons learned and describes how its members worked to align available resources to the specific needs of the researchers and developed tools to help one another resolve challenges.

Establishing and Sustaining Networked Improvement Communities: Lessons from Michigan and Minnesota

Networked improvement communities are a relatively new type of collaborative research partnership between researchers and educators. With facilitation from researchers, educators identify problems of practice, the factors that drive the problems, and promising solutions. They then engage in iterative cycles of designing, implementing, testing, and redesigning solutions, while learning from variation across the settings in the networked improvement community. This report shares REL Midwest’s lessons learned from working with educators in Michigan and Minnesota to establish and sustain networked improvement communities, and offers guidance to researchers and educators as they form networked improvement communities in different contexts.

Awarded Projects March 2017

College of Education
Awarded Projects
March 2017
Principal Investigator: Herman Knopf (AZCEECS/SSESPECS)
Co-PI: N/A
Funding Agency: University of South Carolina (Subcontract – NIH Flow Through)
Project Title: Child Care Accessibility Index: Leveraging SC Child Care Administrative Data to Inform State CCDBG Subsidy Policies
Project Period: 9/3/2016 – 9/29/2017
Award Amount: $21,265.99
Principal Investigator: Donald Pemberton (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Study Edge
Project Title: Support of Algebra Nation Michigan 2016-17
Project Period: 1/1/2017 – 9/30/2017
Award Amount: $10,000.00

 

Submitted Projects March 2017

College of Education
Submitted Projects
March 2017
Principal Investigator: Susan Butler (STL)
Co-PI: Philip Poekert (Lastinger Center for Learning), Nancy Ruzycki (Materials Science and Engineering)
Funding Agency: National Science Foundation
Proposal Title: FACTOR: Fostering a Community of Learners to Support Computational Thinking
Requested Amount: $2,491,540
Principal Investigator: Nicholas Gage (SSESPECS)
Co-PI: Ashley MacSuga-Gage (SSESPECS), Joni Splett (SSESPECS), Mary Kristina DePue (SHDOSE)
Funding Agency: US Department of Justice/National Institute of Justice
Proposal Title: Project PASS: A Randomized Controlled Trial of Positive Alternatives to School Suspension
Requested Amount: $2,961,253
Principal Investigator: Dorothy Espelage (Department of Psychology)
Co-PI: Walter Leite (SHDOSE), Philip Poekert (Lastinger Center for Learning)
Funding Agency: US Department of Justice/National Institute of Justice
Proposal Title: Enhancing School Resource Officers’ Effectiveness through Online Professional and Job Embedded Coaching
Requested Amount: $499,431
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Heartland Educational Consortium
Proposal Title: Heartland Certified Coaching
Requested Amount: $183,600
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Miami-Dade County Public Schools
Proposal Title: Miami-Dade Early Learning Coaching Certification Program
Requested Amount: $23,100
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: National Career Academy Coalition
Proposal Title: Career Academy Leaders’ Collaborative…an institute for transformational leadership development
Requested Amount: $16,500
Principal Investigator: Philip Poekert (Lastinger Center for Learning)
Co-PI: N/A
Funding Agency: Episcopal Children’s Services
Proposal Title: CoachJAX! Advanced Coaching Academy
Requested Amount: $8,500
Principal Investigator: Stephen Smith (SSESPECS)
Co-PI: Nancy Corbett (SSESPECS)
Funding Agency: U.S. Department of Education/OSEP
Proposal Title: Personnel Development to Improve Services for Children With Disabilities – Interdisciplinary Preparation in Special Education – Serving Children Who Have High-Intensity Needs (EBD Prep)
Requested Amount: $996,764
Principal Investigator: Joni Splett (SSESPECS)
Co-PI: N/A
Funding Agency: Society for the Study of School Psychology
Proposal Title: A Mixed Methods Comparison of Universal Screening and School Referral
Requested Amount: $19,919