Summary

As virtual education and virtual jobs become the "new" reality, most professional development opportunities require technology skills. But how should we evaluate them? Typically, we measure the number of activities or contact hours offered, the number of participants, the increase in knowledge or skills, or their level of satisfaction. Unfortunately, what happens afterwards is not usually addressed. The follow-up question we are interested in answering is: to what extent have these activities succeeded in promoting or enhancing participants' meaningful uses of technology? For example, the Puerto Rico Department of Education (PRDE), with the support of COVID-19, implemented the DE-Innova project to train educators in various technological tools. To this end, a needs assessment was conducted, which was also used to evaluate the project's achievements. The results of this implementation presented an opportunity to reflect on the elements that should be contemplated when evaluating this type of project, considering the evolving nature of technology.

Background

When the COVID-19 pandemic began in 2020, the unexpected perspectives of different work scenarios around the world became apparent. In Puerto Rico, after a four-week shutdown, there was an abrupt shift from a traditional, face-to-face work culture to a remote work scenario. For educators, the challenge was even greater. Not only did they have to adapt their educational practices to virtual environments, but they had to deliver and assess learning with technological tools unfamiliar to them and their students. Faced with this situation, during the summer of 2020, the Puerto Rico Department of Education (PRDE) implemented the DE-Innova project, to provide educators and students with technological equipment, and adequate professional development opportunities in the use and integration of technology in their educational practice. The project consisted of the following:

Project theory

To design or adapt the assessment tool, we reviewed several tools that had been used previously in system-wide technology implementations. The most relevant were the Technology Pedagogical Content Knowledge Framework (TPACK), the Technology Integration Matrix (TIM), and the International Society of Standards for Technology Integration for Educators (ISTE). Our objective was to find indicators to assess and determine the perceived level of technology use and integration by educators at various levels of the public education system (grades kindergarten through twelve) for use in Puerto Rico. The following figure shows the framework proposed as part of the project.

To this end, Global Education Exchange Opportunities (GEEO) signed a collaboration agreement with the Center for Instructional Technology at the University of Florida to use its online platform to translate, adapt and administer its Technology Uses and Perceptions Survey (TUPS) based on the Technology Integration Matrix (TIM) for analysis. The TUPS is an assessment tool focused on educational practice within the classroom.
Educators' responses were analyzed to identify the level of use and integration of technology in the classroom in a way that fosters meaningful learning. The process of teaching with technology can take place at any of the five TIM levels. It is critical to note that what triggers the progression from one level to another is the autonomy teachers give students in using and integrating technology through their teaching: from mastering the selection and/or focusing on teaching the use of technology, to encouraging the student to know how to choose and use a variety of technologies. Therefore, the primary goal of the professional development activities was to help educators "move" to a higher level of technology integration in their professional practices.

Technology Uses and Perceptions Survey (TUPS)

Reliability of subscales

The TUPS evaluated thirty-two (32) technology tools. Participants reported their perceived usefulness and their level of skill in using each tool. Both scales included six alternatives: Not at all, Very little, Little, Moderate, High, Very high. For ease of interpretation, the six alternatives were reduced to three categories: Little (including Nothing and Very Little), Moderate (including Little and Moderate), and High (including High and Very High). The 32 instruments were classified into five subscales:

The thirty-two technology tools (hardware and software) were grouped according to Norman Webb's Taxonomy of Depth of Knowledge (DOK) and TIM. The DOK is a framework that classifies contexts, cues, scenarios and challenges into four levels of rigor. For this study, the tools were grouped into five subscales, from those requiring the lowest to the highest teacher/student ability to create, apply, and synthesize.
A reliability analysis was calculated using Cronbach's Alpha after grouping the tools into categories to construct the five subscales (see figure above). The first subscale Equipment-Hardware included the tools available to teachers and students, resulting in a score α = .910. The second subscale Information Management, Analysis and Communication, with an α = .873, grouped the tools used to organize, calculate, analyze and generate conclusions or predictions based on the data. The third subscale Internet Communication, with an α = 0.875, grouped tools to facilitate communication and collaboration. The fourth subscale Demonstration and Practice are those used to repeat or respond to cues to learn a content or demonstrate mastery of a process, when grouped together had a score α =.873. The fifth scale Design and Creativity with a score α =.946 of includes design and creativity tools to require the construction or visualization of information giving students greater autonomy for learning and teachers greater creativity to design it.

In summary, the subscales generated had a strong Cronbach's score above .80. Therefore, there was strong predictive validity within each of the five subscales allowing us to conclude the perceived use of the tools in the teaching and learning process.

Results

Participation rates

As a result of the DE-Innova project, the percentage of participants who were classified at the Entry level decreased by 20.25%, while the percentage of participants at the Adoption/Adaptation and Infusion/Transformation levels increased by 5% and 15.26%, respectively. The majority of participants remained at the same TIM level after the DE-Innova project (57.07%), while 37.57% moved up at least one TIM level. Less than 6% dropped one or two TIM levels (5.35%).

Participants in the DE-Innova project indicated that as their skills for different technological tools increased, they tended to perceive most of the tools as less useful (inverse relationship), except for the tools related to the Internet communication subscale, which showed an increase in both skill levels and perceived usefulness. This could be due to the fact that classes had to be taught online, and that teachers and students used Internet communication programs to teach during the Covid-19 pandemic.

Conclusions

The adaptation of the TUPS and the subsequent creation of subscales for the thirty-two technology tools was effective in identifying the baseline as well as the impact of implementation on the teacher's perceived uses as well as their perception of use by the students in the sample. This goes beyond assessing participant satisfaction and knowledge gains following professional development. The subscales provided information on participants' intended uses of the technology in their academic work environment, and a way to meaningfully assess how new or improved skills will be used. Gaps for planning future professional learning experiences for the teacher and his or her students were also obtained.

Considerations for evaluating technology-related professional development opportunities.

The Puerto Rico Department of Education project provided GEEO with the opportunity to implement a robust assessment of technology-related professional development in an educational setting. However, this approach can be adapted to any professional setting where technology-related professional development opportunities are available. Among the variables or aspects that should be included in the evaluation of this type of activity are the following:

References

Florida Center for Instructional Technology (FCIT) (2019). The Technology Integration Matrix. TIM.

Global Education Exchange Opportunities, Inc (2019). Framework; Technology use and integration in the field of education.

Global Education Exchange Opportunities, Inc. (2021). DE-INNOVA - GEEO | Global Education Exchange Opportunities. https://www.geeopr.com/de-innova/#planindividual