Int J Technol Des Educ (2014) 24:107–120 DOI 10.1007/s10798-013-9241-0
Connecting the STEM dots: measuring the effect of an integrated engineering design intervention Paul R. Hernandez • Ralph Bodin • Jonathan W. Elliott • Badaruddin Ibrahim • Karen E. Rambo-Hernandez • Thomas W. Chen Michael A. de Miranda
•
Published online: 5 April 2013 Springer Science+Business Media Dordrecht 2013
Abstract Recent publications have elevated the priority of increasing the integration of Science, Technology, Engineering, and Mathematics (STEM) content for K-12 education. The STEM education community must invest in the development of valid and reliable to scales to measure STEM content, knowledge fusion, and perceptions of the nature of STEM. This brief report discusses the development of an instrument to measure student perceptions of the interdependent nature of STEM content knowledge in the context of a complex classroom intervention implemented in five Colorado high schools (N = 275). Specifically, cross-functional science, technology, engineering, and mathematics teams of high school students were formed to complete engineering design problems. Exploratory (pretest) and confirmatory (posttest) factor analyses indicated that a newly adapted scale to measure student perceptions of the interdependent nature of STEM content knowledge had possessed adequate model fit. Furthermore, analysis revealed a novel pattern of results for the intervention. Specifically, students with initially high perceptions of the interdependent nature of STEM sustained their high perceptions at posttest; however, students with initially low perceptions exhibited statistically significantly positive gains from pretest to posttest. Therefore, this intervention may work best with students who are at risk of losing interest in STEM disciplines. The implications of these research findings are discussed. Keywords Integrated STEM Education Field study Intervention Engineering design problem
P. R. Hernandez (&) R. Bodin J. W. Elliott K. E. Rambo-Hernandez School of Education, Colorado State University, Fort Collins, CO, USA e-mail:
[email protected] B. Ibrahim University Tun Hussein Onn Malaysia, Batu Pahat, Malaysia T. W. Chen M. A. de Miranda College of Engineering, Colorado State University, Fort Collins, CO, USA
123
108
P. R. Hernandez et al.
Introduction There is growing concern that the United States of America is not preparing a sufficient number of students, teachers, and practitioners in the areas of science, technology, engineering, and mathematics (STEM). STEM education or STEM fields is an acronym for the study or professional practice in broad areas of science, technology, engineering, and mathematics. This educational concern is driven by a large majority of secondary school students who fail to reach proficiency in math and science. For most, STEM education means only science and mathematics, even though the products of technology and engineering have so greatly influenced everyday life. A true STEM education should increase students’ cognitive synthesis of the co-dependent nature science and mathematics in order to have advances in engineering and technology. Furthermore, an integrated approach to STEM education naturally requires authentic experiences to study how things work and how technologies are created. Technology and engineering in STEM education are directly involved in problem solving, innovation, and design; three themes with high priorities on every nation’s agenda. Given its economic importance to society, students should learn about engineering and technology within a STEM context and exercise the skills and abilities associated with the design process. The significance of engineering and technology in the USA has risen to the point that the National Assessment Governing Board has recognized the importance of this issue and recently approved the evaluation of technology and engineering education through examinations that will be given to US students in 2014. Likewise, the draft Framework for Science Education released by the US National Academies includes technology and engineering among four targeted disciplines. From a pedagogical perspective, the opportunity students have to learn content and practices is a critical indicator that has been shown in numerous studies to be related to achievement (Schmidt and Maier 2009; Schmidt and McKnight 2012). Content coverage (or opportunity to learn) is defined in three ways: (a) the extent of coverage: (b) the amount of time devoted to content; (c) and the order of coverage. Because it is central to academic achievement, coverage provides an intermediate indicator related to the quality of schooling. In fact, in many countries, content coverage is one of the important judgments usually made by inspectors. Therefore, the current national emphasis on STEM education has prompted researchers and practitioners in these fields to focus on three critical areas: (a) review of our current mathematics, science, and engineering curricula; (b) our pedagogical practices and utilization of existing and emerging technology; and (c) our ability to design, integrate, and assess exemplary learning environments (National Research Council [NRC] 2012). As asserted in a recent report, engineering design projects can serve as a catalyst for integrating learning across STEM disciplines (Katehi et al. 2009). Thus, one of the primary goals of the current study was to use an engineering design task to increase student perceptions of the interrelated nature of STEM content knowledge. However, a critical precursor to testing the intervention effects was the ability to measure perceptions of co-dependent STEM content knowledge reliably. Therefore, this brief report focuses on an initial scale development and the results of an intervention designed to change student perceptions of the co-dependent nature of STEM content. In this interim research report, we first provide an overview of the current framework for integrating STEM curriculum in K-12 education. Further, we describe a conceptual framework grounding STEM education in the cognitive sciences. Finally, we describe the current intervention as well as the development of a scale for measuring STEM content connections.
123
Connecting the STEM dots
109
Framework for integrating STEM curriculum It is regrettable that the vision for STEM curricular integration in the USA is not being framed around the Standards for technological literacy: Content and contexts for the study of technology (STL; ITEA, 2000) The current age of the STL and failure of the national technology education leadership and community to advance an inclusive and transformative agenda for technology education is evidence that there is little appetite to engage in a revision or updating of the STL. This has left the technology education community looking to strengthen its position within an integrated STEM perspective. This movement has included the adoption of engineering principles and practices within technology education. While from an international perspective this may seem far removed from traditional technology education curriculum, the tenants of technology education remain an important part of the STEM education movement. For example, the recent release of the National Research Council (NRC) recommendations for K-12 science education titled A framework for K-12 science education: Practices, crosscutting concepts, and core ideas marks a significant shift in the core concepts that will guide STEM education in the coming years (NRC 2012). The NRC recommendations propose a framework that articulates a broad set of expectations for students in science. Within this framework, grades K-12 science education needs to integrate three new dimensions into science standards, curriculum, instruction, and assessment. These dimensions include: (a) scientific and engineering practices; (b) crosscutting concepts that unify the study of science and engineering through their common application across fields; and (c) core ideas in disciplinary areas such as physical sciences, life sciences, earth and space sciences, engineering, mathematics, technology, and applications of science (NRC 2012). The setting for the current study captures the complex classroom learning interactions envisioned in this framework where students actively engage in scientific and engineering practices to deepen their understanding of crosscutting concepts and disciplinary core ideas. The authors acknowledge that there is no single approach that defines how to integrate the three dimensions into curriculum, instruction, and assessment. However, well researched approaches in learning and instruction from the cognitive science provide a strong foundation for grounding exploratory investigation within this learning context. Conceptual framework for teaching and learning of STEM in the cognitive sciences Realizing the vision of integrating the three dimensions will require new conceptualizations of teaching, learning, and instruction grounded in the cognitive sciences and rigorous methodological approaches to measuring student learning. Bruer (1993) argued that, ‘‘the National Assessment of Educational Progress (NAEP) results indicate that current curricula, teaching methods and instructional materials successfully impart facts and rote skills to most students but fail to impart high-order reasoning and learning skills’’ (p. 5). This statement continues to resonate nearly 23 years later. Some researchers have explored transforming the classroom from ‘‘work sites where students perform assigned tasks under management of teachers into communities of learning and interpretation, where students are given significant opportunity to take charge of their own learning… attempting to engineer an innovative educational environment’’ (Brown 1992, p. 141). One approach to this transformation process can be addressed through the adoption of well researched approaches to learning and instruction grounded in the cognitive sciences.
123
110
P. R. Hernandez et al.
Engineering and technology education core concepts and practices advocated in the new K-12 science education framework are remarkably consonant with findings from cognitive science that defines good instruction (de Miranda 2004). The foundations of cognitively based models hold three elements of learning and instruction common across the various instructional strategies (de Miranda 2004): (a) students learn to engage actively with the learning process and content; (b) through the instructional design, students learn to reflect on and connect existing structures of knowledge to guide and further their learning; and (c) students learn to interact in classrooms or communities of learning where knowledge and information are shared openly in an environment that values participation and interaction between students, teachers, and sources of knowledge outside the classroom. The application of the above principles results in classroom environments that engender design experiments, collaborative learning experiences, socially-distributed expertise across teams of students, and project based learning (de Miranda 2004). The engineering component of STEM education puts emphasis on the process and design of solutions instead of the solutions themselves. This approach allows students to explore mathematics and science in more authentic contexts while helping them to develop the critical thinking skills that can be applied to many facets of their work and academic lives. The American Society of Engineering Education (ASEE) has continually emphasized the notion that engineering design can be a pedagogical strategy to promote learning across disciplines (Lantz 2009). Therefore, the organization of the classroom intervention focused on supporting the four key elements of cognitively-based instruction: multi-disciplinary student design teams, shared expert knowledge, student lead design and engineering tasks, and authentic scientific and engineering practice (de Miranda 2004). In addition, when instruction is organized without classroom or school boundaries to promote cross-disciplinary interaction, critical debate, design solutions, and problem focus, the ethos of the learning environment is fundamentally changed from knowledge transmission sites to sites of active discourse, questioning, experimentation, risk-taking, and meta-cognition (de Miranda 2004). Design of the intervention The intervention was modeled to reflect the authentic practice of a research program funded by the National Science Foundation. The research is centered on the design of a nano scale silicon biosensor microchip that will measure specific chemicals in the brain, in an attempt to determine which chemicals are key to our brain development. This work will would allow scientists to study which chemicals direct neurons to move during development. The research and design of this biosensor requires the knowledge, skills, and cooperation of professors and graduate student fellows in biology, chemistry, computer science, electrical and computer engineering, and mathematics. To design a complex classroom intervention that emulates how interdisciplinary STEM is practiced, teachers from mathematics, science, and engineering and technology education were recruited in teams of three to participate in this study. The teachers met during a summer workshop to co-plan the STEM content and make familiar to each the design problem that students would be assigned. The teachers planned an instructional calendar and shared the discipline specific content each would cover that would directly align with the STEM knowledge and practices required to design, build, and test a design solution. Teachers met monthly with the research team to debrief and discuss the management of the design activity which would extend for 5 months and culminate in a student scientific and engineering research symposium where design solutions and research posters would be presented.
123
Connecting the STEM dots
111
Organization and management of the intervention Participating teachers each selected a class from the courses they teach to participate in the project. This ensured that a mathematics, science, and engineering and technology student would be represented on each engineering design team. First, small student teams (N = 4–6) consisted of a ‘‘Junior Fellow’’ (a high school student team leader) and students from science, technology, engineering, and mathematics classes were formed. The junior fellows’ responsibility was to guide/lead student teams from each of the content classes and to provide content expertise to their assigned team. The junior fellow and teacher were supported in their content knowledge by a university graduate student assigned to the classroom to serve as a content knowledge expert (hereafter referred to as a graduate student fellow). This organizational model required that the junior fellow collaborate with the content teacher, each of the classes, the graduate student fellow, and their peers from the other content areas. The teachers’ responsibilities involved collaboration between the other content teachers, the junior fellows, and graduate student fellows, as well as providing necessary resources and guidance for the students in the STEM content areas. Students were responsible to their team members to provide a level of expertise and query in their designated or assumed roles within the design team. Second, students were situated in expert roles as engineers (mechanical and electrical), scientists, and applied mathematicians within each team and individually contributed to the whole knowledge possessed by the group. This intervention design element is consonant with the research on situated cognition in the classroom. For example, the mathematics students provided expert knowledge on data collection and analysis organization as well as communicating the results of product and component performance data. The physics students provided expertise in their role as electrical engineer and circuit designers. The mechanical engineering students provide expert knowledge on product design, material selection, production processes, and design for function analysis. Expert knowledge was shared within and between teams through the student content experts. Third, we sought to engage students in authentic scientific and engineering practice. For example, in the iPod docking station design challenge, student teams were required to integrate four components which served as separate design challenges; an amplifier circuit, printed circuit board, amplifier box/container, and power supply. Each component was designed, engineered, built, tested, and optimized as part of their course content. This included identifying and understanding the interconnectedness the various components had within the design, development and testing protocol, as well as connections to the content curriculum for their class. For example, the typical physics units of power, electricity, and magnetism were connected to the theoretical and practical design of engineering a power supply, testing its output and waveform characteristics, and ensuring safety with electronics and electricity. Fourth, students took the lead in developing their design solution and process for assessing the progress of the project. Students had multiple interrelated design issues to address during the project (e.g., the power system, op-amp circuit design and optimization for desired performance [acoustic] characteristics, printed circuit board design and etching, and the actual speaker box/container design). Team members engaged in the process of debugging, troubleshooting, and innovating as they moved through the engineering design process. Students kept individual engineering notebooks as an artifact to track their progress, identify design problems, and track the evolution of potential solutions. The engineering notebooks, considered the records of what the team had decided, provided a stop-
123
112
P. R. Hernandez et al.
gap measure to ensure teams did not repeat previous iterations and helped to keep them moving forward. Finally, recognition of the classroom environment/ethos plays as an important function in student learning and teacher/student collaboration is important. Informing this importance are research findings suggesting that the personal relations between class members help predict learning outcomes and that there is a relationship between positive environments and positive learning outcomes (Walberg and Anderson 1968; Haertel et al. 1981). Additionally, it has been reported that ‘‘teachers can promote deep approaches to learning through the creation of learning environments that students perceive as safe, supportive, and that offer helpful relationships’’ (Dart et al. 2000, p. 269). These studies lend support to the significance of being able to create an environment in which the participants (teachers, students, and graduate student fellows) are able to collaborate, share, and learn from each other. A particular recommendation from learning and instruction informed by the cognitive sciences is that students engage in a whole and mature task (Glaser 1990). In this study, students engaged in the engineering design team problem for approximately 24 weeks. The culminating intervention required the teams made of science, mathematics, and engineering and technology students to present their findings in a research poster symposium that mirrored scientific and engineering practice, shown in Fig. 1. In presenting their work through this poster session, students ‘‘cognitively mastered’’ what they had learned through communicating their work to other students, teachers, university faculty, and graduate students who attended the poster session. Problem of measurement The current study sought to develop an instrument that would be sensitive to changes in students connecting the STEM content ‘‘dots’’ as a result of participation in the intervention. Interdisciplinary integration of math curriculum within other content areas has been a focus of investigation for numerous content areas, such as science and the
Fig. 1 Student team made up of members from math, physics, and engineering and technology
123
Connecting the STEM dots
113
humanities (Angel and LaLonde 1998; Elliott et al. 2001). Therefore, the authors identified a measurement instrument closely related to the goals of the current study. Math across the curriculum (MATC) is a teaching model developed to integrate mathematics content across humanities courses (Korey 2000, 2002). The MATC program created nine courses linking mathematics content with humanities disciplines. Researchers developed a multi-assessment instrument to measure the impact of the MATC program on student perceptions and beliefs and provided initial validity evidence for the instrument (Korey 2000). In the current study, researchers adapted items tapping the perceptions of the interrelated nature of STEM content. The process for adaption involved defining the target constructs (i.e., connections between mathematics, science and engineering) and matching items to the facets of the underlying constructs. For this project, graduate students and a faculty member served as the context experts that reviewed the survey instructions and item content for clarity, non-redundancy, and relevance to the underlying constructs (Haynes et al. 1995; Netemeyer et al. 2003). Therefore, the adapted scale items were designed to assess student perceptions of the interrelated nature of content knowledge across the STEM fields. Having adapted an instrument to the needs of the complex classroom intervention, the specific goals of the current study were to (a) assess the initial evidence of the reliability and validity for the STEM connections instrument and (b) to investigate the effect of the intervention. We hypothesized that the measurement model for the instrument would exhibit a simple structure (i.e., a single factor). Furthermore, we hypothesized that students would perceive higher levels of connections between the STEM disciplines after participating in the intervention.
Method Participants The participants consisted of students from five area high schools enrolled in science, engineering and technology, or mathematics classes (N = 275). As shown in Table 1, the sample was predominantly male and White, although sample characteristics varied somewhat by high school campus. Further, student academic level ranged from 9th to 12th grade and most of the students were engaged in this project through their science, mathematics, or engineering and technology classes. Procedure During the spring semester (2011) prior to the intervention implementation, the research team solicited teams of teachers (i.e., science, mathematics, engineering and technology education) to participate in the study. Teams were selected based on their willingness to implement a longitudinal engineering design project and their willingness to collaborate across school subjects. During the summer (i.e., prior to implementation of the intervention in the fall), teachers participated in a week-long workshop where the teacher teams were able to plan, brainstorm, and align their curricular units to coincide with the engineering design project. During the summer planning, each teacher team was introduced to their graduate student fellow who would co-plan new content with them to meet the necessary prerequisite understandings and knowledge required by the complex engineering design problem.
123
123
12
Prefer not to answer
48
3
100
6
–
8
2
–
83
2
17
56
25
27
1
5
–
2
\1
4
1
92
–
–
–
–
–
Pop. %
29
2
2
2
6
10
7
3
8
8
10
8
n
100
7
7
7
21
34
24
10
28
28
34
28
%
18
44
–
1
\1
22
38
34
–
–
–
–
–
Pop. %
HS 3. (N = 29)
32
2
–
1
3
–
26
1
1
5
25
20
n
100
6
–
3
9
–
81
3
3
16
78
63
%
12
23
–
2
–
27
4
61
–
–
–
–
–
Pop. %
HS 4. (N = 32)
40
2
–
–
4
–
34
4
3
33
–
17
n
100
5
–
–
10
–
85
10
8
83
–
43
%
10
22
–
2
–
18
4
72
–
–
–
–
–
Pop. %
HS 5. (N = 40)
Total percentages are not 100 % for all of the characteristics due to rounding. HS #. = participating high school from which the sample was recruited; N = total sample size for within a single school; n = sample size of subgroup within a school; % = percentage of sample of subgroup within a school; Pop. % = percentage for the entire school
10
21
31
–
15
26
Engineering/Tech.
48
4 –
\1
2
1
–
40
1
8
27
12
13
27
3
60
Limited English proficiency
39
Science
10
–
1
2
2
87
–
–
–
–
–
Economically disadvantaged
61
Mathematics
Class
1
–
Asian/Pacific Islander
2
Hispanic/Latino
Native American
2
African American
White
109
22
28
Ethnicity
12th
41
51
11th
30
9
38
7
37
10th
47
9th
Grade
Female
%
n
Pop. %
n
%
HS 2. (N = 48)
HS 1. (N = 126)
Table 1 Characteristics of the student participants and the participating schools
114 P. R. Hernandez et al.
Connecting the STEM dots
115
Early in the fall semester (2011) and prior to the intervention implementation, each student enrolled in the classes that formed the teacher/graduate student fellow teams (mathematics, physics, engineering and technology, etc.) were administered a pretest. All involved were required to adhere to the guidelines required under human subjects’ approval (Office of Regulatory Compliance) and complete an informed consent to participate in research form. Following consent to participate each student was assigned by the teacher a unique research identification number that is unknown to the researchers. The intervention was implemented; and approximately 24 weeks after the intervention began and the final research poster symposium had been held, students were administered the posttest. Measures Perceptions of interconnected STEM content knowledge Participants responded to a nine-item scaled, adapted from Korey (2000), to measure student perceptions of the interconnections between mathematics, science, and engineering. Students responded to statements, such as ‘‘I have applied connections between mathematics, science, and engineering to help me solve problems outside of school’’, on a five-point Likert scale from 1 (Strongly Disagree) to 5 (Strongly Agree). See all item content in the ‘‘Appendix’’.
Results Measurement model Pretest data First, an initial exploratory factor analysis (EFA) was conducted to determine the number of latent factors to extract from the nine-item pool. Multiple criteria, such as the extracted communality greater than 0.20, Kaiser criteria (Hayton et al. 2004), Catell’s scree plot analysis (Pett et al. 2003), parallel analysis (Hayton et al. 2004), and the minimum average partial revised criteria (Velicer 1976; Velicer et al. 2000), were used to determine the number of factors (Henson and Roberts 2006). The criteria indicated that one item should be dropped due to extremely low communality (i.e., Q1 communality = 0.01), but all criteria indicated that a single factor should be extracted. Second, a final EFA was conducted with the remaining eight items (i.e., Q2–Q9) with extraction set to a single factor using principal axis factoring and oblique rotation. The model accounted for 39 % of the common variance among the items, and all items exhibited strong factor loadings. Next, a reliability analysis was conducted on the eight item scale. The reliability analysis revealed that the internal consistency reliability was acceptable (Cronbach’s alpha = 0.82). The scale mean was moderately high (N = 267, M = 3.56 out of 5.00), variability was relatively low (SD = 0.67) and the scores were normally distributed (i.e., Kolmogorov–Smirnov z = 0.05, p = 0.20). Thus, we interpret a low score on the STEM connections scale as indicating that a student did not perceive strong connections between math, science, and engineering; while being high on this factor would indicate the opposite.
123
116
P. R. Hernandez et al.
Fig. 2 CFA factor structure of STEM Connections scale. Note Q9 had been set as the marker variable. Factor loadings values in the figure are standardized Table 2 Cross-tabulation showing patterns of individual changes in STEM connections Pretest
1. Low (\3.13) 2. Moderate (C3.13 and \3.50) 3. High (C3.50 and \4.00) 4. Very high (C4.00)
Posttest 1
2
3
4
Total
11
1
6
3
21
52 %
5%
29 %
14 %
100 %
7
6
11
1
25
28 %
24 %
44 %
4%
100 %
6
8
15
8
37
16 %
22 %
41 %
22 %
100 %
2
1
4
25
32
6%
3%
13 %
78 %
100 %
The above cross-tabulation cells contain sample sizes and percentage. Diagonal (bolded) values indicate number/percent of students who remained stable over time. Off diagonal values indicate individuals who changed from their original category from pretest to posttest
Posttest data First, a single factor CFA model was fit to the data based on the model identified through EFA at pretest. The CFA analysis was conducted using maximum likelihood estimation in AMOS version 18 (Arbuckle 2009). The analysis of the single factor model revealed excellent model fit (v2(df = 20) = 29.11, p = .08; comparative fit index = 0.98; and root mean square error of approximation [RMSEA] = 0.05, 90 % CI [0.00, 0.09]). Further, an inspection of the factor model revealed that all items exhibited high factor loadings, see Fig. 2. A reliability analysis at of posttest scores indicated highly similar findings to those found at pretest (i.e., Cronbach’s alpha = 0.85, N = 170, M = 3.50, SD = 0.69, KS z = 0.05, p = 0.20). Intervention effects Having found evidence of similar factor structures at both pre- and posttest, as well as acceptable levels of reliability, we proceeded to test our second hypothesis. First, we examined the pattern of individual changes in STEM connection scores from pretest to posttest. To assess change in student perceptions, we categorized student scores into quartiles indicating low, moderate, and high, and very high scores at pretest (i.e., below the 25th percentile, between the 25th and 50th percentiles, between the 50th and 75th
123
Connecting the STEM dots
117
percentiles, and at or above the 75th percentile). To assess individual level change in perceptions, we compared perceptions at pretest to their perceptions at posttest using crosstabulation. Students were assigned to each category based on the pretest scores. Therefore, the number of students in each group are unequal due to attrition. As shown in Table 2, students scoring at the lower and moderate end of the spectrum showed a substantial positive shift in their perceptions of STEM connections from pre- to post tests. For example, 43 % of students in the ‘‘low’’ group at pretest shifted to the ‘‘high’’ or ‘‘very high’’ groups at posttest. Students in the ‘‘high’’ group exhibited shifts to up and down (i.e., to ‘‘moderate’’ and to ‘‘very high’’) at equal rates, but students in the ‘‘very high’’ group exhibited a high degree of stability in their perceptions. These findings hint at differential effects of the intervention for students who perceive low to moderate STEM connections versus students who perceive high and very high STEM connections. To formally test the hypothesis above, two statistical tests were performed. First, a dependent-samples t test was conducted on the entire sample of students to compare pretest to posttest scores. The test revealed that overall, students did not exhibit a statistically significant change in STEM connection scores (Mdifference = -0.01, SDdifference = 0.59, rpre-post = 0.61, t(df = 114) = -0.26, p = 0.79). However, given the pattern of results shown in the cross-tabulation, we performed a second dependent samples t test with only students that had exhibited ‘‘low’’ and ‘‘moderate’’ scores at pretest. This second test revealed statistically significant positive changes in STEM connections from pretest to post-test (Mdifference = 0.27, SDdifference = 0.62, rpre-post = 0.19, t(df = 45) = 3.03, p = .004, d = 0.44). Thus, the evidence points to significant changes in STEM connection, but this positive shift was only present for those students who came into the project with relatively low perceptions of the connections between STEM disciplines.
Discussion There is a national priority for increasing the competency of K-12 STEM education. In an effort to guide educational policy, the national research council has developed a framework for integrating science and engineering curriculum (NRC 2012). In the current study, we report on an engineering design based intervention that bridges the gap between the new integrated framework and well researched models of teaching and learning from the cognitive sciences (de Miranda 2004). One of the primary goals of this project was to transform student understanding of the co-dependent nature of STEM content knowledge; however, the measurement of such knowledge has been elusive. Therefore, the current study reports on efforts to develop a new scale to measure integrated STEM content knowledge. A key finding from this study indicated that the measurement of student perceptions were consistent with our expectations. Specifically, we had hypothesized that our instrument would measure a single factor related to perceptions of the interrelated nature of STEM content knowledge and our analysis supported that hypothesis. A single factor emerged from the EFA to explain the pattern of student responses. A second psychometric validation was conducted on the posttest data using confirmatory factor analysis. Once again, the results indicated that the single factor model provided the good fit to the data. Together, these findings provide incremental evidence of the structural validity of the scale, as well as evidence for the stability of the structural model over time. A second, more important, finding from the current study concerned the positive change in student perceptions of the co-dependent nature of STEM content knowledge. We
123
118
P. R. Hernandez et al.
addressed this issue by comparing student perceptions at pretest with their perceptions at posttest. Our analysis indicated that overall, students did not exhibit significant change in their perceptions; however, probing our data further indicated a surprising pattern of results. More specifically, our data indicate that the intervention was most effective with students who started with low perceptions of the interrelated nature of STEM knowledge. It appears that long term participation in an authentic engineering design problem cultivated connections for those students who initially saw the fewest connections and benefits of STEM content knowledge. Although far from conclusive, this finding is promising in that interventions such as this may help to spur student understanding of the utility of, interest in, and engagement in STEM knowledge for those students who are initially the least engaged. Future studies of similar interventions should closely examine the initial STEM connections at different levels of the continuum. These results of the psychometric validation of the instrument advances our knowledge of desperately needed tool design within STEM education. Ultimately, such tool design is required to accurately detect and measure levels of content connects and knowledge fusion occurring when learning-in-doing through engineering design (de Miranda 2004). Furthermore, the results support our optimism for the future potential of being able to drill down to specific learning outcomes across the STEM disciplines. The work required to realize this level of scale specificity will require a blueprint of specific scientific and engineering practices, crosscutting concepts, and core ideas (content standards and learning benchmarks). The design of this study (i.e., cluster non-randomized single-group pretest–posttest design) limits the inferences we can make about the efficacy of the intervention as well as the generalizability of the findings. Further, the observed changes to initially low scoring students’ perceptions of the connectedness of STEM disciplines could also be explained by regression to the mean. Therefore, follow up studies are being conducted that further explore these relationships and broaden the replicability and generalizability of the findings.
Conclusion The purpose of this study was to describe a complex classroom intervention that involved an interdisciplinary team-based approach to solving an engineering design problem and report the results of efforts to develop a new scale for measuring the fusion of STEM content knowledge. The scale development process revealed that a new scale measures student perceptions of STEM connections reliably. Finally, our results indicate that this intervention was most impactful for student who initially had low perceptions of STEM connections. We tentatively interpret this finding as evidence for the effect of the intervention on growth in student understanding of the co-dependence of STEM content knowledge. Acknowledgments This program is based upon collaborative work supported by a National Science Foundation Grant No. 0841259; Colorado State University, Thomas W. Chen, Principal Investigator, Michael A. de Miranda and Stuart Tobet Co-Principal Investigators. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
123
Connecting the STEM dots
119
Appendix See Table 3.
Table 3 Summary of items used to assess perceptions of the interconnections between mathematics, science, and engineering Item number
Item stems
Q1
I think mathematics, science and engineering are essentially an accumulation of facts, rules, and formulas to be memorized and used
Q2*
Other subjects in school interest me more than mathematics, science, or engineering
Q3*
If I work at it, I can do well in math and science classes
Q4*
The subjects of science, mathematics, or engineering have little relation to what I experience in the real world
Q5*
Reasoning skills used to understand science, mathematics, or engineering can be helpful to me in my everyday life
Q6*
Engineering is an important tool to help me learn other subjects
Q7*
I have used mathematics as a tool to help me learn other subjects or solve problems outside of math classes
Q8*
I use science concepts to help me learn other subjects
Q9*
I have applied connections between mathematics, science, and engineering to help me solve problems outside of school
* Item retained after EFA and reliability analysis
References Angel, S., & LaLonde, D. (1998). Science success strategies: An interdisciplinary course for improving science and mathematics education. Journal of Chemical Education, 75, 1437–1441. Arbuckle, J. L. (2009). AMOS (Version 18). Spring House, PA: Amos Development Corporation. Retrieved from, http://amosdevelopment.com. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2, 141–178. Bruer, J. T. (1993). Schools for thought: A science of learning in the classroom. Cambridge, MA: The MIT Press. Dart, B. C., Burnett, P. C., Purdie, N., Boulton-Lewis, G., Campbell, J., & Smith, D. (2000). Students’ conceptions of learning, the classroom environment, and approaches to learning. The Journal of Educational Research, 93, 262–270. de Miranda, M. A. (2004). The grounding of a discipline: Cognition and instruction in technology education. International Journal of Technology and Design Education, 14, 61–77. Elliott, B., Oty, K., Mcarthur, J., & Clark, B. (2001). The effect of an interdisciplinary algebra/science course on students. International Journal of Mathematical Education in Science and Technology, 32, 811–816. Glaser, R. (1990). The reemergence of learning theory within instructional research. American Psychologist, 45, 29–39. Haertel, G. D., Walberg, H. J., & Haertel, E. H. (1981). Socio-psychological environments and learning: A quantitative synthesis. British Educational Research Journal, 7, 27–36. Haynes, S. N., Richard, D. C. S., & Kubany, E. S. (1995). Content validity in psychological assessment: A functional approach to concepts and methods. Psychological Assessment, 7, 238–247. Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory factor analysis: A tutorial on parallel analysis. Organizational Research Methods, 7, 191–205.
123
120
P. R. Hernandez et al.
Henson, R. K., & Roberts, J. K. (2006). Use of exploratory factor analysis in published research: Common errors and some comment on improved practice. Educational and Psychological Measurement, 66, 393–416. Katehi, L., Pearson, G., & Feder, M. (Eds.). (2009). Engineering in K-12 education: Understanding the status and improving the prospects. Committee on K-12 Engineering Education. Washington, DC: The National. Korey, J. (2000). Dartmouth college mathematics across the curriculum evaluation summary: Mathematics and humanities courses, retrieved on 3 Sep 2012, from http://www.math.dartmouth.edu/*matc/ Evaluation/humeval.pdf. Korey, J. (2002). Successful interdisciplinary teaching: Making one plus one equal one. 2nd International conference on the teaching of mathematics at the undergraduate level Hersonissos, Crete, July 1-6, 2002, retrieved on 3 Sep 2012, from http://www.math.uoc.gr/*ictm2/Proceedings/pap123.pdf. Lantz, H. B. (2009). Science, technology, engineering, and mathematics (STEM) education: What form? What function? CurrTech Integrations. Retrieved from, http://www.currtechintegrations.com/pdf/ STEMEducationArticle.pdf. National Research Council (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Committee on a conceptual framework for new science education standards. board on science education, division of behavioral and Social Science Education. Washington, DC: The National Academies Press. Netemeyer, R. G., Bearden, W. O., & Sharma, S. (2003). Scaling procedures: Issues and applications. Thousand Oaks, CA: Sage Publications. Pett, M. A., Lackey, N. R., & Sullivan, J. J. (2003). Making sense of factor analysis: The use of factor analysis for instrument development in health care research. Thousand Oaks, CA: Sage Publications. Schmidt, W. H., & Maier, A. (2009). Opportunity to learn. In G. Sykes, B. Schneider, & N. Plank (Eds.), Handbook of educational policy research (pp. 541–560). New York and London: Routledge Publishers. for the American Educational Research Association. Schmidt, W. H., & McKnight, C. C. (2012). Inequality for all. New York: Teachers College Press. Velicer, W. F. (1976). Determining the number of components from the matrix of partial correlations. Psychometrika, 41, 321–327. Velicer, W. F., Eaton, C. A., & Fava, J. L. (2000). Construct explication through factor or component analysis: A review and evaluation of alternative procedures for determining the number of factors or components. In R. D. Goffin & E. Helmes (Eds.), Problems and solutions in human assessment (pp. 41–71). Boston: Kluwer. Walberg, H. J., & Anderson, G. J. (1968). Classroom climate and individual learning. Journal of Educational Psychology, 59, 414–419.
123