European Journal of Psychology of Education 2000, Vol. XV, n° I, 5-18 © 2000, I.S.P.A.
The modelling of 'dissonant' study orchestration in higher education Jan H.P. Meyer University ofDurham, U'K.
Attention is drawn in the present study to atypical patterns of contextualised learning engagement that are often difficult to interpret because, at face value, they exhibit varying degrees of conceptual dissonance. Against a summary of the practical and methodological problems associated with researching the phenomenon of idissonance', a review is presented of how it may manifest itself in differing contexts, and with what implied or observed associated effects. The empirical question of how 'dissonance' may be interpreted and modelled is addressed and it is argued that, in general, the phenomenon can be naturally accommodated within an interference observed model of student learning by virtue of violations of defined 'conceptual boundaries' within the model at an individual, or subgroup, response level. An example of an observed interference model of student learning in the form of a common factor model is introduced and is then further used to illustrate how such conceptual violations may occur in practice.
Introduction The term 'study orchestration' was used by Meyer (1991) to describe contextualised patterns of learning engagement at an individual, or group, level of response that are sensitive to variation in students' conceptions of what 'learning' is. In general, a study orchestration might thus be expected to exhibit a considerable degree of conceptual consonance in the manner in which conceptions of 'learning' reflect how the content and context of learning are perceived and purposefully engaged in terms of congruent intentions, motives and processes. Simply put, a study orchestration should be recognisable and interpretable in terms of our emerging understanding of how students vary in their conceptualisation and engagement of learning. The focus here is on study orchestrations that exhibit degrees of conceptual dissonance; that is, the expected theoretically coherent linkages between some or all of the more common sources of explanatory variation in contextualised learning behaviour fail to appear in a readily recognisable and interpretable form. Such 'dissonance' may appear in the individual observed responses of students; for example, in what they declare about their learning behaviour via interviews, written descriptions, or coded inventory responses. It follows that, when aggregated for analytical purposes, similar forms of 'dissonant' responses will also be reflected at a more abstract level in an observed model as in, for example, a factor model, a scaling model, or an unfolding model. That is, the observed model itself, or a single
6
J.H.F.MEYER
'dimensional' aspect of it, will exhibit conceptual 'dissonance' in a structural sense. In terms of inventory response data, in particular, attention has thus been drawn in a small number of studies to 'dissonant' aspects of observed models of student learning that exhibit a lack of response discrimination between discrete aspects of learning engagement that are theoretically incongruent with one another. In linear terms, a simple hypothetical example of 'dissonance' might thus be reflected in a group-level observed model (a common-factor model, say) that fails to empirically exhibit, as distinctly separate dimensions of variation, a theoretically expected distinction between 'meaning' and 'pathological' (that is, 'anti-meaning') aspects of learning engagement in process terms. The default conclusion in such a case might understandably be that the theory has failed to reconstitute itself in terms of what has actually been observed for reasons attributable to methodological ineptitude, or some form of measurement error. The study by Biggs and Kirby (1984) provides a good example of how an observed model can fail to empirically exhibit, in this case, the theoretically posited contrasting forms of motive-strategy congruency within the 'learning process complex' of the Biggs student learning model. An analysis of subgroup responses (to an inventory that reflects the Biggs model), additionally associated with 'low reasoning' and 'low memory' measures of ability, manifested a unidimensional (one principal component) and atypical structure. An explanation of this phenomenon by Biggs (1985) was essentially confined to the comment that the apparent failure of this subgroup to distinguish between contrasting motives and strategies was an indication that the learning process complex was 'opaque and meaningless' for them - an interpretation that, technically, is not strictly admissible because a principal component does not explain how all the respondents contributing to the analysis conform; it explains how they vary relative to one another. It can nevertheless be conjectured that what was being exhibited in this case within the constraints of the instrument used was an underlying form of 'dissonance' as defined and discussed further on.
Difficulties in researching 'dissonance' Researching the phenomenon of 'dissonance' in quantitative terms is difficult for a number of reasons. First, the proportion of 'dissonant' responses exhibited in a typical higher education undergraduate student sample within a given course is expectedly relatively small, and usually too small in absolute terms to sustain an independent statistical analysis in relation to the number of observables. For example, based on the experience of numerous studies carried out by the present author on first-year engineering students, the number of expected 'dissonant' responses in a typical first-year undergraduate sample would be approximately ten percent. 'Dissonant' responses may thus easily remain undetected as a distinct source of variation because of swamping or, when detected, they may simply be attributed to measurement error. Again, using the common-factor model as an example, it is an acknowledged fact that there is a degree of intellectual 'art' involved in deciding how much reduction in the multivariate complexity of the observables should be admitted; that is, the decision of how many factors to extract for interpretive purposes cannot be blindly driven by statistical criteria alone. The research literature on student learning accordingly contains many examples where the extraction of factor solutions of a given dimensionality are defended on the grounds that they are the most conceptually 'robust', 'stable', or 'interpretable'. Often, in practical terms, this argument is simply another way of saying that empirically there might be additional sources of variation (factors) that are being ignored, precisely because they do not make sense, or because they are assumed to be attributable to error variation, or because (where applicable) the 'percentage of variation' they explain is relatively negligible, or because they are believed to be nothing more than a spurious statistical artifact. There are, secondly, difficulties in isolating individual 'dissonant' responses even when their presence is suspected. Empirically these difficulties reflect, in particular, the inherent uncertainty of 'locating' an individual 'dissonant' multivariate response within an observed
THE MODELLING OF 'DISSONANT' STUDY ORCHESTRATION
7
model or exhibited structural relationship; the isolation is driven by statistical considerations rather than conceptually imposed criteria as in, for example, the identification of 'outliers' or their analogues. On the other hand, the formulation of a conceptually driven categorisation procedure that is sensitive enough to sift out 'dissonant' responses in observed data requires considerable theoretical justification and is not trivial to implement (Meyer, 1998). Thirdly, it may be observed that, insofar as 'dissonant' responses may be associated more generally with the well established statistical phenomena of 'outliers', there is an inherent scepticism among educational researchers about the validity of such observations and, therefore, their warranted attention and retention for analytical purposes. 'Outliers' by definition represent extreme responses that are problematic because they contribute to violations of the parametric and other assumptions under which a given statistical model can be strictly fitted to the data. And even when such assumptions are ignored for practical reasons, the presence of 'outliers' still compromises the degree of confidence with which a statistical model can be taken to represent what has been observed in terms of the concept of 'goodness of fit' . There is, therefore, a need to respond to the presence of such troublesome 'outlying' observations. Both quantitative and qualitative research methodologies in fact have quaint ways of addressing the problem; they may be 'rectified' in some way according to some rule, or simply discarded, as part of the analytical process. 'Winsorization', for example, is an engaging technical term for a process of data refinement whereby 'outlier' observations (high and low extremes) are respectively replaced with less problematic nearest neighbour observations (Winer, 1971, p. 51). 'Trimming', on the other hand, means that 'outliers' are simply discarded in equal numbers of high and low extremes. Phenomenographic analysis, in turn, does not strictly require qualitatively different 'categories of description' to explain or exhaust all of the variation in what has been observed; an atypical experience of the phenomena being investigated can be ignored as an arguably valid form of 'variation in method'. As Walsh (1994, p. 23) succinctly puts it: 'Refining implies selection of the data on some basis. If the data is found not to conform to a predetermined logical relationship, then it is discarded'. More generally, however, the individual status of all experiences, including extreme or atypical ones, is lost in the 'discovery' or 'construction' of the 'categories of description', It can thus be argued that phenomenographic analysis is unlikely to constitute 'dissonance' as a category of description given that the learning experiences in question may not exhibit any 'pre-determined logical relationships' (within or between categories). A further popular imposition that 'categories of description' should be hierarchically related further strengthens the likely exclusion of any 'dissonant' category that: might otherwise legitimately exist in isolation. There is, finally, the practical problem of distinguishing 'dissonance', as a conceptually valid source of variation, from sources attributable to error variation. Some modes of observation lend themselves to the validation of observed 'dissonance' more readily than others. A suspected 'dissonant' observation based on an inventory response pattern, for example, can be validated via interview data. There is, however, a more general requirement to first theoretically accommodate such 'dissonant' objects of intellectual enquiry within an interpretive framework. Against this background questions arise as to (a) whether there is any consistency in the manifestation and interpretation of 'dissonance' and (b) how 'dissonance' might be produced and explained in abstract modelling terms.
Manifestations of 'dissonance' A few quantitative studies of student learning have drawn attention to patterns of study orchestration that manifest empirically as atypical, interpretively problematic, and even internally contradictory, conceptual structures. These observations have arisen most clearly in quantitative studies that have employed a non-linear individual-difference statistical model (unfolding analysis) that seeks to preserve, or reproduce within known limits of precision, the uniqueness of individual observed multivariate responses.
8
J.H.F. MEYER
The solution of an unfolding analysis is essentially descriptive; both the underlying empirical structure of the observables, and the location of individual responses within the structure, are simultaneously represented within a common multidimensional Euclidian space called the 'common joint space'. When the dimensionality of this space is between one and three, the coordinates of the observables and the individual responses can be jointly plotted and visually interpreted in a conventional geometric sense. The study by Meyer and Muller (1990) (which contains a non-technical explanation and illustration of the main features of an unfolding analysis) is believed to represent the first individual-difference analysis of the general relationship between students' perceptions of their learning environment and their approaches to studying. Meyer and Muller (1990) explored in some detail the manifestation of atypical forms of study orchestration; they reported that in their study there were students who apparently orchestrated their declared motives and intentions, their perceived learning processes, and their contextual perceptions, in a manner that eludes rational interpretation. In a further elaboration of these atypical units of analysis, essentially associated at this stage only with academically weak students, Meyer (1991) noted that a general pattern of conceptual 'disintegration' (hereafter referred to as 'dissonance' for the sake of conformity) appeared to characterise the corresponding individual 'preference structures' as exhibited in the observed responses of individual students, or as empirically exhibited in a corresponding unfolding solution. In studies employing a linear statistical model (such as common factor analysis) evidence of a similarly interpreted phenomenon of 'dissonance' has also been reported. Studies by Meyer and Dunne (1991), and Entwistle, Meyer, and Tait (1991), have reported that the common-factor structures associated in their studies with the responses of academically weak students do not exhibit the theoretically expected relationships between contrasting aspects of learning behaviour and associated perceptions of the learning environment. Other studies by Meyer, Parsons, and Dunne (1990), and Meyer and Sass (1993), have reported further on the association between such problematic structures and low academic achievement or academic failure. It has also been concluded that there is an association between 'dissonance' and attributional causes for academic success and failure. In exploring the static and dynamic features of categorically imposed 'dissonance' differences via a locus of control model, Meyer (1996) draws particular attention to a subgroup of students exhibiting a high level of 'dissonance' in their study orchestrations. This subgroup also exhibits (see Table I) an observed locus of control structure that, unlike those of other less 'dissonant' subgroups, conspicuously blurs the theoretically posited distinctions between internal and external attributions for academic success (and failure).
Table 1
Exploratory factor pattern exhibiting 'dissonant' locus of control model (n=109) Attribution
Source
FI
F2
Good luck for success Bad luck for failure Favourable context forsuccess Unfavourable context for failure Ability for success Lack ofability for failure Effort for success Lack ofeffort for failure
External External External External Internal Internal Internal Internal
34 51
57
22 41
57
64
-40
49 -34 30
Note. Loadings are multiplied by 100 and rounded to two decimal places. Loadings with an absolute value below 20 are
printedas dash. Inter-factor correlation coefficient=+.38. Table contents adapted from Meyer(1996, p. 62).
THE MODELLING OF 'DISSONANT' STUDY ORCHESTRATION
9
It can be seen from Table 1 that, in essence, Factor one captures attribution variation for success internally (ability) and externally (good luck and favourable circumstances) and, for failure, also internally (lack of ability) and externally (bad luck, unfavourable circumstances). This is an atypical dimension of variation to interpret in terms of the underlying theoretical locus model which posits a separation between internal and external attributions for success, and failure, respectively. Factor two (which is positively correlated with Factor one; r=+.38) captures attribution variation for success externally (good luck and favourable circumstances) in the absence of internal attributions (ability and effort) with an additional attribution for failure that is also internal (lack of effort). Factor two is easier to interpret but its positive correlation with Factor one simply does not make sense in terms of the underlying locus model either. In terms of dynamic relationships, Meyer (1996) concluded further that there appears to be a consistent relationship between a shift over time towards increasingly 'dissonant' forms of learning engagement and statistically significantly higher levels (mean scores) of external attributions for academic success in terms of favourable circumstances (context) and chance factors (good luck). There are fewer studies that have explored the phenomenon of 'dissonance' on the basis of qualitative data analysis. The study by Meyer (1991) basically confirmed via interview data that students exhibiting atypical inventory response patterns were able to articulate these quite clearly with concrete experiences. In similar vein Cliff (1992) has verified, via interviews, the presence and some defining features of 'dissonant' study orchestrations that were previously independently identified via an analysis of inventory data. Another interview study by Lindblom- Ylanne and Lonka (1999) has also identified 'dissonance' via interview data in a subgroup of senior medical students exhibiting a relatively low measure of academic achievement in relation to that of other peer subgroups. In the interpretation of their interview data Lindblom- Ylanne and Lonka (1999) essentially associated 'dissonance' with the conflict experienced by students whose study orchestrations were not supported by their learning environment. They furthermore concluded that 'dissonant' study orchestrations may develop because of such a conflict. This conclusion is consistent with the findings of a number longitudinal studies that have basically examined the negative (including 'dissonant') impact of the often difficult first undergraduate year on school- versus university-based study orchestrations (Meyer, 1996; Meyer & Scrivener, 1995; Meyer & Sass, 1993; Meyer, Dunne, & Sass, 1992). The conclusion that the learner-inconflict with the learning environment can exhibit transient 'dissonance' is also theoretically supported by Vermunt and Verloop (1999), who describe such an inducing conflict, in terms of the interplay between learner- versus teacher-regulated learning activities, as destructive friction. A further conclusion reported by Lindblom-Ylanne and Lonka (this issue) is that there appears to be a proximal relationship between 'dissonant' study orchestrations and external regulation and/or a lack of self regulation. This finding is plausibly consistent with the conclusion reached by Meyer (1996) that increased levels of 'dissonance' (within a given context over time) are positively associated with increased external causal attributions for academic success, insofar as such external attributions might also reflect a shift to corresponding external regulatory patterns. In distilling what has thus far been reported in the literature, 'dissonant' study orchestrations are characterised by blurred and/or conceptually problematic associations between incompatible sources of variation in habitual or preferential patterns of contextualised learning engagement; and particularly so in circumstances where there is a conflict between what students prefer to do, wish to do, or are actually capable of doing (in terms of versatility) and what the learning environment supports, demands or can accommodate. The sources of variation in question are typically reflected in the contrasting forms of generic cognitive processes, motives, intentions, regulatory mechanisms, contextual perceptions, conceptions of 'learning', and so on. Students who are unable to adjust 'dissonant' patterns of learning engagement to the unfamiliar or hostile demands of a changing, or changed, learning environment are clearly academically 'at risk' in more than one sense. The same is also true for students who may be induced into a transient form of 'dissonance' by the influence of the learning environment.
J.H.F. MEYER
10
In summary, 'dissonance' has thus far been associated with (a) an atypical study orchestration that is idiosyncratic and relatively stable, (b) a conceptually problematic incomplete or inappropriate transient 'readjustment' (of a previous habitual or preferential study orchestration) in response to changed, or rapidly changing, or hostile, contextual demands, or (c) a low-level (or absence) of metacognitive regulatory capacity.
An interference model Thus summarised, and viewed against an empirical research background in which it may easily remain undetected, or be ignored for the wrong reasons, Meyer (1998) has argued that the phenomenon of 'dissonance' can logically arise, and be further investigated, within a particular model-form of student learning termed an interference model. In a further refinement of this concept, an interference model is defined as a nonhierarchical (linear or non-linear) observed model that contains either (a) two or more separately distinct and conceptually consonant, but contrasting, dimensions of variation and/or (b) at least one dimension of variation constituted in terms of conceptually 'dissonant' sources of variation. In the special case of a common-factor model with theoretically contra-indicated correlations between factors, feature (b) may be exhibited simultaneously across two or more factors. A simple graphic example of a hypothetical interference factor model is presented in Table 2, while an observed interference model exhibiting a rudimentary type (b) feature is presented in Table 1.
Table 2 Graphic example ofhypothetical interference model
FI
F2
F3
Note. '-)' and '+---' respectively indicate observables, or sets of observables, that are conceptually incompatible with one another, and that appear within the depicted hypothetical factor structure with same-sign loadings. Fl and F2
thus respectively represent distinct and conceptually consonant, but contrasting, dimensions of variation while F3 is internally dissonant.
Depending on the structure of the model, anyone dimension may, in the case of feature (a), typically be unambiguously constituted in virtuously or non-virtuously labelled student learning terms (on the basis of 'high' location parameters) that reflect, respectively, associations of conceptually consonant modelling observables (for example, a composite 'meaning' or a 'pathological' dimension of variation); one conceptually consonant dimension of variation will thus contrast with one of, or more than one of, the other conceptually consonant dimensions of variation in the model (as is the case with Fl and F2 in Table 2). In the case of feature (b), the dimension(s) of variation may singly exhibit an internal composition that is conceptually 'dissonant' (as is the case with F3 in Table 2); that is, contain sources of variation reflecting observables that are conceptually incompatible as a composite entity (for example, discrete aspects of both 'meaning' and 'pathological' sources of variation). Insofar as 'dissonant' individual responses can be located within an observed interference model (directly or by estimation) they may generally exhibit features that violate the conceptual boundaries defined by the interference model. This violation can occur in one of several ways. Within a two-factor interference model of feature type (a) above, for example, a 'dissonant' response may, by estimation, be relatively highly 'scored' on both factors (such as
THE MODELLING OF 'DISSONANT' STUDY ORCHESTRATION
11
on FI and F2 in the hypothetical example in Table 2). Such a 'high-high' scoring implies a simultaneous location within the two composite structures of the model that effectively violates the conceptual boundary between them (it the contrasting factors, as per definition of the interference model, are orthogonal or negatively correlated). Stated differently, any other 'equally paired' (or nearly so) factor scoring patter' that can occur on the factor score continuum signifies, in perhaps corresponding varying degrees of response intensity, a lack of location discrimination between contrasting aspects of the model (or a potential lack, in the case where the location is estimated rather than computed directly), In the case of a feature type (b) model, any response location within a single composite dimension of variation that is, by definition, conceptually 'dissonant' (such as F3 in the hypothetical example in Table 2) constitutes a self referencing violation of one or more boundaries at the level of the associated discrete observables.
An example of an interference model Meyer and Cleary (1998) have reported on the development of a student learning model of clinical diagnosis, specific to the discipline of Medicine, that is used here for illustrative purposes. This model, still under further refinement, contains eleven internally consistent observables (as captured by inventory subscale response scores) that reflect, between them, conceptually contrasting aspects of the diagnostic process. It is the (interference) structure of the model, rather than the detail of what the structure represents in clinical engagement terms, that is the focus here. The approach taken to the development of the model, and a domain-specific description of what the model represents in conceptual terms that are relevant to students' conceptions and engagement of the diagnostic process, is contained in Meyer and Cleary (1998). The observed two-factor model is presented in Figure 1 and it satisfies the definition of an interference model stated earlier; the structure of the model reflects two contrasting and independent dimensions of variation in student learning that are associated with the process of clinical diagnosis. Both factors are independently (but not jointly) conceptually interpretable in a composite sense.
~
0.2
...
DilN!lnaionlll ,"IIncb Fl F2 +.56 -.17
UNIDIAGS
........~.... . j
,
!.,
j ••..
. ·····f·rr1:m::ttl: 0.0
.
-0.2
GLiPPATH PREMCLOS UNIOIAGS RELATING PROBCAUS VISPATH OIFFLIKE FRAGMENT MUlTIFAC VIRTHIST POSTULAT
4>-
Factor 1
-0·· Factor
2
Figure 1. An observed two-factor interference model. Extraction via principal factors under varimax rotation. First three eigenvalues are 3.29, 2.24, 0.35 Variation explained=50.3% Note. 'Non-virtuously' labelled subscales are GLIPPATH (learning pathology), DIFFLIKE (difficulty with likelihoods), PREMCLOS (premature closure), FRAGMENT (learning pathology), UNIDIAGS (unidiagnostic approach). 'Virtuously' labelled subscales are MULTIFAC (multifactorial approach), RELATING (relating symptoms), VIRTHIST (proficient history taking), PROBCAUS (probable causes), POSTULAT (postulating conditions), VISPATH (visualising the appearance of diseased organs).
The first 'pathological-unidiagnostic' factor essentially captures variation in terms of a composite dimension represented by a learning pathology (GLIPPATH: a weak knowledge base
12
I.H.F. MEYER
of detail and difficulty in formulating a set of general principles applicable to diagnosis), difficulty with likelihoods (DIFFLIKE: difficulty in thinking in terms of likelihoodsand in accessing the right information at the right time), premature closure (PREMCLOS: reaching an exclusive diagnosis without considering all the available evidence), another learning pathology fragmentation (FRAGMENT: a difficulty in applying knowledge to diagnosis, because the knowledge is perceived to be a collection of unrelated facts), and a unidiagnostic approach (UNIDIAGS: a tendency to ignore conflicting data and pursue only one line of diagnostic enquiry). The second 'multifactorial-relational' factor captures variation in terms of a composite dimension represented by a multifactorial approach (MULTIFAC: an ability to follow more than one line of diagnostic inquiry and to generate new questions in the process), relational process (RELATING: connecting diagnostic observations to knowledge or to previous experience), virtuous history taking (VIRTHIST: proficiency and confidence in eliciting the clinical signs), probable causes (PROBCAUS: formulating a list of the most common causes, or of possible causes, of diseases that need to be considered), postulating conditions (POSTULAT: knowing what tests to order, knowing what diseases are prevalent in the community and what diseases commonly cause each set of symptoms in each age group and in each sex), and visualising pathology (VISPATH: visualising the microscopic and macroscopic appearances of diseased organs during the physical examination). Thus, in terms of high subscale scores, five 'non-virtuously' labelled aspects of the diagnostic process are reflected in subscales that capture variation in terms of learning pathologies, and other related conditions such as premature closure, that are collectively associated with a unidiagnostic approach. In contrast, six 'virtuously' labelled aspects of the diagnostic process collectively reflect a multifactorial approach to diagnosis in subscales that capture variation in relational thinking in terms of probable causes, postulating conditions, and symptom complexes. The alpha values associated with the eleven subscales range from 0.71 to 0.83 (n=480). Of particular interest in Figure 1 is the sign change in the loadings of UNIDIAGS and MULTIFAC across the two factors. This 'dimensional switch' neatly signals the conceptual boundary that distinguishes, within the observed model, the basic overall contrast between (variation in) the two composite, and conceptually incompatible, diagnostic approaches. Insofar as it is possible to estimate the 'location' of individual responses within the common-factor structure (by means of factor scores) it can be speculated that potentially good student diagnosticians will score 'high' on Factor 2 relative to Factor 1, while potentially bad diagnosticians will score 'high' on Factor 1 relative to Factor 2. Any students, for example, that score 'high' (or 'low') on both factors would, in effect, be violating a 'conceptual boundary' within the observed model in a multivariate and linear structural sense (rather than simply in terms of similar magnitudes of discrete observed scores which exhibit little variation and do not necessarily reflect any underlying pattern of structural association). In diagnostic process terms students exhibiting such structural violations would constitute subgroups that are atypical in some sense. Within these subgroups the expectation is that there would be individual similarities that would be, in this case, of interest and concern to clinical teachers .
............ " ...
.,
'"
......... ---'\:
--'
..-' ""
... ," .... "
''',
·2
-0-
NO.1
",
-3
No.2 -<>- Ouster "
No.3 Cluster No.4 •.•- Cluster No.5 --A-
-5
FACTOR1
Cluster
··0·· Cluster
FACT0R2
Variables
Figure 2. Plot of mean factor scores for each cluster
THE MODELLING OF 'DISSONANT' STUDY ORCHESTRATION
13
One way of attempting to represent these postulated location phenomena is to perform a k-means cluster analysis of the individual (pairs of) factor scores. These estimated scores effectively compress the multivariate linear complexity of an individual response within the two composite dimensions of the model to just two corresponding location parameters. However using factor scores, albeit only for illustrative purposes in this case, strictly implies that the underlying conceptual model actually 'fits' the data in a statistical sense, rather than simply appearing to do so in an exploratory subjective sense. It should therefore be noted in passing that confirmatory factor analysis, (via, for example, a Steiger-Lind noncentrality 'goodness of fit index' of 0.09) does provide an acceptable basis for thus proceeding further. But there is, at the same time, an indication here that the conceptual model does not fit the data as well as might be expected. The 'fit' should be better for inferential purposes. So either the specification of the conceptual model is deficient or the group-level data does not adequately represent the constituent individual responses. The classic approach in modelling would be to argue that the data are a given and that the specification of the model should be altered to improve the 'fit'. The conjecture here, however, is that the model 'fit' is being compromised because the aggregated data is insufficiently representative of some constituent subgroup(s) responses that may need to be separately modelled in their own right. The five..way cluster solution presented in Figure 2 exhibits a clear separation between the subgroups that is plausibly consistent with the preceding argument. There are expected and interpretable differences in the mean scores associated with the clustered estimates of individual response location within the model as follows: Cluster 1 (n==130) containing atypical 'high-high diagnosticians', Cluster 2 (n:::125) containing atypical 'low-low diagnosticians', Cluster 3 (n:::l08), containing 'bad diagnosticians' Cluster 4 (n:::5) representing perhaps a possibly extreme subgroup of 'bad diagnosticians' and Cluster 5 (n==112) containing 'good diagnosticians'. It needs to be emphasised that the designation of the various cluster groups as 'atypical' or otherwise does not imply a corresponding strict conformity of individual multivariate response within the respective clusters. In particular, designating the 'high-high' and 'lowlow' clusters as 'atypical' does not mean that all the individuals respectively within these two clusters are necessarily exhibiting 'dissonant' study orchestrations, or that the aggregate features of any 'dissonance' that might be evidenced within each of the two clusters is similar across the two clusters. What the designations do imply is that subsets of individual observations within each of the two clusters are respectively likely to exhibit some common pattern of individual similarity. The expectation is that two (perhaps distinct) forms of 'atypical' individual similarities are here respectively concentrated within, rather than dispersed across, the two clusters in question. What is required, then, is some evidence closer to the multivariate observables (rather than a compressed statistical abstraction of them) that the posited violation of a conceptual boundary has occurred within either of these 'atypical' subgroups. Such evidence must, in essence, support the violation of the conceptual distinction between the pathologicalunidiagnostic and the multifactorial-relational dimensions of the observed model. Furthermore, in order to be consistent with the fact that the observed model is a linear one, this evidence should, ideally, be based on correlation considerations. In terms of the interference definition the correlation matrix on which the observed model is based should, in the ideal conceptual case, reflect positive correlations between all the observables within each of the two factors respectively, and preferably negative (rather than zero) correlations between observables across the two factors. Thus, for example, all of the 'pathological-unidiagnostic' observables should be positively correlated with one another and negatively correlated, in tum, with each of the 'multifactorial-relational' observables. This is, in effect, the conceptual model that was 'fitted' to the data via the confirmatory factor analysis referred to earlier.
14
J.H.F. MEYER
Table 3
A comparison ofsome correlations between observables MULTIFAC
GUPPATH
DIFFLIKE
PREMCLOS
FRAGMENT
UNIDIAGS
Cluster! (n;;;125) 'Atypical' Cluster2 (n;;; 130) 'Atypical' Cluster3 (n;;;108) 'Bad' ClusterS (n;;;112) 'Good'
-.067 -.089 -.314*** -.312***
-.146 -.254** -.369*** -.266**
+.008 -.058 -.300** -.455***
+.197* -.136 -.192* -.191 *
+,077 -.064 -.051 -.389***
Note. * signifies pS.05, ** signifies pS.OI, ***signifies ps.OO 1.
The correlations between each of the five 'pathological-unidiagnostic' observables and the discrete 'multifactorial' observable for each of the large cluster subgroups (the smallest Cluster 4; n=:5 is excluded here and in the analyses presented further on) are presented in Table 3. As ideally expected, and in contrast to the 'atypical' Clusters I and 2, the 'good diagnostician' Cluster 5 exhibits correlations that are all significantly negative. The same is true for the 'bad diagnostician' Cluster 3 with the one exception of the practically zero negative correlation between a multifactorial and a unidiagnostic approach. The significant positive correlation (r=:+.197, p=:.028) between a multifactorial approach and the learning pathology of fragmentation for Cluster I (the atypical 'high-high' subgroup) simply does not make sense conceptually. It is the positive nature of the association rather than its magnitude that constitutes a boundary violation of immediate interest. There is an inherent 'dissonance' in blurring the distinction between simultaneously being able to follow multiple paths of diagnostic inquiry, also generating new questions in the process, (this is what an 'expert' diagnostician does) and difficulty in applying an existing knowledge base to diagnostic reasoning because the knowledge itself is perceived to be a (fragmented) collection of unrelated facts. An inspection of the complete correlation matrix for the 'high-high' Cluster I (not presented) reveals a further nine positive cross-factor observable correlations (~.15) that represent, in similar vein, conceptual violations in varying degrees. Six of these correlations are statistically significant at the five percent level or lower. A similar inspection of the correlation matrix for the 'low-low' Cluster 2 subgroup (not presented) reveals four conceptual violations of comparable (correlation) magnitude, one of which is statistically significant (p<.OI). There are 110 similar cross-factor observable correlations (~.15) exhibited in the correlation matrices (not presented) associated with Clusters 3 and 5. There is thus evidence that the atypical subgroups are exhibiting a violation of the conceptual boundary within the observed model, but not to the same degree. Of further interest is that, for the 'low-low' Cluster 2, only one of the fifteen correlations between all the observables within the 'multifactorial-relational' dimension is significant at the five percent level. The corresponding number is two for the 'high-high' Cluster I, five for the 'bad diagnosticians' Cluster 3, and nine for the 'good diagnosticians' Cluster 5. Both the 'atypical' Clusters I and 2 thus exhibit a relative weakening in the internal consistency of the 'multifactorial-relational' dimension and, as already noted, Cluster 1 in particular appears to do so in terms of significant and clearly defined (correlation) conceptual violations. It can finally be noted that the discrete significant and non-significant correlation-based violations that have been noted for the 'atypical' Clusters I and 2 also emerge, as expected, in terms of more complex (and direct) linear associations such as those yielded by within-cluster principal component analyses. These component analyses (not presented) based on an eigenvalue>l extraction criterion may be summarised as follows: For the 'high high' Cluster 1 and the 'low-low' Cluster 2, four of the five, and three of the four, respectively extracted components exhibit internal 'dissonance'. In the case of the 'bad diagnosticians' Cluster 3, there is 'dissonance' in only one of four components. No such 'dissonance' appears in any of the three components associated with the 'good diagnosticians' Cluster 5.
THE MODELLING OF 'DISSONANT' STUDY ORCHESTRATION
15
For the sake of completeness it should again be noted in passing that, notwithstanding the fact that it was established at the outset that the conceptual model 'fits' the undifferentiated data, there is now consistent evidence that contra-indicates the conceptual validity of such a single model for inferential purposes. In fact, additional confirmatory factor analyses (not presented), separately carried out in respect of the cluster groups, supports the argument that different models need to be formulated to achieve marked improvements in the corresponding 'degrees of fit' (to, in some cases, near perfect). In agreement with the findings of earlier individual difference studies, and simply restated here as a pointer to further indicated work, 'dissonance' has to be separately modelled. It is also acknowledged that the use of factor score estimates in the present study may be open to criticism on the grounds that they are non unique. The fact that there are grouping consistencies and patterns using this approach nonetheless carries the argument presented in communicating the existence of clusters that may alternatively be explored by more direct initial means including principal component scores. Such an alternative principal component approach, however, sacrifices the communality between observables for precision of location in a tradeoff that is of questionable value for illustrative purposes. In summary, what the analysis captured in Figure 2 demonstrates is that an interference model has been observed that allows 'dissonance' to be exhibited in an abstract sense given that the factor scores do not represent 'real people'. 'Dissonance' is nevertheless initially exhibited, in different degrees, in this case via the violation of a single conceptual boundary by both of the two 'atypical' subgroups of students, notwithstanding the inter-individual variation within them. Precisely what the attributes are, in terms of students' lived experiences, of the 'dissonant' individual responses within the 'atypical' clusters is a focus of further ongoing research and is not relevant to the present argument. The point here is simply that the sought after evidence of conceptual violation is present in the correlation structure of the observables in a manner consistent with expectations. For an interference model to be of any practical use, however, and given that the 'atypical' subgroups represent a source of 'dissonant' responses rather than an exclusive collection of them, the question arises as to whether any 'dissonant' responses present can be identified using alternative analytical procedures that are more sensitive to the responses of 'real people'. It has in fact been demonstrated that this task can be accomplished with a reasonable degree of known precision by the application of a non-linear individual-difference statistical model such as unfolding analysis (Meyer & Muller, 1990; Meyer, 1991) which avoids the difficulties inherent in a factor analytic approach that have already been referred to. Another and more direct alternative approach, that of categorising 'dissonance' in observed data, will be briefly introduced in outline form.
Categorising violations of conceptual boundaries Meyer (1996) has demonstrated that, given a suitable metric (such as rank), violations of conceptual boundaries can be interpreted and categorised as interference conditions. The Meyer categorisation procedure was inspired by the (statistical) concept of stress that is used in unfolding analysis. In reproducing a complete individual multivariate response (rather than a compressed form of it) within an empirical data structure (represented, in the case of unfolding analysis, by the 'common joint space') the associated stress coefficient basically signifies how 'scrambled up' an individual preference structure (the ranked unit of analysis) needs to be in order to be reproduced within the structure defined by the responses of the group as a whole (see Meyer, 1991, for examples). High stress values typically identified what are being interpreted here as categorisable forms of 'dissonance'. In thus treating observed individual multivariate responses as preference structures (based on ranked, item-averaged scores) the Meyer categorisation procedure assigns numerical values to conceptually discrete stages of 'dissonance'.
16
J.H.F. MEYER
Table 4 Categorising 'interference' conditions F1
F2
IC,
IC2
IC 3
+~
~-
+~
+~
~-
+~
~-
+~
~-
~-
+~
~-
~-
+~
+~
+~
~-
~-
~-
+~
+~
~-
~-
+~
Note. Within FI, F2 and 1Cr1C3' '+' signifies, in terms of high location parameters, positively labelled observables (eg. meaning processes) and -,: similarly signifies negatively labelled observables (eg. 'anti-meaning' learning pathologies). Each interference condition is internally ranked in descending order according to the magnitudes of the individual scores on the observables.
Table 4 illustrates the basic logic of this categorisation procedure which is, for the sake of simplicity, based on a hypothetical two-dimensional interference model such as the one represented by just the first two factors of the symbolic model presented in Table 2. There are thus two contrasting sets of three constructs each (six observables in all) that are now symbolically interlaced via a suitable metric (such as distance or rank) in the form of preference structures with posited ordinal properties. Meyer (1991) provides a full discussion of how such structures arise empirically in the application of unfolding analysis. Three such structures, each representing a particular set of interference conditions (IC t-IC 3) , are depicted in Table 4. To simplify matters further, if all permutations of each respective set of three constructs within a given structure can be regarded as conceptually equivalent in terms of the metric (as in three learning processes, and three learning pathologies, say) then in ordinal terms the IC 1 structure is conceptually better than the IC 2 structure which, in tum, is conceptually better than the IC3 structure. The ordinal (or near ordinal) character of any given interference condition can thus be established on reasonable grounds and is amenable to categorisation. Such ordinal categorisations, furthermore, represent explanatory sources of variation in their own right and have, in varying degrees of complexity, been consistently and statistically significantly associated with learning outcomes in a variety of higher education student learning contexts. These associations have been most clearly illustrated in studies by, for example, Meyer, Parsons, and Dunne (1990), Meyer and Sass (1993), and Cowie, Shanahan, and Meyer, (1997).
Summary and conclusion 'Dissonance' in study orchestration continues to emerge in sufficiently sensitive studies of student learning as a legitimate and important source of explanatory variation. Failure to detect 'dissonance', or excluding its presence, generally arises because of the insensitivity of the diagnostic instruments used, sampling limitations (the proportion of 'dissonant' observations involved in a given sample is usually relatively small), or because individual 'dissonant' observations (the conceptual 'outliers') that do not conform to the parameters of 'model fit' (or outcome space 'logical' connectedness in phenomenographic terminology) are often mistakenly dismissed as being spurious for reasons attributable to measurement or response error. It has been argued, and empirically demonstrated, that individual manifestations of 'dissonance' can arise naturally within an interference model of student learning. Such a model accommodates 'dissonance' when present in terms of violations of conceptual
THE MODELLING OF 'DISSONANT' STUDY ORCHESTRATION
17
boundaries; these boundaries are typically inherent within the structure of the observed model, and they typically isolate (or define) such conceptually contrasting (or incompatible) dimensions of variation that may constitute the model in whole or in part. Accommodation can also occur via containment within a dimension of variation that is, in itself, internally 'dissonant'. Depending on the complexity of the model, 'dissonance' can thus be exhibited either across contrasting composite dimensions of the observed model, or within a single composite dimension of it. In either case one or more conceptual boundaries can be violated in a number of ways that are further (and independently) amenable to ordinal categorisation at an individual level of multivariate response in observed data. The modelling of 'dissonance' is theoretically important because it raises questions about the fundamental assumption of group representation that implicitly underpins the application and interpretation of a given model of student learning. This assumption is a fundamental one in the application of factor models which are widely employed in studies of student learning. The phenomenon of 'dissonance' also poses challenges for phenomenographic research, in particular, because it calls into question an authorised methodology of 'discovery' or 'construction' of 'categories of description' that, by virtue of imposed 'logical or hierarchical connectedness' requirements, are unlikely to accommodate 'dissonance'.
References Biggs, J.B. (1985;. The role of metalearning in study processes. British Journal ofEducational Psychology, 55, 185-212. Biggs. J.B., & Kirby, J.R. (1984). Differentiation of learning processes within ability groups. Educational Psychology. 4,21-39. Cliff, A.F. (l9921. The 'educationally disadvantaged' student: Factors impacting upon conceptions of learning and perceptions of learning contexts. Unpublished M.Ed. dissertation. University of Cape Town. Cowie, J., Shanahan, M., & Meyer, E. (1997). Measuring learning processes in first year economics. Preliminary results. Research and Development in Higher Education, 20.209-230. Entwistle, N.J., Meyer, J.H.F., & Tait, H. (1991). Student failure: Disintegrated perceptions of study strategies and perceptions of the learning environment Higher Education, 21, 249-261. Lindblom- Ylanne, S., & Lonka, K. (1999). Individual ways of interacting with the learning environment - Are they related to study success? Learning and Instruction, 9, 1-18. Lindblorn-Ylanne, S., & Lonka, K. (this issue). 'Dissonant' study orchestrations of high achieving university students. Meyer, J.H.F. (1991). Study Orchestration: The manifestation, interpretation and consequences of contextualised approaches to studying. Higher Education, 22. 297-316. Meyer, J.H.F. (1996). Some aspects of the individual-difference modelling of causal attribution. Higher Education, 31, 51-71. Meyer, J.H.F. (1998). A medley of individual differences. In B. Dart & G. Boulton-Lewis (Eds.), Teaching and Learning in Higher Education: From Theory to Practice (pp. 42-71). Carnberwell: Australian Council for Educational Research. Meyer, J.H.F., & Cleary, E.G. (1998). An exploratory student learning model of clinical diagnosis. Medical Education, 32,547-581. Meyer, J.H.F., & Dunne, T.T. (1991). The study approaches of nursing students: Effects of an extended clinical context Medical Education, 25, 497-516. Meyer, J.H.F., Dunne, T.T., & Sass, A.R. (1992). Impressions of disadvantage. I - School versus university study orchestration and implications for academic support Higher Education, 24, 291-316. Meyer, J.H.F., & Muller, M.W. (I990). Evaluating the quality of student learning. I - An unfolding analysis of the association between perceptions of learning context and approaches to studying at an individual level. Studies in HigherEducatioll,15,131-154.
18
J.H.F. MEYER
Meyer, J.H.F., Parsons. P., & Dunne. 1'.1'. (1990). Individual study orchestrations and their association with learning outcome. Higher Education, 20, 67-89. Meyer, J.H.F., & Sass. A.R. (1993). The impact of the first year on the learning behaviour of engineering students. International Journal ofEngineering Education, 9, 209-217. Meyer. J.H.F .• & Scrivener. K. (1995). A framework for evaluating and improving student learning. In G. Gibbs (Ed.), Improving Student Learning through Assessment and Evaluation (pp. 44-54). Oxford: Oxford Brookes University. Walsh, E. (1994). Phenornenographic analysis of interview transcripts. In J.A. Bowden & E. Walsh (Eds.), Phenomenographic research: Variations ill method (pp. 17-30). Melbourne: Royal Melbourne Institute of Technology. Vermunt, J.D., & Verloop, N. (1999). Congruence and friction between teaching and learning. Learning and Instruction, 9,257-280. Winer, BJ. (1971). Statistical principles in experimental design. New York: McGraw-HilI.
Dans fa presente etude l' attention est focalisee sur des patterns atypiques d'engagement dans des apprentissages contextualises qui sont souvent difficiles a interpreter parce que, a premiere vue, its presentent des degres varies de dissonance conceptuelle. Plutot qu 'un resume des problemes pratiques et methodologiques associes avec la recherche du phenomene de 'dissonance', on presente une revue de la facon dont if se manifeste dans differents contextes et avec quels effets associes impliques ou observes, On se pose la question empirique de savoir comment la 'dissonance' doit etre interpretee et modelisee. Un exemple d'un modele d'interference observe d'apprentissage d'eleve sous la forme d'un modele defacteur commun est introduit et il est en suite utilise pour iilustrer comment de teiles violations conceptuelles peuvent apparaitre en pratique.
Key words: Dissonance, Interference model, Study orchestration.
Received: February 1999
Jan H.F. Meyer. School of Education, University of Durham, Leazes Road, Durham DHI ITA, U.K.; Tel: +191 374 1729; E-mail: j.h.f.meyerwdurham.ac.uk, Current theme of research:
The modelling of individual differences in student learning and the developrnent of student learning models that are discipline specific. Most relevant publications ill the field of Psychology of Education:
Meyer, J.H.F. (1991). Study Orchestration: The manifestation, interpretation and consequences of contextualised approaches to studying. Higher Education, 22,297-316. Meyer, J.E.F. (1996). Some aspects of the individual-difference modelling of causal attribution. Higher Education, 31, 51-71. Meyer, J.H.F. (1998). A medley of individual differences. In B. Dart & G. Boulton-Lewis (Eds.), Teaching and Learning ill Higher Education: From Theory to Practice (pp, 42-71). Camberwell: Australian Council for Educational Research. Meyer, J.E.F., & Cleary, E.G. (1998). An exploratory student learning model of clinical diagnosis. Medical Education, 32,574-581.