Cultural Analysis And What Designers Need To Know - A Case of Sometimes Too Much, Sometimes Too Little, And Always Too Late

An invited response to Bader and Nyce (1998) Theory and Practice in the Development Community: Is there room for cultural analysis?

Andrew Dillon

This item is not the definitive copy. Please use the following citation when referencing this material: Dillon, A. (1998) Cultural Analysis And What Designers Need To Know - A Case of Sometimes Too Much, Sometimes Too Little, and Always Too Late. ACM Journal of Computer Documentation, 22,1, 13-17.

Bader and Nyce's article raises intriguing issues that have concerned researchers in HCI and user-centered systems design for much of the last decade: to what extent can a deep social science methodology influence the process of technology design usefully. Their conclusion, that cultural analysis yields knowledge perceived to be of little value by system designers, is in my view, largely correct. However, while I share their conclusion, I do not accept their rationale. In the present paper I will attempt to demonstrate that the root of the problem lies less with the system designers than the inappropriate application of the specific social science methods Bader and Nyce invoke, which itself can be traced to their overly narrow view of the design process and their assumption that cultural analysis is the most useful social scientific method. 

It appears to me that Bader and Nyce's analysis rests firmly on one key point: developers make epistemological errors. These errors thus systematically bias designers' output and hence color their designs. The authors describe two such error types. The first is that classic problem of designers assuming that users are somewhat like them, and hence the designers' judgments of what constitutes good design will logically be shared by the intended users. The second type of error reflects the designers' views of social life being describable in terms of rules, albeit complex rules, which enable interaction to be predicted. 

This key point of epistemology needs to be examined critically since much of what follows in the Bader and Nyce article, and my response, flows from it. The first type of error is common, and has been seen and documented repeatedly in the literature on design. Since Hansen (1971) first urged designers to 'know the user', to Dillon and Watson's (1996) review of the individual differences among people that could influence interaction, all writers on user-centered design have urged designers to think beyond themselves. How successful this has been is an open question. My own experience in systems design suggests that the issue is slightly more complex than Bader and Nyce contend. Designers are quite prepared to believe that typical users are not at all like themselves, but this does not prevent them adopting stereotypical views of the user population, such as "Joe Sixpack who wants to get on the Web" or "a middle-aged woman with no experience of computers who just wants to hit a button and have it work". Nor does it prevent them invoking common-sense as a rationale for design, considering interface features as suitable on the basis that "anyone can see the available options" or "there is a help system for anyone who is unsure". Thus, I see designers as fundamentally well-intentioned but largely ignorant of the necessary methodological steps to follow to ensure that user issues are fully addressed. Clearly cultural analysis might have a role here, though not necessarily in the form Bader and Nyce advocate, and I shall return to this point later. 

The second type of error that Bader and Nyce claim designers make is more complex. Invoking contingency as a defining characteristic of social life, the authors state that designers (mistakenly) wish for laws of human behavior that will enable predictions to be made. Well, good for them I say. Designers are not alone in wanting this, and any social scientist that claims to be happy with descriptive rather than predictive knowledge is, in my view, not doing her job. Rich descriptions just don't cut it if they cannot be used to guide future action, and a theory that fails to support the derivation of any predictmy. Contingency renders prediction difficult but it does not preclude prediction at any level, and this is key to my objection. 

When designers reduce ethnography simply to a means of generating frequency data on behavior (as they authors claim) then they will miss what Bader and Nyce feel is the richest use of this method: the identification of differences between the social worlds of participants and the gaining of an understanding of how meaning is made. This 'rich use' is never employed by designers as, according to the authors, not only does it violate designers' assumptions of how the world works, but most importantly, designers find the output of this form of analysis less than useful. Herein lies a problem that I feel that is too often ignored in discussions of social science in design. Academic studies of human behavior are complex, difficult to perform and require some understanding of theory to appreciate their outputs in many cases. When practitioners in these fields bemoan the lack of understanding outsiders demonstrate, they miss the point. Social science studies, in and of themselves, do not exist to serve outside purposes, be they design or otherwise, but to increase our undertstanding of humans, and we should never demand of the social sciences in their pure form that they offer direct guidance to software developers (for an extended discussion of this point see Dillon 1996a). One may as well ask theoretical physicists to justify their practices in terms of producing faster chips and blame chip designers for not appreciating fully the work of Einstein. Thus, designers trained in engineering and required to produce working tools, are never likely to appreciate the subtleties of enthonography as long as it fails demonstrate significant practical advantage for them. It is not just a matter of wanting to reduce the world to laws, it is a case of showing what difference any information makes to practice. To fail to see this requirement of any theoretical position or method is to fail to apply the basic point of ethnography itself, the differences in the participants' social worlds. 

So this leaves us looking for cases where cultural analysis works well in design, and to use such examples as a means of educating designers, perhaps refining our methods, and suggesting future practice. And it is here that I have my most serious problems with the Bader and Nyce article, andnce again, where I find myself in agreement with some of their conclusions, but not their rationale. The failure of hypermedia to produce the educational benefits proposed or claimed is one of the most under-reported phenomena in the field of HCI. The case for use of hypermedia has never been adequately demonstrated in the literature (see Dillon, 1996b for more on this) and the authors case study provides some interesting pointers to why this may be the case. My reading of their account suggests that the teachers had invested so heavily in the technology that they were blind to its shortcomings and their own failings as teachers. The sample quote from a student complaining that the teacher "does not explain what he wants" is telling here. But in terms of the authors' analysis such data are extremely limited. Could these data have been obtained earlier in the design process? Could a designer take these into account and use them to inform the development or even the implementation of the hypermedia? It is not clear that they could from this example, and this has nothing to do with world view, and everything to do with the means of obtaining these data. The comments came from real use with an embodied artifact in a classroom setting, and while they may well reflect all the clashes of world view that ethnography suggests exists, my point is how could such issues be identified and incorporated into development work early enough to make any difference? 

The most optimistic answer I can provide here is that such analyses do offer the chance to develop a perspective on technology use that might serve to inform future technologies of this kind. Thus the designers responsible for the educational technology cited, may learn from these accounts some issues about use and implementation that they can bear in mind for future designs. But even then, it is not clear precisely what lessons could be learned: that different stakeholders have different values and requirements? No surprises there! That a teacher in love with the technology might be blind to its shortcomings and prove over-zealous in his/her advocacy of technology in the classroom? Is there anything here that is unique to cultural analysis or would not have been observed with other theoretical lenses or methods? I don't think so, and the authors never demonstrate this, yet it is their key example of the value of cultural analysis. Concluding that developers should "learn how to address the taken-for-granted, common sense, structures of meaning that shape our social worlds" ( p.8) is really not helpful in such cases and is almost humorous if it is meant to convey a course of action to someone not steeped in the terminology of anthropology. Know your users? 

So where does this leave us? First, ethnography or associated approaches are not geared for use by designers and we should not attempt to hand them over for use anyway, no more than we should expect social scientists to do the work of designers. It is not clear that any two ethnographers reviewing the authors' accounts would be able to deduce any implications for system design, and why should they? Ethnographers are not trained to be system designers and are not usually required to deduce the relevance of their findings to this domain. The leap from data to design implication is complex and nothing in any theory or method currently advocated for design purposes explicates this step. The ability to do so comes from experience in the design community, experiences that most social scientists lack. 

Second, just what does cultural analysis predict? This is not a loaded question designed to irk ethnographers everywhere but one that has a purpose. Invoking the spectre of positivism at the end of their analysis, Bader and Nyce seem to be preparing a defense against this question, but accusing the accusers of belonging to a long-discredited philosophical camp that has few adherents post-1950 is itself worthy of ethnographic analysis-- why do some qualitative folks get so upset by the request for predictive power that they resort to tainting objections with labels? 

I believe that it is the assumed precision of the required prediction that worries most people, and this may be why discussions of predictive power are so often avoided or reduced to name-calling. I have another view. The power of an ethnographic analysis might lie more in the insights it offers those involved at the earliest stages of design who (ideally) perform the user and task analyses. While it might be impossible to demonstrate a simple cause-effect relationship between cultural analysis and resulting interface use, I would think that any process of stakeholder analysis and scenario evaluation (see e.g., Eason 1988) might usefully exploit ethnographic methods, especially if these included that unsavory process of frequency counting. Such uses of methods could demonstrate utility much as interviews do, not as rigid mathematical chains of causality between finding and subsequent user performance (which the authors invoke as the only yardstick designers care about), but as context builders, supporting the user and task analysis work that is at least accepted as necessary in most design processes. 

The image of designers sitting in judgment of social science that Bader and Nyce convey is perhaps a problem too. Design is a complex process that is rarely handled by one person. There is a need for multiple methods at various stages and any presumed technology transfer relationship between social science and engineering need not follow a simple "hand over the method" model. Other relationships are possible, and while the formal modeling approach of Card et al (1983) has had influence out of all proportion with its value to the HCI field, some transfers are not so neatly packaged. I see much more scope for designers to work with social scientists in teams, a far more subtle form of technology transfer but one perhaps better suited to our ends. In this situation, the cultural analysis work could be accurately performed by social scientists and imported into participatory design meetings where new technologies are first conceived. Such information could shape the technology design that emerges far more powerfully than any attempt to get designers to think ethnographically themselves. 

Under such circumstances, there is a way forward for any social science method, including cultural analysis, but only if it proves useful in influencing the design process. If it can, it will not be in the pure form that cultural analysts such as Bader and Nyce appear to advocate - with designers throwing off their world views and suppressing their desires for predictive power-, but as an analytic method that shed light on users and stakeholders. This may not be pure cultural analysis as anthropologists in the field practice, but so what? I am trained as a psychologist, and I believe psychology has immense relevance for system development projects, but I do not expect to use formal experimental methodologies in all my usability tests, or to make the designers I work with aware of all the details of the theories which drive my inputs, or bemoan their understanding of statistical inference. An applied social science is of necessity a partial re-packaging of the science base, as civil engineering re-packages physics, social work re-packages sociology etc. (though I do not presume that either of these examples offers the idealized form of re-packaging for cultural analysts to follow), and even my own design framework for hypermedia (TIMS) is a simplified, qualitative wrapping of complex socio-cognitive activities underlying information use. 

As user-centered design methods invoke multiple theoretical positions to justify various techniques, the issue is really one of involvement. Social scientists must be involved in the design process at the outset, and throughout the development process user issues must be addressed in a manner that is suitable to the problem at hand. If cultural analysts can demonstrate the utility of their perspective and the data it yields, then it is likely to be in terms of better understanding the context of use for the technology. However, it is not clear that cultural analysis offers anything unique here. Socio-technical theorists have long advocated stakeholder analysis, and scoping out of scenarios of use for design consideration is now a standard procedure at the earliest stages of any user-centered design process. Perhaps there is more that ethnography can offer, as an analytic rather than descriptive tool (e.g., Anderson, 1994) but that case remains to be made. 

Obviously part of the difficulty, and one that Bader and Nyce allude to (p.5) is the oversimplification of social science methods by designers. These authors are not alone in complaining, we have produced data from studies of the European software industry (Dillon, Sweeney and Maguire, 1993) that demonstrates how easy it is for design teams to claim adherence to the methods of user-centered design without actually performing them in a manner that most of us would consider appropriate. While Card et al (1983) outlined an ambitious plan for technology transfer from science to design that completely embraced the re-packaging philosophy mentioned earlier, my own personal view is that until we train social scientists about software development so that they can get involved in person, or perhaps better yet, we train computer scientists and software engineers in the techniques necessary for true user-centered design, progress will always be slow. 

So while I agree with Bader and Nyce's conclusion that we have a long way to go, I do not blame the intended recipient community for not valuing our knowledge in general or the outputs of cultural analyses in particular. After all, none of us places much value on information that appears to have little relevance to us, and merely asking people to believe in a method without commensurate demonstration of its value in their terms is pointless. Too much description and too little prescription too late in the process is a recipe for rejection. And this is precisely what cultural analysis as it is currently explicated offers. To make converts to such creeds requires one to breed them or convince them with miracles. Since social science traffics in the former rather than the latter, the progressive route is clear. To accelerate matters we might reconsider some of our own assumptions and methods as social scientists before asking other communities to reconsider their world view.


Anderson, R. (1994) Representation and requirements: The value of ethnography in system design. Human-Computer Interaction, 9, 151-182. 

Dillon, A. (1996a) TIMS: A framework for the design of usable electronic text. In: H. van Oostendorp and S. de Mul (eds.) Cognitive Aspects of Electronic Text Processing, Norwood NJ: Ablex. 99-120. 

Dillon, A. (1996b) Myths, misconceptions and an alternative perspective on information usage and the electronic medium. In: J.F. Rouet et al (eds.) Hypertext and Cognition, Mahwah NJ: LEA, 25-42. 

Dillon, A. and Watson, C. (1996) User analysis HCI-the historical lessons from individual differences research. International Journal of Human-Computer Studies. 45(6) 619-638. 

Dillon, A., Sweeney, M. and Maguire, M. (1993) A survey of usability evaluation practices and requirements in the European IT industry. In. J. Alty, S. Guest and D. Diaper (eds.) HCI'93. People and Computers VII. Cambridge: Cambridge University Press. 

Eason (1988) Information Technology and Organizational Change. Bristol PA: Taylor and Francis. 

Hansen, W. (1971) User engineering principles for interactive systems. American Federation of Information Processing Societies Congerence Proceedings, 39, 523-532.