Saturday, November 06, 2010

Digital Intelligence

Last month Thomas, one of my blogging colleagues, discussed in The future the possibility of doing a PhD on digital intelligence. Thomas wrote:

Howard Gardner’s multiple intelligences theory has always had me interested and somewhat of a believer. I certainly believe in the essence of the theory as he wrote it, but I feel that the theory has been abused and diluted and manipulated in wrong ways over the past few years (having read research about it for the past 6 years). But, yes, I do subscribe (as many probably do) with the notion of different intelligences.

What I would like to investigate is whether with this massive explosion in technology in people’s (see; children’s) lives, is there a new branch of the ‘intelligences’ that Gardner came upon emerging that we could term ‘digital intelligence’? That is to say, are people now developing new and distinct ways to comprehend technology, the new ways technology is creating and presenting information, and the way the digital world works?

For those who don't know Thomas, he is just completing an honours degree in education at the University of Sydney. I promised to provide a comment.

Much of the focus on the application of computing and communications technology in the classroom or in business has been on ways to better use the technology. I say computing and communications technology rather than just digital because the focus appeared well before the internet.

You can see this focus in the various Australian school curricula. There acquisition of various types of computing skills, the use of technology to do things, is built into every subject and every stage. You can also see it in teacher's blogs such as that of Maximos62 who is an enthusiast about the possibilities opened up by the digital technologies.

The actual impact of the technologies on the way that students, and people in general, think is less well understood.

We already know that technology shapes thought in often unseen ways. The rise of the motor vehicle is a classic example. It fundamentally reshaped the structure of life in country, town and city. It also changed the way in which we look at the world. It actually altered our mind settings, imbedding new perceptions of space and time.

We know that computing and communications technology is having similar effects. Indeed, we talk about it all the time. Yet the actual affects on the way we think are not well understood.

The initial stages of the computing and communications revolution focused on the storage, transmission and dissemination of data. Initial applications were business focused. The rise first of the PC and then the internet, added access and presentation to the original focus; the concept of interactivity emerged; the digital world became personal.

There are considerable tensions between the old and the new.

The use of the new computing and communications technologies in organisations, the desire to achieve uniformity and processing efficiency, led to the emergence of what I call command and control organisations. The bounds of individual authority were reduced, replaced by central decision rules and various types of performance measurement. My last post, Scoping the decline in organisational performance, is concerned in part with what I see as the adverse effects of these changes.

The internet and the associated new tools including social networking pose a fundamental challenge to the command and control paradigm because they transfer power from the organisation to the individual. To a degree, organisations including Governments that have used the technology to improve processing efficiency and to assert central control now struggle with changes that threaten that control. You can see this play out in, for example, the debate over internet filtering.

At individual level, people are still coming to grips with just what this new world means not just in terms of the use of the technology itself, but also in the impact of the technology on ways of thinking and acting. The debate over Facebook and privacy is an example.

Whether all this translates to a new type of intelligence, digital intelligence, is open to question. My problem here lies in the use of the word "intelligence". Quite clearly, new ways of thinking and acting are emerging. Quite clearly, people's ability to access and use the new technology varies enormously; the digital divide originally foreshadowed in this country by people like Barry Jones is here. However, does this constitute an "intelligence" in the way referred to by Gardner? It may, but it's also a question of definitions. 

To my mind, the more interesting question is the way the technology is actually affecting the way children think. If you look at the debate in this area at present, it seems to be generally problem focused. Cyber bullying is an example. There is, I think, much less focus on changes in structures of thought and of perceptions, on the way this affects learning and behaviour.

This is, of course, a huge topic. It may be that the use of Gardner's concept, a discussion of what constitutes digital intelligence and how we might measure it, is one way in. Whichever way Thomas goes, the topic is an important one. 

2 comments:

Thomas said...

Thanks for this, Jim. I have much to say in reply, and insufficient time. I will get to it though.

Jim Belshaw said...

Take your time, Thomas.