Perspectives: Notes on the final discussion:
The next steps to be taken to prepare the next workshop

What policy makers & practitioners a) want, b) need, c) ought to hear
from the science & technology community about:

  1. mapping and measuring social science
  2. characterising change in the STI system

ISSC Workshop, BBAW. Berlin, 14-15 March 2002


Peter Healey:

In terms of the organisation of this last session I think it's entirely open and unstructured, as I say we have this objective to look forward to what we might say to policy makers and practitioners in London in some time which has to be decided.

With that in mind earlier this morning I asked three members of our company if they would contribute, baring in mind what Sheila told us yesterday about when we had this discussion how we can easily slip between the descriptive and the prescriptive. I was going to try and establish three kinds of levels of potential demands that were for science and technology indicators studies. I asked three people to say what policy makers and practitioners

  1. want to hear, in other words, where they and what they actively now are seeking from the science and technology community about these two issues 'mapping and measuring social science' and to put it more loosely than any of the knowledge society Mode II contentious discussions we had during these 24 hours about characterizing changes in the science technology and innovation system. Let's just put it in that loose way, but about two things that are targets for this meeting. What they want to hear or the very minimum what they are prepared to listen to without to much persuasion. That is one level of it.

  2. The second level is what they need to hear from the science and technology community. In other words what we fear that we know that their job wouldn't be done so perfectly unless they had understood and heard it.

  3. And the third is very much more normative on our side, in other words what we think they ought to be hearing, what would be a little bit more strategic in terms of the capacities of mapping and measuring science and technology.

And if I could ask Chris Caswill to start in terms of the first of these, what he thinks policy makers and practitioners want to hear.


Chris Caswill:

The first thing I would like to say is, you need to be careful about trying to assume there is an generic mode of policy maker or practitioner. I think policy makers are as effected by their contexts as other people.

People who are making policy for medical science will be different from people making for social science, that is the first thing I wanted to say.

Secondly, just to put things straight, before I can start on the difficulties, the policy makers are coming first of all in three levels, this is important to understand.

First there are macro-micro level of decision makers. The people who are concerned with our grand national policy, the 'Robert Mayers' of this world, people who are interested in whether genomics is a national priority for the UK or not, for example. There are people who are concerned with our strategic decision making, program level decision making typically for councils. And there are people who are concerned with micro level decision actually allocating money and making decisions about that and they all will have slightly different needs. But what they all have in common -and this has been referred to several times - and that is that they don't have any time. That is a really important issue that has to be faced in this world. Shortage of time is the biggest constrain for the image policy makers have to operate.

The second thing related to that is the world policy seems to operate to me a bit, like I am told, a bit like warfare. There are long periods of calm, when people can read and think and reflect, like I have been able to do for these two days, but there are periods like I have a two days board meeting next week when we are going to allocate many millions of pounds of money where we will be working under enormous time pressure and everybody around the group will be, and we don't have time to discuss whether we are engaged in post-normal science or what we have dropped to do.

I think those kinds of constrains are really important to understand.

What do we want to know? I think we want to know, within that kind of framework, we want to know what can help us to be pro-active. No policy makers are practitioners likes to think of himself as being purely passive animals, everybody is trying to make a difference. What information can we have that would help us to take active steps in developing policies making decisions, finding new directions, understanding whether there are gaps in capacity, understanding where we can cooperate with other organisations to achieve our goals. So I think information that fits into this pro-active essential element of policy is important.

We want to know things which help us sort out public money here and that concerns accountability, public policy makers are concerned about obviously the fact that they are accountable and they need information to adresses that. Occasionally policy makers and practitioners in science policy have defensive needs. And I suppose all of us associated with the social sciences particularly understand that. Most of us, I guess, have been in contacts where we had to answer the question: What is the use of social science? Is it any good? And that obviously is an important element.

So I think we want also to be able to have information that occasionally allows us to justify and promote our special interests.

It seems to me there is a kind of listening mode, policy makers will occasionally be in listening mode and they want ideas, I think that is often underestimated, the need policy makers have for ideas. In my opinion and that is an entirely personally view, I have never really seen policy makers substantially impressed by large quantities of empirical data that say: 'we have done all this stuff and by the way here is the answer'. It' not my answer, so they go. That's life. But what people are influenced by, by their ideas, but how the system works, the normal science questions, whether the whole nature of science is changing, whether we are working, by a new kind of world, when we will have new kinds of goals and so on...

I think those kinds of things have an enormous and very large effect.

We don't want I think very serious about a kind of empirical studies with lots of data, but I think increasingly policy makers want to know what is the evidence base, that is a different question. I think, if there is a very big change, so people who are addressing these kinds of questions, science policy asks, what is the evidence base, can you look systematically and say: there is evidence base for any of these assertions that have been made, which is different from commissioning another study. The evidence base is pretty thin, I think, in most science policy areas and I think there are many jobs to be done to proof it.


Silvio Funtowicz:

I support many of the conclusions. In particular these about the need for ideas, more than just empirical data. Empirical data is just a way of showing some important ideas. I think there is, at least in my own experience, limited to the European Commission, I found that increasingly and I am not going across the board, it's just my own relations, people are more and more aware, administrators, policy makers are more aware, which is the same phenomenon we find in universities. People are more aware, which is interesting. There is no memory, but there is awareness. This is very post-modern, it is what the American call 'streetwise'. Most policy makers and administrators have come to the conclusion that the magic bullet is not there. They hope somebody comes with the magic bullet, but they know it is not there, because so many times they have the problems. So more and more the element of realism is coming in and that is a challenge and an opportunity. So in that sense I think, as I say I am not a social scientist, but the type of things social science can provide, contribution, there is an opportunity and a challenge for the social sciences. I find many times personally that the social sciences don't have an influence, maybe because you don't understand what they are talking about. Many things I read in the 'Journal of social sciences' I don't understand, they are talking jargon and all the rest, they are an internally closed discipline. That is the challenge for the social sciences. But in a sense there is a big opportunity, because there is this new awareness and also in the case of the UK I have been in the last 5 years been contacting many high people called policy makers in science, administrators in the office of science and technology, chief scientists. But the fact is that they were very interested in what I had to say and they were interested, perhaps BSE has something to do with it, I am sure it has, but I just am putting it that way, the fact is that now there is an opportunity for new ideas and the challenge is of course to recognize for us- what are the constrains? What are the institutional, what are the investment constraints and all the rest? You cannot ask to close it down and start all over again. So, I see the commission. The strategic way for work is as follows, and this is strategic in a positive, not in a negative sense, work is as follows: try to push the envelope, try to bring new ideas that will totally change the place. On the other hand to show, how you can go from here to there. And in that sense my activity has been mainly on trying to bring forth the conceptional ideas with what I called demonstrational projects, meaning it is not data as such, I want to show approach, how it works, and show to the people: this is the project, when are using demonstrational projects, doing it the new way and showing how these can be done. It is similar to case studies. I call it demonstration projects, you can call it case studies, it is important, because when people see how, then you get another level of awareness, because the argument is, doing participatory research and participatory measure, of course that can be done in Denmark, I heard in the meeting we had last year in the European academy, that can be done in Denmark and in the Netherlands, but never in the UK, we are a more complex society, which was very funny. What I am saying is this, the task is that these demonstrational projects, we should be able show that this type of things can be done, not only in Norway, the Netherlands or Denmark, that is why I am doing it also in Sicily, I am doing it in Latin America, in Africa, those who are excluded from the participation. I am doing it with women, with children. The idea is to show that these new ways of doing it work, not only in our ideal societies where participation is normal, you show them that there is substance there, in the sense I think we should work more concrete examples.


DISCUSSION FOLLOWING


Philippe Jeannin:

I want to present to you a demonstration of our project. First of all in many countries science figures do not include social science among these things.

The second thing to say is, my work is part of yours, because of the foundation of the knowledge society as a whole is relevant academic research. Evaluation is needed, because in Europe in the academic market of research the figure of the social sciences lacks information on the quality of journals. There is no consensus. So I propose this agenda, we go to do the following three targets. First problem is this one, for each member of the union, for each discipline elaborate a list of scientific journals. I think it is the first step for each member to elaborate this list. It is not very difficult, this step is not very difficult, because disciplinary communities are well oriented to this problem, because they need visibility, they need money. The scientists are aware of the fakes of science, they are aware that evaluation is necessary.

Second point is, when you have a national disciplinary list you criss-cross on an European level this national disciplinary list. I think this is the first step towards a relevant European database.

The third one I want to present, is to explore all implications created by step two, all implications of building review databases centred institution, a kind of an European institute for scientific information.

I think you can reflect on this, I think it is urgent, I think it is important to reflect on this, because I think there is room for us in the social sciences and humanities, I don't think the same for (...) sciences. So this agenda I think is not very expensive, some work has already been done in some countries, sometimes the researchers work on the problem that they don't know each others. If you want figures, I cost about 100.000 o to my country for doing that from 1992 up to 2003, I cost about 100.000 Euros for eight-nine disciplines.

Peter Healey:

Two immediate questions, the advance and costs of doing it, but the other question is: 'what's the costs of not doing it?' If this is something which policymakers and practitioners are supposed to need, what will happen to them if we don't do get it.

Philippe Jeannin:

If we don't do that, others will do. Other companies, because it is necessary to do.

Peter Healey:

So they would still get the information, but is a question of the competitive advantage of Europe in science and technology indicators, rather than the information requirement of European decision makers. That' s the summary.


DISCUSSION FOLLOWING


Stefan Kuhlmann:

I will give you three points on what policy makers and practitioners ought to hear.

The first is, as long as we are talking about the focus of this workshop, that is the emergence of heterogeneous, issue driven kind of research clusters, that is what I am focussing at, not in general science policy, but at this new dynamics. As long as we are talking about that, first then, policy makers ought to avoid simplistic output measurement. That is for instance mechanic application of publication measurement. That is, they should not follow that much the Anglo-Saxon approach that has been taken up in the 1980s and 1990s, the research assessment exercises. Take the U.S. experience in governance performance and the results acted and how it has been applied to research institutions. It has gone to far and it's more and more signals coming from the UK, coming from the U.S. saying that it has gone to far. That's what I noticed.

We organized two workshops, one in Germany and one in the U.S. in the last two years to act with our American colleges on the new procedures in the research evaluation. What I heard from my new colleges and my American colleges, they are meanwhile suffering from the rather simplistic kinds of measurement and are striving for more complex, more intelligent, more reflexive approaches. Because simplicity works out finally, in medium term at least, as a disincentive to creativity in research of science. So the first point is to avoid simplistic output.

Second, of course we need outcome oriented, but multi-dimensional performance criteria for research, in particular this kind of heterogeneous problem issue driven research cluster.

We need it, but we need it in a different way. My second requirement is to create room and leave it for self- government of that kind of heterogeneous research clusters, innovation clusters. Self- governance giving them the ability, opportunity to learn what they are doing. Self governance does not mean: laissez faire, but it means that it is only up to them what they do. They should have this free way for a limited period of time, in my perception, let' s say five years, and then they will be measured according to outcome oriented multidimensional performance criteria. Multidimensional means there might be science internal dimensions, there might relations towards the expectations of public, there might be economic dimensions to what they performed and there might be of course policy relevant aspects to it and that all can be broken down in qualitative and quantitative criteria for measurement, after some time at least.

Then make decisions on the basis of that, give others room for manoeuvre, room for new ventures, so create room for self-governance. That was the second one.

The third and final one is then in order to be able to do that policy makers themselves would have to strive for more intelligent management concepts for running science and research policy in that way. One element of that is that they should try to change the assessment governments from time to time, because only if they do that, it'll avoid opportunistic behaviour, for the actors aren't there. The more you codify your assessment procedures, the more easy it is for the actors to adept to it, to present and to look at fake, and to look at what research assessment exercise and other highly mechanistically organised procedures what they produce finally, they will try to obey to them and produce fake instead of creativity, that is the problem and the result, so you have to change your government from time to time, surprisingly. That' s very important. In order to be able to do that, that is my second aspect of the last, before I turn to the policy measurement idea, is you need to make use and also to facilitate science policy studies or innovation policy studies, that type of things we try to perform as a basis for new ideas, how to measure, how to assess the outcome of complex research activities funded with public money.

I come back to my idea of strategic intelligence that I tried to introduce this morning in my presentation. To be understood as an interlinking of sources providing information of how we could do things better that links it up also to the requirements Chris Caswill in the beginning mentioned, policy makers are interested to listen to new concepts, that is my experience as well.



Back to ISSC Workshops