What Foundations for Research Evaluation? - The Case of French Social Scientists

Philippe Jeannin

Université Toulouse III
LERASS (Laboratoire d'études et de recherches
appliquées en sciences sociales) & IUT de Tarbes
1 rue Lautréamont, BP 1624, 65016 Tarbes cedex, France.

Ministère de la Recherche,
Direction de la recherche, CDR-SHS, Paris, France

The ideas expressed in this communication reflect the opinions of the author.

ISSC Workshop, BBAW. Berlin, 14-15 March 2002


This contribution aims at coming up with serious grounds for an evaluation of French research published in scientific journals in the field of the Social Sciences (SSc). Because the science of the SSc lacks maturity, a reliable method is needed. This method consists in criss-crossing the various databases which play an authoritative part -those of the ISI (Institute for Scientific Information) and others-, in listing the titles of journals they retrieve, and in asking each scientific community what its position is. Hence the scientificity of a journal: a journal is "scientific" when considered as such by the scientists of its community. Advantages and drawbacks are studied. Results are presented for the following disciplines: law, political science, anthropology-ethnology, education science, information and communication sciences.

Keywords: Anthropology, Bibliometrics, Education Sciences, Ethnology, Evaluation, France, Information and Communication Sciences, Law, Political Science, Research, Scientific Journals, Scientific Periodicals, Scientometrics, Social Sciences

In the field of the Social Sciences (SSc), the evaluation of research published in scientific journals appears to be somehow difficult (Hicks, 1999). The very foundations of the evaluation of this production are sensitive and often controversial for the concerned scientific communities. This contribution aims at defining and then applying a method which allows to know which journals -whether national or foreign- scientists in France consider as scientific. This question is indeed the often forgotten preliminary to any bibliometric or scientometric study.

In this contribution, we will always keep in mind the necessity to propose a positive mapping of research which allows for international comparisons, which can be applied to other disciplines and language areas, and which participates, in the end, to the understanding of how to build or update bibliometric databases. This method will gain its full meaning in the French context where the disappearance of the Colbertist state -with its centralisation and its financial interventions (Papon, 1998 ; Mustar & Laredo, 2002)- led to force, in every discipline, the French community of scientists to take more responsabilities.

Our study takes into account the totality of tenured scientists in the SSc i.e. 8,300 people (Fréville, 2001, 121), split up into disciplines. Each disciplinary community is interviewed in order to know which journals can be considered as scientific by this community. One could oppose this work bypasses interdisciplinarity. This is not the case. On the one hand, scientists who publish in disciplines different from theirs, will be more accurately evaluated if a disciplinary list of journals is available. On the other hand, interdisciplinarity is not only a feature of scientists, but also one of some journals which are considered as being scientific by various disciplinary communities.

The first point will recall the theoretical framework which is applied to all the disciplines of knowledge. This science of science will be applied to the SSc. The second point will point out our method, a quality organization allowing for a further governance of research. Finally, a last point will underline the advantages and drawbacks of such a method. The appendix gathers some results of surveys already carried out in law, political science, anthropology-ethnology, education science, information and communication sciences. For the complete method and results, refer to: http://www.u-paris10.fr/bibethno/21p.html and http://www.iut-tarbes.fr/enquete/index.htm

1. The Science of Social Sciences (SSc)

The evaluation of research is at first glance a field which seems quite well structured with its extensive literature on its objectives, its methods, its standards, its difficulties and its results (OCDE, 1997 ; Giorgi & Tandon, 2000). At the same time, a critical approach came into existence, a genuine science of science, highly present in the so-called "hard" sciences. In the 1940's, Merton saw science as an ideal republic where standards such as universalism, communalism, unselfishness, organized skepticism, originality, humility (Martin, 2000, 24-32) were shared by scientists. Today, science is seen more as a place of power, a market characterized by competition and conflicts and where one trades one's outputs for notoriety and recognition (D. Vinck, 1995, 55-81). Notoriety and peer recognition do not exclude various strategies for power and the development of professional networks. The dynamics of research laboratories also needs to be taken into account (Joly, 1997). Such theses are developed by Hagstrom, Bourdieu and other authors (Andersen, 2000, 675).

Various debates sprung from such arguments. First of all, the debate initiated by Gibbons, Limoges, Nowotny, Schwarzman, Scott & Trow in 1994, with the birth of a new model of the production of knowledge breaking off from the traditional academic organization of disciplines and universities. Another debate, the triple-helix debate (Leydesdorff & Etzkowitz, 1997), is on the other hand underlining the rehabilitation of the role played by universities through an analysis of the relations between state, industry and universities. Our contribution belongs to a third type of debate -a synthesis of the ones formerly mentioned- in which the activities of research in universities, although more and more diverse, do not lead to their decline (Godin, Gingras, 2000). The production of disciplinary lists of journals is seen as a complementary work on studies which are more global and more oriented to research policy (Wilts, 2000).

So far, the evaluation of research in the SSc has been of little interest for scientists. Publications on the subject are scarce and contradictory. Even at the CNRS (1) level, the only source of information, Labintel, has a lot of deficiencies and does not reflect the real state of published research. This problem is nevertheless crucial, because the visibility, the credibility and even the quality of this research are at stake.

If one admits the SSc are sciences like others, then one can apply the same criteria to them, and this is all the more true in terms of evaluation. Scientometrics is the evaluation of scientific activity (Callon, Courtial & Penan, 1993). It allows for an assessment of the activity of research (research fronts, journals, citations...) in the "hard" sciences. The genuine proof of this activity is available under its written form: articles, reports, books... Here, we will set the limits to articles in journals. To measure science, we could therefore measure printed scientific activity, and use to this end the databases created by Garfield in the United-States within the ISI (Institute for Scientific Information: http://www.isinet.com), i.e., in the SSc, the SSCI (Social Science Citation Index) and the CCS&BS; (Current Contents Social & Behavioral Sciences) (2). We would then know which scientists publish the most, the most cited ones...

Concerning the fields of the SSc and the humanities, this method cannot be reliable because the bases show a lot of deficiencies (Glänzel, 1996, Kieffer & Peyraube, 2001). Indeed, the major journals in the English language are globally well represented, even if the anglo-saxons themselves do admit these bases are not totally reliable (Hicks, 1999 ; Katz, 1999). The developed countries are the countries which are the best represented (Narvaez-Berthelemot, Russell, 2001; Ingwersen, 2000). The disparity concerning the non English-language journals is very important: some major journals are absent, whereas popularizing journals are cited. It entails that in the case of France, the indicators which were set up from these bases do not reflect the real activity of research of the country (Glänzel, 1996 ; Katz, 1999).

Retrieving to few titles may also entail a bias. For instance, in the fields of econometrics (Baltagi, 1998) and economics, some authors (Elliott, Greenaway & Sapsford, 1998 ; Kalaitzidakis, Mamuneas & Stengos, 1999 ; Kirman & Dahl, 1994) limit their selection of titles to around ten, all of them in the English language ; their analysis lacks all the more objectivity.

If the SSc cannot be measured through the bases SSCI and CCS&BS; of the ISI exclusively, they can be assessed. In fact the difficulty is not of bibliometric concern. To conduct bibliometric studies, what is important is to have access to relevant databases (Glänzel & Schoepflin, 1999, 43).

2. Elaboration of a method of evaluation

Our operating method is not limited to a mere compiling of the publications of scientists (De Looze, Coronini, Jeannin, Legentil & Magri, 1996 ; Combes & Linnemer, 2001) but it integrates a criss-crossing of the various bases which play an authoritative part in the field of research, and lists the journal titles they retrieve. We also asked each community -and not a few experts only- its opinion about this list. Hence, the scientificity of a journal. A journal is "scientific" when considered as such by the community of scientists. That also implies peer reviewing has been seriously carried out, and scientificity which is essential here (Harnad, 1998 ; Budapest Open Access Initiative http://www.soros.org/openaccess/) will not be questioned as it is the case for on-line publications (Fritch & Cromwell, 2001).

This method requires, first of all, to operate a disciplinary division which will be coherent both with the French institutions and the institutions which recruit and promote scientists (Jeannin, 2000). This disciplinary division should be as close as possible as that from our European counterparts. It is based on seven steps to follow ; these steps will be repeated for each discipline:

  1. Isolate which databases and existing lists -French and foreign- play an authoritative role, by associating confirmed scientists and specialized reference libraries. This step is the easiest one, because there is a consensus concerning the databases scientists refer to in their work.
  2. Draw up a first list of journals after having criss-crossed the various bases selectioned in step 1. This step is sensitive. Titles generally present in several bases will be kept. Some French specificities will not be left aside as for example the fact French scientists may neglect certain geographic areas or foreign languages. Another requirement: the selection of titles will be applied to journals which are 'alive' on the one hand, and available on numerous localisations (http://www.sudoc.abes.fr). New journals should not be left out, just as electronic ones. In the end, a single question will be asked: 'Do you consider journal X as scientific ?' with three possible answers: 'Yes', 'No', 'Do not know'. This takes for granted the fact that each scientist has an opinion on the scientificity of journals.
  3. Before proceeding to the life-size survey, test this first list of journals among a small number of scientists and people in charge of journals in information retrieval departments in order to suppress potential errors.
  4. Proceed to a large-scale survey. The number of scientists must be important. They must be representative of their own community in order for the survey to reach a high level of relevancy and reflect the views of this community as well as possible. Besides, they have the possibility of mentioning other titles.
  5. Count the answers, and then present the results on the Internet, because evaluation is a public good (Avery, Resnick & Zeckhauser, 1999).
  6. Validate Step 5 by calling on a meeting where all the persons who answered the survey will be invited. Such a meeting is essential as it acts as a mirror for the community of scientists.
  7. Analyse. The final list of scientific journals can be used for various studies e.g. determine research fronts, isolate the most active scientists or research laboratories, count citations... This step is technically difficult because of the fact some sources may have been wrongly computed or not computed at all. But the plasticity of the bibliometric tool allows for spatial or temporal comparisons.This type of work must always be carried one discipline at a time, and by scientists who have some computing knowledge in order to avoid important errors. We are in the case of a scientific evaluation because we measure scientific activity, but it also means it is a strategic evaluation because research programmes, institutions... may be concerned, and some recommendations can be made (Callon, Laredo & Mustar, 1995). Therefore, this scientometric work appears to be a basis for an evaluation of research on the one hand, and on the other hand as a masterpiece for scientific watch, necessary in our disciplines of the SSc.

In this method, peer reviewing is not totally open because scientists are proposed a list of journals, even if this list can be extended. In the end, we assess expertise: if articles are expertized, why should things be different for journals? This second expertise is in no way redundant. The first reason is that it is not somebody who assesses somebody else's work, but a collective identity assessing an object i.e. a journal, meaning less emotion involved. The second reason concerns the vision a scientific community has of itself, and not the vision of an international community -should it exist ?- has. The third reason concerns the so often criticized authority of experts (Latour, 2001) which seems to regain ground today.

The surveys already carried in five disciplines (cf. Appendix) attend to prove that:

  • In these five disciplines, French research can in no way be evaluated through the bases of the ISI exclusively,
  • Only a criss-crossing of "non-ISI" bases allows to point out the scientific journals of each discipline,
  • And, except for the political science, foreign "non-ISI" bases do not allow to know which journals are scientific.

We will now analyse the advantages and drawbacks of this method in which a lot of flexibility is initiated by Step 2.

3. Consequences and limits of this method

If we consider the advantages first, this method, transparent, public, testable and updatable aims at being a quality organisation:

  • Transparent and public: because it was carried out according to clearly edicted rules, and available to scientists from the very start to the results.
  • Testable and updatable: the feasibility of the study is an important requirement just as the possibility of repeating it, because journals evolve (Jeannin & Devillard, 1994).

This method appears to give value to the environment of research, and helps the scientists to publish in journals considered as scientific by each community. That means an increased legitimity and credibility (Jeannin, 2002). It also means the evaluation of the articles or of the laboratories will be carried out more easily because one uses a list which is not confidential but public and validated by the community. Each discipline will benefit from an increased visibility. It will help databases specialists in their selection of journals, reference librarians in their subscription policies (Puech & Tesnière, 2000), editors and publishers in their strategies of promotion and digitisation. This is all the more important when the general climate of publication of research is quite disturbed (Chartron & Saläun, 2000 ; Ollé & Sakoun, 2001 ; Russell, 2001).

For scientists, the evaluation represents an important stake. At the moment, among the 8,300 scientists in the SSc, 7,300 are employed in universities, and like their non-SSc colleagues are faced with low salaries and career evolutions with little incentive, where excellence is not priviledged (Fréville, 2001).

So, with such lists of journals which have been validated by scientists themselves, one can come up with an accurate representation of a discipline (Jeannin & Mouton, 2001), draw maps of research thanks to the localisation of the authors of articles... The plasticity of this tool allows for spatial comparisons and doesn't exclude the taking into account of other indicators linked, for example, to the collaboration between scientists or to the way knowledge disseminates in society (Landri, Amara & Lamari, 2001).

When criss-crossed at a European level, national disciplinary lists could be the first step towards new relevant European databases.

But this method also has limits which need to be mentioned:

  • Because it is based on a disciplinary classification, it does not give enough importance to various creative interdisciplinarities (Jeannin, 2000). Nevertheless, a minimal interdisciplinarity exists through journals considered as scientific by several communities.
  • It is assumed that each community has a strong link with its discipline but this is not always the case.
  • Printed journals may be privileged at a time when some call on e-journals exclusively (Goodman, 2000), and when the work of the scientist has undergone important changes due to the use of information technologies (Lally, 2001 ; Russell, 2001).
  • It requires long-term support, more specifically in terms of finance and organisation because of the need for a continuous updating. To ensure legitimacy, this evaluation must be institutionalised and carried with the involvement of scientists.
  • It should not be subject to hegemony. It presents a quantitative criterion among other criteria, whether quantitative or not. It is one of the basis of peer qualitative reviewing. When scientific communities were not important in terms of number of scientists, had little money, and were living in a closed environment, reviewing could be done more easily and quickly. This is no longer the case.
  • It must be completed by an evaluation of other research media (Andersen, 2000, 685), books in particular, and not by titles but by series (cf. Jeannin, 2001 for anthropology-ethnology), or by university presses (cf. Goodson, Dillman & Hira, 1999 for political science).
  • It may freeze schools of thoughts, research fronts and journals on well-defined and dynamic niches, leaving innovative openings aside. That is to say, for example, that an editorial commitee will have a tendency to evaluate scientists according to the leading paradigms... Yet, the expansion of the market of research leads to a specialisation (Stigler & Stigler, Friedland, 1995), each journal trying to have the monopoly of its field. It may entail that this competition between journals will allow the variety of various schools of thought to be preserved.
  • It may lead to fraud: falsification, making up of data, plagiarism (LaFolette, 1992 ; Hubbard &Vetter;, 1992) if the evaluation criteria based on publications in scientific journals are only elaborated with a view to decide on the allotting of funds, of contracts. Nevertheless in France, most scientists are tenured. It reinforces their independance, even if this "weakens their aptitude to answer social demand and does not entice them to orientate their reseach towards potential economic spillover" (Encinas de Munagorri, 1998).
  • The legitimacy which has been acquired through this method may be questioned if the transfer and the perception of research (through economic agents, the public, foreign scientists... ) is biased. Scientists have certainly a part to play, if they want their research to be valorised as well as possible.

Our contribution aimed at setting up the first basis of a quantitative and consensual evaluation of French scientific research in the SSc. This evaluation will be relevant only if it relies on lists of scientific journals which have been assessed by each disciplinary community. This work is all the more urgent that the publications on this subject are biased, and underestimate French research in the SSc. As long ago as 1993, Frey and Eichenberger were saying concerning the field of economics, that the information on journals is much better in North America than in Europe, when in fact this information is strategic: in North America, the university market of research in the SSc is vast, homogeneous, competitive and is based on the expertise of journals. In Europe, it is narrow, partioned and lacks information on the quality of journals. The valorisation of research suffers from this situation.


Andersen (H.), 2000, « Influence and Reputation in the Social Sciences. How much do Researchers agree ? », Journal of Documentation, 56, November, 674-692.

Avery (C.), Resnick (P.) & Zeckhauser (R.), 1999, « The Market for Evaluations », The American Economic Review, 89 (3), 564-584.

Baltagi (B.H.), 1998, « Worldwide Institutional Rankings in Econometrics: 1989-1995 », Econometric Theory, 14, 1-43.

Callon (M.), Courtial (J.P.) & Penan (H.), 1993, « La Scientométrie », Paris, PUF, 126 p.

Callon (M.), Laredo (P.) & Mustar (P.), 1995, « La gestion stratégique de la recherche et de la technologie: l'évaluation des programmes », Paris, Economica, 477 p.

Chartron (G.) & Salaün (J.M.), 2000, « La reconstruction de l'économie politique des publications scientifiques », BBF (Bulletin des bibliothèques de France), 45 (2), 32-42.

Combes (P.P.) & Linnemer (L.), 2001, « La publication d'articles de recherche en économie en France », Annales d'économie et de statistique, 62, 5-47.

Elliott (C.), Greenaway (D.) & Sapsford (D.), 1998, « Who's publishing who ? The national composition of contributors to some core US and European journals », European Economic Review, 42, 201-206.

Encinas de Munagorri (R.), 1998, « La communauté scientifique est-elle un ordre juridique ? », Revue Trimestrielle de Droit Civil, (2), avril-juin, 247-283.

Fréville (Y.), 2001, « Rapport d'information sur la politique de recrutement et la gestion des universitaires et des chercheurs », Sénat, France, annexe au procès-verbal de la séance du 6 novembre, nˇ54, 522 p.

Frey (B.S.) & Eichenberger (R.), 1993, « American and European Economics and Economists », Journal of Economic Perspectives, 7 (4), Fall, 185-193. (Cf. two letters to the Editor and answers in JEP, 9 (1), Winter 1995, 203-207).

Fritch (J.W.) & Cromwell (R.L.), 2001, « Evaluating Internet Resources: Identity, Affiliation, and Cognitive Authority in a Networked World », Journal of the American Society for Information Science and Technology, 52 (6), 499-507.

Gibbons (M.), Limoges (C.), Nowotny (H.), Schwartzman (S), Scott (P.) & Trow (M.), 1994, « The New Production of Knowledge. The dynamics of science and research in contemporary societies », London, Sage.

Giorgi (L.) & Tandon (A.), (2000) « The Theory and Practice of Evaluation », ICCR Working Papers, 407, The Interdisciplinary Centre for Comparartive Research in the Social Sciences, Vienna (Austria), 32 p. http://www.iccr-international.org

Glänzel (W.), 1996, « A Bibliometric Approach to Social Sciences. National Research Performances in 6 Selected Social Science Areas, 1990-1992 », Scientometrics, 35 (3), 291-307.

Glänzel (W.) & Schoepflin (U.), 1999, « A Bibliometric Study of Reference Literature in the Sciences and Social Sciences », Information Processing and Management, 35, 31-44.

Godin (B.), Gingras (Y.), 2000, « Impact de la recherche en collaboration et r™le des universités dans la production des connaissances », Sciences de la Société, 49, 11-26.

Goodman (D.), 2000, « Should scientific journals be printed ? A personal view », Online Information Review, 24 (5), 357-363. http://www.emerald-library.com

Goodson (L.P.), Dillman (B.) & Hira (A.), 1999, « Ranking the Presses: Political Scientists' Evaluations of Publisher Quality », PS: Political Science and Politics, June, 9 p. http://www.apsanet.org/PS/june99/goodson.cfm

Harnad (S.), 1998, « The invisible hand of peer review », Nature, c. 5, Nov. 1998. http://helix.nature.com/webmatters/invisible/invisible.html

Hicks (D.), 1999, « The Difficulty of Achieving Full Coverage of International Social Science Literature and the Bibliometric Consequences », Scientometrics, 44 (2), 193-215.

Hubbard (R.) & Vetter (D.E.), 1992, « The Publication Incidence of Replications and Critical Commentary in Economics », American Economist, 36 (1), Spring, 29-34.

Ingwersen (P.), 2000, « The International Visibility and Citation impact of Scandinavian Research Articles in Selected Fields: The Decay of a Myth », Scientometrics, 49(1), 39-61.

Jeannin (P.), 2000, « Evaluation quantitative de la recherche en Sciences humaines et sociales: le tissu disciplinaire », Premier rapport de mission, Direction de la Recherche, Ministère de l'Education Nationale, de la Recherche et de la Technologie, mars, 32 p.

Jeannin (P.), 2001, « Evaluation quantitative de la recherche en Sciences humaines et sociales: premières listes de périodiques et de collections scientifiques », Rapport intermédiaire de mission, Direction de la Recherche, Ministère de la Recherche, février, 77 p. http://www.u-paris10.fr/bibethno/21p.html & http://www.iut-tarbes.fr/enquete/index.htm Jeannin (P.), 2002, « Légitimer la recherche franŤaise en science économique », Sciences de la Société, 55, 189-204.

Jeannin (P.), Devillard (J.), 1994, « Towards a Demographic Approach to Scientific Journals », Scientometrics, 30 (1), 83-95.

Jeannin (P.), Mouton (M.D.), 2001, « Critères d'évaluation de la recherche en Sciences humaines et sociales: l'exemple de l'Ethnologie-Anthropologie sociale et culturelle », Communication au 6ème Forum Semmering, Lille, 6-8 décembre, 17 p. http://www.iccr-international.org/events/

Joly (P.B.), 1997, « Chercheurs et laboratoires dans la nouvelle économie de la science », Revue d'économie industrielle, 79 (1), 77-94.

Kalaitzidakis (P.), Mamuneas (T.P.) & Stengos (T.), 1999, « European Economics: an Analysis based on Publications in the Core Journals », European Economic Review, 43, 1150-1168.

Katz (J.S.), 1999, « Bibliometric Indicators and the Social Sciences », Paper presented for ESCR, Polaris House, Swindon (UK), 8 December, 11 p.

Kieffer (F.) & Peyraube (A.), 2001, « Problems of the Evaluation of the Scientific Production in the Domain of Humanities », Workshop on the Evaluation of the Scientific Production in Humanities, European Science Foundation (ESF), Budapest, June 7-9.

Kirman (A.) & Dahl (M.), 1994, « Economic Research in Europe », European Economic Review, 38, 505-522.

LaFollette (M.C.), 1992, « Stealing into print », Berkeley, University of California Press, 293 p.

Lally (E.), 2001, « A researcher's perspective on electronic scholarly communication », Online Information Review, 25 (2), 80-87. http://www.emerald-library.com/ft

Landry (R.), Amara (N.) & Lamari (M.), 2001, « Utilization of social science research knowledge in Canada », Research Policy, 30, 333-349.

Latour (B.), 2001, « Nouvelles règles de la méthode scientifique ? », Projet, 268, 91-100.

Leydesdorff (L.) & Etzkowitz (H.), 1997, « Emergence of a triple-helix of university-industry-government relations », Science and Public Policy, 23 (5), 279-286.

De Looze (M.A.), Coronini (R.), Jeannin (P.), Legentil (M.) & Magri (M.H.), 1996, « Determining the Core of Journals of a Research Centre: the example of Researchers from the Department of Rural Economy and Sociology of the Institut National de la Recherche Agronomique, France », Scientometrics, 36 (2), 167-183.

Martin (O.), 2000, « Sociologie des Sciences », Paris, Nathan, 128 p.

Mustar (P.) & Larédo (P.), 2002, « Innovation and research policy in France (1980-2000) or the disappearance of the Colbertist state », Research Policy, 31 (1), January, 55-72.

Narvaez-Berthelemot (N.), Russell (J.M.), « World distribution of social science journals: A view from the periphery », Scientometrics, 51 (1), 223-239.

OCDE (Organisation de coopération et de développement économique), 1997, « The Evaluation of Scientific Research: Selected Experiences », Proceedings of an OECD Workshop on the Evaluation of Basic Research, OCDE/GD(97)194, 112 p.

Ollé (J.M.) & Sakoun (J.P.), 2001, « L'édition électronique en France: Fermeture [provisoire] pour inventaire », Diogène, octobre-décembre, 104-110.

Papon (P.), 1998, « Research institutions in France: between the Republic of science and the nation-state in crisis », Research Policy, 27 (8), December, 771-780.

Puech (C.) & Tesnière (V.), 2000, « Expertise scientifique et évaluation des collections. Une méthode appliquée aux fonds de linguistique de la BnF », BBF (Bulletin des bibliothèques de France), 45 (4), 96-104.

Russell (J.M.), 2001, « La communication scientifique à l'aube du XXIe siècle », Revue internationale des sciences sociales, UNESCO, 168, juin, 297-309.

Stigler (G.J.), Stigler (S.M.) & Friedland (C.), 1995, « The Journals of Economics », Journal of Political Economy, 103 (2), 331-359.

Vinck (D.), 1995, « Sociologie des sciences », Paris, Armand Colin, 292 p.

Wilts (A.), 2000, « Forms of Research Organisation and their Responsiveness to External Goal Setting », Research Policy, 29, 767-781.

Appendix: Five lists of journals

(Law, Political Science, Ethnology-Anthropology,
Education Science, Communication and Information Sciences)

For each survey, a first test was elaborated and the catalogue of the university system of documentation http://www.sudoc.abes.fr was used. The scientists who took part in the survey could name other scientists to be involved in the testing. They also could add other titles of journals to the list they were submitted. The question they were asked was: « If you consider Journal X as scientific, tick the « Yes » column ; if you consider Journal X as not scientific, tick the « No » column. Tick « Do not know », if you do not know whether Journal X is scientific or not ». For each discipline, a meeting attended by the scientists who had taken part to the survey was called to analyse the results .

In Sociology, the survey is about to be completed. Others are on the way in SSc and Humanities: Economics, Art History, Psychology... The details and full results of the surveys already carried out are available on the following website: http://iut-tarbes.fr/enquete/index.htm

Acronyms used:

CCS&BS;: Current Contents Social & Behavioral Sciences (ISI)
CNU: Conseil national des universités (3)
CoNRS: Comité national de la recherche scientifique (4)
ISI: Institute for Scientific Information http://www.isinet.com
SSCI: Social Sciences Citation Index (ISI)

1. Law

Number of scientists in the discipline: 2,500 scientists and scholars (sections 1 to 3 of the CNU, and 36 of the CoNRS)

Date the survey was carried: from late 2000 to early 2001

Elaboration of the list: V. Allagnat and N. Trion, Bibliothèque Nationale de France http://www.bnf.fr and other experts.

Bases selected:

Number of journals selected: 217

Scientists invited to answer the survey: the survey was sent to all the people in France in charge of laboratories in law, so they could dispatch it to their scientits. On the whole, the survey was sent to 255 people.

Received answers: 112 (40% from Private Law scientists, more than 30% Public Law scientists, and around 10% from Histoy of Law scientists.).

List of selected journals: 110 titles received one third of « yes » ballots ( 38 « yes » or more).

Commentary: among the 54 journals which received more than 56 « yes », 2 are absent both from the IFLP and Doctrinal, and 28 others are not present in one of these two bases. The discipline coverage by these two bases is rather satisfying, but under the condition both bases should be selected! As for the third base (Wilson), out of these 54 titles, only two are referenced.

Comparison with the bases of the ISI: Among these 54 titles, two only are present in these bases (CCS&BS; Law, 87 titles, and SSCI Law, 104 titles). And out of the 110 titles of the list, four are present.

No French journal is present in these two bases.

2. Political Science

Number of scientists in the discipline: 400 scientists or scholars (sections 4 of the CNU and 40 of the CoNRS)

Date the survey was carried: mid 2000

Elaboration of the list: the author of this contribution with the help of N. Dada, FNSP (Fondation nationale de science politique, National Foundation of Political Science) and other experts.

Bases selected:

  • IPSA (International Political Science Abstracts): http://www.ipsa-aisp.org/
  • IBSS (International Bibliography of the Social Sciences): http://www.lse.ac.uk/IBSS/
  • PAIS (Public Affairs Information Service): http://www.silverplatter.com/catalog/pais.htm
  • ABC pol sci (Advance Bibliography of Contents: Political Science & Government)

Number of journals selected: 59

Scientists invited to answer the survey: 29 scientists or scholars, all members of the Association des enseignants et chercheurs en science politique (Association of scholars and scientists in Political Science) (234 members).

Received answers: 14.

List of selected journals: 42 titles received four « yes » or more.

Commentary: The core journals considered as scientific by 50 % or more scientists amounts to 31 journals, all available in the IPSA and in at least another base. Nevertheless, among these 31 titles, 13 are not selected by PAIS.

Criticism: Two journals outside the list were cited by the scientists: Politix and Critique Internationale. During the meeting, scientists suggested the list should be open to more titles.

Comparison with the bases of the ISI: Among the journals which received at least 7 « yes », 24 are present in the CCS&BS; Political Science & Public Administration (147 titles), and 18 are in the SSCI Political Science (78 titles). Among the 43 titles of the list proposed, 29 are in the CCS&BS; Political Science & Public Administration, et 21 in the SSCI Political Science.

The first base selects only two French journals: La Pensée and Mouvement social ; the second one, only one: La Pensée. None of these journals was listed in the survey available to scientists.

3. Anthropology-Ethnology

Number of scientists in the discipline: 400 scientists and scholars (section 20 of the CNU and 38 of the CoNRS)

Date the survey was carried: 2000

Elaboration of the list: M. D. Mouton, LESC, Maison René-Ginouvès http://web.mae.u-paris10.fr/recherche/beinforma.htm and other experts

Bases and lists selected:

  • AIO: Anthropological Index Online. 750 journals http://lucy.ukc.ac.uk/AIO.html
  • AL: Anthropological Literature. 450 journals http://www-hcl.harvard.edu/tozzer/al.html
  • IBSS: International Bibliography of Social Sciences http://www.bids.ac.uk/, http://www.lse.ac.uk/IBSS/default.shtml
  • SSCI-A: SSCI Anthropology. 52 journals
  • FRA: FRANCIS, ethnology (INIST, Institut national de l'information scientifique et technique / National Institute for the Scientific and Technical information) http://www.inist.fr
  • CNRS: CNRS Périodiques
  • EV: Catalogue de l'association Ent'revues (IMEC) (Catalogue of the association Ent'revues)

Number of journals selected: 294

Scientists invited to answer the survey: 362 (among them, 205 were members of the AFA, Association franŤaise des anthropologues/French Association of Anthropologists, and all the 71 members of the APRAS, Association pour la recherche en anthropologie sociale/Association for the Research in Social Anthropology).

Received answers: 76

List of selected journals: 84 titles received more than one third of the positive votes (that is to say 25 « yes » or more)

Commentary: 46 titles were ticked « yes» at least 38 times. On these 46 titles, 6 are absent from the three bases AIO, AL, IBSS. Hence the importance of having taken the four other bases into account.

Comparison with the bases of the ISI: Among these 46 titles, 13 are mentioned by the CCS&BS; Sociology & Anthropology (175 titles) and 12 in the SSCI Anthropology (52 titles). Among the 84 titles of the full list, the number of journals is respectively 20 and 16.

Seven French journals are present in the CCS&BS; Sociology & Anthropology, and two in the SSCI Anthropology.

4. Education Science

Number of scientists in the discipline: 450 scholars (section 70 of the CNU)

Date the survey was carried: mid 2001

Elaboration of the list: the author under the advice of experts.

Bases and lists selected:

  • W & E: titles retrieved both by HW Wilson & Eric:
    1. HW Wilson Education (only peer reviewed titles) http://www.hwwilson.com/
    2. Eric http://www.oryxpress.com/cije.htm
  • C & I: titles retrieved by C or I:
    1. C: CNCRE (Comité national de coordination de la recherche en éducation / National Committee of Coordination for Research in Education): 55 titles, sections 4/5 and 5/5, from the 1999 Report by J. Beillerot.
    2. I: INRP (Institut national de la recherche pédagogique/National Institute for Educational Research): 224 journals of category 1 from the catalogue of journals of the institution http://www.inrp.fr
  • IREDU (Institut de recherche sur l'économie de l'éducation/Institute of Research for the Economics of Education, Dijon, France): 43 titles http://www.u-bourgogne.fr/IREDU

Number of journals selected: 163

Scientists invited to answer the survey: 411

Received answers: 108

List of selected journals: 70 titles received more than a quarter of « yes » votes (that is to say 27 « yes » or more)

Commentary: From these 70 titles, only ten are selected by W & E.

Comparison with the bases of the ISI: Among 70 titles, 14 only are present in the SSCI Education & Educational Research (96 titles), none in the SSCI Education, Special (25 titles).

No French journal is retrieved in either of these two bases.

5. Communication and Information Sciences

Number of scientists in the discipline: 500 people, 400 of whom are sholars (section 71 of the CNU)

Date the survey was carried: late 2000 - early 2001

Elaboration of the list: the author under the advice of experts.

Bases and lists selected:

  • Bases of the ISI:
    1. ISLS: 56 titles of the SSCI Information Science & Library Science
    2. LIS: 53 titles of the CCS&BS; Library & Information Sciences
    3. C: 42 titles of the SSCI Communication.
  • FRANCIS: 150 titles from FRANCIS Information Sciences (INIST) http://www.inist.fr
  • BP: list of 41 titles elaborated by R. Boure and I. Paillart, ed., « Les théories de la communication », CinémAction, nˇ63, mars 1992.
  • SD: list of 40 titles elaborated by L. Sochacki and J. Devillard, « Des chercheurs en «info-com» et leurs revues », in LERASS, « La communication et l'information entre chercheurs », 1994
  • LC: list of 64 titles elaborated by Y.F. Le Coadic, « La science de l'information », PUF
  • SFSIC: list of 23 titles, elaborated by the SFSIC (Société franŤaise des sciences de l'information et de la communication/French Society for Communication and Information Sciences)

Number of journals selected: 261

Scientists invited to answer the survey: 442 (all the members of the SFSIC)

Received answers: 97

List of selected journals: 44 titles were granted more than a quarter of the « yes » votes (that is to say 24 « yes » or more)

Commentary: the number of core journals considered as scientific by 50 % or more scientists is limited to 12 journals (three of them are published abroad). None is present in the bases of the ISI or in FRANCIS.

Comparison with the bases of the ISI: Among the 44 titles of the list, 3 are present in the SSCI Information Science & Library Science and in the CCS&BS; Library & Information Sciences ; 8 are retrieved by the SSCI Communication and by the CCS&BS; Communication.

No French journals is present in these four bases.


(1) CNRS (Centre national de la recherche scientifique) : institution responsible for basic research in France (11,300 scientists in all areas of science) (Fréville, 2001, 121).

(2) In spite of some imperfections, the ISI databases give much information (author address(es), cited references...). Usually, "non-ISI" databases do not retrieve these information and are unfit for bibliometric work.

(3)CNU : this National Council for the Universities recruits and promotes scholars (75 sections) (Fréville, 2001).

(4)CoNRS : For the CNRS, this National Committee for Scientific Research recruits researchers and evaluates the activities of laboratories and researchers (40 sections) (OCDE,1997).

Back to ISSC Workshops