|
|
Douglas Smith | |
104 Grise Hall | Phone: (270) 745-2152 |
Department of Sociology | Email: Douglas.Smith@wku.edu |
Western Kentucky University | Webpage: www.wku.edu/~Douglas.Smith/ |
1 Big Red Way | |
Office Hours: I will be available Wednesday afternoons between 2:30 and 6:00 or by appointment. |
This course will be very different from most of the other courses that you will have in your graduate career because its focus is on the research methods used to study social science rather than on a particular substantive area. We will consider the nature of social science as a science; the role of theory in science; problem formulation and research design; the experimental method; measurement--including the validity and reliability of our measures be they single items, indexes or scales; different modes of observation--survey, field, archival, and biographical methods; the problems of sampling; the logic of data analysis; ethics in social science research; and grant-writing.
This course provides a chance to put the disparate parts of theory, methodology, and statistics together and into practice. A primary objective of this course is to sharpen your abilities to evaluate social research and to plan and conduct their own projects. Every social scientist doing research is faced with tough decisions that must be made to collect and analyze their data. The outcome of these decisions affect the quality of their data and their analysis. Though reasonable people might differ on precisely where to draw the line regarding what research is worthless and what research is worthwhile, by the end of the course you should have a better sense of what distinguishes good from bad research.
There are at least two different views on methodology. In the first view, statistics and methods are seen as completely intertwined. Basically, statistics are good methods. Any research methods that do not include statistics are looked down upon as nonscientific. The second view, which I hold, sees research methods as all the ways of collecting data. Each collection method has its own problems with validity and reliability of the data collected. Statistics, under this view, are used mainly to shore up claims of validity and reliability. Thus if you collect your data properly, you will have less need to use extremely extravagant and arcane statistics. (Do not misunderstand me. I am not saying that you should neglect your studies in statistics. Take as much statistics as you can stand. You will need to know the extravagant statistics for those times when you cannot collect the best data. For example, you will probably need to know how to use multi-level logit modeling and discriminant analysis at some point in your career as a social scientists; however, you are not going to use them every day.)
The fundamental requirements are as follows: (1) attend class (2) read and acquire a basic understanding of all assigned materials for each class, (3) be prepared each week to discuss those materials and explore the ideas contained in them, (4) write, and (5) work collectively on a group project. Grades will be determined on the basis of performance in these areas.
The course requirements will be assessed using the following measures:
1. Class Attendance (15% of grade)
2. Testing Your Understanding (35% of grade)
There will be a take home mid-term (15%) and a final exam (20%).
3. Weekly Preparation for Class (25% of grade)
Each week grad students will be expected to read the assigned materials and come to class with bullet-style summaries of each week's readings. These summaries will be turned in DURING CLASS.
4. Writing Assignments (25% of grade)
You will write 3 short papers, 5-8 pages long, typed, double spaced. In these papers students will address the question: "What do I know and understand now about research methods that I didn't know or understand (or misunderstood before)?" This requirement forces you to do what any good academic must learn to do; namely, constantly explore the boundary of your own ignorance and knowledge.
NOTE: These papers are not to be reports or summaries of the readings. (You do those for every class.) Neither are they to be reports or summaries of what was said in class. Instead I want you to think about how what we read and discuss applies to your own particular situation, knowledge base and/or research project and write about it. It is also helpful to talk about what you don't understand.
There are no required texts for this course. (Most quality schools shun the use of textbooks at the graduate level in favor of reading packets.) At Western, we will use readings which I will put out on our course website. To supplement these readings, we will be also be reading several papers from the Sage University Paper Series on Quantitative Applications in the Social Sciences.
While I do not believe that it will be necessary, those individuals truly concerned about that their background in research methods is lacking might pick up a copy of Thomas J. Sullivan's Methods of Social Research (or any other research methods textbook...the more you read, the more you know.).
NOTE: Additional Suggested Readings on a topic are denoted with an asterisk (*).
January 10. Opening Session: Intro to the course.
A. The Principles of the Scientific Method
Labovitz, Sanford and Robert Hagedorn. 1981. "Evidence and Causal Analysis," Pp. 1-21 in Introduction to Social Research, 3rd ed. New York: McGraw-Hill.
*Labovitz, Sanford and Robert Hagedorn. 1981. "Decision Making in Scientific Research." Pp. 130-143 in Introduction to Social Research, 3rd ed. New York: McGraw-Hill.
Phillips, Derek L. 1973. "Warranting Knowledge." Pp. 81-101 in Abandoning Method. San Francisco, CA: Jossey-Bass.
*Wallace, Walter L. 1971. "Introduction." Pp. 11-29 in The Logic of Science in Sociology. Chicago: Aldine-Atherton.
*Zetterberg, Hans. 1965. On Theory and Verification in Sociology, 3rd ed. Totowa, NJ: The Bedminster Press.
B. Critiques of the Scientific Method and Sociology
Denzin, Norman. 1989. "An Interpretive Point of View." Pp. 1-33 in The Research Act. Englewood Cliffs, NJ: Simon and Schuster.
McCaughey, Martha. 1993. "Redirecting Feminist Critiques of Science." Hypatia 8(4): 72-84.
Campbell, Donald T. 1984. "Can we be Scientific in Applied Social Science? Evaluation Studies Review Annual 9:26-48.
Collins, Randall. 1989. Sociology: Proscience or Antiscience? American Sociological Review 54: 127-139.
*Smith, Dorothy E. 1987. "Women's Perspective as a Radical Critique of Sociology." Pp. 84-96 in Sandra Harding (ed.), Feminism and Methodology. Bloomington, IN: Indiana University Press.
Scarce, Rik. 1995. "Scholarly Ethics and Courtroom Antics: Where Researchers Stand in the Eyes of the Law." The American Sociologist Spring:87-112
Van Den Hoonaard, Will C. 2001. " Is Research-Ethics Review a Moral Panic? " La Revue Canadienne de Sociologie et d'Anthropologie/The Canadian Review of Sociology and Anthropology 38(1):19-36.
Crow, Graham, Rose Wiles, Sue Heath, and Vikki Charles. 2006. "Research Ethics and Data Quality: The Implications of Informed Consent." International Journal of Social Research Methodology 9(2):83-95.
*Greeley, Andrew M. 1998. "Social Science Sinners." Society 35(2):239-244.
Bernard, H. Russell. 2000. "The Foundations of Social Research." Pp. 30-57 in Social Research Methods: Qualitative and Quantitative Approaches. Thousand Oaks, CA: Sage Publications.
Onwuegbuzie, Anthony J. 2000. "Expanding the Framework of Internal and External Validity in Quantitative Research." Paper presented at the Annual Meeting of the Association for the Advancement of Educational Research, Ponte Vedra, FL.
Daniel, Larry G. and Anthony J. Onwuegbuzie. 2002. "Reliability and Qualitative Data: Are Psychometric Concepts Relevant within an Interpretivist Research Paradigm?" Paper presented at the Annual Meeting of the Mid-South Educational Research Association, Chattanooga, TN.
Sullivan, Thomas J. 2001. "Contructing Questions, Indexes, and Scales." Pp. 148-182 in Methods of Social Research. Orlando, FL: Harcourt.
Sullivan, John L. and Stanley Feldman. 1979. Pp. 9-28 in Multiple Indicators: An Introduction. Beverly Hills, CA: Sage.
Maxim, Paul S. 1999. Pp. 210-232 in Quantitative Research Methods in the Social Sciences. New York: Oxford.
Bernard, H. Russell. 2000. Pp. 144-162 in Social Research Methods: Qualitative and Quantitative Approaches. Thousand Oaks, CA: Sage Publications.
Sullivan, Thomas J. 2001. "Nonprobability Samples" Pp. 205-214 in Methods of Social Research. Orlando, FL: Harcourt.
Onwuegbuzie, Anthony J. and Kathleen M.T. Collins. 2004. "Mixed Methods Sampling Considerations and Designs in Social Science Research." Paper presented at the Annual Meeting of the Mid-South Educational Research Association.
*Frankel, Martin R. and Lester R. Frankel. 1987. "Fifty Years of Survey Sampling in the United States." The Public Opinion Quarterly, 51(4):S127-S138.
*Biernacki, P. and D. Waldorf. 1981. "Snowball Sampling: Problems and Techniques of Chain Referral Sampling." Sociological Methods and Research 10(2): 141-163.
A. Basic Experimental Designs
Sullivan, Thomas J. 2001. Pp. 222-250 in Methods of Social Research. Orlando, FL: Harcourt.
Reicken, Henry W. and Robert F. Boruch. 1978. "Social Experiments." Annual Review of Sociology 4:511-532.
B. Problems of Experimental Design
Sechrest, Lee and William Yeaton. 1992. "Magnitudes of Experimental Effects in Social Science Research." Evaluation Review 6(5): 579-600.
Leamer, Edward E. 1983. "Let's Take the Con out of Econometrics." American Economic Review 73(1):31-43.
Fitchen, Janet M. 1990. "How Do You Know What to Ask if You Haven't Listened First?: Using Anthropological Methods to Prepare for Survey Research." The Rural Sociologist Spring: 15-22.
Converse, Jean M. and Stanley Presser. 1986. Survey Questions: Handcrafting the Standardized Questionnaire. Beverly Hills, CA: Sage.
B. Survey Implementation
Dillman, Don. 2000. "Chapter 1: Introduction to the Tailored Design Method." Pp. 3-31 in Mail and Internet Surveys. New York: John Wiley & Sons.
Dillman, Don. 2000. "Chapter 4: Survey Implementation." Pp. 149-193 in Mail and Internet Surveys. New York: John Wiley & Sons.
Morgan, David L. 1996. "Focus Groups." Annual Review of Sociology 22: 129-152.
Kitzinger, Jenny. 1994. "The Methodology of Focus Groups: The Importance of Interaction between Research Participants." Sociology of Health and Illness 103-121.
Sullivan, Thomas J. 2001. "Unobtrusive and Available Data Research." Pp. 287-316 in Methods of Social Research. Fort Worth, TX: Harcourt College Publishers.
Webb, Campbell, Schwartz and Sechrest. "Physical Traces: Erosion and Accretion." in Unobtrusive Measures: Nonreactive Research in the Social Sciences.
Webb, Campbell, Schwartz and Sechrest. "Archives II: The Episodic and Private Record." In Unobtrusive Measures: Nonreactive Research in the Social Sciences.
Nandy, Bikash R. and Paul D. Sarvela. 1997. "Content Analysis Reexamined: A Relevant Research Method for Health Education." American Journal of Health Behavior. 21(3):222-234.
Daniels, Glynis. 1995. "The Forest Related Content of Children's Textbooks: 1950-1991.
Sheldon, Jane P. 2004. "Gender Stereotypes in Educational Software for Young Children." Sex Roles 51(7/8):433-444.
Denzin, Norman. 1989. "Strategies of Multiple Triangulation." Pp. 234-247 in The Research Act. Englewood Cliffs, NJ: Prentice-Hall.
Williamson, Graham. 2005. "Illustrating triangulation in mixed-methods nursing research" Nurse Researcher 12(4):7-18.
Ungar, Michael. 2006. "'Too Ambitious': What Happens when Funders Misunderstand the Strengths of Qualitative Research Design. Qualitative Social Work 5:261-277.
Tentative Course Schedule |
||
August | 25 | Intro |
27 | Principles of Scientific Method | |
September | 1 | No Class - Labor Day |
3 | Critiques of Scientific Method | |
8 | Ethics | |
10 | Problem Formulation and Research Design Development FIRST RULE | |
15 | ||
17 | * | |
22 | ||
24 | Measurement | |
29 | ||
October | 1 | Reliability and Validity |
6 | ||
8 | Construction of Questions, Indexes and Scales | |
13 | Sampling | |
15 | Experiments | |
20 | * | |
22 | Problems with Experiments | |
27 | Survey Research | |
29 | ||
November | 3 | Survey Development |
5 | NO CLASS | |
10 | Focus Groups | |
12 | ||
17 | * | |
19 | Unobtrusive and Available Data | |
24 | Content Analysis | |
26 | NO CLASS | |
December | 1 | Triangulation |
3 |
Finally, my disclaimer:
DISCLAIMER | The university may have adopted a business model; | DISCLAIMER |
DISCLAIMER | however, education is NOT a business. Moreover, | DISCLAIMER |
DISCLAIMER | the syllabus is not some sort of sacred contract (at | DISCLAIMER |
DISCLAIMER | the very least, the course calendar is not a sacred | DISCLAIMER |
DISCLAIMER | contract), but more along the lines of a road map. | DISCLAIMER |
DISCLAIMER | The readings in the course calendar are places we | DISCLAIMER |
DISCLAIMER | are scheduled to visit. Anyone who has taken a | DISCLAIMER |
DISCLAIMER | preplanned road trip or vacation knows that the trip | DISCLAIMER |
DISCLAIMER | is not fun unless you stop at interesting roadside | DISCLAIMER |
DISCLAIMER | attractions even though they might divert from your | DISCLAIMER |
DISCLAIMER | original route or time table. It's the process of | DISCLAIMER |
DISCLAIMER | getting there that is fun/relaxing/intriguing. In that | DISCLAIMER |
DISCLAIMER | light, the above schedule and procedures for this | DISCLAIMER |
DISCLAIMER | course are subject to change in the event of | DISCLAIMER |
DISCLAIMER | extenuating circumstances. | DISCLAIMER |