Wednesday, December 8, 2010

Dissertation Prospectus

Dissertation Prospectus
Influence of Early Personal Computers and Texts on Programming Style
John Bork
University of Central Florida

Summary

Interest in learning computer programming outside of computer science peaked in the mid 1980s through early 1990s as widespread diffusion of personal computers into public schools and universities occurred in the United States. During this period, Sherry Turkle revealed through ethnographic research that children learning to program exhibited a number of distinct programming styles. Often these individual styles clashed with the dominant culture, forcing some to conceal their true stylistic preferences, thwarting the creative impulses of others, and discouraging some altogether from writing programs. Her theoretical model relies strongly on developmental psychology and post-structuralist cultural critique, and attributes only minor relevance to the specific computer languages being learned, and makes no mention of the different brands and models of computers that were being used. My dissertation prospectus outlines a strategy for discovering how structural, rhetorical, and media-specific features of early personal computers and accompanying texts relate to the development of individual programming style by interviewing people who learned to program on such systems in the late 1970s through mid 1980s. The purpose of the data collection phase is to determine the relevant platforms, texts, and programming styles. The analysis phase will seek connections between them using qualitative analysis of the empirical data from the interviews. Close readings of Original Equipment Manufacturer (OEM) manuals, books, and magazines associated with these computer platforms seek to employ theories and methods from the study of texts and technology to respond to the gap in Turkle's work. The synthesis stage of the dissertation work may reveal recommendations for best practices in teaching computer programming in general settings, and guide the development of future computing platforms and their accompanying texts that are designed for learning.

Literature Review

For a period in the late 1970s through early 1990s, the rapid proliferation of personal computers in public schools and universities generated substantial interest among humanities scholars examining both the study of learning computer programming (Papert; Turkle, 1984; Turkle and Papert; Perkins, Schwartz, and Simmons), as well as how skills related to programming may transfer to other domains (Hoar; Mayer), in addition to the overall phenomenon of the growing presence of computer in education (Shields). Then the interest diminished, and scholarly study of learning programming devolved to the specific domains of computer science and technology education – and only for those children considering programming as a career (Gillespie; Ge, Thomas and Green; Panell). There is no clear picture of the current state of computer technology instruction in American public and private schools; however, it is likely to have shifted from a programming-centric to applications-centric emphasis. (If necessary, this hypothesis will be tested via empirical research, but it is not anticipated as a precondition affecting the validity of the proposed methodology.) Put another way, it is likely that programming-centric learning environments equivalent to (offering at least the same affordances of) late 1970s through early 1990s are not offered to the extent, as a percentage of school age children, that they were during this period. Interest in programming over application-based literacy has diminished since the 1990s, as articulated by David Brin in a 2006 online article, "Why Johnny Can't Code,” among others (Turkle, 1995; Bogost). A few who do promote the casual learning of programming in general academic contexts recommend very simple, modern languages such as HTML and XML (Cummings). Recent observers of this phenomenon suggest learning programming needs a different basis, such as using long defunct, early models of personal computers (PCs) on account of their relative simplicity and operational proximity to the raw hardware (Brin; McAllister; Bogost).
Is such an approach sensible? Has mainstream American culture shifted such that it is not reasonable to expect there to be a great deal of interest in what I would like to call general-purpose, casual programming? Within the realm of cultural studies, a possible means of analysis is to examine whether particular models of early PCs encouraged the development of particular programming styles, and how different programming styles fit into the overall culture of the time. While it may be fanciful to resurrect obsolete PC models to encourage children to learn programming, there may be lessons that can be learned from their design, marketing, and the texts that accompanied them, especially bundled instruction and reference manuals, for the next generation of devices.

Seymour Papert invented the Logo programming language as part of his vision of bringing together the power of easily programmable computers and the innate creativity of small children, as recounted in his 1980 book Mindstorms: Children, Computers, and Powerful Ideas. He set in motion other researchers and helped introduce personal computers to American schools. Sherry Turkle, who performed ethnographic studies of children learning to program in the late 1970s through mid 1980s, and later published with Papert, argued that children naturally express different programming styles. She called them 'hard mastery' and 'soft mastery'; children flourished or floundered in their efforts at learning programming depending on whether the learning environment and culture was compatible with their form of mastery (Turkle, 1984; Turkle and Papert, 1990; Turkle and Papert, 1991). She used the term 'bricolage', taken from Claude Levi-Strauss' anthropological studies of pre-industrial societies, to represent a programming style that “works on a problem by arranging and rearranging these elements, working through new combinations” (1984, p. 105). Essential differences between the formal, canonical programming style that is typically taught, and the 'bricoleur' style are the programmers relationship to their materials (abstract versus concrete) and attitude toward errors (avoidance versus acceptance). Turkle writes, “for planners, mistakes are missetps; for bricoleurs they are the essence of a navigation by mid-course corrections. For planners, a program is an instrument for premeditated control; bricoleurs have goals, but set out to realize them in the spirit of a collaborative venture with the machine” (2000, p. 136). Most of her research focused on the Logo programming language, which she argued was superior to BASIC in affording expression of the soft mastery style, and made no distinction between programming language and the particular model of machine that her subjects were using. She leaves to a footnote a brief analysis of the affordancs of Logo over BASIC: “Not all computer systems, not all computer languages offer a material that is flexible enough for differences in style to be expressed. A language such as BASIC does not make it easy to achieve successful results through a variety of programming styles” (1984, p. 105).

Turkle's interest in programming styles represents one aspect of research in learning computer programming. Studies about what teaching techniques do and do not work, and whether and how particular skills such as algorithm flowcharting and troubleshooting (debugging) may transfer from a programming environment to other domains, are more typical. Richard Mayer edited Teaching and Learning Computer Programming: Multiple Research Perspectives in 1988, a book containing research articles reflecting the rapid proliferation of personal computers in public schools and universities. The history of research on teaching and learning computer programming, as related by Mayer in the introduction, began with strong claims by Papert and others concerning the expected positive outcomes of non-directed methods for teaching programming. This was followed by empirical research revealing disappointing realities: “learning even the rudiments of LOGO appeared to be difficult for children and transfer to other domains seemed minimal” (p. 3). The present state (in 1988) characterized by multidisciplinary research and theory, retreated from the early, positive claims. Two trends he notes are “first, instead of advocating discovery methods of instruction, current research suggests that more guidance is needed to insure learning. . . . Second . . . transfer is most likely to occur for skills most similar to those learning in programming” (p. 4). Nonetheless, Papert's influence is evident by the number of chapters that deal with the Logo language. The next four chapters of Teaching and Learning Computer Programming focus on Logo: componential problem solving, cognitive analysis of learning Logo, it influence of intellectual development, and teaching methods that emphasize transfer of general skills. Chapter 11 returns to the topic of skills transfer from programming environments to non-programming contexts, again using Logo, and, contrary to Mayer's assessment in the introduction, claiming to successfully teach a skill that can be transferred beyond computer programming environments.

The other 'learning languages' besides Logo of note in Mayer's anthology are BASIC and Pascal. Chapter 7 presents the research of Perkins, Schwartz, and Simmons on “Instructional Strategies for Novice Programmers,” in which a 'metacourse' was designed to provide supplementary material to a BASIC programming curriculum. It attempts to address three common sources of difficulty for beginning programmers: fragile knowledge of the domain, lack of elementary problem-solving strategies, and attitudinal problems of confidence and control towards computers. This theme is continued in Chapter 8, which investigates the social context of learning computer programming. In Chapter 9 the results of a particular instructional project (the Autonomous Classroom Computer Environments for Learning) in high school Pascal programming classes are analyzed. Chapter 10 is a case study of typical errors students make in an introductory Pascal class. A glance at the history of scholarly journals combining computing and teaching reveals an expectant atmosphere in which it was assumed by most that programming instruction would continue to expand in public school curricula.

The bulk of current literature on learning computer programming is published within computer science and technology education domains, focusing on classroom settings where the currently popular languages used by professional programmers are taught, and not surprising, focusing on post-secondary instruction. A typical research article is "Exploring the relationship between modularization ability and performance in the C programming language: the case of novice programmers and expert programmers” by Maurice Vodounon. The days of general-purpose, casual programming instruction appear to be gone, along with wood shop, sewing, and other 'home economics'. While research publications about learning computer programming increased in quantity in the last two decades, by and large they appear in domain-specific journals and conferences rather than venues more popular for 'humanities computing'. This is not to disregard the rich dialog in new disciplines like Software Studies (Manovich) and Critical Code Studies (Wardrip-Fruin). However, these theorists appear more concerned with overall, societal technology consumption than with the topic of learning programming.

A few stand outs among humanities scholars who do directly address learning programming can be found, and they will be considered next. It is easier to see connections between the themes developed in Mayer's anthology than direct continuations of Turkle's work, although Turkle is often invoked in studies of class, race, and gender biases in instructional and work environments, often associating hard mastery with masculine preferences, and soft mastery with feminist themes (McKenna; Lynn, Raphael, and Olefsky; Sutton; Lau and Yuen). Recent research into 'pair programming' by Jill Denner and Linda Werner further elaborates on the importance of background knowledge and social support for novices that was the subject of the metacourse designed by Perkins, Schwartz, and Simmons.

Robert E. Cummings published an interesting article in 2006, “Coding with power: Toward a rhetoric of computer coding and composition” that explicitly ties programming, as a form of composition targeted towards machine readers, to composition targeted towards human readers. Besides making a theoretical argument linking programming and composition by presenting a parallel of the rhetorical triangle for addressing machines, he offers practical tips for how to implement his ideas in a class. In private correspondence with Cummings, he supports, as a replacement for writing BASIC programs, having student write Extensible Markup Language (XML) Document Type Definition (DTD) representations of the rhetorical elements of narratives and essays that they are studying. His emphasis on writing software code to clarify conceptual understanding alludes to the object-oriented programming style, which can be seen in addition to Turkle's bricoleur style as a rich ground for asserting individual perspectives by focusing on how programming problems map onto their external referents rather than on how to most efficiently code algorithms. The trend toward object-oriented metaphors was recognized by Turkle in Life on the Screen by such ground breaking changes in human computer interfaces as the desktop oriented Apple Macintosh of the mid 1980s. By the mid 1990s her research had taken a turn from studying how people learned to program computers to how people relate to and live in computer environments in general.

From this review of scholarly literature on learning programming a number of gaps are evident, and these gaps appear to be echoed by the cries of the popular press who lament America's loss of its edge in technical innovation attributable to failures of the educational system. The argument of the article by Soloway, Spohrer, and Littman in Chapter 6 of Teaching and Learning Computer Programming is that it is better to focus on the process rather than the product when the goal is to teach that problems can be solved in multiple ways. This idea complements Turkle's notion of epistemological pluralism. Her early investigation of programming styles can be refreshed using more recent programming languages and computing platforms, to see what is happening in elementary and secondary schools today. Alternatively, there is the unasked question of whether there is any influence by the specific, material conditions of learning programming on the development of individual style. By material conditions I mean both the computer platforms themselves, such as the Apple II+ and Commodore 64, and the texts that were bundled with them by the manufacturer, such as the Applesoft Tutorial and the Commodore 64 Programmer's Reference Guide, as illustrated in Figure 1.

This question about the influence of specific platforms and media can be studied by interviewing people who learned to program during the same period in which Turkle did her research focusing on Logo. N. Katherine Hayles (2004) coined the term Media-Specific Analysis (MSA) to describe a critical technique that takes into account the physical and structural properties of different media that putatively perform the same rhetorical function that can be used here. Second, the site of learning programming can be shifted from formal instruction – classroom settings in colleges, universities, public and private schools – to where many people in America first learned to write programs, at home, alone, self-taught via OEM manuals, library books, and magazines. Many respondents to online surveys and discussion forums state that they were self-taught (see Table 1 in the Data Collection section). In these settings the texts that were used alongside the computer itself often fulfilled the role of an accompanying instructional regime. A third gap to consider is extending the categories of programming styles beyond Turkle's dichotomy of hard and soft mastery by importing knowledge from mainstream computer science literature and history of software studies. For instance, there may be a recognizable stylistic difference between procedural and object-oriented approaches. There may also be distinct styles tied to specific computing platforms, a point made by Bogost and Montfort in their cultural and technical study of the Atari 2600 game console. In the spirit of qualitative research, however, these questions are best answered using a mixed method of critical analysis guided by empirical study of human subjects.

Methodology

The research questions I am posing revolve around the gaps in Turkle's early work on programming styles, informed by theorists like Hayles who, with Turkle, recognize that specific technologies and texts can have a broad impact on learning, development, and personality, perhaps, in this case, programming style. Understanding the relevance of the affordances of specific technological systems and texts serves to remind us that certain obsolete platforms may offer advantages for learning programming over contemporary, putatively superior ones. The outcome of this study may reveal recommendations for best practices in teaching computer programming in general settings, and may guide the development of future computing systems (for example, as using all free, open source software) and their accompanying texts, both printed and online, when they are designed as platforms for learning. These research questions are:

  • What platforms were used in learning to program in the late 1970s through mid 1980s?

  • What printed texts (OEM manuals, popular press books, textbooks, magazines) were used?

  • What programming styles developed?

My methodology is summarized in the following outline. See Appendix D for the overall project milestones and target completion dates.

  1. Data Collection Stage

    1. Refine Interview Protocol

    2. Select Interview Candidates

    3. Submit IRB materials

    4. Conduct Interviews

  2. Data Encoding Stage

    1. Code Interview Data from Forms

    2. Transcribe and Code Significant Audio Recordings

  3. Data Analysis Stage

    1. Qualitative Analysis of Key Platforms, Texts, and Programming Styles

    2. Structural, Rhetorical, and Media-Specific Analysis of Key Platforms and Texts

  4. Synthesis Stage

Data Collection

An interview protocol has been developed that elicits data about what computer platforms and what texts people used while learning to program. It also inquires into when, where, and with whom they did so, as well as what kinds of programs they wrote. At the heart of the interview process is a single page form that provides prompts for the researcher and provides room to take organized notes. See Appendix A for a blank form and an image of one that was filled out during a trial interview. Trials of the interview process took 30-60 minutes to cover all of the items on the form, including a substantial amount of free discussion at the end of the interview. However, the protocol needs to be adjusted to provide prompts for learning more about the programming style characteristic of the subject. Turkle seems to imply a connection between programs asking the user questions and soft mastery, although her assignment of style hinges more on how the programmer relates to the computer platform running the program and source code than the actual nature of the programs written. This may be due to her choice of environments (instructional setting) and programming language (Logo) influencing the type of programs being written. It is hoped that Sherry Turkle or one of her students can be solicited to provide guidance. Susan Lammers' interviews of influential programmers in her 1986 book Programmers At Work is helpful for seeing the sorts of responses that may be indicative of programming style without begging the question because their wider scope itself gives narrative, biographical evidence of programming style.

Interviews will be conducted over telephone, voice over IP (VOIP), or in person. During the interviews the researcher takes handwritten notes using the interview form, and also makes an audio recording for later transcription if permission has been given to do so. The Informed Consent Form in Appendix C provides a check box where the interview subject can elect to be recorded or not. In either case, the researcher must be diligent in removing all personally identifying information from any quotations or paraphrases that appear in subsequent oral and written presentations based on this research.

The intended subjects for this study are persons likely to have used early personal computer platforms and associated texts (manuals, books, magazines, etc.) that were popular in the United States from the late 1970s through mid 1980s. Interview candidates will be selected based on their participation in existing surveys, polls, and discussion group threads that address how, when, and with what platforms they first learned to program computers. Sample data sources from which to cull potential interview subjects are shown in the following table. Obviously, only those resources that provide email addresses or another way to contact the respondent, such as through the discussion forum messaging system, will be considered.

URL

http://stackoverflow.com/questions/348098/are-you-a-self-taught-programmer-or-did-you-take-a-programming-course

Survey Question

Are you a self-taught programmer or did you take a programming course?

Sample Response

Self taught using ZX BASIC/assembly on a ZX spectrum. Got it for the games but quickly became very interested in what was happening underneath. No internet/forums so just had to make it up as I went along.

Then did a university degree which required us to do programming but did not really teach it (apart from some simple Fortran 77). Was good for me as I was really interested in programming anyway. Then used Fortran/C++ in the real world just by continued learning on the job.

Have continued self teaching ever since (e.g reading stackoverflow) but I don't get to do as much programming as I would like these days...... (posted Dec 7, 2008)

Respondent

Registered User “luapyad”

URL

http://blogs.msdn.com/b/johnmont/archive/2006/05/03/586766.aspx

Survey Question

How did you learn to program?

Sample Response

I started as a hobbiest in high school typing in programs from Compute! magazine into my C=64. The most programming I did at that point was debugging my type-o's which forced me to understand the source of the problem in order to fix them. I started professionaly with MSAccess (using Mike Groh's Access 95 book). I now work with VB.Net/Sql Server/ASP.Net/etc. Most of the learning I get at this point is through trial-error. I also find user groups to be a good source of free training as well as networking opportunities. For a new programmer, the best learning will be to give them a simple task and have them figure out how to solve it. Sure they will make mistakes along the way, but we often learn best from our mistakes. We can read all the books we want, but actually doing is what counts. As evidence, consider the number of people who read the certification books, pass the tests and still can't code their way out of a shoe-box. (posted May 3, 2006)

Respondent

Registered User “Jim Wooley” (registration required to access additonal user profile information)

URL

http://www.quartertothree.com/game-talk/showthread.php?p=2082157

Survey Question

How did you learn to program?

Sample Response

I started learning (and learned about backups) simultaneously; the first while modifying my uncle's TRS-80 Color Computer animation program, and the second by saving my alterations to his original tape in ignorance of cassette tape leaders. (posted February 4, 2010)

Respondent

Username “ciparis”

Table 1: Data sources for selecting interview candidates

Interviews may also be solicited through blog postings and other social media, as well as personal references. There are a number of software development businesses in the greater Orlando area including Toptech Systems, Electronic Arts, Fiserv, and NCR with which the researcher has ties. A sample email invitation to participate in the research is shown in Appendix B. Following University of Central Florida Institutional Review Board (IRB) procedures, all interview subjects must read, sign, and return a copy of the Informed Consent Form prior to being interviewed.

Data Encoding

Data will be encoded from the interview forms and audio recordings. Sets of identifiers will be created for each category (computing platform, texts, types of programs written, and other stylistic indicators) to normalize the data into units represented by machine configurations, printed texts, and programming styles. A spreadsheet may be used to initially encode the handwritten interview forms, although it is anticipated that custom software will be developed in conjunction with the Spring 2011 Digital Editing and Database course to facilitate encoding. Table 2, below, represents a spreadsheet representation of sample interviews displaying the interview numbers, the age and year they started programming, the computer platforms, and the OEM texts they recall using. The full interview form contains many more dimensions, some of which may take multiple entries for each interview subject to reflect concurrency and temporal change in their programming activities.

Number

Age

Year

Computer Model

OEM Manual

1

18

1978

Mainframe PDP


2

18

1980

PDP – 10 or 11 minicomputer


3

13

87/88

Apple //e, PC jr (later that year)


4

10

1985

C-64, 486 in college

big thick manual that came with it

5

9

1979

Vic-20, C-64, PCs late 80s, Mac 2006

Reference Manual, 64 had 6510 instruction set, section on architecture, very basic overview of kernel, sprites, sounds

6

24

1978

Heathkit H-100 (H8) dual 8085/8080 kit, Zenith Z-100 integrated unit

MSDOS/ROM BASIC came with ROM chips, not the computer

Table 2: Spreadsheet containing interview data from 6 sample interviews

The next level of encoding is to normalize dimensions and category information from the spreadsheet data using an enumerative coding scheme (see Geisler, chapter 4). It is anticipated that the data will be loaded into MySQL relational database tables designed for this project to allow manipulation by custom programs for analysis and display of results. Appendix E contains the Structured Query Language (SQL) commands used to create the sample database and tables that are shown below. The tables will contain fields for the encoded categories for each dimension (as an integer reference to a secondary lookup table), as well as the raw interview transcription if it is available, or raw researcher notes (as a text field). It is important to retain the raw interview text since it is likely that the data may be iteratively recoded as more results accrue and emerging patterns reshape the direction of the study. The table structure will also support multiple instances of the same dimension for each interview, reflecting that the subjects may recall having used multiple computer platforms over the course of the years they learned to program that fall within the target time frame of late 1970s through mid 1980s. The two sample tables definitions shown in Figure 2 and Figure 3 represent the enumeration of various dimensions and their possible categories.

These are considered secondary lookup tables so that the integer key can be used to consistently reference each category type for all of the interviews. The encoding task requires identifying the relevant, distinguishable computer platforms and assigning each an integer; these are the categories of the dimension 'computer platform'. A similar process must be done for the OEM texts, magazines, and so on that are recounted during the interviews, for the types of programs written, and so on.

Then each interview can be encoded using these enumeration numbers to facilitate automated analysis, as shown in Figure 4. Notice the use of the InstanceEnumeration field to indicate when there are multiple instances of the same dimension for an interview, reflecting the temporal ordering of, for example, the computer platforms Commodore Vic-20 followed by Commodore 64 by InterviewNumber 5. A utility program is envisioned for entering the data from the interview form directly into the database rather than through an intermediary spreadsheet representation.

It is understood that the encoding models for many of the dimensions captured by the form and that may be derived from transcriptions of the recorded audio have yet to be defined. In particular, the categories of the dimension 'programming style' have yet to be expanded beyond Turkle's dichotomy of hard and soft mastery, let alone have heuristics for determining those programming styles from raw data be developed. Therefore, one of the first tasks in the project time line presented in Appendix D is to refine the interview protocol to achieve these goals.

Data Analysis

Data analysis will be divided into two parts: analysis of the interview data, and analysis of computer platforms and texts. Interview data will be analyzed using custom software written by the researcher to perform qualitative analyses or prepare the datasets for analysis by third-party software applications. Nominal data about the use of computer platforms, texts, and programming style will be generated to perform quantitative statistical analysis of correlations between platforms/texts and programming styles, as well as other pairings of variables that emerge as possibly significant. This data will also inform the selection of the specific platforms and texts that will be scrutinized in detail as items of material culture. The analysis of computer platforms and texts represents a study of material culture and is the least developed part of this proposal. The cultural milieu and aspects of production of many early personal computers are recounted in Freiberger and Swain's Fire in the Valley. It is anticipated that future coursework in 2011 will flesh out the methods for this content analysis. Any custom software created from this project will be licensed under a popular free, open source license such as the GNU General Public License (GPL) version 2 or higher. The analysis methods will include assessment of the reliability and validity of the measurement system, following the recommendations made by Lauer and Asher (see Chapter 7 on Measurement).

Synthesis

The final stage of the dissertation project will synthesize the findings of the data analysis that will happen after encoding data from interviews. In connecting this work with mainstream research in teaching and learning computer programming, I am hypothesizing that certain texts may impose "more guidance" (as developed by Perkins, Schwartz, and Simmons in the description of their 'metacourse' supplement to formal programming instruction) than their absence or other texts during the unmediated discovery of the computer's programming possibilities, as well as other features of the metacourse for novice programmers. Texts may also function like a partner in the "pair programming" research discussed by Denner and Werner without the authoritarian auspices of an adult teacher. Also consider how media-specific features of colorful, spiral-bound printed texts such as The Applesoft Tutorial differentiate their rhetorical and practical functions from other types of texts, such as browser-based resources or help features built into Integrated Development Environments (IDEs). For example, consider the rhetorical impact of its cover (Figure 5), containing a program that can be typed in and run. This gave the Apple platform an advantage because you could use it at face value. In terms of designing an optimal text today, while it is true that printed and video display media can both show the picture on the cover, there may be advantages to having a physical, printed book from which to learn about how to use the computer, versus using the same computer to display the instructional text. At minimum, a solution to the conundrum of why it seems to be better to go back to previous states of the art in computer technology to best learn how to program in general to best develop individual style, both innate and determined by the environmental affordances of the technologies in use in experience, will be offered.

Works Cited

Bogost, Ian. (2010, February 19). Pascal spoken here: Learning about learning programming from the Apple ][. Message posted on http://www.bogost.com/blog/pascal_spoken_here.shtml. Retrieved October 3, 2010.

Brin, David. (2006, September 14). Why Johnny can't code. Message posted on http://www.salon.com/technology/feature/2006/09/14/basic. Retrieved October 3, 2010.

Cummings, Robert E. (2006). Coding with power: Toward a rhetoric of computer coding and composition. Computers and Composition, 23(4), 430-443.

Denner, Jill, & Werner, Linda. (2007). Computer programming in middle school: How pairs respond to challenges. Journal of Educational Computing Research, 37(2), 131-50.

Freiberger, P., & Swaine, M. (2000). Fire in the valley: The making of the personal computer. New York: McGraw-Hill.

Ge, Xun, Thomas, Michael K., & Greene, Barbara A. (2006). Technology-rich ethnography for examining the transition to authentic problem-solving in a high school computer programming class. Journal of Educational Computing Research, 34(4), 319-52.

Geisler, Cheryl. (2004). Analyzing streams of language. New York: Pearson Education, Inc.

Gillespie, C.W. (2004). "Seymour Papert's vision for early childhood education? A descriptive study of preschoolers and kindergarteners in discovery-based, Logo-rich classrooms. Early Childhood Research and Practice, 6(1).

Hayles, N. Katherine. (2004). Print is flat, code is deep: The importance of media-specific analysis. Poetics Today, 25(1), 67-90.

Hoar, Nancy. (1987). Conquering the myth: Expository writing and computer programming. College Composition and Communication, 38, 93-95.

Lammers, Susan. (1986). Programmers at work: Interviews with 19 programmers who shaped the computer industry. Redmond, WA: Tempus Books of Microsoft Press.

Lau, Wilfred W. F., & Yuen, Allan H. K. (2009). Exploring the effects of gender and learning styles on computer programming performance: Implications for programming pedagogy. British Journal of Educational Technology, 40(4), 696-712.

Lauer, Janice M. and J. William Asher. (1988). Composition research: Empirical designs. New York: Oxford University Press.

Lynn, Kathleen-M., Raphael, Chad, & Olefsky, Karin. (2003). Bridging the gender gap in computing: An integrative approach to content design for girls. Journal of Educational Computing Research, 28(2), 143-162.

McAllister, Neil. (2008, October 2). Should computer programming be mandatory for U.S. students? Infoworld.

McKenna, Peter. (2000). Transparent and opaque boxes: Do women and men have different computer programming psychologies and styles? Computers & Education, 35(1), 37-49.

Mayer, Richard E. (1988). Introduction. In Richard E. Mayer (Ed.), Teaching and learning computer programming: Multiple research perspectives. Hillsdale, N.J: L. Erlbaum Associates.

Panell, Chris. (2003). Teaching computer programming as a language. Tech Directions, 62(8), 26-27.

Papert, Seymour. (1980). Mindstorms: Children, computers, and powerful ideas. New York: Basic Books.

Perkins, D.N, Schwartz, Steve, & Simmons, Rebecca. (1988). Instructional strategies for the problems of novice programmers. In Richard E. Mayer (Ed.), Teaching and learning computer programming: Multiple research perspectives. Hillsdale, N.J: L. Erlbaum Associates.

Shields, Mark A. (1995). The legitimation of academic computing in the 1980s. In Mark A. Shields (Ed.), Work and technology in higher education: The social construction of academic computing. Technology and education. Hillsdale, N.J: Lawrence Erlbaum Associates.

-----. (1995). The social construction of academic computing. In Mark A. Shields (Ed.), Work and technology in higher education: The social construction of academic computing. Technology and education. Hillsdale, N.J: Lawrence Erlbaum Associates.

Sutton, Rosemary E. (1991). Equity and computers in the schools: A decade of research. Review of Educational Research, 61(4), 475-503.

Turkle, Sherry. (1984). The second self: Computers and the human spirit. New York: Simon & Schuster.

-----. (1995). Life on the screen: Identity in the age of the Internet. New York: Simon & Schuster.

Turkle, Sherry, & Papert, Seymour. (1990). Epistemological pluralism: Styles and voices within the computer culture. Signs, 16(1), 128-157.

-----. (1991). Epistemological pluralism and the revaluation of the concrete. Journal of Mathematical Behavior, 11(1), 3-33.

Vodounon, Maurice A. (2006, June 22). Exploring the relationship between modularization ability and performance in the C programming language: the case of novice programmers and expert programmers. The Free Library.

Weinstein, Peter. (1999). Computer programming revisited. Technology & Learning, 19(8), 38-42.

Werner, Linda, & Denner, Jill. Pair programming in middle school: What does it look like? Journal of Research on Technology in Education, 42(1), 29-49.

No comments: