Students Talk Technology: Tech Tools for College Education

Many studies have revealed that college students today arrive on their campuses with high literacy in the latest technology and mobile devices. It is not uncommon to see students walking around with their beat earphones, texting while waiting in the hallway, and snapchatting with their friends in the dining hall. Yet, in the sea of options, what educational technologies are students using to help them with their studies? In the video above, I interviewed a trio of undergraduate students at St. Cloud State University (SCSU) to quiz them on their favorite tech tools and innovations they hope to see in the future.

All that tech has caused something of a dependency, too. The following infographic reviews responses from SCSU students about their technology-using habits — ultimately showing a trend that leads to a techno-reliant generation.

Infographic

Ideas/comments? Add your two cents to the comment section below!

Is There a Robot in This Class? On Writing & AES

Slide1

This presentation was delivered at the 2014 Conference on College Composition and Communication in Indianapolis, IN, March 24, 2014.

Automated essay scoring (AES) is the use of specialized computer programs to assign grades to essays written in an educational setting. It is a method of educational assessment and an application of natural language processing. Its objective is to classify a large set of textual entities into a small number of discrete categories, corresponding to the possible grades—for example, the numbers 1 to 6. Therefore, it can be considered a problem of statistical classification (Wikipedia, “Automated essay scoring”).

As we know, the conversation on machine grading isn’t new. It began in 1962 with Walter Reitman’s thoughts on the possibility of machine grading at a conference on needed research of teaching English. Most historical summaries of AES trace the origins of the field to the work of Ellis Batten Page. In 1966, Page argued for the possibility of scoring essays by computer, and in 1968 he published his successful work with a program called Project Essay Grade (PEG).

Multiple-choice exams have, of course, been graded by machine for a long time, but essays are another matter. When it comes to English composition, the question of whether computer programs can reliably assess student work remains sticky. Can a computer program accurately capture the depth, beauty, structure, relevance, creativity, etc., of an essay?

Douglas Hesse writes that when it comes to the resulting problems of cost and frustration associated with traditional writing assessment, he worries more about how computer grading will negatively impact “the transferability of skills and about the cost to how students perceive writing (and the resultant cost to human culture).”

Amid the pressure many composition teachers and their institutions face in regards to the possible adoption of computer-grading programs for assessing student writing, many teachers have strong opinions – most of them disagree with the employment of grading machine in their classroom. In this pressing time, I think it is important for teachers, as well as graduate students, to understand the impact on pedagogy, process, and product that grading machines may have if they replace human readers. Hence, by providing an overview on the appeals of computer grading programs and the ongoing debates around the adoption of automated essay scoring software for assessing writing, this presentation aims to illustrate how mechanized evaluation of writing may mechanize pedagogy as well as the process to create a written, machine-directed product. I will also offer suggestions for writing instructors in handling the adoption of essay-grading programs in their classrooms.

Read the full conference paper here.

View the conference presentation here.

Ph.D. Finder for Rhetoric and Composition Programs

PhD Finder promo

Earning a higher educational degree is a notable pursuit. This pursuit usually starts when one has developed an interest around a certain area of study and began looking for universities and programs that meet his/her academic needs. Yet, as a graduating master’s student and Ph.D. applicant, I cannot rant enough how frustrating the process of school-finding can be. My first instinct was to speak with my advisors about possible programs to look into, and then go on Google to find out more about these schools – about their department culture, Ph.D. dissertations written by students, faculty members’ research areas, criteria for admittance, etc. Miserably, I felt luck was a huge, integral part of the search process.

To help Ph.D. applicants narrow their searches, Rhetoric Review conducted its fourth Survey of Doctoral Programs in Rhetoric and Composition in 2007 and published a print article in RR 27.4 (2008). Later, a Wikia site was set up to allow program directors update their department or program information. However, if you take a look at both the original site and wikia, you may find it challenging to navigate through the sites for a comprehensive outlook across the programs reviewed.

This is why we designed and launched Ph.D. Finder – an interactive app that allows users to find Ph.D. programs in Rhetoric & Composition that most closely match their admittance aptitude and research interests.

Screen Shot 2014-01-05 at 3.03.54 AM

Our algorithms are designed to rank programs based on the user’s selection of research interests and self-ordered strength in the criteria commonly used by Ph.D. admission committees for candidate evaluation. We aggregated data from the Rhetoric Review Survey and picked out programs that currently offer a Ph.D. degree in Rhetoric and Composition. Users will be able to see real-time adjustment on the program ranking as they select their area(s) of research and order their admittance aptitude.

Screen Shot 2014-01-05 at 2.42.53 AM

Screen Shot 2014-01-05 at 3.20.56 AM

As this app is still in its infancy stage, we realize there are rooms for improvement and look forward to add more programs to our existing database. Since this app is a startup-like project created by two college students, we are, nonetheless, looking for fundings to help us expand this endeavor. Please contact us if you are interested in a partnership or backing us up.

At the meantime, check out the app and let us know what you think!

Ph.D. Finder is created by Ivan Okhin and Jason Tham, both social entrepreneurs (wannabe) and technology enthusiasts based in St. Cloud, Minnesota.