Thursday, December 14, 2006
What I found particularly interesting, however, was that several students discussed how distracting online research was for them. When I pressed them on what they meant, they noted that while online primary sources were great, that during those times that they were researching online they constantly found themselves involved in other online activities (IM, Facebook, entertainment sites, etc.). When I suggested they turn IM off while doing research, they indicated they knew that they should, but that they never did. [A couple students pointed out that even if IM was off, online research required being online and therefore the temptation to click elsewhere was always there.]
What occurred to me as I listened was that there was a gap between the students' understanding of the Internet and my own. [This post title suggests one explanation for that gap, though I acknowledge the small sample size and my own particular biased perspective.] I see the Internet as a collection of various repositories of information (albeit linked together in what is hardly a seamless way); I am constantly conscious of their origins (a fact especially true
when I discuss online collections of primary sources and historical information). For my students, the Internet is a more-or-less seamless information source, and one for which they have trouble tuning out particular parts. In fact one student said that she much prefers offline research (in books of all things) because of the lack of distractions.
If that's true (and I acknowledge that I may be overstating the dichotomous perspective) then what are the potential effects on our teaching? I now believe that it is even more important than I realized before to get students to identify the sources of particular information in order to better ascertain bias and veracity of online materials. Overcoming the seamless (sourceless?) nature of the web may be a bigger problem than I thought. Or should we of a slightly older NetGeneration just accept the view that the Internet is increasingly seamless and come to terms with a new more-or-less unified way of presenting information?
Monday, November 06, 2006
First, we need to begin with basic informational and technological competencies and we need to start by defining in a broad sense what we mean by “competencies,” a word that has been gaining a great deal of traction lately without a great deal of explanation. I think (though I'm hardly the first to articulate this) that digital competencies are made up of both technical skills (ranging from changing margins in Word and more advanced MS Office functions to a familiarity with online tools including email, search engines, IM, blogs, wikis, and so on) and digital fluencies (requiring a higher-order deployment of those skills in producing and consuming information in an adaptable, creative, responsible way). [It's also important to note that this definition of "competencies" goes beyond the notion of bare adequacy.]
If we accept this notion of skills versus fluencies, I would argue the emphasis of our digital proficiency goals should be aimed at achieving fluency, not skill mastery. Can we not expect students to come to college with a basic familiarity with the skills of digital life? Can’t basic skills (word processing, email, spreadsheets) be expected? But Jeff, aren't there plenty of entering college students who are familiar with aspects of the digital world, but don't know how to do all of these things? Why, yes, I'm glad you asked. Certainly a focus on fluency over skills would require institutional support in the form of a Technology Center, online guides, brief workshops (no more than an hour or two), and perhaps student tech tutors, so that students not capable of certain skills could find the answers to questions about margin changing or Powerpoint presentations or what ever they needed for a particular class.
There is, I think, an important parallel here with the way we address writing in college. A small percentage of our students have poor grammar skills. [Most are quite good at grammar.] There are resources on campus to deal with those issues, but we don’t send them to a separate class on grammar (nor do we advocate all students take a test on grammar). The school does have a writing intensive requirement, however, that necessitates students demonstrate a number of their writing skills, which of necessity requires familiarity and facility with grammar.
So, can’t we tell new students, “these are the basic computing skills every incoming student should have” and then offer them resources to address the gaps they have? The vast majority of our students do have a broad familiarity with basic computing skills. [We might even have an (optional) placement assessment like we do with foreign languages that would allow them to measure their technical computing skills.]
We can then focus the technology proficiency requirement on fluencies, on an adaptable ability to think, create, and operate within the digital world. And we need to make these fluencies the centerpiece of the requirement (not digital skills or even the classes/fulfillment requirements).
The advantages of focusing on digital fluencies over digital skills are numerous:
- Testing students on basic skill sets makes most of them feel like they are wasting their time on things they already know, or on things they think they'll never use.
- Focusing on digital fluencies allows us to expect more technological sophistication from students. [This is as opposed to a kind of low-level investment in a skill-test system that encourages only completion (and that only barely) and not engagement, creativity, or adaptability.]
- Researching, finding and evaluating primary and secondary sources
- Presenting one’s ideas in a variety of formats (written, oral, formal/informal--online)
The next question to ask is how these digital fluencies will be delivered. I think the material should be incorporated into one or a set of class(es), not a separate course on "technology". Integrating digital fluencies into classes (general education and departmental requirements) has a couple of key results. First, it indicates the importance the institution and faculty have invested in those fluencies. Second, it provides students with content-linked opportunities to demonstrate their ability to maneuver and participate in the digital world (locally and globally)
The most recent discussions of our technology proficiency have revolved around a two-tier system, with a foundational level (a course or courses) intended to address those fundamental competencies incoming students need for their college experience in general, and a discipline-specific requirement. The latter would allow departments to integrate those discipline-specific digital skills and fluencies into their curriculum and support plans. [So a psychology department might incorporate SPSS into their methods classes, while Math could include work with Mathmatica or Dynamic Solver.]
In determining the success of the digital fluency approach (and more generally of the technology proficiency program), we need to make sure that the assessments are not multiple-choice, specific skills tests, but rather allow students to demonstrate competencies within a framework of actual activity and usage.
What do you think? Is this an approach that makes sense? Is it an approach that can garner support among faculty, students, and administrators?
I don't disagree with either of the comments raised by Steve and Jerry. Various parts of what I described as e-portfolios could be started without a full-blown university-wide e-portfolio system. [And some of my colleagues at CGPS have already begun to do so.] All that is good. Students could demonstrate competencies in technological proficiency and/or digital literacy (they're different things, a subject for a future post), they could maintain online archives of sorts of their written work using blogs or wikis or some other medium, and even reflect on that work.
But would students do that on their own? Probably not. Will they do so when it's assigned? Likely, and they might even get something out of it. But without other professors doing the same thing they're not likely to connect it to a larger educational experience or broader world.
I guess the real appeal to me of the e-portfolio (beyond the practical function as an accessible place to collect work) is on the grand scale. One place to assemble the work of a college career, one place to reflect on four years' worth of research, writing, even presentations (digitally recorded), one place to make connections between courses and concepts, between science and literature, between language and society. Steve's right in his comment that this reflection could be going on all the time. Heck, it should be going on all the time. But what appeals to me (and what I see as its biggest problem) is the notion of some kind of complete integration of the e-portfolios, a notion that would require grass-roots and top-down support from administration, faculty and students. Since I have trouble envisioning that broad institutional buy-in, I'm having trouble buying into doing this piece meal.
I suppose my pragmatism is blocking my vision in this case.
Maybe this is the kind of thing that might best be tried out at the departmental level. [If any of my departmental colleagues are reading this, rest easy. This is a thought piece, not next meeting's new business agenda item.] A department could decide that it wanted its majors to collect their writings, speeches, and everything else related to the major in one place; that it wanted its majors to be consciously reflective about their courses and the material/concepts/skills learned in them; and that it wanted them to explore the value of that content and those competencies for their own goals in and after college.
A department would be larger than a single professor's desire and therefore would reflect a larger commitment to the concept on the part of a group of faculty within a discipline on campus. On the other hand the issues of scale and practicality I raised in my earlier post would be less problematic with 5-15 professors and 50-250 students than they would be with an entire campus. [Plus, getting buy-in from a single department is more feasible than convincing an entire campus.]
Wednesday, October 18, 2006
I recently took part in a technology roundtable discussion with members of my institution's faculty and staff (and a couple of students) about the future of our technology proficiency requirement for our undergraduate students. A couple of colleagues from our graduate campus noted that they were beginning to use electronic portfolios to assess computing proficiency, as well as present pedagogical skills. I commented that I thought most undergraduate students already had a default portfolio of projects (papers, PowerPoint presentations, spreadsheets, and other electronic materials), but it was sitting on their computers, unorganized and unused since the assignments for which they were created. Certainly those materials might be collectively used to demonstrate the array of technological proficiencies that our students learn.
Still, that use of electronic portfolios seems rather limited. What is the point of electronic portfolios, why would we want to use them, and why has there been such resistance to the idea of them on many campuses? I think the answers to these questions are connected.
Reasons for an electronic portfolio (in no particular order)
- Demonstrate competence in some skills
- This gets at what my colleagues from CGPS and I were discussing before.
- Many teaching programs require paper or e-versions of these now.
- Gather and reflect on one's work
- This might be one way of encouraging a kind of self-reflective (or self-repairing) learning. "Here's all of the papers you've written for your classes in the history department. Reflect on what worked, what didn't, what you learned about yourself, your researching, and your writing."
- In the ideal form, such e-portfolios might even come to serve as a central theme to the liberal arts experiences, a kind of connective tissue between individual classes and between courses and the larger collegiate experience. "Why am I taking all these classes? What's the point of this array of courses, within my major and outside it and how do they relate to each other?"
- Serve as long-term online storage for student work (and faculty comments?) that could be used by both the student (for reference, as part of a job or graduate school application) and the school (evidence of student learning, data source for assessing outcomes, examples of projects for future students)
- Due to privacy the latter uses would depend on selective approvals by students.
Then there are the practical issues about online portfolios. What responsibility does the school have for keeping these portfolios? How long is long-term? Ten years? Twenty? Permanently? Sure, hard disk space is cheap, but servers and personnel to maintain them aren't. Given the numerous problems with privacy and data thefts lately, how much responsibility would schools have in safeguarding access to this material?
None of this is to reject the idea of e-portfolios--I'm especially attracted to the notion of reflective/self-correcting consumers of information and I think they can serve practical goals in demonstrating competencies--only to note that if a school is to take on such a project it would need an extremely clear set of goals (and a long-term plan) to deal with the practical issues.
Wednesday, October 11, 2006
We discussed writing and speaking skills, knowledge acquisition and critical thinking, familiarity with a diverse set of methodologies, times, and places, and a perspective on the place of the self in the larger society. Although not always expressed in such ways before by us, these are fairly common sentiments in history departments. What was unusual was the addition of a section on what we're calling "digital literacy." Here's what we came up with:
- As the amount of information available online increases at near-exponential levels, the need for students’ digital literacy grows as well.
- The ability to find reliable, scholarly, information on a topic
- Within gated, subscription databases and in the larger, disorganized online world
- Finding and searching the collections of online archives, museums and institutions of higher education
- The ability to assess and evaluate the reliability of online sources
- This is a new facet of the approach historians and history students have long employed, that of judicious skepticism.
- The ability to produce creative, yet scholarly materials for the digital world
- These require the same level of rigor applied to traditional papers and presentations.
We decided the following was what we wanted for our students:
Students who become fluent in all these areas will be adaptable, reflective consumers and producers of information, critical thinkers able to take on any number of occupations, aware of the diversity of thought and opinion in the study of the past, and ready to move forward into the larger world as responsible, productive citizens of local and global communities.
None of this is finished yet (and we still are in the midst of curricular discussions), but I can't help but be excited about the direction the department and the institution is taking. We are looking to the future in useful ways, for us as teachers/mentors/learners and for our students as learners/mentees/teachers.
Tuesday, October 10, 2006
This repetitious request by children for the same song over and over is the bane of many a parent's existence (and perhaps the jackhammer between sanity and insanity). That is, the repetition of children's songs is not a new experience.
However, it's begun a thought that's been rattling in my brain (along with the chorus to Puff) about the delivery of media today and its effect on society, children in particular.
My daughter also always has children's shows to watch because my wife and I have Tivo'ed a number of them and can play them on demand (well, not literally, since we try to limit her TV watching time, but you get my point).
Sure, other parents have played Puff on CD or cassette tape (or probably 8-track), and they've got VHS or DVD versions of their kids' favorite shows and movies, but it seems to me there is
a fundamental psychological difference to children between getting a tape or cassette or DVD out to put in and play, and just using a couple of button presses on a screen or click-wheel to start exactly what she wants. Will this raise her expectations? Will she demand information and entertainment to just appear with a few clicks? Yes, of course she will.
And so will our students.
Yet as my institution grapples with the implications of a new president who has asked the faculty to be forward-thinking, creative, and innovative, I wonder if this same sentiment is an appropriate description of our curricular and institutional choices (and I think some of my colleagues fear it's our future).
Now, to be honest, among faculty, quitting is rarely an option, at least in the form of leaving one's job, since the market is so tight right now, especially if one has roots of family and friends in a particularly geographic region. However, there's quitting and then there's quitting.
Complaining is always an option of faculty. [Some might say it's an area that we've claimed and reclaimed over and over.] Still, despite succumbing to this myself at times, it's hard for me to see this as the only option. [Does this have to be a single choice? Can't I do both? Well, why not!? -- See how easy it is for complaining to start? And how quickly it turns into whining, which is even worse.]
So that leaves us with innovate. To me this is exciting, exhausting, invigorating, and downright scary. What we do with this is left to us (though likely with significant leadership from our administration). We have to live with it. But we also have to live with ourselves, our students and our institution if we don't change things now, if we don't adapt, if we don't look towards the future of higher education and learning in general....
Sunday, September 03, 2006
I think part of what faculty members don't say when they flinch back from the idea of posting their lectures (as notes, full text, or podcasts) is that they are afraid that their lectures will be revealed to be less polished, less original, less important than their published scholarship.
However, we need to also be careful about dismissing the concerns about people not coming to class (and certainly there are ways to address that). But I've had a number of conversations with students who have told me (confidentially...) that if there were podcasts or lecture notes that they would come to class much less, or not at all, even if they took the hit for class participation. [I might add that some institutions of higher ed discourage or even prohibit professors from requring attendance.]
I would also add that if one runs a truly interactive lecture, then students are missing a great deal by not being there to participate in that process. Like it or not, posted materials suggests that active learning is less important than passive learning (even if the hope is that posting such materials would result in a more active engagement with the material).
I understand the appeal that broadening the podcasting or vidcasting of lecture series to all class lectures might have. But remember, most of those lectures in something like Great Lives required weeks of preparation by those scholars for one 75-minute presentation. The harsh reality is that most class lectures are composed in an hour or two and don't come close to the quality in content, presentation and delivery of a formal lecture.
None of this is to reject the premise of podcasting lectures, but merely to explain that such a process should be undertaken carefully, especially for junior faculty members who have their professional reputations (and future employment) to consider.
Wednesday, August 30, 2006
Will at weblogg-ed.com has an amazing post ("Teachers as Learners") that I'm still trying to figure out an appropriate response to that goes beyond, "Yes, that's it. That's what I've been trying to say...." Here's the most relevant paragraph.
In a world where knowledge is scarce (and I know I’m using that phrase an awful lot these days), I can see why we needed teachers to be, well, teachers. But here’s what I’m wondering: in a world where knowledge is abundant, is that still the case? In a world where, if we have access, we can find what we need to know, doesn’t a teacher’s role fundamentally change? Isn’t it more important that the adults we put into the rooms with our kids be learners first? Real, continual learners? Real models for the practice of learning? People who make learning transparent and really become a part of the community?This notion of teachers constantly learning, of modeling for students a notion of the process of learning is incredibly appealing to me. It's part of why I've created 11 different courses in 5 years, and why I have several more waiting to pop out when I have time and opportunity to do them. Not because I want to impart wisdom to my students, but because I want to share with them the excitement I feel when creating a new vibrant course, when learning more about a topic and figuring out how to share that process with them. That excitement for learning is why I have several scholarly projects on the backburner, waiting for my last project to finish. It's why I find so many of the posts of my blogger colleagues Gardner and Steve so provocative and evocative of all I wish to do.
Okay, so I have plenty of passion, but to what end? Look for future posts for more thoughts on this....
Tuesday, May 16, 2006
Rachel Smith of the NMC is talking about gaming and education.
People actually research MMOs and their impact. [Seems useful for modern anthropologists or sociologists.] See The Daedalus Project
MSU program does research on games and gender -- aliengames.org
Research, books and lectures about games. -- gametheory.net
Wednesday, March 01, 2006
This post was originally written in February, posted, and then I removed it, out of concerns about 1) how it would be perceived and 2) how it had been written in a moment (moments?) of frustration. I have been persuaded to repost it now, though my concerns remain.
Having done student web site projects (on the history and impact of a piece of American technology) three times now over the last four years, I suppose I should have realized that the specifics of the technological side of the approach (using Netscape Composer and a limited amount of hard coding to build research-based web projects) was getting long in the tooth. What I get is that HTML coding is no longer a relevant/marketable skill to our students.
I'm not planning on teaching the class in which I've done this assignment again until Spring of 2007--which both gives Jerry Slezak, my building's ITS, and I more time to figure this out, but also more time for things to change--yet I still have a number of concerns about completely ditching the old system that may add something to this conversation on Patrick Gosetti-Murrayjohn's blog (or it may just reveal my own biases).
1) Having to completely ditch something (assignments, guidelines, rubrics) Jerry and I spent an immense amount of time creating does not make me eager to adopt something else brand new (especially from a new company that may not be around the next time I teach the class). The reluctance some faculty have for embracing new technology may partly come from a sense that that technology is constantly something new (and therefore a sense that an instructor will be at square one every time they teach a new class).
[I am aware that I wouldn't be literally at square one, but it feels like it sometimes. That feeling can be paralyzing (or at least discouraging of new attempts).]
2) I'm not convinced a wiki or a content management system (especially not knowing what that CMS would look like) will meet all of my expectations for what I had hoped the web projects were doing before.
Just so we're clear on my (perhaps unreasonable) goals, they were (and are) based in the idea that I want to provide students with a chance:
--To learn a new way to present ideas, while adhering to the idea that all serious historical scholarship must be thoroughly cited with footnotes (or endnotes) and a bibliography.
--To learn the components of reliable online sources
--To produce original, available online works of scholarly research that would be intellectually accessible and interesting to other students and web surfers. Ideally these projects would improve—one web site at a time—the quality of information available on the web. I would emphasize that this focus on a scholarly approach is essential to my understanding of the value of this assignment. (And I would add that I think that wikis/blogs are seen by many members of my discipline as unscholarly, at least for now.)
--To create a process imitatible by other disciplines
--To provide history majors with a new skill set that they were not getting within our current curriculum. Ideally this skill set would make them more marketable when looking for jobs or applying to graduate schools. [Although HTML or Netscape Composer skills are less relevant than they were three years ago, certainly I think the ability to think about and present information in the digital realm remains important.]
--To think about what it would mean to write and create for a larger audience than just their instructor.
Now, obviously, some of these goals could be met through a wiki or a blog or a content management system. But, could I count on that medium to be (relatively) stable?
Maybe I'm looking at this the wrong way and I should just be focusing on how to get students to think in new ways. Maybe the problem is with my desire to not have to completely rethink my approach every time I teach a class....