We met in Centra on Wednesday night and it was SO much better than the past Centra classes I have facilitated. The video switching feature is awesome, and everything just seems to be much smoother now than it was. Having several guest presenters helped break up the night, and the number of technical difficulties was lower than in past classes. Overall it was really good - though I am always exhausted and my head hurts after Centra classes.
The topics we covered could easily have filled 2-3 class sessions. I wish we could have watched Singapore Dreaming and compared it with one of Woo's academic articles using the same data. I wish we'd had time to really debate arts-based research. I wish we could have all practiced doing word cloud or worked together in ManyEyes.
I'd like to add a reading/discussion around hard and soft process technologies, too.
Here's an article I ran across that I'd like to use next time.
This page on the different ways to use TagCrowd is useful too.
I would also use some of these special issue articles on performative social science.
I'm wondering whether I can teach this class again in spring 12 or not...would I have enough students? Would it be good to test out some of the book chapters on the class? Hmm.
Friday, August 5, 2011
Wednesday, August 3, 2011
day 7
We spent the 7th class period on Transana, and I was pleasantly surprised at how open students were to it. I am least comfortable with Transana of all the programs, and I find following the tutorial to be painful, but I think it went okay with just a few technical glitches. This time around I feel like Transana makes more sense to me, especially since I have started using it for two projects recently.
Tonight we meet in Centra as an example of a collaborative tool. We are going to try to cover way too many tools, but we had to combine two classes into one. I would like to next teach this course over 15 weeks. I would add a night for talking about historical discourses around technology and qualitative research, a night about ethical concerns/trustworthiness issues, and separate out the collaborative tools from the representation of the findings.
I'm not so happy with some of the readings on these topics, but I have found a few more since this syllabus "went to press" that I will probably substitute next time.
The last time I did a virtual classroom was in the spring using Elluminate, so I'm having to relearn the Centra functions. I'm having several students help out with parts of the presenting, so that should make it a little less boring for everyone to sit through 4.5 hours of Centra.
Monday will be the last day - students will engage in focus group-type discussions of lessons learned about the tools. I also clarified the methods section assignment and hope that no one has a breakdown over that assignment. With new courses it's hard to have all the pieces in place before the class starts, so I can imagine it may be causing some stress.
Tonight we meet in Centra as an example of a collaborative tool. We are going to try to cover way too many tools, but we had to combine two classes into one. I would like to next teach this course over 15 weeks. I would add a night for talking about historical discourses around technology and qualitative research, a night about ethical concerns/trustworthiness issues, and separate out the collaborative tools from the representation of the findings.
I'm not so happy with some of the readings on these topics, but I have found a few more since this syllabus "went to press" that I will probably substitute next time.
The last time I did a virtual classroom was in the spring using Elluminate, so I'm having to relearn the Centra functions. I'm having several students help out with parts of the presenting, so that should make it a little less boring for everyone to sit through 4.5 hours of Centra.
Monday will be the last day - students will engage in focus group-type discussions of lessons learned about the tools. I also clarified the methods section assignment and hope that no one has a breakdown over that assignment. With new courses it's hard to have all the pieces in place before the class starts, so I can imagine it may be causing some stress.
Thursday, July 28, 2011
day 6
Last night we had a QDA Miner demonstration from OIT, a discussion of the readings related to CAQDAS and an introduction to Transana by yours truly.
A few insights during the discussion that I want to remember - the "textual laboratory" idea by Konopasek and the idea that CAQDAS can make qualitative research more "scientific" by grounding it in empirical, objectivist notions. While this is great for transparency and trustworthiness concerns, it also borders on making claims that the "truth" is "there in the data" if we just look closely enough and articulate enough. We have to keep remembering that WE are creating the truth through our interpretations, it's not there to be "discovered". (But the software will make it easier to back up those truth claims, I think.) Forgetting that this is all interpretation, though, is what, I think, causes some Denzinesque researchers to eschew the use of technology - it smacks of an objectivist worldview. How to keep the interpretive process at the forefront of analysis even while using technology?
A few insights during the discussion that I want to remember - the "textual laboratory" idea by Konopasek and the idea that CAQDAS can make qualitative research more "scientific" by grounding it in empirical, objectivist notions. While this is great for transparency and trustworthiness concerns, it also borders on making claims that the "truth" is "there in the data" if we just look closely enough and articulate enough. We have to keep remembering that WE are creating the truth through our interpretations, it's not there to be "discovered". (But the software will make it easier to back up those truth claims, I think.) Forgetting that this is all interpretation, though, is what, I think, causes some Denzinesque researchers to eschew the use of technology - it smacks of an objectivist worldview. How to keep the interpretive process at the forefront of analysis even while using technology?
Tuesday, July 26, 2011
day 5 reflections
We spent nearly the whole class on Atlas.ti last night, which we really needed to do (after orienting everyone to Centra and pointing them to OIT for troubleshooting various issues and revising the schedule for the rest of the class.)
We've pushed the Centra session back a day and we'll cover collaborative/project management tools and data representation tools the same day. I finally looked at the rest of the questionnaires and pretty much everyone already knows Skype, dropbox, google docs and Prezi. So that should save us some time next week.
Throughout last night I was able to make some improvements to the outline I have for Atlas.ti. There's no easy way to keep eleven people together, though, no matter what you try. I do not miss my days of doing technology training. Everyone's laptops are different and present different glitches and challenges. We eventually got everyone up to speed, though, enough that I think everyone got to do some coding and memoing and playing with the tools. It was a LONG night, though.
Tomorrow night Mike from OIT will do a two hour QDA Miner workshop, we'll have a discussion of this week's readings (all of which I really like) and I will do a short overview of Transana if we have time. I will be working on Monday's Transana workshop during the car ride to and from Indiana this weekend; hopefully David won't mind driving.
We've pushed the Centra session back a day and we'll cover collaborative/project management tools and data representation tools the same day. I finally looked at the rest of the questionnaires and pretty much everyone already knows Skype, dropbox, google docs and Prezi. So that should save us some time next week.
Throughout last night I was able to make some improvements to the outline I have for Atlas.ti. There's no easy way to keep eleven people together, though, no matter what you try. I do not miss my days of doing technology training. Everyone's laptops are different and present different glitches and challenges. We eventually got everyone up to speed, though, enough that I think everyone got to do some coding and memoing and playing with the tools. It was a LONG night, though.
Tomorrow night Mike from OIT will do a two hour QDA Miner workshop, we'll have a discussion of this week's readings (all of which I really like) and I will do a short overview of Transana if we have time. I will be working on Monday's Transana workshop during the car ride to and from Indiana this weekend; hopefully David won't mind driving.
Monday, July 25, 2011
reflections, day 4
Oops, forgot to blog on Thursday or Friday last week since I was working against the clock to finish reading a dissertation for a defense this week and finish 2 AERA proposals.
Class seemed to go much more smoothly on Wednesday. We started with Ricardo's webinar on ATLAS.ti which was great, though he was willing to talk for more than an hour, so next time I need to keep that in mind. He also offers a discount on the cost of the software, something else to keep in mind. I encouraged students to consider bringing him to campus if several departments could chip in on the cost.
We then talked about DiscoverText ideas for studies and preparing for the workshop on ATLAS.ti this week.
The Inqscribe demo went pretty well. I had students each transcribe the first few turns and email them to me, so that I could pull them up on the screen and show the differences in how each of us "hear" and transcribe the text. I emphasized the strength of synchronizing the text with the audio and the importance of re-listening to the data in addition to re-reading the transcript. We then looked at a transcript that was transcribed using Jeffersonian and discussed the purpose of that annotation system.
The follow up discussion was a good one; with lots of thoughts and talk around all the decisions that need to be made around transcribing. We also talked about the tradeoffs between face to face interviewing and IM interviewing, and what is gained and lost while experiencing the actual conversational event, listening to the recording of the event, and reading a transcription.
We had a bit of time at the end of class to go back and talk about the Internet as data. I showed what is lost when an online forum is downloaded and input into ATLAS.ti (all graphics are lost), and we talked about a couple of ethical dilemmas I am facing in my own research right now in terms of publicly available data that the IRB considers available for research but that the authors of that content may not. Some students had really strong opinions about this which were discussed even further in the blog entries after class. I think the realities of how the Internet is changing research practice definitely hit home.
I also was glad of the chance to talk a bit more about how face to face interaction is still seen as the "gold standard", not only for research data but for other things like teaching learning. People are still skeptical of distance education because it isn't face to face, but my experience has been that I've had DE classes that are much better than f2f classes. The issue isn't merely the medium - online or f2f - it's much more complex than that. I want to encourage people to think in terms of affordances and constraints, not all or nothing/right or wrong ways of doing business - be it the business of research or the business of teaching.
Overall it felt like a really good class session. Here's hoping for the same tonight!
Class seemed to go much more smoothly on Wednesday. We started with Ricardo's webinar on ATLAS.ti which was great, though he was willing to talk for more than an hour, so next time I need to keep that in mind. He also offers a discount on the cost of the software, something else to keep in mind. I encouraged students to consider bringing him to campus if several departments could chip in on the cost.
We then talked about DiscoverText ideas for studies and preparing for the workshop on ATLAS.ti this week.
The Inqscribe demo went pretty well. I had students each transcribe the first few turns and email them to me, so that I could pull them up on the screen and show the differences in how each of us "hear" and transcribe the text. I emphasized the strength of synchronizing the text with the audio and the importance of re-listening to the data in addition to re-reading the transcript. We then looked at a transcript that was transcribed using Jeffersonian and discussed the purpose of that annotation system.
The follow up discussion was a good one; with lots of thoughts and talk around all the decisions that need to be made around transcribing. We also talked about the tradeoffs between face to face interviewing and IM interviewing, and what is gained and lost while experiencing the actual conversational event, listening to the recording of the event, and reading a transcription.
We had a bit of time at the end of class to go back and talk about the Internet as data. I showed what is lost when an online forum is downloaded and input into ATLAS.ti (all graphics are lost), and we talked about a couple of ethical dilemmas I am facing in my own research right now in terms of publicly available data that the IRB considers available for research but that the authors of that content may not. Some students had really strong opinions about this which were discussed even further in the blog entries after class. I think the realities of how the Internet is changing research practice definitely hit home.
I also was glad of the chance to talk a bit more about how face to face interaction is still seen as the "gold standard", not only for research data but for other things like teaching learning. People are still skeptical of distance education because it isn't face to face, but my experience has been that I've had DE classes that are much better than f2f classes. The issue isn't merely the medium - online or f2f - it's much more complex than that. I want to encourage people to think in terms of affordances and constraints, not all or nothing/right or wrong ways of doing business - be it the business of research or the business of teaching.
Overall it felt like a really good class session. Here's hoping for the same tonight!
Tuesday, July 19, 2011
day 3
Pacing, scope and sequencing are critical parts of instructional design and teaching. I felt a bit off on all three last night - tried to cover too much, too quickly, and not necessarily in the right order! Thanks, everyone, for hanging in there and being patient.
Last night was the only night for data collection, though we could easily spend 2 or 3 full days on the topic. I emphasized that not only do we have new Internet tools for traditional methods of data collection (e.g. online interviewing), but we have the Internet itself a site for research. I had wanted to cover 1) Mendeley Q&A; 2) discussion of the readings around tools; 3) demonstration of DiscoverText as a transition to the idea that the Internet itself is presenting a new context for data collection; 4) discussion of the readings around online community & the ethical dilemmas presented by the Internet as a site for data collection. Well, we got through 3 of these anyway.
Mary Alice did an awesome job sharing some of her Mendeley tips and answering some recurring questions. The .pdf annotation feature isn't working consistently and is causing concern. We talked about how the tradeoff with free, open source, new software tools is that they are free, new and open source. That is, proprietary software tends to work better because you are paying for it - lots of work has gone into developing it to the point where the glitches are worked out. Working with glitchy software can be really frustrating for researchers, especially novice ones. Heck, I wouldn't even use this stuff before tenure - the learning curve wasn't worth it. Others who have more skill and patience and inherent enthusiasm for the tools will adopt it earlier. I loved Mary Alice's idea for setting up the watch folder in the drop box, and I hadn't known about the ability to export the .pdfs with the annotations or the importance of the DOIs in the new APA manual.
After that we spent a bit of time on technology jargon and I demonstrated gmail chat (including the new audio/video feature) and Skype with Ginny's help. None of this seemed to be terribly new to most folks, but I knew from the questionnaires I sent out that a few people had never used it before. I need to check in to how to record Skype conversations since that will be key to using the tool for data collection or collaboration.
The first round of discussion could have gone on much longer and I did feel bad cutting it off, though I think the whole group debrief did cover the essential concepts - especially that we can't avoid looking at CMC as part of our data collection when seeking to understand human interaction. I would really like to make this point in our book, too. Twitter and virtual worlds were raised as sites for data collection, too. The discussion of trade-offs around choosing tools for data collection is a critical one, and we framed it as how to make a case that you are choosing the method that is most appropriate for your study (rather than a convenience argument which is unlikely to fly with an audience.)
And then, we moved to DiscoverText. Teaching the tools themselves has been the most stressful part of this class, in part because I am learning them myself, in part because no good technology training has one instructor for 14 students. I do wish I had the time to put together extensive training materials - but they would soon be out of date and it takes days to put together good instructional materials for one tool, let alone the number I'm trying to introduce in this class. At minimum I should have an assistant who can go around and troubleshoot while I demonstrate. I do need to keep encouraging the students to help each other out and take the time to help each other troubleshoot, which they have been great about doing. (Thanks, everyone!)
I was completely out of steam by 9 pm and was mentally adjusting the schedule while closing the class. We didn't have a very energetic closing discussion or debrief of the tool, so I'll have to revisit its possible uses on Wednesday. I heard a few people sharing ideas about what they could use it for, so that's cool.
I wish I could finish out the second discussion I had planned for today, but there won't be time, so I'll just have to let it go and let us move on..not easy for me...!
Last night was the only night for data collection, though we could easily spend 2 or 3 full days on the topic. I emphasized that not only do we have new Internet tools for traditional methods of data collection (e.g. online interviewing), but we have the Internet itself a site for research. I had wanted to cover 1) Mendeley Q&A; 2) discussion of the readings around tools; 3) demonstration of DiscoverText as a transition to the idea that the Internet itself is presenting a new context for data collection; 4) discussion of the readings around online community & the ethical dilemmas presented by the Internet as a site for data collection. Well, we got through 3 of these anyway.
Mary Alice did an awesome job sharing some of her Mendeley tips and answering some recurring questions. The .pdf annotation feature isn't working consistently and is causing concern. We talked about how the tradeoff with free, open source, new software tools is that they are free, new and open source. That is, proprietary software tends to work better because you are paying for it - lots of work has gone into developing it to the point where the glitches are worked out. Working with glitchy software can be really frustrating for researchers, especially novice ones. Heck, I wouldn't even use this stuff before tenure - the learning curve wasn't worth it. Others who have more skill and patience and inherent enthusiasm for the tools will adopt it earlier. I loved Mary Alice's idea for setting up the watch folder in the drop box, and I hadn't known about the ability to export the .pdfs with the annotations or the importance of the DOIs in the new APA manual.
After that we spent a bit of time on technology jargon and I demonstrated gmail chat (including the new audio/video feature) and Skype with Ginny's help. None of this seemed to be terribly new to most folks, but I knew from the questionnaires I sent out that a few people had never used it before. I need to check in to how to record Skype conversations since that will be key to using the tool for data collection or collaboration.
The first round of discussion could have gone on much longer and I did feel bad cutting it off, though I think the whole group debrief did cover the essential concepts - especially that we can't avoid looking at CMC as part of our data collection when seeking to understand human interaction. I would really like to make this point in our book, too. Twitter and virtual worlds were raised as sites for data collection, too. The discussion of trade-offs around choosing tools for data collection is a critical one, and we framed it as how to make a case that you are choosing the method that is most appropriate for your study (rather than a convenience argument which is unlikely to fly with an audience.)
And then, we moved to DiscoverText. Teaching the tools themselves has been the most stressful part of this class, in part because I am learning them myself, in part because no good technology training has one instructor for 14 students. I do wish I had the time to put together extensive training materials - but they would soon be out of date and it takes days to put together good instructional materials for one tool, let alone the number I'm trying to introduce in this class. At minimum I should have an assistant who can go around and troubleshoot while I demonstrate. I do need to keep encouraging the students to help each other out and take the time to help each other troubleshoot, which they have been great about doing. (Thanks, everyone!)
I was completely out of steam by 9 pm and was mentally adjusting the schedule while closing the class. We didn't have a very energetic closing discussion or debrief of the tool, so I'll have to revisit its possible uses on Wednesday. I heard a few people sharing ideas about what they could use it for, so that's cool.
I wish I could finish out the second discussion I had planned for today, but there won't be time, so I'll just have to let it go and let us move on..not easy for me...!
Thursday, July 14, 2011
day 2 reflections
I was absolutely exhausted when I got home last night, and then I had insomnia, waking up at 3 and never really falling back to sleep. Getting through tonight's class is going to be painful. Summer school (13 hour days x 4 days in a row) is not healthy for me, but the plus side is that I am enjoying the classes - great students and for some reason I seem to be more relaxed than I usually am, especially doing two new preps at the same time. So, that's my first reflection - please, God, don't let me ever teach summer school again.
I once again found myself going off topic on a rant about graduate school culture - I really need to stop doing that. I think I did it Monday and Wednesday. Sigh. What sparked it last night was knowing that none of us teach how or provide practice on writing a literature review, yet it's completely foundational to everything that we do. The assumptions that we make about students (and that students likely make about us) are mind-boggling, and yet there is no real space for open conversations to take place to demystify each for the other. Not sure that rants in class are the best venue..
Anyhoo, tonight was the first "tools" night, and I was exhausted before even arriving to class trying to get up to speed on Mendeley and Diigo. It was cool to see Evernote, learn more about how Google scholar connects with UT libraries, and start our Google "help page" doc. I am learning so much from the students, as I knew I would.
The "brainstorm scholarly writings and rank them by timeliness and quality" activity took much longer than anticipated - I think when I've done this before I have given them the list instead having them brainstorm it. I forgot a key aspect of the debrief, which was to identify which ones count come tenure time. Most on their lists do not. Still, this activity brought a new idea - that Diigo is good for keeping track of the gray literature that can be hard to find. Good thing to add to the book (courtesy of Mito.)
One a-ha was that our need to have "physical" copies of the .pdf on our local drives/Mendeley desktop is in part just like our need to have a hard copy in our offices. We don't trust that we will always be able to access it from the databases. Why not just let Ebsco store the .pdf for us, and we can download it when we need it? I guess if we are annotating .pdfs, though, we do need a local copy. Still, I do think this shift may take place soon.
The demonstrations went pretty well, with everyone speaking up to share the features that they knew. There are still some outstanding issues with Mendeley, but Mary Alice is coming in on Monday to help us with those if she can.
I have to work on our AERA proposal tomorrow and Sunday. The class is giving me a lot of ideas, we'll see what I can pull together.
(Note related to the book - we can go through my discussion questions and lecture notes to pull together the core argument for each chapter.)
I once again found myself going off topic on a rant about graduate school culture - I really need to stop doing that. I think I did it Monday and Wednesday. Sigh. What sparked it last night was knowing that none of us teach how or provide practice on writing a literature review, yet it's completely foundational to everything that we do. The assumptions that we make about students (and that students likely make about us) are mind-boggling, and yet there is no real space for open conversations to take place to demystify each for the other. Not sure that rants in class are the best venue..
Anyhoo, tonight was the first "tools" night, and I was exhausted before even arriving to class trying to get up to speed on Mendeley and Diigo. It was cool to see Evernote, learn more about how Google scholar connects with UT libraries, and start our Google "help page" doc. I am learning so much from the students, as I knew I would.
The "brainstorm scholarly writings and rank them by timeliness and quality" activity took much longer than anticipated - I think when I've done this before I have given them the list instead having them brainstorm it. I forgot a key aspect of the debrief, which was to identify which ones count come tenure time. Most on their lists do not. Still, this activity brought a new idea - that Diigo is good for keeping track of the gray literature that can be hard to find. Good thing to add to the book (courtesy of Mito.)
One a-ha was that our need to have "physical" copies of the .pdf on our local drives/Mendeley desktop is in part just like our need to have a hard copy in our offices. We don't trust that we will always be able to access it from the databases. Why not just let Ebsco store the .pdf for us, and we can download it when we need it? I guess if we are annotating .pdfs, though, we do need a local copy. Still, I do think this shift may take place soon.
The demonstrations went pretty well, with everyone speaking up to share the features that they knew. There are still some outstanding issues with Mendeley, but Mary Alice is coming in on Monday to help us with those if she can.
I have to work on our AERA proposal tomorrow and Sunday. The class is giving me a lot of ideas, we'll see what I can pull together.
(Note related to the book - we can go through my discussion questions and lecture notes to pull together the core argument for each chapter.)
Subscribe to:
Posts (Atom)