Also, NVivo doesn't publish LOTR photos because of copyright issues. So for our mutual edification, and because it is probably just advertising for Peter Jackson's franchise anyway, here are some LOTR photos.
Tuesday, August 6, 2013
Embarking on a Journey with NVivo (or 3 lessons from Middle-earth)
NVivo is a great tool for qualitative or mixed methods research/evaluation. Check out this not-nerdy-at-all blog post.
Also, NVivo doesn't publish LOTR photos because of copyright issues. So for our mutual edification, and because it is probably just advertising for Peter Jackson's franchise anyway, here are some LOTR photos.
Also, NVivo doesn't publish LOTR photos because of copyright issues. So for our mutual edification, and because it is probably just advertising for Peter Jackson's franchise anyway, here are some LOTR photos.
Wednesday, May 8, 2013
After reading The Coding Manual for Qualitative Researchers by Johnny Saldaña
Most people would start off this blog post with: "I don't often get this excited about a research book." I am not most people. I do, often, get this excited about a book. Still, it's worth a write-up.
The Coding Manual for Qualitative Researchers* is my first attempt to go mainstream. What I mean by that is that I am not a methodologist. Researchers use methodology in the same way that social climbers use name dropping. It's oftentimes a contest to see who's in the know, who was educated more, and how you should be labeled. Envision a group of researchers with turned-up noses at a cocktail party. "You're a grounded theorist? Oh. I get it. I have to go see about a thing..."
Fortunately this book never leaves you feeling like you showed up to a stats party in your ethnographer's dress. It's basic. It goes through first cycle coding methods (what you do when you first look at the material), second cycle coding methods (can be deceiving - you may have already looked at the material more than once before this), and gives useful ideas for memo-writing and pre-writing.
One useful part came toward the end of the book. Saldaña mentions several focusing strategies that researchers can use when starting to write, one of which is called "The 'top ten' list". He suggests extracting "no more than ten quotes or passages" and essentially arranging and rearranging them to give you further insights into your data.
Thus, we have what is, for this blog, my list #1.**
List #1: Top Ten Useful Findings from Johnny Saldña's TCMQR
Bibliography
*FYI, I used an earlier version - but I want you to have the most recent one!
**A caveat: This is what is useful to me, in my work - you will certainly find other useful tidbits.
The Coding Manual for Qualitative Researchers* is my first attempt to go mainstream. What I mean by that is that I am not a methodologist. Researchers use methodology in the same way that social climbers use name dropping. It's oftentimes a contest to see who's in the know, who was educated more, and how you should be labeled. Envision a group of researchers with turned-up noses at a cocktail party. "You're a grounded theorist? Oh. I get it. I have to go see about a thing..."
Fortunately this book never leaves you feeling like you showed up to a stats party in your ethnographer's dress. It's basic. It goes through first cycle coding methods (what you do when you first look at the material), second cycle coding methods (can be deceiving - you may have already looked at the material more than once before this), and gives useful ideas for memo-writing and pre-writing.
One useful part came toward the end of the book. Saldaña mentions several focusing strategies that researchers can use when starting to write, one of which is called "The 'top ten' list". He suggests extracting "no more than ten quotes or passages" and essentially arranging and rearranging them to give you further insights into your data.
Thus, we have what is, for this blog, my list #1.**
List #1: Top Ten Useful Findings from Johnny Saldña's TCMQR
- Use simultaneous coding. Because I was afraid of labels, I was already doing this in my research and not knowing what to call it. It is essential to the way I use NVivo in my work. Frequently my colleagues ask me to find organizations in our network that meet certain criteria. For example, someone will find a grant opportunity that will apply to a set of network members, but not others. They may ask me if I have heard of a rural affiliate with a great tutoring program, or a large affiliate that utilizes multiple university partners. That's why it's essential that I code my interview data with multiple codes. Simultaneous coding allows me to search the database of my interviews and find specific passages that refer to promising practices throughout our network.
- Memo writing is time-consuming but essential. I don't know about you, but when I'm done coding a passage of interview questions, the last thing I want to do is write a memo. Interviews are not always the most exciting things. In fact, they are oftentimes quite repetitive. To have to go through codes upon codes of data is exhausting. Fortunately, Saldaña mentions that there are multiple ways to write memos. I don't have to stick to the research questions in my memo writing. I can think about the coding process (why am I using so many nodes for needs and not enough nodes for programs?); emergent patterns in the data (wait, is it possible that interviewees are mentioning needs but not aligning them with appropriate programs?); networks (are our affiliates in close proximity to each other doing better at matching needs with appropriate programs?); problems with the study (should I be asking better questions about matching needs to programs?); and future directions (maybe the next iteration of the study should focus only on practitioners who select appropriate programs).
- Qualitative analysis is cyclical. It's iterative. It takes time, and patience, and the willingness to go back to your data with a fresh set of eyes to see if anything needs to be reworked. All along, I thought that quantitative analysis was hard. For me, qualitative analysis might be even more difficult. I. Am. Not. A. Patient. Person.
Note: Hold on to your hats; the next several items are about methodologies. - Magnitude coding is useful for evaluating content. I'm using this a lot in my work. Initially, I referred to it as directionality. Essentially, you take a bit of text and evaluate the content. Take this sentence, for example:
"If we lose this grant, we're going under."
I can use simultaneous coding here and code for "grants," "cuts," etc. I can also code it as "finances" and "negative". Now, I realize I am applying a judgment here, but in my particular line of work, this is a good bit of information - we need to know if an interviewee feels that their organization is holding on by a thread. It helps inform how we work with them. That's the point of this blog; research informs practice. Another sentence may be:
"We worked with our local United Way and applied for a grant. We received $1 million to support after-school tutoring."
I can simultaneously code this with "partner," "non-profit," "tutoring," and "growth." If an organization's financial health is important to me, I can also code it as "finances" and "positive". - Use descriptive coding to make life easy for yourself. This is essentially what I've been doing while analyzing the hundreds of interviews I've collected over the past several years. You take a chunk of text and label it. Done. I love it. (Wait, maybe I do love labels.)
- Evaluation coding is essentially the best of magnitude and descriptive coding. This is my happy place. I look at the data and think, what is this person saying? Is it good, bad, or neutral? Then I use that information to inform my colleagues about what is working and what isn't.
- Provisional coding is useful if you already know going into a study what you want to highlight. This is another type of coding that I use all the time. My colleagues want to know about specific programs and initiatives that are common to our network, and having a set list of codes for these programs and initiatives can make the coding process easier.
- "[T]hematic analysis allows categories to emerge from the data." This is something that is difficult for me, because it requires me to constantly step back from my precious nodes and allow broader themes to emerge. Themes can emerge from what is being said, but also from what is not being said - and by whom. If you're stuck in your analysis, and want to come up with broader, more dramatic theory, sometimes thinking of themes across the data will help out.
- Remember what is most important for the audience. Or, as he puts it, "I appreciate being told early in a report what the 'headline' of the research news is..." People don't have time, or patience (see #3 above). Get to the point of your story, and quickly, or they'll flip the channel to a reality show.
- "[T]here are times when it's more powerful to end a presentation with tough questions..." Although finding themes isn't my favorite thing, when I do finally get to the point, I like to tell everyone exactly what it means. In my line of work, where practitioners and support staff aren't keen on research anyway, it's occasionally helpful to ask *them* to construct meaning. In other words, rather than coming up with my own ideas about why some of our affiliates are challenge-focused and others are solution-focused, I should ask practitioners why this phenomenon is happening - and what they would do to solve it.
There's more, but I'll leave that for you to discover.
Bibliography
Saldaña, J. (2009). The Coding Manual for
Qualitative Researchers. London: Sage.
*FYI, I used an earlier version - but I want you to have the most recent one!
**A caveat: This is what is useful to me, in my work - you will certainly find other useful tidbits.
Labels:
audience,
cocktail parties,
coding,
descriptive coding,
evaluation coding,
evaluation use,
magnitude coding,
provisional coding,
qualitative evaluation,
simultaneous coding
Tuesday, April 2, 2013
When in doubt, it's your job.
Over the past few weeks, I've been working obsessively on a project. In December, we sent out a needs assessment survey of our nonprofit network. My job was to analyze over 400 data points (using Excel and Word, which will soon change to SPSS and NVivo), summarize them for the different departments in headquarters, and organize meetings to study the results and discuss use.
It's been a lot of work, and fortunately, it's never enough.
I say fortunately because if you are ever in a place where you have decided that your work is done, you're either too confident or out of a job. When I sit down in these meetings I learn so many things that will help my future work - in short, it will make me even more useful for others.
During one meeting, I learned that I should have vetted our needs assessment survey with all the members of one of our office's teams. There was an awkward moment - worthy of a clip from The Office - where everybody essentially said, "Where the hell did you get these questions anyway?"
Awkwardly, the answer was, "Uh, I think it was one of you." No, really. I don't know a darn thing about this departmental function, so I'm pretty sure that the content was from someone in the department who apparently had forgotten s/he had given it to me.
Instinct tells us that we should react to this situation by flexing our muscles and trying to prove them wrong. Interestingly enough, animal instincts don't get you very far in an office environment. They especially don't get you very far if you're trying to get people to make use of your data. Instead, I kept my mouth shut and my biceps covered and moved on.
Though instinctively I would have expected people to remember that I had emailed them to discuss the content of the survey, the fact is that a) it was a year ago and b) I didn't sit down with everyone in that department. So, if I am looking to find a lesson in all of this, the lesson is that I didn't do my job. The next time that I create a survey for our office, I need to meet with everyone in each department, at a team meeting, to vet the survey. Institutional memory is stronger if everyone is involved in the process, and that's true even if there is no turnover in your institution. In this particular department, there had been significant turnover recently, so it's entirely possible that my having the survey vetted by only one individual caused a gap in the results this department is now receiving. And again, it was a year ago. People don't remember what they ate for lunch yesterday. How are they supposed to remember an email from a data geek about this awesome groundbreaking survey?
When people face difficult information, like learning that their beautiful survey had a few imperfections, their instincts often tell them to move more strongly in the direction they were already headed. People don't want to change currently laid plans or to feel self-doubt, because we are taught that this is a sign of weakness.
To a certain extent, they are right. Self-doubt can be a debilitating thing. But there is a difference between self-doubt and reflective self-questioning, and the latter can spur incredible growth. Upon questioning myself, I often find that there was more that *I* could have done to make a project run more smoothly. I don't blame myself for this - I'm human, as far as I know - but I always learn from it.
If you want a project to be successful, you have to learn to communicate with people and navigate organizations in multiple ways. So ask yourself, if someone misses a deadline, what could you have done to remind them of it? Perhaps more in-person conversations or email reminders. (Maybe this person uses carrier pigeon. Consider it.) If someone is a thorn in your side, what can you do to alleviate their concerns? You may have to go out of your way, change course slightly, ask for help, or think creatively, but that doesn't matter. What matters is that the onus is on you. If you see a problem, perhaps you aren't human. Perhaps, like Superman, you have been blessed with special vision. With this gift, you can help those who need corrective lenses to feel like they are part of a successful project. This way, you'll be part of a successful project, too. And that's the point of all this. It's your job.
Labels:
data use,
evaluation use,
growth,
institutional memory,
Ronnie Coleman
Thursday, March 14, 2013
What it's about.
What is this blog about? It is about many things: practicing, researching, education, learning, nonprofits, organizations, psychology, management, self-actualization, finding fulfillment, self-awareness, humility, respect, health of mind and body.Then Vidaghdha, son of Shakala, asked him, "How many gods are there, Yajnavalkya?" Yajnavalkya, ascertaining the number through a group of mantras known as the Nivid, replied, "As many as are mentioned in the Nivid of the gods: three hundred and three, and three thousand and three.""Very good," said the son of Shakala, "and how many gods are there, Yajnavalkya?""Thirty-three.""Very good, and how many gods are there, Yajnavalkya?""Six.""Very good, and how many gods are there, Yajnavalkya?""Three.""Very good, and how many gods are there, Yajnavalkya?""Two.""Very good, and how many gods are there, Yajnavalkya?""One and a half.""Very good, and how many gods are there, Yajnavalkya?""One."10. Hinduism. Brihadaranyaka Upanishad 3.9.1
What is this blog about? Well, really it is about two things: research and practice.
What is this blog about? Actually, it's about one thing: life.
Yesterday I was sitting at a dinner table with three dear friends. We were talking about how we've lived our lives - mistakes we've made, things we've learned, the phases we go through. As one of my secret joys is summarizing, toward the end of the conversation I said, "And you know, you go through phases, you make mistakes and you learn from them and you grow, and then you start all over again. And that's what life is."
That's how I earn my keep, too. I look at a program, learn from it, help it to grow, and start all over again. Right now, I work for a large, networked education nonprofit. It's transformed from an organization founded in the belief that every child deserves a fair shot at success, to an organization that uses research-based practices to ensure that children get a fair shot at success.
My job, in both my personal and professional life, is to act, reflect, grow, and act again. It's a continual process.
This blog is part of that process. Enjoy.
Subscribe to:
Posts (Atom)