Thursday, July 12, 2007

Gypsy fiddlers and one step backwards.

Did I mention the gypsy fiddlers last night? Last night after I forgot to cancel the last of my many planned social events to again write, I ended up having to go, no prob after all cause I thought I was ahead enough, almost done with a first narrative draft. And it was wonderful, I went to the new Frank Gherry bandshell in Millenium park, got a seat right down front with one of my favorite people, the 82 yr old Raffle Rose, and settled in to watch the symphony. Now, it's been 20 years since I've gone to one of those, so I was all trying to fit in but not need to pretend to like it, the guests were gypsy fiddler kings, a small band of them playing with the symphony. Now that's the kind of symphony I can get into!

Now, it's Thu afternoon, I've volunteered to put in my full draft of a CBPR grant for review tonite, but unfortunately, I've gotten even more good feedback on my current draft - of the change buckets variety. Sigh. So, I'm tired but I'm mustering for one more big writing push.

And today I heard from one of the faculty what she really walked away from this whole shindig with.... that CBPR is not a theory, it's not an aim/goal, it's not an outcome, it's a methodological enhancement only, it should be nicely tucked into the framework of your proposal, but not spotlighted. (paraphrased) I'll ask more faculty later what they got out of it.

It is nice for us to have these experts on tap for days at a time, ready to review yr stuff in a jiffy, with worlds of combined experience between them.

Equity in CBPR budgets

First, let me say one thing... I have actually not taken notes on many of the great presentations on the community-interaction elements of this institute, I think that's because many of these factors are more familiar territory to me, and some very intuitive, but just to be good, I'll review some of the main components here:
  • do not underestimate how long and how much of an investment it'll take for a researcher to build trust with the community partners (especially if you are out of community!)
  • This is a business relationship, not a marriage, do not rely on good feelings, rely on following all good business principals of mutual respect, documented expectations, money for work, joint tasks and timelines, etc.
  • Researchers should expect to put more than the contracted into the community partner, help them with their capital campaign, etc.
  • Face it, there's usually a fair amount of in-kind given by both sides.
  • Now this is just my own deduction, but after hearing some amazing stories of CBPR relationships I think one of the smartest things either the community or researcher can do is make sure they partner with a hotshot on the other side. You will get the best performance, the most bestest results if you invest in a person/organization who has proven to be really competent, or even outstanding if you can find them.
  • Respect the fact that answers for many significant problems exist in that community knowledge. Sometimes ideas about methods to get there may exist more in the researcher side, tho not as certainly.
  • CBPR applications at NIH get scored in part on their CBPR infrastructure and partnership
Now onto a few comments on CBPR budgets and equity between both sides.
  • make sure you adequately compensate all activities of your community partners, presentation travel, etc.
  • make sure you pay indirects to your community partner
  • universities can only charge indirect on first $25k of a subcontract, so they're not always charging double indirect for that item.
  • get a focused subcontract for the community deliverables, use eval measures on both sides to make sure you keep to performance. Likewise watch to make sure your researchers all perform, otherwise you'll sour your community partnership.
  • make sure your subcontract to the community group pays for a staff position, not just unassigned deliverables
OK, I've gotta get back to writing, because good news is I think I have enough done to submit my draft application as one of the 3 that'll get big review from everyone tomorrow! Yay that, and weeha, wasn't sure that could happen. On the other hand, I've just gotten feedback that my methods section sucks (well politer language was used).... so I gotta haul ass before it's due at 8 pm tonite.

Whew, the titans heat up.

Wow, in case we were napping this am... or alternately writing like mad, a presentation this morning on the methods for process evaluation got our back row of NIH folk fired up more than anything before. As my buddy next to me says, it's always nice to be a fly on the wall as big muckety mucks in NIAAA, NIDA, CSR and others debate something amongst themselves. But alas, it was perhaps more an appearance than true substantive differences. Regardless, let me summarize the important points as I interpreted them.

  • basic point -- NIH does not fund evaluation research, if you don't have an outcome to measure, don't come knocking.
  • process evaluation is most often used to measure fidelity to an planned intervention, and you aren't going to get thru CSR (the folk that score your proposal) without it (altho another from NIAAA said it was less important to him to do more process eval on interventions that were well baked and documented)
  • but it can be used in a non-intervention setting to track your aims... like measuring satisfaction and faith in the process with yr CBPR folk, like measuring the level of project/topic education by your cbpr folk, like comparing the number/demographics of actual people enrolled to the potential enrollment population
  • now back to intervention, NIDA person said, as they start to emphasize interventions that have longer interaction periods, the value of process evaluation increases, because the impact of the intervention can be affected by greater number of factors, which then need to be monitored through process eval.
Another side note: whenever you have a survey of participants... always ask at the end if you can contact them again!

And yet another note: an often overlooked item in NIH applications is a data safety and monitoring plan, in the words of Dr. Bill - don't forget it!

Now, it's time for a break and in the words of our esteemed co-chair... "I'm sure there's another thousand calorie snack out there from wolfgang puck"

--- Break + 2k calories of muffins later ---

Another note I forgot from before: new applicants to NIH now can get expedited comments and then are allowed to resubmit in next cycle, which could be as soon as one month away.

Now, as many of you know but some may not... NIH grants can be submitted 3x, the process can easily take 2 yrs. At each time, your application can be returned as 1. not scored (but with review comments) 2. scored but not well enough to be funded or 3. scored well and funded. Scoring happens over in Dr. Bill's shop, the Office of Sci Review (OSR). Funding happens according to "paylines" decided by the institutes that take your grant into their "portfolio". Each PA usually has an institutional portfolio attached (and it's one more reason to start your process by asking the grant person at the institution about what's important to them, because there are some portfolios that are full for certain PAs, so it's your waste if you apply for them, they won't get funded).

Also, if I can remind folk: I might get some of this wrong, so don't live by it!

Now in the words of Kate Greeno: She expects to get funded, but right now with paylines so low, she does not expect to get scored on the first submission, many seasoned funded researchers do not these days. She also plans that each application will need full 3 submissions to get funded. (so plan early!)

Now, another note of info from last night. What is the difference between all those different funding announcement types at NIH?

The name for all types of announcements is FOA now, Funding Opportunity Announcements.
Remember at NIH, you never are really working on a "proposal" it's almost always an "application".

PA = program announcement, most common, regular due dates, can stand for years. This is NIH's routine way to say "we want to see applications that are about this". Some are very very broad, since smaller grants (R03, R21) have to reply to a specific PA and they wouldn't want to exclude good ideas by having all PAs be narrow.
PAR = PA + "receipt referral review" PA plus special due dates. often LOI should come first.
RFA = one time shot with $ attached, often used to fund a small number of studies to kickstart a different area of interest. Interestingly, if you apply for this and don't get it you can reapply under a regular PA, giving you kinda an extra shot at it.
PAS = like an RFA that lasts for many years, there are set aside funds attached to this as well, but it's longer term than a PA.

Wednesday, July 11, 2007

Wed 2 - Qual sampling & Retention

Sampling, Recruitment and Retention -- Qual part by Deborah Padgett

Hey y'all, we've had some quantitative presentations that didn't apply to my work as much now, so I didn't take notes, but now, we've got Deborah Padgett talking, and she wrote a couple of books about qualitative methods and I'm taking notes so let me take them here.

She starts by reminding us that most all research is with a non-probability sampling.

Sample size
Vigorous discussion here about how you prestate sample size when the goal is to get saturation or redundancy, which is an iterative phenomena you can't predict beforehand. You need to include a sample size table in your grant (but I don't see that in the directions, I'll ask more later). And you need to tell your IRB sample size. And I've heard unofficially from an AJPH editor that they really hoped any incoming qualitative studies had at least 25 folk. Deborah says it's understood that yr sample might be larger if you are doing grounded theory, and less if you are doing phenomenological theory, & if your journal doesn't like your sample size, switch, because a sample of 10 can really be valuable when conducted well. But consensus is to project sample size, even if you then undershoot. In the immortal words of one of the co-chairs here... "You always have to say a number, then we never hit it, but you say it."

Main types of Purposive sampling
  • extreme or deviant case sampling (seeking the outliers)
  • intensity sampling (similar but doesn't go to the extreme outliers)
  • maximum variation sampling (sampling for heterogeneity)
  • homogeneous sampling (opposite of above)
  • typical case sampling (getting individuals in the middle of the bell curve)
  • criterion case sampling (setting criteria and using folk who meet it, also with Nominations as a subset where folk refer others who meet criteria then you pick, often picking the folk with most nominations for the criteria, best with a positive valence)
  • snowball sampling (when first enrollees refer others, NIDA has a protocol for snowball with IDUs)
  • RDS and PDR for CBPR (respondent driven or participant driven sampling)
Retention issues
Many folk have gotten grants, set sample sizes then fallen way short... now NIH is talking about cutting off funding for unmet enrollment goals (Quote from our OBSSR rep: "It's beyond talk")

So plan for your attrition in yr sample plan. (note from another meeting on this point - best way to keep your folk enrolled and in contact is to plan for routine contact points with them, don't leave big gaps, otherwise you lose folk).

Retention tips (from a large homeless study that had amazing retention rates)
  • keep a toll free project number and distribute business cards
  • use incentives, cash & metro cards (cash is best and you should argue for it)
  • monthly check-in phone interviews
  • Refreshments for all gatherings
  • easy access location, near subway hub
  • incentives mailed out quickly
  • letters and holiday cards sent to participants
  • holiday party (the idea is to create a community of folks)
  • interviewers become familiar with participants life and use "rapport talk"
  • use respectful and formal language (not first names unless indicated)
Now we want to talk more, but it's time for our next speaker, so half of us will probably corner her in the hall later.

Hey, thanks to Rupaleem, another participant, she's telling us all about this good sampling website from RWJF with cites for the different types of sampling.... http://www.qualres.org/HomeSamp-3702.html




The race is on!

Hey, I haven't forgotten y'all here, it's just that I'm in the mad dash now to try and create a full grant application by Thu eve... so it's pretty much write all the time while folk are talking and while they are discussing, break briefly for food, read endless buckets of literature, sleep a bit, then up early to write more. But I think I'll get a bit on top of it today, so I'll summarize some of our great presentations then. What's my topic? A tiny project trying to find out more about why LGBT youth take up smoking at such high rates. Encouragingly, everyone here seems to like the idea so that's good. But three folk who get their full narrative done by 8 pm Thu can volunteer to get the full group's feedback on it on Fri... and I think there's at least a small hope in, ahem, well it's not impossible that I could get there. (why, oh why didn't I read more on the topic area before?)

(The hotel puts a small container labeled "vanity kit" in the bathroom...I had to wait a day to open it just because the thought of what tiny thing could be a vanity kit was so intriguing. It turned out to be 2 qtips and 3 cottonballs. Hmmm... I know qtips are amazing, one of those things like fedex where you don't need it but then suddenly can't live without it, but I didn't realize they were quite that transformative.... ok, enough digressing, back to work, it's 7:30 am already, we start in 1 hr!)

Monday, July 9, 2007

Mon 2 - Defining Community & Developing a Co-Equal Partnership

Defining Community and Developing a Co-Equal Partnership: Academic and Community Perspectives
Lisa Sanchez-Johnson U of C & Xichel Woods, Greater Humboldt Park Community of Wellness

This is the old home presentation, cause spent many years living in Humboldt Park in Chicago. It's a huge Puerto Rican neighborhood and one of the fewer places in the northern u.s. where you can develop a taste for corn on the cob with mayo, cheese and lime on top, or rice ice cream (remember the rice ice cream has a raisin in the bottom, that's how you tell it apart from the coconut ones). I'll definitely be hitting it later to get my fix.

First thing... see a chapter by Israel in the book Community-Based Participatory Research for Health (2003) where the principles of CBPR are outlined well. Make sure your community understands these principles.

3 stages of CBPR implementation
  1. Identification
  2. Development
  3. Maintenance
OK, I drifted away a bit here, as I started reviewing lit for my paper while listening.

Mon 1 - Research thru a CBPR lens

Morning all, still no luggage and a typically restless night in a fancy hotel (you know, I cannot remember having any restless nights in a tent, hmmm), but hey, we power on.

There are now 28 of us here, all expected to write an NIH CBPR proposal as a result of our training in this institute. There are almost as many faculty as participants, and they've tried to get at least one faculty tuned to each of our research interests.

Welcome by Hank Webber
VP for Community and Govt Affairs at U of Chicago

He's just welcoming us really, but let me try to capture one of his good lines, and let me also warn y'all, all quotes are subject to the wild vagaries of my memory...

"We all believe there is deep knowledge in communities about how to solve social problems. There is also deep knowledge in universities. CBPR is a search for mutuality, achieving both academic rigor and public benefit."


Research Through a CBPR Lens
Sarah Gehlert
Principal Investigator of U of Chicago's Ctr for Interdisciplinary Health Disparities Research, and their project leader on their CBPR efforts.

CBPR is a balance between sometimes oppositional forces: community reality, the faith that findings will translate into real world outcomes, and academic rigor, the faith that findings have high reliability and validity.

"Community ideas about research and sampling may conflict with what researchers consider good science."

A common way CBPR applications fail the reviews is either
  • community partnership are strong, but not written about systematically and scientifically
  • the project is strong scientifically, but fails on demonstrating true co-equal partnerships with the community (what's with blogger's random font changes)
Mini discussion -- what if we are researching communities where the people are to unstable to be co-equal partners, like injection drug users? Folk talked about using community groups as access, community leaders, ex members of target communities and how co-equal partnership is really a goal, not a yes/no, so you're trying to make someone as much a part of the partnership as they are able to be.

Insightful words of one community partner, "They say this is a partnership, they come to our homes and talk to us, but I'm never invited to their [the scientists] home."

Sarah Gehlert's Challenges of CBPR
  1. CBPR should be community-based, not just community-placed. The ideal is when the community originates a project and researchers are brought onboard to help with it.
  2. How to define community.
  3. Achieving co-equal partnerships
  4. Sharing findings and influencing policy and practice (could it harm the community, what and where for publishing, decide in advance how disagreements will be handled - one way someone did this was to establish community co-directors who are frontfolk to negotiate with then disagreements come up).
  5. Recognizing that CBPR principles alone don't dictate good design and methods
  6. Conducting ongoing evaluation
Morning break




Sunday, July 8, 2007

Dinner speaker -- Kathleen Alexis from the Alexis Nakota Sioux First Nation

Nimi Icinohabi - a life skills training program for Alexis Nakota Sioux First Nation youth
There are two dinner speakers, let me try to catch some of the story of one of those two... Kathleen Alexis is a leader in first nation tribe in Canada, she told the story of a CBPR research project they developed and implemented, Nimi Icinohabi.

The tribe knew of problems with high substance abuse rates among their members, and were worried adult practices would be passed onto the kids. But their school could not find a substance abuse prevention program that reflected their cultural beliefs. As their first step, the Alexis Nakota Sioux National invited researchers from University of Alberta to collaborate on this problem. Eventually, they ended up with a workgroup with representatives from the band's education department, health department, the University of Alberta, the Alberta mental health board, and community elders. After a series of consultations through the workgroup meetings and community interviews they decided upon the proposed solution to their problem... tailor a current Life Skills Training (Botvin's LST) to this community. This program provided trainings in resistance to drugs, self-esteem and personal management.

In phase one of program development, the community workgroup (she referred to them in shorthand as "The Elders") met regularly to modify the existing intervention, while restoring and preserving their Isga culture. The tailored intervention included the following community-specific components:
  • Alexis Nakota Sioux Nation teachings, ceremonies, prayer, storytelling, & sharing circles.
  • the Alexis Stoney language
  • a naming ceremony (this was a challenge, because it is not traditional to this culture so it took some thinking but it worked very well)
  • Isga artists graphics and pictures
  • students drawings
  • community volunteers.
Where are they now? They have fielded this intervention once in their school (phase 2), and are anxiously awaiting the evaluation results (phase 3) from "the doctors".

Community Impacts?
The process of bringing the elders together to help the youth has been invigorating, the meetings are well attended and the members have a "sense of pride and strength in what they do to help our community." It's also facilitated a lot of skills building for the members, they have reached out to organizations they haven't reached before (recruiting help from the Edmonton School Board and others), they are asked to present on this success to others ("I never guessed I'd be asked to speak to academics"), they have expanded their inreach to allies in the band (parents and volunteers). There have been challenges, including variations in the understanding of their language. It's also expanded the horizons of health issues that the elders think about. In Kathleen Alexis' words.... "For me, it's a spiritual journey and even being in prayer with the elders makes me feel very connected." "And you know I can't thank the doctors enough to come and help us... the encouragement... they are really good teachers and good leaders."
In sum? Engaging so many people in a project that incorporated their history, language, community stewardship, youth, elders, and all inbetween had benefits well beyond the development of tailoring of one life skills training program.

Second speaker - William Freeman - Director of Tribal Community Health Program, Northwest Indian College. Lummi Nation.

Ok, just a bit of his wisdom too....

The ten standard steps of CBPR
  1. identify problem
  2. identify solution
  3. develop plan
  4. generate protocol
  5. co-fund and get approvals
  6. co-implementation of protocols
  7. co-collect data
  8. co-analyze data
  9. co-drafting results
  10. co-report results
He told a few stories about making sure you engage community colleagues in all the steps, and proceeded to regale us with stories of when researchers did it wrong (the Navaho Nation once suspended all research for 13 months when researchers ignored their request and published an article that named the places where research was conducted).

Some of his other good thoughts...

In CBPR, the community is a co-investigator on the research project. CBPR can be a continuum, from minimal to maximal. Ask yourself if the community is a co-investigator, if not, you're just doing research "in community". Here are some possible measures to help understand if the community is a co-investigator:
  • % of plan/proposal/analysis that is shaped by them
  • frequency of all team meetings
  • % of time community members are talking v. scientist
  • # of times scientist is learning something new
He had a bunch more slides with great CBPR info... let me see if I can get them from him to post.

The audience moved into a spirited discussion of compensation for CBPR community partners. I think I can summarize it quickly by saying most all supported compensation for the time and wisdom of the partners, but there were some community-members who had chosen not to in accordance with their cultural mores, and there was some concern on how it might change partnerships to employer relationships... no big clarity on that thought... but to end with a quote from a coming speaker, "Money is a measure of power and justice... And I have yet to see money come in and spoil a relationship between a community and researchers."

(And sorry, but no pictures yet, my camera is in my "delayed" luggage right now, but I have high hopes it might return soon... or else this one outfit will get kinda boring!).

NIH Summer Institute on CBPR

It's Sunday and 27 researchers have arrived from around the country to get the latest training on how to create a successful National Institutes of Health Proposal on Community Based Participatory Research. Community Based Participatory Research, or CBPR as it's fondly known, is a pretty cool concept -- it's a process where fancy pantsy greek-talking researchers partner with the people they are interested in studying to design, implement and then promote the study. Sounds pretty obvious right, but face it, this is not the way research is often done. But now, luckily, it's getting pretty well accepted as a smart method for research.

Much of this blog will be pretty involved, because they are trying to teach us researchers how to ace CBPR --and-- the NIH funding stream. Now I've done a lot of grantwriting but NIH research proposal writing is a bit different than the average bear, it's long, commonly has you toss in over 100 citations, can have you resubmitting supercomplex revised versions of the grant over years, and after all that, they only fund 20% of the proposals submitted. But it's also the path to get bigger research dollars, and if you can jump through all these hoops and get funded, you're valuable to any university (because they make lots of money from your grant), and you're one step closer to getting NIH's famed R01 grant -- which the main way they fund researchers careers, once you have an R01, NIH has invested in you and they are hoping you keep submitting good R01s for .... uh, forever. So -- I'll try to label which sections are about acing NIH so you can read past it.

With that said, they spent about 5 minutes on introductions then started shooting info at us... so I gotta stop yaking and start taking it down! So, let me take you the Intercontinental Hotel in Chicago, to a small nondescript room on the 2nd floor, where we've launched into the first session.

First Presentation - Outlining and Writing an NIH proposal

Dr. Bill Elwood – Scientific Review Administrator NIH Center for Scientific Review & Jerry Flanzer – Senior Advisory NIH Office of Behavior and Social Science Research (OBSSR) & Peggy Murray, NIAA

Wow, they flew thru these presentations so some info is partial. I've got handouts that supplement these notes.

  • Watch for a couple new NIH CBPR announcements coming out before end of year. Special reports
What do NIH Insitutes really want to see proposals on?
  • PAs and RFAs (Now called FOAs). RFAs are really interesting because they have money set aside, and often special review sections.
Pitch your idea to the Institute Program manager first through a concept paper
  • Usually 1 page
  • follow standard format: aims, significance, hypothesis, methods
  • Get feedback from them, also get feedback on your proposal draft
  • These program officers are really there to help you
What are some common boffos in NIH grantwriting?
  • Most common reason for nonreview = overly ambitious design
  • messing up your citations
  • not having a data safety plan (not the same as IRB!)
  • not understanding that all interventions with people are clinical trials
  • presenting why the problem is important, not why our study is important.
  • presenting miscroscopic graphics to save space
  • omitting the conceptual or theory model
A good application (their word for proposal) has...
  • why the study is important
  • presents and exciting story
  • harmonizes money with resources with methods with researcher capabilities (with theory, with existing literature, with instruments, with statistical analysis plan)
Don't forget some of your power!
  • You can request a specific study review section (<50%>
  • Request assignment to a specific institute
  • Request types of reviewer expertise
  • Name reviewers with conflict of interest. (you cannot request a reviewer, if you do, they will put them on the conflict list).
Big mechanism types we're talking about this week
R03 - smallest pilot grant, 2 yrs, <$100k
R21 - small exploratory or developmental grant, a bit more $
R01 - the big kahuna, no $ limit.

OK, there's more from these, but we're breaking for dinner, so I'm gonna post and eat!