Will: And welcome, everybody, to Funder Feedback - Ask the Right Way to Get the Right Results. This workshop is being recorded, and slides will be shared afterwards, so just keep your eyes peeled for a follow-up email later, in case you want to review anything from today. In case it's your first time here, this free workshop is an Instrumentl partner webinar. These are collaborations between Instrumentl and our community partners to provide these free educational workshops for grant professionals. Our goal is to tackle a problem that grant professionals often have to solve, while sharing different ways that Instrumentl’s platform can help grant writers win more grants. In case you didn't know, Instrumentl is the institutional fundraising platform. If you ever want to bring your grant prospecting, tracking, or management into one place, we can help you do that.
You can set up your own personalized grant recommendations using the link on the screen here, and before we get started, I just wanted to announce the winner of our $100 donation to their nonprofit. That participant shared our workshop today, and then was drawn for having friends register as a result. To enter for a chance to win a donation for your non-profit in the future, you can feel free to register for some of our following events and share the event with your friends. Congratulations to Rebecca's Tent, which is a women's shelter in Atlanta that supports about 1200 homeless women. We'll be donating on behalf of Instrumentl later today. Lastly, be sure you stick until the end of today's entire presentation. We will be sharing with you, at the end of the presentation, three prizes that we'll be raffling away, raffled at over $750 in value, so more details to come on that at the very end of Maryn's presentation, before the Q&A.
Now that the housekeeping is out of the way, I’m very excited to introduce Maryn Boess. In her 30-year grant career, she has been a staff grant writer, grant consultant, grant trainer, reviewer, author, speaker, mentor, coach, and a grant-maker, so this 3D background helps her bring a unique insider's perspective to the grant-seeking process. Since 2014, Maryn has taught over 30,000 students through her GrantsMagic U community, which she'll share more about later today as well. Maryn asked that you guys hold off on questions until the very end of the presentation, where we're going to have a dedicated section for addressing any questions that you might have as we go through the presentation today, but other than that, Maryn, take it away from here.
Maryn: Will, thank you so much, thank you so much to Instrumentl for making this possible and bringing us all together, and I am so excited, I can hardly sit in my seat, because this is the very first time that I will have shared a training that exclusively focuses on funder feedback. I love the second title of this program, “Ask the Right Way to Get the Right Results,” that's what it's all about in Grants World, isn't it, guys? Knowing what's the right way to do something, so you can get the results that support you in your continued grant success.
As usual, those of you who know me through Grants Magic U... Please come over and visit if you aren't familiar with Grants Magic. We are truly a global community of learning and practice in the grants field, and you can find all of the goodies that we've got for you, and join the community at go.grantsmagic.org/join. If you are already a Grants Magic U community member, hooray! I know we've got lots and lots of you here today, and it'd be great if you would just chat in and encourage everyone else to come over and join the big party that we've always got going on. Let's just get this underway, because what I want you to do is take a moment now, although we'll come back to actually answer specific questions, as Will said, we have specifically set aside a big chunk of time here to answer your specific questions about your experiences and your challenges with getting funder feedback, but only after I’ve given you my big context, and shared from my own experience what I know, what I learned, and what I can do to support you.
I would love for you to just take a moment to chat in, in 10 words or less, keep it as tight and hard as you can; what's your one biggest question, your one biggest question, when it comes to getting feedback from funders on your grant proposals? Then at the very end, I’ll ask you to chat in your specific questions. This is just a little bit of a warm-up, because I love the introduction that Will shared, and it always makes me sit and look around, and say, “Who is he talking about? Who is that person?” This slide shows you what my long and winding road in my 35 plus years in Grants World looks like, maybe yours looks a lot like this. I know we've got tons of people, with lots of experience, as well as total grants newbies joining us here today. Mine has been a very long and winding road, and I like to say, if there's any job that someone can do in Grants World, I’ve probably done it. If there's any mistake that can be made in Grants World, I’m certain I’ve made it at least once, and hopefully learned all the lessons to go along with it, as well. What's really important here today is that bottom line, there. That starting in 2006, I had the opportunity to become a grant maker. I became a grant-making program officer for a pass-through program that was working with federal dollars through the U.S Department of Education, passing those dollars through in the form of training and professional development grants to the K-12 community in my particular state.
Lots of experience in what it's like to work with federal dollars, lots of experience in what it's like to be the grant maker behind the scenes, whose job is to get the dollars out into the community, and get those dollars working. That's really the perspective that I bring to you today, specifically talking about funder feedback, but I want to check in with you, because in that long and winding road of mine, there have been a lot of times when I thought, “Oh my gosh, this grants thing, this is absolutely nuts,” and I’m sure you've thought the same thing yourself, so I just want to give you some flash cards, and ask you, what is it like for you on those crazy days when you're just tearing your hat, and saying, “This is just crazy.” What is it like for you? Does it feel like you're yanking on the handle of the jackpot machine, waiting to see if somehow or other those magic sevens will line up, and the money will come out, but you really don't have any control over when, or whether, that happens? The jackpot analogy… Maybe it's more like an ATM machine for you, it's not so much the handle on the lever, but it's like, “I know there's a secret code here, if I can just punch in the secret code?” I see other people are getting money, and I know they must have the secret code, if I could just grab that secret code myself, I’m sure the money would come out for me as well. Oh my goodness, how about this guy? This poor gentleman, I just love the intensity on his face. What's he doing? He's in an arcade, right? A game arcade, and he's playing whack-a-mole, and it's like those plastic, little plastic creatures just pop up completely at random, and he has to be totally reactive, and try to lock him back down again before they disappear… Is that what grant-seeking feels like for you? It's like I am totally in reactive mode, whacking those moles, and trying to grab them as fast as I can?
Yeah, okay. Let's see the chat box light up. Well, chat in when we hit the one that resonates for you. This one is so common. This is the jungle analogy. I’m in the jungle, and I can't even see three feet in front of me, I know there must be a path through, but I don't know where it is, but I’ve got this machete, and I’m just going to whack that machete as fast and hard as I can, and as long as I keep whacking as fast and hard as I can, I’m sure I must be making progress, right? Oh my goodness, I think this one is my favorite. This is Alice and her adventure in Wonderland. This isn't a jungle, here there is a path, in fact, the problem is, there are lots of paths, and from where she is, down on the ground, Alice can't see which path, if any, is going to lead her to the success that she's looking for.
She's stuck in the middle, because she has no clue which path to choose, or how she could possibly move forward.
Grant seeking and grant success, whether you're a newbie, or you've been doing this for quite a long time, as I have, I know from time to time it's felt like one of these analogies, and maybe you've got your own favorite analogies for what it feels like on the toughest of days. But what I know, and what I really want to share with you, is that it is a long game. These are the big lessons that I’ve learned on my long and winding road. Grant success is a long game, it's not a one and done, and you're not going to have success every time, you may not even have success for a while if you're getting started. It is a process, it's not a to-do list activity, “Just go get us the grant check,” that is not the way it works. It's an ongoing process, it's continuing to unfold in a dynamic and ever-changing environment. I call that environment Grants World, and because it is an ongoing process, in an ever-changing and dynamic environment, it's an adventure, right? It's never the same thing twice, things are constantly changing, and evolving, and moving forward. That means, my friends, you get to be the hero. That's good news. And more good news is, there really is a map. There really is a map, and we call that the grand success path that Grants Magic U, that's a clear consistent path from where you are, to where you want to be. Now, this is not a path that someone lays out for you, this is the path of your tasks, your activities, and your strategies moving forward through Grants World, and this is what the grand success path looks like. The way we share it at Grants Magic U, just going to flash this on the screen for you here. We're not going to focus on it, but I really want to start with the context that this is a process, it's a strategic process, it's an ongoing process, you can see it starts with step one, down in the bottom left-hand corner, all the way up to step 10, in the top. What is step one? Get real? What is step 10? Get better. And then we've got that lovely, looping yellow line that brings us all the way back. That's our feedback loop that says, get better means we always have the chance to improve and get better at each and every step along the way. Our focus today is getting better on getting feedback. Feedback from funders is only one way that we can get that. We can build improvement into the process, and that's what we're going to be focusing on here today.
You chatted in what's your one biggest question when it comes to getting feedback from funders on your grant proposals, and I am not a mind reader, but I’m going to bet that 90 percent of the questions that got chatted in, if not more, fell into one of these three big bullet categories: How do I get funder feedback? How do I make sense of funder feedback when I get it? And what do I do with it?
We are going to be tackling all those questions, but I want to let you know my two-word answer to this, and every question that has to do with, “What are funders thinking?” What are funders all about? What's going on in underworld?” is always going to be, “It depends.” It depends on the funding organization, depends on the structure, and the culture, and the practices of each and every individual funding organization, because each and every funding organization is unique. It depends on how you ask, and it depends on what happens inside the grant maker's black box.
One of the biggest lessons I’ve learned as a grant maker, “What is the grant maker’s black box?” The grantmaker’s black box is where your proposal disappears when you push submit. Boom, submit. It used to be when you dropped it in the mail slot, or you took it to the FedEx office, but now, as often as not, it's a submit button online, or an email submission process. Boom, it goes away. It disappears, it becomes invisible to you. What happens to it from that point on, until, at some point in the future, you get a notification of some kind, and hopefully it's a, “Yay, we want to fund you.” notification, hooray! But the distance between pushing submit and getting an okay notification is a complete mystery to us when we're the grantseekers, because all of that work happens inside the grant maker’s black box. I live inside the grant maker’s black box as a grant maker, and one of the biggest lessons I’ve learned is, so much of what we struggle to understand as grantseekers, becomes much more clear, much less confusing, and much less frustrating, once we understand what life is like for the grant makers. Not just what grant makers say to us, not just the public-facing engagement, not just how they engage with us in, you know, the formal and official professional setting, but how things work inside the grant-making ecosystem, behind the scenes. In other words, inside the black box, where the sausage is made, right? If you want to think about it like that.
Today's journey, we are going to definitely talk about how to get funder feedback, and what to do with it when you get it, but we're going to start by building the bigger context for understanding funder feedback with the behind the scenes tour of what really happens inside the grantmaker's black box. Along the way, take a deep breath, we're going to bust three of the biggest myths about what really happens in the grant-making review and decision-making process. I’m not going to keep you hanging, I’m going to show you what those three big myths are right now. Ready? Here we go.
Number one, it's a competitive process. It isn't. I’m going to show you exactly why it's a myth that the grant review and decision-making process is competitive.
It's an objective process. Bing bing bing bing! It is not, and I’m going to show you exactly why the grant review and decision-making process is not objective. How about this one…
Take a really big breath for this one, guys… This is a toughy. An A+ proposal will beat out a B- proposal every time. It will not, and there's a very good reason for that, and I’m going to show you exactly why that's so, alright? And all of this, again, is in aid of really understanding the context for funder feedback, so we're going to be circling back, then to talk about the three biggest mistakes that people make when it comes to getting feedback from funders on their proposals, and the lessons that you can learn from them, so you won't make those same mistakes.
Just as I previewed the three biggest myths for you, I’m going to preview those three biggest mistakes so you know what we'll be leading up to. Number one. The biggest mistake is, they don't ask for feedback. We're going to talk about that. Talk about why, and when we should, and when we shouldn't. They don't ask for feedback the right way. Did you know there's a right way to ask for feedback, and a very wrong way? We're going to break that down for you. And then, finally, people misunderstand what the feedback means, and what to do with that.
We're going to talk about that as well. But again, all of these questions will be much more clear for you once we understand the context of what actually goes on inside the grantmaker’s black box during the process of deciding which proposals are going to get funded.
Finally, you're going to have time to bring all of your own questions, your own challenges, your own experiences, into the conversation during our open Q and A session. We specifically set time aside for that, so as Will said, if you'll make a note of questions that you have coming up, but don't chat them in just yet, and we'll invite you to chat them in. Many of your questions I may be answering for you as we go through. I hope that's the case, but questions that remain at the end, you'll be asked to chat them in at the end. The format that we'll use when we invite the questions is, we'll ask you to actually put “##” so that it'll be really easy for Will to scroll through the chat and pick up what questions you have, versus what comments you might be making to each other in the chat box, so lots of time for that.
Write down your questions, write down your “aha's,” write down your “hmms,” and write down your “huh’s”. Aha means, wow. The, I didn't know that. Hmm is, you're like, that's really interesting, and the huh? is like, I don't really know if I believe that or not. Then we'll come to the questions a little bit later on.
Alright, so here we go guys, ready to go inside the grantmaker’s black box? These slides actually come from a training video that we've got up at Grants Magic U called “Inside the grant maker’s black box,” where what I do is, take you on a very deep dive into my own grant-making process, a recent process, where you will actually watch the proposals as they come in, will track the proposals through. You'll sit in on my review team meeting, you'll listen to the conversations that we're having, you'll see all of the really unexpected things that happened that made the process especially challenging.
You get a real-life experience, a real-world experience, up close and personal with an actual grant-making process, but what we start with, is that inside the grant maker’s black box there are typically, not just in my process, but I’m going to- I’m expanding out here for you across all grant makers, which is a scary thing to do, but a typical grant-making process may very well be structured around eight specific checkpoints that take place inside the grant maker’s black box that your proposal needs to make its way through, like an obstacle course, in order to what? In order to get into the finals round, and we're going to go really, really quickly, through each of those checkpoints, with a deeper dive into a couple of them. But I want to set the context for this by reminding you, as I mentioned earlier on, that each and every grant-making organization is unique. It's unique in its structure, it's unique in its culture, it's unique in the ecosystem, the dynamics of how things work, it's unique in the life cycle, the timing of things, how things happen inside that process, it's unique in the people that are inside that process moving things through, or impacting what happens, what doesn't happen, what works really, well what's quirky and weird about it all, and then I say all the P's. Each and every grant-making organization is unique in its preferences, its priorities, its practices, its principles…what are some of my other P's… It's pet peeves, all of those, especially the personality.
That's the context, that we don't have a broad brush that we can use to say, grant makers always, or even grant makers usually, so what I’m going to share with you is a template that's a very typical template for how the process works. It's how it works in my world, and it's how it works as sort of a good general template as a whole.
The first checkpoint is “deadline,” which is fun, because when we're grantseekers, the deadline is sort of the last thing in the process of getting the grant in… for us, when your proposal enters the black box that's on deadline day.
Deadline or receipt is the first step in the process, okay, and then the proposals move to a technical review. Technical review simply says, does the proposal, as submitted, follow the rules that we set out in our RFP, our request for proposals, or our proposal guidelines, or whatever it is. We were letting people know how we wanted the proposals to come into us, does the proposal as submitted follow the rules that we asked people to follow? This is a completely objective scrutiny. This is me, the staff person, sitting down with some kind of a compliance checklist, could be very informal, could be very, very formal. It is not qualitative at all, and we are not judging or assessing the value of the proposal, we're simply making tick marks in terms of the compliance issues. Are all the required attachments included, are all the signatures in blue ink? Are all the margins the right size? Were there any violations of our character count limitations? You know the drill, you know the drill.
That's a checkpoint. Number two, proposals that advance from checkpoint number two go into the individual review and, again, this is a super high flyover. Inside the “grant maker’s black box” course, I’m spending a lot more time at each checkpoint showing you what can go wrong for your proposal at each stage of the way. We're just doing a super-duper quick eagle view fly-over here. The proposals that make it through checkpoint one and checkpoint two go into the individual review. What is this? This is where people, one person or a group of people, put their eyes on the proposals, all the proposals, not just yours, but all the proposals that have been submitted for a qualitative review, a qualitative review. My friends, this is a really really big point that I cannot stress too hard for you at this stage, and it feeds forward into our feedback issues. The review of all grant proposals is not an objective process, it cannot be an objective process, because proposals are not like math tests. A math test you can score right answers, wrong answers, there's no question about right and wrong answers on that kind of objective test. Scoring, or evaluating, or assessing a proposal is not an objective process, it's a lot more like an essay test, it's a qualitative review of the responses to the request for proposals. Hold that thought in mind. That's one of our big myths right there, just busted it, right? I’m going to show you a little bit more inside examples of what that actually looks like on the ground. Yes, I know, there were all these points that are being tallied up. Yes, I know there are all these detailed matrices that proposals are being scored against, doesn't matter. The numbers don't… The numbers don't make it an objective process, the numbers just give us something to help us move the decision-making forward. You'll see what I mean in a little bit.
Alright, so that's the individual review. Out of the individual review, in many processes, those scores come forward. We compile them in some way, we create some sort of report, that can then go forward to what? The group review.
We're making order out of all of the parts, all of the information that comes in from the individual review goes forward into our group review, and this is where the people who have read the proposals now come together as a group to decide which proposals they are going to collectively recommend for funding. Now, what's really important here is that puzzle graphic that you see, because this is all about fitting pieces together. What are the pieces that we're working with in terms of the qualified proposals, and how do they best connect with each other? To what? To create the overall effect that we're after, or the impact that we want to have with our dollars in this funding round. What are the pieces that we have to work with, and given the money that we have available, how can we best fit those pieces together to create the overall effect? Coming back to that in a little bit, as well. Did you know there's a checkpoint called official approval in many, if not most, funding processes? There very well may be another step beyond the group review, where the recommendations from the group review go forward to someone else who has the final say. That could be an executive team that has the final say on a community review advisory board's recommendation.
Sometimes it's an individual, for instance, the CEO of a corporation, who might be reviewing the recommendations from an employee proposal team.
Sometimes it’s family members. If we're talking about a family foundation, there may be a review process that sends recommendations forward to the family members, and they get the final say, so that all becomes part of that sausage-making on the inside of our black box that we on the outside, as grantseekers, may not be thinking about as our proposals are making their way through. There is also a checkpoint that is negotiations, which is all about, now that we know which proposals we want to fund, are there any changes that we need to see? Do we need to work with some budget cuts perhaps, maybe, need to pare budgets back a little bit, make some specific changes so that these programs will really work well for us, and work together.
The negotiation piece is a part of that interior process as well, and once all of that's taken place, we come to the final checkpoint, which is notification. Notification is simply when we're done with all the processing, we know exactly where the money's going to go, which proposals are going to get funded, and we're ready to put out the notification to those that will get funded, and those that are not going to make it this time around.
That's the very quick flyover of the eight checkpoints, a typical template of checkpoints inside our grant maker’s black box, alright?
Let's talk about elimination, because inside the grant maker's black box, what happens is, we are moving from the 27 balls that you see on the left-hand side of your screen, each of which represents a proposal that comes in, through the deadline, all the way to the right-hand side, to the three that ultimately are going to get funded. Now these are just made-up numbers, obviously, just for the purposes of illustration, but this is always the way it works. We have more proposals coming into the pipeline than we are going to be able to fund, so the process of moving from checkpoint to checkpoint to checkpoint becomes a process of selective elimination. At certain checkpoints, proposals are selected out, and do not move forward to the next round. Only a certain number are going to make it all the way through to the finals round, we call it the finals round, and ultimately only a certain number are going to come out the other end with awards attached.
Let's talk about what those elimination rounds are.
Okay, deadline or receipt, that's an elimination round for many funders. If you don't get in by the deadline, boom. You don't get funded, it doesn't move forward, it does not advance. Technical review is not true for all funders, but for many this can be an elimination round where a proposal that violates one or more of the technical requirements, we call it, they eliminate themselves.
Okay? They take themselves out of consideration by not lining up with the requirements that we gave them in our guidelines.
There's an elimination round there. Individual review and group review… not usually, not always an elimination round, but that's certainly where the elimination process, the selective process really gets cranking up. The official approval absolutely can be an elimination round if the person with that final approval says I don't like that, I don't like that choice, let's send it back and get something else instead. And then negotiation. If you've ever been in the situation where the funder has come back and said, “We love your proposal, but we can't fund the full budget, can we work on that and see if we can do something with less money?” If the answer is “No, I’m afraid we just can't do it with less money,” then that can be an elimination round as well.
It's the elimination rounds that really come into play when we're talking about funder feedback, because we have to look at where funder feedback actually comes from. In that process, you're not going to get funding, you're not going to get feedback from a funder, if your proposal was eliminated by deadline, or by technical review, because it doesn't advance to the point where anybody's actually looking at it and doing a substantive evaluation, right? And if it's a negotiation, then that's part of the discussion process, but the kind of information that we typically think of when we're thinking about funder feedback, is information that's generated through the individual review, the group review, and sometimes the official approval process.
Alright, so what we're going to do now is dive a little bit more deeply into each of those three checkpoints, and take a look at what happens inside them. Again it will be fairly high-level, I’m going to be doing a quick extract from the bigger course to show you some of the high points of the process from that course to give you a taster feel for what actually is going on that impacts the kind of feedback that you can reasonably expect to get.
Alright, so let's go back into the individual review. Here we go, guys, essay test. Remember this is my qualitative review, I’m a reader, I’m a reviewer, I’m sitting here with a stack of proposals on one side of my desk, I have a checklist on the other side, maybe with very specific point values for things that I’m to be looking for, and assessing on the basis of how well the proposal is responding to these particular questions and criteria, but it is still a qualitative review. It's my qualitative review of the responses to the proposal.
Alright, so we're going to go meta, that means we're going to kind of rise up again and take a look at what this looks like. Again, these screenshots come straight from the inside. The “grant maker’s black box” course, and I’m not going to give you a whole lot of context here, but my review team, as a very very experienced team, they're advisory council people, they're experts in their field, they've worked with me for a while, they know the projects inside and out. They are not new, and we sit down with a detailed scoring sheet, each and every time we're doing our proposal review. Here is what those scores look like for proposal two.
Let's say we had 10 proposals to review. This is our scoring sheet for one proposal, and you can see… let's just blank this out, so you don't go, “Ah, that's too many numbers,” I understand that. Let's just focus on what's really important here. What's really important here is that there are a hundred and forty points possible in our entire scoring process.
Okay, for all proposals. This is just proposal two, and across the top you can see the individual reviewers, Reviewer A, Reviewer B, Reviewer C. Now, these are actual real, live, living breathing, people. These are not scoring automatons, they are you, and me, and Betsy, and Frank, and Jill, and George, and real, live, living, breathing, people with real lives, real challenges, real mental energy issues when it comes to sitting down and paying good attention to important grant proposals. Take a look. Out of a hundred forty possible points for Proposal 2, how my proposal reviewers actually scored them, read across… Reader A scored 107, Reader B scored 121, Reader C scored 127, Reader D scored 69. We have 71, 113, and 103. This is the total score from my review team for one proposal. Remember, these are experienced reviewers. They know my program inside and out, they're looking at exactly the same proposal, they're looking at exactly the same scoring criteria, and out of a total of 140 possible points, look how different their scores are, from a high of 127, to a low of 69. Hmm. Very interesting.
Alright.
Hold that thought, and what we're going to do now is, we're going to dive in and zoom in on one item on that proposal, item 11.
All the way down the left-hand side are the 12 scoring items… What just happened? There we go… are the 12 items that are being scored, and we're just going to look at item 11 right now. And I’m going to take you into the score sheet and show you, not only each reviewer's scores, but the comments that each reviewer made to explain the scores for that item. Again, this is just one item, this is called systemic impact. It is worth 10 points out of the 140, and here's what I want my reviewers to do. On the score of 1 to 10 as a reviewer, how confident do you feel that the work and the impact of this project will have the support it needs to continue beyond the funding period?
Okay, systemic impact. How confident do you feel on a scale of 1 to 10 that this will have the support it needs to continue beyond the funding period?
Here are the scores. [Music]
Okay, there we go.
Alright, so Reviewer A scores at 6, and the explanation is generic in nature: weak description of how the state working team will extend the effort. Reviewer B scores at 10 full points. Pleased to see the number of teachers able to participate in the professional learning and the collaboration among several west valley districts. C scores at 10, gives it full points, but doesn't offer any comments, that's very common. D gives it eight out of ten, this section is brief but specific enough that I believe the framework is in place, etc. etc. E says two points only out of ten, and its only comment is highly unlikely, meaning I don't have any confidence at all. F says nine, that's a lot of points based on the narrative: “This plan seems feasible.” Not very helpful there. G says 6 out of 10. G says concerned about the depth of training spread over large numbers of teachers, wish collaboration were stronger.
Here's what I want you to look at. Now I want you to look at B, and I want you to look at G. B said 10, G said six. B said, pleased to see the number of teachers. G said, concerned about the large number of teachers. B said, the collaboration looks great. G said, wish collaboration were stronger. Right? Actual word for word coming straight out of my review process. Completely typical that reviewers would see the exact same information through very different perspectives, and score them very differently.
Think about that. We're going to come back and have an opportunity to talk a little bit more deeply about what this has to do with funder feedback, and what kind of feedback is possible, much less useful, in a situation like this, which is the situation all the time. This is an absolute snapshot of what things look like inside the individual review process.
Alright.
Okay, let's move to the group review. As if that's not enough.
All we were seeing there was just for the scoring in the comments for individuals. We just put them all together into one data report, but now we're going to take a look at what happens in the group review itself when we bring our reviewers together, and they talk about the proposals that they saw, what they liked, what they didn't like, and how we're going to come to a consensus on which ones to recommend for funding. Remember, this is about fitting it all together, our puzzle analogy, what are the pieces we're working with? What are the proposals that came in? And given the amount of money we have, how do they best connect with each other? To do what? To create the best possible impact. Again, given the pieces we have to work with…
Here's where we're really going to hit those big three myths from outside the black box hard, my friends. Number one, I’m going to show you how we're just going to put the lie to the myth that it's a competitive process. Once we're inside the finals round, it is not a competitive process, and you'll see exactly why. You will learn even more clearly why it makes no sense to even think of it as an objective process, and that question about the A+ proposal will beat a B- proposal every time. You're going to watch in real-time as we just make that one go away completely, for good, because what's really going on here, my friends, this is the big gold star nugget from anything inside the grant maker's black box, is in their final decision making, when it comes to the finals round, grant makers aren't looking to fund the objectively best proposals, they are looking to create a portfolio or an ensemble of funded projects that, collectively taken together, best furthers their specific intentions in that funding round… in that funding round.
This is maybe the single most important slide of the entire presentation in terms of understanding what goes on inside the grant maker's black box, especially when it comes to those final funding decisions, okay? And the big truth that drives grantseekers crazy… take a big breath before you read this one, half the success of any grant proposal, in any given grant-making round, is something that we as grantseekers have no control over whatsoever, and that's who else shows up. Who else shows up? What are the other proposals that are in the mix for consideration? What are the other puzzle pieces that are available for the reviewers and the final decision-makers to work with to create that final picture?
Alright? That's a biggie, but I want you to see this for yourself. I don't want you to take my word for it. I need you to see this, and experience this for yourself.
Never tried Instrumentl?
Find and win more grants for your nonprofit!
Start saving 3 hours a week and increase your grant applications by 78%.
Try 14-days free
Let's do an experiment. Here, you're going to play along with me, you're going to be part of my review team.
Put your grant maker cap on, first of all. Put your review team member cap on, okay, and let's say that you're a member of the review team considering proposals for a project called, “Community approaches to improving literacy,” and let's say that we had a total of 15 proposals come into the review, and the whole team got together and you did own individual reviews, and compiled all the scores, and so what you're seeing now is sort of the compilation of the average scores across all review team members for the 15 proposals. Now, you're not seeing all 15 proposals, you're just seeing those that scored above a certain point, but you can see the highest-scoring proposal was 98 points out of 100, 95, 93, 91, 90, 85, then there's the dotted red line, and then we see one proposal below that says, “82.” That red line, what's that all about? Well, we were told provisionally that we would give priority consideration only to those proposals that scored at least 85, and we also know that four proposals only are going to be funded. Let's just say we have only enough money for four.
Our four proposals that are going to be funded, if you look at this screen in front of you, what do you imagine the results of the final four might look like? Well, if you're going by points, it's pretty clear isn't it. It's going to look like this, the top four scoring proposals are going to get funded. How simple is that?
Would you not be terribly surprised to find out that this is the actual result, once we send our recommendations forward to our executive team? The final results that come back look like this. And you're going, “What the heck, the top-scoring proposal didn't even get funded, and somehow that thing below the red dotted line did, what the heck is going on in there?”
Alright, that's what you, as the review team member would see, but when you go to the finals round, which is what we're going to do now, there's another level of decision-making that has to take place, and what happens in the finals round changes everything, because here, in addition to matching mission and scoring on proposal qualities, there are two other criteria that apply to the final funding decisions. We call them non-competitive. Goes like this, we're committed each year to selecting projects that reflect the greatest possible diversity in representation of the major regions of the country, geographic diversity, and in the high needs populations to be served by the project awarded.
In the final selection process, these criteria come into play, and here's what that process might look like. Now, become a member of my executive team and walk along with me as we take a look at the recommended proposals and apply these two final criteria to choosing the four that are ultimately going to get funded. With your grant maker's hat on, we know we're not going to fund two projects from the same geographic region, not if we're only going to be funding four altogether.
Right away, we've got a choice to make between our two top-scoring proposals, right? Because they both come from the state of Washington, my home state.
How are we going to decide that? Well, we're going to look at the target population factor, okay?
What can you discern about that? Gosh, we've got another. We've got Seattle working with the working poor, we also have Boston working with the working poor, but if we look at Tacoma and refugees, that's the only proposal serving refugees.
Just kind of fast-forwarding through a lot of conversation we might have about that, we're going to say, you know what, since Tacoma is the only project for refugees, let's go ahead and select that. Which automatically means we're going to select the Seattle project out, representing the same geographic region, provisionally, provisionally. And so that's what that looks like, that's how the high-scoring proposal got set aside in the decision-making process. Now what happens. Now we move down to New York and Boston. Same situation, they both represent the same geographic region. How are we going to choose between them? Let's do the same process with them. I’ll give you a moment to look at what we see on the screen here, and see for yourself, how you would choose between New York and Boston, right? Maybe you would look at the target population again, and you would notice that New York and Chicago both serve inner-city youth.
We have an option for inner-city youth, but now because we eliminated Seattle, Boston is our only working-poor option. You see how we're not even talking about the proposals qualities at this point? We're assuming they're all high-quality proposals. That was the hard work that our review team did just to make sure we're only looking at high-quality proposals. Remember, there are another eight proposals down below that line that we're not even seeing at this point.
What are we going to do now?
We're going to eliminate New York, and we're going to keep Boston, and we're going to keep Chicago.
Alright, guys, that's three out of four. We have just one more proposal to select, and guess what? We've got one more option there. We've got Texas.
Okay, different geographic regions, and they're serving farm workers. Hey, a whole different target population, right? Slam dunk, totally perfect, yay! Except that one of our executive team members says, “Hey, guys, you know what, it's been at least three years since we've had an opportunity to fund a project serving Native Americans, in fact, I’m not even sure we've seen a proposal for a project to serve Native Americans in the last three years. I think we ought to give some serious consideration to that opportunity this time around.” Wow.
Alright, let's talk about that. We have a conversation about that, we decide the project is strong enough, that we're willing to support it because it serves a high-needs population that we rarely have an opportunity to serve, and there you go. Voila! Those are our four final awardees. That's how that works, my friends. If you're going, “Hold on, wait a second, that thing with Tucson and the Native Americans, that's below the 85-point line, that's not fair.” Of course, it's fair, it's totally fair, because we make the rules, right? And the rule is not that the line is going to make the decisions for us in our decision-making process, the line is there to help guide our thinking, but the line doesn't make the decisions for us.
When we see a unique opportunity, we are totally free to respond to that unique opportunity, because here's the thing that just drives grant writers crazy, and I know, because I was one, it also drives recovering A+ students crazy, and I know, because I was one, it looks like this. This is what happens in the finals round. When we've got our qualified proposals there on the screen in front of us, guys, the points no longer matter, or at least the points are no longer the only thing that matters, and rarely are the points what determine the final awards, okay?
This was a very fast and simple, stripped-down version of what happens behind the scenes, when we're putting the puzzle pieces together, but I hope it gives you a feel for what that can look like. This can take many, many hours of intense conversation, for sure. This is just the fast-track way of getting you to see how those non-competitive factors, geographic diversity, and target population, in this case, were what we chose to use as our guides for putting our puzzle pieces together, and so, in the end, it's a lot less like the Olympics, a competitive process where the first three proposals across the finish line win, and if you're in the fourth place you don't, and it's a lot more like what? Think of that as an audition, it's a selective process, we're choosing to create an ensemble, and we're looking to see who's showing up to audition, and we're assessing how everyone who showed up to audition is going to fit with each other to create that ensemble, or collective, or portfolio impact that we're after here. Remember that gold star nugget. In their final decision, it's that portfolio or ensemble of collective projects that best furthers the intentions of the funder that makes the difference in the funding round.
Is it a competitive process? No, it's a selective process. It may feel competitive to us as grantseekers, but internally, it doesn't work that way. Is it an objective process? No, it could not be, it's not a math test. It's an evaluative process, and will an A+ proposal beat out a B- proposal every time? Nope. Because in the finals round, it's not… the points are not the only thing that matters, other factors come into play that have nothing to do with A+ or B- proposal, alright?
So, whoo, big breath here, what about feedback? This is all about feedback, right? The focus here is on really understanding more about what goes on inside the process, so that we can be better positioned to understand what's possible, what's realistic, and what makes sense in terms of the feedback that we do or do not get from funders about that process. As we talked earlier, the place where the assessment happens is in the individual review, the group review, and the official approval.
Let's talk about feedback. Let's go back to our experiences, and I’m just going to ask you, without conversation right now, what kind of feedback is possible and helpful in the situation where you have two reviewers seeing the exact same response very, very differently, and scoring it differently, and judging it differently? Just think about that for a moment, and keep in mind, this is just a little tiny microcosm. You're just seeing the comments on one section of one proposal. There are 14 sections, and 27 proposals in total. This is just the focal point for today. What kind of feedback would be possible, and what kind of feedback would be helpful? Just think about that for a moment.
Remember, they saw the exact same thing, the number of teachers. One saw it as A+, one thought of it as a negative. They saw the exact same thing, the description of the collaboration, one saw it as strong, and one saw it as weak. Right? What kind of feedback is helpful here? If you were the Seattle proposal, maybe you find out that you got 98 points, maybe you even find out you were the highest-scoring proposal, and yet you didn't get funded. What kind of feedback would be possible and helpful? And finally, we didn't talk specifically about this. This is our checkpoint number six, the approval stage, where there is yet one more level of final approval. A family member had the final say. They had a favorite project they wanted to fund, so one of the projects that our review team recommended got bumped in favor of a family pet project. What kind of feedback is possible and helpful in that situation?
Here's the thing about feedback, okay? Let's talk about it now? Let's dive right in here. Number one, not all grantors can or will share feedback or reviewer’s comments, not all can or will. There are a lot of reasons why that's the case, and it's not usually just because they don't want to. I guess it could be a lot of times because they don't want to, they've made a decision or structurally, they're not in a position to do that, so just know that. Just know that when you go to look for feedback, if you don't get any response at all, or if they say “Sorry, we can't do that,” don't take it personally, it just is a rule of the road that not everyone can. It's an artifact, actually, like, I’m going to talk about that later on, maybe it'll come up in the Q&A about why that is, and why that's an artifact of the fact that every single teeny, tiny, little ground-making organization now shows up in funder databases, thanks to the wonders of the internet.
Not all granters can or will share feedback or reviewer's comments, so don't get upset. Don't get upset if you don't get a response, but the thing is, most people never ask, okay? You're here because you do ask, or you want to know how to ask, and you want to get… you want to know how to ask better, but most people never do. I have a very small number of proposals that come in in my review process, and I offer, I say, “Let me know if you'd like to talk about what went on, and what you might learn from the process for the next time around.” Most of my proposing teams don't ask, okay? Interesting, huh? And most people who do ask, ask the wrong question. Ask the wrong question.
What's the one question you want an answer to when you get a notification? Let's say you didn't get awarded. The one question you want to ask is, why wasn't our proposal selected? The one question your executive director wants answered, why wasn't our proposal selected? The one question your board members wants answered, why wasn't our proposal selected? Guys, there's only one honest answer that a funder can give 90 percent of the time to that question, and it's the one that we hate to see, because we think it's the generic response, “Given the mix of proposals we were considering in the amount of funding available, there were other proposals that the review committee saw as a better fit for what we wanted to accomplish with our funding this time around.” Chat in if you've ever gotten a response like that when you've asked for feedback from a funder, or even in the notification letter. A lot of funders will send this out just when they're notifying you, the decline letter, right? Often includes information like this: your proposal was wonderful, we got so many wonderful proposals, given the mix and the amount of funding available, there were other proposals that were a better fit for what we want to accomplish this time around. You think that's a generic response, but it's the most honest answer, because why? Because what they're telling you is, “We had a certain number of pieces that we were trying to fit together, given a certain amount of funding, and as we looked at those pieces, and moved them around, these were the pieces that created the greatest impact, and not all of them made it in,” right? Now you know why. Now you know why, and this is what you can tell your executive director, this is what you can tell your board members. It's all about the puzzle, it's all about the puzzle, and it's all about the fact that half the success of any proposal in the finals round has to do with what? Who else shows up.
Okay, so again, if you have more questions about this, we can talk about this in the Q&A, but that question, that response that drives you crazy, that everybody hates to see because they feel it's non-responsive, is usually the most honest response to what? To the wrong question. That's the point here. That's the answer to the question, “Why wasn't our proposal selected?” but instead, keep in mind it's a lot less like the Olympics, right, that competitive process, and a lot more like the selective process. The wrong way. Let's talk about the right way. Here's what the right way to ask for feedback is. First of all, you start with illustrating that you understand the dynamics of the review and decision-making process. “I know a lot of different factors come into play in your team's review and decision-making process.” That's great, you're really showing you've got some knowledge and expertise. “We're committed to the project, and we want to continue seeking financial support for it.” Perfect, you're indicating that you're invested in the project, that you… that you submitted for approval. It's not just throwing spaghetti on the wall, let's slam something together, and throw it out, and see if they're going to fund. It's like, no, this is a real thing that's important to us, and we want to continue supporting it, and seeking financial support for it, and here's the question: Is there anything you can share from the review process that might help us strengthen this proposal, and do a better job with our proposals in the future? Not get funding from you next time around. “Is there anything that we could have done this time around that if we do it the next time, you'll give us the money?” That's not the question to ask. A lot of reasons for that, first of all, the review team will probably be different the next time around, so it'll be a different set of eyes, same scoring criteria maybe, or maybe the scoring criteria will change, but even if the scoring criteria is the same, different sets of eyes are always going to see things differently.
But this doesn't mean you can't use the feedback for a new submission. It's just that you're not asking for, “What can we do next time to up the chances that we'll get money,” you see? What it's saying… Here's the difference. It's really understanding the dynamics of the whole process, and indicating that this project is something that's important to our organization. Is there anything that came up in the review process that you can share with us that might help us strengthen this proposal, and do a better job with our proposals in the future. Now, as a program officer, and as a review team member, I would love to respond to this question, because I will always have something I can share, okay? And what you do with what I share is up to you, and that's part of what we're going to be talking about here in just a moment, as well.
Alright, you got that down? Perfect. More about feedback, more about feedback. First of all, don't ever take feedback personally. It's so hard as grant writers, we are so invested in the grant proposals that we're working on in the projects, and when we get any feedback at all, if it's glowing feedback, we take that personally, if it's negative feedback, or constructive feedback, we can take that on, and take it personally. It can be really… it can feel so devastating if it doesn't go well, but it's not personal, it's what? It's one reviewer's response, it's one person's input, it's something that they saw and reacted to at that particular point in time. What I don't want you doing is, I also don't want you going to chase the universal perfect proposal. Don't go chasing those points you didn't get. Most feedback is very specific to a given review process. Most feedback is very specific to a given set of reviewers at a particular moment in time, it's specific to the other proposals that they're considering at the exact same time.
I don't want you combing through the reviewer's comments and saying, “Look, this one dinged us 8 points out of 10 for this thing, so we've got to put our entire focus on what we can do to not lose those same eight points next time.” That's not energy that's well-spent, that's not energy that's well-spent, because the next time is going to be very different than this time was, and, again, remember, different reviewers are always going to be… see the exact same thing very differently. In fact, the same reviewer looking at a proposal today may score it differently than they would have if they didn't see it until next Thursday. It's not an objective process is it? It's very fluid, it's very dynamic, and it's very dependent on all the context and circumstances that are going on. And finally, and this is so important, when you do get feedback, take a good hard look at what makes sense in the context of your organization's mission, your organization's vision, your organization's values, in the context of everything you know about the community that you serve, and the context of everything you know about what works, and what doesn't work, in the work that you're doing, and hold the rest very lightly. Take what makes sense, be very… do a lot of due diligence, and thinking through, this is feedback, this is somebody's perspective. What part of this is potentially true, what part of this might be helpful for us? The rest of it, it's like, ah, that's one person's opinion, right? That was a lesson I learned the hard way in my role as a grant writer over many, many years, working primarily with federal dollars, and we could always get the federal reviewer’s comments back after a certain point in time, and it was very hard not to take it personally, especially if there were five reviewers that gave top marks, and one reviewer that just didn't like it at all. Does that ever happen? You take that personally, and then I focus on that one reviewer who didn't like it at all, and wanted to know why we lost the points we lost. I didn't understand that every reviewer is just going to have their own perspective on it, and it became almost obsessive to figure out how we would need to do things differently to get those points from that one reviewer that didn't like it at all, even if the proposal was ultimately awarded.
There's a lot there, and here's a grant-seeking power move that may come as a surprise to some of you. When do you ask for feedback? Well, ask for feedback when you don't get the grant, and ask for feedback when you do get the grant. Why do I call this a power move? How many of you do that? How many of you ask for feedback whether you get the award, or not? Anybody? Why do I call this a power move? Well, doesn't it just make sense that there's always something to learn? And especially if you do get the grant, don't you want to know what the reviewers really saw as the strengths, so you can learn from how your proposal is being viewed on its strengths, as well as focusing on the things that reviewers thought could be done better. Super duper power move.
Alright, so the bottom line. Can we control what happens inside the black box as grantseekers? No. Once we push submit, we've done everything we can to put that package together, to send our darling into the audition, and then it's up to how our darling does in the mix of all the other proposal darlings that are auditioning at the exact same time. We can't control who's reviewing the proposals, we can't control who else is showing up, we can't control the personal preferences of the reviewers, we can't control the fact that it is an ensemble effect, we can't control how much or little mental energy the reviewers have to process the information, and the stack of proposals they have to whack through before they can call it quits for the day, we can't control the quality of the other proposals, alright? Maybe yours is an A+ proposal in the mix of 10 other A+ proposals. Maybe yours is an A+ proposal in the mix of a lot of other B proposals. That's going to make a difference, it really is, etcetera, etcetera, etcetera. What we can control is everything it takes to get through those qualifying rounds, right, and into the finals.
Those are the mission focus. The partnerships, the project plan, the grant's research, the relationships with the funders, the quality of the proposal itself, in fact, absolutely every step along your grant success path, all the way up, that's what we can control, and then we can control how we focus on taking the feedback that makes sense, and feed that into our own internal process for getting better, looping all the way back and bringing that improvement all the way through. Strategically, everything we're doing in our organization, in our community, and in our team to get better.
Alright. That is it. We've got some things lined up for you here, I just want you to chat in, think for just a moment. I shared a lot here. I think there were some big “aha’s,” and “hmmms,” and maybe some “huh’s” that you're not quite sure about even yet.
Chat in, what's one big takeaway for you from today's session, one thing that made you go, “I did not know that,” or, “Hmm, that's something to think about,” or “Huh? I’m not so sure about that Mara,” and I’d love to see what's kind of sparking for you here. And now it's time for you to chat in your questions, okay? Those questions you've been saving up, if they haven't been answered for you, in what I covered here, go ahead and chat them in with a hashtag, hashtag, hashtag, and in a little bit, we'll open lines up, and Will's going to go into the chat box and pull questions out. He's going to take a look at questions, and group them together, so we can do a great job of answering as many questions as we can. And while you're doing your questions, I’m going to turn things back over to Will, because Will wants to talk to you about Instrumentl, and a little bit about how Instrumentl can actually support you in keeping track of the feedback that you do get from your funders. Will, take it away.
Will: Awesome Maryn. Thank you so much for sharing those insights. Let's go ahead and take a look at this next slide, where I want to share first, for anyone that hasn't heard about us, what Instrumentl does, and then I’ll share with you three ways that you can start taking some of the learnings from today's workshop into your grant process as well.
At a high level, Instrumentl provides grant prospecting, tracking, and management into one place, and there's really two big areas where we add value to your grant process. The first one is we have a unique matching algorithm that will only show you active grant opportunities that you can apply for your direct programs and campaigns that you are fundraising for. Then the second way is that we help you with the tracking and management.
Once you start saving grants from Instrumentl, or from opportunities you're already working with into our grant tracker, we'll send you a weekly roll up of all of your tasks and deadlines in the same place. One of the features in regards to this, that connects to the grant feedback process is, really, the three things that you can do within your grant tracker.
If we go ahead and go to the next slide, we can talk about these three ways of how you might use Instrumentl to track your funder feedback. The first thing is using notes as a workspace for you and your team, and the second way is creating tasks before and after submission for requesting funder feedback, and the third way is for uploading a summary document of funder feedback correspondence to your document library, and if you don't have an Instrumentl account, no worries, you can create a free account literally using Maryn's link in the bottom right corner of these slides. Going to the next slide, let's break down how you might do this with Instrumentl.
The first thing is if you are on Instrumentl, you will have a grant tracker in which you can open up into a particular grant opportunity that you have saved, and then use the notes section as a workspace for you and your team.
You can see here that what I’ve done in this slide is, I’ve essentially led with a note that says ask Maryn's question. Is there anything you can share from the review process that might help us strengthen this proposal?
This is a good way to start building systems for yourself by incorporating some of the takeaways from today's workshop into every single grant that you're going to be applying for, both the ones that you know you're going to win and the ones that might be new to your process. Secondly, if you go to the next slide, you'll see that you can also create tasks before and after submission for requesting funder feedback. I saw some questions in the chat in which people were asking, “Well, should I ask before or after I submit?” You can also think about how you could build that into the process, both before and after.
Here in the tasks section of your grant tracker, you can create a milestone task, in which you have something like “prep ask email with funder.” Set that before the actual deadline, and then after the submission, you can ask for post-application feedback as well. And then, the final way that you might use Instrumentl, is to upload a summary document of all of the funder feedback corresponding to your document library.
This is one of the best ways for you to store institutional knowledge, because if you've ever come across the case in which there's some position changes within your organization, or there's new people joining the team, and everyone doesn't know where particular proposals are from last year, because a different person worked on that last year than this year, this would be the place in which you can upload all the feedback you do get in terms of the correspondence from these grant makers into your document library for that particular proposal on Instrumentl.
In the case in which you are looking for ways to help you with this grant tracking discovery and management process, you can use Maryn's link today in the description, and then we'll also leave a link in the chat, as well. But now I’m going to quickly pass it back to Maryn for learning takeaways before we then hear about our raffle.
Maryn: Yeah. Just to wrap up, again, the three big learning takeaways that we set out for you at the very beginning, and just spent the entire time unpacking for you, is really, let's reveal the three biggest myths about what happens inside the grant maker’s black box. You've got that. Now you're one of the 2% of all the grantseekers in the world who have a better understanding of what actually happens inside the grant box, and why those myths just aren't true. Also, the three big mistakes that people make when asking for funder feedback, and what you can do instead, so that you don't have to make those mistakes, and then finally the big golden nugget here is, the right way to ask for funder feedback to get results that can really make a difference, and what to do with that information. Don't take it personally, take it into consideration, take what matters, take what makes sense, and let the rest go.
That brings us to the end of funder feedback, per se. Again, by the way, I just want to mention, if you… I would love to have you pop on over to go.grantsmagic.org and visit us at Grants Magic U, and we're going to pop back over to the next steps for you. In addition to joining Grants Magic U… And I think this is your slide, again, Will.
Will: Awesome. The first thing is, if you don't have an Instrumentl account, you can sign up using Maryn's link. It's instrumentl.com/grantsmagic. And if you already are on the fence, and you're thinking about upgrading your Instrumentl account, and you have one already, you can use the code GRANTSMAGIC50 to save $50 off your first month. That applies for anybody that signs up today with that link as well. And then we encourage everybody to submit your feedback form for today's workshop. That's essentially how we come up with this programming to make sure the topics that we're sharing with you guys are relevant, as well, and something that is going to really resonate with you guys.
In the next slide, we have one final thing for us before we open it up for Q&A. First off, if you have never attended these, we typically reward people that stick with us until the end for these live workshops, and we do this, essentially, by working with our partners for fun giveaways.
The raffle for today is, we are offering two spots into Maryn's upcoming course. It's a 10-week, self-paced online program called the Ultimate Grant Proposal Blueprint.
That is a $300 value for two people that we will be raffling away for, and we will also be raffling away a one-month subscription to Instrumentl, which is a $162 value, as well. In order to enter the raffle, all you need to do is one of three, or all three, if you want more chances, of these steps. Either create an Instrumentl trial account with Maryn's link, complete the feedback form that will be sent out 30 minutes after the end of the webinar, and then tweet on Twitter, or post on LinkedIn and tag us, on what you learned today, and then we will announce the winner on Friday on Instrumentl’s social media calendar. Then, if you enjoyed this grant workshop, you'll love our next one, it's on how to go from 100 grants, to the best fits for you, the grant writing unicorn method. It is on May 26, at 1 P.M eastern time.
You can register for that on our events calendar, which we'll include in the follow-up email. We'll also be posting this workshop on Youtube later on today, so keep your eyes peeled for that, but first, let's go ahead and now dig into some Q&A. Maryn there's one question I wanted to ask you, which is, you said that not all grant makers can or will give feedback, or share reviewers comments, and that there are a lot of different reasons that's true. I wonder if you could say more about what some of those reasons might be.
Never tried Instrumentl?
Find and win more grants for your nonprofit!
Start saving 3 hours a week and increase your grant applications by 78%.
Try 14-days free
Maryn: You know, it breaks down into a couple of different basic reasons. Number one is structure. Surprisingly, many “grant-making organizations,” aren't really grant-making organizations, they are organizations that are set up to help people who have money get their money out into the community, support the causes that they care about. That's not the same as grant-making, and yet they will show up a lot of times in our research, and they will look and act as if they're grant-making organizations, when, in fact, there's already a process behind the scenes for making those decisions.
Sometimes it's a charitable trust that's managed by a bank officer, and so all the communications go through the bank officer. The bank officer has no clue. The bank officer is not going to give feedback, there isn't really the same kind of robust decision-making and review process that we typically think of in terms of grant seeking. That's one of them, and that feeds into the second one, which is that there are many, many, very small funding organizations that simply have zero staff, or zero bandwidth for engaging in authoring or sharing feedback. Many will try very hard, many small ones do so, it's not simply just a matter of size, but it's just a matter of saying, you know, our job is to figure out where the money is going to go, and then the money will go there, and do the good work that money is going to do, we just do not have the bandwidth for going beyond that. And also, frankly, many have just gotten burned, because people ask your feedback the wrong way, and it's kind of a non-productive conversation.
When there's a… When it's clear that the feedback is actually going to be valuable in some way, then some funders might be more… I know I, as a funder, am more likely to be generous with my feedback if it's clear that it's actually being taken in a way that's going to produce good results, but structurally, sometimes it's like we've got no bandwidth for that, sorry. You're not going to be able to do that, and that's what I meant that back in the day, when we did not have the internet available for grants research. Thank you very much Instrumentl and all the other wonderful grants online databases… These small funders would never show up, and we would probably not be sending proposals to them. They don't typically want a lot of engagement in conversation at that level, and that's just one manifestation.
So, yeah.
Will: Awesome. Another question is going back to the examples you shared of reviewer's comments, the two that saw the same response so differently, if those comments were shared with me about my proposal, how would you advise me to interpret them?
Maryn: Is there anything useful for me there, or would I just be chasing the points, as you say? Oh, do you mean the slide I showed with the two reviewer’s comments, and then the one that said we really like the number of teachers, and the one who said we think the teachers there are too many teachers, is that the slide we're talking about?
Will: Yeah, exactly.
Maryn: Okay, great.
I don't know if that proposal was funded or not, but let's say it wasn't, and if you came to me and said, “Why was our proposal not funded?” I wouldn't be able to answer you, but if you came and said, “Was there anything that came up in the review process that would help?” I would say, you know, one thing, and I might have several, but one would be going back to that slide. Most of the funders were… most of the reviewers seemed really comfortable with the number of teachers, and thought the collaboration was pretty strong. One actually, though, raised an issue about concern that there are so many teachers. How do we know the quality of the professional development is going to be maintained?
If you were looking at making a bit of a change in your proposal, you might want to take a sentence or two to specifically address that, it raised a yellow flag in one reviewer's mind, and all it would take would be a sentence or two to say, “Even with this large number of teachers, we can maintain a high quality of professional development because…” Bing bing, and bing.
It may be something that we, as the proposal writer, took for granted, that raised the yellow flag in the mind of even one reviewer that maybe could, or could not, have been, a make or break factor in the decision-making process, but we have blinders on based on our own beliefs, and our own fields of understanding, and sometimes the reviewer’s comments can help us see, “Oh, I can understand that’s something that somebody could be concerned about, all we have to do is address that with a sentence or two,” or maybe it is a structural issue, it's like, “Wow, we just really don't have a process for quality control. We're definitely going to need to build that in.”
It can either be something, “Oh, we didn't articulate that clearly,” or, “Wow, that really is something that needs to be a part of our program for a project that we just didn't even think about, got it.”
Yeah. We've got a lot of questions from the audience, the first one from Nina is, “Do many or most grantmakers use a point system? Is there any data that you might be able to share on this?” It's… What did I tell you was going to be my one two-word, honest, answer to any questions about funders? It depends. There is no data that I can share about it. There is no, and I don't know that there's any value to doing that. I mean, I’m not sure what difference it would make, because in the end, it all does become an evaluative process. The points, when point systems are in place, the points are intended to help guide the decision-making, and some funders, frankly, will fund from high-scoring proposals down. There's no doubt about that. That is probably not the most sophisticated way of doing the work.
If I could think of a way to answer the question that would actually help you in the future, I would do that, but I can't see… and I love the question… I don't totally understand where your question is coming from. It's just… it…
Yeah, I’m sorry, I don't have a context for an answer that would actually feel like… I don't think there is any data from all the range, from the tinies, to the great big huges, the bigger the funder is, the more likely there is some sort of a scoring matrix, but in the end it is still always an evaluative process, whether there are points that guide the decision-making or not. I do not want the points to seduce you into thinking that one process is more "objective” than another.
Will: Rachel asked, “Is seven a typical amount of individual reviewers?” In your example today, you have seven reviewers on that, so is that something that is pretty standard?
Maryn: Nope, it can be one, it can be a hundred, it can be anywhere in between.
Will: Got it. Cheryl asked, “How different is this process between a large foundation versus a family foundation?”
Maryn: A great question, thank you, these are all great questions. Totally.
Some of them I can answer quickly, and others take a little bit more…
What I wanted to share with you here was a template that sort of included all of the big moving parts of a typical review process, and every single grant-making organization is going to find its own place inside that template. I don't think there's much going on that I didn't cover on the template, but… and there's no way, again, that… There's no distinction between a large foundation and a family foundation. A family foundation can follow that exact same template if that's how they choose to set up their funding process. A large multi-national foundation can follow the exact same template, or not, depending on how they want to set up their own grant-making and funding process. I know many family foundations that follow the exact same level of, I don't want to say seriousness, but my process is one where we have a lot of stakeholder accountability, and so it's really important that we build a lot of Clarity and accountability and transparency into the process, and not all funders have that level of requirement, and so they're going to structure their decision-making very differently. In a family foundation, one quirk is that sometimes the family members themselves are the board members.
Sometimes they'll have a community review team, but a family member, or board of family members, have the final approval, and that's where things can get really interesting, because-I’m going to merge a question from Renee and Nina, which is, “How can we get funders to be more transparent?”
One of them is the supporter of the “#fixtheform” movement, but is there a similar initiative to get them to be more forthcoming in their evaluation and feedback. Oh, shoot, I just drew a mental blank…
Somebody on this call will know the organization that [Vooley] set up in conjunction with a lot of other community leaders that is a trip advisor, oh, grant advisor, that's the name of it. Grant advisor. It's like Tripadvisor for funders, where we, as the grantseekers, go in and we give our reviews of the funders, and I believe it works that once a particular funder has garnered five reviews, then that funder is posted, and all the reviews are posted, and we can all go in and see what the reviews are. There are funders now that are really embracing that level of transparency. We want to have feedback from you about how we're doing, and they look at that feedback very, very carefully. In fact, I know of several… I know there are many, many more that I’m not aware of, who say, “Please post your reviews on Grant Advisor, we want to know what you think, we want to know what we can do better.”
Embracing that particular model, I think, is a fantastic way of really nudging funders, those that are willing, nudging them in the direction of greater transparency and accountability. There are lots of reasons that a grant-making organization is structured.
Some is to do community good, and some is to serve as a tax shelter, okay? And everything in between.
The tax shelter organizations are never going to be nudged in the direction of greater transparency and accountability, because that's not what they're there for. That's not what they're set up for, that's not how they're operating. But this whole range of those that really are engaged in community conversation, and community improvement, definitely, that would be the way to go. That would be one way to go.
Will: Got it. Jax asked, “How can you tell which piece of feedback is most important to the funder?”
Maryn: I don't know why I would need to know that. Again, I’m trying to think, why would we want to know which was most important to the funder? If a funder can say, and I don't know, if you're not having a conversation, I don't know if you can, because what that presumes is what? That you're going to want to take their advice, and put that into the next proposal that you submit to them. I think that's what I was kind of nudging you not to be thinking about, but it's never, it's never a problem, it's totally never ever a problem to ask that question directly.
Let me just back up on that. It's like anything that you want to know from the funder, ask them directly. I don't know, I’m just one person, and I can just sort of speculate based on my years and years of experience, but if there's something you really want to know, ask them, and they’ll either tell you, or they won't tell you, and either way, you really can't lose, so if you said, “If there was one piece of advice coming out of the review process that you thought was most pivotal in the decision-making, what would that be that?” would be one way of framing that question, but that's not the same thing as saying, “What's the one thing we could do next time around to make it more likely that we would get the grant?” which you can also ask. They're likely to say, “Well, don't do that, because it'll be a different process next year,” or they'll say, “
Yeah, really clearly and specifically, if you really want to be competitive with us, you're going to need to really shore up your collaboration, because that is going to be a direction we're really going to be hitting hard in future funding rounds.” Okay? It depends, and then Maryn has a lot of words that she can add to “it depends” to bring some light to the subject.
Will: Sarah asks, “Please elaborate on why it is or why a grant maker is not allowed to, or able to provide feedback?”
Maryn: There's not a… I don't think there's, and I think I… Please, elaborate. Okay, it's not “I’m not allowed to,” I don't think there's a “not allowed to.” There's policies and practices. If it's a government agency, they may be subject to freedom of information, which means that they do need to make the information public if you ask for it. If it's a private organization, they can choose to, yes or no, so it's not a “not allowed to,” as much as, what they choose to do, and I think I answered that with the first question that Will asked, if there's something more that I missed on that, let me know, but essentially, it's a choice that the grant-making agency makes. “Do we have the staff for this? Do we have the time for this? Is this even a priority for us to engage in a community conversation?” Because frankly, again, not all grantmakers are the same, and some of them, it's like, we're not here for the community conversation, we're here to get the money out the door, and that's a completely different dynamic.
You can bang on their door all you want, and they're just going to go, we're not answering. That's not what we do. Go find the ones that do want to be part of the community conversation, and that's where the good, robust feedback happens, and that's where the great exchange of information back and forth can take place, as well.
Will: Maryn, I want to be respectful of everyone's time, do you have a few minutes after this to go through them? If so, we can continue to go a little bit after, but there's a ton of questions, but I also want to be respectful of the time block we have in the next minute or so.
Maryn: Yeah, absolutely, I’m totally open. Go for it, I love it. Will, just, I love the way you're pulling questions together, and kind of combining them, and they're all, it's moving well.
If you can stay… If you can't stay, don't worry, this will all be part of the recording, and I’ll stick around as long as Will keeps the zoom room open.
Will: Okay, alright, so the next question that I had is from Alexandra, which is, “What is the best way to get a sense of the puzzle or the ensemble that they're hoping to create?”
Maryn: You can't; that's my whole point. Only they can see that when they see what the puzzle pieces are, that's only something that the review team can see when they see the proposals that are actually there in front of them. They've been reviewed, the ones that have made it into the finals, and scored a certain minimum number of points, now we've got these 15 proposals in front of us. We're going to be scoring five, we’re going to be awarding five. How do we actually move these things around?
We can't see anything about who the other proposals are. You can get a hint, and I know that federal funders do this, and some private foundations may, as well, and those of you that are out there researching private foundations maybe know better than I do, you will see things in their guidelines that will say… let me think what the language used to be in the Federals… “In addition to the scored… in addition to the scored criteria included in this request for proposals, other factors will be taken into consideration in the final decision-making, including, but not limited to, geographic diversity, program diversity, target population diversity, you know. They'll list whatever it is that they're going to be looking at in the finals round, and those are the pieces. That's where they're talking about how they're going to be putting those puzzle pieces together. A great example, let me just take a… I’m going to give you an example of this.
The example that I shared with you, with the community literacy program, was like, we want diversity, so we want to make sure that we've got diversity in our target population and diversity in our geographic area. In my real-life funding round, which wasn't that, we're dealing with academic subject areas, and sometimes, depending on the proposals that we get, we'll say, “Well, we've got a great opportunity here to support English, support math, and support science. We can actually do a portfolio that represents three strong proposals, and programs in three different academic areas.” That's diversity. Another time, we may get a bunch of proposals where there's a whole set of them that are really strong in one academic area, and we'll say, “Wow, we've got a unique opportunity, given the proposals that are showing up this time, on really doubling down on our commitment to math.”
It's not something that we announce, “We're going to double down on our commitment to math,” it's something that emerges as a possibility for us, given the proposals that show up. Sometimes it's going to be diversity, and sometimes it's going to be really investing heavily in a single academic area, but that's completely dependent on what proposals show up in the funding round.
I hope that makes sense.
Will: Got it. This question comes from Yvonne, and I think it also merges with [Vern’s] question, which is, “Does the timing of proposal submission affect the review process?” In other words, are proposals submitted prior to the due date reviewed as they are received, or does the black box review process begin after the deadline is reached?
Maryn: Now that's a big “it depends.” Also, I promised you I was going to be saying it depends, because some funders will start reviewing the proposals as soon as they come in, and others will wait until they're all in and processed, and lined up, and send them out as a package. I think the second is probably more typical, certainly of the larger organizations that have a more formalized and structured review process, tend to have that the deadline day is when the process really starts, but it certainly is a complete option for the funder to decide, “Let's just start reading them as they come in, and we'll see what we've got.”
Some funders have rolling deadlines; they simply say, “We don't actually have deadlines, but on the 15th of every month, we'll read the proposals that have come in since the last time. It's not a deadline, just send them in any old time, and we'll read them when the 15th comes around. That's very dependent.
Will: Got it. And we got a lot of people asking this question, so if I don't remember to include your name as part of this question, Susie, and a few others as well, “What wording would you use for asking for feedback on grants that you were funded?” Would you use the same wording you used before, or would you use something different?
Maryn: There I would… I don't think there's any reason to change the wording, other than saying, “We're thrilled to have the opportunity to work with you, and we are always committed to learning from the process.” Okay, so I think to me that's a hugely important phrase, “We're always committed to learning from the process, and we'd love to know what came up in the review process, if anything, that you could share with us that would help us.” Maybe you might say, “Help us understand what the reviewers saw as the strengths and areas for improvement of our proposal,” that might be a good way to put that. We'd love to see what we… what we're seeing is the strengths, and we'd love to see what areas were seen as ways that we might improve.
Will: Yeah. Richard asks, “Are most reviewers looking for reasons to say no, or looking for reasons to say yes.”
Maryn: Oh my gosh, Richard. It feels like you've got some stories behind that one. I’ve worked with reviewers on both sides of that. I’ve worked with… you know, it's an interesting question, because it's like, basically, are we operating… are we going to look up, you know, baseline plus, or baseline down. It's like everybody starts leveling. Are we looking for reasons to give you points? Are we looking for reasons to take points away? That's such an interesting psychological perspective, and I certainly can't speak for most, but I can say you really hit on an inter… one of the interesting psychological aspects of teachers grading essay tests, or people scoring resumes for job applicants, or anytime it's an evaluative process, there are people who will start by saying, “I’m looking for points to take away,” and there are people who are saying, “I’m looking for opportunities to add points in.” That's fascinating.
Will: Awesome. Yvonne also asked, “When is it appropriate to ask a funder for a recommendation to other potential funders?”
Maryn: Oh my gosh, I love that question. I think that's appropriate just about any time you've got a good relationship with a funder. If you burnt the bridge, if something has happened, and the relationship is not a good one, that's probably not a good time, but if the relationship is solid, whether they funded you or not, again, all you can do is ask. Ask respectfully, and the likelihood is, they won't, but the chances that they could let me connect you with let me see let me send an email to or if you haven't yet, you might try contacting ding ding ding ding ding. That's a lovely question, I love that, because what really speaks to, is my position, is that we as nonprofits are partners, equal partners in Grants World with the funding community. That power dynamic that seems to often go along with money, I wish we would all just get rid of that, and say no, because they need us to create the change that they want to be trading with their philanthropic dollars, we need them to support our work with their philanthropic dollars. We’re equal partners, and so treating each other as equal partners, where you can go to each other, and ask, “Hey, it would really help us out if maybe we had a couple of other ideas of other funders that we could approach. Do you have any thoughts you could share with us.” I love that. That's a really natural way of being good partners with each other.
Great question. Thank you, these are wonderful questions. Will, it's so much great thought is go- a lot of great thought and experience behind these questions. I’d love to sit, kick around, and drink some wine, and share the… swap the stories that are behind the questions.
Will: Awesome. I think we have two last ones. The first one is from Adam. “We are a non-profit advocating for eco-restoration to address global warming. Most funders want to see emphasis on greenhouse gas reductions. If our grant were submitted, is it possible that it has strength by expanding the funder's perspective? In other words, goals are aligned, but methods are on the fringe. Is it worth submitting, given that time is a limited resource?”
Maryn: What I would want you to do is, I would want you to have a conversation with the program officer at those funding organizations before submitting a proposal. This is one of those great situations where it's not a clear fit, and I understand what you're saying is, like, can we move you just enough in this direction to be willing to consider this? You're not going to do that with a proposal. That's the kind of thing that can happen... ~can~, emphasis, happen with a conversation. The conversation would be with me, the program officer, for instance, and what I might say is, it's worth a try, go ahead and submit the proposal. I can't promise anything, though, because the review team makes their own decisions, and I’m not part of that, but it could happen, or I might say, you know what, it's just, they're going to go straight down the middle this time. I know that money is really tight, and they're not going to be funding on the margins this time around. You can submit the proposal, I just want to let you know the chances are extremely slim. You know, so that you can get that kind of feedback from me, and you can all… you know, it's your choice how you want to spend the time. I rarely… there are always exceptions, and I know as grantseekers, hope always springs eternal, right? I want to be one of the exceptions, and so I want you to be one of those exceptions as well, but we don't have unlimited time, and if there are other funders that are a more clear fit for you, that's where I would be investing my time, while potentially seeing if I can set up conversations with some of these, “Let's see if we can nudge them to think differently in that direction.”
Will: Ed and a few others asked this question, and it's related to government proposals. “How would you go about getting funding from government proposals?” In his case, the federal trade commission. They applied for a telecommunications grant, but did not get funded, and did not get a response when they asked for feedback.
Maryn: I have no idea. Try to get some feedback again, I’m not sure why you wouldn't get feedback, but you can push them, and you can say, we need to know, were we eliminated for cause? Were we eliminated on a technical basis, did we go into the full review? If we went into the full review, can we see the reviewer’s comments? That's… you can, as I believe… I don't know how the FTC specifically operates, but if it's a quasi-governmental agency, they may not be subject to the same Freedom of Information provisions that the straight down federal government things are, so it may be a closed door, it may not be, there may not be anything there that you can learn, but you can always go knocking again, and see if you can, see if you can get a little bit more clarity around that.
Will: Awesome, and I was going to call it there, but a great question just came in, and I think it's interesting to ask, which is, “How do you diplomatically ask a funder why your grant award was so much less than what you asked for without appearing ungrateful?” and this is from Ann.
Maryn: Oh, my lord. That's… see that's what I mean about those money power dynamics. If you go to a restaurant, and you order a meal, and they send you half of what you order, nobody has any problem saying, “Hey, this isn't what I ordered, how come I didn't get the full thing I ordered,” but when it's with a funder, and you don't get the full amount it's like, oh my gosh, we can't possibly… You know, I mean it's really transactional. It's a simple piece of information, so that's part of the negotiation, right? What it sounds like is, you got a grant award, you didn't have a negotiation process around that, but it's a simple ask. It says we're grateful for the money, we need to do a little retooling if there are strings attached to how to spend it. That's when you need to have a speculative conversation. Okay, if they say, “We want you to do the same thing you said you would do in your proposal but we're giving you 30% less money to do it,” then you need to have a conversation to say, “we're going to need to talk about this, because this is going to be a real challenge, we're going to have to work this through.” But if they simply just give you the money, and there are no strings attached, then you say “Thanks very much, for clarity's sake, was there… Are you specific on how you expect us to use this money? And if they say no, just use it, then. You're good to go.
Will: Awesome. Well, that wraps up our Q&A everybody, thank you so much for attending today. As a reminder, the raffle, be sure to enter it by creating your Instrumentl account, by submitting the webinar feedback form, or going on LinkedIn and sharing what you learned or on Twitter, and we will be announcing that winner on Friday. And for our next session in a few weeks, it'll be on how to go from 100+ grants, to the best fits for you. Be sure to register for that as well, and check out Maryn's Grants Magic U University as well for more content similar to this. Other than that, guys, thank you so much for attending, and I’ll see you guys next time. Bye, everybody.
Maryn: See y'all later. Bye.