Continuing our conversation around maximizing return on investment from your research efforts. In the last post on this subject, we shared principles and best practices on getting the most bang for your buck from these investments, especially when outside firms have been engaged to conduct research studies.
In this edition, let me introduce you to a principle we use here at Silicon Valley Research, the HBU, or highest-and-best use principle. This principle derives from the commercial real estate appraisal industry where a property or parcel is appraised now for what currently stands on it, but for the most valuable structure that could be built on that property or parcel.
Marketing expert Jay Abraham applied this principle to any marketing activity or investment, encouraging his clients to apply this principle when evaluating best options for spending marketing dollars, especially expensive media expenditures.
We apply this same HBU principle to marketing research expenditures. Our observation is that research studies we conduct for our clients land into one of two buckets-the HBU, highest-and-best-use bucket or (unfortunately) the LLU or lowest-and-least use bucket. Sometimes, the determinant of which bucket the research will land in is purely cultural and sometimes a function of how busy a stakeholder team is with other priorities. In one recent example, a client who has had a long HBU track record with us, let an important brand equity research study lapse into LLU due to a major recent acquisition that occupied their already short-staffed stakeholder team. Fortunately, the research investment, which was global and therefore fairly substantial, did get restored to the HBU bucket, albeit with some prompting from my analyst team, who I am proud to say, were committed to not only completing the study but ensuring the client derived maximum value from their investment.
In ensuring HBU success for your research studies, your research provider should be your ally. They should offer services beyond just reporting charts and trends from the data. Some of the things you should expect and be entitled to:
- In-depth questioning of you and your team at project kick-off and through the project cycle on your business objectives to enable them to draw meaningful insights from the data they obtained. Meaningful insights are not just those that are derived from making interesting connections within the data itself, but connecting these to what your organization is trying to achieve. Too many industry players see their role as one of merely obtaining the data. Meaningful vendor vetting should ascertain how willing and capable the outside vendor is to grasp your business objectives. Where providers often falter on this aspect is where the client’s business involves complex technology and related subjects in B2B settings versus simpler consumable B2C and D2C items.
- A well developed “peripheral vision” mindset. Those who gravitate towards market research as a profession often come from academia (including an unusually high portion of market research company founders!). This academic bent can often lead to too much focus on methodology at the expense of a business focus. Both are critical to project success, and it can be argued that methodology issues should be table stakes for an experienced firm. A provider will ideally strike a good balance between a methodology and business focus, which in turn will enable them to explore opportunities in white space areas not previously considered by the client. Sometimes these peripheral insights end up being the most valuable and monetizable insights of the entire research agenda.
- Both a willingness and a commitment to data and insights dissemination to your organization. As I stated in my previous post on this subject, our client project philosophy is “it ain’t over when it’s over”. While it wouldn’t be fair to let projects wander into scope creep territory, it is important that research providers build in effort and time at helping you socialize the study findings within your organization in a way that maximizes impact and actionability. To this end, what you should expect at a minimum are the following:
- Encore presentation sessions of the findings and their implications, ideally two sessions; one as a follow-up working session to the implementation team, plus a shorter higher-level insights only version to your executive leadership team.
- Some form of after-project query support, typically 90 days or so, enabling your implementation team to ask the research provider’s staff to help with data interpretation including how valid or significant a particular data point is. This becomes particularly important for quantitative studies, especially those that involve complex modeling. Most custom research projects last around 6 to 10 weeks during which team the provider’s research team lived and breathed your project. This knowledge should not therefore remain untapped.
May your next research project turn out to enjoy HBU status and success!
Don't miss a single issue: Subscribe above to receive insights and best practices from our research on the world's most innovative brands directly delivered each week to your inbox.
Alan Nazarelli is Founder & CEO of Silicon Valley Research Group. Based in San Jose, CA with offices in Seattle and New York, the company works with the world’s most innovative brands to provide timely and actionable market intelligence and strategic guidance to enable them to make well-informed decisions to positively impact revenues and profits and to achieve their growth targets. Connect with Al on Linked in