Beyond the people involved, the other half of the systems equation is tools and technology. The emergence and evolution of the research toolkit is one of the exciting trends in business today. The capabilities that used to take months or be impossible altogether can now be conducted on-demand.

Tooling

Research teams need to construct a toolkit that meets the evolving needs of the organization. While that varies from organization to organization, it generally includes tools for foresight and hindsight, and quantitative and qualitative data.

Procurement process

While research studies are frequently standalone projects, it’s important for tools to be persistent. Otherwise each study suffers a “cold start” – the tedious procurement process of acquiring the right tools and capabilities to execute. In today’s fast-paced landscape, that’s an unacceptable delay for most initiatives.

  • What does a dire situation look like? Research and procurement teams avoid subscription software. Every study requires an extensive business case and RFP from vendors.
  • What are key indicators that there’s room for improvement? The research team has fairly free reign to purchase paid trials and PoCs before making the business case for an ongoing subscription.
  • What does excellence look like? Research and procurement teams take a step back to understand patterns across the most common studies and projects and ensure that subscription capabilities are adopted and available on-demand.

Toolkit and use cases

Any learning objective is understandably constrained by the available tools – that’s a reasonable limitation that every business initiative faces. However, if the most pressing and common objectives constantly face the same constraints, it means that the research toolkit doesn’t adequately map to the organization’s needs.

  • What does a dire situation look like? The majority of learning objectives can’t be adequately informed by existing capabilities. Study results generally underwhelm key stakeholders.
  • What are key indicators that there’s room for improvement? The toolkit over-indexes on one or two capabilities, but lacks the diversity needed to satisfy most learning objectives.
  • What does excellence look like? The research toolkit is focused but robust. 70%-90% of resources are allocated to standard research capabilities like panels, surveys, and interviews, while the rest is allocated toward more cutting-edge capabilities like insight repositories and ML-powered analysis.

Competency

Besides selecting the right tools, the key determinant of how much value those tools generate is the competency to leverage them. Training, best practices, and expertise are critical for a high-performing research operation.

Adoption

Dormant tools are useless so it’s critical to develop a process for adoption. Typically that includes a trial, initial target projects, identified key users, and investment evaluation.

  • What does a dire situation look like? Investment in tools doesn’t include required investment in training and onboarding. Usage is deferred to external vendors and consultants.
  • What are key indicators that there’s room for improvement? There are one to two power users within the research team that can navigate and use tools, but institutional knowledge is at constant risk of leakage.
  • What does excellence look like? Multiple team members engage in training and onboarding. The research team pairs vendor documentation with internal documentation, and creates visibility for the availability of tools and use cases. Adoption is revisited periodically to optimize documentation and training.

Success criteria

It’s critical for every significant investment to have success criteria, and research tools are no different. At a time when many organizations are trying to be leaner, wasted expenditures can negatively impact rapport and budgets.

  • What does a dire situation look like? There are no documented success criteria before purchasing a new tool. There is no correlation between year-over-year investments and the relative successful usage of tools.
  • What are key indicators that there’s room for improvement? While there aren’t specific success criteria, there’s a general sense of accountability to invest in tools that deliver results while divesting from dormant tools.
  • What does excellence look like? Each investment in a tool is considered a testable hypothesis, documented with success criteria. Success criteria are then used to calibrate investments and subscriptions.

How can you improve research in your organization?
Take the Self-Assessment Quiz: Improving Your Research Operations

Our self-guided quiz is designed to help you identify areas to improve the research operations of your organization. At the end, you’ll receive a score and guided recommendations to improve the maturity of your research operations.

TAKE THE QUIZ NOW