The arrival of LLMs such as ChatGPT, Bard and the ever-growing universe of AI tools that emerge daily, has stirred conversations as to their impact on industry sectors, business processes, people and more. This article discusses how the widespread usage of the technologies will impact the market research and report space.
AI dominates the business and social media content news landscape. Every day we see stories of new SaaS apps becoming available, advances in ChatGPT prompts, Google, Meta. Amazon, Microsoft, and others engaging in an arms race, and how Generative AI is being integrated into anything and everything. Hard to keep up, isn’t it?
The upsides are there. Productivity gains, more output, more automation, and new use cases that are pushing the limits and driving innovation. Tools that can search, summarize, rewrite, “analyze”, spot trends and even create reports for users. If the user understands the limitations of the tools and how they search, source and access data the risks are minimal.
One of the major keys to the ensuring best outcomes is via data observability (https://www.decube.io/post/data-observability-llm-accuracy) which is critical within within the analysis realm as the models only work when one can verify data sources, collection methods and accuracy. Layering AI on top of a defined data reservoir makes a lot of sense. Utilizing AI chatbots to support research and analysis is another story though.
Market research factories that make their money assembling reports using data scrapers or who engage in copy pasta are now turning to AI tools to support their “Just In Time” production model of posting reports on websites (their own or via aggregators) and backfilling the orders. AI tools are more of a means to spitting out end products that are low quality, lacking in context and worse, inaccurate given the issue of LLM hallucinations where the ChatGPT, Bard etc. provide inaccurate results.
The Just In Time model is highly problematic as the factories will compile 200 plus page documents in under 3 days packaging together previously created forecasts based on data that could very well be AI impacted along with data and content collected, synthesized and summarized utilizing these tools. And with the goal of having a report available for just about any subject or segment it is impossible for report factories to avoid these practices.
It also isn’t possible for a firm to produce 800 reports a year despite what they list on their website. The economics simply don’t prove out. A production output at that level normally requires 200 people in research, forecasting and analysis, especially if the firm sells 200-page reports. Add in the sales, marketing, production, management and overhead and even with offshore labor benefits the company must generate nearly $8 million a year in revenue to break even.
AI tools are a way for them to shave costs while giving the illusion of extensive product offering and market coverage via a team of analysts that are not obvious to anyone.
Many companies buy reports for 3rd party validation of market sizing, investment justifications, internal funding, M&A, or market perspectives. Given the issues with AI and the low standards employed by market research factories, how can a buyer or internal user feel confident in the content or numbers produced by factories or even know what they are getting though?
Work with established research brands where the buyer is able understand the vendor’s database, research methodology and the analysts writing the report. Quality firms can produce the analysts and present their processes in an obvious way, factories cannot.
Buy directly from reputable brands and avoid the factories and the resellers who carry these products. While I am aware that some companies do screen the publishers, if the products sell then they stay in their catalogues.
If the report is not available when the buyer inquires, don’t wait 3-5 days for a scrape and compilation job. That should tell the buyer the product is a factory job.
Master the AI tools internally and not only make sure that the data and content sources are up to user standard but do your part to avoid supporting firms that lessen the value that good firms provide.