Rarely does a company correctly price their product on the first try—nor the second, third or fourth try, either. According to a Bain & Company study, a whooping 85% of B2B companies “have significant room for improvement in pricing.” But even if a company does happen to land close to the mark, continuing to inch toward optimal pricing can make a huge difference. Harvard Business Review demonstrated that a mere 1% improvement in price generates a 7-11% increase in profits, and the study has since been replicated many times over to become a reliable benchmark.
The key to pricing optimization is data. Traditionally, companies have anchored their pricing strategy by selecting a handful of relevant data streams to create a fixed price. Increasingly, however, today’s organizations are incorporating more and more elements into a pricing model that is dynamic, not fixed. As pricing becomes a central lever for differentiation, organizations need to be able to respond to market changes faster than ever, which means optimizing for a continuum of decisions (such as list price or promotions) based on context (such as localization or special occasion) to serve the function of multiple business objectives (such as net revenue growth or cross-selling).
Common Data Model: The First Step Toward Trustworthy Pricing Models
A dynamic pricing model has the potential to significantly move the needle on an organization’s bottom line. But it can also test the limits of their data architecture and data management practices. When incorporating more and more data streams, many organizations have faced repeated “trust” issues due to a lack of data quality, poor data governance across multiple geographies or business units, and extremely long and inefficient deployments.
One of the first steps towards resolving these issues is developing a Common Data Model (CDM). Having a centralized model for all required data is essential to ensure that users have a single source of truth. A CDM defines standards for the key components that are influential to pricing optimization, including transactions, products, prices, and customers. From there, the standards are applied through a blended dataflow and serve the downstream systems to leverage pricing data (business applications, dashboards, microservices, etc.) with one homogeneous data model. The CDM also offers a consistent way for the various teams involved in the initiative to better collaborate using a common language.
Of course, the CDM must work in conjunction with a modern data environment such as Google Cloud. In particular, Google Cloud provides major benefits to pricing optimization, including end-to-end analytics suite, flexibility and scalability, and self-managed self-service.
Cloud Dataprep by Trifacta for Pricing Optimization
Having a Common Data Model and a modern data environment in place are important foundational components. But the true test of pricing optimization efficiency will be measured in how quickly users can take action on this data. In other words, how easily can users prepare data, the first and most critical step in data analysis? Historically, the answer has been “not very.” Preparing data, which can include everything from addressing null values to standardizing mismatched data to splitting or joining columns is a time-consuming process that requires a great attention to detail and, often, a large amount of technical skills. It can take up to 80% of the overall analytics process. But by and large, it’s worth the extra time spent—properly prepared data can make the difference between a faulty and accurate final analysis.
Google Cloud’s answer to the data preparation problem is Cloud Dataprep by Trifacta, a fully-managed service that leverages Trifacta technology and allows users to access, standardize, and unify required data sources. Its machine-learning driven engine and visual interface accelerates the overall data preparation process by up to 90%. Some of the critical steps involved in leveraging Cloud Dataprep by Trifacta are:
- Assess Data Sources: Upon ingesting the necessary data for pricing optimization, its contents must be assessed. Each source system will have its own way of describing and storing data. Each source system will also have a different level of accuracy. In this first step, building an inventory is essential in order to get a clear idea of the quality of each source, which will later inform how they should be cleaned and standardized.
- Standardize Data: After identifying the source systems and assessing their data quality, the next step is actually resolving those data quality issues to achieve data accuracy, integrity, consistency, and completeness. Eventually, this data will be normalized and mapped to its corresponding CDM class of data.
- Unify in One Structure: Unifying this data into a singular structure consists of joining all individual CDM data classes with attributes from each individual class at the finest granular level. This is a critical step because it creates one source of trusted data for all pricing optimization work.
- Deliver Analytics & ML/AI: Once data is clean and ready for analysis, analysts can begin to run use cases and scenarios that explore the impacts of pricing changes. It is during this stage where the organization will begin to see dramatic changes to their bottom line—but not without all of the hard upfront work of preparing that data.
Want to learn more? We’ve put together an entire whitepaper with our partner TheTopLineLab specialized in the practice of Pricing Optimization so you can identify your organization’s weakness, better understand the data sources needed on your pricing optimization journey, and review a step-by-step guide of how the Google Cloud Smart Analytics suite enables an essential Common Data Model (CDM). Download the full whitepaper here.