Mastering Well Data.
They were putting together a comprehensive strategy to best their competitors. Right when they started researching how to gain the leverage needed for a targeted attack, they hit a major obstacle: lack of easy-to-access, and well defined well data.
There was no automated way to track the activity at the wells by segment. Since data analysts and market managers were in the dark, they couldn’t create a market share map without spending weeks piecing together record fragments, which ultimately were inaccurate, since they were basing their analysis in rig count, not to mention that by when they were finished, data was obsolete. Without that, there was no way to show how sales measured up against industry competitors, leaving our client at a significant disadvantage.
How can you outperform your rival, when you can’t see where/how they are operating? We needed the location, characteristics, owner, etc. of every single well across the world to get completely clear on our client’s position and output. This would enable our client to define a targeted strategy over potential customers and over their most important competitors, aggressive, right?
1) Identify trustworthy data sources of well data
2) Create an automated method of organizing that data so that business leaders could get a consistently clear picture of the current market based on well data (Master Data Management Strategy).
This entailed pulling together 2 million records from multiple sources all over the company and third party sources, developing a Master Data model for well data (the PPDM standard was utilized), cleaning thousands of duplicate records, purging inaccuracies, then matching and moving all data records into one “single source of truth”.
A good old-fashioned mess. So, we rolled up our sleeves and jumped in.
It was a 4-month process involving a small dedicated team. Our Master Data Project Manager analyzed the requirements and created a plan. Our Master Data Architect put together our precision toolbox based on the tools our client had at hand along with our own recommendations, and our two Master Data Senior Developers got to work.
Our client’s main choice for a go-to program was MDS (Master Data Services), a Microsoft product that is part of SQL Server. We used another service of SQL Server called SSIS to help us pull data from all sources and transfer that data into MDS. And for due diligence, we developed environments using TFS (Team Foundation Server) to best manage software versioning.
Through SSIS, our developers determined data quality validation and reviewed the lifecycle of all 2 million records. We then switched from SSIS to another tool called Maestro in order to match data records. Maestro was also used to purge duplicates and ensure survivorship of the most accurate records and create Golden records. After a final validation with SSIS, we then published that data directly into the Master Data Cache in MDS and exposed it via web services in order for it to be consumed by BI and Analytics tools.
Now that our client has access to Master well records, data analysts, market managers, and business executives now have fast, easy, and reliable access to information that can tell them about their company’s and competitors performance. It has reduced the time to produce a market share map by half, and has significantly improved repeatability through automation. Additionally, now when an order service is picked up through SAP, they can choose the well where the service will be applied. If you can imagine a future with efficient access to data, we can make it happen.