In a bid to build the next generation of home financing infrastructure, Fannie Mae uses TBM to understand not only technology costs but also business operation cost and time efficiency. The company is shaping investment in technology and operations with models that allow business users to explore the total costs and connections between applications, business operations and business capabilities such as turning a pool of loans into a mortgage-backed security.
Fannie Mae corporate overview
The Federal National Mortgage Association (FNMA), colloquially known as Fannie Mae, is a government-backed corporation that provides widespread access to affordable mortgage credit. It does that by buying loans from banks, packaging those loans in pools, securitizing the loan packages, and selling them to investors.
Fannie Mae is in the midst of another transformation
During the Great Depression, as borrowers defaulted on mortgages en masse and banks found themselves strapped for cash, President Franklin D. Roosevelt and Congress created Fannie Mae in 1938 in order to buy mortgages from lenders, freeing up capital that could go to other borrowers. In 1968, as the US Government’s budget was strained by the war in Vietnam, Fannie and its assets were sold to private investors and, in 1970, Fannie was listed on the New York Stock Exchange, followed by decades of robust growth alongside a rising housing market.
After the 2008 financial crisis, Fannie was placed under the conservatorship of the Federal Housing Finance Agency (FHFA) to guarantee solvency while working on a plan to restructure the secondary mortgage market. Now, Fannie is adopting a TBM approach to technology and business cost transparency in anticipation of helping to build the next generation of home financing infrastructure. Whatever future unfolds from the restructuring plan, Fannie aims to be prepared with facts and insight about the building blocks of technology and process that continue to supply vital liquidity into the American housing market.
IT was a black box to the business
The Operations and Technology (O&T) division’s first response to the new pressure for transparency was to create reports and dashboards to better respond to FHFA requests. One such report had 400 different data points that were input manually, every single day. To be prepared for on-the-fly questions about IT cost, the CIO carried around a thick binder of printed reports.
When another round of company budget cuts came around, O&T was always an easy target, not only because it was the largest budget at Fannie Mae, but also because no one really could tell the business which technology costs were relevant to them.
“At that time we were using technical jargon to communicate costs to the business,” remembers Sheenal Patel, client engagement manager for Service & Performance Management (SPM), the O&T group that runs Fannie’s TBM program. “The business was allocated a large lump sum of indirect cost. We couldn’t explain what the impact would be to the business if we were asked to reduce cost. IT was a big black box.”
Gunther Schultz, a vice president in the business operations side agrees. “I didn’t know what my applications cost. I just had one big cost number for all applications.”
“Our goal wasn’t necessarily cost reduction,” recalls Gboyega Adebayo, Fannie’s lead TBM analyst. “It was cost transparency. We really wanted to peel back the onion and understand the cost of what we do on a day-to-day basis, and we wanted to be able to communicate that to our business partner. We needed data that told a story.”
Telling that story was part of an overall transformation from technology provider to service provider.
Changing the business conversation with service costs
The Service and Performance Management (SPM) team set out to create a prototype of service costs focused on infrastructure. In a proof of concept, they used data from a handful of financial and operational sources to show storage and server costs tied to a handful of applications. Although it took them several months to compile this incomplete view, the first glimpse of real data in a service context was enough to get buy-in to investigate a repeatable solution. Patel explained, “We really wanted a solution that didn’t require hiring professional services every time we needed help. We wanted to grow at our pace. That’s when we adopted the TBM methodology and configured our own TBM system.”
The SPM team worked with the business to define services they understood and cared about. They include Operational Services (business labor and supporting technology to automate the business operations services O&T provides to the company), Application & Integration Services (specific applications-as-aservices as well as infrastructure services), End User Services (e-mail, laptops) and Projects & Investments.
They loaded data from multiple sources into their TBM system and configured the model to flow costs into services they defined within O&T. After several iterations what they discovered is that business and O&T service owners often wanted very different views and each team had different priorities which weren’t consistent throughout the organization. TBM Program Manager Mina Han confessed, “Our impulse was to make the data perfect, but our idea of perfect wasn’t the same as the business.” Since then the team has adopted an approach of sharing works in progress with the business, as Han said “to partner together and figure out what provides the most value.”
“The TBM service cost model really opened doors and a lot of eyes across Fannie Mae,” said Adebayo. “It was the first time application owners could see the total cost of what they were providing and the business could see what they were getting beyond just the project dollars they knew. Application support, risk controls and management overhead, the full cost of infrastructure services … It was mindboggling when you realize how much it cost to keep a lot of these applications running.”
Schultz recounted his reaction from the business side: “The ability to actually isolate an application, double click on it, and understand that it’s made up of this much hosting, this much level one support, this much level two support … It gives me a better questioning path. I can ask why – ‘Why is that so high? Why is that app different from that app?’ We’re having further conversations of how do we allocate that cost? Is that the right methodology? We’re finally at the table together because we have a singular currency to discuss. The cost model has definitely brought us closer together.”
Part of sitting down together has meant working together on where to invest and where to pare back. “And so now with TBM, we have the ability to show, here are your levers. This is what you can turn off, and this is what you can turn on.”
Schultz agreed. “It allows me to help drive the roadmap for which applications we should invest in and which ones we would like to sunset because of that cost.”
Advice for TBM beginners
The Service and Performance Management team at Fannie Mae offered advice for those just starting out with TBM or considering doing so.
Use TBM analysis to drive data improvement. “Everyone has gaps and inaccuracies in their data,” stated Patel. “Don’t let that be your crutch. TBM lets you tie dollars to those gaps so data owners know where to focus their improvement efforts. And they can use the reporting to see the progress.”
Roll out reports as soon as possible. “Don’t wait for things to be perfect before rolling anything out,” advised Han. “What you may think is perfect in your mind is most likely not what’s perfect in your customer’s mind, so the key is really to partner together to figure out what works and what provides the most value.”
Just get started. “Your data’s never going to be perfect, and you’re never going to have 100% buy-in before you start,” said Adebayo. “But just throw the information out there because when people see it, their eyes will be wide open. They’ll see what needs to be fixed, not just for TBM but for all the other reasons that system and data are there in the first place.”