Carlos Romero, VP of Technology for Stradigi AI, has guided the company’s technology initiatives since 2005 and has seen the organization through its high growth since 2017. Apptio asked Carlos for his insights into optimizing costs for a growing and changing business.
What is your current involvement in managing technology?
With the widespread use of SaaS-based tools on the enterprise landscape there are many services whose features often overlap. As a VP of Technology, part of my job is to make the right decision when analyzing the tools that the business will leverage and closely watch the integration between systems. Bringing value faster to the business is the primary goal, while providing secure access protecting data and keeping the information tightly coupled. I consider Stradigi AI to be a cloud native company.
How do you define cost optimization and differentiate it from simply watching out for expenses?
It’s a balancing act between cost reduction and bringing value to the business while keeping a close eye on risk. It's not about being cheap, it's about using the right tools for the job. We have processes internally to select vendors and make sure they're compliant with certain standards, and that we don't duplicate the tools within the organization. So we go through a diligence process, identifying the right solution for the job, and then comes negotiate. I negotiate the terms with the provider. Then we look at the cost, the ROI of the tool, and the actual long-term gain along with ease of implementation to understand if we need to go through a steep learning curve.
These are the things that need to be planned properly to make sure the value is there for what you want to do, and then creating a plan and a path to actually achieve the expected result.
Are there particular areas of risk that you consider to be inherently expensive? Can you give an example of an opportunity or technology you considered to present too much risk? And how do you calculate an acceptable level of risk vs. cost?
Loss of data confidentiality and availability are inherently expensive, and cloud computing if not managed properly can be considered a risk, but its benefits often outweighs the drawbacks. A proper Risk assessment should be performed on every cloud project to decide if the residual risk is acceptable by the organization’s standards and always backed up by a proper risk treatment plan.
We utilize tools to help us identify the tools that we're using and that the vendors have certifications in place. We run that through our legal department. We also have vendor assessment programs. Then based on the risks that we have determined we decided were to take it or possibly mitigate existing risks with other tools. So it's a risk assessment and treatment plans for every single process internally. For vendors of SAS solutions we go through the same scenario.
Where does cost optimization start? What are the most important areas to monitor?
Watching closely service overlaps and really check if the current tools in our arsenal can do the job before committing to what I often refer to as 'YAST' - Yet Another SaaS Tool. Often teams request access to services or begin trials whose features are already part of a service that we currently have access to. Training the staff, having a good understanding of the tools and capabilities, and being able to negotiate with other department heads is key to keep costs down and having full control of where the company’s data lives. There are great tools that allows you to have a comprehensive view of your company’s SaaS ecosystem and keep license allocation in check.
There are certainly plenty of options when it comes to SaaS tools. How do you track the various capabilities of each and evaluate when there is duplication of function? Is this simply a matter of digging into every product?
Part of the analysis phase on every project requires in-depth research about the features offered by the applications shortlisted. Once selected a proper briefing and training of the IT team on the full capabilities is key. Also, having an up-to-date knowledge base with information about the tools available helps the staff and newly on-boarded employees access that information. It all comes back to features and ROI and the value and time savings a new application brings to the table vs. using one or a combination of existing tools in our arsenal.
How much control does IT management really have over costs?
Nowadays all cloud utilization is managed by the IT department, from contract negotiation to provisioning, license allocation and proper training which increases time to productivity. Therefore, I’d have to say there is quite a bit of control that IT management does have over costs.
Cloud services are often pointed to as ways to optimize IT costs. How do you view this as a strategy and how can overuse be avoided?
Devising a clear strategy to allow cloud utilization in the organization, security best practices on cloud resources to protect against abuse, and monitoring under-utilized resources are all key.
There are quite a few tools that allow you to intelligently see how the resources are being utilized and come up with a schedule and automatically shut down systems that are not being used. Some give suggestions like, “maybe this computer or this server is being used at this utilization level, so maybe you can consider lowering the size to save half of the money.”
Internally, we shut down and disable unused accounts. At this point we have 140 people in the company. So, when someone is off-boarded you want to make sure that all the resources and accounts that they were utilizing are terminated or suspended along with the employee. Automation lets us disable an account with a click of a button.
I think automation software has come a long way to help the organization have full visibility on the resources, how they're being utilized, and how we approach it as an IT team to make sure that the resources are managed properly.
Can you give us some insight into the cloud strategy you’ve defined for your organization?
Cloud first! It brings fast prototyping, agnostic automation, endless scalability, faster project deliveries and quicker time to market. So what I mean with cloud first is that we don't believe in procuring resources internally.
That means we're not tied to any vendor specific code or resources. I want to make sure that we can move our code and tools across platforms easily. When we talk scalability, not just going up, but being able to also reduce use at the resource. In case of spikes or surges in traffic, you can easily adjust accordingly.
Do you think zero-based budgeting is appropriate for IT cost control?
Indeed. Properly projecting utilization, building a solid business case for each project and closely analyzing your financial estimates are key to properly distribute the budget across security, infrastructure, SaaS, software and hardware.
We don’t have an endless amount of money that we can dig into so we analyze every single use case of the business and we come up with a business plan that is solid, then we present it for approval. We go with a strategy to achieve ROI, whether it's something as simple as a cloud phone system, or a full deployment in the cloud. We have to actually be able to show how the investment is going to benefit the organization. So my department works with no budget.
What methodologies and tools do you employ to keep spending at appropriate levels?
We look at monitoring, automation, cloud parking systems, training, and great negotiation skills. The industry is moving toward automation. We can automate tasks like account generation for new employees to set up all the environments that they're going to be using. So it is an automated script that can be executed many times.
I'm a believer of that kind of practice, and every time we have a task that is this repetitive, we find ways to automate it and make sure that the process goes as smooth as possible. It’s the same with long running workflows; how can we simplify the work? How can we purge steps because it takes time? It's very important that for workflow automation, for development automation, for infrastructure automation, for deployment automation.
Clearly negotiating with providers can help keep costs in check. What strategies have you used to create leverage when negotiating for services? Does it usually involve price concessions, or do you consider SLA and feature upgrades as well?
Multi-year agreements, future opportunities, the growth of the company all factor in. I’ve found it advantageous to negotiate contracts close to quarter ends.
If you're going to be with a provider for a long time, or it's going to be the backbone of your organization, or it's going to be running very mission critical apps, it’s best to establish long term contracts. When you negotiate a longer agreement, you can lower the licensing fees, and that’s something I use quite a lot.
How do you see DevOps and agile methodologies contributing to cost optimization?
We take an agnostic view of software, and build once - deploy everywhere, then closely monitor resources. Container orchestration tools allow us to deploy easily in the underutilized systems in our fleet and on our IaaS providers to leverage spot instances.
Do you have staff dedicated to monitoring these changes and managing the various moves and transitions?
Yes, monitoring and gathering early feedback on deployments is extremely important. Having early indications that an application or feature is not performing as expected or having clear metrics on its adoption helps the business pivot and adapt to the market needs saving time and materials in the process.