Blog Doing AI vs. Using AI: Two Approaches to Rationalize Adoption & Boost ROI

Two distinct approaches. Multiple use cases. Which one will best fit your organization’s unique needs?

Developer working at workspace

I speak to countless business leaders who are excited — but also bewildered — at the promise of disruptive technologies like AI. Recently, I’ve been seeing major light bulbs go off during conversations where we talk about putting AI adoption into two main camps: “doing AI” and “using AI.”

Each method caters to specific requirements and resources. And both offer unique advantages and use cases. Let’s take a look at the fundamental differences between these two camps and explore how your organization can leverage either approach to your advantage.

Doing AI: Building from scratch

The "doing AI" approach involves a more comprehensive development of an AI system. This method focuses on creating an in-house AI infrastructure — one that is fine-tuned and meticulously crafted to address needs across the organization.

Here are some key characteristics of the "doing AI" approach:

  • Large, well-manicured data sets:
    Successful AI implementation requires access to extensive and reliable data sets. In the "doing AI" camp, organizations invest significant efforts in gathering, cleaning, and curating data to train their AI models effectively.
  • Expert team:
    The process of building AI models from scratch demands a team that can include highly skilled data scientists, data architects, Ph.D. researchers, and other subject matter experts. Their collective expertise ensures the AI system is robust and optimized for the given domain.
  • Substantial resources:
    Implementing AI from the ground up is a resource-intensive endeavor. Organizations need considerable financial investments to support the infrastructure, compute resources, and staff required for the project.
  • Time intensive:
    Doing AI often involves extensive experimentation, tuning, and optimization to achieve the desired level of performance. Converging on an ideal model is often measured in months, if not years.

Using AI: Leveraging existing solutions

On the other hand, the "using AI” approach focuses on incorporating pre-existing AI solutions or services into a workflow. You might consider key players such as Microsoft, Anthropic, and OpenAI, leveraging their capabilities as a feature to drive improvement within your organization.

That said, you don’t necessarily need to lean on a provider that only does generative AI — the provider could be a security vendor that’s built an AI model into its solution. There are several products that use AI models internally to drive better outcomes in their offerings. For example, there exists an entire category of AIOps tooling that improves operations by using AI technologies to predict failure, correlate events, and even optimize your environment.

The mindset of a "using AI" approach includes the following:

  • AI as features:
    Integrating AI capabilities into existing products, solutions, and workflows helps you add AI functionalities seamlessly.
  • Pretrained models:
    The "using AI" camp benefits from pretrained AI models that are already capable of performing specific tasks. These models can be leveraged for inference, generation, or other relevant purposes.
  • Fine tuning for customization:
    Organizations can modify pretrained models as needed with their own data to tailor the AI system to their unique needs. This allows them to achieve better accuracy and relevance within their specific domain.

What can these approaches look like in the real world?

It’s important to note that both approaches can exist within the same organization. For example, let's consider an oil and gas company. Suppose the company is working with their extensive, geo-seismic data that they have collected through expensive processes. In this case, the company would likely prefer the "doing AI" approach to develop their own AI system, ensuring complete control over their data. It is the organization’s competitive advantage.

Within that same company, maybe their development arm has a goal to improve coding efficiency — in this case, they probably should opt for the "using AI" approach, leveraging a service like Microsoft Copilot that would integrate AI to assist with code development. The organization would be able to enhance their coding capabilities without having to invest in extensive AI model training and building.

Rationalize for success.

There’s no way around it: Rationalizing adoption is a huge part of how successful your organization will be as it adapts to new technologies. At a high level, these doing and using approaches to AI have been a highly effective way to rationalize AI adoption for Insight’s clients.

As you think about all the factors that can guide your journey — data sensitivity, expertise, available resources, and your desired level of customization — remember that embracing the right approach can be a game changer for your organization.

Headshot of Stream Author

Juan Orlandini

Chief Technology Officer, North America and Distinguished Technologist, Insight

Juan is Insight’s chief technology officer, North America, and one of Insight’s distinguished technologists. He is a 30-plus-year veteran of the IT industry and has designed and deployed enterprise computing, storage, data protection, virtualization and hybrid cloud solutions. Juan evaluates next-generation technologies for Insight and works with enterprise clients, assisting them in architecting and selecting strategic roadmaps. In his current role, Juan designates champions of the technology community within Insight and drives events that promote thought leadership, professional development and knowledge sharing.