Insights      Operations      Product      Agile AI at Georgian – Part 1: Finding Your Project’s North Star

Agile AI at Georgian – Part 1: Finding Your Project’s North Star

AI at Georgian: What I’ve learned

In my work leading product and strategy for Georgian, I’m obsessively focused on process. I’ve spent the past few years experimenting to find a process that meets the specific challenges of building AI products. 

AI products tend to be more complex, highly cross-functional, have ongoing governance needs, non-deterministic outcomes…the list goes on. It’s no wonder that only 13% of data science projects make it into production1.

You might be asking yourself — why is a VC firm building products? Every industry is going through digital transformation and ours is no different. Historically, we’ve differentiated through helping our companies adopt trends like AI and Trust that help our companies accelerate and develop a competitive advantage. Now, we’re evolving our value-add offerings through our technology platform. The Georgian platform supports our companies’ efforts in the areas of community, capital, research, product, and operations — all to create a better experience of growth-stage investing. 

Like any startup, we run an agile product development process, with multiple sprint teams and pods. In this series on agile AI practices, I’ll share some of the best practices we have developed over time by building our own products and collaborating with other ML teams. There are some unique challenges presented by ML/AI/data science projects that we will talk about in these posts. 

In the first installment of this series, we’ll talk about what an AI project’s North Star is, why you should care and how to formulate one for AI products while avoiding common pitfalls.

A user-centered North Star

Once our team has prioritized a project or product, honing in on the North Star is the next step. When we say ‘North Star’, we mean a clear definition of the value we’re expecting AI to deliver. Is it automation, augmentation, scale, increasing the quality of a business process or something else?  Google PAIR’s People and AI Guidebook is one resource we’ve found that offers high-quality guidance on identifying exactly where AI capabilities and user needs intersect.

The North Star needs to be clear, attainable and framed in terms of the value the user will experience. AI projects often get off track because there is too much focus on the model, or model performance, and we lose track of how that model will delight customers or make an impact once it is deployed.

For instance, our platform identifies, scores and ranks companies and markets for our investment pipeline. The main persona we are supporting is a member of our investment team who is sourcing leads.

INFOGRAPHIC #1: Finding your AI project’s ‘North Star’

Define the value your project will provide using AI (ex. Automation, scale, or augmentation). 
Your project must be clear, attainable and framed in terms of user value.
Don’t focus so much on model performance that you lose sight of customer value.
Finding your AI Project’s “North Star”

Keeping your North Star in focus

Next, we need to make sure that the North Star is clearly understood by all stakeholders, from the business owner to the customers to the data scientists. We give our data scientists and engineers access to investors and users of the system to understand exactly how the things they are building will impact their colleagues, and the context in which their solutions are being used.

How can you keep your data science team from losing sight of the North Star even as they are down in the weeds of model development and optimization? We’ve found that a regular feedback loop is necessary so that the cross-functional team continues checking in on progress relative to the North Star goal.

At Georgian, here is the feedback loop cadence we’ve settled on:

  • Our product team meets with business stakeholders (our deal leads) and the customer (our investment and growth/business development teams) each month.
  • Our data scientists meet with the customer each week. In some cases, we even have them participate in the user’s activities — for instance, we have an engineer working on the investment team doing market research to gain user empathy and understanding. Because this is an internal product, we also have regular demo days where the development team and users co-present our progress to a wider audience. 
  • Based on the output of the above steps, our team develops detailed ROI models and metrics reporting to keep everyone apprised of progress.
INFOGRAPHIC #2: Keeping your North Star in focus  

Left side: 
Ensure your North Star is understood by all stakeholders.
Connect with customers so that engineers and scientists understand how their solutions are being used.
Create a regular feedback loop so cross-functional teams can check in on progress relative to the North Star goal.

Right side: Example of a feedback loop



Georgian’s product team meets with business stakeholders and the customer each month.
Our data scientists meet with the customer each week. In some cases, we even have them participate in the user’s activities. 
Based on the output of the above steps, our team develops detailed ROI models and metrics reporting to keep everyone updated on progress.
Keeping your North Star in Focus

Avoiding pitfalls

There are a few common ways that teams can lose sight of their North Star. Knowing about these in advance can help you to avoid them.

First, when building and evaluating models, it’s important to understand the business impacts of different types of errors. For instance, you may need to make tradeoffs between your model’s coverage and its accuracy. Your North Star will tell you which is more important to prioritize based on the impact each has on your users, and this insight will drive decisions throughout the development process, from which modeling techniques the team employs to how to QA the results. You’ll need to do detailed error analysis to check the qualitative impact of model errors, not simply blindly use performance metrics against your test set. 

Second, your North Star can promote deep understanding and empathy of how users will interact with and perceive your model — remembering that they are not interacting with models in isolation, but as part of a larger product experience. There may be different limitations on models that users more directly interact with (such as lead generation or recommendation) vs. models that aren’t obviously visible to end-users. At Georgian, we capture our requirements in a “user story” format, which helps us keep the North Star in sight.

Third, it can be easy to confuse your North Star with other goals. Your North Star should not change, or should evolve only slightly, over the course of your development project. By contrast, we use more granular sprint-level goals and quarterly OKRs to track specific targets that support the North Star. At all times, teams should rally behind specific targets and objectives to promote collaboration and reduce waste created in the development process. 


For instance, our North Star is to use ML to put better leads, in our case high-growth startups, in front of the investment team. In a two-week sprint, we may want to incorporate employee growth features to attain lift in our lead scoring model. Without an overarching theme, though, it can be easy for a team to get bogged down in meeting the granular goal in a way that doesn’t necessarily advance the spirit of the initiative. Along the way, we check that adding this data meaningfully improves our progress toward the North Star — which means improvement not just in model performance, but overall ease of use, understandability and utility of the model for end-users.

When building and evaluating models, understand the business impacts of different types of errors.
Your North Star will tell you what areas to prioritize based on the impact each has on your users.
Use your North Star to promote deep understanding and empathy of how users will interact with and perceive your model.
Your North Star should not change, or should evolve only slightly, over the course of your development project, while OKRs can track specific targets to support the North Star.
Avoiding Common AI Project Pitfalls

We need to help data scientists bridge the gap between the user experience and the technical performance of the model. This is no easy task — the nature of data science work makes it easy to focus primarily on the details and so the effort to counteract this needs to be constant. When you achieve this, however, the results are transformative.

This is the first in a series on agile AI. If you want to receive the rest to your inbox, sign up for our newsletter here

Next time, in our Agile Series, I’ll talk about nurturing your AI team. 

Read Part 2: Nurturing your AI Team, Part 3: Experimentation and Effort Allocation and Part 4: How Data Can Make or Break Your AI Project


1https://venturebeat.com/2019/07/19/why-do-87-of-data-science-projects-never-make-it-into-production

Read more like this

Testing LLMs for trust and safety

We all get a few chuckles when autocorrect gets something wrong, but…

How AI is redefining coding

Sometimes it’s hard to know where to start when it comes to…