Ask PMI Anything: How can I manage complex AI projects? – Part Two
25 Mar 2021
Photo by Peter Gombos on Unsplash
Fast-evolving technology areas like Data Science (DS) and Artificial Intelligence (AI) are reshaping how work gets done around the world; PMI’s recent in-depth look at AI, for example, found that project professionals expect the portion of projects they manage using AI to jump from 23 to 37 percent in just three years. But how can organizations make sure they’re most effectively leveraging these tools when they are continually changing?
PMI and NASSCOM, India’s National Association of Software and Services Companies, recently teamed up to develop the Playbook for Project Management in Data Science and Artificial Intelligence Projects, the definitive guide to managing DS and AI projects.
In part two of this conversation—visit here for part one—let’s hear from Snehanshu Mitra of NASSCOM and Srini Srinivasan of PMI about the “how and why” of the playbook, and some best practices for project professionals seeking to unlock the full potential of DS and AI.
Tell us about the framework you’ve developed for managing DS/AI projects.
Snehanshu: The framework was developed based on evidence shared by experienced practitioners in the field and by our own research into companies that have successfully completed DS/AI projects. It consists of two parts: the resources individuals and organizations need to acquire in order to drive DS/AI projects and a best-practices toolkit for each stage of such a project.
We developed an illustration [Figure 3.1 from page 22 of the playbook] to visualize these elements. The resources required—for both individuals and organizations—make up the foundation, while the overarching best practices are the pillars. Together they contribute to driving effective outcomes for DS/AI projects.
Srini: The framework addresses some of the unique challenges of managing DS/AI projects. For example, experimentation, which is critical in DS/AI projects, is generally enabled, but planning and preparation tend to be much more difficult. The framework also helps teams define and measure success and to refine success metrics over time. And it helps teams establish an organization-wide data strategy. This is especially important because data preparation and model behavior have a direct bearing on project success.
As you see, the framework breaks down the project lifecycle into five broad stages—from developing business knowledge all the way to closing. This is similar to the stages in CRISP-DM and waterfall. Given the need for experimentation, however, the earlier stages are more unstructured than the later stages, which, in turn, are more suitable for iterative practices like agile.
What are some of the specific best practices you recommend?
Snehanshu: First, it’s worth discussing some of the “unstructured” best practices that we recommend in the earlier stages of the framework. The first is to iterate with a suitable timebox and scope defined beforehand. This gives you the freedom to experiment without dictating a uniform cadence or requiring a detailed backlog for each iteration cycle. We recommend timeboxing the overall stage and limiting the scope definition to avoid an unexpected extension of the stage. The “spike story” technique in agile may also be used at this point if leadership has mandated the use of agile practices.
Srini: In the Business Understanding stage, we recommend techniques like checklists and Standard Operating Procedures (SOPs). These are usually associated with more structured projects, but 45 percent of the organizations in our research report are successfully using SOPs in their Business Understanding phase. We also recommend competency mapping to better understand training and resource requirements.
Collaboration is the key in the Data Preparation stage, since you’ll need to access data from multiple sources—the business, clients and data science and engineering teams. Most organizations fall short on collaboration, however, with 80 percent of CIOs citing collaboration challenges and data silos as reasons why AI projects have failed in the past. To aid in this effort PMI, MIT and International Council on Systems Engineering (INCOSE) have developed the Model for Interdisciplinary Team Collaboration—a resource for driving collaboration among teams.
Finally, in the Modelling stage, we recommend the “Champion-Challenger” approach to better meet timelines and scope. This involves testing multiple “challenger” models at the same time, before settling on one based on business requirements.
What about the later stages of project management? What best practices do you recommend there?
Snehanshu: The hand-off between the modeling and implementation stages is critical. Up to 75 percent of machine learning projects never get beyond the modeling phase, and Gartner has flagged the ability to operationalize a model as a critical “litmus test” of an organization’s maturity.
We believe Machine Learning Operations (MLOps) increase the likelihood of operationalizing a model. It frees up valuable bandwidth of the data scientists, allowing them time to build new models. It also reduces the time-to-market for the model since it allows for continuous training.
Srini: In terms of the Closing phase, we recommend the 3Es for measuring success: Efficiency, Effectiveness and Experience. These are the main focus areas for measuring customer outcomes. Under Efficiency, you would look at factors like resource usage, including the hours saved or the reduction in human error costs. Under Effectiveness, you would assess the accuracy of your predictions and risk reduction in decision-making. And under Experience, you would measure factors like adoption rates and customer experience in moments.
What resources do you recommend organizations develop to enhance their ability to manage DS/AI projects?
Snehanshu: At the top of our list would be the use of low-code/no-code tools and employing citizen data scientists (which could be further explored in PMI’s Citizen Development suite of offerings). The latter are individuals who work in analytics and data science and are knowledgeable about the field but don’t operate at the depth of a data scientist. They can help overcome talent shortages, speed the development of models and facilitate communications between the core data science and business teams.
We also encourage organizations to support an enterprise-wide data strategy so there are protocols and sound governance standards in place for how data is collected, stored, documented and prepared. This is critical to the long-term success of DS/AI projects. To unlock greater efficiencies, organizations should look into investing in data architecture and tools, such as data lakes and integrated data warehouses, as well as in automated data cleaning and preparation tools, such as AutoML.
What about resources for the individual—either an experienced practitioner or someone new to the field? What do you recommend there?
Srini: For experienced practitioners, we recommend specialized DS/AI project training, including training in the basic workflow of DS/AI projects, developing a DS/AI-centric business case and KPI setting and performance evaluation. Such training, by the way, is also valuable for development team members like data scientists or clients looking to deploy DS/AI solutions. It provides a useful grounding in such issues as roles and responsibilities, key deliverables, timelines and required resources. Collaboration and design thinking skills could be reinforced, for instance, through PMI’s Wicked Problem Solving offering.
New practitioners need to build their knowledge base at more fundamental levels. We suggest training in such areas as:
- Understanding the world of DS/AI, including the business applications of DS/AI projects and the factors that drive the success, failure and complexity of these projects.
- Understanding oneself, i.e. understanding the common motivations for joining the field and the challenges to be faced, as well as assessing one’s knowledge and skill levels.
- Value delivery—understanding the maturity of an organization to take up DS/AI projects, acquiring a mentor or role model and staying up to date with best practices in the field.
Snehanshu: DS/AI project management is a hugely promising arena. By 2023, global spending on AI systems is expected to reach nearly $98 billion, and the value of the transformations they power is estimated to be in the trillions. Experienced project managers will be key to delivering this value, so now is the time to reassess your career goals, sharpen your skills and look for opportunities to be part of the exciting DS/AI world.
Playbook for Project Management in Data Science and Artificial Intelligence Projects
Read Part One of this two-part series here, and check out the full playbook online.
