AI, DATA, MACHINE LEARNING ARE ALL TOOLS, NOT ENDS TO THEMSELVES

The revolution in AI and the reliance on data to drive decision making are key practices in many government and non-government organizations.  Who wouldn’t want tools to assist in making better decisions?  It’s important to remember, however, that these new, or newer, instruments are, in fact, just tools.  Tools can help us do our job more easily but are still no substitute for doing the actual job.  This is important for contractors to consider as they create solutions for federal clients using some or all of these tools.  Allowing machines to make our decisions for us can lead to unintended outcomes, especially ones that may be hard to undo.  Ethicists and others are concerned that an overreliance on tools such as AI may result in people losing the ability to make final decisions based on tools AND other inputs. They’ve speculated that such reliance could lead to wars, unnecessary capitulation, ethnic cleansing, or other negative outcomes.  Congress is now getting involved, with bi-partisan legislation expected to be introduced soon that would create a “blue ribbon” panel to make recommendations on managing AI.  This is not to say that AI, machine learning, and data don’t have their place.  Properly deployed, each can have a positive impact on government missions and the ability of contractors to support them.  As with any new tool or capability, though, people must resist the urge to think that it can be used in any and all circumstances to remove doubt and subjectivity.  Subjectivity, for its part, is already deeply entrenched in government acquisition.  Consider the growing advantage for companies providing Buy American Act compliant solutions, small business preferences, or cost/performance tradeoffs that are regularly used to make contract award decisions.  Government procurement would look very different if every decision was driven only by data analysis.  Contractors and their government customers need to remember the TV show “Star Trek”.  Just because a decision isn’t “logical” doesn’t mean it’s always a bad choice.