In machine learning, "uncertainty" describes the margin of error of a given measurement as a range of values most likely to contain the "true" data value. A critical cultural approach to digital assistants reframes uncertainty into a strategy of inquiry that foregrounds the range of cultural values embedded in digital assistants. This is particularly useful for exposing what sorts of ideological "truths" are enclosed and/or foreclosed as part and parcel of the design, implementation, and use of these technologies. Exploring the anthropomorphic design of digital assistants through feminist and critical race lenses requires us to confront how dominant ideologies about race, gender and technology forma a kind of cultural infrastructure that undergirds technology design and practice. From this perspective, uncertainties emerge about the "common sense" of anthropomorphic design of digital assistants, particularly surrounding how this design strategy is employed in ways that target vulnerable communities at the behest of state, corporate, and commercial interests.