The Path to Understanding: "Be Attentive" in the Age of Generative AI
Cultivating Attentiveness to Foster Empathy and Insight in Generative AI Product Design
In the world of tech and innovation, we are often driven by the pursuit of the next big breakthrough—something that captures the imagination and moves the needle in terms of what's possible. But every act of innovation begins somewhere much smaller: in the data we notice, the patterns we perceive, and the thoughts we allow to surface. I’ve studied the thought of Bernard Lonergan, the philosopher-theologian for years, both in academia and personally, and I think he lays out for us a profound way of thinking about human cognition. He begins his journey with the imperative "Be Attentive." It's the foundation for everything that follows in his exploration of our experience and understanding.
This post dives into what "being attentive" means in the context of today’s generative AI products, focusing on how attentiveness not only includes the obvious—sense data—but also the crucial elements of emotional data and the data of consciousness. These aspects of attentiveness shape how we understand our world and, by extension, shape how we build tools that make sense of it for others.
The Usual Suspect: Sense Data
Let's start where most of us are comfortable: sense data. When we think about attention, we often think about being observant—taking in the world with our senses. In the tech world, sense data is analogous to the types of information that drive machine learning models. For a computer vision system, it's the pixels that compose an image. For a natural language processing tool, it's the semantic units that build sentences.
For us as innovators and product builders, being attentive to sense data means sharpening our awareness to the things that are presented directly to us—the statistics of user behavior, the graphs showing growth or decline, the everyday experiences of those using our products. This attention to sense data allows us to spot gaps, patterns, and opportunities.
But being attentive goes far beyond this. In Lonergan's philosophy, to "be attentive" is to open oneself to all data, not just the data that is neatly organized or obviously relevant. This includes the elements that machines still struggle to quantify: emotions, intuitions, and conscious reflection.
The Often Ignored: Emotional Data
Emotional data is the rich undercurrent that shapes human experiences. Imagine a product team analyzing the interaction between a user and a generative AI model. They may focus primarily on metrics like session duration or user engagement. But what about the user's emotional journey?
Perhaps the user felt excitement, curiosity, or even frustration. Emotional data is often qualitative—embedded in user testimonials, comments, and behaviors that don’t fit neatly into rows and columns. Being attentive here means actively noticing and valuing these dimensions. It’s the effort to understand the anxiety a user feels when they interact with an opaque AI or the delight when a model produces something surprisingly creative.
Incorporating emotional data into product development requires more than numbers; it demands empathy. A generative AI tool might technically meet user needs but still fail to engage emotionally, leading to adoption challenges. Emotional attentiveness often reveals these subtler truths—insights that quantitative data can miss.
The Depths of Attention: The Data of Consciousness
Perhaps the most challenging, and most transformative, aspect of being attentive is paying attention to the data of consciousness itself. This isn’t just about being aware of our thoughts but understanding the processes of thinking: What questions are we asking? Why are we asking them? What presuppositions shape the way we frame a problem?
Consider the process of designing a new AI-driven product. There’s a layer of attentiveness that goes beyond brainstorming features or running user tests—it’s an attentiveness to the biases and frameworks that are guiding these processes. Are we approaching the problem in a way that’s too constrained by current paradigms? Are we listening not only to user feedback but also to our own assumptions about what users should want?
Lonergan’s idea of attentiveness encourages us to take a step back and observe our own cognition as it unfolds. For product teams, this might mean reflecting not only on user personas and needs but also on the cognitive journey taken in formulating those personas—how much have we allowed ourselves to be surprised by the data, versus how much have we imposed our preconceptions on it?
Generative AI and the Expanded Scope of Attentiveness
To illustrate this broader view of attentiveness, let’s take a generative AI product as an example—perhaps one that generates images from text prompts. Sense data comes into play in obvious ways: the user inputs text, and the model, trained on billions of labeled images, creates an output that reflects the statistical patterns it's learned.
But being attentive means looking beyond this simple input-output dynamic. Consider the user's emotional data: What kinds of images does the user seem to favor? Do they seek images that evoke comfort or excitement, realism or fantasy? The data of consciousness is even subtler: What deeper purposes are motivating the user to engage with this tool? Are they exploring creative possibilities, seeking inspiration, or trying to solve a practical problem?
Product teams can deepen their design approaches by being attentive to all these layers of user experience—moving beyond usability metrics into understanding how products intersect with users' mental and emotional landscapes. This kind of attentiveness doesn’t only improve products; it also helps innovators to uncover insights that might otherwise remain hidden.
Attentiveness in Practice
What does this mean for tech leaders and innovators today?
Expanding the Scope of Data: Attentiveness in Lonergan’s sense requires looking beyond what is easily measurable. Sense data alone provides only a partial picture. To truly understand user experiences, we need to actively gather emotional data—through qualitative research, user stories, and reflective team discussions.
Encouraging Cognitive Self-Awareness: Teams need time and space to reflect on their processes. Which data points are being elevated, and why? Where might we be applying blinders to our thinking? Product teams can incorporate reflective practices—workshops or retrospectives that focus not only on what was built but on how decisions were made.
Fostering Empathy-Driven Design: Emotional attentiveness naturally leads to greater empathy. Building products that resonate requires understanding users not only as data points but as people with unique, often unpredictable, emotional journeys.
Further Reading and Reflections
Everything in this series of seven posts is drawn ultimately from Bernard Lonergan’s masterwork "Insight: A Study of Human Understanding", it can be an imposing tome, but there’s an excellent free course that works through the entire thing here.
More approachable but less comprehensive further insights into the subject matter of this post can also be found in:
Clear Thinking by Shane Parrish
How Emotions are Made by Lisa Feldman Barrett
Conclusion
Being attentive is not simply about gathering more data—it’s about broadening the types of data we take in and understanding the subtleties that shape both the external world and our internal experience of it. In the context of generative AI, and indeed any tech product, this expanded attentiveness is what allows us to create tools that are not only functional but also meaningful. It’s about seeing with fresh eyes, attending to what might otherwise be ignored, and, ultimately, enabling innovation that resonates at a deeper level.
In the next post, we will look at the next guiding cognitive principle: "Be Intelligent." If being attentive is about gathering and noticing, being intelligent is about interpreting and making sense. Stay tuned as we continue to unfold Lonergan's rich framework—and see how it applies to our work as creators and innovators today.