Data Visualization Archives - Nightingale | Nightingale | Nightingale The Journal of the Data Visualization Society Fri, 20 Mar 2026 17:43:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://i0.wp.com/nightingaledvs.com/wp-content/uploads/2021/05/Group-33-1.png?fit=29%2C32&ssl=1 Data Visualization Archives - Nightingale | Nightingale | Nightingale 32 32 192620776 Building Tableau Dashboards for the PowerPoint Download https://nightingaledvs.com/building-tableau-dashboards-for-the-powerpoint-download/ Thu, 26 Mar 2026 12:00:00 +0000 https://nightingaledvs.com/?p=24662 Working in reporting and analytics for the last six years has made me realize an uncomfortable truth about Tableau: Your beautiful interactive dashboard will often..

The post Building Tableau Dashboards for the PowerPoint Download appeared first on Nightingale.

]]>
Working in reporting and analytics for the last six years has made me realize an uncomfortable truth about Tableau: Your beautiful interactive dashboard will often become a static PowerPoint slide.

If you work in sales ops, finance, or any executive-facing analytics team, you already know this. Your vice president  won’t open Tableau Server at 9 a.m. before the board meeting. They’ll download your dashboard as an image or  powerpoint, paste it into slide 17, and present it to the C-suite.

Once I accepted this reality, I started treating this as a design problem. Here are five non-negotiable factors I learned on my Tableau journey.

The first Excel dashboard, created in 1990 using the first version of Excel for Windows. Source: Microsoft

1. Design for PowerPoint From Day One

Device preview matters exponentially more when your dashboard will live in a powerpoint deck.

In the early stages of redesigning an executive-level sales report, I built my dashboard in Tableau’s default “Desktop Browser” view. When I downloaded it as PowerPoint, it crushed into a single slide with illegible text — a formatting disaster right before a leadership presentation.

The fix here is using Tableau’s built-in PowerPoint layout (16:9 aspect ratio) from day one.

Source: Rituparna Das

This ensures your dashboard fits perfectly into standard Google Slides or PowerPoint without awkward cropping or white space. Don’t design for Tableau’s default dimensions — design for where your dashboard will actually be consumed.

Pro tip: Always test your export before the final version. Click “Dashboard > Export as PowerPoint” to preview exactly what stakeholders will see.

2. Accept That 80% of Functionality Disappears

This is the hardest lesson: You must build assuming zero interactivity.

What dies in PowerPoint:

  • Filters (static view only)
  • Parameters (whatever was selected during download)
  • Hover tooltips (invisible)
  • Drill-downs (gone)
  • Dashboard actions (non-functional)

This changes your design strategy. Now you have to build multiple static versions of what each filter setting your users will want to view. For example, my executives were interested in seeing  pipeline performance across sales regions, sales clusters, business units, and product lines. What would have been one dashboard filter is now separate dashboards I had to create:

  • “Pipeline_Review_by_Sales_Region”
  • “Pipeline_Review_by_Sales_Cluster”
  • “Pipeline_Review_by_Business_Unit”
  • “Pipeline_Review_by_Product_Line”

Yes, it’s more work. Yes, it feels redundant. But it’s the only way to ensure your stakeholders see what they need without interactivity.

Every critical insight must be visible on page load. If it requires a click to reveal, assume it will never be seen.

3. Use Containers for Layout Control

When your dashboard contains multiple visualizations, containers keep everything locked in place during the PowerPoint export. Without them, floating objects shift unpredictably — your perfectly aligned KPI cards end up overlapping your bar chart in the downloaded version.

PowerPoint downloads don’t tolerate white space. A minimalist Tableau dashboard might look elegant on screen, but it looks unfinished and unprofessional in a deck. Executives expect dense, information-rich slides.

Why containers solve both problems:

  • They lock your layout in place (no shifting elements)
  • They help you maximize space efficiently (no awkward gaps)
  • They give you precise control over how information flows
Source: Rituparna Das

This dashboard exports with excessive white space, making it look unprofessional in decks.

Best practice workflow:

  1. Create a low-fidelity mockup of your dashboard layout
  2. Build the container structure first (horizontal and vertical containers)
  3. Drop visualizations into containers last

Pro tip: Watch this Tableau container best practices video before building your next dashboard — it’ll save you hours of reformatting frustration.

4. Establish Governance Standards for Version Control and Collaboration

If you’re working collaboratively or managing multiple dashboard versions, implement a simple visual system:

Source: Rituparna Das

Use the color coding available for dashboards:

  • 🟢 Green : Production-ready, safe to download
  • 🟡 Yellow : Work in progress, do not present
  • 🔴 Red : Draft/testing only

Keep consistent and clear worksheet naming conventions. This will save your sanity.)

❌ DON’T: “Bookings (1)”, “Bookings (1)(1)”, “Sheet 3”
✅ DO: “Q4_Bookings_Final”, “Pipeline_Review_v3”, “Pipeline Coverage_BarChart”

5. Add Company Logos

Align as closely as possible to your organization’s standard slide deck template.

Why this matters: Your dashboard might be internal today, but it’ll be in a client presentation tomorrow. When your VP forwards it externally without asking you first (and they will), professional branding matters.

Where to place logos:

  • Top-left or top-right corner (consistent with company templates)
  • Footer with date/data source
  • Consider adding a “confidential” watermark for internal metrics

The Bottom Line

The moment you accept that your Tableau dashboard will become a PowerPoint slide, you start designing better dashboards.

Stop optimizing for interactivity. Start optimizing for screenshots.

Use the 16:9 layout. Build static versions of filtered views. Lock everything in containers. Name your worksheets like a professional. Add your company logo.

Your stakeholders don’t care about your elegant parameter actions if they can’t paste your dashboard into their Monday morning deck.

Sometimes being a great analyst means accepting that your masterpiece will be Ctrl+C’d, Ctrl+V’d into slide 23 — and designing for that reality from the start.

CategoriesHow To

The post Building Tableau Dashboards for the PowerPoint Download appeared first on Nightingale.

]]>
24662
The Tiles That Made Me: Mapping Friendship through the Lens of AI https://nightingaledvs.com/the-tiles-that-made-me/ Thu, 19 Mar 2026 12:00:00 +0000 https://nightingaledvs.com/?p=24653 According to the Oxford Dictionary, friendship is a “voluntary, personal relationship characterized by mutual affection, trust, and support.” Whereas to me, friendship involves authenticity and..

The post The Tiles That Made Me: Mapping Friendship through the Lens of AI appeared first on Nightingale.

]]>
According to the Oxford Dictionary, friendship is a “voluntary, personal relationship characterized by mutual affection, trust, and support.” Whereas to me, friendship involves authenticity and a trustworthy partnership that involves fun, kindness, and understanding.

It’s the size of the smile on your face when you see someone. It’s the decision to stay in touch with a niece long after family events end. It’s the fragile silence between you and a friend who couldn’t support a recent life choice.

As a data designer, I’ve always been obsessed with how we categorise the intangible. Recently, I set out to map the people who have shaped me. I didn’t want a balance sheet, but I did want to see the patterns. A relationship always evolves; this would only represent a snapshot in time.

The Taxonomy of Connection

I began by listing every person I care about. First from memory, then verified by my friends list on Facebook. But as I opened my spreadsheet, the questions started to flood in. Can family members count as friends? For example, my nieces and I have been chatting nonstop for years now. We grew fond of each other through the circumstance of birth, but we stayed in touch by choice. Does that make them friends? And what about friends who aren’t supportive of my life choices? We were very close 7-8 months ago, but we are not now. Are we still friends? If I exclude her from this, does that mean I have given up on our friendship? Also, I use the term “friend” very loosely. I am naturally familiar with strangers. Is my new neighbour — with whom I have shared a few cups of tea — my friend?

To make sense of the friend list, I distilled friendship into three core metrics, scored on a scale of one to three, three being the highest rank possible: 

  • Reliability: Loyalty, faithfulness, and the feeling of being safe.
  • Empathy: Supportiveness, kindness, and open communication.
  • Joy: Playfulness, liveliness, and shared common ground (though one might question whether friendship is required for common ground; for the sake of this visualisation, I decided it was).

I also added two judgment values: Duration (how long we have been friends), and Contact (how recently we spoke). To keep the data honest, I limited the scope to friends I had contact with in the last 24 months. I chose 24 months as a mark because it’s the period since my daughter was born. Spoiler alert: In a time when I often felt lonely as a new mother, the data showed me I was actually deeply loved.

From Sketching to Scripting

In my notebook, the design evolved rather quickly into a series of “tiles.” I remember having the visual in my head for a while, and I felt as if I were a vessel letting it out onto the paper. I wanted something that would represent the scale’s levels easily. Level one was a simple base; level three added complex detail. 

Source: Or Misgav

Initially, I used background colors to denote duration, but the palette was too loud. It made the story about “how good I am at making friends” rather than “how these friendships built me.”

Source: Or Misgav

Then came the pivot. Usually, I build these visualizations by clicking the mouse. A thorough process of copying, pasting, and double-checking layers in Illustrator and Figma would easily take three hours. But, inspired by the “vision to execution with a click” movement, I turned to Claude and Gemini.

I asked Gemini to help me write the prompt for Claude. It generated a Python script that processed my Excel file and generated stacked layers as PNG files. Claude taught me how to install Python on my Mac. (Honestly, I felt like I was back in the 90s, typing into a terminal to launch a game.) Then, “Boom. Your tiles are ready.” With a single click, the assets were generated. A few back-and-forths with Claude, and the grid was aligned. The work was done.

Source: Or Misgav

The Cost of Efficiency

As I looked at the finished folder, a strange feeling washed over me: I didn’t recognize the data. By automating the execution, I had accidentally bypassed the data familiarization stage — that meditative hour where you handle each data point with care and remember the person behind it. The tiles were beautiful, but they felt distant.

It raised a fundamental question for our field:
If the AI builds the layers, are we co-creators? Or are we just curators of our own memories?

End Result. Source: Or Misgav
How to read. Source: Or Misgav

The Tokens of Gratitude

Despite the digital distance, the final grid is a testament to my life. These tiles are me. They represent the people who stayed through puberty, the ones who signed my wedding book, and the new friendship that started when I collected my son from preschool, which grew close.

This project is more than a visualization; it’s a token of gratitude. It captures a snapshot of my soul as it exists in 2026. Shaped by humans, rendered by machines, and held together by the voluntary, personal relationships that make life worth mapping.

CategoriesData Art

The post The Tiles That Made Me: Mapping Friendship through the Lens of AI appeared first on Nightingale.

]]>
24653
Trends, Aesthetics, and Individuality: How the Internet Irrevocably Changed Fashion https://nightingaledvs.com/trends-aesthetics-and-individuality/ Tue, 10 Mar 2026 14:17:29 +0000 https://nightingaledvs.com/?p=24613 Close your eyes, and picture an outfit from the 1980s. Now, the 1990s. The 2000s. Chances are you thought of perms and shoulder pads first,..

The post Trends, Aesthetics, and Individuality: How the Internet Irrevocably Changed Fashion appeared first on Nightingale.

]]>
Close your eyes, and picture an outfit from the 1980s. Now, the 1990s. The 2000s. Chances are you thought of perms and shoulder pads first, then grungy flannels and preppy streetwear, before finally thinking of low-rise jeans and velour tracksuits. 

But if I were to ask you to picture something from the 2010s, that answer might range anywhere from colored leggings to checkered Vans. That range gets even wider when we look at the 2020s so far.

We used to have a very clear idea of which styles belong to which decade, but that distinction has gotten increasingly muddy in the last fifteen to twenty years. We’ve lost the pattern of an iconic style or two defining each decade, and it’s affected our favoritism when it comes to fashion. The 80s, 90s, and 2000s—decades with only a handful of predominant styles—rank highest when respondents are asked for their favorite fashion decade.

On average, 9.5% of respondents favored the 80s, 11.25% favored the 90s, and 8.25% favored the 2000s. 

Even when we abandon the idea of “favorites,” those decades still rank the highest when respondents were asked how fashionable they found each decade. Repeatedly, the 2010s and 2020s rank lowest on average when it comes to being fashionable decades with a defined sense of style. 

Fashion trends are increasingly speed-running their usual five stages: introduction, rise, peak, decline, and obsolescence. Instead of the usual fifteen to twenty year cycle, we’re now seeing trends rise and fall within a matter of months. What happened?

The Internet.

Internet usage has increased across all generations over the last 25 years, and with it, our access to fashion inspiration outside of current pop culture. Instead of fashion trends being born on a runway and trickling down through magazines, movies, and music videos to the general public, modern teenagers and young adults are finding their new favorite styles on their For You and Explore pages, with 42% of Gen Z listing social media as their main source of fashion inspiration.

While expressions of individuality and personality have always been a priority when it comes to fashion, younger generations now feel that burden more acutely due to their exposure to the world online. Pre-internet, you knew the people in your town, and you knew the familiar movie and music stars. It was normal for everyone to take fashion inspiration from the screen, like when Dirty Dancing had everyone in leotards, or when Top Gun boosted aviator jacket sales. In the digital era, you have access to the whole world. 

That’s not an exaggeration, either. Out of the 8 billion people on Earth, more than 5.17 billion use social media and spend an average of over 2 hours scrolling every day. Instagram and TikTok are the top platforms for young adults, with 89% of Gen Z users on Instagram and 82% on TikTok. 

Breaking those audiences down makes it even more jarring to realize how many people we’re seeing on our screens now. Instagram alone has 3 billion monthly active users, and nearly a third of them are 18-24 year olds. TikTok is no different, with a majority of its 1.9 billion monthly active users being Gen Z. Additionally, out of Pinterest’s 553 million monthly active users, 42% of them are Gen Z, often searching specifically for style inspiration.

With these sorts of numbers, it’s not outrageous to assume that a young adult in 2026 will see thousands of strangers online every day. More often than not, they’ll see these strangers jumping onto the same trends they are, but when the whole world is following the same trends, how is anyone meant to feel like an individual? How are the 71% of Gen Z’ers that prioritize personality in their style meant to feel like they’re unique?

It seems their answer is a wider range of hyper-specific aesthetic niches. Now, to a reader who isn’t chronically online, you might think the word aesthetic is an adjective describing something “concerned with beauty or the appreciation of beauty” or maybe a noun for “a set of principles underlying and guiding the work of a particular artist or artistic movement.” In the modern online fashion world, it means something a bit more distinct.

A clothing aesthetic in 2026 can be defined as, “your personal style or the overall vibe your outfits create. It’s the visual theme that ties your wardrobe together, from colors and patterns to the types of pieces you wear,” according to Copenhagen Fashion Summit. Included with the site’s definition are no less than 42 different aesthetics, such as Soft Girl, Clean Girl, Streetwear, Fairycore, Cottagecore, Witchcore, both Light and Dark Academia, and many others. 

Fairycore. (Source: Cris Ramos)
Dark academia. (Source: Murat Esibatir)
Cottagecore. (Source: Eugenia Sol)

Cottagecore might be one of the most popular, emerging back in 2019, and is essentially a romanticization of rural life. Cottagecore styles include warm and earthy colors, flowy dresses, puffed sleeves, and cardigans, while activities include gardening, crocheting, and baking bread. Overall, it’s a cozy, peaceful aesthetic that prioritizes comfort. While the general trend might’ve died a few years ago, Cottagecore has quietly lived on past its hype like many of these aesthetics tend to do.

Since Cottagecore’s heyday, aesthetics have gotten even more specific. Depop, a popular clothing resale platform, posted their 2024 Trend Report, and the following ‘core’ styles had some of the highest search volume increases: “Contemporary classics,” “Minimalist renaissance,” “Retro sportswear,” and “Indie vanguard.”

Contemporary classics is defined as an “updated take on ‘old money’” in the report, reviving preppy styles by blending Ivy League style with countryside vibes. Brands like J.Crew and Ralph Lauren are named as the leaders here, with Depop saying the aesthetic “reflects a yearning for stability and reliability.” 

The Minimalist renaissance is a return to “understated elegance,” according to Depop, and is focused on clean lines, neutral colors, and classics like cashmere and tailored coats. This aesthetic has a specific focus on craftsmanship and dedication to timeless taste.

Retro sportswear follows the more traditional trend pattern of recycling from decades prior, and pulls from 80s windbreakers and 90s athletic styles, combining them with modern flair for nostalgic yet practical outfits. This specific style’s increase could be attributed to the rise in popularity of casual sports like pickleball in recent years. 

And finally, Indie vanguard is described as “bold reimagining of 2010s indie sleaze and hipster culture,” combining grunge and punk styles with the early 2000s. Think band tees paired with knee-high boots and boas. Even better, think Charli XCX’s style during her “Brat” era from the summer of 2024.

Now, is the rise of aesthetics a bad thing? In a general sense, I don’t think so, but there is an important caveat. Younger generations not having an agreed “uniform” of sorts in favor of having specific, sometimes eccentric wardrobes is completely fine. What we consider to be “normal” changes constantly, and what was normal for trends thirty years ago just isn’t normal anymore.

However, with trends moving as fast as they do, there are significant production concerns, especially the effects on the environment. Fast fashion—the manufacturing process concerned with mass-producing clothing to keep speed with trends—eats through fossil fuels with its use of polyester and contributes up to 10% of annual global carbon emissions, only for the clothes to end up in landfills at best, and our oceans at worst.

This issue, though, might be reaching its turning point. Younger shoppers are beginning to prioritize sustainable clothing practices, and the secondhand clothing market value is going up. Even when we look at Shein, one of the most notorious fast fashion brands, its downloads were cut nearly in half between 2024 and 2025. This in no way diminishes the threat and consequences of trendy and unsustainable clothing, but it might be the beginning of the way out. 

Trends have always been part of the fashion world, but once we got the internet, they became something entirely new. When nearly everyone on Earth is able to search for fashion inspiration online, you trade a handful of decade-defining styles for a thousand niche aesthetics that live on beyond their trend cycles. The earth might not have ended with Y2K, but a new world of fashion and individuality were certainly born.

The post Trends, Aesthetics, and Individuality: How the Internet Irrevocably Changed Fashion appeared first on Nightingale.

]]>
24613
The Back of the Painting: On Structure, Integrity, and Data Visualisation https://nightingaledvs.com/the-back-of-the-painting/ Tue, 17 Feb 2026 16:41:46 +0000 https://nightingaledvs.com/?p=24599 In the early 1420s, Fra Angelico, a Dominican friar and painter, completed his first large-scale work for the newly built monastery at San Domenico. The..

The post The Back of the Painting: On Structure, Integrity, and Data Visualisation appeared first on Nightingale.

]]>
In the early 1420s, Fra Angelico, a Dominican friar and painter, completed his first large-scale work for the newly built monastery at San Domenico. The San Domenico Altarpiece is one of the Early Renaissance’s defining works and adorns the high altar where the friars once sang their hymns during the Divine Office. Last year, the altarpiece was removed for restoration and featured in a major exhibition across Florence. On the front, the polyptych depicts four haloed saints in a single unified space, each attentive to the Virgin and Child. The Virgin and Child are themselves surrounded by angels with vibrant multi-coloured wings, their feathers shifting though a prismatic palette that is particularly iconic of Fra Angelico’s work. 

San Domenico Altarpiece by Fra Angelico. (Source: Web Gallery of Art)

To look at the back of the high altarpiece, however, is to see an intricate collage of wood from various centuries. It serves as a physical record of how the work has been altered as tastes have changed over time. In the seventeenth century, carpenters recut the original panels and added new wood to force the piece into a rectangle. Beechwood inserts, shaped like butterflies, and crossbeams cover the surface, running against the natural grain. Poplar meets beechwood, intersecting in different directions. Each species moving discordantly with humidity and the passage of years.

Roberto Buda, a conservator who specialises in wooden panel paintings, spent close to nine months stabilising the altarpiece’s structure. Working with his team, he removed the existing crossbeams and butterfly-shaped inserts, replacing them with carefully matched old poplar wood infills aligned parallel to the wood’s grain. A new frame was added with conical springs that allow the wood to move naturally. “It’s a house,” Buda told the Financial Times during the restoration. “If you don’t have a good foundation, it doesn’t hold up. The painting will never look good if the support is not right.” 

Months later, as I sat at my laptop placing an axis in the centre of the page, I thought again about this quote.

A deliberate transition in my career was marked in 2025. During my PhD in experimental neuroscience, I learned to do many things at once. I built hardware and software. I designed experiments. I ran those experiments, analysed the data, visualised the results, wrote papers, and taught students. Academia rewards this kind of breadth and a range of technical skills accumulates quickly. Yet, I found myself most engaged at the very end of the workflow, sitting with a dataset that had not yet been interpreted. I wanted to slow down and to look for the narrative in the data. To focus not only on results but on how those results are communicated—clearly, honestly, beautifully. In academic research, figures are often produced in haste, appended at the end of the pipeline. There is a script, a deadline, a familiar plotting function. In Python, with the visualisation library Matplotlib, you can call plt.bar(), and a chart appears. Microsoft Excel goes further still, delivering a fully formed graphic with colours and proportions chosen on your behalf. I wanted to build visualisations with greater intention and technical freedom, and this is what led me to the open source JavaScript library, D3.js.

D3 stands for Data-Driven Documents and is a low-level library which uses the full capabilities of web standards such as CSS, HTML, and SVG to build sophisticated and interactive data visualisations. While other visualisation tools hand you a bar chart or a scatterplot, to represent data in D3 you must manually calculate the scales, define the coordinate system, and bind the data to a graphical element. You must decide exactly where an axis sits and how a margin breathes, what a data point is—a circle, a path, a mark—and how it behaves when the data changes. Nothing appears unless you build it. D3 is a workshop full of raw timber and hand saws.

With this in mind, I applied to the Data Visualisation Society’s mentorship program, intent on learning D3. Under the guidance of my brilliant mentor, Sam Bloom, I spent ten weeks at the end of 2025 working through the library’s fundamentals and building an interactive visualisation. We focused on first principles before developing an interactive scatterplot to explore Ancient Greek colour perception. Progress was slow at first because the learning curve was steep, but as I learned to build in D3 and perform this kind of digital carpentry, visualisation began to resemble construction. Every line of code was doing structural work. Figures included in this essay show examples of the interactive scatterplot, which examines the sensory dimensions of Ancient Greek colour by focusing on the major colour adjectives used by Homer in the Iliad. The Ancient Greek experience of colour was inseparable from motion and shimmer. Colour was a basic unit of information which reflected the natural world—encoding brightness and darkness as fundamental dimensions. Greek colour terms not only prioritised luminosity but the play of light across surfaces, the texture of materials, even the social standing implied by a sheen or shade. It was a colour vocabulary rooted in their lived perception, rather than the modern hue-based categories we use today. Selected excerpts from my D3 code illustrate how each visual element is constructed, for example, the multiple lines of code required to precisely position and size tick marks along each axis. The full project can be viewed here.

When a reader encounters a clean scatterplot, they see only the front of the painting. They don’t see the scaffolding: the decisions about scale domains or the choices about what not to encode. While the interactive scatterplot I built at the end of the ten weeks was modest, I could explain why each element existed and how it related to the data. Each decision—scale, colour, interaction—could be justified. Good data visualisations often look deceptively simple. But this clarity is the result of many intentional decisions about the data and the visual design. 

For example, ~90% of the charts the Financial Times publishes are bar charts or line graphs, yet because these charts adhere to a defined set of design principles, down to the very placement of the title and the subtitle, it makes the FT’s graphics some of the most recognisable in newsroom data visualisation. This coherence is maintained through meticulous style guides, which dictate everything from the weight of an axis line to the specific hex code of a categorical blue. These guides function as a visual vocabulary or grammar. Alan Smith, the FT’s Head of Visual and Data Journalism and who led the design of the FT’s visual vocabulary, has previously championed the idea that a chart should be as readable as a sentence. Alberto Cairo, a professor of visual journalism at The University of Miami, has often argued that the most important part of a visualisation is the “reasoning” that happens before the first pixel is placed. In his book, The Art of Insight, he argues that there are really no rules of data visualisation, there’s just reason. Every design choice must be a defensible, rational response to the data and the intended audience. 

These ideas are not confined to style guides or theory; they are persuasive when also used for animation and interactivity in visualisation. When such principles are applied with narrative intent, even complex data can be immediately comprehensible to an audience. A widely cited example of the power of a simple but intentional use of data visualisation is Hans Rosling’s 2006 TED talk. Rosling revealed patterns in a complex dataset through an animated scatterplot in which countries appeared as circles, mapped by measures such as life expectancy (on the x-axis), countries’ GDP (on the y-axis), and population (the size of the circle). As the animation unfolds, these circles shift across the axes allowing long-term trends to emerge gradually rather than all at once. Rosling paired this animation with carefully selected narration and emphatic gestures to guide his audience to the most meaningful changes as they occurred. The result was a complex global health story made easy to understand through intentional narrative decisions and clear visual structure.

The painted surface of Fra Angelico’s altarpiece is inseparable from its support. The relationship between the painted surface, the underlying preparation, and the wooden support beneath, makes the altarpiece a three-dimensional object rather than a flat image viewed only from the front. The butterfly-shaped beechwood inserts, which were set against the direction of the grain, introduced stresses that increased the risk of cracking, jeopardising the paint layer.

A data visualisation is a three-dimensional object of logic. If the underlying structure is weak, if scales are arbitrary or axes misleading, the surface won’t stand up to scrutiny. The narrative ‘paint’ (the colour palette, the interactivity, etc) will eventually crack. For example, decisions relating to the axes scale depend on what counts as meaningful in a given context. Although it is often suggested that a y-axis should begin at zero to preserve proportional accuracy, this convention can obscure important variation when the relevant changes are small, as is often the case with climate data. A review by Steven Franconeri, professor of psychology at Northwestern University, illustrates this clearly: a temperature chart anchored at zero degrees Fahrenheit flattens visible change, while a version scaled to the relevant temperature range makes trends legible without distorting the data. A widely criticised, since-removed National Review article employed a temperature chart with a lower bound of –10 degrees Fahrenheit, a choice that made recent increases in global temperature appear negligible.

Wood is a living thing and it needs to move. Buda and his team’s addition of a new, more encompassing frame made of chestnut wood and conical springs allowed the altarpiece painting to breathe through the natural movement of the wood in different directions. I developed my D3 visualisation in tandem with the JavaScript library React. In modern web development, React acts as the frame of chestnut wood. It is often described as a library for building user interfaces, but at its core it is a way of thinking about state and change. You describe what the interface should be given certain conditions, and React takes responsibility for updating it when those conditions shift. React holds the structure and lifecycle of my visualisation and D3 handles the math: scales, layouts, transitions that respond to data.

This article is not about JavaScript, or frameworks, or even data. It is about integrity in design. It is the realisation that the most important work we do as data visualisation developers is often the work that the reader will never see. When the San Domenico Altarpiece returns to the walls of the monastery, the public will only see the Virgin and Child, resplendent and serene. They do not see the new poplar inserts running parallel to the grain or the conical springs hidden within the frame. When we design good visualisations, we are doing something similar, we are building the foundations so that the story can stand on its own. We are building houses for data. Every axis, every scale, every line of code is a poplar insert aligned to the grain.

CategoriesCode

The post The Back of the Painting: On Structure, Integrity, and Data Visualisation appeared first on Nightingale.

]]>
24599
Info+ https://nightingaledvs.com/info-plus/ Wed, 14 Jan 2026 16:02:12 +0000 https://nightingaledvs.com/?p=24511 Info+ is a long-standing data vis conference, held biannually in rotating locations. This year, it was hosted at Northeastern University in Boston (my alma mater),..

The post Info+ appeared first on Nightingale.

]]>
A quiet moment, before the conference begins. Image credit: Pedro Cruz

Info+ is a long-standing data vis conference, held biannually in rotating locations. This year, it was hosted at Northeastern University in Boston (my alma mater), chaired by Pedro Cruz of Northeastern and Sarah Williams from MIT. The event was an action-packed three days of workshops, keynotes, seminars and social activities, and even included an art exhibition at the MIT media lab.

Opening night exhibition at the MIT Media Lab. Photo credit: Pedro Cruz

The conference was a dose of concentrated inspiration, with a head-spinning line up of back-to-back 10-minute seminars by leading designers in the visualization field. By the second day there were definitely some unifying themes emerging from the blur of inspiration and ideas. 

You can find recordings and abstracts for all of the talks on the conference homepage. A few selected presentations are also linked below.

From communication-to towards communication-with

As someone who’s been in the data vis community for a long time, the biggest change I noticed was a shift in the general framing of data vis problems. Instead of Tufte-esque critiques of “proper” visualization techniques or discussion of misinformation and misleading graphics in politics, the conversation (at least in this conference) has shifted strongly toward more participatory practices in data vis.

Talking about inflation. Photo credit: Jose Duarte

Rather than talking about how to present data so that people will understand it, the focus was on how to have conversations—with people, using data—and how to include appropriate context and resolution to help them see how it fits into and reflects their lives. This was reflected in games talking about inflation at the grocery store and local biodiversity challenges in college classrooms, mapping inclusive and discriminatory spaces for marginalized communities to inform urban planning, and using info vis techniques to map informal transportation networks in developing nations.

Mapping exclusionary spaces. Photo credit: Sofia Burgos-Thorsen

When communicating with disenfranchised groups (like middle-schoolers impacted by extreme climate events and migrants hesitant about motivations behind the intervention), it can also be a challenge to overcome obstacles to communication, like self-censorship and diminished agency.

Visualizing marginalized perspectives

Across many talks, there was a focus on using data as a form of community expression, and using locally-generated data to capture experiences that are often left out of the dominant narrative. The conference exhibition included a project to record the important annual events for the Quecha people of the Amazon, organizing their year around important agricultural and cultural events.

Map of cultural practices created by the Quecha people. Photo credit: Catherine D’Ignazio and Claudia Tomateo

Another team used conversations with migrants to improve shelters, focusing on designing features that will support them best in their transition. Data can also help to articulate deep-rooted structural inequalities, or something as “simple” as pronouncing someone’s name. It may also help us to question what we memorialize, how, and why. 

Designing for impact

Some talks showed how to use data in a political context, as a tool for advocacy and creating change. One project focused on providing legal evidence to demonstrate systematic displacement in the West Bank, another created an archive of communities erased by urban redevelopment in Seoul.

Mapping the land of dispossessed farmers in the West Bank. Photo credit: Gauri Bauhuguna

A blanket woven from currencies served as an entry point into deeper discussions about economic impacts and the many reasons for migration, informing and humanizing policy decisions at the UN. One team collaborated with corporate sustainability offices to use biodiversity data to create better-informed sustainability policy and achieve more meaningful targets. Data can also help to illustrate what is lost when policies change, such as local shore changes for communities in the Mediterranean, and the pain caused by lost reproductive rights.

A blanket highlighting the economic impacts and reasons for migration. Photo credit: Sarah Williams

Advocacy is one form of impact; others take a more neutral approach. Some speakers discussed using data journalism to represent geopolitical conflicts in an unbiased but informative way. Others illustrated the importance of thoughtful visualizations focused on place and the need to keep things simple when dealing with the practical realities of fast-paced projects in a newsroom. Conversely, including details in your charts can sometimes make them better, more interesting, and more understandable.

Visualizing ship motions related to undersea cable damage. Photo credit: Irene de la Torre Arenas

New modes for visualizing data

Of course, the medium we choose also influences what we observe. The representation of time in social media platforms can shape and even distort our perceptions. Using different modes of visualization (including touch and sound) can help people engage with and better understand different habitats on the ocean floor.

Visualizing sea floor habitats with visuals and texture. Photo credit: Jessica Roberts

Textiles have deep traditional roots and can evoke a softer expression of meaning, especially in a cultural context. Acoustic data can have profound emotional impact as well as quantitative meaning, and mixing auditory and visual explorations can encourage different modes of exploration, as well as creating more accessible tools

Perhaps my favorite application of unexpected media was using folded paper as the basis for the conference identity, creating rich and nuanced visuals by simple physical means.

Behind the scenes view of creating a conference identity. Photo credit: Todd Linkner

Seeing the big picture

Stepping back from day-to-day practices, we also considered how visualization can be a reflection of worldview. Framing is a critical step for a designer grappling to create a visualization, and our underlying theories of change influence both how we approach and how we talk about data visualization.

Books that capture an entire worldview through visualization. Photo credit: Paul Kahn

What I didn’t hear

Across the entire conference, there was almost no mention of AI. Presenters were definitely using AI technologies for certain kinds of data, but their talks were focused on the output rather than the tools. The one talk focused explicitly on AI considered whether it is helpful to use visualization as an input for AI learning, and what properties of a visualization might make it more interpretable and more useful for training an AI. I’m not sure if that was incidental or intentional, but it was a notable absence when so much of our current discourse is dominated by AI froth.

Reflections to take forward

Coming out of these many conversations, I found myself wondering which of the “theory of change” approaches are most effective, for which audiences, and when. Some speakers mentioned negative receptions: from the CDC when talking about data rhetoric and emotional visualizations, and from institutions of higher education when talking about faculty pay inequity. Many others discussed the tangible impacts of their work in shifting stubborn social and policy problems.

As always, the key lies in consciously framing your data and your analysis: in terms of the context, your purpose, the audience, and the people impacted and involved. Across many projects, we heard designers talk about how to define and redefine the problem as a critical step in getting to insight and achieving a successful design. 

As a designer working in industry to create large platform software, I find that all design often gets simplified to UX. It was nice to step outside of that bubble for a moment and remember the many things that design does, and the different places that designers contribute. I do think there is an interesting conversation to be had between the perspective of creating large-scale tools to structure data exploration for decision making at scale, and the one focused on using bespoke and personalized data visualization for communication—either to or with—an audience once the analysis is complete. 

Many of the unique, nuanced and contextual factors in a dataset can get blurred out when analyzing data at scale, and much of the big picture gets lost when focusing only on the particularities of a specific dataset. And yet, both the large and the contextualized cases come down to helping humans create big-picture conclusions by understanding nuances in the data. Building systems to accommodate large, unwieldy, and heterogeneous datasets to connect across these different scales requires insights from both sides. Perhaps that’s a topic for the next conference.

CategoriesCommunity

The post Info+ appeared first on Nightingale.

]]>
24511
Analytics Products Will Never Be Truly Human-Centered Until the Workplaces Behind Them Are https://nightingaledvs.com/analytics-products-never-human-until-workplaces-are/ Wed, 17 Dec 2025 16:34:49 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24457 I’ve been really excited to see a shift in analytics and business intelligence around more integration of human-centred design, ethics, and accessibility. I learn something..

The post Analytics Products Will Never Be Truly Human-Centered Until the Workplaces Behind Them Are appeared first on Nightingale.

]]>
I’ve been really excited to see a shift in analytics and business intelligence around more integration of human-centred design, ethics, and accessibility. I learn something new almost every day. However, I feel something is still missing from these conversations: whether these are being considered beyond the interface, and in our workplaces too. 

From what I’ve experienced and witnessed working in analytics, I don’t think I see the same strides in how analytics work gets done. For example, how many of us have kept producing while our lives were going through upheaval? How many have wondered if we can stay in our jobs, or even careers, because the way we’re expected to work is unsustainable to our well-being and personal lives? What might happen if we approach our work in a way that decenters speed, volume, and heroics, and recenters all humans involved?

My early days

I discovered data visualization in undergrad while studying cases like the Three Mile Island nuclear accident, where poor information design contributed to near or actual harm. It was one of the first moments in engineering where my ears perked up, especially around how data visualization bridges the analytical, creative, and human.

My early roles in quality improvement in hospitals only deepened that passion. I was fortunate to work alongside clinicians, designers, and researchers who introduced me to co-design methods, the importance of evaluation, and reframed users as collaborators.

Eventually, I landed my first role on an analytics team to support with BI design and development. However, it was during a time when my mom was battling appendix cancer, and I was living at home to support with caregiving. And my passion for this work quickly collided with the realities of how analytics gets done.

Deadlines versus trauma

When my mom was admitted to palliative care a year later, it happened to line up closely with a due date for a “high-stakes” report I was responsible for developing in Tableau, which I was learning how to use on my own. Because of the project’s size and weight, and the responsibility I felt to deliver, I would work a full day, bring my laptop to hospice care, and continue working near her bedside.

I could have asked for an extension or support. However, analytics routinely feels like a pressure cooker, especially on “high-stakes” projects. Plus, my qualifications were openly being questioned by others, I was identified as one of the “single points of failure”, and was also cautioned about the potential for blame if anything went wrong. Stepping away didn’t truly feel like an option – it was easy to feel cornered. On top of that, I was in my twenties, with undiagnosed neurodiversity, zero concept of needs and boundaries, and overwhelmed, confused, and exhausted.

At my mom’s funeral, a colleague asked when I might return to work, and relayed that people were getting anxious about report delivery. 

Her funeral was on a Friday. I went back to work on Monday. I finished developing and testing the report—and from what I remember, everyone received it when expected. 

I’m not sure if it felt like “a win” for me. It made me question, how are analytics workers perceived? And, what did I just do? 

Breaking points

The elements of that experience were not isolated to any individuals, teams, or organization, but recurring threads I’ve encountered and witnessed time and time again as my career in analytics has progressed. 

Fast-forward many years later to a more recent contract, again as a BI designer and developer, where layers of challenging, but common, systemic pressures rattled my nervous system. I eventually had a major Autistic shutdown (an involuntary neurological response to sensory overload), and needed to leave.

I’ve listed some of the challenges below – do any of these resonate, neurodiverse or not?

Structural

  • Unclear or missing roles, scoping, processes, and standards
  • Unrealistic expectations around task complexity and timelines
  • Unpredictability requiring frequent context switching and quick adaptation to change

Cultural/interpersonal

  • Persistent state of urgency, with hustle and “just get it done” culture
  • Lack of autonomy and space, with ongoing progress checks and pressure points
  • Repeatedly having to overexplain, raise concerns, and justify boundaries 
  • Interdepartmental conflict and tension
  • Feeling held responsible for the success of the project

Environmental

For this experience, I was able to be fully remote. From research and my own previous jobs, I know several factors that can be challenging with in-office environments for Autistic workers. These can include adherence to a 9 – 5 schedule, open concept office spaces with bright lighting and noise, and pressure to attend social functions. 

When layers like these start to compound, my nervous system gets flooded with input and demands, and can’t catch up. I get stuck in survival mode, and eventually break or shut down. Autistic burnout can look very different from our typical understanding of burnout, and recovery can require weeks to months (or even years) of deliberate care. Just to note, other Autistic people may have different experiences, supportive conditions, and responses – these are just my own. aces with bright lighting and excessive noise, constant interruptions, and pressure to attend social functions.

Figure 1. Examples of supportive conditions for Autistic employees from a 2023 report by Autism Alliance Canada. It is important to note that Autistic employees and employers can work together to identify the supports that might work best.

At this point, I’m afraid of returning to analytics as it currently exists. It can feel inaccessible to neurodivergence, and unforgiving to responsibilities outside of work. But am I the only one who feels this way? 

Ripple effects: Tired teams, leaders, products, and users

From what I’m seeing across industry research, I don’t think I’m the only one finding this field challenging and unsustainable. Here are some highlights:

Data teams are already overcapacity, despite ever-growing demands

In a 2023 survey of more than 900 data team practitioners and leaders across the United States and the United Kingdom, 84% said their workload exceeded their capacity, and 90% reported that it had increased from the year prior.

The vast majority of data engineering teams feel burnt out 
Another survey of over 600 data engineers and managers found that nearly all of them (97%) reported feeling burnt out, primarily due to time spent fixing errors, maintaining data pipelines, and constantly playing catch-up with stakeholder requests. Nearly 90% reported frequent work-life disruptions. 70% said they were likely to leave their current company within a year, and almost 80% were considering leaving the field altogether.

Figure 2. Experiences and impacts of data analytics work on data engineers from a 2021 report by data.world and DataKitchen.

“When a deliverable is met, data engineers are considered heroes. However, “heroism” is a trap. Heroes give up work-life balance. Yesterday’s heroes are quickly forgotten when there is a new deliverable to meet.”

2021 Data Engineering Survey: Burned-out Data Engineers Call for DataOps

Analytics products aren’t sufficiently supporting our end users

In a 2025 survey of more than 200 product leaders, data teams, and executives, 40% said their data doesn’t support decision-making sufficiently, 51% can’t meaningfully interact with the data provided, and 29% export data to spreadsheets daily. 

Findings I’m not surprised to see, considering how we’re expected to work. From a design perspective, it can be a struggle to carve time and space to sufficiently understand the data and users before I’m asked to quickly turnaround a prototype. Plus, post-launch follow-up and evaluations don’t seem to gain traction before we’re onto the next priority.

We’re hoping AI will save us

In the same survey as above, 75% believe AI-powered analytics might finally help uncover value buried in data. But in a new study by MIT and Snowflake, 77% of data engineering teams are finding their workloads even heavier, despite AI integration. 

While AI has the potential to streamline tasks and improve product quality, a cracked foundation could limit its impact, and cause further complexity and burnout. 

Figure 3. Examples of external and internal pressures in analytics, as well as possible outcomes.

Diverse does not equal inclusive

In analytics, we often point to diversity as evidence that we’re on the right path. When concerns are raised about how pressures, workloads, and expectations may weigh differently across identities, they can be dismissed with the reassurance that our workplaces are “already pretty diverse.”

That might be partially true in terms of representation. A recent study by Statistics Canada showed that 60% of data scientists (one of many roles within analytics) are immigrants, with the majority of first languages being neither English nor French. About one-third of data scientists identify as women+ (defined by the study to include “women and some non-binary people”). 

It is important to recognize that diversity does not always equal inclusion. In other pieces published by Nightengale, Catherine D’Ignazio and Lauren F. Klein, authors of Data Feminism, speak to how racism and sexism are imbued in the end to end data lifecycle, reinforced by structures of power, and ultimately surfacing in our products. An online poll by Christian Osborne showed that 90% of respondents said that they’ve experienced microaggressions at work, which can cause emotional and psychological harm, decrease job satisfaction, and increase turnover. 

We can also be sensitive to trends across all workplaces. In 2024, the Diversity Institute, Future Skills Centre, and Environics Institute for Survey Research published a Canada-wide study on gender, diversity, and discrimination at work. The survey reinforces that workplace discrimination is more likely to be experienced by racialized and Indigenous peoples, women, persons with disabilities, 2SLGBTQ+ individuals, and young adults. It is crucial to recognize that intersectionality amplifies these effects, with racialized and Indigenous people more likely to face multiple forms of discrimination, especially related to gender, age, and disability. And, those who reported experiencing discrimination also reported poorer mental health. 

Even with diversity, we still need to ensure that our analytics workplaces make everyone feel safe, healthy, empowered, and valued. Diversity, equity, and inclusion (DEI) programming remains urgent and necessary, and should not be deprioritized or defunded. In the systemic pressures previously discussed, I wonder how these are felt across different identities. For example – what are the experiences of a woman in a leadership role, a recent immigrant who is supporting family both at home and overseas, or a new grad with one or more disabilities – are they really all the same?

What if we worked differently, and prioritized people first?

The tendency for analytics workplaces to be top-down, reactive, chaotic, transactional, and overburdening clearly isn’t working—not for our people, and not for our products. We’ve got more than enough burned out workers and leaders, and more than enough underused products to prove it. And I’m only seeing signs that analytics (and tech more broadly) might be becoming even more unsustainable—from 996 culture, mandatory RTO policies, pressure to upskill for AI, low data readiness for AI, to the defunding of DEI.

I think systemic change (or a reset button) is required to humanize our approach to analytics work. The shift has to include not only analytics teams, but also the ecosystems that rely on us. 

For example, earlier this year, the Canadian Occupational Health and Safety Magazine suggested that workplaces adopt a trauma informed care (TIC) approach to work. This approach places safety, trust, and empowerment at the center, and recognizes that many of us have experienced trauma—trauma that workplaces can trigger, perpetuate, or even create. Normalized approaches to analytics work can actually be quite harmful, like unpredictability, constant urgency, ambiguity, and the erosion of autonomy. 

The article references the six pillars of TIC laid out by the Substance Abuse and Mental Health Services Administration (SAMHSA), and cites research that shows its positive impacts to employee well-being, satisfaction, retention, operational functionality and effectiveness, and cost efficiency. 

Figure 4. Six key principles of a trauma-informed approach, published by the Substance Abuse and Mental Health Services Administration (SAMHSA).

I have listed the six pillars from SAMHSA below, along with my attempt at (extremely) high-level and brief descriptions tailored to those of us working in analytics. I am still on my own learning journey. 

  1. Safety: Prioritize physical and psychological safety in all elements of the workplace. In analytics, this can mean that people are able to seek clarity, name concerns, and admit uncertainty without fear of punishment or loss of credibility. It can also mean that we respect limits on things like working hours, cognitive load, personal space, and sensory needs.
  2. Trustworthiness and Transparency: Build trust through consistent transparency around decisions, timelines, priorities, and changes. Clarity and predictability can reduce uncertainty, prevent reactivity, and stabilize teams.
  3. Peer Support: Reduce isolation and barriers to connection to foster peer support within and across teams. This can allow for greater understanding across disciplines and parts of the organization, smoother workflows, supportive relationships, shared problem-solving, and better knowledge transfer.
  4. Collaboration and Mutuality: Involve workers in decisions about policies, procedures, tools, standards, and more. Also, when business units and analytics teams better understand each other’s capacities, workflows, complexities, timelines, needs, etc., collaboration might be more smooth, respectful, and productive. 
  5. Empowerment, Voice, and Choice: Choice and control are essential for trauma-impacted people. In analytics, empowerment could mean giving workers more agency in defining things like their own scope, workflows, documentation, timelines, training needs, and work arrangements.
  6. Cultural, Historical, and Gender Sensitivity: Address systemic inequities and promote diversity, equity, and inclusion. Design systems from the start to acknowledge, understand, and respect differences. Do not rely on people to constantly identify, overexplain, or advocate for their needs.

Integrating TIC is a deep, long-term commitment that isn’t about checking boxes, a quick workshop, or adding a few supportive practices. It requires honest and sustained cultural and structural assessments, learning, planning, and shifts, and a more balanced distribution of power. But with a new reframing, maybe we can begin to view:

  • Workers as human, collaborators, creators, and both autonomous and interdependent 
  • Leaders as human, coordinators, facilitators, coaches, guides, and anchors
  • Work as collective, learning, growth-oriented, and sustainable 
  • Technology as supportive, enhancing, synchronizing, and shared 

This isn’t meant to be a silver bullet, and I know there are many other challenges in analytics that involve data, tools, processes, and more. It may also seem overly idealistic in our current systems. But I feel like tech is at a precipice, especially in the rush toward AI creation and adoption. We’re already seeing increased exploitation of labour and the environment in the AI space, without consideration of short or long term consequences. If we don’t care to stop and make our systems more sustainable, ethical, equitable, and accessible now—what does this mean for our (very near) future? 

I’m curious about what a different approach to analytics work might bring:

  • Will we have the space to maintain our health, relationships, and lives outside of work?
  • Will relationships within and between teams become more stable, empathetic, and productive—especially between analytics and business units?
  • Will we have more space in between deliverables to recover, reflect, and refine our systems?
  • Will our products become clearer, more cohesive, more aligned, actually used, and have impact?
  • Will we feel safe and supported to show up at work in our own unique ways?

The post Analytics Products Will Never Be Truly Human-Centered Until the Workplaces Behind Them Are appeared first on Nightingale.

]]>
24457
From Metrics to Mood: The Emotional Story in A HYROX Race https://nightingaledvs.com/from-metrics-to-mood/ Tue, 02 Dec 2025 16:43:20 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24449 In the world of sports performance, data is everywhere. Watches track heart rates, apps monitor recovery, and race platforms log every split and second. But..

The post From Metrics to Mood: The Emotional Story in A HYROX Race appeared first on Nightingale.

]]>
In the world of sports performance, data is everywhere. Watches track heart rates, apps monitor recovery, and race platforms log every split and second. But when all that data is condensed into a single visual, a story emerges: the numbers stop being neutral—they speak with raw emotion.

The aim was to analyse my HYROX performance, which is the fast-growing hybrid fitness event. It combines eight 1-kilometre runs with functional workout stations like sled pushes, burpees, and wall balls. The race’s structure naturally lends itself to analysis—clear segments, repeated runs, and measurable transitions. The goal of the visualisation was to explore how time, effort, and physiology interact across a physically demanding event. What actually transpired was a visualisation that was much more emotive: projecting personal emotion, or how I felt about my performance.

The challenge of condensation

Athletic data is inherently multidimensional. Time, effort, and physiology interact in ways that are complex and deeply human. Condensing all that into a single visual means facing the same challenge every visualisation designer knows too well: what to keep, what to simplify, and what to discard.

This HYROX chart condensed over an hour of physical effort into a few compact panels. Rather than presenting the bars along the conventional layout (x-axis), I shaped the visual to mirror the race’s own rhythm. As the reader moves from left to right, the reader too moves through each run and station. As the viewer follows the visual rhythm of the page and reaches the second chart from the top, they uncover time spent at each station relative to the event average—a clear indication of where momentum built or faded. Green meant faster than average, red meant slower. A cumulative line showed the overall trajectory: moments of acceleration, versus pauses of fatigue relative to the average athlete.

Design-wise, it worked. The streaks of green—for the lunges and sled pull stations—sparked a sense of pride. But as soon as I saw that one bar of deep red—the dreaded wall balls—I didn’t just see inefficiency; I felt disappointment. That’s when I realised how much emotional weight colour can carry in performance visualisation.

When color becomes judgement

It’s clear that colour can convey emotion. Warm hues suggest intensity, fatigue, or struggle, while cool tones evoke calm and control. These associations can subtly influence how athletes perceive their own performance. By using warm reds to mark high heart rate zones and difficult stations, and cool greens to indicate easier segments relative to the average, the visualisation established an intuitive “moral language”: a clear visual distinction between stronger and weaker performance that made the data instantly readable.

This raises a key design question: when visualising personal performance, are we aiming to motivate—or simply to measure? Should a chart make the athlete feel proud, or precise? The answer likely lies somewhere in between. The top chart, rendered in a calm blue gradient, remains neutral: it measures output without judgment. The chart below leans into emotion, using contrast and colour to spotlight effort and highlight moments of struggle.

Rhythm, not just metrics

The bottom half of the visualisation traced my heart rate throughout the race, capturing the ebb and flow of effort across running segments and workout stations. The rising and falling bands of orange and red felt like a heartbeat for the race itself—a pulse that mirrored moments of endurance, bursts of strain, and brief windows of recovery.

It wasn’t just data on a page; it was a rhythm you could feel. Peaks were sudden surges of intensity, while valleys were respites and recovery. Each station became a note in a composition of exertion and relief. In this way, visual structure itself conveyed effort before any labels or numbers were read. As designers, we often obsess over precision, but here, pacing and tempo communicated the human experience of performance more viscerally than any raw statistic ever could.

From metrics to meanings

What I learned from visualising my HYROX race wasn’t just where I was fast or slow, but how visualisation framed that story. Choices of colour, alignment, and context turned raw numbers into something interpretive—something emotional.

For data visualisation practitioners, that’s a valuable reminder: the goal isn’t only to display information, but to mediate understanding. The way we design a visual can shape not only what people learn, but how they feel about what they learn.

The post From Metrics to Mood: The Emotional Story in A HYROX Race appeared first on Nightingale.

]]>
24449
In the Shadow of Edmund Halley: Solar Eclipses, Citizen Science, and Qualitative Dataviz https://nightingaledvs.com/in-the-shadow-of-edmund-halley/ Wed, 19 Nov 2025 16:08:08 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24423 On April 8, 2024, a total solar eclipse crossed North America from the Pacific Coast of Mexico to the island of Newfoundland, off the eastern..

The post In the Shadow of Edmund Halley: Solar Eclipses, Citizen Science, and Qualitative Dataviz appeared first on Nightingale.

]]>
On April 8, 2024, a total solar eclipse crossed North America from the Pacific Coast of Mexico to the island of Newfoundland, off the eastern coast of Canada. At its longest point, in the center of totality, the Moon covered the Sun for exactly four minutes and 28 seconds. 

In the months leading up to the 2024 eclipse, experts predicted that millions of people would migrate to the path of totality to witness this extraordinary event. Tiny towns across the continent braced themselves for tourists, advising residents to stock up on food and gasoline in case of shortages. Highway signs warned travelers to prepare for extended delays. Some people who lived on the edge of totality drove two or three hours from their hometown just to experience one extra minute of darkness.

Part of the beauty of a modern solar eclipse—indeed, the only thing that makes it possible to travel to the center line—is that we understand the science behind the phenomenon. Knowing exactly where and when the darkness will hit, we can anticipate it with excitement and pleasure.

Among ancient people, for whom the sudden disappearance of the Sun provoked fear and dread, those four minutes could not have passed more slowly.

More than three centuries ago, in 1715, another solar eclipse hit the scene smack-dab in the middle of the Age of Enlightenment. Just two decades earlier, Isaac Newton had published his Principia, ushering in the eponymous era of Newtonian physics. The Sun, Moon, and stars—once seen as mystical celestial bodies—had been reduced to mere balls of rocks and gas, subject to the same laws of motion and gravity as the rest of us on Earth.

Newton set in motion a reshaping of the universe: from a mysterious, unknowable cosmos into one governed by data. With enough data points, early Enlightenment thinkers hypothesized they could anticipate the future movements of every object in the universe.

The 1715 solar eclipse was noteworthy in many respects. It was the first eclipse to pass over London, England in more than 500 years. It was the first time the path of totality could be mapped in advance thanks to the new laws of astronomy and physics. It was, therefore, the first eclipse to attract tourists. And the first to inspire scientific investigation. 

Astronomer Edmund Halley, most famous for his discovery of Halley’s Comet, was also a data visualization pioneer. He published the world’s first weather map, which depicted trade and monsoon wind patterns across the globe and was subsequently used by sailors as a navigational tool. He is also recognized as the first to plot two variables against each other on a Cartesian plane (as seen in his bivariate plot of barometric pressure and altitude) and the first to use contour lines on maps.

Halley saw the upcoming solar eclipse as a chance to test out Newton’s theories of gravity and motion. He published a pamphlet that claimed the darkness was neither an evil omen nor a divine event, but in fact the “necessary result of the Motions of the Sun and Moon.”

Halley’s pamphlet included a map that depicted the path of totality as seen from above—the first of its kind ever recorded and one which sparked a “golden age of eclipse maps.” 

Halley also kicked off the first citizen science project in modern history. In his pamphlet, he addressed the “Curious” people of England, urging them to watch the sky during the eclipse and record their observations: “The Curious are desired to Observe it, and especially the duration of Total Darkness, with all the care they can; for therby [sic] the Situation and dimensions of the Shadow will be nicely determin’d…”

In the end, about 25 people answered Halley’s call, sending him the times that totality began and ended in their specific location, along with a short description of what they saw in the sky. Halley himself wrote about his own experience in a mix of both scientific and poetic observations: “by Nine of the Clock . . . the Face and Colour of the Sky began to change from perfect serene azure blew [sic] to a more dusky living Colour having an eye of Purple intermixt, and grew darker and darker till the total Immersion of the Sun…”

Halley used the data he collected to correct the path of totality on his map, setting the stage for countless future scientists and eclipse chasers.

Leading up to the 2024 total solar eclipse, I prepared myself as best I could. I booked a weekend cabin along the path of totality, bought eclipse glasses for my whole family, and stocked up on Moon Pies, Sun Chips, and Cosmic Brownies. I vowed not to take pictures during totality, desiring instead to stay fully present and “in the moment.” After all, the eclipse would likely be the most photographed astronomical event in human history; there would be plenty of opportunities to download iconic images later.

But nothing could have prepared me for the experience of totality: four minutes of darkness, of disorientation, of complete awe and wonder. Four minutes of walking a strange, fine line between science and mysticism. Four minutes of feeling connected to birds and squirrels, to everyone else who was watching the sky at the same moment, and even to the ancient Vikings, who believed eclipses resulted from a monster devouring the Sun.

I took pictures, of course: terrible, blurry, amateur shots from my iPhone. I couldn’t stop myself—I felt an overwhelming compulsion to capture the strange sights and sounds around me and to document that I was there

Afterward, I couldn’t help but wonder whether other people felt that same sense of connection… or that same compulsion to take pictures. These weren’t questions of physical science, of course; nonetheless, they were questions that could be answered with data. Following in the footsteps of Edmund Halley, I sent out a call on social media, asking people to share their own photos and stories from the eclipse. Naively, optimistically, I hoped to receive hundreds, if not thousands of responses. But I’m no great social media influencer, and after posting my Google Form link everywhere I could imagine, I ended up with 62 responses—a tiny fraction of the total population who watched the eclipse. But to my delighted surprise, they represented a broad swath of locations along the path of totality and contained all the depth and complexity of a strong qualitative dataset.

Image provided by the author.

Initially, I created a Google map of the responses I received, a nod to Edmund Halley’s original visualization. But I couldn’t help but wonder if there might be a different way to present the data, one that might capture what the experience felt like.

So, I set out to analyze the rich mix of words and images that comprised my dataset. Using poetic inquiry, a qualitative process developed in the 1970s by multiculturalist and feminist researchers, I engaged in thematic analysis of respondents’ written submissions. A few themes that emerged in this process included feelings of transcendence (including connectedness to nature, humanity, and God), descriptions of the weather (especially the cool temperatures that accompanied the darkness), changes in animal behavior (dogs barking, birds roosting), and a communal feeling of celebration (gathering, cheering, public festivities). I highlighted certain “poetic turns of phrase” that appeared in participants’ responses; then I cut and pasted words and phrases to create 10 found poems that each represented a shared theme from participants’ experiences. (A condensed version of the poems, entitled “Six Ways to View an Eclipse,” appears in the online literary journal Unlost). 

I also coded the photos that I received. Most people submitted some version of the Moon covering the Sun during totality; these photos were coded based on the size of the Moon, whether it was in the foreground or background, and what other elements appeared in the photo (such as people, buildings, or trees). Some photos depicted a photo from before or after totality, featuring a “crescent sun,” and a few photos included people without the Sun or Moon appearing at all. In the end, I selected 20 photos that collectively showcased all the different visual elements that appeared in the dataset.

Images provided by the author.

In thinking about how to visualize this data, I wanted to create an opportunity for viewers to interact with the photos and poems in a novel way. After brainstorming several installation ideas with the team at Fusiform Props and Exhibits, I finally settled on the idea of printing the photos and poems using a special technique called lenticular printing. Lenticular printing is a technology that uses plastic lenses with ridges on top to display multiple, interlaced images at one time. The different images float in and out of visibility, depending on the angle from which the print is viewed.

Each of the final lenticular prints consisted of two photos and one poem, thereby displaying the words and images from multiple participants at one time. From April to June of 2025, the 10  prints appeared as part of a larger exhibition, entitled “Data Is Poetry,” at Artspace in Shreveport, LA.

During the opening reception, I watched as people walked past the prints on the wall. At first, most people strolled past casually at first, then did a double-take after realizing that the prints contained “hidden” images and words. They proceeded to adjust their own position, moving forward, backward, and side to side as they tried to see (and read) all the layers in the image. 

I was reminded of my own experience from a year earlier and how earnestly I had watched the sky through my eclipse glasses, looking for the slightest changes in the Sun as the Moon passed in front of it. The data visualization, therefore, mirrored the eclipse itself—an astronomical phenomenon that shifted with mathematical precision based on angles and movement. 

But the visualization also effectively symbolized our shared experience of the eclipse. Though all of the participants in the project had shown up for the same event, their view was necessarily determined, and limited, by their specific location and context. Only by compiling multiple viewpoints could we see the composite: a collective phenomenon that was as human as it was cosmic.

The post In the Shadow of Edmund Halley: Solar Eclipses, Citizen Science, and Qualitative Dataviz appeared first on Nightingale.

]]>
24423
Mapping Change: How One Design Studio Navigated 20 Years at the Forefront of a Changing Industry https://nightingaledvs.com/mapping-change-applied-works/ Mon, 17 Nov 2025 17:14:30 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24395 Applied Works is a London-based design studio celebrating their 20th anniversary. I sat down recently with founders Joe Sharpe and Paul Kettle to discuss their..

The post Mapping Change: How One Design Studio Navigated 20 Years at the Forefront of a Changing Industry appeared first on Nightingale.

]]>
Applied Works is a London-based design studio celebrating their 20th anniversary. I sat down recently with founders Joe Sharpe and Paul Kettle to discuss their work, changes they’ve seen in the industry over time, and to talk about the core principles that guide and focus their work.

Founders Paul Kettle (left) and Joe Sharpe (right)

Early influences

Joe and Paul met while in university. With a background in motion graphics, Joe has always kept an eye out for how a story evolves, frame-by-frame. Paul has a more classic graphic design background: his emphasis is information design and creating clarity for the user through in-depth understanding of his audiences. 

The two worked independently for a few years, and they joined forces to create Applied Works in 2005. Over the years, the studio has remained relatively small and has shifted focus multiple times to stay relevant in a changing landscape. Now, with 15 people, they’re on a growth path.  

Through their projects and clients, Applied Works has had unprecedented opportunities to witness the growth and transformation of an industry over time. From their early days working in moving image, branding, and websites, they had front row seats through the dot com bubble and learned how to code on the job. They ran tests on prototype devices like early satellite communications and the first iPad, and collaborated on many high-profile data vis projects, with the BBC, the Times in London, the London 2012 Olympic and Paralympic Games, and others.

BBC Class Calculator (2013). Source: Applied Works

As web technologies advanced, they upgraded their methods to support live data feeds, establishing style and component systems for code reuse. They also experimented with 3D maps for the 2014 Tour de France and advanced image filters for a Black Mirror project in 2018. Their Class Calculator project for the BBC became the broadcaster’s most-shared data tool in 2013. Lately, they have pivoted toward climate and environmental work with nonprofit partners — as well as projects tackling societal issues and inequality — collaborating with philanthropists, intergovernmental agencies, and think tanks to help them communicate complex data and nuanced narratives. They are also expanding their skills into data science and machine learning. 

Themes

The team relies on several “north star” behaviors to guide their exploration, helping to chart a course over complicated and changing terrain. Throughout our conversation, a few strong themes stood out.

Push the boundaries

In design school, Paul observed that the coursework was very structured and quite strict, but the most successful students were often the ones who did their own thing. To develop your own perspective, he realized early on that you need to push the edges to test who you are and find out what you think. Your initial instincts might be wrong, but that’s how you learn. This principle continues to shape how the studio approaches its work. An exploratory mindset and keen appetite for learning helps to feed their creativity and ideas.

Experiment to find out

In our influencer age, it’s worth emphasizing that success is not just about broadcasting your ideas and opinions and hoping that someone else follows along. You also need to test and refine those ideas based on feedback from the world. 

Experimentation and prototyping are a key part of the process at Applied Works. In order to find the limits, you need to push an idea as far as it will go, and then just a little bit further. When it starts to fail, you can pull back and find the place where it works. This process of tuning their approach through experiment, feedback, and course correction has been a consistent theme for Joe and Paul throughout their design practice.

Follow the creative tension

In addition to doing your own thing, you need to find something to push against and someone to negotiate with. Joe and Paul bring different contributions and viewpoints to their collaboration, producing a natural creative tension that drives their approach. 

Joe has a more technical bent. He often starts by analysing complex datasets to propose a narrative, and then they iterate together until it makes sense from both a user and a technical perspective. This collaboration allows the pair to use each other to get to a better solution than either would have achieved alone.

Creative tension also forms the foundation of client engagements. Clients bring new and interesting problems and constraints, and together the group negotiates a new set of solutions to meet those needs. They start by asking challenging questions to get the team thinking, and then they get deeply involved with a client problem and the data, understanding as much as they can about the science of what the client is doing. This process helps them identify the underlying need, and the solutions emerge from that. 

When designing a call center dashboard for Genesys, the team identified a fundamental relationship in the way the key performance metrics are presented. They transformed the data into a user-friendly dashboard build around just three key insights, streamlining the display to allow users to monitor and address issues in real time. This approach later became a foundation for how Genesys designs its products.

Genesys supervisor dashboard (2014). Source: Applied Works.

Have a perspective

Over time, the team’s projects and creative experiments added up to experience, creating a sense of identity that is both unique to the studio and informed by the external world. This gives them the confidence to stand their ground when needed, which sometimes means forging an alternate path. 

One of their biggest breaks as a studio came in 2010, when the iPad first came out. At the time, most of the industry was using Adobe Flash for infographics. For accessibility reasons, Applied Works had resisted using Flash in favour of HTML5 and CSS. When the Times got a pre-release version of the first iPad, their existing projects worked natively where many others did not.

The Times iPad data journalism (2010). Source: Applied Works.

By following their own inner guidance rather than an industry fad, Applied Works was positioned to take advantage of a major opportunity when the technology changed. The team was quick to point out that it doesn’t always work out this well, but independent thinking sometimes pays off in unexpected ways.

New technologies

Over and over again, Paul and Joe’s experimental approach positioned them to embrace new technologies as they emerged. They are often approached by people who want something done and aren’t quite sure yet what it is. Starting from an unformed idea, they work collaboratively to shape and co-define the work, and that often leads to new and innovative projects that they might not otherwise have created. 

Although the team has often been among the first to embrace a new technology, they work hard not to be defined (or confined) by it. Technologies are a medium or a tool that they use to achieve better results for their clients, but the process often starts on paper, outside of the constraints and limitations of a screen. 

Instead, the team comes back to core design principles to guide their work. Usability has always been central to what the team does. The term has changed over time, from accessibility and user centered design to usability, human centered design, and now inclusive design. It’s similar for data visualization: the team sees it both as a practice and a tool that’s best applied to a problem, and not necessarily the skill that defines an artist in its own right.

Chatham House resourcetrade.earth (2017). Source: Applied Works.

Regardless of terms or technology, the quality standards remain the same: is the design easy to use? Interesting? Intuitive? Coming back to Joe’s background in motion graphics, does the sequence and hierarchy of information over time make sense? Everybody learns differently, and the team focuses on using a mix of technologies and skills to facilitate core use cases and needs. A recent article on a project about trade flow for Chatham House shows how all of these different pieces work together. 

Embracing change

Across all of the team’s experiences, there is a strong pattern of learning and embracing change. Where there are no precedents, Joe and Paul see opportunities. Learning alongside their clients makes experimenting and trying new approaches a more collaborative way of introducing fresh perspectives.

Applied Work’s content focus has changed over time, shifting with their interests and the industry. Starting out with websites, corporate work and data journalism, they transitioned into data products and design systems as those opportunities emerged. They are now refining their focus again, focusing on enlarging their scope and creating a better future for the planet. 

The team’s process has also changed over the years. In the beginning, they worked mostly from creative briefs. As their experience and expertise grew, they moved into more open-ended engagements based on client trust. Paul likened it to going on a journey together: the ideal situation is when a client has an open-ended idea, and they can sit down and work out how to approach it together. 

They’ve also been working to make their work more scalable, developing a process and a system to support a larger, more distributed team. They’re deliberately creating more opportunities for R&D and making space to explore their personal interests and curiosities to keep the team engaged. Joe in particular is interested to see what happens if they let technology lead the way a bit more, to help them invent what could be. In 2017, the team got the chance to work on chapter artwork for a book about the Netflix series Black Mirror. Taking inspiration from the anthology’s dystopian themes of losing control of technology, the team used creative coding to generate imagery of each episode, relinquishing a certain level of control over the visual aesthetic. 

Inside Black Mirror book (2017). Source: Applied Works.

Looking back

Applied Work’s 20th anniversary has been an opportunity to pause and make sense of the journey the team has taken over the years. This kind of progress usually doesn’t follow a linear path. You can’t draw these connections with a ruler: you can only look back and connect the dots after the fact. The guiding principles above helped the team to navigate the shifting terrain, and to find their way. 

Joe and Paul created a successful studio built around care for their people and their team, their clients and affected audience, and the legacy that they leave behind in the world. They negotiated an ever-changing landscape by optimizing at each point in the process, following their principles and intuition to find the best path.  

Imagining the future

Looking forward, the Applied Works team is excited to help their clients navigate a world that is subject to ever-increasing change. They are interested in partnering with climate and environmentally-minded non-profits, data scientists and academic partners to understand and share their impact, communicate their mission, and design their approach to funding and future research. They hope to go deeper with their clients to articulate the core identity of their organization, to help them see further and ensure the continued success of their work.  

Applied Works 2025

Especially in the area of climate awareness, some of the team’s major clients are already thinking far into the future, asking questions like: “if we do our job properly, in 10 years we won’t need to exist in our current form. What should we do next?” Paul and Joe would like to help them to answer that question. They are also positioned to help facilitate new connections between their clients, creating an exchange of ideas that could lead to more collaborative and impactful work. 

Of course, Applied Works will continue leveraging technology to solve problems and experimenting to push beyond the current limits. They’re excited to shape our technical evolution beyond the screen into a more immersive and experiential virtual environment. Joe recently completed a MSc in geographic data science to expand his skillset for an AI-enabled world. The team is also ready to engage with the many new creative tensions introduced by AI: questions of bias and ethics, where and how we should use AI methods, and the many conversations about profitability and exploitation that this new technology poses. 

Overall, Joe and Paul are looking to help lead the push toward ethical, sustainable progress, both globally and for design. With two decades of experience navigating complex landscapes, they are well-positioned to “work together with clients to take each other into the future.” It will be interesting to see where they go next. 


Get in touch if you are interested in working with Applied Works, or subscribe to Rows and Columns to get updates on what’s happening with the team. They are also accepting applications to their Springboard program to solve big, global problems until Dec 17, 2025.

For more information about the team’s projects and history, see their recent anniversary post on LinkedIn.

CategoriesCommunity

The post Mapping Change: How One Design Studio Navigated 20 Years at the Forefront of a Changing Industry appeared first on Nightingale.

]]>
24395
Data Visualization & Affective Computing. Design That Manipulates Emotions or Design That Helps Reflect on Emotions? https://nightingaledvs.com/data-visualization-affective-computing/ Thu, 06 Nov 2025 16:49:32 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24381 What are emotions and how design is connected Emotions are complex. They are not feelings nor are they desires. I’ll define emotions as a biopsychological..

The post Data Visualization & Affective Computing. Design That Manipulates Emotions or Design That Helps Reflect on Emotions? appeared first on Nightingale.

]]>
What are emotions and how design is connected

Emotions are complex. They are not feelings nor are they desires. I’ll define emotions as a biopsychological process that happens inside the body and is an information-processing tool. I heard emotions being opposed to rationality—by some coincidence, pretty often in a sexist logic. But it’s quite the opposite, and emotions matter in effective decision-making. The way interactive interfaces, data visualizations, and other design systems that surround us are constructed may influence our emotional experience and processing. The ability to meaningfully experience data visualizations through emotional feedback enhances engagement.

In a design context, how we reflect on our emotional experiences can vary depending on the system architecture. That matters because we are surrounded by interactive systems, from AI-based digital products to newsroom data visualizations and train ticket machines. This is why the framework for constructing design systems that fascinates me is affective computing, a discipline researching how emotions can be detected and responded to by interactive systems. Currently, techniques such as emotion recognition via audio, speech and physiological data as well as sentiment analysis of textual evaluations and opinion mining are used to get this information. But is this factual data enough to effectively and empathetically evaluate the meaning of communication? Boehner and colleagues wrote that there are a lot of caveats to interpreting emotions, such as limitations of the given evaluation method. This means that the methods of factual evaluation should be combined with cultural understanding and nuanced assessment.

Emotions are constructed not only within physiological, but also cultural and social contexts. Emotion manifestation must make sense within the cultural context in which we live, and it matters what our own reaction to our emotions is. Am I ashamed to be openly angry, or rather feel justified? (Although not emotion, I keep thinking about a North Korean refugee explaining that there is no concept of “depression” in North Korea. Then, when one has depression in North Korea, how is the interaction with this state constructed and articulated?) Dr. Rosalind Pickard, a creator of affective computing, said that “Originally, affective computing was an area of research created to give technology the skills of emotional intelligence. The goal is to create technology that shows people respect, such as by not continuing to do things that cause people to become frustrated or annoyed.” She also says, “There is no magic sensor that will accurately convey how someone is feeling. We need to combine AI learning with lots of information from multiple channels gathered over time to even make a guess at feelings.”

Two models

Boehner and colleagues proposed a classification of affective computing, depending on what the goal is: informative and interactive models.

The informative model is based on the idea that emotions can be classified, symbolically encoded, categorized, and transmitted. The success metric for such a system is whether the emotion I sent to another user is interpreted correctly.

The interactive model is based on the idea that emotions are constructed in a process of interaction. The goal of the system in this framework is to provide a place to reflect on emotion. If the system helped the user interpret, understand, and reflect on their emotional state, it is a metric of system success.

Scheme by the author

How models work

I find the comparison between informative and interactive models important today, because in the world of deceptive patterns, misguiding charts and issues related to the AI that maximizes engagement over safety, it is important to design systems that can not only comprehend emotions, but also give a safe space for reflection and empathy without pushing the limits.

How would an information model be different from an interaction model? Let’s say I have a conversation with my childhood friend. Two scenarios:

  1. We use instant messaging, emoji exchange. Transmission of emotion is limited by the range of animations / character settings (Still, it can be fun). My sadness becomes a static crying cat, and my friend’s irony becomes an animated character resembling her, but with orange hair. This is an information model.
  2. We use email agent EmoteMail (old and dead, but interesting nevertheless). EmoteMail took photos of the user’s face when she was writing an email. The photo was automatically placed approximately near the paragraph that was written when the photo was taken. Moreover, paragraphs were color-encoded to reflect the time duration of writing. (Happy this is not my compulsory work email agent. But the conversation with a friend, long-distance flirt or quarrel could be, perhaps, an interesting experience). This is an interaction model.
Emotemail. (Source)

We have two totally different conversation spaces. The system can initiate analysis of the interaction result (information model, emoji) or catalyze interpretations of interaction (interaction model, Emotemail). Unlike emoji-type apps, EmoteMail-type apps remove the predetermined classifications, providing more direct access to the emotion of another person and an instrument for interpretation via data collection and representation. Predetermined classifications may limit the overlay of cultural and situational context (Have you ever had a desire to react to someone’s Instagram story, but pressing the heart, fire, or clapping hands was deeply contextually inappropriate?) In EmoteMail, the context of two people enhances the meaning of interaction, and the visualization of behavior provides clues to a person’s emotional state, but it doesn’t provide answers. It allows users to draw their own conclusions, providing a framework for data collection as a playground for interaction. However, the design system might have been pushed too far. In a forum discussing the project, a user Adam Kazwell wrote: “the thing that scares me about EmoteMail is having the recipient see what didn’t show up in the final draft. What value is added knowing that I misspelled recipient 3 times before I posted this comment? (…) If you want more than just cold-hard text, maybe pick up a phone or meet face-to-face :)” It was in 2004, and in post-COVID 2025, full of cold-hard texts and videoconferencing not being a perfect substitute for face-to-face communication, perhaps it is a good time to build design systems using both interactive and information models.

Not only may we not want to overshare in design systems, as Adam Kazwell mentioned, but sometimes we need time to process and understand the emotion we are experiencing. That’s why I don’t talk to AI about my emotions—I don’t want any priming and forcing. I need to get there myself. And with the latest Congress hearing Examining The Harm Of Chatbots on the tragic deaths of teenagers who interacted with AI agents, the question of the safety of technologies that can mimic empathy as confidants is pressing. And perhaps an interactive model of computing can provide ideas on how to construct safer systems and balance a widely used information model that may understand emotions, but drives them for engagement.

An example of an interactive model in data visualization that gives space for reflection on emotions is the Tied Knots project, which tells stories of harassment in academia. It provides users with a space to reflect on emotions and lived or observed experiences, and fosters a sense of community without pushing users to any particular conclusion or emotion. It prompts to assess the situation and maybe even make some personal decisions. Another example is Affective Diary, a data visualization project that empowered participants to track their emotional experiences via guided questions and sensor-tracking of arousal and movement – something that can be found in health trackers like Oura. However, researchers found that using graphs was not the best way to connect with emotional experiences, and proposed that data visualization for empathy should look “familiar”. 

The way social media platforms mine data about users is related more to the informational model, with the intention of correctly understanding, predicting, and profiting from the user’s emotions. On the positive side, health tracking apps and devices also utilize an informational model by collecting physiological data to assess the physical and emotional state of the user, which, with ethical data collection, can promote well-being and improve health. A good example of an informative model that provides such reflective space without pushing boundaries is the app How We Feel, which helps users understand their emotional state by offering hundreds of emotions to choose from, each with its own classification. At the end of the week, the user receives a data visualization of emotion distribution, as well as tools to manage them.

How We Feel new app feature. Image provided by the author.

What could be done better

I feel that the informative model, although useful and important, is overexhausted by the goal of marketing, when the interactive-based model is a humane framework of human-computer co-existence that might be important to add in a current approach to business and metrics. So many daily interactions with design systems (especially scaled to serve a lot of users) can handle big data, but lack human touch and compassion.

I love the idea that a design system doesn’t get to know my emotions to analyze me (I don’t like you, Facebook), but instead gives me a space to make sense of my emotions. The question is: how can we sustain such design at scale, systemically?

There are promising examples of empathy integration and meaningful interaction into the business model. For example, a Deep Viewpoints application developed for the Irish Museum of Modern Art provides visitors with a digital platform to share the emotions they feel when interacting with an artwork. Through mediation, users can also share reflections and questions in the form of a digital script, allowing others to access and utilize it for their own reflective and interpretive experiences. Such app allows museums to better understand their communities, and for minoritized communities to have a participatory space in a cultural dialogue. It allows people to be active participants when they interact with the cultural heritage, an important engagement practice.

Deep Viewpoints. (Source)

Another example of a sensory experience that involves a space for reflection is a study at Blair Drummond Safari and Adventure Park. Researchers built a multi-sensory device that allowed red lemurs and visitors to interact through smell, sound and videos. Researchers found that not only did people stay for longer times, but also that it increased their empathy towards animals and increased their educational outcomes. 

“We allowed people to share in the same experience to try and get people to have a sense of understanding of other, that we are sniffing together, helps makes animals more relatable and understandable.” said Ilyena Hirskyj-Douglas, an director of the Animal-Computer Interaction Lab who led the project, “As people our impact on animals and the planet is far reaching and I hope that this empathy can shape how people think and behaviour towards animal conservation. Though it is really unknown what the lemur in this case thinks. Some zoo keepers think of animal-zoo visitor interaction it as a type of enviromental enrichment.”

Perhaps if empathy is not an engagement extraction resource, but a space to build both connection and business, our design and social systems both benefit.

CategoriesData Science

The post Data Visualization & Affective Computing. Design That Manipulates Emotions or Design That Helps Reflect on Emotions? appeared first on Nightingale.

]]>
24381
From Pixels to Parks: The Intersection of Data Visualisation and Urban Greening https://nightingaledvs.com/from-pixels-to-parks/ Tue, 04 Nov 2025 15:00:49 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24350 Italian designer, Bruno Munari, famously argued that “Art shall not be separated from life: things that are good to look at, and bad to be..

The post From Pixels to Parks: The Intersection of Data Visualisation and Urban Greening appeared first on Nightingale.

]]>
Italian designer, Bruno Munari, famously argued that “Art shall not be separated from life: things that are good to look at, and bad to be used, should not exist.” Beauty arises when form is aligned with its exact function and material constraints—form follows function. I have recently been thinking about this conviction through the lens of two specific interests of mine: data visualisation and urban greening. One is rooted in digital representation and the other in physical space, yet both fields share a foundational principle of design. Both disciplines strive to organise complex information and environments in ways that are legible, functional, and at their best, to the benefit of human experience. A well-designed visualisation engages its audience through aesthetic appeal, drawing them in and making them more inclined to explore the information. Likewise, green spaces that are beautiful, not merely decorated, but beautiful in their functionality and integration with nature, are more likely to be used and cared for. They build civic pride and contribute to overall wellbeing.

I lived in London for a decade, the final three and a half years spent in Brixton. There, I would frequent two distinct green spaces. Brockwell Park, a large public park where the neighbourhoods of Brixton and Herne Hill meet; and Brixton Orchard, a small community garden situated directly opposite Lambeth Town Hall. It was during this period that I developed a keen interest in community-focused urban green spaces. In the winter of 2023, I left London for the densely populated city of Taipei, the Taiwanese capital. I moved into an apartment within walking distance of Da’an Park, the city’s central green expanse, but the true joy was the city itself: greenery along every street and tucked into every corner with plants of every species tumbling down off balconies. From my kitchen window, I could see the man who lived in the top floor apartment opposite had cultivated a veritable jungle atop his roof, in which he would emerge daily at sunset to water his plants and sit quietly on a bench he had nestled under his palms. This set-up is quite common in Taipei. As Clarissa Wei described the city for The New York Times, it is “a literal urban jungle—ferns and large elephant ear plants sprout through the crevices of roofs and sidewalks with wild abandon”. Across the street from my apartment, a pocket-sized neighbourhood park was a constant theatre of intergenerational life, teeming with both children and the over-seventies. I was surrounded by nature in the heart of a capital city, and I loved it. 

The pocket-sized neighbourhood park. Image provided by the author.
The roof garden opposite the kitchen window. Image provided by the author.

Urban greening, or green infrastructure, is the deliberate integration of vegetation—street trees, parks, green roofs, and living walls—with urban development to provide ecological, environmental, and cultural benefits. The protection of nature in urban spaces is essential for sustaining natural ecological cycles, and provides crucial cultural ecosystem services: it softens the harshness of urban infrastructure, ensures a critical connection to nature, improves general wellbeing, and fosters social interaction. Taipei relies heavily on green spaces of all types for public life. The city serves as a compelling case study for urban greening in compact cities, defined by its “top-down” planning approach born of its limited land area. A study by Peilei Fan, professor of urban and regional planning, and colleagues, found most neighbourhood centres and one subcenter in the city “exhibit both high compactness and good green accessibility”. The central city rests on the ancient Taipei basin, bounded by major rivers (like the Tamsui) with steep, mountainous terrain rising abruptly on most sides of the basin. Other examples of compact cities in East Asia include Hong Kong and Singapore. London, by contrast, is a city that grew organically over centuries, resulting in its pattern of “urban villages”.

Since 2010, Taipei’s urban planning policy has shifted its focus from a broad, visible green strategy toward practical green landscaping schemes. This strategy gained momentum around 2014 with the emergence of an urban regeneration programme and the prioritisation of the development of small green spaces, such as river corridor greens and pocket parks. Through government funding and the lease of state land, the Taipei Beautiful Programme and the Open Green Project have delivered the development of small green spaces across the city. A complementary effort, the Garden City initiative further promoted urban agriculture, including citizen farms (allotments), community plots, and rooftop gardens, particularly on school buildings. I used open source data from the Taipei Government Open Data archive to illustrate the development of green spaces in Taipei City over the last fifty years. I first translated the raw data from Mandarin, followed by exploratory analysis using Python. I plotted the time-series data as a raster-style graphic, aiming for an aesthetic that evokes an interconnected urban ecosystem. I was interested in mapping the locations of developed green spaces, and I relied on data from the Taipei City Water Green Space Atlas. I manually classified all the green spaces listed in the atlas according to the definitions of the Garden City initiative.

Effective data visualisations and well-planned urban spaces seamlessly blend aesthetics and utility, aimed at enriching public experience and understanding. In data visualisation, this means selecting graphs, layouts, and interactive elements that communicate the underlying data with clarity to engage the target audience. Similarly, in urban greening, this translates to designing spaces and infrastructure that provide ecological benefit and prioritise the needs of the local community. Form follows function. Munari viewed the designer as a mediator, bridging the gap between expert knowledge and public life: “The designer of today re-establishes the long-lost contact between art and the public, between living people and art as a living thing.” Without effective design in data visualisation, the data remains inaccessible. With it, visualisation can empower the public to understand, question, and engage with critical issues. Urban greening, by its very nature, is public-facing. The design of urban green spaces is about making a city a usable and sustaining environment.

Of course, data visualisation can also act as a bridge between academic research, expert knowledge, and public understanding within urban greening contexts. As someone whose academic background is in Neuroscience, and not Urban Greening, I write this from the perspective of the public that these policies need to engage; the public must be integral to the decision-making process for the planning and design of their local green spaces. The ‘Citizen Dialog Kit’, an open-source toolkit, was developed to leverage situated visualisation within public spaces through a set of interactive, wirelessly networked displays. By making local environmental data visible and understandable, it invites discussion and feedback from diverse community members who might not typically participate in traditional planning meetings. Residents can see the tangible results of past efforts and then contribute to ongoing dialogues about future greening priorities. In a recent study, socio-environmental scientist Thomas Mattijssen and colleagues presented a participatory application of GIS that bridges the gap between data-driven and citizen-centred urban greening. The authors used spatial modelling in community workshops where residents contributed local knowledge to enable researchers and local citizens to jointly identify greening criteria, translate them into indicators, and pinpoint potential greening locations. This accessibility fosters democratic participation in environmental decision-making.

Visualisations serve as a powerful tool to engage community members and translate complex datasets into compelling narratives, increasing both understanding and acceptance of environmental initiatives. For example, RisingEMOTIONS, a data physicalisation and public art installation situated outside the East Boston Public Library in 2020, aimed to engage communities directly affected by sea-level rise, encouraging their participation in planning adaptation strategies. Looking ahead, emerging technologies offer significant opportunities to enhance the role of data visualisation in urban greening, and more broadly, climate policies. Innovations such as AI-assisted personalisation, mobile technologies for context-aware experiences, and real-time environmental sensing can amplify its impact, creating tools—imagine an app that allows a Taipei resident to visualise the projected, real-time impact of a new pocket part on local air quality—that make planning tangible. For a comprehensive look at how data visualisation can be leveraged to address sustainability goals, a recent article by social computing professor Narges Mahyar provides an excellent review.

In adhering to Munari’s principles, where design prioritises functionality, clarity, and intrinsic beauty, the design of data visualisations and urban green spaces finds its highest purpose in its utility to the public: “Art shall not be separated from life.” At their best, both disciplines demonstrate that “beauty arises from functionality”—whether in the efficient form of a bar chart communicating a climate trend or the purposeful design of a pocket park managing water runoff—both are functional in their communion with the public. Moreover, data visualisation can become an indispensable tool for urban greening. By effectively communicating complex planning concepts and policies, visualisation fosters community involvement and informs policy decisions that directly benefit local communities.


All images designed and provided by the author.

The post From Pixels to Parks: The Intersection of Data Visualisation and Urban Greening appeared first on Nightingale.

]]>
24350
Exploring Data Detective Practices as a Class Activity https://nightingaledvs.com/exploring-data-detective-practices-as-a-class-activity/ Fri, 24 Oct 2025 16:24:00 +0000 https://dvsnightingstg.wpenginepowered.com/?p=24233 We reflect on our experiences arising from a recent computer science graduate class about data feminism, during which we explored the idea of being data..

The post Exploring Data Detective Practices as a Class Activity appeared first on Nightingale.

]]>
Figure 1. Example journey of Data Detective: beginning with defining a critical problem or question, identifying gaps (“What’s missing in the picture?”), searching for data, and confronting barriers—missing, partial, or deliberately obscured datasets. Each step provided unique insights into the relationship between data, society, and power dynamics. (Illustration by © Zezhong Wang & Ruishan Wu)

We reflect on our experiences arising from a recent computer science graduate class about data feminism, during which we explored the idea of being data detectives. In this report, we explain what we mean by Data Detective as an active approach where we, as individuals, could approach the underlying questions, as suggested by D’Ignazio and Klein “Data science by whom? Data science for whom? Data science with whose interests in mind?”. By connecting individually through personal reflection, data literacy, and critical engagement, our goal is to inform and inspire those who are interested in integrating similar methods into their classes.

As our society continues to evolve, more and more of the information we need is stored as data, and many of these repositories are growing and becoming what we refer to as Big Data. In this process, data becomes more challenging and less accessible to us as individuals. We, as visualization researchers, work on the creation of visualizations as at least part of the solution to this problem. However, much of our data is still not visualized, and even when it is, individuals still often find it challenging to understand. How do we cope with this? How do we teach our students to cope with this continually expanding problem? 

In our data feminism class, we introduced concepts such as visual variables, physicalizations, assumptions about knowledge development (e.g., Positivism and Interpretivism), along with reflection and discussion on reading the book Data feminism. We then explored developing an active practice through which we would document our investigations of both qualitative and quantitative data under various themes. We now term this active practice as being a Data Detective

Our concept of Data Detective is modeled on detective work in a more general sense, where a person uses coherent, time-based, record-keeping of their activities to gain a better understanding of that which they initially do not know but want to understand. Thus, to act as a Data Detective is to discover and conduct purposeful, documented, and reflective actions needed to gain access to the desired data. This detective work ideally results in access to the desired data, an understanding of the data, and the detective process involved. 

The term Data Detective appears in various contexts, making it important to clarify our specific approach. Unlike children’s books that suggest counting objects (like red cars versus white cars), or Harford’s statistical literacy guide with its ten rules for making sense of statistics, or visualization workshops for children by providing a gamified sense of accomplishment. Our approach also differs from Inselberg’s multidimensional data detective work, which focuses on analyzing existing visualizations, and from data activism approaches that emphasize community engagement.

We visualized our investigative approaches as journeys: beginning by defining a critical problem or question, identifying gaps (“What’s missing in the picture?”), searching for data, and confronting barriers—missing, partial, or deliberately obscured datasets. Each step provided unique insights into the relationships between data, society, and power dynamics.

Examples

Throughout the semester, students undertook diverse projects with strong societal relevance, including topics such as gender bias in politics, barriers faced by women in entrepreneurship, the functions and ideology of pockets constrained by historical gender roles, and gender representation within STEM academia. 

One student examining women’s representation in political institutions vividly illustrated the practical challenges of data detective work. Initial exploration quickly highlighted systemic data gaps as key datasets were fragmented or unavailable. The student navigated through a frustrating landscape marked by opaque official sources, partial records, and silences. Despite challenges, this data detective journey offered significant emotional and intellectual rewards. The student discovered patterns of marginalization, for instance, women are frequently relegated to peripheral roles rather than core decision-making positions. Each painstakingly gathered dataset provided clarity about structural inequalities. Ultimately, the effort became a tangible act of resistance against invisibility and marginalization.

Figure 2. Data Detective journey created by © Ruishan Wu.

Another student explored the challenges women encounter in achieving tenure in Canadian academia. Initially optimistic, the student encountered considerable barriers, including incomplete or outdated datasets and inconsistent categorization across institutions. Interviews became essential to fill these gaps, highlighting how data detective work can require alternative methods beyond computational data collection. The journey revealed systemic biases: women disproportionately assigned tasks correlated with lower job satisfaction and hindered career progression.

Figure 3. Data Detective journey created by © Haidan Liu.

Working with both big data & personal data

As we moved through the process, we found ourselves blending two approaches to data visualization that are often kept separate: working with big data and working with personal data. Big data showed up in the external datasets we chose to investigate, such as government records, institutional statistics, or public health databases. These are the kinds of large-scale, structured data commonly associated with the term big data.

On the other hand, personal data and visualization came into play as we reflected on our own experiences navigating these data landscapes. By documenting our paths through note-taking, diagramming, and visualizing our steps, we deepened our understanding of the datasets themselves and uncovered what was missing, what was hard to access, and where our questions should lead next.  

Central to our pedagogy was encouraging students to critically reflect on their data practices. We structured reflective exercises to surface the implicit power dynamics in data collection and usage. Students were prompted regularly to question: Whose data are we using? Who collected it, and for whose benefit? Who controls access, and how does that affect analysis?

This reflexivity deepened our critical engagement, enabling us to overcome technical challenges and interpret the implications of their findings from the data.

We suggest one possible pathway to actively take on the role of being a Data Detective:

  • Initially clarify what one is looking for—this is before one has the data.
  • Develop a timeline starting from the current moment, which will track the process by which one gains or loses access to the data.
  • Choose a currently promising direction to find more information (could be: ask a person, search on the web, go to an institution, etc.)
  • Collect and reflect on the information collected, filling in one’s timeline, with data, facts, responses, including emotional and frustration level responses.

Actively conducting Data Detective projects in our class, where we used personal visualization of our detective process to teach us about both institutional and personal data, whilst revealing many factors about our society. Each data point gathered and each visualization created represents a small act of making the invisible visible, contributing to more equitable and inclusive understandings of our complex social world.

Acknowledgement

We thank our colleagues and reviewers for their thoughtful comments. This research was funded in part by NFRFR-2022-00570 (A Co-Design Exploration), NSERC Discovery Grant: Interactive Visualization RGPIN-2019-07192, and Canada Research Chair in Data Visualization CRC-2019-00368.

CategoriesData Literacy

The post Exploring Data Detective Practices as a Class Activity appeared first on Nightingale.

]]>
24233