Tools Archives - Nightingale | Nightingale | Nightingale The Journal of the Data Visualization Society Fri, 20 Mar 2026 17:43:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://i0.wp.com/nightingaledvs.com/wp-content/uploads/2021/05/Group-33-1.png?fit=29%2C32&ssl=1 Tools Archives - Nightingale | Nightingale | Nightingale 32 32 192620776 Building Tableau Dashboards for the PowerPoint Download https://nightingaledvs.com/building-tableau-dashboards-for-the-powerpoint-download/ Thu, 26 Mar 2026 12:00:00 +0000 https://nightingaledvs.com/?p=24662 Working in reporting and analytics for the last six years has made me realize an uncomfortable truth about Tableau: Your beautiful interactive dashboard will often..

The post Building Tableau Dashboards for the PowerPoint Download appeared first on Nightingale.

]]>
Working in reporting and analytics for the last six years has made me realize an uncomfortable truth about Tableau: Your beautiful interactive dashboard will often become a static PowerPoint slide.

If you work in sales ops, finance, or any executive-facing analytics team, you already know this. Your vice president  won’t open Tableau Server at 9 a.m. before the board meeting. They’ll download your dashboard as an image or  powerpoint, paste it into slide 17, and present it to the C-suite.

Once I accepted this reality, I started treating this as a design problem. Here are five non-negotiable factors I learned on my Tableau journey.

The first Excel dashboard, created in 1990 using the first version of Excel for Windows. Source: Microsoft

1. Design for PowerPoint From Day One

Device preview matters exponentially more when your dashboard will live in a powerpoint deck.

In the early stages of redesigning an executive-level sales report, I built my dashboard in Tableau’s default “Desktop Browser” view. When I downloaded it as PowerPoint, it crushed into a single slide with illegible text — a formatting disaster right before a leadership presentation.

The fix here is using Tableau’s built-in PowerPoint layout (16:9 aspect ratio) from day one.

Source: Rituparna Das

This ensures your dashboard fits perfectly into standard Google Slides or PowerPoint without awkward cropping or white space. Don’t design for Tableau’s default dimensions — design for where your dashboard will actually be consumed.

Pro tip: Always test your export before the final version. Click “Dashboard > Export as PowerPoint” to preview exactly what stakeholders will see.

2. Accept That 80% of Functionality Disappears

This is the hardest lesson: You must build assuming zero interactivity.

What dies in PowerPoint:

  • Filters (static view only)
  • Parameters (whatever was selected during download)
  • Hover tooltips (invisible)
  • Drill-downs (gone)
  • Dashboard actions (non-functional)

This changes your design strategy. Now you have to build multiple static versions of what each filter setting your users will want to view. For example, my executives were interested in seeing  pipeline performance across sales regions, sales clusters, business units, and product lines. What would have been one dashboard filter is now separate dashboards I had to create:

  • “Pipeline_Review_by_Sales_Region”
  • “Pipeline_Review_by_Sales_Cluster”
  • “Pipeline_Review_by_Business_Unit”
  • “Pipeline_Review_by_Product_Line”

Yes, it’s more work. Yes, it feels redundant. But it’s the only way to ensure your stakeholders see what they need without interactivity.

Every critical insight must be visible on page load. If it requires a click to reveal, assume it will never be seen.

3. Use Containers for Layout Control

When your dashboard contains multiple visualizations, containers keep everything locked in place during the PowerPoint export. Without them, floating objects shift unpredictably — your perfectly aligned KPI cards end up overlapping your bar chart in the downloaded version.

PowerPoint downloads don’t tolerate white space. A minimalist Tableau dashboard might look elegant on screen, but it looks unfinished and unprofessional in a deck. Executives expect dense, information-rich slides.

Why containers solve both problems:

  • They lock your layout in place (no shifting elements)
  • They help you maximize space efficiently (no awkward gaps)
  • They give you precise control over how information flows
Source: Rituparna Das

This dashboard exports with excessive white space, making it look unprofessional in decks.

Best practice workflow:

  1. Create a low-fidelity mockup of your dashboard layout
  2. Build the container structure first (horizontal and vertical containers)
  3. Drop visualizations into containers last

Pro tip: Watch this Tableau container best practices video before building your next dashboard — it’ll save you hours of reformatting frustration.

4. Establish Governance Standards for Version Control and Collaboration

If you’re working collaboratively or managing multiple dashboard versions, implement a simple visual system:

Source: Rituparna Das

Use the color coding available for dashboards:

  • 🟢 Green : Production-ready, safe to download
  • 🟡 Yellow : Work in progress, do not present
  • 🔴 Red : Draft/testing only

Keep consistent and clear worksheet naming conventions. This will save your sanity.)

❌ DON’T: “Bookings (1)”, “Bookings (1)(1)”, “Sheet 3”
✅ DO: “Q4_Bookings_Final”, “Pipeline_Review_v3”, “Pipeline Coverage_BarChart”

5. Add Company Logos

Align as closely as possible to your organization’s standard slide deck template.

Why this matters: Your dashboard might be internal today, but it’ll be in a client presentation tomorrow. When your VP forwards it externally without asking you first (and they will), professional branding matters.

Where to place logos:

  • Top-left or top-right corner (consistent with company templates)
  • Footer with date/data source
  • Consider adding a “confidential” watermark for internal metrics

The Bottom Line

The moment you accept that your Tableau dashboard will become a PowerPoint slide, you start designing better dashboards.

Stop optimizing for interactivity. Start optimizing for screenshots.

Use the 16:9 layout. Build static versions of filtered views. Lock everything in containers. Name your worksheets like a professional. Add your company logo.

Your stakeholders don’t care about your elegant parameter actions if they can’t paste your dashboard into their Monday morning deck.

Sometimes being a great analyst means accepting that your masterpiece will be Ctrl+C’d, Ctrl+V’d into slide 23 — and designing for that reality from the start.

CategoriesHow To

The post Building Tableau Dashboards for the PowerPoint Download appeared first on Nightingale.

]]>
24662
The Tiles That Made Me: Mapping Friendship through the Lens of AI https://nightingaledvs.com/the-tiles-that-made-me/ Thu, 19 Mar 2026 12:00:00 +0000 https://nightingaledvs.com/?p=24653 According to the Oxford Dictionary, friendship is a “voluntary, personal relationship characterized by mutual affection, trust, and support.” Whereas to me, friendship involves authenticity and..

The post The Tiles That Made Me: Mapping Friendship through the Lens of AI appeared first on Nightingale.

]]>
According to the Oxford Dictionary, friendship is a “voluntary, personal relationship characterized by mutual affection, trust, and support.” Whereas to me, friendship involves authenticity and a trustworthy partnership that involves fun, kindness, and understanding.

It’s the size of the smile on your face when you see someone. It’s the decision to stay in touch with a niece long after family events end. It’s the fragile silence between you and a friend who couldn’t support a recent life choice.

As a data designer, I’ve always been obsessed with how we categorise the intangible. Recently, I set out to map the people who have shaped me. I didn’t want a balance sheet, but I did want to see the patterns. A relationship always evolves; this would only represent a snapshot in time.

The Taxonomy of Connection

I began by listing every person I care about. First from memory, then verified by my friends list on Facebook. But as I opened my spreadsheet, the questions started to flood in. Can family members count as friends? For example, my nieces and I have been chatting nonstop for years now. We grew fond of each other through the circumstance of birth, but we stayed in touch by choice. Does that make them friends? And what about friends who aren’t supportive of my life choices? We were very close 7-8 months ago, but we are not now. Are we still friends? If I exclude her from this, does that mean I have given up on our friendship? Also, I use the term “friend” very loosely. I am naturally familiar with strangers. Is my new neighbour — with whom I have shared a few cups of tea — my friend?

To make sense of the friend list, I distilled friendship into three core metrics, scored on a scale of one to three, three being the highest rank possible: 

  • Reliability: Loyalty, faithfulness, and the feeling of being safe.
  • Empathy: Supportiveness, kindness, and open communication.
  • Joy: Playfulness, liveliness, and shared common ground (though one might question whether friendship is required for common ground; for the sake of this visualisation, I decided it was).

I also added two judgment values: Duration (how long we have been friends), and Contact (how recently we spoke). To keep the data honest, I limited the scope to friends I had contact with in the last 24 months. I chose 24 months as a mark because it’s the period since my daughter was born. Spoiler alert: In a time when I often felt lonely as a new mother, the data showed me I was actually deeply loved.

From Sketching to Scripting

In my notebook, the design evolved rather quickly into a series of “tiles.” I remember having the visual in my head for a while, and I felt as if I were a vessel letting it out onto the paper. I wanted something that would represent the scale’s levels easily. Level one was a simple base; level three added complex detail. 

Source: Or Misgav

Initially, I used background colors to denote duration, but the palette was too loud. It made the story about “how good I am at making friends” rather than “how these friendships built me.”

Source: Or Misgav

Then came the pivot. Usually, I build these visualizations by clicking the mouse. A thorough process of copying, pasting, and double-checking layers in Illustrator and Figma would easily take three hours. But, inspired by the “vision to execution with a click” movement, I turned to Claude and Gemini.

I asked Gemini to help me write the prompt for Claude. It generated a Python script that processed my Excel file and generated stacked layers as PNG files. Claude taught me how to install Python on my Mac. (Honestly, I felt like I was back in the 90s, typing into a terminal to launch a game.) Then, “Boom. Your tiles are ready.” With a single click, the assets were generated. A few back-and-forths with Claude, and the grid was aligned. The work was done.

Source: Or Misgav

The Cost of Efficiency

As I looked at the finished folder, a strange feeling washed over me: I didn’t recognize the data. By automating the execution, I had accidentally bypassed the data familiarization stage — that meditative hour where you handle each data point with care and remember the person behind it. The tiles were beautiful, but they felt distant.

It raised a fundamental question for our field:
If the AI builds the layers, are we co-creators? Or are we just curators of our own memories?

End Result. Source: Or Misgav
How to read. Source: Or Misgav

The Tokens of Gratitude

Despite the digital distance, the final grid is a testament to my life. These tiles are me. They represent the people who stayed through puberty, the ones who signed my wedding book, and the new friendship that started when I collected my son from preschool, which grew close.

This project is more than a visualization; it’s a token of gratitude. It captures a snapshot of my soul as it exists in 2026. Shaped by humans, rendered by machines, and held together by the voluntary, personal relationships that make life worth mapping.

CategoriesData Art

The post The Tiles That Made Me: Mapping Friendship through the Lens of AI appeared first on Nightingale.

]]>
24653
The Back of the Painting: On Structure, Integrity, and Data Visualisation https://nightingaledvs.com/the-back-of-the-painting/ Tue, 17 Feb 2026 16:41:46 +0000 https://nightingaledvs.com/?p=24599 In the early 1420s, Fra Angelico, a Dominican friar and painter, completed his first large-scale work for the newly built monastery at San Domenico. The..

The post The Back of the Painting: On Structure, Integrity, and Data Visualisation appeared first on Nightingale.

]]>
In the early 1420s, Fra Angelico, a Dominican friar and painter, completed his first large-scale work for the newly built monastery at San Domenico. The San Domenico Altarpiece is one of the Early Renaissance’s defining works and adorns the high altar where the friars once sang their hymns during the Divine Office. Last year, the altarpiece was removed for restoration and featured in a major exhibition across Florence. On the front, the polyptych depicts four haloed saints in a single unified space, each attentive to the Virgin and Child. The Virgin and Child are themselves surrounded by angels with vibrant multi-coloured wings, their feathers shifting though a prismatic palette that is particularly iconic of Fra Angelico’s work. 

San Domenico Altarpiece by Fra Angelico. (Source: Web Gallery of Art)

To look at the back of the high altarpiece, however, is to see an intricate collage of wood from various centuries. It serves as a physical record of how the work has been altered as tastes have changed over time. In the seventeenth century, carpenters recut the original panels and added new wood to force the piece into a rectangle. Beechwood inserts, shaped like butterflies, and crossbeams cover the surface, running against the natural grain. Poplar meets beechwood, intersecting in different directions. Each species moving discordantly with humidity and the passage of years.

Roberto Buda, a conservator who specialises in wooden panel paintings, spent close to nine months stabilising the altarpiece’s structure. Working with his team, he removed the existing crossbeams and butterfly-shaped inserts, replacing them with carefully matched old poplar wood infills aligned parallel to the wood’s grain. A new frame was added with conical springs that allow the wood to move naturally. “It’s a house,” Buda told the Financial Times during the restoration. “If you don’t have a good foundation, it doesn’t hold up. The painting will never look good if the support is not right.” 

Months later, as I sat at my laptop placing an axis in the centre of the page, I thought again about this quote.

A deliberate transition in my career was marked in 2025. During my PhD in experimental neuroscience, I learned to do many things at once. I built hardware and software. I designed experiments. I ran those experiments, analysed the data, visualised the results, wrote papers, and taught students. Academia rewards this kind of breadth and a range of technical skills accumulates quickly. Yet, I found myself most engaged at the very end of the workflow, sitting with a dataset that had not yet been interpreted. I wanted to slow down and to look for the narrative in the data. To focus not only on results but on how those results are communicated—clearly, honestly, beautifully. In academic research, figures are often produced in haste, appended at the end of the pipeline. There is a script, a deadline, a familiar plotting function. In Python, with the visualisation library Matplotlib, you can call plt.bar(), and a chart appears. Microsoft Excel goes further still, delivering a fully formed graphic with colours and proportions chosen on your behalf. I wanted to build visualisations with greater intention and technical freedom, and this is what led me to the open source JavaScript library, D3.js.

D3 stands for Data-Driven Documents and is a low-level library which uses the full capabilities of web standards such as CSS, HTML, and SVG to build sophisticated and interactive data visualisations. While other visualisation tools hand you a bar chart or a scatterplot, to represent data in D3 you must manually calculate the scales, define the coordinate system, and bind the data to a graphical element. You must decide exactly where an axis sits and how a margin breathes, what a data point is—a circle, a path, a mark—and how it behaves when the data changes. Nothing appears unless you build it. D3 is a workshop full of raw timber and hand saws.

With this in mind, I applied to the Data Visualisation Society’s mentorship program, intent on learning D3. Under the guidance of my brilliant mentor, Sam Bloom, I spent ten weeks at the end of 2025 working through the library’s fundamentals and building an interactive visualisation. We focused on first principles before developing an interactive scatterplot to explore Ancient Greek colour perception. Progress was slow at first because the learning curve was steep, but as I learned to build in D3 and perform this kind of digital carpentry, visualisation began to resemble construction. Every line of code was doing structural work. Figures included in this essay show examples of the interactive scatterplot, which examines the sensory dimensions of Ancient Greek colour by focusing on the major colour adjectives used by Homer in the Iliad. The Ancient Greek experience of colour was inseparable from motion and shimmer. Colour was a basic unit of information which reflected the natural world—encoding brightness and darkness as fundamental dimensions. Greek colour terms not only prioritised luminosity but the play of light across surfaces, the texture of materials, even the social standing implied by a sheen or shade. It was a colour vocabulary rooted in their lived perception, rather than the modern hue-based categories we use today. Selected excerpts from my D3 code illustrate how each visual element is constructed, for example, the multiple lines of code required to precisely position and size tick marks along each axis. The full project can be viewed here.

When a reader encounters a clean scatterplot, they see only the front of the painting. They don’t see the scaffolding: the decisions about scale domains or the choices about what not to encode. While the interactive scatterplot I built at the end of the ten weeks was modest, I could explain why each element existed and how it related to the data. Each decision—scale, colour, interaction—could be justified. Good data visualisations often look deceptively simple. But this clarity is the result of many intentional decisions about the data and the visual design. 

For example, ~90% of the charts the Financial Times publishes are bar charts or line graphs, yet because these charts adhere to a defined set of design principles, down to the very placement of the title and the subtitle, it makes the FT’s graphics some of the most recognisable in newsroom data visualisation. This coherence is maintained through meticulous style guides, which dictate everything from the weight of an axis line to the specific hex code of a categorical blue. These guides function as a visual vocabulary or grammar. Alan Smith, the FT’s Head of Visual and Data Journalism and who led the design of the FT’s visual vocabulary, has previously championed the idea that a chart should be as readable as a sentence. Alberto Cairo, a professor of visual journalism at The University of Miami, has often argued that the most important part of a visualisation is the “reasoning” that happens before the first pixel is placed. In his book, The Art of Insight, he argues that there are really no rules of data visualisation, there’s just reason. Every design choice must be a defensible, rational response to the data and the intended audience. 

These ideas are not confined to style guides or theory; they are persuasive when also used for animation and interactivity in visualisation. When such principles are applied with narrative intent, even complex data can be immediately comprehensible to an audience. A widely cited example of the power of a simple but intentional use of data visualisation is Hans Rosling’s 2006 TED talk. Rosling revealed patterns in a complex dataset through an animated scatterplot in which countries appeared as circles, mapped by measures such as life expectancy (on the x-axis), countries’ GDP (on the y-axis), and population (the size of the circle). As the animation unfolds, these circles shift across the axes allowing long-term trends to emerge gradually rather than all at once. Rosling paired this animation with carefully selected narration and emphatic gestures to guide his audience to the most meaningful changes as they occurred. The result was a complex global health story made easy to understand through intentional narrative decisions and clear visual structure.

The painted surface of Fra Angelico’s altarpiece is inseparable from its support. The relationship between the painted surface, the underlying preparation, and the wooden support beneath, makes the altarpiece a three-dimensional object rather than a flat image viewed only from the front. The butterfly-shaped beechwood inserts, which were set against the direction of the grain, introduced stresses that increased the risk of cracking, jeopardising the paint layer.

A data visualisation is a three-dimensional object of logic. If the underlying structure is weak, if scales are arbitrary or axes misleading, the surface won’t stand up to scrutiny. The narrative ‘paint’ (the colour palette, the interactivity, etc) will eventually crack. For example, decisions relating to the axes scale depend on what counts as meaningful in a given context. Although it is often suggested that a y-axis should begin at zero to preserve proportional accuracy, this convention can obscure important variation when the relevant changes are small, as is often the case with climate data. A review by Steven Franconeri, professor of psychology at Northwestern University, illustrates this clearly: a temperature chart anchored at zero degrees Fahrenheit flattens visible change, while a version scaled to the relevant temperature range makes trends legible without distorting the data. A widely criticised, since-removed National Review article employed a temperature chart with a lower bound of –10 degrees Fahrenheit, a choice that made recent increases in global temperature appear negligible.

Wood is a living thing and it needs to move. Buda and his team’s addition of a new, more encompassing frame made of chestnut wood and conical springs allowed the altarpiece painting to breathe through the natural movement of the wood in different directions. I developed my D3 visualisation in tandem with the JavaScript library React. In modern web development, React acts as the frame of chestnut wood. It is often described as a library for building user interfaces, but at its core it is a way of thinking about state and change. You describe what the interface should be given certain conditions, and React takes responsibility for updating it when those conditions shift. React holds the structure and lifecycle of my visualisation and D3 handles the math: scales, layouts, transitions that respond to data.

This article is not about JavaScript, or frameworks, or even data. It is about integrity in design. It is the realisation that the most important work we do as data visualisation developers is often the work that the reader will never see. When the San Domenico Altarpiece returns to the walls of the monastery, the public will only see the Virgin and Child, resplendent and serene. They do not see the new poplar inserts running parallel to the grain or the conical springs hidden within the frame. When we design good visualisations, we are doing something similar, we are building the foundations so that the story can stand on its own. We are building houses for data. Every axis, every scale, every line of code is a poplar insert aligned to the grain.

CategoriesCode

The post The Back of the Painting: On Structure, Integrity, and Data Visualisation appeared first on Nightingale.

]]>
24599
Decoding the Influence of Visual Data in Consumer Decisions https://nightingaledvs.com/decoding-the-influence-of-visual-data-in-consumer-decisions/ Thu, 05 Jun 2025 14:23:53 +0000 https://dvsnightingstg.wpenginepowered.com/?p=23606 Do you ever wonder why your eyes dart to a flashy “50% OFF” sticker? or an appetite-stimulating visual of potato chips on a pack in..

The post Decoding the Influence of Visual Data in Consumer Decisions appeared first on Nightingale.

]]>
Do you ever wonder why your eyes dart to a flashy “50% OFF” sticker? or an appetite-stimulating visual of potato chips on a pack in a store before you even register the brand name? Have you ever been curious as to why some websites feel easier to use?

I posed this question to my colleagues. Their answers varied—bold colors, attractive discounts, evocative imagery, and clutter-breaking layouts. Fair enough, but what if the real insight lies beneath the surface? In the era of split-second decisions and endless scrolls, understanding what grabs attention, and what doesn’t, can make or break a design. Noticing isn’t just seeing , it’s where we look, why we pause, and what we skip—typically, the “Stop–Hold–Close” model of communication. 

Design is psychology and consumer testing is at the crux of impactful design. AI is giving designers superhuman insights into what truly captures attention and it’s not always what we expect. To design effectively, we need to understand how people see while diving deeper into the unconscious mind of the consumer where decisions are truly made.

Eye tracking

Origins of eye tracking date back to 1879 when the French ophthalmologist Louis Émile Javal noticed, for the first time, that readers’ eyes do not skim fluently through the text while reading, but make quick movements—saccades—mixed with short pauses—fixations. These studies were based on naked-eye observations in the absence of a more advanced technology.

What is it?

Eye tracking is a technology that measures and records the position and movement of the human eye. These seemingly minor movements reveal powerful insights into attention and decision-making often before they’re even aware of it. An eye-tracking device, screen-based or glasses, uses infrared light to detect where the user is looking on a screen or in a physical space. 

Image credit: Sachi Mahajan

The science behind eye tracking

  • Fixations — These are moments of attention. When the eye pauses on a specific element to absorb information—be it a logo, button, or headline. They are measured in milliseconds revealing what truly holds a user’s focus.
  • Saccades — The silent connectors. These are the rapid eye jumps between fixations. No data is absorbed here, but the path reveals how users navigate visually from one element to another.
  • Scanpaths — Think of this as the storyline of sight. A visual map showing the sequence of fixations and saccades offering deep insights into user intent, flow, and engagement.
Image credit: Sachi Mahajan

AI tools

And finally, for the part you’ve been waiting for—the superhuman sidekick reveal.

Say hello to Dragonfly AI, Attention Insight, and many more AI tools! They’re your backstage pass into the user’s brain. You can generate heatmaps in seconds and see exactly what catches the eye and what gets ignored.

Want to design like a mind reader? Now you can!

The role of AI in the future of data communication

As data becomes increasingly abundant, the challenge lies in clear and effective communication. One of the most intuitive tools in this space is the heatmap. Heatmaps are a graphical representation that uses color gradients to visually represent data intensity. By mapping data across two axes, they offer an immediate visual summary that enables users to identify key patterns and insights at a glance.

Traditionally, consumer testing any package design is a resource-intensive process that takes days or even weeks to yield results. Insights are typically gathered manually, compiled into spreadsheets, and then painstakingly analyzed making it time-consuming. In contrast, tools like Attention Insight streamline this process dramatically. Typically colors from blue (low attention) to red (high attention) represent how users interact with a design. It helps reveal which areas draw the most focus and which are overlooked, offering powerful insight into user behavior on websites, apps and physical packaging. (In the image below)

For example, while working on a peanut butter brand’s packaging, our goal was to highlight the product name and key benefits for a mass-market audience. We used Attention Insight.  Interestingly, the heatmap revealed that users’ attention gravitated heavily toward the typographic elements more than the imagery we initially emphasized. We took that insight and still retained the imagery to make the audience take a closer look at the end product. Remember, the stop, hold and close approach! In the end, we combined the  insight with our design strategy reminding us how data can sharpen creative intent.

Image credit: Sachi Mahajan. Copyrights reserved: Alpino Brand Super Oats pack design

“It is true that data visualization is part data science and part art. That being said, even the most creative art is supported by theories that explain why it works.
— Michiko I. Wolcott

The process of using eye-tracking softwares is a powerful form of data visualization. It transforms raw user attention data into intuitive heatmaps that visually communicate where users are most and least engaged. Rather than sifting through numbers or spreadsheets, designers can instantly grasp insights through color-coded gradients and focus areas. This  streamlines decision-making bridging  the gap between analytical data and creative direction, allowing teams to iterate with both precision and empathy.

So, next time you make the brand logo a little bit smaller or add those colourful callouts, pause and ask yourself: Is it backed by data?

The post Decoding the Influence of Visual Data in Consumer Decisions appeared first on Nightingale.

]]>
23606
Senses & Sentiment: When data is too emotional for the screen https://nightingaledvs.com/senses-and-sentiment/ Wed, 30 Apr 2025 14:39:01 +0000 https://dvsnightingstg.wpenginepowered.com/?p=23459 After doing data visualization work for eleven years, in 2021 I decided to go back to school to gain the skills and tools for creating..

The post Senses & Sentiment: When data is too emotional for the screen appeared first on Nightingale.

]]>
After doing data visualization work for eleven years, in 2021 I decided to go back to school to gain the skills and tools for creating physical work. 

I’d already been fascinated with—and then obsessed with—data physicalizations and installations for almost a decade. My interest in them grew over time as I became dissatisfied with creating work for the screen; I was frustrated that no matter how much love I put into a work, I was ultimately vying for a few minutes of attention amongst dozens of other tabs. I wanted to create a whole world, but designing for a screen meant I could only share that world via tiny windows. I wanted people to step into my works with their full bodies, where I could monopolize all of their attention. 

At New York University’s Interactive Telecommunications Program (ITP), I learned how to control Arduinos (a.k.a. microcontrollers, tiny computers capable of simple operations for devices that interact with the real world), LEDs, motors, extract sensor readings, and design circuits. I laser cut and 3D printed, I proudly overcame my fear of band saws and miter saws (a.k.a. big motorized blades), and I became comfortable in hardware stores. 

But what I hadn’t expected was the luxury that two years away from client work afforded me, and grad school became a safe space for me to experiment. I went in with a singular goal: I wanted to see what my data stories could look like off the screen and into the physical world. 

What I found, instead, was myself.

◇◆◇

It’s a funny thing: When we create for the screen, we can go so fast. For years, I’d churn out projects in a matter of weeks, at most a few months. There’s a sort of satisfaction to that, being able to write a few lines of code, hit refresh, and see something appear on the screen. But it also means that I have no time to think about the why. I mean, I know why I’m designing a visualization a certain way, how the design choices support the data, the story I’m trying to tell. But I never have the time to question: Why am I doing this? What does this dataset, this story, mean to me? 

One of the unexpected things for me working on these physical pieces was the amount of time between ideation and realization, and how that wait time demanded introspection. I could no longer hit refresh and get a result in a few seconds. I had to get on 40-minute train rides for fabric samples, wait half a day for a 3D print to finish, or a week for electronic components to arrive. That wait time forced me to be intentional with every material choice and design decision, until I pared down the piece to its core. It also gave me a lot of time to get existential and think about why I was doing what I was doing. 

This was especially true when I created my thesis project that took four months working on and off, in between classes and exhibitions, to complete. The piece, “Untitled (we still land, home)” (2023) is introspective and deeply emotional for me. The final exhibit is a physical data installation featuring Chinese calligraphy ink that drips onto billowing fabric at different rates based on a dataset of Chinese immigration into America. The dataset itself is simple and small, only 17 rows and two columns, but the stories behind each number—including my own, as an immigrant—are anything but. 

Footage from Shirley Wu’s 2023 exhibit, “Untitled (we still land, home),” showing an oscillating fan that is billowing fabric. Credit: ITP Documentation Lab
Footage from Shirley Wu’s 2023 exhibit, “Untitled (we still land, home),” showing ink drips landing on the billowing fabric. Credit: ITP Documentation Lab

My thesis started when I first discovered a deeply rooted rage, and traced it back to my childhood and the quiet trauma I inherited from my immigrant parents who, by the time I was 10 years old, had made four life-changing moves to three foreign countries—taking me on all but the first. In my research into Asian-American and Chinese-American history I came across a common refrain: one of setting out to a new country filled with hope and expectation and finding a reality vastly different. I became fascinated with that tension.

I latched on to Chinese calligraphy ink as a symbolic representation of the immigrants and a scroll as the US land where they arrived. But I added a twist: To acknowledge the challenges of settling into a new country, I wanted the ink to land on a wildly billowing scroll—a literally difficult terrain—that would make it hard for the ink drops to land where they intended and cause them to splatter and scatter messily. Traditional calligraphy paper, beautiful but delicate, wouldn’t hold up to the large movements or quantities of ink I had in mind. So I spent weeks performing material tests with different swatches of fabric, taking into consideration their weight (lightweight enough for my fans to move them), their absorbency (just absorbent enough so that the ink makes interesting, messy patterns on landing), and their historical context.

I knew I wanted a visual metaphor of droplets being displaced, not being able to land where they intended. So I installed fans to move the fabric, and over time, the fans became symbolic of the anti-Chinese sentiments within and across American history. When people first entered the room with the installation, they couldn’t see the fans underneath the fabric, but they could hear their whir and, more importantly, they could feel the wind. It sounded like this:

I loved that metaphor for systemic racism: something intangible (and artificially created) that we often can’t see, but can feel.

A photo of a black fan under a white sheet.
An oscillating fan placed under the suspended fabric creates a billowing effect. Credit: ITP Documentation Lab
A black box, suspended from a belt-pully system drips calligraphy ink.
A custom-made device that houses ink, a pump, an arduino microcontroller and a battery pack. The device, mounted onto a belt-pully system, drops calligraphy ink (a symbolic representation of immigrants) onto the fabric. Credit: ITP Documentation Lab
A close-up photo of the black ink on the fabric.
A close-up of a series of dots made from the black drips. Credit: ITP Documentation Lab
A close-up photo of the black ink on the fabric.
Some drips splatter and roll across the fabric, symbolizing displacement. Credit: ITP Documentation Lab

The wind also carried the smell of Chinese calligraphy ink—a scent that is deeply nostalgic for anyone who had to learn calligraphy growing up. I loved that as a subtle, shared sense of community and belonging. 

In the space between iterations, I [cried bucketloads, like seriously my husband would just randomly find me crying behind my computer every other week] realized that I was asking the piece questions I could never bring myself to ask my parents. And with each drop of ink I tested on each piece of fabric, I found my answers: Even if we don’t land where we intend, we still land and we make homes for ourselves. And that resilience is so beautiful. 

The slowness of the process gave me space to take the first step towards finding closure, a way to know myself.

◇◆◇

My thesis wasn’t the only place where I used human senses to conjure an emotional response. In total, I made four projects with “the drip drips,” as my classmate called it (lol). 

In my piece “wonder” (2022), I project an animated visualization of all the photos I took between 2018 and 2022 onto a bowl of water. And for the pandemic years, I drip water according to the number of anti-Asian hate crimes reported to the New York police department in that same timespan. When I originally designed it, I imagined the water drips distorting the projection as the visual metaphor for how distorted reality felt during that time.

A photo of Shirley Wu positioning a white bowl on a pedestal.
Setting up the exhibit “wonder”. Credit: Tuan Huang
A photo of a narrow gallery space with white walls and a tiled floor. Data visuals, in circular frames, hang from the ceiling and the bowl of water on the pedestal is in the center.
Inside the gallery space, with the bowl of water under the solenoid valve that drips water. Each of the seven hanging circles shows a data visualization of photos from a particular month. Credit: Tuan Huang
A diagram with concentric circles. In the center the text says “March 2018 - Tokyo.” The dots on the diagram represent photos taken that month. The further out the photo is from the center, the farther from home the photo was taken.
In the data visuals, each colored dot is a photo taken from that month. The color represents the predominant color in the photo. The further out the photo is from the center, the farther from home the photo was taken. Credit: Tuan Huang
A close-up image of a water drop creating a ripple on the surface of the water.
The drops distort images projected onto the surface of the water. Each drop corresponds to the timing of anti-asian hate crimes in New York. Credit: Tuan Huang
A photo of people reading the data visuals.
Gallery visitors spending time with the exhibit. Credit: Tuan Huang

In data visualization, we pay attention to visual metaphors—colors and shapes that could hint at what the underlying data and story is about. But working in the physical, there were so many dimensions to play with, so many layers of meaning I could imbue in each of my material choices. And oftentimes, I’d find unexpected, unintended manifestations that only helped enhance the piece.

Footage from Shirley Wu’s 2022 exhibit, “wonderCredit: Ready Steady Cinema

What I hadn’t expected was for the mechanism I’d chosen to programmatically control the water dripping—a solenoid valve—to have a mechanical clicking noise every time it opened and closed. At first, I considered covering up the sound, fearing it would be distracting. Then, as I sat with the piece for hours (to make sure that the solenoid valve would hold up after many hours of continuous use), I realized that the continuous, staccato-ed clicking was a perfect audio complement to the piece. It sounded like this:

I was told again and again by visitors on the night I exhibited the piece, that even though there was no explanation for what the water drips represented, they internalized the subtle noise; it stirred in them an almost subconscious anxiety.

◇◆◇ 


Another drip drip piece followed the same pattern. 

“Though a patriarchy would privilege the changelessness of the sun over the inconstancy of the moon and you” (2022) is a data physicalization where red ink drips according to my 2020 menstrual data. The ink lands on watercolor paper printed with the 31 days of a month, marking the days I had my period. It is inspired by the shame I used to feel in my teenage years that somehow my body was abnormal because my periods never came on the same days of the month, until I realized that what was actually irregular was the solar calendar, which shifts from 31 days to 30 to 28 and sometimes even 29 days. By the end of the 30-minute performance, almost all of the days are marked red.

A photo of the setup, including a valve dripping red ink onto a strip of paper marked with 31 days. The paper moves on a gear system, and resets its position every 31 days.
Red ink that represents menstruation drips on a 31-day calendar that cycles through a year. Credit: ITP Documentation Lab
A close-up photo of the moving paper with the 31 days, nearly all of which are covered in red.
As the paper calendar moved, the ink drips would splatter and run off the paper. After 30 minutes, all of the days are marked red. Credit: ITP Documentation Lab

In this piece, the topic is emotionally charged for me, and for anyone watching it. It is imbued with a deep-rooted rage, a little rebellion, a f*ck the patriarchy. The experience of seeing it is visceral: The red ink is messy, it doesn’t land neatly within date boundaries, it splatters in between and all around, it bleeds into the next and pools together. It demands conversation.

Credit: ITP Documentation Lab

It also acknowledges that there are some datasets and topics that are too emotional for the pixel perfection of a computer screen (because how do we even begin to quantify an emotion?). But we try to force them into numbers and discrete rows of data anyway, then try to acknowledge that loss of fidelity by injecting pseudo-randomness and pseudo-noise into our code. It strikes me as ironic that when we do that, we’re trying to force a computer to do the very opposite of what it’s good at—which is to be precise and pixel perfect—when the physical world is already random and noisy all by itself.

We’re trying to force a computer to do the very opposite of what it’s good at—which is to be precise and pixel perfect—when the physical world is already random and noisy all by itself.

And I am convinced that there is a whole category of data stories that belong in the physical realm—sensory and messy visualizations that urge us to engage with them not as precise manifestations of a dataset, but instead prioritize an emotional connection and understanding.

◇◆◇ 

When all of these projects were done, my husband made a mind-blowing observation: I design precise systems that manifest imprecisely, messily. And he further observes: The precise systems I’ve designed in my installations are much like the rigid boxes—screens, constraints, social identities—I’ve worked in for much of my career. But, the way that they manifest—messily, imprecisely—that’s me breaking out of those boundaries and finding freedom. 

That’s me finding myself.

The post Senses & Sentiment: When data is too emotional for the screen appeared first on Nightingale.

]]>
23459
Paperbase: A Window into Photographic Paper History https://nightingaledvs.com/paperbase-a-window-into-photographic-paper-history/ Wed, 26 Mar 2025 15:51:14 +0000 https://dvsnightingstg.wpenginepowered.com/?p=23158 The Lens Media Lab at Yale University’s Institute for the Preservation of Cultural Heritage has created Paperbase, an interactive platform for exploring the world’s largest..

The post Paperbase: A Window into Photographic Paper History appeared first on Nightingale.

]]>
The Lens Media Lab at Yale University’s Institute for the Preservation of Cultural Heritage has created Paperbase, an interactive platform for exploring the world’s largest collection of gelatin silver photographic papers.

For much of the twentieth century, photography was a paper-based medium, yet little research has explored how photographic papers influenced the art. Accessing and analyzing historical paper samples have been a challenge—until now.

The Lens Media Lab has cataloged more than 7,200 paper samples—dates ranging from 1890 – 2010—documenting their material, technical, and visual characteristics. This dataset, the most comprehensive of its kind, is now publicly available, offering an unprecedented resource for researchers, conservators, and photography historians.

To describe plainly, Paperbase is an incredible project about photography and its technical side—paper.

According to Paul Messier, Director of the Lens Media Lab, the paper collection “preserves the experience of a physical photograph” and through Paperbase “this experience is contextualized and made universally accessible.”

You can approach it as an analyst, dissecting the data, or as a photography enthusiast, admiring the masters of the past, or as a publisher.

As someone passionate about data visualization, what fascinated me most was the interactive tool built for this project. It’s incredibly flexible—one moment you’re looking at a map, the next you’re working with bar charts, then diving deeper into clusters or radar charts. 

“Collections viewers typically constrain your view to a narrow window. This is fine if you’re only interested in a small part of the collection, but if you want to see patterns at the collection level, you need to be able to see at that level. Paperbase was designed with this in mind.”

And through it all, you’re still surrounded by an overwhelming number of data points—every detail remains visible!

Screenshot

A closer look at the project

Look at these golden highlights on a deep brown background—reminiscent of antique furniture or old ornaments. They pull us into the past, revealing old photographs, inviting us to study the faces of those who lived before us. But they also ask us to explore the technical side—what kind of paper is this? What’s the thickness, texture, gloss, and whiteness?

Then, suddenly, all of it comes alive. A dataset that once seemed purely physical becomes interactive, shifting perspectives and taking on new forms. A powerful tool at the cutting edge of computational analysis.

You start with a chart, some bars, and dots—then suddenly, from those samples, faces from the past smile back at you. It’s magic.

One example of a photo book—velour black

Visual exploration of photographic history using data

Not every sample is an actual photograph—some are technical sheets or even stacks of blank photo paper. You can filter the dataset based on that too, turning each point or cube into a time capsule of an era.

For me, the most fascinating part was studying the historical trends. You can instantly see which decades favored certain types of photographic prints. The rapid rise of photography in the mid-1930s, its peak around 1950, and the gradual decline by the 1990s—it’s all there in the data.

Of course, this could reflect the specific collection methods rather than global trends, but it’s likely that this dataset also mirrors the overall history of photography, at least in the U.S.

Timeline, where every object is placed according to its type—whether it’s photographic paper or a photo book

Here’s the radar chart system—each element on the map can be visualized with a small glyph, a symbol encoding its characteristics. This adds another layer of texture to an already complex visualization.

Radar glyphs with paper properties encoded in them

A few clicks, and bars and squares transform into fascinating radar glyphs. Each object gets its unique symbol, allowing researchers to highlight and compare different samples up close.

Here are the parameters embedded in these radar glyphs (probably more useful to a photographic paper expert than a data viz enthusiast like me):

  • Thickness
  • Color
  • Gloss
  • Texture

When you analyze gloss dynamics over time, the chart suddenly becomes color-coded, revealing yet another layer of insights.

For serious researchers, the dataset allows filtering, analyzing, and downloading the exact data they need for further study.

The dynamics of the gloss over time is bright, but you can still perform some visual analysis and see the changes in the parameter over time

On the project’s website, you’ll find a few key numbers that help grasp the sheer scale of this archive.

Project by the Numbers

Another aspect I loved? The team’s storytelling about their journey of collecting, classifying, and visualizing this data. If you think this process was simple, just check out the algorithm breakdown on the homepage.

A fragment of the history of collection and classification and the project is also part of the project

At one point, the workflow splits into two paths—it turns out evaluating stacks of paper versus photo book samples is an entirely different challenge.

Beyond diagrams and algorithms, you can read the history of the collection itself—a story filled with dedication, loss, and 25 years of continuous work.

Data insights based on the Paperbase project

The website also features analytical insights, prepared for curious visitors—like a chart showcasing the increasing whiteness of photo prints over time. Ever wondered why old photos appear more yellow? No, they haven’t aged that way. Chances are early paper production used different whitening agents, reflecting changes in manufacturing technology.

Example of data insights as part of the Paperbase project

The visual style of this analysis differs from the project’s own, but it’s fascinating to see how different people approached the same data.

A few video tutorials will help you navigate the tool, spot intriguing trends, and discover unique patterns in the data. And beyond that—it’s simply stunning to look at. Highly recommended!

In the publications section, you’ll find a vast list of studies and research articles about this project. Perhaps your name could be among them one day?

I may not be able to fully appreciate the technical intricacies of photographic paper thickness and whiteness, but I can lose myself in admiring historical photos from past decades.

Step into history with Paperbase!

Beautiful item with the code name: #2075oo Defender Velour Black 1940

The post Paperbase: A Window into Photographic Paper History appeared first on Nightingale.

]]>
23158
The Business MRI: A Smarter Way to Track Performance and Collaborate https://nightingaledvs.com/the-business-mri/ Sat, 30 Nov 2024 15:30:00 +0000 https://dvsnightingstg.wpenginepowered.com/?p=22548 It’s Monday morning, and the team is in a rush. Everyone scrambles to pull numbers from slow-loading tables published on the Tableau Server. Marketing is..

The post The Business MRI: A Smarter Way to Track Performance and Collaborate appeared first on Nightingale.

]]>
It’s Monday morning, and the team is in a rush. Everyone scrambles to pull numbers from slow-loading tables published on the Tableau Server. Marketing is stuck waiting for data to load; Operations is battling errors in their extracts, and Sales is frustrated trying to piece together their Excel sheets. After hours of delays and patchwork fixes, everyone finally makes it to the roundtable to discuss weekly progress:

  • What’s been happening,
  • What were the red flags?
  • What are the priorities for the week?
  • How do we refine the strategy to stay on track?

Sounds familiar?

This is the reality for many organizations, with teams spending more time assembling outdated reports than actually solving problems.

Enter the automated business health deck

The business health deck in Tableau eliminates this chaos. It transforms the manual reporting process into a real-time, centralized system where every department’s updates are unified and actionable. 

Think of it as an MRI of your business, a tool that provides clear real-time disability into performance, allowing your team to focus on collaboration and decision-making, not Data Prep.

The process: from PowerPoint to Tableau

Here is the beauty of this solution. Instead of building individual PowerPoint slides for your weekly reports, each slide is replaced by a small Tableau dashboard. Tableau allows you to size dashboards to match PowerPoint slide dimensions, so you can easily create a dashboard per slide format. For example:

  • Marketing gets into a slide with command performance metrics.
  • Sales see pipeline data with bottleneck-highlighted operations to track budget and resource allocation in real-time.
  • HR reviews, hiring metrics and retention rates.

These Tableau dashboards are designed specifically for each department. 

They are compiled into one cohesive Tableau workbook. This workbook becomes your business health deck, pulling data in real-time directly from your validated sources. It’s instantly accessible, always up to date and formatted in a way that every department can use without a steep learning curve, and just like that, you replaced outdated PowerPoint presentations with dynamic, data-driven insights, all in one place.

The challenges of manual reporting

Why is this shift so necessary? Because manual reporting isn’t just a time drain, it’s a serious barrier to organizational effectiveness.

  1. Wasted time teams spend hours pulling data, reconciling discrepancies and formatting slides. The time spent preparing these reports often outweighs the value of the insides themselves.
  2. Inconsistent metrics. Different departments use different data sources and methodologies, leading to misalignment, misaligned metrics, and a lack of trust in the numbers.
  3. Siloed insights. Without a unified view of performance, departments miss opportunities for cross-team collaboration and strategic alignment.

The result? Teams are stuck reacting to outdated information instead of proactively addressing issues.

The solution: real-time reporting in Tableau

With the automated business health deck, these inefficiencies disappear. Here’s what makes it a game-changer:

  1. Essential time saver
  2. Live data access. The deck pulls data directly from validated sources in real-time, no longer waiting for the Tableau server to load or manually refresh data extracts.
  3. Unified visuals. Each dashboard is designed to replace a PowerPoint slide so teams get a format they already know and trust. No learning curve, just better data.
  4. Cross-department transparency is a single source of truth, which means everyone sees the name in the same numbers, fostering collaboration, improve strategic decision making

Why It Matters: The business impact

The automated business health deck isn’t just about saving time, it’s about transforming how your organization operates.

  1. Actionable insights: by providing real-time transparency, teams can focus on solving problems instead of finding them.
  2. Proactive decision making: with up to date data at the fingerprints, leaders can make adjustments on the fly rather than waiting for the next reporting cycle.
  3. Efficiency gain, automating manual reporting frees up time for teams to focus on analytics solutions and execution.

Real-life use cases: Turning data into action

This solution works across departments, tailoring insights to each team’s unique needs:

  1. Marking: monitor campaign performance in real-time and adjust strategies dynamically to hit ROI targets
  2. Sales: gain pipeline clarity, identify bottlenecks and prioritize high-value deals.
  3. HR: track hiring metrics and retention rates to ensure teams are fully staffed.
  4. Operations: oversee badges and resource allocation in real time, avoiding last-minute surprises.

The design philosophy: simple, interactive, actionable

At a score, the automated business help deck reflects several visualization first approach

  1. Simplicity: complex data is simplified through intuitive visuals like charts, heat maps, and KPIs, which make it easy to identify key metrics at a glance.
  2. Interactivity: teams can explore the data directly in Tableau, uncovering patterns and trends without digging through endless rows or numbers of numbers.
  3. Action-oriented design: every visualization answers a specific business question, empowering teams to act immediately on the insights provided.

The future of decision-making

With the automated business health deck, your weekly meetings go from chaos to clarity normal, no more scrambling to pull numbers reconciling inconsistent reports or waiting for the Tableau server to load. Instead, you have a real-time centralized view of your business, a tool that empowers leadership and teams alike to make smarter, faster decisions.

The post The Business MRI: A Smarter Way to Track Performance and Collaborate appeared first on Nightingale.

]]>
22548
Meet Will Sutton, former Iron Viz Champion https://nightingaledvs.com/meet-will-sutton-former-iron-viz-champion/ Fri, 22 Nov 2024 19:21:07 +0000 https://dvsnightingstg.wpenginepowered.com/?p=22517 Iron Viz is the world’s largest data visualization competition. What began as a breakout session at Tableau Conference 2011 has grown into a global phenomenon and..

The post Meet Will Sutton, former Iron Viz Champion appeared first on Nightingale.

]]>
Iron Viz is the world’s largest data visualization competition. What began as a breakout session at Tableau Conference 2011 has grown into a global phenomenon and become a core part of the Tableau Community. Three Iron Viz contenders take center stage and have 20 minutes to tell the most compelling story using the same data set. The 2022 competition saw the crowning of Will Sutton as champion. On behalf of the Data Visualization Society’s Early Career Committee, I asked Sutton a bit more about himself and his win.

How long had you been doing dataviz when you entered the IronViz competition and were you making a living doing dataviz at that time?

I’d been working in data analytics for about 8 years before I arrived at the Iron Viz stage. However, it was only about three years before the contest that I really started focusing on improving my dataviz and Tableau skills. At the time, data visualisation was certainly an important part of my role, but so were the technical skills of SQL, R, and Python.

Can you walk us through the preparation process you went through for Iron Viz? What were some of the key steps or strategies you think contributed most to your success?

Preparing for Iron Viz was very challenging. In the final, you are given 20 minutes to build a data visualisation live on stage against two competitors, using a given dataset. It’s hard enough telling a compelling data story, but against a time limit and a live audience, it really adds a lot of pressure.

This was a real step out of my comfort zone, so rather than focusing on winning the contest, I aimed to make a win on my own terms. Let me explain. Career-wise, I was in a difficult spot: half of my team at work had been made redundant, my wife and I were having difficulty getting a mortgage, and I had just been selected for the final.

It was a lot to go through at the time, so I took a long weekend trip to Wales. Whilst I was out there, I realised what an opportunity I had—not the prize money, but the chance to show my skills to thousands of employers. So I viewed the final more as a showcase of what I could do with Tableau, rather than worrying about being first, second, or third. With that in mind, it got me thinking more about the positives that could happen after the event.

The competition environment can be intense. How did you manage the pressure and incorporate any feedback you received along the way?

I found it important to identify the main aspects that were causing me stress about the competition and to reframe them. The stress was only going to distract me from performing my best. One big hurdle I was facing was presenting in front of an audience of 5,000 people. Public speaking wasn’t my thing. I’m certainly not the most outspoken person, and when it came to presenting, I just wanted to get it over with as soon as possible.

Thinking back to my purpose—that thousands of people were going to see my work—I realised I really needed to sell my skills. So I approached it with the mindset that this was a skill I could improve, and if I could get through this event, future work presentations would be much less daunting.

I spent a long time reviewing TED Talks on education, which was the dataset I would be working on. The content was useful for ideas, but importantly, I could see and understand how other speakers approached the presentation aspect. I found great tips, such as starting the presentation with a question to get the audience thinking and adding more emotion-driven words to build interest in the topic. Once I realised the techniques being used, it gave me much more confidence in my ownpresentation delivery.

What was the very first indication that your life had changed after the win?

The first sign that things had changed was the sudden recognition. At the conference, people I’d never met were coming up to me, congratulating me on my presentation and sharing how it resonated with them. It was surreal to receive that kind of attention. But it came with lots of new opportunities: chances to speak at user groups, collaborations, and job leads.

The Tableau “DataFam” community is unique for a software program/tool. It’s well-known for being an accelerator in building skills in the program. How instrumental was it in contributing to your win?

Absolutely instrumental. The community truly helped me gain the skills I needed to win the contest. I had been part of community projects like #MakeoverMonday, which focused on regularly building data visualisations with simple datasets. My initial attempts certainly wouldn’t win any awards, but by seeing what others submitted and receiving feedback on my work, I saw a rapid improvement in my dataviz portfolio.

The support and feedback from the community helped me grow and pushed me to improve. It then gave me the confidence to do more of my own projects and even help set up the community project #GamesNightViz, which was all about visualising data about games. Funnily enough, how I ended up in the final was from a visualisation I built to promote the #GamesNightViz initiative.

Data visualization competitions are all about standing out and pushing the envelope. Where did you find inspiration for your winning visualization, and how did you balance creativity with clarity?

For me, it comes back to the question: “What do you enjoy?” Since I was using the contest to showcase my skills, I didn’t want to become known for something I wouldn’t want to do in the future. Plus, I think when you enjoy a topic, you’re naturally more interested and creative in what you create.

A big deciding factor for me was how the visualisation would be viewed. In the first round of the contest, I knew the work would go to a panel of judges who would take their time reviewing it, so I could create a bit more of a personalised user experience. Here, the user would play a game, which would add their results to the survey data I had collected, making the data more interesting as it was relative to them—for example, “You chose X, and Y% of respondents agreed with you.”

For the final, it’s a three-minute presentation. There’s less need to dive into details or go overboard with creativity, as it’s all about the clarity of the message. Having said that, as a presenter, you’re in control of what is displayed on screen and how the content is consumed. So you can direct and twist the story as you present it. This worked well with my love of animations to get my audience’s eyes back on the screen when I shared the next insight.

I see from LinkedIn that your Tableau consultancy The Information Lab started in June 2022, just after you won IronViz. Can you talk about that and how was it influenced by your achievement?

Winning Iron Viz definitely opened up new opportunities. Shortly after the competition, I received a lot of job offers and, yes, joined The Information Lab shortly after the contest win. They’re a Tableau partner, very well known in the community, and it was too good of an offer to refuse. Speaking with other competitors, past and present, Iron Viz allowed them to change and develop their roles regardless of the result in the final.

It’s been approximately 2.5 years since you won IronViz 2022. You mentioned that a lot has changed for you career-wise. Aside from The Information Lab, how has winning affected your career or invited other opportunities?

It has been fantastic for meeting new people. Before we headed on stage, they played a little intro video that showed us in our hometowns, so the bright lights of Las Vegas got to see me running around sleepy Suffolk. I had folks come up to me saying they were from a small town too and loved the video.

I’ve also become more involved in the Tableau community, contributing to projects and attending events to help others develop their skills. It’s also helped with working more with Tableau. Lately, I have been working on incorporating LangChain (an AI framework) into Tableau, which is something I’m super excited about and calls for a new set of skills. Overall, winning Iron Viz opened doors and gave me the confidence and contacts to pursue new and exciting opportunities.

You mentioned that post-contest life called for a new set of skills. Can you talk more about what this means specifically?

In the build-up to the contest, I was heavily working on my dataviz skills. After the contest, public speaking became a big part of my life—I had to get comfortable presenting in front of large audiences and engaging with people at events. I still find it tiring as an introvert, but it’s been so beneficial to develop confidence in this area.

Since the contest, I’ve been pursuing developments in AI, building on what I’ve learnt about dataviz. It’s a very different set of skills, with much more jargon, but it’s been very rewarding to find new solutions with this technology.

Aside from recommending that they win Tableau IronViz, what’s the most important advice you would give to early career data-vizzers?

Let your interests lead you. When you’re genuinely interested in a topic, it shows in your work. You’ll put in more effort, be more creative, and produce visualisations that really resonate with people.

Also, focus on what skills you want to gain and what a win looks like on your terms. Whether it’s technical abilities, storytelling, or design principles, knowing what you want to improve will guide your learning.

Will’s final viz

The viz that got Will into the finals

CategoriesCommunity

The post Meet Will Sutton, former Iron Viz Champion appeared first on Nightingale.

]]>
22517
Review: Data Visualization with Microsoft Power BI: How to Design Savvy Dashboards https://nightingaledvs.com/review-data-visualization-power-bi/ Tue, 12 Nov 2024 15:50:40 +0000 https://dvsnightingstg.wpenginepowered.com/?p=22360 Alex Kolokolov and Maxim Zelensky say that there are three types of vizzer—business analyst, infographic creator, and data journalist. Are you clear about which you..

The post Review: Data Visualization with Microsoft Power BI: How to Design Savvy Dashboards appeared first on Nightingale.

]]>
Image credit: Dataviz tools network image© 2023 Ihar Yanouski.

Alex Kolokolov and Maxim Zelensky say that there are three types of vizzer—business analyst, infographic creator, and data journalist. Are you clear about which you are? They are. And the ideal reader will be an analyst, for this is a complete guide to Power BI. 

Never made a dashboard? I hadn’t. But should I want to work anywhere other than journalism, I need to know how. The 2023 DVS State of the Data Viz Industry Survey revealed that, combined, Tableau, Power BI and Excel are the most used tools in the data visualization community.

As the ideal guinea pig for this manual on BI, I set out to recreate the graphics. I parked my hesitation around business phrases like “key performance indicators,” downloaded the data from GitHub, and went for it.

My challenge: could I create stuff without having to ask Google for backup? Straight up, I’ll say that it can be done just by following the first part of the book. There are plenty of pictures—very helpful for navigating menus. Despite the familiar Microsoft layout, it’s always helpful to see a big red circle around the thing you need.

Health warning! While I created charts that looked like those in the book, the numbers did not come out the same. The authors have reassured me that this is due to the raw data pre-dating the final copy. 

The book is a dipper—regardless of whether your chart choice is classic, trusted or ‘risky’. Maps and bubbles were covered early, and their inclusion was a nice surprise.

The writing and tone are friendly and not too technical. My heart sank slightly when I found the examples based around sales, but that’s OK; we’re doing minimalist business graphics, after all.

I was showered with advice on customising defaults, which you’ll want to consider. BI’s auto-generated titles are unashamedly dull. And in the time-poor field of business, it is so true that ‘everything should be clear at first glance.’

The biggest concept I took away is that ‘dashboards are not just tools for data discovery; they are also for facilitating communication among people.’ It’s a good reminder for anyone who is normally design-led.

For those who hate Excel charts, back yourself with this book and give BI a try.


The book is available in various formats on the publisher’s website and Amazon.

CategoriesReviews

The post Review: Data Visualization with Microsoft Power BI: How to Design Savvy Dashboards appeared first on Nightingale.

]]>
22360
Interviewing AI Assistants for Data Visualization https://nightingaledvs.com/interviewing-ai-assistants-for-data-visualization/ Tue, 08 Oct 2024 15:52:43 +0000 https://dvsnightingstg.wpenginepowered.com/?p=22168 In today’s world, you need to run very fast just to stay in place. Technology is developing at an incredible speed, but I don’t believe..

The post Interviewing AI Assistants for Data Visualization appeared first on Nightingale.

]]>
In today’s world, you need to run very fast just to stay in place. Technology is developing at an incredible speed, but I don’t believe it will replace specialists! I believe it will become a loyal assistant, helping to eliminate routine tasks.

Our team is keeping a close eye on all the latest AI innovations that could be useful for data visualization specialists, BI analysts working with data, graphs, and dashboards. These professionals are expected to deliver insights and visual representations of all kinds of data for various business needs. So much to do, so much to handle! A little help from an assistant certainly wouldn’t hurt.

We regularly review interesting AI tools, and in this article, we want to briefly introduce a few of them, while diving deeper into one of the most exciting ones!

So, months of researching the AI data visualization market have brought us the following insights! Our main tester has been Anya, our marketing director and neural network specialist! You can read a few of her articles on this topic on Medium, where she reviews some of these products. Highly informative reading!

Now, let’s take a look at the list of potential assistants! Who will we hire for our team? Tell us a little about yourselves, dear candidates. I’ve heard that some of you are great at working with data, but not all of you understand charts. And some can even build dashboards? How about a trial period and a test task? All agreed?

Logos of the modern AI tools which can be useful to the data visualization person or BI analyst

Perplexity
What it does: helps gather and analyze information
Drawbacks: the service may be too compliant with your request. Better phrase it as: “I want to understand whether small businesses need content marketing. Give me answer with pros and cons”

Athenic
What it does: analyzes data and builds charts
Drawbacks: works only with one sheet of data and sometimes makes odd calculations

Julius
What it does: powerful data analytics with a user-friendly interface
Note: Ability to build and customize charts directly in the service
Drawbacks: good for quick data analysis, not suitable for complex calculations or merging datasets

ChatGPT
What it does: analyzes complex data and prepares visualizations
Note: Ability to build and customize charts directly in the service through new queries
Drawbacks: always useful to check data additionally to see trends and key findings. Also, clarify the logic of its calculations.

Basedash
What it does: builds dashboards 100 times faster than manual assembly
Drawbacks: decent interface, but not the most convenient. Does not connect data from the Excel and basic tables but support many SQL databases.

Rows
What it does: replace traditional tables, simplifies data analysis, automates dashboard and report creation, and eases collaboration on projects
Drawbacks: advantages include easy data integration and availability of templates. But honestly, sometimes it’s simpler and more convenient to use good old Excel.

Polymer Search
What it does: to simplify the dashboard creation process and data visualization by automating their creation and offering intuitive templates and AI features
Drawbacks: currently the most interesting tool on the list. Give it data, and it will build a dashboard on it!

This last candidate intrigued me immensely, so I invited them for a second round of interviews and personally had several conversations with them. We worked together on data visualization tasks and dashboard building! Based on the trial period results, help me decide—should I hire this assistant full-time?

Trial period: testing Polymer Search

When you think about creating dashboards, what’s the first thing that comes to mind? Probably Power BI, Tableau, and a nervous twitch, because working with tables and charts can take hours. But what if there’s a way to do it faster, easier, and without yelling at your screen? Enter the neuro-analytics service—Polymer Search.

Why do we even need Polymer Search?

Polymer Search claims that building dashboards is now so fast, you won’t even have time to make yourself a cup of coffee. Mmm, with cream?  

Here’s what it promises:  
– Dashboard created in one click  
– No coding required  
– Automatic data visualizations  

Okay, sounds cool, but we’re here to test claims against reality. So let’s see how it actually works.

But first, our seasoned expert, ChatGPT, will help the newcomer get up to speed and provide the initial data!

Step one: Using GPT to generate the data

Before we begin, let’s prepare the data for analysis. In our case, it’s webinar data where we want to understand which lecturer generates the most profit and who is working ineffectively. We sent a request to GPT, asking it to calculate the ROI for the webinars. GPT provided formulas and even suggested a table template that we could use for further filling.

We won’t dwell on working with ChatGPT in detail, as much has already been written about it, and we’ve previously explored the various useful aspects of this tool for data visualization people in the article: Creating a Dashboard Using ChatGPT.

First request:

ChatGPT prompt example

And the template:

Template for out ROI table, generated by ChatGPT

Now we can give the data to our candidate!

Step two: Importing data into Polymer Search

Now that we have the data in hand, let’s start testing our main candidate! We’ll upload the data into Polymer Search. Let’s explore how to work with it effectively. After all, every specialist requires a tailored approach!

And here’s what impressed me right away: Polymer Search generated the dashboard almost instantly. What usually takes several hours (or days if Excel decides to corrupt the file) is done in just a few seconds here.

Dashboard, generated by Polymer Search

Of course, the chart formatting needs improvement—diagonal text on bar charts. The pie chart also needs some adjustments! But it’s a good start.

The boss’s heart is already rejoicing; it seems this candidate could be useful! And Polymer Search not only created a dashboard layout but also suggested several key metrics for visualization:

  • Total revenue from webinars—it’s always interesting to see the overall numbers.  
  • Average ROI by lecturers—who sells and who just talks. This is important in business…  
  • Conversion of participants to paying customers—a metric that reveals the true state of affairs in the online school.  
  • Quality assessment of webinars—to gauge how much the audience enjoyed it, as we are in it for the long haul.

Step three: Setting up visualizations

Working with visualizations in Polymer Search is also a pleasure. You can choose from various types of charts, from simple bar graphs to more complex diagrams. All of this is done in just a few clicks; it feels like you have a reliable and understanding assistant who knows what you want with just a half-word or a glance. Like a dream!

Polymer Search suggests me a pie-chart for my task
We can add labels
And change the colors, not bad!

But let’s return to the real analysis tasks:  

For example, you can look at the revenue by webinar topics, which helps understand which topics have “hit the mark.” Or analyze the average ROI by lecturers to find out who brings in the most sales.

Result from the real task

Interesting plus: Predictive data  

One of the coolest features in Polymer Search is the ability to forecast data based on what has already been uploaded. The tool doesn’t just visualize current data; it also attempts to predict what will happen next. 

For example, you might see that one of your webinars could grow by 65% in the coming months. However, I can’t say for sure that this is a reliable forecast to depend on since I haven’t seen their calculations. 

But it’s intriguing! The assistant doesn’t just mindlessly fulfill your requests; it also knows how to dream about the future!

(It was the most interesting part—to play with the forecasts!

Step four: Automatic data refresh  

Another advantage of Polymer Search is its automatic data refresh feature. When you add a new webinar to the table, the data on the dashboard updates immediately. No more struggles with manual refreshes; everything happens quickly and smoothly. This truly makes life easier, especially when the data changes frequently.

Downsides of Polymer Search

Of course, despite all its advantages, Polymer Search has some limitations that should be considered. No tool is perfect; it’s essential to understand your assistant’s constraints from the start and not overload it with tasks that are beyond its current capabilities.

  • Data must be flat at input: The tool works only with a single table. If your data is spread across multiple tables, you’ll need to combine them.
  • Not all visualizations hit the mark: Some graphs look good but don’t always provide accurate or useful insights. You might occasionally need to make manual adjustments.
  • Limited chart customization: You can’t configure every aspect of the visualizations as flexibly as in Power BI or Tableau. This is sufficient for basic needs, but if you require in-depth control, you’ll encounter limitations.
  • Inflexible grid: Moving elements around on the dashboard isn’t always convenient. The tool offers minimal customization options for object placement.
  • Paid system: Yes, Polymer Search is not a free tool, but if you need to quickly create something decent and accessible online, the cost is justified.

Results of the trial period!

Polymer Search is a powerful tool for those looking to speed up the dashboard creation process and save time. Its predictive data and automatic updates make working with dashboards more efficient, but it’s important to be aware of its limitations. If you need to quickly create something on the fly, without delving deeply into settings and programming, Polymer Search is a nice choice for a personal assistant.

So, if you want to eliminate the routine in dashboard creation, Polymer Search is a good option. The time you save can be spent on something more enjoyable. 

Maybe finally brew that coffee? Mmm, with cream!

Well, that’s the overview of AI tools I prepared for you! 

If you have interesting ideas on how to use them effectively, share them on social media—I’d love to learn something new! The more wonderful assistants there are among AI tools, the more enjoyable the work will be for specialists! And I don’t believe they will be left without jobs; instead, many routine tasks can be shifted to the reliable shoulders of AI colleagues.

CategoriesUse Tools

The post Interviewing AI Assistants for Data Visualization appeared first on Nightingale.

]]>
22168
Tracing Carbon: Visualization for Systems Thinking https://nightingaledvs.com/tracing-carbon-visualization-for-systems-thinking/ Thu, 26 Sep 2024 15:35:05 +0000 https://dvsnightingstg.wpenginepowered.com/?p=22005 Systems thinking is fundamental for understanding complex problems. Addressing twenty-first century challenges like climate change requires comprehending how different components of Earth systems influence each..

The post Tracing Carbon: Visualization for Systems Thinking appeared first on Nightingale.

]]>
Systems thinking is fundamental for understanding complex problems. Addressing twenty-first century challenges like climate change requires comprehending how different components of Earth systems influence each other. The carbon cycle, crucial to our planet’s climate system, is a powerful context for helping the rising generation develop systems thinking skills. Traditional 2-D static images often fail to convey the complexities of the carbon cycle, making it challenging for learners. These representations do not communicate dynamic features of the carbon cycle, such as its multiple scales and interconnected processes. We hypothesize that interactive visualization can aid learning by enabling dynamic exploration and consideration of human impacts, thereby fostering systems thinking. 

Personalized learning paths guided by adaptive visualization could also support individual progress. Despite growing interest in interactive data visualization, there is little research on designing these tools for meaningful integration into teaching about systems thinking. The development of Tracing Carbon aims to bridge this gap and targets junior high school students. Using an iterative design-based approach, we combined systems thinking theory, cognitive learning principles, and carbon cycle knowledge, and involved teachers and students in the design process. 

Introducing Tracing Carbon to teachers and collaborating on its classroom integration revealed that digital tools must align with educational goals. Our work demonstrates how the intersection between design and science education creates research opportunities for enhancing learning experiences. The development of Tracing Carbon paves the way for future research on how students of different ages use visualization and how adaptive learning environments can enhance visual learning environments for STEM education.

Why is thinking about systems important?

Systems thinking is fundamental for understanding complex problems. Addressing twenty-first century global challenges such as digital privacy, world health, biodiversity loss, and climate change all rely on thinking about systems. Indeed, humanity faces the massive task of tackling global warming. To act, we all need to comprehend what is happening—and that requires thinking about how different components of earth systems connect and impact one another. The carbon cycle plays a crucial role in our planet’s climate system, making it a highly relevant and powerful context for helping the rising generation develop systems thinking skills.    

Despite the importance of the carbon cycle, our young students are often presented with the sheer complexities of this system with deceptively simple and traditional 2D static images (see below). Many young learners find it demanding to learn about the carbon cycle using conventional visual tools. This is perhaps unsurprising since established and standard visual representations do not necessarily communicate the intricate and dynamic features of the carbon cycle, such as traversing multiple temporal and spatial scales, constituting several subsystems-within-systems, and incorporating multiple interconnected processes. Additionally, these conventional visualizations are not always paired with the scientific data that represents the status of the global carbon cycle and its implication for environmental challenges.

A detailed diagram of the carbon cycle, illustrating the flow of carbon dioxide (CO₂) between different environmental components. The diagram shows processes such as photosynthesis, animal respiration, and ocean uptake, with arrows indicating the movement of CO₂. It includes elements like a factory emitting CO₂, plants taking in CO₂, animals respiring, decaying organisms, fossil fuels, and the conversion of organic carbon in the soil.
A typical traditional visual representation of the carbon cycle encountered by school students.

Could interactive visualization foster systems thinking?

In the visualization community, the term visualization is usually acquainted with computer-based visualization systems that augment human decision-making capabilities by providing visual representations of data. Coming from a science education background, we adopt a broader perspective on visualization, by considering it as the representation of information in visual formats such as images, diagrams, or charts. 

The carbon cycle represents an abstract conceptual framework in science education that requires grasping multiple layers of information, including system components and dynamic relationships between components at microscopic and macroscopic organizational levels. We hypothesize that a learning environment that represents this information through multiple interactive visualizations can facilitate learning about the complexities of the cycle. Additionally, we believe visual representations of scientific data about the carbon cycle—such as atmospheric carbon dioxide, earth temperature anomalies, and carbon flux—can enrich this learning environment and provide guidance towards evidence-based insights on this topic. By enabling users to dynamically explore and visualize components and relationships of a complex earth system such as the carbon cycle, and motivate consideration of the influence of human impact, this learning environment can foster the development of systems thinking abilities. Furthermore, we believe that as students gradually build their understanding about how earth systems work, personalized learning paths guided by adaptive visualization could support their individual learning progress. In parallel, teachers play a vital role in providing support as students navigate their learning trajectories with such an interactive visual environment. Our work currently pursues the following questions:

  • How can we create interactive data visualization tools to help pupils understand how earth systems work?
  • What happens when pupils use such visual tools to explore the carbon cycle? 
  • What are the potential benefits of interactive and adaptive visuals for facilitating systems thinking?

To explore these questions, we apply mixed method endeavors that incorporate data visualization, interactive and visual design, adaptive learning environments, science education, and educational psychology.

How can design intersect with science education in developing Tracing Carbon

Despite increasing inroads into the value of interactive data visualization in science education, there remains a significant gap in research on how to design these tools effectively and integrate them into teaching practice. The development of Tracing Carbon exemplifies a pedagogical effort to bridge this gap, highlighting how design intersects with science education. Targeting junior high school science classes (students aged 13 – 16), we designed an adaptive and interactive learning environment to support systems thinking skills about the carbon cycle. We used an iterative design-based approach informed by theory, teachers, students, and a research team that included educators, designers, and programmers. 

We combined systems thinking theory, cognitive learning and carbon cycle knowledge to create the components of the learning environment. We used a hierarchical systems thinking model as a scaffolding framework to develop sequences of learning support for Tracing Carbon, and integrated various learning objectives, tasks and modules to help students understand the carbon cycle. Teachers and students are key in designing learning tools, and science teachers and students were involved in the design process. The research team used their collective insights to guide and refine the design through various focus group meetings, individual interviews, and classroom interventions.

The design process led to Tracing Carbon, an adaptive interactive learning tool consisting of tasks and quizzes across progressive modules. The learning experience starts by exploring the forest carbon cycle by covering carbon pools and transformation processes. Then it focuses on the carbon cycle at the global scale and ends by exploring scientific data on anthropogenic effects on the carbon cycle. Students can engage with Tracing Carbon through various interactions (see below), that include dragging and dropping items to complete images (A) and drawing and completing arrows to “trace carbon” through various sub-cycles (B).

Two-panel comparison of educational diagrams related to the carbon cycle. Panel A: Shows a tree with labels for processes like photosynthesis and cellular respiration, along with a simplified depiction of how carbon moves from the atmosphere to dead organic material in the soil. Interactive elements, such as drag-and-drop boxes, indicate an educational activity where learners identify and place correct labels. Panel B: Presents a more interactive approach with hexagonal icons representing different organisms, such as a rabbit and microorganisms, showing their roles in the carbon cycle. The diagram appears to engage users in a learning activity by allowing them to select and link different components of the cycle.
Examples of implemented interactions in Tracing Carbon: dragging and dropping items (A) and drawing arrows to “trace carbon” between carbon cycle components (B).

While progressing through the visualization environment, students are presented with quizzes and tasks representing various levels of systems thinking skills, designed to stimulate reasoning and problem-solving abilities. While engaging, the adaptive characteristics of Tracing Carbon personalize the learning experience by adjusting the task and quiz difficulty and quizzes according to students’ real-time performance. Additionally, various forms of visual feedback strive to validate students’ answers and guide them through the learning experience.

What happens when Tracing Carbon is used in the classroom?

Integrating digital tools in classrooms involves much more than merely making them available to teachers and students. It’s about aligning them with educational goals to enhance learning processes and outcomes. Given the diversity of data visualization tools and educational settings, there is no universal recipe for integrating a digital tool into every classroom. However, observing how a specific tool like Tracing Carbon can support systems thinking can illuminate key considerations for implementing similar educational tools in teaching practice. 

To date, we have begun introducing Tracing Carbon to teachers and collaborating with them while they integrate it into their teaching. Perhaps unsurprisingly, teachers seek tools to streamline their workload while enhancing students’ learning. A general recommendation is to complement digital tools with additional resources to maximize pedagogical effectiveness. In terms of student experiences, we observed that they generally enjoy using the digital resource and seem to show increased engagement. It follows that harmonizing teaching activities and digital resources is of high pedagogical importance.

Future directions for enhancing systems thinking with data visualization?

We have tested potential principles for creating interactive diagrammatic visualizations that can help learners grasp complex and interconnected science concepts. In this way, we demonstrate how the intersection between design and science education provides a research space where visual tools are crafted with the intention to enrich learning experiences. The developed Tracing Carbon tool is intended to enhance systems thinking as a key aspect that contributes to notions of environmental literacy and informed decision-making. Such integration also provides prompts for how systems thinking could be approached in STEM domains at large. Furthermore, learning and reasoning about fundamental STEM concepts through a visually communicated environment could compensate for differences in pupils’ language proficiency. Lastly, the fact that such a visualization platform can be used in any learning setting equipped with a computer and internet ensures accessibility for all pupils and teachers. 

The multidisciplinary collaboration undertaken in this project sets the stage for several future directions. These could involve comparing how students of different ages use data visualization platforms to develop systems thinking, or how such abilities develop over time. The adaptive visualization component of the work also provides insights into how emerging AI can enhance visual learning environments for STEM—in continuing to seek interventions for equipping the next generation with essential knowledge to solve urgent global issues. 


Acknowledgements

We heartily appreciate the contributions of our collaborators in the project team Måns Gezelius, Gunnar Höst, Marta Koć‑Januchta, Jonas Löwgren and Lena Tibell. This work is supported by the Swedish Research Council (Vetenskapsrådet, Grant 2020-05147).

Further reading

Conducting Educational Design Research, Susan McKenney and Thomas Reeves

Development of system thinking skills in the context of earth system education, Orit Ben-Zvi Assaraf and Nir Orion

Students’ conceptions of the carbon cycle: identifying and interrelating components of the carbon cycle and tracing carbon atoms across the levels of biological organisation, Katharina Düsing, Roman Asshoff and Marcus Hammann

The fabric of visualization, Elisabeta Marai and Torsten Möller

Visualization analysis and design, Tamara Munzner

The post Tracing Carbon: Visualization for Systems Thinking appeared first on Nightingale.

]]>
22005
Good Morning Data #5 | The Half-Full Learning Curve https://nightingaledvs.com/good-morning-data-5-the-half-full-learning-curve/ Sun, 15 Sep 2024 16:55:45 +0000 https://dvsnightingstg.wpenginepowered.com/?p=21990 The Half-Full Learning Curve or “Could we be happy with what we already know?“ I sighed at the sight of the calendar I was holding in..

The post Good Morning Data #5 | The Half-Full Learning Curve appeared first on Nightingale.

]]>
The Half-Full Learning Curve or “Could we be happy with what we already know?

I sighed at the sight of the calendar I was holding in my hand. It was a paper one, the granny kind you ripped a page a day with the big bold numbers on it. It indicated the 20th of June though we were in September, proof of its moderate success in helping me better anchor myself in time. I tackled the meticulous task of ripping the 73 pages that separated me from the actual day. At last, I reached September and stared at the date, for it wasn’t any day of September, it was the first.

I can’t vouch for other European countries but here in France, we have two beginnings of the year—one on the 1st of January and one on the 1st of September. It would be difficult to know which one is the most important—though one is certainly more festive than the other—both marked the fresh start of a new year, both signified the end of the holiday season and most of all, both were heavily symbolic milestones that require you to take good resolutions. (It’s nice to have both because it gives you a second chance at failing.)

Therefore, staring at the big “one” on the calendar page, I was left wondering about which resolution I could take, work wise, for this new year, apart from ripping pages of the calendar as days go by. And believe it or not, but for the first time in my data designer career, I didn’t promise myself this was the year I would finally learn to code.

Until that moment, I kept hammering myself about the importance of knowing how to code, both as a data designer and as a woman (in a world hugely encoding by white guys). And trust me, I did try to learn—I attended bootcamps and followed online classes. I multiplied the learning attempts in persistent yet lazy efforts, continuously trying, continuously failing.

But now that I think about it, it wasn’t only coding. Starting in data visualization, I was hyper aware of all the skills I seemed to be missing. I kept discovering new ones on every corner! My statistics’ knowledge was embryonic, my data management skills were nonexistent, I could hardly babble in html. I was a diapered baby on the floor of a room full of shiny objects, either too far or too high for my reach. From my anterior professional lives, there was only one toy I had brought with me (or so I thought) and it was the duller of all (or so I felt)—design. Sure, I could design. Back then, I was an art director in a Parisian creative agency so designing websites, ads and books was familiar ground. The rest of dataviz skills though? All those were uncharted territory. 

Yet, after a couple of missions, I noticed something funny. Before my years in design, I had been a French teacher. Those few years of trying to teach meandering grammar rules to poor students turned out to be extremely precious in my new occupation, for what is data visualization if not a visual, pedagogical effort? And before that, I had been a literature student, doing a bit of freelance writing for a magazine. The ability to digest a text or study, analyze and rewrite it proved to be quite useful as well. Suddenly, I realized I was more equipped than I previously considered myself to be.

Still, I wasn’t at peace. There were too many holes in my armor, I wasn’t geared up enough. Thus began the years of self training (insert boring montage of me in front of a computer, taking class and reading books to the tune of “Eye of the Tiger”). OpenClassrooms and Domestika were the neighborhoods I was wandering in. I had notebooks for every new skill I needed to master. I even thought about taking an Excel class (and excel class!); what had I become? 

I believe we’re all taken by a similar frenzy of learning after staring at the depth of the knowledge we’re missing, joining the dataviz gang. There is just too much to know, too many skills involved in data visualization. Statistics, data science, data mining, data management, design, writing, UX, storytelling… Aside from a couple of people—Leonardo da Vinci, who seemed to master everything (and yes, I’m looking at you, Nadieh Bremer and some of your kind)—the rest of us are bound to feel wobbly. Still, isn’t it interesting how we all decide to focus on what we’re missing, what the others all seemed to know better instead of what we already know? In terms of seeing the glass half full or half empty, are we all quite pessimistic while seeing our fellow practitioners as filled jars? 

You could argue that this hyper awareness of a structural inner flaw, related to the very nature of dataviz itself, is a good thing that pushes us to keep learning, keep aiming for a better, fuller version of ourselves. But staring at this big “one” on the calendar, I want to say no. No, I don’t want to learn how to code, I don’t need to learn how to code. I’m good already, thank you. Sure, I intend to keep improving myself as a designer throughout my entire career but I’m tired of wriggling, trying to fill with a spoon an inland sea I perceive empty. In a life where the curve can never be truly full, I believe we have to learn to be content with what we’ve already poured in. 

Oouh! I wonder if there’s online classes on soft skills management…?


Loved this column? Rendez-vous here on Nightingale every 15th of the month for a new one!

CategoriesCommunity

The post Good Morning Data #5 | The Half-Full Learning Curve appeared first on Nightingale.

]]>
21990