William Careri, Author at Nightingale | Nightingale | Nightingale https://nightingaledvs.com/author/william-careri/ The Journal of the Data Visualization Society Wed, 20 Nov 2024 15:44:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://i0.wp.com/nightingaledvs.com/wp-content/uploads/2021/05/Group-33-1.png?fit=29%2C32&ssl=1 William Careri, Author at Nightingale | Nightingale | Nightingale https://nightingaledvs.com/author/william-careri/ 32 32 192620776 When the Data is Gone https://nightingaledvs.com/when-the-data-is-gone/ Wed, 20 Nov 2024 15:44:19 +0000 https://dvsnightingstg.wpenginepowered.com/?p=22392 When Cartoon Network abruptly erased its entire online archive, decades of web content vanished in an instant. At the same time, Boomerang, a platform that..

The post When the Data is Gone appeared first on Nightingale.

]]>
When Cartoon Network abruptly erased its entire online archive, decades of web content vanished in an instant. At the same time, Boomerang, a platform that had housed much of Cartoon Network’s classic catalog, shifted its library to the Max streaming platform. What seemed at first like a simple consolidation of content soon felt like a loss. Nostalgia-inducing shows like Ben 10, Steven Universe, and Chowder began disappearing without warning. Each removal was a quiet reminder of how ephemeral digital access can be, even to content we once believed permanent.

This unsettling realization made the concept of media preservation, often relegated to formal archives and academic studies, feel personal. I found myself tracking down hard copies of favorite shows to safeguard them—not just against the physical degradation of discs but against the creeping disappearance of cultural materials that reflect specific eras and viewpoints. These losses aren’t just personal frustrations; they highlight a systemic weakness in our digital infrastructure, where history can vanish without notice or recourse.

A few months ago, I embarked on a digital preservation project of my own. What began as a seemingly straightforward task—building a personal media server—took on greater urgency. Using a ZimaBlade 7700, a compact yet versatile device, I set up two 8TB hard drives and began transferring years’ worth of DVDs: movies, television series, entire seasons preserved on fragile discs. The goal was convenience, but it quickly became something more—a way to reclaim control in an age where access to the things we love feels increasingly precarious.

Cartoons, films, and other forms of media may not typically qualify as “historical data,” yet they carry embedded cultural narratives, snapshots of societal trends, and evolving norms. With recent reports of Distributed Denial of Service (DDoS) attacks targeting entities like the Internet Archive, a troubling question emerges: how much of our digital heritage is quietly slipping away? And what does it mean for a society when its collective memories—the lighthearted as well as the profound—are so easily erased from public access, potentially lost forever?

This is not a new concept

I’m reminded of Marion Stokes. From the tumultuous events of the 1970s to the dawn of the digital age, the Philadelphia activist, librarian, and television producer tirelessly amassed over 70,000 VHS tapes containing non-stop news coverage. But this was no ordinary collection. For Marion, the news was more than just a fleeting moment on the screen. It was a vital historical record, a window into the soul of society, and a snapshot of our collective evolution. In an era of constant change and flux, Marion and her extraordinary VHS tape archive is a tale of obsession, intrigue, and the importance of preserving our past.

Sean Fagan with the Marion Stokes video archive, 2014 pre-sort. Photograph by Brett Brownwell, Internet Archive.

Marion Stokes’ commitment to preserving television broadcasts was more than a personal project—it was a powerful countermeasure against the impermanence of media. Her vast archive of VHS tapes captured news and events that, if left to the fate of corporate interests or technological shifts, might have vanished without a trace. Stokes understood that these recordings were not just ephemeral entertainment; they were cultural artifacts, snapshots of societal perspectives, and evolving narratives. 

In much the same way, today’s media collectors and data visualization artists play a vital role in preserving the digital past. The sweeping erasures on streaming platforms echo Stokes’ mission, underscoring an urgent need to protect the digital world’s fragile memory before it slips through our fingers, one missing file at a time.

Spontaneous data deletion should scare you

The fragility of digital archives and the spontaneous deletion of data pose a unique challenge to the field of data visualization. For visualization artists, who rely on historical and real-time data to tell meaningful stories, the sudden disappearance of datasets isn’t just an inconvenience—it’s a barrier to truth, context, and continuity. When data vanishes, we lose more than just numbers or records; we lose the threads that connect insights across time, dismantling the larger narratives that might otherwise emerge.

From a data humanism perspective, the stakes are especially high. Data humanism emphasizes the human context behind data points, viewing them as more than abstract information to be processed. It treats data as a bridge to human stories, and visualization as a medium to bring those stories to life in ways that inspire empathy and reflection. But when datasets disappear—whether through platform changes, corporate decisions, or cyber-attacks—visualization artists are cut off from essential materials that fuel this process of discovery and connection.

This impermanence is not just an obstacle for artists but a deeper societal issue. If data points that capture essential aspects of our world vanish unpredictably, the ability to draw insights from history and context weakens. Patterns go unrecognized, biases slip through unnoticed, and societal issues fade from the public eye, leaving us with a fragmented and incomplete view of reality. Data visualizations grounded in humanism should serve as records, preserving insights that resonate on both intellectual and emotional levels. When data becomes as ephemeral as a tweet, visualization artists must question how best to protect the integrity of their work and, ultimately, the stories they aim to tell.

My ZimaBlade 7700, housing a personal media server.

What you can do about it

Data visualization artists aren’t powerless in the face of data’s fragility. While the threat of data erasure looms, artists and analysts can take active steps to safeguard their sources, ensuring that the stories they tell have the chance to endure. First, creating local copies of datasets and backing them up to physical storage offers a level of protection. Just as we save our own personal memories, these datasets represent collective memory, and they’re worth preserving beyond a single, vulnerable source. While it may be tempting to trust that publicly available data will remain online, a local archive is an artist’s insurance against impermanence.

But preservation shouldn’t stop at individual copies. Visualization artists can leverage public repositories like the Internet Archive or other data libraries to share and secure their datasets. By uploading historical data to open-access platforms, we contribute to a collective effort that transcends individual projects. In doing so, we ensure that even if a source disappears, the data lives on for future artists, researchers, and the public to use and interpret. These repositories are a shared foundation that supports continuity across projects, disciplines, and even generations.

The role of visualization artists can expand to act as advocates for data integrity. Open-source tools, transparent data collection practices, and data-sharing initiatives all serve as bulwarks against spontaneous deletion. By engaging with communities that value open access, visualization artists can support broader conversations on digital preservation and advocate for policies that protect data as a public good.

The effort to preserve data is an act of stewardship. Visualizing data is more than arranging numbers or statistics; it’s about capturing the human experience as it unfolds. If we view ourselves as caretakers of digital memory, then it becomes our responsibility to protect that memory from slipping away. After all, data visualizations not only tell the stories of today but also create a legacy for tomorrow—one we’re all responsible for preserving.

The post When the Data is Gone appeared first on Nightingale.

]]>
22392
Hachette v. Internet Archive and the Future of Data Access https://nightingaledvs.com/hachette-v-internet-archive-data-access/ Tue, 16 May 2023 13:35:53 +0000 https://dvsnightingstg.wpenginepowered.com/?p=17188 A legal ruling raises issues about fair use in storing knowledge, the role of digital libraries, and balancing copyrights with open access.

The post Hachette v. Internet Archive and the Future of Data Access appeared first on Nightingale.

]]>
Humanity has long cherished the pursuit and dissemination of knowledge. Dating back to the third century with the Library of Alexandria, generations of libraries sought to replicate, serving as a safehaven of wisdom for the intellectually curious. 

Although the digital age has changed how we look for and access information, our enthusiasm for the free exchange of ideas has not diminished. The recent ruling in the Hachette v. Internet Archive case raises important issues regarding the function of fair use in the preservation of knowledge, the changing nature of libraries in the digital environment, and the precarious balance between the rights of creators and the general desire for open access.

Judge John G. Koeltl of the Southern District of New York decided in a historic judgment on March 27, 2023, that the Internet Archive’s “Open Library” lending program violates copyright. This decision calls into question the rights of archivists and the accessibility of open data.

Professionals in the field of data visualization are familiar with the advantages of free and open access to knowledge. The ability to gather, examine, and display data from a variety of sources is essential to our work. The transformation of raw data into captivating visual storytelling has a huge potential to advance understanding, influence policy, and guide judgment. 

The Hachette v. Internet Archive decision will have a big impact on our industry and how people access information going forward. This ruling may unintentionally impede the search for knowledge, inhibit creativity, and restrict the innovation which underpins our sector by reducing the extent of fair use and possibly limiting the digital access of copyrighted works. More than ever, data visualization experts must fight for a fair and balanced copyright system which respects the rights of content creators while also recognizing the critical role of open access.

What is the Internet Archive?

The Internet Archive is a non-profit whose mission is to preserve and make digital content. Since its inception in 1996, the Internet Archive has amassed a vast collection of freely accessible digital books, audio, films, websites, and other digital materials. The Internet Archive also operates the well-known “Wayback Machine,” a service which enables users to access archived versions of previously existing websites. The Internet Archive’s mission is to provide universal access to all knowledge. Its efforts have made it an indispensable resource for researchers, educators, and anyone else seeking current and archived digital materials.

The “Open Library” of the Internet Archive provided access to millions of books and documents which would have otherwise been difficult or impossible to find. During the COVID-19 pandemic shutdowns, when many physical libraries were forced to close and people resorted to digital resources to continue their studies and research, it became an indispensable resource. Even though digital lending through these libraries was beneficial, it was not always sufficient to meet the skyrocketing demand for remote resource access. Numerous libraries were constrained by limited budgets, a limited selection of digital titles, and strict licensing agreements which dictated the number of copies and lending limits for digital materials. Together with the increased reliance on remote learning and remote employment, these factors underscored the need for accessible alternative information sources.

Through its “Open Library” lending program, the Internet Archive intended to offer a more extensive digital collection. The Internet Archive, unlike conventional libraries, did not always adhere to the same rules and regulations regarding the acquisition and lending of digital materials. Hachette v. Internet Archive hinged on the argument that the Internet Archive’s lending program violated copyright law by neglecting to obtain the required permissions and licenses for its digital reproductions, circumventing the established rules governing library lending. While libraries are permitted to have digital copies of books and lend them within a prescribed framework, the Internet Archive’s approach was deemed to be in violation of fair use, resulting in an unfavorable ruling. For data visualization professionals, researchers, academics and other open access advocated, this raises significant concerns about the future of open access to information.

A website screenshot of the Internet Archive's "Open Library," featuring rows of books. The books are separated by "Trending Books" and "Classic Books."
The “Open Library” of the Internet Archive has been a helpful resource for both readers and academics, providing access to millions of books and documents that would otherwise be difficult or impossible to locate. Credit: William Careri

Questions of fair use and copyright

The lawsuit, brought by Hachette Book Group and several major publishers, including HarperCollins, Penguin Random House and Wiley, was filed in 2019. The plaintiffs alleged that the Internet Archive’s “Open Library” lending program effectively functioned as a pirate website, enabling users to access and distribute copyrighted materials without authorization or payment.

But in its defense, the Internet Archive countered that their “Open Library” lending service was protected by the doctrine of fair use, which permits the limited use of copyrighted content without authorization in certain circumstances, such as for educational or research purposes. The Internet Archive argued that their service was comparable to a standard library lending system, in which users could borrow digital copies of books for a limited time and it did not profit from the circulation of copyrighted material. The Open Library ensured that only a limited number of users could access a particular digital book at any given time. The number of available copies corresponded to the number of physical copies held by partner libraries or obtained through other means. This is done to simulate the scarcity of physical copies and ensure that the digital lending is within reasonable bounds, and was only available for those who created an Open Library account. The Internet Archive also maintained its efforts to preserve and distribute digital content were in the public interest and consistent with its mission to provide universal access to knowledge.

Although the ruling against the Internet Archive sets a new precedent regarding fair use and open access to information, it is by no means the first.

Other landmark cases

The cases of Sony Corp. of America v. Universal City Studios, Inc. (1984) and Authors Guild v. Google, Inc. (2013) are landmarks in the history of fair use and information access. Both cases illustrate the role of technological innovation in influencing the contours of copyright law and serve as useful touchstones for comprehending the implications of the Hachette v. Internet Archive decision.

The Sony case, also known as the “Betamax case,” involved the private, noncommercial recording of copyrighted television programs. The Supreme Court’s ruling established fair use in this context, setting a precedent for future cases involving new technologies. Similar to the digital lending program of the Internet Archive, Sony’s Betamax technology facilitated access to copyrighted content. However, the Hachette v. Internet Archive case differs from the Betamax case in that the court ruled the Internet Archive’s lending program violated fair use.

In the 2013 case Authors Guild v. Google, Inc., the court ruled that Google’s book digitization project, which aimed to create a searchable database of books, qualified as fair use because it was deemed transformative and provided valuable information without duplicating the original works. To achieve this, Google implemented a system where they only displayed snippets of information in the search results rather than full works. This decision highlighted the significance of information accessibility and the transformative nature of digitization projects, drawing parallels to Internet Archive’s efforts. In contrast, the decision in Hachette v. Internet Archive suggests that the boundaries of fair use may be narrower when it comes to digital lending programs as opposed to searchable databases.

These landmark cases illustrate the complexities and nuances of fair use and information access, as well as the role that technology plays in challenging and redefining established legal frameworks. As the Hachette v. Internet Archive decision unfolds, it is crucial to consider its ramifications in the context of copyright law’s evolution and the ever-changing landscape of information access.

Potential implications

The recent decision in Hachette v. Internet Archive has cast doubt on the future availability of information. This decision establishes a precedent for the use of copyrighted materials in digital libraries, leaving many to speculate on the future of similar archival endeavors.

Although the decision specifically targeted the “Open Library” lending program, it inevitably raises questions regarding the broader implications for archival efforts and digital content accessibility. If similar projects are deemed to be in violation of copyright laws, researchers’ ability to freely access and disseminate information could be severely restricted, potentially impeding scientific progress and innovation.

In a hypothetical future marked by increased restrictions on open access to information, the data visualization community and other groups that value open access may face significant obstacles that impede their ability to create compelling and informative visual narratives. 

With limited access to a wide variety of sources, the insights and conclusions derived from data analysis may become more limited, resulting in a less comprehensive understanding of complex issues. In addition, the sharing and collaboration that underpin the collective development of the data visualization community could be stifled, resulting in a slower rate of innovation and progress. Such a future could also exacerbate existing disparities in knowledge access, creating a digital divide that marginalizes underrepresented voices further and perpetuates systemic inequalities. 

In the end, a world with restricted access to information would not only limit the data visualization community, but also impede the societal pursuit of knowledge and comprehension that drives human advancement.

A website screenshot of the Internet Archive's Dataset Collection. A grid-view of different dataset collections are displayed.
As data visualization professionals work with complex and large datasets, they rely heavily on accessible information and resources to produce visualizations that can drive insights and decision-making. Credit: William Careri

What happens next

Following the ruling, Internet Archive announced their plans to appeal the court’s decision.

In a statement made by Internet Archive Founder Brewster Kahle, “Libraries are more than the customer service departments for corporate database products. For democracy to thrive at global scale, libraries must be able to sustain their historic role in society—owning, preserving and lending books. This ruling is a blow for libraries, readers and authors and we plan to appeal it.”

At the time of writing this article, efforts were made to contact the Internet Archive for additional information, with no returning correspondence made.

The Hachette v. Internet Archive ruling serves as a critical juncture in the ongoing discourse surrounding copyright law, fair use, and the preservation of knowledge in the digital age. As data visualization professionals, you are uniquely positioned to witness firsthand the transformative power of open access to information. 

The decision in Hachette v. Internet Archive is a turning point in the ongoing conversation about copyright law, fair use and the preservation of knowledge in the digital age. As professionals in data visualization, you are in a unique position to witness the transformative power of open access to information firsthand. 

It is essential to engage in this discussion and advocate for a balanced approach that respects the rights of creators while fostering the free movement of ideas upon which our society depends. The spirit of the Library of Alexandria must endure in our collective pursuit of knowledge and innovation, inspiring us to create a future in which information is accessible and the fruits of human intellect are shared for the benefit of all.

The post Hachette v. Internet Archive and the Future of Data Access appeared first on Nightingale.

]]>
17188
The Challenge of Designing Nuclear Waste Warning Markers to Last 10,000 Years https://nightingaledvs.com/design-warning-for-nuclear-waste/ Wed, 08 Mar 2023 15:11:29 +0000 https://dvsnightingstg.wpenginepowered.com/?p=16173 A group of experts had to propose visual warning markers for a radioactive waste site. Their design needed to be comprehensible, durable, and timeless.

The post The Challenge of Designing Nuclear Waste Warning Markers to Last 10,000 Years appeared first on Nightingale.

]]>
About 10,000 years ago, the Earth was in the midst of the last Ice Age, and much of the northern hemisphere was covered in ice sheets. Sea levels were lower, and the landscape was dotted with glaciers. In areas not covered in ice, the climate was cooler and drier. Forests and grasslands dominated the landscape, and human civilizations were just beginning to develop.

Now, the year is 12,023. Long gone are the generations who pioneered and attempted to archive information. Long gone are those who understood the importance of history and its significance on building what is now considered present day society. A human approaches a sparse, unmanned area in the desert of what once used to be New Mexico, United States. Large chunks of stone debris are scattered in the sand and there is no one to be found. Engraved on one stone is an earlier form of English along with a pictograph. The human doesn’t understand everything the stone says, but can make out something which frightens them—a warning, or possible threat, of death. With this now in mind, the human hastily passes the area with a mental note to never return, for no good can come from doing so.

What the human didn’t know is roughly 2,150 feet (657 meters) below this stone is nuclear waste deposited 500 generations ago; the warning marker continuing the job it was assigned to do 10,000 years prior.

A high-stakes design project

This is not a hypothetical conceptualized for science fiction, but a scenario real experts have considered at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico. The WIPP is a housing location for the United States’ nuclear waste, including clothing, tools, rags, residues, debris, soil, and other items contaminated with small amounts of plutonium and other man-made radioactive elements. The warning markers for this plant are not currently active, because there is no need to deter people from the area if the site is occupied with a staff. But once the plant decides to close its doors, with the nuclear waste still remaining, proper signage will be critical.

Nearly 30 years ago, teams of anthropologists, archaeologists, architects, astronomers, communicators, designers, engineers, geologists, linguists, material scientists, psychologists, semioticians, and sociologists convened to conceptually design warning markers to physically withstand 10,000 years. Their goal was to come up with a proactive proposal to ensure outsiders would never enter a location housing the country’s nuclear waste, even centuries after the plant ceased operations. It is a once-in-a-lifetime challenge for an information designer—and a story not often shared, though should be.

One of the teams consulting for the WIPP, underground wearing headlamps, smiling.
One of the teams consulting for the WIPP. Credit: Jon Lomberg

I had the privilege of sitting down with Jon Lomberg to get a better idea of his role as a designer for this project. Lomberg has held many titles throughout his career, including NASA’s Design Director for the Golden Record on the Voyager Spacecraft and Carl Sagan’s principal artistic collaborator. His work for the WIPP was designed to help clearly communicate to the public the science behind the project and its potential benefits and risks. Lomberg’s role at the WIPP involved working with his multidisciplinary team to understand the technical aspects of the project and to translate this information into visual form.

“It was a thought experiment, but we tried to approach it seriously,” Lomberg said. “One thing we were told was that there were no budgetary restraints. We could design what we want and not worry about building permits or construction costs.”

Considering every scenario

It’s not often data visualization artists or information designers are granted a blank check with no restraints, so where does one’s mind go when all barriers are removed? The group briefly considered making the danger markers from solid gold because of the metal’s stability and durability—it doesn’t corrode or tarnish— but the risk of theft was too high to entertain the idea for very long.

A mock up of materials simulated to see how they would look after 5,000 years, half way through the projected timeline. The top shows the biohazard symbol freshly built. The bottom shows the biohazard symbol almost entirely gone from erosion, showing the unlikeliness of this being a viable option.
A mock up of materials simulated to see how they would look after 5,000 years, half way through the projected timeline. Credit: Jon Lomberg

While developing their proposal, the team followed two guiding stars. The first focused on how we interpret narrative. “Most narratives are read from left to right, but […] that is something which is learned,” Lomberg explains. “However, everyone, seemingly by nature, reads from top to bottom.”

The second guiding star was how the narrative is visualized. Icons which require teaching to understand their meaning and significance, like the biohazard icon or alphabets, may lack staying power. This posed a challenge in deciding which symbols or icons could work. 

“Humans inherently like and understand pictorial narratives,” Lomberg told me. This led to options which included the classic stick figure, which can be traced back to prehistoric cave paintings, being a viable option, though not without concern when considering its place in a narrative and how humans read it. The stick figure, notes Lomberg, “is something everybody recognizes as the human shape, by nature.”

In addition to working without budgetary restraints, the teams relied on two additional pieces of information to develop their proposal. The first was to assume the people who may stumble upon the site have the same cognitive reasoning as humans do today. The second was to consider 10,000 years as an arbitrary goalpost. The radioactive materials will still be radioactive after 10,000 years, so this benchmark simply offered them an initial bar to reach. 

The group also eliminated a few other ideas, including anything non-structural. This included concepts such as a warning sound echoing throughout the area. But they considered structural concepts—including a few controversial ones which made it into the completed proposal—such as modifying the physical landscape, lining the desert fields with large stone spikes, and attempting to convey dread or danger in a way which didn’t require language or pictographs. 

An ongoing debate for the ages

The biggest concern with these ideas was whether people would accurately interpret the messages as intended (with a sense of dread or danger), or would find them intriguing—stirring their desire to explore and triggering the adverse effect.

“I didn’t want the design to be mistaken as an art project,” Lomberg said. “I look at things like signs in national parks. You see information signs overlooking great landscapes, and you never think that the sign is lying to you. It’s there to do a job and no one misinterprets it.”

Designing these markers meant examining multiple options for their physical shape, as well. Laying out a number of above-ground markers in a circle, giving shape to the location, seemed viable, and was another option which made it into the proposal. Such markers would contain pictographs and possibly danger warnings in multiple languages. 

They also considered a below ground alternative; layering the area with buried platforms at various depths. This may deter people from digging; afterall, the danger is hidden until brought to the surface.

A depth visualization of the various materials at different depths on top of the WIPP. At the surface is Mescalero Caliche. Until 250 feet is Santa Rosa Sandstone. Until 550 feet is Deweylake Redbeds. Until 850 feet is Rustler Formation. Until 2100 feet is Salado Salt. At 2200 feet is the repository.
A depth visualization of the various geologic materials on top of the WIPP. At 2,200 feet is the repository. Credit: Jon Lomberg

Ultimately, the proposal conceptualized by minds like Jon Lomberg remains just that—a proposal. While there is no need for such warning markers to exist while the WIPP is still operational, it’s possible the coming decades will see their implementation. Until then, we can allow our minds to wonder about the human who will stumble upon this location in the year 12,023. And after that point, says Lomberg, “we simply have to say, it’s somebody else’s problem.”

The post The Challenge of Designing Nuclear Waste Warning Markers to Last 10,000 Years appeared first on Nightingale.

]]>
16173
REVIEW: Be Data Driven by Jordan Morrow https://nightingaledvs.com/review-be-data-driven-by-jordan-morrow/ Thu, 22 Sep 2022 13:00:00 +0000 https://dvsnightingstg.wpenginepowered.com/?p=13127 Following up on his first book, which considered data from the individual level, Jordan Morrow’s Be Data Driven looks at harnessing the power of data..

The post REVIEW: Be Data Driven by Jordan Morrow appeared first on Nightingale.

]]>
Following up on his first book, which considered data from the individual level, Jordan Morrow’s Be Data Driven looks at harnessing the power of data from the organizational perspective in this practical and easily digestible book.

As much more of a data visualization hobbyist than working professional, it’s not uncommon for longer-form data content to go over my head or fail to make a meaningful impact. This was not the case for Be Data Driven. As someone who comes from an organizational and executive communications world, this book combined both the importance of data literacy and the influence it has on a larger scale within relatable applications.

I was fortunate enough to sit down with Jordan Morrow to gain insights as to how this book came to be, what its goals are now that the book is out, and what the future might look like for his continuing efforts to increase data literacy and encourage being data driven.

“This book was formed from the impetus of companies wanting to be ‘data driven’ while developing this seemingly ambiguous and nebulous term.”

While the world’s technology has advanced and grown tremendously in the past few decades, more so recently, with COVID-19 being a strong contributing factor, we are seeing a drastic surge in the use of data from an individual and corporate standpoint. Morrow paints the picture of what an effective data-driven workforce could look like while going into the core foundations of data strategy and literacy in just the first chapter. These core concepts are expanded upon throughout the rest of Part One. Readers are informed of the impact COVID-19 had on data within organizations, the roles and tools of being data driven, and how all of these elements meld together to form a true data-driven company. 

Where tools and technology are only alluded to in Part One, they are further explained in-depth in Part Two, where Morrow goes into the current foundational skills gap which exists between companies and the data they are trying to uncover and utilize. Often, organizations purchase tools and technologies with the intention of using them to drive strategy. In turn, those tools and technologies are force-fitted to the employee base and unsuccessfully adopted. With this in mind, Morrow spends the remainder of Part Two detailing the biggest missteps often made by organizations in their goal of being data-driven; including the pillars of an organizational data strategy, the gap in leadership, and looking at what Morrow claims to be the biggest hurdle: culture. The section on culture is the one Morrow expressed to me as his personal critical takeaway for readers, “we become so enamored in data analytics when the linchpin, the key to success, is the people.”

In Part Three, the final section of the book, Morrow takes the lessons learned from foundation building and organizational data gaps, and applies them to how readers can effectively build a data-driven organization. One key thing Morrow does in this final part is provide an abridged retelling of each chapter and how they all work together to build a data-driven organization. Breaking these chapters down once again and providing clear instructions on next steps is one of the many things Morrow does which makes this book easily digestible. Providing this information in such a clear, concise, and actionable format allows this book to be read as a guide much more than a text or reference book. Where many data books fill pages with visualizations, hoping to provide context for a reader who may have been bogged down in the over-complicated text of the chapter, Morrow allows the writing to speak for itself and provide context in a way which is not often found in books of this genre. As a result, this book is an enjoyable and insightful read.

“So often, people become enamored with STEM backgrounds and people who don’t come from that STEM background don’t feel like they have a seat at the table.”

As I mentioned earlier, I do not come from a STEM (science, technology, engineering, mathematics) background, but rather a social sciences background. In our conversation, Morrow talked about how data literacy for all, not just those in STEM, is key to larger success. “Everyone is data literate to an extent,” Morrow further explained. “To me, data literacy is paramount to everybody, because we just live in this data-driven world. But it doesn’t mean everybody needs to become a technical practitioner of data and analytics. I want people who might not be data and analytics professionals to just develop confidence and a comfort in using data to help them in their decision making.”

Morrow’s sentiment of ensuring everyone feels like they have a seat at the table, regardless of their background, is evident throughout his book. While Morrow doesn’t shy away from touching on more complex data topics, he doesn’t presume readers know everything on the topic prior to cracking open the book, nor does he speak about data in a way which comes off as gatekeeping from those who are not already in the field.

“Where I want my work to continue is around a holistic approach to data and analytics, the strategies around it, and how that ties to business.”

If you’re wondering where to start with Morrow’s work, whether it be with his first book, Be Data Literate: The Data Literacy Skills Everyone Needs To Succeed or this one, Morrow answers this question for you. “Utilize the two books together. Data literacy is necessary to be data driven and for an organization to succeed. We have to put practical solutions in place. Utilizing the books together will help you build your plan, build your strategy, and to drive it.”


You can purchase Be Data Driven from the publisher’s site, Amazon, or wherever you like to buy your books. Morrow plans to release his upcoming book on the four levels of analytics in May 2023.


Disclaimer: Some of the links in this post are Amazon Affiliate links. This means that if you click on the link and make a purchase, we may receive a small commission at no extra cost to you. Thank you for your support!

The post REVIEW: Be Data Driven by Jordan Morrow appeared first on Nightingale.

]]>
13127
The Visual Evolution of the Tommy Westphall Universe https://nightingaledvs.com/the-visual-evolution-of-the-tommy-westphall-universe/ Tue, 19 Apr 2022 13:00:00 +0000 https://dvsnightingstg.wpenginepowered.com/?p=10997 Television series finales often offer their long-time viewers heartfelt moments, character arc completions, plot line conclusions, tears, laughs and more. Unlike any other series finale,..

The post The Visual Evolution of the Tommy Westphall Universe appeared first on Nightingale.

]]>
Television series finales often offer their long-time viewers heartfelt moments, character arc completions, plot line conclusions, tears, laughs and more. Unlike any other series finale, the 80s drama St. Elsewhere offered its viewers something much different – a puzzle. A puzzle which has remained unfinished after it was first hypothesized over two decades ago, offering a unique challenge for data visualization artists. The challenge – visualize a forever-growing list of American television shows and how they’re all connected through the mind of an autistic boy named Tommy Westphall.

Where it all began

St. Elsewhere was an American medical drama which ran from 1982 through 1988. While the show was not an instant hit, it did develop a small but loyal fan base over its six-year run, at one point being part of TV Guide’s “50 Best Shows of All Time” list. Not unlike many shows such as The Sopranos, Twin Peaks, Two and a Half Men and Breaking Bad, St. Elsewhere’s finale went down as one of its most memorable episodes, particularly its final scene.

In the final moments of the show, Dr. Westphall and his autistic son Tommy are seen sitting in Dr. Auschlander’s office inside the hospital as snow falls. The image cuts to an exterior shot of the hospital, shaking. It is revealed at this moment that Tommy and Dr. Auschlander are seen in an apartment building, Tommy playing with a snow globe containing a miniature version of the hospital. After an exchange of dialogue between characters, it is implied that Tommy had imagined all the events of the series, playing on the now cliché television trope of “everything was just a dream,” by placing the events inside the mind of an autistic boy.

St. Elsewhere had direct connections with 13 other television shows through crossovers, cameos and more. This places the connected shows – M*A*S*H, Method and Red, The Bob Newhart Show, Crossing Jordan, Providence, Julia, Oz, Degrassi Junior High, The White Shadow, Chicago Hope, Tattingers, Cheers and Homicide Life on the Street – in the same fictional universe. In a classic game of six degrees of separation, these 13 connected shows are connected to an additional 18 shows. These connections grow and grow to the current list of 577 shows. Some of the connections are direct – in the form of crossovers and cameos. Some series spin off entire other series, expanding their fictional universes. Some connections are indirect – in the form of fictional places, character names, awards, newspapers, potato chip brands and more.

What once began as a hypothesis among two friends in 1999, has since evolved into a cult following of contributors, adding to the Tommy Westphall Universe.

The evolution

In 1999, Keith Gow and Ash Crowe began discussing the Tommy Westphall Universe. They exchanged emails, detailing everything from The Simpsons to Jay Leno, ultimately creating the first list and chart of the Tommy Westphall Universe in 2011.

First major evolution of tree map developed to visualize the Tommy Westfall Universe in 2011 by Keith Gow and Ash Crowe.
Image credit: Keith Gow and Ash Crowe.

This chart made a precedent that the list can and should exist visualized, not simply in text. At the time, the list contained less than 200 shows.

Over the next four years, the chart grew to just over 400 shows, more than doubling from the original chart.

Second major evolution of tree map developed to visualize the Tommy Westfall Universe in 2015 by Keith Gow and Ash Crowe.
Image credit: Keith Gow and Ash Crowe.

When this chart was presented in 2015, individuals following the evolution of the Tommy Westphall Universe attempted to visualize the universe in their own way.

In 2015, a physical version and reimagined visualization of the Tommy Westphall Universe appeared at the University of Waterloo’s art gallery, designed by Dave Dyment.

Third major evolution of tree map developed to visualize the Tommy Westfall Universe in 2015 by Dave Dyment.
Photo credit: Dave Dyment.

The next major evolution in the chart was posted on Reddit in May 2020 by user u/TheTrueBreadLord. The chart was a significant elaboration from the 2015 chart by Gow and Crowe, captioning each connection and how shows were linked.

Fourth major evolution of tree map developed to visualize the Tommy Westfall Universe in 2020 by Reddit user The True Bread Lord
Image credit: Reddit user TheTrueBreadLord.

Up until I redesigned the chart between March and April 2022, this was the most comprehensive visual representation of the Tommy Westphall Universe.

Today

When I came across the chart created by Reddit user TheTrueBreadLord, I wanted to try to visualize the Tommy Westphall Universe in my own way. Where my chart differs is added variation in bubble size depending on the number of connections, and color-coding the connections based on the way each show is linked.

This is not meant to be all-encompassing. In the month it took me to research the latest updates to the Tommy Westphall Universe, 20 more additions were made via the Tommy Westphall Universe Wiki. This was my way to visualize the universe in a way which was easier to read, with lines color-coded and non-overlapping, and varying the size of markers for television shows.

The future

The Tommy Westphall Universe will never be complete. It has been further hypothesized that the Tommy Westphall Universe encompasses 90 percent of all American television shows.

While the list continues to grow, a challenge is offered to data visualization artists. Developing a new map is a lot of work, but as we use new tools to visualize data, this is the perfect challenge for television-loving data visualization artists.

The post The Visual Evolution of the Tommy Westphall Universe appeared first on Nightingale.

]]>
10997
Designing for Neurodivergent Audiences https://nightingaledvs.com/designing-for-neurodivergent-audiences/ Tue, 08 Feb 2022 14:00:00 +0000 https://dvsnightingstg.wpenginepowered.com/?p=10315 When a group of autistic individuals coined the term neurodiversity in an attempt to redefine their identity, few would accurately predict the impact it would..

The post Designing for Neurodivergent Audiences appeared first on Nightingale.

]]>
When a group of autistic individuals coined the term neurodiversity in an attempt to redefine their identity, few would accurately predict the impact it would have on design, education, and society as a whole. With this term, autistic individuals asserted their right to move beyond negative colloquialisms. More than a decade later, neurodiversity has become synonymous with those having neurological conditions such as behavior and emotional disorders, learning disabilities, ADHD, Asperger’s, and Autism. This umbrella term informs others to see those who are neurodiverse as having a “differently wired brain,” as opposed to being someone unable to fit into the model of social norms.

Data visualization artists are responsible for making large amounts of information easily accessible and digestible for a wide array of readers. This includes the 17 percent of the global population who have been diagnosed as neurodiverse, as reported by the Oxford University British Medical Bulletin. For those in this community, having a visual representation of data can be invaluable, but there are still ways to make data more accessible. Taking extra care to consider elements such as emphasis, balance, proportion, typography, and color can make a noticeable difference.

Here are a few guidelines designers should consider when creating visualizations to make them more accessible to neurodiverse readers, from a neurodivergent designer.

Typography

Font selection is one of the most important decisions a designer makes. It’s the key to ensuring the data you’re attempting to communicate is understood beyond your design. While it may be a go-to decision to select a font which looks professional, such as Times New Roman, research shows you may want to avoid serif fonts altogether when attempting to appeal to neurodiverse audiences. 

Serif fonts can be identified by their tails and ticks on the ends of most strokes. While serif fonts like Times New Roman have been a standard for professional writing for decades, they have been found to be far less readable among neurodiverse audiences, according to the British Dyslexia Association. As an alternative, sans-serif fonts prove to be much easier to comprehend. While fonts resembling handwriting appeal to neurodiverse audiences, such as Comic Sans, there is limited usage for them in the vast majority of data visualization projects. Sans-serif fonts continue to offer multiple variations to add dimension to your design, where using different weights and sizes can make for a visually interesting piece.

Using a sans serif font can be much easier for neurodivergent readers to process. Here are a few fonts you may want to try: Apercu, Brandon Grotesque, Brother 1816, Century Gothic, Colfax, Corbel, Futura PT, Lato, Raleway, and Roboto.

Color

Color can be a data visualization artist’s best friend, offering an easy way to divide information and adding variables to a chart or dashboard. While it is both reasonable and natural to feel the need to use drastically different colors to differentiate areas, it can be overstimulating for a large percentage of people with dyslexia who suffer from Scotopic Sensitivity Syndrome, according to the National Library of Medicine. This is where the usage of single-hue scales can be a perfect solution.

Single-hue color scales limit the number of hues in use while still providing plenty of variations. While this won’t work for projects breaking down a large number of variables where color is your only option for variation, it surely has its place in many others.

Selecting a background color is also crucial for contrast. A designer should not overlay two highly contrasted colors on top of each other, causing an unpleasant viewing experience for those who are neurodivergent. For many neurodivergent audiences, there is a preference for muted and pastel hues and neutral tones.This may mean a matte-black background with two-to-three pastel hues to depict data, or a neutral tan, gray or white background, to prevent colors from overwhelming the reader.

Single-hue color scales can minimize high contrast, which can be overstimulating for those who are neurodivergent.

Visual hierarchy

Having a clear path for a reader to follow is essential among neurodiverse audiences. When looking at the project as a whole, it should allow for the eyes to follow a path to easily find the title, description, primary graphic, and key. This greatly aids comprehension, ensuring information isn’t overlooked, nor is a strain to find.

Another way to ensure your project has an effective visual hierarchy is to break up large bodies of text with visual elements. When there are too many large blocks of text, it can be overwhelming and harder for neurodiverse audiences to comprehend. Breaking up the text into smaller, more digestible pieces will considerably increase comprehension. 

A visual hierarchy ensures balance in our work; by using an even amount of symmetry or asymmetry, a strong visual stability is enforced. 

A proper mix of text and images ensure a visual hierarchy, making it easier to process.

Patterns

The use of patterns, both organic and geometric, appeases a neurodivergent audiences’ need for predictability and repetition, according to a study conducted by the Association for Psychological Science. This technique is known as fractals structures, and while they naturally help the neurodivergent understand, manage, and navigate the world, when used in data visualization projects, patterns can improve understanding, management, and navigation of large amounts of information. These patterns, while still used in moderation to avoid overstimulation, can add another element to visual storytelling efforts.

Many of those who are neurodivergent depend on repetition and predictability to feel in control and comfortable with what they’re attempting to comprehend. Having a pattern combined with a well-executed visual hierarchy can keep a reader who is neurodiverse more engaged.

Organic and geometric patterns can be extremely useful because they're predictable - embracing fractals structures.

Data visualization artists are storytellers with the unique ability to take information that requires skills – and patience – to dissect and turn it into beautiful designs. By designing data visualization projects to accommodate a variety of neurodivergent disorders, dataviz practitioners only further ensure our work is more accessible for all.

The post Designing for Neurodivergent Audiences appeared first on Nightingale.

]]>
10315