Tableau Archives - Nightingale | Nightingale | Nightingale The Journal of the Data Visualization Society Thu, 19 Mar 2026 07:28:28 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://i0.wp.com/nightingaledvs.com/wp-content/uploads/2021/05/Group-33-1.png?fit=29%2C32&ssl=1 Tableau Archives - Nightingale | Nightingale | Nightingale 32 32 192620776 Building Tableau Dashboards for the PowerPoint Download https://nightingaledvs.com/building-tableau-dashboards-for-the-powerpoint-download/ Thu, 26 Mar 2026 12:00:00 +0000 https://nightingaledvs.com/?p=24662 Working in reporting and analytics for the last six years has made me realize an uncomfortable truth about Tableau: Your beautiful interactive dashboard will often..

The post Building Tableau Dashboards for the PowerPoint Download appeared first on Nightingale.

]]>
Working in reporting and analytics for the last six years has made me realize an uncomfortable truth about Tableau: Your beautiful interactive dashboard will often become a static PowerPoint slide.

If you work in sales ops, finance, or any executive-facing analytics team, you already know this. Your vice president  won’t open Tableau Server at 9 a.m. before the board meeting. They’ll download your dashboard as an image or  powerpoint, paste it into slide 17, and present it to the C-suite.

Once I accepted this reality, I started treating this as a design problem. Here are five non-negotiable factors I learned on my Tableau journey.

The first Excel dashboard, created in 1990 using the first version of Excel for Windows. Source: Microsoft

1. Design for PowerPoint From Day One

Device preview matters exponentially more when your dashboard will live in a powerpoint deck.

In the early stages of redesigning an executive-level sales report, I built my dashboard in Tableau’s default “Desktop Browser” view. When I downloaded it as PowerPoint, it crushed into a single slide with illegible text — a formatting disaster right before a leadership presentation.

The fix here is using Tableau’s built-in PowerPoint layout (16:9 aspect ratio) from day one.

Source: Rituparna Das

This ensures your dashboard fits perfectly into standard Google Slides or PowerPoint without awkward cropping or white space. Don’t design for Tableau’s default dimensions — design for where your dashboard will actually be consumed.

Pro tip: Always test your export before the final version. Click “Dashboard > Export as PowerPoint” to preview exactly what stakeholders will see.

2. Accept That 80% of Functionality Disappears

This is the hardest lesson: You must build assuming zero interactivity.

What dies in PowerPoint:

  • Filters (static view only)
  • Parameters (whatever was selected during download)
  • Hover tooltips (invisible)
  • Drill-downs (gone)
  • Dashboard actions (non-functional)

This changes your design strategy. Now you have to build multiple static versions of what each filter setting your users will want to view. For example, my executives were interested in seeing  pipeline performance across sales regions, sales clusters, business units, and product lines. What would have been one dashboard filter is now separate dashboards I had to create:

  • “Pipeline_Review_by_Sales_Region”
  • “Pipeline_Review_by_Sales_Cluster”
  • “Pipeline_Review_by_Business_Unit”
  • “Pipeline_Review_by_Product_Line”

Yes, it’s more work. Yes, it feels redundant. But it’s the only way to ensure your stakeholders see what they need without interactivity.

Every critical insight must be visible on page load. If it requires a click to reveal, assume it will never be seen.

3. Use Containers for Layout Control

When your dashboard contains multiple visualizations, containers keep everything locked in place during the PowerPoint export. Without them, floating objects shift unpredictably — your perfectly aligned KPI cards end up overlapping your bar chart in the downloaded version.

PowerPoint downloads don’t tolerate white space. A minimalist Tableau dashboard might look elegant on screen, but it looks unfinished and unprofessional in a deck. Executives expect dense, information-rich slides.

Why containers solve both problems:

  • They lock your layout in place (no shifting elements)
  • They help you maximize space efficiently (no awkward gaps)
  • They give you precise control over how information flows
Source: Rituparna Das

This dashboard exports with excessive white space, making it look unprofessional in decks.

Best practice workflow:

  1. Create a low-fidelity mockup of your dashboard layout
  2. Build the container structure first (horizontal and vertical containers)
  3. Drop visualizations into containers last

Pro tip: Watch this Tableau container best practices video before building your next dashboard — it’ll save you hours of reformatting frustration.

4. Establish Governance Standards for Version Control and Collaboration

If you’re working collaboratively or managing multiple dashboard versions, implement a simple visual system:

Source: Rituparna Das

Use the color coding available for dashboards:

  • 🟢 Green : Production-ready, safe to download
  • 🟡 Yellow : Work in progress, do not present
  • 🔴 Red : Draft/testing only

Keep consistent and clear worksheet naming conventions. This will save your sanity.)

❌ DON’T: “Bookings (1)”, “Bookings (1)(1)”, “Sheet 3”
✅ DO: “Q4_Bookings_Final”, “Pipeline_Review_v3”, “Pipeline Coverage_BarChart”

5. Add Company Logos

Align as closely as possible to your organization’s standard slide deck template.

Why this matters: Your dashboard might be internal today, but it’ll be in a client presentation tomorrow. When your VP forwards it externally without asking you first (and they will), professional branding matters.

Where to place logos:

  • Top-left or top-right corner (consistent with company templates)
  • Footer with date/data source
  • Consider adding a “confidential” watermark for internal metrics

The Bottom Line

The moment you accept that your Tableau dashboard will become a PowerPoint slide, you start designing better dashboards.

Stop optimizing for interactivity. Start optimizing for screenshots.

Use the 16:9 layout. Build static versions of filtered views. Lock everything in containers. Name your worksheets like a professional. Add your company logo.

Your stakeholders don’t care about your elegant parameter actions if they can’t paste your dashboard into their Monday morning deck.

Sometimes being a great analyst means accepting that your masterpiece will be Ctrl+C’d, Ctrl+V’d into slide 23 — and designing for that reality from the start.

CategoriesHow To

The post Building Tableau Dashboards for the PowerPoint Download appeared first on Nightingale.

]]>
24662
Meet Will Sutton, former Iron Viz Champion https://nightingaledvs.com/meet-will-sutton-former-iron-viz-champion/ Fri, 22 Nov 2024 19:21:07 +0000 https://dvsnightingstg.wpenginepowered.com/?p=22517 Iron Viz is the world’s largest data visualization competition. What began as a breakout session at Tableau Conference 2011 has grown into a global phenomenon and..

The post Meet Will Sutton, former Iron Viz Champion appeared first on Nightingale.

]]>
Iron Viz is the world’s largest data visualization competition. What began as a breakout session at Tableau Conference 2011 has grown into a global phenomenon and become a core part of the Tableau Community. Three Iron Viz contenders take center stage and have 20 minutes to tell the most compelling story using the same data set. The 2022 competition saw the crowning of Will Sutton as champion. On behalf of the Data Visualization Society’s Early Career Committee, I asked Sutton a bit more about himself and his win.

How long had you been doing dataviz when you entered the IronViz competition and were you making a living doing dataviz at that time?

I’d been working in data analytics for about 8 years before I arrived at the Iron Viz stage. However, it was only about three years before the contest that I really started focusing on improving my dataviz and Tableau skills. At the time, data visualisation was certainly an important part of my role, but so were the technical skills of SQL, R, and Python.

Can you walk us through the preparation process you went through for Iron Viz? What were some of the key steps or strategies you think contributed most to your success?

Preparing for Iron Viz was very challenging. In the final, you are given 20 minutes to build a data visualisation live on stage against two competitors, using a given dataset. It’s hard enough telling a compelling data story, but against a time limit and a live audience, it really adds a lot of pressure.

This was a real step out of my comfort zone, so rather than focusing on winning the contest, I aimed to make a win on my own terms. Let me explain. Career-wise, I was in a difficult spot: half of my team at work had been made redundant, my wife and I were having difficulty getting a mortgage, and I had just been selected for the final.

It was a lot to go through at the time, so I took a long weekend trip to Wales. Whilst I was out there, I realised what an opportunity I had—not the prize money, but the chance to show my skills to thousands of employers. So I viewed the final more as a showcase of what I could do with Tableau, rather than worrying about being first, second, or third. With that in mind, it got me thinking more about the positives that could happen after the event.

The competition environment can be intense. How did you manage the pressure and incorporate any feedback you received along the way?

I found it important to identify the main aspects that were causing me stress about the competition and to reframe them. The stress was only going to distract me from performing my best. One big hurdle I was facing was presenting in front of an audience of 5,000 people. Public speaking wasn’t my thing. I’m certainly not the most outspoken person, and when it came to presenting, I just wanted to get it over with as soon as possible.

Thinking back to my purpose—that thousands of people were going to see my work—I realised I really needed to sell my skills. So I approached it with the mindset that this was a skill I could improve, and if I could get through this event, future work presentations would be much less daunting.

I spent a long time reviewing TED Talks on education, which was the dataset I would be working on. The content was useful for ideas, but importantly, I could see and understand how other speakers approached the presentation aspect. I found great tips, such as starting the presentation with a question to get the audience thinking and adding more emotion-driven words to build interest in the topic. Once I realised the techniques being used, it gave me much more confidence in my ownpresentation delivery.

What was the very first indication that your life had changed after the win?

The first sign that things had changed was the sudden recognition. At the conference, people I’d never met were coming up to me, congratulating me on my presentation and sharing how it resonated with them. It was surreal to receive that kind of attention. But it came with lots of new opportunities: chances to speak at user groups, collaborations, and job leads.

The Tableau “DataFam” community is unique for a software program/tool. It’s well-known for being an accelerator in building skills in the program. How instrumental was it in contributing to your win?

Absolutely instrumental. The community truly helped me gain the skills I needed to win the contest. I had been part of community projects like #MakeoverMonday, which focused on regularly building data visualisations with simple datasets. My initial attempts certainly wouldn’t win any awards, but by seeing what others submitted and receiving feedback on my work, I saw a rapid improvement in my dataviz portfolio.

The support and feedback from the community helped me grow and pushed me to improve. It then gave me the confidence to do more of my own projects and even help set up the community project #GamesNightViz, which was all about visualising data about games. Funnily enough, how I ended up in the final was from a visualisation I built to promote the #GamesNightViz initiative.

Data visualization competitions are all about standing out and pushing the envelope. Where did you find inspiration for your winning visualization, and how did you balance creativity with clarity?

For me, it comes back to the question: “What do you enjoy?” Since I was using the contest to showcase my skills, I didn’t want to become known for something I wouldn’t want to do in the future. Plus, I think when you enjoy a topic, you’re naturally more interested and creative in what you create.

A big deciding factor for me was how the visualisation would be viewed. In the first round of the contest, I knew the work would go to a panel of judges who would take their time reviewing it, so I could create a bit more of a personalised user experience. Here, the user would play a game, which would add their results to the survey data I had collected, making the data more interesting as it was relative to them—for example, “You chose X, and Y% of respondents agreed with you.”

For the final, it’s a three-minute presentation. There’s less need to dive into details or go overboard with creativity, as it’s all about the clarity of the message. Having said that, as a presenter, you’re in control of what is displayed on screen and how the content is consumed. So you can direct and twist the story as you present it. This worked well with my love of animations to get my audience’s eyes back on the screen when I shared the next insight.

I see from LinkedIn that your Tableau consultancy The Information Lab started in June 2022, just after you won IronViz. Can you talk about that and how was it influenced by your achievement?

Winning Iron Viz definitely opened up new opportunities. Shortly after the competition, I received a lot of job offers and, yes, joined The Information Lab shortly after the contest win. They’re a Tableau partner, very well known in the community, and it was too good of an offer to refuse. Speaking with other competitors, past and present, Iron Viz allowed them to change and develop their roles regardless of the result in the final.

It’s been approximately 2.5 years since you won IronViz 2022. You mentioned that a lot has changed for you career-wise. Aside from The Information Lab, how has winning affected your career or invited other opportunities?

It has been fantastic for meeting new people. Before we headed on stage, they played a little intro video that showed us in our hometowns, so the bright lights of Las Vegas got to see me running around sleepy Suffolk. I had folks come up to me saying they were from a small town too and loved the video.

I’ve also become more involved in the Tableau community, contributing to projects and attending events to help others develop their skills. It’s also helped with working more with Tableau. Lately, I have been working on incorporating LangChain (an AI framework) into Tableau, which is something I’m super excited about and calls for a new set of skills. Overall, winning Iron Viz opened doors and gave me the confidence and contacts to pursue new and exciting opportunities.

You mentioned that post-contest life called for a new set of skills. Can you talk more about what this means specifically?

In the build-up to the contest, I was heavily working on my dataviz skills. After the contest, public speaking became a big part of my life—I had to get comfortable presenting in front of large audiences and engaging with people at events. I still find it tiring as an introvert, but it’s been so beneficial to develop confidence in this area.

Since the contest, I’ve been pursuing developments in AI, building on what I’ve learnt about dataviz. It’s a very different set of skills, with much more jargon, but it’s been very rewarding to find new solutions with this technology.

Aside from recommending that they win Tableau IronViz, what’s the most important advice you would give to early career data-vizzers?

Let your interests lead you. When you’re genuinely interested in a topic, it shows in your work. You’ll put in more effort, be more creative, and produce visualisations that really resonate with people.

Also, focus on what skills you want to gain and what a win looks like on your terms. Whether it’s technical abilities, storytelling, or design principles, knowing what you want to improve will guide your learning.

Will’s final viz

The viz that got Will into the finals

CategoriesCommunity

The post Meet Will Sutton, former Iron Viz Champion appeared first on Nightingale.

]]>
22517
SPOTLIGHT: W.E.B. Du Bois Portrait Gallery https://nightingaledvs.com/spotlight-w-e-b-du-bois-portrait-gallery/ Tue, 06 Sep 2022 14:06:39 +0000 https://dvsnightingstg.wpenginepowered.com/?p=12647 Chimdi Nwosu’s interactive visualization, the W.E.B. Du Bois Portrait Gallery, is a beautifully designed exploration of 20 re-created Du Bois charts. The gallery won Favorite..

The post SPOTLIGHT: W.E.B. Du Bois Portrait Gallery appeared first on Nightingale.

]]>
Chimdi Nwosu’s interactive visualization, the W.E.B. Du Bois Portrait Gallery, is a beautifully designed exploration of 20 re-created Du Bois charts. The gallery won Favorite Viz of the Year at the 2022 Tableau Conference. Check it out here and read about his inspiration and process below.

Inspiration 

The W.E.B. Du Bois Portrait Gallery began as an attempt to participate in the 2022 #DuBoisChallenge. However, not long after I started working on it, it quickly became a broader exploration into W.E.B Du Bois and the incredible legacy left behind in his data visualization work. As a black man, I found that Du Bois’ depiction of the black experience post-slavery allowed me to explore that time in our history and see how it impacted us back then and even today. This was probably the most valuable part of this process for me. 

The Du Bois Challenge

The Du Bois Challenge hosted by Anthony (AJ) Starks, Allen Hillery, and Sekou Tyler provides a platform for people to engage with and re-create Du Bois visualizations and share their work online. AJ wrote an excellent piece about this challenge, and I recommend you check it out here.

As an avid explorer in data viz, I was blown away by AJ’s gallery of Du Bois’ portraits. This year’s challenge provided ten portraits for us to recreate over time, but after attempting the first, I became curious about the second, the third, and the fourth. I realized I wouldn’t be staying within the limits of the challenge, but the work had already evolved into a personal journey for me.

Most of the visualizations I was re-creating were part of the American Negro exhibit from the 1900s. In my opinion, Du Bois was way ahead of the times in terms of data visualization. 

Gallery Style Viz and Choosing the Gallery Portraits

I always knew I wanted to make multiple vizzes. My initial plan was to gradually release them over the timeline of the #DuBoisChallenge. However, as I spent time creating the vizzes and researching Du Bois, I encountered this gallery by North Carolina State University, which also features some of Du Bois’ portraits. After seeing that, I scrapped my idea of releasing multiple individual vizzes and decided instead to emulate their gallery-style layout. The convenience I experienced while interacting with the NC State gallery was something I wanted to offer to my audience. 

From that point, it was simply a matter of choosing which vizzes to include in the gallery. There wasn’t any fancy selection process. I chose vizzes that intrigued me and were diverse in topic. In depicting the black experience, Du Bois charted population growth in the black community, the value of property/wealth, the journey to freedom after slavery, and more. My goal was to touch on as many of these topics as possible. 

Processes, Tools, and Techniques 

As many practitioners know, a big part of any data viz project is collecting data and gathering relevant information resources. AJ had already collected all the data and original portraits and made them available in his Github profile. This saved me a lot of time, and I’m grateful to him as it let me focus on the exploration. 

The Du Bois Challenge features many entries made in R, Python, Power BI, and other tools, but not many Tableau entries. This was one of the main reasons that it felt great to represent Tableau this year. Specifically, I used Tableau to put together the charts; the second tool I used was Adobe Illustrator. As a fan of design, learning design tools has allowed me to take some of my visualizations to the next level, and it was only natural to incorporate one here. I used Illustrator for design aspects like the background/gallery layout and other minor things like annotations and making the shapes used in some visualizations.

Timeline and Challenges Faced 

The time taken for each viz varied so much. Many of the bar charts and basic charts took 10-15 minutes, while others took much longer. Overall, I found the vizzes with many “hand-drawn” components and annotations took a lot longer because of the level of detail. 

Here are some examples: 

The easy stuff:

Stuff that took a bit more time:

From a dataviz perspective, some of those that took longer are simple in theory. For example, the viz on the left is a simple line chart, but the axes, annotations, etc., required building and bringing a lot of components into Tableau. Same with the one on the right—it’s a basic stacked bar chart, but the details required a lot of time. As a matter of fact, despite being very detailed, the re-creation on the right is still missing a few details from the original:

While the lines connecting the bars were a great addition to the original viz, I felt they didn’t provide much extra value to the chart. And honestly, I couldn’t find a practical way to add them, so I excluded them.  

I made decisions like this throughout the visualization process, and they emphasized a certain aspect of Du Bois’ visualizations that stood out to me: the illusive simplicity of many of the Du Bois portraits.

There were many times where it was shocking to me how much effort it took to make some of the simpler-looking vizzes. I think we can attribute this to the particular tool I used—Tableau. I imagine that making some of these charts by hand may actually be easier.

Examples

Continuing with the example based on portrait 4 in my gallery (Plate 21 by Du Bois), here’s what it took:

Two sheets are used for the main view.

16 individual components are required to make the axis.

I created the annotations in Illustrator.

Thinking about it now, I wouldn’t refer to this aspect as a challenge, but more of a necessary exercise in finding creative ways to replicate.

Another aspect that required some thinking was deciding what to preserve from the originals and what to remove. As a fan of minimalism in visualization, I occasionally noticed corrections and omissions that I could make while building. My goal was to do so without compromising the work’s authenticity, while modernizing the look and feel.

Here are some examples:

1. The proportions of the original viz on the right were throwing me off, and I felt the proportions would be easier to compare if all the bars totaled 100%.

2. The original viz on the right has a number of details, like the right arrows at each category and lines connecting the bars. Though great additions, I wanted to modernize and clean up the viz, so I omitted both of these details.

3. This one was an easy decision for me, as it felt like the extra twists in the circle made it harder to compare the values of the proportions. I decided to build mine with no extra twists.

Conclusion

Creating the gallery was an informative process for me, and it was gratifying to see the reception to the viz within the data visualization community. I had an opportunity to present the viz at a Data + Diversity event, where we were able to dig into Du Bois’ history. The viz was also talked about at a Tableau Chicago TUG (Tableau User Group) event. At that point, I thought it was awesome to simply help raise awareness about W.E.B Du Bois and his great work. 

However, in May 2022, during the Tableau Conference, the viz won Favorite Viz of the Year, and I have to admit that wasn’t something I anticipated. On that day, I had the conference playing in the background while I was working from home. When they announced the award, I just about fell out of my chair in shock. After being a part of the #datafam for a while and having been shown so much love already, I was floored and grateful that they had voted this viz as their favorite. I still have a lot of “thank you’s” in the tank, so here’s one more big thank you! to everyone who voted for it. 

And, following this viz, I was invited to collaborate more on the Du Bois Challenge so you can look forward to that in the future!


Thanks for reading about my experience. You can check out the viz here.

The post SPOTLIGHT: W.E.B. Du Bois Portrait Gallery appeared first on Nightingale.

]]>
12647
Five Free Data Visualization Tools for Beginners https://nightingaledvs.com/five-free-data-visualization-tools-for-beginners/ Wed, 19 May 2021 13:00:00 +0000 https://dvsnightingstg.wpenginepowered.com/?p=7831 We have used data visualization in some way or other since time immemorial, from early cave paintings to today’s advanced information dashboards. The human eye..

The post Five Free Data Visualization Tools for Beginners appeared first on Nightingale.

]]>
We have used data visualization in some way or other since time immemorial, from early cave paintings to today’s advanced information dashboards. The human eye is drawn to colors and patterns. Our brains crave visual information. In fact, according to MIT, “90% of the information transmitted to our brains is visual.”

Two years ago, I began to recognize the value of isolating insights and driving understanding by developing my visualization skills. But, when I looked at sophisticated vizzes like the ones above, I was worried that I needed to learn extensive coding and obtain training on premium dataviz tools. Instead, I discovered five free tools useful regardless of your coding prowess.

In this article, I review those five free tools by creating a viz in each one using same dataset: Number of Livestock Species of Karnali, Nepal taken from the Ministry of Livestock Development, Organizations on Open Data Nepal.

1. Tableau Public

Screenshot from the Tableau Public website.

Tableau is one of the world’s leading analytics platforms. Tableau Public is a popular visualization application that allows you to create a wide range of charts, graphs, maps, and other graphics. The visualizations you create can be conveniently inserted into any web page and can be shared with your friends, organizations, peers in the industry, and so on. Tableau’s public gallery contains a wide ranges of visualizations created by the community. You can play with other vizzes. One of the best things about Tableau is their user community popularly known as #Datafam on Twitter. Everyone I have encountered through #Datafam has offered a helping hand and sincerely seems to want every member of the community to improve despite their proficiency.

Tableau Public is designed for scientists, academics, or anyone who wants to create and explore the journey of data visualization. For anyone looking to share data, collaborate publicly, and learn data visualization inspired by other people’s work, Tableau Public is worth a try.

Pro: Tableau Public is that it offers unparalleled data visualization with fully functional and interactive graphics.

Con: You cannot save your workbook locally and everything you create gets shared publicly on your Tableau Public profile, which limits its usefulness for work based on proprietary data.

Example: Below is a depiction of viz I created in Tableau Public showing the number of livestock species in Karnali, Nepal.

Tableau Public Viz: Lifestock of Karnali People, Nepal
Viz by the author. View the interactive visualization here.

2. Flourish Public

Screenshot from the Flourish website.

When it comes to storytelling, Flourish tops the list and is ideal for anyone looking to tell stories with data. It enables immersive storytelling rather than more traditional ways of visualizing as tables, diagrams, and dashboards. Unlike Tableau Public, Flourish does not require a desktop edition. It is browser-based. You can choose a wide range of flexible templates from the library. Flourish is typically for social media sharing and website content.

Flourish enables journalists to guide readers through one or more visualizations, animating between views to create a narrative. If you are a journalist and involved in newsrooms, Flourish is worth a try.

Pro: It is super easy to create interactive rotating globes and maps in Flourish.

Con: It doesn’t support adding data from Google Sheets or other online sheets in its free version.

Example: Below is an alternative depiction of my previous viz in Flourish Public regarding the number of livestock species in Karnali, Nepal.

Viz by the author. View the interactive visualization here.

3. Infogram

Screenshot from the Infogram website.

Infogram is a browser-based visualization platform that offers interactive charts, graphics, infographics, and maps to tell a story and has many free templates from which to choose. In addition to supporting local data uploads, it also supports uploads from Google Sheets, Dropbox, MySQL, and direct JSON data feeds. Infogram provides object animations that allow you to easily zoom, bounce, rotate, fade, and slide objects into your work.

Infogram is for anyone who wants to stand out with data-driven content. If you are a marketer, media company, or strategic business leader, Infogram is worth a try.

Pro: It is easy to create reports, slides, dashboards, email headers, and social media contents in addition to interactive visualizations.

Con: In the free version, when you want to insert interactive charts into your website, you get a large Infogram logo. You have to upgrade to a paid version to remove it.

Example: Here is my Nepalese livestock interactive viz in Infogram.

4. Datawrapper

Screenshot from the Datawrapper website.

Datawrapper is a browser-based data analysis tool that can easily transform numerical data from PDFs, CSVs, and web sources into charts and graphs. It requires no sign up and you can get started by selecting the “start creating” button from its landing page. River, a shared platform developed by Datawrapper, helps you to share relevant data, charts, and maps with other users. It also allows users to reuse visualizations made by others.

If you are a journalist, engineer, and other design professional that needs to compile data from a range of native formats, Datawrapper is worth a try.

Pro: Datawrapper is that it is super easy for beginners to understand the process of creating the visualization as it guides you through every step of the process.

Con: Customizing fonts and colors is tough compared to the other tools in this list.

Example: Here are those Nepalese livestock comparisons reimagined in Datawrapper.

Viz by the author. View the interactive visualization here.

5. Google Data Studio

Screenshot from the Google Data Studio website.

Google Data Studio is a powerful, browser-based analytics and visualization platform. It allows you to generate informative reports and eye-catching dashboards to communicate, interact, and share publicly. If you already have any type of Google account, set up with Google Data Studio is quick and easy.

If you are an engineer or a designer and you frequently use Google products, Google Data Studio is worth a try.

Pro: The ability to combine and view Google Analytics, Google Ads, and Search Console data is fantastic.

Con: Again, there is limited ability to customize compared to other tools on this list.

Example: Below is what my Nepalese livestock visualization looks like when created in Google Data Studio.

Viz by the author. View the interactive data visualization here.

Data visualization is not limited to presenting data in a graphical form; it is also a way to weave a story. Of course, there is more than one way to tell a story, depending upon your goals and your audience. Luckily, there are free tool options for a variety of types of visualizations: whether you need a graph or an interactive map or a narrative story. Free tools are designed to be limited in functionality — for trial purposes — but they can be useful resources for both dataviz beginners and experienced practitioners alike. Limited features and no-fee trial versions can help reduce the intimidation factor for beginners. They can also encourage experienced practitioners to branch out with minimal initial investment while exploring paths to further hone their skills.

The post Five Free Data Visualization Tools for Beginners appeared first on Nightingale.

]]>
7831
What is Data Scaffolding? https://nightingaledvs.com/what-is-data-scaffolding/ Fri, 12 Feb 2021 14:00:57 +0000 https://dvsnightingstg.wpenginepowered.com/?p=5620 Even if you’ve never heard of data scaffolding, it’s likely that you have (or will face!) a data problem of your own for which scaffolding..

The post What is Data Scaffolding? appeared first on Nightingale.

]]>
Even if you’ve never heard of data scaffolding, it’s likely that you have (or will face!) a data problem of your own for which scaffolding is the answer — or at least part of it. You may have even had a data problem in the past where scaffolding could have helped, but you just weren’t aware of it. Plenty of other folks have written about scaffolding (Phillip LoweCarl AllchinKen Flerlage), but I’d like to provide some of my own use cases for applying the technique in business settings.

What is data scaffolding and why would you need it?

Data scaffolding is a technique used to fill in pieces of data that are missing from your data source in order to help with analysis and visualization. Merriam-Webster’s online dictionary defines the word scaffold as:

1) a temporary or movable platform for workers (such as bricklayers, painters, or miners) to stand or sit on when working at a height above the floor or ground

2) a supporting framework

Data scaffolding is, quite literally, that platform or framework to facilitate a more thorough analysis of your data. You create and then insert records that you can infer must exist (albeit abstractly) or details that must be true, based on existing records.

Consider this typical scenario: you are asked to help discover trends over time to inform business decisions. However, new rows of data are only written to your data source in order to log changes, specific activities, or updates. While this is an efficient way to store data from a database administrator’s perspective, it presents a challenge for analysts like you. This is an opportunity to use data scaffolding.

What tools can you use for data scaffolding?

There are a number of tools that you can use for data scaffolding. Tableau Prep and Alteryx are two that I have used. Another option is using Excel in conjunction with some Tableau Desktop join calculations, however this may not be practical for large data sets, for situations where the Excel file needs to be refreshed periodically, or for when one of your data sources is from Tableau Server. In the examples below, I use Alteryx for data transformation and Tableau Desktop for visualization and analysis.

What kinds of data could this technique be applied to?

While there are numerous areas of business where scaffolding can be applied, I’ll discuss three areas here:

1) Human Resources (HR) data, specifically employee headcount tracking. Every business wants to know how many people are employed, hired, or terminated on any given day and this is often available as a point-in-time view. But ideally, a business wants to see how this information changes over time. They may also want to see this trend for each department, or by other meaningful categories, to understand what is happening in different areas of the business.

2) Customer headcount data, similar to HR headcount. Customers join and exit an organization on a regular basis and that organization likely wants to know how many customers they have on a given day, as well as how many were gained or lost each day.

3) Sales and inventory data is a little different from people-related data. Products are sold and returned; stock runs down and is replenished. People data is likely to be reported at a daily level (meaning you want to know ‘how many people do we have?’ each day), whereas sales and inventory data could be reported at an hourly, daily, or weekly level, or something else entirely. In addition to reflecting different dimensions of time, data related to a product may belong to multiple hierarchies, something like (but not limited to):

A. Product Detail: product ID> category > broader category

B. Where Product is Sold: product ID > store X > location X

C. Product Pricing

People data may certainly belong to multiple hierarchies as well, but product data will be structured differently to people data. This may require that distinct scaffolding techniques be applied.

Use case example 1 — scaffolding for headcount

The sample data below is intentionally simple. In a real working environment, there would be more nuance to the data structure that you would investigate before attempting any data transformation. But in this example, you have eight customers who have signed up for and then cancelled a service at various points in time. The data source receives new rows of data only when there is an update for that customer:

Sample of eight customers with 20 transactions between them, sorted by customer.

You have been tasked with showing the headcount trend of customers since this service was first offered. The business also wants to know how many customers signed up or cancelled each day. Losing customers (or employees) is also known in business as attrition or churn.

If you connect to this data as is in Tableau, for example, you can put together something that looks like this:

Line graph of the number of records per day in our 20-row dataset.

In the line graph, you can see that there is a sized circle representing the number of records on the dates associated with each record, but you don’t have a circle for any of the days that fall in between the dates in your data. This is because there is no data in your data source that represent those specific days.

You could review these 20 records and work out that Andrew was an active customer from August 4–9, he left on August 10, then came back about two months later on October 22. Similarly, you can infer that Sarah signed up on September 20 and has remained an active customer since then. You could probably comb through this list, manually work it out, and report back to the business as requested. But that would take awhile and you are only looking at eight customers. In the real world you are likely to be looking at far more records representing far more customers, so this method may not be realistic.

It’s time to scaffold. First, insert the dates that are missing (for each customer) and show that customer’s status on each date. Note that this is daily-level data.

Let’s go step by step:

1) Find the earliest date in the data by sorting in ascending order by date. In this case, it’s August 4, 2020. In a real world application of scaffolding, you may need to do some internal research to ensure you have all the right data and are interpreting it correctly before you start.

Input Data tool followed by Sort tool

2) Determine all the days that exist between the earliest date in the data and today. In this example, today is February 1, 2021. This means the last date you scaffold will be yesterday, or in this case January 31, 2021. There are 181 days that exist between August 4, 2020 and January 31, 2021.

Text Input tool followed by Generate Rows tool

3) For each row in your original data (20 rows), you need to attach or append 181 days to each row. Pause and think about this. You started with 20 rows, and you’re attaching 181 rows onto each one. That means you will now have 3,620 rows (181 x 20).

Append Fields tool

4) Now you need to filter the rows you created in step three so that you are looking at only customer records that make sense for your analysis. You want to see each customer’s status each day since they first became a customer. For example, why would you have records for Sarah prior to when she joined on September 20, 2020? Or why would you need to have records for Sabrina prior to December 3, 2020? You don’t need those records and they don’t tell you anything since they did not exist as customers prior to those dates.

Create a formula to flag and filter out rows that don’t make sense. The formula is scaffold date >= date in the original data set. Your goal is to keep the records where the date you’re inserting is either the same as the date in the original data (for a given customer) or later. After you filter those records out, you are left with 1,766 records.

Formula tool followed by Filter tool

5) Next, you need to quantify the statuses of active and terminated customers (called Active and Terminated) by creating a new column called Status Number using the Formula tool. The formula in this example is pictured below. Ensure the data type of this new column is set to a numeric one.

Formula tool configuration

Once the Status Number column has been created, you can tell Alteryx to sum the Status Number for each date, which now has numeric values of 0, 1, or -1. This will give the net number of customers on each day.

Summary tool — for each date it sums Status Number, also known as our net customers per day.

Note that no line was assigned the value of 0 because it’s a small and tidy data set. In the real world, you would need to carefully consider your data structure when working out what this formula should be.

You could stop here at the summary tool if all you needed was the net headcount figure. But, the business asked to also see the number of customers gained and lost each day.

6) So far, for each customer, you have a row that contains every single date since they first became a customer along with their status on that date (either 1 or -1). So the daily status is populated on every row.

To determine the number who joined on a particular date, the customer record must reflect that date and count them as +1 on that date only (not all the dates in between). Similarly, for terminations, you want the customer record to reflect that date and be counted as -1 on that date only (again, not on all the dates in between). On the dates in between (for both + and -), you want to see nothing (0 or null).

You have a column for net (end of step 5), but now you need to create a new column for gains and a new column for losses. Putting them in their own columns allows you to aggregate them separately from each other and from net.

7) To meet this objective, you join the original data (20 rows going into the left of the Join tool) onto where you were at the end of step five (1,766 records), ensuring you join on all the correct fields. Once joined, that status of Active or Terminated will appear on the relevant date only in a new column, which is what you need. The number of records here will not increase because each of the 20 original rows will just be tacked on to the 1,766 records as new columns.

Note that transaction ID is critical to including in the join criteria for cases where the same customer(s) arrives and leaves more than once. The transaction ID ensures the records join up correctly.

Right Outer Join: the J and R outputs of Join tool followed by their Union.

The Append Fields tool you used in step three behaves differently than the Join tool. Using Append ‘filled in’ the status on every date (which is what allowed you to calculate the net), whereas using Join here filled in the status only on the date when the activity happened (which is what will allow you to calculate gains or losses on a single date). Similar to what you did for net, you can use a formula to convert Active to a numeric value of 1 and Terminated to -1. But you do this on the new status columns that were joined, so that when the formula sees the null in the new columns, it will populate a 0 for that line (instead of 1 or -1 like it did for net).

Two new columns are created and renamed — Summary tool can calculate the net, gains, and losses per day.

8) You are left with 181 records that contain a date from August 4, 2020 through January 31, 2021 and three columns: net, gains, and losses. Recall in step two that you have 181 days in between your earliest date and today. So you have the data prepared sufficiently for time-trend analysis.

Here is what the entire Alteryx workflow looks like:

Example of an Alteryx workflow that scaffolds dates to calculate customer activity per day.

If you were to copy and paste this transformed data into Tableau, for example, you could create a graph like the one below, which is entirely different from your original, and it is more meaningful, as every single day is accounted for showing the net number of customers for that day:

Line graph of net customers on each day from Aug 4, 2020 to Jan 31, 2021.

If you hover over any date in the above, you can see the net, gains, and losses for that specific day — here are three examples below:

Tool tip detail examples from the new headcount data.

Use case example 2 — sales and inventory scaffolding

This example is quite detailed, so I want to instead write about where scaffolding comes into play that you can apply to a similar situation.

You have been asked to calculate a common key performance indicator (KPI) used in inventory analysis — referred to as weeks of stock (also known as weeks of supply or WOS) — for each of your products and store locations. You need to be able to show WOS at various levels in the business (category, store, location) over time.

Weeks of stock calculation is calculated by dividing current inventory by average sales.

In this case, our numerator (inventory data) is stored separately to our denominator (sales data) and both are reported on a weekly basis.

Each of these sources receives new rows of data only when there is a change. This means that:

  • some weeks the sales source gets new rows, but inventory doesn’t
  • some weeks the inventory source gets new rows, but sales doesn’t
  • some weeks there is no change to either sales or inventory, so neither source gets new rows

Pause a moment to think about the WOS equation. Notice that the denominator needs the average amount sold over the past X number of weeks. That’s important because those weeks need to be consecutive to calculate an accurate average, which means you can’t have gaps in the data for the weeks when nothing changed.

Let’s get to work.

First, join the two sources. This is fairly straightforward in this case as they have similar naming conventions and structure. This will expose some gaps in the data. The gaps come from the three bullet points above. For bullet one, this inventory data can be completed by filling in zeros. For bullet two, this sales data can be completed by filling in zeros. But for bullet three, you need to do three things:

1) You must first insert the weeks that are not present in the data at all (this is similar to the first headcount example) for all product combinations. These are your dummy rows.

2) These dummy rows have no product, store, or location information because you have created these rows from scratch. These details will eventually need to be populated.

3) The values in the dummy rows for both sales and inventory should be filled in with zero, since you know that there was no change or activity for them (which is why they don’t exist in the original data).

The overall idea with scaffolding is to ‘complete’ your data set so that you can work with it accurately and fully.

The image below shows an Alteryx workflow that was built to address this business challenge. The top section shows the two sources being joined and the bottom section is where the gaps are filled in via scaffolding.

Sales and inventory are joined (top left) and then scaffolding is used to fill in missing weeks and product details.

The Summary tool generates product detail, store, and location hierarchies (so these can be used to fill in gaps) by grouping relevant items from the joined sales and inventory data, and the Generate Rows tool is used once again to explode out the week-ending dates.

Now you have a complete data set (two independent data sources joined up and scaffolding used to fill in the missing data) that can be viewed in Tableau, for example, and you can perform accurate calculations to visualize WOS-over-time easily.

In the example below, parameters have been created so that the user has control over the number of weeks back they want to look (for the denominator) and they can view the WOS in terms of sales units or sales dollar value. Table calculations were used, which means the weekly data must remain in the view for the calculations to be accurate. However, it would be possible to create something similar using calculated fields instead, now that you have the data structure in order and in a single source.

An example of WOS being calculated using table calculations in Tableau.

Data that is refreshed incrementally may not contain all the data points required to support business analysis. Data scaffolding is a technique that allows you to build a framework that ‘completes’ your data set so that it can be analyzed and visualized more thoroughly. I hope the examples I have shared will help you solve similar data problems you might encounter.

The post What is Data Scaffolding? appeared first on Nightingale.

]]>
5620
Viz What You Love https://nightingaledvs.com/viz-what-you-love/ Fri, 04 Sep 2020 02:46:14 +0000 https://dvsnightingstg.wpenginepowered.com/?p=4399 After a long day of meetings, projections, and rigorous data analysis there’s nothing I look forward to more in the evening than sitting down and..

The post Viz What You Love appeared first on Nightingale.

]]>
After a long day of meetings, projections, and rigorous data analysis there’s nothing I look forward to more in the evening than sitting down and fiddling with a Business Intelligence tool.

Really. I’m not joking.

You might think I’m getting paid to say this (I’m not), but most evenings, while my girls are falling asleep next door, I’m sitting in my office in the same chair I worked in all day, building data viz in Tableau.

For fun.

Because I’m making art.

I’m not bold enough to call myself an artist but I’ve always been into drawing. Whether it be in class, a meeting, or watching TV you’ll find a pencil in my hand and images appearing on a piece of paper. While the tools have changed over time, from a Bristol pad and Dixon Ticonderoga to iPad Pro with Magic Pencil, the art and doodling has been constant.

Let’s get back to data viz.

I’ve been actively creating data visualization on Tableau Public for about three years now and have a portfolio of over 140 visualizations. A lot of my early work was driven by public data exercises that really helped me establish a connection to the #DataFam community as well as skill up.

When you’re really trying to get traction, these exercises are PERFECT. With the same tool and dataset you see hundreds of different vizzes every week. If someone else created something better/cooler/smarter than your work, the only thing keeping you from getting there is your own skillset and creativity.

But I started to find that community exercises didn’t always produce my best work. It’s not their fault, it’s entirely mine. If I felt uninterested in a topic or dataset I tried to make myself more interested. Bold color choices, weird chart types, lots and lots of unnecessary graphics. It was counterproductive.

I wanted to learn AND make art.

Thinking back to what originally got me excited about DataViz early in my career, I remembered a visualization by Rody Zakovich analyzing the legendary rock band Queen. Rody transformed what I thought a data viz could be.

Queen’s albums, weighted by words, tapering into a single point from Freddie Mercury’s spiked fist.

(Here’s the chart, sans Mercury)

What! You can do that?

I was taken with the cleverness and elegance of the design, but what really caught my attention was the passion. Rody had taken a topic he was personally fascinated by and elevated it with data. And it wasn’t really that hard. Yes, he used trigonometric functions to bend the lines at the bottom, but we’re really looking at 4 relatively simple charts here. BANS across the top, bar charts, a colored text table, and then some bent lines.

The combination, and the passion behind it, were what captured my imagination.

So, the question is, why wasn’t I working on projects that covered MY interests?

Cards on the table, I’ve been a geek since long before it was cool. I blasted through Jurassic Park (my first novel) in a brazen 24 hour binge as an 11-year-old. I read Star Wars expanded universe novels in middle school (in class, no less), and sketched pictures of Batman and the X-Men for friends.

While my tastes have progressed, I’m still a comic reading, sci-fi loving, pop-culture junkie. And we now live in an age where that’s become mainstream.

What could I learn, what could I REALLY do, if I chose to work on things that captured my attention?

Data viz is about translating difficult to grasp concepts into easy-to-understand visuals. While bars, BANS, lines, and maps may be the bread-and-butter of the discipline, there are other ways to express these ideas.

This led me to ideas like mapping the dream worlds of Inception. Christopher Nolan’s complex dream-heist masterpiece can be complicated to wrap your mind around (believe me, I had to figure out how to make an image of it), but it was an EXCELLENT opportunity to practice Cartesian mapping of lines.

The crew executing the heist enters a series of layered dreams with one member staying behind at each level. After hitting the bottom, they wake up successively through the dreams that they’ve passed down through, ultimately (we hope) returning to the waking world.

You may ask, IS this “DataViz”?

If your sole criteria is “does it show numerical data” then the answer might be no. But, like a map, it’s expressing the relationship between locations and taking a concept that’s difficult to describe by text and giving it a visual explanation.

At this point, you’re thinking “yeah, but I’m working in reality, not dreams”.Let’s talk something more tangible, like the growth and contraction of a marketplace. In this particular case, I’d read Blake J Harris Console Wars about the early 90s marketing battles of the fledgling Sega against Nintendo.

I wanted to find a unique chart type that could show the relative sales volumes of the different console manufacturers over time, as well as illustrate the boom and bust nature of the market as a whole.

There are plenty of “safe” ways to express this idea, but embracing the idea of the vivid visual bombast of video games, a novel chart type was the perfect fit. I’m not a “Math guy,” and it took me about nine attempts, but I was able to correctly navigate the dozen or so layered mathematical calculations needed to create the lush waves of the console gaming market in this Stream Graph.

You’re not going to find this in the Big Book of Dashboards and it’s not a best practice chart. Having said that, it catches the eye and serves its purpose. The entry points for different companies into the marketplace (and some exist) are easy to recognize, and you can see the evolution and growth up until the peak in 2008 when Nintendo’s explosive Wii game sales topped out.

You can still have fun with some best-practice charts too.

Inspired by an episode of the sitcom Community which featured a class “Nicolas Cage: Good or Bad,” I decided to explore the Rotten Tomatoes scores for the actor’s films. Since RT includes % approval ratings from both Audiences AND Critics, I chose to analyze the gaps in perception.

This Diverging Bar chart shows us where the two types of viewers differ and agree most significantly. Outside of a couple of creative design flares and color choices, this is a perfect best-practice viz that made for a fun evening creating, and several interesting discoveries within the data along the way.

Life is short, though, and sometimes you just feel compelled to swing for the fences with a viz that breaks all of the rules. It COULD work, it could flop, but by pouring yourself into it, putting yourself out there, you find out what works and what doesn’t.

That’s the story behind my Marty McFly timeline viz, which shows a VERY unorthodox timeline. Time travel is WEIRD. Though a character physically moves to a different period in time (forward or back) they’re always aging as they continue, and thus, their own personal timeline is always moving forward.

With each of these projects I learned something new about what did (or didn’t) work, and in many cases tried new techniques I wouldn’t have had the opportunity to try at work.

Had I not been as invested and curious about the subjects as I am, I know (I have the vizzes to prove it) I wouldn’t have created something as interesting.

Self-development, even through fun topics and unorthodox charts, is a journey. You’re still going to put in hours on evenings and weekends, and results aren’t immediate. If you doubt me, feel free to look back through my Tableau Portfolio, you’ll see plenty of clunkers. However, if you treat yourself to projects that you are passionate about, you’ll find yourself picking up neat tricks, greater speed, and a new zeal that will cross over into your day job in an amazing way.Viz your passion. It’ll transform your work, boost your learning, and create a portfolio you’re proud of.

CategoriesData Art

The post Viz What You Love appeared first on Nightingale.

]]>
4399
So You Want to Pick a Dataviz Platform https://nightingaledvs.com/so-you-want-to-pick-a-dataviz-platform/ Tue, 17 Sep 2019 22:27:24 +0000 https://dvsnightingstg.wpenginepowered.com/?p=4633 Ok Phil, your first ever column got 2,000 clicks in two days. Now. You need to keep that momentum going, boy! Failure to make a..

The post So You Want to Pick a Dataviz Platform appeared first on Nightingale.

]]>

Ok Phil, your first ever column got 2,000 clicks in two days. Now. You need to keep that momentum going, boy! Failure to make a good article here means you’ll lose all the goodwill you got on your first one. And it’s even worse for you since, ya know, your first article was literally about embracing failure. So … hey, no pressure.

Now. Let’s open with something solid. How about …

There’s more than one way to skin a cat.

WOW! That was teeeeeerrible. Offensive to animal rights people and a dumb metaphor in the first sentence. Amazing. You have a gift. Try again, and make sure to delete this before the editor sees it.Folks pretty desperately want to figure out data visualization. Well, good news! The industry wants to help. Articles, classes, and videos spend untold minutes, nay, perhaps even hours telling you how to tackle the mountain of libraries and apps that exist now to make a chart.Even so, they may fail to some degree. There’s a mountain of articles, sure, alongside a similar mountain of apps and techniques to use them. But does this pseudo-helpful mountain address one key question that newcomers ask me? Does the mountain peak look down from its lofty perch and hear:“Wait, which one of these things do I learn?”It’s a good question, but strong opinions and hot takes cloud the answers you get. You, consultant or CIO or data officer trying to make a good recommendation on what software to buy, need something that gets to the heart of the real question you have.

Not

“Which do I learn?”

But

“Which is right for me? Or my organization or team?”

If you’re curious how we get to that second question, read up on something called the Heilmeier Catechism. I use it a lot whenever I go through the design thinking process. In the industry, we call that second question a business question — the actual problem underlying the issue we think we have.I don’t want to learn something that I don’t like or understand, you say. And I really don’t want to learn something that might cost more money or bring extra risk. Don’t give me some opinionated drivel or some boring statistics, you continue. I can get that anywhere. Show me what works best for me. Use simple language and make it something I can follow along.

I’m gonna do this for you.

But not with a one-paragraph answer. That’s for chumps, and your needs and questions are way larger than that.

This is a series, and it’ll be live fire. I’ll show you that you can use the same data to make the same chart, three different ways, and I’ll document how.When I say “three different ways,” I really mean it. I’ll talk about the different ways you can use Excel to get through the chart-making process. Then, I’ll demonstrate how to use code like a programmer might with Python (plus some other friends). Finally, I’ll show you how a popular data visualization application named Tableau can do it. Same data, same chart, three completely different ways to achieve it.At the end, you get to compare how I made each chart. You can decide on your own which is right for you, but I’ll give you some pointers. I’ll even crack open draw.io and make a flowchart. You can copy & paste it into a PowerPoint deck for your next staff meeting where you’ll “forget” to cite me. Hey, buddy. It’s all for you.Draw The Rest Of The ChartImposter Syndrome and Levelling Up in Data Visualizationmedium.com

After all this is done, you may have an epiphany. You may realize that you now know a bunch of ways to make a chart. You may also realize that you don’t want to lock yourself into just one path. If that happens, then I have good news: You’ve learned the real lesson here.

I primarily do consulting for the US federal government. Because of that, I can’t even guess how many different ways I’ve made charts over the years. Oh, this client likes Excel. That client has coders who love R but they didn’t obtain the necessary software approvals to get it for me, even though the contract says all work must be done on a client computer. This other client wants everything on a SharePoint site and fully interactive with their List. This other client paid for a Tableau Server, but they require that I use a version of Desktop that includes the number 8, but isn’t 2018.

That’s life. Deal with it, consultant.Yes, if it wasn’t clear: All of that has happened (or is happening right now) to me, because that’s what your tax dollars look like when they’re hard at work. Pick “one” dataviz platform. Heh.

Whatever. I love my country, so I support it by making charts. Wow, that’s depressing.

Oh, says a critic. But that means a lot of people will have all these tools and will want to use them, and the people who don’t know any of this will have to rely on them. To which I say: yes. Your point?

We’re doing Microsoft Excel first, and guess what? It’s the longest one! It takes the most time! Just watch how I do it!Part One- Excel

Part Two- Python (and friends)

Part Three- Tableau

Coming Soon- Wrap-up


Philip Hawkins hasn’t retaken the AWS exam yet. He has an outline for a sci-fi novel and a few rough chapters written. He strongly believes that Medium needs a dark theme. Philip recently learned how to make a heatmap in Bokeh despite there being no documentation for it aside from one example.

The post So You Want to Pick a Dataviz Platform appeared first on Nightingale.

]]>
4633