All posts by media-man

How to be an intern at NPR Visuals (Apply now for Winter/Spring 2016!)

We’re currently looking for interns for spring 2016!

We want to see your best work.

Here’s how.

Cover letters

All candidates must submit a cover letter. Your cover letter should be a statement of purpose. We’re interested in what you’re passionate about and why you’re passionate about it. (Most cover letters tell us that you are hardworking, passionate and talented, etc. And that you love NPR. We don’t need you to tell us that.)

  • Tell us what you care about and work on.
  • Tell us why you are passionate about your work.
  • Tell us why this opportunity will help you reach your potential.
  • Tell us how you will contribute to our team.

Other expectations

  • Photo internship candidates must have a portfolio.
  • Programming/design candidates with either projects on Github or a personal site are strongly preferred.

Selection process

After you submit a resume and cover letter, our selection committee will read through all the applications. We’ll reduce the list to approximately 8-10 candidates by eliminating applications that don’t have a cover letter and resume or who clearly aren’t a good fit for the team.

If you’re one of these candidates, two or three folks from the Visuals team will conduct a 30 minute Skype interview with you. You’ll get an email before your interview with outline of the questions you’ll be asked in the interview and also given the opportunity to ask any questions beforehand. The questions may vary a bit from interview to interview based on your professional experience, but we will be as consistent as possible.

Then we’ll call references and conduct some follow-up via email, possibly asking one or two more substantial, interview-style questions. Email communication is crucial in our workplace, and gives us an opportunity to see how you communicate in writing. We expect that answers are prompt, succinct, and clear.

We’ll follow up with all of our finalists with some constructive criticism about their application and interview.

Who we are

We’re a small group of photographers, videographers, photo editors, developers and designers in the NPR newsroom who make visual journalism. (Yeah, NPR is a radio thing, and yeah, it’s weird sometimes.) Check out our latest stuff!

Why we’re doing this

Everyone on the Visuals team wants to open our field to the best people out there, but the process doesn’t always work that way. So we’re trying to make the job application process more accessible.

Applicants with strong cover letters and good interview skills naturally tend to do well in this process. Often, those skills are a result of coaching and support — something that not all students are privileged to have. To help candidates without those resources, we’re being more transparent about our process and expectations.

We’re certain that we’re missing out on candidates with great talent and potential who don’t have that kind of support in their lives. We think knowing our cover letter expectations and interview questions ahead of time will help level the playing field, keep our personal bias out of the interview process, and allow better comparisons between candidates.

Apply!

Photo editing

Our photo editing intern will work with our digital news team to edit photos for npr.org. It’ll be awesome. There will also be opportunities to research and pitch original work.

Please…

  • Love to write, edit and research
  • Be awesome at making pictures

Are you awesome? Apply now!

Design and code

This intern will work as a designer and/or developer on graphics and projects for npr.org. It’ll be awesome.

Please…

  • Our work is for the web, so be a web maker!
  • We’d especially love to hear from folks who love illustration, news graphics and information design.

Are you awesome? Apply now!

What will I be paid? What are the dates?

The deadline for applications is November 1, 2015.

Check out our careers site for much more info.

Thx!

Why so many people clicked play on this story’s audio from a congressional hearing

image

When people visit NPR.org stories that include audio, few typically click “play” – only about 13 percent. 

But this piece by NPR’s Eyder Peralta? It got hundreds of thousands of views and about 62 percent of them resulted in a “play.”  

Why? Because the audio was thoughtfully cut and packaged for a digital audience. Take a look for yourself:

image

Let’s break this down:

The headline: It’s clear and easy to understand. And the content delivers exactly what the headline promises – a story told through sound snippets.

Some text, but not too much: Eyder begins the piece with a few paragraphs for context without burying the audio, which, again, is the experience people are promised.

The packaging: Eyder doesn’t overthink it. He lists each clip with a brief description so you know what you are getting before you click play. That makes it easy to scan from clip to clip (and our analytics show that’s exactly what people are doing).

This is not the first NPR story to take this form. It’s often used by the Two-Way blog for congressional hearings and complicated issues, such as this one.

The chart at the top of this post is from our internal analytics dashboard. It shows the surge of people who visited the Planned Parenthood story, which became the most popular item on NPR.org.

                                                                           –Eric

How to make scenes that breathe and move and WORK

Thanks to an invitation from Storybench.org to write about NPR stories that “breathe life into a neighborhood scene,” I’ve been thinking about what distinguishes audio scenes that are, well … meh … from those that really sing.

I came up with six examples of scenes you can learn something from (though there are many, many more). Check it out HERE.

Here are some “CliffsNotes” of ways to create immersive scenes:

1. Awesome writing (like at the beginning of this piece by Robert Siegel from Cuba).

image

2. Stereo sound recorded and mixed by a pro (fast forward to about 9:45 in Part 1 here)

3. A clearly plotted pathway through the scene (I love how KCUR’s Frank Morris walked through the 7 mile-long line of tornado destruction in Joplin, MO)

4. A surprise that challenges listeners’ expectations (In this story from Turkey, Ari Shapiro highlights unexpected things about the place and people)

5. Movement! (Steve Inskeep takes you along a Louisiana street)

6. Audio of people interacting (Kelly McEvers brings LA’s Skid Row to life by letting us hear the regulars talk)

                                                                – Alison

Photo credit: NPR/John Poole

Public radio people told a story exclusively on Snapchat…



Public radio people told a story exclusively on Snapchat … and lived to tell the tale!

A couple of weeks ago, Alison MacAdam and I spent a day showing a new employee how a story comes together at NPR, from start to finish. It’s an extensive process that not many people — even those who work at NPR — get to see in its entirety, so we wanted to find a way to share the experience with our audience (who love getting glimpses at faces-for-radio). 

Enter Snapchat.

The app felt like a natural fit for the story we wanted to tell because it has a peek-behind-the-scenes feel. (You can watch the final product above.)

I wrote about how we did it in the NPR Social Media Desk Tumblr, but let me add something here about using Snapchat for storytelling. 

Our experiment worked for a couple of reasons: 

1. It presents everything in chronological order. Classic story structure.

2. It’s easy to add more context with captions and drawings. 

3. It’s not that intrusive to the people you’re photographing or recording (videos have a maximum length of 10 seconds). 

4. It’s okay if it doesn’t look totally polished – that’s part of the charm.

8,800 people ended up watching the story in the 24 hours it was available. Dozens of them sent positive feedback right back to us via Snapchat. It was a reminder that you shouldn’t be afraid to make a story explicitly for a social platform. If it only lives in one place and doesn’t directly connect back to your site, that’s okay! 

                                                                                   —Serri

PS — Before you start, consider which platforms make the most sense for what you’re trying to accomplish. I don’t think this would have worked as well on Twitter or Facebook, for example, because you don’t naturally check back in with the same story throughout the day. 

When should you reveal the big magical number?

When you have a story that’s centered around a huge and surprising number, when do you reveal it?

This piece by NPR’s Nell Greenfieldboyce illustrates how the answer could be different for audio and digital.  It’s about researchers who discovered how many trees there are on the planet (the answer is 3 trillion).

In the radio version, Nell took listeners on the researchers’ quest to find the answer and didn’t drop the big figure until the story called for it – at the 1:30 mark of a three and a half-minute piece. 

CROWTHER: We all gathered in a room. It was a very exciting time. We’d been working towards it for two years.

GREENFIELDBOYCE: And…

CROWTHER: The total number of trees is close to about 3.04 trillion.

GREENFIELDBOYCE: Three trillion - that’s, like, eight times more than the previous estimate. If you were to plant a tree every second…

CROWTHER: It would take you somewhere in the order of 96,000 years to plant that 3 trillion trees. So it’s a huge astronomical number that I don’t think I could comprehend before this study.

If Nell revealed the number at the beginning, it would have spoiled the story.

But Internet readers of a text piece don’t want to wait. They want the answer immediately. And Nell’s write-up delivered it 27 words into the story.

image

                                                       –Eric

Building a neighborhood scene

On Friday, August 28, two stories on Morning Edition achieved the same thing: They painted effective scenes of single, emblematic streets. 

The first street is in LA - in this diminutive piece by NPR’s Nathan Rott about Californians limiting their water use. With a small amount of ambient sound, audio of people talking about their lawns, and a few directional details (”on the corner,” “a couple houses down,” “across the street”), Nate began his piece with a 360-degree view of the street.

It could have been… fine… to hear just one resident, but with three Nate sketched a more comprehensive visual image of the street. He also served his story better since he was elucidating statewide statistics, not just individual experiences. 

image

The second example is longer and more immersive. The entire frame of Steve Inskeep’s post-Katrina feature from Arabi, Louisiana, is a street scene: Schnell Drive, which was inundated after the hurricane (see photos above). Listen to the ways Steve (with producer Rachel Ward) sketched a human streetscape – not with predictable, static sounds (lawn mowers, cars passing) – but by capturing their interactions with residents - knocking on doors, introducing themselves, entering homes, engaging people spontaneously on the street.

There are a million ways to build a scene with sound. These are just two - unintentionally related! – ways to do it. The lesson here: If you want to bring a street or neighborhood to life, don’t describe its parts in isolation. Demonstrate how they are connected.

                                                                – Alison   

Credit: Photos of Schnell Drive and its residents by Edmund D. Fountain (Check out some of his other photos from the Gulf states here.)

How doodling can improve your audio story

Here’s a handy trick from NPR’s Don Gonyea, who has endured more campaign airplanes, Iowa State Fairs, and overstuffed spin rooms than almost anyone. Don is nearly always on a tight deadline, and it turns out he sketches pictures like the doodle below - to help him tell vivid stories quickly.

image

Don explained the doodle above – of an event at the Iowa State Fair - as a way of remembering the layout. It’s an act of reinforcement: The protesters stood where you see “Boo!” and “Hiss!” The supporters? Look for “Yay!” and “Go Go!” The stage was quite small (top left corner). There were hay bales (those rectangles by the stage). And so on… 

He even uses graph paper – in part because it helps keep things to scale.

Why not just take a photo?

Don says, “You can draw from any perspective.” For example, the aerial view. No photo will get you that, unless you catch a ride on Trump helicopter.

And why not rattle off these visual details into your microphone?

It would get lost on the tape. Don doesn’t have the luxury of rolling back through all of his audio to find those moments. He’s sprinting too fast for this afternoon’s All Things Considered or tomorrow’s Morning Edition.

Ultimately, Don uses his doodles to recall those one or two telling visual details he can write into his story. And those details make the difference between mediocre exposition and a story that takes the listener somewhere.

                                                                          – Alison

When you can’t get a story out of your head, write an explainer

image

You’ve probably seen the photos: shockingly orange water cascading through Colorado’s Animas River, the contamination the result of an accident by the Environmental Protection Agency at a nearby mine. 

KUNC reporter Stephanie Paige Ogburn has been covering the story on air, but I was particularly struck by one of her web posts about it (which she published before most national media started paying attention). It’s a great case study in the efficient explainer. 

The headline sets the right tone from the start: Why Was The Environmental Protection Agency Messing With A Mine Above Silverton? In the post, she deftly describes the history of mining in Colorado, how and why the EPA was involved in this particular site and the basic mechanics of how mining creates the bad orange water:

That water, when it runs through the rocks in a mine, hits a mineral called pyrite, or iron sulfide. It reacts with air and pyrite to form sulfuric acid and dissolved iron. That acid then continues through the mine, dissolving other heavy metals, like copper and lead. Eventually, you end up with water that’s got high levels of a lot of undesirable materials in it.

Ogburn says she wrote the post from home (bed, actually), the night after the accident.

“My husband was like, what are you doing, but I couldn’t stop thinking about that story and kept digging around when I got home,” she says. 

Her post was just what I wanted in the early days of the story. It works because it feels clearly distilled; it provides a remarkable amount of context on a complicated topic, without getting caught up in too many of the details. 

Photo credit: EPA

                                                                             —Serri

How to tell a powerful story… in real timeAugust 14 is…



How to tell a powerful story… in real time

August 14 is Melissa Block’s last day as host of All Things Considered. After 12 years “in the chair,” she leaves behind thousands of memorable moments. But none are more powerful than her stories from Sichuan Province, China, in the days after the 2008 earthquake hit. 

In honor of Melissa, I thought I’d share the audio excerpt above – because of the presence she brings to this story. In the midst of the tragedy and chaos you can hear above, Melissa just talked into the microphone, describing everything she saw. 

Here’s what’s happening: Melissa is watching a couple desperately search the rubble for their toddler son and parents, with the help of a huge, rumbling excavator. She is witnessing one family’s tragedy, nobody speaks English, and it’s not the time or place to stand around doing traditional interviews with an interpreter. So Melissa narrated the whole thing in real time. It’s raw. Heartbreaking. 

If you want to hear a master class in how to tell a disaster story – and to be reminded that journalists are human, and that’s OK! – there is no better lesson than this story.

image

And here’s one more reason Melissa is a class act. She took every chance she got to thank the people who helped her tell this story – interpreter Philip He (above) and producer extraordinaire, Andrea Hsu.

                                                                               – Alison

Photos by Andrea Hsu/NPR

It’s time for you to discover your mission.

I didn’t always have a mission. For the first seven years of my career, I worked in the software industry. The work was interesting, and I had a craft, for sure, but not a mission. All that changed when I quit my job to become a journalist.

Our mission at NPR Visuals is to make people care. Everything we do: The things we make, our design process, and how we measure success, all flow from that mission. It’s awesome.

We’re here to create empathy. To introduce you to somebody you’ve never met, and think for a few minutes about life in their shoes. We’re here to open your eyes and make you give a shit.

Are you ready for a change?

The Knight-Mozilla fellowship is an awesome opportunity — for you and for us. It’s a chance for you to change your life, to try out working in a newsroom. You’ll learn a ton, and we’ll learn from you.

We’re open to folks from all walks of life, but if you’re a filmmaker, graphic designer, or involved in the digital humanities, we’d especially love to hear from you. No sweat if you can’t code or haven’t reported a story before — we’ll teach you.

(As for the specific work you’ll be doing… it’s hard to say! That’s one of the joys of working in a newsroom. We work on short schedules, and news deadlines. But I can say that you’ll work with us to report and tell important, impactful, visual stories, online.)

Want to join our mission? APPLY NOW!

Listen to the work of your colleagues

From Sara Sarasohn, longtime NPR editor and producer, now editorial leader of NPR One:

“When I stopped being a producer, I did a back-of-the-envelope calculation and figured that I had mixed more than a thousand reporter pieces. That is what taught me to be an editor. I actively engaged with such a volume of the work of my colleagues - good and bad - that I developed a lot of ideas about what worked and what didn’t. 

When I became an editor, I realized that I was mostly just engaging with my own work. I had lost the valuable teaching I got from engaging in - not just listening casually to - others’ work. So I developed a practice of sitting down once a day with my hands folded in my lap and just listening, very carefully, to a piece I had nothing to do with. It gave me new insight into techniques and pitfalls I would never have if I just did my work as an editor. It only took five minutes a day. You don’t have to do that exactly, but you should develop some mechanism for learning from the work of others.”

Illustration by Alison MacAdam

Tell a small story in order to tell a large one*

Compliments to Rachel Martin, Jordana Hochman and Connor Donevan at Weekend Edition for this great story last Sunday. They produced an 11-minute piece about two sisters who were stranded in the New Orleans Superdome with thousands of other people after Hurricane Katrina.

Weekend Edition could have tried to tell the story of the Superdome by talking to LOTS of people. Some might consider that a more “accurate” reflection of what happened in 2005. 

But by focusing tightly on just two sisters, Rachel and team immersed listeners in Talitha and Regina Halley’s memories. And that “small” story told the BIG story of the Superdome more effectively, I think. There is an emotional truth to it – the horror, the trauma, the transformation of lives – that is hard to depict without this kind of deep dive.

Here’s Regina Halley, now 33 years old:

I still don’t feel like I’m OK. Like, for us, tomorrow never came. We were supposed to go back to our house. My sister was able to push through. I don’t even know how she was able to cope with it as a child. Or maybe her coping was to move forward and not let it stop her. But in me and my mom’s case, it’s totally different. Still to this day, if it’s raining, my mom, she still packs a suitcase.

                                                                           – Alison

*Lots of people have conveyed this “tell a small story to tell a big one” tip, but I’ll add a shout-out to NPR’s Ari Shapiro, who said it to me. 

One secret to good visual storytelling

The NPR Visuals team has gotten raves for this story, produced by David Eads and Claire O’Neill in 2014. “Demolished,” about Chicago’s public housing projects, won the top award given out by the Society of News Designers.

So why is “Demolished” great? Here is just one (of many) reasons, using three (of many) images from the story:

image

“Demolished” begins by drawing your attention to one girl and one photo. Above, you see 10 year-old Tiffany Sanders. This 1993 photo became an iconic image, used in rap videos, posters, news reports, and more.

At this point, you may expect the story to zoom in further on Tiffany. But click “Next,” and there’s a twist, the first moment of surprise:

image

When the buildings suddenly turn bright pink, the story’s focus shifts. You were looking at Tiffany. Now, you’re forced to look at her environment, the backdrop to her life.

Then, click the “Next” arrow, and there’s one more twist:

image

This image tells you that Tiffany’s home was demolished soon after the photo was taken. And it completes the thematic movement from the human… to her environment… to policy.  

Claire O’Neill explains what works here as “one thought per gesture;” meaning, each time you click “Next” you’re presented with just one idea, not many. In this case:

SLIDE 1: Meet Tiffany.

SLIDE 2: This is her home.

SLIDE 3: Her home is gone.

Using this simple approach (and by “simple,” I don’t mean it was easy to make!), “Demolished” achieved what visual aspire to: A high “completion rate.” That means, despite the depressing topic and the complicated urban policies it explored, “Demolished” held onto its viewers until the end.

                                                                – Alison

Credit: The photos used in “Demolished” were taken by Patricia Evans.

Sense of place: Learning from “insiders” with outside perspective

image

Why would anyone want to trade the comforts of British Columbia for a partially-destroyed, periodically war-torn, 7 mile-long enclave squeezed between Israel and the sea? 

NPR’s Emily Harris (with editing by Larry Kaplow) recently told the story of a family that moved from Vancouver, Canada, back to their native Gaza. 

Emily often reports from Gaza – so she has lots of chances to describe the place through the eyes of its residents. And it’s common, when we seek to evoke a place “authentically,” to defer to residents with the deepest local experience. We prize longevity (so-and-so has lived in Gaza her whole life). 

But this piece stood out because the Al-Aloul family brings outside perspective. And that perspective made their descriptions of Gaza more striking. I got a more surprising picture of Gaza through their eyes. 

Like this, from the Al-Aloul’s 20 year-old daughter, Nour:

My parents, they give you all the freedom here. Like, I go out, I do whatever I want because you walk in the streets, you know that no one will do something bad for you.

I love the idea that Gaza feels safer than Canada. 

The lesson here could apply to reporting anywhere. Mining deep local knowledge is always important, but the eyes of a local with outside perspective - a comparative view - is just as important.

                                                                    – Alison

Photo credit: Emily Harris/NPR

No studio on the road? No problem

image

Every radio producer and reporter knows you have to get creative when tracking while traveling. Our engineer, Kevin Wait, says you should try to recreate the studio environment as much as possible — find the quietest area and use something soft above or around you to minimize the echo.

All Things Considered producer Monika Evstatieva (above) went for the old coat-over-the-head trick while working with Ari Shapiro on a recent reporting trip in Eastern Europe. And NPR correspondent Jeff Brady constructed the mobile studio below with couch cushions while he was in Charleston, S.C., covering the church shooting. There was a loud wedding in the same hotel so he retreated to the bathroom when it came time to track. “It was a pretty good studio!,” he says. 

image

                                                                                    —Serri

Image credits: Ari Shapiro (above) and Jeff Brady (below)

How to apply for an internship at NPR Visuals

We want to see your best work.

Here’s how.

(In case you missed it, applications are currently open for our fall internships.)

Cover letters

All candidates must submit a cover letter. Your cover letter should be a statement of purpose. We’re interested in what you’re passionate about and why you’re passionate about it. (Most cover letters tell us that you are hardworking, passionate and talented, etc. And that you love NPR. We don’t need you to tell us that.)

  • Tell us what you care about and work on.
  • Tell us why you are passionate about your work.
  • Tell us why this opportunity will help you reach your potential.
  • Tell us how you will contribute to our team.

Other expectations

  • Photo internships candidates must have a portfolio.
  • Programming/design candidates with either projects on Github or a personal site are strongly preferred.

Selection process

After you submit a resume and cover letter, our selection committee will read through all the applications. We’ll reduce the list to approximately 8-10 candidates by eliminating applications that don’t have a cover letter and resume or who clearly aren’t a good fit for the team.

If you’re one of these candidates, two or three folks from the Visuals team will conduct a 30 minute Skype interview with you. You’ll get an email before your interview with outline of the questions you’ll be asked in the interview and also given the opportunity to ask any questions beforehand. The questions may vary a bit from interview to interview based on your professional experience, but we will be as consistent as possible.

Then we’ll call references and conduct some follow-up via email, possibly asking one or two more substantial, interview-style questions. Email communication is crucial in our workplace, and gives us an opportunity to see how you communicate in writing. We expect that answers are prompt, succinct, and clear.

We’ll follow up with all of our finalists with some constructive criticism about their application and interview.

Why we’re doing this

Everyone on the Visuals team wants to open our field to the best people out there, but the process doesn’t always work that way. So we’re trying to make the job application process more accessible.

Applicants with strong cover letters and good interview skills naturally tend to do well in this process. Often, those skills are a result of coaching and support — something that not all students are privileged to have. To help candidates without those resources, we’re being more transparent about our process and expectations.

We’re certain that we’re missing out on candidates with great talent and potential who don’t have that kind of support in their lives. We think knowing our cover letter expectations and interview questions ahead of time will help level the playing field, keep our personal bias out of the interview process, and allow better comparisons between candidates.

Apply for this fall!

If you’re looking for a gig, please apply. If you know somebody who may be, please pass this along.

What’s new in our first release version of the dailygraphics rig?

Our dailygraphics rig has been around for more than a year and in that time we’ve used it to make hundreds of responsive rectangles of good internet, but we’ve never made it easy for others to use. The rig is heavily customized for our needs and includes our organization-specific styles and templates. Despite this, a handful of hardy news organizations have made efforts to adopt it. In order to better facilitate this, today we are releasing our first fixed “version” of the rig: 0.1.0.

This isn’t a traditional release. The rapid pace of development and the pace of our news cycle makes it impossible for us to manage normal open source releases. Instead, we will tag selected commits with version numbers, and maintain a detailed CHANGELOG of everything that happens between those commits. This way users who want to use and stay up to date with the rig will have a clear path to do so.

As part of this release we’ve folded in a number of changes that make dailygraphics better than ever.

Block histogram

This block histogram is a format we’ve used several times to display discrete “binned” data. It works especially well for states or countries. Aly has turned it into a new graphic template so we can spin them up quickly. Run fab add_block_histogram to make one now!

Negative numbers and smart label positioning

The bar_chart, column_chart, grouped_bar_chart, stacked_bar_chart and stacked_column_chart graphic templates have all been updated to gracefully support negative numbers.

These five templates are also now much smarter about positioning labels so they always fit within the confines of the chart or hiding them if there is no way to make them fit in the available space.

(Curious how we did this? Here is the relevant code for bar charts. And here it is for column charts.)

Custom Jinja filters

Lastly, we’ve added support for defining custom Jinja filter functions in graphic_config.py. This allows for, among other things, much more complex formatting of numbers in Jinja templates. For example, to print comma-formatted numbers you can add this filter function:

def comma_format(value):
    return locale.format('%d', float(value), grouping=True)

JINJA_FILTER_FUNCTIONS = [
    comma_format
]

And then use it in your template, like this:

{{ row.value|comma_format }}

Documention for this feature has been added to the README.

Please see the CHANGELOG for a more complete list of changes we’ve made. We hope this new release process allows more news organizations to experience the joy of using a code-driven process for making daily charts and graphics.

Work with us this fall!

Hey!

Are you a student?

Do you design? Develop? Love the web?

…or…

Do you make pictures? Want to learn to be a great photo editor?

If so, we’d very much like to hear from you. You’ll spend the fall working on the visuals team here at NPR’s headquarters in Washington, DC. We’re a small group of photographers, videographers, photo editors, developers and designers in the NPR newsroom who work on visual stuff for npr.org. Our work varies widely, check it out here.

Before you apply, please read our guide about what we expect in an internship application.

Photo editing

Our photo editing intern will work with our digital news team to edit photos for npr.org. It’ll be awesome. There will also be opportunities to research and pitch original work.

Please…

  • Love to write, edit and research
  • Be awesome at making pictures

Are you awesome? Apply now!

Design and code

This intern will work as a designer and/or developer on graphics and projects for npr.org. It’ll be awesome.

Please…

  • Our work is for the web, so be a web maker!
  • We’d especially love to hear from folks who love illustration, news graphics and information design.

Are you awesome? Apply now!

What will I be paid? What are the dates?

The deadline for applications is July 31, 2015.

Check out our careers site for much more info.

Thx!

PMDMC Did Little to Clarify the Future of Pledge Drives

There were two sessions on the future of public radio pledge drives at last week's Public Radio Development and Marketing Conference In Washington, DC.  The conference was organized by Greater Public, the industry's trade group for fundraising and marketing professionals.

Here's a summary of the main points from those two sessions.

1. It is getting harder to raise money during pledge drives.

2. Greater Public presented a formula for lowering pledge drive goals to counter the impact of sustaining (monthly) givers and $1,000+ donors on drive results.  The example shown at the conference suggested goals should be lowered by as much as 25%.  The exact percentage will vary by station. The more successful a station is with Sustaining Givers, the lower the goal will be.

3.  Greater Public's Fundraising benchmarks show that up to 90% of stations still had room to increase annual listener income through pledge drives.

Unfortunately, those three points taken together lead to just one conclusion -- many stations will need to do more on-air fundraising with lower goals in a tougher fundraising environment in order to meet their listener income potential. That's a recipe for more pledge drive days and, perhaps, more pledge drives per year.

A separate, but related, thread in these sessions was the new wave of shortening or eliminating pledge drives. Station representatives from Phoenix and upstate New York presented their current approaches to reducing on-air drives.

As noted in a previous post, we always learn something new and valuable when stations embrace more programming and less on-air fundraising. What hasn't changed in nearly two decades of drive shortening efforts is this -- the less on-air fundraising a station does, the less room it has to increase its on-air goals.

We know from past experience that the less on-air fundraising approach doesn't rule out growing annual listener income. Most of that growth, however, has to come outside of pledge drives.  That conflicts with Greater Public's assertion that most stations still have growth potential from on-air drives, even in a tougher fundraising environment.

In the end, the conference sessions did affirm the difficulties stations face and that could help foster productive dialogue between fundraisers and their station managers.  But moving forward to the Fall fundraising season, PMDMC didn't deliver any new industry-wide intelligence on how to address the pledge drive challenges ahead.  It feels like a missed opportunity.  

Rethinking Public Radio Station Brands

This week Public Radio Fundraising and Marketing professionals are meeting in Washington DC at the Public Media Marketing and Development Conference. 

Public Radio branding is one of the big topics as NPR News stations try to figure out how to remain relevant as listeners gain more direct access to their favorite NPR content.

A few years ago, the mantra was that Local is the Future for stations. That hasn't worked out so far and probably won't since NPR News listeners consider themselves citizens of the world. They are the epitome of "think globally, act locally."  The range of potential content available in a station's market is simply too narrow to win enough listening to remain sustainable. 

Thanks to a resurgence of podcasting, many stations are asking if their future is in podcasts.  Perhaps, but not solely.  Stations will need NPR as part of their broadcast and digital brand in order to remain viable, let alone grow, in the future.

They can do that as public media brand aggregators.

Think of it this way. NPR is Apple. Stations are Target, offering the leading brand (Apple) but also other top digital and electronics brands. Consumers can get most of Apple's products from either brand.  They can shop online.  They can go into an old-fashioned brick-and-mortar. 

Some consumers shop for Apple only through Apple.  Some shop only through brand aggregators such as Target.  Some do both.  The same behaviors will unfold in public radio.  The good news is that there's plenty of room in listeners' minds and hearts to embrace both.

Stations have always been brand aggregators by carrying programs such as Marketplace from APM, This American Life, and the Moth.  In the past, it almost worked against station interests to highlight those brands. 

Maybe that's changing. Consumers in the digital space are now learning that not everything good in public radio is from NPR.  They're learning there is more than one quality brand.

Being a quality brand aggregator can be a brand too!  Stations have an opportunity to become a primary source of the best brands in public radio -- over the radio and in the digital space. Stations have the opportunity to be the place to find listeners their favorite brands and discover new ones.  

One of those new brands should be the station's original programming, which does not necessarily have to be local and it doesn't have to be just news.  It has to be enriching and engaging.  It has to be comparable in quality to the best existing brands in public radio. It can be on the radio, digital, or both. Sense of Place is important but it is not necessary 100% of the time.

Great original content, easily found and consumed along side the best national content in public radio, will create a station brand that still highlights NPR but is much more than NPR.

I think this is a highly viable approach for stations. It works for Apple, perhaps the strongest consumer brand out there. It works for Target. It could work for NPR, stations, and other producers and distributors in public radio.

We’re hiring a designer!

The NPR Visuals team

Love to design and code?

Want to use your skills to make the world a better place?

We’re a crew of visual journalists (developers, designers, photojournalists…lots of things) in the newsroom at NPR headquarters in sunny Washington, DC. We make charts and maps, we make and edit pictures and video, we help reporters with data, and we create all sorts of web-native visual stories.

(And yeah, sometimes it’s kind of weird to be a visuals team at a radio organization. But there’s this special thing about audio. It’s intimate, it’s personal. Well, visual storytelling is really similar. It’s power is innate. Humans invented writing — visual and audio storytelling are built in, deep in our primordial lizard brains. So, anyway, yeah, we fit right in.)

Pictures and graphics are little empathy machines. And that’s our mission. To create empathy. To make people care.

It’s important work, and great fun.

And we’d love it if you’d join us.

We believe strongly that…

You must have…

  • Strong design skills, and experience implementing your designs on the web
  • A steely and unshakable sense of ethics
  • Attention to detail and love for making things
  • A genuine and friendly disposition

Bonus points for…

  • Serious front-end coding skills
  • Experience running user-centered design processes

Allow me to persuade you

The newsroom is a crucible. We work on tight schedules with hard deadlines. That may sound stressful, but check this out: With every project we learn from our mistakes and refine our methods. It’s a fast-moving, volatile environment that drives you to be better at what you do, every day. It’s awesome. Job perks include…

  • Live music at the Tiny Desk
  • All the tote bags you can eat
  • A sense of purpose

Know somebody who’d love this job?

Maybe it’s you?

Email bboyer@npr.org! Thanks!

When Engagement Really Worked

Posted To: Ideas & Innovation > Blogically Thinking

This article first appeared June 22, 2015 in Nieman Labs.

Nowadays, we often seek to measure media engagement by social media activity, web metrics or attention minutes.

But there was a time in the not-so-distant past – before the Internet and social media disrupted traditional media – when genuine engagement really worked.  A period when news organizations actually involved people in their communities so successfully it triggered impact.

With last week’s celebration of the tremendous journalism contributions of Ed Fouhy, the award-winning broadcast executive and founder of the Pew Center for Civic Journalism, it seemed like a good time to revisit what we already learned – but may have forgotten.

During the heyday of civic journalism, which spanned a decade starting in the early ‘90’s, the Pew Center funded 120 newsroom projects and rewarded scores more with the James K. Batten Awards. More than 600 CJ initiatives were counted and studied by U-Wisconsin researchers, who found a pattern of outcomes. Some 78 percent of the projects studied offered solutions, and more than half included solutions offered by citizens themselves.

I was on the frontlines of this activity. Fouhy hired me in 1994 to be his Pew Center deputy. A couple years later, I took his place at the helm.

I find it striking how many of these efforts foreshadowed what we now call interactive and participatory journalism.

Civic journalism began as a way to get political candidates to address the public’s agenda in running for office. News organizations soon adapted its techniques, starting with polls and town hall meetings, to difficult problems in their communities. Later on-ramps involved civic mapping, email, voice mail, cutting-edge video technologies, and eventually, of course, the Internet.

Key hallmarks of these civic journalism initiatives included:

  • Building specific ways to involve readers and viewers.
  • Deliberately positioning ordinary people as capable of some action.
  • Inviting the community to identify solutions.

Consider how some of these efforts played out:

Taking Back Our Neighborhoods: This seminal initiative, a finalist for a Pulitzer Public Service Award, set the bar high for CJ projects.  It evolved from the 1993 shooting of two Charlotte, N.C. police officers.

Determined to address the root cause of crime, The Charlotte Observer partnered with WSOC-TV to synchronize in-depth coverage and give people things they could do reclaim their communities.

Elements included data analysis, which identified patterns of crime and the most violent neighborhoods to spotlight. A poll asked residents how crime affected them, why crime was happening and what were possible solutions. Town hall meetings and neighborhood advisory panels in 10 targeted communities contributed very specific lists of neighborhood “needs” that were published with each community’s narrative.

Outcomes were impressive:  Some 700 volunteers stepped up to fulfill the needs on those lists – from opening new recreation centers to making uniforms for a fledgling drill team. Eighteen law firms filed pro bono nuisance suits to close crack houses. New community centers were built and neighborhoods were cleaned up. Eight years later, crime was still down and the quality of life had improved in eight of the 10 neighborhoods.

West Virginia After Coal: The Herald-Dispatch in Huntington, W.Va., and West Virginia Public Broadcasting joined forces in 2000-01 to examine one of the state’s biggest issues: Its future without coal.  

The partners developed a groundbreaking database that exposed how virtually none of the $18 million in coal severance taxes distributed to the state’s 55 counties and 234 municipalities were being used for economic development. Instead, the funds paid for such things as dog wardens or postage. The media partners used statewide polls and an interactive town hall involving audience input from 10 different sites via cutting-edge video conferencing technology. By the project’s end, the state was promising more job training and more revenue targeted to economic development.

Waterfront Renaissance: In 2001 The Herald of Everett, Wa. engaged the community in plans to remake its waterfront. It held a town hall meeting on development plans and created online clickable maps with moveable icons to give residents a virtual vote on what should be built along the Snohomish River and Port Gardner Bay. Some 1,200 people participated. The Herald tabulated the results of these maps and submitted their findings to city officials. A prevailing theme was that people wanted any development to give them access to their riverfront. Their wishes were ultimately included in city plans. The project today remains a prime example of how to involve citizens in surrogate “public hearings.”

Neighbor to Neighbor: In 2002, after the shooting of an unarmed teenager in Cincinnati sparked allegations of police misconduct and major rioting, The Cincinnati Enquirer embarked on an ambitious project. It held solutions-oriented conversations on how to improve race relations in every municipality and neighborhood in the regions – some 145 in all.  

Each group was asked to answer:

  • What three things can people do to ease racial tensions?
  • What three things would we like to see our community leaders do?
  • How can we make it happen?

Some 1,838 people participated; 300 people volunteered to host or facilitate the conversations. The project inspired much grassroots volunteerism and efforts among black and whites to interact. The project "started people talking together, going to dinner, meeting in their homes and going to school and churches together,” said then-managing editor Rosemary Goudreau at the time.

There were scores of similar robust projects:

  • The Savannah Morning News involved a large citizen task force in discussions and visits to 15 U.S. schools to figure out how to improve local education.
  • A 1997 series on alcoholism, “Maine’s Deadliest Drug,” by The Portland Press Herald and Maine Sunday Telegram led to citizen forming 29 study circles that concluded with an action plan to stem alcohol abuse.
  • We the People/Wisconsin, involving the Wisconsin State Journalism and the state’s public broadcaster, engaged in some of longest-running efforts to bring citizens face-to-face with candidates running for statewide office.

To be sure, journalism investigations often lead to widespread change. But, to me, so many of today’s journalism success stores seem pallid by comparison to what I saw during the period of civic journalism experimentation.

Simply put: civic journalism worked.  Readers and viewers got it.

We learned that if you deliberately build in simple ways for people to participate – in community problems or elections – many will engage.  Particularly if they feel they have something to contribute to the problem.

Nowadays, this is so much easier than it used to be. All that is needed is the creativity to make it happen.

Jan Schaffer is executive director of J-Lab, a successor to the Pew Center and an incubator for entrepreneurial news startups.


Navigating Law for Media Startups

Posted To: Ideas & Innovation > Blogically Thinking

This was first published March 10, 2015 on Mediashift.

When I launched J-Lab in 2002, the best piece of advice I received was to have a lawyer draft a Memorandum of Understanding outlining the relationship between my center and its soon-to-be home, the University of Maryland.

The agreement detailed how I would support my startup, who owned the intellectual property, how much the university would charge for administering my incoming grants – and how I might spin the center into its own 50s(c)3 or affiliate with another fiscal agent in the future.

Thanks to that MOU, when U-MD changed its rules for grant-supported centers, I was able to seamlessly transition to American University. The MOU basically served as a pre-nup agreement.

I never really expected to need the MOU – until I did. So, too, are new media startups finding themselves in situations where they need to know about, and plan for, an array of legal issues.  Many of these issues particularly affect digital-first news sites.  

With this, and many more experiences under my belt, I approached Jeff Kosseff, a Washington, D.C., media lawyer and fellow A.U. adjunct, about co-authoring “Law for Media Startups.” We wanted to make it a user-friendly e-guide to what news entrepreneurs need to know and also help them identify when they needed professional help.

Next, I recruited CUNY’s Tow-Knight Center for Journalism Entrepreneurship to help support the project.  The result: our 12-chapter guide that we hope will be as helpful to educators teaching media law courses and it will be to startup founders themselves.  A downloadable PDF is coming soon.

Most journalists are used to working with legal counsel for such things as pre-publication review of important stories.  But legal issues for digital-first news startups extend far beyond such traditional First Amendment issues as defamation, privacy and open government.

“New media has not changed the law of libel at all,” said Richard Tofel, ProPublica’s president and its resident lawyer. “But it has changed the breadth of laws entrepreneurs need to know about.”

How should you respond when someone is demanding the IP address or the identity of a commenter on your sites. How should you flag sites that steal your content? How can your make sure, in a rush to add an image to an article, that you are not posting a copyrighted photo?  How to deal with a freelancer’s request to use for another assignment research gathered for a story you commissioned? When it is OK for someone to be a freelancer and when do they have a right to be an employee?

“The No. 1 question by far that we hear from our members is about freelancer contracts and rights,” said Kevin Davis, executive director of the Investigative News Network.

The IRS sets out very specific guidelines addressing who should be an employee and who can be an independent contractor.  As important, it requires all unpaid interns to meet six specific conditions.

All digital-first news startups are collecting some type of data on their users, and while most journalists advocate for openness and transparency, as an Internet-based business, you have a number of legal obligations to keep certain formation private. You also need to tell your users how you will use their data.

Certainly, one of the biggest misconceptions some online publishers have is that you websites will only have immunity if you take a hands-off approach, and don’t edit or moderate any comments. Indeed, according to the e-guide, “service providers have wide latitude to edit, delete, or even select user content without being held responsible for it.”

Again and again, I have reviewed applications for J-Lab funding that promised that the startup would get grants to support its work.  However, the applicant was neither a nonprofit nor affiliated with one and, therefore. was not eligible for the grants they wanted to support their business.  News entrepreneurs need to understand what being a nonprofit entails or pick another business structure.

As our guide notes, “journalism is not something the IRS recognizes as having a tax-exempt purpose.” So, if you embark on applying for 501(c)3 status, you need to flesh out how you will be different from a regular commercial publisher.

In the media startup space, legal needs can be surprising. Lorie Hearn, founder of inewsource.org, has partnered with a number of media outlets to amplify her investigative stories in the San Diego area.

But she says she has begun to feel the need to craft written distribution agreements to cover inewsource partnerships with other news outlets, especially pertaining to how they credit her material on their websites. Some “want their own correspondents to come in and interview our people and make like this is a joint investigation,” she said.

For that, she will likely seek out a lawyer who has worked closed with her site over the years.

To read about more issues, see the full guide here.


Should Public Radio Offer Incentives to Attract New Digital Listeners?

The strategic use of incentives helps make public radio pledge drives more successful. They help boost the number of donations during key dayparts. They motivate some listeners to give at certain pledge levels and in ways that are beneficial to the station.

Incentives were successfully used in the late 1980s and early 1990s to encourage listeners to give via credit card instead of asking for an invoice. One of the most popular credit card incentives was an annual subscription to Newsweek magazine. Each subscription cost the station a dollar.

Incentives were successfully used in the late 1990s and early 2000s to encourage giving via the station web site. Stations held special “cyber days” to get listeners to give online. One of the most famous cyber days was in 1999 at WAMU when the station gave away a new Volvo.

Public radio has no problem offering incentives to generate contributions and encourage ideal giving behaviors.  Why not try the same for digital listening?

We know from decades of research that listening causes giving. And having more listening makes it easier to generate more underwriting revenue. Getting more listening, generating more public service, is the best fundraising a station can do. It might make sense to accelerate digital listening by offering some incentives for listeners to try it.

It’s an interesting prospect. There could be incentives for downloading an app or registering to listen online. There could be incentives for first use or the first ten hours of listening or a certain number of podcast downloads.

What types of incentives? That’s the fun part. We get to test.

Maybe it is offering bonus content or a coupon code for the NPR Shop. Maybe it is a dining discount with an underwriter or a digital coupon for a local bookseller. Perhaps it is a “vintage” t-shirt or mug from the back of the premium closet. Maybe a Bluetooth speaker is offered at a special discount price to digital listeners who use the station 10 times over two weeks.

Digital listening is supposed to be an essential component of public radio’s future. That means public radio’s finances will depend on it. It just might be worth the testing whether incentives can accelerate digital audience growth.

Promoting Digital Listening Like Your Survival Depends On It

How would you promote your public radio station’s on-line stream if the station’s very existence depended on it?

It’s not a hypothetical question.  Every public radio station faces that situation today as more of its listeners and donors spread their listening across broadcast and digital platforms.

It wasn’t a hypothetical question five years ago for Classical KDFC in San Francisco.  KDFC was a commercial radio station and its owner decided to drop the format.  Classical music lost its home at 102.1 FM.

The University of Southern California and KUSC stepped in and acquired two lesser signals on which to broadcast KDFC as a public radio station.  Two frequencies.  Far less coverage.  More than 100,000 distraught listeners who could no longer hear the station over the air.

KDFC already had a good digital presence.  It had streams and mobile apps.  It was social media savvy.  It had a good database and a newsletter.

KDFC researched the many ways listeners could easily hear its programming through digital platforms.  It developed recommendations for Internet radio options and how to use Bluetooth to send sound to external speakers.  It developed the simplest possible narrative for communicating those options.  It heavily promoted that narrative across all available touch points.  This went on for months.

Listeners who could no longer hear KDFC reached out to the station as well and KDFC was prepared to help them with information and support. That support went as far as KDFC’s program hosts returning phone calls from listeners and walking them through the steps necessary to hear the station online.  It was a daily occurrence.

Embedded in KDFC’s story is a template for how all public radio stations should be promoting their digital listening options.
  • Start with the goal of helping as many listeners as possible learn to create a quality listening experience on a computer, to listen via an app, to use external speakers at home and in the car, and to find and listen to a podcast or on demand content.
  • Have up-to-date and easy to use digital listening options.
  • Develop a simple narrative describing the benefits of using the station’s digital offerings, including step-by-step instructions on how to get the most out of each option.
  • Promote the heck out of it using every possible touch point, including on-air.
  • Provide prompt individualized customer service when needed.
  • Rinse and Repeat.
That last point is really important.  Rinse and repeat.

KDFC ended up with five different radio signals throughout the Bay Area.  Most of its previous coverage area was restored three years ago.  In some areas the station has even better coverage. KDFC promoted those new signals even more heavily than it originally promoted online listening, including billboard and bus card advertising, and has rebuilt much of its audience.

Still, 5 years after losing its original signal and 3 years after restoring most of its coverage, a pledge drive doesn’t go by without hearing from past listeners who are just discovering that KDFC is back on the air in their community. They didn’t get the message.

Rinse and repeat.  There’s always someone who didn’t hear the message.  There’s always some who has just discovered your station for the first time.

Growing digital listening is too important to not be engaged in continuous promotion.  To borrow and modify an old slogan from PBS, if you aren’t going to effectively promote your own digital offerings, who will?

If Digital is the Future, Public Radio Needs to Promote it Better Now

I just spent part of the last two days listening to 50 station breaks across 14 different large and medium market public radio stations. Every station is considered to be a top station in public radio and most are considered to be digitally savvy. Some quick numbers:
  • 43 of the breaks (86%) had absolutely no promotion for the station's digital listening offerings.
  • 8 of the 14 stations had no digital listening promotion. I listened to at least 3 breaks in one hour for each station.
  • Of the 5 stations that had some sort of digital listening promotion, 3 mentioned more than one type of digital listening in the same break.  For example, the website was promoted as a way to stream the station and as a way to hear the station's new podcast.
  • 1 station qualified as promoting digital listening only because it included the website in its legal ID, "...and online at WXZY.org."  That's more of a throw away mention than a promotion, but I still counted it.
There's not a whole lot to say here other than this is a woefully inadequate level of self-promotion given the importance of digital listening to public radio's future.  It is a notable lack of promotion given public radio's decades-long marketing lament, "If only more people knew about us."

When it comes to digital, even the people who know about us through the radio probably don't really know about our digital offerings.

It is going to be tough enough to win new listeners with the infinite number of media options now available in the digital space. Stations need to make it a priority to move as many current listeners as possible to its digital platforms. That starts with the station selling current listeners on those digital offerings. Right now, that doesn't appear to be happening in any meaningful way.

In the next post, a possible template for the promotion of digital listening.

Simplifying Map Production

Map of recent Nepal earthquakes

When news happens in locations that our audience may not know very well, a map seems like a natural thing to include as part of our coverage.

But good maps take time.*

In ArcMap, I’ll assemble the skeleton of my map with shapefiles from Natural Earth and other sources and find an appropriate projection. Then I’ll export it to .AI format and bring it into Adobe Illustrator for styling. (In the example below, I also separately exported a raster layer for shaded relief.) And then I’ll port the final thing, layer by layer, to Adobe Photoshop, applying layer effects and sharpening straight lines as necessary.

Mapping process

(* Note: I enjoy making maps, but I am unqualified to call myself a cartographer. I owe much, though, to the influence of cartographer colleagues and GIS professors.)

I concede that this workflow has some definite drawbacks:

  • It’s cumbersome and undocumented (my own fault), and it’s difficult to train others how to do it.

  • It relies on an expensive piece of software that we have on a single PC. (I know there are free options out there like QGIS, but I find QGIS’s editing interface difficult to use and SVG export frustrating. ArcMap has its own challenges, but I’m used to its quirks and the .AI export preserves layers better.)

  • This reliance on ArcMap means we can’t easily make maps from scratch if we’re not in the office.

  • The final maps are flat images, which means that text doesn’t always scale readably between desktop and mobile.

  • Nothing’s in version control.

So for the most recent round of Serendipity Day at NPR (an internal hackday), I resolved to explore ways to improve the process for at least very simple locator maps – and maybe bypass the expensive software altogether.

Filtering And Converting Geodata

My colleague Danny DeBelius had explored a little bit of scripted mapmaking with his animated map of ISIS-claimed territory. And Mike Bostock has a great tutorial for making maps using ogr2ogr, TopoJSON and D3.

(ogr2ogr is a utility bundled with GDAL that converts between geo formats. In this case, we’re using it to convert GIS shapefiles and CSVs with latitude/longitude to GeoJSON format. TopoJSON is a utility that compresses GeoJSON.)

Danny figured out how to use ogr2ogr to clip a shapefile to a defined bounding box. This way, we only have shapes relevant to the map we’re making, keeping filesize down.

ogr2ogr -f GeoJSON -clipsrc 77.25 24.28 91.45 31.5 data/nepal-geo.json ../_basemaps/cultural/ne_10m_admin_0_countries_v3.1/ne_10m_admin_0_countries.shp

We applied that to a variety of shapefile layers — populated places, rivers, roads, etc. – and then ran a separate command to compile and compress them into TopoJSON format.

ogr2ogr -f GeoJSON -clipsrc 77.25 24.28 91.45 31.5 data/nepal-geo.json ../_basemaps/cultural/ne_10m_admin_0_countries_v3.1/ne_10m_admin_0_countries.shp

ogr2ogr -f GeoJSON -clipsrc 77.25 24.28 91.45 31.5 data/nepal-cities.json -where "adm0name = 'Nepal' AND scalerank < 8" ../_basemaps/cultural/ne_10m_populated_places_simple_v3.0/ne_10m_populated_places_simple.shp

ogr2ogr -f GeoJSON -clipsrc 77.25 24.28 91.45 31.5 data/nepal-neighbors.json -where "adm0name != 'Nepal' AND scalerank <= 2" ../_basemaps/cultural/ne_10m_populated_places_simple_v3.0/ne_10m_populated_places_simple.shp

ogr2ogr -f GeoJSON -where "featurecla = 'River' AND scalerank < 8" -clipsrc 77.25 24.28 91.45 31.5 data/nepal-rivers.json ../_basemaps/physical/ne_10m_rivers_lake_centerlines_v3.1/ne_10m_rivers_lake_centerlines.shp

ogr2ogr -f GeoJSON -clipsrc 77.25 24.28 91.45 31.5 data/nepal-lakes.json ../_basemaps/physical/ne_10m_lakes_v3.0/ne_10m_lakes.shp

ogr2ogr -f GeoJSON -clipsrc 77.25 24.28 91.45 31.5 data/nepal-roads.json ../_basemaps/cultural/ne_10m_roads_v3.0/ne_10m_roads.shp

topojson -o data/nepal-topo.json --id-property NAME -p featurecla,city=name,country=NAME -- data/nepal-geo.json data/nepal-cities.json data/nepal-neighbors.json data/nepal-rivers.json data/nepal-lakes.json data/nepal-roads.json data/nepal-quakes.csv

(Why two separate calls for city data? The Natural Earth shapefile for populated places has a column called scalerank, which ranks cities by importance or size. Since our example was a map of Nepal, I wanted to show a range of cities inside Nepal, but only major cities outside.)

Mapturner

Christopher Groskopf and Tyler Fisher extended that series of ogr2ogr and TopoJSON commands to a new command-line utility: mapturner.

Mapturner takes in a YAML configuration file, processes the data and saves out a compressed TopoJSON file. Users can specify settings for each data layer, including data columns to preserve and attributes to query. The config file for our Nepal example looked like this:

bbox: '77.25 24.28 91.45 31.5'
layers:
    countries:
        type: 'shp'
        path: 'http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/10m/cultural/ne_10m_admin_0_countries.zip'
        id-property: 'NAME'
        properties:
            - 'country=NAME'
    cities:
        type: 'shp'
        path: 'http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/10m/cultural/ne_10m_populated_places_simple.zip'
        id-property: 'name'
        properties:
            - 'featurecla'
            - 'city=name'
            - 'scalerank'
        where: adm0name = 'Nepal' AND scalerank < 8
    neighbors:
        type: 'shp'
        path: 'http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/10m/cultural/ne_10m_populated_places_simple.zip'
        id-property: 'name'
        properties:
            - 'featurecla'
            - 'city=name'
            - 'scalerank'
        where: adm0name != 'Nepal' AND scalerank <= 2
    lakes:
        type: 'shp'
        path: 'http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/10m/physical/ne_10m_lakes.zip'
    rivers:
        type: 'shp'
        path: 'http://www.naturalearthdata.com/http//www.naturalearthdata.com/download/10m/physical/ne_10m_rivers_lake_centerlines.zip'
        where: featurecla = 'River' AND scalerank < 8
    quakes:
        type: 'csv'
        path: 'data/nepal.csv'
        properties:
            - 'date'
            - '+intensity'

Mapturner currently supports SHP, JSON and CSV files.

Drawing The Map

I’ve been pretty impressed with the relative ease of using D3 to render maps and test projections. Need to adjust the scope of the map? It might just be a matter of adjusting the map scale and centroid (and, if necessary, expanding the overall bounding-box and re-running the mapturner script) — much faster than redrawing a flat map.

Label positioning is a tricky thing. So far, the best way I’ve found to deal with it is to set up an object at the top of the JS with all the nit-picky adjustments, and then checking for that when the labels are rendered.

var CITY_LABEL_ADJUSTMENTS = [];
CITY_LABEL_ADJUSTMENTS['Biratnagar'] = { 'dy': -3 };
CITY_LABEL_ADJUSTMENTS['Birganj'] = { 'dy': -3 };
CITY_LABEL_ADJUSTMENTS['Kathmandu'] = { 'text-anchor': 'end', 'dx': -4, 'dy': -4 };
CITY_LABEL_ADJUSTMENTS['Nepalganj'] = { 'text-anchor': 'end', 'dx': -4, 'dy': 12 };
CITY_LABEL_ADJUSTMENTS['Pokhara'] = { 'text-anchor': 'end', 'dx': -6 };
CITY_LABEL_ADJUSTMENTS['Kanpur'] = { 'dy': 12 };

Responsiveness makes label positioning even more of a challenge. In the Nepal example, I gave each label a class corresponding to its scalerank, and then used LESS in a media query to hide cities above a certain scalerank on smaller screens.

@media screen and (max-width: 480px) {
    .city-labels text,
    .cities path {
        &.scalerank-4,
        &.scalerank-5,
        &.scalerank-6,
        &.scalerank-7,
        &.scalerank-8 {
            display: none;
        }
    }
}

Our finished example map (or as finished as anything is at the end of a hackday):

 

There’s still more polishing to do — for example, the Bangladesh country label, even abbreviated, is still getting cut off. And the quake dots need more labelling and context. But it’s a reasonable start.

Drawing these maps in code has also meant revisiting our map styles — colors, typography, label and line conventions, etc. Our static map styles rely heavily on Helvetica Neue Condensed, which we don’t have as a webfont. We do have access to Gotham, which is lovely but too wide to be a universal go-to. So we may end up with a mix of Gotham and Helvetica — or something else entirely. We’ll see how it evolves.

Locator Maps And Dailygraphics

We’ve rolled sample map code into our dailygraphics rig for small embedded projects. Run fab add_map:$SLUG to get going with a new map. To process geo data, you’ll need to install mapturner (and its dependencies, GDAL and TopoJSON). Instructions are in the README.

Caveats And Next Steps

  • This process will NOT produce finished maps — and is not intended to do so. Our goal is to simplify one part of the process and get someone, say, 80 percent of the way to a basic map. It still requires craft on the part of the map-maker — research, judgement, design and polish.

  • These maps are only as good as their source data and the regional knowledge of the person making them. For example, the Natural Earth country shapefiles still include Crimea as part of Ukraine. Depending on where your newsroom stands on that, this may mean extra work to specially call out Crimea as a disputed territory.

  • When everything’s in code, it becomes a lot harder to work with vague boundaries and data that is not in geo format. I can’t just highlight and clip an area in Illustrator. We’ll have to figure out how to handle this as we go. (Any suggestions? Please leave a comment!)

  • We’ve figured out how to make smart scale bars. Next up: inset maps and pointer boxes. I’d also like to figure out how to incorporate raster topo layers.

Let’s Tesselate: Hexagons For Tile Grid Maps

A hexagon tile grid, square tile grid and geographic choropleth map. Maps by Danny DeBelius and Alyson Hurt

A hexagon tile grid, square tile grid and geographic choropleth map. Maps by Danny DeBelius and Alyson Hurt.

As the saying goes, nothing is certain in this life but death, taxes and requests for geographic data to be represented on a map.

For area data, the choropleth map is a tried and true visualization technique, but not without significant dangers depending on the nature of the data and map areas represented. Clarity of mapped state-level data, for instance, is frequently complicated by the reality that most states in the western U.S. carry far more visual weight than the northeastern states.

Are more northeastern states shaded than western? That’s hard to say with this type of choropleth. Whatever, though. West coast, best coast, right?

Are more northeastern states shaded than western? That’s hard to say with this type of choropleth. Whatever, though. West coast, best coast, right?

While this presentation is faithful to my Californian perception of the U.S. where the northeast is a distant jumble of states I pay little attention to, I’ve learned in four years of living in D.C. that there are actually a lot of people walking around that jumble, and they’d prefer not to be ignored in mapped data visualizations. There are approximately 74 million people living in the thirteen states the U.S. Census Bureau defines as the Western United States, while around 42 million people live just in the combined metropolitan statistical areas of New York, Washington, Boston and Philadelphia.

One popular solution to this problem is the cartogram — maps where geography is distorted to correspond with some data variable (frequently population). By shading and sizing map areas, a cartogram can display two variables simultaneously. In this New York Times example from the 2012 election, the size of the squares corresponds to the number of electoral votes assigned to each state, while the shade represents possible vote outcomes. NPR’s Adam Cole used this technique to size states according to electoral votes and ad spending, as seen in the map below. Cartograms can be a great solution with some data sets, but they introduce complexity that might not serve our ultimate goal of clarity.

A cartogram of the U.S. with states sized proportionally by electoral votes. Map by Adam Cole.

A cartogram of the U.S. with states sized proportionally by electoral votes. Map by Adam Cole.

Recently, a third variation of choropleth has gained popularity — the tile grid map. In this version, the map areas are reduced to a uniform size and shape (typically a square) and the tiles are arranged to roughly approximate their real-world geographic locations. It’s still a cartogram of sorts, but where the area sizing is based on the shared value of one “map unit.” Tile grid maps avoid the visual imbalances inherent to traditional choropleths, while keeping the map a quick read by forgoing the complexity of cartograms with map areas sized by a variable data point.

Tile grid maps are a great option for mapped state data where population figures are not part of the story we’re trying to tell with the map. Several news organizations have used this approach to great effect, including FiveThirtyEight, Bloomberg Business, The Guardian, The Washington Post and The New York Times.

A square tile grid map

A square tile grid map.

Here at NPR, we recently set out to create a template for quickly producing this type of map, but early in the process my editor Brian asked, “Do the tiles have to be squares?”

More specifically, Brian was interested in exploring the possibility of using hexagons instead of squares, with the assumption that two additional sides would offer greater flexibility in arranging the tiles and a better chance at maintaining as many border adjacencies as possible.

The idea was intriguing, but I had questions about sacrifices we might make in scanability by trading the squares for hexagons. The columns and rows of a square grid lend to easy vertical and horizontal scanning, and I wondered if the tessellation of hexagons would provide a comfortable reading experience for the audience.

Here is Brian’s first quick pencil sketch of a possible state layout using hexagons:

Brian’s hex grid sketch.

Brian’s hex grid sketch.

That proof of concept was enough to convince me that the idea was worth exploring further. I opened up Sketch and redrew Brian’s map with the polygon tool so we could drag the states around to experiment with the tile layout more easily. We tried several approaches in building the layout, starting from each coast and building from the midwest out, to varying degrees of success.

Ultimately, I decided to prioritize accuracy in representing the unique geographic features of the U.S. border (Texas and Florida as the southernmost tips, notches for the Great Lakes) and making sure the four “corners” of the country were recognizable for orientation.

The final layout that will power our tile grid map template looks like this:

Six sides instead of four! That means it’s two better, right?

Six sides instead of four! That means it’s two better.

This map still has many of the same problems that other attempts at a tile layout of the U.S. have fallen into — the relationship of North and South Carolina, for one example — but we like the increased fidelity of the country’s shape the hex grid makes possible.

In case you were wondering, news dev Twitter loves talking about maps:

We recently published our first use of the hexagon tile grid map to show the states that currently have laws restricting discrimination in employment, housing and public accommodations based on sexual orientation, gender identity and gender expression. The hex grid tile map also made appearances in several presentations of last week’s U.K. election results, including those by The Guardian, Bloomberg Business and The Economist.

What do you think? Vote in the poll below!

Tech note: Connecting to an Amazon RDS database from a legacy EC2 server

Amazon’s Relational Database Service (RDS) is an excellent way to host databases. The service is affordable, low-maintenance, and self-contained. If you use the Amazon cloud, there are precious few reasons to maintain your own database server.

At some point, Amazon started requiring RDS instances to use Virtual Private Cloud (VPC) networking. However, if you’re like the NPR Visuals team, you might have older Amazon Elastic Cloud Compute (EC2) server instances that don’t use VPC but need to connect to RDS databases. Even if you don’t, you might need to connect to your RDS instance locally.

As is often the case with Amazon, it’s not entirely clear how to configure the correct security rules to allow access from outside the VPC. Here’s what worked for us.

During creation, make sure your RDS instance is publicly accessible. This setting cannot be edited later.

Make your RDS instance publicly accessible

For the security group setting, either option will suffice, though creating a new security group will help isolate the network access rules for this database instance.

Once created, click on the security group from the instance details:

Click the security group link

A new tab or window will open with the security group selected. Click the “Inbound” tab in the lower window pane, then click the “Edit” button to add rules to allow the IP addresses you want to access the RDS instance

Click inbound tab, then click edit

Now you can configure the inbound rules in the modal that opens:

Edit inbound rules in the modal

I found a lot of places in the VPC interface to set inbound rules, but only the security group rules actually worked to allow local machines and non-VPC EC2 instances access to the RDS database.

If you know a better way to handle this, let us know in the comments!

Better, faster, more: recent improvements to our dailygraphics rig

In the past couple weeks the Visuals team has consciously shifted resources to focus on the parts of our work that have the highest impact. As part of this reorganization the graphics team has grown from one (Graphics Editor Alyson Hurt) to two—the second being me! Having a dedicated engineer working on daily graphics means we are doubling down both the amount of content we can create and on the tools we use to create it. For the last week I’ve been sprinting on a slew of improvements to our dailygraphics rig. Most of these are small changes, but collectively they represent the biggest iteration we’ve made to dailygraphics since creating it over a year ago.

OAuth

Amongst a group of features we’ve ported over from the app-template is the addition of an OAuth support for access to our “copytext” Google spreadsheets. This means Google credentials no longer need to be stored in environment variables, which increases security and portability. (Hat tip to David Eads for untangling OAuth for all of our projects.)

This change also allowed us to implement a more significant feature: automatically creating copytext spreadsheets. Each time you add a graphic a spreadsheet will be automatically created. (You can opt out of this by not specifying a COPY_GOOGLE_DOC_KEY in your base graphic template or by deleting graphic_config.py entirely.)

Rewriting the copytext workflow has also allowed as to add a “refresh flag” to the preview. Now anytime you pass ?refresh=1 with your graphic preview url, the preview app will automatically re-download the latest copytext before rendering. This can tremendously accelerate editing time for text-heavy graphics.

Advanced graphic templates

As our graphics pipeline has matured we’ve started to run into many of the same limitations that prompted development of the app-template. As a result, we’ve reincorporated features such as template inheritance, asset compression and LESS support.

The base template

All graphic templates now “inherit” from a base template, which is found in graphic_templates/_base. When a new graphic is created, this folder is copied to the new graphic’s path before the normal graphic template (e.g. graphic_templates/bar_chart). This base template can house files common to all templates for easy updates. (The individual graphic templates can copy over any or all of them.)

The base template also includes a base_template.html which the original child_template.html now inherits from using Jinja2 template inheritance. This change means you can now make a change to the header or footer of your graphics and have it instantly incorporated in all your graphic templates. (Not retroactively though, every graphic is still a copy of all assets and templates.)

LESS and asset compression

All CSS files in graphic templates can now be LESS files, which will be automatically compiled during preview and deployment. The resulting CSS assets will be automatically compiled into a single file and compressed by using this code in the base template:

<!-- CSS + LESS -->
{{ CSS.push('css/base.less') }}
{{ CSS.push('css/graphic.less') }}
{{ CSS.render('css/graphic-header.css') }}

Mirroring the app-template, this same pattern is followed for compressing Javascript assets:

{{ JS.push('js/lib/jquery.js') }}
{{ JS.push('js/lib/d3.min.js') }}
{{ JS.push('js/lib/modernizr.svg.min.js') }}
{{ JS.push('js/lib/pym.js') }}
{{ JS.push('js/base.js') }}
{{ JS.push('js/graphic.js') }}
{{ JS.render('js/graphic-footer.js') }}

Google Analytics support

Our new base template also now includes code for embedding Google Analytics with your graphics. We’ve long wanted to be able to track detailed analytics for our graphics, but putting analytics inside in the iframe would have resulted in impressions being counted twice—once for the parent page and once for the child page. To avoid this we’ve recently begun tracking our project analytics on a separate Google property from that used for NPR.org. This allows us to put our custom analytics tag inside the iframe while our traditional pageviews are captured by the parent analytics tags.

Improvements to the graphic viewer (parent.html)

Perhaps the most obvious changes to the dailygraphics rig are our suite of improvements to the graphic preview template (a.k.a. parent.html). These changes are aimed at making it easier to see how the final graphic will work and making it faster to test. They include:

  • Resize buttons for quickly testing mobile and column layouts.
  • Border around the graphic so you can see how much margin you’ve included.
  • An obvious label so you know which environment you’re working in (local, staging, production).
  • One-click links to other environments and to the copytext spreadsheet (if configured).
  • Easy-to-copy Core Publisher embed code (for NPR member stations).

Other improvements

In addition to these larger improvements we’ve also made a couple of smaller improvements that are worth noting:

Upgrading

If you’re a user of the dailygraphics rig we strongly encourage you to upgrade and incorporate these new improvements into your process. I think they’ll make your graphics workflow smoother and much more flexible. After pulling the latest code you’ll need to install new requirements. Node.js is now a dependency, so if you don’t have that you’ll need to install it first:

brew install node
curl https://npmjs.org/install.sh | sh

Then you can update your Node and Python dependencies by running the following commands:

pip install -Ur requirements.txt
npm install

Please remember that everything in the dailygraphics rig still works on copies, so upgrading will not retroactively change anything about your existing graphics.

If you’re using the improved dailygraphics rig, let us know!

The Well-Chosen Word Matters in Pledge Drives Too

One of the big challenges during public radio pledge drives is avoiding clichés. They pop into the appeals of even the most experienced on-air pitchers. Fundraising fatigue will do that to you. 

Pledge drive clichés aren’t effective at persuading listeners that their support is important. 

You are the public in public radio.

And it is unlikely a cliché ever motivated someone to drop what she was doing to make a contribution.

We meet our goal one pledge at a time. Just you and 19 other people in the next 2 minutes gets us there.

For the most part, pledge drive clichés are silly filler.However, there’s a new one going around that is downright ridiculous and, in my opinion, a bit damaging.

It’s time to begin your financial relationship with the station.

When I hear this on the air, I can’t help but think about how Paula Poundstone might react using her best “Wait Wait… let’s stop the show for a moment while I ask a few questions to sort this out” voice. It goes something like this:

Hold on a second. Did you say that you want to begin a financial relationship with me?  How does that work?  I give you 10 bucks a month and you go halfsies with me on my kid's college tuition?

On-air fundraising is hard. In some ways it is the most challenging programming to produce in public radio because it is live and, even when heavily scripted, subject to spontaneity.

Sometimes that spontaneity makes the fundraising more effective. Other times it undermines not only the fundraising, but also the larger effort to build a true relationship with listeners beyond the programming.

It’s time to begin your financial relationship with the station.

Who talks like that in real life?

Public radio is successful because the well-chosen word still matters. Listeners will hear poorly-chosen words on-air as long as stations do traditional pledge drives. It's one of the costs of doing business that way.

It’s important to remember that the pledge drive words are just as much a part of how listeners think and feel about the station as the words they hear while listening to programming. Stations should strive to recognize those poorly-chosen words when they inevitably happen and ensure that they don’t become clichés that hurt the station’s image more than they help it.

Shoter Pledge Drives… Again!


Public radio is in another cycle of conducting shorter on-air pledge drives.  
 
The latest cycle started at North Country Public Radio (NCPR) in upstate New York.  Last fall, NCPR produced what it called a Warp Drive, allowing it to meet its $325,000 campaign goal with just 3 hours of traditional on-air fundraising. The typical NCPR drive was five days full of fundraising interruptions. 
 
NCPR achieved this through weeks of more aggressive off-air fundraising (email, direct mail, social media) supported by short on-air announcements that didn’t interrupt the programming.  Several stations have followed NCPR’s lead and have been able to cut their drives from more than a week to mere days, even hours.  Vermont Public Radio managed to meet its $350,000 goal without having to interrupt programming at all.
 
The “less on-air fundraising” movement isn’t new to public radio. There was a lot of experimentation in the mid-1990s. We helped WBUR in Boston cut a drive from 10 days to 3 hours with More News, Less On-air Fundraising. The station managed to keep drives very short for a little more than a year.  WKSU in Kent, OH pioneered All the Money in Half the Time. Many stations tried variations of these ideas throughout the 90s with good results. 
 
In the early 2000s, WUWM in Milwaukee eliminated its entire Fall drive for 3 or 4 years in a row using strategies similar to NCPR’s Warp Drive.  Around the same time, WSKG in Binghamton, NY invented the 1-Day pledge drive. 
 
Sonja Lee, who is part of our firm Sutton & Lee, helped perfect the 1-Day drive concept while she was at KBBI in Homer, AK. She helped us create a 1-Day drive kit and consulting package used by more than a dozen stations.  A few of those stations have been doing nothing but 1-Day pledge drives for years, including five straight years for Northwest Public Radio in Pullman, WA.
 
Shorter drives, by themselves, provide no long-term fundraising benefit. The real fundraising benefit of shortening drives is the leverage it provides when trying to get more sustaining members and direct mail givers. These types of donors have greater long-term value to the station. The promise of shortening or eliminating drives helps change their giving behavior.
 
It should come as no surprise then that drive shortening efforts tend to work best at stations with under-developed off-air fundraising programs.  There’s more financial opportunity.
 
Really short drives don’t last long at most stations. There are several reasons including:
 
- Failing to upgrade off-air fundraising efforts or maintain them at the highest level. After a few big successes, pledge drives get longer again in order to capture lapsed donors and lost off-air revenue. 
 
- Increased revenue demands. Stations increase their spending over time more than they can improve their off-air fundraising results.  Then the pledge drive creep begins.
 
- Novelty. Short drives are at their most efficient the first go-around. The actual on-air part of shorter drives make less money over time as listeners get used to them. The first few drives bring in lots of additional gifts as current members reward the station for doing less fundraising. The novelty wears off and the additional gifts go away.
 
This is where NCPR has made an important innovation. Almost every past approach to less on-air fundraising had a "pre-drive" that helped shorten the drive. NCPR flips that and says that the weeks leading up to the on-air pitching *are* the drive. The on-air part is merely clean-up. That's a very good message.  It redefines the drive and might help create future additional gift opportunities when the novelty wears off.  Whether that pans out remains to be seen.
 
Acquiring new members can be an issue over time but it is not initially a problem for most stations. At first, stations sees a spike in renewal rates and lapsed donors coming back. So even when new member counts are down, the donor database grows through better retention and reacquisition. This can last as long as two or three years if the off-air fundraising efforts are firing on all cylinders.
 
Will this cycle of shorter drives lead to a lasting change in how public radio conducts on-air fundraising?  Probably not.
 
NCPR repeated its Warp Drive approach this Spring and needed 2.5 days of traditional fundraising to meet its campaign goal.  While that’s a lot more than the 3 hours it required in the Fall, it is still a great success.  It’s half as much fundraising as the station used to do.  That’s good fundraising and good stewardship of the airwaves.
 
And, as with every past cycle to shorten drives, this one is helping public radio learn new things about fundraising that will make more stations stronger in a future where traditional pledge drives could be as much of a liability as an asset.

Making small multiples maps with invar

Mapping the spread of Wal-Mart

For a recent story on the growth of Wal-Mart in urban areas we set out to map Wal-Marts across the US and over time. Due to limitations with our dataset, we only ended up mapping three cities. Here is the graphic we produced:

Automation is key to generating these sorts of maps. There are huge number of things that could go wrong if each one was produced by hand. For this story the automated process involved connecting several different tools and many different data sources. In this post I’m going to set that complexity aside and focus on just the final part of the toolchain: outputting SVG maps for final styling in Illustrator. If you’re interested in the complete process, we’ve open sourced the code here.

Why use many little maps?

For this story the maps we produced were used as “small multiples”, that is, many small images that collectively illustrate a something larger. However, there are many other occasions where producing small maps is useful, such as when illustrating city or county-level data for many hundreds of places. Sometimes it’s necessary to generate these maps dynamically, but in many cases they can be pre-generated and “looked up” as needed.

From XML to SVG

To generate map images we used a tool I originally wrote over four years ago, when I was working for the Chicago Tribune: invar.

invar is a suite of three command line tools:

  • ivtile generates map tiles suitable making slippy maps.
  • ivframe generates individual maps centered on locations.
  • ivs3 bulk uploads files (such as map tiles) to Amazon S3 for distribution.

Both ivtile and ivframe use Mapnik as a rendering engine. Mapnik allows you to input an XML configuration file specifying styles and datasources and output map images as PNGs or SVGs. For example, here is a fragment of the configuration for the circles (“buffers”) around each store:

<Layer name="buffers" status="on" srs="+init=epsg:4269">
    <StyleName>buffer-styles</StyleName>
    <Datasource>
        <Parameter name="type">postgis</Parameter>
        <Parameter name="host">localhost</Parameter>
        <Parameter name="dbname">walmart</Parameter>
        <Parameter name="table">(select * from circles where year::integer &lt;= 2005 order by range desc) as buffers</Parameter>
    </Datasource>
</Layer>

<Style name="buffer-styles">
<Rule>
    <Filter>[range] = 1</Filter>
    <PolygonSymbolizer fill="#28556F" />
</Rule>
<Rule>
    <Filter>[range] = 2</Filter>
    <PolygonSymbolizer fill="#3D7FA6" />
</Rule>
</Style>

In this example we query a PostGIS table called circles to get buffers for stores opened before or during 2005. (The &lt; escaping is an unfortunate necessity for getting the XML to parse correctly.) We then color the circles differently based on whether they represent a one or two mile range. To render the map for Chicago we would run:

ivframe -f svg --name chicago_2005.svg -z 10 -w 1280 -t 1280 map.xml . 41.83 -87.68

(You can also render a series of images using coordinates from a CSV. See the invar docs for more examples and more details on the flags being used here.)

Documentation of the Mapnik XML format is relatively sparse, but Googling frequently turns up working examples. If the XML annoys you too badly, there are also Python bindings for Mapnik though personally I’ve never had much luck generating maps from scratch with them.

Using invar to make your own maps

invar is easy to install, just pip install invar. Unfortunately, the Mapnik dependency is notoriously difficult. You’ll find instructions specific to your platform on the Mapnik wiki. (If you’re on OSX I recommend brew!) This is the only time I will ever suggest not using virtualenv to manage your Python dependencies. Getting Mapnik to work within a virtualenv is a painful process and you’re better off simply installing everything you need globally. (Just this once!)

Dusting off invar after not having used it for a long time gave me a good opportunity to fix some critical bugs and the new 0.1.0 release should be the most stable version ever. More importantly, it now supports rendering SVG images, so you can produce rough maps with invar and then refine them with Illustrator, which is what we did for the Wal-Mart maps. Go ahead and give it a spin: the full documentation is here. Let us know how you use it!

Law for Media Startups – New Entrepreneurship Guide

Posted To: Press Releases

For immediate release
Noon, March. 4, 2015
Contact: Jan Schaffer
jans@j-lab.org

 

J-Lab partners with CUNY to create e-guide

 

Washington, D.C.“Law for Media Startups,” a new resource for entrepreneurs launching news ventures and educators teaching students how to do it, was published today by CUNY’s Tow-Knight Center for Entrepreneurial Journalism.

The 12-chapter web guide was written by Jan Schaffer, J-Lab executive director, and Jeff Kosseff, media lawyer for Covington & Burling LLP law firm in Washington, D.C. The Tow-Knight Center supported the research, writing and production as part of its suite of entrepreneurial journalism resources.

“This guide goes beyond traditional First Amendment law to address every-day issues news entrepreneurs confront,” Schaffer said. “From native advertising to labor law and fair use, it supplies what I found missing for my students.”

"Every day, innovators are developing new ways to deliver news and content to consumers," Kosseff said. "I hope this guide helps them identify the legal issues that they should be considering as they build their business models."

Small digital news startups are facing a range of legal issues as they set up their business operations, gather and report the news, protect their content, and market and support their news ventures. They need to know classic First Amendment law – and much more. This guide offers an introduction to many of those issues, from hiring freelancers and establishing organizational structures, to native advertising and marketing, to maintaining privacy policies and dealing with libel accusations. It seeks to help jumpstart the launch of news ventures and help entrepreneurs know when to seek professional legal help.

“The news ecosystem of the future will be made up of enterprises of many sizes, shapes, and forms, including journalistic startups that need help with their businesses and the law” said Jeff Jarvis, Director of the Tow-Knight Center. “Jan Schaffer and Jeff Kosseff provide an invaluable guide to help them recognize legal pitfalls. It complements other research from Tow-Knight on a variety of business practices.”

Jeff Kosseff is a communications and privacy attorney in Covington & Burling’s Washington, D.C. office, where he represents and advises media and technology companies. He is co-chair of the Media Law Resource Center’s legislative affairs committee. He clerked for Judge Milan D. Smith, Jr. of the U.S. Court of Appeals for the Ninth Circuit and Judge Leonie M. Brinkema of the U.S. District Court for the Eastern District of Virginia. He is an adjunct professor of communications law at American University, where he teaches in its MA in Media Entrepreneurship program. Before becoming a lawyer, Kosseff was a journalist for The Oregonian and was a finalist for the Pulitzer Prize and recipient of the George Polk Award for national reporting.

Jan Schaffer is executive director of J-Lab, an incubator for news entrepreneurs and innovators, and Entrepreneur in Residence at American University’s School of Communication, where she also teaches in its MA in Media Entrepreneurship program. She launched J-Lab in 2002 to help newsrooms use digital technologies to engage people in public issues. It has funded 100 news startups and pilot collaboration projects and it has commissioned and developed a series of online journalism resources that include Launching a Nonprofit News Site, Top Ten Rules for Limited Legal Risk and The Journalist’s Guide to Open Government. As the federal court reporter for The Philadelphia Inquirer, she was part of a team awarded the Pulitzer Gold Medal for Public Service for a series of stories that won freedom for a man wrongly convicted of five murders and led to the civil rights convictions of six Philadelphia homicide detectives.

The “Law for Media Startups” guide also invites media lawyers around the country to contribute information on state-specific laws that apply to news entrepreneurs, following the guide’s template for laws in Virginia.

J-Lab is a journalism catalyst that has provided funding and resources for news startups. It has funded 100 startups and pilot projects since 2005.

The Tow-Knight Center for Entrepreneurial Journalism offers educational programs, supports research, and sponsors events to foster sustainable business models for quality journalism. It is part of the City University of New York's Graduate School of Journalism, and funded by The Tow Foundation and The Knight Foundation.


Switching to OAuth in the App Template

Suyeon Son and David Eads re-worked the authentication mechanism for accessing Google Spreadsheets with the NPR Visuals App Template. This is a significant change for App Template users. Here’s why we did it and how it works.

Most App Template developers only need to consult the Configuring your system and Authenticating sections of this post, provided someone on your team has gone through the process of creating a Google API application and given you credentials.

Why OAuth?

Prior to this change, the App Template accessed Google spreadsheets with a user account and password. These account details were accessed from environment variables stored in cleartext. Storing a password in cleartext is a bad security practice, and the method led to other dubious practices like sharing credentials for a common Google account.

OAuth is a protocol for accessing online resources on behalf of a user without a password. The user must authenticate with the service using her password to allow the app to act on her behalf. In turn the app receives a magic access token. Instead of directly authenticating the user with the service, the application uses the token to access resources.

There are many advantages to this approach. These access tokens can be revoked or invalidated. If used properly, OAuth credentials are always tied to an individual user account. An application can force all users to re-authenticate by resetting the application credentials. Accessing Google Drive resources with this method is also quite a bit faster than our previous technique.

Setting up the Google API application

To use the new OAuth feature of the App Template, you will need to create a Google API project and generate credentials. Typically, you’ll only need to do this once for your entire organization.

Visit the Google Developer’s Console and click “Create Project”.

Give the project a name for the API dashboard and wait for the project to be created:

Give the project a name again (oh, technology!) by clicking “Consent screen” in the left hand toolbar:

Enable the Drive API by clicking “APIs” in the left hand toolbar, searching for “Drive” and enabling the Drive API:

You can optionally disable the default APIs if you’d like.

Finally, create client credentials by clicking “Credentials” in the left hand toolbar and then clicking “Create New Client ID”:

Make sure “Web application” is selected. Set the Javascript origins to “http://localhost:8000”, “http://127.0.0.1:8000”, “http://localhost:8888” and “http://127.0.0.1:8888”. Set the Authorized Redirect URIs to “http://localhost:8000/authenticate/”, “http://127.0.0.1:8000/authenticate/”, “http://localhost:8888/authenticate/” and “http://127.0.0.1:8888”.

Now you have some credentials:

Configuring your system

Whew! Happily, that’s the worst part. Typically, you should only do this once for your whole organization.

Add some environment variables to your .bash_profile or current shell session based on the client ID credentials you created above:

export GOOGLE_OAUTH_CLIENT_ID="825131989533-7kjnu270dqmreatb24evmlh264m8eq87.apps.googleusercontent.com"
export GOOGLE_OAUTH_CONSUMER_SECRET="oy8HFRpHlJ6RUiMxEggpHaTz"
export AUTHOMATIC_SALT="mysecretstring"

As you can see above, you also need to set a random string to act as cryptographic salt for the OAuth library the App Template uses.

Authenticating

Now, run fab app in your App Template project and go to localhost:8000 in your web browser. You’ll be asked to allow the application to access Google Drive on behalf of your account:

If you use multiple Google accounts, you might need to pick one:

Google would like you to know what you’re getting into:

That’s it. You’re good to go!

Bonus: Automatically reloading the spreadsheet

Any route decorated with the @oauth_required decorator can be passed a refresh=1 querystring parameter which will force the latest version of the spreadsheet to be downloaded (e.g. localhost:8000/?refresh=1).

This is intended to improve the local development experience when the spreadsheet is in flux.

Behind the scenes

The new system relies on the awesome Authomatic library (developed by a photojournalist!).

We provide a decorator in oauth.py that wraps a route with a check for valid credentials, and re-routes the user through the authentication workflow if the credentials don’t exist.

Here’s an example snippet to show how it works:

from flask import Flask, render_template
from oauth import oauth_required

app = Flask(__name__)

@app.route('/')
@oauth_required
def index():
    context = {
        title: My awesome project,
    }
    return render_template(index.html, **context)

Authomatic provides an interface for serializing OAuth credentials. After successfully authenticating, the App Template writes serialized credentials to a file called ~/.google_oauth_credentials and reads them when needed.

By using the so-called “offline access” option, the credentials can live in perpetuity, though the access token will change from time-to-time. Our implementation hides this step in a function called get_credentials which automatically refreshes the credentials if necessary.

By default, credentials are global – once you’re authenticated for one app template project, you’re authenticated for them all. But some projects may require different credentials – perhaps you normally access the project spreadsheet using your USERNAME@YOURORG.ORG account, but for some reason need to access it using your OTHERUSERNAME@GMAIL.COM account. In this case you can specify a different credentials file in app_config.py by changing GOOGLE_OAUTH_CREDENTIALS_PATH:

GOOGLE_OAUTH_CREDENTIALS_PATH = '~/.special_project_credentials'

Finally, the Google Doc access mechanism has changed. If you need to access a Google spreadsheet that’s not involved with the default COPY rig, use the new get_document(key, file_path) helper function. The function takes two parameters: a spreadsheet key and path to write the exported Excel file. Here’s an example of what you might do:

from copytext import Copy
from oauth import get_document

def read_my_google_doc():
    file_path = 'data/extra_data.xlsx'
    get_document('0AlXMOHKxzQVRdHZuX1UycXplRlBfLVB0UVNldHJYZmc', file_path)
    data = Copy(file_path)

    for row in data['example_list']:
        print '%s: %s' % (row['term'], row['definition'])

read_my_google_doc()