How-To: 10 Tips for Launching a Solid Podcast

Campaign studies by Podtrac and TNS found podcast advertising is three times as effective as “traditional” online advertising, and seven times more so than TV ads.

In 2007, podcasts served 18.5 million users in the US — a figure projected to rise to 65 million by 2012 (eMarketer).

These tips for launching an engaging podcast will help you build a loyal and responsive brand audience.

1. Plan your podcast schedule. An engaging podcast is more than a “one-off” episode. Plan each in advance, and launch them on a consistent day and time.

If you broadcast weekly, publish a monthly schedule so listeners get a sense of what to expect. Stick to your schedule — inconsistency encourages even devotees to look elsewhere.

2. Make it RSS-accessible. A downloadable mp3 file
is only one component of a podcast. Enable people to subscribe via RSS
so they can retrieve updates automatically.

3. Keep it short. It’s not a one-hour radio show.
Brevity encourages relevance. Unless you have a strong feature, don’t
press listeners’ patience.

A typical episode of The Wall Street Journal’s “Your Money Matters” podcast lasts a little over five minutes.

4. Don’t waste time hard-selling. Don’t discuss
your product or service all the time. When you do, invite a client or
user to speak frankly about it on air. It’s okay to promote a website
if the site contains content relevant to the episode.

5. Segment your podcasts. Think “Client Talk,” “Tip
of the Day,” “Your Questions Answered,” that kind of thing. Content
segments give listeners bearings and yield a sense of familiarity with
your podcast’s ebbs and flows. This is key to the success of any series.

6. Simplify podcast management. Keep your recording process and RSS feed management simple, so you can focus on developing content (the tough part).

Garageband,, and Audacity are the most
popular programs for recording podcasts. FeedforAll helps with editing,
and its simple GUI eases management of RSS feeds.

7. Submit your podcast to popular directories. iTunes lets users submit podcasts from within its program; here are some podcasting resources from Apple.

Pinging services like Autopinger and Pingoat will submit podcast
updates to major blogs and search engines. Burn your podcast with
Feedburner, which allows you to notify listeners about new episodes
through email updates.

8. Build a compelling podcast website. Any ad
campaign or product launch should have its own web destination, loaded
with up-to-date and relevant information. Your podcast is no different.
Keep the site updated with your podcast schedule, and website-only
tidbits, to build listener loyalty.

Bonus features on the website will go a long way. Coffee Break
Spanish uses its site to sell written transcripts of its free audio
lessons. The website is also the place for visibly advocating your
product or brand.

9. Let website visitors commune with one another. Provide listeners with a newsgroup so they can interact. You can also start a Facebook group or invite them to follow you on Twitter.

10. Measure and analyze. None of this does much good if you’re not keeping metrics on your progress. Some handy tools:

  • Google Analytics helps track users and audio file downloads
  • Feedburner lets you measure the number of unique subscribers per episode
  • Podtrac and Volomedia help you gain deeper insight in behavioral and demographic data

Happy podcasting.

This MarketingVOX How-To was written by Arun Krishnan, VP of Marketing at Pontiflex. His podcast, “Learn Hindi from Bollywood Movies,” has been running since 2006.


Donation page optimization: Summary of learning

  • Size DOES matter: Bigger donate buttons helped convert more donors
  • Color can matter too: A vividly colored donation button can strongly boost donation page conversion…but seasonality and color choice influenced whether it did
  • Less is more: Removing unnecessary fields from the personal information form significantly increased conversion to donate
  • Remind people (nicely) why they want to donate: Polite header copy (“Please make a tax-deductible gift…”) followed by short appeal copy yielded better conversion than a more forceful call-to-action (“Donate Now! Help us…”) without appeal copy􀀁
  • No need to be demanding: Using firmer language on the donation button (“Donate Now” instead of “Submit”) did not produce statistically higher conversions

How to add stuff to wikipedia

1) For legal reasons, we have to be very proactive on
copyright violations.
2) You can’t license the use of copyrighted
material on Wikipedia; only the copyright owners can do that.
3) This
article would probably have been deleted anyway, as being about an non-notable organization; see WP:GROUP.
4) Since you have a relationship with this organization, you probably
shouldn’t be editing any articles about them anyway, under our restrictions on edits by persons with conflicts of interest. —Orange Mike | Talk 14:03, 7 May 2008 (UTC)
  1. Information on copyright releases can be found here.
  2. As to the conflict-of-interest issue, the thing to do is first to
    request that some impartial third party create an article, a thing done
  3. If an editor agrees that the organization passes muster under WP:GROUP, and an article is
    created, then COI persons would make suggestions on the talk page of
    the article, and other editors would take up the suggestions they found
    most useful.
  4. (I’d advise you to clarify the relationship between OPA,
    the WWF, etc., because I was having trouble figuring it out from the


How Little Do Users Read?

On the average Web page, users have time to read at most 28% of the words during an average visit; 20% is more likely.

We’ve known since our first studies of how users read on the Web that they typically don’t read very much. Scanning text is an extremely common behavior for higher-literacy users; our recent eyetracking studies further validate this finding.

The only thing we’ve been missing is a mathematical formula to quantify exactly how much (or how little) people read online. Now, thanks to new data, we have this as well.

The Research Study

For full details, see the following academic paper:

Harald Weinreich, Hartmut Obendorf, Eelco Herder, and Matthias Mayer: “Not Quite the Average: An Empirical Study of Web Use,” in the ACM Transactions on the Web, vol. 2, no. 1 (February 2008), article #5.

In the study, the authors instrumented 25 users’ browsers and recorded extended information about everything they did as they went about their normal Web activities. What’s important about this study is that it was completely naturalistic: the users didn’t have to do anything special.

One downside of the study is that the users had above-average intelligence, with several being university employees. This might not be a problem in the long run, however. If, for example, we compare data we collected in 2008 for our Fundamental Guidelines for Web Usability seminar with a similar study we ran in 2004, we find that 2008’s average behavior is close to that of 2004’s higher-end users. Thus, even though Weinreich et al.’s data represents high-end users, it’s likely to be fairly representative of broader user behavior in the future. In fact, the authors collected their data in 2005, so the recorded behaviors might already be fairly common.

In any case, the research yielded several interesting findings, and the full paper is well worth reading.

Among other things, the authors found that the Back button is now only the 3rd most-used feature on the Web. Clicking hypertext links remains the most-used feature, but clicking buttons (on the page) has now overtaken Back to become the second-most used feature. The reason for this change is the increased prevalence of applications and feature-rich Web pages that require users to click page buttons to access their functionality.

Of course, Back is still the user’s lifeline and is so frequently used that supporting it remains a strong usability guideline.

Real-Life Reading Behavior

Harald Weinreich graciously provided me with the dataset detailing 59,573 page views.

From this data, I removed the following records:

  • 10,163 page views (17%) that lasted less than 4 seconds. In such brief “visits,” users clearly bounced right out without truly “using” the page.
  • 2,615 page views (4%) that lasted more than 10 minutes. In these cases, users almost certainly left the browser open while doing something else.
  • 1,558 page views (3%) with fewer than 20 words on them. Such pages are probably server errors or disrupted downloads.

After cleaning the dataset, I was left with 45,237 page views for my analysis.

I was able to fit very nice formulas to describe users’ reading behavior for pages containing between 30 and 1,250 words. For longer pages, reading became quite erratic. Pages with a huge word count are probably not “real” pages anyway — they’re more likely to be either academic papers or “terms & conditions” pages, which people don’t give the time of day. (In research for the book Prioritizing Web Usability, we found that people read only about 10% of the text that they supposedly “agreed” to.)

The following chart shows the average time users spend on pages with different word counts:

Scatterplot: word count on the horizontal axis and the duration of average visits on the vertical axis.

Obviously, users tend to spend more time on pages with more information. However, the best-fit formula tells us that they spend only 4.4 seconds more for each additional 100 words.

Usually, I assume a reading speed of 200 words per minute (WPM), but because the users in this study are highly literate, I’ll go with 250 WPM. At that reading speed, users can read 18 words in 4.4 seconds. Thus, when you add verbiage to a page, you can assume that customers will read 18% of it.

Percentage of Text Read

This wasn’t an eyetracking study, so we don’t know precisely how users allocated their time on the Web pages. The formula in the chart above indicates that there is a fixed time of about 25 seconds, plus an additional 4.4 seconds per 100 words. (Of course, the numbers are not “fixed” in the sense that they’re always the same — these are averages.)

The formula seems to indicate that people spend some of their time understanding the page layout and navigation features, as well as looking at the images. Clearly, people don’t read during every single second of a page visit.

However, the total time spent on a page is definitely the upper limit of possible reading time. Thus, we can calculate the hypothetical maximum number of words users would be able to read, if they allocated their entire page-visit to reading.

The following chart shows the maximum amount of text users could read during an average visit to pages with different word counts:

 Scatterplot: word count on the horizontal axis and the largest proportion of this time users have time to read on the vertical axis

This is a very rapidly declining curve. On an average visit, users read half the information only on those pages with 111 words or less.

In the full dataset, the average page view contained 593 words. So, on average, users will have time to read 28% of the words if they devote all of their time to reading. More realistically, users will read about 20% of the text on the average page.

As an example of word count on various pages, here’s the total for some popular Alertbox columns:

Blah-Blah Text: Keep, Cut, or Kill? 902
This column 1,068
Passive Voice Is Redeemed For Web Headings 1,079
Change the Color of Visited Links 1,209
Intranet Information Architecture (IA) 1,961
Top-10 Application-Design Mistakes 3,572

Clearly, the average visitor won’t make it too far through most of my articles. But I’ve consciously targeted a small, elite readership with a firm commitment to usability. If you target a broader audience or have sales cycles that are shorter than 5 years, you’d be wise to put your word count on a strict diet.


Want That Post to Go Popular? Here’s The Best and Worst Times to Post It

Connecticut software developer Jake Luciani has run 10k items on, Digg, Reddit and Mixx through the API of popularity ranking engine AideRSS
to analyze the connection between popularity and timing. He determined
the best days and times for a blog post to be submitted to those sites
if its author wants it to receive the maximum number of votes, comments
and inbound links.

Luciani’s conclusion: between 1pm and 3pm PST (after lunch) or
between 5pm and 7pm PST (after work) are the best times and Thursday is
the best day. The worst time to post? Between 3 and 5 PM PST on the
weekends – nobody cares. See the graphs below.

How the Measurement Works

In the graphs below the factor measured is what AideRSS
calls a PostRank of 6 or higher. AideRSS looks at all the items in an
RSS feed and scores them (relative only to other items in the same
feed) in terms of number of comments, number of Diggs, number of times
saved to and number of inbound links from blogs. The
highest percentile of posts in a feed have PostRanks closest to 10.

These graphs then measure which times and days see the largest
numbers of posts submitted that end up being more popular than other
posts in the same feed. So the most wildly popular and discussed items
among all popular items at Digg, etc. It’s tracking the time that the
post is submitted to the news site – not when it was necessarily posted
on the blog. It’s a touch obtuse and it would be nice to read a little
more about the methodology employed – but the PostRank algorithm is
relatively transparent and the conclusions are intuitive.

This is just one of many things we’ve written about using AideRSS for here at RWW. It’s a simple and very powerful tool that I at least use every single day.

Note that of course people blog for more reasons than just
popularity and popularity cannot be equated with popularity! If you’re
in a hurry it is one way to look for quality, though. 🙂

With no further ado, knock yourself out wrapping your mind around
these graphs. I almost did; remember that times here are GMT and if
you’re on the West Coast of the US, I hope you just had a nice lunch
and remember to subtract 7 hours from this 24 hour clock to figure out
these times for yourself.

Thanks for the creative and valuable work, Jake!



For more RSS fun times, check out the other entries on the AideRSS blog.


Right-Justified Navigation Menus Impede Scannability

We know from eyetracking studies that users tend to rapidly move their eyes down the left-hand side of lists. People read the rest of a list item only if something catches their eyes in these left-most one or two words.

The menu design guidelines are thus clear, at least for vertical menus:

  • Left-justify the menu, so that the user’s eyes can move in a straight line and don’t have to re-acquire the beginning of each new line.
  • Start each menu item with the one or two most information-carrying words.
  • Avoid using the same few words to start list items, because doing so makes them harder to scan.

Aligning a navigation menu with the right margin might look cool, but the resulting ragged left margin severely reduces the speed with which users can scan the menu and select their preferred options.

(Of course, the left-alignment guideline is for languages that read left-to-right. For languages that read in the opposite direction,
the guideline is reversed: you should right-justify the menu. In either
case, the point is to make it easier for users to scan down the side on
which they start reading.)

Take a look at the following screenshots. I picked university
sites for this illustration, but right-aligned navigation disease is
found on business sites as well.

Screenshot of navigation menus from Indiana, Michigan, and Vanderbilt Universities.

Navigation menus from three university websites. Left to right:

Indiana University, University of Michigan, and Vanderbilt University.

Note how hard it is to scan the menus. Paradoxically, Vanderbilt
provides us with an example of correct alignment in the same
screenshot: it’s much faster to scan the top menu than the bottom one.

To complicate matters, two of these screenshots also violate the guideline against USING ALL CAPS, which reduces legibility by about 10%. When you mix cases,
the ascenders and decenders produce varied letterforms, while all caps
produce boxy shapes. Users recognize words faster when you preserve
traditional word shapes. (As an example, compare the word “Employment”
in the left-hand menu with the word “EMPLOYMENT” in the middle menu.)

Finally, the contrast between the text and background colors in the
middle menu is too low. Violating three legibility guidelines makes the
middle menu particularly hard to read, especially for low-vision users.
So, in this sampling, the University of Michigan takes the prize for
worst menu design. (The school has a good human-computer interaction
program, but apparently the site designers failed to consult the local

Menu alignment is admittedly a small point rather than a top high-ROI redesign priority. But it’s easy to get right — just don’t align to the right.

Updated Menu

8 hours after posting this article, I got
email from the University of Michigan design team that they had
redesigned their navigation menu. Fast work.

Redesigned navigation menu from the University of Michigan.

U. Michigan’s old (left) and new (right) nav menus.

Jakob Nielsen‘s
Alertbox, April 28, 2008