Web Project Planning for Government and Non-profits
Posted 9/22/2008 6:39:09 AM by Mark Reichard
This post is the first of three that will deal with Web project planning and execution for non-profits and government agencies. In this post, we discuss how to approach planning a Web project. In the next post, we’ll look at how to measure the impact of a Web project for a non-profit or government agency. The final installment will look at site promotion and search engine optimization for non-profits and government agencies.
When iData carries out Web projects with our clients, we start by reviewing their overall strategy and determining how the Web site and other online initiatives can best support this strategy. As part of this planning exercise, we identify key goals of the organization and we jointly brainstorm conversions --- specific actions that visitors to the organization’s site will take that directly relate to the identified strategic goals.
This sounds simple (and it is), but you’d be surprised how often the focus on strategic goals is lost when Web project teams dive into the detail of actually completing the project. Without a clear, prioritized list of goals and related conversions to guide project planning, project plans can easily fill with junk (design elements that the Marketing VP thinks are cool or features that the tech team wants to do because they can) while the hard, frustrating work of figuring out how to actually accomplish the vital work of the organization online goes undone.
The tendency for focus on strategic goals to be lost seems to vary directly with how involved the site is in e-commerce. For an Internet-only retailer (i.e. a pure e-commerce site), it’s pretty obvious that the focus should be on:
- Getting visitors to the site, and
- Converting visitors to buyers (at a profitable price).
It’s true that some online retailers focus on these fundamentals better than others, but the built in feedback mechanism of the marketplace tends to correct this --- good, usable sites get traffic, make sales, and stay in business while bad sites don’t.
For organizations where the feedback mechanism is not as direct, it is a lot harder to identify online conversions that support the key goals of the organization. I’ve been in a lot of meetings, particularly with non-profits and government agencies, where our clients have been frustrated by our attempts to reduce their project (which until the meeting have been thought of in very lofty, conceptual terms) to the fundamentals of:
- A clear, concise statement of the strategic goals of the organization, and
- A list of concrete things they want visitors to the organization’s site to do and how those actions support strategic goals.
These folks often object that their site is informational and that there is not really an online conversion related to what they are trying to accomplish. In other words, non-profits and government agencies are often primarily trying to inform, educate and advocate, not sell something. How are they supposed to measure and track changes in the perceptions of visitors to the site? How can they tell when someone has become more informed or educated?
These are tough and legitimate questions. It is often more comfortable for agencies and organizations to simply build a site delivers a snappy new look and feel and some cool new online do-dads rather than devote time and energy to soul-searching about the mission of the organization and how to track the Web site’s impact on that mission. But what is most comfortable and what is best for the organization are not necessarily the same. In order to do what’s best for the organization, these questions must be part of the planning process. The reality is that there are a variety of mechanisms (surveys, offline referral source tracking, creative use of contact forms) to measure the impact of a Web site, but implementing them sometimes requires a new level of discipline and adoption of new tools not only through the Web site but in the organization’s offline activities as well. The next post in this series will examine some of these tools.
Low cost usability testing
Posted 9/4/2008 7:36:07 AM by Mark Reichard
When we plan Web projects with clients, we stress (and re-stress) the importance of usability testing, especially for sites with e-commerce or significant search functionality. Many usability tests for new or redesigned sites seem to go the same way --- the first tester starts to work their way through the site and quickly uncovers some (now that they have been pointed out) blindingly obvious and major usability issues. Sometimes these are things that the development team missed, but just as often they are the result of a compromise solutions to heated disagreements between marketing, business people, designers and/or developers. On seeing the results, the Web site team often gets defensive and says "that's just one user, and besides, that guy is a doofus --- anyone else will see how it should work." Then the next tester comes in and finds the same issues. And the next, and so on.
Usability testing is a great process when the it takes place early on in the development/'testing cycle, or even during the design phase. When no testing takes place until after the site is finished and ready to be launched, it's not as fun. Major issues are often still uncovered, but now there is a lot more disappointment and blame. There are lots of uncomfortable meetings to discuss how this could have happened. For this reason, it's really a good idea to make "test early and test often" your mantra when designing or re-designing a site.
Often, there is resistance to testing. One of the main reasons cited for not testing is the cost. Fortunately, www.usertesting.com has just taken away this argument. Now, for $19 per tester (and, by the way, most problems are uncovered by a handful of testers), you can create a usability test profile, submit it, and start getting results (with video, audio and written comments) within an hour or so.
We completed a 3 user test yesterday of some of our pay per click landing pages. Guess what? We immediately heard about an obvious usability issue. We've taken the affected page down while we're re-working it. When we've finished, you can bet we'll run another test. I urge you to do the same on your site.