The surge of search-only web interfaces in public sector websites in 2011 is raising user experience questions about search vs. browse as the core content delivery metaphor. However, I’m interested in the implications of a web channel powered by content authors, who are now required to be metadata experts and content strategists.
In 2011, we’ve already seen three major sites either launch or launch in preview that use the new Microsoft-powered, search-only navigation metaphor:
Because we work with many municipal and regional government clients, we’ve been flooded with questions and opinions from clients about whether this is the “new way” for public sector web delivery. Our good colleague Gene Smith at nForm has already written a very thoughtful post about the UX implications of this type of interface.
Everyone’s first concern appears to be whether search-only is a suitable interface to enable effective service delivery. For starters, it appears that the July 2011 traffic on Utah.gov is up 33% from July 2010. Whether this is just gawkers checking out the new site, or whether it’s just a correction due to the previous site having low traffic due to poor design/content is hard to say, but a 33% increase is a great year over year gain for a municipality.
I made a friendly wager with a client that it would be less than 12 months before navigation started to appear on most of these websites. But my interest is less on the usability issue of presenting users with nothing but a search box. Gene has covered this effectively in his article. Instead, my question is about what is happening behind the scenes, and what it will take to sustain this interface over time.
Note: the following is rife with my own personal assumptions about how these sites are actually running.
Is the magic coming from Microsoft or Google?
On the front end of this website, we see a search box with a familiar “Powered by Google” logo, which gives the appearance that the real power of this interface is coming from Google. Putting a Google Search Appliance in front of a content repository can be a joyfully straightforward process. I do see some very sophisticated search configuration and federated search in the Calgary site. What's interesting to me, however, is that Microsoft SharePoint is acting as a content repository in the background.
The site is not using Microsoft’s FAST search. We’ve heard rumours about FAST not meeting the performance demands of web indexing. However, there’s no way to know whether these projects started as a SharePoint repository with FAST search, with Google was put in place later to solve performance problems, or whether it was Microsoft and Google from day one. Either way, Google has ended up in front.
The really juicy fact for me is that this is an attempt to create a true Enterprise Content Management (ECM) approach to the web. All content for these sites is not being managed in a Web Content Management (WCM) layer, but in a SharePoint Document Management (DM) layer. Most projects have documents – files managed in Word – and web content – HTML managed by content authors specifically crafted to belong in a web interface.
This means that subject matter experts and content authors are likely using Microsoft Office products as the editing interface, and using SharePoint collaboration for workflow and editorial processes. Content authors do not work with web content. They work in a word processor.
What is a site visitor’s experience like?
Microsoft has been working hard to sell SharePoint as a WCM tool since they abandoned their CMS 2002 product and dedicated their resources to SharePoint. The 2010 release of the product was supposed to finally bring a mature WCM layer to SharePoint. It’s curious to see, however, that there doesn’t seem to be an actual WCM tool in their initial releases. Instead, there appears to be a document repository, accessible by search alone. As a visitor moves into a results page, additional methods to move through content are presented.
The default search results screen shows several categories of results:
- Taxonomy-driven search result set. Lists terms the visitor might use to find related information.
- Default Google search results based on document content. Displays title, descriptive text and URL, but with additional taxonomy-driven categories shown below that apply a more detailed filter to the content.
- Federated search results that list relevant services. This appears to be from a repository of web applications that provide online transactional services.
- Geo-tagged results that include a graphical map tile and a text address. Displays search results for relevant location-based information.
What is a content author’s experience like?
The implications of this interface are that for every document/page in the site structure, the following actions have occurred:
- keywords/categories have been applied as meta data
- geo-tagging information has been applied
- the document has been assigned a categorization scheme in a formally defined taxonomy
- A secondary categorization scheme has been applied to identify what “Labeled Document” links should appear in search results (this appears to define whether it is a HTML, PDF, PowerPoint, Visio, etc.)
In addition, it must be assumed that every content author has also been trained on how to prepare good web content, and that each document has gone through an editorial review process.
Because there is no “website” in the traditional sense, most of the content authoring would most likely happen in a Microsoft Office application (Word, Excel, Visio, PowerPoint, Publisher). This adds a further layer of complexity, since most modern WCM products boast about ease of use, with in-context authoring as their evidence.
One of the areas we have been exploring at Yellow Pencil is what UX means for content authors. Every great website is driven by great content, and great content only comes from well-trained and enabled authors. Web content strategists will need to rethink their practice for authors working in native Office applications and applying metadata and taxonomy externally.
Who is left out?
One of my biggest areas of curiosity is the marketing/communications role in this new model. The new chain of command could be:
- Subject matter expert (SME) creates new content or updates existing content
- Editor or manager reviews content for accuracy
- Web content specialist reviews content for compliance with corporate style guidelines and best practices, either through a formal workflow on a scheduled review, or as the result of analyzing logs or stats
- When content needs to be reviewed, the SME finds/opens the document and starts the process again
Aside from curating the brand standards and style guide, a traditional marketing/communications department does not appear in this workflow or governance model in an active way. I’m looking forward to hearing from marketing/communications people on how this new model impacts their mandate and their engagement with the web channel.
And whither the social web?
For most public sector organizations, the web is a core component for campaign-style communications. With the organization’s website removed as a platform for engaging in online marketing activity, this leaves search marketing and social marketing as the only meaningful platforms for campaign activity.
An example of campaign-style activity could be anything from a “Remember to Vote” project to a green energy promotion or a traffic safety public service message. Every public sector organization does these today as part of standard business operation.
This forces the hand of most public sector communications departments, which have been doing a slow dance with social media policy over the past 3 years. If they want to use an online engagement channel, it can no longer be the organization’s website – it must be another online platform.
This also means that marketing content lives outside the fold in terms of data sovereignty and records management. Social content is a part of the platform where it is deployed. The organization does not have exclusive title to their marketing content, and the creator does not necessarily decide when it expires and for how long, or under what terms it can be accessed.
Another implication is that online marketing content exists only on proprietary platforms and not on the public web.
One alternative to this social-only requirement for campaign content is the use of stand-alone microsites, but this requires a WCM product and a whole separate layer of business process, cost and governance from the core web channel. However, even with a traditional website model, this happens in many organizations.
Overall this model of web content delivery makes a number of assumptions that change what it means for an organization to own their web channel:
- A “webpage” is now a “document”. The web channel is now managed as a system, not a communications activity, so it either lives in IT or in a dedicated Web Department.
- A “web content author” is now a “subject matter expert”. All authors must understand not just how to write web content, but how to manage documents, apply taxonomy and manage metadata.
- No one “owns” the web communications strategy. A website tells a story and builds a relationship with your audience. With traditional communications changing its relationship to the web channel in this new model, each organization needs to rethink who the curator is of the story the organization tells.
All in all, I’m extremely curious about this approach. I want to see what visitor adoption is like just as much as I want to see how it impacts content authors. And I applaud these organizations who are taking a calculated risk with their web channel to try to provide the best possible level of service.
Our firm works with traditional WCM products so I’m likely biased. We are not a SharePoint implementer. We work with Google Search on behalf of many customers but are not otherwise affiliated with Google.