Given at Confab 2012, Minneapolis, USA; May 16, 2012, NYC Content Strategy Meetup, September 27, 2012. Slides highly subject to change.
This document summarizes a presentation on semantic search given by Peter Mika, a senior research scientist at Yahoo Labs. It discusses the history and goals of semantic search, including improving query understanding and bridging the semantic gap. It also describes Yahoo's research into semantic search applications for web search, including enhancing search results, entity retrieval and recommendations, and question answering. Semantic representations of queries and documents are key to these applications.
This document provides an introduction to a semantic search tutorial given by Peter Mika and Tran Duc Thanh. The agenda covers semantic web data, including the RDF data model and publishing RDF data. It also covers query processing, ranking, result presentation, evaluation, and a question period. The document discusses why semantic search is needed to address poorly solved queries and enable novel search tasks using structured data and background knowledge.
Keynote presentation by Louis Rosenfeld at the Usability and Accessibility for the Web International Seminar; 26 July 2007, Monterrey, Nuevo Leon, Mexico
The document provides an introduction to an information architecture and design workshop. It includes an agenda for the workshop that covers background on information architecture, the design process, a project overview, user research including surveys and competitive reviews, developing personas, and design deliverables like site maps, wireframes and prototypes. Key aspects of information architecture are defined, including the combination of organization, labeling and navigation to facilitate accessing content. The history of the field and example methodologies for user research, competitive reviews and developing personas are also outlined.
This slidedeck is from our surfray webinar on Search Analytics in SharePoint 2010. The presentation contains some search theory and an introduction to search analytics reports in Sharepoint 2010. It also covers simple techniques for improving search based on the analytics.
Talk at the 2nd Summer Workshop of the Center for Semantic Web Research (January 16, 2016, Santiago, Chile) about the construction of Yahoo's Knowledge Graph and associated research challenges.
Cyr, Chris. 2019. “From Discovery to Fulfillment: Improving the User Experience at Every Stage.” Presented at the Congress of Information Professionals, October 29, 2019, Montreal, Canada.
An introduction to what entities are in semantic search, based upon my presentation as a keynote speaker at the SMX East 2013 conference in NYC.
Using the tools of linked in to improve your search and view returns. This will allow you to network iwth this tool more effectively
How is mobile technology and search engine tuning evolving to meet the needs of users? Here we look at recent developments in research, implementations by search engines, and how to look at reach users can adapt their strategies to take into account these next-level changes.
Presentation on Semantic Search given at WIMS 2011 (with some slides celebrating 10 years of Semantic Web).
Here are some options for completing your query: - Freddie Mercury was the lead singer of Queen - Brian May was the guitarist for Queen - Queen was a British rock band formed in 1970 - Freddie Mercury died in 1991 from complications due to AIDS
The document discusses semantic search capabilities at Yahoo. It describes how Yahoo has developed techniques to extract structured data and metadata from webpages to power enhanced search results. This includes information extraction, data fusion, and curating knowledge in a graph. Yahoo uses this knowledge to better understand search queries and present relevant entities and attributes in results. Semantic search remains an active area of research.
Slides for the iDB summer school (Sapporo, Japan) http://db-event.jpn.org/idb2013/ Typically, Web mining approaches have focused on enhancing or learning about user seeking behavior, from query log analysis and click through usage, employing the web graph structure for ranking to detecting spam or web page duplicates. Lately, there's a trend on mining web content semantics and dynamics in order to enhance search capabilities by either providing direct answers to users or allowing for advanced interfaces or capabilities. In this tutorial we will look into different ways of mining textual information from Web archives, with a particular focus on how to extract and disambiguate entities, and how to put them in use in various search scenarios. Further, we will discuss how web dynamics affects information access and how to exploit them in a search context.
Best-in-class Content Analytics provides an advanced search and analytics platform that enables better decision-making from the enterprise content, regardless of the source or format. Using Content Analytics, retailers can understand the meaning and context of human language. By rapidly processing information, Content Analytics allows organizations to improve knowledge-driven search and surface new insights from enterprise content. IBM’s Content Analytics uses the same Natural Language Processing (NLP) technologies as IBM Watson DeepQA, the world’s most advanced question-answering machine. This webinar will explain how companies can apply IBM Content Analytics to customer email, call center logs, chats, correspondence and other forms of text and “unstructured” content to obtain a more detailed and accurate understanding of customers, products, market segments and competitors.
Search is a conversation, learn to listen to what you visitors are telling you by understanding their search behavior. In this presentation we'll cover information foraging, search analysis, and how to use them and other techniques to improve your content without having to be a statistician.
Get guidance through the gigantic sea of freely available Open Data and learn how it can empower you analysis of any kind of sources. This webinar is a live demo of news and data analytics, based on rich links within big knowledge graphs. It will show you how to: Build ranking reports (e.g for people and organisations) View topics linked implicitly (e.g. daughter companies, key personnel, products …) Draw trend lines Extend your analytics with additional data sources
This document discusses content marketing metrics and strategies for success, including focusing on high value keywords from keyword research, tracking keywords that bring traffic to content, and creating altruistic content inspired by keywords that provides value to users. Case studies are presented showing content that ranked for hundreds of keywords and brought thousands of daily visits without going viral. The document also discusses keywords for B2B content about sales pitches and strategies for graphics with SEO.
Ian will be speaking about attribution and content measurement, bringing agency experience and insights to shed light on the topic.
This presentation, taken from our "Lifting the Lid on Performance Content" event examines: - Defining demand across the key points in the user journey - The value and role of platforms across the key digital touchpoints with your brand - How to measure effectiveness when traditional metrics are no longer a true reflection of performance
Content Marketing and B2B strategy expert Rebecca Lieb shares some uncommon but powerful indicators of content marketing effectiveness.
This is an example case study showing what big data can mean for a small website that generates just 5000 visits a day. It all depends on what we want do get from our assets like website traffic. If we only measure the number of people who visited our site, then we do not need to worry about “big data”. We just have to count total visits (5000 a day, 150 000 monthly). But by using just the simple measure we know nothing about our visitors / customers. So, it pretty useless. On the following slides we present what a website owner can gain from advanced website analytics and why big data technologies are recommended.
This presentation discusses how search analytics can be used to improve search experiences on websites and intranets. It provides an agenda that covers logging search queries and metrics, analyzing search query data, and actions to take based on insights. Specific recommendations are given, such as fixing queries with 0 results, checking common queries, and using search analytics in combination with web analytics. The presentation aims to demonstrate how a small time investment in search analytics can significantly enhance search.
So you're going to Confab Higher Ed. You're already pretty excited about content strategy. But your boss and colleagues? Not so much. To outsiders, content strategy is just another buzzword. And as more schools move to become "data-driven" organizations, talking about content can sound hopelessly qualitative. So don't say "content strategy": do it. This session will look at content strategy practices you can introduce to show even your most quantitatively-oriented colleagues the value of content strategy: content analytics, social media analytics, and user testing techniques. Rack up successes first—then start talking content strategy. • Introduce content strategy practices into your organization when your organization doesn't care about content strategy. • Use analytics to identify what needs improvement. • Learn how user-testing techniques can improve your content.
In 2013, Eaton drove over 26,000 marketing leads. Awesome, right? But how many of those leads became qualified sales opportunities? Only 5%... not so awesome. Eaton threw its old nurture program out the window and asked: What does the audience really care about and how can the brand deliver it to them on their terms?. Learn how to enhance your lead nurture programs to drive customers down the funnel to conversion, as well as: • Understanding persona and demand type • Creating and aligning content to business challenge and the buyers journey • Tracking user behavior and digital consumption • Leveraging information to create an implicit and explicit lead score • Qualifying leads that show interest and surfacing hot leads to sales Eaton leveraged the power of Eloqua to transform the way it nurtures leads, qualifies them and truly understands its buyers’ behavior. You will walk away excited to drive change your organization’s management of leads for the better.
The document discusses content measurement and marketing ROI. It provides examples of how to track engagement, leads, sales, and profit from various marketing initiatives like emails, webinars and nurturing programs. Metrics include reach, engagement counts, marketing expense, cost per lead or engagement. The document also discusses attributing results to multi-touch marketing efforts and optimizing the marketing mix and tactics based on performance data.
1) The document provides tips for creating content that gets significantly more views and shares, known as "content unicorns", by focusing on promotion planning, distribution networks, and topic research rather than just writing content. 2) Effective promotion involves identifying influential promoters, leveraging trending topics, and optimizing content for sharing on social media. Paid promotion of content to influencers can help content go viral. 3) Thorough topic and competitor research upfront to identify resonant topics is more important than the actual writing, which should only take 20% of the time spent. The majority of content fails to get noticed or shared.
Larry Kim discusses trends in paid search marketing and advertising. He notes that paid search CPCs are rising while inventory is decreasing, forcing advertisers to be more selective. However, display advertising offers lower CPCs and more inventory. Identity-based targeting is also opening up new opportunities by allowing targeting of specific individuals. Google is removing some targeting options as well, so advertisers need to focus on strategic uses of new formats. Content remarketing is emerging as a way to combine content, social media, and paid advertising by selectively promoting top content to remargeted audiences.
The Huffington Post began in 2005 as a blog that aggregated political content but expanded into other topics over time. It was successful due to its technological advantages like a sophisticated aggregating system, search optimization strategies, and social networking. By 2011, it was acquiring over 30 million monthly visits and was purchased by AOL for $351 million. The Huffington Post's search engine optimization strategy included techniques like using keywords, catchy headlines, and linking to drive traffic. It also implemented a social media optimization strategy using social networks, recommendations, and badges to create loyal readers and followers.
In this webinar we covered how to improve search with analytics using the Elastic Stack: ElasticSearch, Logstash, Kibana. Check out our upcoming events: www.mcplusa.com/events
Content analytics uses semantic technologies to unlock business value from unstructured content like text. It answers important questions by automatically analyzing text to extract features, trends, quotes, and regulatory clauses. Content analytics offloads tedious manual tasks and provides high-level insights through APIs and a modular architecture that includes parsing, annotation, and classification capabilities.
Plugged is a strategic digital marketing agency focused on results. Our aim is to connect consumers with brands through content and performance marketing. Combining research and analysis with strategic consultancy, we help our clients harness their brand by developing an effective suite of marketing activity to maximise customer engagement and drive sales growth. From developing ECRM strategies to creating effective brand partnerships, we've helped each and every one of our clients to become better connected with their customers. Plus we've got the results to prove it.
Presented by Lou Rosenfeld and Rich Wiggins at the 2007 ASIS&T IA Summit, Las Vegas, Nevada, USA, March 24, 2007.
This document provides an overview of site search analytics and how they can be used. Site search analytics data, such as search queries and click-through rates, can be analyzed in several ways: to understand what content users are searching for, identify failures in navigation, learn how different audiences search differently, and predict future trends. This semantically rich data allows site owners to improve search functionality, organize content more logically, reduce jargon, and avoid potential problems with site changes or new features.
The document outlines 8 better practices for information architecture (IA) and findability. It discusses (1) diagnosing important user problems, (2) balancing qualitative and quantitative evidence, (3) designing for the long-term, (4) measuring user engagement beyond conversions, (5) supporting contextual navigation, (6) improving cross-silo search, (7) combining manual and automated design approaches, and (8) regularly tuning designs over time. The document provides examples and explanations for effectively implementing each of the 8 better practices of IA.
This document discusses search analytics and how analyzing search logs can provide insights to improve a website. Key points covered include analyzing common queries to identify opportunities to improve search results or content; using click-through data to determine the most relevant results; and learning about users' interests and information needs from their search terms and sessions. The document also provides examples of how various organizations have used search analytics to enhance search, navigation, metadata, and content.
When your colleagues say they want Google, they don’t mean the Google Search Appliance. They mean the Google Search user experience: pervasive, expedient and delivering the information that they need. Successful enterprise search does not start with the application features, is not part of the information architecture, does not come from a controlled vocabulary and does not emerge on its own from the developers. It requires enterprise-specific data mining, enterprise-specific user-centered design and fine tuning to turn “search sucks” into search success within the firewall. This presentation looks at action items, tools and deliverables for Discovery, Planning, Design and Post Launch phases of an enterprise search deployment.
Lou Rosenfeld's presentation on search analytics, given at WebContent.gov's Web Manager University, October 27, 2006.
The document discusses the evolution of search engines from basic keyword search to semantic search using knowledge graphs and structured data. It provides examples of how search engines like Google are now able to provide direct answers to queries by searching structured data rather than just documents. It emphasizes the importance of representing web content as structured data using schemas like schema.org to be discoverable in semantic search and knowledge graphs.
Slides for my full-day information architecture workshop. Will teach in Minneapolis, MN (November 12, 2012) and Toronto, ON (November 29, 2012) Details: http://rosenfeldmedia.com/workshops/
Introduction to Enterprise Search. A two hour class to introduce Enterprise Search. It covers: The problems enterprise search can solve History of (web) search How we search and find? Current state of Enterprise Search + stats Technical concept Information quality Feedback cycle Five dimensions of Findability
The document summarizes a presentation about search and navigation requirements for a multimedia archive called iMMix. It discusses challenges in searching large audiovisual collections with diverse target groups. It outlines different user types and their information needs, and proposes a model with search competence and knowledge dimensions to support different search behaviors. Key points include connecting user demand to content supply through metadata, the challenges of search interfaces, and adding value through features like thesauri, keyframes and faceted search.
Presented by Lou Rosenfeld and Rich Wiggins to Seth Earley's Search Solutions Jumpstart Conference Series, November 3, 2006.
The document summarizes best practices for improving enterprise intranet search. It discusses how enterprise search differs from public web search due to more complex data, users, and information needs. It provides tips for understanding users and their search behavior through analytics, designing search interfaces that support users of all skill levels, and implementing an iterative process of testing, measuring, and improving search performance over time.
The document summarizes best practices for improving enterprise intranet search. It discusses how enterprise search differs from public web search due to more complex data, users, and information needs. It provides tips for understanding users and their search behavior through analytics, designing interfaces to support users of all skill levels, and implementing an iterative process of testing, measuring, and improving search performance over time.
Slides from Enterprise Search & Analytics Meetup @ Cisco Systems - http://www.meetup.com/Enterprise-Search-and-Analytics-Meetup/events/220742081/ Relevancy and Search Quality Analysis - By Mark David and Avi Rappoport The Manifold Path to Search Quality To achieve accurate search results, we must come to an understanding of the three pillars involved. 1. Understand your data 2. Understand your customers’ intent 3. Understand your search engine The first path passes through Data Analysis and Text Processing. The second passes through Query Processing, Log Analysis, and Result Presentation. Everything learned from those explorations feeds into the final path of Relevancy Ranking. Search quality is focused on end users finding what they want -- technical relevance is sometimes irrelevant! Working with the short head (very frequent queries) has the most return on investment for improving the search experience, tuning the results, for example, to emphasize recent documents or de-emphasize archive documents, near-duplicate detection, exposing diverse results in ambiguous situations, using synonyms, and guiding search via best bets and auto-suggest. Long-tail analysis can reveal user intent by detecting patterns, discovering related terms, and identifying the most fruitful results by aggregated behavior. all this feeds back into the regression testing, which provides reliable metrics to evaluate the changes. By merging these insights, you can improve the quality of the search overall, in a scalable and maintainable fashion.
Presentation given to the University of Edinburgh web publishers community in January 2018 on the use of Keyword research tools for Search Engine Optimisation (SEO).
The document provides guidance on initial steps for developing a search application, including validating the need for full-text search, identifying ideal search results, considering clustering results, and producing requirements and choosing a technology. Some key recommendations include sketching out ideal results for sample queries, determining how results should be ordered and presented, and considering if and how results could be clustered. Determining ideal results and clustering options can help drive specific requirements and the selection of an appropriate technology.
This event was on May 2, 2017 at Wesley University, Ondo State, Nigeria. I trained the university's staff (academic and non-academic) on "Information Discovery and Search Strategies for Evidence-Based Research" in an information/digital literacy session.
I was invited to speak at OMCap Berlin 2014 about the close relationship between search engines and user experience with prescriptive guidance to gain higher rankings and more conversions.
The document discusses search engines and how they have evolved over time. It explains that early search engines ranked results based mainly on content, while modern engines also consider factors like page structure, popularity, and reputation. The document provides definitions of key search-related terms and outlines some of the main components and processes involved in how search engines work, such as crawling websites, indexing pages, and ranking results. It also discusses different types of search tools and how to choose the best one depending on your information needs.
Riley-O constructs a relevancy map of any document at realtime with a google-like interface and passes this map to search Google, bing, YAHOO!. Benefit here is helping users search: - when the user requires an entire document to satisfy the search requirement. - or when they don't know the subject matter or keywords; - or the possibility that the user does not know what they are looking for.
The document discusses how information architecture (IA) can be used to reveal truths about ourselves, our communities, and the world. It describes exercises like the privilege walk that can increase self-awareness. It also discusses how IA can help balance power in communities and institutions by providing engagement and checks. The document poses questions about how else IA could reveal truths and concludes by asking what actions can be taken to improve ourselves, our communities, and the world.
The document discusses improving findability through better information architecture practices. It outlines 8 practices: 1) Diagnosing important problems by focusing on the tasks and needs of key audiences. 2) Balancing evidence from different data sources to gain true insights. 3) Advocating for long-term goals beyond short-term metrics. 4) Measuring user engagement beyond just conversions. 5) Supporting contextual navigation through content modeling. 6) Improving cross-silo search by focusing on revision patterns. 7) Combining design approaches effectively. 8) Tuning designs over time.
Presented at EuroIA17, September 2017; World IA Day NYC, February 2017; Interact, October 2016 (London, UK); earlier versions in 2014 at UXPA Boston (Boston, MA, USA); in 2013 at Interaction S.A. (Recife, Brasil), Intuit (Mountain View, CA, USA), Designers + Geeks (New York, USA); in 2012 at UX Russia (Moscow, Russia), UX Hong Kong (Hong Kong, China), WebVisions NYC (New York, NY, USA); in 2011 at the IA Summit (Denver, CO, USA), UX-LX (Lisbon, Portugal), Love at First Website (Portland, OR, USA). This is something of a successor to my talk "Marrying Web Analytics and User Experience" (http://is.gd/vK34zS)
The document discusses business models for publishing in the current landscape. It summarizes that in 2008, a publisher's model focused on paperback books and PDFs through direct sales and Amazon, but by 2010 involved many more formats and channels. It then says that instead of business models, publishers should have a "faith-based model" focusing on connecting expertise to audiences through various content types and services. A good publishing business is based on purveying expertise, not just books.
The document discusses how user experience (UX) principles can help publishers better engage readers and authors. It provides examples of how one publisher, Rosenfeld Media, applies UX practices like transparency, empathy, delight, generosity and engaging content to involve customers at different stages of the publishing process. While some efforts like book-in-progress sites had mixed results, the publisher's customer-centric approach generally helped improve products and services. The document concludes by discussing gaps in applying UX and how the publisher may expand its role in brokering UX expertise and services.
The document discusses site search analytics from both a bottom-up and top-down perspective. It describes analyzing search query data to understand common queries, failure rates, and metadata patterns. It also discusses defining search-related metrics and benchmarks to measure findability and performance from a top-down perspective. The key is putting both approaches together to understand what is being measured and why.
Keynote at: • JBoye Conference; Philadelphia, PA, USA (May 7, 2009) • IA Konferenz; Hamburg, Deutschland (May 16, 2009) • Delve NYC; Brooklyn, NY, USA (August 5, 2009)
Louis Rosenfeld gave a presentation arguing against frequent website redesigns. He notes that redesigns often take 6 months and involve overhauling millions of pages but provide little real benefit to users. Additionally, redesigns focus more on visual changes rather than usability improvements and often break internal links and workflows. Rosenfeld recommends focusing redesign efforts on specific problem areas rather than wholesale visual overhauls.