Typically the effectiveness of recommender systems is determined by statistical accuracy metrics of the algorithm used such as MAE (Mean Absolute Error). However, Kirsten Swearingen and Rashmi Sinha argue interaction design is equally important in determining recommender system effectiveness. Performing an analysis of music recommender systems, researchers have discovered that there are two separate models for Recommender System success. The first is in terms of ecommerce (user's indication they will buy music) and the other is usefulness (how users are helped to explore musical tastes). Eleven systems were tested (including the likes of Amazon, MovieCritic, Media Unbound and CDNow). System recommendations provided by 6 of those systems are compared to recommendations provided by their friends. Study 1 involved 20 participants and Study 2 involved 12, all of which are regular Internet users in the 19 to 44 age range. Users provided input to the system and received a set of recommendations. Users were then asked to rate 10 recommendations from each system, evaluating aspects such as liking, action towards item (buy/download/do nothing), transparency (do they understand why system recommended that item) and familiarity (any previous experience of the item). users were also asked to rate the systemas a whole on a number of dimensions: usefulness, trustworthiness, and ease of use. At the end of the session, users were asked to name the system they preferred and explain their rationale.
Findings
The goal of most recommender systems is to replace or at least augment the social recommendation process. Study results showed that users preferred recommendations made by their friends versus online systems but their is a high level of overall satisfaction finding them useful in suggesting items that users had not previously heard of. Users like the breadth that online systems offer, allowing them the unique opportunity to exploer their tastes and learn about new items. Effective recommenders inspire trust and users are willing to provide more input to the system in return for more accurate recommendations. Designers often try to balance ease of use while enhancing accuracy. Of the participants studied, 67% didn't think the 4-20 input ratings amazon requires is sufficient to generate accurate recommendations. Conversely, MediaUnbound requiers 34 input ratings and 75% thinks that number is just right. Users also commented on the rating process noting some mechanisms such as genre selection, labeling your favorite artist, and rating scales are either too restrictive or redundant and boring. Users did like the rating bar/slider scale since they could click anywhere to indicate their degree of liking. They expressed interest in varying the rating process to one that is engaging and offers mixed questions and continuous feedback. Participants also like receiving information about recommended items. Identifying why it was recommended, when it was released, album covers and reviews by others is useful. Adjustments were made to RatingZone's Quick Picks to offer my detail about recommendations and they noticed a 20% increase in usefulness! In addition, users like and prefer to buy previously familiar recommendations. It helps build trust. For example, participants said 72% of Amazon's, 60% of MediaUnbounds and 45% of MoodLogic's recommendations were familiar. There is a greater willingness to buy familiar than unfamiliar recommended items. this makes sense since a familiar item is a less risky purchase decision. Users expressed a willingness to buy only 7% of the items recommended by mediaUnbound. While users show a preference for familiar items they do express frustration over recommendations that were albums by the same artists that the users had input into the system. Amazon might remind users about a favorite song not heard recently but it did not help users expand their tastes in new directions. MediaUnbound includes a silder bar for users to indicate how familiar the music suggested should be and users participants stated they liked this feature.
Again, while the algorithm used to generate the recommendation is useful in determining effectiveness, interaction factors must be equally weighed. Depending on the definition of success used, ecommerce techniques may be more important than usefulness and vice versa. Of course if the goal is to receive the best of both worlds, a hybrid system that uses the strengths of each approach is ideal!
Interaction Design for Recommender Systems (2002)
HUBRIS: Human Benchmarking of Recommender Systems (2002)
Kirsten Swearingen
Rashmi Sinha
Marti Hearst
Thursday, April 10, 2008
Thursday, April 3, 2008
Songkick Concert Recommender
So while album/track purchases may be down drastically, it seems artist may be recouping through live concerts. In 2007, concert ticket sales earned about $9 billion worldwide, thats pretty impressive!! Songkick is attempting to capitalize on the "big boom" by offering a centralized location to search for concerts information (date, locations and ticket prices). Without registering, visitors can input up to three artist/bands at a time and the service will provide concert dates, if available, or they will recommend other artist/bands which will be touring in a city near you and which they assume you are likely to enjoy!
One of the features songkick totes is its ability to look at your computer's music library (iTunes, Winamp and Windows Media Player) and build recommendations. Songkicker is a downloadable plug-in which automatically scans your entire music library and adds these artists into your Tour Tracker -- without you ever having to do anything! The Songkicker automatically runs in the background whenever you play music. Every track you play is Songkicked to us and appears in your "Recently Songkicked" section. Though it seems to free up a lot of time and lessens the amount of manual user input, I don't feel very comfortable with "tracking devices" on my PC....thats just my personal opinion though!
So How Does it Work?
Good question!! The most Ive been able to decipher is this....Any mention of music on the web is a data point for recommendations. Their recommendation engine doesnt generate suggestions from the user base like Last.fm or through careful analysis like my favorite toy, Pandora, but instead it crawls websites like Wikipedia and music blogs to pick up related artisted based on positive or negative associations between the bands. Through a combination of including anything about music on the internet coupled with using "expert" critical opinion from blogs and music publications, the technology is able to infer similarities between artists, compare this to your personal music taste and recommend concerts that you might actually enjoy. I wish I could get a better grasp of what depicts positive and negative or better details about how "similarities" between artists are identified but I'm unable to locate the specifics. It just launched on March 18, 2008 so maybe more details will be released in the future!
Supposedly the real payoff is buying tickets, allowing you to find the cheapest tickets for these shows. Songkick provides direct links to ticket inventory from 16 vendors across the U.S. and U.K.
As a closing point - for artist and band bloggers, Songkick, offers these individuals a way to make extra money with a widget that identifies bands on tour mentioned on their blogs, and inserts tour information and ticket vendor links that can be tracked for referrals.
Did I Test It Out?
Of course I did! The first three artists I input were individuals who I know are touring! I input Jay-Z and Mary J Blige (they will be in Greensboro NC on Saturday) and then Kanye West (he's coming to Charlotte on May 2). Jay-Z and Mary didnt return any results, not even the concert in Greensboro but Kanye did appear. I switched the artists and added 3 more artists: Lauryn Hill, The Roots and Pharrell Williams. Both The Roots and Pharrell are coming to Charlotte but I didn't get any results for them either. So I decided to signup (username/password: prsrecommend) and I played around a little more. I selected John Mayer, Maroon 5 and Avril Lavigne. I didn't know any of these artists were touring and actually found their dates in Charlotte so that was pretty neat. I am still extremely interested in knowing how similar artists are derived because they did seem to do a decent job of suggesting artists that I know and enjoy. Perhaps it's based on popularity and if they're scanning the web, I would bet money that's how it's determined.....maybe I'll continue investigating!!
One of the features songkick totes is its ability to look at your computer's music library (iTunes, Winamp and Windows Media Player) and build recommendations. Songkicker is a downloadable plug-in which automatically scans your entire music library and adds these artists into your Tour Tracker -- without you ever having to do anything! The Songkicker automatically runs in the background whenever you play music. Every track you play is Songkicked to us and appears in your "Recently Songkicked" section. Though it seems to free up a lot of time and lessens the amount of manual user input, I don't feel very comfortable with "tracking devices" on my PC....thats just my personal opinion though!
So How Does it Work?
Good question!! The most Ive been able to decipher is this....Any mention of music on the web is a data point for recommendations. Their recommendation engine doesnt generate suggestions from the user base like Last.fm or through careful analysis like my favorite toy, Pandora, but instead it crawls websites like Wikipedia and music blogs to pick up related artisted based on positive or negative associations between the bands. Through a combination of including anything about music on the internet coupled with using "expert" critical opinion from blogs and music publications, the technology is able to infer similarities between artists, compare this to your personal music taste and recommend concerts that you might actually enjoy. I wish I could get a better grasp of what depicts positive and negative or better details about how "similarities" between artists are identified but I'm unable to locate the specifics. It just launched on March 18, 2008 so maybe more details will be released in the future!
Supposedly the real payoff is buying tickets, allowing you to find the cheapest tickets for these shows. Songkick provides direct links to ticket inventory from 16 vendors across the U.S. and U.K.
As a closing point - for artist and band bloggers, Songkick, offers these individuals a way to make extra money with a widget that identifies bands on tour mentioned on their blogs, and inserts tour information and ticket vendor links that can be tracked for referrals.
Did I Test It Out?
Of course I did! The first three artists I input were individuals who I know are touring! I input Jay-Z and Mary J Blige (they will be in Greensboro NC on Saturday) and then Kanye West (he's coming to Charlotte on May 2). Jay-Z and Mary didnt return any results, not even the concert in Greensboro but Kanye did appear. I switched the artists and added 3 more artists: Lauryn Hill, The Roots and Pharrell Williams. Both The Roots and Pharrell are coming to Charlotte but I didn't get any results for them either. So I decided to signup (username/password: prsrecommend) and I played around a little more. I selected John Mayer, Maroon 5 and Avril Lavigne. I didn't know any of these artists were touring and actually found their dates in Charlotte so that was pretty neat. I am still extremely interested in knowing how similar artists are derived because they did seem to do a decent job of suggesting artists that I know and enjoy. Perhaps it's based on popularity and if they're scanning the web, I would bet money that's how it's determined.....maybe I'll continue investigating!!
Songkick: Live Music Lovers Will Love This
Songkick's Concert Recommendation Engine: It Goes to 11
Subscribe to:
Posts (Atom)