TY - GEN
T1 - A crowd-powered socially embedded search engine
AU - Jeong, Jin Woo
AU - Morris, Meredith Ringel
AU - Teevan, Jaime
AU - Liebling, Dan
PY - 2013
Y1 - 2013
N2 - People have always asked questions of their friends, but now, with social media, they can broadcast their questions to their entire social network. In this paper we study the replies received via Twitter question asking, and use what we learn to create a system that augments naturally occurring "friendsourced" answers with crowdsourced answers. By analyzing of thousands of public Twitter questions and answers, we build a picture of which questions receive answers and the content of their answers. Because many questions seek subjective responses but go unanswered, we use crowdsourcing to augment the Twitter question asking experience. We deploy a system that uses the crowd to identify question tweets, create candidate replies, and vote on the best reply from among different crowd- and friend-generated answers. We find that crowdsourced answers are similar in nature and quality to friendsourced answers, and that almost a third of all question askers provided unsolicited positive feedback upon receiving answers from this novel information agent.
AB - People have always asked questions of their friends, but now, with social media, they can broadcast their questions to their entire social network. In this paper we study the replies received via Twitter question asking, and use what we learn to create a system that augments naturally occurring "friendsourced" answers with crowdsourced answers. By analyzing of thousands of public Twitter questions and answers, we build a picture of which questions receive answers and the content of their answers. Because many questions seek subjective responses but go unanswered, we use crowdsourcing to augment the Twitter question asking experience. We deploy a system that uses the crowd to identify question tweets, create candidate replies, and vote on the best reply from among different crowd- and friend-generated answers. We find that crowdsourced answers are similar in nature and quality to friendsourced answers, and that almost a third of all question askers provided unsolicited positive feedback upon receiving answers from this novel information agent.
UR - https://www.scopus.com/pages/publications/84900421450
M3 - Conference contribution
AN - SCOPUS:84900421450
SN - 9781577356103
T3 - Proceedings of the 7th International Conference on Weblogs and Social Media, ICWSM 2013
SP - 263
EP - 272
BT - Proceedings of the 7th International AAAI Conference on Weblogs and Social Media, ICWSM 2013
PB - Association for the Advancement of Artificial Intelligence
T2 - 7th International AAAI Conference on Weblogs and Social Media, ICWSM 2013
Y2 - 8 July 2013 through 11 July 2013
ER -