Growing Venture Solutions - GVS - statistics http://growingventuresolutions.com/taxonomy/term/4/0 en Selecting conference session proposals: popular vote? selection committee? http://growingventuresolutions.com/blog/selecting-conference-session-proposals-popular-vote-selection-committee <p>I was on the "Ecosystem" track session selection team for <a href="http://london2011.drupal.org/">Drupalcon London</a>, which motivated me to finally do some more analysis on the traditional pre-selection session voting. Specifically, I wanted to compare the votes a session receives against the evaluations submitted after the conference.</p> <p><em>By the way, if you have the opportunity, I highly suggest going to a <a href="http://drupalcon.org/">Drupalcon</a>; they are always great events.</em></p> <p>Here are some conclusions based on analysis of the evaluation and voting data from DrupalCon Chicago:</p> <ul> <li>Voting was not a useful predictor of high quality sessions!</li> <li>The pre-selected sessions did not fare better in terms of evaluation than the other sessions (though they may have served a secondary goal of getting attendees to sign up earlier).</li> <li>We should re-evaluate how we do panels. They tend to get lower scores in the evaluation.</li> <li>The number of evaluations submitted increased 10% compared to San Francisco, which seems great (Larry Garfield theorizes it is related to the mobile app, I think there are a lot of factors involved)</li> </ul> <h3>Is voting a good way to judge conference session submissions?</h3> <p>Drupalcon has historically used a voting and committee system for session selection that is pretty common. This is also the default workflow for sites based on the <a href="http://usecod.com/">Conference Organizing Distribution</a>.</p> <p>Typical system:</p> <ol> <li>Users register on the site</li> <li>They propose sessions (and usually there is a session submission cutoff date before voting)</li> <li>Voting begins: people (sometimes registered users, sometimes limited to attendees) can vote on their favorite sessions</li> <li>During steps 2 and 3, a session selection committee is encouraging submissions and contacting the session proposers to improve their session descriptions</li> <li>Selection begins: Voting closes and the session selection committee does their best to choose the right sessions based on factors like appropriateness of content to the audience, the number of votes, their knowledge of the presenter's skill, diversity of ideas</li> <li>???</li> <li>Profit</li> </ol> <p>Drupalcon Chicago (the event I'm basing this analysis on) had a few changes to that model. They pre-selected some sessions from the people they knew would submit sessions and get accepted (see their <a href="http://chicago2011.drupal.org/news/submit-your-session-proposal-drupalcon-chicago-today">blog post on that</a> and the <a href="http://chicago2011.drupal.org/speaker-faq">faq</a>). This allows us to see whether pre-selecting actually brought in sessions that were more valuable to people which seems like a decent proxy for whether or not the committee's choices are right.</p> <p>The pre-conference voting had 5 stars with the following labels:</p> <ul> <li>I have no interest in this session</li> <li>I would probably not attend this session</li> <li>I might attend this session</li> <li>I would probably attend this session</li> <li>I totally want to see this session</li> </ul> <p>The post-session evaluations had 5 stars with the following criteria:</p> <ul> <li>Overall evaluation of this session</li> <li>Speaker's ability to control discussions and keep session moving</li> <li>Speaker's knowledge of topic</li> <li>Speaker's presentation skills</li> <li>Content of speaker's slides/visual aids</li> </ul> <p>I've previously looked at the percent of the attendee population that actually gets to vote and the distribution of votes (1 to 5) to see if that was actually used in a meaningful way in Chicago (that analysis is on <a href="http://groups.drupal.org/node/106579">groups.drupal.org</a>). Given the distribution of votes in Chicago across the entire 1 to 5 spectrum, I believe it is useful to use a 5 star system as a rating on a session. However, I don't think the resulting value is directly useful by the session selection committee when they choose individual sessions (more on that later).</p> <p>My analysis method was to create a nice spreadsheet with the average and count of votes on sessions from the pre-conference period where votes were used to help determine which sessions to include. Then I added in the votes (from 1 to 5 stars) which covered several categories.</p> <p>I graphed the pre-conference votes compared to the post-conference evaluations and used the "<a href="http://wiki.services.openoffice.org/wiki/Documentation/How_Tos/Calc:_CORREL_function">correl</a>" function to see how correlated the data is. I expect a straight line correlation: the higher the average votes, the higher the post-conference evaluation scores. In fact, there was basically no correlation.</p> <p>What I found was that there is basically no correlation between the pre-conference voting and the post-session evaluations. Here is a table that shows the axis (i.e. one of those 5 elements above) and the correlation between that axis and pre-conference session votes.</p> <table> <thead> <tr> <th>Axis</th> <th>Correlation (r)</th> </tr> </thead> <tbody> <tr> <td>overall</td> <td>-0.00615455064217481</td> </tr> <tr> <td>control</td> <td>0.0528419859853818</td> </tr> <tr> <td>knowledge</td> <td>0.0907826506020892</td> </tr> <tr> <td>presentation</td> <td>0.00493457701973411</td> </tr> <tr> <td>visuals</td> <td>-0.0216904506593498</td> </tr> </tbody> </table> <p>As a graph, the overall data looks like:</p> <p><a href="http://growingventuresolutions.com/gvsfiles/distribution_overall_session_score_votes.png"><img src="http://growingventuresolutions.com/gvsfiles/distribution_overall_session_score_votes_thumb.png" /></a><br /> <em>This data is not correlated. Just look at it, spaghetti soup!</em></p> <p>I graphed it along with a random line that has a correlation value of .95. As you can see, the overall evaluation is not at all correlated to the outcome evaluations.</p> <p>It isn't surprising that <strong>votes don't correlate to session quality</strong>. Voting tends to be done by a minority of event attendees who are "insiders" to the event. They are likely to be swayed by friendships, employers, and social media campaigns.</p> <h3>Comparing pre-selected sessions to regular sessions</h3> <p>I also took an average of the evaluation scores across non-pre-selected-sessions and the pre-selected sessions. The average overall evaluation score for non-pre-selected sessions was 80.9 vs. 80.7 for pre-selected sessions. The other axes show similar results except for knowledge and visuals, though it's not clear if those are statistically significant.</p> <table> <thead> <tr> <th>Axis</th> <th>Pre-selected average evaluation score</th> <th>Non-pre-selected average evaluation score</th> </tr> </thead> <tbody> <tr> <td>overall</td> <td>81</td> <td>81</td> </tr> <tr> <td>control</td> <td>83</td> <td>83</td> </tr> <tr> <td>knowledge</td> <td>93</td> <td>91</td> </tr> <tr> <td>presentation</td> <td>80</td> <td>81</td> </tr> <tr> <td>visuals</td> <td>78</td> <td>75</td> </tr> </tbody> </table> <p>So, we can see that regularly selected sessions got very similar scores to the pre-selected ones. I'm not suggesting that pre-selecting is flawed (it didn't produce lower results, anyway), but I do think we should carefully consider who we pre-select.</p> <p>The third bit of analysis I did was to look at overall score, and the number of presenters for that session. Here's the average per decile where decile 1 is the 9 sessions that were ranked highest. Seems like a pretty clear trend from nearly 1 person for the top rated sessions to 2.5 people for the bottom rated sessions.</p> <table> <thead> <tr> <th>Average # of presenters</th> <th>Decile</th> </tr> </thead> <tbody> <tr> <td>1.11</td> <td>1</td> </tr> <tr> <td>1.67</td> <td>2</td> </tr> <tr> <td>1.89</td> <td>3</td> </tr> <tr> <td>1.44</td> <td>4</td> </tr> <tr> <td>1.67</td> <td>5</td> </tr> <tr> <td>2.33</td> <td>6</td> </tr> <tr> <td>2.00</td> <td>7</td> </tr> <tr> <td>2.00</td> <td>8</td> </tr> <tr> <td>2.44</td> <td>9</td> </tr> <tr> <td>2.50</td> <td>10</td> </tr> </tbody> </table> <p>I believe there are two big reasons for this. First, panel presentations are rarely done in a well-coordinated manner and the panel members usually don't take time to practice as a group (our distributed community makes that hard). Second, Drupalcon session selection committees often suggest similar topics get merged into one panel. I think we should <strong>stop merging independent presenters.</strong> The result is often that people who may not have the same story to tell end up putting 45 minutes of information into one-half or one-third of the time.</p> <h3>What can we do to improve session quality and session selection?</h3> <p>One of the great tools for session selection committee members at Drupalcon London was the availability of evaluation data from previous conferences. If a proposed session got a lot of votes (perhaps due to a campaign on twitter or within a large company) but the presenter had horrible evaluations from a previous conference then the evaluator has an easy job: just say "no thanks".</p> <p>The only problem with using previous conference evaluations to judge sessions is that it can lead to stagnation among the presenters. Part of the value of a conference is in hearing new ideas. This can be reduced by having free-for-all BOF sessions, but I think in the Drupal world that part of the solution is to use Drupalcamps as a ramp into Drupalcon: presenters should give their session at a camp and mention that (and any evaluations from the camp, any video from the camp) in their session proposal. With approval from presenters, Drupalcamp Colorado <a href="http://2011.drupalcampcolorado.org/news/drupalcamp-colorado-2011-wrapup">published our evaluations</a> - we hope this helps other camps and that they will do the same. It's not surprise that <a href="http://drupal.org/node/930072">some</a> <a href="http://drupal.org/node/1223800">feature</a> <a href="http://drupal.org/node/1176604">requests</a> for COD will help make the process of gathering this information and getting it to the right people much easier.</p> <p>See also a great discussion on groups.drupal.org: <a href="http://groups.drupal.org/node/151174">On popular voting and merit-based selection of sessions</a>.</p> <h3>What else can improve session quality?</h3> <p>So far I've talked about identifying good sessions, but I think the nature is more complex. It's also about encouraging and inspiring the presenters to do great work on their sessions. We can tell them "please practice it 10 times" but nobody will do it if they aren't motivated. Sending reminder mails to presenters like "we expect 3,000 attendees including key decision makers from companies like Humongo Inc." could help. There's also the possibility of compensating presenters. Drupalcon Chicago gave a mix of cash and non-cash benefits (massage chair, faster check-in line).</p> <p>Scott Berkun gives some tips on how to improve the presenter experience at a conference in <a href="http://www.scottberkun.com/blog/2011/an-open-letter-to-conference-organizers/">An open letter to conference organizers</a>. He recommends a lot of things including sharing the results of the evaluation data. I'm in favor of that as well...(<a href="http://drupal.org/node/1223870">provide default terms of attendance</a>).</p> <h3>Extra note: Want to see your evaluations from Chicago? Just needs more code</h3> <p>There were evaluations in Chicago, but the speakers have not seen this data. I got access to it as part of my role on the London session selection team and my work on the infrastructure team/Chicago sites.</p> <p>However, the fact that presenters can't see it is a result of a bug in software that you can help fix. The organizers of Drupalcon want to share that information, but the <a href="http://drupal.org/node/930072">code to do that</a> isn't fully working. If you can help make it work then all session presenters will be able to see their evaluations.</p> <div class="field field-type-nodereference field-field-related-project"> <div class="field-label">Related Project:&nbsp;</div> <div class="field-items"> <div class="field-item odd"> <a href="/portfolio/drupalcon-chicago-2011-site">DrupalCon Chicago 2011 Site</a> </div> </div> </div> http://growingventuresolutions.com/blog/selecting-conference-session-proposals-popular-vote-selection-committee#comments Planet Drupal conference organizing distribution statistics strategery Wed, 03 Aug 2011 22:32:43 +0000 Greg 1389 at http://growingventuresolutions.com Voting, Profiles & Hot content: Tools to help the Drupal community scale http://growingventuresolutions.com/blog/voting-profiles-hot-content-tools-help-drupal-community-scale <p>Drupal is growing in complexity and growing simply in sheer numbers. We need more tools to help people manage the information overload and find the best voices in our community quickly. We should build dynamic tools to empower community members to join in and share their voices (if those voices are valuable) rather than walled gardens that keep people out. I believe voting, richer profiles, and the hot content are steps to help enable that vision. That said, the implementation has to match the community values. Below I've laid out the story of how some improvements to Groups.Drupal.org were made, provide data behind some of those improvements, and ask some questions so we can keep refining them.</p> <p>At <a href="http://sf2010.drupal.org/">Drupalcon San Francisco</a> there was a sprint for groups.drupal.org features where <a href="http://drupal.org/user/3313">Josh Koenig</a> and <a href="http://drupal.org/user/139189">Brian Gilbert</a> helped out add some new features. In particular we added voting on nodes &amp; comments and we added a "<a href="http://groups.drupal.org/hot">hot</a>" page which incorporates several elements to determine which content on the site is interesting in the last week.</p> <p>I wanted to look back on the past year to think about these changes and whether or not they are an improvement.</p> <h3>Hot Content: G.d.o <em>is a differentiated piece of $<em>#!@$</em>&amp;</em></h3> <p>Randy Fay recently wrote about how <a href="http://www.randyfay.com/node/104">Drupal.org doesn't differentiate in quality of content</a> and he's absolutely right about a lot of the site. The only way to know what posts/comments are good is if you've been around enough to get a sense of people's reputation and maybe have a perspective on their content. For drupal.org, my colleague Lisa is working on a <a href="http://drupal.org/community-initiatives/drupalorg-content-strategy">Content Strategy</a>, which is what we ultimately need. I don't agree with all of Randy's ideas, but one of them I agree with is that we need better ways for the community to find the best content. Groups has that!</p> <p>The <a href="http://groups.drupal.org/hot">hot</a> content on groups.drupal.org helps feed people's desire for great content. According to Google Analytics this page is in the top thousand or so for most popular with a <strong>bounce rate of just 23%</strong> For comparison, <a href="http://groups.drupal.org/jobs">Jobs</a> is perpetually one of the top pages on the site and has solid value as a guide to content useful to visitors, it also has a comparatively low bounce rate of 29% which is a bit higher than the hot content. That makes the Hot page a pretty solid resource on the site for finding content, especially considering it's just a year old and has no primary navigation linking to it.</p> <h3>Voting on content &amp; comments</h3> <p>Groups.drupal.org is using the <a href="http://drupal.org/project/vote_up_down">Vote Up Down</a> tool which has both up/down AND the ability to do just Up voting (my thanks to <a href="http://drupal.org/user/132175">Marco Villegas</a> for his help with the module). My feeling was that this tool would provide a simple way for folks to say "I agree" or "I disagree" without having to get into a long debate about exactly why. It helps bridge the gap for users who want to provide feedback to the conversation but don't want to fully invest in a comment, or don't want to create an e-mail notification for their action.</p> <p>Below is a graph that shows a breakdown of the sum of votes on content (both nodes and comments):<br /> <img src="http://growingventuresolutions.com/gvsfiles/distribution_of_sums.png" /></p> <p>This graph shows that of the 12,117 pieces of content that are rated, more than half of it ends up with a +1 total score (6,810 items). Almost 9,000 items end up with a positive score while about 2,400 items have a negative score (about 750 items had a net zero score).</p> <p>Looking at individual voting habits, 7,400 people have placed a vote. Among those, there are 2,960 who placed a down vote and 5,630 who placed an up vote.</p> <p>The sum of votes for individuals shows whether on whole they are more in agreement or disagreement: a person with 2 up votes and 1 down vote would have a sum of 1. The data shows that the sum is a negative number for about 1,800 people and a positive total for over 4,600 people. These two items show that most people use votes to be positive in whole even if they periodically have a down vote.</p> <p>My hope was that voting would help reduce tension among groups on the site. In the time I've co-managed the site since Drupalcon Paris we have had 3 regional groups in particular that had a large number of disagreements that rose to the level where concerned group members needed to ask for outside help. Those disagreements led not only to me instituting this voting tool, but also to Moshe creating the <a href="http://drupal.org/dcoc">Drupal Code of Conduct</a>. Has voting or the DCOC helped those groups to interact more civilly? It's hard to say.</p> <p>I personally find both up and down votes of other users to be valuable as I read a thread. It helps me get a sense of what other people on the site think.</p> <p>What do you think? Is voting useful on g.d.o? Should we remove the option to downvote? Should we make it possible to see who did which votes?</p> <h3>Improved user profiles:</h3> <p>A few months ago my colleague Ben added some fields to user profiles. Now the site displays a mix of user entered data AND automatically generated information.</p> <p><a href="http://groups.drupal.org/user/58"><img src="http://growingventuresolutions.com/gvsfiles/gdo_richer_profiles.png" title="Annotated version of my profile" /></a></p> <p>It's now possible to take a quick glance and see a user's level of interaction on the site from posting content, to voting, to events organized and groups organized.</p> <p>Groups has long used avatars, a feature which is valuable to help reputation/recognition when scanning a page. Try looking at a forum post with a lot of comments on d.o and quickly identify comments by people you know. Now try it on g.d.o. See?</p> <p>Voting and reputation are something that's particularly interesting to me since it relates to the session selection process in <a href="http://usecod.com/">conference sites we build</a> and the <a href="http://certifiedtorock.com/">Certified to Rock</a> system. I'll be writing more about voting on sessions as it relates to Drupalcon in the next few weeks.</p> <p>What else can we do to improve the day to day interactions with these sites? How can we elevate the conversations?</p> http://growingventuresolutions.com/blog/voting-profiles-hot-content-tools-help-drupal-community-scale#comments Planet Drupal reputation statistics Tue, 24 May 2011 22:12:10 +0000 Greg 1341 at http://growingventuresolutions.com