Advertisement

Announcing the Winner of the SourceCon CSE Challenge

Article main image
Jul 8, 2011
This article is part of a series called Q/A.

The CSE Challenge is over! The finalists all gave us their best – but there can only be one winner…

This Challenge was different – unlike challenges of the past which sent participants on complex hunts for clues all across the Internet to see who could reach the finish line first, this assignment was simple and without an urgent timeline: create a Custom Search Engine to find candidate resumes, profiles, and directory lists. The execution, however, would prove to be more thought-provoking and, well, “Challenging,” than anyone could have imagined.

So who among our three finalists created the best Custom Search Engine?

First, let’s learn a little about each one:

Nitisha Gupta started her career as a sourcer in 2010 working for a U.S.-based RPO and currently works for Eaton Corporation as a Talent Sourcing Associate. She earned a Bachelors degree in Computer Application as well as an MBA in Human Resource and Industrial Relations. While she doesn’t have years of sourcing experience, she does have plenty of experience in research since getting her first PC in 10th standard.

Kameron Swinton is a full life cycle recruiter for Cymer, a global semiconductor equipment manufacturer based in San Diego, CA. He specializes in the recruitment of engineers and scientists but has sourced for a variety of challenging requirements. He’s passionate about using technology (especially your ATS!) to uncover leads and identify hidden talent.

Julia Tverskaya is an executive recruiter and sourcer. For the last six years she has been a partner with Brain Gain Recruiting, placing senior full-time employees in IT, ERP, strategy consulting, and finances. She has an MS in Applied Linguistics and a strong background in software engineering and management.

The judging criteria for the final round consisted of a point system that rewarded both quality and creativity. A grand total of 90 possible points could be earned: 60 points for quality results and 30 for source variety.

Based on the number of points awarded by each judge, totaled and averaged together, the winner of the SourceCon CSE Challenge, with a grand, averaged total of 77 points, is:

Julia Tverskaya!

Congratulations to Julia – she has won a trip to SourceCon in Silicon Valley in the fall and will be competing in the GrandMaster Sourcing Challenge. Well done!

How the judging was done

As with any sourcing project, the judging of this contest was not a cookie-cutter procedure. Let me tell you, sourcers, this was a difficult contest to evaluate — just ask any of the judges.

Why was it difficult, you may be asking? Here’s a small clue: choose your own adventure.

You may recall this post from last month in which I said,

…giving five researchers the same assignment will result in five different search results. It just depends on their techniques, resources, and thought processes while conducting the search.

Though we had objective criteria with which to evaluate the CSEs (contestants provided us with a search phrase, and we evaluated each refinement and awarded points for valid results as well as variety of sources), there were gray areas that each judge discovered. Do duplicates across refinements count? What about lists of LinkedIn profiles? What about resumes with no contact information? What if you can get to the contact information by clicking a link within the result? What if you can access contact information by signing up for/logging in to a free service? Do ‘headshops’ count for anything? What if the contact information is only a physical address? The list of questions could go on and on and on and on…

Sourcers – these are the same types of challenges you all run into daily while using your tools. And guaranteed if you work with a team of sourcers, if given the same search assignment you will all return with different results and a variety of potential candidates.

Each of the questions above would probably be answered differently by each of you. One of the judges told me, “I counted the results that only had physical addresses; I could do a reverse address search to find other contact info. Heck, I would even send a postcard! Everyone in the world is a potential candidate to me.”

It takes teamwork

At the end of the day, our point totals, while not identical, were all within the same ballpark as one another. Each of us graded the CSEs individually, frequently corresponding, asking questions, clarifying, re-examining our own results, and making sure we were on the same page with our assessments. We worked well together to make sure we were awarding points in the appropriate places.

Teamwork is what enabled us to fairly and objectively evaluate each CSE. Teamwork is what makes for cohesive, effective, and ultimately successful sourcing teams. We took each other’s suggestions into consideration and worked as one to choose the appropriate paths to take to determine our overall winner.

#Winning

While each judge’s point totals may have varied slightly, Julia’s CSE was the clear, unanimous winner – all five of us had her scored on top. What caused her CSE to win was the simple fact that it returned the least amount of non-quality results and had the greatest variety of sources. Her CSE earned a whopping 53 (rounded up from the 52.6 average) of 60 possible quality result points and she earned 25 (rounded up from the 24.8 average) of a possible 30 points for variety of sources.

The really awesome thing about this contest is that all who participated came away with a prize – their CSE! Now, each contestant will be able to use their CSE to source and find potential candidates for their companies and clients. Everybody wins!

What can be learned from this Challenge

Valuable lessons were abundant throughout this Challenge. I believe I learned a lot from this experience, as did each of the judges as well as all of you who participated in varying degrees. Here are a few of my own takeaways I can share with you:

  • When changing a Challenge format, it’s good to give a head’s up to the community. This Challenge was a 180 from the previous ones. Many of you may have been expecting a scavenger hunt of sorts and were surprised by this new Challenge. I heard that there were some people who had created CSEs but decided not to compete for fear of exposing their names publicly with something that they did not feel was perfect, they didn’t want their boss to see, or that might be considered their company’s intellectual property. These are all understandable and valid concerns, and certainly will be taken into consideration in the future.
  • Adding a crowd-sourcing element allows anyone who wants one to have a voice in the contest. I really believe that having the first round of competition as a peer evaluation and vote was a great way to get many more people involved. Anyone who wanted an opportunity to have their voice count toward the overall winner was able to cast a vote and be heard.
  • Judging something that can be customized so much is difficult. Each of our participants’ CSEs were unique – ranging from location-specific searches to alumni directories contained within the refinements. It is nearly impossible to compare search engines that can vary so much with fool-proof objectivity. Think of the show America’s Got Talent – participants display skills from dancing to singing to juggling and everything in between. How do you compare singing to dancing? And even if you compare singing to singing, how do you compare opera to country to pop to rock? When it comes down to it, there are a million ways the CSEs could have been judged, and no matter how we designed it, it would never be perfect. We set out to develop criteria that would provide the greatest potential for objective evaluation through quality and variety.

Congratulations to all of our finalists – the judges (thank-you to you as well!) all agreed that all three CSEs were high quality and returned excellent results. And big congrats to Julia – we look forward to having her at SourceCon this fall and seeing what she brings to the GrandMaster Sourcing Challenge.

We hope all of you who participated had a great time and that those of you who followed this Challenge will consider joining in the next one!

This article is part of a series called Q/A.
Get articles like this
in your inbox
The original publication for Sourcers, delivered weekly.
Advertisement