Kaggle

From HandWiki - Reading time: 5 min

Short description: Internet platform for data science competitions
Kaggle
TypeSubsidiary
IndustryData science
FoundedApril 2010
Founder
  • Anthony Goldbloom
  • Ben Hamner
HeadquartersSan Francisco, United States
Key people
ProductsCompetitions, Kaggle Kernels, Kaggle Datasets, Kaggle Learn
ParentGoogle
(2017–present)
Websitekaggle.com

Kaggle is a data science competition platform and online community of data scientists and machine learning practitioners under Google LLC. Kaggle enables users to find and publish datasets, explore and build models in a web-based data science environment, work with other data scientists and machine learning engineers, and enter competitions to solve data science challenges.[1]

History

Kaggle was founded by Anthony Goldbloom and Ben Hamner in April 2010.[2] Jeremy Howard, one of the first Kaggle users, joined in November 2010 and served as the President and Chief Scientist,[3] and Nicholas Gruen was the founding chair.[4] In 2011, the company raised $12.5 million Series A and Max Levchin became the chairman.[5] On 8 March 2017, Fei-Fei Li, Chief Scientist at Google, announced that Google was acquiring Kaggle.[6]

In June 2017, Kaggle surpassed 1 million registered users, and as of October 2023, it has over 15 million users in 194 countries.[7][8][9]

In 2022, founders Goldbloom and Hamner stepped down from their positions and D. Sculley became the CEO.[10]

In February 2023, Kaggle introduced Models which allows users to discover and use pre-trained models through deep integrations with the rest of Kaggle’s platform.[11]

Site overview

Competitions

Many machine-learning competitions have been run on Kaggle since the company was founded. Notable competitions include one improving gesture recognition for Microsoft Kinect,[12] making a football AI for Manchester City, coding a trading algorithm for Two Sigma Investments,[13] and improving the search for the Higgs boson at CERN.[14]

The competition host prepares the data and a description of the problem; the host may choose whether it's going to be rewarded with money or be unpaid. Participants experiment with different techniques and compete against each other to produce the best models. Work is shared publicly through Kaggle Kernels to achieve a better benchmark and to inspire new ideas. Submissions can be made through Kaggle Kernels, through manual upload or using the Kaggle API. For most competitions, submissions are scored immediately (based on their predictive accuracy relative to a hidden solution file) and summarized on a live leaderboard. After the deadline passes, the competition host pays the prize money in exchange for "a worldwide, perpetual, irrevocable and royalty-free license [...] to use the winning Entry", i.e. the algorithm, software and related intellectual property developed, which is "non-exclusive unless otherwise specified".[15]

Alongside its public competitions, Kaggle also offers private competitions, which are limited to Kaggle's top participants. Kaggle offers a free tool for data science teachers to run academic machine-learning competitions.[16] Kaggle also hosts recruiting competitions in which data scientists compete for a chance to interview at leading data science companies like Facebook, Winton Capital, and Walmart.

Kaggle's competitions have resulted in successful projects such as furthering HIV research,[17] chess ratings[18] and traffic forecasting.[19] Geoffrey Hinton and George Dahl used deep neural networks to win a competition hosted by Merck.[citation needed] Vlad Mnih (one of Hinton's students) used deep neural networks to win a competition hosted by Adzuna.[citation needed] This resulted in the technique being taken up by others in the Kaggle community. Tianqi Chen from the University of Washington also used Kaggle to show the power of XGBoost, which has since replaced Random Forest as one of the main methods used to win Kaggle competitions.[citation needed]

Several academic papers have been published on the basis of findings made in Kaggle competitions.[20] A contributor to this is the live leaderboard, which encourages participants to continue innovating beyond existing best practices.[21] The winning methods are frequently written on the Kaggle Winner's Blog.

Progression System

Kaggle has implemented a progression system to recognize and reward users based on their contributions and achievements within the platform. This system consists of five tiers: Novice, Contributor, Expert, Master, and Grandmaster. Each tier is achieved by meeting specific criteria in competitions, datasets, kernels (code-sharing), and discussions.[22]

The highest and most prestigious tier, Kaggle Grandmaster, is awarded to users who demonstrate exceptional skills in data science and machine learning. Achieving this status is extremely challenging. As of April 4, 2023, out of 12 million Kaggle users, only 2,331 (about 1 out of every 5500 users) have reached the Master level.

Among these Masters, only 472 (approximately 1 out of every 5 Masters) have achieved the coveted Kaggle Grandmaster status.[23]

The other tiers in the progression system include:

  • 13 thousand Experts
  • 200 thousand Contributors
  • 12 million Novices.

The progression system serves to motivate users to continuously improve their skills and contribute to the Kaggle community.

See also

References

  1. "A Beginner’s Guide to Kaggle for Data Science" (in en). 2023-04-17. https://www.makeuseof.com/beginners-guide-to-kaggle/. 
  2. "Google is acquiring data science community Kaggle". Techcrunch. March 8, 2017. https://techcrunch.com/2017/03/07/google-is-acquiring-data-science-community-kaggle/. 
  3. "The exabyte revolution: how Kaggle is turning data scientists into rock stars" (in en-GB). Wired UK. ISSN 1357-0978. https://www.wired.co.uk/article/the-exabyte-revolution. 
  4. Mulcaster, Glenn (4 November 2011). "Local minnow the toast of Silicon Valley". The Sydney Morning Herald. https://www.smh.com.au/business/local-minnow-the-toast-of-silicon-valley-20111103-1mxt9.html. 
  5. Lichaa, Zachary. "Max Levchin Becomes Chairman Of Kaggle, A Startup That Helps NASA Solve Impossible Problems". Business Insider. https://www.businessinsider.com/kaggle-11-million-max-levchin-2011-11. 
  6. "Welcome Kaggle to Google Cloud" (in en). Google Cloud Platform Blog. https://cloudplatform.googleblog.com/2017/03/welcome-Kaggle-to-Google-Cloud.html. 
  7. "Unique Kaggle Users". https://www.kaggle.com/tunguz/unique-kaggle-users. 
  8. Markoff, John (24 November 2012). "Scientists See Advances in Deep Learning, a Part of Artificial Intelligence" (in en). The New York Times. https://www.nytimes.com/2012/11/24/science/scientists-see-advances-in-deep-learning-a-part-of-artificial-intelligence.html. 
  9. "We've passed 1 million members" (in en-US). Kaggle Winner's Blog. 2017-06-06. http://blog.kaggle.com/2017/06/06/weve-passed-1-million-members/. 
  10. Wali, Kartik (2022-06-08). "Kaggle gets new CEO, founders quit after a decade" (in en-US). https://analyticsindiamag.com/kaggle-gets-new-ceo-founders-quit-after-a-decade/. 
  11. "[Product Launch] Introducing Kaggle Models | Data Science and Machine Learning". https://www.kaggle.com/discussions/product-feedback/391200. 
  12. Byrne, Ciara (December 12, 2011). "Kaggle launches competition to help Microsoft Kinect learn new gestures". VentureBeat. https://venturebeat.com/2011/12/12/kaggle-competition-microsoft-kinect-learn-new-gestures/. 
  13. Wigglesworth, Robin (March 8, 2017). "Hedge funds adopt novel methods to hunt down new tech talent". The Financial Times (United Kingdom). https://www.ft.com/content/1fd47a60-03e5-11e7-aa5b-6bb07f5c8e12. 
  14. "The machine learning community takes on the Higgs". Symmetry Magazine. July 15, 2014. http://www.symmetrymagazine.org/article/july-2014/the-machine-learning-community-takes-on-the-higgs/. 
  15. Kaggle. "Terms and Conditions - Kaggle". https://www.kaggle.com/terms. 
  16. Kaggle. "Kaggle in Class". http://inclass.kaggle.com/. 
  17. Carpenter, Jennifer (February 2011). "May the Best Analyst Win". Science Magazine 331 (6018): pp. 698–699. doi:10.1126/science.331.6018.698. https://www.science.org/doi/abs/10.1126/science.331.6018.698. 
  18. Sonas, Jeff (20 February 2011). "The Deloitte/FIDE Chess Rating Challenge". Chessbase. http://www.chessbase.com/newsdetail.asp?newsid=7020. 
  19. Foo, Fran (April 6, 2011). "Smartphones to predict NSW travel times?". The Australian. http://www.theaustralian.com.au/australian-it/smartphone-used-to-predict-nsw-travel-times/story-e6frgakx-1226034533295. 
  20. "NIPS 2014 Workshop on High-energy Physics and Machine Learning". 42. http://jmlr.org/proceedings/papers/v42/. 
  21. Athanasopoulos, George; Hyndman, Rob (2011). "The Value of Feedback in Forecasting Competitions". International Journal of Forecasting 27: pp. 845–849. http://www.sciencedirect.com/science?_ob=MImg&_imagekey=B6V92-52S72B8-1-1&_cdi=5886&_user=559483&_pii=S0169207011000495&_origin=&_coverDate=09%2F30%2F2011&_sk=999729996&view=c&wchp=dGLzVzb-zSkWl&_valck=1&md5=7c0b261207c4204b14d21a95adc9d6bb&ie=/sdarticle.pdf. 
  22. "Kaggle Progression System". Kaggle. https://www.kaggle.com/progression. 
  23. Carl McBride Ellis (2022-02-10). "Kaggle in Numbers". Kaggle. https://www.kaggle.com/code/carlmcbrideellis/kaggle-in-numbers. 

Further reading




Licensed under CC BY-SA 3.0 | Source: https://handwiki.org/wiki/Company:Kaggle
12 views | Status: cached on July 24 2024 09:27:03
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF