Gradient Boosted Trees are everywhere! They're very powerful ensembles of Decision Trees that rival the power of Deep Learning. Learn how they work with this visual guide and try the code out for yourself using the link below. Happy learning!
https://colab.research.google.com/drive/1tRzL1GGOJDgz7CHSY1HECd4IwNAjuip5?usp=sharing
⏵PlayStopDownload Visual Guide to Gradient Boosted Trees (xgboost).mp3
Event-Centric Natural Language Processing
ACL-IJCNLP Tutorials, August 2021
Instructors: Muhao Chen, Hongming Zhang, Qiang Ning, Manling Li, Heng Ji, Kathleen McKeown, and Dan Roth.
More information: http://cogcomp.org/page/tutorial.202108/
⏵PlayStopDownload Event-Centric Natural Language Processing, ACL-IJCNLP Tutorials 2021.mp3
What are the neurons, why are there layers, and what is the math underlying it?
Help fund future projects: https://www.patreon.com/3blue1brown
Written/interactive form of this series: https://www.3blue1brown.com/topics/neural-networks
Additional funding for this project provided by Amplify Partners
Typo correction: At 14 minutes 45 seconds, the last index on the bias vector is n, when it's supposed to in fact be a k. Thanks for the sharp eyes that caught that!
For those who want to learn more, I highly recommend the book by Michael Nielsen introducing neural networks and deep learning: https://goo.gl/Zmczdy
There are two neat things about this book. First, it's available for free, so consider joining me in making a donation Nielsen's way if you get something out of it. And second, it's centered around walking through some code and data which you can download yourself, and which covers the same example that I introduce in this video. Yay for active learning!
https://github.com/mnielsen/neural-networks-and-deep-learning
I also highly recommend Chris Olah's blog: http://colah.github.io/
For more videos, Welch Labs also has some great series on machine learning:
https://youtu.be/i8D90DkCLhI
https://youtu.be/bxe2T-V8XRs
For those of you looking to go *even* deeper, check out the text "Deep Learning" by Goodfellow, Bengio, and Courville.
Also, the publication Distill is just utterly beautiful: https://distill.pub/
Lion photo by Kevin Pluck
-----------------
Timeline:
0:00 - Introduction example
1:07 - Series preview
2:42 - What are neurons?
3:35 - Introducing layers
5:31 - Why layers?
8:38 - Edge detection example
11:34 - Counting weights and biases
12:30 - How learning relates
13:26 - Notation and linear algebra
15:17 - Recap
16:27 - Some final words
17:03 - ReLU vs Sigmoid
------------------
Animations largely made using manim, a scrappy open source python library. https://github.com/3b1b/manim
If you want to check it out, I feel compelled to warn you that it's not the most well-documented tool, and has many other quirks you might expect in a library someone wrote with only their own use in mind.
Music by Vincent Rubinetti.
Download the music on Bandcamp:
https://vincerubinetti.bandcamp.com/album/the-music-of-3blue1brown
Stream the music on Spotify:
https://open.spotify.com/album/1dVyjwS8FBqXhRunaG5W5u
If you want to contribute translated subtitles or to help review those that have already been made by others and need approval, you can click the gear icon in the video and go to subtitles/cc, then "add subtitles/cc". I really appreciate those who do this, as it helps make the lessons accessible to more people.
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you're into that).
If you are new to this channel and want to see more, a good place to start is this playlist: http://3b1b.co/recommended
Various social media stuffs:
Website: https://www.3blue1brown.com
Twitter: https://twitter.com/3Blue1Brown
Patreon: https://patreon.com/3blue1brown
Facebook: https://www.facebook.com/3blue1brown
Reddit: https://www.reddit.com/r/3Blue1Brown
⏵PlayStopDownload But what is a neural network? | Chapter 1, Deep learning.mp3
Event - Centric Natural Language Understanding, AAAI Tutorials 2021
Event-Centric Natural Language Understanding
AAAI Tutorials, February 2021
Instructors: Muhao Chen, Hongming Zhang, Qiang Ning, Manling Li, Heng Ji and Dan Roth.
More information: http://cogcomp.org/page/tutorial.202102/
⏵PlayStopDownload Event-Centric Natural Language Understanding, AAAI Tutorials 2021.mp3
What does it mean for something to be Artificially Intelligent?
Abi Aryan is the Chief Business Development Officer at Coinsecure - India's leading Bitcoin exchange. With a background in mathematics, statistics and computer science, she has been at the core of many futuristic companies. Working on Neural Networks and Artificial Intelligence have been one of the most defining moments in her life during her masters degree at London School of Economics as it went on to become her passion. She’s been deeply involved with the Big Data, Singularity and Artificial Intelligence circles in U.K. and has hosted several subject experts at her own conferences with Computability Intelligence Unconference and Future Tech Track in London.
This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx
This is Part 3 in our series on Gradient Boost. At long last, we are showing how it can be used for classification. This video gives focuses on the main ideas behind this technique. The next video in this series will focus more on the math and how it works with the underlying algorithm.
This StatQuest assumes that you have already watched Part 1:
https://youtu.be/3CC4N4z3GJc
...and it also assumed that you understand Logistic Regression pretty well. Here are the links for...
A general overview of Logistic Regression: https://youtu.be/yIYKR4sgzI8
how to interpret the coefficients: https://youtu.be/vN5cNN2-HWE
and how to estimate the coefficients: https://youtu.be/BfKanl1aSG0
Lastly, if you want to learn more about using different probability thresholds for classification, check out the StatQuest on ROC and AUC: https://youtu.be/xugjARegisk
For a complete index of all the StatQuest videos, check out:
https://statquest.org/video-index/
This StatQuest is based on the following sources:
A 1999 manuscript by Jerome Friedman that introduced Stochastic Gradient Boost: https://statweb.stanford.edu/~jhf/ftp/stobst.pdf
The Wikipedia article on Gradient Boosting: https://en.wikipedia.org/wiki/Gradient_boosting
The scikit-learn implementation of Gradient Boosting: https://scikit-learn.org/stable/modules/ensemble.html#gradient-boosting
If you'd like to support StatQuest, please consider...
Buying The StatQuest Illustrated Guide to Machine Learning!!!
PDF - https://statquest.gumroad.com/l/wvtmc
Paperback - https://www.amazon.com/dp/B09ZCKR4H6
Kindle eBook - https://www.amazon.com/dp/B09ZG79HXC
Patreon: https://www.patreon.com/statquest
...or...
YouTube Membership: https://www.youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw/join
...a cool StatQuest t-shirt or sweatshirt:
https://shop.spreadshirt.com/statquest-with-josh-starmer/
...buying one or two of my songs (or go large and get a whole album!)
https://joshuastarmer.bandcamp.com/
...or just donating to StatQuest!
https://www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
https://twitter.com/joshuastarmer
#statquest #gradientboost
⏵PlayStopDownload Gradient Boost Part 3 (of 4): Classification.mp3
Chris Bishop discusses the progress and opportunities of artificial intelligence research.
Subscribe for weekly science videos: http://bit.ly/RiSubscRibe
The last five years have witnessed a dramatic resurgence of excitement in the goal of creating intelligent machines. Technology companies are now investing billions of dollars in this field, new research laboratories are springing up around the globe, and competition for talent has become intense. In this Discourse Chris Bishop describes some of the recent technology breakthroughs which underpin this enthusiasm, and explores some of the many exciting opportunities which artificial intelligence offers.
Chris Bishop is the Laboratory Director at Microsoft Research Cambridge and is a professor of computer science at the University of Edinburgh. He has extensive expertise in artificial intelligence and machine learning.
This Discourse was filmed at the Royal Institution on 28 October 2016.
Subscribe for regular science videos: http://bit.ly/RiSubscRibe
The Ri is on Twitter: http://twitter.com/ri_science
and Facebook: http://www.facebook.com/royalinstitution
and Tumblr: http://ri-science.tumblr.com/
Our editorial policy: http://www.rigb.org/home/editorial-policy
Subscribe for the latest science videos: http://bit.ly/RiNewsletter
⏵PlayStopDownload Artificial Intelligence, the History and Future - with Chris Bishop.mp3
Hey guys! In this channel, you will find contents of all areas related to Artificial Intelligence (AI). Please make sure to smash the LIKE button and SUBSCRIBE to our channel to learn more about these trending topics, and don’t forget to TURN ON your YouTube notifications!
Thanks & Happy Learning 🙂
.
Subscribe here: https://goo.gl/9FS8uF
Become a Patreon!: https://www.patreon.com/ColdFusion_TV
Visual animal AI: https://www.youtube.com/watch?v=DgPaCWJL7XI
Hi, welcome to ColdFusion (formally known as ColdfusTion).
Experience the cutting edge of the world around us in a fun relaxed atmosphere.
Sources:
Why AlphaGo is NOT an "Expert System": https://googleblog.blogspot.com.au/2016/01/alphago-machine-learning-game-go.html
“Inside DeepMind” Nature video:
https://www.youtube.com/watch?v=xN1d3qHMIEQ
“AlphaGo and the future of Artificial Intelligence” BBC Newsnight: https://www.youtube.com/watch?v=53YLZBSS0cc
http://www.nature.com/nature/journal/v518/n7540/full/nature14236.html
http://www.ft.com/cms/s/2/063c1176-d29a-11e5-969e-9d801cf5e15b.html
http://www.nature.com/nature/journal/v529/n7587/full/nature16961.html#tables
https://www.technologyreview.com/s/533741/best-of-2014-googles-secretive-deepmind-startup-unveils-a-neural-turing-machine/
https://medium.com/the-physics-arxiv-blog/the-last-ai-breakthrough-deepmind-made-before-google-bought-it-for-400m-7952031ee5e1
https://www.deepmind.com/
www.forbes.com/sites/privacynotice/2014/02/03/inside-googles-mysterious-ethics-board/#5dc388ee4674
https://medium.com/the-physics-arxiv-blog/the-last-ai-breakthrough-deepmind-made-before-google-bought-it-for-400m-7952031ee5e1#.4yt5o1e59
http://www.theverge.com/2016/3/10/11192774/demis-hassabis-interview-alphago-google-deepmind-ai
https://en.wikipedia.org/wiki/Demis_Hassabis
https://en.wikipedia.org/wiki/Google_DeepMind
//Soundtrack//
Disclosure - You & Me (Ft. Eliza Doolittle) (Bicep Remix)
Stumbleine - Glacier
Sundra - Drifting in the Sea of Dreams (Chapter 2)
Dakent - Noon (Mindthings Rework)
Hnrk - fjarlæg
Dr Meaker - Don't Think It's Love (Real Connoisseur Remix)
Sweetheart of Kairi - Last Summer Song (ft. CoMa)
Hiatus - Nimbus
KOAN Sound & Asa - This Time Around (feat. Koo)
Burn Water - Hide
» Google + | http://www.google.com/+coldfustion
» Facebook | https://www.facebook.com/ColdFusionTV
» My music | t.guarva.com.au/BurnWater http://burnwater.bandcamp.com or
» http://www.soundcloud.com/burnwater
» https://www.patreon.com/ColdFusion_TV
» Collection of music used in videos: https://www.youtube.com/watch?v=YOrJJKW31OA
Producer: Dagogo Altraide
Editing website: www.cfnstudios.com
Coldfusion Android Launcher: https://play.google.com/store/apps/details?id=nqr.coldfustion.com&hl=en
» Twitter | @ColdFusion_TV
⏵PlayStopDownload Google's Deep Mind Explained! - Self Learning A.I..mp3
Can one do better than XGBoost? Presenting 2 new gradient boosting libraries - LightGBM and Catboost
Mateusz Susik
Description
We will present two recent contestants to the XGBoost library: LightGBM (released October 2016) and CatBoost (open-sourced July 2017). The participant will learn the theoretical and practical differences between these libraries. Finally, we will describe how we use gradient boosting libraries at McKinsey & Company.
Abstract
Gradient boosting proved to be a very effective method for classification and regression in the last years. A lot of successful business applications and data science contest solutions were developed around the XGBoost library. It seemed that XGBoost will dominate the field for many years.
Recently, two major players have released their own implementation of the algorithm. The first - LightGBM - comes from Microsoft. Its major advantages are lower memory usage and faster training speed.
The second - Catboost - was implemented by Yandex. Here, the approach was different. The aim of the library was to improve on top of the state-of-the-art gradient boosting algorithm performance in terms of accuracy.
During the talk, the participants will learn about the differences in the algorithm designs, APIs and performances.
www.pydata.org
PyData is an educational program of NumFOCUS, a 501(c)3 non-profit organization in the United States. PyData provides a forum for the international community of users and developers of data analysis tools to share ideas and learn from each other. The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization. PyData communities approach data science using many languages, including (but not limited to) Python, Julia, and R.
PyData conferences aim to be accessible and community-driven, with novice to advanced level presentations. PyData tutorials and talks bring attendees the latest project features along with cutting-edge use cases. 00:00 Welcome!
00:10 Help us add time stamps or captions to this video! See the description for details.
Want to help add timestamps to our YouTube videos to help with discoverability? Find out more here: https://github.com/numfocus/YouTubeVideoTimestamps
⏵PlayStopDownload Can one do better than XGBoost? - Mateusz Susik.mp3
What do you get when you give a design tool a digital nervous system? Computers that improve our ability to think and imagine, and robotic systems that come up with (and build) radical new designs for bridges, cars, drones and much more -- all by themselves. Take a tour of the Augmented Age with futurist Maurice Conti and preview a time when robots and humans will work side-by-side to accomplish things neither could do alone.
TEDTalks is a daily video podcast of the best talks and performances from the TED Conference, where the world's leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design -- plus science, business, global issues, the arts and much more.
Find closed captions and translated subtitles in many languages at http://www.ted.com/translate
Follow TED news on Twitter: http://www.twitter.com/tednews
Like TED on Facebook: https://www.facebook.com/TED
Subscribe to our channel: http://www.youtube.com/user/TEDtalksDirector
⏵PlayStopDownload The incredible inventions of intuitive AI | Maurice Conti.mp3
slides: https://mlvu.github.io/lecture10/
2019 version: https://youtu.be/m-at5l3F_ig
In this lecture we (finally) show how decision trees are trained. We also dicuss ensembling: a technique that can be used on any machine learning model, but is often combined with tree learning. Lecturer: Peter Bloem. See the PDF for image credits.
⏵PlayStopDownload 10 Tree Models and Ensembles: Decision Trees, Boosting, Bagging, Gradient Boosting (MLVU2018).mp3
Professor Hastie takes us through Ensemble Learners like decision trees and random forests for classification problems. Don’t just consume, contribute your code and join the movement: https://github.com/h2oai
User conference slides on open source machine learning software from H2O.ai at: http://www.slideshare.net/0xdata