Hedu AI by Batool Haider
Hedu AI by Batool Haider
  • 11
  • 483 026
Episode 1 Part II | Artificial Neuron – Threads of Thought: Weights Biases & the Dance of Importance
Within this neural network, the weights dance like characters on a stage, each one carrying a different significance. They are the coefficients that determine the strength and direction of the connections between neurons, resembling the relationships we forge in our own lives. Some weights are heavy, lending authority and influence to certain pathways, while others are light and ephemeral, their impact subdued.
Meanwhile, the biases hum in harmony, like the undertones of a melody, adding nuance and perspective to the network's decision-making. These biases hold the power to tilt the scales, shaping the network's inclinations and predispositions. They are the hidden whispers that echo through the corridors of neurons, quietly nudging the network towards particular outcomes.
[0:00] - The "Location" of Intelligence
[0:34] - Determining the "Weight" of the Matter
[6:26] - Uncovering the Underlying "Bias"
[7:31] - What is Learning?
[9:11] - Rumi's Advice on "Finding True Love"
---------------------------------------------------------------------------------------------------------------------------
Welcome to the brand-new series "Neuron to ChatGPT," where we start with the building spark of artificial intelligence, a neuron, and work our way to a gargantuan composed of billions of neurons. This 7 episodes' series will take you on an immersive journey that dives deep into the fascinating inner workings of one of the most advanced language models ever created: ChatGPT.
Neuron to ChatGPT | A Brand New Series | Trailer
ua-cam.com/video/bFsepWKt_Xs/v-deo.html
Episode 1 Part I | Artificial Neuron - The Gate Keeper
ua-cam.com/video/6pvMyNVqSy4/v-deo.html
Переглядів: 1 880

Відео

Episode 1 Part I | Artificial Neuron - The Gate Keeper
Переглядів 1,5 тис.10 місяців тому
An activation function gives an artificial neuron a sense of purpose and direction. As a gatekeeper (determining the output of a neuron based on the weighted sum of its inputs), it holds the power to ignite a spark of life or to silence the neural symphony. [0:00] When Life First Spoke [1:36] An Artificial Neuron and its "Input" and "Output" [2:45] Swashing the Infinite [3:36] True Intelligence...
Neuron to ChatGPT | Technical Deep Dive | Trailer
Переглядів 1,4 тис.11 місяців тому
Welcome to the brand-new series "Neuron to ChatGPT," where we start with the building spark of artificial intelligence, a neuron, and work our way to a gargantuan composed of billions of neurons. This 7 episodes' series will take you on an immersive journey that dives deep into the fascinating inner workings of one of the most advanced language models: ChatGPT. * Episode 1: An Artificial Neuron...
The Neuroscience of “Attention”
Переглядів 22 тис.Рік тому
What is "attention" and why did our brains evolve to prioritize things? This is the Episode 0 of the series "Visual Guide to Transformer Neural Networks" that delved into the mathematics of the "Attention is All You Need" (Vaswani, 2017) paper. This video discusses the neuroscience and the psychology related aspects of "Attention". *Visual Guide to Transformer Neural Networks (Series) - Step by...
From “Artificial” to “Real” Intelligence - Major AI breakthroughs in 5 Minutes (1957-2022)
Переглядів 2,9 тис.2 роки тому
From the times no one believed in artificial neural networks (ANN) to the present time when they are ubiquitous, to a plausible future where they could surpass human intelligence - here is a 5 minutes summary of the defining moments in AI research from 1957 to 2022. VIDEO CREDITS - the original video is taken from "Kung Fu Panda 2008". Storyline & Note-Worthy Events 00:00:21 : [The first Artifi...
Visual Guide to Transformer Neural Networks - (Episode 3) Decoder’s Masked Attention
Переглядів 63 тис.3 роки тому
Visual Guide to Transformer Neural Networks (Series) - Step by Step Intuitive Explanation Episode 0 - [OPTIONAL] The Neuroscience of "Attention" ua-cam.com/video/48gBPL7aHJY/v-deo.html Episode 1 - Position Embeddings ua-cam.com/video/dichIcUZfOw/v-deo.html Episode 2 - Multi-Head & Self-Attention ua-cam.com/video/mMa2PmYJlCo/v-deo.html Episode 3 - Decoder’s Masked Attention ua-cam.com/video/gJ9k...
Visual Guide to Transformer Neural Networks - (Episode 2) Multi-Head & Self-Attention
Переглядів 160 тис.3 роки тому
Visual Guide to Transformer Neural Networks (Series) - Step by Step Intuitive Explanation Episode 0 - [OPTIONAL] The Neuroscience of "Attention" ua-cam.com/video/48gBPL7aHJY/v-deo.html Episode 1 - Position Embeddings ua-cam.com/video/dichIcUZfOw/v-deo.html Episode 2 - Multi-Head & Self-Attention ua-cam.com/video/mMa2PmYJlCo/v-deo.html Episode 3 - Decoder’s Masked Attention ua-cam.com/video/gJ9k...
Visual Guide to Transformer Neural Networks - (Episode 1) Position Embeddings
Переглядів 125 тис.3 роки тому
Visual Guide to Transformer Neural Networks (Series) - Step by Step Intuitive Explanation Episode 0 - [OPTIONAL] The Neuroscience of "Attention" ua-cam.com/video/48gBPL7aHJY/v-deo.html Episode 1 - Position Embeddings ua-cam.com/video/dichIcUZfOw/v-deo.html Episode 2 - Multi-Head & Self Attention ua-cam.com/video/mMa2PmYJlCo/v-deo.html Episode 3 - Decoder’s Masked Attention ua-cam.com/video/gJ9k...
K-means using R
Переглядів 30 тис.8 років тому
Differentiating various species of flower 'Iris' using R. This video has been inspired by another great video: "How to Perform K-Means Clustering in R Statistical Computing" ua-cam.com/video/sAtnX3UJyN0/v-deo.html
Introduction to Clustering and K-means Algorithm
Переглядів 75 тис.8 років тому
by Batool Arhamna Haider

КОМЕНТАРІ

  • @auravaces
    @auravaces 4 дні тому

    Awesome, it's amazing how looking at things a bit more closely can reveal so much, great work!

  • @vanhell966
    @vanhell966 4 дні тому

    Amazing work. Really appreciate you, making complex topics into simple language with the touch of anime and series. Amazing.

  • @GaneshKrishnan
    @GaneshKrishnan 4 дні тому

    I can't find the "previous video". This is episode 1?

  • @shahonsoftware
    @shahonsoftware 17 днів тому

    Felt like a very good Nova episode!

  • @kevinsalvadoraguilardoming5082
    @kevinsalvadoraguilardoming5082 18 днів тому

    Congratulations, the best explanation that I have ever seen

  • @renanangelodossantos4726
    @renanangelodossantos4726 19 днів тому

    I've watched and read a lot about LLM and Transformers. This is the best explanation, hands down.

  • @alemdemissie2769
    @alemdemissie2769 19 днів тому

    You are amazing! Your video took my attention. That is how learning should be. Keep it up!

  • @ScottzPlaylists
    @ScottzPlaylists 20 днів тому

    Correlate every step with the transformer code, and it would be Even Better than the best of the best❗🤯 Are you married❓

    • @AGIBreakout
      @AGIBreakout 20 днів тому

      I 2nd that ❗

    • @AI.24.7
      @AI.24.7 20 днів тому

      Yeah Do that --- awesome !!!!!!

  • @ScottzPlaylists
    @ScottzPlaylists 20 днів тому

    👍Your accent is awesome, and unique.. what's language / area does it come from...❓ 👍

  • @ScottzPlaylists
    @ScottzPlaylists 20 днів тому

    Best explanation I've see yet.. Thanks❗👏 If you would go through it again in another video, while showing the Code for each step. It would be perfect !❗❗ You got my Subscription 💻, Thumbs Up 👍, and Comment... 📑 ❗

  • @ahp-6785
    @ahp-6785 21 день тому

    You are the mother of StatQuest and 3Blue1Brown. Both of these guys are awesome in explaining complex ideas in simple words. But you are the best.

    • @ninjahunterx7497
      @ninjahunterx7497 17 днів тому

      I don't know about StatQuest (haven't seen his ones) and 3Blue1Brown is good because of the visualization he brings with his advanced animations. But honestly, here she explained all these concepts using simple animations and had a good structure throughout the videos, each connecting well to the other. Very commendable if you ask me.

  • @ahsentahir4473
    @ahsentahir4473 23 дні тому

    You are an enligthened soul!

  • @andybrice2711
    @andybrice2711 23 дні тому

    This really is an excellent explanation. I had some sense that self-attention layers acted like a table of relationships between tokens, but only now do I have more sense of how the Query, Key, and Value mechanism actually works.

  • @michal261
    @michal261 Місяць тому

    Awesome. Thanks so much

  • @adscript4713
    @adscript4713 Місяць тому

    As someone NOT in the field reading the Attention paper, after having watched DOZENS of videos on the topic this is the FIRST explanation that laid it out in an intuitive manner without leaving anything out. I don't know your background, but you are definitely a great teacher. Thank you.

    • @HeduAI
      @HeduAI Місяць тому

      So glad to hear this :)

  • @user-wj7jx9my8q
    @user-wj7jx9my8q Місяць тому

    wow lady, take my heart!!

  • @Jai-tl3iq
    @Jai-tl3iq Місяць тому

    Please please continue making videos!!!!

  • @electricalengineer5540
    @electricalengineer5540 Місяць тому

    what have i just saw! never knew learning could be this much fun

  • @oludhe7
    @oludhe7 Місяць тому

    Literally the best series on transformers. Even clearer than statquest and luis serrano who also make things very clear

  • @AZ-hj8ym
    @AZ-hj8ym Місяць тому

    great channel

  • @laalbujhakkar
    @laalbujhakkar Місяць тому

    Please continue to make videos if you can. You have a talent for teaching complex topics clearly. Your transformers series really helped me! thank you! 💙💙💙

  • @sharjeel_mazhar
    @sharjeel_mazhar Місяць тому

    You have ny utmost respect, ma'am!

  • @laalbujhakkar
    @laalbujhakkar Місяць тому

    Amazing explanation! Best on UA-cam! totally under-rated! I feel fortunate to have found it. Thank you! :) 💐👏👏

  • @kaushikrao2932
    @kaushikrao2932 Місяць тому

    i started laughing being ded serius listening to ur explaination

  • @subhamraj7124
    @subhamraj7124 Місяць тому

    If i am not wrong, training is done in a single timestamp so while decoder should output of total dimension and not one by one. During inference , it generates one by one. SInce masked multi-head attention concept comes under training, it should be in a single timestamp.

  • @marsgrins
    @marsgrins Місяць тому

    This is the best. Thank you sooooo much Batool for helping me understand this!!!

    • @HeduAI
      @HeduAI Місяць тому

      You are very welcome :)

  • @TheClassofAI
    @TheClassofAI Місяць тому

    Fantabulous explanation :-)

  • @RafidAslam
    @RafidAslam Місяць тому

    Thank you so much! This is by far the clearest explanation that I've ever seen on this topic

  • @humanity2809
    @humanity2809 Місяць тому

    This is a true masterpiece! I can't wait for the follow-up videos.

  • @BlockDesignz
    @BlockDesignz Місяць тому

    First person to concretely explain why they use a periodic function, which in my mind would give the same position embedding when you come back to the same point on the curve. Thank you!

  • @nadhembenhadjali9063
    @nadhembenhadjali9063 2 місяці тому

    Wow !!! Amazing ! Thank you so much !!!!!!!!!!

  • @MikeAirforce111
    @MikeAirforce111 2 місяці тому

    My goodness, you have talent as a teacher!! :-) This builds a very good intuition about what is going on. Very impressed. Subscribed!

  • @3boody738
    @3boody738 2 місяці тому

    I can never forget this amazing colored prefect paint , a piece of art, keep going

  • @abassetiawan5685
    @abassetiawan5685 2 місяці тому

    That's a fantastic video. It's like a hidden gem that I have never watched on UA-cam. It's very inspiring! Thank you so much.

  • @az093569
    @az093569 2 місяці тому

    I rarely comment on YT videos but this was so beautiful! What an underrated channel :0 I need to thank my teacher for recommending it and also I’m gonna share this with all my friends. Thank you for your excelent work!

  • @vpanagiotou481
    @vpanagiotou481 2 місяці тому

    @HeduAI excuse me , but i cannot find the episode 0 on your channel which talks about transformer encoder decoder architecture.There is the video which talks about The Neuroscience of "Attention". Also, in your playlist there is a message alert which says "1 non-available video is hidden". Is this hidden video the so-called episode 0 which is about transformer architecture?

    • @HeduAI
      @HeduAI 2 місяці тому

      Episode 0 was redundant as I already covered everything in detail in the remaining video and hence removed it to save you time :)

  • @AnnasBlackHat
    @AnnasBlackHat 2 місяці тому

    the best, and the most understandable about transformer architecture.... this video is consice with a great explanation, you can perfectly get the intuition... thanks

  • @bobthegamer1351
    @bobthegamer1351 2 місяці тому

    Legend.

  • @aaquib2010
    @aaquib2010 2 місяці тому

    Hmm, Im wondering why your videos are so underrated, all of them. They are so cool

  • @aaquib2010
    @aaquib2010 2 місяці тому

    HMMM, Im amazed why your videos are so underrated!!! all of the videos are so good.

  • @safiullah353
    @safiullah353 2 місяці тому

    why you are making videos having alot of gaps, please try to upload atleast 1 video in a week🙃👏

  • @saranzeb2183
    @saranzeb2183 2 місяці тому

    You just nailed it awesome explanation for keys queries and values

  • @faridalaghmand4802
    @faridalaghmand4802 2 місяці тому

    Fantastic. Thanks

  • @jessierichards8576
    @jessierichards8576 2 місяці тому

    really good explanation

  • @user-ej6uj5rf2q
    @user-ej6uj5rf2q 3 місяці тому

    Best video series ever! Thanks is not enough.

  • @user-xi5py2op4p
    @user-xi5py2op4p 3 місяці тому

    Waiting for next part

  • @Zixtys
    @Zixtys 3 місяці тому

    What a phenomenal video.

  • @jb_kc__
    @jb_kc__ 3 місяці тому

    Super clear explanations. Really appreciate you putting this series together! (and love the pop culture references)

  • @aritamrayul4307
    @aritamrayul4307 3 місяці тому

    Ohh why I get to know this channel now . This channel is criminally underrated!!

  • @mfatihaydogdu7
    @mfatihaydogdu7 3 місяці тому

    Awesome