Member-only story
Fusion of machine learning and human consciousness

Attention is all that was needed for the machine learning industry to explode. Just as attention fuels AI, prana is the vital energy that powers the human body and mind. For humans, prana is nothing extraordinary; it has always been a fundamental part of our existence. Wherever our prana flows, our attention follows, and where our attention goes, growth and prosperity soon follow. I like to call it Praan (प्राण), also known as Prana.
If you are ever bored, just try to go into Idle Breathing rhythm and try to not see from your eyes and let the focus of your drishti stabilize in the center of your eye brows like you were to throw light on your path walking forward. We can try to describe it a lot, but just T.R.Y
If I ever found the Genie bottle, I won’t ask for more money, I will ask for more Praan प्राण, stop asking and start being
— THE Y🧿GIC LENS
A Machine Learning Perspective
The paper, “Attention is all you Need,” introduced the Transformer architecture, a significant advancement in machine learning, particularly for tasks involving sequences like text and speech. Here’s a breakdown:
- Problem: Traditional methods like Recurrent Neural Networks (RNNs) struggled with long-term dependencies in sequences. They might “forget” information from earlier parts.
- Solution: The Transformer relies on an attention mechanism. This allows the model to focus on specific parts of the input sequence that are most relevant to the current processing step. Imagine you’re reading a sentence — the attention mechanism lets the model pay closer attention to words that matter for understanding the current word,even if they’re far apart in the sentence.
- Benefits: Transformers can capture long-range dependencies more effectively than RNNs, leading to improved performance in tasks like machine translation, text summarization, and question answering.