In this post, I share my perspective on one of the biggest challenges we face as technology professionals today: separating what really matters from the constant noise that bombards us every day.
You’ve probably felt that anxiety of seeing new technologies blowing up on X/Twitter or LinkedIn and thinking, “Do I need to learn this now? Will I fall behind if I don’t?” This feeling of tech FOMO (Fear of Missing Out) is real and can be paralyzing.
I’ve seen countless technologies born with revolutionary promises and die within a few months. Professionals invest hundreds of hours in frameworks that never gain market adoption. And people ignore genuine trends due to excessive cynicism, falling behind.
The truth is there’s no magic formula. But there are signs, patterns and strategies that can help us make more consistent decisions.
Let’s start with the most important thing: how do you differentiate between superficial social media buzz and genuine market adoption of technology?
X/Twitter and LinkedIn have become stages for tech performance. Influencers gain followers by talking about the “next big thing.” Marketing companies pay for sponsored posts disguised as genuine opinions. Developers seek visibility by betting on technologies that seem innovative.
Note that these platforms have value. Often, that’s where we first hear about technologies that become important. The problem is that’s also where we hear about hundreds of others that never go beyond buzz.
Think about how many times you’ve seen enthusiastic posts about some technology that promised to “change everything” and, six months later, nobody was talking about it anymore.
We have a somewhat recent case with the metaverse: major initiatives, companies selling spaces, billions in investments, and when generative AI arrived, all that metaverse hype went down the drain. Nowadays, nobody even talks about it anymore. That’s the nature of social media; the attention cycle is short, and the next hype is always just one post away.
That’s why I developed the habit of not reacting immediately to what I see on social media. I used to panic when I noticed I was missing the train of modernity. Today, I’m more cautious; I take notes, observe, but don’t invest significant time until I see more concrete signs.
I remember the days of Visual Basic and the early days of C#, where I noticed an interesting pattern in Microsoft products. It was usually from version 6.0 that a tool reached maturity or disappeared.
Visual Basic 6.0, for example, was the peak and the end of the line for that VB model. C# followed a similar path; version 6.0 brought the stability the language needed. Then came a transition period with .NET Framework (up to version 7.3), followed by a “rebirth” in .NET Core starting with C# 8.0, and today we’re at C# 14 running on .NET 10.
I use two indicators about technology usage by observing the market:
Companies don’t hire for technologies they don’t intend to use. When you start seeing job postings on platforms like LinkedIn, Indeed or Glassdoor asking for experience in a particular technology, it means there’s real demand.
But be careful. It’s not enough to see a few postings. What matters is observing the trend. Is technology appearing in more job postings over time? Is it being requested by companies from different sectors and sizes? Is it appearing in senior positions or just in startup experiments?
This is probably the most straightforward and most powerful strategy: wait at least two years before diving deep into any new technology.
It’s not a magic number, but there’s logic behind it. The first year of any new technology is the honeymoon period. There will be enthusiasm, hype and excited early adopters. The real problems haven’t appeared yet because nobody has had enough time to find them.
In the second year, reality starts to show. The first production projects reveal limitations. Companies that adopted early begin sharing their experiences, including the pain points. Competitors have time to react and offer alternatives.
If technology is still growing and being adopted after two years, it has passed trial by fire. It survived the initial hype cycle, proved to have real value and developed a sustainable community. It’s a much safer bet.
Of course, no rule is absolute. There are situations where it makes sense to adopt technology earlier:
When you work in a cutting-edge field and being at the forefront is part of your competitive advantage, it may make sense to take more risks. AI startups, for example, frequently need to adopt emerging technologies before they mature.
When technology solves a critical problem you have right now, and there are no mature alternatives, sometimes there’s no choice but to accept the risk.
When you have resources to invest in learning, even if technology goes nowhere, like personal projects or hackathons, the cost of being wrong is low.
But for most professional decisions, especially those involving production projects or long-term career development, the two-year filter remains an excellent option.
Waiting two years doesn’t mean completely ignoring technology during that period. It means following from a distance, being aware of what’s happening, without investing significant time in deep learning.
In practice, this can mean reading introductory articles to understand the basic concepts, following the leading developers and the official account on X/Twitter, watching one or two introductory talks, adding some repositories to your favorites to keep track of, and participating in superficial discussions when the topic comes up.
This lighter following approach lets me stay informed as technology matures without wasting hundreds of hours if it doesn’t take off.
There’s a trap around our necks. Sometimes, we choose complex technologies because they seem more sophisticated or because they challenge us intellectually. But unnecessary complexity is a cost, not a benefit.
The right question isn’t “Is this technology impressive?” but “Is this technology appropriate for my problem?” Often, the simplest solution is the best.
Kubernetes is incredible, but do you need it for your application? Microservices are elegant, but does your project justify the complexity? The most advanced technology isn’t always the most suitable.
The best strategy for identifying technologies worth pursuing is to combine cautious observation, analysis of concrete adoption signals and patience to let time filter what’s truly relevant.
What we can always rely on is maintaining a growth mindset and learning from new technologies; even if they disappear, we’ll still carry a bit of their way of thinking into the next phase of life’s game.
And you? How many times have you seen technologies emerge, disappear or continue to this day?
Jefferson S. Motta is a senior software developer, IT consultant and system analyst from Brazil, developing in the .NET platform since 2011. Creator of www.Advocati.NET, since 1997, a CRM for Brazilian Law Firms. He enjoys being with family and petting his cats in his free time. You can follow him on LinkedIn and GitHub.