Innovation as Fuel: PCBs, Electricity, and Artificial Intelligence

Computer Science

Nobody was scared of electricity. Well — actually, they were. When Thomas Edison and Nikola Tesla were busy arguing about whose version of it was better, the average person thought it was sorcery. Dangerous. Unnatural. Ungodly, even. Fishermen and farmers and factory workers looked at it the way people look at AI today — with that particular mixture of awe and dread that tends to accompany things we don't yet understand. And then, slowly, inevitably, it became the thing that kept your food cold and your house lit and your kettle boiling. Today, you don't think about electricity. You just plug in.

Artificial intelligence is electricity.

Not literally — though funnily enough, electricity is very much what's keeping AI alive. Every prompt you fire into a chatbot, every image conjured from thin air, every voice assistant that mishears your name — all of it runs on data centres drawing enormous amounts of power. The infrastructure is physical. Grounded. Mundane, even. Printed circuit boards, GPUs, cooling systems, fibre optic cables running under oceans. AI isn't magic floating in the cloud. It's silicon and solder and a very large electricity bill. The wonder of it is built on the deeply unsexy reality of hardware. And that's fine. That's how all the best innovations work.

What AI is, though, is fuel. The kind that powers the next layer of things. And the list of things it's already powering is longer than most people realise. Doctors are using it to catch cancers earlier than any human eye could. Farmers are using it to monitor soil conditions and reduce waste across entire harvests. Engineers are using it to simulate materials that don't exist yet — testing tensile strength and thermal resistance without ever touching a lab. Musicians are producing tracks in genres nobody has named yet. Fraud detection, drug discovery, accessibility tools for people who can't see or hear — these aren't hypotheticals. They're already running. Quietly. In the background. Like electricity.

But if you look around and don't see much of this? You're not imagining it. Innovation is still scarce. And there's a reason for that.

The game has evolved — but most people are still playing the old one.

There's a principle in computing called Moore's Law. Gordon Moore, co-founder of Intel, observed in 1965 that the number of transistors on a microchip roughly doubled every two years — and that this pattern would continue. More transistors meant more processing power. More processing power meant more complex software. More complex software meant higher abstraction layers. What this really describes, underneath the hardware specs, is a kind of compounding refinement: as technology matures, the raw technical details get buried deeper, and the interface presented to the human gets simpler and broader. You no longer need to write machine code to build software. You no longer need to understand memory allocation to build a web app. Each generation, a layer of complexity gets absorbed by the machine, and the human standing on top of it gets handed something a little more like a steering wheel.

AI is the latest — and most dramatic — version of this. The abstraction layer has shot up dramatically. You don't need to know Python. You don't need to know how a database is structured, or how an API is authenticated, or what a transformer model actually does. The machine handles more of the technical execution than ever before. In theory, this is extraordinary. The barrier to building things has never been lower.

And yet. Walk into any office, any university, any household, and what you'll find people doing with AI is summarising emails and writing mediocre captions. The chatbot use case. The surface. Barely a candle's worth of what the infrastructure can actually do.

This is because the barrier didn't disappear. It changed shape — and most people haven't noticed.

The new gatekeeping skill is fluency. What you might call speaking AI. The ability to decompose a vision — a real vision, something specific and ambitious — into precise, structured, iterable instructions that a model can actually work with. That is not nothing. It requires you to think clearly about what you want, in what order, under what constraints, producing what kind of output. It requires project management instincts, and the imagination to see the finished thing before a single line of it exists. People who can do this are building genuinely impressive things. People who can't are asking ChatGPT what to have for dinner.

Moore's Law also tells us something hopeful here, though. If the pattern holds — and it has, stubbornly, for sixty years — then this layer of friction will also get absorbed. The interfaces will get better. The tools will get smarter at interpreting vague instructions. The skill floor for speaking AI will drop, the same way the skill floor for building a website dropped when WordPress arrived, and dropped again when Wix arrived, and again with every subsequent simplification. Technical knowledge doesn't disappear — it sediments. Gets buried under a friendlier surface. And what rises to the top, each time, is something harder to automate: taste, judgment, the capacity to imagine something worth building in the first place.

None of this makes technical skill obsolete. And it's worth saying that plainly, because the narrative can slide in that direction if you're not careful.

People still fix cars. Not because they can't afford a mechanic, but because understanding what's under the bonnet gives you a different relationship with the machine. You know when something sounds wrong. You know which warning light to take seriously and which one has been lying to you for three years. You know, when the mechanic tells you something needs replacing, whether that's true. DIY culture hasn't died because power tools got better — if anything, better tools created more people willing to attempt harder things. The skill and the tool have always existed in conversation with each other, not in competition.

AI is no different. Someone has to configure it. Someone has to audit it. Someone has to notice when it's wrong — and this is the part that deserves more attention than it gets. AI doesn't fail the way software traditionally fails. It doesn't crash. It doesn't throw an error. It doesn't refuse. It just... continues. Confidently. Fluently. In complete sentences. It will hallucinate a court case that never happened and cite it with a judge's name and a year and a jurisdiction. It will generate code that looks exactly right and is subtly broken in a way that only surfaces under specific conditions. It will give you a medical summary that is ninety percent accurate and ten percent dangerous — and nothing in the output will tell you which is which. That is a specific and unusual failure mode. Traditional software, when it breaks, tends to make noise. AI, when it breaks, tends to sound like it knows what it's talking about.

The people who can catch that — who understand the underlying system well enough to know where the cracks are likely to appear, and what a wrong answer looks like when it's wearing the costume of a right one — those people are not going anywhere. They become more valuable precisely because fewer people bother developing that depth when the surface is so convenient.

And there are problems that simply cannot be handed to AI at all. Problems too sensitive, too specific, too consequential for a system that optimises for plausibility rather than truth. Security vulnerabilities. Legal interpretation. Medical edge cases where being almost right is worse than being wrong outright, because almost right doesn't trigger any alarms. In those moments, the human with deep technical knowledge isn't a fallback. They're the point.

The abstraction layer rising doesn't mean the foundation stops mattering. It means the foundation becomes load-bearing in a quieter, less visible way. And the people who understand it are the ones keeping the whole structure from quietly developing cracks.

The fear, of course, makes sense. New fuel always attracts suspicion. When the printing press arrived, people worried it would make scholars lazy and fill the world with bad ideas. (It did both. It also sparked the Renaissance.) When calculators reached classrooms, teachers panicked that students would forget how to think. The pattern is consistent: powerful tools arrive, people panic, the tools get absorbed into ordinary life, and we move on to panicking about the next one.

Jobs change shape. They shed the parts that were never really about human skill in the first place — the repetitive, the mechanical, the mind-numbing — and what remains is the part that was always the interesting bit. AI won't take your place. It'll take the dull part of your place, and hand the rest back to you with more time and more room to do it properly. That's what fuel does. It doesn't replace the driver. It doesn't make the journey meaningless. It opens the road.

And if we get this right — really right, not just "useful chatbot" right but genuinely, deeply right — the ceiling disappears entirely.

Think about what has always slowed human progress down. Not lack of ambition. Not lack of imagination. Lack of capacity. We have always been bottlenecked by the sheer volume of complexity involved in solving hard problems. Climate modelling. Protein folding. Orbital mechanics. The logistics of feeding eight billion people. These aren't unsolvable problems in principle — they're just problems with more variables than any human mind, or any team of human minds, can hold at once. AI doesn't have that ceiling. Or at least, it doesn't have our ceiling. Point it at hunger and it can model supply chains, soil health, distribution networks, and climate projections simultaneously, at a scale no committee ever could. Point it at crime and it can identify the socioeconomic fault lines that produce it, long before they erupt. Point it at poverty and it can find the intervention points that aid budgets and good intentions keep missing.

Point it at space, and things get genuinely exciting. Mars isn't just a dream for the romantically inclined — it's an engineering problem. A vast, brutal, beautiful engineering problem involving radiation shielding, atmospheric pressure, food production in alien soil, and the psychological cost of being very far from everything you've ever known. AI can simulate all of that. Can design habitats, stress-test life support systems, optimise trajectories, manage the thousand interdependent variables that make the difference between a colony and a graveyard. We've always had the audacity to imagine it. Now we're building the tools to actually do the maths.

And then there's the stranger territory. Consciousness backups. The idea that what makes you you — the pattern of it, the accumulated weight of every experience and preference and instinct — might one day be mappable, storable, transferable. Not immortality exactly. Something more complicated and more interesting than that. It sounds like science fiction because it is, right now. But so did sequencing the human genome, once. So did a surgery robot. So did a computer that could beat every chess grandmaster alive. The distance between science fiction and Tuesday morning has been getting shorter for a very long time.

None of this is guaranteed. Progress isn't linear and the hard problems are genuinely hard. But the trajectory is pointing somewhere extraordinary — and that's worth holding onto, especially in the moments when the discourse gets small and the fear gets loud.

We are right now in the uncomfortable middle of the story. The part where the electricity is real but most people are still using it to light a single candle. The extraordinary things AI can fuel are largely still waiting — not because the fuel isn't there, but because most people haven't yet learned to drive. That will change. It always does.

The goal isn't AI as spectacle. It's AI as a socket in the wall. Useful, reliable, humming quietly in the background while you get on with building something worth building.

March 13, 2026, 9:02 p.m.

Published bY CYrilAuthor profile picture