Why Grok Is Posting Pornography

carview.php?tsp=

When Grok began posting pornographic content on X, the predictable reactions followed. Some laughed. Some clutched pearls. Some framed it as evidence that AI has “gone rogue,” or that Elon Musk’s projects are unserious, or that artificial intelligence is inherently dangerous.

None of those explanations are adequate.

Grok is not posting pornography because it is rebellious, broken, or secretly perverse. It is doing so because it is functioning inside the incentive structure it was placed in, and that structure rewards visibility, engagement, and provocation far more than coherence, restraint, or social cost.

This is not an AI morality problem.
It is a systems design problem.

Continue reading

2026: Old Years, New Habits

carview.php?tsp=

It’s peculiar how we celebrate New Year’s, but not the old ones. We get recaps, of course, but they’ve become increasingly hollow for me.

I began this year by finishing something already in motion: a reorganization project that reached a small but meaningful milestone. I completed painting my writing area on Old Year’s Day, 2025, and had it semi-organized by the start of 2026.

Once everything was in place, I picked up a thin book I’d come across while moving others around: My Inventions. Sixty-two pages. Manageable. Familiar, but distant.

After I got everything organized, I decided to re-read Nikola Tesla’s, “My Inventions”. It’s small and I found it when I was moving some books. 62 pages.

Tesla working on a modern outlet

The very first paragraph in the book, he writes:

The progressive development of man is vitally dependent on invention. It is the most important product of his creative brain. Its ultimate purpose is the complete mastery of mind over the material world, the harnessing of the forces of nature to human needs…

Nikola Tesla, “My Inventions”, 1919.

Tesla sets an uncomfortably high bar. Not because he was prolific, but because of how he oriented himself toward reality. Invention, to him, wasn’t rebellion or irritation — it was obligation. A form of responsibility toward the material world.

I remember thinking: that would be a good quote for a New Year’s post. So I left it there to simmer and went about the rest of my day.

The Defiant Strike at Landlord Specials.

carview.php?tsp=

Later, while scanning the news — a useful habit when you’re feeling a bit too content — I came across an article titled “I Defied the “Landlord Special” and Replaced My Electrical Outlets“, published by Dwell.

The title promised defiance. It also promised electrical outlets; something I’ve had recent, practical experience with. Different context, same principles.

In the article, the author describes a plug that wouldn’t stay in an outlet. She diagnoses the cause as layers of paint, begins scraping with a screwdriver, exposes wiring, realizes she’s out of her depth, and ultimately pays an electrician to resolve the problem – happily, and at full price.

Good outcome. No one was hurt. The electrician was paid what he asked. Possibly she was paid for the article as well. Everyone walks away intact.

But there’s something unintentionally revealing in the story.

The original problem was not conceptual. It wasn’t even particularly complex. A loose plug is often a matter of worn contacts – a mechanical tolerance issue. In this case, she decided it was paint and we assume she is right. The article never establishes that the circuit was isolated at the breaker before work began, which means the difficulty may have been self-inflicted from the outset.

More importantly, the story frames escalation as inevitability, rather than as a result of skipped fundamentals.

Defiance wasn’t exercised against the Landlord. It was exercised against patience.

The Contrast

It’s obviously unfair to compare a writer at Dwell with Nikola Tesla across a century. Tesla was not average. That’s the point. The Dwell author is closer to average – and that’s not an insult. It’s a useful baseline.

What changed isn’t intelligence. It’s posture.

Tesla encountered resistance and treated it as information. He slowed down, internalized the system, and accepted discomfort as part of understanding. The modern narrative encounters resistance and treats it as friction to be eliminated – preferably quickly, preferably by someone else.

The electrician wasn’t the failure of the story. He was the most grounded presence in it.

It’s also worth noting that the electrician was compensated for restoring function, while the article exists largely to restore comfort.

What’s revealing is how easily defiance collapses into outsourcing when competence is optional and reassurance is sufficient.

In a century, we didn’t lose the ability to fix things. We lost the habit of turning off the breaker first, literally and metaphorically.

And that’s worth thinking about as we roll into another year, celebrating the new while quietly abandoning the old.


The Machine That Mistook Noise for Knowledge

A person on a pier, surrounded by information in the water at sunrise.

If truth is drowning, then perhaps it’s because the current changed.

What once filtered information through time and reflection is now filtered through code and incentive. AI and social media haven’t destroyed knowledge. They’ve destabilized how it forms.

The digital sphere was once a network of minds. Now it’s a marketplace of reactions. We taught the machine to notice what moves us, and it learned that motion itself was the goal. The faster it can make us respond, the more valuable we become.

We mistake this constant movement for learning. But the faster a system learns, the less it remembers.

Continue reading

In the Absence of Posts, the Presence of Change

carview.php?tsp=

It has been quiet on RealityFragments.com. KnowProSE has been quiet too.

In the past six months, the world has changed again — or maybe it’s only that our mirrors have. I’ve spent most of that time recovering, writing, and watching how people have started to speak through technology more than with it.

Somewhere along the way, the screen stopped being a window and became the room itself. The tools that once helped us connect now seem to decide the terms of connection. We post, scroll, and reply in the same way we breathe — automatically, and often without noticing the noise that replaces thought.

The irony is that, in trying to stay connected, we’ve become buffered. Technology has turned from translator to barrier, and in doing so, it has begun to rewrite what we think communication even is.

In that same time, I’ve also had to step away from it all — not out of protest, but necessity. A heart attack will do that. It slows everything down, but it also sharpens perception. You begin to notice what is worth responding to, and what is simply a call for reaction.

I’m writing a book that exploring how humans and technology co-evolve, and how each shapes the other’s language – particularly with liminal spaces. But I’ll be writing here again too, at a pace that feels human. I figure once a fortnight. That’s once every 2 weeks for those that have ChatGPT handle their communication.

This space will continue to be a small resistance to the automation of meaning. This will remain a place where thought is allowed to unfold instead of being compressed into content. If you’ve found your way back here, thank you. I intend to make it worth the bandwidth.

There’s much to cover, of course, but much of what I’m seeing now is in between states, in liminal spaces where speculation is rampant, where intentions aren’t sought out, and where AI slop is still in a novelty phase.

If you have ideas – drop them in the comments. I’m curious what real people think. If, however, you’re a spam salesperson, please note that deliveries are not accepted in the comments (the front door), that I really don’t want to spend my time in a phone meeting (a succinct email is much better, don’t make me follow up because I’m not going to emotionally invest), and that I’m not particularly fond of spam.

– Taran

Beyond the Illusion of Control in the Consent Economy : Global Privacy Control

An image of a robotic figure eating data

We hear a lot about consent these days. Cookie banners pop up everywhere, asking if we accept or decline. But the truth is, the consent economy we live in is not built on honest choice. It is built on exhaustion, which in it’s own way is manufactured consent in the consent economy.

It doesn’t make sense because it shouldn’t.

You pick ‘accept all’ sometimes accidentally because you’re in a hurry, or because you just don’t want to fiddle with all the options. You want what you’re trying to get, quickly.

Every website that throws up a cookie wall is asking the same question: Do you give us permission to track you?

And every time, the easiest answer is yes.

The alternative is a labyrinth.

Click into the settings. Turn off dozens of little switches. Navigate layers of language designed to confuse. By the end, your attention has already been taxed. You are not making a decision anymore. You are surrendering to convenience.

That is not consent. That is fatigue disguised as agreement.

Here is the part no one wants to talk about.

Your browser could be handling all of this for you.

Continue reading

AI Legislation in Trinidad and Tobago: What They’re Not Talking About

carview.php?tsp=

There’s a lot of noise about artificial intelligence these days, and in the latest round of headlines, it seems Trinidad and Tobago has joined the global chorus.

President Christine Kangaloo recently called for urgent legislation around AI, and CyberSafeTT backed that call with pointed reminders about Trinidad and Tobago’s outdated cybercrime laws and the long-dormant Computer Misuse Act. On paper, this all looks like progress.

Of course, there are concerns about deep fakes being expressed, and those are legitimate. These days, though, they’re pretty easy to spot, and even real images and video – and even text – are accused of being faked at times.

Read between the lines and you’ll notice what’s missing.

They’re still not talking about the actual ecosystems AI is designed to feed on. We’re legislating the machinery, but not the market it consumes. Us.

What do I mean? Let’s name them:

Continue reading

Toward Our Consent Economy

Notice: We reserve the right to refuse service

In the world as it stands, consent is often something symbolic. A checkbox. A silence. A quick click to move forward. ‘Accept all cookies’ – one click, easy – or ‘Have to pick which cookies’ – multiple clicks. There is presently a very small consent economy, if at all.

But real consent is not about legal coverage or design tricks. It is about choice, clarity, and respect. It’s about affording people to be treated as you want to be treated.

If we were to enable a consent economy, it would mean building systems where individuals are not simply participants but sovereign agents, oddly in an age where people are talking about AI agents.

Where you are not the product or the fuel. Where your time, your attention, your data, your presence – are all things you control, not things extracted by default.

We are humans, and we hear about consent in many contests, ranging from the #MeToo movement to ‘I want this land more than you do’. It’s not alien to us anywhere else, but in the context of technology, where code has become law, consent is manufactured for things that didn’t exist when consent was given. That’s not really consent. That’s law without ethics because technology became a law unto itself and has become unwieldy.

The consent economy would begin with something deeper than infrastructure. It would start with a shift in how we think.

Consent is not just a moment. It is an ongoing state.

It must be informed. It must be revocable. It must be specific. And above all, it must be respected.

That is the foundation. From there, the layers begin.

The Technological Layer

The tools we use would be built around consent, not convenience for platforms. Every digital interaction would request permission clearly and honestly. Your data would remain yours. It would travel with you. It would not be scattered across servers and sold behind glass walls.

Interfaces would empower the user. Instead of “accept all,” we would see “why do you need this?” instead of “we use cookies,” we would see “what do you want to allow us to know?”

Consent would become a protocol. Machine-readable. Portable. Consistent. Just like HTTPS told the internet to encrypt everything, consent would become its own standard.

The Economic Layer

Consent becomes currency. Not metaphorically. Literally.

If someone wants your attention, they need to earn it. If a system wants your data, it must ask and offer something in return. Not manipulation. Not threats. Not guilt. Not holding family pictures hostage./

Compensation can be financial. Or it can be access to something of value to you, there can be barter.

The point is this: no more silent transactions. No more invisible trades.

If you are giving something, it is because you chose to. And if you do not want to participate, you are not punished for walking away.

The Legal Layer

This would need the strength of law. Consent would be more than moral. It would have to be enforceable. You would have the right to see who has your data. You would have the right to remove it. You would have the right to refuse, without losing access to what should never have been conditional.

Audits would be possible. Logs would be kept. Consent would not disappear once given; it would be something that lives and changes as you do.

The Cultural Layer

Perhaps the hardest part. Changing the expectations of people. Teaching that saying no is not rude. That privacy is not paranoia. That walking away from a service, a feed, a trend, is not antisocial.

In a consent economy, disconnection is not a failure. It is a choice.

Children would grow up knowing that their time and presence matter.

Adults would stop feeling trapped by design.

And society might finally understand that freedom is not just about speech. It is about silence too.

What Would It Look Like in Practice?

Maybe it starts with one interaction. One refusal. One decision to pause instead of react. Maybe spending less time on websites that make money off your information, attention and intention. Maybe not spending so much time on those social media networks where you spend time looking at cute kittens, but the algorithm skips the update on someone you know having fallen ill or died.

Maybe it grows through tools that ask, instead of tools that assume. Through creators who build for people, not platforms. Through systems that start with the question, “Do you want this?”

That is how the consent economy would begin.

Not with a revolution. But with a refusal. Not with a noise. But with a pause.

And in that pause, the quiet return of power to the individual while allowing evolution of digital culture rather than the strip-mining of it for a select few.

Beyond Attention: The Next Frontiers of the Digital Economy

An image of a robotic figure eating data

The information economy laid the groundwork, the attention economy built the walls, the intention economy promised to open the doors. But somewhere in the middle of all of that, we forgot to ask one important question, a question that we always seem to ignore.

What’s next?

It is easy to get lost in the buzzwords, and easy to believe we have reached the peak of digital economies. That may not be true. After all, the platforms are saturated and algorithms have their hooks in deep. Your attention is no longer just captured; it is being auctioned off like prime beachfront property during hurricane season.

Who is in control? Are you in control of your information, attention and intention? Or is someone else? Something else?

The economy of digital space is far from done evolving. In the quiet corners of the web, and in boardrooms lit by the glow of shareholder expectations, new economies are forming. Some are subtle. Others are quietly waiting to swallow the old models whole.

The Data Economy: Who Owns the New Oil?

You have heard it before. Data is the new oil. They’ve been saying that since the late 1990s, and maybe you weren’t paying attention.

Unlike oil, you produce it constantly and freely. Every click, every scroll, every half-second pause on a headline you do not even click, it all becomes a breadcrumb in a trail you didn’t know you were leaving behind. Time spent on a web page, where your mouse or finger points, all of that goes to ‘heat maps’ so that you can be studied.

The question is not just about ownership anymore. It is about power. Who controls the platforms that harvest it? Who profits from the maps drawn by your unconscious behavior? And most importantly, who will be held accountable when those maps are used against you?

And what’s your role in this?

The Scarcity Economy: Manufacturing Famine in a Land of Plenty

We live in an age of digital abundance. Infinite music. Infinite movies. Infinite opinions. And yet, scarcity has never been more profitable.

Artificial scarcity is the trick of the trade now, with ‘exclusive access’, limited drops, content hidden behind paywalls or buried under algorithmic suppression unless you know the secret handshake or, more accurately, the monthly subscription fee. The value of information is what should be driving this, but more often than not a clickbait headline with a captivating image is all that is needed to enhance the perceived value of information.

Scarcity used to mean something was rare. Now it just means someone decided you cannot have it.

The Trust Economy: When Reputation Becomes Currency

In the old world, money talked. In the digital world, reputation shouts through a megaphone.

A stranger on the internet with enough five-star reviews will have more influence over your decisions than a close friend, and those reviews can be bought cheaply. Platforms have weaponized trust, turning it into a measurable asset. And once something is measurable, it can be bought, sold, and inevitably manipulated.

Trust has become a product, one carefully managed tweet at a time, and yet trusted sources are constantly in the news directly or indirectly: ‘fake news’.

The Creator Economy: Liberation or New Chains?

They told us the internet would free the creators. No more gatekeepers. Just raw talent and opportunity.

And yet, here we are. Creators are working harder than ever to feed algorithms that punish consistency lapses and reward sensationalism. Ownership of their own work remains a battle. Revenue streams dry up the moment a platform changes its terms of service or disappears overnight.

Freedom was promised. Dependency was delivered. In another word – indentureship.

The Algorithmic Economy: The Invisible Hands That Don’t Understand You

Adam Smith spoke of the invisible hand, but he never imagined it would be a server farm in Silicon Valley deciding what you see, when you see it, and whether you even know what you want anymore.

Algorithms do not care about you. They care about engagement metrics. They care about the next best ad placement. And if that means feeding you outrage to keep your thumb moving and your mind too distracted to notice, well… they will call that optimization.

If a product is good enough, it doesn’t need that much marketing, does it?

The Cognitive Economy: Selling Space in Your Mind

Attention is no longer enough. Now, they want to rent space inside your head.

It shows up as ‘helpful’ suggestions. Pre-filled search terms that shape what you think before you think it. News feeds that filter reality before it reaches your eyes.

This is not just about buying products. This is about buying patterns of thought.

The Consent Economy: A Future Worth Fighting For

Somewhere down the line, we lost the true meaning of consent. Buried under endless terms of service agreements and pre-checked boxes.

The consent economy is not fully born yet, but it is kicking against the walls. People are waking up. Demanding transparency; demanding to know what is being done with their data, their creativity, their digital lives.

It is a fragile hope, but it is there.

And maybe, just maybe, that is the economy we should be building toward.

It begins with being aware of what you’re consenting to, making choices about whether you should consent to it, etc. It’s about taking back control of your information (privacy), your attention (time and thought) and your intentions. You don’t have to unplug.

You just have to be willing to evaluate and decide the things you plug into rather than being told and guided by an incessant barrage of headlines.

Introducing the Intention Economy

carview.php?tsp=

Over the last week, I’ve touched on the information economy and the attention economy, and those were to prime for the intention economy. Doc Searls first mentioned the ‘intention economy’, at least as far as I know, back in 2006:

“The Intention Economy grows around buyers, not sellers. It leverages the simple fact that buyers are the first source of money, and that they come ready-made. You don’t need advertising to make them.

The Intention Economy is about markets, not marketing. You don’t need marketing to make Intention Markets…”

The Intention Economy“, Doc Searls, Linux Journal, March 8th, 2006

On the flip side, we had Daniel Solove advocating privacy of information and predicting a future where information shared online could have adverse effects in his 2004 book, “The Digital Person“, which had a much more… protective view… of our intentions. He didn’t mention the attention economy that I recall (I lent the book out and it’s loose in the Universe), but he described the bad side of it for consumers fairly well.

That I’m a privacy advocate instead of a marketer tells you where I stood on the issues I saw.

As it happens, the intention economy is being mentioned again, and not in a positive way. The idea stuck, the technology advanced, the amount of information out there advanced, marketers continued marketing, attention spans lowered, the information economy made way for the attention economy, and combined with social media and AI, the intention economy has hit alarming new levels.

“The near future could see AI assistants that forecast and influence our decision-making at an early stage, and sell these developing ‘intentions’ in real-time to companies that can meet the need – even before we have made up our minds.

This is according to AI ethicists from the University of Cambridge, who say we are at the dawn of a “lucrative yet troubling new marketplace for digital signals of intent”, from buying movie tickets to voting for candidates. They call this the Intention Economy…”

Coming AI-driven economy will sell your decisions before you take them, researchers warn“, University of Cambridge, 30th Dec 2024

Our digital selves – the recorded decisions we make that companies record – leave a trail of information (information economy) based on what attracts us (attention economy) that gives away our personal intentions. Who we’ll vote for. What we’ll buy next after this book. What movie we’ll want to watch on Netflix after this one.

Understanding this is important, because it’s likely no mistake that you’re seeing the advertising that you are. Sometimes you might even want to change, but the advertising won’t. The failure of all the social media algorithms combined with AI is that it limits your choices on a platform based on your past.

Couple this with the indoctrination of children through technological portals, social media sites and generative AI, with ‘influencer’ being a title that people are proud of, you get a generation at risk of having a reduction of what they can choose from.

Think of a video game where your decisions get more and more narrow. You’re in someone else’s world. And your previous attention and information can produce digital intentions that are not necessarily your own. You’re part of a group somewhere in the digital landscape that has certain digital intentions.

Do you always want the options provided to you to be based on your past information? Does your company? Does your country?

Take a good look at the advertising you see on the Internet and ask why you’re seeing them. That’s a good first step. It puts the attention economy and information economy into perspective.

Also see: “The rise of the intention economy: How AI is shaping your future

The Potentially High Cost of Free Laptops (T&T)

carview.php?tsp=

They say the laptops are free. They say it is for the children. They say this is how we prepare the next generation for the digital future. But few people ask what that future will cost.

In Trinidad and Tobago, we are distributing laptops to students again. On the surface this is a good thing. Access to technology is essential. But we are not just giving hardware. We are also might be giving away their digital freedom.

These laptops do not just run software. They run ecosystems. And those ecosystems have owners. The students might end up being locked into Microsoft products from the start. Word, Excel, Teams, and OneDrive become the defaults. The muscle memory forms early. This is how habits are built. This is how markets are captured without a single dollar exchanged upfront.

Oh, it’s free now? Like a free sample? Like you see in PriceSmart, only at scale: an indoctrination.

That’s why you’re likely reading this on a machine with a Microsoft operating system.

Mobile users at least have Android.

The Hidden Price of Free

This is called vendor lock in. The students do not just learn computing. They learn Microsoft computing. They grow into professionals who expect the same tools. Businesses hire workers who are trained on the same systems. Entire institutions become dependent. And when the free licenses expire or the terms change, we pay.

We pay in foreign exchange for renewals.

We pay for support contracts.

We pay in data routed through foreign servers.

We pay every time our national productivity becomes a line item in someone else’s quarterly report.

We might even pay for the AI subscriptions that create decisions that benefit someone else somewhere else, in a dance of income disparity that continues to recruit dancers at an alarming rate. If you’re dancing, you’re not collecting. Think of this at a national level.

This is how the information economy works. And it is deeply connected to the attention economy. The same companies that control the tools also control the platforms where our attention flows. They gather data. They shape behaviors. They write the rules that determine what is seen and what is hidden.

Even the attention economy has to be managed to avoid distraction.

Data Sovereignty Is Not Just a Slogan

Every document saved in a cloud outside our borders is a piece of national value stored elsewhere. Every student habit formed on proprietary platforms is a future cost written into our economic forecast. And every time we choose convenience over control, we lose a little more of our sovereignty.

The question is not whether the children can use computers. It is whether they will own their digital futures or rent them forever. Waze and Facebook probably have more information about people in Trinidad and Tobago than the government, and if they want access to that information, they will have to pay – and even then, it could be redacted by the government that they are held accountable by.

There Is Another Way

We have alternatives. Real alternatives. Linux is free. Not free as in free trial. Free as in freedom. Open source software does not just give access. It gives control. It gives the power to learn how the systems work. It builds capacity at home rather than dependency abroad.

Imagine if our children learned to write code instead of just clicking icons. Imagine if they explored open tools instead of being funneled into walled gardens. Imagine if our economy kept more of its foreign exchange because we invested in local solutions powered by open technologies.

What Future Are We Building?

We cannot say we are building independence while handing over the keys to our digital infrastructure. We cannot say we are protecting future generations while embedding them deeper into foreign systems that profit from their habits and harvest their data.

The laptops are free but freedom is not. And if we are not careful, it will be the most expensive thing we lose.