When Algorithms Manufacture the Reality

When Algorithms Manufacture the Reality

Algorithms influence what content users see, shaping opinions and behaviour. Awareness of their impact is crucial to avoid misinformation and bias

Dr Uttam Chakraborty / Dr Santosh Kumar Biswal
  • Mar 01, 2026,
  • Updated Mar 01, 2026, 11:46 AM IST

Perception has translated to be a capital in the digital age. Visibility is the new value; ratings are turning into revenue, and reputation, which has been so painstakingly established via lived experience, is now produced overnight. The Deepfake CEO Scam this year potentially exploited the trust people place in decipherable public figures. Also, very recently, AI-Generated Deepfake Persona Investment Scams could create the sheer amount of trust where none existed. The case of ‘WeWork Pre-IPO Hype’ (2019) can be cited here, in which strategic storytelling could distort the valuation. “Fyre Festival’ (2017) was a unique instance where the influencer-driven perception could outpace infrastructure.

There cannot be many stories with such a paradox as the 2017 episode of a fake restaurant, The Shed at Dulwich in the United Kingdom, that rose to the top of the TripAdvisor ranking system and received the title of the No. 1 restaurant in London, even though the restaurant did not exist. Moreover, the Nobel laureate George Akerlof points out the concept of information asymmetry, which is often minimised by user-generated content (UGC) comprising ratings, reviews, comments, and photos.

A journalist staged the experiment by making up a fictitious listing on the TripAdvisor website. With the help of fake online reviews, fake images of a variety of household items, posing as a gourmet meal, and time-stamped Internet interaction, the journalist tricked the ranking system of the platform. The fictional establishment was already regarded as a top-ranked restaurant in London within a few months. Hundreds were trying to buy a reservation. The story was blown out of proportion by media houses. At the time of the revelation, the incident was mainly treated as a joke, a witty remark on influencer culture and naive shoppers. However, economically, socially, it was another thing, a wake-up call. The above-mentioned cases are standalone cases, but manifested the AI distortion on a large scale.

Economics of the Digital Delusion

Digital platforms have extensively depended on user-created information ratings, reviews, comments and photographs as indicators of quality. Such signals minimise the information asymmetry, a market failure that Akerlof analysed in the classical economic theory. Buyers do not have guaranteed information about sellers, and this might cause markets to collapse. Theoretically, reviews and ratings allow returning credibility to the crowd by crowdsourcing.

But an experiment performed by the concerned journalist demonstrated that there is a structural weakness in the signals themselves: they can be fabricated. Algorithms are a reward system that compensates a pattern, not a veracity, frequency, positivity, or velocity of engagement. When the content of fabrication reaches a specific threshold, then it becomes self-justifying. The top positions are a cause of curiosity. Inquisition gets more clicks. The ranking is confirmed by more clicks. Perception spreads out to perceived legitimacy.

Reputation is a bought and sold item in such an environment. It can be purchased, blown up or sabotaged. In its turn, deception is not irrational but economically advantageous. When algorithmic visibility is what defines market success, then manipulation of the visibility becomes a logical choice of business. The implications are far-reaching. The decision made by consumers is distorted. Capital flows to the most obvious instead of the most worthy. Algorithms give investors, advertisers, and even customers the impression that they are objective qualities. Once such signals are broken, the invisible hand of the market starts shaking.

Platform Economy and the Crisis of Trust

There is also something more sinister about platform capitalism, as seen in the fake restaurant episode. The businesses, such as Google, Amazon, and Yelp, have turned into the personal controllers of trust. Their algorithms define the publicity of businesses and the obscure ones. They virtually control the online markets. However, they have motivational factors that frustrate the regulations of these platforms. Interaction translates to advertisement earnings. Controversy fuels the clicks. It is expensive and, in many cases, reactive to police fake content. Anything that eliminates fake reviews poses the danger of losing customers and attracting businesses, too. This leads to a structural strain: the platforms have to look credible and, at the same time, be able to maximise the engagement-based metrics that can be relatively simply expanded with the help of deceit.

It is not just about fake reviews, but it is a crisis. It is concerning the privatisation of credibility. Trust as a quality set in institutions, professional norms and physical transactions is being mediated by secret algorithms. The results of such game-rigged algorithms spread to whole markets. Ratings are a way of life for the delivery riders, drivers and freelancers in the Indian gig economy. A star fraction can decide on employment, incentive, and algorithm visibility. An unscrupulous review - or a concerted campaign - can ruin the revenue. On the other hand, artificially inflated ratings can be drawn out of the deserving workers. Algorithms in such an environment are obscured economic opportunity gatekeepers.

Reputational manipulation cannot be applied only to restaurants and hotels anymore. The sellers of e-commerce buy counterfeit reviews to increase their sales rank. To get students, ed-tech instructors exaggerate testimonials. Influencers make followers in order to get brand deals. The online marketplace is the market that favours perception and perception can be manufactured.
The economic risk is not evident but critical. Consumers do not trust signals that are not reliable. When there is reluctance by consumers, transaction costs are increased. Once the trust goes away, the growth goes down. The lack of any credibility cannot be addressed by the digital infrastructure, such as payment systems, broadband networks and AI tools. The digital economy is based on the invisible infrastructure of trust.

Protection against Algorithmic Deception

Addressing the problem of digital deceit needs to be done on several levels. Firstly, to begin with, platform accountability should not be reactive moderation. When the systemic manipulation is profitable, platforms just must be jointly accountable to the effects of it. Regulation systems must encourage preventive measures instead of cleaning up of the situation once scandals have occurred.

Secondly, authentic reviews can be reinforced with established digital identity mechanisms, which are associated with some actual transaction and not anonymous accounts. Anonymity without accountability leads to abuse of privacy protection, which is necessary. Thirdly, algorithmic transparency is to be given serious consideration. Although the proprietary formula cannot be completely divulged, some light in the veil of secrecy of the ranking factors would help mitigate blind spots and rebuild trust by the people. Once users have an idea of the way visibility is calculated, manipulation becomes harder.

Fourthly, artificial intelligence should be used both defensively and in business. With the help of AI tools, it is possible to identify bot networks, detect coordinated review campaigns and indicate synthetic media. Deception can also be revealed using the same technologies that are used to lie.

Lastly, consumer internet literacy is mandatory. One thing that citizens have to learn to do is be sceptical of online cues, to identify signs of inauthenticity, to fact-check information, and to know how algorithms manipulate what content reaches them. Like financial literacy training can educate people on how to spot scams, digital literacy should educate people on how to challenge the realities they are being exposed to.

In the coming decade, economic power will depend not only on information, infrastructure, or AI, but on the trustworthiness of digital signals. Once trust falls, reconstruction will be sluggish and costly. No amount of endeavour capital or technological superiority can easily repair a marketplace where consumers distrust what they see.

Read more!