What are dark patterns and can they be regulated
Dark patterns are everywhere, these are tricks used by applications and websites to make users do things they don’t want to. They rely on exploiting our behavioural biases and cognitive limitations. We all encounter dark patterns in our daily lives, like making it easy to subscribe but hard to unsubscribe, pre-selecting actions like purchasing insurance, offering tips, payment methods etc., some e-commerce platforms “sneak” new items that you didn’t choose just before the payment step, hiding or obscuring important details, using scary and fearful language and more.
In this conversation, Ashish Aggarwal (head of public policy at NASSCOM), Kailash Nadh (CTO at Zerodha) and Bhuvanesh R (Zerodha) discuss how dark patterns are harmful to users and the kind of regulations that must surround them.
Highlights from the conversation
Dr. K
- Dark patterns; the word itself is rather new. I think just about a decade old. But in really simple terms, it’s a user interface digital surface term. When a user interface employs trickery and tricks users into doing something that they didn’t really have the intention to do that’s a dark pattern.
- In the dial-up, in the old internet era things that were seen as a menace today have become industry standard practices. So that is the transition that black patterns as a thing has undergone over the last 20 or 25 years.
- Dark patterns and not coincidently are always employed in the context of financial gains. Wherever there’s no money or financial transaction you typically do not tend to see a so-called dark pattern. And I think that’s then explained by a simple matter of incentives if there’s a financial incentive if there’s a financial gain to be made by increasing the size of a certain kind of button and reducing the size of another kind of button that seems like an obvious incentive to a lot of organizations who primary incentive is to make money.
- I use software written by other people every day, built by other companies every day and people use software that I have written and our organization has built. So the common sense understanding translates really well. If you don’t want to be annoyed if you find some of these practices deceptive or annoying, how could you, in the right conscience, employ those same practices in your line of work? So to me, it’s that simple, common sense principle.
- This whole metric-based view of anything is problematic. It could be the right metric to begin with, but metrics don’t exist in isolation, especially in complex businesses and services. A lot of metrics really have to play together, and you have to look at everything together and not in isolation to make meaningful decisions
Ashish
- In an offline world, you are in a proper 3D environment—your senses are assessing everything, then you move to a laptop world where you have a screen in front of you and now you are on this mobile and a consumer is always in a rush so let’s click and go ahead. In that kind of environment, I think a dark pattern has a serious implication
- If I got a really targeted ad that was very useful for me I wouldn’t think of it as spam but because I get ads which are not relevant and I get only 5% of ads maybe which are relevant I start worrying about it and then I start thinking about spam and all of that stuff and because of this clunkiness in various processes where it is also about customer targeting, customer retention and then you trying to make up for lack of at times genuine understanding of consumer or genuine quality or you are trying to make up for lack of processes at your end to employ certain things which are unethical and or deceptive to still meet your business outcomes and that’s what is dark pattern in some sense.
- There are other things beyond the dark patterns which we are discussing today and this is especially relevant let’s say in the case of children. So today, if you’re again using your phone, then you open any of the social media sites and once you start scrolling, you will not stop scrolling, the next video will start, the next thing will start; the next reel will start; and maybe as an adult, it’s okay; I mean, an adult has to take a decision but for a child, this could be something; maybe a child is not able to take a decision, and it could have some detrimental impact on the child’s understanding and the way he utilizes time.
- A lot of this starts from product design, and it kind of anchors into the outcomes that you are chasing. Are those outcomes short-term outcomes in some sense or are they meaningful outcomes for the end consumer, if you focus on the meaningful outcomes for the end consumer even if you employ patterns, these will be positive nudges most likely and not nudges which are out to deceive you as a consumer.
- Typically let’s say there are 10 businesses. Three of them are very good, they don’t want to use dark patterns, and seven of them are like, “Okay, fine, we don’t care so much.” Now, for these three people, it becomes very difficult to stick to the idea of not doing a dark pattern because if there’s no regulation and there’s no action, then they’re hoping in hell that the customers will be smart enough and will reward these three as against the seven and this timeframe may not be the timeline during which we are measuring this potential outcome, could actually be make and break for a startup.
Transcript of the video that’s been edited for readability.
Bhuvanesh: There’s always a tension between what consumers want and what brands want. As hyperbolic as it sounds, it’s never been harder to be a consumer. So in this process what the brands want and what the consumers might want, they might not match. So, in the process brands might end up doing something they shouldn’t be doing which might not always work out well for the consumers.
This problem becomes a little complicated especially with everything being online. Such things might happen intentionally or unintentionally and it’s also sometimes not easy for brands, people building products, and designing products to be cognizant of all the factors that might impact the consumer. And as the world has moved online as all our interactions moved online, this tension has become a little problematic and one of the side effects of this tension is that these so-called dark patterns have become a big problem.
Across the world, various consumer regulators have taken cognizance of this fact and they’re passing regulations on how to protect consumers. Recently, the government of India released draft guidelines on regulating these dark patterns and that’s the topic of today’s conversation.
So, I have Kailash, who heads technology at Zerodha, and Ashish, who’s the head of policy at NASSCOM to talk about this so before we dive into this, K, I’ll start with you, what in the simplest possible terms, what exactly are dark patterns and maybe a little bit of history on how this became a thing.
Dr. K: Dark patterns; the word itself is rather new. I think just about a decade old. But in really simple terms, it’s a user interface digital surface term. When a user interface employs trickery and tricks users into doing something that they didn’t really have the intention to do that’s a dark pattern.
I can give you a really simple example: On many websites, we’ve seen let’s say giant button that says download PDF but that in itself is an ad but there’s a tiny link below that button that says download the PDF here. So that’s a classic simple widespread dark pattern and the goal there of the entity that employs the dark pattern is to get users to click on the ad which the user never had the intention to.
There are many variations and historically it’s not really a new thing I think it has been there from the advent of the internet, even in the dial-up era, for those who remember we had popups and there was this entire pop-up menace.
Bhuvanesh: I think Microsoft internet bars.
Dr K: Yeah yeah yeah toolbars. I mean so dark patterns I guess were malicious back in the day the pop-up menace drove some of the biggest browser innovations of that era, you know popup blocking, etc. But today what has changed, what has shifted is that so-called dark patterns have been legitimized by large organizations who employ it as a part of their business. So in the dial-up, in the old internet era things that were seen as a menace today have become industry standard practices. So that is the transition that black patterns as a thing has undergone over the last 20 or 25 years.
Bhuvanesh: Ashish, so the government has come out with a consultation paper on regulating this. Where is the government coming from, in the sense, is this a reaction to a lot of other governments, let’s say for example, the US FTC has been taking a particularly strong stance against this is this a reaction to the global mood to regulate this or has this been part of the government’s agenda for a very long time?
Ashish: I think we should go back to the current draft consultation. The conversation around this began somewhere in June this year and it actually started with the ASCI which is the Advertising Standard Council of India looking at dark patterns from the context of just advertising and they came up with four kinds of dark patterns that they felt were part of the advertising SES like drip pricing, bait and switch, false urgency and also sometimes you have disguised advertising which is creating organic content and treating the advertising to be like an organic content.
Now there are a lot of things that can’t be covered under the advertising itself and that was the discussion that was happening between the Consumer Affairs Ministry and the ASCI. And to be honest, I think increasingly when you look at the Indian consumer class over the last decade so much of what was happening offline is today happening online.
If you just take financial services, you take a simple example of personal loans, this is something that has always been an issue in terms of the rates, the repayment capability, and the collection practices, and so on so forth and there is an entire issue around it which has now in a way shifted or is increasingly shifting from an offline world to an online world.
So, what would be the kind of dark patterns here in terms of saying that okay you will cause deception by not letting the potential customer understand what the rate actually implies in terms of cost, so that could be something or giving him a very cheap deal saying 8% interest rate, no security and he clicks and then he finally ends up with 36% interest rates with 10 big terms and conditions so that is bait and switch strategy.
From a government point of view, in the Consumer Protection Act, you have the concept of unfair trade practice. So all of this comes under unfair trade practice. Now the problem is that because in India you haven’t got evidence saying that over the last five years, this is the set of complaints in the digital world that relates to some of the examples we discussed of dark patterns. I think the government decided that let’s look at what is globally established as some dark patterns, so they looked at a list of 10 and that’s where the thought process started and I think for good measure they set up a task force which involved civil society groups, law firms, industry associations like NASSCOM and they asked for recommendations.
And I think the first step was that okay let’s identify something and put it out as a voluntary code and that in a way could have been a good thing you know to start off but I think the government took the next step already by saying that we will actually put this into a draft set of guidelines and that’s under consultation today and we can talk a lot about this.
Now the thing is that some of the things which are dark patterns are otherwise not illegitimate activities. That’s where we will have friction at times to think about and another interesting aspect that I think nobody talks about is that sometimes a service provider could just be not putting the right kind of investment in the processes and it could actually be resulting in some inefficiencies or incompetencies as an organization and from a customer point of view, it may not be very easy to understand whether it is a dark pattern or is it just that the service level in a certain manner is not as per the service standard.
A simple example of this is the access to a helpline. So if you don’t put up capacities that’s just a business thing that you’re not able to do but sometimes you have the resources and you have just made it difficult for the consumer to access the helpline then that’s a dark pattern but sometimes it’s difficult.
I’ll end with this, sometimes, in a public sector banking environment, you could go through many hoops and you might wonder it is because the actual bank is trying to prevent you from getting to that level or it could just be bad UI. So I think these are some of the things which we haven’t tested out in terms of <unclear> figuring out. So it’s a grey road ahead but I think it’s a good start to be talking about it and definitely when we start talking about it companies will realize that there are certain things which they must avoid and that should be a good beginning.
Bhuvanesh: We’ll dive deep into the regulations but before that, we are supposed to be the most smartest creatures in the universe. Is that a sad review of humanity that we can be tricked with a bad UI/UX? I want to get both of your reactions on this.
Dr. K: It’s a dark pattern and these dark patterns and not coincidently are always employed in the context of financial gains. Wherever there’s no money or financial transaction you typically do not tend to see a so-called dark pattern. And I think that’s then explained by a simple matter of incentives if there’s a financial incentive if there’s a financial gain to be made by increasing the size of a certain kind of button and reducing the size of another kind of button that seems like an obvious incentive to a lot of organizations who primary incentive is to make money.
Dark patterns are nothing new—it’s just a more digital manifestation of the same incentive-driven human behaviour and is there a sad commentary on Humanity? Yes obviously.
Bhuvanesh: Ashish, is humanity doomed?
Ashish: In an offline world, you are in a proper 3D environment—your senses are assessing everything, then you move to a laptop world where you have a screen in front of you and now you are on this mobile and a consumer is always in a rush so let’s click and go ahead. In that kind of environment, I think a dark pattern has a serious implication because probably I’m walking, I’m also making some purchases and browsing and these little, little things can make a lot of difference. Like the kind of colours you use, the font you use, and how you do some of these things.
The core thing is that when the intent is to misdirect and deceive, you will never get it to a prescriptive list of what all is a dark pattern but I think as soon as you put this test of intent to deceive and mislead. I think it’s very much happening however smart you are, forget dark patterns I think some of the smartest people have got conned into financial frauds and money siphoned off and so on and so forth. So I think it’s very real and it’s something that consumers need to understand and companies need to know when there is a dark pattern actually.
Bhuvanesh: A Dutch social psychologist once said that the human conscious brain has the capacity of an abacus and at any given point in time there’s very little that we can focus on in the sense it’s barely one thing. But on the regulation side, now that you have seen it what are your views on the enforceability of the regulation?
Because earlier you alluded to a lot of conflicts and, a lot of issues, especially for example, if you open a PSU website, the entire website is a dark pattern but as you alluded to it’s not intentional, sometimes it might be just because they haven’t made the right investment or so on so forth. So in terms of the overall structure of the consultation so far, would this go far enough to address the issues or would this just end up creating more conflicts and gray and fuzzy areas where the industry and government will just have a lot of conflict?
Ashish: I think enforcement is still a little ahead of us. I don’t think even from a government perspective at least in my conversation we are necessarily seeing it as an enforcement step at this point in time. I think what is important is to establish a certain common understanding of what is a dark pattern and I think that’s the first step which these guidelines are seeking to do.
In the entire consultation process, the feedback at least from the industry was that use the lens of the effect of a pattern or the intent behind the pattern. So that is something that needs to be eventually established to actually treat it as unfit trade practice under the Consumer Protection Act and that is still ahead of us that how will that actually happen.
Just take a simple example, there is no regulation and there is some regulation. When there’s no regulation, for example, we went and asked the Consumer Affairs Ministry why don’t we use the past data of consumer complaints and pick out those complaints which have certain patterns and you can take out the top 10 patterns and you can use that as a basis to say that now we want to have a guideline for these patterns because these are something we have observed in India.
The government has no data practically to do that evidence-based policy making so I think with this step we will probably if after let’s say year or two we can go back to the consumer helpline and get the data to say that okay what kind of complaints actually came in and then reconcile that with the regulations and move forward.
I think it’s going to be a journey and from an industry point of view, I think there’s a lot more at stake than just enforcement. It is a matter of reputation. Today enforcement happens in many different ways. As a consumer, when I go on social media and start complaining about a particular dark pattern, I think it’s already a signal in a manner that a lot of firms would seriously want to avoid.
I think there are many ways beyond just hard enforcement where there are carrots and sticks which will work in this journey to make sure that we start building a digital ecosystem where consumers are not misdirected and it’s going to be tension for sure as I think was initially mentioned and we’ll see how we navigate this.
Dr. K: I’d like to add to what Ashish said and I agree with Ashish.
The problem with enforcement here and I’m speaking from a technologist lens is that you can’t ever have an exhaustive list of dark patterns. Even something as simple as an important bit of information being on a small phone size. There’s no such thing as a small phone size is it 10 pixels, is it 12 pixels wide, what is the DPA of the screen, so you can’t ever have a thing that says that using X pixel or REM whatever font size on a web page is dark or not. That’s not possible. So I think this can’t really be looked at from an enforcement lens in that sense.
The way I understand it has to give consumers a leg to stand on where previously they had no leg saying the font was too tiny I couldn’t read and it could be a million different variations million different dark patterns. There should be some legal basis for consumers to pursue their rights saying that was really misleading and right now, it would be very difficult without a legal basis without legal recognition of this concept of dark pattern so that is the way I am looking at it because any sort of individual exhaustive enforcement is technically infeasible or impossible.
Bhuvanesh: I think this ties into the follow-up question I had for you. So some of the industry reactions were that there are already existing regulations that kind of cover most of these dark patterns, was another regulation really needed and does it maybe address some of the potential gaps in it?
Ashish: The concept of unfair trade practice is a very broad concept and technically I can go to the consumer protection court and decide that this according to me as a consumer is an unfair practice and obviously I can get a favourable judgment order. Company can get an order which can be cease and desist. I can even get compensation and so on.
The point about these guidelines is how do you start dealing with these things at scale? When you think about it at scale, the amount of resources that get spent when you try to deal with complaints at an individual level and the amount of state capacity you need to deal with it and then when you start thinking of it from the lens of consumer and company or industry, the consumer will be at a disadvantage because at a firm level, you will always have the firepower, the money muscle to say that okay let’s handle this let’s take it to the legal process and let’s see what happens. So what that creates is potentially an environment where a consumer usually will do his own cost-benefit analysis and say that it’s not worth getting into and probably shift the vendor shift the service provider.
From that point of view, I think you do need a set of regulations. Now having said that, I think the challenge is do you want to look at it as a hardcoded regulation or you want to have the journey that starts with a voluntary code.
I think that’s the first point of debate in this legal journey and when you actually go through a hardcoded regulation, how is it going to play out that will depend upon the lengths to which the Consumer Affairs Ministry will treat these complaints and who are the people who are going to sit on these complaints. So I could be very pedantic like the example of font that you just mentioned right, if I would then say that okay anything below font 9 is tidy then that sets a very bad precedent in many ways and we want to avoid that.
So how do you get an industry voice into some of the capacity building within the government to say why some companies think this is a genuine practice, I think that discussion needs to happen and what we are trying to see is that in this process of regulation building, we can get into a forum where there can be a lot of conversation and learning and feedback so that this regulation can actually evolve into something very useful and sensible.
From that point of view, I don’t think it is an overlap or an unnecessary set of regulations. It is a good beginning we just need to work on it because it’s very nuanced, it’s not a typical black-and-white scenario where you say okay you jumped a red signal and you pay a fine. That’s not what this is about.
Dr. K: I agree, I agree.
Bhuvanesh: K, you’ve been thinking about this for a long time. This idea of doing right for the users is also code to Zerodha’s philosophy. You’ve written a post called User Disengagement, which probably should be like a Bible for people working in technology and people who are not just designing products but also taking business decisions. I’ll ask you about the post, but before that, how have your own views about quote-unquote “doing the right thing” for the users evolved over time? Because you’ve been tinkering pretty much for your entire life with technology. How has your worldview about user interactions evolved?
Dr. K: So I think I was on the early internet I think I got online in the year 2000, so I got to see that shift from the dial-up internet to slightly faster internet, Web 2.0 – 3.0 and I saw how dark patterns evolved, and I realized that, I was a pre-teenager then, so some of the things that I employed on the web pages that I was building, software that I was building and distributing those were dark patterns. I didn’t even realize and I was way too young to realize but when it really started happening to me as I grew older, you know, in the early 2000s, when you’d go on a website and there’d be 20 pop-ups that open up, that was very annoying and that was bad. So common sense would dictate that this is really bad; I hate it, so I shouldn’t do that in the software that I make.
So to me, it has been really understanding that has evolved via common sense, and as makers of software technology and user interfaces, we’re also consumers and users. I use software written by other people every day, built by other companies every day and people use software that I have written and our organization has built. So the common sense understanding translates really well. If you don’t want to be annoyed, if you find some of these practices deceptive or annoying, how could you, in the right conscience, employ those same practices in your line of work? So to me, it’s that simple, common sense principle.
I find it really bad, so I won’t do it. I wouldn’t want it to happen to others either.
Bhuvanesh: In the post, you mentioned this truism “Don’t do unto others what you wouldn’t want to be done unto you.” I mean, you mentioned common sense; it should seem like common sense, but it’s not. This problem is just getting worse by the day, so much so that we have regulations.
If you were to put on your technologist hat and think of yourself as a person building or designing products, why do you think this still exists in the 21st century? Where are people going wrong?
Dr. K: No, it’s not going wrong. To the people who employ dark patterns and to the organizations and entities who do it, it’s going right, it generates revenue, and many of these decisions are based on practices like AB testing. You do a big red button, then you do a slightly orange button mixed with text. You try out variations, measure the conversion rates over a period of time, and whatever works best, you do it.
So when a lot of such decisions are based entirely on so-called “AB testing,” where those metrics—did it generate revenue, did it convert revenue, those drive user interface product UX decisions then it’s very natural that we’ll slowly creep into the lowest common denominator of practices, which is whatever generates money and deceptive practices can easily, I mean, I had no intention of clicking on an ad but the UI deceives me into clicking an ad, obviously that will make money and the AB testing systems will register that pattern as a better converting pattern.
So this whole practice—I’m not saying AB testing as a concept is inherently bad; of course it’s not—but AB testing and purely metric conversion-driven practices for building technology for human beings has taken off like nobody’s business in the last two decades, and we’re seeing the results of that. This entire machinery, there are entire industries that cater to delivering metric-based business decisions.
Bhuvanesh: Everything is a hack.
Dr. K: Everything is a number. But really, I mean human-computer interaction, HCI is a field, it’s a scientific field that has existed, and you mentioned earlier that you read somewhere that human beings can only store X number of things in their working memory. It’s true it’s very well understood. I think that the rule of thumb is seven plus or minus two, so even on a user interface, if you have 15 items in the menu, the user’s already lost.
So there are even quantifiable metrics for user attention, and this is slightly from a philosophical tangent not really from an incentive perspective. We have limited attention and there are even terms like attention economy now, robbing of attention, so there’s a reason why these terms have surfaced, right you go online and every website that you see, every app that you see—everything is trying to show you an ad, everything is trying to deceive you into clicking something that you shouldn’t have clicked, everything is trying to monetize you, even Wikipedia for instance, it’s a nonprofit and the donate banner I mean, it’s such a shame that Wikipedia showing a banner soliciting donations is okay, but the language they use is very problematic.
I think that’s a dark pattern as that’s a nonprofit, so it has really become the norm, and from a philosophical angle, I think it’s a bit of a crime also, I know it sounds a bit like hyperbole but I’m not consenting to my attention being stolen, to being misled into clicking stuff to being misled into accidentally clicking a tiny checkbox that starts a subscription, etc., how’s all of this legitimate globally as a practice? When trillion-dollar corporations employ these practices there’s a problem and I agree with everything Ashish has said except for one tiny disagreement.
I think if reputation is a thing you know somebody gets tricked by a dark pattern and they air their complaint on social media, if that really had an effect I don’t think this would have exploded so much I think reputation as a self-correcting mechanism that forces companies into reducing dark patterns is not very effective. It may be effective, but not very effective, because if it was we wouldn’t be having this conversation.
Bhuvanesh: So this is for both of you – 2023 we are in a place where K like you rightly said this dehumanizing language of everything being an eyeball, everything being an interaction, everything being engagement. Have people building products and designing products lost the plot? In the sense that, they’ve let these measures become targets and everything else has taken a backseat. So to give one simple example, it’s become a default thing that a user visiting an application again and again is a good thing. If I look at it from the finance industry it’s absolutely the bad thing. Activity and results are inversely proportional in finance. Where do we go from here? Is there going back to like you rightly said—common sense-based design of products, policies and how people think about business decisions? I’ll start with you.
Dr. K: So I think this is also, we are in a very nascent stage still. All the internet economy, everything getting digitized this phenomenon is just about a decade old really at scale right so I think this is really the early years of heavy civilizational digitization and I think it’s common sense again.
With your limited attention, how many apps or websites on a daily basis can capture that? That’ll also have a tipping point and I think this stuff will correct itself because it’s not sustainable, it’s not viable, it’s not healthy for any society anywhere. So I think over a period of time for whatever reason it will start correcting itself. This exploitation of eyeballs, you’ve seen how low these have fallen right whatever used to work has stopped working now. That’s why we have more and more darker and darker patterns. There’ll be a tipping point where this stuff stops becoming effective and people will just get sick of being dark patterned all the time, so I think it’ll correct over a period of time, it has to logically that’s that’s my view.
Ashish: I completely agree Kailash, and to your earlier comment I think what I meant to say is that along with the regulation, the power of the consumer to stand up increases and that will probably have a greater reputational risk at some level and you’re right I think, things are happening at such a scale but we are still very early in this process.
One other way to maybe explain this is that a lot of the actions from the industry in some sense are very clunky so let’s take advertising; it’s like stray and pray, right? So if I got a really targeted ad that was very useful for me I wouldn’t think of it as spam but because I get ads which are not relevant and I get only 5% of ads maybe which are relevant I start worrying about it and then I start thinking about spam and all of that stuff and because of this clunkiness in various processes where it is also about customer targeting, customer retention and then you trying to make up for lack of at times genuine understanding of consumer or genuine quality or you are trying to make up for lack of processes at your end to employ certain things which are unethical and or deceptive to still meet your business outcomes and that’s what is dark pattern in some sense. So I think as we mature as business and industry and as we are able to refine our processes and technologies better both from a consumer point of view and from an industry point of view and this is the positive lense right this is the best case scenario which I’m painting then I think both from an industry and from a consumer will have to worry less and less about dark pattern and that’s the choice we have to make and we can make it early but as historically it is visible it is not a choice which will organically happen by itself so you need that push of the government to come up with the regulation.
A couple of examples – on a cigarette packet, you have smoking kills so obviously you can buy a cigarette and there is a clear advisory on it–it kills. So it’s a nudge that somebody who’s buying maybe he needs to avoid buying it right but he still makes their choice he’s an addict he can make that choice.
So there are certain things which the government will come out and say that certain things will be labelled in a certain manner so that at least the customer has a head start. There are other things beyond the dark patterns which we are discussing today and this is especially relevant let’s say in the case of children. So today, if you’re again using your phone, then you open any of the social media sites and once you start scrolling, you will not stop scrolling, the next video will start, the next thing will start; the next reel will start; and maybe as an adult, it’s okay; I mean, an adult has to take a decision but for a child, this could be something; maybe a child is not able to take a decision, and it could have some detrimental impact on the child’s understanding and the way he utilizes time.
Now, do we need regulation for that? It’s an open question. Now if you just look at the sphere of regulation, we are talking about it from the lens of just consumer protection. We recently had the Data Protection Act, which talks about the processing of children has to be done in such a manner that it is not detrimental to the well-being of a child. Now what that means? So today we might be looking at a certain set of patterns that nobody has categorized as a dark pattern or otherwise, but going down when you start looking at personal data processing, it could very well be that certain practices will start popping up as dark patterns. I have sent you a newsletter, and I’ve taken your consent. Now the unsubscribe button is hidden somewhere, now the law requires that the manner I gave my consent should be equally easy to withdraw the consent.
Now if you employ tactics which, as a business you know you feel that okay I will reduce the unsubscription scenario then it could easily be something that can be clamped upon. So I think when you look at this sphere of regulation, it is not going to be limited to the Consumer Protection Law; you’re going to have many, many such things, and this is something that from a from a business point of view, you need to take a call on pretty early. How are you going to measure success? How are you going to be talking to your investors? Who are your investors, and what are their metrics?
So clearly, I think if that Excel sheet that you are looking at has problems in terms of the targets, then to the point that the measure is the problem right in some sense, that will happen, and that clarity is going to be important, and I don’t think there is a lot of discussion, which probably happens at an early stage in a startup, on these points because clearly the focus is on raising fundings, getting more customers, and then sometimes it becomes too late to then do these deep type practices, this is something which is an interesting discussion I think from our startup point of view.
Bhuvanesh: K mentioned there’s a problem, if businesses don’t see results which means there’ll be darker patterns now if you are to put your policy hat and if you are to invert whatever you said and give advice to founders and decision makers on what not to do whenever they’re making business decisions or deciding on metrics on what to chase and what not to chase, like what would your ideal case advice be. What are those key points that people should keep in mind when they are making business decisions?
Ashish: So see, before I was in NASSCOM, one big part of my life I spent enabling low-income workers to save for old age. So I spent about a decade creating a business. We had about a million customers, and this is people saving for their social security. Now the thing about social security is that if you open a pension account, nothing happens because you just open an account. You need to accumulate in that account enough to be able to have something for your old age, so if the metric is opening accounts, and that’s what everybody’s chasing, then I think you will employ a lot of practices which will get you a million or 10 million accounts. But if the metric is that after 5 years have the average account holder saved X proportion of his monthly income, if that is the metric that you are chasing, then all your patterns will be focused on delivering on that kind of metric.
As against this, it’s simpler in insurance in some sense, if you bought insurance you’re covered. So from that point of view just making a difference between pension and insurance right, but if you as a company also want to bundle pension and insurance how do you think about that? A lot of this starts from product design, and it kind of anchors into the outcomes that you are chasing. Are those outcomes short-term outcomes in some sense or are they meaningful outcomes for the end consumer, if you focus on the meaningful outcomes for the end consumer even if you employ patterns, these will be positive nudges most likely and not nudges which are out to deceive you as a consumer. I don’t have very strong advice but this is just something which I can bring back from my experience from the past.
Dr.K: I wanted to share something else but adding to what Ashish said it really comes down to metrics again the philosophy of running an organization. Imagine an organization that opens accounts, if account closure is a metric for poor performance and there’s an easy way to close accounts and people are closing it, then imagine you have a product manager whose sole focus is reducing account closure, then the simplest thing to do is make account closure difficult and the graph suddenly goes down like account closures have gone down and businesses have improved. This whole metric-based view of anything is problematic. It could be the right metric to begin with, but metrics don’t exist in isolation, especially in complex businesses and services. A lot of metrics really have to play together, and you have to look at everything together and not in isolation to make meaningful decisions, so ideally, one should never really have the goal of reducing account closure; it should be increasing customer satisfaction it’s common sense but yeah.
And secondly, on the regulation bit earlier, I wanted to cite this example of what has now become widespread you go to any web page in the world there’s a popup that shows up you know the notorious cookie consent popup so when that whole consent thing came out in GDPR, it had the right intent. Users should get explicit consent from users to set up a cookie or collect information and whatnot. But the implementation has turned out to be an annoying UX nightmare, and it fatigues users. Now you go to any web page you don’t want to read all that stuff and you’re conditioned to just hitting okay and most websites don’t even have a reject button it’s okay or advanced settings or whatever and people are so fatigued of dealing with this one particular cookie consent popup that people just hit okay.
What’s happened here is that because now people have fatigued and they hit okay, the website has received legally compliant legitimate consent to do whatever, but that’s how the pop-up thing has backfired, so when it comes to regulations and implementation, I think being overly prescriptive is very problematic. Like Ashish said right, this is specifically this stuff. dark pattern stuff is extremely nuanced. Any sort of regulation should be principle-based. Anything that says show a popup or reduce font size will just backfire and make it worse. So I wanted to share that example earlier.
Bhuvanesh: So on that note, because you actually build stuff, you actually design stuff, in the sense that when you’re thinking about product design, is the lens that something you would want to use, or how do you think about what’s your framework when you’re designing products? How do you let’s say you had to start from scratch what goes on in your mind when you’re thinking about the design choices, be it anything?
Dr. K: I think there’s a huge element of what personally I would expect out of a common sense user interface and I would expect the least amount of pain, I would expect zero deception if I’m using a piece of technology, I want it to be simple and I want it to be free of clutter, free of deception, so that is my framework, whether it be an open source, free and open source project, whether it be one of our commercial things that is really the lens that I personally use and the lens that our teams have looking at a product.
So the common sense tangible philosophy here is to be user-centric and not really be business-centric, now that sounds ironic but if you’re user-centric and if your business vision is aligned that way, it must pay off. We can take ourselves as an example, so that’s the view; make it as simple for the user and if the users are happy with your product then the business outcomes should ideally logically follow.
Bhuvanesh: Ashish, you mentioned something interesting which was actually one question I had: the example of pensions. Is there an element where these regulations, or rather dark patterns, is there an element of 50 shades of grey? Let’s take the US example where they automatically opted out a lot of employees into their pension programs. Now, if you look at it in isolation, that can potentially be construed as a dark pattern. But if you’re looking at yourself from the lens of, let’s say, a policymaker or a regulator, getting people enrolled into pension programs is actually good.
So, is there an element of context dependency on a lot of these regulations? Because, for example, if you say opting in is bad like a blanket statement, that could potentially backfire and that could potentially lead to consumer welfare loss. The same is the case with EPFO. In the absence of employers having to automatically contribute to a lot of these users, I mean no user would have, and we would probably have a lot fewer savers than otherwise. Like, is there a solution to this?
Ashish: In the pension world, it is very clear globally that opting in is not going to work and opt-out is a good option. It has been discussed across jurisdictions and it is accepted as a practice. And obviously, it has to be transparently designed as a clear policy. So a lot of these things, where there is a potential for something to be treated as a dark pattern which is actually a good nudge, I think it does get a regulatory cover. It might start out as a practice but it gets picked up, and it gets a regulatory cover, and that’s a good thing going.
Another thing which is very interesting, which we should see, is that we talk about regulation but technology is not only going to lead to more dark patterns and most activity, it can actually be the solution also. So one of the things which potentially is in the realm of a dark pattern is that once you have a subscription, which is like monthly or whatever, and you want to cancel that subscription, you just can’t find the cancel button, right? It’s just difficult. And sometimes these are tiny subscriptions, Rs. 100 rupees, 200, so you don’t want to bother about it, but it’s there.
So now what RBI has done is that it has come up with this concept of an account aggregator and it has come up with this concept of saying that all of your consent will be through this tokenization framework and all of that is managed in such a way that in my mobile, through my bank, I can manage a lot of my standing instructions which I have given to various service providers. And through swipes on my one app, I can enable and disable it.
So sometimes, what is a way of resolving certain activities which are dark patterns? There can be multiple ways of looking at it. So we should also look at technological solutions which will empower consumers more. And in that context, what the Data Protection Act is doing is that it is uniquely an Indian solution in the law itself, saying that we have a concept of a consent manager, which basically means that, okay, fine, I might have consented to 100 things when I was not having the time or not reading the fine print, and there might have been deceptive practices. Eventually, if I opted for the use of a consent manager, I can go back and one fine afternoon I’m scrolling through all the consents which I’ve given and I can remove the consents which I don’t want.
So I think technology is going to be a huge thing, and sometimes most of the businesses, I think, take a shortcut. It’s always a shortcut when you’re employing a dark pattern. So to Kailash’s point, it’s a little more difficult to achieve the same outcome in a straight manner. And I think that’s the choice a lot of times as heads of businesses that people like Kailash are taking to say that what is the route that we are going to take to achieve an outcome, and the lazy route is the dark pattern.
Bhuvanesh: K, at a broad level, are these dark patterns the result of when people are designing the product? Is it because they don’t really understand how the human mind works or thanks to behavioural economics whatever in the last 40 years, are we in a place where we understand the human brain so much that we have figured out how to hack, trick, delude users into doing whatever we want?
Dr.K: There’s that classic cliche, adage that I forgot who said it. You know, it goes like this: Some of the brightest minds in the world are figuring out ways to make people…
Bhuvanesh: Click ads.
Dr.K: Oh yeah, yeah, yeah. So it’s all of this put together.
I think, once, when some of these practices, dark patterns, become industry standard practices, then it’s legitimized. And to a young product engineer or a young technologist, it might not even look like a dark pattern. You look around and you see all large organizations, all competitors, all businesses employing the same pattern, then it’s just a standard; it’s convention. So it’s a mix of all of this. It’s a spectrum of overly focusing on metrics, maximizing financial incentives, the lack of empathy, just misunderstanding, just everything becoming an accepted convention. It’s everything put together.
Bhuvanesh: As an extension to that, there are bound to be a lot of programmers, and product designers listening. What advice would you give to them when they’re thinking about designing products so that they don’t? Also, in a lot of cases, developers might not have the choice to do what they want in a large organization because metrics would become the North Star. Is there anything they could do? Two questions: First, what would you want to tell the young programmers and young developers?
Dr. K: It’s the same advice. It’s not advice specifically for engineers. It’s, “Don’t do unto others stuff that you don’t want done unto you.” That’s it. I mean, if you employ that rule of thumb, you will not be building dark patterns. Unless you want to be deceived, don’t deceive others. And you’re absolutely right, in most organizations globally, engineers have rarely a voice or a choice. Engineers have to build what they’ve been asked to build.
Now, we can dig deep and look at the ethical and moral conundrums of engineers choosing specific jobs, but at the end of the day, people have to make a living. It’s just a tech job and when a requirement comes through saying, “Reduce the size of this button,” they probably don’t even have the context of why that button is being reduced. That metric would have been computed somewhere in some other department. So this has to be a collective thing, and this really is in the hands of business heads and decision-makers, not really engineers.
Ashish: So if I can add to that. Typically let’s say there are 10 businesses. Three of them are very good, they don’t want to use dark patterns, and seven of them are like, “Okay, fine, we don’t care so much.” Now, for these three people, it becomes very difficult to stick to the idea of not doing a dark pattern because if there’s no regulation and there’s no action, then they’re hoping in hell that the customers will be smart enough and will reward these three as against the seven and this timeframe may not be the timeline during which we are measuring this potential outcome, could actually be make and break for a startup.
So from that point of view, what the regulation also does is that it creates a level field where it starts supporting the people who have good practices and then starts penalizing the people who thought they could get away. I think that is one great thing from a regulation point of view. Start seeing it like that and start championing some of the good practices.
The second thing is, let’s say today, I don’t know, but how many companies have a let’s say a person in their company who’s actively looking at their product design and scouting for potential dark patterns and saying, “Knock, knock, I want to ask you, this looks like a dark pattern, can we have a discussion?” As we start maturing, we’ll start seeing these kinds of things developing within the country in terms of the industry taking up these practices and also documenting the choices.
When you make a design choice, I mean is there enough documentation to say that we have thought about it from a dark pattern perspective or not? I’m sure it is happening, but again, I’m also sure it’s not really something that is standard practice. So that culture, I think, will not just organically develop. It’s, like I said, carrot and stick, and what we are hoping is that the good guys start getting rewarded and the bad guys start feeling some of the heat of this regulation. That seems to be a good outcome if that can happen.
Dr. K: Absolutely, I completely agree. I think killing peer pressure and FOMO, where you’re pushed to employ some of these to be competitive, any sort of meaningful regulation that flattens that is a great outcome.
Bhuvanesh: Ashish, you mentioned two things I want to pick up on. So one, you use the example of loan apps and the interest rate and so on and so forth. And the other point you mentioned was it may never be possible to enforce this in the sense that there’ll be far too many complaints at any given point of time. It’s physically impossible for the government to go after pretty much everybody.
But we are also seeing, on the other, for example, I can speak to financial scams. There probably have never been more scams than ever, especially the Chinese loan app scam, for example, which we were discussing in the office today morning. So if that is the case aren’t the people who are doing all these things, they can actually not worry about getting into trouble and they can continue doing whatever they want, right? In the sense that, is it possible to make sure that such practices put in a reasonable deterrent?
Ashish: Under the Consumer Protection Act, for example, the government has the power to issue directions for cease and desist. Now, this can be a very powerful tool. Imagine a midsize or a large company receiving an order, which the government stands by, saying that “you will cease and desist from doing ABCD.” This could mean that operations could come to a halt. It could be a big thing. But it’s not about that firm. If the government has made a very reasoned order, it has demonstrated how it has looked at the particular case, for example, it can levy a penalty of up to 10 lakhs and after that, it can even send people to jail, imprisonment, and so on and so forth.
But the thing is that if it can be done in such a manner that it acts as a signal, it gives a clear message to other people to do their maths and figure out: is it worth doing what they are doing. Because a lot of times, what happens is that even if “A” company gets penalized, the B to Z companies can’t figure out the maths. Because they’re either thinking that this is just random bad luck, so it’s the probability that this person got penalized. Or they can’t figure out the maths behind the penalty and the action.
So then they can’t do an analysis to say, “Okay fine, what is the likelihood of the same kind of penalty coming in a similar environment?” The signal effect of some of the cease and desist and penal provisions is what can have an industry-wide impact. But if you don’t do that, if, from that point of view, the actions are not giving that kind of signal, then I think we are just going to make some incremental progress and it’s going to be a bit of a cat and mouse, and we’ll continue to evolve, but incrementally.
Bhuvanesh: In this entire thing, one area where there could be a potential tension is: what the regulation says is a dark pattern, I might consider it as marketing. To your mind, is there a line where something stops being marketing and starts becoming a dark pattern?
Dr. K: That’s a tricky one. Because, if we really truly scrutinize a lot of marketing pretty much the entire concept of marketing may turn out to be a huge global dark pattern itself. So I don’t have an answer, but being prescriptive, definitely, even with the best of intentions, it will not work. It can only backfire.
But, like Ashish said in the beginning, if you can establish the intent of deception, that’s a pretty good metric and that construct as a legal framework has existed for centuries. So I think that sort of approach is what will probably work and obviously, you can’t fix and regulate everything. The longtail will always continue to employ all kinds of outright scammy patterns, forget dark patterns. But the goal here is to, at least, if you can, at least get the big players in any industry to cut down on dark patterns, that’s a huge win. That would be like 80-90% of all activity, right? I think the focus should be on that. But being prescriptive is, firstly, physically not possible and it can’t work, being overly prescriptive.
Bhuvanesh: Ashish, anything to add on that note?
Ashish: No, I think Kailash, I completely agree with you and from a government point of view, I think they need to bring more conversations onto the table. They need to bring industry and discuss some of their practices and also have these discussions with some of the consumer groups. There is sometimes if I’m just wearing the industry hat, I could convince myself that this is absolutely a great pattern because of the simple thing that every time I’m trying to convert a potentially unwilling customer, one could argue that it’s a dark pattern, right? But that’s the entire idea, that there are potential customers who want our product and we go and make that pitch and we sell it to them eventually.
Therefore there are going to be those zones here, but clearly, it’s not so great. There are clearly dark patterns which can be established in black and white. And therefore, the focus should be on that and not muddle the discussion by focusing on the grey zone, at least not at this point in time in the journey. Because that, in some sense itself is a sort of deception, right? So if we can focus on the fact that let’s identify what we agree as a dark pattern and let’s solve for it and let’s not worry about stuff that is grey, at least in the short to medium term.
Dr. K: I agree.
Bhuvanesh: I have a slightly stupid question, so please bear with me. So does it seem like a philosophical thing in the sense that capitalism runs on more growth, more numbers, more metrics, more everything.
Everything is more, growth is good, less growth is bad, if you have more growth, you’re rewarded with higher share prices, and so on and so forth. So far, in this conversation, one thing seems apparent: less is good. Does this mean capitalism has to break down for this thing to work? In the sense, this is not clearly tenable. I’ll start with you, K, because this is also your favourite beating horse.
Dr. K: I mean, we are in the Capital Market, so it’s a very difficult question to answer. But if you and I don’t want to break out into a commentary of different economic systems, the current state of civilization, it’s very complex, and I don’t have any answers. However, if we agree that a balanced approach is really the optimum way to move forward, you need markets, you need free markets, you need the right kind of regulations for the right kind of industries, it can’t be a wild, wild west.
You can’t over-regulate, you can’t not have any regulation. It will just become the wild, wild west. Then a balanced approach seems to be the only choice that we have, right, as humanity as a whole. And over the last hour, we’ve discussed ways to get to a balanced approach to tackle yet another problem that has come up, which is widespread dark patterns. So it’s just that we have to take that balanced approach. It’s a very fragile balancing act, and I don’t see any other way forward.
Bhuvanesh: Got it. Ashish, commentary on capitalism?
Ashish: No, I think there are various ways of running a business, and there are various time horizons and various outlooks that different people have. So I don’t think there is one way of looking at it. I think it is what lets the founders sleep well at night. I think that’s an important test. Whatever makes the founders sleep well at night, I think that’s good for them, and then the rest law will take care and, yeah, I think I’ll just stop with that.
Bhuvanesh: That was pretty much from my part. Any parting comments, if both of you have, on that note?
Dr. K: Nothing, I was commenting the other day, typically, technology regulations, be it wherever in India or outside India, any continent, tend to lack nuance. Because we are basing our entire understanding of technology on how it has evolved over the last 10-15 years practically in making laws that are forward-looking into the next several decades. And typically, it is done very poorly because we also don’t really have the tech capacity or nuance within the law and policy-making circles.
So every time I see a tech policy or regulation, there are good things, but there are often tons of nuances that are missed. However, sorry, I was commenting the other day that this seems to be the first tech policy/regulation attempt where I tend to agree with it, the principle of it, almost wholeheartedly and the previous one was the Net Neutrality Act, you know, 2015-16. After that, this has really personally made me very happy. It also, in its examination of dark patterns, it kind of nails it. Now, what remains to be seen is how it finally gets implemented, and how it evolves. But the start is great and as a technologist, it’s one of those rare things where I kind of wholeheartedly agree with it.
Ashish: So we just look at our population and where we are, I think establishing digital trust is going to be the key thing which will really expand the size of this entire pie In terms of what we look at the market, so if you zoom out of individual firms, I think what some of the things which are now happening in policy, whether it is the consumer protection rules and the dark patterns, or the data Protection Law, and the reform of the IT Act in the form of Digital India Act, we hope to see a structure which utilizes laws, technology, and kind of involves industry to build that trust.
And I think that if you zoom out of the specific issues with specific policy-making, where you could crib a lot like I do, but when I zoom out and look at the last 10 years, what really got wrong at a big level, then there are just a few things I would think and largely I think we are directionally in the right space. So,
From that point of view, I think it’s something that we need to support and we need to contribute because, left alone to the government, it will be difficult for them to deal with these nuanced topics. Like you talk about millions of capable minds working in one direction, at least some of them need to also work in this direction and if that starts happening I think we are pretty much safe.
Bhuvanesh: Got it. This has been an absolute education for me. Thank you so much for taking the time, both of you. We will end with that.
Dr.K: Thank you, thanks Ashish.
Ashish: Cheers. Good talking to you, Kailash. Thanks.
Dr. K: Thank you. Bye.