top of page

Simplify for Success - Conversation with Tawfiq Alashoor


Tawfiq Alashoor was on #SimplifyForSuccess, a podcast series presented by Meru Data and hosted by Priya Keshav.


Mr Alashoor shared the results from his behavioural economics of privacy. He also discussed the factors influencing people’s ability to make privacy decisions.


Disclaimer: Some of the studies mentioned in the podcast are published, conditionally accepted, under review, or being prepared for a journal submission. Several scholars have contributed to these studies including Mark Keil, H. Jeff Smith, Allen R. McConnell, Kambiz Saffarizadeh, Mahesh Boodraj, Carolina Alves de Lima Salge, Jasper Feine, Grace Fox, and Laura Brandimarte.


For a list of studies published by Tawfiq Alashoor, visit: Link


For a publication that summarizes many studies around the behavioral economics of privacy, see Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509-514 . Link Thank you to Fesliyan Studios for the background music.


*Views and opinions expressed by guests do not necessarily reflect the view of Meru Data.*







Transcript:

Priya Keshav:

Hello everyone, welcome to our podcast around simplifying for success. Simplification requires discipline and clarity of thought. This is not often easy in today's rapid paced work environment. We've invited a few colleagues in data and information governance space to share their strategies and approaches for simplification.

Today, we will be talking to Tawfiq Alashoor, who'll be talking about his research around privacy attitudes and privacy behavior. Hi Tawfiq, welcome to the show.


Tawfiq Alashoor:

Hi, Priya. Thank you very much for inviting me. It's a pleasure.


Priya Keshav:

So, tell us a little bit about yourself and your research around privacy.


Tawfiq Alashoor:

Let me take you back to, let's say, 2012 or 13, when I started by drawing a cartoon, like a model, like a data flow model or where it's just a random model where you draw whatever subject or object comes to mind and you connect them and you exaggerate some of the areas in that drawing.That was the big data time, cloud computing time, those things were big back then, ten years ago. And then I started putting things in a pyramid where, cloud was on the top and there's data coming to the cloud and then below all that layer of big data was the consumers are individuals, users, whatever we call them, so entities that actually create objects in these databases. The security and privacy idea came to my mind right away and I started putting a circle in there and I start more and more and that and then it became my thesis when I was doing my masters and the story didn't stop until then.


Priya Keshav:

So you do a lot of behavioral experiments right? And around how and why people make privacy decisions. And it's a fascinating story that you have been researching, so we'd like to hear some more about the type of behavioral experiments that you do, to kind of understand how privacy decisions are made?


Tawfiq Alashoor:

But that would be the area I started when I started my PhD at Georgia State University, I had a chance to work with one of the known people in that area, in the in the prompt behavior privacy area in the field of information systems, but then I got also exposed to in a good way. I work in behavioral economics, also under Equisty and his colleagues and many other people in the field. Some experiments that just


fascinate you when you learn how malleable privacy decisions are. I can give you a few examples of studies that I've been working on. Hopefully they come out soon but this is a work of years and as you may have heard, publishing scientific work takes a lot of time, but some of these experiments, in a nutshell, are more into the privacy paradox, so trying to explain that phenomenon which indicates that individuals who express high level of privacy concerns do not necessarily act on those concerns, and this question or this paradox has been intriguing scholars and also practitioners.


So I got into that area and then so I tried to measure these things, privacy concerns, try to measure it through a self report indicators like we ask people how concerned they are about, for instance, using this certain app. And then we also measured their disclosure behavior, which is the privacy decision in this case, and disclosure could be measured in many, many different ways. There are studies that measure rely on intention to disclose information. Studies that measure past behavior like have you disclosed, for instance, or have you shared your age on Facebook? Intention would be how willing are you to share your age in this app. Other studies, major actual decisions, they ask them to share actual data, personal data.


So depending on the context, those are the two factors and the relationship between them is what defines what I've been studying and I try to understand under what conditions the relationship between the two is more or less likely to hold. In theory we would expect those to express high level of privacy concerns. The result would be or the effect would be that they would disclose less information. And we want to see a strong correlation in there. However, using some very nuanced and simple primes or nudges, or whatever you label those factors, we can actually reach to that relationship in a way that it doesn't hold anymore, meaning that it does the theory of more privacy concerns leading to less disclosure of personal information. Kind of breaks down because of some economic factors, social factors.


For example, a psychological factor would be happiness, if someone is happy, they're less likely to act on their privacy concerns at the moment when they disclose personal information. It depends, the setting of the studies we've been conducting varies from online surveys or a class experiment, but I've been saying this a lot that happiness or positive emotion is just one factor. Other factors include cognitive depletion. For instance, if someone is cognitively depleted, they're less likely to actually process that belief on privacy and privacy concerns, and therefore that factor that belief is less likely to influence their sharing behavior, so that's in a nutshell. And there's a bunch of other factors that we've been studying and also many other researchers have been studying those factors.


Priya Keshav:

It's fascinating, right? I mean, even with myself, if I kind of reflect on some of the things, and obviously this is not scientific, but I can see I'm obviously extremely concerned about privacy, but you mentioned two factors and that there are many others but one being happy. So I tend to be more trusting maybe when I'm happy because I feel I'm closer to friends or closer to people who have similar interests and so. Which maybe makes me start revealing things, personal facts about myself than normal and you mentioned another major aspect of a lack of correlation between disclosure and privacy concerns, which is.


Tawfiq Alashoor:

The cognitive capacity, so if you've been like doing a very cognitive task, then you run into a website that asks you to check all those buttons on the screen, but you’re like really, just so tired and you want to get done with it, so you don't have capacity to actually process those privacy concerns and beliefs.


Therefore, you just most likely just click accept all unless they have a very obvious “reject all” button, you won't be searching for thatbutton or more less likely to search for that button.


Priya Keshav:

Yeah, which brings us right? Most of us end up clicking “accept all” to all of the cookie banners because when you're presented with 20 choices and all you're trying to do is just one thing that you either want to read or you want to visit a website for a certain purpose. At some point, you’re overwhelmed by the number of choices that you end up just saying yes. Even if you go into which also happens probably with us providing a bunch of information if asked. Because if I need to book a ticket, and if they want something personal for it, I'm probably going to give it because I need to know what the price of that ticket is before I can, so it's just.


Tawfiq Alashoor:

Yeah, there's many factors and theories that explain that one of the prominent ones is an immediate gratification. So the reason that people are more likely to just click accept is first, there's the design architecture or choice architecture that is being designed by developers, so that actually drives you to click accept or reject depending on how the designers design it. But also immediate gratification in the sense that you just want to get to thatbenefit that you are striving to get at that moment. So the privacy part is that I think in economics is called secondary good. It's not a primary good at that moment, so you're less likely to actually take care of.


Priya Keshav:

So in other words, you're basically saying that the fact that a consumer has major concerns around privacy most likely might not reflect in his choices or behaviors because there are other influencing factors that make the person voluntarily give out information. Judging whether they have concerns or not based on consent probably it's not a good idea, is it? Am I drawing too strong a conclusion there or?


Tawfiq Alashoor:

No, so yes, the experiments showed that. However that does not mean that people don't care about privacy and they will not act on them at all. There are situations under which, for instance, you brought up trust, which is a major determinant of privacy decisions. If trust is too high. It could outweigh the privacy concerns and the privacy cost because the benefits sought at that moment, along with high level of trust, is more likely to lead people to just give personal information. But then you mentioned something else after that, I forgot it.


Priya Keshav:

Does it mean that once I'm voluntarily giving consent would not matter because it just depends on how I asked the consent. If I ask it in a more positive, either persuasive way or a positive way, most customers are more likely to give consent whether or not they want the privacy choices.


Tawfiq Alashoor:

Let me give you an example of an experiment we conducted so many times, more than 20 times. So number of subjects participating in those experiments would reach, I would say, more than a 4000 or 5000 people. It's an online experiment in the survey style. So we brought people, we used some research platforms or platforms that provide crowdfunding stuff like Amazon Mechanical Turk or prolific? There's many other platforms like that where people are in there, the demand side and there's a supply side where people are participating in solving tasks and the suppliers, in this case, the researchers provide monetary incentives to participants. So we put that study and people were invited to participate in the study about Amazon Mechanical Turk, and half of the subjects, we asked them to provide a set of personal information, this includes demographics, location related, financial stuff, related health related questions. Some of them are sensitive or more sensitive, some of them are less. None of them was an identifier, but it could reach like asking someone what is the domain of your personal e-mail. It's not identifiable personal information, but it could trigger that identifier right? Your e-mail is a a primary key or a personal identification number, because it only belongs to you. That's half of the subjects we ask them those questions right away. The other half though, we asked them how concerned they are about research studies asking for personal information etc. A bunch of privacy concerns related questions, so these, the other half were asked the privacy concerns questions just right before we ask them those same exact set of personal questions. And as we would expect those participants who were not primed about privacy and privacy concerns by just asking them those survey questions, they provided almost everything I think, based on that study about 21 or 22 items, whereas those who were not in that condition, those who we primed before we asked them the personal questions, they provided an average depending on the study. I'm now thinking of five or six data points less. So just changing that environment and they were randomly assigned to this condition made people reveal five or six data points less, and those data points most likely would be the sensitive ones. So people were actually acting on those concerns and the sensitivity of the personal data that they're providing because they were just briefly primed through a few questions about privacy. So that might get to the point of the reality of whether people actually act on their concerns and this study would suggest that people have to be aware at the moment so they have to be primed and designers could think of this. So now this gets us to privacy by design, these are some of the techniques that our companies could use to resolve transparency issues and compliance issues regarding privacy.


Priya Keshav:

So in other words, if I'm happy, I end up providing more information, but as if you ask me questions that highlight my privacy concerns, I may be primed, which kind of makes me start focusing on my privacy concerns as opposed to if I'm not focused on my privacy concerns, I may just voluntarily give that information anyway, even if I wouldn't otherwise normally provide that information. So, but that doesn't that kind of disincentivizes most companies, because they want consumers to give them information. So I would rather prime them on something positive, and something that kind of increases trust, then provide awareness around the privacy concerns.


Tawfiq Alashoor:

Yeah, yeah. Absolutely, so it's a strategic move and one has to understand the nuanced effect of this choice architecture features. To design an interface that at the same time complies with the regulation, but also does not lead to a significant reduction in providing personal data. So you want to make people see the benefits that you provide to them as a data controller or as a marketer.


Consumers need to see there is a whole literature around the privacy calculus. People would actually share more personal information if they perceive the benefits more than the cost. So it's not an easy question, but it's doable given what we know in the privacy literature in the past 20 years, I would say, 20 or 25 years.


Priya Keshav:

So I have a couple of questions. So we had talked about twoinfluencing factors, right? Like basically the fatigue as well as the happiness and there are many other factors, but I'm particularly interested in anything that you thought was odd or confusing as a factor, because these seem like obvious things, right? Like I can understand that when I'm happy I trust and I can understand that when I'm fatigued I'm not thinking clearly. But are there factors that probably should have not been normal factors but things that you did not anticipate, but tends to sort of influence how we disclose?


Tawfiq Alashoor:

So there's a study in which we designed a chat bot and the chat bot is there to match people so it's like think about a dating bot, a bot that helps people match. And we called it tinderbot. What we did is for the bot to get to match people, it would require some information. You know, the dating apps, they would ask for some personal information, do you like X, do you like Y, do you like Apples? And if you put some indicators about yourself that helps the algorithm learn more about your preferences, so it will be more efficient in terms of finding you a match and therefore a transaction. So in that experiment, the bot would first ask people a bunch of personal questions. Half of the participants know we put emojis along with the questions, you know, funny emojis. A question, for example, would be how much debt you have. So this is a financial stuff-related question, it is a bit sensitive. So we put that emoji with the tongue out, with the dollars tongue out, in a funny way to see what happens. As compared to the other condition in which they did not get any of those emojis and what we found is it was funny that the effect of emojis on driving their disclosure behavior through a link from happiness, it made people think that the bot was more friendly when those who received the emojis and because they thought it was friendly, they're more in a positive state, emotional state or effective state, and that state influences their behavior, decision or absorb their disclosure decisions. So they gave more data when we use the emojis. What we found is actually these emojis work much more on those sensitive questions. So we did another experiment in which we asked some people only sensitive questions and ask the other people less sensitive questions, and again we alternated or we manipulated the emoji so it's like a 2 byte to design and we found that the effect of emojis or the use of emojis by the bot was stronger, much more than it was when the questions were not sensitive. So it seems like these emojis have some emotional effect that actually leads to an effect on disclosure decisions.


Priya Keshav:

That's interesting. So, is that because I feel a little more maybe casual, I feel like there is emotions involved, it just makes me feel more comfortable sharing that personal information because it's interesting how an emoji can make someone share more than...


Tawfiq Alashoor:

Yeah, we explain and show oi this study that the effect of emoji actually influences your perceived friendliness of the bot and that increases your, let's say, happiness to use a general term. And that's why you would share more personal information .


Priya Keshav:

So in your research , I don't even know, if you focus onlooking at cultural differences but, I mean, I'm being hypothetical here, but are a certain type of population more trusting than others which influences their decisions? Do you find different trends depending on whether it's an American and European, Asia, etc.


Tawfiq Alashoor:

At this time, there are many studies that appeared recently on the cultural differences when it comes to privacy. And before that it was like Europeans are more concerned about privacy as compared to the US. And if we go further to the east, there was not much research, like we go to the Middle East and that area, there was not much research to show what are the level of concerns, whereas in the US, for instance, w it was back in the 1960s, I think, when those surveys started. So there you have some accumulation of statistics on privacy concerns level. But culturally so. now I'm in Denmark, so I've seen studies around is Germany or Denmark more or less concerned when it comes to privacy and there arealways some variations. In fact, these variations could happen at the, think of other segments in the population, like a neighborhood, a whole city could have different level of privacy concerns as compared to a rural city. There's are few research papers on this, but there's more recent research on this area, I think, I did one project back in 2016 or 17 on cross-cultural research, but then that project has been pending for a while, so I cannot speak more than this when it comes to cultural differences.


Priya Keshav:

OK, but it does make sense, right? Because I know that in certain cultures it's pretty normal to ask personal questions to a complete Stranger, whereas, if somebody walked to me, I live in the US, and if somebody walked to me and asked me what is your age, I would probably frown upon it. It would take a lot more in terms of conversation and some level of trustworthiness before I'm willing to share that information. And it would also be considered inappropriate to ask a stranger, but then I can tell this whenever I visit Asia, Sometimes, people ask me some very personal questions and these are complete strangers that you would meet on a train and you'd go why would they think that I would be willing to share that with them, but then they think that it's absolutely normal, so it's probably I'm sure there are differences in the way cultures treat privacy so.


Tawfiq Alashoor:

Absolutely, there is. But it seems like it's changed, so I can give you my perspective as a Saudi person. So given that in Saudi Arabia, the data protection law will be enacted or implemented next year, so there's been a lot of talk around the effect of this law on businesses, on organizations and individuals. So then, so we're in Asia, right? Far away from, let's say, China, but we are still in Asia so but let's say the Middle East. Growing up there, it'sokay to ask how old are you to a random person? I would say back maybe 20 years, especially if it's an elder, it's a sign of respect, you answer the question. But now, what they change in the perceptions of privacy and personal data, things like this becoming more and more

sensitive or less, I would say, casual in those societies so such regulations could have influence in a long, long term, hopefully for the best.


Priya Keshav:

But I also wonder,, it may be that in the moment, I felt pretty happy that, let's say, I have a lot of concerns around privacy and I would not, in general, share sensitive information with a company. But maybe some influencing factors, may be the choice of questions, what was presented to me made me feel like I could trust, or I was in a happier mood, which essentially made that I was more willing to share this information. It doesn't mean that the consumers don't care.I probably expect some level of fairness and treatment and the trust kind of also implies that I have trusted and faith that the person the information that I sharedwill be taken responsibly, otherwise, you would betray the trust to the customer. So it's not just about how the UI or the factors that influence me into sharing it, but it's important to kind of take it responsibly, because ultimately I expect because I have privacy concerns, I expect some level of responsibility in how it is handled.


Tawfiq Alashoor:

Yeah, no question at all. If that happens at one time it doesn't mean it will happen another time. Again, there are other factors that could influence, so trust is also a factor, so there's a calculus in the mind that is happening before an individual or a user makes that decision. The calculus is happening whether or not we are aware of it. It could be so quick and it could make flaws and as we know, given the past, let's say 20 years, there's a lot of flaws in that calculus that is happening in the brain. But now we're getting to learn how this calculus works and that there are many factors that are intertwined together And They influence the final decision of whether or not someone shares a piece of personal information. So it doesn't mean that people don't care if the system was at fault, it revealed some information that it should not have revealed. But again, there are those other factors. So now at this time individuals, users and organizations need to understand those nuanced factors that influence these privacy decisions and the most tricky ones would be those that happen at the unconscious or subconscious level. Those that are conscious to people, like if they see a huge sign of reject or they like they get a good sign of like please reject sharing your personal information, but it shouldn't be happening that way because there are also benefits of accepting those cookies. Cookies would be the most benign part of the whole discussion around privacy and security.


Priya Keshav:

Any other closing thoughts before we end the show?


Tawfiq Alashoor:

I think a thought would be it's very interesting to see the privacy community in the industry and in academia and in, let's say software development, so the whole trend around privacy is very interesting to observe. But I personally hope that this conversation brings also a little more focus on on the benefits of data, you know data innovations and data-driven innovations. Those innovations require a lot of personal data. There are ways to resolve this tension between protecting personal data and information and also utilizing personal data for society or societal benefits or also personal benefits. But we see that there's a huge kind of bias in the conversation. Hopefully, though, my thought in a few years from now that those two paths find a convergence, point, in which way, we are talking about here a macro

calculus, a macro privacy calculus where we're trying to weigh the benefits and the cost of using personal data. I think my hope is that we succeed in reaching a positive outcome in that macro level, let's say in 10-15 years from now.


Priya Keshav:

No, I absolutely agree and I think most consumers, it doesn't mean that if they are privacy concerned, they're not willing to share information because they understand, and I think that some of them understand and maybe even if they don't, they want to understand and they want personalization, right? So not sharing any data comes at a cost that would be too much for almost everyone. Yeah, so it's nice to be able to share because that data drives personalization, it drives innovation, it drives a lot of things that we are enjoying today. And it wouldn't be possible with no data. But at the same time, maybe there is an over collection of data where maybe some things that are too intrusive or too personal, even though it's not necessary for the purposes of data analytics per se are the that kind of information has been gathered and then they have been shared and of course, appear in markets and sold through data brokers, which is what sort of triggered the pendulum to swing in the other direction, but hopefully conversations on both sides. And then again, a better understanding, and I think the fascinating topic of understanding consumers behaviors and UI, UX and the type of information that needs to be shared with me, right? So the type of research that you do and others do around what are dark patterns and what are not dark patterns, how it influences my ability to kind of make decisions. When I say me, I mean me as a consumer, and that I think as we mature, we'll understand privacy better and hopefully we'll reach a state that sort of is good for all of us.


Tawfiq Alashoor:

Yeah, absolutely One of the biggest public issues in the world is privacy and it's to me, it's fascinating. I love this area. It's what I do is, I know it's what I read about every day, what I research about and let's hope for the best.


Priya Keshav:

Makes sense. Well, thank you so much for your time. I think it was very, very interesting to listen to your research and hear about your findings. Again, even though it is obvious when you look at it as a research, it probably is more meaningful and hopefully we will see all of this research being put to good use as corporations think about privacy and as regulators think about privacy and consent as well. So thank you so much.


Tawfiq Alashoor:

I hope so. Thank you very much for inviting me and hopefully organizations and regulations, it's time that they're putting these efforts and also individuals also get used to the new norm around privacy decisions and they are more aware of their privacy decisions. And again, thank you very much for inviting me to this show.

Comments


Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page