Europe Union
Published: 01/02/2024

How Can Technology Companies Balance Profit and Ethics? Tech Excellence Podcast with Marta Fydrych-Gąsowska

Tech Excellence podcast with Marta Fydrych-Gąsowska

Marta Fydrych-Gąsowska is a cybersecurity expert and the topic of ethics in technology innovation is close to her heart. In this episode of the Tech Excellence Podcast, Monika Dawidowicz and Marta are taking a deep dive into defining the fine line between sheer profit and ethical product management and deployment.

Key takeaways

The importance of ethical tech

  • As a security specialist, expert in law, and mother, Marta highlights the gravity of ethics in technology, especially since children are increasingly exposed to it. 
  • Companies can thrive and prioritize their customers’ security and data or personal details.

Invasive advertising and ignoring privacy

  • Contemporary advertising mechanisms can often be invasive to the users and neglect their privacy, as they cannot express their consent to share their data. 
  • At times, companies deploy their products without considering the consequences—for example, face apps and filters that can destroy the confidence of young children, especially girls.
  • The problem highlighted by Marta involves ignoring the issue by company executives and leaving the product as it is.

Technology and addiction

  • Modern apps are designed to get users addicted to them  – via gamification and reward systems that can boost the user’s ego.
  • It’s essential to look back at the results of the product, what influence it has on its users, and how we can ensure their safety.

Profit should come hand-in-hand with ethics

  • Executives often put profit ahead of the end user’s wellbeing and safety, which should change.
  • The increasing issue of tech addiction needs resolving that would not hinder the companies and secure the users’ needs.
  • With enough determination to create products that benefit users, it might result in better profit, as they would feel safe and cared for.

Education and transparency

  • Some ethical issues boil down to transparency and education that companies should provide. Technological development brings many benefits as long as we educate on threats and consequences.

Leading by example

  • Marta gives several examples of companies that do it right: Proton, Brave browser, and Salesforce, to show that there is a silver lining and hope for ethical and safe technologies.
  • By engaging stakeholders, including employees and customers, in the decision-making process, the leaders can ensure ethical practices that satisfy everyone involved.

Watch and listen

Transcript

Monika Dawidowicz: Hello, everybody. We are live at Tech Excellence podcast, where we discuss all the things connected to tech and how you can deliver excellent tech using very different tools and completely different perspectives. And today, I am joined by Marta Fydrych-Gąsowska, cybersecurity expert. Hello, Marta. Nice to have you here. 
Marta Fydrych-Gąsowska: Hello. Thank you very much. 
Monika: Most of the people here probably know you because of your involvement in different topics connected to information security, risk assessment, ethics in technology. However, what I observed in your LinkedIn bio is the phrase “humanist in IT.” And I thought it’s really intriguing and interesting. So maybe before we jump into today’s topic of balancing profit and ethics in technology, you could elaborate more about that headline that you’re using. 
Marta: Of course. Of course. Thank you. So, yes, I can say that I define myself, yes as a humanist in IT because, yes, I have a strong humanistic background and education. And it always lasts with me because I was always interested also in psychology, in, I can say, a social aspects of tech. And when… I have to admit that when I became a mother, and then I saw this crash between kids and tech, and then I started to think more about the ethics of this tech, how it’s like, and if tech companies really care about people and especially children, yes. 
Monika: So, it was your personal experience that drove you to that topic. 
Marta: Exactly. 
Monika: However, with information security risk assessments, it started before you got interested in ethics in tech. 
Marta: Yes. And I have to admit that my background, my education is in law, and I became information security specialist completely by, I can say, by accident, yes. And then I saw that this world of this security standards and all this security environment is really fascinating for me. So I just try to make my, how can I say, skilling, yes. And I came to this new domain, yes. And here I am. 
Monika: OK. I’m really, really glad you’re here with all of your observations and your interesting insights because what we’re going to discuss today are a variety of topics, mostly ethical and unethical activities of different tech companies and examples of companies that actually care about human and balance it with the profit. There are a few other things that we are going to cover. However, I would start with the unethical practices that are happening within the tech space because you are observing the space carefully. You’re dealing with different companies and different clients. So, I’m curious about, like, the most common ones that you notice that are occurring within the space. 
Marta: Yes. Now, we can divide these practices in two types. One is when a company intentionally ignores customers, ignores personal data subjects, yes, as we are also called as customers, and set many invasive techniques just in our daily life, yes. For example, recently, when I was in Kraków, I saw in one of the shopping malls in the restroom, in the window, not the window… in the mirror, there was installed a whole system for the advertisement, depending on, I don’t know what are the exact criteria beside the sex, but there was something like this that you can show your advertisement to all or to selected ones. Yes. So I don’t know what criteria they take into account when they show the advertisements in the restroom anyway, restroom mirror. So…
Monika: And it’s such a private space. 
Marta: Yes. 
Monika: There are apps there, anyway. Yeah. Because, you know, back in the day, like five or 10 years ago, all you could do in the restroom was maybe put a poster or something like that. And now there’s this customized advertisement there. Kind of scary. I mean, personally, I am not offended by customization when I see ads. I even appreciate it to some extent. But in the restroom, yeah, I believe this is quite invasive and disturbing, I would say. 
Marta: Exactly. The more that if it is like this, that they select certain criteria, so it means that something behind this mirror is observing you and profiling your ad. 
Monika: Without my consent, actually, unless I will just not go to use the restroom and fulfill one of the basic human needs. Yeah. So, yeah, that’s quite disturbing. So it’s one of the examples of, let’s say, like, a very not ethical behavior. Anything else that comes to your mind that struck you recently? 
Marta: The second I can say about the second type of this unethical practices, this is when a company just puts in the market her product just as it is. And then after some time, they come, they see that there are some creepy, as we can say, creepy, creepy outcomes. For example, it was like this with filters on Instagram that were previously at the very beginning; they were installed just to make fun. You could have a face of a dog, of a bunny, you know, it was a fun. But then they started to improve the beauty by these filters. And that was a problem because there are reports made also by, for example, MIT, where it is proven that these beauty filters are really, really harmful for girls’ self-esteem. And what is more creepy even, so this is harmful, especially for girls in… this younger ones, this 9, 11, in this age about. And the company and why it is an unethical approach, because the company, Meta, doesn’t do anything about it. They know about it, but the money that they receive from their business is too big regarding the cost of change. It’s just a little change of algorithm, in fact. It’s not a problem of technology, of possibility, of technical possibility. It’s a problem of will, of their goodwill. 
Monika: OK, so if they adjusted something to make it more, let’s say, beneficial for a younger user, they would lose money and they consciously decide not to do anything. So as not to threat the profits. Yeah, it sounds like pretty much unethical. And there’s one more aspect that I believe could be discussed here. And it’s, like, the attention because a lot of apps nowadays are designed in a way that grabs our attention all the time, continuously. And I think it’s also like, obviously, it is within the company’s interest to have our attention for as long as possible. They profit from that. But where’s the line here? Yeah, because sometimes it seems like some apps, even the Instagram that you mentioned, they praise the user for spending more and more time within the app and they punish you if you do not check it regularly. So I believe this is also another unethical, let’s say, behavior. Maybe it falls under one of the categories that you… maybe even under both. How do you think? 
Marta: Yeah, this is it, because, you know, I would say that with our attention span, it’s a case that we can locate in between. Because it was like this that previously. At the very beginning, the companies just created their digital platforms. And they wanted users to use them as long as they wanted. Yes. As long as it was only possible. And it was their business model and it had nothing to do with our problems now today. Why? Because there was a time and they didn’t know the results that their platforms will bring to us in the future. This is also a cause of lack of risk assessment made at the very beginning that should be done at the very beginning and then constantly in the process, yes. During the platforms were deployed. But this is one case. 
The problem is that even after years of these platforms functioning, when some ethical experts, for example, like, Tristan Harris in Google, said, “OK, I see these effects are disturbing, are damaging for our users, for this kind of design of products. Let’s change it. Just make this algorithm more user-friendly, just to respect the user’s attention.” And what happened? He had a great presentation. It was really a wonderful, wonderfully shown in this movie, Social Dilemma. He made a great presentation and nothing happened. Completely. Because it’s like this that, when they saw that the results that, how can I say, that the user’s attention is lower, but it brings them money? So it’s better to do nothing. Also a trap of, how can I say, quarterly KPIs because everyone has his quarterly KPIs, and he has no user attention as a target and users that are online. 
And even if any developer, an ethical developer, would make a change in this algorithm, just to make it more user-friendly. And after a quarter, for example, his manager will come and say, wait a minute, why we have so little users? 
Monika: Yeah, why are they dropping? 
Marta: Yes. Why they are dropping? No, no, no. Let’s change it. Let’s remove your changes because we see that there’s something wrong with our audience. 
Monika: OK, so the decisions should come from the above. And it’s extremely difficult within, especially within, like, a huge company, for an individual to change anything. Yeah, this is something that should be, like, the mission of the company. 
Marta: That is it. 
Monika: Because…
Marta: The results of a lack of our attention can be really serious because it’s not only that I cannot concentrate on the piece of paper, yes, and I cannot read long text. It’s not only this because lack of attention; it causes that we are more prone for phishing and other social engineering techniques. This is also that it is also, how can I say, a danger for… for our democracy because you don’t under-, if you are not aware if you are not attentive if you are not mindful, you start not to understand what’s good and what’s bad. So, on a political level, it’s really very dangerous. 
Monika: OK, so there are much broader results of such activities like putting that profit, putting KPIs over the human. It’s like a very, let’s say, social aspect. It’s a political aspect as well that is affected. So, who should be responsible? Because obviously one side, it’s- sorry, it’s us, the users, like we can choose to some extent, obviously, because, you know, addiction is something outside of our control. If it happens, it happens. And it’s extremely difficult to break it. Then it’s the companies that we can, like, let’s say, trust that some of them will deploy the right practices, but the others won’t. So maybe it’s, like, something that should be legally regulated?
Marta: Yes, it’s about regulation also, but, you know, with regulation, there is a problem because legal regulations will be always just after technology. This is natural, you know, because it’s like this, that Kevin Kelly wrote a really great sentence, that technology is like water because you can burn something and you can regulate it in the one piece of water. 
Monika: And then it will leak somewhere else. 
Marta: Exactly, exactly. 
Monika: So the most of the responsibility is actually on the creators of technology.
Marta: That is true. 
Monika: So… Maybe there are some companies that are worth, you know, looking up to because they are actually ethical. Are there any examples? Because now it seems like everything is scary. We’re all getting addicted and the companies are just, you know, doing it for the profit. But I believe that the world is a good place still. So maybe there are some examples of the companies that, you know, that actually claim to put some kind of, let’s say, balance here? 
Marta: Yes, definitely. I can say about some of them. 
Monika: Yeah, because we don’t want to fearmonger on everybody, and also, we don’t want to point fingers at Meta and everything else. Maybe there are good examples that, you know, people can get inspired by. 
Marta: Yes. OK, so I will start with some privacy-enhancing companies and then I will have a surprise also. 
Monika: OK. 
Marta: But yes, there is, for example, there are a lot of there’s a lot of companies that are privacy-friendly. And, for example, there is Proton, there and Brave Browser. OK, so let’s just concentrate on these both. Yes. So one of them, Proton, brings to its clients a privacy-friendly email box, VPN, privacy-enhancing drive, and some other products. And that is it, that they not only say that they respect our privacy, but they really do it. I read their newsletter and I see that in every product, every month, they upgrade the level of the security, yes. And they say we don’t track you. We don’t see what you are writing about in your emails. We don’t know what you are storing at our drive. It’s yours and it’s yours and we respect it. And a lot of people that I know and then I see on their public profiles, on social media, they have a lot of clients that come today because they offer really well-protected services. That is why, you know, because they say we respect you. Yes, that is it. 
And the same with the Brave Browser, that also is, how can I say, is used in a wider space that we would expect. Yes, it also respects the privacy of its users and don’t track. And it’s true that if you write something in Brave, after two or three days or even after a few hours, you will have to write it again because they don’t save it. 
Monika: Don’t store that data. Yeah, that’s interesting. That’s interesting and I’ll definitely give it a try to check if I, as a user, can feel any difference. OK, so are there any other examples that you could share? 
Marta: Yes, and the last one that I was also surprised when I read it for the first time it’s Salesforce. And recently, I found a report of World Economic Forum. It was a whitepaper where they wrote a whole whitepaper on the Salesforce’s ethical practices that they started in 2019. And for example, they established a guiding principles not only with the board management, not only with experts but with whole employees. And also then they also validated it with more than two thousand of clients and customers. And for example, they created a catalogue of values that are really important for them in the company. And it was human rights, privacy, safety, honesty, and inclusion. So, you know, and as I see, as I observe also them and some people working in Salesforce, they really manage it. 
Monika: OK, that’s that’s quite inspiring, I would say. And it actually proves that you know, such a successful and known company can be known and successful without violating the privacy data. So, because here I would like to smoothly get to the topic of that balance of profit and ethics because for any technology in this world to develop, all it needs is, like, to generate profit so that there was enough money to develop it further. So I’m wondering, you know, isn’t isn’t it a conflict between, like, that, you know, urge to to generate growth and actually the ethical considerations? Where’s the balance here? 
Marta: No, in my opinion, there’s there’s no conflict between it. And that’s, for example, there’s something, like a, like a precautionary principle that I wrote- that I read in the really magnificent book of Natalia Hatalska, The Age of Paradoxes. And she writes that the precautionary principle assumes that the technological development brings us benefits, brings benefits to humanity. So, it should be maximized. But in the same time, we should educate, we should educate society about the possible threats and negative consequences. So it’s like this: that if a company brings a new product to the market, it deploy it. And then during its life, it permanently make, how can I say? Safety checks. 
Monika: So reassessing if it’s actually safe. 
Marta: Exactly. 
Monika: As they go and as they get feedback from the users, for example. 
Marta: And we assess it. Yes, constantly, all the time, to see the results. And if I can just bring some more details about this principle, because it’s not well known, but it’s really practical. And it’s really, I think it’s good for all… for all stakeholders to apply it because it assumes that the company will use foresight to describe at each level of the product lifecycle its possible negative consequences and harms. Then, there is an ongoing assessment on the technology in terms of possible threats it poses also. And there is also a rapid remediation of harms and risk prioritization. So if something happens now, so we don’t think about mitigating risks in the far future. 
Monika: OK, because we already have some sort of formula in case it happens. 
Marta: Yes, yes, exactly. And last but not least, transparency and openness to all stakeholders. This product is still maybe not an open source because not all want to- want to keep this model in the business, but it’s open for the opinions. Yes. And for the interests of all these stakeholders, not only shareholders. 
Monika: Yeah, I think that’s a really powerful statement, not only to focus on shareholders but on all stakeholders. It’s a funny like, you know, like, the let’s say connotation of the words, but it’s actually pretty accurate that sometimes that’s a trap if the company is focused on the KPIs because, you know, there is somebody out there waiting for the financial results. But then there is this severe effect caused on the society, on the users and users are the society. So, yeah, definitely the principle you’ve mentioned. This is worth taking into consideration. And are there any other, like, guidelines or frameworks that should be adopted in the tech industry that come to your mind? 
Marta: Yeah, I think that every company that would like to be really ethical, not only the so-called greenwashing, so should go the way of Salesforce just to ask all their stakeholders, employees, external experts, and also customers what values are important for you? Yes, just in this way. And just because we know that that law will always go just after the technology. And there is also a problem that the law has a problem with definitions because, you know, because you can have a product that will have the functions of this what is prohibited, but it will not be included in this definition- of this legal definition. So- 
Monika: So there would be, like, a loophole in the legislation. 
Marta: Yes, exactly. And it’s very, very likely that something like this would be. That is why. That is why I think that companies just should by themselves-, regulate by themselves internally and also in the in their communities just to make it together. If law cannot be for us as elastic, as agile as we need it to just regulate ourselves. This so-called soft law. But there is also a need of goodwill. 
Monika: OK. Yeah, that was all really interesting and definitely worth reviewing for most of the companies, whether they do, like, apply the framework that you discussed before. I would like to get back to the topic of tech addiction, because it seems like one of the most severe effects of those unethical practices that the companies just keep on doing. And I think the the global effects, the social effects, are the most severe as well because we are the nation of people addicted to their phones and glued to their phones. And I’m wondering what steps can tech companies take to ensure their products are designed in a way that is, let’s say, profitable but not addictive. What should be taken into consideration here? Because, again, it’s a fine line. They profit from people using the technology for as long as possible, and then it can cause a lot of damage. So, what’s the fine line here? What are the practices that are worth taking into consideration as a tech company designing a product? 
Marta: OK, so, you know, I think that from one side, this, this… this precautionary principles would work here really great. 
Monika: So predicting what could happen and finding resolution before it even happens. 
Marta: Yes, yes. And then educating, educating also clients. Yes. For example, recently, I didn’t read it yet, but there’s a book of Joanna Glogaza. It is called, wait a minute, it is called Attention Economy. How not to screw up your life. And she proposed and she proposed something as easy. That is genius because she proposed a wallpaper on the phone that every time when you will take your phone and you open it and you turn it on and you turn it off, just there’s a wallpaper: “What have you done? What you have to do in here?” Yes? 
Monika: So, like, a kind reminder to be mindful. But again, it puts the responsibility on the user at the end of the day. So it’s up to us, the users, the end users. But on the other hand, we vote with our attention, yeah. So maybe if it’s us, the end users, who change the way we use the technology, it also changes the way it is designed?
Marta: You know, it’s like this, that we, you are right, that we vote with our attention. So it’s by us to choose products that we feel well with, yes, that we feel that respect our basic needs, our values, and us, yes, as humans. So it’s just to choose, yes. So this is it. What can we do? Yes. Because I want to say one more thing. There are a few authors who are known that they first, they wrote guides for developers how to develop products that are the most addictive as possible, how to glue the users, possible- prospective users to your products, how to design it. And then, they wrote books for users how not to distract. For me, it’s a two-faced approach. 
Monika: And a red flag, I would say. 
Marta: Exactly. So it’s not like this, that only we as users, as end users, we can do anything. Yes, that is only our responsibility. We have to see that there is a part for us, for example, just to practice mindfulness or make something that make us calm. But it’s not the only way, yes. Also, the second side, it’s the companies. 
Monika: Yeah, the creators. Yeah, because that would be unfair to put all the responsibility onto the users, especially that we’re this tiny in comparison to all those mechanisms that make us addicted. So basically, legally, it’s quite difficult to regulate it. From the user’s perspective, it’s just a small action. So, most of the responsibility is on the creators. 
Marta: On designers, yes. 
Monika: On the designers, on the creators. So what is the way forward? Is it education? You said, foreseeing the possible harmful effects and maybe thinking about the solutions before they happen. Is there anything else that people should take away from this conversation, especially if they are creating some technology? 
Marta: Exactly. 
Monika: What’s the way forward with that? 
Marta: Yeah, so definitely creating foresight and creating possible scenarios. What can go wrong? Because when we make a new product, we concentrate on this only bright side. Yes, we imagine how it would be wonderful, but… 
Monika: How profitable, growing, successful, glamorous. 
Marta: Yes, yes. But we should make, we should make scenario planning scenario creating with every possible, plausible, and preferable scenarios and see what they will bring us. And just embrace it, whatever the result will be. So this is the first step when we start developing, designing, yes, our product. What can go wrong? And then I see a really huge role of education. Because, you know, this wallpaper, “What are you going to do here?” proposed by this writer, why it can’t be used by companies? For example, Johann Hari, in his book, Stolen Focus, he asks the question: why on Facebook there is no button “meet with friends offline?” Because, you know, Facebook is about relations. It’s about meeting people. So why this platform cannot… 
Monika: Remind the users, yeah? 
Marta: Yes, yes. Because it won’t happen online. That is the problem. So this is it. Just not seeing this time spent in front of the application but seeing the human that is looking for it. And for example, recently, I will give you an example. I had a recent situation like this. I learned Swedish on Duolingo. And there is, you know, a competition, yes, who will get more points. And it was a Sunday and there was a final day. And I was in a demotion area because that week, I learned really small. Yes, a little. So I had a choice. I will fall down to the previous league with only a few points. But I will spend time with my daughter, with my family. And this is what is really important for me. Or I will spend this time only with a phone in my hand. 
Monika: And that little avatar of an owl who tells you if I’m doing a good job. 
Marta: Yes. Telling to my daughter, “sorry, I have to learn Swedish,” yes. 
Monika: Yeah, if it wasn’t the technology, if it was just a textbook, probably there wouldn’t be the dilemma, yeah? And everybody would choose spending time with their family. And with technology, that is so tempting. There is this gamification and, you know, some instant gratification. Because if you have enough points, like you have some kind of confetti there and the nice sounds to praise you for that. I think that’s the difference between learning in the digital era and the textbook era. That it wasn’t so rewarding back then. So obviously, it’s beneficial for people who need extra motivation. But on the other hand, it can be, let’s say, a double-edged sword. Because, well, actually, I had a similar example last night. Because my son is using Duolingo and he’s very, let’s say, serious about it. So… and he’s sick right now. He has a terrible flu. And he was just laying down in his bed with a fever. And then he told me, “Mom, mom, but you need to bring my phone here. Because I will lose my Duolingo streak.” And here’s, I think, that fine line between, okay, that’s motivation for learning languages. And I felt, again, that’s great that he’s so motivated. When I was 10, I had no motivation to learn languages. Because, you know, you’re not a logical person when you’re 10. And you don’t think, “Okay, I will use it in the future,” yeah. And now the kids have that little animal telling them that, yeah, you’re great. Keep on doing that. But, you know, there’s real life. There’s time for recovery. So, sometimes technology is not responsible enough to let people have a rest. 
Marta: Exactly. And this is it. That to see, this is this point. To see users as people, as humans, who has their family. They need to spend time with them. Who needs to go out, meet with people outside in real life? Because this is what gives us really satisfaction. And what makes our brain really happy, yes. And this is also because I think that we don’t have so much time, no. But I want to say one more thing. There is a story. I don’t know how true it is. Because I don’t know anybody in Silicon Valley. But I heard that the CEOs and other bosses of tech companies in Silicon Valley don’t let their children use the products they produce. And they send their children to offline schools, completely unknown forests. And without technology. So, yes. If I would ask one question to the entrepreneur who would like to ask himself if his or her is ethical in his work. So, it would be this question: Would you give your product to your children without any hesitation? 
Monika: Oh, yeah. And that’s a powerful one. And I believe for the B2C companies, that’s definitely the basic question to ask. If you’re creating tech that is there for the consumers, especially for entertainment or education. I think it’s a powerful question to ask yourself. Yeah, for all the designers, for all the authors of the apps. Yeah, definitely. And I think that’s a perfect closing thought on balancing profit and ethics in technology. Ask the human question. Yeah. Would you give this technology to a person that you love without any hesitation? Okay. So, we covered a lot of different areas. And I believe this topic is so deep on so many levels that we could just go on for hours. But we don’t have that much time. But definitely something worth exploring. 
So, to all of you who listened to us today, if you have any questions, you can either post them in the chat right now before we switch off. Or you can tag me or Marta in the comment section. And we will try to get back to you with some answers. So, yeah, I believe there are no questions so far when we’re live. But don’t hesitate to ask afterwards. And we’ll try to get back to you with either sources of information or maybe recommendations of people to reach out to if we know them. 
Okay, Marta, thank you very much. That was really insightful. And it opened a lot of other topics that I would like to have discussed in this podcast in the future episodes. So, to all of you listening to us, thank you very much for your attention. Make sure to subscribe to us to be up to date with other experts visiting our podcast. Yeah, and I wish you a great and a mindful day. Keep on making responsible technology. Bye. 
Marta: Thank you. Thank you very much for having me. Bye.