Listen to the podcast here:
Ensuring Your Product Is Designed For Trust with Adam Stone
I’m happy to welcome Adam Stone to the podcast. Adam is the VP of Consulting Services and Chief Privacy Officer of Secure Digital Solutions. I reached out to Adam early on in my journey before we started our build because I was interested in having a security consultant involved. We’ll talk about that journey and some of the things we did and how Adam has helped us while we’re building Bella. Welcome to the podcast, Adam.
Thank you very much. I appreciate it.
Let’s talk a little bit about all this fun topic, security and privacy. We promise we are not going to be boring. Adam and I actually laughed a lot on this journey.
I think it’s endlessly interesting.
One of the things I learned early on in my consulting career is that if you waited to the very end of a project to think about security or think about privacy and who has access to things that invariably you would be in a world of hurt. Speak a little bit to where do you usually see things and projects and where do you wish things were?
First off, Amber, maybe I should talk about how we met each other. I recall the day you reached out to me, you gave me an overview of what you’re doing and you had indicated that I would like to have somebody who would help me think about privacy and security issues early in the process. At the very beginning of the process, during ideation within the process. I’ll tell you something, in the twenty-plus years that I’ve been doing this, whether it’s for a large company or a small company, I have not once until you had somebody come to me and want to talk about privacy and security at the beginning of a project instead of somewhere near the end. That was a breath of fresh air for me. That is why ultimately you and I are here and I’ve remained a faithful helper as well as an adherence and a fan of the Bella Scena tool.
Thank you. I appreciate that. For me, this whole concept of security and trust was important. We’re living in an era where our data is sold and we don’t know what people are doing with the data. I wanted to come at it from the perspective of, “I’m the small company that is getting started. How do people know I’m not going to be slimy with their data and what I do with it?” Also, I didn’t know what I didn’t know. For our listeners, the point that we’re talking about in the journey is talked about human-centered design. We had done all of our user interviews. We knew our pain points. We had done our initial wireframes and reviewed them. We knew what we were going forward with on the scope.
We were not yet writing code when I showed Adam what we were looking at and started gathering feedback on it. It was before the first lines of code were actually written that we initiated this conversation. It was in an era where GDPR legislation was just coming into play and now the California Privacy Law is all coming into place. The crux of a lot of what we talked about is how I actually adhere to privacy by design principles. This is another one of those things that came out of my gut and it turned out Adam said, “You’re talking about privacy by design.” Let’s speak a little bit to privacy by design and talk a little bit more about where that came from and what that means.
Privacy by design is a fantastic term to use and nowadays it’s been growing in terms of the folks that are aware of it and what it means. Unfortunately, sometimes people misuse the term, but we’ll talk about the proper uses. Privacy by design was the brainchild of a wonderful lady named Ann Cavoukian. At the time that she wrote this, if my memory serves me, she was the Privacy Commissioner, the Data Protection Commissioner for Ontario in Canada. This was in and around 2009 give or take, she came up with this fantastic, simple, elegant solution to a problem that a lot of people were facing, which is how can we develop a product that maintains a high standard of quality throughout each stage, but then at the end provides a trust that is needed at the end-user. A product that is designed for user trust and confidence in the product. In and around 2009, give or take, Cavoukian wrote a piece and came up with these Seven Privacy By Design Principles. I’m not going to give them away right now because I know that we’re going to be talking about him as we go throughout here. It’s taking hold in the industry. It does take people like me, like you to continue evangelizing the term as it were and getting people to buy into it. I fully bought into it.
Let’s talk about the first one. The first one is proactive, not reactive, preventative, not remedial. Explain that a little bit.
The way that Cavoukian talks about it, she indicates that the whole notion of privacy by design is to avoid that normal circumstance where we develop a piece of software. We develop a product of any sort and only at the end do we think, “We should probably bring in the privacy and security guys.” What happens inevitably when that situation comes up is that the privacy and security guys are going to look at the end product and say, “You’re going to do that. That doesn’t feel very good.” The end result is that sometimes these people derail projects and make people very upset.
They’re perceived to derail the project. That’s why I’m always a little careful about now, because of my background. It’s a perception that they’re derailing the project. If you have them involved earlier, you never would have been on that path.
It creates an environment that’s very difficult for people and privacy and security because frankly we don’t want to be seen as a hindrance. We want to be seen as enablers to business, to revenue, to profit, to cost savings and not something to get in the way the process of design.
One of the ways that we thought about when you talk about this whole proactive, not reactive type design, I’ve been fortunate enough that several of the developers that I work with as well are security-minded. I seem to have a lot of security-minded people in my life, which is wonderful. We talk about how people could take advantage of things, how will we make sure that someone doesn’t use it in a nefarious way? We had to acknowledge that people might try to abuse something. How do we be proactive about that and prevent some of those types of problems from happening?
We did those when we would be ready to code and after we would finish features, we often would step back and think about that. As we were finishing up some of our sprints and we would consider a bundle of functionality done, we’d step back and say, “Was there anything else here we need to think about and give ourselves a little reflection time?” That’s usually where those kinds of things would come up and it would be in the future, we should do this and we’d add to the backlog, plan it for our future sprints. That’s how we actually haven’t been doing for Bella. Let’s talk about the second one, privacy as the default.
This is one of my favorites. I’ll give the most salient example I can in terms of what we have in technology. I have my Apple phone in my hand. I had been resisting purchasing an Apple phone for years and years. I don’t exactly know why, but ultimately a few years ago, I purchased this phone. When I opened it up and started fiddling around with the configurations, I discovered to my pleasure that many of the controls that would be otherwise chattier as it were and tools that data leakers as it were, those switches were turned off by default. I had to turn the things on in order to make the phone chattier and to disclose more data. Apple has been so proud of themselves and this in this design technique that they have a commercial about it.
You have probably seen the commercial, but the ultimate message is, “Buy an Apple phone because our phone respects your privacy and that competitor X over there, they’re going to do all sorts of terrible stuff with your data. They don’t care about your privacy.” That’s the message that they’re sending. I think it’s brilliant. This whole notion of privacy as a default setting is both a business to a decision or a set of business decisions. It’s a design decision. It is framed around an organization’s values. What they want to get at the end of the day out of this product that they’re building, what image they want to portray, so on and so forth. With Bella, you immediately took a position where privacy was the default setting and that was fantastic. Throughout every stage of the process we continued to think about, “Are we following that principle that this privacy is being injected into this process without the user having to intervene?”
The way I did that for anybody listening is when we looked at the product design and as we would refine designs, one of the final steps that I would do in finalizing the designs is I went back to the seven principles and I would ask myself, “Is it proactive, not reactive? Is privacy the default?” I would walk the principles so that I have a spot check, so that was actually reviewed for me, which is probably unorthodox and maybe unusual. I cared enough about that and making sure we had privacy as the default and so that our user settings and things you have to opt to share more data with anybody else in the product. That was hugely important to you. If we move on to privacy embedded into the design as well, so speak a little bit about that.Entrepreneurs are enthusiastic. They're endless optimists and gung-ho to a fault. Click To Tweet
This flows naturally from privacy as a default setting. Actually, number three should have done in front of number two. The whole notion of privacy embedded into the entire design is that the entire framing of your product or project and the foundation of your project has privacy-minded design built in. What that means in practical terms, I’ll give one very good example. The question about how long we keep data after we’ve collected it. Do we keep it for six months? Do we keep it indefinitely? You get the point, the usual setting as it were. I say that jokingly, but the position that companies take is often, we’ll just keep the data and we’ll just keep it.
Why do they do that? Storage is cheap. The whole notion of big data is promising and enticing for them. The fact that they had or they think maybe someday they’ll be able to monetize that stuff. Even if they can’t use it today, they might be able to use it tomorrow. When you think about privacy by design, that’s when you’re going to start saying, we ought to have a retention policy and that retention policy ought to be rock solid. We need to communicate that retention policy. We’re going to keep stuff only for three months. When we do after that, it’s gone and you stick to it. That’s an example of embedding privacy into the design of the product.
Now, this fourth one, full functionality, a positive-sum not zero-sum.
Let’s think of this as tradeoffs. When we think about privacy in the online world, oftentimes we as a user are wrestling with tradeoffs for each decision that we make when we’re online. The tradeoff in privacy sake is you can either have your privacy or you can use our thing. It’s a primary decision. There’s no gray within that. There is no, “Maybe you can share my data here, but I like it here.” It’s a very strict binary thing and users have been conditioned into this tradeoff thinking and by being conditioned. Unfortunately, what that means is for the vast user population out there, they don’t think about it, but they don’t think of the long-term ramifications of their decision.
That is the notion of the zero-sum where it’s either all or none. The other element of that is when organizations start thinking about how they’re going to make a profit off of a given tool or project. They ask themselves, “If I put too much privacy into this thing, if I build too much privacy, I’m not going to be able to make the profit that I want because I won’t be able to do this.” Privacy is seen as a restrictive or roadblock and that’s not a good place to be obviously because at least in my case, I don’t like to help folks perceive privacy as a roadblock. That’s the notion of the whole zero-sum game, both from the user side and from the developer’s side.
The next one end security, lifecycle protection. Talk a little bit about that one and I’ll add more color to how we’ve been thinking about that as well.
End-to-end security, security as we know is the fundamental of the enabler to privacy. The saying goes that you can have security without privacy, but you can’t have privacy without security. There’s a lot of truth to them. We need security controls to keep our stuff secret, to keep our stuff confidential. How do we do that? Instead of bringing that security professional into a project, nearing the end, you bring that person at the beginning of the ideation process. That individual, if they are worth, are going to think about all the ways, all the possible pernicious uses of the application, so how can a bad guy do this?
What can go wrong here? Security folks are eternal. They like to prognosticate chaos and destruction. That’s how they do their business. That’s how I do my business. What’s the worst thing that can happen? Then you design from there. Example of this would be, let’s talk about the decision-making process that we have to go through for saving user’s ID and password. The first question we might want to ask is when somebody inputs their password and ID, are we going to send it across the internet and clear text or are we going to encrypt it? That’s a security decision. We’ve built that into the design and it’s not an afterthought. The next question might be, “We’ve got somebody’s credentials, their password and ID in our databases.”
What if somebody gets into those databases? Maybe we should protect through encryption those passwords and IDs. Maybe we should even go so far as to throwing in salt, but making salted hash for those security folks out there. The point is that as we are building this project or going through this project, we are thinking about these things throughout every stage. It’s not a process that is “getting in the way” but rather a process that’s working in collaboration with and helping to enable at the end of the day a better higher quality product.
How do we think about end-to-end security? As Adam said, there’s this whole concept of thinking through, a lot of sites have username and password or Google single sign on and we’ve enabled both in ours. It was about ease of use in terms of the platform. We had to think about what happens if your Google token expires or something doesn’t work and we’d have to play out. Probably the best way to do this is played out non-happy path. Let’s design that people do it exactly the way we expected at all times, but he started to play out where does it go wrong and what happens under stressful conditions where people are clicking things or a whole variety of conditions.
When you start to ask yourself that, then you start all these little things start falling out like, “Maybe we need an error message there, maybe we need a little more protection here.” That’s how we looked at the non-happy path. We also had intentional steps in our process as well that were security reviews. We went back and looked at our APIs and said, “Are they secure? What might be in secure? What if somebody hacked that? What could they get?” We put on our little black hats and said, “What would you be able to do with that? Is that a fun discussion?” “No,” but I always tell people, “I don’t want to be some certain CEO in front of Congress that’s defining about why I never thought about people’s security.” Maybe I will be so lucky my company gets that big, but even if it doesn’t, I don’t ever want to have to look at a user in the face and say, “I didn’t think about your data sensitive.” “This is your to-do list, this is your calendar and these are your meeting notes. I’m going to protect that.” I think about, “Would I want people to be able to see this?” No, then I better protect it in terms of that process.
I found that and this is especially true for entrepreneurs. Entrepreneurs are enthusiastic. They’re endless optimists. They are gung-ho to a fault. They are like, “The torpedoes were getting this product out the door to people.” A seasoned security and privacy professional will be able to work within those dynamics. To flush out things that this endlessly optimistic entrepreneur might just gloss over just because they have so many other things to think about. They have a big picture to think about. It’s up to the privacy and security professional to help flesh out those little things that can give you gotchas later.Ultimately, the experience and the perception that you want to leave the user is a sense of trust and confidence in your software. Click To Tweet
I thought about it as how do I find out what I don’t know? How do I know what’s coming? We talked about the California Privacy Law is coming, which by the way, if you all don’t know about that, do research because that’s going to affect all of us.
If you do business in California.
Even thinking about that and I needed to be more educated and that they threw that guide to help you get more educated and roll with it. It will help you understand, “That may be a flexible decision, but here are a few of the things that happen if you don’t do it.” It might be fine. It’s about scenario analysis. That’s one of the things I found the most interesting as we talked through, because Adam helped with some of the initial user agreements, some of the initial privacy policies and all those things before we had the attorneys do the final reviews. You helped me understand things I wanted in there, because here’s where one to two years down the road, this may come up. For me it was about filling your knowledge gaps and that’s the most important thing is to know what you don’t know as an entrepreneur and then go get help to fill those holes. Later, you don’t have to remediate and deal with that. Talking about wanting people to trust the sixth principle is visibility and transparency. Talk about that one.
Do you mind if I give a metaphor?
I’m thinking about my doctor. I happen to have a pretty good doctor and I go into the doctor’s office, but the doctor does doctor things. He checks my blood pressure and all that other stuff. What the doctor does is as he is working with me and he is working within and around my body, which is my own private property. The doctor tells me what he is going to do before he does it. Here’s an example. I’m going to touch your arm now. I’m going to turn your arm. This might hurt a little bit. Are you okay with that? That transparency is brilliant because I have the opportunity then even if it’s a split-second opportunity to give assent or not.
That’s control. I maintain a level of control over my body despite the overwhelming power dynamic or the imbalance in the power dynamic between the doctor and the patient. I still maintain a sense of control and the doctor is letting me know what he plans to do before he does it. That same metaphor can apply to software to let folks know what you plan to do with their stuff. Once you collect it, as you’re using it, as you have a desire to share it, whether for profit or just because you have third parties that need to help, you want you to let the user know I head of time.
You want to try to provide that information to the user, not only in a timely fashion but in a way that the user and that any user, the average user shall we say, can understand. It doesn’t necessarily have to be in. This is for all sorts of very important reasons, but that is the idea for this principle is to avoid the black box. Keep the box as clear as you can. If it has to be a little opaque, okay, fine. You may have some secrets in terms of your secret sauce that you don’t want folks to know about for competitive reasons. That’s fair, but there are other things why not let the user know what’s going to happen before it happens.
One of the things that I thought a lot about in terms of process, especially around visibility, transparency and even just who’s involved in making decisions, a huge human-centered design advocate. There’s not a lot of tech companies that have fully embraced that yet. It requires new processes, as new ways of thinking and it takes a little longer. It’s not about just shove it out on the market and see what happens. It’s a very different, very intentional process. One of the things around human-centered design and it comes back to this idea of visibility and transparency and who’s involved accountability and who’s involved. I have a lot of concerns about algorithmic bias, who’s at the table making the design decisions for the products. All of us have seen examples, women may relate to this better. I’ve seen friends tweet about this as well.
One of my friends tweeted about clearly, it was some weight loss or weight tracking or something app. They’re like, clearly it doesn’t understand breastfeeding mothers, what it’s recommending. No, that’s not happening. How do some of these products, how do you incorporate a whole variety of people? One of the things I’m most proud of with Bella is that 50% of the people that did the initial reviews for me were women. About 45% of my testers from my close Alpha were women, to help me see issues differently and to make sure a whole variety of perspectives were brought to that table. To help me think about, what kinds of things do I need to share? How do things need to work? I would put a huge plug in for anybody that’s releasing software to do some type of closed-off a process before you go out to the market because it gives you this opportunity to find out if what you think you designed for something was there.
For example, I had learned that there wasn’t enough visibility and transparency on my Instant Meeting feature. Instant Meetings are these collaborative meeting notes and they’re in real-time and you can invite someone, but I forgot to alert you that the email actually got sent. Why did you know that person got the invite? Little things like that popped up about how can I be more transparent in the design? That’s where actually having that testing process and having people understand that methodology and bringing all these different perspectives to the table gives you a more well-rounded, more mature product earlier in your development cycle.
The harsh reality and the whole notion of avoiding the built-in discriminatory decision making that algorithm seem to create in some circumstances. The reality is that the vast majority of folks out there that are developing these algorithms are Caucasian males. Though I think that folks are getting better, there is an unfortunate amount of bias that is built into these algorithms. These algorithms increasingly control virtually every aspect of our life, who creates these things becomes even that much more important, to ensure no gender bias, cultural bias and race bias. If you don’t mind, I’m going to advocate for more women in coding.Be accountable when you screw up, because everybody's going to do it. Click To Tweet
That would be fabulous. I think we’d all like to advocate for that.
For you women out there that are reading to this, please start your journey now.
It is fascinating though because my perspective has been different. Having been in tech for so long and having participated in so many large projects. Being sure that I didn’t want to repeat that because I knew that I came with a perspective that wouldn’t represent everyone else’s. That if it was only me that influenced the decision, I would introduce a different kind of bias, but I would do. The idea of you need multiple different people to strengthen the product and help you figure out what you don’t know or you don’t see because you can’t have that same set of experiences. For me, it was powerful to be able to do that and to have a better perspective on different viewpoints from different people. Let’s talk about the last one. Respect for user privacy.
Who needs to respect? I’ll tell you, the whole notion of user-centricity in design is something that, there is any number of the very smart people talking about advocating for and I’ll tell you sometimes it seems like not one person is listening to them. If you go and visit certain pieces of software, go online or whatever, the whole notion of user-centered is so powerful. Ultimately at the end of the day, whether it’s true or not, the experience that you want to leave the user with, the perception that you want the user to have is a sense of trust and confidence in your software. The only way that individual is going to have that trust in confidence is if you show visibly through visual cues, sound cues, whatever it takes that you respect their individuality, you respect their dignity. You are an authentic and accountable organization, if you get something wrong.
There are a lot of folks, all these data breaches that happened, it’s crazy. You listen to these data breaches and the companies, their first impulse is to not admit. Their second impulse, once they’ve been uncovered, is to try to deny or remove themselves from any culpability. It was the plumber that did it or whatever. I’ll tell you when my kids come to me and they make a mistake, “Dad, I hit the car into the curb but it was a bird of made me do it. A little bird got in my way.” My trust and my confidence in what my kid is telling me dropped a little bit. I’m like, “Is that really what happened? Tell me the real story.” That’s what this whole notion of a user-centricity is and the respect and to be accountable when you screw up, because everybody’s going to do it.
We’re all going to screw up. Sometimes we just have to say we’ll do better or learn something from it and be able to then follow up on that because we’re all learning. None of us are going to be perfect and you’re right. That whole concept of, if we make up some excuse for it and we don’t take any responsibility for having a hand in it, there are other outside factors. The blame game, we all know that one. We all respond accordingly. We have a physical feeling when we hear it because we know.
I would say to the naysayers who are reading about this notion of trying to remove oneself from culpability in these circumstances, not every company does that. In fact, most companies, the position they take is a very sterile position based on existing law and the language that’s used is to satisfy existing law and no more. That’s it. They aren’t going to say a thing more about it. The problem with that is that that’s not good for customer service.
You’re not the one that has to take the phone call.
Customers are just going to get increasingly agitated. They are going to lose their trust in you. The more you try to sterilize the story, try to whitewash the story, try to pin the story within a very narrow box without divulging how the organization is going to become better as a result of this screw up, that’s what we’re missing in a lot of these data breaches. Even those organizations that do say, “We’re going to be better.” I have yet to see an organization actually come out publicly a couple of weeks, couple of months, a couple of years later and say, “We had this thing happen. Here’s what we’ve done to fix it.” What a concept that would be to not only admit your guilt, but to show that you’ve learned from it and that you actually have made your thing whatever it is better as a result.
This has been fantastic, walking these seven principles with you. How will people learn more about you and your company, the company work for?
First off, feel free to look me up on LinkedIn and my name is Adam Stone. I think you can look me up under Adam Stone and Privacy if you were to search for it. If you were on the Google or any other search engine, you would probably find me under Adam Stone, Minneapolis Privacy or Security. Privacy actually is more of my sweet spot. LinkedIn is one way. The web address for our organization, Secure Digital Solutions is called TrustSDS.com. You can look us up that way. If you get a hold of me or if you reach out to me, I’ll play you something. I will call you back and actually talk to you, have a conversation with you and I would love to touch up. Anybody who likes to touch up about privacy, anybody who is interested as you were in solving a business problem in part by thinking about privacy and security beforehand, I want to hear from you.
- Secure Digital Solutions
- Bella Scena
- Adam Stone on LinkedIn
- @BeingWonderly on Twitter
- Wonderly Software Solutions on LinkedIn
- Bella Scena
About Adam Stone
Skilled business leader with 20+ years’ overseeing the implementation and development of data privacy and security innovations.
A business development professional with the spirit of an entrepreneur and the boundless enthusiasm of a Russell terrier.