How can you protect your personal data in an increasingly interconnected world? The future of Web3 increasingly drives the point of security. Let’s delve into the crucial topic of Web3 protection for personal data. On the show today, Dr. Richard Carback, Co-founder of the xx network, shares his invaluable insights on the importance of privacy and data security in the Web3 era.

Dr. Carback unravels the complexities of data privacy in the digital age and discusses the challenges faced by traditional centralized systems. He highlights the significance of Web3 as a paradigm shift that empowers individuals to regain control over their personal data. Get your head out of the sand and get ready to be ‘gut punched’ with the truth. It more real and closer than you think! Push Go!

Transcript

xx network - Web3 protection for personal data

Participants:

• JP (CMO of AdLunam)

• Dr. Richard Carback (Co-founder of the xx network)

00:22

JP

I also see that our speaker, Dr. Richard Carback, is in the room. Doc, if you would like to have a quick sound test before we flag off the show, you can just unmute and say hi.

00:33

Richard

Hello, sounds right. Can you hear me okay?

00:38

JP

Yes, loud and clear. Loud and clear indeed. All right, ladies and gentlemen, I can see that the room is fast filling up, so we will start this show in record time. Give us about 30 seconds so that our team is in place, and then we have a smooth sailing going forward. Okay, welcome. Welcome to this episode of Diving into Crypto. This is JP from AdLunam Inc. bringing you everything about Web3 on the show. Today we have Dr. Richard Carback. He's already in the room, really excited to hear what he's doing with the XX Network and a little about his journey, about how he got there. From what I've seen, he's been an inventor, he's been in tech for the longest time, but I'm going to let him tell us all about that in a minute. Before we begin, once again, ladies and gentlemen, this show is hosted by AdLunam.

01:56

JP

It's the industry's first NFT integrated Engage to Earn seed crowdfunding platform and IDO Launchpad with a Proof of Attention allocation model. That's a mouthful, but in other words, your engagement and your attention is rewarded for allocation. That being said, I'd like to remind the audience in the room views expressed on this program belong to that of a speaker and is not to be construed as financial advice. Also, in case we get cut off, please remember to come back to AdLunam Inc. and you will find yourself a new link to jump back into the room and continue with the show. Also, we will open out a question and answer round at the end of the program. If you have any questions in between, please feel free to message them to AdLunam Inc. or to our speaker directly so that we can have them answered for you.

02:47

JP

Finally, if the speaker drops a gem or you find something that makes you have an epiphany, please, by all means, show us your love. With the emojis that are there, you can choose the colors that you like. I think I'm going to choose today's color. I'm going to go with green. Do we have green as the heart? Yes, we do indeed. Okay. There we go. Okay. That being said, once again, ladies and gentlemen, welcome to the show. And more importantly, let's welcome to the stink our guest, Dr. Richard Carback. Doc, welcome.

03:23

Richard

Thank you, JP. I also appreciate the green heart, green being my favorite color, so I'm really happy to be here and share sort of why privacy matters with your audience. I think I'm probably the first guest that you're having that's going to be able to do that, and I'm really grateful to be here.

03:44

JP

Excellent. Glad you accepted our invitation. And yes, it got us extremely curious about web three protection. I know it's a so point for all of us, but before we dive right into that, I'd like you to share with us a little about who's the man behind XX Network, all the wonderful stuff that you've done. What's your background and what got you here?

04:07

Richard

Yeah, I am a cryptographer and I'm a software engineer. I've done a significant amount of work on voting systems, cryptographic, end to end, voter verifiable voting systems. If you look me up online, you'll see probably 15 or 20 publications on that. My main work at XX has been building out the private messaging protocols. I've done a fair amount of computer security work and I've done forensics work, so I'm all over the place from a technical perspective, but it's all centered around the privacy and security space. And I want to say my current focus is privacy preserving technology. Also, in my free time, I will contact politicians and I try to use my expertise to behind the scenes advocate for better regulations and laws in addition to actively working on these technical solutions that we're going to talk about today.

05:09

JP

That's awesome. Okay, great. And at which point did you realize because I know that you've worked a lot with different types of software and yes, you are a cryptographer, but at some point it must have struck you to say that, hey, web three is the place to be for me. What was that moment like? What was that story?

05:33

Richard

I went to school in the early:

06:41

Richard

. I got back into it again in:

07:48

Richard

He's the main guy behind it. Many people in the audience are going to know him as either the godfather of anonymity or the godfather of crypto. That's because he ran a company in the 90s called eCash, right, which is sort of the earliest, I want to say the earliest really serious form of crypto effort that was out there. And anyway, he's up there and he's talking about this voting system that he's working on, and I'm sitting there, and I'm seething, and I'm like, rocking back and forth, and I'm agitated, okay? And I'm agitated because while he's up there presenting all this interesting cryptography, he's also pointing out sort of the parts he hasn't worked out yet. And I'm sitting there and I've connected all the dots, and I just want to jump out of my chair and be like, no, you can do it. I know how to build this.

08:34

Richard

So when that talk ended, I very adamantly went up there and I said, I can build this. I know exactly how. And I think within two or three months, we had a working prototype. And so we've been collaborators since then. So it's kind of a long and windy path, right? I was there. I wasn't there. I've kind of always believed in it.

08:59

JP

I'm really glad that you had a chance to attend that seminar, because what he's done, of course, is pioneering. I mean, I'm at a loss for the words, because when you look at the genesis, when you look at where it all began, eCash was definitely you could say that was in many ways, the starting point where all of this began. Right. So for you to have.

09:28

Richard

It goes back to his PhD thesis when he talked about a chain of blocks. I think he's probably the first person that really had that realization, but he hadn't coined the term, so he really is a key seminal player in all this.

09:44

JP

Right.

09:44

Richard

Sorry to interrupt.

09:47

JP

This is fascinating because this would have been way back in, what, 88, 89?

09:52

Richard

Probably before I was born.

09:55

JP

Really? Wow. Okay. This is super. I mean, you know? Okay, so you're working at this point of time focused on preserving privacy. Right. Let's start here because I know that some of the audience may already be deep into it, but some of us are still struggling with, hey, what does privacy mean? Do I just turn off my Google settings and is that the end of it? Like, how deep does the rabbit hole go?

10:25

Richard

Yeah, the problem is right now there are automated systems and machines. They're tracking everything you do when you do it, and in many cases what you're doing, and it's insidious and it's everywhere.

10:44

JP

Right.

10:45

Richard

And it doesn't just affect the digital world, it's very much affecting the physical world. Now.

10:57

JP

Wow.

10:59

Richard

Sorry to be so profound about it, and I'm happy to talk more, but it is very much the case that you get in your car and you drive down the street, and in most places that most people live, probably 15 or 20 devices. A lot of them are going to be things like license plate readers get and store data about where you went. And we're very close to becoming a situation like a Dystopia where you can't go anywhere without everyone knowing everything. So not everyone. People in privileged positions knowing everything. And I don't mean the government. I mean corporations in many indeed the government too, but corporations as well.

11:53

JP

en around since, as you said,:

13:14

Richard

e. I was doing prototyping in:

14:16

Richard

that in, I think, November of:

15:21

Richard

So that's been the journey.

15:24

JP

Wow, okay. In some ways, right? The fact that you have developed this is already a huge step in getting people to adopt. It is the next one, and of course, that does come with its own set of challenges. I'm curious to understand though, when it comes to your privacy, is what does it take to get somebody to understand that?

16:09

Richard

I think that most people kind of need to be punched in the face with a bad consequence, unfortunately.

16:19

JP

Yeah. Okay.

16:22

Richard

You know what, let me think of an example for you. So, you know, your bank tracks your transaction. And if you look up bank of America Close My Account, you will see hundreds of stories of regular people suddenly losing access to their bank account because most likely because some AI tool decided that they weren't good for the bank. So the bank just shuts their account and takes their money. I think those people, it's very easy to convince them that privacy matters, because their bank abused that privacy and used it against them. There are much less important examples that I think also drive home the point. Years ago, I want to say it's almost ten years now target had a story where they were predicting who was pregnant, and there was a father of a young teenage girl, and he starts getting pregnancy mail addressed to his daughter.

17:36

Richard

And that's how he found out that she was pregnant, because target had been monitoring her transactions and figuring out, oh, she's buying the unscented wipes. She must be pregnant. Let me start sending her this. And I think once you get hit with that, it suddenly becomes really important. I also find that it's very easy for people in more marginalized communities to understand this because they tend to get abused.

18:05

JP

Right?

18:07

Richard

Yeah.

18:11

JP

This is of course, something that drives really deep. I mean, if you're talking about you mentioned before that you could have your license plate picked up and then somebody knows exactly where it is that you're going, your bank account. There's something that says, hey, we may not be the right customer for this bank. And then they freeze your assets. You're walking into the store and your purchases are being recorded. And based on that, some intelligence is saying or some algorithm is saying that, hey, this is your particular condition. And of course this is scary on so many levels. Let's just put that out there right now. My first instinct is to say how do I protect myself from this? But I think the bigger question to ask is where are the vulnerabilities or where are the avenues that my data is being picked up that I should at the very least have this basic protection around me?

19:27

JP

Is there, you know, is there a checklist or is there some, you know, some basics that I should be doing that I should look for to be able to stop this entire cycle from going? As far as that.

19:44

Richard

There’s no checklist and it's kind of an impossible question to answer. I will say what I do, right. I try to use cash when I can. When I'm out with my phone, I will turn off all of the wireless signals and there's sort of limited things you can do there. When I'm online, I will try to use Tor or VPN if it's something where I'm not comfortable with someone being tracking me. And I try to move all of my chats over to XX messenger and speakeasy. These days, obviously I'm an early adopter, so take that with a grain of salt. And it's a matter of understanding what the threat is and trying to work within it, right? So I hate to use drug dealers as an example, but you see all those TV shows where they have.

20:49

Richard

Throwaway phones, right, and they're kind of accepting the fact that the phone is getting tracked, but they're using it in a way that really makes it hard to track them. So you got to think it's a sad state of affairs when you have to think like a criminal to protect yourself. But that's the reality we live in and that's the kind of mindset that you need to have going into these things. And especially if there's a protest going on, I wouldn't even bring my phone and if I do, I will turn it off even if I'm not participating in the protest. Right now you've got, they have these listening devices and they're just assuming everyone in the area is part of the protest. Even if you're just going to work. Right, so it's in your best interest to assume that you're going to get tracked and that's going to get used against you.

21:42

Richard

It's a tough situation to be in and there's no unique answer. I will say another really important tool in the toolbox is time, right? So they may be able to track a device and then they'll be able to figure out who bought that because there's cameras and things like that. But that data doesn't typically get stored forever. If you buy something and then wait a year before you start using publicly and you bought it in cash, typically that association can't easily be made. If you buy it with your MasterCard, I guarantee you MasterCard has already sold that data to the US government and other governments. And they could just type in a thing with the IMEI number, the serial number, and boom, they know who owns it. Right. So then it's associated with you.

22:40

JP

Fair enough. Okay, doc, I have to ask sorry.

22:47

Richard

To belabor the point, but part of what we're doing is we're trying to carve out private spaces where you can have privacy. And it's not that you should have privacy in all aspects of life. It's that privacy needs to exist. It's essential to a free and functioning society that is very much sort of a core belief that espouse.

23:13

JP

Indeed, indeed. It's a basic right that you have freedom. I don't know if you want to amend the constitution, but you're going to need that freedom from your basics, every movement, every single movement of yours being chalked out, right? And that definitely isn't something that should happen. And of course I wouldn't say it may be, but do you also see this more significantly in the US. Do you see it all across the world? And this may sound naive, but are there spaces in which this system isn't as developed as it is where you are.

24:10

Richard

With free speech protections and things like that? There are some legal backstops in the US That I don't think exist elsewhere. I do think it's a worldwide problem. We're sleepwalking into a world where, for example, I think this is an important function. Anonymous whistleblowers just aren't going to exist. It's not going to be possible to be anonymous whistleblower. So what that means is the powerful in every society get absolute control. And that gets real scary when you think about who's powerful where a powerful drug cartel, a powerful corporation, what can they do with that level of power? And I'll get laughed at sometimes when I say this, especially to policy people, but this lack of privacy is a national security issue. Just think it through. A plurality of folks are good people. I believe that at least, and you can at least say that the vast majority aren't evil, right?

25:14

Richard

But if the bad guys can concentrate power and then anyone who threatens their operation, they can immediately identify, pick out and take care of by take care of, I mean, get rid of or otherwise delegitimize, then the good guys know there is no good guy to stop a bad guy. Right. If you're in cartel land, they have absolute control. Maybe you want to go tell the government a state over what's going on, or the US government, but bad guys don't play by the same rules as good guys. So it's really important that we have these private spaces so that can continue to exist. I also think that corporations and government abuse of privacy is going to keep getting worse. I've already kind of alluded to this, but privacy in the physical world is probably done in the next five or six, five or ten years.

26:11

Richard

If you own a cell phone and you're using it regularly, that's already true for you. Right?

26:16

JP

Yeah.

26:17

Richard

You're just not going to go anywhere without being tracked. So if we can preserve private spaces, at least digitally, then we can preserve that free thought and free society. We can provide that relief valve so that a whistleblower can do what they need to do, and you can operate without potentially being judged or criminalized for something you did years and years ago. That's the hope, yeah.

26:48

JP

These are interesting times we live in. And I of course, say those words, interesting.

26:54

Richard

The Chinese proverb is true, we live interesting times, or it's a Chinese curse.

26:58

JP

Right, yeah, that's what I exactly. Yeah. I'm going to deviate a little, of course, and go on to the fact what you're seeing around the world in many ways that you have a lot more today, a magnification of certain social issues that may not have been in the spotlight with the intent of coloring perception. Right. And as you're saying, you have a micro segment of societies the world over or a population that is the outlier of good or is not in a category that has malicious intent. Right. And then when that is amplified, the rest of the population, let's say that's at 5%, but 95% of the population is seeing that just that one segment is causing an issue and that being amplified begins to color vision. And what often happens is that the people who don't stop to think about what does that actually mean and how much that impact is actually going out, begin to start seeing things in that way.

28:23

JP

Now, this is, of course, from a media point of view, but at the same time, if your privacy is violated, if your privacy is taken over, these same points can be used in many ways to be amplified for individuals. Right. And on an individual level. And then statistically, that collective starts becoming larger just by virtue of amplification. I think I've begun to ramble, I'm sorry.

28:59

Richard

I think what you're talking about is how does privacy look at scale? Right.

29:06

JP

Is that a fair character from a media point of view? But also, yes, it starts with privacy.

29:13

Richard

Yes, it absolutely does. So at scale, it's kind of a dynamic between at scale versus individually. And the example that immediately comes to mind is healthcare. In the US. Most people in the world don't have to worry about this, right? But in the US. It's privately run health insurance. And a big problem here is they are able to use this data that's collected at scale to make judgments about the population and to assess risk based on your current health profile. So they compare you against their averages and how much it's cost historically, and then they charge you more or less, depending on that. And that's, in theory, been made illegal. But I guarantee you they're working on every possible way they can to use that data against you. So that's the kind of way that this problem exposes itself at scale. And I'm sure there are more that I can think of, but that's the one that comes to mind right now.

30:17

JP

ng to an article I believe in:

30:49

Richard

You've done your homework on me. Of course I'm a cyber spy. All of this is obviously a front. Nothing I said actually matters. Right. But in all seriousness well, to be fair to those people, it's not like I don't have the skill set. Right?

31:07

JP

Right.

31:07

Richard

I did say I have a background in computer security and forensics. So it's an honest question. But to explain, it's not really an article. It's something that was released on the website called Cryptome. And at the time I was working at Draper Laboratories, I was running their Embedded System Security lab. And surprise to no one, the Embedded System Security Lab evaluates and comes up with security solutions to secure embedded systems. Right. So I was invited to give this talk at Fordham University at this conference for cybersecurity that was run by the FBI. And the talk was about approaches on how you protect legacy control systems and public infrastructure like power plants and water treatment facilities from remote hacking attacks. And in general, the high level takeaway there is, don't connect it to the Internet. And if you must do some very robust analog backstop control on top of whatever it is you're doing.

32:13

Richard

So it's impossible for the digital system to do something that could cause a catastrophe. So that's like the takeaway of my talk at that same conference. I did not know this in advance, but they had the CIA and the NSA director, I think it was Michael Hayden, and he has that favorite, his famous quote, I believe he said it there was, we use your metadata to kill people like we use metadata to kill people. That's all we go on. We don't even need to know the contents of the information. It's like an infamous quote from him. So I really mean it. When the metadata protection is important, very much so, especially if you're in an area that's targeted by the US. It could be very fatal if you aren't watching that and who you're associating with and what data is getting tracked there. So I hope that answers the question.

33:09

JP

It does. I have so many more now that out there, but to stick to point, right? To stick to point. You've shared with us, of course, some of the prime ways that data gets tracked, some of the key things we can do to protect ourselves. And it keeps coming back to me that as you're saying this, it does make a lot of sense. But I'm sure that there are a lot of people out there that say, hey, you know what? I would rather be an Ostrich. Put my head in the sand and just continue with my life the way it's going. Right? I'm absolutely fine with that. I've got no problems with that. What you're telling me could also not be true, right? Because maybe they don't want to believe it or maybe they're not seeing it apart from punching them in the face because now their heads in the sand.

34:05

JP

So it's going to be difficult. But how do we get across to family? How do we get across to our colleagues, friends to get them to understand the importance of securing their privacy?

34:19

Richard

dividual, right? And you were:

35:49

Richard

I think it was the Expendables. Again, another real, true story. So I think the other aspect of this is to actually have the tools built and functioning and working and for them to provide more value than what else is out there. And I'm doing my best to do that. That's sort of my main focus in life right now.

36:14

JP

Understood. Well, and more power to you, Doc, because this is something that a lot more people need to empower themselves about and in so many ways. And I'm hoping that this podcast to the audience that it reaches, that it at least makes them think, if not act, at least to make them think about how they should be protecting their privacy.

36:37

Richard

I would add to that, you're not defenseless. Right. Even in bitcoin where everything can be tracked, right. You have a defense, send it to an exchange, pop it out to a new wallet. Right. There's some level of protection there.

36:54

JP

Right.

36:54

Richard

The exchange can de-anonymize you, but the Internet not necessarily can't, especially if it's a different amount, right?

37:02

JP

Right.

37:03

Richard

You can't make many assumptions there. So there are defenses. It's just a matter of learning, I don't want to advertise other people, but there are wallets, like wasabi wallet that kind of do this for you automatically. So by all means, do some research, download and try these new tools. You've got nothing to lose.

37:28

JP

Definitely okay to come back, because I know that we have a lot more ground to cover on the show. All right, I want to circle back to one thing. We hit on security. We've hit on how to protect yourself as an individual, right. When it comes to building XX network, you tell us a little about how it operates and then I want to dive a little into some of the finer points about its operation.

38:01

Richard

Yeah. So the reason we started building XX network, I want to say it was the pre computation, which was a technical discovery that David made. It's covered in our white paper. Essentially, what it does is it can do a lot of work on the computer ahead of time, and then when the data comes in for processing, you can process it extremely quickly. And that enables this technique of high trust batch mixing with very fast speeds. And when that made this technology viable, I was on board to build it. I think it took me four months or six months to build it. And I had some help with ben Wenger is the other major technical individual on the project. So we built that prototype. Once we proved it work, we started working on a production version of that, the first version. It required 64 CPU cores.

39:06

Richard

And we're thinking about, well, that's very pricey for nodes in a decentralized system. And we realized, wait a minute, consumer GPUs are great at the problem. These computers are slow at, which is modular, multiplication and NVidia. It just happened to be that year they had released a library. So now I think our nodes cost about two grand to build a node, which is like nothing. It's just a standard server on the Internet. But it needs a GPU to work. And we're not mining, we're not doing hashing or anything like that. It's doing this pre computation to execute on the network. Once we had that MixNet, we needed this decentralization model that could work a blockchain with consensus, that keeps up with the speed of the batch. Mixing was what we needed. So we ended up building a layer, one that could be tuned to keep up with the MixNet.

40:06

Richard

And because we're cryptographers that whole time, we're paranoid about the threat of quantum computers. I think we still are. So we're thinking, how do we make the messaging robust against that threat? Because you can store the messages now and decrypt them later. Right? So we came up with a protection for that. While were doing part, another team, Mariotic and David, came up with doing it for the transactions and the consensus. So we added that into the mix. So XX Network is really a culmination of all this engineering and these sort of technical, unforeseen technical advancements, but the seed of that was all the pre computation.

40:49

JP

Okay. And it's interesting that you say that, because when you mentioned not just the way the system is built and the logic behind, or rather the intent behind having to make it as secure and as cost effective as possible, I know that I read some here where you say that XX Networks is quantum resistant and has a privacy focused blockchain ecosystem which.

41:19

JP

Is comparably better than traditional blockchain platforms. Right. Especially in terms of security and privacy for the layman. Especially. I'm a layman as well. I don't understand this, and I know that a lot of people don't understand the tech side of it. So for us to be able to get an idea about how much further or how much as a quantum leap, pun intended, XX Network has taken, can you draw us a picture, paint us a scenario how that works better, smoother and more securely?

41:57

Richard

Absolutely. So I think the way to think of it is that there are two aspects to it. I'll talk about the messaging first in a little bit more detail, and I'll try not to put everyone to sleep. So on most messaging systems, the encryption, if there is any, is between you and the server. Right? So if you're using Facebook Messenger, the Facebook Messenger servers can read it. And the story is the same with Telegram. It has a more secure option, but it's rare for people to use it. So by default, Telegram is the same way it stores. It reads all of your messages on what are considered really secure systems. The messages are end to end encrypted. And that means that this intermediate servers that are handling the messaging and sending them back and forth, they can't read your messages, only the recipients can. And a good example of that technology is Signal.

42:54

JP

Right.

42:55

Richard

We're taking that a step further because what's different in our system is because in addition to that, when you send a message and where it gets sent, in other words, that metadata that we've been talking about is hidden. That's huge. That means you can go into your McDonald's and hopefully in the future pay for that sandwich in crypto. And nobody knows if it was you doing it on your cell phone or you message your uncle to pay for it because that metadata on that payment doesn't exist. Obviously, McDonald's is going to have the picture of you in the store, but the actual metadata of the device metadata doesn't exist, which offers you a lot of protection. So bad guys that are passively listening on the open Wi Fi in the McDonald's, they no longer get the target that they would otherwise get, saying, oh, this device has crypto.

43:50

Richard

Maybe we can look at that transaction, figure out roughly how much that kind of analysis is completely blocked off from the bad guys. And it also means that you can do things like use the Wi-Fi at work and talk to your therapist, and your boss doesn't know that you have a therapist again because they don't have that metadata to see it. So you've created a private space where none existed.

44:15

JP

Right.

44:16

Richard

The second aspect let me get to this that was just the first aspect. The second aspect of all of this is that quantum protection. So I already mentioned we provide the options both on the blockchain side and the consensus and transaction side sorry, on the messaging side and on the consensus and transaction side. And I'm happy to get into more detail there. But the bottom line, the takeaway is that when quantum computers become a threat, we will have the solutions in place. So it's not a problem.

44:51

JP

Right.

44:54

Richard

As far as I know, nobody else has done that yet. We're working with people in other projects like IHK and Algorand. But as far as I know and certainly I think that when it becomes a threat, I'm hopeful that most systems will have some transition plan. But it's nice to know that we're kind of ahead of the game there.

45:17

JP

Well, indeed. And so far, to quote McDonald's, I'm loving it because what you're saying is you're already preparing to be future ready. Right. And this is of course, the more that these transactions become mainstream, the more you want to be able to have the right security measures in place. Okay. So that makes perfect sense to me. I can totally understand also, and thank you for painting that picture because now it also becomes clearer how exactly we

45:55

JP

Open our messages are on WhatsApp, on Facebook, on Telegram, where it should be as protected as we want it to be. Now, this being said, when it comes to XX network, I understand also in addition to this, that you have done some work in various other spaces with innovation, with mainstream tech, and also with voting. My team is telling me that.

46:41

JP

Tell me a little about some of this innovation and then let's dive a little into the voting aspect because I think this is going to be an interesting topic.

46:51

Richard

d a startup in I think it was:

48:02

Richard

Right. So that was the purpose of the startup and very interesting, didn't work out. Ended up going working on XX network instead after that. Before that I had done some other minor startup stuff. But let's talk about what you're really interested in, which is the voting. And part of my PhD research was can we make systems where you get some confirmation that doesn't prove how you voted but gives you confidence that your vote was counted properly? Okay, that sounds like I'm saying opposite things, but let me explain to you briefly how it works. So the way that this system works is before the election we post what I call a digital audit trail online. And that is think of it like a phone network. So it's a phone number attached to a person. So in this case it's a phone number attached to a candidate.

49:08

Richard

And then there's a bunch of wires in between that are all mixed up. What's hidden on the digital audit trail is those wires. Those are encrypted.

49:16

JP

Okay?

49:17

Richard

So right before the election, before we print the ballots, or however we're going to do it, we reveal some of those wires and we say, look, all of these wires match to all of these candidates. So we're not lying. This is a one to one connection because we've audited and I think in our proposals it's roughly half, but there's multiple copies. So it's one over two to the N, one minus one over the two to the N probability that we did it right. And there's no wires or numbers that match to the same place. So that we're not messing with the count after the fact. After the election, that digital audit trail gets updated with intermediate results and then we do another audit after we do the final count. And that audit doesn't reveal which number, in this case confirmation number goes to which candidate, but it reveals all but one of the wires.

50:12

JP

Right.

50:12

Richard

So for us to have changed the results of that election, we would need to know who you voted for ahead of time and which ballot you were going to get, which is a very impossible task given that we're printing it on paper, we don't know when you're coming in, so on and so forth. So that is roughly how those systems work. And that's how you can have high confidence that your vote was counted absolutely correctly. Because you're going to go online, you're going to make sure your confirmation number is there, but it's not going to say who you voted for because that still remains encrypted and it's protected by a multiparty computation. And I won't get into the nerd stuff.

50:55

JP

Yeah. So this is extremely interesting. Right. In a previous life, I had some exposure to how the system voting works, and it is so vulnerable. Let's just put it this way, right. Because there's so many ways in which either from one source or another, whether it's electronic or an individual, certain people are able to identify how an individual for whom an individual has voted. And that, of course, is not something that you want to have out there for multiple reasons. Right. I would want to vote for who I wanted to vote. I wouldn't want for everybody to know it at the time that I'm voting. But if that information is vulnerable at that particular moment, it is an issue. Right. I was about to say it could be potentially harmful, but at that point, of course, it is an issue. That's not something I want to be out there.

52:04

Richard

It absolutely is. Yeah.

52:07

JP

Right. Okay. In terms of this vulnerability through the system that you just described, it is protected or do we need more adoption about it? What's the next step?

52:22

Richard

That's a great question. My conclusion is that no one wants to deploy a system voting is messy.

52:32

JP

Right.

52:32

Richard

And no one wants to deploy a system where every minor mistake provides incontrovertible evidence that the mistake was made. And I don't think, barring public outcry, I don't think that it's going to ever be used in a real public election. And another aspect to this, which I'm glossing over, but it's really important when you build a voting system, everyone has to be able to use it equally. And these systems are not. The systems I worked on are good and most people can use them, but I think it's difficult to use for a lot of voters. So there's a lot of reasonable objections to implementing that system. Some of them, I think, less legitimate than others. But I don't think that any election official who's competent is ever going to deploy the kind of systems that I worked on. But I do think aspects of what we've done are going to make it into these systems eventually.

53:44

Richard

Microsoft has a really great project called Election Guard, and there's a nonprofit in the space called voting works and they're doing wonderful work and they have technology that uses very similar methods and also leverages sort of the auditing to do risk limiting audits on paper ballots and other types of audits. So it's not that this is never going to make it into the public sphere, it's more that it's not going to look the way that it looked when I built it, if that makes sense. I've come to that conclusion.

54:23

JP

I can completely understand. Yeah, go for it.

54:26

Richard

And I would just like to connect this back up. The same privacy techniques, not the same, not completely identical, but in general, the essence of them is the same stuff that's in the MixNet, the batch mixing.

54:42

JP

Right.

54:43

Richard

So it's a very similar idea. That's how these two things are connected. Because it seems like voting is kind off the wall. But no, in my mind you're solving the same problem, the same technical problem. Right. Different context, but same technical problem.

54:56

JP

Yeah, a lot of us can relate especially to that. And when you look at the entire mechanism when it comes to voting, there's so many moving parts involved. From the information we've had over the last couple of elections, whatever controversies had surrounded them, the utility of personal data to create influences and then finally getting down to so that's one aspect. Right. That's just one layer. The second layer is when you're actually going to vote. And it's at that point, if there's an additional vulnerability, then this attack is coming from both ends, rather from both sides.

55:52

Richard

Right.

55:55

JP

But our technology is kind of being done to plug that the equation helps,

56:03

JP

Your rights, your freedom. And it should. Right. So we need more systems like these that are able to protect us from attacks like that. Or vulnerabilities. Let's just call them vulnerabilities because that's exactly what they are.

56:19

Richard

Absolutely. And it's not that I've stopped working on voting systems, I've been working lately on votexx. If you go to votexx.org, there's a paper that you can read that explains the system and the innovation there is trying to make security similar to a voting booth on the Internet. Right. So we're really trying to make Internet voting more viable and our intention is to use that as part of the XX network ecosystem. And the innovation there is, we've realized, sort of when you look at the voting problem, and I don't think this is well understood, it really boils down to can a third party buy or influence the vote because your privacy is violated? Right. If I can know how you voted, then I can say, we'll vote for this guy, I'll give you $10. That's like the fundamental if you remove the malware and other aspects to this, which I think we can solve with technology, that's the fundamental issue.

57:21

Richard

And what we've done is you register. And that registration could be in person or it's some kind of KYC so that you're a real person and you do that sometime before the election, right. And you register passphrases that allow you to vote. And after the fact you can use another third party. We're calling it a hedgehog because we lack a better name, but essentially the hedgehog is in there to reverse your decision. So if I vote for candidate A, I give the hedgehog a token that says, well, you can nullify the vote for A, and nullify can mean different things. It could mean we're going to switch it from A to B. Right. But the key there, the insight there is no token that allows the hedgehog to directly vote for B or to nullify your vote for B. It's only allowing it to reverse or indicate reversal of the decision that you made.

58:29

Richard

And because you have that, no one can, after the fact, come to you and say, well, I want you to vote for candidate A. I'll pay you $10. Because they don't know if you're going to hedgehog it later to reverse the vote or to cancel the vote. That's the innovation there. So if they wanted to do that, they would need to be involved from the beginning in the registration system. They need to make sure you never talk to anyone for the duration of the election. We're assuming elections last for months. Right. So how many people have the resources to take a significant number of people and put them in a dungeon without access to computers and it not be noticed? Right? That's theory. Right. Nobody, not even governments, not even oppressive regimes have that capability.

59:17

JP

Yeah, I don't think anyone would go for it either.

59:21

Richard

It's about reversing that asymmetry that you have now, because right now you just need to be involved when they're voting. But in the system we're proposing, you need to be involved the entire time. You need to constantly be watching. And that level is not possible. I could say before I even register, hey, I'm going to give you the token in advance, if I don't put my flowers out on this day, then cancel it. And that could happen two years before the election happened. There's all kinds of things you can do, spy versus spy style, that would enable you to do a nullification that wouldn't otherwise be possible. So it just enables this entire world of optionality and I'm sorry, I'm rambling. Now it's my turn to ramble, right JP?

::

JP

Absolutely, absolutely. We're coming towards the end of the show. So audience, if you have questions, I know this is not the most secure network, but you can type that into AdLunam Inc. because this has certainly been informational, educational, eye opening, and, you know, probably need to get a new phone with a lot more security features now that we've heard from Dr. Richard Carback. Doc, I've got two more questions for you. So I'll give the audience some time to throw the questions in.

::

Richard

Right.

::

JP

Okay. So in terms of the future of where more than just the crypto industry as a whole is going, are we on the right track in terms of being more secure? Are we on the right track to being more vulnerable? What should we watch out for broadly? Because I know that this is something that we've spoken on the show, but from the industry as a whole, the next five years from now, the next ten years from now, are we on the right track? What should we course correct at this point?

::

Richard

I think we have a hard road ahead of us, but also I think that things are going to get better, and I think that things are going to get worse. We're going to see more and more frequent privacy abuses, but we're also going to see tools and reactions to that are going to help mitigate and potentially solve some of those problems that we're going to create. I hope that privacy is going to gain more prominence, and I'm hoping to see a number of privacy. Obviously, we're not the only privacy project. We're specific to messaging, right. So we have things that others don't. We have message delivery guarantees. We have some message storage that other projects don't. But we're not the only privacy project out there, and I'm hoping to see more of a user base for all of the projects and more prominence for them as well.

::

Richard

So I'm hopeful, cautiously optimistic, I guess, is the summary there.

::

JP

Okay. All right, so my next question then is for you, Doc. You've told us a lot about where you started, what you see, and what are the things we have to protect ourselves with. Get some crypto Kung Fu going, some of the basic moves. Right. But what's your personal philosophy? I mean, what is it that keeps you going and so passionate about what you do?

::

Richard

Oh, that's a tough one. Be kind. Use what you're good at and try to do the most good for the most people and do what you can to leave things better than how you found it. Right. Now that I've said that sounds a little similar to effective altruism. I would say that it's a little different for me. There's that example of you theoretically let the local kid die to save 100 in another place. Right. I don't think that's the right way to look at it. I think that you always have to save the local kid because you're there, and that's what making the place you're in better is about. And I think when you say, oh, if I spend money elsewhere, that doesn't necessarily actually save those kids, you might be making the situation worse in a lot of ways. So you have this moral obligation to do the work, to understand your impact on the world.

::

Richard

Very similar philosophy, but slightly different in terms of the execution. It's sort of impossible to know how your contribution is going to have impact unless you're in the mud figuring it out. So you need to do the work to make sure that you're actually having the effect you think you're having.

::

JP

Okay, so I got to thank you, Doc. Thank you for sharing that. Certainly get in the mud, but also at the same time be kind. These are two things I think that every one of us can follow. So thank you for sharing that. I've asked my team to collect some questions, and I'll be honest, some of the responses I've got have been, you know, have been great. It's like two of them have said, I had a question, but I'm not sure that I should send it electronically now after listening to the show.

::

Richard

What’s the harm?

::

JP

But I think that's also testament to the fact about what you've been saying. I think it has been enlightening and it has gotten them to understand, gotten us to understand me as well, where the vulnerabilities are and what we should do to circumvent that to protect ourselves better. One more question that has come in. If you could travel back in time and give one piece of advice to your younger self, what would it be?

::

Richard

Besides don't sell all your crypto. Don't do the human trafficking project. I know that's a little selfish, but I want to say biggest personal flaw, right, is that I have always picked the most technically interesting project and not necessarily the one that makes any money. And I had a VC tell me that when I was trying to sell the voting system stuff and I hated it. And the reason I hated it is because that guy was right. So it took me a long time to come around, but I've since modified that thinking to I'm trying to pick a technically interesting project that's going to do good in the world that can make the most impact for the most people. And that's admittedly a compromise, right? Because I want to say that I'm not much. Obviously I'm a little motivated by money but not a lot motivated by money.

::

Richard

So pursuing only money, that kind of would stain my soul. I think I couldn't handle it. But I also don't want to work on shelf ware. I want to work on stuff that people will actually get it into people's hands and that they'll actually use it. So my conclusion there is that's what I would tell myself is like, if you work on things that no one else is using, does it really matter? And hopefully I would come to the conclusion that I've come to now. And that is my best bet here is to find a happy middle ground between those two. So that's the advice I would give myself. But number one would definitely be like, don't sell the crypto.

::

JP

Fair enough. Okay, doc, one last question. This has come in from Gloria and the question goes, AI is becoming more and more prominent. Is there something we should watch out for to protect ourselves? This may be rudimentary, but I think given the current trends and the focus that AI has gotten off late, is there something that we should watch out for?

::

Richard

It's obviously a problem because the AI is looking at patterns, and the fundamental issue is that it is trying to predict what is not predictable. And what I mean by that is you can say, oh, this is a flower that's a great use of and what kind of flower it likely is. That's an excellent use of artificial intelligence. But the minute you start saying, oh, I'm going to use it to predict the market, I'm going to use it to predict if this guy's a good guy or a bad guy, you've crossed a line of morality that is, I think, unacceptable and I would love to see eliminated by regulation, but we're not there yet. What we're going to see is police departments using it to profile individuals, to profile neighborhoods. And it's going to a cycle of abuse, which is largely in the US. Is largely funded by racism, but pretty much it's the same everywhere.

::

Richard

I think in many cases it's worse than other countries in the US. We acknowledge that we're racist in many mean, not always, but in many ways we do more than other countries. So I think that's the danger. And when you marry that with what I've already told you about all the privacy abuse, it's going to exacerbate the situation, especially when you're talking about trying to make predictions. And if you're predicting that some guy might like this product that you're offering, sure, fine, there's not a lot of harm in that. But you got to look at the harm. And they talk a lot about end of the world, ending AI. The machines are going to take over. I've looked at the LLMs. That's not a real threat right now. There's a lot of real harms right now. I think her name is Mitchell and Tim Nick Gibrew, they've been publishing in this space for five to ten years at least.

::

Richard

They have outlined a lot of these harms. And they're right. We should be listening to the people who have been studying this on what these harms are and trying to build some regulations. And the other thing that I really dislike is all of this public data that you're sending into Google. I mean, I think Google just changed their terms of use to make it more explicit that you are training their AI. But all of this data, we're the source, we're the reason the AI is smart, right? We're training the data. The reason Mid Journey makes such great images is because it's stolen and using a bunch of images from everyone else. And I would love to again see regulation that requires them. And I know that they can build this in. Because I used to work on AI systems, requires them to cite their sources.

::

Richard

It would be not that hard to make it work that way. They're just choosing not to do it because it makes them more legally liable. Great question.

::

JP

Indeed. I think we have to do another edition of the show because there's a lot of ground to cover. There's a lot of thought that still has to be put into how to go forward in this entire thing. And that's what we've learned through this entire show. So, Doc, thank you so much for letting us, for first accepting the invitation, sharing your insights, being at the forefront of what you're doing, and thank you for your contribution to the space. To be honest, it's been spectacular having you on the show.

::

Richard

JP. It has been an absolute pleasure and I appreciate everyone in the audience and I wish everyone a great day.

::

JP

All right, thank you so much, Doc. Ladies and gentlemen, that's it. This is the last question that we can take, but if you have further questions, write them into AdLunam Inc. or to our speaker directly. We get them answered for you. I know that there are a lot more. If you're looking for a secure channel, you already know where to look. So, yeah, you so much, once again for being on the show. We will be back next week at the same time at the same place. This is JP from AdLunam Inc. bringing you everything about web3. Have a good one. Cheers.

Leave a Reply

Your email address will not be published. Required fields are marked *

top

Subscribe to our Newsletter

Join our AdLunam community by clicking subscribe now!!

Paid Search Marketing
Search Engine Optimization
Email Marketing
Conversion Rate Optimization
Social Media Marketing
Google Shopping
Influencer Marketing
Amazon Shopping
Explore all solutions