Video / Uncanny Valley

5 Questions About War Virtually

In this upcoming free live event, anthropologist Roberto J. González will discuss his new book, War Virtually: The Quest to Automate Conflict, Militarize Data, and Predict the Future.

In this live Q&A, former SAPIENS Media and Public Outreach Fellow Yoli Ngandali meets with anthropologist Roberto J. González about his new book, War Virtually: The Quest to Automate Conflict, Militarize Data, and Predict the Future. Watch and learn about war robots, the militarization of virtual technologies, and whether the efforts to automate conflict make us more or less than human.

Read a transcript of the CART captioning by Jordan Mucha

>> YOLI NGANDALI:  Hi, everybody. Come on in. We are going to wait one moment and then we’re going to get started. You are in the right place. So just wait one moment here.

Hi, everybody. Come on in. We’re gonna wait just one more moment before we get started.

All right. All right. Yes. Well, hello and welcome, everyone. My name is — hello and welcome, everyone, to SAPIENS 5 Questions Series. My name is Yoli Ngandali and I’m a graduate student in archaeology at the University of Washington.

And a former Media and Public Outreach Fellow for SAPIENS.

Before we get started, I want to encourage you to click the link in the chat to join our SAPIENS newsletter. This newsletter delivers news stories about humans and our world every Friday.

Today’s guest is Roberto González. Roberto is a cultural anthropologist whose work focuses upon science, technology, militarization, processes of social and cultural control and ethics in social science. Professor González is the chair of the Anthropology Department at San José University. He’s authored several books including Anthropologists in the Public Sphere, American Counterinsurgency, Militarizing Culture, Militarization: A Reader, and Connected: How a Mexican Village Built Its Own Cell

He’s also co-produced the documentary film Losing Knowledge: 50 Years of Change in 2013. He’s also conducted ethnographic research in Latin America and the United States, and now he’s here with us to talk about his new book, War Virtually: The Quest to Automate Conflict, Militarize Data, and Predict the Future. This is hot off the press from the University of California Press. I’m gonna stop sharing and welcome our guest, Roberto. Thank you so much for being here today. Welcome.

>> ROBERTO GONZÁLEZ:  Thanks for the introduction, Yoli, and for the invitation to be here. It’s a pleasure.

>> YOLI NGANDALI:  And for people in the audience, if you have any questions for Roberto during this time, make sure to add them using the Q&A function at the bottom of your screen, and we’re gonna answer them at the end. So we only have about 20 to 30 minutes here today with Roberto, so let’s get right into it.

Roberto, congratulations on the publication of War Virtually. Much of the book concerns the military’s advancement of data-driven technologies and artificial intelligence. Can you give us a brief preview of the trends, trajectories or predictions for the next generation of technological systems and models for warfair?

fare?

>> ROBERTO GONZÁLEZ:  Sure. I should start by quoting former Yankees catcher and sometimes philosopher Yogi Berra who once said, it’s tough to make predictions, especially about the future. It’s really unclear what things are gonna look like 10 or 20 years down the line, but what I can talk about are things that are in the pipeline right now at some of the military research labs.

And hopefully a bit later we can get into how social scientists are actually playing a part in the development of this new technologies. First and most importantly, I think, is the development of next-generation drones. So basically aerial drones, unmanned aerial vehicles are really I think the centerpiece of future wars.

And we’re seeing this, for example, right now in the Russian innovation of Ukraine where Ukraine in particular has made use of Turkish-made drones. I think the significant shift that we’re likely to see is the incorporation of more and more autonomous technologies into drones. They have the capability now, but there are still a lot of problems to be worked out.

A second area that I cover in the book that I think is really important is the creation of a kind of very sophisticated form of high-tech psy-ops or psych lodgeal operations. You can think old school helicopters throwing out pamphlets on to, you know, an Afghan village or an Iraqi village with propaganda basically trying to entice enemy soldiers to surrender.

Nowadays I think what we’re seeing is the use of micro-targeted messaging. Using social media and other forms of internet communication. And, again, we see this all around us, even here in the United States, as both Ukrainian and pro-Russian forces try to sway American public opinion about that particular war.

Another area that I cover in the book is the area of predictive modeling software. These are programs that basically aggregate big data from sources like Twitter and Facebook to try to get a better understanding from the military and intelligence agencies’ perspective of where future conflicts are likely to happen.

And then finally, I would say the development of the next generation of lethal autonomous weapons systems is high on the agenda of the Pentagon’s, you know, research labs. And I can talk about this shortly, but, clearly, here we’re seeing an expansion of Silicon Valley and the tech industry as becoming really important defense contractors, much more so than any time in the past in the years to come.

>> YOLI NGANDALI:  Mm-hmm. Mm-hmm. Oh, wow, thank you so much for that summary to give us kind of a baseline of where we’re at with this book and where we’re thinking about going in the future here. So this second question here is a two-parter. How is data a gift? How is it a commodity? How is data a weapon? And what does anthropology bring into this conversation?

>> ROBERTO GONZÁLEZ:  Let me start with the last part of that question, because I think that’s gonna be of interest to a lot of people listening here today. I think anthropology has a lot to say about data because data literally, if you look at the origins of the word, data means “gift,” the literal meaning in Latin, the word datim, is where data comes from. It literally means that which is given.

There are anthropologists who have made note of this in some of their writings have made us aware of these groups of the word data. And, of course, any time we’re talking about gift-giving or reciprocity, I mean, these are really in a lot of ways these areas of economic anthropology are forming the basis of modern cultural anthropology, I would say.

If we look at the work — really gift exchange was at the heart of their analysis. And I think it’s — that’s been continuous throughout the history of modern anthropology. And on the cultural anthropology side and social anthropology is this interest in gift-giving.

So framed in that way, anthropology clearly has a lot to say about data as a gift. And I get into this in the introductory chapter of the book, and at various points throughout, and the way I look at this is very similar to the way that Jillian Tepp has suggested in some of her work. She notes the cyber economy is basically built on sort of a large-scale system of exchanges between the technology firms on the one hand, which are providing seemingly free gifts and services to users.

So if you think about your free Gmail account or chat functions. Or forms of entertainment that you may be able to get online for free, supposedly. In fact, what it is is an exchange where the reciprocity part is you handing over your personal information to these companies to do with as they see fit.

>> YOLI NGANDALI:  Right. There is no free lunch.

>> ROBERTO GONZÁLEZ:  There is no free lunch, and we often don’t reflect on this exchange, but I think that’s a really basic but important way of understanding the idea of data as a gift.

In terms of your other questions, then when does that become a commodity? Well, the commodification is when these gifts of personal data, these images of ourselves, to quote another anthropologist, Rebecca Lamont. When we give that up and the companies aggregate this into what we call big data, and then sell it either to third parties or use it themselves to develop advertising campaigns for their clients.

Then that is the gift as commodity, clearly. It’s not an accident or not a coincidence that some people are referring to big data as the new oil, just because of its value in terms of the capacity to generate revenue for a company like Twitter or Facebook or Google.

So I think that’s a really important aspect of this that we should be thinking more about. Not just in the context of the militarization of data, but more broadly in terms of a certain form of economic transformation that occurs with data.

In terms of the weaponization of data, well, to me, clearly, if data can become aggregated and can be sold off to clients for purposes of advertising or propaganda in the political sense, then data can quite easily be weaponized in the connected world that we’re living in.

And the example that I cite in the book is that famous example of Cambridge Analytica from a couple of years ago. I take a little bit different angle on that, because rather than just looking at Cambridge Analytica, I take a harder look at SCL Group, which is the larger — it was actually a defense contractor and Cambridge Analytica was just one part of a larger company based in the UK.

And so the argument I make is that essentially Cambridge Analytica is a great example, literally, of the militarization of big data, because it’s happening under the umbrella of a military contract firm, SCL Group. Who have been doing business across dozens of countries around the world at the same time that the Cambridge Analytica scandal broke.

So that in a way — those are three sort of different — it’s a great question because it explores the different ways in which you can analyze that or think about data as an anthropologist. And I do get into that pretty extensively throughout the book.

>> YOLI NGANDALI:  Mm-hmm. That also kind of — as you were speaking, reminds me of so many different, like, ethics involved in all of these different parts here. How do we kind of manage some of those, those expectations for, you know, having ethical relationships with our data?

>> ROBERTO GONZÁLEZ:  Yeah, well, I think that there are some examples, some great examples of this. I mean, towards the end of the book, I talk about some of the efforts under way to try to push back against the militarization of data and the weaponizization of big data, and there are some groups out there doing some really interesting things, some activist groups, that collectively I guess you could think about it as kind of the own your own data movement.

Where there’s these activists trying to pressure the technology firms to be more transparent about what they’re using their data for, about what data they’re collecting on us, and there are examples from other parts of the world that have done a much better job of demanding transparency from firms doing business there.

I think that in terms of ethics, that is a clearer sort of ethical path that we can take if we’re confirmed about these kinds of things. I thought it was really important in the conclusion, not so much to have, you know, to end on an up note, but just to acknowledge the many groups out there trying to — trying to change these potentially really dangerous situations and exploitive practices that so much of the tech world has been complicit in.

>> YOLI NGANDALI:  Right. Right. Oh, wow. Thank you so much for that. I mean, we’re talking a little bit more about some of these, you know, larger corporations and things like that. What role do these corporations like Amazon, Google, Microsoft play in anticipating human behavior and in these data models and predictions?

>> ROBERTO GONZÁLEZ:  Yeah, well, I think — I mean, first of all, I think lots of people assume that these big companies, Amazon, Google, and Microsoft, for example, are really good at predicting human behavior. Because they have access to so much of our personal data.

>> YOLI NGANDALI:  Right.

>> ROBERTO GONZÁLEZ:  I’m not at all convinced that they’re as good at predicting things as we sometimes think. Or as they sometimes make themselves out to be. I think, you know, if we’re talking about if a person is more likely to buy organic or non-organic or, you know, whether or not a person’s more likely to vote Republican versus Democrat, then, yeah, they may be pretty good at those sort of rough estimates.

But detailed predictions about human behavior, you know, I don’t buy that, really. I think what’s more important to ask — and I get — there is a chapter in my book called “Pentagon West” where I get really to the heart of this. I think the question we have to ask is: What role are these corporations, which many of us think of as sort of clean, non-military tech firms, that somehow operate in a bubble. How they have become major defense contractors.

And how they are — to the point that the Pentagon literally now has outposts on the west coast in Silicon Valley within walking distance of Google Headquarters to try to acquire new technologies and to try to find new start-ups that are developing technologies that they can put to work for military and intelligence purposes.

Everything from targeting, you know, so-called terrorists or enemies to high-tech surveillance systems. There’s just all kinds of stuff that’s in the works. Much of it coming out of small start-ups in Silicon Valley. So just to take two examples of how there are outposts now. They’re not so much military forts or anything like that or bases, they’re organizations. You’ve got the on the DOD side, Department of Defense has a DIU, which is an acronym for Defense Innovation Unit.

They’re headquartered, again, literally en masse on Ames. A big NASA research base there and very close to the Googleplex, Google’s headquarters.

And then on the CIA and intelligence community side, you’ve got an outfit that’s been around for close to 20 years now called Incutel. They’re basically responsible for providing funds to start-up company for mainly surveillance technologies that’s are being developed in the greater Silicon Valley area.

These two entities are also operating in places like Austin, Texas, and the Boston, Massachusetts area as well, where you’ve got lots of kind of similar tech firms doing business.

So, to me, it was fascinating as someone who teaches in the heart of Silicon Valley, at San José State, that there is so much work going on not far from my headquarters, base of operations at the university, that is hardcore sort of military —

[ Inaudible ]

So that was of great interest to me. So this is in a way maybe answering your question in a different way. We shouldn’t be so concerned about how well Amazon, Microsoft and Google are doing at predicting human behavior as how they have shifted now to major defense contracts.

>> YOLI NGANDALI:  Right.

>> ROBERTO GONZÁLEZ:  I’ll just close on one note. Probably the biggest prize of all is a contract that is called the Joint Enterprise Defense Initiative, JEDI. It’s essentially a cloud system for all the military — for all military users, and we’re talking about millions of military employees that would be using this. It’s a $10 billion deal that’s probably gonna be even greater in the end.

And Microsoft was awarded the contract. But Amazon basically jumped in and protested. So now it’s tied up in the courts. And the official awardee hasn’t been named. So here you’re seeing the tech firms kind of struggling with each other over these billion-dollar contracts, multi-billion-dollar contracts, which to me was kind of surprising to find out.

I had a sense that some of this was going on, but it’s really interesting to see just the depth to which it’s out there.

>> YOLI NGANDALI:  Yeah, that is quite surprising. I want to just say something to our audience. If you have any other questions for Roberto, make sure to put them in the Q&A before — before we’re too far along. I want to make sure that you can have a chance to ask a question as well.

So my next question, I was particularly fascinated with the account in the book about the robot funeral and these types of emerging relationships between humans and artificially intelligent machines. There is an important question that you raised in the book, and it was: How are homosapiens capable of dehumanizing members of its own species, as it has for centuries, while simultaneously humanizing robots that killed them?

So can you speak a little bit more to this and maybe just a little bit about techno-animism.

>> ROBERTO GONZÁLEZ:  Sure. As somebody who has been studying militarization for a long time, one of the questions that has always fascinated me is how are soldiers trained to overcome what I would say is a pretty natural aversion to killing other human beings? And, you know, there is a lot that’s been written about this. Probably the most famous recent book is a book called “On Killing” by Dave Grossman. It turns out what the military has figured out is that to overc

That humans have to killing other humans, you have to shape the killing practice so it’s more reactive versus reflective. Once you give humans time to reflect on the process of killing another human, there is a good chance they’re never gonna do it.

So the training these days breaks down into things like pop-up marksmanship drills and fire commands and other kinds of battle drills that are essentially an effort to incorporate muscle memory into the killing process, so that you’re just automatically reacting lethally to certain kinds of situations as a soldier.

>> YOLI NGANDALI:  Wow.

>> ROBERTO GONZÁLEZ:  Now, to me, that’s kind of a big deal, but it’s pretty simple psychological technique that they’re using during the training process to do this. And it can be done within a matter of months, with the vast, you know, majority of soldiers.

So, to me, that’s interesting in and of itself. But the relevant subject matter that I’m dealing with in this book has to do with a different problem that psychologists are working on within the military establishment, which is how can we get humans to place their trust in robots? Because it turns out that efforts that have been made so far to incorporate autonomous robotic systems into battle groups are not working.

And because there is a deep mistrust on the part of soldiers towards the machines. I mean, I’m talking about, you know, let’s say an infantry troop that sees introduced as part of her team, you know, a robot, autonomous, semi-autonomous robotic system that is armed, right, with a machine — with an AK — with an M-16 or what have you.

But the soldiers aren’t — at this stage right now — they’re not gonna be — you don’t find many soldiers willing to place their trust in machines. It turns out there’s been enough incidents of friendly fire when these experiments have been done that soldiers just basically distrust the machines.

So you’ve got all four branches of the military, they each have their own research laboratory, and each of them has psychologists primarily dedicated to what’s called trust calibration within the military. And the idea here is what mechanisms can we use to shape soldiers’ attitudes towards these machines so that they’re more willing to place their trust in them?

And what they’re coming up with is pretty fascinating and sometimes — I shouldn’t say funny — but just the sorts of things they’re trying out is — I mean, it’s just kind of all over the map. So some of the things they’re looking at right now is kind of feedback systems where they’re basically informing the autonomous machine, the robot, about the emotional state of the human and then adapting the amount of the actions of the Robert, depending on the emotional state of the user.

Another example would be just basically abandoning the idea of fully-autonomous robots and doing a sort of combined machine/human team, which they’re calling centaur war fighters, like the mythical half human, half horse. They’re saying let’s try to work the technology in gradually by assigning certain tasks to the robot and having the humans do other things, right? That are more in their comfort zone.

And then you’ve got other ideas like trying to make machines more — literally more anthropromorphic, so they look like us, act like us, they’re more polite towards us. There are all of these kinds of aspects of changing the machines, which is called the user interface and the user experience that the user is having to overcome that trust.

One of my favorite sections in the book is called The Trust Engineers. It’s all about the psychologists doing this kind of research. At times I was chuckle and think, you know, they’re never gonna do this. At times I would go back to this idea of the kill drills. But, yeah, the psychologists have managed to do a really good job of training their soldiers to kill.

So I kind of leave it as an open question: To what degree will the trust engineers be successful?

>> YOLI NGANDALI:  Right, right.

>> ROBERTO GONZÁLEZ:  For themselves.

>> YOLI NGANDALI:  Yeah, from what I’ve seen in my own experience and, you know, online and stuff like that, like, seems as though robes with googly eyes and, like, you know, that look a little bit more friendly seem to be more trustworthy, right?

>> ROBERTO GONZÁLEZ:  Yeah. And there is some really fascinating things that I uncovered, like, for example, the piece that — the excerpt that appeared in SAPIENS from my book. And you made reference to the robot funerals. I mean, these are remote-controlled robots that look kind of like Tonka Trucks. They’re sort of really not very human-looking yet. The soldiers give them names and, you know, Johnny 5 or R2-D2 or these kind of names from movies. And seem to at some level be developing a real

So one can’t just dismiss out of hand the idea that humans can build some kind of relationships or at least have some kind of emotional attachment to these machines that they’re spending so much time with out on — in training and on the battlefields as well.

So, yeah, that was — it was an eye-opening chapter in many ways. And for me, one of the more fun ones to write, too, from the book.

>> YOLI NGANDALI:  Mm-hmm. Well, I want to ask you one more question before we get into the audience questions. I just want to know a little bit more about you. What’s your own story? And a few important things that we should know about you and you how approach your work as an anthropologist and as an author.

>> ROBERTO GONZÁLEZ:  Sure. Yeah, and I’ll be brief here, because I do see there are at least a couple of questions. My own background, I think it’s probably — I mean, in general, the anthropologists that I’ve known over the years tend to have very varied backgrounds. Many of us get into something other than anthropologists before we were kind of taken by the field that we’re in.

I actually began my academic career as a student of mechanical engineering at a large public university. And actually almost finished my degree. I was about two semesters away with a pretty good GPA. But just not really seeing myself in that role for the rest of my life.

And by happy coincidence, I enrolled in a general ed course that was Introduction to Cultural Anthropology, and I think it was a month later I changed my major to anthropology. And that was following a pretty tough conversation with my parents. As you can probably imagine.

But in the meantime, I’ve realized now reflect back 25 years later, that that background in engineering really shaped the kind of anthropology that I do.

>> YOLI NGANDALI:  Right.

>> ROBERTO GONZÁLEZ:  So one of the common threads that has pretty much linked all the anthropology work that I’ve done is this fascination with science and technology. And I’ve looked at that in different ways both in my research in Mexico on cell phones most recently, and also in the military context, now with this new book.

I think the ability to — the experience of having had that engineering formation and training has allowed me to kind of understand the culture of engineering at least, and to some degree I would say computer programming as well. In a kind of way that is, I think, more profound than if I had just studied four years of engineering as — anthropology as an undergrad and gone on to anthropology graduate school.

So I kind of see myself as this weird hybrid between an anthropologist who has had some substantial engineering training over time. And then just to close up, you know, your question about my role as an author has become increasingly important to me.

I mean, I think I began, as many anthropologists did, wanting to make my mark on anthropological theory and the anthropology of Latin America and Mexico in particular, but I would say over the past ten years or so, I’ve become increasingly concerned with the fact that so much good anthropology work is unknown to the general public. And this is why I think what SAPIENS does is so important.

Because I think one of the goals that the magazine is committed to is reaching those general public that need, I think, the insights we can bring them. So I’ve really spent a lot of time honing my writing and trying to make it accessible as to wide an audience as possible. And for me, it’s been really rewarding and gratifying and sometimes challenging to do that.

So I’ll just leave you with that.

>> YOLI NGANDALI:  Thank you so much for that. We’re almost out of time, but I do want to be able to try to get to a couple of these questions. Is it okay if we go over just a little bit?

>> ROBERTO GONZÁLEZ:  Sure.

>> YOLI NGANDALI:  All right. So I’m going to start with Lucas. How does the responsibility for war change or not with the move towards virtual warfare?

>> ROBERTO GONZÁLEZ:  Yeah, that’s a great question. I think the move towards virtual warfare, in a way clouds the responsibility for decisions to go to war. And even more specifically, it can potentially cloud the responsibility of those who are doing the killing.

One of the things that I have come to understand is that the push towards autonomous weapons systems, if we just look at that one — that one example. So here we would be talk, for example, about fully-autonomous drones that are, say, flying over Afghanistan and then making a decision about who to kill and when and where and the conditions based on certain algorithmic processes.

That’s not happening right now, that I know of, but the potential is there. In those circumstances, who’s responsible if the civilians get killed? Who is responsible for there is a wedding party, for example, in which an autonomous drone mistakenly identifies the victims? And who is to blame there, right?

So, to me, a lot of the — a lot of this effort to kind of automate the battlefield and create autonomous systems is about absolving humans of responsibility for war.

>> YOLI NGANDALI:  Wow.

>> ROBERTO GONZÁLEZ:  And to me, that’s profoundly troubling in an ethical sense, and I think this is one reason you’ve got, among others, philosophers talking about the implications of virtual warfare. So I appreciate the question. It’s a really big one and a really important one.

>> YOLI NGANDALI:  Right. I’ll take one more. If the Big Tech companies are not so great at predicting human behavior, how are they doing on shaping human behavior? Are they molding consumers rather than studying them?

>> ROBERTO GONZÁLEZ:  Yeah, I think that’s the key right there. I would fully agree with that. I mean, you know, when — when you have watched a certain series of Netflix series, right, one after the other, the artificial intelligence has the capacity to recommend things to you, right? To what extent is it truly your choice? And to what extent is it Netflix basically pushing something on you, essentially, or Amazon does the same thing, right, with purchases.

So, yeah, I completely agree with the sentiment behind that question. And I think that’s — that’s the right perspective to take on that. Rather than to somehow be awed by the idea that these companies are somehow predicting our behaviors.

>> YOLI NGANDALI:  Mm-hmm. Mm-hmm. Oh, wow. Well, since we only have one more question here, I’m going to just read it, and then we should wrap up. So, thank you so much for this book. I can’t wait to read it. And this is from Georgia Butler. A question I have is whether you see the commercial sector driving defense technology solutions, maybe even to the point where the commercial sector is creating new possibilities for the defense sector that they didn’t even imagine.

So we were talking about, like, Silicon Valley as well. So is the defense sector still the primary driver of what it needs to be — of what it thinks it needs? Is it the defense sector still the primary driver of what it thinks it needs?

>> ROBERTO GONZÁLEZ:  That’s a really good question, and I’m not sure I know the answer to that, but I think that you’re on to something there. And I’m convinced that as companies like Google and Microsoft and Amazon and Palantier, which is a great example of a tech company that basically only service the defense and intelligence sector, I think we’re gonna see more and more of the commercial part driving the military part.

By which I mean the tech industry driving the military industry. And I want to close out and note that for me was very sobering. Because I’ll be honest with you, I wanted to do this book for a long time, but I put it off for quite some time as well because I was busy doing other things like chairing my department. But I had a conversation with an applied mathematician who was a senior researcher at Google and left.

He calls himself a conscientious objector. He left when he found out that Google had undertaken a multi-million-dollar contract that I describe in the book with the Pentagon called Project Maven. And so he quit out of conscience, his position, and has created a new organization called Tech Inquiry that is doing some great research into the ties between the tech industry and the defense agencies.

What he told me was this — and he said — I’ll just — my blood kind of child when I heard him say this. He says, I think within a few years, we’re likely to see the merger of Big Tech and big military companies, literally, he said. He said something like, imagine something like Amazon buying Raytheon or Lockheed Martin. Because the resources the tech industries have is so much greater than even the largest military firms.

So when he said that, I thought, I’ve got to write this book and I’ve got to do it soon because it’s happening all around us. So I do think that’s in the spirit of the question, though, which is who is really the contractor here and who is, you know, the client?

>> YOLI NGANDALI:  Mm-hmm. Okay. So, actually, I was wrong. I missed one more question. So this is from Robert Fritz as well. How does the robot determine emotional states of the soldiers? Oh, well, it looks like we might have a little bit of some technical difficulties. It seems as though we should probably move on, since it is the end of our time.

Is that all right? How are you?

>> ROBERTO GONZÁLEZ:  Hello, Yoli.

>> YOLI NGANDALI:  Yep, hi.

>> ROBERTO GONZÁLEZ:  Hi, somehow my speaking of tech — I’m not sure if the internet Gods are after me now, but I had a weak internet connection for a moment. So I didn’t get to hear any of the last question.

>> YOLI NGANDALI:  Right. Right. So how does the robot determine the emotional states of the soldiers?

>> ROBERTO GONZÁLEZ:  Ah, thank you. Yeah, so what that is is essentially these are body sensors. So these are sort of like adhesive body sensors that monitor things like heart rate and breathing patterns and things of that sort. So the same kind of technology that you might find on certain wearable devices. These day, there’s wearable rings and so forth that monitor body states.

They’re basically deriving or inferring emotional states base would on those kinds of biological patterns that are coming from the human body.

>> YOLI NGANDALI:  Thank you so much. I appreciate that. And I appreciate you for sticking around to answer a few questions a little bit longer, but we are out of time. So, thank you, Roberto, for being here.

>> ROBERTO GONZÁLEZ:  Thank you so much, Yoli. This was a pleasure. And great questions, everyone. Thanks for the opportunity.

>> YOLI NGANDALI:  To read the except of his book, go to SAPIENS.org. The link is in the chat. And if this conversation has piqued your interest, go ahead and buy the book. So, War Virtually is now available on all major book-selling platforms. Thanks everyone for sticking around and be sure to like, follow, subscribe to SAPIENS to be updated on our next live 5 Questions Event. Thank you again and have a wonderful day.

Republish

You may republish this article, either online and/or in print, under the Creative Commons CC BY-ND 4.0 license. We ask that you follow these simple guidelines to comply with the requirements of the license.

In short, you may not make edits beyond minor stylistic changes, and you must credit the author and note that the article was originally published on SAPIENS.

Accompanying photos are not included in any republishing agreement; requests to republish photos must be made directly to the copyright holder.

Republish

We’re glad you enjoyed the article! Want to republish it?

This article is currently copyrighted to SAPIENS and the author. But, we love to spread anthropology around the internet and beyond. Please send your republication request via email to editor•sapiens.org.

Accompanying photos are not included in any republishing agreement; requests to republish photos must be made directly to the copyright holder.