Scott Timcke: Algorithmic capitalism and digital dehumanization (ep344)

Increasingly, the process of decision-making is taking place without people’s involvement, and these decisions are becoming automated—we have less sway in how the processes that organize our lives can be altered to better suit our interests.
— SCOTT TIMCKE

In this episode, we welcome Scott Timcke, Ph.D., a comparative historical sociologist who studies race, class, and technology in modernity. He is a research associate with the University of Johannesburg’s Centre for Social Change and a fellow at the University of Leeds’ Centre for African Studies where he studies the overlap between algorithmic capitalism, FinTech, and neocolonialism. He is also the author of Algorithms and The End of Politics.

Subscribe and listen to Green Dreamer in any podcast app, or read on for the episode transcript.

 

Artistic credits:

 
 

If you feel inspired by this episode, please consider donating a gift of support of any amount today!

 
 

Transcript:

Note: *Our episodes are minimally edited. Please view them as open invitations to dive deeper into each resource and topic explored. This transcript has been edited for clarity.


Scott Timcke: I was born in the early 80s in South Africa, which you and your viewers know was both the premature end of apartheid but also this moment where it started to decline. So there was a lot of social turbulence, a lot of protests, a lot of repression.

Being a white person in the community, you see the basic unfairness of that society. So those tumultuous years very much orientated the types of things I'm interested in: How does racial oppression occur? What are the mechanics by which [it occurs]? How do states oppress people? How does class become involved? What types of technologies are utilized and deployed against people to subjugate them?

Those types of things were very much in the forefront of my mind as I started to look around me, as I started to go through school, the schooling system in South Africa start to become a little bit more integrated. You start to learn to work with, participate with, play sports with people from different communities, and are some of the anchoring points that stake out the types of things that I've tried to follow through in my research, at least.

Kamea Chayne: I appreciate you sharing that. Before we dive in further, I want to be honest and say that I found it challenging to prepare for this interview because there are so many layers to what you address, and a lot of it is focused on the digital realm, which can feel abstract for me and hard to conceptualize. So I just want to put that out there first and say in advance that I may need clarifications on things that I'm just still trying to wrap my head around.

But to start us off with a more relatable and bigger-picture question, I'm aware that as people talk more about technologies like social media and the internet, many are sitting with this question of how do we make these algorithms better serve humanity? Or how do we make better use of a lot of the data being collected to ensure they don't end up being weaponized against us for example?

But you've gone deeper to question how the entire way of reducing our social relations through algorithms is in itself changing our lives in ways that we might not even be able to fully recognize right now. So can you walk us through your train of thought on this and how you see algorithms in this system inevitably driving a process of simplification?

Scott Timcke: I think that there are several layers operating simultaneously and they rub against each other and sometimes create opportunities for change. We'll get to those in a moment.

But let's speak a little bit about the layers. You spoke about one, which is how the technologies themselves, because of who just happens to design them and the types of biases they may bring to the table, either unconscious or very conscious in some cases, how the data sets when it comes to facial recognition. Sometimes they discriminate against people of color, for example. That's one that's very common. You find it in the media in the United States, at least.

The other things are a little bit more abstract, as you were saying, like the very types of logic in the first place that we come to evaluate why we want to pursue certain types of technologies. Now, certainly, we think that there's a lot of utility to the data that forms what we use on a day-to-day basis. I think all of us would agree that many of these technologies like social media platforms are incredibly useful and that's why we use them, despite knowing that the legal frameworks or user agreements that we signed aren't sometimes serving our best interests because we offset [that] by the sheer basic utility of these tools. We understand generally as a population that even things like Slack could be useful to ensure productivity or help coordination within firms. These things have an obvious utility that I don't think many people would deny, noting many people really want to get rid of them.

The problem happens to come when we purely think about technologies on one axis of valuation—their utility to maximize profit. When we start thinking about technologies only in that way, whatever social benefits they get are incidental, rather than the main imperative.

The type of work I'm trying to do is to look at how can we make the incidental components of social goods that come about—things that we obviously recognize have a lot of value to us—how we make them the main imperative rather than simply enriching the one percent. Those are the types of things that I'm very much interested in, to try to find the scope to think about and try to get others to think about, too.

Kamea Chayne: Right, so I guess to ask a more basic but critical question. Do you see the internet and information technologies as being themselves neutral with the biases coming from the people behind them, their governance, and their users? Or do you see these tools in and of themselves serving particular logics that cannot be considered neutral?

Scott Timcke: No, I would never consider them neutral. The algorithms have their own affordances and are even skewed by the nature of their design. Think about if we had to build a building, for example, it obviously has a set of properties that favor enclosure rather than being on the outside. So the architecture, the design of these things, to use that type of language, certainly has affordances, emphases, biases, pick the word you wish. That said, at the same time, there are people who deploy these affordances in particular ways to suit their interests or pursue their interests.

So sometimes, I think we get into a bit of a fight, a way of thinking where we put these things as diametrically opposed to one another. The affordances versus the interests of the people who happen to use them. For me, it's a question of both, as these things are also situated within this larger, historical setting that we find ourselves in, one that has a deep, rich, complicated, complex, and contradictory history.

We need to see technologies both upon questions of who designs them, their own attributes that they have, and how these things themselves are conditioned by the long, rich history, where there’s an intense struggle over the design of these things in the first place.

Kamea Chayne: They're really not starting from ground zero because they're being created given this context that already exists and it's connecting a world that already has a lot of historical injustices and it's kind of weaving all of these threads together. So there's that bit...

And more on how algorithms, I guess, end up reducing social relations. There is this broad idea that the positive potential of the internet and social media lies in how they connect the world and connect people with others, and stories and discoveries that they may not otherwise be able to make offline.

Nothing is, of course, ever all good or all bad, as we just mentioned. But I'm curious to hear how your understanding and views of social connections and relationships may have shifted since diving into all of this research.

Scott Timcke: Well, I mean, I think one of the things that inform my own understanding in this area is I grew up when we had this Big Bang. I went to university and one of the first assignments we ever had to do was send an email to ourselves, right? I'm sure some of the listeners out there will laugh at that, but that I think was worth 20 percent of our mark, right?

So you come of age watching as platforms like Facebook, or more so MySpace at that time was very popular, but Facebook and then all the other platforms that have become much more popular came into existence. They have had a degree of initial influence, and then they have massive influence and the ability to coordinate influence in different types of factions on, about, and through them.

One thing we sometimes miss is how digitization, as a larger process, is very much similar to industrialization in the late part of the 19th century and early part of the 20th century—how industrialization fundamentally rechanges cities, workplaces, relations of hierarchy, the public and private realm.

Those same types of things are occurring under digitization more broadly. Sometimes I think when we focus too much on an individual firm or an individual business strategy, or an individual quarter, we sometimes miss these larger, decade-like transformations that are occurring to our cities, to ourselves, and also to our countries. So I think that that's the larger thing that I'm interested in.

To get to the point about simplification, which is one of the core themes of one of the books I've written on this area, is that too often, we become so ingrained in the minutiae of platforms. What are the user agreements compelling us to do? What are we losing when we signed them?—that we miss the larger two, three, or four-decade transformation that is very much changing who we happen to be. The types of subjectivity, the types of identities, the types of ways of being, the types of behaviors that come into existence and [have] now become salient for us.

By comparison, industrialization creates workers, management, and capitalists. That type of process fundamentally changes how people come to think of themselves and how they come to do their work and how their work is organized. I think some of the types of processes occurring right now in digitization are the very nature of work, our identities, the way we move through the world, the way we experience in that and comprehend the world suddenly changing to in ways that are as expensive as those that occurred under industrialization.

Kamea Chayne: To that point, you talk about how information technologies are entrenching us deeper into this system of extraction and exploitation. I know there are so many different angles you could address this, but are we able to recognize this showing up within our day-to-day lives in the more micro-level as we engage with these platforms? Otherwise, what are their greater impacts exerting their logics and influence, and how they might also limit our imaginations or senses of choice?

Scott Timcke: I think we do, for the most part, pick them up. I sometimes don't think that we connect them as well as we could. What I mean by that is we all, and you see this with user experience studies of platforms, there's a lot of grievances that people have with them. They're frustrated by this or that. Why can't this platform do this? Or what can't this technology allow that?

We often think of when we call a help-center line, and the voice is automated and we have a set of preset menus that come to us and we sit there and go, well the preset menus can’t cater for the type of particular complaint that I have or I'm just looking for information that these menu options don't really provide. I want to speak to a real person.

One of the things I think that is occurring with the ever-greater technologicalization of the world, the digitization I've been speaking about, is that increasingly, people are being removed from the process, such that we're now going to become part of the inputs and we're affected by the outputs, but we're not really part of the decision-making process within that. What I mean by that is that we can't really speak to someone at, say, the cell phone company to talk about a bill and why there may be charges that are unfair on them.

So increasingly, the process of deciding, of decisionmaking, is taking place without people's involvement. So these decisions are becoming automated, and more importantly, there's very little scope for people. So that also ultimately means that we have less clout, less sway in how the processes that organize our lives can be altered to better suit our interests.

If these things become automated and if we think about how they may occur at places that are much more meaningful than just, say, the telephone company. We can think about how welfare systems may be automated and how they may automatically make decisions about our lives with very little means to appeal. These pre-set categories that if you don't fit in properly, you get booted.

That's going to have dramatic changes on how, for example, the experience of child poverty or how people will be able to get goods or access services that all that has been carved out for them that they may not have now access to because they can navigate these systems well.

Kamea Chayne: To clarify that, does this mean that we essentially, at a collective level, have less power and ability to co-create the world when we essentially can't, for example, influence the robots that are all automated that we are engaging with? So we're engaging less with real people who may be influenced by the things that we may be asking for. Instead, we're faced with these robots that cannot really perceive and take in and understand our full sense of humanity.

Scott Timcke: Yeah, I think that there's a point at which the humans being removed from the decision-making process and the systems that come to govern our lives. I think your summary is very, very good there.

We need to be concerned about this because the stakes are very high. What happens when these things start to bleed over into our criminal justice system?

We then start to see people whose circumstances might not fit these given categories but are nevertheless skewed and pushed into one of them, and then injustice is done in their sentencing. Or we think about how these same types of technologies deployed in a different way may bleed into the political realm and now we have less scope to actually enact democratic changes to the world around us. I think that those prospects are very grave and we need to think a little bit more about them.

Kamea Chayne: Yeah, I certainly have a lot of concerns, especially with how the dominant culture seems to conceptualize advancement of humanity through this idea of technological advancement and the digitization of everything because in my mind, it's equated with a process of dehumanization and really limiting our abilities to be fully human and connect with other people in the most authentic and deepest ways.

Scott Timcke: I think you're absolutely right there. I mean, one of the ways that I think about it, given the parameters of my scholarship, are questions about alienation. I think it very much boils down to the mechanization of alienation or the digitization of automation. We ultimately start to then have fewer and fewer means to reach out and really have genuine human encounters with other people.

We see this again, to speak about the U.S., because of the media system, there are these identities that happen to be are you a Democrat or Republican. [They] have become so salient that they almost become cleavages for massive social fights. If you're a Republican, you have to watch Fox News and you have to believe everything that's broadcast on it.

The same thing is true with the democratic media system. It's such an intensification of these two identities. It's become very binary, to play upon the idea of the digital and digital society, identities have become very binary. It's you are this or you aren’t that. The levels of difference and the ability to navigate difference and the ability to argue and exchange reasons for each other have very much evaporated.

Everything is now, if you are this, from the beginning then I can’t listen to you because whatever you say is propaganda or fake news or whatever it happens to be, that's not, of course, to deny that some people don't have good motives and aren't doing genuine harm in the world. But at the everyday level, at the level of your neighborhood, it's very much alienating you from one another.

These categories that are created, shaped, and reinforced on platforms have now come to entrench ourselves so much so that we can't even reach out and understand the people who live around us.

Kamea Chayne: That resonates deeply for me because I'm currently writing about binary reductionism and how everything is becoming like this side or that side, and just as examples, I have strongly critiqued, “clean energy” or “renewable energy.” Some people are like, “Well, if you critique this, then you must be in support of the fossil fuel industry.”

Or I might critique racial frameworks as being insufficient to fully understand Asians across the board. And people will say, “Well, that sounds like a right-wing talking point like you don't see race.” So it's just like, I don't know, maybe this culture that we're currently entrenched in is stunting our abilities to have nuance and to recognize the full complexity that is in each of our beings as humans.

Scott Timcke: I'm an outsider to the United States, so I've been there once to go get Mexican food across the border from Canada down to Bellingham. I know very little about how people live and see. But you watch a little bit of American popular culture. You watch Twitter, you know, all sorts of things, you talk to people, and one thing I see is how just reactionary in the United States become.

What I mean by that is everyone is very quick to rush to judgment. There's a lot of snap decision making based upon these preset identities that people have because it's almost become us versus them because you have political gridlock and polarization and your senatorial politics ensure that the small states have bigger clout than the larger, more populous states because of the tightness of Supreme Court politics. Everything now is so, I don't really like using the word tribal, but it is so group and affinity-based that if you don't affiliate and fly your team's flag in every moment, you’re seen as suspect, as if you may be hurting the cause.

It's kind of like politics of causes, for a cause that must be triumphant at all costs. I think that there's a lot of merit in trying to ensure that fascists don’t win, for example, there's a lot of good in ensuring that. I want to be very clear, I think we need to punch Nazis.

At the same time, though, we now have such intense polarization that even the people that you need to persuade that there's lots of merit in doing antiracist work right, or there's a lot of merit to be had in doing criminal justice reform, stands to keep entrenching people rather than persuading them to pursue the common good. It seems almost, and again, I don't wanna be too flippant here, but the very idea of the common good the United States seems to have evaporated very, very quickly.

Kamea Chayne: Yeah, and to that point, I have been thinking a lot about divesting from, for example, the short-form social media platforms like Instagram and Twitter, just because I've found the comment sections to be so dehumanizing in how it really incentivizes and encourages this reactivity culture and judgmental culture where people aren't truly interested in having genuine engagements. People just like to drop their hot take and then leave. Or even when people do have a dialogue, it's dehumanizing in the sense that people can't see each other's full humanity or sense each other's emotions and states of being so conversations can get so derailed—even though social media has been set up to supposedly connect people.

I know that I personally have not loved my experience on it, and it's given me a lot of thoughts about how I want to divest from these forms of short-form social media that encourage this reactive response, and spending more time on other things that allow me to connect more deeply and genuinely with people.

I don't know, maybe part of this is because these social media platforms have been set up in this capitalistic system where they're focused on the attention economy. So how do they design algorithms and this platform to keep people there the longest? A lot of times, it's the reactive stuff that gets the most engagement and so it just further incentivizes that behavior.

Scott Timcke: Yeah, there's nothing that enrages people more and drives more money to firms than the clickable bad or hot take. Right? Sometimes I think that the New York Times keeps the number of columnists on board just purely to drive hate reads. Then when you start to commodify hate reads and monetize them, what does it do to your journalistic discourse? I mean, I think another thing that is at play. To all the things that you've addressed, there are two things I would say. One is that because...

We don't really have a common experience of social media.

Each one of our feeds is very different from one another. It's very hard to see what other people's feeds happen to be, is that there's a lot of context collapse that occurs. My way of understanding a tweet is shaped by the tweets that happen to be around it, for example. What am I reading around an issue or what are the other people commenting upon it?

So that can attenuate or alter or shape my understanding, my interpretation of that particular tweet thread. For example, where someone else might not have that same experience, and so depending upon what they may have read, I may be the one enraged, or they may be the one enraged. So there are no genuine, real structural frameworks for interpretation.

I'm not saying that we need the proverbial priest to come back in and help us read social media, and I don't think that's a good thing, either. But it does talk about how without these systems of hierarchy, your interpretation is now at free fall and that can lead to a type of engagement and drive the creation of content that is very quick to engage with and gets lots of traction to get as much revenue as possible because you need as many eyeballs on that screen so you can get the ad revenue as quickly as possible. So those things are always there.

Kamea Chayne: That really speaks to the importance for us to all sharpen our critical thinking skills, so we are made aware of this. Because I do think it has made productive and genuine communication more difficult when people might be talking about the same subjects, but the contexts that they are seeing that through are so vastly different. Oftentimes, these platforms do not allow the space and time for people to have deeper discourses to get to the roots that they may actually share, but that they may see vastly differently because of the stories and the context that they personally have been experiencing, or that their lenses have been shaped from.

And to move on, to connect digital technologies to the larger political context. You share about the deployment of digital coercion, which you say refers to "the various processes facilitated by digital technologies that greatly enable American rule."

Can you elaborate more on what this coercion means and has looked like in practice that we might recognize and why you would emphasize that the internet has been weaponized and that it should be understood as a human rights issue?

Scott Timcke: Yeah. So let's talk about the digital coercion part of it then move to human rights. But one of the things that I don't think enough American scholars talk about is the empirical components of American communication technologies. Because of its ascendancy after the Second World War, and even before in Latin America and the Caribbean, Central America.

The United States is an imperial power. People don't want to talk about it in those terms, but it clearly is an empire.

It has territories that it occupies. It has provinces. It has all of the mechanisms and apparatuses of imperial power. It sets the conditions upon which people can engage and shape the world. So we think about South Africa, for example, where I’m from, in order for me to go get a bank account, I need to ensure that I can show the bank my sources of income where I got the money from. In South Africa, this is called Fica, it's financial disclosure forms.

The reason that South African banks have this ecosystem is because of anti-money laundering regulations passed in the United States. To try simplifying a more complicated story, the United States has used almost all the banks that work in the United States, and they use the Swift system to allow transfers to exchanges between banks and because it's located in the United States, will use one of the servers located in the United States. It falls under American jurisdiction. So the American lawmakers have said, well, in order to counter anti-drug laundering and terrorist laundering, you have to be able to disclose and show forms of income so that we know the source of your funds.

So in places like South Africa, I have to show my banking details, my letters of employment, if I get gifts from friends or family I have to show that to the bank. So South Africa doesn't have full economic sovereignty because of the type of coercive power the United States has through its laws about technology, banking, and so on and so forth.

There are lots of these little types of things that you can see the world over that have come to greatly shape the life experience of people. So to go back to places like Kenya and South Africa, a place where I'm thinking of working at the moment, there’s this big talk about financial exclusion and to try now create more financial inclusion, to get people to become banked.

Typically, people use their cell phone accounts in order to transfer money. The reason that they use their cell phone accounts is that they can't get banking accounts, because they work in the informal sector. They can't provide bureaucratic letters and documentation of where they live, the sources of their income, and so on and so forth. So the vast majority of African people can't get banked because of the types of economic conditions in which their lives are or how they exist because of rules made in the U.S. Congress. So you now have people trying to find workarounds for this. But otherwise, they just simply try to work around the dictates of imperial power. So those are things that affect global banking.

But to maybe talk about things a little more vicious than that, we can think about how drone strikes are this global network where people are working in Arizona using video feeds of drones that are over Syria, Iraq, Afghanistan and droning people. We've just recently had a report out in the New York Times about how some of these units are not following the rules of law and aren't even following the rules of engagement to the extent that they are claiming offensive actions as defensive actions, thus allowing massive civilian casualties. So we start to think of the types of digital coercion that has the ability to shape and sometimes even end people's lives the world over because of the communicative power, the digital power that the United States has.

Kamea Chayne: This may be a more introductory question, but how exactly does data collection, which a lot of people are increasingly aware of and concerned about, how does data collection and information technologies and platforms relate to surveillance technology and state surveillance, especially as you note that surveillance tech disproportionately targets the most vulnerable? So what are the relationships between these tech companies, for example, and the state that people should be aware of? How are they weaponized against dissidents or the most marginalized communities?

Scott Timcke: Well, we'll go back to the idea of drone strikes during the Obama regime or the government, he put together a program that targeted people in Yemen, Pakistan, Iraq, and Afghanistan, based upon the types of profiles and signatures that they had. These things were called signature strikes. So in this case, they often had humans involved in the process and often involved in the kill chain.

But depending upon their social media sites that they had access to, the types of things that Edward Snowden pointed out using social media, scraping social media sites, other nonpublic databases that had otherwise collected data on people, they were able to ascertain whether to call in a drone strike on someone based upon the data profile that they had.

So these are things that are very nefarious. And so it's a very telling example of how the incidental data that we produce just in the process of living lives may sometimes be used to justify near death from above at the very sharp end. But these things also occur in places a little little bit less visible too. We think about how often these stories appear in American news about how someone is visited by the FBI, for example, because they looking up different types of rice cookers and rice cookers are now seen as ways to make bombs. So you then get a visit from the FBI to say, ‘Well, what are you trying to do over here?’

We have all of these types of routine surveillance that are built into people's lives—in most cases, as a normalized, accepted, natural part of American life.

At the level of the policing in the city, we now have your police routinely have types of technologies, stingrays, where they can either intercept people's cell phone accounts or cell phone services or some of the types of technologies where they automatically read people's license plates. So simply moving through the public now puts you at risk of being stopped by the police or not. We know that common or how frequently routine traffic stops turn into things that are much, much worse.

When you constantly have this active surveillance done by technologies without much human intervention. Humans have just put the systems into operation. You are very much having experimentation done on people around the world, trusting the United States on how to rule them and often without democratic oversight.

Again, just to go back to the premise of the police because so many police forces in the United States are now almost private companies that are owned by municipalities but now, because of the corporate relationship between the municipality and the police force, it becomes very difficult to have democratic oversight over the police and cooperation. So in many cases, you now have private police forces without the ability for democratically elected officials to even curtail the surveillance practices. They almost become a force unto themselves.

Kamea Chayne: I hear this question a lot, which is people saying, “If I'm not doing anything wrong, why should I or other people be concerned about surveillance? How would you address or answer that sort of question?”

Scott Timcke: I mean, I think that way of thinking is foolish in many ways. I mean, first of all, [it’s] short-sighted because all of these surveillance practices are backward-looking. So you never know who's going to be looking backward and what they try to go and find and how they can connect the dots in ways that suit them.

I mean, all data analysis is framing in a different way, shape, or form. What do you include and exclude told to ultimately suit a narrative? I mean, despite what people who study STEM say, quite a lot of data analysis is storytelling by a different name. What types of data are you collecting, organizing, arranging for what purpose?

So you might believe that not doing anything nefarious now but the things that you think but in five years’ time, may be the signals for something much more. We have to go back to the idea of the rice cooker. You bought a rice cooker today, but in five years’ time, some may say well he was collecting the parts to make a bomb in 2021 already, right? So I think it doesn't really anticipate that that type of the idea that, oh, I've got nothing to lose or I'm not doing anything wrong doesn't really take into account what the future may hold and how things may turn out differently from expectations.

The other thing, and I think this is just a short-sighted thing, is it does really matter if you're not doing anything wrong, either. You have a fundamental human right to privacy, and one should never allow your human rights to be compromised, regardless of whether you do something nefarious or not.

As soon as you start to concede that, “Oh, my human rights can be violated,” it allows other things to be compromised, and you have this slow frog in a pot boiling. I could never quite understand why people are prepared to simply shrug off and say, “Well, I'm not concerned about my human rights being violated at all. It's fine. I'm not doing anything wrong.” The point is if someone else is doing something wrong and you're not acknowledging that.

Kamea Chayne: Yeah, I think this becomes especially concerning when we think about, for example, land defenders or environmental activists or social justice activists or dissidents who may appear to be a threat to the state or to some major corporation with a lot of influence on the government. If these people are constantly being watched, then even if they're not doing anything wrong in the moment, whatever data was collected about these people from their past could just somehow be framed or put together, as you mentioned, as a justification to incarcerate or arrest people.

Scott Timcke: Yeah, exactly. I mean, one of the things this was very shocking about the United States is whenever there is, you know, and typically a Black man gets gunned down by the police, the local media station always finds the worst proverbial “thug shot” of the person possible. Everyone has those photos of themselves doing ridiculous things or performing or mocking or playing up one aspect of themselves. And now you want the selective representation to be shown at someone else's discretion to further their narrative. I don't think it's I don't think people are thinking sufficiently right about this, as you say.

Kamea Chayne: Well, at a more global level, you explore "the role of military power and the digital components of imperialism that protect resource extraction or the creation of surpluses."

Being a show on sustainability, our listeners are aware of a lot of the forces driving extractivism, the exploitation of lands and labor. But we haven't explicitly talked about how this is enabled by militarism in conjunction with digital technologies. So I would love it if you could illustrate for us how this has played out to facilitate the dispossession and expansion of this extractivist logic, and if you have any specific examples you can share.

Scott Timcke: Well, let's start using an example from Niger. It's a very complicated thing, so I have to try to simplify it as much as possible. So forgive me if anyone listening is a Niger expert. One of the things about Niger is that it has a lot of uranium deposits and other types of minerals that are very useful for high-end technological components.

The thing about Niger, though, is it's a political and economic black hole. It's very difficult to govern. There's a lot of anti-state sentiment or the state has a very weak status only exists because of military partnerships that the U.S. military has with the state of Niger. The US military is the main diplomatic entity working in that region to prop up both the training of these forces, the administration of the state and how the state is able to mostly command capitals and then try to put down land riots, from people who are sometimes resisting the state because the state is simply allowing foreign companies, particularly French companies, just to come in and take resources out, often without of good practices. So you have lots of petrochemical spills, you have lots of degradation of land.

So there's a lot of protest against those that are expressed in anti-dispossession movements. But of course, because they're Muslim, they're counted as Islamic terrorists and sometimes the vernacular in which they express their frustrations don’t often take on the best form or at least from our vantage don’t really take on the best form.

So you start to have this deep American securitization of this region to ensure that resources can continue to be extracted. Periodically, these land revolts will sometimes get the Nigerian forces embroiled in conflicts.

As an example, a couple of years ago, four American servicemen died in a crash in Niger and it made Sunday news headlines in U.S. states about ‘what are our special forces troops doing this region.’ But ultimately, they're there to help the state, the Niger state, preserve enough rule or be coherent enough just to be able to allow French firms to extract the minerals required to build high-end technological goods.

It's now starting to see these larger global commodity chains and held things that we use on a day-to-day basis in order to speak, communicate, go about our routine business. So we've come from the dispossession of people whose land these mines are all on, and these things become very, very complicated, very, very quickly.

Kamea Chayne: Even with all of these forces, something that you note is to maintain the status quo requires more than military force, favorable laws, coercion and legitimation. You say, "A dangerous, impoverished, exploited and oppressed urban class requires the development of a system of beliefs with several mechanisms to get the subjects themselves to justify the prevailing social inequality and social order."

What are some of those beliefs you see that are prevalent among everyday people that may be upholding this prevailing order of injustice? And where do you see the cracks coming from that might help people to learn alternate narratives and possibilities?

Scott Timcke:

That quote is trying to speak to just how expensive it is to maintain capitalism, to put it a different way, right?

We see the sheer amount of the billions of dollars spent on military forces. We see the billions of dollars spent on media forces. We see all of these things trying to shape people to say, to recognize, or to convince them that their life is very good. But if you have a look at political sentiment in the United States, the vast majority of people don't really believe that their lives are in a good place right now.

So it's not only simply the costs required to maintain a stabilized capitalist rule in the United States and elsewhere. You also need the ways in which people can understand themselves as benefiting from all of this expenditure. In other words, that they need to have a subjective comprehension of themselves as benefiting from and not being opposed to all of the expenditure.

So this is often what scholars talk about, neoliberal identity politics or the neo-liberalization of the person. I think that I would agree for the most part with all those arguments. The difficulty, I would say, is that there are only so many people who think of themselves as neo-liberal subjects. I think that those are people who for the most part benefit from the system and the vast majority of people in the United States don't benefit from it. I think it was one of the reasons why, after the 2008 Great Recession, there was such a wildfire embrace of the 99 - one percent rhetoric.

It became very salient in the United States and you also had some of these astroturf, ‘Don’t tread on me types’ types, which despite being organized by the Koch brothers, people still very much have genuine grievances to which they're attaching to those politics, right? So I think that there are an incredible amount of grievances in the United States that the neoliberalization of the person to affiliate with the way things ought to keep the status quo simply isn't able to fully do.

I think this is one of the frustrations that the Democrats have at the moment with the Biden administration is that from my reading of the tea leaves is that there's a great worry that he's not able to keep the neoliberal order in check. So there's a degree to which is a reform element to try to increase state spending. At the same time, though, those efforts have now been defeated within the Congress, or at least at the time of recording how it looks to Build Back Better is not going to come about.

So you have a lot of self-sabotage within the Democratic Party because you have this intense conflict with the new American ruling class, at least on the Democratic side about do you reform or do we double down. People like Sinema and Manchin, I think, on the whole, let's double down and pretend that everything's okay. Then you have others, the Bernie Sanders types, Elizabeth Warren, who are very much like, we need to reform American capitalism very, very quickly otherwise, the entire game stops and we're going to be left without chairs, to use the metaphor from musical chairs.

Kamea Chayne: So do you see that people are increasingly recognizing that they're not, in fact, benefiting from the system, and so that sort of narrative has been false?

Scott Timcke: I think there’s a couple of layers, again over here. One thing is there's the view that if you gobble up everything on MSNBC, things are doing well, and empirically, they are. If you think about joblessness claims are down. But at another level, people don't really understand and aren’t able to connect that to policy goals.

They see that gas prices are going higher and they blame Biden for not doing it all. So you have, on the one hand, this policy framework, and then people themselves may be getting raises at work, but they don't really see that as connected to Biden's agenda.

They are more likely to say, ‘Oh, well, that's my own individual negotiation skills. I'm a good worker or I'm a good manager. I've been able to bargain for this based upon my own capabilities.’ It's the very individuated explanation for why they are now doing well. So again, this is one of the things about the alienated American experience is that all the good things that happen in life come from me. All the bad things that happen in life come from the government. And so those are things that [are] also terrorism.

I mean, getting back to the point that you raised about grievances and the like. I think if you look at the poll numbers, the vast majority of Americans are deeply unsatisfied with the way things are. They have different ways of explaining it. They have different scapegoats. But all in all, there is a sense that something is not right and something's got to give. So this is what I think is sometimes the underlying current to all those articles about ‘is the United States going to enter a civil war?’ It's just like, what's going to give? Something's going to have to give.

It deeply worries me that the tone of those arguments is all, ‘we can only resolve this through conflict’ rather than recognizing that it's the way that we do capitalism is the thing that that is creating all this dissatisfaction and grievance.

You can be anti-capitalist and still recognize that there are other ways of doing capitalism that is not as intense and as harmful as the type that's been practiced in the United States.

Kamea Chayne: And finally, it was interesting for me to read when you wrote: "American imperialism is the net result of politics, policies, corporation actions, and trade relations, the nurturing of local collaborators independent societies, and fiscal instruments to complement security forces seeking to ensure that there are no insurmountable barriers to capital accumulation."

We recognize that diversity lends itself to resilience in any ecosystem. This is sort of what I see here is that a diversity of players and forces and institutions have been set up to create the resilience of this dominant extractive system. That, in a sense, is biomimicry, but not really for the best purpose and for our collective well-being. And maybe this is why it's felt so difficult for people wanting to drive systemic change at all levels to achieve those ends because we are up against a very resilient system that's been very much diversified.

What more would you add to this and how would you invite our listeners to begin to think about what it might mean to weaken this resilience so we may be able to rebuild the resilience of life itself and our regenerative capacities to recreate community and abundance and more genuine relationships and our well-being?

Scott Timcke: I mean, that's a great question. I think there are a couple of things. The first is that you also have to recognize the sheer fragility of the system. A strong system doesn't need all of the things that you've just described in order to be successful. You only need the proverbial watchman if people will come in to steal right? Maybe that's not the right metaphor, but you only need the security in place if you think you're being threatened. So all of these, the densification occurs because there's a degree to which people in power know that they can be tossed out very, very quickly.

We think about the revolutions that occurred in the late 18th century, early 19th century and how quickly life changed overnight. I’m not saying over here that we need to think about revolution as in like going to war with the state. I think the conditions are very, very different than the types of arms are all certainly very different. But I do think that we need to fight a little bit. We need to fight a bit more strategically and with different types of weapons in order to bring about the very types of things that you said.

One of the things that I think are are very profitable at the moment is community engagements, building coalitions of the concerned at the community level, wherever those communities happen to be, and using those communities and those coalitions to start to drive local projects to demonstrate that these types of projects can be very, very successful. So I think this is why some Marxists who think in big macro-sociological forms are maybe a little bit dismissive of the community gardener’s association.

But I think there's a lot of value in those types of organizations voting and building skills, expertise, and experiences. But like a community gardener trying to get a community garden, gets to learn about the local municipal regulations governing urban space, for example, and that teaches you about how to navigate through the systems and ways to bend them to your advantage. One of the great things about bureaucracies, and I'm not a big fan of bureaucracies to begin with—I'm not going to get started on my own on a rant because otherwise, this will be unproductive—is that you can use them to your advantage. One of the things about if there are so many rules, if you learn how those rules work, you can bend them to your will.

So I would encourage people to get involved in local communities, get involved in local municipalities and get familiar with regulations and use them to their advantage.

If you catch people off guard, you can achieve quite a bit in a short amount of time.

*** CLOSING ***

Kamea Chayne: What's an impactful publication you follow or a book that's been really profound for you?

Scott Timcke: The one thing I would say is that I've learned an incredible amount from Black Marxists, particularly from the Global South and particularly from the Caribbean. So I'd encourage anyone to go read Stuart Hall, Sylvia Wynter, C. R. R. James, they're all fantastic, deep, wonderful thinkers.

Kamea Chayne: What are some mottos, mantras, or practices you engage with to stay grounded?

Scott Timcke: I love the quote from Raymond Williams, which he says: "To be truly radical is to make hope possible rather than despair convincing." I think that that's really what we need to do as young activists, is to give people reason to come out and join your cause.

Kamea Chayne: And what are some of your biggest sources of inspiration right now?

Scott Timcke: This is going to sound incredibly silly, but it's my wife. I am enamored with just how strong an intellect and how purposeful she works on her projects. It also makes me feel very shameful that I'm not able to match her, but that's something I have to deal with by myself.

Kamea Chayne: Scott. Thank you so much for joining me today. It's been a really enriching conversation that I'm looking forward to listening to again. So thank you. What final words of wisdom do you have for us as green dreamers?

Scott Timcke: It's possible, I think, that just with less work than you think, you can make the world a great place.

 
kamea chayne

Kamea Chayne is a creative, writer, and the host of Green Dreamer Podcast.

Previous
Previous

Bram Ebus: Power, poverty, and criminality in the gold industry (ep345)

Next
Next

Beatriz Caiuby Labate: Sacred plant medicines and healing psychedelics (ep343)