Transcript
Sociocrafting The Future Ep2: The Base of The Social Stack
[00:00:00]
Madyx: now that we've started sharing it, I've had to start trying to explain what our mission is a lot
Rauriki: Yeah.
Madyx: reading into, to realize I need to be a lot more proficient at sharing what we're doing and why it's so valuable.
I. You know, and especially to people that are just like coming at it cold and don't have any background. And I, and, and so that's, that brought me to wananga drill down on the social foundation aspect and why this, the basis of that tech stack having animism at the base, like why that is the most important thing.
So that's one, like just to really drill into that. And then two, um. I was thinking about process and why and explaining game A versus game B process. And what made me think about it was we're in a game, a system, we're trying to do game B uh, [00:01:00] thinking and especially in this environment where we're sort of in the not-for-profit, think tank space.
A lot of people. They're gonna start with the game, a mindset. And they default to expecting game a process. But if we are obligated to follow game A process, it'll be impossible to deliver game B results. And so that's where I thought we really need to refine how we articulate that, especially to like a funder, you know, that game, a sort of mandates quantification of everything.
Um. Timeframes and predetermined outcomes. But then the more I reflected on that, the more I realized that those things are all sort of prohibitive to what we need to do. And also how we were take, we were really coming at it more from the, the science and the phenomenon of emergence, right? And thinking like, this is, this is a thing that will lead us to the best chance of finding.
Solutions [00:02:00] in a game to, you know, to apply in a game B way. Anyways, those are the two, like, so process and animism, um, as they relate to Ky Foundry's mission.
Rauriki: No, look
Madyx: What?
Rauriki: are, those are, those are awesome things to dive into. Let's definitely dive into one. I'm just thinking, um, just on a side that this is like an exploratory podcast type session. A it's not really something where we are saying this is where, this is where our thoughts are, but it's more so this is us thinking about things. 'cause I, I have the same Challenge articulating our project. You, you talked about what's our, if we were to explain this to someone else, could we do it in a succinct way? dived into so many different and conversations. It's, we're in the distillation and synthesizing pat now. Yeah, I, I need to get my message. [00:03:00] My message is clear too, so no, I'd be keen to talk about that, even though it's kind of right there in the back of our head. Um,
Madyx: Yeah.
Rauriki: this is cool to bring it forward.
Madyx: Which one do you wanna start with? Like the process or the animism?
Rauriki: uh, I like the, like the process one. Well, well, the process one was talking about game BA, game B. Um.
Madyx: Yeah. One, I guess why I was thinking about it was sort of, I. In a really practical sense of anyone, us and also anyone else out there trying to do game B. Game B work. And if you receive funding from people in a game, a mindset, and I think that you could expand this logic to any sort of work, but especially in this context, that they first are gonna approach it with that certain expectations.
And I was just reflecting that those expectations basically preclude you ever finding game B solutions if you have to. Adopt a process [00:04:00] that satisfies game A. And so I thought that was an interesting thing to dig into, which would help articulate to people why, Hey, this isn't just like an arbitrary thing where we've asked for these conditions for the work.
It is actually just part of the process to,
Rauriki: Yeah,
Madyx: come to these solutions.
Rauriki: so some things like, um, like how, like how we've got to take minutes, how we've being trying to na nega, um, like navigate around wage and, um, time-based remuneration, I suppose. Um. Our structure, the funding and things that you have to meet, um, those kind of things, bro.
Madyx: Yeah, exactly. And I just thought it was interesting to look at it from a process perspective,
Rauriki: Hmm.
Madyx: you know, and that [00:05:00] it's not an either or as far as, but I was coming to, you know, a lot of people, I. They'll come in, they're like, okay, well what results can we expect and which timeframe? You know, that's a very common thing to want to have assurances of
Rauriki: yeah.
Madyx: and I think that it's worth exploring.
I'm sure a lot of people have around, do those things interfere with setting the conditions for emergence? You know, is emergence the best way to approach? Okay, let me step back. I think we've both reflected and thought about and come to the conclusion that we don't have game B answers and we, we probably can't conceptualize of them.
So any system of research and exploration in which we have to like sort of predetermine what, what even what we're exploring is probably gonna be off course right away.
Rauriki: Yes.
Madyx: So [00:06:00] it's like how can you have a process which you can eventually find something? I think if you are forced through funding or work processes or any obligations if you're forced to predetermine destination of your research or your exploration as we basically just, it's just a way of framing research.
If you're forced to pre pick a destination, then all of the chance discoveries are almost completely precluded right away.
Rauriki: Yeah. That's what happened with us, eh, in, um, in our time in Auckland Council, we kind of, uh, eco Park was there. The funding, the funding conditions were set because I. The government's idea of had already been set and their idea, technologies had already been set, and the funding criteria for what type of initiatives would get supported were [00:07:00] already were set.
So they were, they were setting boundaries around what type of. They, they thought, um, they, they would imagine and were reflective of the policies. And so that put us in, in sort of a, a box from the get go. And so we were trying to design something that met the funding and then the funding changed. I remember it changed from, I think it was like construction waste. When, when it was at the start and then it changed to, organic waste or some, what was that was for the, um, I think waste minimization type, type approach. Type
Madyx: And there was advanced manufacturing in the middle.
Rauriki: And then there was like advanced manufacture, advanced manufacturing, in the middle. Um, and then we were having a conversation around, we should probably our own idea of what we think is worth funding, because if we [00:08:00] keep working towards the government. Policies and strategies and plans, it's always gonna be changing. Um, and then we were given some kind of free license to, to dream and explore what, what it would look like. And that's where this kind of convers, these this's where these ideas started to, flourish when we had that creative license to imagine. and the support to do so from, from the rest of our. From the rest of our team. Um, yeah. But it's interesting how we went from trying to meet, trying to meet the conditions that we set within game A to, well, let's imagine what we could create without those mm-Hmm.
Madyx: Yeah, and it was a lot. It led to something that seemed a lot more impactful to people right away. Like we're, we're still in the early phases of developing whatever this new [00:09:00] project is with Funke. Foundry. I. , but it already seems like it resonates Right? Way stronger with people, even though we still need to refine it, as opposed to, we had a lot of like, you know, senior professional people putting together pitches, and I know that's competitive to central government, but um, and we weren't in the room, so maybe it did resonate and there was some other reason why it didn't get funding.
But as far as what we saw, that didn't really grab people. And then we started . Doing something, like you said, when there was a bit of a pause and there was a bit of space where there wasn't a certain obligation, so we were like, well, let's just do research while we have this space here and we don't have clear direction.
Let's just, as you said, there was some concept of setting our own direction, but also it was just a, you know, a bit of the conditions for emergence where there was a pause and there was, there wasn't a mandate. Right. And so then we started. We started out with a preset destination [00:10:00] exploring a problem rather than like, well here, you know?
And we started to find interesting seeds, I think, which already resonated with people because at some level, even if it's subconscious, people know that game A is not serving them or the planet. You know what I mean? And it's not an enjoyable game to play. So anyways, yeah, I to, yeah, I think you're right.
It started, so we found something quickly when we didn't have that prescriptive approach that resonated with people.
Rauriki: And then I think the challenge was. we went from like fully, fully defined by preexisting, preexisting conditions that have been set to then why don't we just imagine something and create all these possibilities? And the challenge was how do we, what does the middle look like? How do we then apply, uh, that type of visioning in this context? Because, uh, some of the interesting things we started to come up across, uh, [00:11:00] come across were, um. If we were to create an entity to do this, we'd need to get funding. Once. If we were to get funding, then we would under game a introduce hierarchy and it, 'cause we'd need directors, we'd need governance board of directors, and then you'd have workers and then that would insinuate that workers will do everything and, and not. Under gay may not benefit from the work shareholders in the business would benefit. And then we started well, how could we, um, what would it look like if it had, could it be a cooperative, but also could it have a charitable purpose because we want it to be, um, purpose driven and, and in sort of that respect. So we started looking at what would a, what would a workforce that are, a cooperative, so they're part of the decision making process. What would that look like and what would it [00:12:00] look like if their time was valued at a just a flat retainer rate rather than hourly? Because as soon as you start getting into hours, then. Yeah, that's, it comes up, we had a, I think we had a whole wellbeing conversation around how how hourly wages, um, affects, affects your overall wellbeing and how the decrease doesn't necessarily decrease in analysis, doesn't necessarily reflect the decrease in productivity. I think we, you were, you were sharing some things about a four day or three day work week, um, but still getting everything done that you need to do.
Madyx: Yeah, I think, no, go ahead bro.
Rauriki: Nah. tricky part is how do you do that in a game, a space in a, in a way that is acceptable to people who have funding.
Madyx: Yeah, it's really hard to implement game B processes in game A, which is why I think [00:13:00] something we'll get to later when we start to refine the roadmap is that you have to just create game B nodes because you can't really harsh, like you said. Say this concept of not trading time for a wage, if you implement it, there's certain types of job functions which are hard to transition over, um, to a sort of a living retainer type system or something more
Rauriki: Hmm.
Madyx: enlightened, and then that becomes an equity problem within your organization of all, okay, well these people are now in a better conditions and we can't transition.
Let's say someone who has to be on a desk just has to be physically present for a certain amount of hours, right?
Rauriki: I like
Madyx: So that's why I think, you know, you don't try and you don't reform the system. Like that's you hospice the system, which that'll be a different podcast and I think [00:14:00] really worth, but that's our, our philosophy, our approach is you hospice gay man, you don't reform it.
I think that's really important. And you touch on a lot of interesting things. Um, and I think one of them I was thinking about is, you know, it's not a one process for the whole journey. There's different conditions that are conducive to different phases, and it's just that we're in a early phase of research discovery and, um, deep insight gathering and maybe moving towards synthesizing some of that.
and the conditions for that type of work need to be really open. Once that stuff's refined and you're moving towards more developing a roadmap and implementing phase one of, of a, of this roadmap to game B, um, world to animus, game B, solar punk world, then you probably can start applying more. [00:15:00] What's the timeframe you're shooting for?
What are you building? What outcomes? That's that. That can be helpful. Towards implementation. But I think, I don't know if it's just the ecosystem, ecosystem that we're working in, but it doesn't seem to be a high level of understanding that you have to apply different conditions to different phases of this type of work.
You know, it's not, it's not an either or. It's not. No. You can never have quantification and timelines. You certainly can for certain types of work, but we're not there yet and I just think we need to . Help support people developing more nuance in their understanding of like, you know, how funding or work conditions can preclude certain outcomes.
And it's not, you know, you need to first understand what phase of the work are you in, and then apply the right conditions to that.
Rauriki: Mm.
Madyx: And you also made me think of sign when we were talking about, um. [00:16:00] The solar punk aspect, but I won't, I won't take us off track. But it was like some of the things you're talking about, sort of, you could drift towards, I think some of the initial punk was a lot about fighting against the current system, the oppressive system.
And it's almost like we want Neo solar punk or something different where it's like we wanna ignore the current system and start building game B. So there's these little tweaks I think that are learned along the way. You know, we don't want to fight it.
Rauriki: that's why I like that word hospice because it's about the slow, uh, respectful passing of game A and, and all its different components. And then replacing, that with solar punk game B animus type new game. Type of principles. I, I just looked at the principles for Game Beat, bro, and I put them in the, chair here.
I was just wondering if we should just go through them and just reflect on maybe if [00:17:00] it's useful. I'm not too sure, um, on them and how, how, I don't wanna use the word maybe conflict is quite strong, but, um, how it's conflicting with our current setup, like, um.
Madyx: Is it in the Discord or
Rauriki: It's in the, I've got a chat in the, in the, in
Madyx: Oh, yeah.
Rauriki: the here. Can you, can you see the chat? It's
Madyx: Oh yeah, I think so. Yeah. Yeah.
Rauriki: hold on,
Madyx: Um,
Rauriki: I'll, uh, I'm going to chat you PT I'll make it a bulleted list because it's all just in the block text at the moment.
Madyx: yeah, that's a wall of text While you're doing that, um, one last thing to rev up. The last part was, um, I think when we're looking at . Conditions for the work to occur and, and for anyone out there, not just funke, but anyone trying to do aspirational game, be type work, um, and funders. So that [00:18:00] whole ecosystem, enlightened funders, all these people, um, I think it's important.
It's really important that that ecosystem has an approach that some, that incur that, sorry, that they fund an ecosystem broadly and know and are happy with . A lot of those things failing. I think that's like a really important condition which isn't present. You know, it's that you gotta find people and you, when you go back and you look at like the development, and these are not gonna be social tech, but I think the process is similar of more hard tech.
A lot of the people in Xerox Park that pioneered a lot of graphic user interface and a lot of the . Things that unlocked modern technological age. And a lot of the physicists that developed a lot of the physics breakthroughs in the forties and fifties, they always talk about this blue sky conditions where they just knew, Hey, these are brilliant people and we know they're motivated to find something.[00:19:00]
And then they just had conditions where . The funders were like, we don't understand these guys, but we know that they're the type of guys we need and we don't know what conditions they need, so let's let them create their own little ecosystem. Let's just check in periodically, and as soon as they have something, they'll give it to us.
But that's basically the extent. And I just, that's been completely lost across. And I've heard scientists, I've heard physicists. I've heard a range of people talk about. That's why I. In the seventies, we sort of hit a wall of innovation where we, we were mostly just innovating on stuff that was developed pre that and refining it, but there wasn't a lot of radical breakthroughs.
So anyways, I think these are all conditions for our work and the ecosystem, right? Ecosystem conditions.
Rauriki: I, I, I keep thinking about risk like a funder now puts in those conditions because of the risk of failure. Um, and the [00:20:00] impact of failure is more significant. I think now I just, the, the, the golden age of capitalism was like nine. Oh, I just Googled it. 1945 to the sixties, which is when. That was when, you know, you can make it off one income and all those, all those, um, narratives came out because there was, I, I imagine less risk. So people could afford to say, try this or do this. There's plenty of, there's plenty of, oil that we can burn and there's plenty of, um, ecosystems we can, you know, destroy that's a future problem. While we are dealing with a lot of those, those things. So I think have a, we are dealing with an a funding ecosystem that is risk averse does not, doesn't have the flexibility of the, the forties, the sixties when people were just given, make, this off the back.
A lot of it was off the back of government [00:21:00] funding and then, um, then they later became privatized a lot of the things. So that's a tricky, another tricky part and game a. We'll try and keep itself alive. And for it to do that it needs to minimize any chance of dying or any chance of introducing anything else that may impact itself. So that's the other challenge I think we're working with within now is, um, someone, there has to be some bold funders that will trust on. Uh, not bottom one. He's blindly, but have faith
Madyx: Um,
Rauriki: process of exploration and sense making to create something that's completely different from now.
Madyx: yeah.
Rauriki: and doesn't sound like government.
It doesn't sound like, uh, corporate. It sounds like, you know, then we are going into philanthropic then it's gonna be, know, it's just less [00:22:00] and less opportunity.
Madyx: Yeah,
Rauriki: But
Madyx: I, that's,
Rauriki: Yeah. That's what I was thinking as you were sharing, bro.
Madyx: no, I like that. And it, it makes me think about conditions again for this work to occur in, and the, you know, 45 to 70. I think one of the c aspects of those conditions was, uh, at least a mindset or a belief of abundance, right? Like there was an abundance of on many levels, and then that allowed for people to have trust or put faith in like, let's spend some resource on this.
Let's spend some resource on this. And they, maybe there was less of a fear-based scarcity drive of, well, what if one of these things fails? Right?
Rauriki: Hmm.
Madyx: And I think our challenge is. We don't have the golden age of capitalism abundance, but I think that this is my, this is a, this is my approach and I'll be interested to see what you think.
I would rather , this is, [00:23:00] this wraps back to an even more fundamental building block of how your society approaches things and why animism is important, but.
Rauriki: Hmm.
Madyx: I think , this is one of the tough things. You have to step back sometimes and set some groundwork to what you're explaining, but I just wanted to quickly say, think in an animistic world, death is not the enemy.
You don't contort your whole society around the avoidance of death, but I think in a materialist. If that's the base BL Building Block Volume Society like we have now in Game A, then you do dedicate all resources in existence to the avoidance of death. And that might sound like it's not connected, but to me that connects with this idea of it's better to do something and uphold the values that are important and fail then to abandon your values for a chance of success.
That's, that's I think, my ethical view of it, and I think that's a more [00:24:00] animistic view in that it's a living cosmos. We don't know what will happen, but we don't contort our all resources of society towards the avoidance of death. It's a natural part of the cycle. Right? And so therefore, if you have that worldview, you're like, look, we wanna have a flourishing humanity.
We want to continue this project, but is it worth just continuing any project, even civilization, if that I. Process is terrible, you know, and soul destroying or is it better to just give it a shot in a way that is values aligned and, and they go, well, it's a small chance. I'd rather take that path. And I think we've avoided that path too many times.
You know?
Rauriki: Hmm.
Madyx: You know, and that you could apply that to funding, right? Blue Sky research, we just need to fund things which might lead to something cool. Of failure then just fund all these like sort of sure sureties, but don't really lead to much.
Rauriki: Hmm. The [00:25:00] tricky thing too is the, the power of decision making on who, who gets to make the calls and what type of we take. Um, I. Yeah, I'm just thinking people who, who will set, set people who set their conditions will have a vested interest in maintaining the status quo of, of, of game, game a. Um, but yeah, no, I do, I do agree with that. What deal, call it or around animism and, and having that That foundational stack of animism versus a materialist stack does you different outcomes because certain ones will fundamentally click, fundamentally. Conflict with, with that, with that faca, with that thought, um. I'm [00:26:00] trying to actively not say, um, like and bro, and it's quite difficult. Hey, we'll get better though.
Madyx: Yeah. Um, there you go. There's an um, um, yeah, so, and that is, that is a stack and that is the, the order, you know, is the animism game, be solar punk. Um, and we've spent a little bit of time talking about conditions and now a little bit talking about why that foundational block of animism unlocks the conditions to proceed.
And the most foundational block of your civilization tech, or your social tech, right, that has the most influence and all the conditions above that base block, they're always referring back to it. So each like thing that you add onto the stack, it's still always has like a magnetic pull of the blocks below it.[00:27:00]
And that's why the order is really important, right?
Rauriki: Definitely why it's important and it makes me think about that as your, if. Us as Ji we are choosing cannabis as our foundational stack, and that's reflective of indigenous peoples. but then there's so much, uh, like I just, I'm just thinking of the conflict as well with, uh, non-US stack. You get, we've got Materialists, got Materialists Stack, you've also got, An animist stack is a, like a polytheistic stack around the, the living world. Everything is, is animated in a certain way to a certain degree, degree to certain people, and that manifests itself in their, uh, protocols, their law, their mythology as different gods. [00:28:00] and then we get on the slippery slope of monistic world views around, um, There is only one God, uh, above everything else. And so everything else is irrelevant. Um, unless you're just, you're sticking to that. Um, and in a, in a poly polytheistic, animistic worldview, the ecosystems that we live in are the manifestations of our. Of these gods, of these, the way we see everything. And so you are, you are linked, physically to everything around you, and you have immediate feedback of how well, or how, how poorly those ecosystems are doing. And so that's, that's another line in the sand. You, you've got, you've got gay man game B and you got this polytheistic monotheistic, um, worldviews because those will provide different results too. Um.
Madyx: And those conflicts are so foundational when you.
Rauriki: yeah. [00:29:00] Well,
Madyx: know, I think the base block of Game A is a materialism,
Rauriki: yeah.
Madyx: which in, in many cases leads to a monotheism, you know, and the base block that we're proposing is animism. And so that you have a very foundational
Rauriki: Yeah.
Madyx: conflict. But it's really interesting hearing you Yeah.
Explore how that can play out, you know?
Rauriki: just, it's probably 'cause I got the, um, we don't have to put this in our thing, in our chat, but, 'cause the Israel Palestine stuff's just going off so that's like, that's, that is a fundamental of worldviews based, of like foundational principles. But I suppose they both have a, have a worldview.
Madyx: Yeah.
Rauriki: so you can still have Conflict within,
Madyx: Yeah.
Rauriki: Substack. So that's translating that over. I suppose we could still have animism as a, as a foundation, but we'd still have, um, you could still have conflicts. That's why I suppose
Madyx: Yeah.
Rauriki: layers come through. [00:30:00] 'cause you could have an
Madyx: Yeah.
Rauriki: stack and you could have,
Madyx: Yep.
Rauriki: would be game A.
And so
Madyx: Yep.
Rauriki: game B's gotta be like, Hey, why are you, um. Why are you choosing hierarchical,
Madyx: Yeah.
Rauriki: um, individual success type pathways when we could do it in a certain way. So, um,
Madyx: And I,
Rauriki: yeah.
Madyx: I think that was sort of your exploration of Wakanda, that that might be animism then a game, a stack. Now they might, I don't know if people claim it's not, but it, it definitely felt a little bit like that, right? Like maybe their basic was a more animistic grounded in a living cosmos worldview. But then what they added on top.
Now it looked and felt better and it probably, if it's materialist game A or animus game A, that's probably a more enjoyable place to live in animus game, a world that still has a game, a problem. And that's why the stack is really important. And also when you were talking about [00:31:00] the monotheistic piece, it's really interesting because I don't think
No one's ar gonna argue with you that there's living ecosystems that people are living, that the landscape is living, but we sort of are forced in a materialist world to just say it's a weird anomaly, right? Everything is sort of an anomaly. It's a really tough, um, base that to have because you're forced to hold all these contradictions,
Rauriki: Yeah.
Madyx: you know?
And so they're like, you're living, but the cosmos isn't living. and consciousness is an illusion. It's a meta phenomenon just due to complexity of the physical structure of your brain, and you're like, okay, so the brain is just significantly. complex enough that it leads to a completely meaningless phenomenon of self-awareness and self-reflection and being able to watch the actor and all these things are just weird epi phenomenon.
And then you're like, well, isn't isn't a, isn't a [00:32:00] Galaxy or a Galaxy cluster like. Very complex and has inter branching connections like neurons, and so it's like, you know, why are these, why isn't it, why doesn't it manifest in these systems as well? You know, it's, it's a system of contradictions, I think.
Whereas Animalism allows you to sit in a much less contradictory worldview,
Rauriki: hmm.
Madyx: and it allows for a much broader range of . Things that you can add on top. Like you said, you could probably take a monotheistic, animistic view, you could take a polytheistic. Um,
Rauriki: Animistic
Madyx: it doesn't preclude a lot of stuff right.
Rauriki: no, I like that, bro. that. Um, yeah, I like how you've broken that down because they're not, everything can be in conflict, but I suppose There are certain parts that create that conflict, and I think the materialistic and the animistic, whoever holds those [00:33:00] worldviews, will, will conflict to some level. Um, we live in a materialistic society and economy is the tricky part and makes it, makes you think, yeah. What does an animistic. Economy and, and society look like. And then you lean, and then you've got Wakanda as an example, maybe game a, um, and it's, it's also techno technocratic, I suppose, based on the technological successes from, from finding, uh, the McGuffin of vibranium. Um, um.
Madyx: So you're, that's an interesting point is that say you have an animism base deck and you're building a society, you know, I think the way Game B explains it is they basically say before game A, there was an indigenous sort of base, like a pre-game.
Rauriki: Mm
Madyx: Um, and then game A got introduced and that led to hyper, you [00:34:00] know, competition and got us to where we are now and we're trying to go to game B.
You could have an aspiration to return to pre-game A and you could build that off animism and that's probably a much lower tech
Rauriki: mm.
Madyx: Right? And I don't think there's. I don't think you could make an argument like, oh, it's ethical, or there's a right or wrong. That's just one path you could explore off of an animistic base stack.
Right?
Rauriki: Yeah, pre-game, a pre-game A seems more compatible with game B. And it'd be cool to unpack what is an animistic pre-game A and game B look like.
Madyx: Yeah,
Rauriki: I think this conversation is,
Madyx: true.
Rauriki: I. It's really about that base stack and the implications of, how the base stack informs everything on top of it and how that informs what type of society and economy um, communities and everything is, are created.
Madyx: I, bro, that's, that's really important and I think it, it's [00:35:00] illustrates or explains why. We know that this exploration is absolutely essential for our civilization. It's not like, um, a niche sort of weird academic exploration, right? Like, we see that this is the most important thing to be looking at, um, in a very practical way.
You know, like, this is it. I think it can, it could appear very, I don't know, I dunno what word to use, but irrelevant. I. We, we have seen that this is actually the most important exploration and that there's not many people that have, going back to the first principle all the way back to like, keep going back, keep going back.
A lot of times you think you've gotten to the first principle, you know, but you haven't. Whether we are or not, we will, you know, remains to be seen. But I think we're closer to the base, the absolute base reality of our, of our civilization tech, our social tech of . Materialist versus animist, and what you said is really interesting of [00:36:00] exploring like what are different outcomes off of that base deck we are proposing, you know, right now our understandings animus then game B, but it's an interesting way to understand the power of it, looking at different stacks and how that plays out.
Rauriki: We've, through our conversations and explorations, we've identified that animus, the, the animus stack is more important than the game B stack. what we've, um, sort of by default recognized. We have to have that. Um, we've tried to categorize them and like solar punk presents itself as this. Super amazingly aesthetically pleasing, um, future. But then you can ask and unpack your what, what, um, what system is it? It's just a gang mate. Everyone does one person own all the flowers and the birds and all the renewable energy and all of that, and everyone just kind of lives whatever. Um, and then you, and you, you could keep coming down. And so animism. presents [00:37:00] itself as that fundamental point that will cause a divergence between what type of you create.
Madyx: A hundred percent and, and everyone's gonna have their own personal. Uh, aspiration of how that goes, but I think so far I believe that everyone could share animism as a really, um,
Rauriki: I.
Madyx: it sounds so silly, but honestly I think it's just a nice place to inhabit as a base layer. A pleasant, you know, affirming, affirming place.
Rauriki: maybe we need like a put a neo, like you, you said before neo animism because animism is like already loaded. Um,
Madyx: Yeah.
Rauriki: animism
Madyx: But
Rauriki: is a, yeah, it's, it's loaded, but I think the.
Madyx: I guess it is neo animist in a way, right?
Rauriki: is. It is being presented certain principles of animism and then applying them to this, the context of [00:38:00] now and its implications on what our future states will look like. I think that's a neo animus approach. or whatever. I don't wanna
Madyx: Yeah.
Rauriki: on front of it, just be like new near everything
Madyx: Yeah.
Rauriki: But, but it is, but it is kind of an attempt to, um. To from
Madyx: Yeah.
Rauriki: principles, but not, not to say we, we were animus in the pregame and now we're material of the nesty evolution. Ah, no, we
Madyx: Yeah.
Rauriki: we need, um, animism is still the base stack, but what does it look like in this context?
Madyx: No, that's really well said. And I was just reflecting on, I have this personal aspiration of seeing a animistic. Game. B, solar punk space, fairing civilization, that's takes these beautiful principles and is inhabiting the stars. Like to me [00:39:00] that's the most aspirational thing. You know, beautiful life on planets and also, but you could take this same thing and say, no, I just, we just wanna subsistence lifestyle and that's what we build off animism.
Right. That would be more traditional. Right.
Rauriki: and, and, and that would be a, that would be an animist, like animistic, appro appropriate for, um, the time when animism was pumping. Um, but because of our. Because of modern civilization now of our population technology, we need to still maintain those principles. But yeah, I'd, I'd love to see that as well.
We get, we've got, um, oh, Elon Musk talking about The, the, the requirement and the need to be a space fairing civilization. And this is the time to do it now, because otherwise we'll blow ourselves up. Um, and,
Madyx: Yeah.
Rauriki: if we can, we can, um, it'll be a big part in our civilization's development
Madyx: Yeah.
Rauriki: so powerful and. Um, I don't know. [00:40:00] I don't know what, I don't wanna use the word better because that's just, that's just my perspective based on the stack I have. But if there was, if it was an animus, Gaby solar punk space, fairing society, and all of those principles carried through, I like how you said that, bro.
Madyx: First of all, that would just be sweet to see just, it would be so cool to explore what that would look like, and that is some of what we're doing at the far end of exploration. We haven't. Really left Earth yet. We've mostly been, but that's, that's a cool piece to touch on. But I think what you said about Musk is really interesting because both him and funky founder, you and I, I think we both share this aspiration of exploring the stars, right?
I, I think we both, well, cool. We both agree on that, right? And to, to spread humanity. Now, I. Obviously we haven't spoken to him, but as far as I understand, his approach is he is in a large part driven by the avoid. He wants it humanity to be multi-planetary so that there's still a continuation. So if you have a [00:41:00] catastrophe in one area, right?
So again, I don't know what his worldview is and I don't want to put anything on him, but I think someone could take that approach that has a bit of a materialist space deck because . Alright. Forget muss. I don't wanna put it on anyone who, I don't know what their view is. Right. But let's just say someone's driven by that.
You could, that could be a very much another avoidance of death, right? Like, well, the main drives is don't let humanity end. But I think,
Rauriki: Mm-Hmm.
Madyx: another approach is like, don't let humanity be bad.
Rauriki: Mm-Hmm.
Madyx: If it ends in that process of not being bad, so be it, right? Those are very different approaches.
Rauriki: And, and that's a fear. Deaths type, war driven, um, approach, which great innovations have come out of humanity or some nations. Resistance to, to death through the development of war technology, So it is, it's still, [00:42:00] it's still a scarce mindset, and I think that's, maybe it's human nature. I don't know how we, we, we, um, we respond to, to crisis events. Uh, know if there's, if there's war, we would've, we'll develop this if there's climate change. We're try and develop this, but there's no abundance driven Development. And that's what I think animism or, or what we're trying to explore is how can we, what does that, what does abundance animus driven, uh, advancement of our civilization look like?
Madyx: And I think if people are more comfortable with saying Panpsychism, that's another angle that can come at the base stack game. You know, I think the core of that core is that it's, there is a life or an awareness present in all of what we understand of the. Of existence, right. That that's the driving force, you know,[00:43:00]
but I, I just want to, I just think that seeing a space fairing civilization out there, like traveling the stars, but not in this way, but in a radically different way, like an animistic. Um, and I'm sure there's some sci-fi out there that maybe I'm closer without really having . Set out to do this, but have explored different types of space for civilizations.
And I think, I think it is important to go far out into the future, right?
Rauriki: Yeah.
Madyx: that's one of our core, not our core civilization tech things, but just to one of our approaches, right? To building excitement and motivation to exploring this path.
Rauriki: Um, we were doing, you talked about us, uh, creating kind of react content to different ideas and I think, um, if we could get the premise of, uh, some [00:44:00] sci-Fi. World building type things and see how kind of, what kind of worlds they've imagined. We've got, um, well you got Avatar, the Pandora and all of that. We've got Wakanda, and then you got like really player one, which is like a little bit really dystopian. you've got the Matrix. it'd be cool to, um, to unpack and see, um, kind of stacks are these built on, or what's the spectrum on the scale of. Doom to abundance. Um, whatever the scale is gonna be, that would be cool to explore and, and unpack.
But I think really everything quite clearly comes down to the base stack. And then the nuance comes in the, in the different layering. And I think I thinks, uh, you know, science fiction or fiction novels, are a great place to visualize or or articulate what type of wields.
Madyx: You. Yeah.
Rauriki: of.
Madyx: [00:45:00] It would be interesting analyzing sci-Fi world building and what's their civilizational social stack
Rauriki: Mm.
Madyx: and trying to like pick that apart. I think it's,
Rauriki: And then, and then we, just back to the start, how we connect all this to the, the, the right now.
Madyx: yeah.
Rauriki: and our risk averse game funding ecosystem within our little country, to potentially explore all of these things. Yeah. It's, it's the, the challenges and the the, the tricky nature of going against everything that's been set up now to explore what could be.
Madyx: Well, I think one of the, the approaches that we are taking is that there will be a lot of resistance to funding it, but the amount of money.
Outfits like us or anyone else. 'cause it's not just us. We're talking about [00:46:00] building conditions for an ecosystem to emerge, right? So we just wanna be one of many people exploring. We don't need huge sums of resource. We need some, but we don't need vast quantities. And part of our self-development internally is developing waste to operate, which aren't degrading.
Like, 'cause you can, you can pull back on resource costs by just ripping it out of the wellbeing of the humans or the land. Um, but in this case, . We're doing it by, um, effective ways to work and operate. Right? So that's, the ask is not big. And I think it's really interesting because if you look at the current, it's tough.
'cause if, if everyone operated on logic and reason, which they probably think they do, but they don't, they would see, I don't think it would be a hard pitch. 'cause you look at . I don't know your giant corporations without calling anyone out or giant government departments, look at their budgets and look at what they've produced.
How much innovation, right? It's almost [00:47:00] zero. Right? You know? And look at how much money they have to throw at these things. So there is a desperation or a desire of the system to create solutions to things. 'cause they are receiving pressure because. As time goes on, these institutions are less and less effective and they just keep increasing the resources that they apply to them.
And so we're saying, Hey, give us a few hundred thousand, right? That sustains us, you know, a year and you'll get outcomes, which your mega think tanks or your vast policy departments of a government agency probably aren't . Driving much innovation at all. Right. So that's, I think the positive side is yes, there'll be a lot of resistance, but the quantity of resource needed to sustain game B startups is pretty small.
Rauriki: Yeah, pretty small at the moment when it's in this exploratory phase
Madyx: Yeah.
Rauriki: when we, we are just trying to scale up the, the pace and the quantity of [00:48:00] exploration that we, that we can achieve and, um, and trying to link up with other nodes that have been established on this. of conscious future making, creating game B.
Madyx: Yeah,
Rauriki: Yeah.
Madyx: and we're, and we. Coming back to what you said around how do we get the funding is that's why we took this approach of just starting to produce things which we think are cool,
Rauriki: Hmm.
Madyx: emotionally, emotionally connect with people and grab them. You know, that's our approach. Not to either or. We support and we honor our, you know, um, other humans out there that are advancing positive futures through
Um, other means more traditional means of facts and figures and policy documents, and they're all people that mean well. Um, [00:49:00] and we just didn't see a lot of people advancing it through more emotive storytelling formats. Right? And so that's why we focused our energy on AI tools and traditional approaches, um, to communicate that way.
And I think we're like. If we can hook people, if we can grab 'em, then we can build support. And even if one or two people in gay mayor are like, that shit's really cool. That's enough for now. That's the cool thing, right? We just need a couple people that see it and just get touched by it and are like, hell yeah, here's some funding.
Right? I think that's the powerful thing, the P that we have many disadvantages operating in game A, but the advantage is, is that it's so starkly contrasting to what most of . Everything else that's out there. So if someone's got a little bit of spark still alive in their soul and they see this versus just the ocean of garbage that comes past their desk every day and they're gonna, holy shit.
Yeah. That's cool, man. [00:50:00] Right? Like it just touches someone a little bit. It's so different. Most people will throw it out, but you know, there's a few
Rauriki: Yeah.
Madyx: everywhere that still have that spark alive. Right? And they might be like, fuck yeah. like, you know.
Rauriki: I don't know what it is, but it is.
Madyx: Well, this is cool, right? This is cool. And I'm sick of all this.
Just soul crushing. Same stuff, you know? So yeah, this is, I guess it's not easy. It won't be easy. We are trying to grow something within a hostile environment, so we create these little sanctuary nodes and we, yeah,
Rauriki: And that's
Madyx: we find.
Rauriki: Funke Foundry and what we are creating as a sanctuary for this type of just type of thinking. And we just within, within here, we can explore and create and, um, allow people to enter into this space and become enlightened with this type of thinking and how it may be relevant [00:51:00] to them. And Foundry was set up. Externally of government and on its own such that there was a place that maintained this continuity and this level of thinking. so people could always come back to it. And so we could just have a place to explore and hopefully this is something that other people believe in and wanna support. Nah.
Cool.
Madyx: That's a good wrap up and I think, you know, these podcasts and the website for now are a digital space that hopefully people can come to it and experience a little bit of that sanctuary through. Not precluding any types of thinking. You know,
Rauriki: Hmm.
Madyx: for now, that's a way, that's a place we can connect and share with people and we hope to expand out this space of sanctuary.
But right now, you know, we're taking a very biological approach, I guess, in the sense that if we wanna create a structure that can go [00:52:00] dormant rather than say we have to survive at all costs, you know, that's my final,
Rauriki: we, we need external inputs. We wanna,
Madyx: yeah.
Rauriki: just like a mono monoculture needs all, you know, external inputs and constant, nourishment from things that aren't within its immediate vicinity and it's just trying to survive of every artificially. Whereas in a forest, it's got all of the complexity that allows it to thrive and survive.
And so how do we settle the conditions for creating a forest so that it forest. Almost by itself, and you can just walk in it and be like, wow, this is amazing.
Madyx: Yeah, a hundred percent. Or if there is a drought and things go dormant, it's not destroyed. It's just dormant. It's designed to weather these periods of dormancy. If, if there is for whatever reason, a disruption, it doesn't get annihilated, that it can remain there in a dormant state. And then when conditions reemerge.
you know, for when water comes back to this, this [00:53:00] ecosystem, all the seed bank and the soil comes back up and you don't start from scratch. Right? Yeah. Maybe some seed species, maybe some things they needed the complexity so they don't come back right away. But you don't start from scratch. And I, I, that's part of our design, right?
This, this system.