Transcript
Summary:
Madyx and Rauriki trace back the genesis of Whanake Foundry and touch on some of the key elements along the way.
Podcast 1
[00:00:00]
Rauriki: So this is our, our first,
Madyx: yeah.
Rauriki: our first session and we are just gonna get into it. Why are we doing this? Whanke Foundry?
Madyx: Yeah. Well I, I mean, I guess looking back is probably a good starting point and then we'll get to something interesting I think a lot of it's game B was a big spark for me. It articulated a bunch of stuff in a really clear way that, that I had seen or felt or thought, and then it gave me a really clear (message that), you can't just be like tweaking the edges on this thing.
Rauriki: I think you shared that. Um, I think you shared the first video of Game B, which was about finding the others and, um,
Madyx: Oh, that was such a,
Rauriki: no, that was wicked. Was in game B, that it connected all the different concepts that we thought were important that we weren't seeing. And that also, it was, it was objective enough that it wasn't, like[00:01:00] I could see a Maori world view tie into it quite easily. Um,
Madyx: Hmm.
Rauriki: it wasn't something I was seeing in Maori operational structures. Um,
Madyx: Yeah.
Rauriki: I thought, oh man, that'd be, that'd be wicked like B, um, when you introduce that, that was cool.
Madyx: Yeah. So, so I was working on an ecopark project for, for a local council in Aotearoa (New Zealand). And you came on as a contractor, as a Mātauranga Māori specialist. I think.
Rauriki: Yeah.
Madyx: And that project, we tried to be really aspirational. This is my memory. We tried to be really as aspirational as we could inside of a framework.
Like you and I both came into a project that was already going and we tried to be as aspirational as we could. And even that level where probably we'd look back now and think it wasn't that aspirational [00:02:00] that seemed like it was too big of a stretch for the big funders in central government. And so we didn't get the funding to do that.
And then if you, and then I think like just during the, the reflection period after that, it was like, well, that wasn't, that didn't feel that aspirational to you or I, it felt cool and I would've been happy if it got funded. But if that was hard to even communicate or to convince people, I. That we should be doing this and it didn't feel aspirational then.
You know what I mean? Like it's crazy to put all that work and effort in to something that doesn't feel super aspirational. I think then we were like, well then obviously Game B gave us like even more of a framework to wanna go further, but I, I feel like then we started reflecting on what should we be doing?
Okay, I get it. Game A is terrible, like self-limiting thing and game B would be way more fun [00:03:00] for everyone to play. And, but then it's like, how do we contribute to that and what actually shifts things? And the big thing for me was a lot of the time and effort and priority had been put on facts and figures and then presenting those in a really professional way to decision makers, but then it didn't.
When you reflected on it, it didn't seem like facts and figures were actually driving any political capital. You know what I mean? It was all emotional and network based, relational, you know, relational capital and emotional driving, you know, and that's when I think we were like, what if we just like make things that are impactful to people emotionally,
Rauriki: Mm.
Madyx: rather than just another like, uh, position paper.
Rauriki: in another metric case.
Madyx: Yeah. Well, and then I think that's when we started thinking about like, how do we, how do [00:04:00] we, how do we like impact people in a more meaningful way? And most people aren't gonna ever read a position paper, and if the position papers aren't even being successful with central government or other big funders, then it just seemed like a crazy waste of effort.
Rauriki: I think the other thing was when we were in the, in the government space that the turnover of staff and the changing from the political, um, I suppose transitions across different governments or governments of the day. Whoever was in power, pretty much, they could call the shots. And so it was, that type of thinking was never the safe. In, in that structure,
Madyx: That's true.
Rauriki: there were some champions supporting us and supporting the kaupapa and what we, what we were creating. It wasn't, we talked, I think you, you shared one time like the, about the continuity of the consciousness for kaupapa. Like how would we make sure that this thread and this [00:05:00] continuity of thinking would keep, going and people could come in and come and go, but something was, was there was some Yeah, well, continuity, something holding it all together. Uh, and yeah, we, we had a discussion around maybe out holding it outside of government. it's probably the best thing.
Madyx: Yeah. And then that, I think that was really validated when we spoke to Mark and he shared the story of basically them doing, trying to do the same thing,
Rauriki: Yeah.
Madyx: in the seventies or whenever it was, maybe not that long ago, but a while back, right? Like, yeah, that's cool. And he was really excited about what we were talking about.
And he is, and, and then it reinforces to you that one of the, the things that game A does really well is that a lot of people are out there . And I think that's why finding the other was such is such a good component of game [00:06:00] B is such something important that they've flagged is that basically he was, he had done that thing and then he had done a lot of the work in thinking that we are doing now and then he had ran outta steam, it hasn't preserved their progress. Right. And someone else can keep working on it. And that's when we started talking about from Eco Park to like, okay, well then we need these different components. We need an Eco Wa where we can have this repository of knowledge.
We talked about monasteries serving that function in the dark ages like preserving writing and, and knowledge.
Rauriki: And then I think we talked about whare wananga which is a Maori concept of, I suppose, houses of traditional learning. And um, yeah, we, we, we knew Eco Park as itself was a physical space where people would come together and collaborate. And that's that shared similarities with a pa, which is a, like a Maori settlement. Um, so the eco, uh, park [00:07:00] subtly pivoted to pa, which was, it wasn't a, um, big change in the word, but, uh, and the
Madyx: Yeah.
Rauriki: it carried a lot. I, I suppose, maintained the original thought, but built on it. And then which is the what's the thinking space? And that could potentially be a digital space to complement that physical space, eh.
Madyx: Yeah, a hundred percent. So we, we started with the, the park and then we started realizing we need these different components. So there, there was like a shift in how we wanted to build support for change. We were like, all right, game B, whether it's game B or any other framework, but it's just a way of thinking about needing a complete a new thing.
I think that's the, the main thing about game B. And it articulates things really well. But, um, there might be other frameworks out there, but that's a big thing. Like, we need, this current thing is not good and we need a completely new thing. Um, and we need, and then we start talking about the different elements that would be needed [00:08:00] to be built, like a physical shared space is really important in the real meat space.
And there's all these important things about that just for human nature, but also for like in Maori culture, in a place-based culture like that you can't ignore like the, a physical connection to a, a real place, a real land. Um,
And then we started adding in the other two.
Rauriki: and then Eco Ha, was the people element. It was around as was the, um, 'cause when we had the original Eco Park proposal, didn't get funded, essentially didn't have the, the people who supported it or could champion the concepts or, or prioritized it. Um, so it would be, we knew that that was a strategic element that needed to be sorted.
What's the, what is the network that would sit on these nodes and these digital
Rauriki: physical nodes of the, [00:09:00] um, one thing you said, but. I remember like, uh, looking at some solar punk stuff and then there was some content that was like solar punk isn't about, is misrepresented when you just see all these new buildings because solar punk should be, , it grown within these derelict old spaces that are forgotten. It rebuilt. So Eco a concept, uh, eco park. Um, it shouldn't be like a flash new building. It should be, uh, In terms of solar punk, or I was reading it, it should building that's been forgotten about, or old space or an old forgotten about part of, of the city that's revived and has life breathed into it. Um, so that was a, that was a, that was a cool concept to hear. It was like the physical space. Isn't new, but the thinking within that spaces, you can have, um.
Madyx: Yeah. [00:10:00] Which is like really like a, a regenerative approach, right? Not a generative, you know, it's like regenerating within a existing scaffolding that's like degraded or inappropriate and then like something really beautiful springs up from that spot. But yeah, so I, I think, so there's a few shifts and I think all of these things, all these things are interesting on their own, but the most interesting probably thing is synthesizing things together that aren't normally done.
Not that it's never been done, but that you just don't maybe regularly get exposed to combinations in, in interesting ways. And I think that's when we started to find really things that motivated us to keep exploring or to spend effort to build something when we were like, oh, that's, man, if this thing goes together with this thing in a really interesting way, and provide some novelty that at least we hadn't been seeing out there.
[00:11:00] We started talking about obviously people like E.O. Wilson and others that were talking about the, the value of synthesizing the value of...
Rauriki: and then we had like, um, then we started talking about permaculture. And then, uh, I think we, we started talking about syntropic agroforestry as a, as a concept. And then you talked about function stacking and how these permaculture systems all have multipurpose functions that all work together. Um. And then, uh, there was a concept of, was it? It was syntropic agroforestry is uh, which is the opposite, opposite of entropy. Entropy starts from simplicity and it goes out and becomes more complex. It grows where, syntropy starts from multiple, multiple parts, and then it coalesces into one structure that's all these different things synthesized And that's that kind of, that's [00:12:00] that approach.
Madyx: true. So I guess what I was describing is like a centropy approach . Yeah. And then I think the other aspect was that, that sort of happened organically was we were like, well, cultural elements are what drive widespread change, and we were like, so maybe in, because we put together a bunch of really good, logical figures on why government should fund eco park, right? Like, look at all these jobs, look at all these advanced manufacturing reasoning, look at the future of economy, look at demographics, and you know, which ethnicities are gonna make up the bulk of the working population.
All these reasons why like you should invest here and these areas. And you'll get return. And it's all laid out in figures. And ultimately it's only, I think we tried twice, and I know it's not a massive sample size, but I think it's enough to be like, all right, that didn't, that didn't [00:13:00] like, that doesn't guarantee, right?
But it takes a lot of energy and it's not that exciting. So we are like, what if we could start making elements of this new culture? We found aspirational. And that can be used to excite people and if they just get on board for the excitement of this thing rather than for the, the logical data for this thing.
You know, it's not an either or. It's like never an either or. But there we felt that there wasn't the, uh, there wasn't this pull from something exciting, you know, it was just like, here's a logical reasoning why you should support this. But there wasn't like an emotive pull towards like, yes, I really want to get there.
And so we were like, well, what if we can start to pay, uh, an artist to generate some basic, right? Because we were like, that was the thing. We, we were like, maybe we can get some funding and you and I can come up, we can synergize some really cool concepts intellectually, and then an artist can start producing some concept art [00:14:00] and that could maybe start exciting people and we could use that as a tool to explore and gain
Rauriki: Because people,
Madyx: Right?
Rauriki: that people didn't really understand the vision either, wider vision. And
Madyx: Yes. Yeah.
Rauriki: we had to backtrack, you had to start from this, what does the vision look like? What does this aspirational future look like? And then backtrack okay, if you want, that kind of thing.
What does, what do we have to put in now? And then we recognize, well, what does the check, what do the checkpoints have to look like from then, all the way to, to now or to, to even next year. And that's where imagery around like solar punk type things. Um, studio Ghibli, you know, the type of features that they, they projected. But I think we started it, like just not taking it for face value and asking, oh, okay, well, well in those futures, um, who holds the power? What does day-to-day life look like? Like what are the different, um. Social structures and where the resources come from and is that [00:15:00] something we wanna work towards and say, what would it have to be now?
And, and if somebody understood that somebody who's giving funding or someone who was in a decision making space, um, they may be better informed to make decisions and, uh, understand what this potentially non-popular opinion of what things should be would look like.
Madyx: Yeah, bro. A hundred percent. And we also were thinking about Afrofuturism, you know, and there was elements of, you know,
Rauriki: Yeah Wakanda eh, Wakanda was like a, a mix between, um, you know, 'cause I'm a Maori guy and you're American guy, so it's like, um. Was this indigenous future of technology and all those things put together. Um, and I think the indigenous aspect was, was, was a approach to articulate our relationship with our [00:16:00] planet and our environment that wasn't based on, um, yeah, just utility value alone, just but more of us as and us as humans and our connection to our ecosystems. that was articulated within Te Ao Maori, within our Maori frameworks, but then broadly within animistic type of framework. Um, and I think that's when we started trying, trying to add that into our tropic list of things to coalesce.
Madyx: I think you also touched on something important that led us to where we are today with Whanake Foundry, which was we weren't, we, uh, well, we weren't doing a good enough job probably showing, explaining in whatever way why what we were saying was cool to the decision makers. Right. I mean, I guess in some sense you could say it, [00:17:00] it's possible that it doesn't matter how good what you pitch to them is if there's a political reason that they're gonna support X over Y, like it's potential that it doesn't matter.
But in a, in a circumstance where we assume that they're actually like being a good actor and they're not predetermined on what they fund, assuming that they're just looking at it and going, oh, this is cool. Um. Then we didn't do a good enough job exciting them and explaining it. And it was really hard, like we'd get feedback like, I don't actually get this element or this element.
And people had to have so much, um, intellectual background to, oh, well I have to understand a lot of like, Te Ao Maori elements and why that's relevant to advanced manufacturing. They have to understand sociology and economy and, you know, why shifting capital, they had to understand all these components, um, to get why what we were saying was valuable.
Rauriki: difficulty was like we had to understand them as well, but we had to, um, synthesize it. And What was that? What's that? Yeah, Wilson, [00:18:00] um, quote. It's something around like,
Madyx: Like, we're drowning in knowledge, but starving for wisdom or something. Is that the one?
Rauriki: I think the value will be placed on people who are able to synthesize information into something.
And what we, um, were still exploring, but we, we still had all the pieces scattered on our table and we had to figure out how they work together. And it was a bit too early to put that forward. And we knew that it had to have, have a lot of, um, yeah. Had to have a lot of work still.
Madyx: that's definitely true. And I think that's a lot of, that's just where we were with our journey of synthesizing these things that we're working with, but also being brought into a middle of a project. And, but at the end of the day, you know, there's a lot of frustrations with that, with the system, but it did lead to an opportunity for us to do something cool that was much more just like a blank slate.
Like, well, what should we do? What would be cool? And I think, I mean obviously it's not a massive thing yet, but [00:19:00] it's, it is something, and I think when we were talking about it, it was just a complete pipe dream. So we were just like being really, honest about what would be cool. Like we weren't self-editing, like, oh, that's crazy, right?
We were just like, well, it'll be amazing to go deep, deep into a really exciting aspiration in a future which merges like these really cool concepts, which are not necessarily play space and really amazing wisdom of Te Ao Maori and, and an indigenous worldview. And also what if we could use like technology in a way that was empowering and avoided a lot of the frustrations of bureaucracy and the slowness and let us sort of explore and iterate these ideas in a way that was more easy to share, um, with people.
And I think what you said was really powerful around, it was hard to explain, but it's been a lot easier to show people, you know what I mean? Like once we, so we said, let's, let's create some scenarios ourself, and then let's pay [00:20:00] some artists to create concept art. And then that was like, not the beginning, but early-ish.
That was about a year ago in about . I think if I checked our Midjourney, it was like November of 2022, we started messing around and we were like, maybe we can get some crude images through generative AI that then we can give to professional artists and they can create concept art. And through us trying to generate solarpunk animist, Game B, future images, we eventually found that we might be able to generate interesting enough images without paying someone.
And that unlocked so much iteration, right?
Rauriki: Yeah. Like, because we didn't really know we wanted. I think we, the one thing in our actions were, was a clear brief for our, um, so that we, we have, if we had a finite (budget to) give it to them and say, this is exactly what we want. We want this, this, and this. But we didn't really know what that was.
We were still, we were [00:21:00] still exploring. So this what, Mid Journey helped us to do was it allowed us to, to explore visually as well as conceptually. that was cool. And I think we, a few things we recognized was, um, like one of the things it couldn't do was it couldn't create abstract content very well, which was like Maori designs and patterns. So anytime asked it was, um, obviously it would fail. So saw that's probably where we'd have to, if we were gonna truly represent or, any cultural content we'd have to get an artisan.
Madyx: Um, yeah, AI version of, of art and culture is very different than the actual, but yeah. It allowed us to explore, which we are still doing now, like the, we're setting up now to be able to explore and synthesize and create things of value and then share them back out into this wider ecosystem.
Rauriki: back to our initial problem of, um, to [00:22:00] inspire people to understand the bigger picture was. And now we had this immediate tool when in like Midjourney then in Runway. Um, like to, to then animate it, which is later on the track. But now all this, this, eruption of, uh, AI, text to image and text to video, um, technology has given us ability to create a solution to that problem of bringing people over the line from this game A to game B thinking. Um, so yeah, it's been
Madyx: And uh, and a few times we would tell someone what we're up to and you know, we'd talk about it for an hour and then at the end I, you know, I'd be like, oh, I'll show you some of the like, sort of images that we're messing with. And then in like 10 seconds they'd be like, oh, I get it. And, and in one case, you know, I showed it to someone and he was like, oh, I like, I just look at this image and I know that X, Y, and Z have to be a part of their society.
Like he, they were able to draw so [00:23:00] much from just looking at these images. Oh, I know that there has to be certain respect and relationship and connection and, and harmony between this society and, and the natural world because of what I'm seeing. I know that there has to be certain, like societal equity, like they were pulling out so many things just from an image or two and they're like, can I get those images?
I wanna look at them. They're so inspirational. I wanna have 'em on my background. You know, and I don't, it's not that I'm trying to hype up, you know, our skills. I think it's more just that people are really desperate for that type of thing, like aspirational, hopeful visions of the future. And they, and also they, how powerful images were.
Rauriki: Yeah. And the thing is, I think our skill is in the synthesizing and our skill is in probably exploring all these different concepts. Our skill isn't in. them into visual medium, with this tool, us, like allowed you, bro, you're the one who, who been, um, the prompt was it creating all these images [00:24:00] and it's just like every time something, every time an image is created, we, we were like, oh man, what's the, what is the story behind this?
What's the, what's society have to look like there and have to look like between now and then? Um,
Madyx: Yeah. And that was a big, being able to go out to the, to like stretch ourselves to explore as far out as we could, you know, mentally explore out and then find an, an anchor point in the future that was an, that was attractive. Right. And would draw people. And then I. Once I think this was, this became our plan over time, we're like, if we can find an anchor point in the future that's really exciting and, um, we'll pull people, then it's a much easier pitch to be like, and here's the roadmap of whatever, four stages, five stages to go from where we're today to there.
And so fund us for the first part of stage one, you know, and having a, a coherent pathway to get there. But I think [00:25:00] the, maybe the most important thing is something that's attractive at the end of it, which I don't think many other solutions provide.
Rauriki: I think one other important part was once we got support from the team in, in government and council to, to seed this idea, um, we asked ourself what type of thing would hold this thinking, and we ourselves. To ask what type of structure would align most with the future that's being created. just straight off were like, oh man, we can't just be two guys sitting at the top of this pyramid and then hold all the intellectual property and, and um, just really create artificial scarcity around this content so that people can't be a part of it. We like really linked into around game B, around open source, even around our, um, [00:26:00] our, like, if we could have a cooperative structure where everyone was a part of decision making in operation versus a company that works for shareholders, external of the operations and people that just workers, because in these futures we were creating, wasn't gonna be wage, wage labor. Um, you know, that's not, these, these images look like people are slaving away, um, to their corporate overlords. Um.
Madyx: yeah. wage slave society.
Rauriki: doesn't look like that. So what, how would we, within our current context and what we are able to do, how would we best represent that and navigate that? So I think that's when we started leaning into, I think blockchain A, like there's been that kind of part of our and, and like a DAO concept, concept, uh, having some kind of decentralized authority, um, where, you know, decisions could be made transparently, funds could be held on the blockchain so that [00:27:00] people could access them and everything could be seen and trying to remove, um, the central points of power so that people could collectively work you could achieve their game B principles of like hyper transparency and hyper collaboration, which are wicked. Um, and yeah, that's, so that was another whole, so we had, like, we had talked about. Game B, then these solarpunk things and these animist indigenous things. Then these, um, then we had blockchain in there. Then we had like generative AI. So, but they were all like, if you looked at a, if you look at a forest, it's got all of its layers.
It's got all of its intricacies, it's got the different organisms, and we, I feel like we were just uncovering all the different parts a complex thing that all just works as one unit. Um,
Madyx: that's true. And we haven't touched on what we've come to believe is the most [00:28:00] foundational layer of what we're after, which is the animism component. Right.
Rauriki: Yeah.
Madyx: We came to say this might be the most important base to, uh, the social technology that we need or the social toolkit that we need.
Rauriki: Because, yeah,
Madyx: it kept [00:29:00] coming back, right? It kept coming back to the, a relational aspect,
Rauriki: interesting. I heard, and I always remember this fellow, what's his name? His name, I think it was Johan Rockström. He does like, he did the planet develop the planetary boundary stuff. He's a like a scientist fellow. And that's what um, Donut Economics leaned on the planetary boundaries.
So, you know, a lot of thinking around that. And he goes, When you to stop thinking about our planet as a series of levers and metrics that we have to kind of meet between these minimum thresholds and and so on and so forth. Um, he shared an analogy. It was like, yeah, when you have, when you have a child, you're like, um, you say, oh, you don't say I better feed you otherwise you'll die.
You know, you don't, you don't say that. You say, I'll feed you, I'll care for you because I love you. And that's a different relationship. so when we, think about, or my myself, when I think about Te Ao Maori, that's your relationship with your environment. You [00:30:00] think, how can I increase, how can I respect and how can I give back to my, my ancestors, my, whether they be waterways or mountains?, and that Te Ao Maori worldview is a subset of this wider animist worldview of seeing, of our planet as a living thing. And you, you having an actual, uh, relationship with it. Because we do have that relationship where we, we depend on this ecosystem, but our society operates like we don’t. We can just extract, externalize and, and then, um, yeah, all the issues are just pushed out into the, the global south or to the less fortunate. So that's why I think animism came in because it really challenged, um, the conventional thinking and it brought in, a different perspective that's not often talked about.
And I think 'cause it's really associated with religion. I can either, either be like, [00:31:00] indigenous worldviews are clumped into like, oh, these are “other” religions. But it's, it's, um, you know, paganism shared, shared those views of, of a living world and everything being animated. And then, um, venerating and honoring honor, honoring their. Ecosystems, their waterways. And so that was some cool, cool links.
Madyx: And it's also funny that now panpsychism is a bit of an explosion of popularity, which is just animism. Like it is, a way of describing animism that all parts of the, of the cosmos of the universe have a level of consciousness.
And that's a primary force. So it's funny that, you know, we're in a place now where a lot of western sciences are coming to that.
Rauriki: catching up with it and sometimes claim, sometimes claiming that, you know, so it's cool that we've got, I always think it’s funny that me and you talking because, you know, you've got, you've got as a global perspective, and I've got this local perspective, but we can both share an animist perspective, [00:32:00] um, you know, way of looking at the world
Madyx: Yeah. True.
Rauriki: Whereas mine is a specific subset to Aotearoa and to being Maori. But there are that that, that are, um, that are shared with indigenous cultures and even non-indigenous people around the world. you know, how the details of it, you know, it's up to, you know, instead you're getting into getting, going down a rabbit hole potentially. Um, but I think there's, there's some power in this as a concept to connect, to connect Um, people with this concept of animism way of looking at things and being part of something, also the current long standing. Beliefs and worldviews of indigenous peoples. So it's like, it's like a dance. Um. That’s why we haven't put this being Maori. 'cause it's, it's not Maori. It's, it's kind of the in New Zealand it becomes [00:33:00] Maori. Um, but it might be in. And we were thinking, you, you talked about the other day, like, once we become an interplanetary species, you know, what, what does that look like?
Madyx: We were, we were taking this opportunity where the thing we were working on was coming to an end and we were just thinking really blue sky. And I think we were doing something really important to do on almost every aspect, which is just go back to the first principle and don't make any, don't build on any assumptions that, that you haven't explored thoroughly yourself.
And it's so easy to just take for granted a few layers of things, and you never actually working back to the core. And I'm sure we're doing that now. It'd be arrogant to think we're not, but hopefully we peel back more than on average. And we were like, what's at the very core reason of why game A or status quo
Meta systems are so toxic. And I think that was one of the things that led us to animism, it was like, if you peel back to the very core, like [00:34:00] having it and not having that at the base, like having a sort of materialist reductionist, um, base. One of the things we talked about is if that is truly your worldview and it, and you actually authentically believe that there's no meaning, there's no consciousness, there's no awareness, there's no animating life force, then why wouldn't you just view everything as a resource to do whatever you feel like doing?
Right? If you, if it's not a living thing, it's not a relationship, it's just like this is some dead matter in this forest or some unaware matter, so there's no moral, ethical, right, like, reason I shouldn't exploit it to do whatever thing I feel like doing. And so we were like, you know that it's just appropriate.
We were like, game A is just behaving appropriately based on that assumption. And anything that you scaffold over that base, you'll have this underlying force of returning back to that way of operating, which is, as you sort of outlined from, you know, uh, from a Maori worldview is, is is a [00:35:00] relationship to things.
And if it's a just a bunch of lifeless matter, it's not a relational system. Right. And I think that's why we're like, can animism be a, like a founding social technology that we could build a stack, you know, a social tech stack.
Rauriki: that, uh, I was reading somewhere, um, that, that a worldview then sets the boundaries of what is permissible to your ecosystem, what kind of actions you can do to it. Um, if you don't have a relationship, if you just have that materialist, um, worldview perspective of, of resources, you can do whatever. 'cause it's just inert. But if you have, if it's, if it's living or if you have a relationship to it, or if it's divine or if it's spiritual, whatever way you, you frame it up. it is something more, then it's physical characteristics, [00:36:00] then you, you will treat it differently. And if that's at your foundation, then you're gonna have, um, different results, different approaches to how you go about things.
Madyx: and I think that's why it became a core . Of one of the three cores, right? That we are like, okay, that's the base. So if we can build off of, and, and it also connected to just some practical elements of we are operating in Aotearoa New Zealand, we are doing work here, I think one, one of the things that you reminded me of was this story you told about someone that was, um, maybe an English guy, but it doesn't really matter. Someone that moved here and was really inspired by and
Saw value or connected to a lot of Maori elements. And he was excited about the prospect and in a really genuine way as I understand it, thought maybe like if I do enough of acts like I can become Maori, but we had this chat about, you know, that's, [00:37:00] that's not really feasible, but there might be something that both Maori and non-Maori can become or can support or belong to.
And that's, I think one of the really powerful things about that, that you touched on briefly about animism, right? Is that we can all have a pathway to sharing that.
Rauriki: yeah. We can, we can, that can umbrella everyone. then, before, when it becomes about specific knowledge sets or specific indigenous ways of being and stories, then that is, there's a whole lot of different, other, there's other criteria you gotta meet to do that. You know, there's, like, I was thinking his kids could be Maori.
If you married a Maori, then they would have, they would have a genealogical whakapapa linkage, in our world view you and genealogy is, is probably one of the, the prime concepts. Um. That could connect you back to our, to our deities. And if you to, in one way or another, then, then, you know, you have that, um, you [00:38:00] have that to give you to say, I am Maori. I stand, I have the whakapapa there. Um, but without that comes, oh, well you can't, can't come in and appropriate this knowledge, but you can. But this knowledge set can be explained, um, as a wider concept of animism, which is shared with a lot of indigenous people, and how you can be an animist too. I wouldn't say I'm like, you know, I am a first Nations person or I'm an aboriginal person. But I could say that we all share similar, um, principles. And then how that, uh, realizes itself within our environment and in our instance of animism is, you know, Maori or is, uh, all our, all the other indigenous peoples around the world.
Madyx: And I, I think what we saw powerful about that is if you set the initial conditions, if you have a few core foundations, then you can create [00:39:00] the conditions of emergent, of emergence for a positive sort of game B, um, future. But if you don't have the, like, core foundations being appropriate, then it doesn't, it'll be like, you know, Permaculture always talks about there's a certain set of conditions in any piece of land.
You know, the temperature, the moisture content, the, the soil makeup, and those are gonna drive through the living, you know, system on earth towards certain types of land types, right? If you have enough moisture and enough soil and enough, like, you know, temperature, if you have enough warmth, then it's always gonna push towards like a forest.
It's always gonna push towards the most complex type of land cover, um, type of ecosystem. And you can apply, you can input energy and effort to change that, but as soon as you stop that sort of external, whether you're cutting down trees or suppressing grass or whatever you're doing, whatever external energy you're putting, as soon as you stop that input, it reverts back towards that path towards where the natural system wants [00:40:00] to take it.
And I think that's exactly the societal thing, right? If we have the core that these are dead living systems, that no matter how many sort of like. Uh, enlightened or more mindful things come along and say, let's put that on top and you should actually respect indigenous people, or you should respect each other, or we should add in this, right?
Like, it just reverts back to type. As soon as you remove that particular person or group that's applying that positive force, it reverts back. And so we have to set the core conditions right in an appropriate manner. And then you don't have to input energy.
Rauriki: that. We had that base stack of animism. We were pretty clear. Okay. That's probably what our foundational element has to be if we weather it all the way down, that's probably the values, that have to be a part of And then on top of that was, uh, okay, how do, how do you operationalize or how does that, what does that look like?
Concept of game B, um, our game B concepts came in 'cause they, they already anti, [00:41:00] um, oh no, I don't wanna use the word anti, but animism is not as, uh, not so standard way of operating, just as game B isn’t the standard way of operating. So they kind of work well together. 'cause, um, animism doesn’t fiit in the current context of you, um, resource extraction. So, and our concept, and with in Te Ao Maori, if you ask, okay, if I‘m Maori, and I have an animist , Maori world view, but then I have a business that pollutes the river, you know, how does that align? Or is there some kind alignment potentially that business thrives in a game A environment. So what does a game B environment would be, would have a different type of relationship if it's built on that top of the animist stack. And so we kind of have some, some order. And then what followed it was, [00:42:00] um, I think we, we brought in solar punk because it was the most accessible visual concept to, to explain what that looks like. But if you look at solarpunk only, um, you could assume that it operates, you know, it could be a big multinational, Amazon could be running all of the beautiful buildings. Built on top Game B and animism, you have a whole different type of world, a whole different type.
Madyx: yeah. And solar punk was cool because it was a starting point to, you know, it's a, it has a visual style. It has a visual language, it has cultural artifacts that been created by people that love that potential for the future. And so it was a starting point where we could start to visually try to explore what things would look like.
Um, and we started, you know, talking about the vision cube and exploring different domains of life in this as aspirational, [00:43:00] attractive future. But yeah, like you said, I, I mean, solarpunk is cool because all the elements we've talked about, like regeneration and beauty in like a degraded place, obviously appropriate today.
Um, the, the solar aspect or the, like, primary consideration being the sun obviously, I think has a lot of meaning in a lot of different levels. And then the punk aspect again has, we've had to sort of make sure that we were embodying that in a reasonable way. You know, we've, you know, um.
Rauriki: because that is very anti-establishment. The punk aspect. Is it? what's working now isn't what's, set up now isn't working. And to change that, you're gonna encounter resistance. Resistance to change. So then you are gonna be labeled [00:44:00] as, um, resistor.
Madyx: Yeah. But I almost feel like, solar punk is, is like a bridging mechanism to get to something. Right. That's, it's, it's not the thing, but it's a good starting point down a journey, you know? And I, I think that we don't want to get pulled into being in opposition 'cause that's very, very effective game A
Rauriki: True, true.
Madyx: mechanism to entangle all these little like, seedlings of hope and aspiration is then they just, they get all their energy sucked out becoming in opposition to Game A when it's.
Which I think one of the things about our approach is trying to just recognize that is what it is. Think, you know, Game B has a framework where there's like sort of three places you can put your energy. It's like sort of triaging all the messed up stuff that Game A is doing now or building bridges or transitionary structures or building like fully proper Game B or you know, um, things.
[00:45:00] And I think you and I recognize that we probably can't even conceptualize of like a properly fully Game B stuff, you know, like we're probably, anything we're conceptualizing is probably somewhere in the AB
Rauriki: Yeah. 'cause we can't fully, it'd be too difficult to fully go over to Game B because we just don't have the conditions set to, for that of structure to flourish, but we can push towards it. Um, do you think we should talk about the, our, I think I remember us looking at Wakanda and like thinking, oh, that, that's got animism.
That's a different structure. It's gonna kind of Game B, it's kinda solarpunk, at the, at the core of, um, Wakanda was vibranium and vibranium is like this tech McGuffin, you know, that allows through technology this, um, the civilization to flourish. And um, so it becomes like a techno fix as its presented as a fix, whereas we kind of really were trying to [00:46:00] lean towards social or a social fix. Um. That was one of the concepts we had. And then, so we, I think we looked at the Kardashev scale of advancement for civilizations. It was pretty based, pretty much based around technology and, and how advanced your technology and energy extraction was, was how advanced you were as a society.
But um, what would a complimentary scale of social advancement look like? And that's where we wanted a social scale.
Madyx: well, I mean I think like the E. O. Wilson and I'm sure many other people that just point out, we seem to be mostly just applying uh, applying intellect and we're creating a lot of really like cool techno trinkets, but everyone's satisfaction with life, the rate of, you know, depression and anxiety, um, the rate of chronic illness, all these things which are I think for most people, pretty well accepted and [00:47:00] understood now, um, are the trendlines for those things seem to be going up in tandem with the trendlines for physical technology.
Um, and we seem to be applying the same type of reasoning, sense making, solution making, that's, that made the tech to allegedly solve the negative consequences. Right. And we don't seem to be honouring or holding a place of true impact for the type of wisdom, you know, for, for the power of wisdom, right? To guide these things and make them actually positive for the actors in the system, the, the humans, the animals, the, the trees, the rivers, the mountains.
And I think that's really, that has motivated us to talk about, that's a problem. We keep applying, like intellectual [00:48:00] thinking to solve a problem that's like a wisdom problem, right? And then we're confused why we don't seem to make progress on that front.
Rauriki: Should we talk about what our kaupapa is like, you know, I suppose with the, with the name? 'cause then, so once we had all these conversations, then we were like oh, right, we gotta make something. What should we call it? Um, and then we landed on Whanake Foundry.
Madyx: yeah. Well that's, uh, all of that is, you know, a synthesizing a lot of time of exploring what's useful for us to do with our time and, um, what, what, what could be something that we could be happy about and proud about that we actually thought could have a, a maybe some chance of contributing to a positive future.
And, yeah. So yeah, that's where, how we got to Whanake Foundry. We, we've, we got a little bit of, of standup funding to start, um, . Trying to make this thing a reality, trying to make it something that can be a place of [00:49:00] continuation of this type of thinking, whether it's us, um, or new people trying to just create conditions for the emergence of what we need to emerge, but not to be overly prescriptive.
Um, and yeah, so, so yeah, Whanake Foundry came about, um, representing the sort of blend of an American, you know, and, and the Maori guy working to collaborating together on something and having an English and a Maori um, word. I don't, do you wanna talk about why, those words are meaningful?
Rauriki: Whanake is just a word that's, uh, to, to kind of rise up or to, they usually we used for develop, um, or, or just emergence that word, Um, and you know, foundry we leaned on because it's foundry where things are made. And so we were like, how do we create, um, or we, that's thing where things are made, but how do we create, take these concepts and turn them into physical [00:50:00] spaces, digital and digital tools as well.
So, having something to provide. Um, yeah, the foundry is where kind of all comes together. Um,
Madyx: Making real, real things. I just, you know, last thing on this for me, but I think, you know, you've got people like Elon Musk, whether you like him or not, he's feels he has this mission or obligation to solve a bunch of technical problems. And I think if, if you have a social infrastructure that can operate and utilize those things and distribute them in a wise way, that's really powerful.
But I, it's like you've got those people applying a lot of intellect and brilliance in certain domains to solving really, you know, critical problems, but what you don't see to these are a wisdom counterpart to Musk, right? And there will be, as far as brilliance and capability, there'll [00:51:00] be countless people out there.
But the resources are not being directed that way because you generate resources. Resources are controlled by game A, so you generate them through being a Game A player that plays Game A well. Right? And so, I don't know, it's, it's, it's like very like asymmetrical type of warfare. We're not going to have the traditional financial resources that someone like Musk can apply to, to a comparable wisdom solutions.
But, but, so we have to, we, it's very asymmetric. It's like a little gorilla force
Rauriki: Yeah.
Madyx: um, vs a main army. We have to use very asymmetric techniques, which I think is why we keep leaning into things that are emotive or connect to people in a deep level. Um, yeah, I, it just. I think that's what Whanake Foundry is striving to do.
You and I are just putting a little [00:52:00] effort in. We've, we've learned so much from these things we've talked about and been inspired by, and we're trying to create our own little artifacts of that culture, which hopefully can have an asymmetric impact on people that they can connect to. People at a really deep level, you know, they can, people recognize that there's things missing and I think if we start to show 'em some possibilities that have, that are grounded in a, in a relationship to a living cosmos, that's really meaningful and that could be asymmetric to people.
Like those are, I think, you know, the types of things that we want to explore and articulate, um, and, and utilize at
Rauriki: And then just create, content around it. Like create stuff so people can. Oh, you know, 'cause that's what we encountered. We were like, oh man, we've got these concepts. Is there anywhere that's doing this? That's bringing them all together? And no was the answer. so we thought, we have to, we have to create this, we have to explore this, and we have to share this journey.
So Whanake is our, is a [00:53:00] space for us to, um, yeah, explore what does the future state look like? Um, what does it, what kind of imagery, uh, evokes emotions and, and, um, what kind of imagery captures, captures that essence of what that future state should be. What do the, what's the meantime gotta look like?
What does the structure look like now? And how can we best represent that, um, that future and something we can do today? What are our learnings? Um, and then the other thing is, yeah, how can we use, um, how can we use AI's to come out to scale our ability as a small, um. type of operation. How can we use that to do what a bigger entity could do or, um, you know, achieve the same things with less? 'cause that's what we'd be, you know, the asymmetry Again, we've got less to work with, but how can we do just as much, um, I suppose without sacrificing anything in [00:54:00] our wellbeing and stuff. Um, Giving it a,
Madyx: But the goal is the goal. The goal is that we can make something light and lean, which takes minimal input away from the main thing which we wanna do, which is to explore and synthesize and share this inspiring vision of the future that can motivate, uh, people to invest towards that pathway. You know, not for us to spend our time on all the sort of operational things.
And then our hope is, you know, fully supporting the hyper collaborative and hyper transparent, you know, open source, uh, ethics is to explore this blue blueprint for ourself in this space and just share, hey, this, this is something we've done and it's led towards us being able to spend more of our energy in productive areas.
And feel free to use whatever parts of this are relevant for you and contribute that out into this wider ecosystem.
Here how can we apply these technologies here, but ground them in the, in, in. [00:55:00] In these principles. So yeah, we can, we can dig more into like how we're using generative AI, how we're using automation, a lot of AI tools and automation, um, on top of what we just talked about, which is more this sort of intellectual framework, right?
Like, and then there's the practical, how do you make that a reality in operating a small
Rauriki: and how
Madyx: of not-for-profit, right?
Rauriki: other? Right? And how do they support each other because without those tools, we would be time restricted to then, and we wouldn’t be able to do the other, the other wisdom based exercises. So yeah, it's trying to figure out that balance. And this has just been a, a like sense making approach. It just trying to figure it, figure it all out, what, what's, um, what's gonna work? We found ourselves in some other rabbit hole, like, man, we actually just wanna do this, but we've gotta set up all of this other peripheral stuff because, you know, if we got to a point where we set up a structure and it's just like a, it's just like a traditional company, it, it just wouldn't really [00:56:00] represent, the kind of game B principles.
And, um, yeah. So it's cool. We have those three key, uh, three key pillars of Game B, of animism game B and solarpunk and, hey,
Madyx: Yeah, and I think our, our, our hope is that we can make something that's just like a little plugin that you might share and open source, right? And anyone can just take that element and just slot it into their thing, and they don't have to do the sort of exhaustive R and D, right? Like, I think that's our hope that, that through the investment of our time and energy, we can create something that's really like plug and play for especially anyone else here that wants to do something similar and they can just take that and just start from that point and progress.
And we just keep improving the ability for people to not restart. Right. And if we could you know, make a little self-contained thing and just put that out there.
Rauriki: the ability for people to create nodes on, on this network and for this network to hold our thinking.
[00:57:00] For people to, to jump on board and to, to see this as a, as a viable and feasible alternative.
This, this type of approach.
Madyx: yeah.
Rauriki: the ideal, the ideal would be, you know, if we, there would be an Eco Pa, you know, and it would be an established point, and all the businesses within there, all the entities, whatever they were, these type of businesses. And, you know, it was, it was collaborative, it was collective, uh, and all these other things that we still have to flesh out of And it would be complimented by the digital space a network of people across the world. So, yeah. Hey, small, small, uh, small ask
Madyx: Yeah. And I, I think there's quite a few things that we would like to spend time exploring, probably on the podcast in written content, but certainly there's a, there's quite a few threads we would like to explore. Um, via a podcast so that if it is valuable or useful to anyone else that, [00:58:00] that, that's out there and we can at least have a give and take with all the others that are sharing so much brilliance on YouTube and podcasts.
But, you know, for example, we, we wanna deep dive into The Network State and this concept of is how is that a way that you could scale a Game B society? and there's some really amazing stuff out there. And, um, that'll all be sort of things that hopefully we'll explore and flesh out. Um, and I'd also just want to acknowledge what you said and be really clear that we are in a really, uh, formational early stage of this endeavor.
Right? And we are not claiming or pretending to have any complete answers or solutions. We're in a completely exploratory phase, right? And we're sort of have these conditions where we have a little bit of initial opportunity to spend some time in our energy, but we're really just exploring, right? We're not pretending to have any final complete. right. We're not experts. We don't have any final products, but we [00:59:00] do, I think, have some threads worth following and some ways of operating and looking at them and ways of exploring that'll hopefully be useful and contribute something back out to the wider community at large.
Rauriki: Wicked Um, what is this a podcast like? First podcast done, or first? I don't know what we're calling this. This is just our, our, uh,
Madyx: Yeah.
Rauriki: I don't know.
Madyx: Yeah. I get hard to imagine that anyone would take value or be interested in what we just did. But I guess that's the thing. You never know, right? I guess that's the most, like, um,
Rauriki: Just,
Madyx: it would be silly to assume anyone would find value or interest, but it, but a lot of the things that are valuable and interesting, probably those people didn't expect they would be.
And if you don't share them, then that potential is not out there. I think that's what's really cool, [01:00:00] right? Like, the wise approach is to know that we have no idea, like that's the only wise approach that we couldn't see all ends. We couldn't predict what is of value and if we do this exercise, we're like, this is a value, this isn't a value.
That's a very Game A approach and or, or to be arrogant and say, this does have value. But I think the wise approach is just say, who knows what has value. But if you don't put it there, then the potentiality for someone to take something and make a great thing or to be inspired never exists. So I guess we'll just go through this exercise of sharing right inside the rawest level, right.
We're just gonna share, we're gonna try to embody the principles of open sourcing and, and hyper collaborative working and just share any of our things of value that we find valuable back out and who knows what happens.
Rauriki: Cool, bro. I think that's us.
Madyx: the End Podcast one.