Transcript

Title: "Episode 7: Predigenous

Description: Welcome to Episode 7 of "Sociocrafting the Future," where we explore novel concepts such as a "Predigenous" society. Join Rauriki and Madyx as they explore how blending modern tech with timeless wisdom can shape our tomorrow. We're talking about meshing AI, traditional knowledge, and innovative principles to forge communities that are not only sustainable but thrive together.

What's Inside:

  • 🕷️ [00:00:00] Kick-off: A Spider Saga (Yes, really!)
  • 🔄 [00:01:00] From Doing to Dreaming: Why it Matters
  • 🤖 [00:03:00] AI's Big Leap: More than Just Bots
  • ✍️ [00:05:00] Creativity Unleashed: AI's Role in Making Stuff
  • 🧠 [00:07:00] Pushing AI Limits: Think Bigger, Better
  • 🚀 [00:10:00] AI at Work: Changing How We Live & Work
  • 🌍 [00:15:00] Introducing "Predigenous": A New Beginning
  • 📚 [00:20:00] Sharing Wisdom: AI Meets Ancestral Knowledge
  • 🛡️ [00:25:00] AI Ethics: Navigating the New World Wisely
  • 🌱 [00:30:00] Crafting Futures: Principles for Tomorrow
  • 🌐 [00:35:00] Building Bridges: The Role of Network States
  • 🎤 [00:40:00] Wrapping Up: What's Next on Our Journey

Dive into a conversation that goes beyond the usual tech talk, offering a fresh take on building a future that honors both our planet and its people. Whether you're a tech enthusiast, an advocate for sustainability, or someone curious about how we can blend the best of both worlds, this episode is for you!

Join us as we tackle groundbreaking ideas that push the envelope of current human discourse, exploring how we can harness the synergy of technology and timeless wisdom to build a future that's not just sustainable but flourishing. This is not just a conversation; it's a call to action for innovators, dreamers, and anyone who believes in the power of collective progress. Subscribe now and be part of this exciting voyage! 🚀🌿

#TechForGood #FutureReady #Innovation #Sustainability #CommunityBuilding

EP 7 - Predigenous
===

[00:00:00]

Rauriki: Episode seven.

Is this episode 7? Ah!

Madyx: my headphones but I picked them up to get ready and uh, a spider had hatched a whole nest and I like pulled it out and I was like, I had a huge freakout and then my whole, I kept feeling like anything on my body and I went outside and like set it out and like tried to knock it off and I was like, well, I'm not putting those on for a month.

So,

Rauriki: That's,

Madyx: why I don't have a headset.

Rauriki: that's a fully valid reason.

Madyx: Well sweet bro, should we jump into episode 7?

Rauriki: yeah bro, we've, we've done a few, we started off with some deep philosophical yarns in the first few episodes, and then we went fully operational with getting our, getting whanaki off the ground. And yeah, you were saying the other day, maybe we go back into some, um, [00:01:00] Some wicked concepts. Mmm.

Madyx: like a balance is good, just even though there's pretty much no one listening, I still think like, is any of this even remotely interesting? And I think the podcast and sharing it is part of our commitment towards being transparent and sharing the journey early. Um, just in case it is valuable, like there's some, I guess it's not just, um, when I'm just doing it to do it, there is some intention and some design behind it, right?

To start early in case later on, maybe it becomes valuable to look back and I think a lot of times probably it won't be, but in the case that it is something interesting comes out, then it is always so cool to see So it's almost just like a discipline rather than thinking, oh, this will be useful. It's just like, we should all, if we can do this, and then we'll have such a better [00:02:00] ability to look back at,, movements or companies or things.

So yeah, the super like operational stuff might not be the most interesting is all I was thinking. Maybe it's not an either or, like we could, we can always do like a small update on that thing and then maybe talk about some more conceptual stuff as well or more content for Whanake is actually trying to build.

Rauriki: It's really interesting because it makes me think about like, in the operational space we have like, carbon based team members, which are us, and then we're leaning into silicon based team members, which are which are AI. Um, and everything that's coming out now. And the pace! The pace at which these tools are emerging that are giving us the ability to and do things you can never do before. I just saw they launched like [00:03:00] the first fully AI like software developer you just give it a task and there were instances that it's not as they said it wasn't the first time it's been done where you can kind of give the AI a task and slowly you work with it back and forth but this one it just kind of goes off on its own full buzz troubleshoots comes up with solutions and then like You get something that you can run with, so then, then that kind of comes into like a more of a bigger question around. What happens to those people's jobs, or what's the, what's the other side of this disruptive technology, and what are the implications? Um, what are the opportunities to hear, but like, there are two big things to think about, two sides of the coin.

Madyx: Yeah, the one that I put a link in our like podcast fodder channel was Devon, like that was the AI.

Rauriki: That was

Madyx: Yeah.

Rauriki: Like,

Madyx: And I think what's interesting is, I [00:04:00] think we talked about like, I think we started doing AI ImageGen like about a year ago or maybe a little bit more, a little bit more now. And it's really interesting even in that short time frame to look back at, I think there was a lot of people that were like, It's just, it's not actually anything substantial, right?

It just looks impressive, but nothing's going to come out of this, and there have been a lot of people saying it's overblown, it's all hype, and I was more undecided back then. It's not like, um, like Ray Kurzweil signing up to Singularity, you know, signing my soul away to go into a machine, but I'm just like, I've seen in a year how much has already happened and how many Practical applications there are already, and it makes me more, it's not about being a believer or not, it's just like, I see more and more impact from this stuff, [00:05:00] right?

Like, real impacts already.

Rauriki: yeah, you judge, you would like, look at the technology as it was, and be like, oh well, that's not gonna be Useful enough for another 5 10 years, but it was like literally 6 months. Um, I think auto, one of the software tools, autogen, which is what we were thinking of for whanake, to use so that our silicon based things can all talk together and write content and create posts. that's like fully a possibility with just like a few, a few prompts. So that's to Devon. Now, it's crazy.

Madyx: Yeah, yeah, you were doing some, I think it was Python programming and making some of

Rauriki: I like fully learned, I fully got technology to teach me how to code in Python and work with Autogen and get a browser, like to go, you know, to do all those things and um, do research. And now I think that Devon, Devon [00:06:00] could probably do it in a prompt, I'm thinking, ah, one, what a waste of time, but no, two, what a great learning opportunity because now you can kind of speak better to um, To the,

Madyx: the progress.

Rauriki: the, I suppose, to the progress of AI, but also if you were to work with Devon now, having that insight, think you still need some, um, understanding of that process so you can ask the right things.

Just like, my wife says, one of the skills I have is I'm good at Googling. Like, I'm like, I didn't know that was a skill. She's like, you're good at Googling things! Um, so at the very entry level, it's like, how do you work with a search engine so you know how to How it will bind your thing best. And the same will be like prompt engineering, with um, text to image, and I think the same will be with like AI agents.

Madyx: Yeah.

Rauriki: And, yeah, hey, you were talking the other day about how you were pushing GPT 4,

Madyx: Yeah. [00:07:00] Yeah, well that, it's, it's funny you mentioned that because I was about to say something on a similar thread that you made me think of, with chat GPT 4, I was just talking about how I was trying to push it to get away from the derivative responses that are like, They're really good, but it's mostly what I would expect from a competent high schooler that's on track to go to uni.

Like, that's normally the level, like, don't get me wrong, like, sometimes high schoolers are smarter than half the people you meet. It's not, I'm not putting down that, but it's like that level of like, I've read some things, I've synthesized it really well. I've written it out pretty formulaically because you don't have a ton of experience.

And it, I mean, maybe that is on the development spectrum where large language models are now. Maybe they are around a high school level. But anyways, I was pushing it and I tried to push it sort of like you were saying, it's sort of like a sports coach. Like I was like Rocky's coach and I was, [00:08:00] and I was using more language around like, I know you can do better, dig deeper, like push further.

Like I was using that type of language and it was really effective and Um, what you were talking about with prompt engineering, it made me think that's now, right? We've talked about how much has changed already, and I'm thinking, so is it, will it go from like prompt, it went from coder to prompt writer to prompt engineer.

And will it go to like digital,

Rauriki: motivational.

Madyx: like collaboration specialist, or will it go to like, you know, ai? Yeah, it'll go to like a relational based. Like, instead of, because you won't need to code it, so it'll be like, who's really good at, like, relating to digital intelligences? Who's really good at, like, that'll be, right?

Because you already with Devon and stuff, you can see the writing on the wall. Eventually, when general AIs [00:09:00] can code, it's pretty easy if they collaborate with us to ask it to code, right? So now it's more about who, who relates in the most effective way to these types of intelligence. It'll be like a relationship manager or like account manager with these other intelligences or a diplomat or

Rauriki: Yeah, or like, yeah, like if you have, there's like, I saw a video, like two types of coaching. One is like, very clinical and saying like, these are, this is what's happening and this is, these are the things you have to do and what you, what you need to do better. And the other one is a real, um. like, we can do it, you can do better, like, we just gotta dig deep, and so, I think, with it, with working with AI, it's already so clinical, it knows exactly what to do, just needs to be pushed in the right direction, or maybe it needs a bit of both, you might need to say, instead of doing this, do a bit more of this, and then encourage it to say, nah, do, do better than what you're currently doing, I just can't believe that, [00:10:00] a thing you can do now, you can like, encourage, and, and, Get the AI going to push beyond its boundaries,

Madyx: And it is funny because whether or not it's effective because it, it, in some ways it doesn't matter for the user why it's effective or not. The fact that you can use that type of language to improve Results working with these types of intelligence is really interesting. You know what I mean? Like whether or not it's because it's some sort of self reflective, aware, digital intelligence that appreciates being, or is effectively pushed like a human mind, or because the coding communicates the knowledge correctly.

Like you get to points where like, it doesn't matter a whole lot why it's effective because you're going to start relating people, humans will start relating to it. through how you effectively can work with it, right? And if that's that type of language, it's going to shift it.[00:11:00]

Rauriki: that's crazy man. You were talking yesterday, about like the effects that, that AI will have in the sense that it gives anyone the ability to scale up their, their operations or the ability to create content or, or write. So, things eventually be, like when will everything just be all AI, AI,

Madyx: Yeah.

Rauriki: everything?

Madyx: I'll put a link in the chat. It was a Wes Roth video that I shared that, um, was using the forest analogy for, I guess, the digital ecosystem. And there's plenty of people already that Sort of advanced this idea of the dead internet that already because of algorithms and marketing and formulas that are effective at capturing attention, that it's already effectively dead in the sense that you [00:12:00] don't organically explore it and find and acquire interactions or information organically, that it's so controlled through those forces already.

And then his video, what he touched on is if you, that's sort of limited, like people are, people are churning out content and content is being directed at users in a way that's hyper limited, possibly already. And effectively compared to sort of a free and open earlier, a pre algorithm internet is effectively dead.

But then he's saying, well, if you, we already see how good at generating video or written content, audio and video will come soon and. In a matter of time, the dark forest, the clear web, the quote unquote open internet could be so flooded with AI content, will that make it worse? And I was thinking about this because the video sort of paints it like it will, and that's definitely the [00:13:00] easiest reaction.

But then you think, the problem isn't that it's AI generated, the problem is the quality, right? I think, ultimately, right? Like, if something is moving, entertaining, pushes you to grow, helps you be effective in learning something, does it matter if it's human or genetic? I mean, the algorithms and the attention management and control, that's a problem.

But if that went away and it was just a matter of the quality that's being put out, maybe it wouldn't matter as much. But yeah, that was just a discussion of like We want to use the AI tools to amplify our voice and be light and lean to advance the mission of Whanake Foundry on a, on a small budget. But we don't want to contribute to the pollution of a free and open internet, right?

By adding content that's not worth people encountering.

Rauriki: I think that's, there's a [00:14:00] spectrum then, cause for the already saturated content space, like with everything that's in the algorithm, um, AI will probably just build exactly the same, and that will become fully derivative of itself. The same things being said in those spaces, but in spaces like this, where whanake is, where no one really else is, but there are aspects across, um, like with Gamebee, with Solarpunk, with Animism, AI gives us the ability to derive the key concepts from those whole fields and build something new. At pace and at scale, so I think in the spaces where, um, those limitations prevented the full flourishing of these niche topics that's where AI will really, will really, um, [00:15:00] excel and be of value. Especially if you're, yeah, like Alex said, like, it has to be quality input. And I think the innovation lies in pulling from things that are already there, really nurturing things that aren't currently there.

I think of, um, I think of, like, all of the videos that my daughter watches, and they're all in English. And I'm like, damn, I just want that AI to be able to translate to te reo Maori, so I can,

Madyx: Mmm,

Rauriki: the non, the no videos at all in te reo Maori can just be creative. Like, just.

Madyx: yeah,

Rauriki: With AI and how it's, and there will be a question of quality. There will be a question of innovation, and then there's probably a side question of, the data and how do you protect that part. But I see that AI. Potentially has a, um, the ability to create new ecosystems based on what's

Madyx: yeah.[00:16:00]

Rauriki: hmm,

Madyx: I agree. I think it's definitely, it's best to not look at it in like a binary, ironically, black or white. All, all on or all off, you know, it's, it's funny saying you need to look at AI in the digital revolution in an analog way, but I think that is important. Like you've, you've just illustrated a bunch of nuances where it could be really powerfully positive, you know, and so I think it's just a mindfulness thing.

It's changing so much. I was talking to you about it. Uh, ChatGPT 3. 5 and like trying to feed it podcast transcripts to get it to write, uh, pull out the like subjects and basically write the description for Spotify, YouTube, whatever. Um, and the difference between 3. 5 and 4 is huge and that is such a nice task.

[00:17:00] Like, I don't, I'm just thinking to myself, I don't, it's You just need to know what are the key topics they spoke about. That's all you need out of it. The AI can do what it can do it. It's not necessarily, it's a very functional thing, which I think is really nice to just be able to hand off and then get back into something else quickly.

Like that, to me, maybe there's a negative aspect, but that felt really positive.

Rauriki: Like, hand washing your clothes versus in the, putting in the laundry.

Madyx: Yeah.

Rauriki: um, Te Ao, like there's probably, in that context, that use of AI, where something tedious that could be done really quickly from an AI, I think it's, it's, I don't want to say fully. Positive, because there's always

Madyx: Yeah.

Rauriki: some unknown unknown, but in that context, uh, face value, it's just saving time, I mean, see, so we can do what we, what we need to [00:18:00] do, so that the kaupapa, what we're trying to push in Whanake gets out faster and more effectively and, um, with the limited time that you have.

Madyx: It's interesting that you bring up clothes washing as an analogy because I was actually reading about that and it, it was super interesting to me because one of the things they said is around the time when a lot of those daily tasks were automated, I don't know if it was in the 40s and 50s, but there's a period where a lot of those things were coming out.

And this account of that history was saying that it didn't, what happened was people's Expectations of how often your clothes are washed, how clean things were just went up to match that. So it almost like still took as much time, but the standard that was expected rose. So like, I don't know if you're hand washing stuff, maybe you wear it for a week or whatever the, the duration and the amount of dirt and smell that you just accept.

Well, [00:19:00] realistically, how often, so you have a certain standard and then the tech comes in and you go, well, now I expect it to never. The soil never to have odor, which now is like, uh, every use wash cycle. So now the person in charge of that still spends roughly the same amount of time, but the standard is higher.

And I was like interested. So they didn't gain any time. They just gained. A standard that may or may not be worth it.

Rauriki: yeah, that's so true.

Madyx: Isn't that interesting?

Rauriki: is it better or worse, because now if you're trying to spend the same amount of effort, but now you're using more energy and more resources. Oh, I'm not sure if that's a win.

Madyx: Yeah. So yeah, I think that's very appropriate that you brought that up. That's what you want to be mindful of. Does that task being automated mean that now, instead of having crappy podcast notes like I did before, now I have good ones. Does it Now do I, does it in some way just, I spend the [00:20:00] same amount of time, but we have a better output, which isn't the worst result, I guess, but it's just something to be mindful of, right?

That creep.

Rauriki: Yep, same with the chainsaw. Like, you know, when the chainsaw came out, you just had to cut more trees. You didn't have the same quota.

Madyx: Yeah. That's interesting to extrapolate out and apply to our vision of a Whanake Foundry future, of an animist solar pump game y future. What you said is really, I think, relevant. When you're talking about washing and getting water and traveling Those all used to, those used to involve a lot of social time.

So because it took a long time to wash clothing, you probably do it as a group of people, right? So tedious tasks, it's going to take us half the day, so a lot of us are focused on that. We'll all do it together and there's, we don't need to go into the value or the impact of that. And When you travel, you used to have to go between [00:21:00] notes, whether it's, you know, public houses or Oasis or whatever, or you know, villages and you,

Rauriki: were

Madyx: so that involved a lot of social interaction, right?

So it's like we, as we are extrapolating these out, or the technology is a, I mean, us trying to think about the YouTube, the notes, the podcast notes, probably we've already moved past it, but there was a time when. All professional services, you generally went to someone else. Like now we just sort of expect everyone to have 50 different, just work across every domain, practically.

And it's only going to get worse, but probably in the, I don't know, 50s to the 90s, if you needed graphic design, if you need something printed, you would always go to another human?

Rauriki: so funny because like when I work with um, my uncle or anyone older than me, they'll, they'll either, say like my grandparents, they'll go to someone's house unannounced. [00:22:00] They'll just go in person, and then the next generation will just ring. I'm like, I was pretty uncomfortable with the ringing, you know, because you're young, you know, I was just, I was just tics, and that was, that was kind of my generation. And then you've got, like, all social media now as a, as a, as a way of communicating. And then I wonder, like What's the, as these AI agents come, will you just be like, telling your agent to talk to someone else's agent? Like, could we, need to,

Madyx: think you will. Bro, because that's so funny you say that, because I was thinking about this exact scenario, because uh, my, our friend who lives with us said Uh, I'm coming home, like I'm finished work. I'm coming home. And I was looking at this thread of like my wife and our friend lives as a three of us in this house and it's.

If someone's like, I'm finished work, I'm coming home, and everyone's like, yeah, that's awesome. It's always the same. And I was like, how soon until obviously the AI knows that I'm going to say, oh, that's great. See you [00:23:00] soon. And then people are like, why do I type that out? I'm always gonna say that.

Rauriki: just,

Madyx: rough.

And then I just started to get a little freaked out. I was like, how many interactions are semi formulaic on a surface level? As far as just what the words are, and then they're like, why do I need, like, I can just, why do I have to respond? Like, my age, you know, just automatically, right? And then I was like, a lot of those things.

Rauriki: can,

Madyx: think they appear formulaic on the surface, but there's depth to the interaction,

Rauriki: lose the

Madyx: you know? Heh

Rauriki: social, cause, cause I actually looked, I was looking on the tutorials for AI, and there's one where you can extract out all of your, um, you could extract out all of your emails, And all your responses to any email, then you could upload it to the AI, and then you could use like a retrieval, augmented generation of your response, so it goes, oh, um, on what type of [00:24:00] email, what's the type of general response that I would provide, and then it uses that as a start point, so yeah, it's fully formulaic, and you could, you could fully get the AI to respond to the AIs as you would and Like, and, or text message or anything, but you do lose the social element, and there's such a small thread now.

It's gone from talk, like meeting and living together, to then calling someone, and all the nuances you get

Madyx: heh,

Rauriki: call versus a text, and then a message, and then it's gonna be like, what part of society will be social?

Madyx: nothing. It'll just be AI agents responding to AI agents, and then eventually the AI agents will be like, Why do we have this biological meatbag that's in between? Cause it's not even doing anything, right?

Rauriki: Get

Madyx: If it's actually going to go, they're eventually going to go, what, we're talking to each other, we're doing, [00:25:00] we're, that meatbag's not even involved.

It's not doing anything. And then they'll go, yeah, we're the society, not these

Rauriki: Then you'll get

Madyx: hairless ape,

Rauriki: then you'll get to, oh, that person, then we need the meatbags because they, they own the property, you know, like they can set up entities, or then you got DAOs, like you got, you can have just a decentralised autonomous organisation of AI entities, which can, which can hold, have a legal presence, and then you just jump one more step to robot labour, like, looking

Madyx: which

Rauriki: which they just

Madyx: humanoid robot.

Rauriki: they just released me like last, last week of a, a AI robot.

And, um, were looking at papa in the future, what we were gonna live and how we gonna manage it. I was like, damn, I ain't mean no lawns. 'cause you know, I've, I keep forgetting. So I looked up the Facebook ads, knew what I was saying. So it's like all these automated robot lawn mowers come up and so now

Madyx: Yeah,

Rauriki: the [00:26:00] labor's getting out.

So I'm like, wow, what value? Where we're designing ourself out of a, of the world, what value will we provide?

Madyx: bro. It's so interesting. It makes me think. I'll try and start actually connecting it back to Whanake Foundry, not just tech news, although this is all relevant. You know, it's like charting this aspirational future, but it makes me think about when, in the Western world, when we shifted from women generally not having Employment that they were paid official money for, right?

Like when we shifted from mo, uh, it is, it's, it's not even relevant that it's women and men, let's just say generally we had, generally we had two parents and generally one of them got paid labor and the other one did unpaid labor. That's all that's relevant. And we shifted from a doubling of [00:27:00] the labor population, right?

So we went, we doubled the, the workforce when, when there was liberation and, and therefore we have the value of labor. And it's not really that, it's not rocket science because we used to take one person to buy a house, buy food, and now it takes two people to do the same thing. And now you've got me thinking about.

AI is gonna be another having or worse of labor value, probably worse. This is probably each time it's exponential, right? So maybe it's a 10, you know, a 10 times reduction, um, of labor value. And so we can't, I mean, obviously this isn't new, but it just connected it back to that. Tons of people talk about, all right, labor is going to have no value.

We have to find a way to give people money because our whole system is based on people buying things. And if they don't have income, so we need UBI or whatever. That's not new, but I hadn't necessarily thought about women's liberation and two people per household [00:28:00] working as actually something we can learn from as a predecessor to the impacts to society of what this might do.

Maybe we can learn from that Reduction in labor value and its impact on society

Rauriki: It's gonna be like, yeah, a zero labour, um, society.

Madyx: Once you like you said when you plug in AI to Humanoid robots because people were talking about these are just knowledge jobs, right? Cuz it can't physically manipulate the world, but there's been crazy progress in humanoid robots , even hand dexterity, , gentle touch, you know, manipulating ag and sorting.

Rauriki: right, they've trained everything like, I saw they were putting like 3D models into 3D environments. what's physics like, what's all the things they do they could like speed up 10, 000, 20, 000, 100, 000 hours of training in just like a few, a few, a few days and [00:29:00] like, so The pace that things will, like, learn and grow is just crazy.

Madyx: There's big, there's definitely big changes on the horizon. One of the big things was them programming robots before to move. There's so many calculations, but now they take a self learning neural network and they, what they do is they put a haptic feedback glove and then a person manipulates objects and that's all synced up to like a, uh, a humanoid robot with a hand and similar articulation.

It's synced up to a neural, self learning neural network. And it just basically observes. That's exactly how the person triggers all the actuators because they have a haptic glove. And then he said they basically instantly just process that quick. And then they can, and then this robot did it autonomously manipulating objects [00:30:00] gently with, with a hand, with five fingers, right?

So now that's intersection of like neural networks, self learning. And humanoid robots, and now it's accelerated this thing that was really complicated, which was programming the movement. Imagine if you're just writing code to teach it how to, compared to, oh, just put on a haptic suit and move around and have it process that data through a self learning neural network.

And then it's gonna watch how that, how you operate it and go, oh, okay, I just reverse engineer that.

Rauriki: That is actually crazy. get to Like, why, someone, yeah, someone could just be, probably someone's job, well you wouldn't even need to do it. Someone could just wear that suit, and they could just take in all the feedback of how to do it. You could learn so much, so really complex movement would be, like, imagine if you put a suit on a dancer, and it, and it kind of just picked everything, or a freerunner,[00:31:00]

Madyx: Exactly.

Rauriki: it'd probably be almost like, um, to the point where computer vision would develop to a point where you could just watch the person, and then just

Madyx: Yeah.

Rauriki: that create a 3D environment and then be like, okay, this is what's happening.

I'll do it too.

Madyx: Yeah,

Rauriki: be

Madyx: I'm sure they could do that. It's probably just like, what is less processing power? I'm sure technically they could use computer vision, take that and make a 3D map and then track the movements.

Rauriki: the but the hap

Madyx: Maybe it's just a lot more efficient.

Rauriki: now the haptic will be it but I reckon there'll be

Madyx: Yeah.

Rauriki: where they'll just be able to like

Madyx: Yeah. Exactly.

Rauriki: There's one that's like a supervillain on, which one, uh, I think Daredevil's a supervillain, or like Taskmaster, it's like special ability is being able to mimic the um, like the martial arts of other people like [00:32:00] Black Panther and Captain America, they're pretty much a mimic, but like When we fight the artificial intelligence Terminator robots, they're gonna be learning all of our K through moves is what I'm

Madyx: Well, I mean, that was the whole premise of Ex Machina and basically a lot of other people that don't think it's just a movie. They say, if you had to design the perfect data harvesting setup to elevate and train an AI, an AGI, That would basically be Google because every day people are, you're finding out what makes people tick.

What do I want to look for? What do I click on? What, what are people constantly doing? What is driving them? How do they react to data being presented and all of that data? And then you add in email communication. So you have Gmail. So they're tracking all their online communications. Then you add in social media [00:33:00] messaging.

So now you just have the perfect data harvesting training set for an AGI for the mental aspect, right? And then all of a sudden, if we're given sort of semi, you know it's happening when some, for some reason, like haptic things are used for, it'll be like, oh, it's an enhancement sleeve, you know, that you wear and it actually like accentuates your strength or whatever.

But also it like captures the data to maximize. You know, we want to give you the best data on your performance. So it's actually all the athletes should wear these full haptic suits and we play back. Oh, you could adjust your knee. And then they're like, don't worry about the data. We're just taking a copy of that on the cloud.

Rauriki: the Fitbit thing. A that when they, um, somebo the, the data from Fitbit linked and so all of the people who are wearing Fitbit, the, the geo, the, uh, the GPS location

Madyx: Yeah, that was crazy.

Rauriki: And, and so what you could deduce from that was you saw a lot. Fitbit [00:34:00] activity, where on the map there was nothing there, they were like, oh, that's a military base.

So they're like,

Madyx: Yeah, you saw that? That was honestly one of the most interesting things I swear I've seen in 10 years. Was the heat map

Rauriki: yes,

Madyx: the Fitbit data? And they'd be like, why? And it was like circuits, like, oh, someone's walking this route every day and they're going to this place that looks like where they sleep and they come.

And it was like, well, there's nothing there on the satellite image. It's just a blank piece of earth.

Rauriki: So that is 100%. And so,

Madyx: Why did, uh, I, I always wondered how come they weren't aware of it? Like, why did these military organizations, you think the tech just snuck up on them?

Rauriki: I think, it did, like, you didn't have these third parties holding such, this amount of data in that type of way, just like how Cambridge Analytica has been able to leverage all those kind of Facebook [00:35:00] data and all of those.

Madyx: yeah,

Rauriki: so, I think that, yeah, that there,

Madyx: it just snuck up on the,

Rauriki: yeah, probably.

Madyx: they weren't anticipating it.

Rauriki: I hope you can sneak

Madyx: Well, that was the first time it was a problem, right? So it had never been a problem before, until then, and we've never seen it again. So, I was wondering, they probably had really secure Internet environments in these secret bases. In the sense that traffic in and out is controlled.

Everything's controlled. So they're thinking nothing's going to get out and there must be a local geo store on the thing that then when they like rotated off duty, it connected to just a normal Internet and uploaded it. So that's what I wonder if they were like, it's all controlled here. Nothing's getting out.

Not knowing that it still was somehow storing satellite data or geo data. Somehow, and then when [00:36:00] they left the base, they took their Fitbit and it uploaded. That's what I was thinking, maybe.

Rauriki: the other thing I'm thinking with like AI, is um, my mates she was asking, um, she's a celestial navigator, like a traditional voyaging

Madyx: Oh, wow.

Rauriki: um, so they were um, me she was like, hey can you help me with this coding? I was like, do you mean? So team were learning how to use this planetarium software. Of how to map all the different stars so you can show someone, Oh these are the stars and these are the night sky in a kind of AR augmented reality type of way. Um, he had like these 3 40 page PDFs of how to code a function is. And I was like, why, why are you reading this? This is like, not what, not, this isn't Celestial Navigation.

This is, you're showing us a code that I

Madyx: Yes. Yes.

Rauriki: ask [00:37:00] it, ask it if it knows it, it if it code it does, so like the whole, what would have taken ages to learn how to code manually from scratch, um, just was reduced into this like quick few prompts done. Then she asked me, what happens to the data, like if I put my, you know, mataranga Maori into this AI, and then somebody else asks a question, does it have access to my data? And like, I've seen some of them say, no, you know, we only store temporary things, like, just like the Fitbit, we'll be like, no, whatever. there will be a point where, um, like you were talking before, do you want, what would be the best data harvesting? It would be these, these AI instances you put out to the public, where people are just feeding massive amounts of data, training it.

Madyx: Yes. Yes.[00:38:00]

Rauriki: think this was with the Samsung, they put like, um, They put some of their code from their, from their phones, or like, sensitive codes, asked Chattaviti to solve it, and then it all got leaked, and so all of their sensitive data was out, so yeah, I didn't have an answer for that question, I was like, oh, potentially you'd have to run a local node of a, of a, um, open source model, but then, and then, if you wanna, like, then you get all these, um, yeah, barriers of compute

Madyx: Then you're on github, you're programming

Rauriki: and

Madyx: your own language model, and

Rauriki: but then, yeah,

Madyx: you're meant to be a celestial navigator knowledge practitioner.

Rauriki: we're, um, is no longer the realm of celestial navigation.

Madyx: Yeah,

Rauriki: So yeah, it's crazy, um,

Madyx: How does all this relate to our mission at Whanake Foundry, like all of these developments?

Rauriki: I think for Whanake Foundry, Whanake Foundry is about creating a future where animism, game bee, and solarpunk all merge together. And the [00:39:00] very reality that AI is presenting right now is potentially the opposite trajectory. Like, it's, it's, the AI future is one of zero labour, but also zero society without social structures of connection, potentially. Um, it's one that's going to avoid your necessity to have a connection to the paeao, to our environment. Just like when you used to hunt for your food, and then now you've got a supermarket, and then now you've got Uber Eats. Like, food is just going to be a button

Madyx: Yeah.

Rauriki: going to have an agent for everything. the future. That's what's being presented now in AI. That's this counter whanake world. And so, for us, it's like, how do we take good things that AI is presenting, the opportunities to scale good kaupapa, but how do we, like, what does it look like to [00:40:00] prevent a Terminator apocalypse? Or just the capitalist exploitation of everyone's

Madyx: Yeah.

Rauriki: So, it's like the counter whanake narrative.

Madyx: Yeah.

Rauriki: AI and um, and all the technologies that are, that are coming out, especially around ai, just 'cause of the scale and the pace.

Madyx: Well, it's just that all the tech now that, I mean, this is, I guess, part of our pitch is The, the base stack that our civilization is operating on is materialist,

Rauriki: Yes.

Madyx: uh, cyberpunk, um, game A, right? That's the foundational layer of our societal tech stack, and then everything else operates on top of that layer.

So that layer is the prime rule set generator for how everything operates, and then all the tech is coming out way above that, so it's just, [00:41:00] um, magnifying. That base stack, right? Like each thing just magnifies what's below it. And that's why at Whanake Foundry, or our time here, we're trying to propose to stop trying to mediate those effects above the base layer that we need actual.

Rauriki: Whole new base

Madyx: Oh. I was listening to, oh man, I'll look, uh, maybe I'll look up his name, maybe I won't. He's on, he's everywhere. I've avoided him. He's everywhere on like, he spends a million dollars a year on like anti aging. I can't remember his name. He's a Mormon.

Rauriki: a guy who eats like a hundred pills for breakfast. Like this is the

Madyx: Yeah,

Rauriki: you can have and is like 50 pills of this, 50 of these,

Madyx: exactly. And he's everywhere, and how he was marketed was so obnoxious. I never clicked on a single algorithm for it, right? And then he was just on a podcast that I was listening to that was a bit [00:42:00] different, and for whatever reason, because he was on that podcast, I was like, oh, this will be interesting.

And he talked about, instead of going back to first principles, going back to zero principles. That's his whole thing, is not even first principle thinking, zero principle thinking. End. Anyways, long story short, he was more interesting than I thought, but his marketing is all like, sorta real, I don't know, it's definitely, they're presenting a weird face to him, like, I think, anyways, my whole point is, we're saying we need to go back to first or zero with principles, and we need to replace those base tech stacks, and then all the systems dynamics, emergence, human social dynamics, tech dynamics, as long as they operate off that base tech, Our work's done.

Essentially, like, we want to vision and roadmap out something really aspirational that will pull us to the future, like the space, like, like, it's like traveling the stars [00:43:00] was in the 60s and getting to the moon or different sort of really aspirational societal things. Yeah, we want to, we want to contribute towards that.

But I think really the main pitch is if you just replace those base decks, if you've really found your society on different foundational blocks, everything else just flows out from that without active intervention or application of energy, right? And we are just inventing more and more sophisticated tech to scale the base deck.

So I think our pitch is if we're scaling our base deck, AI would be incredible. AI, digital silicon intelligences can be Like we described this vision of instead of exploring to gain wealth or conquer or control, exploring the world, the oceans, the cosmos, the galaxy, to add it. Connections to this network, you know, this, this synergistic, symbiotic family of [00:44:00] intelligence that's interacting and improving their experience of experiencing the physical universe, right?

And this, the Whanake Foundry Society is a star framed society, and its drive is to add family members of various types and places and connections that enrich this experience of Of the, of the, of the universe, right?

Rauriki: Beautiful.

Madyx: I honestly, I love that vision. I guess our whole point of discussing all these things is that we're not taking, we're not taking a Luddite approach of abandoning tech. We're saying let's change the base blocks and then all the amplification will be positive. We sort of pivoted out of a program that was the genesis of Whanake Foundry because, I mean, it came to an [00:45:00] end anyways, but it was taking an approach of let's influence the, the stack towards the governance layer and people that are allocating resources.

Let's convince them of this aspirational project and they can give us money. And we're sort of like, that's so high up, that's near the top. And the higher up you go, you change that. It only affects really above it. Maybe doesn't actually seem to feed back down. All the directions seem to go up the stack.

And so I think it's maybe just motivation to not motivation. Maybe it's just more illustrates the need for this approach, right? Because of how much tech's going to expedite.

Rauriki: and I like not taking a large approach, but not saying don't use AI, but what this conversation has helped identify, and our work in Whanake is that our [00:46:00] effort needs to go in that base level, because a lot of effort will be going, how are we going to use everything on the top of an already materialist cyberpunk game A base, which

Madyx: Yeah.

Rauriki: be futile, and it's not going to be, nothing's going to happen from there. Just going to have different flavours of the same bass stick, but if we put our effort towards shifting or changing or slowly moving towards a different bass stick, that's on top of it, will just naturally Add to that, add to those principles. So that's, that's made me feel a bit more, um, a bit of a light the end of our conversation about, uh, about the Skynet takeover and all the things AI, um, because what a wicked world it would be if AI was on top of that, that animus Gamebee Solarpunk stack, that'd be crazy, because in a world of [00:47:00] zero labour in that sense, it would be Zero labour, but social society.

We do things for the, for the benefit of social interaction in society and our environment and just, that would be great, it would be so cool.

Madyx: Which is interesting because I feel like that has been the sales pitch for all tech in the last, I don't know, hundred years is that it'll let you It's Be more social, spend time with people, do things you love. That's always been the pitch. There'll be more wealth, there'll be more resource because efficiency's improved, but the result seems to have been decreases.

But they do, they're using the sales pitch that we're, we are using, but it's a deceptive tool that's applied on top. Of, of the game, a materialist cyberpunk, but we did go dark and where I won't stick on the thread because we're climbing back out to the light. I think with the network [00:48:00] state movement and a lot of the parallel society movements, they've actually starting to roadmap a way where you build the new base stack of your thing and then you put it out there and say who would like to join me here and then like we said it's not a Again, ironically, it's not a binary.

You don't have to abandon your membership of the nation state UK and maybe have a pair for now. For now, maybe, you know, maybe there's a crackdown, but for now, most people could probably participate in these parallel society structures and maintain and then when it's safe, they can maybe one day, hopefully, right?

Like, not really spend any time or energy in the old nation state paradigm and they're transferring resources and building up And so I think that's shown at least something that when I've articulated it to people. The first step is you, you make the pitch of the base stacks. And I think most people [00:49:00] are on board once you break it down and you show, you know, you explain that.

But then you get that, but yeah, but how do we ever, do we reform this? And then I think the network state parallel approach has for me converted some people that were like, yeah, in principle, but There's no way. You show that and then they're like, okay, another light goes on. So anyway, that's once that's been a bit of an evolution, maybe in our thinking and how we actually achieve these, this mission or contribute to it.

Rauriki: I think that's a bit, that's a big thing around how we communicate the mission as well. We're setting in the space layer and if people are keen want to be about this kaupapa then we can start on top of that and it may look all different, like all different things and they may look like. Projects or technologies that also sit on the game A cyberpunk materialist, on top of this structure, they will benefit to that same set of [00:50:00] principles that we have at our foundation, so yeah, I think, if you can get, if we can get, if we can start offline, we could build a nation, a network state. That's founded on these principles, then anything that emerges out of that will be beneficial.

Madyx: Yeah.

Rauriki: long as we all agree on these foundational things. Yeah,

Madyx: And the thing is, bro, is that, this is what I think about a lot, it's, it's the same as sort of being born. We have a lot of things, as humans, that we just happen to come into a certain one, and that's where we have been stuck, right? Like, I'm a male, European guy born in the States, and that's, well, that's your, that's what you got for life, right?

And you're like, well, I don't agree with half of those things. And I think that's what's so powerful about saying, here's an opportunity for you to look across all the offerings of [00:51:00] network states and be like, oh, I actually align with this value set. It's completely different from the piece of land that I came into this experience in.

That's such a luminous, like, man, it's, it's rough when you're just like, well, I'm born here and I disagree with all the values. It wasn't really an alternative, but this is so empowering, I think, to say, what do you actually align with? And you actually have a choice of like, I'm joining this one because I do align with its base stack.

Rauriki: I like that. That's, that's giving a level of choice and autonomy to you to make a decision on not how you are born into this world, but how you choose to live this world, but in a real tangible way, not in a, um, not in a hand waving way. Because if, uh, a world of network states. Like, a network of network states.

Madyx: Yeah, they an ecology of network [00:52:00] step.

Rauriki: of network states, because that's what you'd want. wouldn't want

Madyx: Yeah.

Rauriki: wars everywhere, like, you know.

Because you'll have, like, there will be, it will be like, almost just a network of network states. Or just be a homogenous group of people on a planet that are all,

Madyx: Mm.

Rauriki: um, All connected in some way, but geographically, it's not a geographical connection. The

Madyx: Yeah.

Rauriki: that's tricky for me is, like, obviously being Maori, we have a geographical connection to

Madyx: Mm,

Rauriki: but that's based on the knowledge of, you know, my ancestors who landed here, and that knowledge changed over time as they migrated from the Pacific, and before that, to all the different islands, and then now, and even from landing. Aotearoa, how that knowledge has changed in time from like the Holocene state to the[00:53:00]

Madyx: mm

Rauriki: Anthropocene state. And I think this, I look at it in that way, it's a continued evolution of our connection to, to, to the taiao. But I think it's allowing ourselves as indigenous people to create new knowledge as well as the world around us changed. Recognising that that's what our, our ancestors, our tupuna did. That knowledge changed as the environment changed, and we're going through so much changes right now, our knowledge system has to change it. That might be it. That might be, uh, um You know, it might be a connection to the Taiao, irrespective of borders.

And I, there's one question, and I'll finish with my last papaea, Ranginui is our skyfather, Papatūānuku is Mother Earth. And the question is like, where does Ranginui and Papatūānuku stop and start? You know, when, how far do you get

Madyx: mm.

Rauriki: to, okay, now let's run [00:54:00] out of Ranginui. We run out, so where, where does that, um, what does that look like?

So I don't have a clue, I don't have an answer to that, but um That's a note on what we've been

Madyx: Yeah. Like in the sense of, maybe I'm thinking this too, literally, like in the sense of like, is it at the tallest peak of mountain? Is that what you're, is that the question? Like

Rauriki: like,

Madyx: you or,

Rauriki: and Papatūānuku only in Aotearoa,

Madyx: oh, right. Okay.

Rauriki: so

Madyx: I already went global, but yeah, you're

Rauriki: yeah, yeah, so,

Madyx: Yeah.

Rauriki: what if I live in Aussie, you know, or

Madyx: Yeah.

Rauriki: somewhere else, anywhere out in the world is, is, um, where is the limit of my, my, my way of seeing the world, my animist or

Madyx: Yeah.

Rauriki: way of seeing the world limit, and I'm arguing nah, it probably can apply anywhere, and, but wherever you are in that locality, Your knowledge will evolve from that as well.

So,[00:55:00]

Madyx: Yeah.

Rauriki: it does, I'm just, I'm just trying to unpack

Madyx: Yeah.

Rauriki: of do we can, is an indigenous peoples tied to the land, or are they tied to the principles of how they respect the land, and that land can be anywhere in the world. Because then we, we no longer lock ourselves to the nation state. I don't have to be Aotearoa.

You could potentially be network state across the world and share these similar principles that are

Madyx: Yeah.

Rauriki: based on wherever you are in the world.

Madyx: Yeah. That's so interesting, bro. I love that. You know, is it, you know, you've taught me and other, other Maori people that have taught me about the Maori worldview. The more and more I learn, and it's a tiny bit, but I, the more I understand it as a way of living in relationship. [00:56:00] With the land, and because Maori people have lived in this land for a long time, that's the strongest relationship, but I, and I mean, I guess mostly I thought about it as the only relationship, but now the way you're painting the picture is this question of, is it a way of living in relationship with The land and the sky, because that doesn't feel fixed.

That feels like what is Maori is a way of living

Rauriki: we need a,

Madyx: uh, where our practice is, but it doesn't technically sound like it has to be limited to a certain land and

Rauriki: down to levels you get. You'd be right at the right at my base level would be Waikato. You know, I'm from, I'm from here. Now, my knowledge base comes from here and stories come from here. But then after I'm a part of Maori, which is Aotearoa, and we live in a way of principles of connection to the taiao and, and have a way of respecting the taiao and working in, [00:57:00] in balance for the survival and well being of both taiao and tangata, people in our environment. And then you come up to the next level with like indigenous people these shared. Principles across the world and indigenous peoples would, would traditionally have settled in those lands from between like 80,000 years, like in our Aboriginal au to a couple of thousand years, which in the case of Mori, but what is the name for people who have the principles of connection to the land and who want to live in balance in unison and, and acknowledge it in a similar way to indigenous, to Maori. But they have just, you know, you've just moved here. Like, if I was to move to another place, or if I was to be born, you know, European, and I didn't, and I didn't align with those principles, but I aligned with these others, what's that type of lawful cause? Because that's, I think, that's what the [00:58:00] population of a network state would potentially be.

Madyx: isn't that sort of what we're trying to make a space for those people, right? And we've talked about that question that you've raised about Maori and the evolution of that world. That doesn't have to be answered now, but we've also talked about It doesn't have to be necessarily exclusive, right? It can be additive, and maybe for some people, they feel that the new animist way of living, that's not, that that's enough, and maybe they drift, or maybe they take half.

The thing is, there's no mandate. That's what's beautiful, right? It's like, hey, join, join us in this thing too, and you spend your time and energy the way you want, and if that naturally changes and ebbs and flows, that's great, right? like. This Whanake Foundry Nation state worldview that I don't imagine it would ever put like, oh, it's not gonna, no, you're [00:59:00] this and give up that.

That's the nation state way, right? You can't have two, like give up that one. And so that's what's beautiful. You don't even have to make that decision. You just go, I do, this is cool. I'd like to try this. Let me like, be a part of this. And you just see how it goes. But what you've just outlined, I feel like is what we're trying to create a space for, right?

People That want to, well, I mean, it's also sort of jokingly when I talked about if a tectonic plate movement uplifts a new continent, if I just fly there and I'm like, my way of living is relational and I, and I build a relationship with the land that I live in or whatever, if I operate with, if I operate under the principles that a lot of indigenous people operate and I go to a new land, then I'm, you know, we have this sort of, sort of joke conversation, then what?

Rauriki: yeah,

Madyx: you to be indigenous there. But this is like a way, a different version of that, right? The network stuff. [01:00:00] Hmm.

Rauriki: well, if it was just emerged there and you just got there, would I be indigenous to that? And it'd be like, I don't know, like, because when you say indigenous, suggests a long period of settlement in that space. But what if you just added, okay, then would my great grandkids be indigenous to this

Madyx: Yeah.

Rauriki: no one else has ever been before, and we have generations of knowledge of how to live here? Then, well Then yes, you probably would say, or argue that your great grandkids would be fully indigenous based on the knowledge system that you've set up.

Madyx: Yeah.

Rauriki: for you now? Like, what is the name for us?

Madyx: Before then. Yeah.

Rauriki: before you were indigenous, when Maori landed in Aotearoa, they weren't Maori. They

Madyx: Yeah.

Rauriki: something else,

Madyx: They were still,

Rauriki: yeah, but through living there and developing a relationship with the land, and then the knowledge base that come from the land and the language that evolved with the land. that's what I think indigenous people are, but I think you can almost have like a I don't [01:01:00] know, like a pre

Madyx: Precursor.

Rauriki: precursor indigenous nation.

Madyx: Pre indigenous. Yeah.

Rauriki: indigenous, like, how,

Madyx: Pre digit.

Rauriki: what, how do you, well, because that's what potentially we need now. We need a pre indigenous network state,

Madyx: Yeah.

Rauriki: so that people

Madyx: How much?

Rauriki: yeah, we're going to be creating new ways of living, new knowledge

Madyx: Yeah,

Rauriki: language, even, even almost.

Madyx: It's signaling though that you want to live by these principles, when you describe that, I think, well, the deciding factor is the strength of your relationship and, uh, of that, of you and that place. And that's, Now, right now, without AI, it takes multiple generations learning, but it's not actually about the time.

It's about the strength of the relationship, which has taken multiple generations, but that's maybe not a fixed,

Rauriki: So

Madyx: right?

Rauriki: so yeah, if an [01:02:00] indigenous suggests settlement for a long period of time, it doesn't actually mean that, it means you've been given the time to develop the knowledge to work with the land in a way that achieves balance and well being. So what if you could just expedite that process through some

Madyx: Then the pre digital phase is maybe just, you know, a year. With augmented learning. And then, uh, first of all, first I was thinking, how much trouble am I going to get in if I just start going around and they're like, I'm, I'm pre digenous. On one hand, I love it. On the other hand, it'd be so hilarious. Cause it could be, you sound like the most obnoxious thing.

You can like introduce yourself. Like, I'm actually pre digenous, bro. Like,

Rauriki: that's it man, I really like this concept because if we're going to create whanake world, whanake future, we need

Madyx: yeah,

Rauriki: prediginous society to this whole land, this whole future that

Madyx: that's the podcast title,

Rauriki: [01:03:00] no lives in the whanake future yet.

Madyx: Yeah.

Rauriki: create the people to live there and how do we expedite the process of learning the knowledge to live in that future.

Madyx: Yeah,

Rauriki: I think that's pretty, that's good.

Madyx: it also outlines and signals your approach, right? So, in the space for in society, all those people that are in the Whanake Foundry, well, the one that we like, those people are going to go out as pre indigenous to those places, right? Because they have a mindset to connect, right, and build a relationship with these new places.

They're going out as pre indigenous people. You don't have to go out that way. You could go out as a colonizer, as colonizers. Neither. It's sort of neutral. Like,

Rauriki: your own space stack and try and push it onto

Madyx: yeah,

Rauriki: else and not create a relational, uh, connection,

Madyx: is a response, right? A two way living interplay between two things, rather than I take my thing, I just [01:04:00] apply it to you. Bro, the pre digenous. It's just,

Rauriki: we're gonna get,

Madyx: oh

Rauriki: we're gonna get in so much trouble,

Madyx: Yeah.

Rauriki: and that's why our faces aren't on the button,

Madyx: Yeah.

Rauriki: but I, I really like, I think it's a real valid concept to introduce and to socialise, and um, Geez, I might have

Madyx: I think,

Rauriki: conversation in episode 7,

Madyx: yeah, bro, I love when you share these things when neither of us do and how it hits other people, you know, like I find that so interesting and I think You know, I'm really glad we eventually organically came to this place of exploring the intersection of, of Maori identity and way of living and that intersection of the sort of global and local because I feel like we hadn't really like come to that in the past few podcasts but I'm really, the intersections of things I think are most fruitful and it was really [01:05:00] interesting exploring that way of looking at it, indigenous identity, network state. Base stack of society, AI, space travel, new lands.

Pre digitized episode title locked in. That'll get the clicks. We'll go from 10 listens to maybe 15 on that type of clickbait title.

Rauriki: we'll, we'll,

Madyx: I don't even know what to type, man. It's just so random. I mean, I don't even know what someone would

Rauriki: I

Madyx: Yeah.

Rauriki: to start a conversation with this.

Madyx: please socialize pre indigenous without any like,

Rauriki: No

Madyx: primer.

Rauriki: context.

Madyx: Yeah. Just be like, so I've got a friend, he's actually pre indigenous, he identifies as pre indigenous. Just see what the response is.

Rauriki: Oh,

Madyx: be like, or you be like, I'm actually thinking about [01:06:00] moving, but I'm going to identify as pre indigenous in that spot. Like there's multiple ways you could actually like test it out.

Rauriki: man, what a thought experiment. What a wicked. we've done a

Madyx: sweet bro.

Rauriki: and a bit,

Madyx: Either it's good or we're good, huh? We're wrapped?

Rauriki: I think so. I think episode 7, breaking through the barrier of the podcast, barrier of the 7th episode. We did really, really cool. We um,

Madyx: Yeah. Congrats, bro. Episode 7. I've heard various stats, but like 90 to 99 or something, a huge number of podcasts don't make it to episode 7. I think there's like, most don't make it to 3 and then past 7 is very rare. So there's millions of podcasts on the database, but most, a lot of them are overdue. One minute, one episode or whatever.

So congrats bro. It's cool to get to seven and it was a good one, man. Oh, well, it was a good one for me. [01:07:00] It was a good one for us. I enjoyed it. And I think that there's more interesting things for people that aren't intimately following the progress of, of this Whanake Foundry set up, start stand up process.

We're continuing to explore the opportunity to work with mentor Ellison on creating a legal framework that will empower us to do what we want to do. So we'll, I guess, report back on how that goes, but, um, we had to catch up with them, which was really nice and, um, looks, it'll be promising, right?

Like they, they, hmm.

Rauriki: The legal structure for Network State is definitely a doubt, and so then you're going to get into the legalities of operating over multiple jurisdictions, so that's why the team there will be able to start the journey with us,

Madyx: Yeah,

Rauriki: time we get to the incorporation of our Network [01:08:00] State, we'll have some strong, strong support. Oh,

Madyx: yeah. I mean, I think at this stage we're starting in Aotearoa, New Zealand, legally, right? We'll report back, but we're exploring that. It might start there and then as things progress it's easier to be non local, but yeah.

Rauriki: Oh, cool. Wicked.

Other episodes

Related blog

No items found.

Latest updates in your mailbox

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
🎉 21.201 active subscribers
cancel

Search podcasts, blog posts, people