Episode 17: MAHA & AI: Disinformation vs Discernment
Madison and Maycee Holmes
In this episode, Madison and Maycee Holmes tackle the concept of “limited hangouts,” exposing hidden narratives in media and politics, with a focus on the Make America Healthy Again (MAHA) movement. They dissect Robert F. Kennedy Jr.’s surprising push for wearable technology, drawing parallels to dystopian agendas like the World Economic Forum’s vision and Palantir’s AI-driven surveillance. Referencing films like Lions for Lambs and The Matrix, they explore how predictive programming shapes public perception, while questioning the motives behind technocracy and political figures. With witty banter and a commitment to critical thinking, the Holmes sisters challenge viewers to question the system, rethink health narratives, and stay vigilant. Tune in for a thought-provoking discussion that’s as entertaining as it is eye-opening!
Want more Holmes?
Find them on Substack, Rumble and Youtube
You can also contact the Holmes sisters directly at Connect@at-home-with-holmes.com
(0:00 - 0:07) Hi Canada, I'm Madison Holmes. I'm Maycee Holmes. And you're watching Holmes Squared. (0:08 - 0:16) So, today we're going to cast some judgments. Oh, damn. Our whacking sticks. (0:16 - 0:27) Yeah, we're going to play whack-a-mole. Except we have bad aim, so it's not going to be very successful. No, Macy, before we hit record, Macy was just saying, look, no criticisms. (0:27 - 0:31) We can't handle criticisms. No, I said I can't. It was a joke. (0:32 - 0:40) But we are going to cast nothing but criticisms, because we're women and that's what we do. Oh, hell yeah. Estrogen warfare, as E.M. Burlingame would call it. (0:43 - 0:54) Yeah, we'll have to do another episode on E.M. Burlingame himself, because that is... That'll be in our next ep, the one we're going to do next. I definitely, I'm going to address some of the things that he talks about, yeah. Oh. (0:54 - 0:59) Yeah. Oh, in our next... Okay, she's... Wait, as he would do, yes. Yeah. (0:59 - 1:21) Yes. Yeah, a little bit annoying to listen to. But anyhow, today we are going to... It's kind of a continuation off of the limited hangout conversation we had previously, in part because there's more limited hangouts. (1:21 - 1:37) Surprise, surprise. But also as I've come to understand limited hangouts, for me anyway, because I didn't really get them before we actually did our episode on them, I'll be real. And now that I do get them, I see them all the time. (1:38 - 1:46) Like, oh, there's a limited hangout and there's a limited hangout. So that's one of the reasons we thought, okay, let's continue this. Yeah. (1:47 - 1:55) Yes. And specifically on the Maha movement is kind of where we're at. Which is the make America great. (1:55 - 2:18) No, healthy again. It took me forever to get that acronym because I knew the Mega movement. And then all the time in all the podcasts we watched, Man in America, the Corbett Report, everybody that said the Maha movement, I'm like, what? And then I think it might've been dad who's like, make America healthy again. (2:18 - 2:36) Yeah, somebody finally said the acronym before we actually understood what the heck it meant. Yeah, it was a couple... I want to say at least a couple months before I actually understood what people were alluding to. I got the names they were talking about, like Robert F. Kennedy Jr. and such, but I didn't actually put two and two together. (2:36 - 2:46) And this just goes to show that we're very intelligent, which is why you should listen to us. Yes. We just like lose people right at this point in the end. (2:49 - 3:00) Honestly, that was something that I was thinking about. Um, just Monday, today's, is today Monday? Today is Monday. Today's Monday. (3:00 - 3:17) Okay, so a week ago, um, I was listening to stuff for this episode. And it was actually the Derek Brough's interview on the last American Vagabond. They were talking about Palantir and, um, not event 201, detachment 201, all this stuff going on. (3:17 - 3:57) And it was just the reflection that people actually tune in to listen to you and I. And just the same humbleness that I felt when our groups here in Calgary raised funds to send us to the Reunify conference. I felt that same almost, it's also like a pressure because, you know, every week me and Macy put together at least, like if I share my screen right here, if you guys can see the tabs, like this is one. So on here, there's like minimum two that we listened to for this episode. (3:57 - 4:08) Another one here, another couple here that we've listened to. Here's one here. Just all of these tabs are stuff that we've put together for these episodes. (4:08 - 4:19) And we do this weekly now. So and it's more responsibility, which is kind of nice. I mean, like we were never not going to keep educating ourselves. (4:19 - 4:46) Like that's kind of the whole point. But it's a bit more, can you put it in a cohesive way? Because if you yourself can't even explain it, then how well do you really understand it? Which is a challenge because I'm not going to claim that I actually understand everything, but definitely not. But I am accepting the challenge of wanting to take on the responsibility of trying to articulate it. (4:47 - 5:14) I mean, this particular audience is not too familiar with some of the content that we speak of, which is nice because it's always nice to have those type of like, like in our family, we make inferences all the time. But it's nice that, you know, we can take on that challenge and continue to try and put in the work to make sure it's good for people watching. Yeah, and keep it open enough that we allow people to still do their own research. (5:14 - 5:43) Because the one thing, and you were actually, it was a couple other people that had recommended different podcasts with other young people and specifically also young females doing their own version of commentary. And I, what we also do here is commentary, you know, we're putting in our own opinion, partly because you can't escape it. There is no position from nowhere that's omnipotent. (5:43 - 5:55) We are not omnipotent beings. We can only speak from our own experience. And so that being the case, you and I really own our values in most of what we learn and communicate with. (5:55 - 6:12) But I'd like to think that it's open enough that it's exploratory. So we can just, we're trying to get people to think about more things generally and include more perspectives rather than honing in a specific one as it is. Because we're trying to do the same thing for ourselves. (6:13 - 6:34) That's what I'm, yeah, exactly. So we're trying to lead by example and lead by this openness to greater thinking. And we, I checked out some of the shows about the younger girls that people were saying they, they admire on their podcasts. (6:35 - 6:56) And I, my initial reaction was I actually felt bad. You know, people like Brett Cooper, she, she talks about the latest and greatest, even with celebrities, you know, what went down with Katy Perry, Justin Bieber, she was on it. And I could care less to be candid. (6:56 - 7:18) And I almost felt bad because I thought, you know, these people might be expecting something more like that from girls our age. I just, but I can't, I can't, I can't meet that bar. There's so many other things that we've talked about that I'm like, I, I feel like I have to wrap my head around all of this greater world stuff. (7:18 - 7:32) So even some of the ones that you watched and you listened to a little bit of their commentary, I, it personally made me cringe. I, it makes, I'm not going to lie. Sometimes I'm not going to name what it is. (7:33 - 7:53) I refuse, but there were two relatively young women. They're actual, they're older than me and Maddie, but they have a show and listening to them talk makes me want to kill myself. Like, like to put it candidly, because honestly, it's just, I can't handle the, the voices as well. (7:54 - 8:02) Like, sure. Maybe what they're saying could be like, Hey, like who cares of how they're talking? If they're saying something of substance, substance, I'm like, yeah, sure. No problem. (8:02 - 8:06) I'm sure some people hate the sound of my voice. And I think it's annoying. I don't know. (8:06 - 8:21) I sometimes do, but I really can't handle like the, the stereotypical, like Valley girl, like whitewash type of accent. Cause I'm, I don't know. It's like my brain goes like, you don't talk like that. (8:21 - 8:30) I know you're faking it. I know it's not how you talk. You had to have mimics a parent or somebody to sound relatively normal. (8:30 - 8:36) You do not talk like that. That is how you talk around your girlfriends and that's fine. But that is not how you actually freaking talk. (8:36 - 9:02) And it's just, it bothers my soul. But regardless the commentary as well, getting to the actual substance is a very, in my opinion, one-sided and very shallow. Like, I think I have no problem with, I guess, people wanting to nitpick a particular politicians or particular figures out there in the world. (9:02 - 9:17) That's fine. You can have a whole podcast if you wanted to dedicated to, you know, psychologically profiling this particular individual. But it's just, to me, I feel like it's a bit still keeps you within the diameter. (9:17 - 9:50) Or the parameters, I should say, of left and right thinking. Even if it might not be left politically, right politically, it just feels like you're stuck in a, I'm focusing on this particular group or that particular person of this particular group and then I'm just putting all my energy towards it. And then people walk away and I'm not going to lie, who benefits from it? Every time I listen to them, I'm like, so what did I gain from that? Nothing. (9:52 - 10:08) So I just wasted like five minutes of my life listening to you for nothing. And sure, you may have made me laugh, but even the jokes had no moral substance to them. Yeah, that's one thing. (10:08 - 10:26) Some people say what makes comedy comedy is that there's a little bit of truth in it. I have heard jokes, though, where there was just no truth. Or it's just straight up like, not gonna lie, I have a theory and it's that females, their humor is they like weird noises. (10:26 - 10:33) And I think that this theory is valid. I have used me and my sister as our own anecdotes. So screw you. (10:33 - 10:49) I'm in my own little self-enclosed laboratory of how I prove this to be right. But honestly, I've come across me and my sister, but also other girls where it's just weird noises. We bond over making weird sounds. (10:49 - 10:57) And so sure, is that funny to me because I'm a girl? Yes. But then at some point, you got to like put it down. It's not that entertaining. (10:58 - 11:02) It's like, calm the hell down. Like, calm down. It's not that funny. (11:02 - 11:15) It's funny because I'm biased. But everybody else, the whole other side of the sex is probably looking at you like you're retarded. It's statistically, men are funnier than women. (11:15 - 11:25) They are funny. You know, that's not to crap on women as I am one. I like to be practical and I don't like to put women on a pedestal. (11:25 - 11:32) You know, just call it for how it is. Okay, Maddie, now let's deviate. No, it's not really a deviation. (11:33 - 11:41) So this, we watched a movie and which one wasn't? This one. This one. Oh, that's not the one. (11:41 - 11:52) This is the one. Just thought I'd show you guys our faces again because you're not already looking at them. This movie you just talked about. (11:52 - 11:55) It's called Lions to Lambs. Lions for Lambs. Lions for Lambs. (11:55 - 12:03) For those who are listening. Yes. And Maycee, you just alluded to the dichotomy and the paradigm. (12:03 - 12:09) Still the zeitgeist and the parameters of the left and right. Yes. Don't mock Ian Burlingame. (12:11 - 12:26) Okay, it's my right hemisphere picking up on tics. Yeah, definitely tics and tics you want out of the system. This movie was done in 2007 and we just came across it last night, I believe. (12:27 - 12:42) Yeah. And it was directed by Robert Redford, which I believe that's this guy right here. That's the actor, which I haven't verified this yet, but supposedly CIA affiliations. (12:43 - 12:47) There's speculation. Either his character was? No, I think the actor. Oh, the actor. (12:47 - 12:56) The actor, not the teacher. Okay. It was overall recommend people go watch it, but that was a movie. (12:57 - 13:09) So watching it for ourselves, there's a scene in it where there's a kid about our age. I think he's in high school or maybe early university. So either way, he's our age. (13:10 - 13:40) University student. And he was basically having this debate with the professor saying, why would I show up to a school and get propagandized? And why would I contribute to this system where the bureaucracy and the politicians, they have the whole system set up for them to win for me to lose? Why would I play? Yeah. And then the teacher was saying that it's like, well, you're not showing that you're caring. (13:40 - 13:53) But then he was like, what is your solution? Then I become a congressman. He's like, just so that I can end up like all of these freaking useless men that don't actually have any conviction. They don't care about people. (13:53 - 14:02) They just care about cheating the system. Like, that's how I'm going to make changes. I'm going to end up in the very system that's designed to kick people like me out. (14:03 - 14:17) And when me and Maycee heard it, both of us, while there was playing, we were like, yes, exactly. See, though, me and Maycee also have a dynamic that the character didn't is that we actually are politically involved. And we are on the stage. (14:18 - 14:27) We've been to the protests. We've been to the rallies. We're doing the thing, even though we have the same feelings that he does regarding the system. (14:28 - 15:25) And that was I alluded earlier to the Derrick Brose interview that I had watched. This is the one Derrick Brose is a investigative journalist, and they were talking about Palantir attachment to a one and also largely the mega movement and how they're responding to things like Palantir, because a lot of people that subscribe to Trump religiously thought he was the next messiah or just generally they when you're lost and you don't have a map and you're not sure what to do because the world is a lot bigger than you. You look to somebody just like a kid looks to, you know, when a kid is trying to map out a playground for the first time, you're looking for the leaders, you're looking for the leaders and you ask mom permission before you go and say hi to the other kids. (15:25 - 15:38) And I feel like Trump became this thing for a lot of Americans. Hence, there was this movement and identity surrounded the mega movement in the first place. And so they were addressing this. (15:38 - 16:07) And when Palantir came out, a lot of people and RFK Jr. being the person that got in for this whole Maha movement. Both Palantir being basically AI government, and we can explain that a little bit more, Macy can, because even I hadn't done enough. I heard allusions to it, but I wasn't certain as to what it physically was doing. (16:07 - 16:48) And so this introduction to basically this technocracy via Elon and then now to a lot of people's dismay, Robert F. Kennedy Jr. into the Trump movement, a lot of people are it's thrown them off and they they're having to remap. It's like when suddenly this person cheats on you and you're and now you're at a loss for words, you're you're incredulous because you don't know what to do. And so this I feel like this whole movement is that's what's happened. (16:48 - 17:16) And with the introduction of the Palantir thing and they were talking about in this interview, Derek Rose addressing this fact about. People have been trying to justify this is why Trump's doing it or denying it outright. You know, they make whatever narrative they need in order to justify the actions, because it's hard to remap. (17:16 - 17:23) Let's be real. When you realize that one thing you were told, it's not the case. Yeah, they call it like 5D chess or whatever. (17:23 - 17:27) And it's like, just be patient. Trust the plan. Trust the plan. (17:27 - 17:32) It's all about something bigger. The white hats are in control. Yeah, I don't. (17:33 - 17:44) It becomes this QAnon copium is how I've started to understand it. And that's what they were talking about in this thing. And he goes a little bit far to say it's not I don't want to say it's far. (17:44 - 17:56) But Derek Rose, he says most of the elections are rigged. And like you said, both sides are. It's the same two wings to the same bird. (17:57 - 18:03) And Tom McDonald said the same thing. You know, he doesn't. He's like, I don't I don't vote because it's a stupid. (18:03 - 18:17) It's a game and I'm not going to play. Derek Rose, to me, was giving the same argument. Why play their game when the elections are rigged anyhow? Um, which is also what the kid said in that movie, Lions for Lambs. (18:18 - 18:42) And which I do recommend people go watch it because it does have it's got a it's got a moral that you have to negotiate and find through what they're trying to tell you. Yeah, like dad was bringing up good context because I didn't even think about it. But that movie during that time was made by, I guess, what you would call Democrats to kind of slander the Republicans. (18:42 - 19:04) So, again, it's it's difficult because when you're watching those movies and let's say you were a Republican or let's say you're a little bit more lean towards the pro-Trump thing in the mega movement, it's like those type of movies would piss you off because you would be siding with almost like a bit of Tom Cruise's character and who play who plays actually a neocon. And yeah, I think he's a he's a senator. Right. (19:05 - 19:17) Yeah. And so you would be looking at that and going like, oh, my God, the way that they're portraying him is totally false or like or it's like they make it sound like his arguments are stupid, but they're actually pretty good. It's like, what the heck? Like, this is bullshit. (19:17 - 19:32) Right. And so it's meant to piss you off because it further entrenches you into your your line of thinking. And then another part, though, is in the film, they're kind of going like, OK, so they're a bit more on par and in favor of more, I guess what you could call Democratic views. (19:32 - 19:59) But even then, this is the whole thing about the limited hangout thing, because even the, I guess, characters that are portrayed as the more rational in the film, it's like they're also missing large pieces of the puzzle. Right. Where it's like almost like a fight of, well, we're fighting between whether or not, you know, like guys being being sent off to war and being killed like this is terrible. (19:59 - 20:10) Right. It's like, sure, for sure. But they fail to mention that, like the freaking guy, Tom Cruise's character, who's the senator or secretary or something, I think senator, he. (20:11 - 20:27) Is oblivious to the fact that it's actually like his government that we're also funding the other sides during the time of the because it actually is taking place during, I guess, the war on terror. Yeah. And it's like they're not they're not nowhere in the movie. (20:27 - 20:44) Is it going to be mentioning the fact that the West was actually funding a lot of those terrorist groups? Yeah. Of which actually now Al Qaeda has been taken off of the terrorist list. I wonder why it's like this is the kind of stuff that they don't mention in the film because that's probably one of the most important things. (20:44 - 21:05) And they would never put the elephant in the room because my brain always thinks like when movies come out, I'm always like, I know this went through processing and I know that this got approved to be sent into the public eye. So I'm always wondering why was this approved? What is it missing? And also, so I thought that that was something that even the even the Democratic side of the more rational people, they don't have a clue. Like they don't they're not even mentioning that either. (21:06 - 21:12) I thought there was one particular part in the film, though, where. Or they have a clue and they don't want to mention it. That would be. (21:12 - 21:30) Yeah. Or even the part in the film where she's just like getting called out for the fact that she helped aid in the, I guess, volunteering of all these guys go into the military at first because she thought that this neocon dude was worth his salt. And then later words she just found out she was working for an organization that lost its soul. (21:30 - 21:52) It was almost like it reminded me of that saying where it's like progress for the sake of progress isn't progress. And that's what it felt like her news organization was like. It was like facts and and and current events and right away and for the sake of facts and current events, it's like and she's like, but what about the moral backing? And they're like, no, that's not that's not what we do here. (21:52 - 22:17) And then like a part of me was like, yeah, I'm siding with this woman because I'm like, yeah, what about the moral backing? But another part of me was just like, but again, how does that play into the psyche of people when they go forward? Because even that film, I forget, it was like a Nicolas Cage film, and it was about stealing cars or had something to do with stealing cars. And literally the percentage of people that year who ended up stealing cars. Right, right. (22:17 - 22:26) Went up because of that film, because that film shows you a bit of how to how to steal some cars. It wasn't about overall. Actually, I think it was literally a whole heist on stealing cars. (22:26 - 22:37) But anyway, so that went up and I was like, OK, so independent journalists and I kind of want to like deviate and say, screw you to the system that are like, I want to put in my moral soul into it. Right. It's like, well, that would go up. (22:37 - 22:55) And it's like, there's nothing there's nothing wrong with that. It's just a part of me goes like, it's so hard for me not to think, how does this go weaponized? Like, how does this even the people that I actually a lot in the film was like, pretty much in agreeance, some of which I wasn't. I was like, how does that still get weaponized, though, against us? No, absolutely. (22:56 - 23:02) And you can't be totally fearful. Otherwise you become a helicopter mom. Moral of the story. (23:02 - 23:11) Don't become a helicopter mom. That's actually what some of the ones that we watched in preparation for this was. Go down. (23:11 - 23:37) Which one was it? So this one, 5G plus mRNA equals remote controlled humanity with Dr. Henry Ely. This one's on Man in America. Yeah, we both me and Maycee were talking before we hit record that Man in America, neither of us loved him in the beginning because there were some there when it comes to podcasting realm, you know, there can be audience capture where you start you start to bring people on to you. (23:37 - 23:47) You realize, oh, look, this episode got you're creating an echo chamber. You create an echo chamber. And so you start to just perpetuate it in order to bring in funds or whatever the case is. (23:47 - 24:04) Keep the viewership going, you don't want to lose people that are interested in hearing only one opinion. But I have found that he has gotten increasingly better at discerning and bringing on people that he disagreed with initially. And he's just really expanded his horizons. (24:04 - 24:13) And I personally appreciate it more. But here's the other one. Is MAHA pushing the WEF's wearable agenda with Dr. Henry Ely as well? So those are those ones. (24:13 - 24:17) Both with Dr. Henry Ely. Which I love that man. I really like Henry Ely. (24:17 - 24:48) OK, so I said to Maddie before we press record, I was like, I really want to hit home some of what Seth and Henry Ely talk about in these two podcasts, because they are emphasizing the fact that it's like it doesn't matter if it's your guy in or not. You have to take a look at what's going on, like what we were kind of criticizing before, but of that whole like 5D chess thinking and like a bit of a QAnon copium. It's like what is happening right now in MAHA is in those videos. (24:48 - 25:08) So you'll see that Seth plays a video of Robert F. Kennedy Jr. saying that in the next four years, he would like it for all Americans to be wearing wearable technology. So that like smart watches, maybe, I don't know, having an Alexis in your house, whatever. It's wearing technology on you that collects your data. (25:08 - 25:23) And he was saying that and they're like like Henry Ely works and was very much in favor of promoting Robert F. Kennedy Jr. in his campaign and trying to get him in. Like he was he knew him. He was friends with him. (25:23 - 25:34) He was very much on par with him. He likes him as a man. But then once Robert F. Kennedy Jr. got in, he was like, dude, you have to be getting these vaccines off the schedule. (25:34 - 25:50) Like if you really want to make America healthy again, that needs to start right now because you are going to get sued into oblivion and it might not happen by the time your term is like over in this. Right. It's like we got to get on it now because you're in for a real fight. (25:50 - 26:07) And of course, there's that idea where it's just like, but they could kill me. And like Henry Ely was just like, then that you would be the third Kennedy that the U.S. In which case that's that's that's big. That's that's big. (26:07 - 26:23) I mean, granted, E. M. Burlingame brought up a good point where it's just like, yeah, well, once the once the first two Kennedy's brothers died, it's not actually we didn't really do much like a part of me. I'm not too closely hitting home on that one because I'm very like I'm young. Right. (26:23 - 26:34) So I'm further away from that. So for some people who would feel more, I guess, sympathetic towards that. But my brain was just like, well, we're still in the messed up system and it's still heading towards technocracy. (26:35 - 26:54) So for sure, we didn't do anything about it. But when he was talking about the fact that Robert F. Kennedy Jr. was advocating for these wearables and how it's basically just another form of slavery, just under a different name, I was like. Seth Hull, sorry, Seth Hull House was like. (26:55 - 27:17) It doesn't matter if it was if it's our guy or not, like if you picture Biden or if you pictured Kamala saying that in the next four years they want people to wear wearable technology, you would be losing your freaking mind. Right. And so they were like, you got to look at it, I guess, in their eyes objectively, which me and Maddie would just say, look at what the action is and what it is that they're taking. (27:17 - 27:48) And I get the whole like this might be a part of a bigger plan, but the reason I like Henry Ely is because he actually is like, no, no, no, like here's the actual like written out in crayon plan of what RFK could have done to start fighting against the vaccines first and not fighting against particularly food dyes or fruit loops or anything like that, because that's very downstream of what is a huge problem in America. It's the biggest elephant in the room and no one's talking about it. Oh, yeah. (27:49 - 27:58) And and specifically, oh, the mRNA. And the that are you about to mention the the amplify amplifying itself? Self-amplifying. Yeah. (27:58 - 28:18) Well, he was saying the fact you'd figure you take the vaccines off and stop the mRNA technology. But not only did he not stop mRNA technology, then he's promoting self-amplifying mRNA technology. Henry Ely was like, we we voted you in to get rid of it. (28:18 - 28:27) And now you're putting in more. Yeah. And a worse kind because the self-amplifying stuff, that's insane. (28:28 - 28:37) That's yeah. Even even the Japanese people who were the first to get subjugated to it. You can if you guys want to know a little bit more. (28:37 - 28:56) This came out nine months ago. Japan rising on the official Corbett Report Rumble channel. And Japan was protesting Japanese people who I mean, that's it's they're not naturally disagreeable people. (28:56 - 29:25) And they were going, no, no, thank you. So the fact that and Dr. Henry Ely, even though I mean, I appreciate his perspective because he's he's talking about bodies on the floor. He said, I don't 5D chess, whatever copium you want to give yourself to justify why RFK Junior is doing this and why Trump is going along with it and all this jazz, whatever he's saying. (29:26 - 29:37) It's not OK. Even if it's for this bigger picture, because people are dying and the self-amplifying mRNA stuff that's just going to it's going to hurt more people. He said it's not OK. (29:37 - 29:56) He said that he was starting to hear from doctors and doctors were making the statement that it's like that's an acceptable amount of casualties. And then he was like, well, great. Now doctors are thinking like the military complex where it's just like, oh, that's an acceptable amount of casualties, right? Like reverse correlate instead of forty six thousand dead. (29:56 - 30:02) There's like eighty two thousand alive. Isn't that great? It's like there's people dead. It's like, right. (30:02 - 30:15) I mean, I'm all for celebrating victories, but like that's like because they were saying because it got administered all over all over across the world, it's like, oh, that's an acceptable amount of deaths. We were that was to be expected. It's like I thought the whole point of vaccines were to save lives. (30:16 - 30:29) None of them should be killing anybody. Yeah, yeah. Like, what do you mean? Like, oh, well, like I get the whole you go ahead and you decide to, I don't know, eat a pizza pizza and you like you know that a part of you is like, oh, it's not going to be really healthy for me. (30:29 - 30:32) Right. But it's not like you're expecting. It's a fair amount of casualties. (30:33 - 30:43) It's like it's just like it's not a thing. Yeah, yeah, that's that's for sure. And but the point that hit at home for me was the fact. (30:43 - 31:09) So they played the clip of RFK Junior saying in the next four years, I want basically all Americans to be wearing wearables. It's like, OK, but then right afterwards, they played the clip of Klaus Schwab basically saying the same thing earlier and saying wearables, you know, that's the the way of the future, because it also leads way to having wearables in the skin, you know, chips and. Neuralink. (31:10 - 31:16) Neuralink. And that's that's the Internet of Bodies, as Catherine Austin Fitts and everybody else has been quoting. So. (31:17 - 31:21) I haven't looked into that as much. I want to I definitely want to look into that more. The Internet of Bodies. (31:21 - 31:48) Oh, like, are they setting up now like, I guess, university and like educational programs for this? I'm very I can't say right now for certain, but I definitely want to look more into that. Yeah, something that's worth looking into. And the fact that RFK Junior, who is painted as on the opposite side, is quoting to get this a thing that Klaus Schwab is in favor for. (31:49 - 32:08) And I don't know how any if there's people that are reconciling this love to hear it because I can't make sense of it. I don't think that's that says a lot in my brain. And even the fact that aligns with the 2030 timeline, it'll in the next four years. (32:08 - 32:19) Well, it's basically it's almost 2026. That's you're right on par. You're right on par for where all of the WEF and the people that the MAGA and MAHA movement supposedly despise. (32:19 - 32:31) That's right on their timeline. How are we playing into that agenda? Yeah, yeah. Like I'm I get the idea of wanting to use technology to maybe improve health. (32:33 - 32:47) But a part of me goes like, you don't need to announce that as a big freaking agenda scale idea that you want to see happen. You could literally just be like, you don't need to say anything about it. You don't need to say that you're in favor of it. (32:47 - 33:02) It's like if people decide that they want to go and put that freaking shit on their body because they think it's going to help them because they get to keep track of their steps for the day because someone online said, get 10,000 steps in and you'll lose weight. It's like, I don't care. You go ahead. (33:03 - 33:12) And if you want to do that, you go ahead. Right. But you do not need a freaking political figure saying that they support this thing almost like a cloud shop. (33:12 - 33:29) It's like to me, I'm like, that's that speaks volumes for how much even for a voice like yours, that's almost like a manipulation of your voice for sure, because you already know you were hugely supported. You are hugely supported, right? For the children's health defense. It's like you built yourself a reputation. (33:29 - 33:35) You built yourself a rapport. And what happens? You abuse it. That is abuse like hands down. (33:36 - 34:12) And because even in the videos with the idea of wearables that I didn't even think about that I thought was pretty like, oh, when Seth brought it up was the fact that it's like, what if they, I don't know, you have your little watch on and it's keeping track of your heart rate, right? And they put out certain tweets or they put out certain governmental statements or they put out any sort of certain stimuli and boom, they have a whole database of how the populace reacted to it. Yeah. That is scary to me because I'm like, if they weren't already good at manipulating us, they sure are going to figure out more now. (34:12 - 34:24) And then my brain also thought no one really like said it directly. But when they said now they have also a huge data collection of your biometric data, I was just like, well, now they know how to make really effective bioweapons. Right. (34:24 - 34:42) I was like, it's no good. That's yeah. And this is part of the reason why, you know, mode of attention from our perspective is so important because even things like AI, as we're talking about, because you look at things Palantir, I want to talk about limited hangouts. (34:43 - 34:54) Oh, that is so limited. Well, not just, I was listening to another one on the last American Vagabond and it was with Catherine Austin Fitts and a couple other individuals. I actually don't know. (34:55 - 34:59) Hey, send that to me. I actually haven't watched that yet. I'm not sure I can do that. (35:03 - 35:15) And Joe Rogan said, I can't believe I'm saying this, but I think we need AI government. I want to talk about again, a limited hangout. Abusing your voice. (35:15 - 35:52) Now Palantir. So Macy, I was actually asking her, I was like, can you give me a summary? Like what is Palantir? I get this detachment 201 and it's technocracy in uniform, but what is Palantir? And she said, you know, the movie Winter Soldier, which is the second movie in the Captain America series for Marvel. And she said the scene where those aircraft carriers are going up and the goal was to target anybody, a potential threat to Hydra, which is just their, you know, a euphemism and their analogy to Hitler. (35:54 - 36:07) It was going to take them all out. It was this automatic. Based on a formulaic data calculation, almost like a game theory type calculation that in the future, this person's going to be a threat. (36:07 - 36:09) Yes. So we should eliminate them now. Yeah. (36:09 - 36:23) And now I thought, okay, that's a. Israel is using now on Palestinians. Yeah. Well, that's, that isn't that why Ted Cruz said Israel and their intelligence. (36:23 - 36:25) We need them. It's like, right. Yeah. (36:25 - 36:33) So we can use the Palantir on the rest of the Middle East. The great news. But you said Palantir Gotham. (36:33 - 36:39) Or on our goal or even on the Golden Dome Palantir on us. Yeah. Yeah. (36:39 - 36:40) Yeah. Right. Yeah. (36:40 - 36:50) And then you literally using Palantir Gotham. And so the people, I don't know if people find this funny. How many movies we allude to? I haven't even mentioned all of them. (36:50 - 37:03) My God. When I'm writing our sub stack things, I, I watch, I go back and listen to these every time because I have to figure out like what it is that we referenced that way in our sub stack everything, by the way, guys, if you guys are like, okay, I'm loving home squared and I want to support Will. So you're watching it on his platform. (37:04 - 37:08) No problem. But if you guys want to go find the actual links in the sources, it's on our, it's on our sub stack. Yeah. (37:08 - 37:13) Macy posts them. I post like everything that we reference. So that way, if you're like, okay, I want to go find something. (37:13 - 37:16) That's where you can find it. And we reference movies. Yeah. (37:17 - 37:27) So in this episode alone, I pulled up this one, which we talked about already. I didn't talk about this one. This is Greenland, which is another like predictive programming sleeper made in 2020. (37:27 - 37:38) So while people were kind of hiding in their homes and death by comet is kind of the overall plot. Overall, like the movie was, it was a good watch. Heartbreaking. (37:38 - 37:44) Yeah. But they actually do a nice message, which we don't often see about like humanity. Yes. (37:44 - 38:01) I was surprised to see how much humanity and unsung heroes reference. Like when you watch Greenland, if you do pay attention to all the people that made it possible for the ending, because all of them are unsung heroes. Yeah. (38:02 - 38:07) Not main characters. I was just like, I was kind of blown away at that. I was like, wow, like, look at the humanity. (38:07 - 38:19) That was like one of the only gold coins to this whole predictive program moving. And then there was this other one called Contagion because we were talking about the vaccines and all that jazz. And this came out in 2011. (38:19 - 38:37) Um, you know, similar timeframe to the H1N1 pandemics that were going on. And then they, but this, if you watch it, they blame it on. Uh, what are they called? The wet shops or the wet markets. (38:38 - 38:47) It was China with their pig mixing pigs and bats. And then you get this contagion. So, and that's what they said for COVID. (38:47 - 38:59) So it's like, look at this predictive programming, which is why. Yeah. The thing with predictive programming, if it's, if people don't already know, it's like they're literally giving you this narrative to get your mind ready for it. (38:59 - 39:16) So that way it doesn't like shock you. Or explain it after the fact to basically go, this is, this is what happened. Meanwhile, like Maycee said, with Lions for Lambs, they kind of paint this politician greed. (39:16 - 39:45) Um, bad Republican aiming to be a president, all of that, but they completely leave out all of the clandestine operations that we've talked about in our pre the coups, military coups and stuff to get rid of different, um, in via democracy, get rid of and control via proxy States. They don't mention any of that or the terrorist groups that were created by the intelligence agencies. They'll, they'll leave that out and they'll explain it via another way. (39:45 - 40:08) Yeah. It's interesting because a part of me, when I felt like as I was watching that, I figured that the message was basically for young people watching to kind of go like, Hmm, I wonder then how I'm supposed to get involved because that's how it felt. It felt like it was meant to recruit a young person's mind into thinking, how should I help the system? Right. (40:08 - 40:34) Which is actually a fine, it's a fine question, but actually it might less so help the system and more so, how do we like reshape the system? Great reset. Cause in it, it's either a military recruitment movie, right? Where it's like, Oh, now I want to go join the military. Um, or it's like a political advocacy recruitment movie. (40:36 - 40:54) Um, media recruitment movie. And I was just, I was thinking to myself, I was like, Yeah, I wondered if I, if we were doing something wrong, cause like a part of me was like, Hmm, like, I wonder for sure, if we're in the parameters of this, of the system, maybe what we're doing isn't the thing that's going to be helping. Right. (40:54 - 41:09) Because you gotta, you gotta ask yourself this question. And my brain was like, well, I still haven't learned. Like, this is why we admire people like Matthew Ehret, because I'm like, he's trying to learn from historical figures, what their ideas were to how to make the system better. (41:09 - 41:15) I was talking with a friend, um, Cyril. He's on our at homes with homes channel. He's great. (41:15 - 41:24) Love him. And he was talking about how he just finished one of Matthew's books. Um, the Eurasia, I don't know the title of it. (41:24 - 41:28) Yeah. Canada's potential Eurasia future. Yes. (41:28 - 41:43) Yes, exactly. And has, as he was describing it as, and as I've done a lot of Matthew's work, I was like, well, at least this is an idea in the sense of like, what a society based on actually wanting to promote. Right. (41:44 - 41:53) Something else. And something that may or may not be for helpful from the historical past of, you know, people's past ideas. Because even in the film, right. (41:53 - 42:01) There were two dudes and they were giving a presentation. And they were like, uh, I think if I'm not mistaken, the program was like military service. Right. (42:01 - 42:07) Almost like not, I don't know about, uh, maybe a requirement. I'm not entirely sure. It was part of it. (42:07 - 42:23) There was not strictly military service, but it was something along the lines of like, Oh yeah. Peace Corp as well. Like along the lines of don't just sit here getting a bunch of theoretical knowledge, actually go out in the world and go and experience it for yourself. (42:23 - 42:30) It was like, if anything, I believe if I'm not mistaken, it was one of JFK, like, sorry. Yeah. JFK. (42:30 - 42:50) Yeah. JFK's ideas where it was like. Students should go out and into the rest of the world, other countries, and then just go spend some time there and actually go and meet some, go meet some people, go fix some, go build some houses, go, go get your hands in physically into what it takes to actually make something. (42:50 - 43:05) See like the whole essay of like what it takes to make a pencil. Right. And then all of the things that happen from going into the, to make the little metal that goes around of it, the actual forming of the wood, collecting the wood, getting the right, the shaping, the lead, like what the erasures may have. (43:05 - 43:10) It's like all these things. Right. It's like, go figure that out and then integrate it. (43:10 - 43:17) It's like expand and then integrate. Right. And I was like, yeah, that, that, that, that part makes sense to me. (43:18 - 43:37) So yeah. So when I was just thinking about in terms of us and then like thinking about it in the sense of like what it is we're aiming towards, like, I just want to keep learning what would be some better ideas and better systems. And then on top of that like, man, if I wish I knew how to build a house, you know what I'm saying? Like just stuff like that. (43:37 - 43:40) But yeah. Yeah. It was a fun little caveat. (43:40 - 43:58) Just this is me and Macy just recently passed our amateur radio test. We, we took that. And the, the knowledge I didn't expect to gain from taking it. (43:58 - 44:23) It was, it was a lot. And honestly, the reason we took it, I know for me anyway, Macy, if you want to speak on behalf of yourself, but it was for what you're talking about right now is gaining more knowledge to help however way I can. And now if some kid doesn't know how electric, even, even people talking about welding in our circle, they'll talk about inductance. (44:23 - 44:34) And now I know what that means. And even the electrical circuits that are run on our own house, and then how to wire certain things. These are things that I didn't contemplate until we took this thing. (44:35 - 44:48) And initially we took it for the utility of potential future scenarios of helping people. And then just the things that it has opened up to help in other ways. Even the understanding of electricity helped in the sense of understanding the body as well. (44:48 - 44:56) Which he really talks about too. Yes, he does. He talks about frequency with like red lights and saunas. (44:56 - 45:36) And I'm like, yeah, like frequency, man. And like learning about it from the radio, learning about it from Arthur Fistenberg's book. I was just like, it's interesting because when you start to learn how things work, or at the very least how we know them to work right now, if you're a good scientist, how we know things to work right now, it makes that helps you feel like, okay, so if this is correlated, and if this is correlated and how these might work, or maybe some somewhat similar, or let's have the scandalous idea of thinking they might be similar, then it's like, how would you avoid that this problem happening, right? How would you solve this problem? Because now I know how it works. (45:36 - 45:48) So maybe I can learn how to actually like change it. So that way it could be used for something good. And hence why I'm not a Luddite in the sense where it's just like AI bad or like technology bad. (45:48 - 46:15) But it's just like, in the hands of the wrong people, yes, definitely bad. Yeah, well, that's why we... And also in the hands of not understanding, like, because now what we know as well with frequency and radiation, it's just like, nah, like there are really some things that we have to consider in the sense of like, the ratio of the pros over the cons, you know what I mean? It's like, yeah. Yeah, that's why mode of attention is so important. (46:15 - 46:34) So the reason why I brought up movies earlier, because we were talking about AI and Palantir being AI government. And you alluded to before we hit record, Palantir Gotham, which is like a subproject under Palantir. And Gotham, from Gotham City, everybody knows. (46:34 - 46:49) And you and I are making all these movie references, but this Palantir Gotham thing, being involved in counterterrorism, cyber security, all of those things. And under... Counterterrorism. I'm sorry. (46:52 - 47:01) Counterterrorism. This is the frustration I had in the movie we were watching last night, because I was just like, nobody's talking about this. Nobody's talking about the clandestine operations. (47:01 - 47:15) Nobody's talking about the fact that it was us that freaking manipulated and made it come to be. But anyways, counterterrorism, it's like, who defines the term of the terrorist? You know what I mean? Yeah. It's like, sure, killing a bunch of people, terrorism. (47:15 - 47:20) Yes, I would agree. Your government does that. That's true. (47:21 - 47:29) So... And we don't call them terrorists. And they're the... So they're probably the best at creating them. Yeah, we don't call our health system terrorists, but people do. (47:29 - 47:34) Because what do we know about the left hemisphere? Hall of mirrors. Everywhere it looks, it wants to see more of itself. Right. (47:34 - 47:50) So you're a freaking radical person that's a eugenicist that wants to kill a bunch of people and who wants to do it covertly. You know what we're going to do? We're going to make a bunch of subgroups of people that are freaking eugenicists and want to kill a bunch of people or whatever the heck. Or just ideologically possessed. (47:50 - 47:58) And they're going to want to go kill a bunch of people. And we're going to spike up forever wars. We're going to supply them all the stuff they need to go and do the killing. (47:59 - 48:02) We will give them the guns. We will give them the ammunition. We will give them all these things. (48:02 - 48:09) And we will profit from it. And we will... Immensely. Because they get to destabilize the region via the terrorist groups that they created. (48:09 - 48:17) And then they get to go in and insert their democratic puppet. Hello Netanyahu BB. Now back on to the AI thing. (48:18 - 48:25) Sorry. No, no, no. It was very valid because that was the big elephant in the room that the movie didn't talk about, which you need to talk about. (48:25 - 48:30) But I watched this little one. This is Press for Truth. And it's only 17 minutes. (48:30 - 48:37) And it's on AI. Chat GPT psychosis and stuff like that. And again, motive attention is so important. (48:37 - 48:59) There are people that are again, stupid movies. There are people that I'm sorry. There are people that in that video, there's a bunch of cases of people literally having relationships, developing relationships with AI because they're AI chatbots and stuff like that, that you can converse with. (48:59 - 49:18) So they're either developing relationships with them or they are asking. There's a guy that basically asked about the matrix theory. That theory when you're not actually in your own body. (49:19 - 49:42) I can't remember. There's an official name for it, but it's the matrix concept simulation theory thing. And he was saying to the chatbot, do you think that's a thing? Do you believe in the simulation theory? And then the bot affirmed him, basically pulled the therapist move, gave him the positive affirmation to his gender simulation dysphoria. (49:43 - 49:59) And so this guy tried to, I don't know if he threw himself off a building. I think, I don't know if he did and he died, but he went to do the thing. I just can't remember the outcome of doing the thing because in the matrix, you can just jump across the building. (49:59 - 50:10) You know, it's a simulation. If you really put everything in your mind. Anyway, but the point is there are people falling for this AI delusion and this schizophrenic like thing. (50:10 - 50:36) And the reason why I said movies. You know what's funny is even in the matrix though, it was all of the freaking machines, right? That were taking over in the AI that created the matrix in the first place to keep you trapped. Why would you trust the AI in the matrix or why would you trust the AI to tell you that you're in the matrix? If it's them that created the matrix, like, and then the result is you want to go kill yourself. (50:36 - 50:48) They're like, well, yeah, mission accomplished. Like if that's the case, I don't know. But the irony about movies and limited hangouts and predictive programming and normalizing all these things we've been talking about is that there is a movie. (50:48 - 50:57) I think it's with Joaquin Phoenix might be it's called Her, I think. Okay. (50:57 - 51:00) I thought you were going to mention the other one. No, Her. It's called Her. (51:00 - 51:06) Okay. I think. And it's on a guy who develops a relationship with AI. (51:06 - 51:23) It's funny how these movies always come out in such opportune moment. But we're talking about AI mode of attention is super, super important because there are genuine concerns as the wearables, the Palantir movements, even, even rumble. Most of the links we show you guys rumble, rumble, rumble, rumble, rumble. (51:23 - 51:39) And Peter Thiel is not a good person. But half the reason we use rumble is just because there's your, there's better guarantee of you not getting pulled off at this moment in time. We can save vaccines and we're most of the time we're okay. (51:40 - 51:48) So it's by default that these things happen. But it doesn't mean rumble or YouTube, whichever one you pick, you're still getting spied on. Yes. (51:48 - 51:58) Oh, and the ads. Oh my gosh, the ads. Anyhow, but mode of attention is super, super duper important when it comes to the AI game. (51:58 - 52:12) Because Grok, there's people that I've, there are some people using AI to help build bridges and actual infrastructure of which that's, that's smart. You know, you use it to. Yeah. (52:12 - 52:33) Like Mike Adams, who's coming up with his Enoch AI system. Actually, he's building it for the sake of, I think it's what he did is his, if I'm not mistaken, his wife might be. Is she Chinese? Or is she, she might speak it. (52:34 - 52:37) Not sure. Oh my God. Yeah. (52:37 - 52:40) Yeah. They both, they both speak Mandarin. Mandarin. (52:40 - 53:04) Thank you. Him, his wife, probably another set of people out that I'm not giving credit to, sorry. They created this program and it's actually like almost like a holistic book, if you wanted to, of, well, not just holistic, but like a medicinal also survival type of AI system that's meant for put in your symptoms. (53:04 - 53:14) And here's like some data that they collected from even like, like what is like Chinese medicine, probably other different countries. Yeah. More holistic. (53:14 - 53:26) Sticks. Yeah. And then they, they put it into the natural homeopathic homeopathic type of solutions, right? Like that's them prepping stuff, prepping huge on the prepping survival gear. (53:26 - 53:30) Yeah, exactly. He's like how to clean your gun. Exactly. (53:30 - 53:53) So it's just stuff like that, where it's like, that's someone trying to use AI and putting a bunch of data into a system that would actually like help people. Right. I mean, because that's the thing with AI that people have to understand is that it's like, I don't mean to say garbage in, in the sense of this one, because I'm like, Hey, that's pretty cool that, you know, that he's doing that, but garbage in garbage out in the sense where it's just like the person putting it together. (53:53 - 54:07) We still have to have in mind who that is. Right. It's like Palantir for, if I'm not mistaken, it said was used to figure out that what for 9-11 Osama bin Laden, like it was fine for finding him. (54:07 - 54:21) Someone set up an AI system and they are allowed to set the parameters of data that they choose to put in there. So it's just like, Oh, this is a person who's a potential enemy or potential threat. And I'm not talking about Osama bin Laden now. (54:21 - 54:31) I'm just using a general thing, which is like, Oh, this person's a potential threat. It's just like, you're the one who put in the parameters of why they're a bad potential threat. So again, like who's setting those parameters. (54:32 - 54:52) Yeah. And if people want to go and listen to that one, it's, I would recommend it because first of all, if since Maycee wants to do a bit more talk about E.M. Burlingame, because he's, he's a good, he's a good psychological case to break down. He and they talk about this Enoch, this new AI surrounded by alternative health. (54:52 - 55:06) And they, they bring up a lot of the tech behind it too, and what it takes to create it and what the biases are and stuff like that. It was, it was interesting because I don't have a tech background. So them asking those kinds of questions. (55:07 - 55:14) It's on the Tommy podcast for those that have never heard of the Tommy podcast. He's, he's a young man. Oh, they have. (55:15 - 55:19) Again, I make the sub stacks. So I know what we've referenced. Yeah. (55:19 - 55:29) Well, assuming people on Will's platform who haven't gone over to our Substack yet, haven't heard of the Tommy pod. Well, they've heard us talk about it. And if we do, then I put it in, in the references. (55:29 - 55:43) At the very least everybody, you know, that we're not, we don't go just to one source. We, we, we source from many because everybody in my household has different people that they like listening to. I always call Tommy. (55:44 - 56:07) The Tommy podcast is something I reminds me of my brother because Tommy being a young man, he, he calls himself, he says, I'm mentally deficient and illiterate. Can you are mathematically illiterate? Can you explain to me what this is in layman's terms? And he references video games to help a map, just how we reference movies all the time. So there's just this whole, we're going to have to make a movie recommended list. (56:08 - 56:18) Well, they think about it, which will be in our sub stacks if ever anybody wants to go there. But we've been talking for 56 minutes. I mean, we've got to get to cover Mike Yeadon. (56:19 - 56:34) I wanted to, but maybe we can do that another time. What with Mike, we alluded to the 2030 agenda. So Oracle Films is, they are a very good resource because just the stuff they cover. (56:34 - 56:44) So the agenda, which is a recap, you know, people talked about the 2030 agenda way back, like back in 2020. This is a recap if people want to. Yeah. (56:44 - 56:53) Like just to kind of like get yourself reaffiliated with really, truly with some of the bigger picture, bigger schemes that are going on. It's just a good refresh. Yeah. (56:53 - 57:13) What we've experienced right here, Dr. David Martin, when he goes over the patents for COVID, that's a very good presentation. It's really dense. He spoke at one of the second injection of truth that we did in Calgary. (57:13 - 57:22) So people again can go watch those because those are still very good resources. Yeah. And then like, lastly, we'll just end it off here. (57:22 - 57:47) Check out the final warning, Dr. Mike Yeadon interview on Oracle Films. I, just to keep it short, I just think it was a good, it was a good interview and you could, or not interview, but I guess just him sitting in front of the camera, talking to the people, given his spew. I'm pretty sure that it was Kyle or something that said that this isn't the last, this isn't going to be the final warning because he's done that a couple of times. (57:48 - 57:57) Mike Yeadon just gaslighting people saying this is the last time. It's like, no, it's not, but we love you anyway. But it was, it was, it was well put together. (57:57 - 58:14) You could tell that there was genuine intention on how he wanted to give across his message and even what it is that he's learned. And I honestly, in terms of his theory on the flu not being contagious, I was just like, I'm 100% like siding on that now. So if you guys are like, what? It's like, go check it out. (58:15 - 58:24) Seriously. It's, it's, it's pretty good. It operates on the theory that it's not viruses as in little particles, aerosols that go through rather. (58:25 - 58:40) It's, um, it is your own homeostasis gets thrown off because, and he brings up stuff like migraines. He's like, you don't blame other people for your migraines. You get migraines because your homeostasis in your body, your balance is no longer balanced. (58:40 - 59:00) And he said, it's the same thing for when you get sick, you know, you can be in the same room in their studies. Arthur Fistenberg is what Maycee alluded to, but there are also studies that he alluded to, which isn't Arthur Fistenberg that also do the same thing where they tried to have healthy and sick people swap so they could spread it and it didn't work. So it's this internal balance. (59:00 - 59:25) And when that goes off, even things with the liquid in the lungs, he brings up, which is super insane. And he touches on all of this digital realm, the technocracy, what you have to watch for in terms of, um, the digital currencies and the central, the digital IDs, digital ID. And even with climate change, I think the funniest thing he said to me was the bubbly drink. (59:25 - 59:39) You know, everybody says carbon bad, right? And he's like, look, everybody knows that you're when it gets warm, your carbon drink loses its bubbles. It's not, it loses its bubbles, which heats up. It's like, it doesn't heat up in the fridge. (59:39 - 59:46) It gets warm and then the carbon leaves. Well, that's how the earth works too. It's like a big bubbly drink. (59:46 - 59:50) It gets warm, then the carbon happens. The carbon doesn't make it warm. That's not how that works. (59:51 - 59:53) Yeah. Yeah. But we digress. (59:54 - 1:00:03) Definitely go watch it because it is, it hits home. It really does hit home and it just brings health to a whole new consideration. So, yeah. (1:00:04 - 1:00:10) Thank you everybody for enduring our, our conversation. I was like thinking it's going to be 30 minutes. It's not. (1:00:10 - 1:00:20) We really need to figure that out. But if you guys stick around for it, then great. Because I just, I'm the type where it's like you're more than capable of doing it. (1:00:20 - 1:00:30) I know that people have busy lives, but thank you for taking the time. And yeah. So this has been Holmes Squared.