Get Your YouTube Summary:
- Copy the video link.
- Paste it into the input field.
- Click 'Summarize' to get your summary with timestamps.
Important Note on Subtitles:
- Automatic summary generation requires English subtitles on the video.
- If the video has no English subtitles, the automatic download of the transcript using the link will fail.
- Manual Alternative: You can still get a summary!
- Find the transcript on YouTube (usually below the video description when viewed on a desktop browser).
- Copy the entire transcript text manually. (Need help finding/copying? Watch the 'Demo Video' linked at the top right of this page).
- (Optional) Add any additional instructions after the transcript (e.g., 'Translate the summary to German.', 'Add a glossary of medical terms and jargon to the summary.').
For videos longer than 20 minutes:
- Select a Pro model for automatic summarization. Note that Pro usage is limited daily.
- If the Pro limit is reached (or if you prefer using your own tool), use the Copy Prompt button, paste the prompt into your AI tool, and run it there.
- 0:00 - The Need for Honesty in Vaccine Discussions: The video opens with a call for honest, open conversations about vaccines, treating them like any other medical treatment with both benefits and risks. Dr. Joel Gator, a pediatrician, advocates for informed choices.
- 0:21 - Flawed Vaccine Study Design: A major point of concern is that vaccine studies often use non-inert placebos (e.g., other vaccines) instead of saline, potentially masking side effects and preventing true safety baselines.
- 0:28 - Individual Variability in Vaccine Response: Emphasizes that individual genetics and immune systems vary, leading to different responses to vaccines, highlighting the need for personalized approaches currently being studied by institutions like Mayo Clinic (vaccinomics).
- 0:47 - The Taboo Nature of Vaccine Discussions: Dr. Hyman shares his personal experience of being labeled "anti-vax" simply for recounting a patient's history involving an MMR vaccine reaction, illustrating the widespread suppression of such discussions in the medical field.
- 2:01 - Challenging Medical Dogmas: Compares the fixed dogma around vaccines to historical medical beliefs (e.g., autism caused by "refrigerator mothers," ulcers by stress) that were later overturned by scientific inquiry, stressing that science should always involve questioning.
- 3:08 - Impact on Trust and Transparency: Argues that the inability to openly discuss vaccine risks and potential side effects, often met with censorship and vilification, has led to a significant decline in public trust in medicine and health authorities like the CDC.
- 5:00 - Dr. Gator's Stance and Book: Dr. Gator clarifies he is not anti-vaccine and administers them in his practice, but he opposes forced vaccination and promotes his book, "Between a Shot and a Hard Place," which aims to provide balanced information on vaccine pros and cons, addressing the gap between pro-vax and anti-vax narratives.
- 10:25 - The Expanding Vaccine Schedule: Discusses the dramatic increase in the number of mandated vaccines for children over the decades (from a few to 72 jabs), raising concerns about the cumulative impact on a child's developing immune system.
- 11:07 - Pharmaceutical Indemnification: Explains that following issues with the swine flu vaccine in the 1980s, pharmaceutical companies were granted legal protection from liability for vaccine injuries, which removed their incentive to produce safer vaccines and contributed to the proliferation of new vaccines.
- 11:43 - Vaccine Injury Compensation Program: Notes the existence of the Vaccine Adverse Event Reporting System (VAERS) and the federal government's payout of over $5 billion to vaccine-injured individuals, which seemingly contradicts the "safe and effective" narrative.
- 16:22 - Lack of Independent Safety Research: Argues that pharmaceutical companies have no incentive to fund long-term safety studies that might reveal problems with their products, and scientists who do conduct such research are often vilified, creating a disincentive for crucial inquiry.
- 27:38 - Thimerosal (Mercury) in Vaccines: Highlights the historical use of thimerosal (ethyl mercury) as a preservative in vaccines, noting that total mercury exposure in babies from vaccines exceeded EPA safety limits. It was quietly removed from most vaccines around 2000, but its past use underscores a lack of comprehensive safety assessment.
- 32:28 - Absence of Long-Term Prospective Studies: Emphasizes the critical need for long-term (e.g., 5-20 years) prospective studies comparing health outcomes in vaccinated versus unvaccinated populations to identify potential associations with chronic diseases like asthma, allergies, or autoimmune conditions.
- 39:19 - Susceptibility and Immunological Changes: Proposes a theory that modern lifestyle factors (C-sections, lack of breastfeeding, early antibiotic use, environmental toxins) can dysregulate a child's immune system, making them more susceptible to adverse reactions from vaccines.
- 48:38 - Autism and Vaccine Triggers: Clarifies that while vaccines are not believed to cause autism, they could act as a trigger for immune dysregulation in susceptible children, leading to neuroinflammatory conditions like regressive autism, which is observed by many clinicians.
- 52:49 - Call for a Dispassionate Data Review: Advocates for an independent, unbiased commission of respected medical professionals to compile and present all existing vaccine research data, including both benefits and potential risks, to doctors and the public in an understandable format.
- 1:04:25 - Personalized Vaccination Choices: Dr. Gator advises parents on informed consent, emphasizing their right to choose whether to follow the CDC schedule, a slower schedule, or selective vaccination based on individual risk-benefit assessment, rather than being forced.
- 1:09:25 - Debating Early Vaccination: Discusses the trade-offs of vaccinating infants with developing immune systems. While early vaccination provides protection, it also means exposure to potential risks (e.g., fever leading to hospital visits and invasive procedures) at a highly vulnerable stage.
- 1:12:21 - Advocacy for Public Debate and Research Funding: Calls for public, televised debates among experts on complex vaccine questions and urges government agencies like NIH to fund research that addresses unanswered questions and improves vaccine safety, stressing that genuine scientific inquiry should never be shut down.
- 1:15:00 - Conclusion and Book Promotion: The discussion concludes by reinforcing the idea that questioning vaccines is about wanting safer products, not being "anti-vaccine," akin to wanting safer fish or planes. Dr. Gator's book is recommended as a resource for balanced information.
- 0:00 - The Human Cost of AI Perfection: The discussion opens with an anecdote about using AI (ChatGPT) to resolve a relationship conflict, illustrating how "perfect" AI-generated responses lack the authenticity and human imperfection that are vital for genuine connection and personal growth.
- 0:33 - Journey Over Destination: Simon Sinek articulates his core philosophy: life is about the journey and the inherent value of doing the work (e.g., writing a book, solving a problem) yourself, as this struggle fosters resourcefulness and intellectual growth, which AI bypasses by providing instant solutions.
- 0:58 - Erosion of Human Skills: Sinek warns that over-reliance on technology can lead to the "giving up" of crucial human skills (e.g., knowing what to do when a friend is struggling, coping with stress), comparing it to having a boat but not knowing how to swim when a storm hits.
- 4:14 - The Irony of Job Displacement: An interesting parallel is drawn between the displacement of factory workers by robotics (met with calls to "reskill") and the current threat of AI to knowledge workers, noting the ironic shift in societal concern and the discussion of universal basic income.
- 7:16 - Authenticity vs. Artificiality: AI-generated content (emails, art) is making digital interactions feel inauthentic, creating a new premium on "human-written language" and the beauty found in imperfection, drawing on the Japanese concept of Wabi-sabi.
- 12:55 - Embracing the Value of Struggle: The conversation reiterates that genuine personal growth, stronger relationships, and increased confidence come from navigating and resolving difficulties, not from avoiding them or seeking perfect, instant solutions.
- 18:22 - Essential Human Skills for the Future: Sinek stresses that critical human skills such as listening, holding space, conflict resolution, giving/receiving feedback, accountability, and empathy are increasingly vital but are suffering in the modern world.
- 31:08 - Potential for Reclaiming Time and Purpose: While AI may lead to job shifts, Sinek suggests it could also free up time, allowing individuals to pursue hobbies, deepen relationships, and combat loneliness, arguing that this return to personal time is something society has "been desperate for."
- 35:54 - The Pursuit of "Enough" and Gratitude: Happiness is not solely tied to money beyond a certain point; true fulfillment often comes from having a vision that exceeds current resources, fostering creativity, and practicing gratitude for what one has.
- 46:12 - The Profound Impact of Gratitude: Personal experiences (e.g., the LA fires) illustrate how confronting loss and embracing the temporary nature of things can lead to a deeper appreciation and gratitude for life and relationships.
- 55:59 - Scale Breaks Quality: The speaker argues that efforts to achieve massive scale often compromise quality and authenticity (e.g., mass-produced clothing, Apple's curated store experience), leading to a growing demand for handmade, unique items.
- 1:09:59 - Friction Creates Freedom: The paradox that technological convenience has reduced the "friction" that once created boundaries between work and personal life, thereby diminishing freedom and increasing stress.
- 1:11:05 - The Rise of In-Real-Life Community: Sinek predicts a "community revolution," defining community as "a group of people who agree to grow together," emphasizing the increasing value of offline, intentional human connection.
- 1:52:50 - Friendship: The Ultimate Biohack: Sinek highlights the underserved topic of friendship, arguing that strong friendships are fundamental for navigating life's challenges and are the most potent "biohack" for combating loneliness, depression, and anxiety.
- 2:01:01 - Mentorship and Lasting Legacy: Sinek shares his commitment to mentoring his team, imparting his knowledge and lessons from past mistakes to build a resilient organization that can thrive independently. He presents a military challenge coin engraved with "Build, Teach, Lead," symbolizing the responsibility to accumulate, give, and serve.
- 0:16 The Most Influential Hacker: Kevin Mitnick is cited as the most influential and infamous hacker, renowned for his social engineering tactics and past arrests for phone system abuse.
- 1:00 Origin of "Hack": The term "hack" originated in the 1960s at MIT's model train club, referring to creative, unconventional uses of technology. By the 1970s-80s, it evolved to mean unauthorized access.
- 1:27 Types of Hackers: Hackers are categorized by "hat colors": "black hats" (malicious intent), "white hats" (ethical, report vulnerabilities for good), and "grey hats" (cross ethical lines despite claiming good intentions).
- 2:10 Evolution of Hacking: Early hacking in the 1960s focused on physical access to computers. The 1970s saw "phone phreakers" manipulating phone systems, and the 1980s-90s brought the internet, drastically expanding hacking possibilities globally.
- 3:38 Government Website Hacking: A notable early instance was the 1996 defacement of the CIA website. To prevent such attacks, key advice includes changing default credentials, disabling unnecessary services, regularly updating software, and using multi-factor authentication.
- 4:59 Open-Source Encryption and Kirkoff's Principle: The principle states that the security of a cryptographic system should rely solely on the secrecy of the keys, not the algorithm itself, meaning open-source algorithms can be secure.
- 6:39 The Deep and Dark Web: The internet is likened to an iceberg: the "surface web" (5%) is indexed, while the "deep web" (95%) is not and requires authentication. The "dark web" is a subset of the deep web, unindexed and requiring special tools, used for both illicit activities and anonymous communication by whistleblowers or dissidents.
- 8:17 Stuxnet Virus: This malware was designed to physically disrupt Iranian nuclear centrifuges by making them speed up and slow down, impairing uranium enrichment. It was widely attributed to the US and Israel.
- 9:33 OPM Data Breach: Discusses the implications of data breaches where personal information, including fingerprints, is compromised. Recommends avoiding unnecessary data sharing and utilizing credit monitoring/freezes to mitigate risks.
- 10:43 TikTok Ban Concerns: Banning applications like TikTok is generally ineffective as it drives activity underground. Concerns about data privacy and misinformation are universal, regardless of a company's country of origin.
- 12:06 Election Vulnerabilities: Paper ballots are preferred over electronic voting systems for their resilience against power outages and for recountability, as electronic votes are difficult to verify retrospectively.
- 12:53 The I Love You Virus: This virus spread virally by overwriting various file types, making systems harder to disinfect and maximizing its reach.
- 13:17 Stopping WannaCry: Marcus Hutchins halted the WannaCry ransomware in May 2017 by registering a specific domain name that the malware was programmed to check, acting as a "kill switch" to stop its replication.
- 14:18 Colonial Pipeline Ransomware: The pipeline shutdown was a cautious decision by operators, not directly caused by the ransomware itself. Although a $5 million ransom was paid, the decryption tool was ineffective, forcing reliance on backups, with the FBI later recovering half the ransom.
- 15:47 Firewall Explained: A firewall acts as a digital "gatekeeper" or "zone of separation" between networks of different trust levels (e.g., internal vs. internet), enforcing security policies to control incoming and outgoing traffic.
- 16:41 VPNs and Anonymity: VPNs primarily encrypt internet traffic to protect sensitive information and can hide a user's original IP address, offering anonymity. However, trust is transferred from the ISP to the VPN provider, who can see all traffic, making provider choice crucial.
- 18:54 Why Phishing and Scams Persist: These scams remain profitable due to the human tendency to trust, which attackers exploit through social engineering. Tactics constantly evolve, including "smishing" (SMS), "vishing" (voicemail), and "quishing" (QR codes).
- 21:11 Password Managers and Passkeys: While no system is perfectly "safe," password managers can be "safe enough" if a strong master password and multi-factor authentication are used. Passkeys, a newer cryptographic standard, are more phishing-resistant as they don't require users to remember passwords and rely on device-based cryptographic keys.
- 21:54 Virus Likelihood Today: Catching a virus is still possible, despite advanced antivirus software. Modern antivirus systems have evolved from signature-based detection to behavior-based analysis. Regular software updates and reboots are crucial defenses.
- 23:01 Most Damaging Hacks: Damage is measured in various ways (financial, systems, lives). An indirect example of life loss is a hospital ransomware attack that redirected emergency services, leading to a patient's death during transport.
- 23:40 The CIA Triad: A foundational concept in cybersecurity, standing for Confidentiality, Integrity, and Availability—the three core principles of protecting information.
- 24:10 The First Computer Virus: The Morris Worm (1988), a self-replicating program, is considered the first widespread example of malware that brought public awareness to harmful software.
- 24:49 Phone Phreaking Revisited: An older attack method from the pre-internet era, where individuals manipulated the phone system using specific tones (like the 2600 Hz tone from a Captain Crunch whistle) to make free calls or reconfigure switches.
- 26:08 Realistic Hacking in Movies: Real-world hacking is often tedious and not visually exciting, involving hours of background processes. Movies dramatize it for entertainment, often inaccurately.
identifier: 3058
model: gemini-2.5-flash-preview-05-20| input-price: 0.15 output-price: 3.5 max-context-length: 128_000
host: 193.8.40.111
https://www.youtube.com/watch?v=gcNt0kgCNfYinclude_comments: None
include_timestamps: 1
include_glossary: None
output_language: en
cost: 0.0130689
*Abstract:* This video features a discussion with pediatrician Dr. Joel Gator, author of "Between a Shot and a Hard Place," advocating for honest, data-driven conversations about vaccines. The dialogue challenges the prevailing "safe and effective" dogma, arguing that vaccines, like all medical treatments, carry both benefits and risks that require open discussion and informed consent. Key criticisms include the lack of inert placebo-controlled trials, the cumulative effect of numerous mandated vaccines, and the problematic indemnification of pharmaceutical companies from liability for adverse effects. The speakers highlight declining public trust in medical institutions due to perceived dishonesty and censorship around vaccine safety concerns. They call for comprehensive, long-term, and unbiased research into vaccine effects, personalized vaccination strategies, and transparent communication to rebuild trust and continuously improve vaccine safety. *Summary:* * *0:00 - The Need for Honesty in Vaccine Discussions:* The video opens with a call for honest, open conversations about vaccines, treating them like any other medical treatment with both benefits and risks. Dr. Joel Gator, a pediatrician, advocates for informed choices. * *0:21 - Flawed Vaccine Study Design:* A major point of concern is that vaccine studies often use non-inert placebos (e.g., other vaccines) instead of saline, potentially masking side effects and preventing true safety baselines. * *0:28 - Individual Variability in Vaccine Response:* Emphasizes that individual genetics and immune systems vary, leading to different responses to vaccines, highlighting the need for personalized approaches currently being studied by institutions like Mayo Clinic (vaccinomics). * *0:47 - The Taboo Nature of Vaccine Discussions:* Dr. Hyman shares his personal experience of being labeled "anti-vax" simply for recounting a patient's history involving an MMR vaccine reaction, illustrating the widespread suppression of such discussions in the medical field. * *2:01 - Challenging Medical Dogmas:* Compares the fixed dogma around vaccines to historical medical beliefs (e.g., autism caused by "refrigerator mothers," ulcers by stress) that were later overturned by scientific inquiry, stressing that science should always involve questioning. * *3:08 - Impact on Trust and Transparency:* Argues that the inability to openly discuss vaccine risks and potential side effects, often met with censorship and vilification, has led to a significant decline in public trust in medicine and health authorities like the CDC. * *5:00 - Dr. Gator's Stance and Book:* Dr. Gator clarifies he is not anti-vaccine and administers them in his practice, but he opposes forced vaccination and promotes his book, "Between a Shot and a Hard Place," which aims to provide balanced information on vaccine pros and cons, addressing the gap between pro-vax and anti-vax narratives. * *10:25 - The Expanding Vaccine Schedule:* Discusses the dramatic increase in the number of mandated vaccines for children over the decades (from a few to 72 jabs), raising concerns about the cumulative impact on a child's developing immune system. * *11:07 - Pharmaceutical Indemnification:* Explains that following issues with the swine flu vaccine in the 1980s, pharmaceutical companies were granted legal protection from liability for vaccine injuries, which removed their incentive to produce safer vaccines and contributed to the proliferation of new vaccines. * *11:43 - Vaccine Injury Compensation Program:* Notes the existence of the Vaccine Adverse Event Reporting System (VAERS) and the federal government's payout of over $5 billion to vaccine-injured individuals, which seemingly contradicts the "safe and effective" narrative. * *16:22 - Lack of Independent Safety Research:* Argues that pharmaceutical companies have no incentive to fund long-term safety studies that might reveal problems with their products, and scientists who do conduct such research are often vilified, creating a disincentive for crucial inquiry. * *27:38 - Thimerosal (Mercury) in Vaccines:* Highlights the historical use of thimerosal (ethyl mercury) as a preservative in vaccines, noting that total mercury exposure in babies from vaccines exceeded EPA safety limits. It was quietly removed from most vaccines around 2000, but its past use underscores a lack of comprehensive safety assessment. * *32:28 - Absence of Long-Term Prospective Studies:* Emphasizes the critical need for long-term (e.g., 5-20 years) prospective studies comparing health outcomes in vaccinated versus unvaccinated populations to identify potential associations with chronic diseases like asthma, allergies, or autoimmune conditions. * *39:19 - Susceptibility and Immunological Changes:* Proposes a theory that modern lifestyle factors (C-sections, lack of breastfeeding, early antibiotic use, environmental toxins) can dysregulate a child's immune system, making them more susceptible to adverse reactions from vaccines. * *48:38 - Autism and Vaccine Triggers:* Clarifies that while vaccines are not believed to *cause* autism, they could act as a trigger for immune dysregulation in susceptible children, leading to neuroinflammatory conditions like regressive autism, which is observed by many clinicians. * *52:49 - Call for a Dispassionate Data Review:* Advocates for an independent, unbiased commission of respected medical professionals to compile and present all existing vaccine research data, including both benefits and potential risks, to doctors and the public in an understandable format. * *1:04:25 - Personalized Vaccination Choices:* Dr. Gator advises parents on informed consent, emphasizing their right to choose whether to follow the CDC schedule, a slower schedule, or selective vaccination based on individual risk-benefit assessment, rather than being forced. * *1:09:25 - Debating Early Vaccination:* Discusses the trade-offs of vaccinating infants with developing immune systems. While early vaccination provides protection, it also means exposure to potential risks (e.g., fever leading to hospital visits and invasive procedures) at a highly vulnerable stage. * *1:12:21 - Advocacy for Public Debate and Research Funding:* Calls for public, televised debates among experts on complex vaccine questions and urges government agencies like NIH to fund research that addresses unanswered questions and improves vaccine safety, stressing that genuine scientific inquiry should never be shut down. * *1:15:00 - Conclusion and Book Promotion:* The discussion concludes by reinforcing the idea that questioning vaccines is about wanting safer products, not being "anti-vaccine," akin to wanting safer fish or planes. Dr. Gator's book is recommended as a resource for balanced information. I used gemini-2.5-flash-preview-05-20| input-price: 0.15 output-price: 3.5 max-context-length: 128_000 on rocketrecap dot com to summarize the transcript. Cost (if I didn't use the free tier): $0.0131 Input tokens: 54576 Output tokens: 1395
Abstract:
This video features a discussion with pediatrician Dr. Joel Gator, author of "Between a Shot and a Hard Place," advocating for honest, data-driven conversations about vaccines. The dialogue challenges the prevailing "safe and effective" dogma, arguing that vaccines, like all medical treatments, carry both benefits and risks that require open discussion and informed consent. Key criticisms include the lack of inert placebo-controlled trials, the cumulative effect of numerous mandated vaccines, and the problematic indemnification of pharmaceutical companies from liability for adverse effects. The speakers highlight declining public trust in medical institutions due to perceived dishonesty and censorship around vaccine safety concerns. They call for comprehensive, long-term, and unbiased research into vaccine effects, personalized vaccination strategies, and transparent communication to rebuild trust and continuously improve vaccine safety.
Summary:
Below, I will provide input for an example video (comprising of title, description, and transcript, in this order) and the corresponding abstract and summary I expect. Afterward, I will provide a new transcript that I want you to summarize in the same format. **Please give an abstract of the transcript and then summarize the transcript in a self-contained bullet list format.** Include starting timestamps, important details and key takeaways. Example Input: Fluidigm Polaris Part 2- illuminator and camera mikeselectricstuff 131K subscribers Subscribed 369 Share Download Clip Save 5,857 views Aug 26, 2024 Fluidigm Polaris part 1 : • Fluidigm Polaris (Part 1) - Biotech g... Ebay listings: https://www.ebay.co.uk/usr/mikeselect... Merch https://mikeselectricstuff.creator-sp... Transcript Follow along using the transcript. Show transcript mikeselectricstuff 131K subscribers Videos About Support on Patreon 40 Comments @robertwatsonbath 6 hours ago Thanks Mike. Ooof! - with the level of bodgery going on around 15:48 I think shame would have made me do a board re spin, out of my own pocket if I had to. 1 Reply @Muonium1 9 hours ago The green LED looks different from the others and uses phosphor conversion because of the "green gap" problem where green InGaN emitters suffer efficiency droop at high currents. Phosphide based emitters don't start becoming efficient until around 600nm so also can't be used for high power green emitters. See the paper and plot by Matthias Auf der Maur in his 2015 paper on alloy fluctuations in InGaN as the cause of reduced external quantum efficiency at longer (green) wavelengths. 4 Reply 1 reply @tafsirnahian669 10 hours ago (edited) Can this be used as an astrophotography camera? Reply mikeselectricstuff · 1 reply @mikeselectricstuff 6 hours ago Yes, but may need a shutter to avoid light during readout Reply @2010craggy 11 hours ago Narrowband filters we use in Astronomy (Astrophotography) are sided- they work best passing light in one direction so I guess the arrows on the filter frames indicate which way round to install them in the filter wheel. 1 Reply @vitukz 12 hours ago A mate with Channel @extractions&ire could use it 2 Reply @RobertGallop 19 hours ago That LED module says it can go up to 28 amps!!! 21 amps for 100%. You should see what it does at 20 amps! Reply @Prophes0r 19 hours ago I had an "Oh SHIT!" moment when I realized that the weird trapezoidal shape of that light guide was for keystone correction of the light source. Very clever. 6 Reply @OneBiOzZ 20 hours ago given the cost of the CCD you think they could have run another PCB for it 9 Reply @tekvax01 21 hours ago $20 thousand dollars per minute of run time! 1 Reply @tekvax01 22 hours ago "We spared no expense!" John Hammond Jurassic Park. *(that's why this thing costs the same as a 50-seat Greyhound Bus coach!) Reply @florianf4257 22 hours ago The smearing on the image could be due to the fact that you don't use a shutter, so you see brighter stripes under bright areas of the image as you still iluminate these pixels while the sensor data ist shifted out towards the top. I experienced this effect back at university with a LN-Cooled CCD for Spectroscopy. The stripes disapeared as soon as you used the shutter instead of disabling it in the open position (but fokussing at 100ms integration time and continuous readout with a focal plane shutter isn't much fun). 12 Reply mikeselectricstuff · 1 reply @mikeselectricstuff 12 hours ago I didn't think of that, but makes sense 2 Reply @douro20 22 hours ago (edited) The red LED reminds me of one from Roithner Lasertechnik. I have a Symbol 2D scanner which uses two very bright LEDs from that company, one red and one red-orange. The red-orange is behind a lens which focuses it into an extremely narrow beam. 1 Reply @RicoElectrico 23 hours ago PFG is Pulse Flush Gate according to the datasheet. Reply @dcallan812 23 hours ago Very interesting. 2x Reply @littleboot_ 1 day ago Cool interesting device Reply @dav1dbone 1 day ago I've stripped large projectors, looks similar, wonder if some of those castings are a magnesium alloy? Reply @kevywevvy8833 1 day ago ironic that some of those Phlatlight modules are used in some of the cheapest disco lights. 1 Reply 1 reply @bill6255 1 day ago Great vid - gets right into subject in title, its packed with information, wraps up quickly. Should get a YT award! imho 3 Reply @JAKOB1977 1 day ago (edited) The whole sensor module incl. a 5 grand 50mpix sensor for 49 £.. highest bid atm Though also a limited CCD sensor, but for the right buyer its a steal at these relative low sums. Architecture Full Frame CCD (Square Pixels) Total Number of Pixels 8304 (H) × 6220 (V) = 51.6 Mp Number of Effective Pixels 8208 (H) × 6164 (V) = 50.5 Mp Number of Active Pixels 8176 (H) × 6132 (V) = 50.1 Mp Pixel Size 6.0 m (H) × 6.0 m (V) Active Image Size 49.1 mm (H) × 36.8 mm (V) 61.3 mm (Diagonal), 645 1.1x Optical Format Aspect Ratio 4:3 Horizontal Outputs 4 Saturation Signal 40.3 ke− Output Sensitivity 31 V/e− Quantum Efficiency KAF−50100−CAA KAF−50100−AAA KAF−50100−ABA (with Lens) 22%, 22%, 16% (Peak R, G, B) 25% 62% Read Noise (f = 18 MHz) 12.5 e− Dark Signal (T = 60°C) 42 pA/cm2 Dark Current Doubling Temperature 5.7°C Dynamic Range (f = 18 MHz) 70.2 dB Estimated Linear Dynamic Range (f = 18 MHz) 69.3 dB Charge Transfer Efficiency Horizontal Vertical 0.999995 0.999999 Blooming Protection (4 ms Exposure Time) 800X Saturation Exposure Maximum Date Rate 18 MHz Package Ceramic PGA Cover Glass MAR Coated, 2 Sides or Clear Glass Features • TRUESENSE Transparent Gate Electrode for High Sensitivity • Ultra-High Resolution • Board Dynamic Range • Low Noise Architecture • Large Active Imaging Area Applications • Digitization • Mapping/Aerial • Photography • Scientific Thx for the tear down Mike, always a joy Reply @martinalooksatthings 1 day ago 15:49 that is some great bodging on of caps, they really didn't want to respin that PCB huh 8 Reply @RhythmGamer 1 day ago Was depressed today and then a new mike video dropped and now I’m genuinely happy to get my tear down fix 1 Reply @dine9093 1 day ago (edited) Did you transfrom into Mr Blobby for a moment there? 2 Reply @NickNorton 1 day ago Thanks Mike. Your videos are always interesting. 5 Reply @KeritechElectronics 1 day ago Heavy optics indeed... Spare no expense, cost no object. Splendid build quality. The CCD is a thing of beauty! 1 Reply @YSoreil 1 day ago The pricing on that sensor is about right, I looked in to these many years ago when they were still in production since it's the only large sensor you could actually buy. Really cool to see one in the wild. 2 Reply @snik2pl 1 day ago That leds look like from led projector Reply @vincei4252 1 day ago TDI = Time Domain Integration ? 1 Reply @wolpumba4099 1 day ago (edited) Maybe the camera should not be illuminated during readout. From the datasheet of the sensor (Onsemi): saturation 40300 electrons, read noise 12.5 electrons per pixel @ 18MHz (quite bad). quantum efficiency 62% (if it has micro lenses), frame rate 1 Hz. lateral overflow drain to prevent blooming protects against 800x (factor increases linearly with exposure time) saturation exposure (32e6 electrons per pixel at 4ms exposure time), microlens has +/- 20 degree acceptance angle i guess it would be good for astrophotography 4 Reply @txm100 1 day ago (edited) Babe wake up a new mikeselectricstuff has dropped! 9 Reply @vincei4252 1 day ago That looks like a finger-lakes filter wheel, however, for astronomy they'd never use such a large stepper. 1 Reply @MRooodddvvv 1 day ago yaaaaay ! more overcomplicated optical stuff ! 4 Reply 1 reply @NoPegs 1 day ago He lives! 11 Reply 1 reply Transcript 0:00 so I've stripped all the bits of the 0:01 optical system so basically we've got 0:03 the uh the camera 0:05 itself which is mounted on this uh very 0:09 complex 0:10 adjustment thing which obviously to set 0:13 you the various tilt and uh alignment 0:15 stuff then there's two of these massive 0:18 lenses I've taken one of these apart I 0:20 think there's something like about eight 0:22 or nine Optical elements in here these 0:25 don't seem to do a great deal in terms 0:26 of electr magnification they're obiously 0:28 just about getting the image to where it 0:29 uh where it needs to be just so that 0:33 goes like that then this Optical block I 0:36 originally thought this was made of some 0:37 s crazy heavy material but it's just 0:39 really the sum of all these Optical bits 0:41 are just ridiculously heavy those lenses 0:43 are about 4 kilos each and then there's 0:45 this very heavy very solid um piece that 0:47 goes in the middle and this is so this 0:49 is the filter wheel assembly with a 0:51 hilariously oversized steper 0:53 motor driving this wheel with these very 0:57 large narrow band filters so we've got 1:00 various different shades of uh 1:03 filters there five Al together that 1:06 one's actually just showing up a silver 1:07 that's actually a a red but fairly low 1:10 transmission orangey red blue green 1:15 there's an excess cover on this side so 1:16 the filters can be accessed and changed 1:19 without taking anything else apart even 1:21 this is like ridiculous it's like solid 1:23 aluminium this is just basically a cover 1:25 the actual wavelengths of these are um 1:27 488 525 570 630 and 700 NM not sure what 1:32 the suffix on that perhaps that's the uh 1:34 the width of the spectral line say these 1:37 are very narrow band filters most of 1:39 them are you very little light through 1:41 so it's still very tight narrow band to 1:43 match the um fluoresence of the dies 1:45 they're using in the biochemical process 1:48 and obviously to reject the light that's 1:49 being fired at it from that Illuminator 1:51 box and then there's a there's a second 1:53 one of these lenses then the actual sort 1:55 of samples below that so uh very serious 1:58 amount of very uh chunky heavy Optics 2:01 okay let's take a look at this light 2:02 source made by company Lumen Dynamics 2:04 who are now part of 2:06 excelitas self-contained unit power 2:08 connector USB and this which one of the 2:11 Cable Bundle said was a TTL interface 2:14 USB wasn't used in uh the fluid 2:17 application output here and I think this 2:19 is an input for um light feedback I 2:21 don't if it's regulated or just a measur 2:23 measurement facility and the uh fiber 2:27 assembly 2:29 Square Inlet there and then there's two 2:32 outputs which have uh lens assemblies 2:35 and this small one which goes back into 2:37 that small Port just Loops out of here 2:40 straight back in So on this side we've 2:42 got the electronics which look pretty 2:44 straightforward we've got a bit of power 2:45 supply stuff over here and we've got 2:48 separate drivers for each wavelength now 2:50 interesting this is clearly been very 2:52 specifically made for this application 2:54 you I was half expecting like say some 2:56 generic drivers that could be used for a 2:58 number of different things but actually 3:00 literally specified the exact wavelength 3:02 on the PCB there is provision here for 3:04 385 NM which isn't populated but this is 3:07 clearly been designed very specifically 3:09 so these four drivers look the same but 3:10 then there's two higher power ones for 3:12 575 and 3:14 520 a slightly bigger heat sink on this 3:16 575 section there a p 24 which is 3:20 providing USB interface USB isolator the 3:23 USB interface just presents as a comport 3:26 I did have a quick look but I didn't 3:27 actually get anything sensible um I did 3:29 dump the Pi code out and there's a few 3:31 you a few sort of commands that you 3:32 could see in text but I didn't actually 3:34 manage to get it working properly I 3:36 found some software for related version 3:38 but it didn't seem to want to talk to it 3:39 but um I say that wasn't used for the 3:41 original application it might be quite 3:42 interesting to get try and get the Run 3:44 hours count out of it and the TTL 3:46 interface looks fairly straightforward 3:48 we've got positions for six opto 3:50 isolators but only five five are 3:52 installed so that corresponds with the 3:54 unused thing so I think this hopefully 3:56 should be as simple as just providing a 3:57 ttrl signal for each color to uh enable 4:00 it a big heat sink here which is there I 4:03 think there's like a big S of metal 4:04 plate through the middle of this that 4:05 all the leads are mounted on the other 4:07 side so this is heat sinking it with a 4:09 air flow from a uh just a fan in here 4:13 obviously don't have the air flow 4:14 anywhere near the Optics so conduction 4:17 cool through to this plate that's then 4:18 uh air cooled got some pots which are 4:21 presumably power 4:22 adjustments okay let's take a look at 4:24 the other side which is uh much more 4:27 interesting see we've got some uh very 4:31 uh neatly Twisted cable assemblies there 4:35 a bunch of leads so we've got one here 4:37 475 up here 430 NM 630 575 and 520 4:44 filters and dcro mirrors a quick way to 4:48 see what's white is if we just shine 4:49 some white light through 4:51 here not sure how it is is to see on the 4:54 camera but shining white light we do 4:55 actually get a bit of red a bit of blue 4:57 some yellow here so the obstacle path 5:00 575 it goes sort of here bounces off 5:03 this mirror and goes out the 520 goes 5:07 sort of down here across here and up 5:09 there 630 goes basically straight 5:13 through 5:15 430 goes across there down there along 5:17 there and the 475 goes down here and 5:20 left this is the light sensing thing 5:22 think here there's just a um I think 5:24 there a photo diode or other sensor 5:26 haven't actually taken that off and 5:28 everything's fixed down to this chunk of 5:31 aluminium which acts as the heat 5:32 spreader that then conducts the heat to 5:33 the back side for the heat 5:35 sink and the actual lead packages all 5:38 look fairly similar except for this one 5:41 on the 575 which looks quite a bit more 5:44 substantial big spay 5:46 Terminals and the interface for this 5:48 turned out to be extremely simple it's 5:50 literally a 5V TTL level to enable each 5:54 color doesn't seem to be any tensity 5:56 control but there are some additional 5:58 pins on that connector that weren't used 5:59 in the through time thing so maybe 6:01 there's some extra lines that control 6:02 that I couldn't find any data on this uh 6:05 unit and the um their current product 6:07 range is quite significantly different 6:09 so we've got the uh blue these 6:13 might may well be saturating the camera 6:16 so they might look a bit weird so that's 6:17 the 430 6:18 blue the 575 6:24 yellow uh 6:26 475 light blue 6:29 the uh 520 6:31 green and the uh 630 red now one 6:36 interesting thing I noticed for the 6:39 575 it's actually it's actually using a 6:42 white lead and then filtering it rather 6:44 than using all the other ones are using 6:46 leads which are the fundamental colors 6:47 but uh this is actually doing white and 6:50 it's a combination of this filter and 6:52 the dichroic mirrors that are turning to 6:55 Yellow if we take the filter out and a 6:57 lot of the a lot of the um blue content 7:00 is going this way the red is going 7:02 straight through these two mirrors so 7:05 this is clearly not reflecting much of 7:08 that so we end up with the yellow coming 7:10 out of uh out of there which is a fairly 7:14 light yellow color which you don't 7:16 really see from high intensity leads so 7:19 that's clearly why they've used the 7:20 white to uh do this power consumption of 7:23 the white is pretty high so going up to 7:25 about 2 and 1 half amps on that color 7:27 whereas most of the other colors are 7:28 only drawing half an amp or so at 24 7:30 volts the uh the green is up to about 7:32 1.2 but say this thing is uh much 7:35 brighter and if you actually run all the 7:38 colors at the same time you get a fairly 7:41 reasonable um looking white coming out 7:43 of it and one thing you might just be 7:45 out to notice is there is some sort 7:46 color banding around here that's not 7:49 getting uh everything s completely 7:51 concentric and I think that's where this 7:53 fiber optic thing comes 7:58 in I'll 8:00 get a couple of Fairly accurately shaped 8:04 very sort of uniform color and looking 8:06 at What's um inside here we've basically 8:09 just got this Square Rod so this is 8:12 clearly yeah the lights just bouncing 8:13 off all the all the various sides to um 8:16 get a nice uniform illumination uh this 8:19 back bit looks like it's all potted so 8:21 nothing I really do to get in there I 8:24 think this is fiber so I have come 8:26 across um cables like this which are 8:27 liquid fill but just looking through the 8:30 end of this it's probably a bit hard to 8:31 see it does look like there fiber ends 8:34 going going on there and so there's this 8:36 feedback thing which is just obviously 8:39 compensating for the any light losses 8:41 through here to get an accurate 8:43 representation of uh the light that's 8:45 been launched out of these two 8:47 fibers and you see uh 8:49 these have got this sort of trapezium 8:54 shape light guides again it's like a 8:56 sort of acrylic or glass light guide 9:00 guess projected just to make the right 9:03 rectangular 9:04 shape and look at this Center assembly 9:07 um the light output doesn't uh change 9:10 whether you feed this in or not so it's 9:11 clear not doing any internal Clos Loop 9:14 control obviously there may well be some 9:16 facility for it to do that but it's not 9:17 being used in this 9:19 application and so this output just 9:21 produces a voltage on the uh outle 9:24 connector proportional to the amount of 9:26 light that's present so there's a little 9:28 diffuser in the back there 9:30 and then there's just some kind of uh 9:33 Optical sensor looks like a 9:35 chip looking at the lead it's a very 9:37 small package on the PCB with this lens 9:40 assembly over the top and these look 9:43 like they're actually on a copper 9:44 Metalized PCB for maximum thermal 9:47 performance and yeah it's a very small 9:49 package looks like it's a ceramic 9:51 package and there's a thermister there 9:53 for temperature monitoring this is the 9:56 475 blue one this is the 520 need to 9:59 Green which is uh rather different OB 10:02 it's a much bigger D with lots of bond 10:04 wise but also this looks like it's using 10:05 a phosphor if I shine a blue light at it 10:08 lights up green so this is actually a 10:10 phosphor conversion green lead which 10:12 I've I've come across before they want 10:15 that specific wavelength so they may be 10:17 easier to tune a phosphor than tune the 10:20 um semiconductor material to get the uh 10:23 right right wavelength from the lead 10:24 directly uh red 630 similar size to the 10:28 blue one or does seem to have a uh a 10:31 lens on top of it there is a sort of red 10:33 coloring to 10:35 the die but that doesn't appear to be 10:38 fluorescent as far as I can 10:39 tell and the white one again a little 10:41 bit different sort of much higher 10:43 current 10:46 connectors a makeer name on that 10:48 connector flot light not sure if that's 10:52 the connector or the lead 10:54 itself and obviously with the phosphor 10:56 and I'd imagine that phosphor may well 10:58 be tuned to get the maximum to the uh 5 11:01 cenm and actually this white one looks 11:04 like a St fairly standard product I just 11:06 found it in Mouse made by luminous 11:09 devices in fact actually I think all 11:11 these are based on various luminous 11:13 devices modules and they're you take 11:17 looks like they taking the nearest 11:18 wavelength and then just using these 11:19 filters to clean it up to get a precise 11:22 uh spectral line out of it so quite a 11:25 nice neat and um extreme 11:30 bright light source uh sure I've got any 11:33 particular use for it so I think this 11:35 might end up on 11:36 eBay but uh very pretty to look out and 11:40 without the uh risk of burning your eyes 11:43 out like you do with lasers so I thought 11:45 it would be interesting to try and 11:46 figure out the runtime of this things 11:48 like this we usually keep some sort 11:49 record of runtime cuz leads degrade over 11:51 time I couldn't get any software to work 11:52 through the USB face but then had a 11:54 thought probably going to be writing the 11:55 runtime periodically to the e s prom so 11:58 I just just scope up that and noticed it 12:00 was doing right every 5 minutes so I 12:02 just ran it for a while periodically 12:04 reading the E squ I just held the pick 12:05 in in reset and um put clip over to read 12:07 the square prom and found it was writing 12:10 one location per color every 5 minutes 12:12 so if one color was on it would write 12:14 that location every 5 minutes and just 12:16 increment it by one so after doing a few 12:18 tests with different colors of different 12:19 time periods it looked extremely 12:21 straightforward it's like a four bite 12:22 count for each color looking at the 12:24 original data that was in it all the 12:26 colors apart from Green were reading 12:28 zero and the green was reading four 12:30 indicating a total 20 minutes run time 12:32 ever if it was turned on run for a short 12:34 time then turned off that might not have 12:36 been counted but even so indicates this 12:37 thing wasn't used a great deal the whole 12:40 s process of doing a run can be several 12:42 hours but it'll only be doing probably 12:43 the Imaging at the end of that so you 12:46 wouldn't expect to be running for a long 12:47 time but say a single color for 20 12:50 minutes over its whole lifetime does 12:52 seem a little bit on the low side okay 12:55 let's look at the camera un fortunately 12:57 I managed to not record any sound when I 12:58 did this it's also a couple of months 13:00 ago so there's going to be a few details 13:02 that I've forgotten so I'm just going to 13:04 dub this over the original footage so um 13:07 take the lid off see this massive great 13:10 heat sink so this is a pel cool camera 13:12 we've got this blower fan producing a 13:14 fair amount of air flow through 13:16 it the connector here there's the ccds 13:19 mounted on the board on the 13:24 right this unplugs so we've got a bit of 13:27 power supply stuff on here 13:29 USB interface I think that's the Cyprus 13:32 microcontroller High speeded USB 13:34 interface there's a zyink spon fpga some 13:40 RAM and there's a couple of ATD 13:42 converters can't quite read what those 13:45 those are but anal 13:47 devices um little bit of bodgery around 13:51 here extra decoupling obviously they 13:53 have having some noise issues this is 13:55 around the ram chip quite a lot of extra 13:57 capacitors been added there 13:59 uh there's a couple of amplifiers prior 14:01 to the HD converter buffers or Andor 14:05 amplifiers taking the CCD 14:08 signal um bit more power spy stuff here 14:11 this is probably all to do with 14:12 generating the various CCD bias voltages 14:14 they uh need quite a lot of exotic 14:18 voltages next board down is just a 14:20 shield and an interconnect 14:24 boardly shielding the power supply stuff 14:26 from some the more sensitive an log 14:28 stuff 14:31 and this is the bottom board which is 14:32 just all power supply 14:34 stuff as you can see tons of capacitors 14:37 or Transformer in 14:42 there and this is the CCD which is a uh 14:47 very impressive thing this is a kf50 100 14:50 originally by true sense then codec 14:53 there ON 14:54 Semiconductor it's 50 megapixels uh the 14:58 only price I could find was this one 15:00 5,000 bucks and the architecture you can 15:03 see there actually two separate halves 15:04 which explains the Dual AZ converters 15:06 and two amplifiers it's literally split 15:08 down the middle and duplicated so it's 15:10 outputting two streams in parallel just 15:13 to keep the bandwidth sensible and it's 15:15 got this amazing um diffraction effects 15:18 it's got micro lenses over the pixel so 15:20 there's there's a bit more Optics going 15:22 on than on a normal 15:25 sensor few more bodges on the CCD board 15:28 including this wire which isn't really 15:29 tacked down very well which is a bit uh 15:32 bit of a mess quite a few bits around 15:34 this board where they've uh tacked 15:36 various bits on which is not super 15:38 impressive looks like CCD drivers on the 15:40 left with those 3 ohm um damping 15:43 resistors on the 15:47 output get a few more little bodges 15:50 around here some of 15:52 the and there's this separator the 15:54 silica gel to keep the moisture down but 15:56 there's this separator that actually 15:58 appears to be cut from piece of 15:59 antistatic 16:04 bag and this sort of thermal block on 16:06 top of this stack of three pel Cola 16:12 modules so as with any Stacks they get 16:16 um larger as they go back towards the 16:18 heat sink because each P's got to not 16:20 only take the heat from the previous but 16:21 also the waste heat which is quite 16:27 significant you see a little temperature 16:29 sensor here that copper block which 16:32 makes contact with the back of the 16:37 CCD and this's the back of the 16:40 pelas this then contacts the heat sink 16:44 on the uh rear there a few thermal pads 16:46 as well for some of the other power 16:47 components on this 16:51 PCB okay I've connected this uh camera 16:54 up I found some drivers on the disc that 16:56 seem to work under Windows 7 couldn't 16:58 get to install under Windows 11 though 17:01 um in the absence of any sort of lens or 17:03 being bothered to the proper amount I've 17:04 just put some f over it and put a little 17:06 pin in there to make a pinhole lens and 17:08 software gives a few options I'm not 17:11 entirely sure what all these are there's 17:12 obviously a clock frequency 22 MHz low 17:15 gain and with PFG no idea what that is 17:19 something something game programmable 17:20 Something game perhaps ver exposure 17:23 types I think focus is just like a 17:25 continuous grab until you tell it to 17:27 stop not entirely sure all these options 17:30 are obviously exposure time uh triggers 17:33 there ex external hardware trigger inut 17:35 you just trigger using a um thing on 17:37 screen so the resolution is 8176 by 17:40 6132 and you can actually bin those 17:42 where you combine multiple pixels to get 17:46 increased gain at the expense of lower 17:48 resolution down this is a 10sec exposure 17:51 obviously of the pin hole it's very uh 17:53 intensitive so we just stand still now 17:56 downloading it there's the uh exposure 17:59 so when it's 18:01 um there's a little status thing down 18:03 here so that tells you the um exposure 18:07 [Applause] 18:09 time it's this is just it 18:15 downloading um it is quite I'm seeing 18:18 quite a lot like smearing I think that I 18:20 don't know whether that's just due to 18:21 pixels overloading or something else I 18:24 mean yeah it's not it's not um out of 18:26 the question that there's something not 18:27 totally right about this camera 18:28 certainly was bodge wise on there um I 18:31 don't I'd imagine a camera like this 18:32 it's got a fairly narrow range of 18:34 intensities that it's happy with I'm not 18:36 going to spend a great deal of time on 18:38 this if you're interested in this camera 18:40 maybe for astronomy or something and 18:42 happy to sort of take the risk of it may 18:44 not be uh perfect I'll um I think I'll 18:47 stick this on eBay along with the 18:48 Illuminator I'll put a link down in the 18:50 description to the listing take your 18:52 chances to grab a bargain so for example 18:54 here we see this vertical streaking so 18:56 I'm not sure how normal that is this is 18:58 on fairly bright scene looking out the 19:02 window if I cut the exposure time down 19:04 on that it's now 1 second 19:07 exposure again most of the image 19:09 disappears again this is looks like it's 19:11 possibly over still overloading here go 19:14 that go down to say say quarter a 19:16 second so again I think there might be 19:19 some Auto gain control going on here um 19:21 this is with the PFG option let's try 19:23 turning that off and see what 19:25 happens so I'm not sure this is actually 19:27 more streaking or which just it's 19:29 cranked up the gain all the dis display 19:31 gray scale to show what um you know the 19:33 range of things that it's captured 19:36 there's one of one of 12 things in the 19:38 software there's um you can see of you 19:40 can't seem to read out the temperature 19:42 of the pelta cooler but you can set the 19:44 temperature and if you said it's a 19:46 different temperature you see the power 19:48 consumption jump up running the cooler 19:50 to get the temperature you requested but 19:52 I can't see anything anywhere that tells 19:54 you whether the cool is at the at the 19:56 temperature other than the power 19:57 consumption going down and there's no 19:59 temperature read out 20:03 here and just some yeah this is just 20:05 sort of very basic software I'm sure 20:07 there's like an API for more 20:09 sophisticated 20:10 applications but so if you know anything 20:12 more about these cameras please um stick 20:14 in the 20:15 comments um incidentally when I was 20:18 editing I didn't notice there was a bent 20:19 pin on the um CCD but I did fix that 20:22 before doing these tests and also 20:24 reactivated the um silica gel desicant 20:26 cuz I noticed it was uh I was getting 20:28 bit of condensation on the window but um 20:31 yeah so a couple of uh interesting but 20:34 maybe not particularly uh useful pieces 20:37 of Kit except for someone that's got a 20:38 very specific use so um I'll stick a 20:42 I'll stick these on eBay put a link in 20:44 the description and say hopefully 20:45 someone could actually make some uh good 20:47 use of these things Example Output: **Abstract:** This video presents Part 2 of a teardown focusing on the optical components of a Fluidigm Polaris biotechnology instrument, specifically the multi-wavelength illuminator and the high-resolution CCD camera. The Lumen Dynamics illuminator unit is examined in detail, revealing its construction using multiple high-power LEDs (430nm, 475nm, 520nm, 575nm, 630nm) combined via dichroic mirrors and filters. A square fiber optic rod is used to homogenize the light. A notable finding is the use of a phosphor-converted white LED filtered to achieve the 575nm output. The unit features simple TTL activation for each color, conduction cooling, and internal homogenization optics. Analysis of its EEPROM suggests extremely low operational runtime. The camera module teardown showcases a 50 Megapixel ON Semiconductor KAF-50100 CCD sensor with micro-lenses, cooled by a multi-stage Peltier stack. The control electronics include an FPGA and a USB interface. Significant post-manufacturing modifications ("bodges") are observed on the camera's circuit boards. Basic functional testing using vendor software and a pinhole lens confirms image capture but reveals prominent vertical streaking artifacts, the cause of which remains uncertain (potential overload, readout artifact, or fault). **Exploring the Fluidigm Polaris: A Detailed Look at its High-End Optics and Camera System** * **0:00 High-End Optics:** The system utilizes heavy, high-quality lenses and mirrors for precise imaging, weighing around 4 kilos each. * **0:49 Narrow Band Filters:** A filter wheel with five narrow band filters (488, 525, 570, 630, and 700 nm) ensures accurate fluorescence detection and rejection of excitation light. * **2:01 Customizable Illumination:** The Lumen Dynamics light source offers five individually controllable LED wavelengths (430, 475, 520, 575, 630 nm) with varying power outputs. The 575nm yellow LED is uniquely achieved using a white LED with filtering. * **3:45 TTL Control:** The light source is controlled via a simple TTL interface, enabling easy on/off switching for each LED color. * **12:55 Sophisticated Camera:** The system includes a 50-megapixel Kodak KAI-50100 CCD camera with a Peltier cooling system for reduced noise. * **14:54 High-Speed Data Transfer:** The camera features dual analog-to-digital converters to manage the high data throughput of the 50-megapixel sensor, which is effectively two 25-megapixel sensors operating in parallel. * **18:11 Possible Issues:** The video creator noted some potential issues with the camera, including image smearing. * **18:11 Limited Dynamic Range:** The camera's sensor has a limited dynamic range, making it potentially challenging to capture scenes with a wide range of brightness levels. * **11:45 Low Runtime:** Internal data suggests the system has seen minimal usage, with only 20 minutes of recorded runtime for the green LED. * **20:38 Availability on eBay:** Both the illuminator and camera are expected to be listed for sale on eBay. Here is the real transcript. Please summarize it: 00:00:01 vaccines. We're not allowed to talk 00:00:01 about it as any other medical treatment 00:00:02 which is has ri benefits and has risks. 00:00:05 We need honesty. We need to have 00:00:06 discussions not to stop people from 00:00:08 vaccinating, not to stop vaccines, but 00:00:10 to hold these companies liable in some 00:00:13 way or to hold them accountable. Dr. 00:00:15 Joel Wor is a pediatrician and opening 00:00:17 up honest conversations around vaccines, 00:00:19 helping people make informed choices 00:00:21 with confidence. Every single vaccine 00:00:23 before it hits the market, it's studied 00:00:25 against a placebo. The thing is, a lot 00:00:26 of those placeos are not inert. Not 00:00:28 everybody is the same. Each of us have 00:00:30 different genetics, different immune 00:00:31 systems. At Mayo Clinic, they're looking 00:00:32 at how different people respond to 00:00:34 vaccines. Why do the kids that are 00:00:35 living an unvaccinated lifestyle have a 00:00:37 lower risk of autoimmune disease? Is it 00:00:39 the food that they're eating? What is it 00:00:40 about their lifestyle that's different? 00:00:43 Joe, welcome to the Dr. Heyman show. I'm 00:00:46 so excited about this conversation 00:00:47 because it's a conversation that nobody 00:00:49 is allowed to have, which is about 00:00:51 vaccines. Yeah. No, I'm I'm so excited 00:00:53 to be here. And before we jump into 00:00:55 that, I just want to say thank you to 00:00:56 you. Back when I was in training, I met 00:00:59 my wife and she was very integrative 00:01:01 minded and that kind of set me on the 00:01:03 path towards learning integrative 00:01:04 medicine. Yeah. But at that time there 00:01:07 really wasn't a lot of integrative 00:01:08 medicine, certainly not for kids. No. 00:01:10 And your work really resonated with me. 00:01:12 I was following a lot of what you did. I 00:01:14 went to functional medicine courses 00:01:15 because of you and um I I used to tell 00:01:18 people, you know, I want to be the 00:01:20 Markman of pediatrics. Yes, we need you. 00:01:22 So that's probably why I'm sitting here 00:01:24 today. So, I just want to say thank you 00:01:25 for everything that you do because it 00:01:26 makes a huge difference. Well, that is 00:01:28 really appreciate that. And you know, 00:01:29 I've worked really hard to try to spread 00:01:31 the word and help people open their 00:01:33 minds and think differently about 00:01:35 health. And one of the challenges is 00:01:37 that is that you know, there's such a 00:01:40 set of dogmas in medicine, which is very 00:01:42 strange. You know, we had all these 00:01:44 beliefs that have been overturned by the 00:01:48 progress of science and science is 00:01:49 really about asking questions. What 00:01:51 really strikes me around the issue 00:01:53 around vaccines and your new book, 00:01:54 Between a Shot and a Hard Place, 00:01:55 tackling difficult vaccine questions 00:01:58 with balanced data and clarity, it's a 00:02:00 breath of fresh air. It's just it's 00:02:01 extraordinary to me that when you look 00:02:03 at the progress of science we've seen 00:02:05 historically, we used to think that 00:02:06 autism was caused by refrigerator 00:02:08 mothers. Mhm. We used to think that 00:02:10 ulcers were caused by stress and not by 00:02:13 a bacteria which we now know and the guy 00:02:14 who figured this out won the Nobel Prize 00:02:16 and was laughed at for making this 00:02:18 hypothesis that a bacteria would cause 00:02:20 ulcers. Recently and we used to think 00:02:22 that aspirin was the best thing to give 00:02:24 to everybody to prevent heart attacks 00:02:26 and that that has completely flipped and 00:02:28 it's only good for a certain subset of 00:02:29 people who are at high risk not 00:02:31 everybody else because it's going to 00:02:32 cause brain hemorrhaging and intestinal 00:02:34 bleeding or we used to give arsenic and 00:02:36 mercury as well that was a long time 00:02:38 ago. Those were the the medicines of the 00:02:40 day. That was a long time ago. Yeah. In 00:02:42 fact, one of the one of the things they 00:02:43 did for teething for babies back in the 00:02:46 30s and 40s would give them this powder 00:02:49 that actually they would kind of relieve 00:02:51 some of the pain and discomfort and it 00:02:52 was full of mercury and it caused a 00:02:54 whole syndrome that actually looks like 00:02:57 autism which is crazy. So, I I want to 00:03:00 kind of have a far-ranging conversation 00:03:01 about the history of vaccines, what we 00:03:03 know, what we don't know, and open up 00:03:05 the question because this has been one 00:03:08 of those areas that is just off limits. 00:03:10 I when I was at Cleveland Clinic, we 00:03:12 started the functional medicine center 00:03:13 there, it was fun, it was great, and a 00:03:15 lot of openness to what we're doing. And 00:03:17 there was an article I wrote like maybe 00:03:19 a decade before about a patient with 00:03:20 autism that I just said in the history, 00:03:23 this patient had an MMR vaccine and the 00:03:26 mother said that after that the kid had 00:03:27 issues. And so I was just sort of 00:03:29 recounting the history and even I even 00:03:31 put in the in the kind of disclaimer at 00:03:33 the beginning, I'm not saying that 00:03:35 vaccines cause autism. I'm just noting 00:03:37 the historical context of this patient. 00:03:40 And I was called out by the the 00:03:42 pediatric department and and said, 00:03:44 "You're an antibaxer. Prove that you're 00:03:46 not. You know, we we think you're you're 00:03:48 a heretic or you know, you're in 00:03:50 trouble." Basically, I was like in 00:03:51 trouble. So I had to write a a letter to 00:03:53 the entire pediatric department saying, 00:03:55 "No, no, I'm not antiax." But I I I 00:03:58 think they're important and essential 00:03:59 and great advance in medicine. And you 00:04:01 know, I think there it's worth more 00:04:03 conversation and discussion, but it's 00:04:04 crazy that you can't talk about it. I 00:04:06 mean, you never could. And same thing 00:04:08 for me. I mean, I never talked about 00:04:10 this outside the office until recently. 00:04:13 Yeah. You know, even I was even on RFK's 00:04:15 podcast in 2022, and before I went on, I 00:04:17 said, you know, is it okay if we don't 00:04:18 talk about vaccines? And he was totally 00:04:20 fine with it. He said, "Yeah, I get it." 00:04:21 And I mean, I should have at that time, 00:04:23 but it was just so controversial. It's 00:04:25 still so controversial, but I think the 00:04:27 times have changed a little bit and 00:04:29 people are much more interested and and 00:04:30 I feel like we need to have that 00:04:32 conversation and we need to be open to 00:04:34 discussing everything for our kids 00:04:36 health because our health is suffering 00:04:38 and we're seeing higher rates of chronic 00:04:41 disease and we have to be open to 00:04:43 looking into everything. Not to say that 00:04:45 we shouldn't vaccinate our vaccines are 00:04:46 are bad or anything like that, but we 00:04:48 want the best possible vaccines. We want 00:04:50 the most minimal side effects and we 00:04:52 shouldn't assume that we have the best 00:04:54 vaccines that we're ever going to have 00:04:55 in the history of the world. Like we 00:04:57 should try to continue improving things. 00:04:59 Yeah. And and there there's no reason 00:05:00 why that should be controversial and and 00:05:03 that's why I wrote the book. I'm not 00:05:04 against vaccines at all. I give vaccines 00:05:06 in my office. I just don't believe 00:05:07 people should be forced to do things. I 00:05:09 think we should have discussions and we 00:05:11 should have debates and we should 00:05:13 provide the best information so people 00:05:14 can make the best decisions for for 00:05:16 themselves. And I don't have any motive 00:05:19 behind this book in terms of pushing 00:05:21 people to vaccinate or pushing people 00:05:22 not to vaccinate. Everything that you 00:05:24 see on the market when it comes to a 00:05:25 book really seems to be one-sided. It 00:05:28 really seems to be super provax or super 00:05:30 antivax and there's nothing in between. 00:05:32 That doesn't help, I think, the majority 00:05:35 of parents who are concerned and want 00:05:37 information and want to discuss the pros 00:05:39 and the cons, the risks versus the 00:05:40 benefits. And and I think that's the 00:05:42 conversation that we need to be having 00:05:43 and that's what we have about every 00:05:44 other topic in medicine. Yeah. There 00:05:46 there's a there's a thing we're supposed 00:05:48 to do called informed consent, which is 00:05:50 you lay out the the risks and the 00:05:52 benefits of any treatment, whether it's 00:05:54 surgery. I mean, I even had surgery. 00:05:56 Okay, you could die from bleeding from 00:05:58 infection from this, from that. I'm 00:05:59 like, okay, you know, like I mean, I had 00:06:02 I had an ablation for atrial fib years 00:06:04 ago, and the doctor this long list, 00:06:06 well, you could have perforated your 00:06:07 aorta, you could do this, you could do 00:06:08 that, and I could die. I mean, oh god, 00:06:10 you know, this is terrible. and and 00:06:12 okay, I I chose to do it because I I 00:06:14 didn't want to have this problem anymore 00:06:16 and I had informed consent. But with 00:06:19 vaccines, we're not allowed to talk 00:06:20 about it as any other medical treatment, 00:06:21 which is has ri benefits and it has 00:06:24 risks. And it's just it's extraordinary 00:06:25 to me that we hear this trope that it's 00:06:27 they're safe and they're effective. 00:06:29 They're safe and they're effective, 00:06:30 right? Well, they're sort of safe and 00:06:32 they're sort of effective for a lot of 00:06:34 them, right? And I I think that phrase 00:06:36 over the pandemic is what really shifted 00:06:39 things. It certainly did for me. is one 00:06:41 of the big concerns. I mean, that was 00:06:43 something that we all lived through 00:06:45 recently and we saw the marketing around 00:06:48 vaccines. We saw how that played out. 00:06:51 And when you talk about a new vaccine, 00:06:53 safe and effective doesn't make any 00:06:55 sense. It's just propaganda. What should 00:06:57 have been said, what would have been 00:06:59 completely reasonable if we were being 00:07:01 honest would be based on the information 00:07:03 that we have, the benefits seem to 00:07:05 outweigh the known risks at this time. 00:07:08 uh we don't have any long-term data, so 00:07:09 we don't know anything about the 00:07:10 long-term risks from the the vaccine. 00:07:12 But we we do feel like based on what we 00:07:14 know, the risks seem minimal. The 00:07:15 benefits seem, you know, great for 00:07:17 decreasing death, decreasing 00:07:19 hospitalizations. So here's why we're 00:07:20 recommending it. We're recommending it 00:07:21 to these individuals because they're 00:07:23 more high risk for the individuals that 00:07:25 are less high risk. Here's our 00:07:26 recommendation. What what's wrong with 00:07:28 that? Like that's honest. That's what we 00:07:29 should have been saying instead of 00:07:30 saying safe and effective. Safe and 00:07:32 effective. And then people had problems. 00:07:34 With anything you're going to take, 00:07:35 there's going to be issues. And if 00:07:36 you're not honest, then you see this 00:07:39 decrease in trust in medicine because 00:07:41 people say, "Well, they're not being 00:07:42 honest with me about that. What else are 00:07:44 they not being honest with me about?" 00:07:45 And I'm seeing that in the office. I 00:07:47 mean, we're seeing the vaccine rates 00:07:49 decline across the board. Trust in 00:07:51 medicine is at the lowest it's ever 00:07:53 been. It was 70% before the pandemic. In 00:07:55 many studies, now it's around 40%. 00:07:57 That's crazy. The the vaccination rates, 00:07:59 there are more kids than ever that are 00:08:01 unvaccinated. There are more kids than 00:08:02 ever that are not fully vaccinated. We 00:08:04 have to take a hard look as physicians 00:08:06 and as a medical establishment and look 00:08:08 in the mirror and say, "What are we 00:08:11 doing?" I mean, medicine's trying to 00:08:13 stop vaccine hesitancy. They're actually 00:08:15 creating it by not having discussions 00:08:17 and having debate. And we have to look 00:08:18 in the mirror and say, "What can we do? 00:08:20 How do we bridge that divide to bring 00:08:22 honesty back and trust back in the 00:08:25 system and to hear what concerns parents 00:08:28 have and not to vilify them or to call 00:08:30 them antiax but to listen to what their 00:08:34 concerns are so that way we can address 00:08:35 those concerns, get the research that we 00:08:37 need to either disprove what they're 00:08:40 saying or to make some changes to make 00:08:42 the vaccine safer, whatever it is. But 00:08:44 we're we're vilifying people for talking 00:08:46 about it. And that creates this system 00:08:49 where where people don't want to do 00:08:50 anything with medicine anymore. They 00:08:52 don't want to go to doctors. They don't 00:08:53 trust anything. And that is bad. Look at 00:08:55 the COVID vaccine rates right now. I 00:08:57 mean, you're talking like 10% for kids 00:08:59 getting the vaccine this year with a 00:09:01 recommendation from the CDC. That means 00:09:03 that people don't trust the CDC anymore. 00:09:05 If if half of kids are getting flu shots 00:09:08 and 10% are getting co shots and 20% of 00:09:10 adults are getting COVID shots, then 00:09:12 there's this distrust of of the things 00:09:15 that the CDC are recommending and that 00:09:17 is very bad. We need to have our 00:09:19 establishments there to protect us and 00:09:21 to provide us information. And if we 00:09:23 don't trust what they're saying, it's 00:09:24 like the boy who cried wolf. What 00:09:25 happens next year or the year after 00:09:26 where something does come out that we 00:09:28 need to be listening to and everybody 00:09:29 decides I don't trust the CDC. They 00:09:31 don't know what they're talking about 00:09:32 and and then we don't do the things that 00:09:34 we need to do. So that's very concerning 00:09:36 to me and we have to take that 00:09:37 seriously. I think what you're saying is 00:09:39 so is so important. The public trust in 00:09:41 medicine has gone down. The questioning 00:09:44 of vaccines has gone way up. And one of 00:09:46 the things I find very disturbing is 00:09:48 that when you hear people in government 00:09:50 saying it's a waste of money to do any 00:09:53 further studies on vaccines because we 00:09:57 know they're 00:09:58 perfect. Like and I'm like what is this? 00:10:02 Is this medicine? Is this science? 00:10:03 There's nothing else that I can think of 00:10:04 in medicine that's like that. No, it's 00:10:06 prop. It's a science fundamentally is 00:10:08 based on hypothesis generation, which is 00:10:11 questioning an assumption and then 00:10:13 trying to prove your hypothesis. 00:10:15 Correct. And and it's it's always about 00:10:17 the question. It's never about the 00:10:18 answer. And and yet to say we don't need 00:10:21 any more research and a waste of federal 00:10:23 dollars to actually look at this. Well, 00:10:24 I want to sort of go back a little bit, 00:10:26 Joel, because I think people need to 00:10:27 understand how we got here. You know 00:10:29 when I was in medical school we had just 00:10:32 DTAP which was dtheria tetanus and 00:10:34 ptasus and we had MR which is measles 00:10:37 mumps and rebella and we had polio 00:10:38 vaccines that was it correct that's what 00:10:41 I got that's what my kids got that's 00:10:43 what I got when I was little and you 00:10:45 know when I was in medical school I got 00:10:46 hepatitis vaccines because you know you 00:10:48 were at risk you know getting hepatitis 00:10:50 B or A or whatever from from from 00:10:52 traveling or from from the uh from 00:10:54 medical work we do and I think that's 00:10:56 fine for certain high-risisk 00:10:58 populations. But now, you know, instead 00:11:00 of getting like, you know, eight jabs or 00:11:02 whatever, we get like 72 jabs that are 00:11:05 from a whole host of different vaccines. 00:11:07 There was a problem with the swine flu 00:11:08 vaccine where it was a problem and it 00:11:11 caused a lot of side effects and 00:11:12 complications and deaths and greet 00:11:14 syndrome which is paralysis. It was it 00:11:16 was not a good situation and that was in 00:11:18 the 80s. And then the vaccine maker 00:11:20 says, "We're not going to in kind of 00:11:22 invest any more money in vaccines unless 00:11:26 you the government protect us." So 00:11:29 rather than us being liable for any 00:11:30 complications or problems or side 00:11:32 effects, the government is going to take 00:11:34 that on. And it's called 00:11:36 indemnification. We indemnify the 00:11:38 pharmaceutical manufacturers who make 00:11:40 vaccines so that they can't be sued. And 00:11:43 since that time there's a vaccine 00:11:45 adverse reporting event reporting system 00:11:48 called VES which has received over 2.6 00:11:51 million adverse effects. Now not all of 00:11:54 them are causal. So it doesn't mean you 00:11:55 know if you've got uh a vaccine and you 00:11:58 got a fever didn't mean the fever was 00:11:59 caused by the vaccine but they report 00:12:00 this. And there's more importantly 00:12:02 there's been5 billion dollars of federal 00:12:04 money that's been 00:12:05 spent to pay people who've been injured 00:12:09 by vaccines. So on one hand, the 00:12:11 government's paying out all this money 00:12:13 and acknowledging there's a problem, and 00:12:15 on the other hand, they're like, "No, no 00:12:16 problem. It's perfect. Don't study it 00:12:18 anymore." Right? So that kind of led to 00:12:21 this plethora of vaccines on the market. 00:12:24 We're like, "Okay, it's kind of open 00:12:26 field day and we can just make vaccines. 00:12:27 No one's ever going to sue us and we can 00:12:29 make a ton of money." And the RSV 00:12:30 vaccine, which is a an important, you 00:12:32 know, disease that kids get, respiratory 00:12:34 sensitial, it's a respiratory disease. 00:12:35 And, you know, they the company made a 00:12:37 hundred billion dollars off this 00:12:38 vaccine. Why are we protecting them? You 00:12:40 know, so can you talk about a little bit 00:12:42 about that? Why what's wrong with it and 00:12:44 what we need to do to sort of think 00:12:45 about it differently? It's such a 00:12:47 problematic setup for for safety and 00:12:50 research. It's great that we have more 00:12:53 vaccines. I mean, that that's a good 00:12:54 thing. We want to protect against 00:12:55 diseases that we can protect against, 00:12:57 but the issue is we have this system now 00:13:00 that protects the companies. They were 00:13:02 going out of business in the 1980s. They 00:13:04 were threatening to go out of business, 00:13:06 especially around like what you were 00:13:07 saying and also the DTP vaccine. And 00:13:08 there were a lot of lawsuits around 00:13:10 that. They had to stabilize the market 00:13:12 in some way and that's what they decided 00:13:13 to do to take their liability away. But 00:13:15 the problem with that is if you take 00:13:17 away the liability, they have no 00:13:18 incentive to make them better. And they 00:13:20 have every incentive to get a vaccine on 00:13:22 the market. They have every incentive to 00:13:25 get something on the schedule. And they 00:13:28 have so much money and so much power. 00:13:30 And that has grown over the last 40 00:13:32 years to the point where they're bigger 00:13:34 than many countries. I mean, they have 00:13:35 just so much money. They know what 00:13:36 they're doing. and and they know how to 00:13:39 lobby everything and and push things 00:13:41 forward. And so we're at this point now 00:13:43 where we are hearing from doctors. I 00:13:47 mean, we're trained. We're not talk 00:13:48 we're not talking about safety when 00:13:49 we're learning in in med school, right? 00:13:50 We're talking about here's the schedule. 00:13:52 Here are the diseases that you protect 00:13:53 against. Go do it. You don't think 00:13:55 anything differently. And I didn't I 00:13:56 didn't think anything differently when I 00:13:57 was going through school and start get 00:14:00 your mamogram. Get your colonoscopy. Get 00:14:01 your whatever. Those are fine. But like 00:14:03 it's sort of in that same bucket of like 00:14:05 this is just a general safe thing to do. 00:14:07 Exactly. And anybody that says anything 00:14:08 about vaccines, you're crazy. You're a 00:14:10 crackpot. You're vilified. Parent who 00:14:13 whose child has a reaction to a vaccine, 00:14:15 or at least that they think had a 00:14:16 reaction to a vaccine, they're vilified. 00:14:18 They're called antivax, which makes no 00:14:19 sense. This is a parent who took their 00:14:21 child to get a vaccine. They believed in 00:14:23 vaccines and they had what they think is 00:14:25 a reaction and and then they're vilified 00:14:28 for for being a crazy crackpot tin 00:14:30 tinfoil hatwearing individual and and 00:14:32 vilified for talking about it. And yes, 00:14:34 sometimes it's correlation. Sometimes 00:14:36 you could get a vaccine today, have a 00:14:38 heart attack this afternoon and and die 00:14:40 and that might have nothing to do with 00:14:41 the vaccine. But certainly it's 00:14:42 reasonable if it happens that you should 00:14:44 consider that it could be related. And 00:14:46 just asking the question should not be 00:14:48 something that we're unable to do. But 00:14:50 it has seemed like that's the case for 00:14:52 the last certainly last few years to the 00:14:54 point where the censorship was so great 00:14:57 that that things were labeled as 00:14:59 misinformation or taken down even when 00:15:01 they were true. Mark Zuckerberg said it 00:15:03 recently that the government was 00:15:05 pressuring them to take down true 00:15:07 information about vaccines because it 00:15:09 would lead to hesitancy. And that's the 00:15:11 opposite of what we need to be doing. 00:15:12 That's not honest. We need honesty. We 00:15:14 need to have discussions not to stop 00:15:16 people from vaccinating, not to stop 00:15:18 vaccines, but to hold these companies 00:15:22 liable in some way or to hold them 00:15:24 accountable. Right now, we're seeing 00:15:25 this back and forth between the 00:15:27 pharmaceutical companies and government. 00:15:28 They fund all the politicians. They fund 00:15:31 the news. They fund everything and they 00:15:33 do it on purpose. It's not, you know, 00:15:35 philanthropic philanthropic. They they 00:15:37 they know what they're doing. They they 00:15:38 are, you know, you scratch my back, I'll 00:15:40 scratch yours. And and and that's what 00:15:42 we're seeing. And it's to the point 00:15:43 where doctors just don't even realize 00:15:45 how much we're being fed a a 00:15:49 pharmaceutical curriculum. I mean, 00:15:50 there's no the joke in the book. There's 00:15:52 no there's no why do we not trust Pharma 00:15:55 101 in med school, right? It's here are 00:15:57 the things that you should do and and 00:15:59 they're funding the med school. They're 00:16:00 funding the journals. They're funding 00:16:01 the studies. They're doing the research. 00:16:02 They know what they're doing. And and 00:16:05 that doesn't mean that the vaccine is 00:16:06 unsafe or a medical product is unsafe. 00:16:08 We need medicines. They're fantastic. We 00:16:10 have a lot of amazing medications and 00:16:11 things that that are fantastic. But who 00:16:15 is watching over the pharmaceutical 00:16:17 companies? Who's pushing back? Who's 00:16:19 making sure they're they're safer 00:16:20 tomorrow than they are today? and who is 00:16:23 doing the research to look at things 00:16:25 that they don't do. They look at the 00:16:27 products before they go on the market. 00:16:29 They're not going to look at their 00:16:30 products again to find problems with 00:16:32 them. If you're a CEO of a company, 00:16:33 you're not funding a study to say, "Hey, 00:16:35 let me check my my medication again to 00:16:37 see if it's causing allergies. Let me 00:16:39 see if it's causing asthma 5 years from 00:16:41 now." Why would they do that? Who was 00:16:43 doing that? Nobody's doing And then 00:16:44 there are, you know, in medicine there 00:16:46 are cases where there have been 00:16:48 scientists who've questioned or doctors 00:16:50 who question things like bio, which is a 00:16:52 one of the most powerful 00:16:53 anti-inflammatory drugs we've ever had. 00:16:55 And and Steve Niss at Cleveland Clinic 00:16:58 actually who I know very well is an 00:16:59 brilliant guy. And he's like, gee, it 00:17:01 seems like there may be, you know, some 00:17:03 connection between increased heart 00:17:04 attacks and taking vio. And so he he did 00:17:07 the research and he showed that there 00:17:09 was a problem and it was taken off the 00:17:11 market, but you can't do that with 00:17:12 vaccines. You certainly can't do it 00:17:13 anymore. I mean, you're almost everybody 00:17:15 who does a study on vaccines, if they 00:17:17 show any sort of problem, they get 00:17:19 vilified. And and and so I think there's 00:17:21 this huge incentive or or disincentive 00:17:24 to even do research on vaccines. People 00:17:26 just don't we just don't see a lot of it 00:17:28 these days. I mean, there certainly is a 00:17:29 little bit, but most of what you see is 00:17:31 individuals who are very pro vaccine 00:17:33 doing provaccine studies looking for an 00:17:36 outcome that's funded by a 00:17:37 pharmaceutical 00:17:39 agency in the first place. And so that 00:17:40 that is what we're seeing. And you can't 00:17:42 say that we don't ever have problems 00:17:44 with vaccines. We took the original DTP 00:17:46 off the market because it could 00:17:48 theoretically causing a lot of um 00:17:50 encphilitis and neurologic issues. We 00:17:52 took a roto virus vaccine off the market 00:17:53 because it was causing a deception. 00:17:55 That's what we should be thinking about. 00:17:57 Not that we don't want the rotoirus 00:17:59 vaccine, but hey, is there an issue with 00:18:01 the vaccine that we have now? Maybe we 00:18:03 can figure out why that we have that 00:18:04 issue and and make it better. Wouldn't 00:18:06 we want to do that? Doctors don't want 00:18:08 to harm patients. No. Doctors are good 00:18:10 people. And if there was good research 00:18:12 and good studies to say, hey, this 00:18:14 ingredient in vaccines is increasing 00:18:16 your risk for asthma or this this 00:18:18 ingredient or or this combination of 00:18:20 vaccines increases your risk for eczema. 00:18:23 Doctors would want to know that. We'd 00:18:24 want to look at the schedule, adjust the 00:18:26 schedule and figure out how do we 00:18:29 provide the most benefit from vaccines 00:18:31 while minimizing the risk. While there 00:18:33 isn't a lot of research out there on 00:18:34 long-term risks, there is some. And if 00:18:37 you look at a very pro-science book, 00:18:40 they just say, "Oh, vaccines have 00:18:41 nothing to do with allergies. Vaccines 00:18:42 have nothing to do with eczema. There is 00:18:44 no research on that." But if you look at 00:18:45 a different book, you look at Neil 00:18:47 Miller's book or something like that, 00:18:48 there are good studies out there that 00:18:50 that show that there are some 00:18:53 connections potentially between eczema 00:18:54 and vaccines. There are some connections 00:18:56 between allergies and and and autoimmune 00:18:58 conditions. And the majority of research 00:19:00 still says that there isn't, but there 00:19:02 certainly is some there certainly are 00:19:04 some good studies that that that show 00:19:05 that there there might be some relation 00:19:07 or if you change up the schedule a 00:19:09 little bit or you delay things a little 00:19:10 bit that you can you can see a 00:19:12 difference and that's the kind of 00:19:13 research we should have more of. There's 00:19:14 nothing controversial with saying, "Hey, 00:19:16 let's study the the schedule that we 00:19:18 have. Maybe we can try some alternate 00:19:19 schedules. See if some of these rates go 00:19:22 down. If they don't, great." That's the 00:19:24 research that parents want. That's 00:19:26 right. And I think the other problem 00:19:27 with this is that, you know, when we 00:19:29 look at a studies on vaccines, when 00:19:31 they're done, they're basically done 00:19:32 with a single vaccine. Most of them 00:19:34 aren't even tested against placeos. 00:19:36 Right. Well, I I want to talk because I 00:19:38 think that's super important because 00:19:39 it's really really important to be very 00:19:41 specific because I've even heard RFK say 00:19:43 this incorrectly. I know he knows, but 00:19:44 he doesn't always say it very 00:19:45 specifically and it's really important. 00:19:47 So, every single vaccine before it hits 00:19:49 the market, it's studied against a 00:19:50 placebo. The thing is a lot of those 00:19:53 placeos are not inert. That's right. And 00:19:54 that's the the giving. But they say the 00:19:57 word placebo, they say, "No, no, it was 00:19:59 placeboc controlled studied and they 00:20:01 were, but it was studied against another 00:20:02 vaccine or an earlier version of the 00:20:04 vaccine." Basically all the vaccines on 00:20:06 the market that are the children 00:20:07 vaccines were never studied against an 00:20:09 inert placebo like a water like they 00:20:11 were never studied against saline and 00:20:12 and so when somebody says they weren't 00:20:14 studied against a placebo then then then 00:20:16 another person listening who's you know 00:20:17 very pro vaccine says oh no no they were 00:20:19 what are you talking about they have all 00:20:20 this pre-licenture testing but it's very 00:20:23 specific they rigged the deck they 00:20:24 studied against another vaccine they 00:20:26 study it in the setting of doing other 00:20:28 vaccines most studies are looking at one 00:20:30 vaccine versus another vaccine in the 00:20:32 setting of getting all your the rest of 00:20:33 your vaccines So you're not studying 00:20:34 vaccinated versus unvaccinated kids. So 00:20:36 you don't have a true baseline of 00:20:38 safety. And there's a huge difference in 00:20:40 a vaccine study between saying the old 00:20:42 DTP versus the new DTP is safe versus 00:20:46 DTP is safe versus nothing. And that 00:20:48 doesn't mean that there's a safety 00:20:50 issue, but they're all studied in that 00:20:51 way. And if you go back through a 00:20:53 history, there's sort of a lack of 00:20:55 integrity in a way in science. It is a 00:20:57 lack of integrity to a degree. So I 00:21:00 think they design the studies to show 00:21:02 the outcomes they want. They they 00:21:03 certainly want to minimize their their 00:21:06 chance for having a problem. They rig 00:21:07 the deck in in a sense because if you're 00:21:09 doing a blinded study, unless you're 00:21:10 lying, then you don't really know who's 00:21:12 getting what. So the best way to rig the 00:21:14 deck is to put it against something 00:21:15 that's going to have a similar profile 00:21:17 and then you minimize your chance to 00:21:19 have a problem. So it's like testing 00:21:20 Advil against the leave. Correct. 00:21:22 Exactly. And then when they both have 00:21:24 the same profile of of of issues, so 00:21:27 let's say you test Advil versus a leave, 00:21:29 you're not going to see any increase in 00:21:30 problems, right? Let's say they get 00:21:31 three seizures in that study, okay? They 00:21:33 say, "Well, there were three seizures 00:21:34 from both groups, so there's no there's 00:21:36 no concerns." But if you studied it 00:21:37 against water, you had zero seizures. 00:21:40 There would be a difference there. And 00:21:41 that's not the way things are done. And 00:21:42 we're in this weird place now because of 00:21:45 ethics where it's hard to do a study 00:21:49 against unvaccinated kids when you have 00:21:51 a vaccine that's already on the market 00:21:52 because there's an ethics there around 00:21:54 not giving the the standard of care. And 00:21:56 so we're because they weren't done 00:21:58 correctly or the best way in the 00:22:00 beginning. Now we're in this catch 00:22:01 because we have all these vaccines on 00:22:02 the market. If we want to restudy them, 00:22:04 how do we do that in a way where we can 00:22:06 get a baseline? Makes it really tough. 00:22:08 It's tough. You got to go to, you know, 00:22:09 populations that are unvaccinated like 00:22:11 the Amish or Menanites or groups like 00:22:13 that. But we should acknowledge this. So 00:22:15 if we do bring something new on the 00:22:16 market, then we can increase those 00:22:18 standards. Say we we we need you to do a 00:22:20 study against an inert placebo for a new 00:22:22 product. Like the Prevnar vaccine, the 00:22:24 original study, there were no other 00:22:25 prevars on the market. They studied 00:22:27 against the menitis vaccine. That makes 00:22:28 no sense. They could studed against free 00:22:30 solution water, but they said it against 00:22:32 menagitis. Yeah, that's that's sort of 00:22:34 shady science. And it's it's amazing the 00:22:36 FDA allows that and then would approve 00:22:37 it. There's no reason that you should 00:22:39 approve that. They they should require 00:22:41 them to have a third group that's also 00:22:43 saltwater. There's no logical reason you 00:22:45 wouldn't do it unless you don't want to 00:22:47 take the chance to show what the 00:22:48 difference is between that and and salt 00:22:50 water. you may find the exact same 00:22:51 thing, but we should require that. And 00:22:53 and you know, Peter Marks just resigned 00:22:55 from the FDA. He was sort of the head of 00:22:57 vaccine program. Do you think it was 00:22:58 because of some of these reasons that 00:23:00 were he didn't want to look at or why do 00:23:02 you think he left? I think he left in 00:23:03 part because he knows there's going to 00:23:05 be a greater lens on vaccines and and I 00:23:07 think that for anybody who has lived in 00:23:09 that world for a long time that's been 00:23:11 extraordinarily provaccine that doesn't 00:23:13 want this discussion or debate then I 00:23:15 think it's going to be really tough for 00:23:16 them moving forward because there's a 00:23:18 lot of people moving in that that do 00:23:20 seem to want to have the discussion a 00:23:21 little bit more for individuals who've 00:23:22 had a long career they don't want to 00:23:24 necessarily be in a battle again these 00:23:25 are people that are love what they do 00:23:27 love helping patients and and most 00:23:29 doctors I mean almost all doctors firmly 00:23:31 believe in vaccines and they don't want 00:23:33 to have that discussion. They're not 00:23:34 even aware that they should or should 00:23:37 look into it. I really think that people 00:23:39 should look into it. You don't have to 00:23:40 take my word for it. Go look at the 00:23:42 research yourself. I didn't know any of 00:23:45 this 10 years ago. I didn't even know 00:23:46 most of this two years ago until you 00:23:48 really dive into it. You you don't know. 00:23:52 And and and I feel like with presenting 00:23:54 this information, I've showed it to a 00:23:55 few doctors who are conventionally 00:23:56 trained thus far and and all of them are 00:23:58 still my friends and they're still 00:23:59 talking to me, which is great, you know, 00:24:01 cuz I don't know how this is going to be 00:24:02 taken. And are they open to it when you 00:24:04 share the data? They were open to it. 00:24:05 That you know, the discussion calms very 00:24:08 quickly when you know they say, "But 00:24:10 what do you what do you what do you 00:24:11 think? Do you don't want people to get 00:24:12 vaccines?" I'm like, "No, no, I do want 00:24:13 to get people to have vaccines. I just 00:24:15 want people to have information and I 00:24:17 think that we could always have better 00:24:18 research and here's the information." 00:24:19 And that seems to calm people down. 00:24:34 RFK vaccinated and that as far as I know 00:24:34 is not the truth in any way and and I 00:24:36 and certainly not what I'm aiming for 00:24:38 and I I think that doctors want to know 00:24:41 that we're moving forward in a way that 00:24:44 is still going to promote vaccination 00:24:46 but maybe do it in a safer way but they 00:24:48 have to at least be open to the 00:24:50 discussion first and And I think if you 00:24:51 come at it from that angle, then there 00:24:54 isn't any doctor out there that doesn't 00:24:55 want safety for their patients and 00:24:58 science and better science. They they 00:24:59 want that and I just don't think that 00:25:01 they know because I didn't know. And we 00:25:02 we we sort of take for granted it's like 00:25:04 there's all these vaccines now on the 00:25:06 market. The schedule has how many 00:25:07 vaccines now? I don't even a lot. I mean 00:25:09 it depends what you know how many 00:25:11 depends on much how many combos you 00:25:12 take, but you're talking like 30 to 40 00:25:14 actual pokes if you're doing combination 00:25:16 shots. You know, when we talk about 00:25:17 evidence-based medicine, it's sort of a 00:25:19 trope that's used to discredit people 00:25:21 who question science. To my knowledge, 00:25:23 and maybe I'm wrong, but there's no 00:25:25 studies that look at what is the 00:25:27 combination of all these vaccines at 00:25:30 these ages. Each each one is studied 00:25:32 signaling singly. So, if you do 30 00:25:34 different jabs with combos of so maybe 00:25:37 dozens of vaccines, what is the 00:25:39 cumulative effect? No one's really 00:25:40 looked at that. There's nothing even 00:25:42 close to that. And how many vaccines are 00:25:44 new in the last few years? You have to 00:25:46 do vaccinated versus unvaccinated 00:25:47 studies and you have to look at them 00:25:49 together and it's not unreasonable to do 00:25:52 that. There are so many parents that are 00:25:54 concerned about too many vaccines. Like 00:25:56 that is one of the biggest concerns that 00:25:57 I hear over and over again. That is a 00:26:00 reasonable thing to discuss. When you 00:26:02 have more and more vaccines, even if 00:26:03 there is a small risk, that risk 00:26:05 compounds. And when you're doing four 00:26:06 vaccines on a day, when you're doing 00:26:08 vaccines at birth and two months and 00:26:10 four months and six months and one year 00:26:12 and 15 months, and you're just getting 00:26:14 more and more vaccines, logic would 00:26:16 state that at some point there's going 00:26:17 to be a breaking point that there are 00:26:19 going to be too many. And so, we need to 00:26:21 be thinking about where is that breaking 00:26:23 point? when do the risks increase to the 00:26:25 point where we don't appreciate or we 00:26:27 don't approve of that level of risk and 00:26:28 where we need to bring it back to some 00:26:31 degree. But the the issue still right 00:26:32 now is if you look at immunize.org, you 00:26:35 look at, you know, Paul Offett's 00:26:36 information and and I love Paul Offett's 00:26:38 book. I think he's a really bright guy. 00:26:39 He says things like, "Oh, you could have 00:26:41 10,000 vaccines. You have as many 00:26:42 antigens as you want." I don't think 00:26:44 that interact with the immune system. 00:26:46 Totally correct. I I understand the the 00:26:49 general thought that, okay, we're 00:26:50 exposed to a lot of antigens on any 00:26:51 given day. Okay, you eat an apple, 00:26:53 you're eating all sorts of bacteria and 00:26:54 all sorts of stuff that's on there. So, 00:26:56 yes, you're not getting adgivants. 00:26:57 You're not getting adgivants, but but 00:26:58 even still, you're not getting injected 00:27:00 with that. It's going through the 00:27:01 regular pathways that our body has 00:27:03 always developed to to deal with these 00:27:05 pathogens. You're not injecting it into 00:27:07 your body. So, it's a different pathway. 00:27:09 And on top of that, you're you have all 00:27:11 the other things in the vaccines. You 00:27:13 have all sorts of adgivants and all 00:27:14 sorts of other ingredients. And we don't 00:27:15 100% know how those interact. And so, 00:27:17 there has to be some upper limit of 00:27:19 safety for the adgivants. I mean, we 00:27:21 it's not woo woo or anti-science say 00:27:23 there's there's got to be some upper 00:27:24 limit to the amount of aluminum that you 00:27:26 could safely take from a vaccine. We 00:27:28 have upper limits for for it in every 00:27:30 other aspect of our life like drinking 00:27:32 water and the amount that we breathe. 00:27:34 Giving yourself 10,000 vac vaccines with 00:27:36 aluminum that doesn't sound like a great 00:27:38 idea. No. And and historically it's 00:27:40 interesting when you look at the the 00:27:41 vaccines and this happened very quietly 00:27:44 where there was a preservative that was 00:27:45 used in most vaccines called thyosol 00:27:48 which is ethyl mercury and it's been 00:27:51 used you know for was used for like 100 00:27:53 years and it it it you know mercury is 00:27:56 the most potent neurotoxin second most 00:27:59 potent toxin after plutonium on the 00:28:02 planet and you could argue that this 00:28:04 form is better than that form but at the 00:28:06 end of the day it's mercury 00:28:08 And what happened was no one had ever 00:28:11 added up all the amount of mercury in 00:28:13 the vaccines that the kids were getting. 00:28:14 And but isn't that crazy before you go 00:28:15 on? Isn't that crazy that no one ever 00:28:17 added it up and no one thought about 00:28:19 that before they put it in a vaccine and 00:28:20 injected into their kids? No, it's 00:28:22 crazy. But you would think that they 00:28:23 talked about that and there were it was 00:28:25 like oops. But but how could you inject 00:28:28 something like that without at least 00:28:29 having the discussion that is it safe or 00:28:32 not? Is it okay to put this in here? And 00:28:34 by the way of a kid, it's about 162 00:28:37 times the amount of mercury that the EPA 00:28:39 says is safe to have in a baby, right? 00:28:42 And then quietly around 2000, they took 00:28:44 it out of most vaccines except the 00:28:46 multi-dosese flu vaccine and a few other 00:28:48 vaccines. So, it's still out there, but 00:28:50 it's, you know, I I actually went to the 00:28:53 U federal government, met with the head 00:28:55 of vaccines for every single uh 00:28:59 department, CDC, NIH, HHS, FDA, and I 00:29:02 said, "Look, you know, here's the data." 00:29:04 And we present over 900 studies on the 00:29:07 the effect of thyosol in in human 00:29:10 health. And as a sort of cumulative 00:29:12 amount of data, it was pretty 00:29:13 compelling. Not any one study will tell 00:29:15 you what or is good or bad. Maybe you 00:29:17 kind of look at the the salt of 00:29:18 everything. It's like, okay, this is 00:29:19 probably not a good idea. And they kind 00:29:21 of all agreed. And and one of them said, 00:29:24 well, oh, maybe maybe it's an adgivant. 00:29:27 I'm like, well, it's not approved as an 00:29:28 advent for everybody listening is 00:29:30 basically something that pisses off your 00:29:32 immune system so the vaccine will work 00:29:33 better. So, you give something 00:29:34 irritating like aluminum or mercury. Is 00:29:36 that a good thing? And one, it's not a 00:29:38 proof of that. And then I and then I 00:29:39 also said, "Hey, any of you would you 00:29:41 willingly inject your grandkid or your 00:29:43 kid with mercury if there was a better 00:29:46 alternative?" And there there is and 00:29:48 there was. And they're like, "No." So 00:29:50 why is it still in the market? And why 00:29:52 is the FDA still approve it as a as a 00:29:54 preservative which is in vaccines? It's 00:29:56 kind of crazy to me. Yeah. I mean, I've 00:29:57 read what you've you've written on it 00:29:58 and and I totally agree with your 00:30:00 perspective. I always have. And it's the 00:30:02 research still to this date is not clear 00:30:05 that thyarosol is is damaging to the 00:30:08 human body. I mean, that's what you hear 00:30:09 people say back all the time. Oh, the 00:30:10 research still doesn't show that that it 00:30:12 is damaging. It's a different form of 00:30:14 mercury and it's a very small dose and 00:30:16 it's safe, but at the same time, you 00:30:17 could just walk by a whiff of a peanut 00:30:19 if you have a bad allergy and it can 00:30:20 cause some sort of serious reaction. So, 00:30:21 even a small amount can cause harm for 00:30:24 some people. Yeah, it's heterogeneous 00:30:25 and and I think I think this is a really 00:30:27 important point. Not everybody is the 00:30:28 same and each of us have different 00:30:30 genetics, different immune systems and 00:30:32 there's a whole field of vaccinomics 00:30:35 which is at Mayo Clinic they're leading 00:30:36 this field where they're looking at how 00:30:38 different people respond to vaccines 00:30:40 which is great, right? But that means it 00:30:42 should be personalized. Where where is 00:30:43 that though? Like we're putting 50 00:30:46 vaccines in kids. Where is the the 00:30:48 research to say all right here are the 00:30:50 10 things that you should do to minimize 00:30:52 your risk of having a reaction from a 00:30:53 vaccine? What is controversial about 00:30:55 that? What are the genetic markers? What 00:30:58 are the lifestyle factors that you could 00:31:00 do? How could you decrease your risk to 00:31:01 have a reaction? That's going to give 00:31:03 parents more confidence to give their 00:31:05 kids vaccines because they're decreasing 00:31:07 the risk and it's going to decrease the 00:31:09 side effects for kids. Why wouldn't we 00:31:11 want that? But where is that research? 00:31:13 Where are the studies? But I think it's 00:31:14 because you have to admit that there's a 00:31:16 potential problem. Nobody wants to do 00:31:18 that. Then nobody wants to nobody wants 00:31:19 to say you could have a side effect even 00:31:21 though they we know that we have a an IO 00:31:23 we have Im reports. We have individuals 00:31:25 that get encphilitis and gonareet and 00:31:28 febil seizures and again these are 00:31:29 fairly rare but they happen and if those 00:31:32 things happen then I don't know why 00:31:33 other things couldn't happen too in 00:31:34 certain kids if we would study it you 00:31:35 probably would see that some of these 00:31:37 things have at least a slight 00:31:39 correlation for for some people because 00:31:40 everybody's different yeah we're moving 00:31:42 into precision medicine or personalized 00:31:44 medicine and the science is going there 00:31:45 and and we're going to be able to map 00:31:47 your whole genome for a few hundred 00:31:48 bucks and be able to understand what's 00:31:50 going on with you and how to personalize 00:31:51 medicine and this is happening already I 00:31:53 mean I do this so if someone, you know, 00:31:54 has a lipid issue and I want to 00:31:56 prescribe a statin, which I don't do 00:31:57 that often, but if I do, I'll check 00:31:58 their genetics. Are they going to get 00:32:00 statin myopathy? Are going to get muscle 00:32:01 damage from statin or there's genetic 00:32:03 tests you can see people might not 00:32:04 tolerate aspirin or other things. So, 00:32:06 we're already doing this medicine. It's 00:32:07 called phiccogenomics and it's a whole 00:32:09 field of understanding the differences 00:32:11 in each individual and how they'll 00:32:12 respond to different treatments. This 00:32:14 should be also the case for for 00:32:16 vaccines. And we we we we just have this 00:32:18 kind of knee-jerk reaction that they're 00:32:20 just safe and effective. And if you 00:32:21 question it, you're you're a cook. And 00:32:23 I'm not a cook. you're not a cook. We we 00:32:24 look hard at the science and and the 00:32:26 reality is we need more because we don't 00:32:28 really have all the answers. It's still 00:32:29 confusing even to me. I'm like, I don't 00:32:31 know, right? We because we don't know. 00:32:33 Anybody that says the science is settled 00:32:35 on on long-term issues, how can you say 00:32:38 that? Where is the research for that? 00:32:40 Who is following kids for five or 10 00:32:42 years? How do we have any idea? I mean, 00:32:44 you talked about vees before that verse 00:32:46 reporting um that we have, but that's 00:32:48 personal report. You do it, the company 00:32:50 does it, the doctor does it if they 00:32:52 decide to do it. But you have to 00:32:53 associate the issue with a vaccine. If 00:32:56 you get hepatitis B vaccine today and 00:32:58 you get some sort of cancer 10 years 00:33:00 from now, you're not going to be 00:33:01 thinking, "Oh, well, my kid probably got 00:33:04 cancer because of the vaccine they got 00:33:05 when they were 2 months old, so let me 00:33:07 go report that to theirs." There's no 00:33:09 way to see those things unless you 00:33:11 follow kids prospectively. You follow 00:33:13 them for 10 or 15 or 20 years and you 00:33:15 say, "Hey, how come the kids that are 00:33:16 getting more vaccines have more of this 00:33:18 kind of cancer? How come those kids have 00:33:20 more asthma or allergies?" You don't 00:33:22 know that unless you follow it 00:33:23 prospectively. People can do research on 00:33:25 that topic, but nobody wants to do 00:33:26 research on vaccines. You'd have to 00:33:27 follow people for a long time. You'd 00:33:30 have to have that association in mind 00:33:31 beforehand and have 10 years of time to 00:33:33 follow those kids. It it it hasn't been 00:33:35 done. But we could do it. We could do 00:33:37 something like the Framingham study. We 00:33:38 can start following kids for 10 or 15 00:33:40 years. Yes, the ethics are difficult 00:33:43 around it, but you can certainly have 00:33:44 people self- select which group they're 00:33:46 going to be in. There are plenty of 00:33:46 people that are going to choose not to 00:33:48 vaccinate. It's not going to be the 00:33:49 perfect study because there's going to 00:33:50 be some bias there, but at least you can 00:33:52 get some good information. Wouldn't it 00:33:54 be really useful to know 10 years from 00:33:56 now that the kids that live an 00:33:58 unvaccinated lifestyle have a lower rate 00:34:00 of asthma than the kids that that get 00:34:03 vaccines? And then you can say, okay, 00:34:05 autoimmune disease or autism or 00:34:07 whatever, whatever comes up. Yeah. Then 00:34:09 you say, well, why why do the kids that 00:34:11 are living an unvaccinated lifestyle 00:34:13 have a lower risk of autoimmune disease? 00:34:15 Is it the food that they're eating? What 00:34:16 is it about their lifestyle that's 00:34:18 different? There's nothing wrong with 00:34:20 that. That would be super useful 00:34:21 information. And if if we found that 00:34:23 there are no differences, that is the 00:34:25 information that parents want, that will 00:34:28 make them be more confident to get 00:34:31 vaccines. That's what they want. And if 00:34:32 we have that research and we show those 00:34:34 things, then you're going to improve 00:34:36 confidence in the vaccine program. And 00:34:38 if you do find problems, you're going to 00:34:40 improve the confidence because you're 00:34:41 going to fix those problems. And and I 00:34:43 think that's right. Right. I mean, it's 00:34:44 just really about upleveling the science 00:34:46 and that's really, I think, what what is 00:34:47 happening now. And I'm I'm I'm glad that 00:34:49 that's the case. And yes, it's going to 00:34:50 cost money and it's going to take time, 00:34:52 but we have to ask the questions and we 00:34:54 we can't just fall into this. There's no 00:34:57 evidence that it's been settled that the 00:34:59 science I mean, this is these are things 00:35:00 I hear all the time in the news. I'm 00:35:01 like, 00:35:02 what? What? First of all, you're a 00:35:04 reporter. You shouldn't be talking about 00:35:05 that. Second of all, like, you know, 00:35:08 this is just anti-science. So, it's 00:35:10 truly anti-cience to say the science is 00:35:11 settled. It doesn't it doesn't make any 00:35:13 sense. You never say that in science. 00:35:15 You're always looking forward and and 00:35:17 the science is definitely not settled on 00:35:19 any of these topics. I mean, there's no 00:35:20 way that you could say that going 00:35:22 through it at this point. I have no 00:35:24 reason to say anything other than that 00:35:25 I'm not against vaccines, but there are 00:35:27 so many aspects of the science that is 00:35:28 just not settled on. And the foundation 00:35:31 of a lot of this research is quite 00:35:33 flawed and not great. And you go back to 00:35:35 the research on some of these original 00:35:37 vaccines like DTP and polio. those 00:35:39 studies were not done in the way we do 00:35:41 studies today. And so you're basing the 00:35:44 safety profile on some of the things 00:35:46 that we used to do. We didn't have 00:35:47 randomized control. We didn't have 00:35:48 randomized control trials back then. So 00:35:50 other than the polio, the polio, the 00:35:51 original one, which was an amazing 00:35:53 study, still wasn't quite to the the 00:35:55 degree of the things that we do today, 00:35:56 but it was like 2 million people in that 00:35:58 in that original polio study because 00:35:59 people were really invested. Can you 00:36:00 imagine if we had studies today that had 00:36:02 2 million people in them? That is the 00:36:04 kind of research that we we should 00:36:06 require to give a new vaccine to our 00:36:08 kids especially to a baby. You know we 00:36:10 have a hepatitis B vaccine on the market 00:36:12 that the original studies in the package 00:36:14 insert say the safety was studied for 00:36:15 four or five days and and there are all 00:36:17 sorts of you know requests to look where 00:36:19 is the other research where is the data 00:36:21 that it's safe. How can we give 00:36:23 something to babies that we don't have 00:36:24 the most strict evidence that is safe 00:36:27 and the most benefit? That's why again 00:36:30 so many people question the vaccine 00:36:31 program because they say giving 00:36:32 hepatitis B to a one day old baby it 00:36:34 doesn't make sense to me. So why are we 00:36:36 doing this? A disease that's only 00:36:38 acquired by sex or introvenous drug use. 00:36:41 So no baby is doing those two things. 00:36:43 And the theory is well they're here in 00:36:44 the hospital. We got them. Let's just do 00:36:46 it because they're going to leave and we 00:36:47 don't know it's going to be a problem. 00:36:48 Right. But now we have other countries 00:36:49 actually don't mandate that. Most 00:36:51 countries don't mandate it. Certainly in 00:36:52 the first couple of days. I I 00:36:53 understand. I mean it certainly has 00:36:54 worked. The program to give vaccines for 00:36:56 hepatitis B has decreased the risk of 00:36:57 hepatitis B. You're basically making the 00:36:59 notion that vaccines don't have any 00:37:01 potential harm or issue when you're 00:37:02 giving to one day old baby. And we have 00:37:05 to be reasonable. Now, we're doing a 00:37:06 hepatitis B, a vitamin K, and an RSV 00:37:09 potentially in those first couple of 00:37:10 days. There has to be a point where you 00:37:12 say, how many things do you want to give 00:37:13 to a newborn baby that is one day old? 00:37:15 Okay, let's look at hepatitis B. Do we 00:37:17 truly need to give that to a one-day old 00:37:18 baby or if the parents been tested, if 00:37:21 they don't have hepatitis B, is this 00:37:23 something that we could say, if you want 00:37:24 to get it now, go for it. But if you 00:37:26 want to move it back a few months or you 00:37:27 want to do it later in life when you're 00:37:29 going to get your other vaccines, maybe 00:37:30 your teen vaccines, like can we have 00:37:32 that discussion? Because you're throwing 00:37:34 a lot of patients off of vaccines in 00:37:36 general because they say this doesn't 00:37:37 make sense. My baby's not having sex. 00:37:39 I'm giving them you're telling me that I 00:37:40 have to do a vaccine on the first day of 00:37:42 life. Now I don't trust you. Well, I 00:37:43 think I think that's an important thing. 00:37:45 And I I I kind of want to sort of zoom 00:37:46 out a little bit and and kind of ask you 00:37:48 about a theory I have, which is that 00:37:50 there's been a real change in our 00:37:52 overall immune health as a population. 00:37:55 We've seen a dramatic decrease in 00:37:57 infectious disease, which I I think we 00:38:00 can talk about why that is and I think 00:38:02 it has a lot to do with sanitation and 00:38:04 hygiene and not necessarily the 00:38:06 vaccines. And we can talk about that 00:38:07 like for example vaccines measles 00:38:10 decreased by over 90% in 1968 when the 00:38:12 measles vaccine came out and it already 00:38:14 had decreased by 90% in the population. 00:38:17 Yeah. And measel vaccines do work and 00:38:18 they help reduce the the risk especially 00:38:20 in you know developing nations where 00:38:22 these kids are susceptible and that's 00:38:23 that was what was going on in in the in 00:38:25 the early uh 1900s and 1800s was the 00:38:29 sanitation and the lack of hygiene and 00:38:31 flush toilets and I mean it was just 00:38:33 like a disaster. So yeah people were 00:38:34 were getting really sick. They were 00:38:36 malnourished. They didn't have vitamins 00:38:37 and minerals. They were there was so 00:38:38 many reasons why they were so 00:38:40 susceptible and died from these. And 00:38:41 when you look at the developing world, 00:38:42 that's that's those are the kids who 00:38:43 die. The kids are malnourished, don't 00:38:44 have nutrients in their system to be 00:38:46 able to fight these things. And the same 00:38:47 thing with CO, you know, CO only killed 00:38:48 the people who are susceptible, which 00:38:50 was those chronically ill and obese. And 00:38:52 that's why we had, you know, four times 00:38:53 the rate of of other countries in terms 00:38:55 of the number of deaths per population. 00:38:58 16% of the the global deaths and 4% of 00:39:01 the population of the world. So, how did 00:39:03 that kind of sink? But the theory I kind 00:39:05 of have is that our immune systems have 00:39:07 been just screwed up because there's 00:39:10 been increase in C-section rates, 00:39:11 there's been a lack of breastfeeding, 00:39:13 there's been early use of antibiotics, 00:39:15 there's other stresses that the baby 00:39:17 has, environmental toxins, all of which 00:39:19 disregulate the immune system. And you 00:39:22 give a vaccine on the top of that and 00:39:24 there's a problem. And I don't know if 00:39:25 you were trained like this, but I was 00:39:26 trained when when when I was trained in 00:39:28 and I did family medicine, so I did a 00:39:30 lot of pediatrics that if if a kid came 00:39:31 in and they were sick, had a cold, 00:39:33 something, you cannot give them a 00:39:35 vaccine. Like, just don't do it cuz it's 00:39:37 going to mess them up and it's going to 00:39:38 be bad. And so that was what I learned. 00:39:40 But then what's happened is that doctors 00:39:42 will say, "Oh god, I don't know if the 00:39:43 kid's going to come back in the office. 00:39:44 I'm just going to give it to him 00:39:45 anyway." And that that kind of seems odd 00:39:47 to me. And I'd love you to sort of 00:39:48 comment on my theory and this this whole 00:39:50 sort of sense of, "Well, we got him in 00:39:52 the office. We got him in the hospital. 00:39:53 let's just give a vaccine even if it's a 00:39:55 bad idea. Well, I totally agree going 00:39:57 back to the first point. So, it is 00:39:59 really important. Physicians, everybody 00:40:01 needs to go look back at the history. 00:40:03 There are some good books on it. I 00:40:04 didn't know any of this even a couple of 00:40:07 years ago, but the reality is that 00:40:09 sanitation and nutrition had a huge role 00:40:12 in our immune systems. Before the 00:40:14 vaccines came out, all of the diseases 00:40:16 were way on the downswing. So, most of 00:40:20 the diseases had almost no death anymore 00:40:22 or very, very little death. I mean, 00:40:23 you're talking even measles before the 00:40:25 vaccine, you had a couple, you know, a 00:40:26 couple hundred deaths a year, lots of 00:40:28 cases, but but still not as many deaths. 00:40:31 And so, because we we had antibiotics, 00:40:33 because we had hospitals, because we 00:40:36 understood medicine better, we knew what 00:40:37 a virus was and a bacteria was, we knew 00:40:40 how to treat things, we were eating 00:40:41 better, we had vitamin C, you know, 00:40:42 people didn't have scurvy anymore. I 00:40:44 mean, all these things that that make a 00:40:45 huge difference for your immune system. 00:40:48 We saw the death rates plummet. Now 00:40:50 there were still lots of cases of some 00:40:52 of these diseases and so the you look at 00:40:54 when the vaccines came out and a lot of 00:40:56 the vaccine preventable diseases did 00:40:58 decrease to almost zero after those 00:41:00 vaccines came out. So they certainly had 00:41:02 an impact. I mean the the the most I 00:41:04 guess memorable for a lot of young 00:41:06 adults would be chickenpox, right? We 00:41:08 all had chickenpox when we were young. I 00:41:09 got a little chickenpox going. We used 00:41:11 to have chickenpox parties, right? And 00:41:12 we did but but there were still lots of 00:41:14 chickenpox around, right? And then the 00:41:15 vaccine came out and then we don't see 00:41:16 chickenpox almost at all anymore. So you 00:41:19 can't say that the vaccines don't do 00:41:20 anything. I mean, you can see that they 00:41:21 clearly dropped the the morbidity and 00:41:23 and and some kids from chickenpox. I 00:41:26 mean, you know, I hear it too from 00:41:27 parents, right? It's like, oh, 00:41:27 chickenpox, I had it. It was nothing. 00:41:29 But when you have an entire population 00:41:30 of kids getting chickenpox and millions 00:41:32 of people, some kids get a really bad 00:41:34 infection, it puts them in the hospital, 00:41:35 some kids get a menitis, some kids get a 00:41:37 pneumonia. So, things do happen at a 00:41:39 population level. There is some benefit 00:41:41 certainly from from from public health 00:41:43 initiatives and and that's where the 00:41:44 vaccines certainly come into play. To 00:41:46 say that our health is better because of 00:41:47 vaccines or that's the thing that's 00:41:49 doing it. I think that's an 00:41:50 overstatement. I think it's a part of 00:41:52 everything that we've done that has um 00:41:54 brought us to where we are today. We we 00:41:56 certainly are better than we used to be 00:41:58 in terms of our health. We used to live 00:41:59 till 40. We live till, you know, in our 00:42:00 70s now. That's good. So, not all of 00:42:02 medicine is bad, but we're seeing our 00:42:04 life expectancy go back down. We're 00:42:06 seeing chronic disease rates skyrocket. 00:42:08 And we need a little humility in 00:42:10 medicine to say, "Hey, maybe some of the 00:42:12 things we've done in the last 20 or 30 00:42:14 years aren't all the best. Maybe we're 00:42:16 not doing everything perfectly. And 00:42:18 maybe there's some sort of middle ground 00:42:19 here that can get our life expectancy to 00:42:21 90 or 100 or 120. And and maybe all the 00:42:25 toxins that we're being exposed to and 00:42:27 the chemicals that we're being exposed 00:42:29 to and the the food that we're eating 00:42:30 that's not the best anymore. Uh and most 00:42:32 kids are eating just mostly 00:42:34 ultrarocessed foods. Maybe that has 00:42:35 something to do with what we're seeing. 00:42:37 And maybe, just maybe, vaccines have 00:42:39 some part in that for some kids. Maybe 00:42:42 there's a certain genetics where it 00:42:44 throws you over the edge. Or maybe for 00:42:45 kids that are already so depleted, don't 00:42:47 have the right nutrients, have a toxic 00:42:49 load, it throws them over the edge. 00:42:51 That's right. And and most kids do fine 00:42:53 with vaccines. I mean, I've given 00:42:54 vaccines in my office and I've never 00:42:56 seen the horrible things that are 00:42:57 written. I' I've taken care of patients 00:42:59 after that swear by it and and there are 00:43:01 way too many stories not to believe it. 00:43:02 And there are certainly people that have 00:43:04 gone to court and won in their cases for 00:43:06 it. So there's a lot of establish in 00:43:08 your in your practice adverse effects. I 00:43:10 mean, we see a lot of benefit. I'm I'm 00:43:11 acknowledging that. I think I'm I'm you 00:43:13 know, I've seen that as well. And I I 00:43:14 definitely recommend vaccinations. The 00:43:16 question is, you know, what kinds of 00:43:18 things have you seen? Because I I 00:43:19 personally have seen a lot of things and 00:43:21 I I'm like, hm, this is interesting. It 00:43:23 it's supposed to be okay, but I'm seeing 00:43:25 these these problems. In the short term, 00:43:27 I wouldn't say that I've seen all the 00:43:28 horrible things that are out there. I've 00:43:30 seen really bad rashes. I've seen 00:43:32 fevers. I've seen kids crying for a long 00:43:33 time and being super fussy. I've seen 00:43:35 all the minor or moderate things. never 00:43:38 personally yet seen any of the super 00:43:41 severe things that that people have had, 00:43:43 but they've come to me after the fact 00:43:44 and and sworn by it. So, I've seen lots 00:43:46 of kids that have stated it as well as 00:43:48 I've seen, you know, certainly lots of 00:43:49 kids. So, you've seen you've seen kids 00:43:50 who were ill who had been vaccinated, 00:43:51 who had been vaccinated, who swear by 00:43:52 it, like one one patient who um they had 00:43:56 a newborn and they got vaccines and then 00:43:59 they had a super heart f super fast 00:44:01 heart rate in, you know, the 200s and 00:44:03 they had, you know, super SVT and they 00:44:05 had to go to the hospital. It was just 00:44:06 right after like a couple hours and then 00:44:07 everybody said, "Oh, no, it's not 00:44:08 because of the vaccines. It's probably 00:44:10 not to do with it." And then they ended 00:44:11 up getting more vaccines, you know, a 00:44:13 couple months later after being 00:44:14 convinced because they were going to 00:44:15 kick them out of the office if they 00:44:16 didn't. The exact same thing happened. 00:44:18 The doctors still didn't relate it in 00:44:20 the hospital to the vaccines, the doctor 00:44:22 they had still didn't relate it to the 00:44:23 vaccines. And and the mom was crying 00:44:25 because she was like, "I don't want to 00:44:26 give my kid vaccines anymore." Like 00:44:27 this. She had another kid that was fully 00:44:29 vaccinated that was older. She wasn't 00:44:31 against vaccines, but this kid was 00:44:32 having a clear reaction an hour or two 00:44:34 hours after getting vaccines was 00:44:36 affecting this child's heart. And the 00:44:38 doctors at the hospital wouldn't call it 00:44:39 a vaccine reaction. The their doctor 00:44:42 wouldn't call it a vaccine reaction. 00:44:44 When they wanted to stop doing vaccines, 00:44:45 at that point for that child, the doctor 00:44:47 kicked them out of the office or told 00:44:49 them they couldn't come back if they 00:44:50 didn't continue vaccinating. So that's 00:44:51 when they came to me and I was 00:44:52 heartbroken because it's like we're at 00:44:55 this point where doctors are so afraid 00:44:57 to call something a vaccine reaction 00:44:59 that we're doing our patients a 00:45:01 disservice. Like in my state it's like 00:45:03 impossible to do an exemption at this 00:45:05 point unless the child is like basically 00:45:07 on their deathbed. And that's doing a 00:45:08 disservice to kids that have all sorts 00:45:10 of concerns and taking ability for 00:45:12 parents who have a reasonable concern 00:45:14 around a vaccine or a reaction or a 00:45:15 previous reaction or a family history of 00:45:17 something. You can't take that into 00:45:18 consideration as a doctor, which I think 00:45:20 is unfortunate because people are 00:45:22 scared. It is interesting. You know, 00:45:23 it's interesting this moment because you 00:45:25 know, RFK Jr. is now held and human 00:45:27 services secretary and he's been labeled 00:45:28 as an antivax conspiracy theorist and I 00:45:30 know him personally and I've he's been 00:45:32 on my podcast a number of times a and 00:45:34 you know I've had deep conversations 00:45:36 with him about this and he's all about 00:45:38 questioning the data which is something 00:45:41 that should be just a natural part of 00:45:43 the evolution of our scientific inquiry 00:45:45 about anything vaccines or anything 00:45:48 drugs surgery anything he gets labeled 00:45:50 in that way and he's just calling for 00:45:52 better science and and to actually ask 00:45:55 the questions. Is there a problem? You 00:45:57 know, he was an environmental lawyer and 00:45:59 he was giving talks about mercury in the 00:46:02 rivers and the pollution from the coal 00:46:04 burning that we had in this country and 00:46:06 the in cement plants and how it was 00:46:08 really causing huge pollution that was 00:46:11 contaminating all the fish. You know, 00:46:12 you if you look at the EPA 00:46:14 recommendations, you shouldn't eat fish 00:46:16 from any lake or river in America period 00:46:18 ever. Like they're just a poison 00:46:20 mercury. And when he was giving these 00:46:22 talks about this, the women would come 00:46:23 up to him and say, "Hey, you know, you 00:46:25 know, my kid got these vaccines with 00:46:26 mercury in them. Maybe you should look 00:46:28 at that because the kid got sick after 00:46:30 or they got autism and maybe there's 00:46:32 something there." And so he went down 00:46:33 that rabbit hole and you know, he's a 00:46:35 smart guy and he saw there was some 00:46:37 signal there. Now, do we know absolutely 00:46:39 it's true? No. But there's some there's 00:46:41 some noise there. And I can tell you 00:46:42 there is a there's a in medicine there's 00:46:44 something called the doctor years of a 00:46:45 disease. In other words, how many years 00:46:47 you have to be practicing medicine in 00:46:48 order to see a certain condition. So 00:46:50 like I've never seen a fiochromocytoma 00:46:52 which is a rare tumor of your adrenal 00:46:54 glands. Like I just never seen it and 00:46:55 I'm probably never going to see it and 00:46:56 you've never seen it. I've never seen it 00:46:58 right and you could probably ask 30 00:46:59 doctors and they've never seen it and 00:47:01 they collectively practice hundreds of 00:47:02 years of medicine. Personally I have 00:47:04 seen many many cases where parents say 00:47:07 look my kid got vaccinated and then 00:47:08 afterwards they got sick or they got 00:47:10 autism and I could I literally say 00:47:12 dozens in my small kind of cohort of 00:47:15 patients who who've had autism. It's not 00:47:17 uncommon. And so it calls in the 00:47:19 question of like what's happening with 00:47:21 these kids? Are they just susceptible to 00:47:22 it? And I always hear the same story. 00:47:24 That's why I sort of mentioned the 00:47:25 susceptibility. C-section, bottlefed, 00:47:27 collic, lots of antibiotics, ear 00:47:29 infections, you know, strep throat, 00:47:31 whatever. Boom, they get the vaccine at 00:47:34 15 months and then wow, they're they're 00:47:36 they're off to the races where they have 00:47:38 regressive autism. A regressive autism 00:47:40 is where you're developing fine, you're 00:47:42 talking, you're walking, you're doing 00:47:43 all the normal things, and then you stop 00:47:44 and you go backwards. And it's like, 00:47:45 well, they weren't born with this. what 00:47:48 happened? And and so nobody wants to ask 00:47:51 that question. And when I sort of dug 00:47:53 deep into these kids' health, they had 00:47:54 terrible gut issues. They had tons of 00:47:57 heavy metals in their system. They had 00:47:59 lots of nutritional deficiencies. They 00:48:02 sort of had a susceptibility when you 00:48:05 start to look at it. And when you 00:48:06 correct these things, these kids often 00:48:07 would improve or get better, which was 00:48:09 amazing to me. So was able to undo the 00:48:12 harmful effects by actually treating it. 00:48:13 People would say, almost everybody would 00:48:15 say, "No, the science has been done." 00:48:17 They would say that and that's what I 00:48:18 would have said 10 years ago. And I 00:48:20 think that's why it's so important to 00:48:21 talk about it because when you say 00:48:23 something like, "Oh, well, you know, 00:48:24 this child had this this thing happen 00:48:26 after they got a vaccine." Then the 00:48:28 knee-jerk reaction is say, "No, no, what 00:48:29 are you talking about? The science is 00:48:30 settled. It's been debunked. We have so 00:48:32 much research on on all the vaccines, 00:48:34 and we know that vaccines don't have 00:48:36 anything to do with autism." Nothing 00:48:38 shocked me more when doing this book 00:48:40 than looking at the autism research. 00:48:42 Now, let me be clear. I don't think 00:48:43 vaccines cause autism. Correct. There's 00:48:45 nothing to say that. But but they they 00:48:47 can be a trigger in a susceptible kid 00:48:49 for dysregulation of their immune 00:48:51 system. And autism is a 00:48:53 neuroinflammatory disease. When you look 00:48:54 at kids' brains with autism who maybe 00:48:56 died in a car accident, their brains are 00:48:58 just full of inflammation. The goal 00:49:00 cells are litten up. In fact, on MRI 00:49:02 scans, their brains are literally larger 00:49:04 because they're swollen. Right. and and 00:49:06 your your your point is the same point 00:49:07 that I have which is there are so many 00:49:09 parents that are saying this so many 00:49:11 stories thousands hundreds of thousands 00:49:14 millions it's information it's data it's 00:49:16 parents we have to listen to them we 00:49:18 have to take that into account and we 00:49:21 have to have discussions you can't just 00:49:23 just dismiss these parents who so 00:49:25 believe in science and so believe in 00:49:27 vaccines that they took their kid to get 00:49:29 vaccines and then they swear that 00:49:30 something happened and then you call 00:49:32 that person antiax crazy for saying look 00:49:35 I think my kid was normal today and then 00:49:37 the next day they had issues after 00:49:38 getting a vaccine. This is my 00:49:40 experience. Those people are the 00:49:42 information that we need to figure out 00:49:43 what's going on. It's true funny like 00:49:45 one of the one of the kids I saw I saw 00:49:46 their mother was a top executive at 00:49:48 Fizer which does vaccines. She was like 00:49:51 this has happened. The research doesn't 00:49:52 show that vaccines cause autism. It 00:49:54 doesn't really show anything. I mean 00:49:55 there is research on MMR and there's 00:49:57 research on thyarisol and that's the 00:49:59 majority of the research. So we we have 00:50:02 some data on that. But when people say, 00:50:04 "Oh, it's been debunked." Or, "Oh, it's 00:50:06 been 00:50:06 studied." I looked through everything. I 00:50:09 went in to, you know, ride in on a white 00:50:11 horse and say, "Look, here's all the 00:50:12 research that shows that vaccines don't 00:50:14 cause autism." Like, this is why people 00:50:16 say it. And when you look at what's 00:50:17 actually out there, most of the research 00:50:19 is from a long time ago. It's on those 00:50:20 two specific things. And it's not on all 00:50:23 vaccines. It's certainly not on 00:50:24 vaccinated versus unvaccinated kids. So 00:50:25 when somebody says we should get more 00:50:28 research and then everybody rushes to to 00:50:30 yell at them and call them anti-cience, 00:50:31 say that the science is settled, that 00:50:33 makes no sense. There's nothing wrong 00:50:34 with getting more research. It's 00:50:36 certainly not settled in terms of we 00:50:38 have all these prospective great 00:50:40 clinical trials to show this. We don't 00:50:41 have that at all. And and that is really 00:50:44 important that when we're talking about 00:50:45 moving forward, we should do more 00:50:47 research. We should listen to these 00:50:48 parents. There's nothing wrong with 00:50:49 that. Let's do more studies and we'll 00:50:51 see what it finds and then we'll go from 00:50:52 there. you know as someone who's 00:50:54 practiced medicine for 30 years and seen 00:50:56 a lot of these kids but very various 00:50:57 things you know kids even for example 00:50:59 who the vaccine I don't think was a 00:51:01 problem there there are things that co 00:51:04 cause issues we we learned from Phil 00:51:06 Landrian and others that lead which was 00:51:09 measured in kids teeth was causing 00:51:11 severe behavioral issues aggression 00:51:14 violence ADD I like and so they made the 00:51:18 conclusion gee lead's bad let's get lead 00:51:21 out of gasoline Let's get lead out of 00:51:23 paint. But we didn't stop using gas. No, 00:51:26 we didn't stop using paint. We just 00:51:28 stopped having lead in there. So that 00:51:30 way when kids were living in their 00:51:32 house, they weren't ingesting lead. But 00:51:34 it started with somebody recognizing it 00:51:36 started with somebody talking about it. 00:51:37 And and I hope if if RFK is listening or 00:51:40 someone's passing this along to like we 00:51:41 do need more information and we do need 00:51:43 more research, but what I think needs to 00:51:46 be done first is we need to get actual 00:51:49 data and information of what vaccine 00:51:50 information exists to the doctors. 00:51:52 That's one of the things that I hope 00:51:54 that my book can help with. But I think 00:51:55 we need some sort of either a new 00:51:57 commission or a new M or or some sort of 00:52:01 literature review done by the most pro- 00:52:05 vaccine people mixed with other 00:52:07 individuals. And we need to go through 00:52:08 the research and we need to show what 00:52:10 actually does exist so that doctors can 00:52:12 see it and read it and put in the New 00:52:13 England Journal of Medicine or Journal 00:52:14 of American Medical Association or 00:52:16 whatever some sort of major body that 00:52:18 doctors are going to respect. I think if 00:52:20 we can get the information out there of 00:52:22 what actually exists, what research has 00:52:24 been done, what we have and what we 00:52:26 don't have, that's going to be a really 00:52:28 good first step because just to go to a 00:52:30 bunch of studies, which is good. We need 00:52:32 them. But I think if you get a study 00:52:33 here or there that shows vaccines are 00:52:35 related to X, Y, or Z, people are just 00:52:36 going to trash it. They're not going to 00:52:38 be even willing to listen to it. And I 00:52:40 think we need to first get the doctor's 00:52:43 information on what the research 00:52:44 actually shows cuz I'm 99% positive that 00:52:47 most people don't know. Yeah. Yeah. And 00:52:49 when you say that, you mean a 00:52:50 dispassionate look at the data. Correct. 00:52:52 And not just some of the data, but all 00:52:53 of the data. All of the data in a 00:52:55 report. Presented in a way that is 00:52:57 digestible and easy to understand for 00:52:59 both practitioners and even lay people 00:53:01 because I think that that would not 00:53:03 increase what people call vaccine 00:53:05 hesitancy, but it would actually 00:53:06 decrease it. It would and it hasn't not 00:53:08 come from someone like RFK. I mean, he 00:53:10 could push it to happen, but it needs to 00:53:12 be someone that the doctors are not just 00:53:13 going to say, "Hey, you know, this is an 00:53:15 antivax cook. We're not going to listen 00:53:17 to it. Get somebody who doctors love. I 00:53:20 don't know. Get Paul off headed. I I 00:53:21 don't care. Whoever wants to do it and 00:53:23 get a group of people together and go 00:53:26 through and show here's what the 00:53:27 research shows. Here's what it shows on 00:53:29 autism. Here's what it shows on asthma. 00:53:31 Here's what it shows. List it all out. 00:53:33 Let people read it. Let people digest 00:53:35 it. And then they're going to have an 00:53:36 understanding. But it has to be 00:53:37 dispassionate and it can't be provax. It 00:53:40 has to be balanced and it has to show 00:53:42 both sides. Nothing thus far does that. 00:53:45 I hope I did it in some way in my book. 00:53:47 That was the goal, but it's still just a 00:53:48 book. It's not it's not fully 00:53:50 comprehensive of everything. There's 00:53:51 only so much you can go through. It 00:53:52 would help doctors. I think doctors 00:53:54 would be willing to listen if they 00:53:56 receive the information in a way that 00:53:58 they would respect. And right now, I 00:54:00 think there's this concern on everything 00:54:02 that's talked about when it comes to 00:54:03 vaccines that there's this push to stop 00:54:04 people from vaccinating or take away 00:54:05 their polio vaccines, which is not the 00:54:07 case. But that's what they hear and they 00:54:09 never hear past the headline. And we 00:54:11 have to get past that headline. Have to 00:54:12 get the information to doctors. That way 00:54:14 we can come together to say hm that 00:54:17 that's what the autism research shows. 00:54:18 Interesting. Okay. Well then I'm not 00:54:20 super against getting more research 00:54:22 because I thought there was a lot more 00:54:23 research than that. Like that that's 00:54:24 what would actually I think move the 00:54:26 needle in the first part. So when you 00:54:28 started digging into this Joel in 00:54:30 talking about the research, what were 00:54:32 the things that were the most surprising 00:54:34 that you learned both in terms of things 00:54:37 that were in support of the vaccination 00:54:40 approach to to managing disease risk and 00:54:44 also things that sort of challenged 00:54:45 that? What were the things that you 00:54:46 found in your discovery of of these 00:54:49 things by just having a pediatric kind 00:54:52 of open-minded view of like I'm just 00:54:54 going to look at what does the science 00:54:55 say? What does the data say? The first 00:54:57 thing that I was most surprised at was 00:54:59 that everything is super one-sided. So, 00:55:01 you read one book and you get one view 00:55:02 of a product and you read another book 00:55:04 and you get a completely different view 00:55:05 and they never talk about the 00:55:06 information on both. That's number one. 00:55:08 Number two, it's like a tennis match. 00:55:10 You kind of go back and forth and you're 00:55:11 like, hm, that's pretty bad about 00:55:12 vaccines. Oh, well, that's really good. 00:55:14 So, you have to be willing to do that 00:55:15 and kind of go back and forth and and 00:55:17 look at, okay, here's what the diseases 00:55:19 were. Oh, I don't want that disease. I 00:55:21 don't want my kid to have this disease. 00:55:22 This is what the vaccines actually done. 00:55:24 That's good. You have to be willing to 00:55:26 go back and forth. Next, the long the 00:55:28 the way that research was done, I was 00:55:30 very surprised at how things haven't 00:55:32 been inert placeboc control trial and go 00:55:34 going back through the research and 00:55:36 going back through the history and the 00:55:37 data how we really didn't have the best 00:55:40 trials a long time ago and we're basing 00:55:42 a lot on that. They're kind of rigged in 00:55:44 some way. Yeah, there's some rig they're 00:55:46 rigged or they were just done in the way 00:55:47 that they were done back then and and 00:55:48 they weren't as rigorous as as as that. 00:55:51 Um, I found listening to Aaron series, 00:55:54 his lawyer. Um, some of his trials and 00:55:56 some of the the depositions of different 00:55:58 individuals like Stanley Platkin and 00:56:01 Kathleen Edwards just very eye opening 00:56:03 in terms of just asking them questions 00:56:05 and and getting their honest opinions on 00:56:07 on vaccines and how the research was 00:56:09 done in the past. I think that was 00:56:10 really interesting because can you 00:56:11 unpack that because Aaron C is a lawyer 00:56:13 who was involved in some of these 00:56:14 vaccine cases and who are these other 00:56:16 people you're talking about and what 00:56:17 were they? So Stanley Pluckkins, one of 00:56:19 the founders of vaccines, you know, 00:56:21 brilliant guy, one of the founders of 00:56:23 the the Rebella vaccine, and he's been 00:56:25 involved in in in multiple trials and 00:56:28 and I listened to his trial with Aaron 00:56:31 Siri on it was a like a religious 00:56:33 exemption uh case and and it's just 00:56:36 really interesting to hear him discuss 00:56:38 the use of of aborted fetal tissue uh 00:56:41 back when they were doing the research 00:56:42 and how they were doing the research 00:56:44 many years ago and how many fetuses they 00:56:46 were using and and just the way that 00:56:48 they were studying things back, you 00:56:50 know, 50 or 60 years ago is really 00:56:52 interesting. Pass the ethical review at 00:56:54 this time. It it wouldn't pass ethical 00:56:55 review and and there is still an ongoing 00:56:58 debate about religious freedom and 00:57:00 religious exemptions with with vaccines 00:57:01 and whether you know the use of aborted 00:57:03 tissue should be considered in in that 00:57:06 as of right now all religions 00:57:08 technically state that you can you can 00:57:10 get vaccines and there's nothing that's 00:57:11 problematic about it. But when you look 00:57:13 at the way that things were done 00:57:14 originally and using aborted fetal 00:57:16 tissue, then it does at least bring up 00:57:18 that discussion and I think that whether 00:57:21 your your church says that it's okay to 00:57:22 get vaccines that you still might 00:57:23 personally feel like that's unethical to 00:57:25 you. And and I just thought that was 00:57:26 interesting because I never considered 00:57:28 any of that before. I mean, I really 00:57:29 didn't even know that aborted fetal 00:57:30 tissue was in there. Yeah. Until you 00:57:32 start learning about vaccines. I mean, 00:57:33 these are just things that they're not 00:57:34 aborting fetuses now. They're using old 00:57:37 old aborted fetal tissue. But it it just 00:57:39 brings into question when when people 00:57:40 are called antivax cooks. Well, maybe 00:57:42 they have a reason why they're they're 00:57:44 concerned or or maybe they're they're 00:57:46 vegan and and they don't want to inject 00:57:48 something that has pork, you know, pork 00:57:50 products in there, which gelatin does. I 00:57:52 mean, they're they're taken from from 00:57:53 from pigs. So, some of the vaccines have 00:57:55 that in there. So, these are just things 00:57:57 that as someone who's open-minded, 00:57:59 doesn't mean you shouldn't do it, but I 00:58:00 think it's reasonable if you understand 00:58:02 those reasons why somebody might be 00:58:04 hesitant. That can help build bridges. 00:58:06 And I think we need to build bridges as 00:58:09 an understanding to why people might be 00:58:12 more hesitant that we give them, you 00:58:14 know, credit for as opposed to calling 00:58:15 them some sort of crazy crazy mom, crazy 00:58:17 dad, or crazy antivaxer. Well, maybe 00:58:19 they have personal beliefs that are 00:58:20 different than than mine. And not 00:58:22 everyone has to have the same beliefs or 00:58:23 the same understanding of everything. So 00:58:25 that that was really interesting. 00:58:27 Long-term studies that we don't have any 00:58:29 we really don't have any vaccinated 00:58:31 versus unvaccinated trials. That's crazy 00:58:32 to me. And autism research. I mean, it's 00:58:34 interesting. You know, I've heard RFK 00:58:37 Jr. talk about this in Del Big Tree also 00:58:41 who is worked with him on on this issue. 00:58:44 Um he he said they they actually met 00:58:46 with Anthony Fouchy and Francis Collins. 00:58:48 They went to the NIH. They sat with them 00:58:50 and they asked them, "Can you please 00:58:51 show us the data on which this is sort 00:58:55 of this whole program is based and the 00:58:58 randomized control trials?" And and they 00:59:00 they said, "We have them." And then they 00:59:03 push, they push, they push, and then 00:59:04 ultimately they don't have them. Um, can 00:59:06 you kind of speak to that? Is that legit 00:59:08 or that is legit? That's in there. You 00:59:10 know, I read all those those transcripts 00:59:12 and I read the the FOYA, the Freedom 00:59:14 Information Act requests and and they 00:59:18 they were I mean, they they've put in 00:59:19 many Freedom Information Acts over the 00:59:21 years now, but they were basically 00:59:23 saying like, "Show us the research and 00:59:25 the data. Show us the placebo control 00:59:26 trials. Show us the data that shows that 00:59:28 that all vaccines don't cause autism. 00:59:30 like provide it to us, please. We're not 00:59:33 saying that it doesn't exist, but for 00:59:35 example, with the hepatitis B vaccine, 00:59:36 like we were talking about, please show 00:59:39 us the information and the safety trials 00:59:42 that were more than four or five days. 00:59:44 Please show us the polio studies that 00:59:47 were done. That's what those when we're 00:59:50 talking about like the Eron Siri and and 00:59:53 RFK not wanting the polio vaccine, like 00:59:55 that's what they're talking about. they 00:59:56 they were asking for more information 00:59:58 and more more data and and what data 01:00:01 exists because they couldn't find it and 01:00:03 that's the same thing for I couldn't 01:00:04 find this stuff like you're really 01:00:06 trying to look for it and so it either 01:00:07 doesn't exist or it's not readily 01:00:10 accessible and I would hypothesize that 01:00:13 if you had some amazing study then it 01:00:15 would be front page news at the New York 01:00:17 Times so it doesn't exist probably and 01:00:20 and I've read you know multiple Apollo's 01:00:22 books and Paulett is is a researcher 01:00:26 vaccine researcher like one of the most 01:00:27 prominent individual many vaccines. 01:00:28 Yeah. Yeah. And he's part of all the all 01:00:30 the committees and and if there was more 01:00:32 I would think he would have it in his 01:00:33 books. Peter Hotz another um you know 01:00:36 prominent vaccine doctor it would be in 01:00:38 his books. The research that I found is 01:00:40 the same research that they had and I 01:00:42 went through all these books before and 01:00:45 after because I didn't believe what I 01:00:46 was finding. Yeah. Because I was I was 01:00:48 thinking there had to be more. You had 01:00:50 to read the other side and there was 01:00:51 there was nothing there. it just didn't 01:00:53 didn't exist that I could find uh in in 01:00:56 any way. And so the the research that I 01:00:57 could find was what they reported on. 01:00:59 And I think that's shocking and that's 01:01:01 why I think doctors don't know because I 01:01:04 didn't know. I'm an integrative doctor. 01:01:05 I get asked about vaccines all the time. 01:01:07 The reason why I'm doing a vaccine book 01:01:08 is because I didn't have the answers to 01:01:10 some of these questions. People ask me 01:01:11 about vaccines every single day. It's 01:01:13 why most people come to my office or 01:01:14 what they ask me on social media. And I 01:01:16 wanted to look at the research and 01:01:19 figure out what is out there. not to 01:01:20 convince people to do it or to tell them 01:01:22 not to, but to say, "Okay, here's what's 01:01:24 out there. Here's what we have. Here's 01:01:26 what we can move forward." And I was 01:01:29 shocked at what we didn't have. And the 01:01:32 people that were calling antivax 01:01:34 conspiracy theorists back in the day 01:01:35 like Del Big Tree and RFK who are 01:01:37 pushing for these things. And I found 01:01:39 the very same things that they're asking 01:01:41 for. when you get the sound bites of 01:01:43 these people, well maybe we need to look 01:01:46 at ourselves and say we got to get past 01:01:48 the headlines to look at the actual 01:01:50 research and we have to care about kids 01:01:52 and we have to care about kids more than 01:01:54 what pharma is pushing for us and what 01:01:57 we were trained originally. We have to 01:01:58 get past that and say well what is the 01:02:00 research actually show and what do we 01:02:03 need to do next? Let's not call names of 01:02:06 people have the NIH actually build a 01:02:08 research agenda that asks the right 01:02:11 questions. Yeah. Or or just just 01:02:13 just move forward. It's not about 01:02:16 blaming anybody. There's no shame in 01:02:18 this. We we know what we know and then 01:02:20 we know more in the future. And nobody's 01:02:23 taking away vaccines. But we want safer 01:02:26 vaccines. We want better vaccines. We 01:02:28 want more research on vaccines. So that 01:02:29 way in a decade or two decades, we have 01:02:32 better vaccines than we have today. We 01:02:34 have lower risk of side effects, 01:02:36 whatever those might be. And we have 01:02:37 more data. Yeah. I think, you know, it's 01:02:39 important. I mean I I as a physician 01:02:41 I've seen vaccinereventable disease and 01:02:43 working in the developing world in Haiti 01:02:45 I saw tetanus which I'd never seen 01:02:46 before and it was horrifying to see 01:02:48 someone with a disease that was just so 01:02:51 awful that was hard to treat that could 01:02:53 die from just it could be prevented by a 01:02:56 simple vaccine or my my stepfather had 01:02:59 mumps when he was a kid. He was you know 01:03:01 born in the 20s and ended up getting 01:03:04 infertility and sterile because of that. 01:03:07 So the vaccines do have a role in 01:03:10 medicine. The question is how do we best 01:03:11 apply them? So this has been quite 01:03:14 enlightening and we've uncovered a lot 01:03:16 of the problems in medicine and science. 01:03:18 Uh we clearly need more and better 01:03:20 science and the trope it's all safe and 01:03:23 effective. I think we need to just kind 01:03:25 of get past that and say look this is 01:03:27 like anything else in medicine. There's 01:03:28 risks and there's benefits. So when when 01:03:30 a parent comes into your office and says 01:03:34 Dr. Gator, which is your kind of 01:03:35 nickname, what should I do? Like, I I I 01:03:39 want to protect my kid. What's the best 01:03:42 way to vaccinate my kid? Cuz the CDC, 01:03:45 Center for Disease Control and 01:03:46 Prevention, which they don't really 01:03:48 focus on much, um is is uh is telling me 01:03:52 I should do this whole list of, you 01:03:54 know, dozens of vaccines and this is the 01:03:56 schedule they given. I mean, I when I 01:03:58 was doing my board certification, my 01:04:01 reapping my board certification, I went 01:04:02 to a review course and the pediatrician 01:04:04 from UCSF was like, "Here's a vaccine 01:04:07 schedule." I'm like, "Boy, this is what 01:04:09 we're told to tell you, but it's a lot." 01:04:11 You know, even he was like, "It's a 01:04:12 lot." And then he was a regular 01:04:14 pediatrician from UCSF. So, when a when 01:04:16 a parent comes in and wants to know how 01:04:18 do they proceed, which vaccine should I 01:04:19 give in what order, which one's 01:04:21 together, at what time? Uh how do I how 01:04:24 do I navigate that? So, what do you 01:04:25 advise your your uh patients for that? 01:04:28 It really goes back to my philosophy on 01:04:31 on vaccines or everything, which is 01:04:32 informed consent. And and I, as I said, 01:04:35 I don't tell people what to do. I I 01:04:37 never do. I don't tell people what to do 01:04:39 in the book. I don't think that we 01:04:41 should. I don't think that's reasonable. 01:04:43 Uh I think it's a doctor's job to inform 01:04:45 and to discuss and let parents make the 01:04:47 best decision that's for them. The only 01:04:49 thing that a doctor can recommend 01:04:50 currently is the CDC schedule. And so if 01:04:52 individuals want to follow the CDC 01:04:54 schedule, I think that's fine and and 01:04:56 reasonable for them to do. I think that 01:04:58 it would be extremely arrogant of me to 01:05:02 say that I know better than the CDC or I 01:05:04 should tell them to do something 01:05:05 different than the CDC should do based 01:05:07 on my own, you know, personal knowledge. 01:05:09 But I I don't feel that people should be 01:05:11 forced to do anything and we should have 01:05:13 discussions and parents should be able 01:05:14 to do whatever they feel is right for 01:05:16 them. I have patients in my office that 01:05:18 follow the CDC schedule. I have uh lots 01:05:20 of patients that go on a slower schedule 01:05:22 and I have some patients that don't do 01:05:23 any vaccines. Most people that come do 01:05:25 want to vaccinate. Most people do end up 01:05:27 vaccinating and and many do some sort of 01:05:29 a slower schedule. And so we can 01:05:31 certainly talk about that, but there is 01:05:33 no studied or researched or specific 01:05:37 slow schedule. It's really just personal 01:05:39 choice of how uh quickly you want to go, 01:05:41 how many you want to do, and how you 01:05:42 weigh the risks and the benefits of any 01:05:44 any given vaccine. And and that is how 01:05:46 people go about things. And vaccine 01:05:48 schedules are different in different 01:05:49 countries. They're very different in 01:05:50 different countries. So I talked about 01:05:51 that in um at the end of the book a 01:05:54 little bit. Most of the countries have a 01:05:56 fairly similar schedule. I mean most of 01:05:57 the vaccines are are the same, but 01:05:59 they're done in a different order. Not 01:06:01 everything is required um that we 01:06:03 require here. So a lot of countries 01:06:04 don't require he or he a lot of 01:06:06 countries don't require chickenpox. A 01:06:09 lot of countries have measles the MMR 01:06:11 vaccine on different schedules. So a lot 01:06:12 of them do it later. And actually our 01:06:14 country is one of the few that doesn't 01:06:16 do the menitis shot early. A lot of 01:06:17 countries actually do the the menitis 01:06:19 shot earlier, the the meninja caucus. So 01:06:21 there there is and are different 01:06:23 schedules out there, but there they are 01:06:25 fairly similar. But when you're talking 01:06:26 about a a slow schedule, I think the way 01:06:28 that I talk to parents about it is 01:06:30 really weighing risks versus benefits of 01:06:32 of each vaccine. So what are you more 01:06:34 likely to get? What is more likely to 01:06:36 kill you or cause serious harm? What's 01:06:37 going on in the community right now? uh 01:06:39 versus um you know what we know about 01:06:42 about any specific risks and that that 01:06:44 is usually how people go about things 01:06:45 and they're certainly whooping cough all 01:06:47 the time um we see it in our communities 01:06:49 there there is homophos influenza is 01:06:52 still around is was pertessus yeah so 01:06:55 the ptasessus part is probably the 01:06:56 biggest part of that I mean there is 01:06:57 still a little bit of tetanus but that's 01:06:59 pretty rare dtheria is extremely rare 01:07:01 these days um but we wouldn't want 01:07:03 dtheria to come back homophus is not 01:07:07 common anymore but certainly very 01:07:08 concerning ing. So that is around still 01:07:10 and that's something that people do tend 01:07:12 to choose to do. Polio, obviously nobody 01:07:14 wants polio, but we haven't had polio in 01:07:16 a very long time in this country. So 01:07:17 sometimes people choose to push that one 01:07:18 back. And there's live versus, you know, 01:07:21 inactive virus, right? Yeah. Well, we 01:07:23 Yeah, we don't do the the live one 01:07:24 anymore just because um we realize that 01:07:27 we're causing more polio with the live 01:07:29 virus that you actually um have and we 01:07:31 haven't had a case in a very long time 01:07:33 from wild wild polio. So people 01:07:35 sometimes push that back. I think it's 01:07:36 important. We don't want polio to come 01:07:38 back obviously and if you're traveling 01:07:39 somewhere where there is polio then that 01:07:40 makes a lot of sense but there isn't a 01:07:42 lot of polio in our country so sometimes 01:07:44 people push that back. Roto virus um is 01:07:47 a common one that we give to to children 01:07:49 but in our country we have ivy fluids so 01:07:52 a lot of people do tend to choose to 01:07:53 push that one back. Yeah. Um roirus 01:07:56 gives you bad diarrhea. Bad diarrhea. 01:07:57 Numoccus can cause menitis and 01:07:59 pneumonia. So that's one that people 01:08:00 tend to prioritize um in that first 01:08:02 year. And then that the measles ms 01:08:04 reubella certainly the measles part 01:08:06 people prioritize that and certainly are 01:08:07 prioritizing that a lot more uh in the 01:08:09 last few months and chickenpox they tend 01:08:11 to prioritize less. So that that's the 01:08:13 way that people think about it. If they 01:08:14 want to do the regular schedule go for 01:08:16 it. If they want to do a slow they can 01:08:18 do all Yeah. then they can do and they 01:08:19 can do what I mean and a lot of times 01:08:21 people still do all the vaccines and 01:08:22 they still get them on the the same 01:08:24 general time frame but they just might 01:08:26 come in a little bit more and do one at 01:08:28 a time and that makes them feel more 01:08:29 comfortable. So that's fine by me. 01:08:31 Sometimes they'll do it every couple of 01:08:32 months. That is a rational for giving 01:08:35 like one at a time as opposed to just 01:08:37 loading them all up visits. That goes to 01:08:40 a little bit more on common sense, I 01:08:41 would say, because there's no research 01:08:43 to show that if you do a slower schedule 01:08:45 or you do one at a time, it's going to 01:08:46 decrease your risk. So, we don't we 01:08:47 don't know and we don't have science on 01:08:49 the current schedule either, right? So, 01:08:51 I I don't know that it decreases your 01:08:52 risk at all. But the theory that a 01:08:54 parent would have is I don't want to 01:08:55 give my kid four vaccines on a day. 01:08:57 There's more toxic load. there's more 01:08:59 that the body has to deal with and 01:09:00 therefore it in potentially increases 01:09:02 their risk to have a reaction. So I feel 01:09:03 more comfortable giving one at a time 01:09:05 instead of doing two vaccines today. 01:09:06 I'll do one today and I'll come back in 01:09:08 a month and do the other one. They're 01:09:09 still doing them. They're just doing 01:09:10 them one at a time. And that yes, it 01:09:12 means more visits. Like they're still at 01:09:14 the end of the day doing all the 01:09:15 vaccines. So I don't have the research 01:09:17 to say that's better, but we should have 01:09:19 that research. We should be studying 01:09:20 that. We should be looking, hey, if we 01:09:22 space it out, does that decrease your 01:09:23 risk of anything? That would be good to 01:09:25 know. And and the other question is, you 01:09:26 know, know vaccines are a really big 01:09:28 immune irritant and babies don't have a 01:09:32 fully developed immune system until 01:09:34 they're over a year old. You know, just 01:09:36 like they don't have a developed 01:09:38 neurological system. When you look at a 01:09:39 a baby, their emotions are like jerky. 01:09:41 They look like they have like cerebral 01:09:43 pausy or some moves movements, 01:09:45 but that's because their their 01:09:46 neurologic system's not developed and 01:09:47 their immune system not developed and 01:09:49 their gut and their oral tolerance not 01:09:50 developed. Is there a rationale for 01:09:52 waiting a little longer when a kid's not 01:09:55 so immune sort of underdeveloped in 01:09:57 their immune response or do we want do 01:10:00 we still want to give it to little kids 01:10:01 babies before a year old? There there's 01:10:03 two sides to that question. The the 01:10:05 first part of that would would say and 01:10:06 this is what the general consensus is. 01:10:08 We want to give the vaccines early 01:10:10 because you get the protection and you 01:10:12 want to protect before you get the 01:10:13 disease. So that's the the general body 01:10:15 of science. The concern around giving 01:10:17 vaccines early is like you said, the 01:10:19 immune system is developing. The baby's 01:10:21 developing, they're still small, so any 01:10:23 sort of toxic load or any chemicals or 01:10:25 any sort of stimulation is going to most 01:10:27 likely have more of effect on them than 01:10:28 it would for an adult um or somebody 01:10:31 older. And and so that's the the concern 01:10:34 there. There certainly are some studies 01:10:36 that say if you delay certain vaccines 01:10:39 then then you decrease risk of asthma, 01:10:40 you decrease risk of eczema. I mean, 01:10:42 these studies exist. They're not the 01:10:43 majority, but they they're out there. 01:10:45 Um, and there's there are some actually 01:10:46 very good studies and good journals that 01:10:47 that show this. So, it's not an 01:10:50 unreasonable hypothesis to have. And 01:10:52 again, it goes back to the research we 01:10:53 should be getting. We should be asking, 01:10:56 hey, if we move the hepatitis B vaccine 01:10:58 back, does that decrease your risk of 01:11:00 anything? Um, I mean, just in basic 01:11:02 common sense, if you give a vaccine to 01:11:04 anybody, the biggest risk you have is 01:11:06 having a fever, right? Having a fever, 01:11:08 not feeling good. If a baby in the first 01:11:10 week gets a fever, what happens? Right? 01:11:12 Right? If you give a hepatitis B vaccine 01:11:14 and they have a normal side effect from 01:11:15 a vaccine of getting a fever, what 01:11:17 happens to a baby? They're going to the 01:11:18 hospital. They're getting a lumbar 01:11:20 puncture. They're getting a full workup. 01:11:21 They're being exposed to all these 01:11:23 potential diseases in the hospital. Um 01:11:26 that maybe they got that fever just from 01:11:29 the reaction from the vaccine. But we 01:11:30 don't know that in a newborn baby. So, 01:11:32 we're going to do the full workup 01:11:33 anyways because we want to protect them. 01:11:35 But we don't talk about that that risk 01:11:37 versus benefit. Do we do those babies 01:11:38 need that risk? They need the risk to go 01:11:40 to the hospital. Do they need to risk to 01:11:41 get the fever? good, they wait till 2 01:11:43 months old or four months old and and 01:11:45 and take away that risk. I mean, those 01:11:47 are the questions that smart people need 01:11:48 to have. They need to sit on stage. They 01:11:51 need to debate. They need to go to 01:11:52 Harvard and Hopkins and on TV on full 01:11:56 cameras, sit there and debate these 01:11:58 questions. They need to talk about that. 01:11:59 They need to talk about all of these 01:12:01 ethics so people can watch it and say, 01:12:03 "Well, you know, maybe maybe you're 01:12:04 right. Maybe we we should rethink our 01:12:06 hepatitis B." Or maybe somebody smarter 01:12:08 than me is going to say, "No, no, no. 01:12:10 Here's why we do it. here's what the 01:12:12 reason is, here's why the the risks the 01:12:14 benefits outweigh the risks and and I'll 01:12:16 say okay well that makes sense you know 01:12:17 like that's what we need but we can't 01:12:19 shut down the discussion which is what's 01:12:21 happened for the last decade how are we 01:12:23 going to move forward unless we can have 01:12:24 these conversations and it has to be 01:12:27 someone who's pro versus someone who's 01:12:28 anti whatever labels you want somebody 01:12:30 who is who knows this research and 01:12:33 understands it really well and can 01:12:35 debate it and discuss it publicly so 01:12:38 that way we can I I don't have any skin 01:12:40 in the game. I Whatever the research is, 01:12:42 it's fine by me. I I want kids to be the 01:12:44 healthiest. I agree. Me, too. And I And 01:12:45 I think what I think what I would love 01:12:47 to see is a catalog of the key questions 01:12:49 that need to be answered. Here's what we 01:12:51 know. Here's what we don't know. But 01:12:53 let's fund research that helps us 01:12:55 understand what we don't know and have 01:12:56 better answers and be able to actually 01:12:58 show population, show doctors and 01:13:01 healthare system what actually we should 01:13:03 be doing because right 01:13:04 now there's so many questions because 01:13:08 there's a lot of unanswered questions. 01:13:10 But those questions won't even be asked 01:13:11 until the literature is written down and 01:13:14 we're allowed to ask the questions. You 01:13:16 have to be allowed to ask the questions. 01:13:17 And no one's going to allow that to 01:13:18 happen until they realize that there are 01:13:20 holes in the research. And so I think it 01:13:23 starts with putting that all down, 01:13:26 having open discussions about what 01:13:27 exists, getting doctors on board with 01:13:29 the research that actually is present 01:13:31 right now today. And then we move 01:13:34 forward from there with debate and 01:13:36 discussion on the open topics. And then 01:13:38 people will say, "Hey, okay, well, 01:13:40 hepatitis B is a reasonable discussion 01:13:42 to have. Let's let's get the the 01:13:44 committees together and let's talk 01:13:46 about, okay, here's what we have. Here's 01:13:47 what we don't have. Here's what the 01:13:48 smart people say. Maybe we move 01:13:50 hepatitis B back or maybe we just make 01:13:52 it a a general recommendation for those 01:13:54 that test positive for hepatitis B or 01:13:56 don't don't know their hepatitis B risk. 01:13:58 let's start to focus on safety because 01:14:01 what I think is happening most right now 01:14:03 and especially when you're talking about 01:14:04 the people that are hesitant and I hear 01:14:06 this over and over again they don't feel 01:14:07 like anybody cares about their kids they 01:14:09 feel like right now everything is about 01:14:12 money and about pushing more medications 01:14:14 and drugs and nobody actually cares 01:14:16 about safety and everybody understands 01:14:18 that there is some safety risk with any 01:14:20 medication that we have or any vaccines 01:14:22 and they'll be okay with some risk but 01:14:24 they want to know that we're 01:14:26 acknowledging the risk that we're 01:14:28 studying the risk, that we're figuring 01:14:30 out what that actual risk is, and that 01:14:31 we're moving to make things safer and 01:14:33 better over time. And until those 01:14:36 discussions are had publicly, more and 01:14:38 more people and more and more parents 01:14:40 are going to be hesitant about vaccines. 01:14:42 And that is what we're seeing. The rates 01:14:44 are plummeting. People don't believe in 01:14:46 anymore. Even doctors, there's some 01:14:48 studies that show that doctors, there's 01:14:50 like 5 to 10% of doctors don't fully 01:14:52 believe that vaccines are are safe 01:14:54 anymore. That's a huge crisis. It is. I 01:14:57 agree. And I think I think these 01:14:58 questions need to be answered and I I 01:14:59 got to thank you for having the courage 01:15:01 to step out in this field because I know 01:15:04 for me personally it's been an area that 01:15:05 I often avoid because all of a sudden 01:15:08 you get labeled as a quack and all the 01:15:09 rest of the work you want to try to do 01:15:10 in the world is compromised. But all all 01:15:13 we're asking for is is you know like 01:15:16 better science so we can have safer 01:15:18 vaccines. You know Bobby always says 01:15:19 this is I was calling out the fact that 01:15:22 fish were poisoned with mercury in our 01:15:25 rivers. I'm not anti-fish. I just want 01:15:28 safer fish. You know, if you want safe 01:15:31 planes to fly on, doesn't mean you're 01:15:33 anti-flying, right? And I think it's the 01:15:35 same thing with vaccines. So, Between a 01:15:36 Shot and a Hard Place is a great book 01:15:38 tackling difficult vaccine questions 01:15:40 with balance, data, and clarity. 01:15:42 Everybody should definitely get a copy 01:15:44 if you're interested in this topic. If 01:15:46 you're um have a kid or thinking of 01:15:48 having a kid or a grandkid, I think this 01:15:50 is such an important conversation, and I 01:15:52 encourage doctors out there listening. I 01:15:54 encourage scientists out there listening 01:15:55 to ask these questions to not kind of 01:15:58 lean into the propaganda that the 01:16:00 science is settled. They're safe and 01:16:02 effective. That's just a nonsense 01:16:03 statement when you think about 01:16:05 everything else in medicine that doesn't 01:16:06 apply to. So Joel, thank you for writing 01:16:09 this book. Thank you having the courage. 01:16:10 And uh hope you all love this. Make sure 01:16:12 you check out Joel's work. Uh you can 01:16:14 find him online. Tell us how to find you 01:16:17 where where we can get get more 01:16:18 information about you and your social 01:16:20 media. Yeah, the best places are 01:16:21 probably at Dr. Joel Gator on Instagram 01:16:24 or X or you can find the book between a 01:16:27 shot and a hard place on on Amazon or 01:16:28 anywhere books are sold or you can go to 01:16:30 theshotbook.com. 01:16:32 You've also written another book 01:16:33 parenting your at your child's pace the 01:16:35 integrative pediatrician's guide to the 01:16:37 first three years came out last year so 01:16:39 you're prolific. I don't know if you're 01:16:41 going to follow my pace. I hope you're 01:16:42 not. It's almost killed me. But yeah, 01:16:44 the the last one not as controversial I 01:16:46 guess. But uh this one I mean this one I 01:16:48 I have wor I've had it for a while and 01:16:51 we've been working on it but this is the 01:16:52 time to get it out there. So I've been 01:16:54 working super hard to get it out and 01:16:55 that's why it's you know kind of two 01:16:56 interrupt. Well thanks Joel and thanks 01:16:58 for having the courage and doing this 01:16:59 and uh we're all going to keep looking 01:17:00 at what you're doing and taking guidance 01:17:02 from your thinking and hopefully the 01:17:04 people listening will in the right 01:17:05 places will advance the science so we 01:17:07 can all have better answers to these 01:17:08 difficult questions. Thank you. If you 01:17:11 loved that last video, you're going to
identifier: 3057
model: gemini-2.5-flash-preview-05-20| input-price: 0.15 output-price: 3.5 max-context-length: 128_000
host: 193.8.40.111
https://www.youtube.com/watch?v=W4tqbEmplug&pp=0gcJCbAJAYcqIYzvinclude_comments: None
include_timestamps: 1
include_glossary: None
output_language: en
cost: 0.014992349999999998
*Abstract:* This video features a discussion with visionary thinker Simon Sinek on the profound impact of artificial intelligence and modern technology on human connection, personal growth, and societal well-being. Sinek argues that while AI offers unprecedented efficiency and convenience, it risks devaluing the human journey of struggle, imperfection, and authentic experience that fosters true wisdom, resilience, and meaningful relationships. He critiques a "result-obsessed" society that prioritizes output over the valuable process of learning and self-development. The conversation emphasizes the increasing importance of cultivating essential "human skills" like empathy, conflict resolution, and genuine connection, especially as technology automates traditional knowledge-based work. Sinek advocates for a re-prioritization of human relationships, intentional engagement, and embracing imperfection as the foundation of beauty and growth in an increasingly digital and fast-paced world. *Exploring the Human Experience in the Age of AI: Insights from Simon Sinek* * *0:00 - The Human Cost of AI Perfection:* The discussion opens with an anecdote about using AI (ChatGPT) to resolve a relationship conflict, illustrating how "perfect" AI-generated responses lack the authenticity and human imperfection that are vital for genuine connection and personal growth. * *0:33 - Journey Over Destination:* Simon Sinek articulates his core philosophy: life is about the *journey* and the inherent value of *doing the work* (e.g., writing a book, solving a problem) yourself, as this struggle fosters resourcefulness and intellectual growth, which AI bypasses by providing instant solutions. * *0:58 - Erosion of Human Skills:* Sinek warns that over-reliance on technology can lead to the "giving up" of crucial human skills (e.g., knowing what to do when a friend is struggling, coping with stress), comparing it to having a boat but not knowing how to swim when a storm hits. * *4:14 - The Irony of Job Displacement:* An interesting parallel is drawn between the displacement of factory workers by robotics (met with calls to "reskill") and the current threat of AI to knowledge workers, noting the ironic shift in societal concern and the discussion of universal basic income. * *7:16 - Authenticity vs. Artificiality:* AI-generated content (emails, art) is making digital interactions feel inauthentic, creating a new premium on "human-written language" and the beauty found in imperfection, drawing on the Japanese concept of *Wabi-sabi*. * *12:55 - Embracing the Value of Struggle:* The conversation reiterates that genuine personal growth, stronger relationships, and increased confidence come from navigating and resolving difficulties, not from avoiding them or seeking perfect, instant solutions. * *18:22 - Essential Human Skills for the Future:* Sinek stresses that critical human skills such as listening, holding space, conflict resolution, giving/receiving feedback, accountability, and empathy are increasingly vital but are suffering in the modern world. * *31:08 - Potential for Reclaiming Time and Purpose:* While AI may lead to job shifts, Sinek suggests it could also free up time, allowing individuals to pursue hobbies, deepen relationships, and combat loneliness, arguing that this return to personal time is something society has "been desperate for." * *35:54 - The Pursuit of "Enough" and Gratitude:* Happiness is not solely tied to money beyond a certain point; true fulfillment often comes from having a vision that exceeds current resources, fostering creativity, and practicing gratitude for what one has. * *46:12 - The Profound Impact of Gratitude:* Personal experiences (e.g., the LA fires) illustrate how confronting loss and embracing the temporary nature of things can lead to a deeper appreciation and gratitude for life and relationships. * *55:59 - Scale Breaks Quality:* The speaker argues that efforts to achieve massive scale often compromise quality and authenticity (e.g., mass-produced clothing, Apple's curated store experience), leading to a growing demand for handmade, unique items. * *1:09:59 - Friction Creates Freedom:* The paradox that technological convenience has reduced the "friction" that once created boundaries between work and personal life, thereby diminishing freedom and increasing stress. * *1:11:05 - The Rise of In-Real-Life Community:* Sinek predicts a "community revolution," defining community as "a group of people who agree to grow together," emphasizing the increasing value of offline, intentional human connection. * *1:52:50 - Friendship: The Ultimate Biohack:* Sinek highlights the underserved topic of friendship, arguing that strong friendships are fundamental for navigating life's challenges and are the most potent "biohack" for combating loneliness, depression, and anxiety. * *2:01:01 - Mentorship and Lasting Legacy:* Sinek shares his commitment to mentoring his team, imparting his knowledge and lessons from past mistakes to build a resilient organization that can thrive independently. He presents a military challenge coin engraved with "Build, Teach, Lead," symbolizing the responsibility to accumulate, give, and serve. I used gemini-2.5-flash-preview-05-20| input-price: 0.15 output-price: 3.5 max-context-length: 128_000 on rocketrecap dot com to summarize the transcript. Cost (if I didn't use the free tier): $0.0150 Input tokens: 74609 Output tokens: 1086
Abstract:
This video features a discussion with visionary thinker Simon Sinek on the profound impact of artificial intelligence and modern technology on human connection, personal growth, and societal well-being. Sinek argues that while AI offers unprecedented efficiency and convenience, it risks devaluing the human journey of struggle, imperfection, and authentic experience that fosters true wisdom, resilience, and meaningful relationships. He critiques a "result-obsessed" society that prioritizes output over the valuable process of learning and self-development. The conversation emphasizes the increasing importance of cultivating essential "human skills" like empathy, conflict resolution, and genuine connection, especially as technology automates traditional knowledge-based work. Sinek advocates for a re-prioritization of human relationships, intentional engagement, and embracing imperfection as the foundation of beauty and growth in an increasingly digital and fast-paced world.
Exploring the Human Experience in the Age of AI: Insights from Simon Sinek
Below, I will provide input for an example video (comprising of title, description, and transcript, in this order) and the corresponding abstract and summary I expect. Afterward, I will provide a new transcript that I want you to summarize in the same format. **Please give an abstract of the transcript and then summarize the transcript in a self-contained bullet list format.** Include starting timestamps, important details and key takeaways. Example Input: Fluidigm Polaris Part 2- illuminator and camera mikeselectricstuff 131K subscribers Subscribed 369 Share Download Clip Save 5,857 views Aug 26, 2024 Fluidigm Polaris part 1 : • Fluidigm Polaris (Part 1) - Biotech g... Ebay listings: https://www.ebay.co.uk/usr/mikeselect... Merch https://mikeselectricstuff.creator-sp... Transcript Follow along using the transcript. Show transcript mikeselectricstuff 131K subscribers Videos About Support on Patreon 40 Comments @robertwatsonbath 6 hours ago Thanks Mike. Ooof! - with the level of bodgery going on around 15:48 I think shame would have made me do a board re spin, out of my own pocket if I had to. 1 Reply @Muonium1 9 hours ago The green LED looks different from the others and uses phosphor conversion because of the "green gap" problem where green InGaN emitters suffer efficiency droop at high currents. Phosphide based emitters don't start becoming efficient until around 600nm so also can't be used for high power green emitters. See the paper and plot by Matthias Auf der Maur in his 2015 paper on alloy fluctuations in InGaN as the cause of reduced external quantum efficiency at longer (green) wavelengths. 4 Reply 1 reply @tafsirnahian669 10 hours ago (edited) Can this be used as an astrophotography camera? Reply mikeselectricstuff · 1 reply @mikeselectricstuff 6 hours ago Yes, but may need a shutter to avoid light during readout Reply @2010craggy 11 hours ago Narrowband filters we use in Astronomy (Astrophotography) are sided- they work best passing light in one direction so I guess the arrows on the filter frames indicate which way round to install them in the filter wheel. 1 Reply @vitukz 12 hours ago A mate with Channel @extractions&ire could use it 2 Reply @RobertGallop 19 hours ago That LED module says it can go up to 28 amps!!! 21 amps for 100%. You should see what it does at 20 amps! Reply @Prophes0r 19 hours ago I had an "Oh SHIT!" moment when I realized that the weird trapezoidal shape of that light guide was for keystone correction of the light source. Very clever. 6 Reply @OneBiOzZ 20 hours ago given the cost of the CCD you think they could have run another PCB for it 9 Reply @tekvax01 21 hours ago $20 thousand dollars per minute of run time! 1 Reply @tekvax01 22 hours ago "We spared no expense!" John Hammond Jurassic Park. *(that's why this thing costs the same as a 50-seat Greyhound Bus coach!) Reply @florianf4257 22 hours ago The smearing on the image could be due to the fact that you don't use a shutter, so you see brighter stripes under bright areas of the image as you still iluminate these pixels while the sensor data ist shifted out towards the top. I experienced this effect back at university with a LN-Cooled CCD for Spectroscopy. The stripes disapeared as soon as you used the shutter instead of disabling it in the open position (but fokussing at 100ms integration time and continuous readout with a focal plane shutter isn't much fun). 12 Reply mikeselectricstuff · 1 reply @mikeselectricstuff 12 hours ago I didn't think of that, but makes sense 2 Reply @douro20 22 hours ago (edited) The red LED reminds me of one from Roithner Lasertechnik. I have a Symbol 2D scanner which uses two very bright LEDs from that company, one red and one red-orange. The red-orange is behind a lens which focuses it into an extremely narrow beam. 1 Reply @RicoElectrico 23 hours ago PFG is Pulse Flush Gate according to the datasheet. Reply @dcallan812 23 hours ago Very interesting. 2x Reply @littleboot_ 1 day ago Cool interesting device Reply @dav1dbone 1 day ago I've stripped large projectors, looks similar, wonder if some of those castings are a magnesium alloy? Reply @kevywevvy8833 1 day ago ironic that some of those Phlatlight modules are used in some of the cheapest disco lights. 1 Reply 1 reply @bill6255 1 day ago Great vid - gets right into subject in title, its packed with information, wraps up quickly. Should get a YT award! imho 3 Reply @JAKOB1977 1 day ago (edited) The whole sensor module incl. a 5 grand 50mpix sensor for 49 £.. highest bid atm Though also a limited CCD sensor, but for the right buyer its a steal at these relative low sums. Architecture Full Frame CCD (Square Pixels) Total Number of Pixels 8304 (H) × 6220 (V) = 51.6 Mp Number of Effective Pixels 8208 (H) × 6164 (V) = 50.5 Mp Number of Active Pixels 8176 (H) × 6132 (V) = 50.1 Mp Pixel Size 6.0 m (H) × 6.0 m (V) Active Image Size 49.1 mm (H) × 36.8 mm (V) 61.3 mm (Diagonal), 645 1.1x Optical Format Aspect Ratio 4:3 Horizontal Outputs 4 Saturation Signal 40.3 ke− Output Sensitivity 31 V/e− Quantum Efficiency KAF−50100−CAA KAF−50100−AAA KAF−50100−ABA (with Lens) 22%, 22%, 16% (Peak R, G, B) 25% 62% Read Noise (f = 18 MHz) 12.5 e− Dark Signal (T = 60°C) 42 pA/cm2 Dark Current Doubling Temperature 5.7°C Dynamic Range (f = 18 MHz) 70.2 dB Estimated Linear Dynamic Range (f = 18 MHz) 69.3 dB Charge Transfer Efficiency Horizontal Vertical 0.999995 0.999999 Blooming Protection (4 ms Exposure Time) 800X Saturation Exposure Maximum Date Rate 18 MHz Package Ceramic PGA Cover Glass MAR Coated, 2 Sides or Clear Glass Features • TRUESENSE Transparent Gate Electrode for High Sensitivity • Ultra-High Resolution • Board Dynamic Range • Low Noise Architecture • Large Active Imaging Area Applications • Digitization • Mapping/Aerial • Photography • Scientific Thx for the tear down Mike, always a joy Reply @martinalooksatthings 1 day ago 15:49 that is some great bodging on of caps, they really didn't want to respin that PCB huh 8 Reply @RhythmGamer 1 day ago Was depressed today and then a new mike video dropped and now I’m genuinely happy to get my tear down fix 1 Reply @dine9093 1 day ago (edited) Did you transfrom into Mr Blobby for a moment there? 2 Reply @NickNorton 1 day ago Thanks Mike. Your videos are always interesting. 5 Reply @KeritechElectronics 1 day ago Heavy optics indeed... Spare no expense, cost no object. Splendid build quality. The CCD is a thing of beauty! 1 Reply @YSoreil 1 day ago The pricing on that sensor is about right, I looked in to these many years ago when they were still in production since it's the only large sensor you could actually buy. Really cool to see one in the wild. 2 Reply @snik2pl 1 day ago That leds look like from led projector Reply @vincei4252 1 day ago TDI = Time Domain Integration ? 1 Reply @wolpumba4099 1 day ago (edited) Maybe the camera should not be illuminated during readout. From the datasheet of the sensor (Onsemi): saturation 40300 electrons, read noise 12.5 electrons per pixel @ 18MHz (quite bad). quantum efficiency 62% (if it has micro lenses), frame rate 1 Hz. lateral overflow drain to prevent blooming protects against 800x (factor increases linearly with exposure time) saturation exposure (32e6 electrons per pixel at 4ms exposure time), microlens has +/- 20 degree acceptance angle i guess it would be good for astrophotography 4 Reply @txm100 1 day ago (edited) Babe wake up a new mikeselectricstuff has dropped! 9 Reply @vincei4252 1 day ago That looks like a finger-lakes filter wheel, however, for astronomy they'd never use such a large stepper. 1 Reply @MRooodddvvv 1 day ago yaaaaay ! more overcomplicated optical stuff ! 4 Reply 1 reply @NoPegs 1 day ago He lives! 11 Reply 1 reply Transcript 0:00 so I've stripped all the bits of the 0:01 optical system so basically we've got 0:03 the uh the camera 0:05 itself which is mounted on this uh very 0:09 complex 0:10 adjustment thing which obviously to set 0:13 you the various tilt and uh alignment 0:15 stuff then there's two of these massive 0:18 lenses I've taken one of these apart I 0:20 think there's something like about eight 0:22 or nine Optical elements in here these 0:25 don't seem to do a great deal in terms 0:26 of electr magnification they're obiously 0:28 just about getting the image to where it 0:29 uh where it needs to be just so that 0:33 goes like that then this Optical block I 0:36 originally thought this was made of some 0:37 s crazy heavy material but it's just 0:39 really the sum of all these Optical bits 0:41 are just ridiculously heavy those lenses 0:43 are about 4 kilos each and then there's 0:45 this very heavy very solid um piece that 0:47 goes in the middle and this is so this 0:49 is the filter wheel assembly with a 0:51 hilariously oversized steper 0:53 motor driving this wheel with these very 0:57 large narrow band filters so we've got 1:00 various different shades of uh 1:03 filters there five Al together that 1:06 one's actually just showing up a silver 1:07 that's actually a a red but fairly low 1:10 transmission orangey red blue green 1:15 there's an excess cover on this side so 1:16 the filters can be accessed and changed 1:19 without taking anything else apart even 1:21 this is like ridiculous it's like solid 1:23 aluminium this is just basically a cover 1:25 the actual wavelengths of these are um 1:27 488 525 570 630 and 700 NM not sure what 1:32 the suffix on that perhaps that's the uh 1:34 the width of the spectral line say these 1:37 are very narrow band filters most of 1:39 them are you very little light through 1:41 so it's still very tight narrow band to 1:43 match the um fluoresence of the dies 1:45 they're using in the biochemical process 1:48 and obviously to reject the light that's 1:49 being fired at it from that Illuminator 1:51 box and then there's a there's a second 1:53 one of these lenses then the actual sort 1:55 of samples below that so uh very serious 1:58 amount of very uh chunky heavy Optics 2:01 okay let's take a look at this light 2:02 source made by company Lumen Dynamics 2:04 who are now part of 2:06 excelitas self-contained unit power 2:08 connector USB and this which one of the 2:11 Cable Bundle said was a TTL interface 2:14 USB wasn't used in uh the fluid 2:17 application output here and I think this 2:19 is an input for um light feedback I 2:21 don't if it's regulated or just a measur 2:23 measurement facility and the uh fiber 2:27 assembly 2:29 Square Inlet there and then there's two 2:32 outputs which have uh lens assemblies 2:35 and this small one which goes back into 2:37 that small Port just Loops out of here 2:40 straight back in So on this side we've 2:42 got the electronics which look pretty 2:44 straightforward we've got a bit of power 2:45 supply stuff over here and we've got 2:48 separate drivers for each wavelength now 2:50 interesting this is clearly been very 2:52 specifically made for this application 2:54 you I was half expecting like say some 2:56 generic drivers that could be used for a 2:58 number of different things but actually 3:00 literally specified the exact wavelength 3:02 on the PCB there is provision here for 3:04 385 NM which isn't populated but this is 3:07 clearly been designed very specifically 3:09 so these four drivers look the same but 3:10 then there's two higher power ones for 3:12 575 and 3:14 520 a slightly bigger heat sink on this 3:16 575 section there a p 24 which is 3:20 providing USB interface USB isolator the 3:23 USB interface just presents as a comport 3:26 I did have a quick look but I didn't 3:27 actually get anything sensible um I did 3:29 dump the Pi code out and there's a few 3:31 you a few sort of commands that you 3:32 could see in text but I didn't actually 3:34 manage to get it working properly I 3:36 found some software for related version 3:38 but it didn't seem to want to talk to it 3:39 but um I say that wasn't used for the 3:41 original application it might be quite 3:42 interesting to get try and get the Run 3:44 hours count out of it and the TTL 3:46 interface looks fairly straightforward 3:48 we've got positions for six opto 3:50 isolators but only five five are 3:52 installed so that corresponds with the 3:54 unused thing so I think this hopefully 3:56 should be as simple as just providing a 3:57 ttrl signal for each color to uh enable 4:00 it a big heat sink here which is there I 4:03 think there's like a big S of metal 4:04 plate through the middle of this that 4:05 all the leads are mounted on the other 4:07 side so this is heat sinking it with a 4:09 air flow from a uh just a fan in here 4:13 obviously don't have the air flow 4:14 anywhere near the Optics so conduction 4:17 cool through to this plate that's then 4:18 uh air cooled got some pots which are 4:21 presumably power 4:22 adjustments okay let's take a look at 4:24 the other side which is uh much more 4:27 interesting see we've got some uh very 4:31 uh neatly Twisted cable assemblies there 4:35 a bunch of leads so we've got one here 4:37 475 up here 430 NM 630 575 and 520 4:44 filters and dcro mirrors a quick way to 4:48 see what's white is if we just shine 4:49 some white light through 4:51 here not sure how it is is to see on the 4:54 camera but shining white light we do 4:55 actually get a bit of red a bit of blue 4:57 some yellow here so the obstacle path 5:00 575 it goes sort of here bounces off 5:03 this mirror and goes out the 520 goes 5:07 sort of down here across here and up 5:09 there 630 goes basically straight 5:13 through 5:15 430 goes across there down there along 5:17 there and the 475 goes down here and 5:20 left this is the light sensing thing 5:22 think here there's just a um I think 5:24 there a photo diode or other sensor 5:26 haven't actually taken that off and 5:28 everything's fixed down to this chunk of 5:31 aluminium which acts as the heat 5:32 spreader that then conducts the heat to 5:33 the back side for the heat 5:35 sink and the actual lead packages all 5:38 look fairly similar except for this one 5:41 on the 575 which looks quite a bit more 5:44 substantial big spay 5:46 Terminals and the interface for this 5:48 turned out to be extremely simple it's 5:50 literally a 5V TTL level to enable each 5:54 color doesn't seem to be any tensity 5:56 control but there are some additional 5:58 pins on that connector that weren't used 5:59 in the through time thing so maybe 6:01 there's some extra lines that control 6:02 that I couldn't find any data on this uh 6:05 unit and the um their current product 6:07 range is quite significantly different 6:09 so we've got the uh blue these 6:13 might may well be saturating the camera 6:16 so they might look a bit weird so that's 6:17 the 430 6:18 blue the 575 6:24 yellow uh 6:26 475 light blue 6:29 the uh 520 6:31 green and the uh 630 red now one 6:36 interesting thing I noticed for the 6:39 575 it's actually it's actually using a 6:42 white lead and then filtering it rather 6:44 than using all the other ones are using 6:46 leads which are the fundamental colors 6:47 but uh this is actually doing white and 6:50 it's a combination of this filter and 6:52 the dichroic mirrors that are turning to 6:55 Yellow if we take the filter out and a 6:57 lot of the a lot of the um blue content 7:00 is going this way the red is going 7:02 straight through these two mirrors so 7:05 this is clearly not reflecting much of 7:08 that so we end up with the yellow coming 7:10 out of uh out of there which is a fairly 7:14 light yellow color which you don't 7:16 really see from high intensity leads so 7:19 that's clearly why they've used the 7:20 white to uh do this power consumption of 7:23 the white is pretty high so going up to 7:25 about 2 and 1 half amps on that color 7:27 whereas most of the other colors are 7:28 only drawing half an amp or so at 24 7:30 volts the uh the green is up to about 7:32 1.2 but say this thing is uh much 7:35 brighter and if you actually run all the 7:38 colors at the same time you get a fairly 7:41 reasonable um looking white coming out 7:43 of it and one thing you might just be 7:45 out to notice is there is some sort 7:46 color banding around here that's not 7:49 getting uh everything s completely 7:51 concentric and I think that's where this 7:53 fiber optic thing comes 7:58 in I'll 8:00 get a couple of Fairly accurately shaped 8:04 very sort of uniform color and looking 8:06 at What's um inside here we've basically 8:09 just got this Square Rod so this is 8:12 clearly yeah the lights just bouncing 8:13 off all the all the various sides to um 8:16 get a nice uniform illumination uh this 8:19 back bit looks like it's all potted so 8:21 nothing I really do to get in there I 8:24 think this is fiber so I have come 8:26 across um cables like this which are 8:27 liquid fill but just looking through the 8:30 end of this it's probably a bit hard to 8:31 see it does look like there fiber ends 8:34 going going on there and so there's this 8:36 feedback thing which is just obviously 8:39 compensating for the any light losses 8:41 through here to get an accurate 8:43 representation of uh the light that's 8:45 been launched out of these two 8:47 fibers and you see uh 8:49 these have got this sort of trapezium 8:54 shape light guides again it's like a 8:56 sort of acrylic or glass light guide 9:00 guess projected just to make the right 9:03 rectangular 9:04 shape and look at this Center assembly 9:07 um the light output doesn't uh change 9:10 whether you feed this in or not so it's 9:11 clear not doing any internal Clos Loop 9:14 control obviously there may well be some 9:16 facility for it to do that but it's not 9:17 being used in this 9:19 application and so this output just 9:21 produces a voltage on the uh outle 9:24 connector proportional to the amount of 9:26 light that's present so there's a little 9:28 diffuser in the back there 9:30 and then there's just some kind of uh 9:33 Optical sensor looks like a 9:35 chip looking at the lead it's a very 9:37 small package on the PCB with this lens 9:40 assembly over the top and these look 9:43 like they're actually on a copper 9:44 Metalized PCB for maximum thermal 9:47 performance and yeah it's a very small 9:49 package looks like it's a ceramic 9:51 package and there's a thermister there 9:53 for temperature monitoring this is the 9:56 475 blue one this is the 520 need to 9:59 Green which is uh rather different OB 10:02 it's a much bigger D with lots of bond 10:04 wise but also this looks like it's using 10:05 a phosphor if I shine a blue light at it 10:08 lights up green so this is actually a 10:10 phosphor conversion green lead which 10:12 I've I've come across before they want 10:15 that specific wavelength so they may be 10:17 easier to tune a phosphor than tune the 10:20 um semiconductor material to get the uh 10:23 right right wavelength from the lead 10:24 directly uh red 630 similar size to the 10:28 blue one or does seem to have a uh a 10:31 lens on top of it there is a sort of red 10:33 coloring to 10:35 the die but that doesn't appear to be 10:38 fluorescent as far as I can 10:39 tell and the white one again a little 10:41 bit different sort of much higher 10:43 current 10:46 connectors a makeer name on that 10:48 connector flot light not sure if that's 10:52 the connector or the lead 10:54 itself and obviously with the phosphor 10:56 and I'd imagine that phosphor may well 10:58 be tuned to get the maximum to the uh 5 11:01 cenm and actually this white one looks 11:04 like a St fairly standard product I just 11:06 found it in Mouse made by luminous 11:09 devices in fact actually I think all 11:11 these are based on various luminous 11:13 devices modules and they're you take 11:17 looks like they taking the nearest 11:18 wavelength and then just using these 11:19 filters to clean it up to get a precise 11:22 uh spectral line out of it so quite a 11:25 nice neat and um extreme 11:30 bright light source uh sure I've got any 11:33 particular use for it so I think this 11:35 might end up on 11:36 eBay but uh very pretty to look out and 11:40 without the uh risk of burning your eyes 11:43 out like you do with lasers so I thought 11:45 it would be interesting to try and 11:46 figure out the runtime of this things 11:48 like this we usually keep some sort 11:49 record of runtime cuz leads degrade over 11:51 time I couldn't get any software to work 11:52 through the USB face but then had a 11:54 thought probably going to be writing the 11:55 runtime periodically to the e s prom so 11:58 I just just scope up that and noticed it 12:00 was doing right every 5 minutes so I 12:02 just ran it for a while periodically 12:04 reading the E squ I just held the pick 12:05 in in reset and um put clip over to read 12:07 the square prom and found it was writing 12:10 one location per color every 5 minutes 12:12 so if one color was on it would write 12:14 that location every 5 minutes and just 12:16 increment it by one so after doing a few 12:18 tests with different colors of different 12:19 time periods it looked extremely 12:21 straightforward it's like a four bite 12:22 count for each color looking at the 12:24 original data that was in it all the 12:26 colors apart from Green were reading 12:28 zero and the green was reading four 12:30 indicating a total 20 minutes run time 12:32 ever if it was turned on run for a short 12:34 time then turned off that might not have 12:36 been counted but even so indicates this 12:37 thing wasn't used a great deal the whole 12:40 s process of doing a run can be several 12:42 hours but it'll only be doing probably 12:43 the Imaging at the end of that so you 12:46 wouldn't expect to be running for a long 12:47 time but say a single color for 20 12:50 minutes over its whole lifetime does 12:52 seem a little bit on the low side okay 12:55 let's look at the camera un fortunately 12:57 I managed to not record any sound when I 12:58 did this it's also a couple of months 13:00 ago so there's going to be a few details 13:02 that I've forgotten so I'm just going to 13:04 dub this over the original footage so um 13:07 take the lid off see this massive great 13:10 heat sink so this is a pel cool camera 13:12 we've got this blower fan producing a 13:14 fair amount of air flow through 13:16 it the connector here there's the ccds 13:19 mounted on the board on the 13:24 right this unplugs so we've got a bit of 13:27 power supply stuff on here 13:29 USB interface I think that's the Cyprus 13:32 microcontroller High speeded USB 13:34 interface there's a zyink spon fpga some 13:40 RAM and there's a couple of ATD 13:42 converters can't quite read what those 13:45 those are but anal 13:47 devices um little bit of bodgery around 13:51 here extra decoupling obviously they 13:53 have having some noise issues this is 13:55 around the ram chip quite a lot of extra 13:57 capacitors been added there 13:59 uh there's a couple of amplifiers prior 14:01 to the HD converter buffers or Andor 14:05 amplifiers taking the CCD 14:08 signal um bit more power spy stuff here 14:11 this is probably all to do with 14:12 generating the various CCD bias voltages 14:14 they uh need quite a lot of exotic 14:18 voltages next board down is just a 14:20 shield and an interconnect 14:24 boardly shielding the power supply stuff 14:26 from some the more sensitive an log 14:28 stuff 14:31 and this is the bottom board which is 14:32 just all power supply 14:34 stuff as you can see tons of capacitors 14:37 or Transformer in 14:42 there and this is the CCD which is a uh 14:47 very impressive thing this is a kf50 100 14:50 originally by true sense then codec 14:53 there ON 14:54 Semiconductor it's 50 megapixels uh the 14:58 only price I could find was this one 15:00 5,000 bucks and the architecture you can 15:03 see there actually two separate halves 15:04 which explains the Dual AZ converters 15:06 and two amplifiers it's literally split 15:08 down the middle and duplicated so it's 15:10 outputting two streams in parallel just 15:13 to keep the bandwidth sensible and it's 15:15 got this amazing um diffraction effects 15:18 it's got micro lenses over the pixel so 15:20 there's there's a bit more Optics going 15:22 on than on a normal 15:25 sensor few more bodges on the CCD board 15:28 including this wire which isn't really 15:29 tacked down very well which is a bit uh 15:32 bit of a mess quite a few bits around 15:34 this board where they've uh tacked 15:36 various bits on which is not super 15:38 impressive looks like CCD drivers on the 15:40 left with those 3 ohm um damping 15:43 resistors on the 15:47 output get a few more little bodges 15:50 around here some of 15:52 the and there's this separator the 15:54 silica gel to keep the moisture down but 15:56 there's this separator that actually 15:58 appears to be cut from piece of 15:59 antistatic 16:04 bag and this sort of thermal block on 16:06 top of this stack of three pel Cola 16:12 modules so as with any Stacks they get 16:16 um larger as they go back towards the 16:18 heat sink because each P's got to not 16:20 only take the heat from the previous but 16:21 also the waste heat which is quite 16:27 significant you see a little temperature 16:29 sensor here that copper block which 16:32 makes contact with the back of the 16:37 CCD and this's the back of the 16:40 pelas this then contacts the heat sink 16:44 on the uh rear there a few thermal pads 16:46 as well for some of the other power 16:47 components on this 16:51 PCB okay I've connected this uh camera 16:54 up I found some drivers on the disc that 16:56 seem to work under Windows 7 couldn't 16:58 get to install under Windows 11 though 17:01 um in the absence of any sort of lens or 17:03 being bothered to the proper amount I've 17:04 just put some f over it and put a little 17:06 pin in there to make a pinhole lens and 17:08 software gives a few options I'm not 17:11 entirely sure what all these are there's 17:12 obviously a clock frequency 22 MHz low 17:15 gain and with PFG no idea what that is 17:19 something something game programmable 17:20 Something game perhaps ver exposure 17:23 types I think focus is just like a 17:25 continuous grab until you tell it to 17:27 stop not entirely sure all these options 17:30 are obviously exposure time uh triggers 17:33 there ex external hardware trigger inut 17:35 you just trigger using a um thing on 17:37 screen so the resolution is 8176 by 17:40 6132 and you can actually bin those 17:42 where you combine multiple pixels to get 17:46 increased gain at the expense of lower 17:48 resolution down this is a 10sec exposure 17:51 obviously of the pin hole it's very uh 17:53 intensitive so we just stand still now 17:56 downloading it there's the uh exposure 17:59 so when it's 18:01 um there's a little status thing down 18:03 here so that tells you the um exposure 18:07 [Applause] 18:09 time it's this is just it 18:15 downloading um it is quite I'm seeing 18:18 quite a lot like smearing I think that I 18:20 don't know whether that's just due to 18:21 pixels overloading or something else I 18:24 mean yeah it's not it's not um out of 18:26 the question that there's something not 18:27 totally right about this camera 18:28 certainly was bodge wise on there um I 18:31 don't I'd imagine a camera like this 18:32 it's got a fairly narrow range of 18:34 intensities that it's happy with I'm not 18:36 going to spend a great deal of time on 18:38 this if you're interested in this camera 18:40 maybe for astronomy or something and 18:42 happy to sort of take the risk of it may 18:44 not be uh perfect I'll um I think I'll 18:47 stick this on eBay along with the 18:48 Illuminator I'll put a link down in the 18:50 description to the listing take your 18:52 chances to grab a bargain so for example 18:54 here we see this vertical streaking so 18:56 I'm not sure how normal that is this is 18:58 on fairly bright scene looking out the 19:02 window if I cut the exposure time down 19:04 on that it's now 1 second 19:07 exposure again most of the image 19:09 disappears again this is looks like it's 19:11 possibly over still overloading here go 19:14 that go down to say say quarter a 19:16 second so again I think there might be 19:19 some Auto gain control going on here um 19:21 this is with the PFG option let's try 19:23 turning that off and see what 19:25 happens so I'm not sure this is actually 19:27 more streaking or which just it's 19:29 cranked up the gain all the dis display 19:31 gray scale to show what um you know the 19:33 range of things that it's captured 19:36 there's one of one of 12 things in the 19:38 software there's um you can see of you 19:40 can't seem to read out the temperature 19:42 of the pelta cooler but you can set the 19:44 temperature and if you said it's a 19:46 different temperature you see the power 19:48 consumption jump up running the cooler 19:50 to get the temperature you requested but 19:52 I can't see anything anywhere that tells 19:54 you whether the cool is at the at the 19:56 temperature other than the power 19:57 consumption going down and there's no 19:59 temperature read out 20:03 here and just some yeah this is just 20:05 sort of very basic software I'm sure 20:07 there's like an API for more 20:09 sophisticated 20:10 applications but so if you know anything 20:12 more about these cameras please um stick 20:14 in the 20:15 comments um incidentally when I was 20:18 editing I didn't notice there was a bent 20:19 pin on the um CCD but I did fix that 20:22 before doing these tests and also 20:24 reactivated the um silica gel desicant 20:26 cuz I noticed it was uh I was getting 20:28 bit of condensation on the window but um 20:31 yeah so a couple of uh interesting but 20:34 maybe not particularly uh useful pieces 20:37 of Kit except for someone that's got a 20:38 very specific use so um I'll stick a 20:42 I'll stick these on eBay put a link in 20:44 the description and say hopefully 20:45 someone could actually make some uh good 20:47 use of these things Example Output: **Abstract:** This video presents Part 2 of a teardown focusing on the optical components of a Fluidigm Polaris biotechnology instrument, specifically the multi-wavelength illuminator and the high-resolution CCD camera. The Lumen Dynamics illuminator unit is examined in detail, revealing its construction using multiple high-power LEDs (430nm, 475nm, 520nm, 575nm, 630nm) combined via dichroic mirrors and filters. A square fiber optic rod is used to homogenize the light. A notable finding is the use of a phosphor-converted white LED filtered to achieve the 575nm output. The unit features simple TTL activation for each color, conduction cooling, and internal homogenization optics. Analysis of its EEPROM suggests extremely low operational runtime. The camera module teardown showcases a 50 Megapixel ON Semiconductor KAF-50100 CCD sensor with micro-lenses, cooled by a multi-stage Peltier stack. The control electronics include an FPGA and a USB interface. Significant post-manufacturing modifications ("bodges") are observed on the camera's circuit boards. Basic functional testing using vendor software and a pinhole lens confirms image capture but reveals prominent vertical streaking artifacts, the cause of which remains uncertain (potential overload, readout artifact, or fault). **Exploring the Fluidigm Polaris: A Detailed Look at its High-End Optics and Camera System** * **0:00 High-End Optics:** The system utilizes heavy, high-quality lenses and mirrors for precise imaging, weighing around 4 kilos each. * **0:49 Narrow Band Filters:** A filter wheel with five narrow band filters (488, 525, 570, 630, and 700 nm) ensures accurate fluorescence detection and rejection of excitation light. * **2:01 Customizable Illumination:** The Lumen Dynamics light source offers five individually controllable LED wavelengths (430, 475, 520, 575, 630 nm) with varying power outputs. The 575nm yellow LED is uniquely achieved using a white LED with filtering. * **3:45 TTL Control:** The light source is controlled via a simple TTL interface, enabling easy on/off switching for each LED color. * **12:55 Sophisticated Camera:** The system includes a 50-megapixel Kodak KAI-50100 CCD camera with a Peltier cooling system for reduced noise. * **14:54 High-Speed Data Transfer:** The camera features dual analog-to-digital converters to manage the high data throughput of the 50-megapixel sensor, which is effectively two 25-megapixel sensors operating in parallel. * **18:11 Possible Issues:** The video creator noted some potential issues with the camera, including image smearing. * **18:11 Limited Dynamic Range:** The camera's sensor has a limited dynamic range, making it potentially challenging to capture scenes with a wide range of brightness levels. * **11:45 Low Runtime:** Internal data suggests the system has seen minimal usage, with only 20 minutes of recorded runtime for the green LED. * **20:38 Availability on eBay:** Both the illuminator and camera are expected to be listed for sale on eBay. Here is the real transcript. Please summarize it: 00:00:01 Let's say you have a fight with your 00:00:01 girlfriend. You want to do the right 00:00:02 thing. So you go to chat GBT and you be 00:00:04 like, "This is exactly what happened. 00:00:05 Tell me what to do." And you go, "Babe, 00:00:07 I just want you to know I want to take 00:00:08 full accountability and I care about 00:00:09 this relationship." And she says, "Did 00:00:11 you get this answer from chat GBT?" And 00:00:12 you go, "I did." How's that going to go 00:00:15 now? You did everything right, but what 00:00:17 makes people beautiful is not that we 00:00:19 get everything right. It's that we get 00:00:21 many things wrong. And I think in the 00:00:22 modern world we live in, we forget that. 00:00:24 Simon Synynic is back. He's the 00:00:27 visionary thinker inspiring millions to 00:00:29 cultivate human connection, find their 00:00:30 purpose, and overcome any modern-day 00:00:33 challenges. So with AI, do you think it 00:00:35 really is cause for concern and deeper 00:00:37 thought? So I'm not in the AI business, 00:00:38 but I am in the humanity business. And 00:00:40 here's the problem that we never talk 00:00:42 about here. People keep telling us life 00:00:43 is not about the destination. Life is 00:00:45 about the journey. But when we think 00:00:47 about AI, we only think about the 00:00:48 destination. And it's remarkable ability 00:00:50 to write the book, paint the painting, 00:00:51 solve the problem. But we forget the 00:00:53 importance of doing the work yourself. 00:00:55 And I think in our modern day and age, 00:00:56 we have underrepresented the value of 00:00:58 struggle. I am smarter, better at 00:01:00 problem solving, more resourceful, not 00:01:02 because a book exists with my ideas in 00:01:04 it, but because I wrote it. That 00:01:06 excruciating journey is what made me 00:01:08 grow. But it's the same for love, 00:01:10 friendships, conflict. And I think that 00:01:12 we forget that we give up certain skills 00:01:14 or abilities because of technology. But 00:01:16 it's like saying AI will provide boats 00:01:18 for everyone except for the time there's 00:01:20 a storm and you don't know how to swim. 00:01:23 And unless we take personal 00:01:25 accountability to teach and learn human 00:01:27 skills, they will disappear. Sure, you 00:01:28 can have an AI friend and trained like 00:01:30 the best psychologist to affirm you, the 00:01:32 best listening skills that exist, but my 00:01:34 ability to know what to do when my 00:01:35 friend is struggling, my ability to 00:01:37 function in the world or my ability to 00:01:38 cope with stress. These very very human 00:01:40 skills are suffering. So what are the 00:01:43 other skills that we need to equip 00:01:44 ourselves with based on the way that the 00:01:46 world is heading? Two things. One 00:01:49 is this has always blown my mind a 00:01:51 little bit. 53% of you that listen to 00:01:54 the show regularly haven't yet 00:01:55 subscribed to the show. So, could I ask 00:01:57 you for a favor before we start? If you 00:01:59 like the show and you like what we do 00:02:00 here and you want to support us, the 00:02:02 free simple way that you can do just 00:02:03 that is by hitting the subscribe button. 00:02:05 And my commitment to you is if you do 00:02:07 that, then I'll do everything in my 00:02:08 power, me and my team, to make sure that 00:02:10 this show is better for you every single 00:02:12 week. We'll listen to your feedback. 00:02:13 We'll find the guests that you want me 00:02:15 to speak to and we'll continue to do 00:02:16 what we do. Thank you so much. 00:02:23 Simon, 00:02:23 good to see you again. 00:02:27 It's just familiar. 00:02:30 Familiar. It's so interesting because 00:02:31 when I sit down to talk to you, you're 00:02:33 one of the very few people that I don't 00:02:34 come with a preconception as to what 00:02:36 we're going to talk about. But but I I 00:02:38 come with a feeling and the feeling that 00:02:41 I bring forth is the feeling of change 00:02:44 and transition. I have lived for 32 00:02:47 years, but I don't think I can ever 00:02:49 think of a time where the future has 00:02:52 felt unclear, uncertain, scary, 00:02:56 exciting, 00:03:03 and I guess unknown. And I don't just 00:03:03 mean with with technology, but 00:03:05 technology is one protagonist in the 00:03:07 story, and there's many other social 00:03:08 stories playing out from politics to 00:03:11 relationships to all of these things. So 00:03:13 my first question to you Simon is what 00:03:15 are those things? What are the biggest 00:03:17 forces of changes that you see happening 00:03:19 at the moment in all of our lives that 00:03:21 you think we should probably talk about 00:03:23 today? That is a big question and I 00:03:27 think one of the mistakes we make and 00:03:30 this is in general is we like things to 00:03:33 be very neatly organized. We like them 00:03:35 to be black and white. Yes or no, right 00:03:37 or wrong. And as you know the world is 00:03:41 messier than that. It is more nuanced 00:03:43 than that. And nothing operates in a 00:03:45 vacuum. Everything is connected to 00:03:46 everything. Especially in a in a world 00:03:49 that's filled with this rising 00:03:52 technology called the internet and this 00:03:53 burgeoning technology called the social 00:03:55 media and desires and feeling like 00:03:58 belonging become more and more 00:04:00 important. We're struggling to find 00:04:01 them. Loneliness epidemic, stress 00:04:03 epidemic, suicide epidemics. These are 00:04:05 all feelings of disconnection, lack of 00:04:06 control, and loneliness. And so that 00:04:09 only exaggerates our feeling of 00:04:10 loneliness and despair. and wanting more 00:04:12 and etc etc etc and then you add in AI 00:04:15 and now those feelings of insecurity are 00:04:17 just exaggerated like crazy right so now 00:04:21 I find AI there's an irony to AI right 00:04:24 so if you go back to the 70s and 80s 00:04:27 right you had the rise of robotics and 00:04:29 so robots are now coming into our 00:04:31 factories and we're able to cut 00:04:35 employees by dramatic amounts and we put 00:04:39 people out on the street who They're 00:04:41 they worked in a factory, their their 00:04:43 father worked in a factory, their 00:04:44 father's father worked in a factory. 00:04:45 Like this is this is what they know. And 00:04:47 they say, "But these robots are 00:04:48 changing. They're taking our jobs." And 00:04:52 the ruling classes and the and the Wall 00:04:54 Street classes and the CEO classes, they 00:04:56 go, "Yeah, I know technology. You're 00:04:59 going to have to find a new skill. 00:05:00 Reskll. Reskill. That's what you have to 00:05:02 do to reskill." Okay. Flash forward to 00:05:03 AI. Here's where the irony comes in 00:05:05 because the world is always nature pours 00:05:07 vacuum and life seeks balance at all 00:05:09 times, right? Not always immediately, 00:05:11 but it seeks balance at all times. It's 00:05:13 always seeking equilibrium. Okay, so 00:05:15 flash forward to AI. Now you hear the 00:05:18 knowledge workers. It's the knowledge 00:05:21 workers who are going, "My job. It's the 00:05:23 coders. It's the finance people. My job, 00:05:28 the plumber is not worried about AI at 00:05:31 all. the baggage handler at the airport. 00:05:34 Here's zero about AI. And so maybe the 00:05:37 right response is it's the future, man. 00:05:40 It's technology. Reskill, res-kill, you 00:05:42 know, maybe maybe become a plumber, you 00:05:45 know. By the way, money's really good. 00:05:46 You get to work for yourself if you 00:05:48 want, you know. Um, so I just find the 00:05:51 pendulum kind of funny. With AI, do you 00:05:54 think it's overblown or do you think it 00:05:56 really is cause for concern and deeper 00:05:59 thought? The honest answer is I don't 00:06:03 know. 00:06:04 Everybody falls on one side of you're 00:06:07 over you're over you're you know the sky 00:06:09 is not falling ch chicken little or the 00:06:12 sky is falling and we're all going to 00:06:13 die. Right. The truth the truth like 00:06:16 most things is probably somewhere in the 00:06:18 middle. But the the real answer is I 00:06:21 don't know and neither does anybody 00:06:22 else. Yeah. And it seems that we should 00:06:25 have some sort of controls because we 00:06:26 didn't put any controls on the internet. 00:06:28 They did put controls on the internet in 00:06:30 China. Like children don't have the same 00:06:33 access to uh social media like kids do 00:06:36 here. Europe has controls on the 00:06:39 internet and America doesn't. And we're 00:06:42 the ones that seem to be suffering more 00:06:44 because of the lack of controls on the 00:06:47 internet. So I think some some And by 00:06:51 the way, when people, you know, talk 00:06:52 about deregulation and no controls, I 00:06:53 mean, they make us wear seat belts in 00:06:55 our cars. Yeah. you know, there's 00:06:57 nothing that we have speed limits, you 00:06:59 know, and and it's for the greater good. 00:07:02 And yeah, sure, your seatelt's 00:07:03 uncomfortable, but you'll get used to 00:07:04 it, like, and it's fine, you know. So, I 00:07:08 I think the call for no reform is is 00:07:11 wrong. Um, there are correct limits to 00:07:14 keep things safe. I am fascinated by AI, 00:07:18 both the benefits and the weaknesses of 00:07:20 it. But it is revealing to me 00:07:25 something more important than what other 00:07:28 people are talking 00:07:30 about which is we are uh we're a result 00:07:34 obsessed society right 00:07:38 we care about output we care about 00:07:39 performance we care about numbers we 00:07:41 care about final product more than 00:07:43 anything right and when people talk 00:07:45 about AI they talk about its remarkable 00:07:48 ability to write the symphony paint the 00:07:50 painting 00:07:51 write the book, write the article, uh, 00:07:54 solve the problem. Like it it is. And by 00:07:57 the way, the technology is incredible. I 00:08:00 asked only a few months ago to please 00:08:02 take this and put it in the style of me 00:08:04 and it was it was fine. I did it with a 00:08:07 friend of mine who's also an author and 00:08:08 we both did it for ourselves. We did it 00:08:10 on each other. It was really fun and it 00:08:12 was fine. I don't think it was good. It 00:08:14 was it was it gave me a good start and I 00:08:16 could edit it. I did it recently. We 00:08:18 both did it. It was damn near flawless. 00:08:21 It was It was scary good. 00:08:25 Right now, AI doesn't know the thing I'm 00:08:28 thinking about. It doesn't know that the 00:08:29 next book I'm going to write about is 00:08:30 friendship. It doesn't know the point of 00:08:31 view I'm going to have on friendship. If 00:08:32 you ask it what would Simon Synynic say 00:08:34 about friendship, it's going to be why 00:08:35 this and why that, you know? So, it's 00:08:37 derivative, right? We know that. It's 00:08:38 not original. We know that. But at the 00:08:41 end of the day, the work is good. The 00:08:42 symphony is good. The art is good. The 00:08:43 article is good. The book is decent. 00:08:45 Like, it's getting better and better and 00:08:46 better. But here's the problem that we 00:08:48 keep not talking about. People keep 00:08:50 telling us that life is not about the 00:08:53 destination. Life is about the journey. 00:08:55 That's what we keep being told. 00:08:57 Right? But when we think about AI, we 00:09:00 only think about the destination. We 00:09:01 only think about the output. We never 00:09:02 think about the input. Right? I can tell 00:09:05 you that and you and I can both say the 00:09:07 same thing which is I am 00:09:10 smarter, better at problem 00:09:13 solving, more resourceful, better at 00:09:16 pattern pattern recognition, not because 00:09:18 an a book exists with my ideas in it, 00:09:21 but because I wrote it. The excruciating 00:09:24 pain of organizing ideas, putting them 00:09:27 in a linear fashion, trying to put them 00:09:30 in a way that other people can 00:09:31 understand what I'm trying to get out of 00:09:32 my brain. That excruciating journey is 00:09:35 what made me 00:09:37 grow. And sure, you can have an AI 00:09:39 friend and that AI friend has been 00:09:41 trained like the best best psychologist 00:09:44 to affirm you the best listening skills 00:09:46 that exist. Tell me about your day. That 00:09:49 sounds difficult. Boy, it's hard being 00:09:51 you. Oh my god, it's so great being you. 00:09:52 Have you, you know, like it's it's a 00:09:54 it's an affirmation machine built by a 00:09:57 for-profit company that wants you to 00:09:59 stay on. Can't neglect that. 00:10:02 but for the fact that nobody's learning 00:10:04 how to be a 00:10:05 friend. It'll feel good. You'll feel 00:10:07 like you have a friend, but you're not 00:10:09 learning to be a friend. Right. And it's 00:10:12 the what made you a great entrepreneur 00:10:15 is not that the company exists. Is that 00:10:17 you built it with your hands and you've 00:10:19 got the scars to show for it. Yeah. It 00:10:22 was when things went wrong and you were 00:10:23 forced to fix them. And think that now 00:10:26 when problems show up, you're quick, 00:10:28 you're smarter. You're a much smarter 00:10:30 businessman now than you were five years 00:10:33 ago, six years ago. Yeah. Because you 00:10:36 did it. And I think what we're 00:10:38 forgetting is that there's something to 00:10:39 be said for, and by the way, I'm a fan 00:10:41 of AI. I want AI to make things, but I 00:10:44 would hate to lose out on becoming a 00:10:47 better version of me. And I think that 00:10:50 um to really learn to grow. And by the 00:10:54 way, I used to have a steel trap for 00:10:56 phone numbers. I knew everybody's phone 00:10:58 number and then all of a sudden my 00:11:00 phone, my 00:11:02 PDA, I don't need to memorize a phone 00:11:04 number anymore. I don't know most of the 00:11:06 people that I love. I don't know their 00:11:07 phone numbers. I type their name in, 00:11:10 right? I just have to know their name. 00:11:11 And so my brain literally went on 00:11:13 strike. It said, "Fine, fine. You no 00:11:16 longer have the capacity to remember 00:11:17 phone numbers and I can't remember phone 00:11:19 numbers to save my life." Right? So we 00:11:21 give up certain skills or abilities 00:11:23 because of technology regularly, right? 00:11:26 That's fine. I don't have a problem with 00:11:28 any of those things because whether I 00:11:29 can remember a phone number or not will 00:11:31 not affect my relationships, my ability 00:11:33 to function in the world, or my ability 00:11:34 to cope with stress. But my ability to 00:11:38 know what to do when my friend is 00:11:41 struggling. My ability to know what to 00:11:43 do when I have a fight with my 00:11:46 spouse, my or my partner. My ability to 00:11:50 know what to do when my boss yells at 00:11:52 me, but I don't want to escalate it 00:11:55 or my employee is acting out and I don't 00:11:58 want to escalate it or fire them. Well, 00:12:01 how do I resolve this? I've missed out 00:12:03 on those skills. and simply asking AI 00:12:06 how should I resolve this thing it'll 00:12:08 give you an answer and it may work and 00:12:09 you've learned nothing right and so it's 00:12:12 the difference between it's it's it's 00:12:14 like saying AI will provide boats for 00:12:17 everyone except for the time there's a 00:12:20 storm and you don't know how to 00:12:22 swim and I'm okay use the boat also 00:12:27 learn to swim so I think there's 00:12:28 something to be said for writing your 00:12:29 own symphony painting your own painting 00:12:33 building your own business, you know, 00:12:36 writing your own book, not for them, not 00:12:38 for the output, not for the output, for 00:12:41 your personal growth. Before I got here 00:12:44 today, I was writing a post for LinkedIn 00:12:47 and I was trying to make the case that 00:12:49 everybody using Chat GPT to write their 00:12:52 emails, their social media posts, their 00:12:54 investment pitches that I 00:12:56 received is now making the internet feel 00:12:59 really inauthentic because people that I 00:13:02 want I knew for many many years are now 00:13:04 sending me these perfect cookie cutter 00:13:06 emails with words that I've never heard 00:13:08 them use before. Yeah, of course. And so 00:13:10 when I read it, my brain mentally 00:13:11 discounts it as not being their opinion, 00:13:14 not actually being them. Yeah. And and 00:13:16 when you feel like you're speaking to 00:13:17 someone's AI, the meaning is gone. So I 00:13:20 I was writing this post about how 00:13:21 actually now there's this premium on 00:13:24 human written language. Like if you make 00:13:25 a couple of mistakes and you use the old 00:13:28 words, you don't use words like forged 00:13:30 and robust. I'd like to forge a 00:13:33 partnership with you, Stephen. 00:13:35 Furthermore, can we I'm like you you've 00:13:37 never said that to me, mate. 00:13:39 So, the the end dashes. Oh my gosh. It's 00:13:42 funny, right? It's crazy. Yeah. But you 00:13:45 but you're talking about what you're 00:13:46 talking about. Have you ever heard of 00:13:48 the Japanese concept of um uh uh 00:13:52 wabishabi? No. So, wabishabi is a 00:13:55 Japanese design concept which is beauty 00:13:57 in that which is temporary or imperfect. 00:14:00 Okay. So, have you ever seen Japanese 00:14:02 ceramics? Yeah. They're wonky or the the 00:14:05 the glaze is not even. Yeah. And they're 00:14:07 beautiful. You know why? Because they're 00:14:09 handmade. Tree bark. Trees. You know 00:14:12 when you have a wooden bowl, wood is 00:14:15 beautiful. Why? It's 00:14:17 imperfect, right? Think and unique. 00:14:20 Things made on a machine are the same 00:14:24 and less beautiful. And things made by 00:14:27 hand are beautiful because they're 00:14:30 imperfect. What makes people beautiful 00:14:33 is not that we get everything right. 00:14:36 It's that we get many things wrong. And 00:14:38 what makes us fall in love is not the 00:14:41 person who's perfect. It's the person 00:14:42 who accepts our imperfections. And we 00:14:44 know we're in love when we learn to 00:14:46 accept theirs. Not learn to want to. 00:14:49 Right. And you're 100% right. I now know 00:14:52 in the art world, artists are being 00:14:54 asked to sign affidavit that say, "I 00:14:58 painted this. I made this." Not AI. Not 00:15:03 because it's better or worse. It's 00:15:06 because I want to know it was touched by 00:15:07 human hands. And so I think you're 00:15:10 right. What will happen is everything 00:15:12 will be so perfect that it'll be as if 00:15:15 we're all driving or using things that 00:15:19 everything came off a conveyor belt. And 00:15:22 what we will start to desire is things 00:15:24 that are made by hand. Because by the 00:15:26 way, we think Rolls-Royce, Ferrari. You 00:15:29 takes 39 months to get a Ferrari. You 00:15:31 know why? It was made by hand. Yeah. 00:15:34 Right. And one of the things that makes 00:15:36 it expensive is the technology and the 00:15:37 carbon fiber and all that, but the other 00:15:39 thing is it's slow and it was made by 00:15:41 people. Human error. The value of human 00:15:44 error. The value of human error. Scooter 00:15:46 Brawn said to me the other day, he said, 00:15:48 um, we could watch a computer play chess 00:15:52 with another computer. He goes, but the 00:15:54 the chess games that have the highest 00:15:56 demand are one human versus another. 00:15:58 Because when a computer plays a 00:15:59 computer, the moves are predictable and 00:16:01 they're the same and they're perfect. 00:16:02 But it's the human error of two human 00:16:04 chess players who are worse, objectively 00:16:06 worse at chess that makes it so 00:16:07 fascinating. It's that's any sporting 00:16:10 event, any sporting event. It's not the 00:16:12 perfection of the game. It's the error 00:16:14 that loses the game that adds the drama. 00:16:16 And and and it's like trying not to make 00:16:19 a mistake is as powerful as trying to 00:16:21 get everything right. And it's the 00:16:22 humanity of the sport, the humanity of 00:16:24 the competition. It is the imperfection. 00:16:26 And I think that we forget as people 00:16:27 what makes us beautiful. Like when you 00:16:29 go on a first date or a first interview, 00:16:33 all you do is present perfection. 00:16:36 Put on my my best clothes, you know, 00:16:39 dressed up. I I don't dress like this 00:16:41 every day. I dressed up on my date, 00:16:43 right? My interview. I don't wear this. 00:16:46 This is what I wore for my interview cuz 00:16:47 I want to put out a good impression. And 00:16:49 I practiced I and I and I make myself 00:16:52 confident and I've got a great job and I 00:16:54 got a great personality and I love my 00:16:55 mother and my goodness, everything's 00:16:56 great. And then you get in the 00:16:58 relationship, you get the job, and you 00:16:59 become a slob, right? This is what AI 00:17:02 is. It's it's it's to your point, it's 00:17:06 it's it's 00:17:07 fake. On the interview example, the 00:17:09 minute you started describing that, I 00:17:11 immediately flashed back to an interview 00:17:12 I had last week in our company where a 00:17:14 young kid walks in wearing a suit and I 00:17:16 thought, now I have no idea who you are 00:17:19 because I know that's not you. I know 00:17:21 that you don't wear a suit. You're 22 00:17:23 years old. You do not wear a suit. So I 00:17:26 have no indication. I have no clues as 00:17:28 to who you are and therefore it's harder 00:17:30 for me to figure out if you fit here. 00:17:32 And I think of because what he tried to 00:17:34 do there was show up perfect how he what 00:17:37 he thought perfect was and in some ways 00:17:39 to hide who he actually was. Whereas 00:17:41 this is why I like now on the internet 00:17:43 sloppy text. I like grammatical 00:17:44 mistakes. But you at the same time you 00:17:46 don't want him to show up and put his 00:17:48 feet on kick his shoes up and put his 00:17:49 feet on the table either. In the same 00:17:50 way that there's an element of respect 00:17:52 you want and you want somebody in that 00:17:54 first interview to put on some effort. 00:17:56 Too much effort, yeah, is pretty 00:17:59 inauthentic, but too little effort, 00:18:02 what's that? This is my diagram. You've 00:18:04 got perfect Yeah. on one end, which is 00:18:07 low, and then you've got poor, which is 00:18:09 also low. The sweet spot Yeah. is right 00:18:11 here in the middle. Yeah. Well, I think 00:18:13 there's some truth to that. And so 00:18:15 there's a level of, you know, like it's 00:18:17 the same thing as like we all want 00:18:18 vulnerability in our relationships, but 00:18:19 not on the first date. Yeah. like I 00:18:22 don't need to know that quite yet. You 00:18:25 know, what are the other skills that you 00:18:27 think we need to equipped ourselves with 00:18:29 based on the way that the world is 00:18:31 heading? Because we're, you know, like 00:18:33 the calculator came along and we no 00:18:34 longer needed to be able to do 00:18:36 complicated maths. Completely forgotten 00:18:38 my time tables. I can't spell anymore. 00:18:40 So, I I said to my friends, the most I 00:18:41 can do is 9* 9. That's like the top end 00:18:43 of my range. But with spelling, it's the 00:18:46 same. I get like half the word correct 00:18:48 now with it. But again, you know, so 00:18:51 what are those skills? I think it's all 00:18:53 human skills. I think there needs So I 00:18:54 think where the world is going to go and 00:18:56 at least this is where I'm taking a bet 00:18:58 is that as the end product becomes 00:19:01 easier to produce, it's the humanity 00:19:03 that's going to suffer. And unless we 00:19:06 take personal accountability both as 00:19:08 individuals and organizations to teach 00:19:10 and learn human skills, they will 00:19:12 disappear for all the reasons we're 00:19:13 talking about. So how do I listen? How 00:19:17 do I hold space? How do I resolve 00:19:20 conflict peacefully? How do I give and 00:19:22 how do I receive feedback? Those are all 00:19:24 two different skills. How do I have an 00:19:27 effective confrontation? You pissed me 00:19:28 off. Do I know how to approach you as a 00:19:31 friend, as a colleague without creating 00:19:34 a massive fight or losing a friendship 00:19:36 over it? Um, how to take accountability, 00:19:38 how to express empathy. these skills, 00:19:41 these very very human skills are the 00:19:43 things that we're already starting to 00:19:44 see just with the internet and social 00:19:46 media um are suffering. And so I think 00:19:49 AI will only exaggerate the loss of 00:19:51 those skills and those skills are more 00:19:53 important than learning how to spell. 00:19:55 One of the concerning things was I heard 00:19:57 Sam Alman who's the founder of OpenAI 00:19:59 and ChatPT launch this thing called 00:20:01 Worldcoin a couple of years ago when 00:20:03 chatt really started taking off and it 00:20:06 has been closely tied to the concept of 00:20:08 universal basic income. Mhm. The idea 00:20:11 the overarching idea is that in a world 00:20:13 where AI and automation eliminate many 00:20:16 jobs UBI may be necessary worldcoin is 00:20:20 one way to help implement it. That was 00:20:22 stated by the founder of chachi Samman. 00:20:25 Yeah, I just again I'll go back to my 00:20:26 ironic statement before. Isn't it ironic 00:20:29 that they want to do a universal income, 00:20:31 standard universal income now that the 00:20:33 knowledge workers are losing their jobs, 00:20:34 but when the factory workers were losing 00:20:36 their jobs, those same people were 00:20:39 massively against uh uh these kinds of 00:20:42 things. So, I mean, yes, what happens to 00:20:46 purpose? It's ironic. And meaning if 00:20:48 we're being because for anybody that 00:20:49 doesn't know what universal basic income 00:20:51 is, the idea is the government, the 00:20:52 state, whatever would pay you a certain 00:20:54 amount of money every single minimum 00:20:55 salary. So $2,000, $3,000, whatever it 00:20:58 might be. Um because they don't think 00:21:01 many of us are going to have there's not 00:21:03 going to be enough jobs to go around. 00:21:04 And I wonder what happens to purpose and 00:21:06 meaning and pursuit and challenge and 00:21:08 all these things in a world where we're 00:21:10 just being handed money. So we're not 00:21:12 being given wealth. There's a 00:21:14 difference. we're being given survival 00:21:17 money, right? And so, you know, you 00:21:20 know, we have to be very careful that 00:21:21 says, you know, everybody who's on 00:21:22 welfare is lazy. You know, that's not 00:21:24 true, you know. So, we have to be very 00:21:26 careful that just because we give 00:21:27 somebody something doesn't mean that 00:21:29 they cease to have ambition or purpose 00:21:31 or drive. It's like somebody who who 00:21:34 makes a a commission salary, commission, 00:21:36 you know, works on commission and they 00:21:37 make just enough to pay their rent and 00:21:39 buy food and that's it. Like, that's a 00:21:42 loss that's a lack of ambition. you 00:21:44 know, the cases, at least the people 00:21:47 I've heard talk about it, they make a 00:21:48 compelling case for it, especially in a 00:21:50 world where there is plenty of wealth. 00:21:53 Um, but you know, I don't know enough 00:21:55 about it to make an argument for or 00:21:57 against it, if I'm honest. Um, but I do 00:21:59 find it ironic that the Sam Olman's of 00:22:02 the world are calling for it given the 00:22:03 fact that there's going to be so many 00:22:04 job losses when it's jobs of their kind. 00:22:07 And like I also think that's funny like 00:22:09 what's going to happen when Sam 00:22:12 Altonman's product gets good enough that 00:22:14 he can lay off most of his 00:22:17 staff. Just 00:22:19 curious what 00:22:22 happens. He has made a point of having I 00:22:26 think it's 100 people or less in his 00:22:28 company. He doesn't have like a big 00:22:31 team. And I think part of that is 00:22:32 because when I heard his TED talk a 00:22:33 couple of days ago, he's saying, "Yeah, 00:22:34 I think AGI is sooner than we think 00:22:36 actually." And I think we're gonna have 00:22:38 a fast takeoff, which means it's going 00:22:40 to arrive very quickly and accelerate 00:22:42 very quickly. So I think he's actually 00:22:43 preparing not to. Yeah. But when happen 00:22:45 what happens to the 90 people he lays 00:22:46 off when he doesn't need 100, he only 00:22:49 needs 10. This is the question. I'm just 00:22:51 curious. I don't know. And this is why 00:22:53 anybody who has an opinion about it, the 00:22:54 answer is we don't know. But I think 00:22:56 people react very differently when it's 00:22:58 their job on the line, when it's their 00:23:00 income on the line, when it's their 00:23:02 pride, when it's their ego. You know, I 00:23:06 keep hearing from companies. I mean, you 00:23:08 we were talking about this before we 00:23:10 turned on the cameras. You know, you 00:23:12 know, you talk to if you want a new 00:23:14 website, I guarantee you I I don't care 00:23:16 which company you talk to, they will all 00:23:18 talk about how they're AI this, AI that. 00:23:21 And you ask the question, are you using 00:23:22 AI? Yes, we're using AI. We're doing it 00:23:24 differently. we're the future, blah blah 00:23:26 blah. And then you ask them for a 00:23:27 proposal. It's going to look like all 00:23:29 the other proposals from 00:23:31 2015, you know, and this is how many 00:23:34 hours it's going to take our people to 00:23:35 program this and code this. And I was 00:23:37 like, what happened to all the AI? Why 00:23:39 is this slow and expensive when 00:23:42 everything's supposed to be fast and 00:23:43 inexpensive? Because they're taking the 00:23:45 margin. Of course, they're taking the 00:23:47 margin. and and they've got a lot of 00:23:48 people doing things the oldfashioned way 00:23:50 because the business model, you know, pe 00:23:52 people work very hard to predict the 00:23:54 status quo exists because there are 00:23:56 people who benefit from the status quo, 00:23:58 you know, that's why there is a status 00:24:00 quo and it's uh and you know, like I 00:24:02 said, everybody's into change the 00:24:04 future, you know, until it's until it's 00:24:07 them that's threatened or their income. 00:24:10 the the billionaires that I that I know, 00:24:12 the one consistent thing they've 00:24:13 whispered to me about AI is that people 00:24:17 are going to have a lot of free time. 00:24:18 That's one of the things that's been 00:24:19 really consistent. You're so right when 00:24:21 you say that when I asked you about the 00:24:22 future of AI, you said, "I don't know." 00:24:24 The reason why I know that's probably 00:24:25 the correct answer generally is because 00:24:27 when I sat with the most advanced people 00:24:28 in AI, whether it's Mustafa who's head 00:24:30 of Microsoft AI, now CEO of Microsoft AI 00:24:33 or people from Google or the CEO of 00:24:36 Google or Reed Hoffman who's the founder 00:24:38 of LinkedIn, they all had different 00:24:40 opinions, which made me to think 00:24:43 actually the right answer is nobody 00:24:44 knows. The right answer is nobody. That 00:24:46 is correct. And and and you always be be 00:24:48 be aware of the messenger, right? Like 00:24:49 you won't have anybody who owns an AI 00:24:51 company talking doomsday scenarios. it's 00:24:54 not in their economic interest even if 00:24:56 they secretly harbor that. It's like 00:24:57 people who used to run cigarette 00:24:59 companies didn't smoke and let they 00:25:00 didn't let their family smoke. It's like 00:25:02 I remember visiting Facebook in the 00:25:04 earlier days and they I went into the 00:25:07 cafeteria and they had like like picnic 00:25:10 benches and I was like and they were 00:25:13 telling me with pride how they have 00:25:14 these communal eatings areas to help 00:25:16 people maintain relationship and I was 00:25:19 like this is hilarious. You literally 00:25:21 have a product that breaks relationships 00:25:23 and yet you understand enough to make 00:25:25 people eat together at lunchtime so that 00:25:27 they'll maintain relationship. I mean 00:25:30 the point being if your economic 00:25:31 interest, you know, show me how 00:25:33 someone's paid and I'll show you how 00:25:34 they behave. You know, one of the 00:25:35 scarest conversations I was privy to was 00:25:38 one a friend of mine who's a billionaire 00:25:40 in London, he knows the CEO of one of 00:25:43 the biggest air companies in the world 00:25:44 who I can't name. And he said, "By the 00:25:46 way, what he tells me in private is not 00:25:48 what he's saying publicly." Yeah. I he 00:25:50 he said to me that what this particular 00:25:52 CEO thinks is going to happen with AI is 00:25:54 pretty horrific. And the CEO of this big 00:25:57 AI company is totally cool with it. It's 00:25:59 it's and it's horrific what he thinks is 00:26:00 about to happen. And then when I watched 00:26:02 this guy do his like online talks and 00:26:04 give his opinion. He's so nuanced and 00:26:07 everything will be fine and he's an AI 00:26:08 optimist. Then I heard this scenario at 00:26:10 this kitchen table in East London from 00:26:12 his friend about what he really thinks 00:26:13 and it was chilling. Yeah. Like actually 00:26:16 the lack of empathy. Yeah, that makes 00:26:18 sense to me. But like the obsession with 00:26:21 power was shocking to me. Yeah. The 00:26:24 obsession with power and money and all 00:26:25 the rest of it. Yeah. But this is 00:26:27 because the internet has done something 00:26:29 really strange and and challenged one of 00:26:32 my theories head on, right? So I talk 00:26:35 about in an infinite game, you know, Jim 00:26:38 Jim Carse his theory, you know, in an 00:26:40 infinite game there's no winners or 00:26:42 losers, right? And so like nobody wins, 00:26:45 you know, fast food, nobody wins cars. 00:26:49 Like General Motors, Ford, Vauhall, they 00:26:52 can all exist at the same time, right? 00:26:55 And they'll have degrees of success or 00:26:56 not success, but they can all exist 00:26:58 simultaneously. No, nobody's going to 00:27:00 win. The exception is in the internet, 00:27:02 in in the like like 00:27:05 Amazon, it won. Yeah. Like you know, 00:27:08 Google for search. Yep. They won, right? 00:27:12 And if you start going down like the big 00:27:14 big tech companies, there is only one. I 00:27:17 mean, sure, there's competition, but not 00:27:18 really, right? Who, you know, Walmart is 00:27:22 making a run of it to threaten Amazon, 00:27:23 but Amazon's still so damn big. You 00:27:26 know, all of these companies that 00:27:27 there's only one. 00:27:29 And that's not good. That you can't have 00:27:33 winners in a in a category. And so, this 00:27:35 is why I think the the race for AI is so 00:27:38 aggressive, for AI dominance is so 00:27:40 aggressive. And which is why people are 00:27:42 not being careful and which is why 00:27:43 they're not putting controls is because 00:27:45 the way that tech seems to work is there 00:27:48 probably will be one dominant standard 00:27:50 and then that's it. And the question is 00:27:52 which one? Because I don't think it just 00:27:55 seems to be the way it is, which is a 00:27:57 very scary prospect to me that the the 00:27:59 fact that we can have winners is is a 00:28:01 bad thing. Especially if we if we pride 00:28:03 ourselves on being capitalists, then 00:28:05 there cannot be there cannot be a winner 00:28:07 and there cannot be one that is so 00:28:08 dominant that that nobody else can even 00:28:10 compete except for scraps. What are your 00:28:13 emotions when you think about AI and 00:28:15 what's happening? because I I feel like 00:28:18 the moment we're living in is a profound 00:28:20 one and that we don't actually realize 00:28:21 it because when these tools come out, 00:28:23 OpenAI released yesterday 3.0. It's the 00:28:26 best model ever. The day after my life 00:28:28 was the same. So, we don't really notice 00:28:30 it cuz we go back to work, our clients 00:28:32 ask for the same thing. We have the same 00:28:33 team members sat around us. The it 00:28:35 almost seems like the sand timer is 00:28:37 rotated and we're on a clock and it's a 00:28:40 slow disruption of our everyday lives. 00:28:43 Sam Alman the other day on his TED talk 00:28:45 3 4 days ago said in the short term 00:28:47 everything will appear the same but in 00:28:49 the long term he goes life is going to 00:28:50 be completely different. Yeah, I think 00:28:51 that's right. I mean and look at any any 00:28:53 technology like the like AI it was kind 00:28:57 of the same until it wasn't and these 00:28:59 are evolutions not revolutions. Like 00:29:00 there's a revolutionary bit. Mhm. You 00:29:03 know, like I remember when when the 00:29:04 internet showed up and like brick and 00:29:06 internet shopping showed up and all the 00:29:08 technologists were like it's the end of 00:29:10 stores, it's the end of bricks and 00:29:11 mortar. Like they're done. Like we'll 00:29:12 never go to a shop again. Well, that 00:29:14 didn't happen. Now shops struggle to 00:29:16 compete against internet, but that's a 00:29:18 price thing, right? That's that's a 00:29:19 business model thing. But we like going 00:29:21 shopping because again, they've all of 00:29:23 these companies always forget, 00:29:24 especially technologists. They all 00:29:26 forget that the end user is a human 00:29:28 being. And most of us don't fully 00:29:31 understand everything. Even even our 00:29:33 iPhones, most people use a a small 00:29:36 percentage of all the capabilities of 00:29:37 our iPhones. Most of us don't even know 00:29:38 how to change the damn settings to make 00:29:40 it do something we want, right? Even and 00:29:43 you neither do your kids. It's not an 00:29:45 adult thing, right? It's not an old 00:29:46 person thing. Like, and there's a few 00:29:49 people who get more out of it and good 00:29:51 for them. Some people use it just as a 00:29:53 phone. Fine. And it's a bell curve. So, 00:29:56 I think there will be a few people and a 00:29:58 few companies that will get more value 00:30:00 out of these things than the rest of us. 00:30:02 But I think he's right. I think there'll 00:30:04 be a revolutionary bit and then it'll 00:30:07 settle. I I I'm find this whole thing 00:30:10 fascinating when you ask me how do I 00:30:11 feel, you know, depending on what 00:30:13 subject I'm talking about. Absolute fear 00:30:16 and absolute amazement. I I I have I 00:30:19 have both and everything in between. 00:30:21 When I think about how it affects 00:30:22 democracy and the ability to make deep 00:30:24 fakes and and how it can manipulate 00:30:28 people and their opinions to vote one 00:30:29 way or another, I have real fear. Yeah. 00:30:33 Right. When it comes to like 00:30:35 productivity and the reshaping of 00:30:37 business, you know, technologists and 00:30:38 people who who were part of the internet 00:30:40 revolution, they love to say, you know, 00:30:42 20 years ago, 80% of the jobs we have 00:30:44 now didn't exist. They love to say that, 00:30:46 right? But when you ask them now they 00:30:48 they like they they they seem to think 00:30:51 that I think it's the same which is all 00:30:52 those people are going to lose those 00:30:53 jobs in white color you know white 00:30:55 collar jobs and knowledge workers I 00:30:58 they're not they're not going to not 00:30:59 work there's going to be new jobs the 00:31:01 IRS digitized a whole bunch of years ago 00:31:04 right they got rid of all the 00:31:05 accountants and they put in all the 00:31:06 computers right do you know how much 00:31:09 money the IRS saved when it when it 00:31:12 completely changed the the way it looked 00:31:14 the answer is zero yes they got rid of 00:31:16 all the accountants and they need to 00:31:17 hire all the IT people. So the the the 00:31:21 the workforce looked different but it 00:31:23 didn't get smaller. And so I think it's 00:31:26 the same thing what it's going we also 00:31:28 we already know the massive incredible 00:31:31 amounts of 00:31:32 energy that it takes for AI to work. 00:31:35 Data centers that use up massive amounts 00:31:38 of electricity like we've never seen in 00:31:39 our lives. Like nuclear has to be a 00:31:42 thing. There isn't enough coal or oil or 00:31:45 solar or wind to power these things. It 00:31:48 just doesn't exist. So nuclear has to be 00:31:50 a thing. So go be a nuclear engineer. Go 00:31:53 you want to get an advanced degree. I 00:31:55 don't need you to be a coder. You know, 00:31:56 coding was a thing for a go be a nuke 00:31:59 because by the way, you got to be just 00:32:00 as smart to be a nuke as you have to be 00:32:02 a a a So you're going to start to see 00:32:05 that. You know, you're going to see 00:32:06 energy work. Um I just think the jobs 00:32:09 will change. I don't think they're going 00:32:11 to like I I completely one thing I do 00:32:13 disagree with, you know, it's not like 00:32:15 you're going to be a bunch of people 00:32:16 walking around bored. I just think the 00:32:18 jobs will change. If there was a a 00:32:20 10-year-old kid stood here now and the 00:32:22 10-year-old said to us said, "Guys, what 00:32:25 do you think I should focus on?" I would 00:32:28 say two things. Um uh one is going back 00:32:33 to human skills, learn how to be a good 00:32:34 friend to your friends. Okay. How do I 00:32:36 learn that? You're going to really need 00:32:37 that. How does a 10-year-old learn that? 00:32:39 Or how do you and I learn that? Both. A 00:32:41 10-year-old learns it that when they go 00:32:43 and have a play date at a friend's 00:32:44 house, a smart parent takes away all the 00:32:46 phones. I would hate that the 00:32:48 10-year-old has a phone in the first 00:32:49 place, but if they do, take away all the 00:32:50 phones and make the kids go play. That 00:32:54 when they have a fight, the parents make 00:32:56 them say sorry. You know, go over to 00:32:59 your friend's house and knock on the 00:33:01 door and you're going to say sorry for 00:33:02 the thing that you did. Um, we're going 00:33:04 to teach kids how to resolve conflict. 00:33:06 We're going to teach kids how to pay 00:33:07 compliments. We're going to teach kids 00:33:08 how to take accountability and these are 00:33:10 all the skills of you know what did you 00:33:13 do wrong versus what did your what what 00:33:15 you know like it's not like you 00:33:19 know it's not always the school or the 00:33:21 teacher maybe your kid did is disruptive 00:33:24 you know and so accountability is a real 00:33:26 thing and so I think if we teach those 00:33:28 things to 10-year-olds and to adults um 00:33:30 I think it makes for a better society 00:33:31 and the other thing is go learn a real 00:33:33 skill and I don't mean like that you 00:33:35 know prompting isn't a real skill That's 00:33:37 not what I mean. It's what I said 00:33:39 before, which is it's the excruciating 00:33:41 like what makes great relationships 00:33:43 great is not that you get along all the 00:33:45 time. The best marriages, the best 00:33:47 relationships, they're not absent of 00:33:49 conflict. It's they know how to resolve 00:33:52 conflict peacefully. By the way, I 00:33:54 believe in world peace. I don't believe 00:33:55 in a world without conflict. I believe a 00:33:57 world in which we can resolve our 00:33:58 conflict peacefully without the need to 00:34:00 go to war to resolve conflict. This is 00:34:01 why I like democracies because 00:34:03 democracies can solve conflict without 00:34:05 bullets. So I the human skills one but I 00:34:07 say a real skill mean like go do 00:34:09 something difficult build something 00:34:12 design something imagine something write 00:34:15 something and and and and by the way I'm 00:34:18 totally fine even if you plug it into 00:34:19 chat GBT and say tell me what's wrong 00:34:20 with this your grammar's all screwed up 00:34:23 you know and like I said 00:34:26 I I am smarter because I did it I'm the 00:34:32 reason I'm more confident than when I 00:34:35 was younger 00:34:36 And I think that's one of the things you 00:34:37 people talk about you get wise with age, 00:34:39 you know, and like and you know, you 00:34:41 have more confident as you get older and 00:34:43 and yes, that's all true and there's 00:34:45 multiple reasons for it, but I think one 00:34:47 of the reasons is is the things that are 00:34:48 happening to me 00:34:50 now. I've gone through those things 00:34:52 before. They were scary and kept me up 00:34:55 at night the first time and 00:34:58 now I know how to do it. I'm not afraid 00:35:00 of it anymore. And so I think what 00:35:03 happens as you gain experience is you 00:35:05 lose 00:35:07 fear. And if chat GBT or whatever AI 00:35:10 product we use does everything for us, I 00:35:12 think you just end up scared. One of the 00:35:15 things that I'm contending with at the 00:35:16 moment with this new technology that's 00:35:19 arrived, being an entrepreneur, seeing 00:35:22 this huge opportunity, thinking about 00:35:24 the dotcom boom and all the great 00:35:26 opportunity that that 00:35:28 created, people talking about the age of 00:35:31 abundance and all these things is I'm 00:35:32 contending with the question on a 00:35:34 personal level, which is when is enough 00:35:37 enough? And maybe this question is more 00:35:38 pertinent now than ever in a world where 00:35:40 creating stuff, building stuff, starting 00:35:42 a company, launching a book, the cost of 00:35:45 creating these things, whether they're 00:35:47 good or not so good, has gone to 00:35:49 basically zero. Yeah. So we can all 00:35:51 theoretically from our computers now 00:35:53 become movie directors and authors and 00:35:56 software developers. And so with this 00:35:59 possibility, 00:36:04 opportunity and the the thing we need to 00:36:04 deploy is intention like what do I do? 00:36:06 What is the thing that's going to lead 00:36:08 me to happiness? Do I pursue all of 00:36:10 these things and start building and 00:36:11 creating and running off down that path 00:36:12 to climb some ladder? Or do I take a 00:36:15 second? When is enough enough? And as an 00:36:17 entrepreneur who is in this moment, has 00:36:19 a lot of resources, could roll the dice, 00:36:21 could start all these new companies, 00:36:22 could do all of these things. When is 00:36:24 enough 00:36:26 enough? 00:36:35 like 00:36:35 we there's something to be said for 00:36:37 gratitude and if you want to make it 00:36:39 money we know the data on this right I 00:36:42 think once you reach I can't remember 00:36:43 the number it's $70,000 a year of 00:36:46 income like like when you're talking 00:36:49 about money you can't buy happiness like 00:36:50 it absolutely buys happiness up to a 00:36:52 certain level which is survival and then 00:36:54 a little bit more you know but once you 00:36:56 reach a certain level like the there is 00:36:58 no discernable increase in happiness 00:37:01 iness that comes with money. Now, what 00:37:03 money buys is options. You know, what 00:37:06 money buys is time. You know, um those 00:37:10 things are true. And you you said it 00:37:12 like some of the people that you and I 00:37:14 know who have made generational 00:37:18 wealth. They're not discernably happy. 00:37:20 The ones that are happier were happy 00:37:22 before they made the money. And the ones 00:37:25 who thought the money would buy them 00:37:26 happiness or worse, the money took away 00:37:29 their purpose. Because when they made 00:37:30 the money, they were driven by something 00:37:32 that they accidentally made the money. 00:37:34 They built businesses that were th those 00:37:36 were their passions and their cause and 00:37:38 then the money came and they weren't 00:37:40 building the thing anymore. And you know 00:37:44 this this is this is the difference is 00:37:45 like why this this is really 00:37:47 interesting. It also gets the question 00:37:49 is why is it that small companies are 00:37:50 more innovative than big companies, 00:37:52 right? Like you think about it when you 00:37:54 say what's the secret for for 00:37:55 innovation? where you want to have 00:37:56 resources, you want to have great 00:37:57 people, and you want to have great 00:37:58 market opportunities, and then you can 00:37:59 have great ideas. Okay? So, big 00:38:01 companies have tons of money. They hire 00:38:03 have all the best people. They have 00:38:04 mature uh marketplaces that people 00:38:06 generally know who they are, and they're 00:38:08 the least innovative organizations on 00:38:10 the planet, right? Then you have little 00:38:12 companies that have no money. They're 00:38:14 bootstrapping it. They don't have enough 00:38:15 people. Nobody knows who they are. And 00:38:16 there's [ __ ] market conditions. And yet, 00:38:18 they're more innovative. And then big 00:38:19 companies innovate by buying the little 00:38:21 companies. You know, that's basically 00:38:22 what happens. My exit. Your exit. big 00:38:25 company can't innovate, they just bought 00:38:26 you, right? Why is 00:38:29 that? Why is that the rule? And so it 00:38:32 goes directly to this. I think I think 00:38:33 the reason is is because when you're 00:38:35 small, your ambitions are bigger than 00:38:38 the resources you have to achieve those 00:38:40 ambitions. Every small business has 00:38:43 outsized ambitions, like beyond 00:38:46 objectively stupid. Like you look at 00:38:49 what they have and what they've got and 00:38:50 you they tell you where they're going to 00:38:51 be and you're just like no. And yet some 00:38:54 of them do. And I think the problem with 00:38:57 big companies is their ambitions are 00:38:58 well within their their capabilities, 00:39:01 their resources. In other words, their 00:39:02 vision isn't big enough. And I think 00:39:05 your vision has to be bigger than the 00:39:07 amount of money, resources, and 00:39:08 intelligence that you have to achieve 00:39:10 that. And what that produces is 00:39:12 creativity. 00:39:14 And so it goes right back to this, which 00:39:16 is if if if we can do so much with 00:39:20 AI, then we need bigger visions. And so 00:39:24 when you ask me how do you find 00:39:26 happiness, I think that we need to set 00:39:29 our sights on things that are bigger 00:39:30 than finite 00:39:32 success. Um, and I think we do need a 00:39:34 gratitude practice. Regardless of how 00:39:36 little or how much you have, to be 00:39:38 grateful for what you do have is a 00:39:41 profound impact. I went through this 00:39:42 with the LA fires, right? The I was very 00:39:46 lucky that my house survived and I 00:39:48 didn't have to get evacuated, but the 00:39:49 evacuation zones were getting closer and 00:39:51 closer. And two things happened that 00:39:54 were profound that live that I will that 00:39:57 live with me now. One which is 00:39:58 resolvable and one which is 00:39:59 unresolvable. We were all obsessed with 00:40:01 this app called Watch Duty, which is how 00:40:03 we track the fires. It basically took 00:40:05 all the publicly available information 00:40:06 and put it in one place in a really 00:40:08 amazing way, right? started by this 00:40:10 amazing amazing amazing entrepreneur 00:40:11 named John Mills. And we were all 00:40:14 obsessed with Watchd Duty. We all were 00:40:15 watch on this app the whole time. And 00:40:17 one of the things we were watching was 00:40:18 the wind cuz if the wind shifted, it 00:40:21 could profoundly impact your life. And I 00:40:24 remember having this experience like we 00:40:25 were all watching the wind and the wind 00:40:28 went away from me and I thought, "Oh, 00:40:31 thank God." And in that moment, I knew 00:40:34 that somebody was looking at the app 00:40:36 going, "Oh god, no." 00:40:38 And it's not like service where I'll eat 00:40:40 a little less so that somebody can eat 00:40:41 more. I'll give up some of my income so 00:40:42 that somebody has it's not one of those 00:40:44 I don't want my house to burn down so 00:40:45 somebody else's house doesn't burn down. 00:40:46 And I has I had to live with this 00:40:48 paradox of how unfair the world is that 00:40:51 simultaneously my relief and good news 00:40:53 was somebody was somebody else's stress 00:40:55 and bad news and there was nothing I 00:40:56 could do to change that. So that paradox 00:40:59 I is horrible and I it was it was right 00:41:02 in front of me. So that's one, but it's 00:41:04 the second part, which is the evacuation 00:41:07 zones were coming a little closer, and 00:41:09 they were one zone away from where where 00:41:11 I 00:41:12 live, and we didn't know if we were 00:41:14 going to be woken in the middle of the 00:41:16 night with a with an alarm to evacuate. 00:41:18 We didn't know. And so we had I had to 00:41:20 go through the process of packing up my 00:41:23 car and making my go bag. And I put as 00:41:26 much stuff in my car as I could. And I 00:41:27 had to, you know, we all play that game 00:41:28 like if there's a fire and you have to 00:41:29 run out and grab two things, what would 00:41:31 you grab? I had to do that. Right? A lot 00:41:33 of people in LA had to do that. Right? 00:41:35 You actually had to make the decisions, 00:41:37 what am I going to take and what I'm 00:41:38 going to leave behind. And I found 00:41:39 myself bringing things that I never 00:41:41 thought were important to me. And I 00:41:43 found myself leaving things behind that 00:41:46 I thought I would 00:41:47 take. But the one thing that was amazing 00:41:50 was stuff that I couldn't fit, but I 00:41:53 still had love for. Like my favorite 00:41:54 painting in the world. It's just I 00:41:56 couldn't fit it in my car. I stood there 00:41:58 in front of it and I said thank you and 00:42:01 I said 00:42:02 goodbye and it was like saying goodbye 00:42:04 to a loved one. You know, I hear this, 00:42:05 you know, somebody loses a parent, they 00:42:06 go, "Look, it was awful. And, you know, 00:42:08 he suffered as on his deathbed, but I'm 00:42:10 glad I was with him to say 00:42:12 goodbye." And it was the most amazing 00:42:15 thing to have gratitude for something 00:42:17 that I don't want to lose, but accepted 00:42:19 that 00:42:20 I I might. 00:42:23 And it's made me a lot more disconnected 00:42:27 from my material things, especially the 00:42:30 things I said goodbye to because I've 00:42:31 already said goodbye to them. I had I I 00:42:33 just sold some of my art for for charity 00:42:36 and people said, "How did you love your 00:42:37 art?" I'm like, "I know. Like my art's 00:42:39 like my babies." Like, "How did you 00:42:41 choose?" And I said, "I've already said 00:42:42 goodbye to everything here. I did it 00:42:44 months ago." You know, and I think this 00:42:46 idea of gratitude, gratitude for what we 00:42:49 have, but also 00:42:52 um like you're going to lose your 00:42:54 parents. All of us will lose our 00:42:56 parents. Hopefully. Hopefully. 00:42:58 Hopefully, uh they don't never have to 00:43:00 say goodbye to us. But we if things go 00:43:02 well, we're going to have to say goodbye 00:43:03 to our parents. 00:43:05 And we can't be angry about it. We want 00:43:08 to say thank you for the times we had. 00:43:11 And I think to have that level of 00:43:14 appreciation for everything in our 00:43:15 lives, how temporary all of this is, I 00:43:18 think that's it makes you happier. I 00:43:20 know it sounds I don't it's it makes you 00:43:24 happier to just look at someone and be 00:43:26 grateful. You know, failed 00:43:28 relationships. You could be angry at the 00:43:30 other person or you can be grateful for 00:43:32 the lessons they taught you and or for 00:43:34 the good times you had. And I think to 00:43:36 change our minds to gratitude and you 00:43:39 can, you know, it sounds it's a little 00:43:41 bit, you know, hippie dippy to have a 00:43:42 gratitude practice and that's fine. You 00:43:45 know, if that's your thing, lie in bed 00:43:46 every night or keep a journal and just 00:43:48 say the things you're grateful for. 00:43:51 But I don't know if it works without an 00:43:54 evacuation zone 00:43:56 approaching. I don't know. But to go 00:43:58 around your home and just say thank you 00:44:00 to the things that you like is a weird 00:44:03 thing. How many of us How many of us 00:44:06 when was the last time you called a 00:44:07 friend out of the blue and just said 00:44:10 thanks for being my friend? Like hey 00:44:14 just wanted to call and just tell you I 00:44:16 love you. Just tell you thank you and 00:44:18 you know that's all. No just a quick 00:44:22 just two minutes. Just just want to say 00:44:23 thank you for being my friend. I think 00:44:25 if I said that to one of my best friends 00:44:26 that think I was losing my mind or 00:44:28 something or that something was wrong, 00:44:29 they'd be so concerned because it's such 00:44:30 an unusual thing for me to have. Okay, 00:44:32 so you can do it this way. So, I had a 00:44:34 guest on the podcast and he came up with 00:44:36 this thing that is so outside of my 00:44:38 personality, but I'm going to try it. 00:44:40 Yeah, that would make sense. Then you 00:44:42 can do that. You know, I'm I'm new to 00:44:45 this. You know, I I know for years 00:44:47 people have been like, you should have a 00:44:48 gratitude practice and keep a gratitude 00:44:49 journal. And I tried it. I was like, 00:44:52 "Okay." Uh, I'm grateful for my sister, 00:44:55 grateful for what I've um for my family, 00:44:57 grateful for my friends, grateful for 00:44:59 the life that I live, right? Good night. 00:45:02 All right, next day. I'm grateful for my 00:45:04 sister, grateful for my family, grateful 00:45:06 for my friends, you know, and I found it 00:45:08 so repetitive that I was like, is this 00:45:11 worth it? Like, sure, like every now and 00:45:14 then something was different and new. 00:45:15 And I've come to come to the conclusion 00:45:17 like if it's the same three things every 00:45:20 single 00:45:21 day, it's good. I was thinking about 00:45:24 this a lot the other day when someone 00:45:25 asked 00:45:27 me, "Think about all the people in your 00:45:29 life and imagine if they were sick and 00:45:31 imagine if you had a billion dollars in 00:45:32 your bank and a billion dollars could 00:45:33 cure their 00:45:36 sickness." Like, who would you cure it 00:45:38 for? four, would you would you spend a 00:45:40 billion dollars to cure your your 00:45:42 girlfriend's sickness, your your 00:45:43 mother's sickness, your father's 00:45:44 sickness, your whatever, even if it was 00:45:47 the the the risk to them was was low? 00:45:49 And you would. I'd give every p penny I 00:45:51 had to cure an illness that my 00:45:53 girlfriend had, even if the risk was 00:45:54 low. And as I was thinking through that, 00:45:56 I was like, but then if you look at my 00:45:57 calendar and how I'm allocating my time 00:45:59 against these individuals and against my 00:46:01 priorities, there's a real imbalance 00:46:03 here. And over the last couple of weeks 00:46:04 in particular, I've been on a bit of a 00:46:05 journey of realizing just how 00:46:07 important four or five people in my life 00:46:10 are. I I uh how much I neglect them. 00:46:12 Yeah. I mean like of course I mean we 00:46:16 only when it's and it's you're right 00:46:18 like which is if you say give a billion 00:46:19 dollars and you can cure this disease 00:46:21 that affects you know two million 00:46:22 people, 10 million people, you'd be like 00:46:25 I'll give some money, right? But if it 00:46:28 was one family member, you would exhaust 00:46:30 every resource. You would quit your job. 00:46:32 you would do everything you can to for a 00:46:34 1% chance for for a 1% chance you know 00:46:37 and people do they quit their jobs when 00:46:39 and and be and like so many charities 00:46:41 have been started because my father died 00:46:43 my mother died my sister died my brother 00:46:45 died and now I've devoted my life to duh 00:46:47 right like like literally that's the 00:46:49 reason and it's because it's personal of 00:46:51 course that makes sense it's why why are 00:46:53 some entrepreneurs good and some 00:46:54 entrepreneurs bad well how personal is 00:46:56 the thing that you're working on to you 00:46:58 because then I only like to like the 00:47:00 entrepreneurs that I 00:47:02 I want to know that they are solving a 00:47:04 problem that they struggled with or that 00:47:06 somebody they love struggled with or 00:47:07 something if they they read an article 00:47:09 in a magazine they thought this is a 00:47:10 great market opportunity. There is no 00:47:12 passion there that is driven by money 00:47:14 and power only. I want to know somebody 00:47:16 that is it's so deeply personal to them 00:47:18 that they will stop at nothing. They 00:47:21 will run through a brick wall and find 00:47:22 every creative solution. And it goes 00:47:24 right back to the small company versus 00:47:27 big company. It's it's passion and a 00:47:29 vision that is bigger than the resources 00:47:31 that I have which you know I had a 00:47:34 conversation with somebody recently 00:47:35 actually 00:47:37 where they wanted to uh we we uh it was 00:47:41 a business problem and they wanted to 00:47:44 change the goals and I said we can't 00:47:46 just change the goals because they're 00:47:48 difficult you know so we might miss the 00:47:51 goal it's true yes we have a very 00:47:52 aggressive goal and the likelihood of us 00:47:54 hitting it is incredibly low but why 00:47:57 would we lower it? And they said to me, 00:48:00 you know, I don't like to fail. They 00:48:02 said, I don't like to fail and I know 00:48:03 you don't either. And I said, ah, that's 00:48:05 where you're wrong. That's where you're 00:48:07 wrong. I have spent most of my life a 00:48:09 failure. And I'm very comfortable being 00:48:11 a failure. And I think of myself not as 00:48:14 a success. I think of myself as a 00:48:16 failure. And that's because my ambitions 00:48:18 are bigger than my skills or my ability 00:48:20 to achieve those ambitions. And so 00:48:22 almost everything I've done, with a 00:48:23 couple of exceptions, have fallen short 00:48:25 of what I had hoped for. I'm very 00:48:27 comfortable with that because failing at 00:48:29 80% is really much better than 00:48:31 succeeding at 00:48:33 30%. And I think this idea of fear of 00:48:36 failure and embracing failure, I don't 00:48:38 want to fail, but I just I I have I 00:48:43 think it's important to have dreams that 00:48:45 are beyond your skills or your resources 00:48:48 because that's where creativity comes 00:48:50 from. That's where resourcefulness comes 00:48:54 from. You know, would you go How am I 00:48:56 gonna figure this one out? And you told 00:48:58 me the stories of your own team, you 00:49:00 know, it's the resourceful ones. And now 00:49:03 we go back to AI. AI is not going to 00:49:05 figure that out, right? There's there's 00:49:08 data on this. I I've actually never 00:49:10 thought about this. This is good. There 00:49:11 was a book called Oh, it was the wisdom 00:49:13 of crowds. I think it was that one. I 00:49:16 think it was the wisdom of crowds by 00:49:17 James Herki where people with experience 00:49:20 knew what to do versus people who were 00:49:22 just trained. So I'll give you the 00:49:23 example. This is I and again I might 00:49:25 have the book wrong but this was I 00:49:26 remember the case. So there was a a 00:49:28 bunch of firemen fighting a fire in like 00:49:31 the in wildfires, right? And the wind 00:49:34 picked up and the fire was coming really 00:49:36 fast towards these firemen. True story. 00:49:39 And they all started running for their 00:49:41 lives as this fire was approaching them 00:49:42 unbelievably quickly. But the problem 00:49:44 was if they looked 00:49:46 ahead there was a small hill which means 00:49:49 you're going to slow down because you 00:49:50 can't run fast up a hill and this fire 00:49:52 is coming fast, right? The senior guy on 00:49:56 the on the team started screaming, "Get 00:49:59 down. Get down. Get down." And they all 00:50:01 ignored him. They were all running for 00:50:02 their lives. And he just stopped running 00:50:04 and got down and put his hands over his 00:50:07 head and just lay in a ball. And the 00:50:09 fire was going so fast that it blew 00:50:11 right over him. and it caught up with 00:50:14 the other guys and burned them all to 00:50:16 death. Now, they didn't teach them that 00:50:18 in fire school. It it was accumulated 00:50:21 knowledge of wisdom of that came from 00:50:23 experience that he knew the right thing 00:50:25 to do in the moment. He was able to read 00:50:27 the tea leaves in a way and his his gut, 00:50:31 whatever that means, his gut said the 00:50:33 right thing to do is to drop down. 00:50:35 You'll be okay. And this is why I'm 00:50:38 going to sound like a a broken record. 00:50:40 the importance of doing the work 00:50:42 yourself, of writing the book, painting 00:50:44 the painting, choreographing the dance, 00:50:46 you know, composing the symphony, 00:50:48 building the business, having the 00:50:50 difficult conversation, stumbling and 00:50:52 bumbling, right? The reason for it, like 00:50:55 let's say you have a fight with your 00:50:56 your your girlfriend. You don't want to 00:50:58 be in a fight. You don't like being in a 00:51:00 fight. You want to do the right thing. 00:51:01 So, you go to chat GBT and you be like, 00:51:03 "My girlfriend and I had a fight. This 00:51:04 is what the fight was about. Okay, I 00:51:07 think I did some stuff wrong. I think 00:51:08 she's did some stuff wrong. this is 00:51:10 exactly what happened. Tell me what to 00:51:11 do." And you go, "Babe, I just want you 00:51:14 to know I want to take full 00:51:15 accountability. I'm really I'm really 00:51:17 sad that this happened and I want you to 00:51:19 know that I care about this 00:51:20 relationship." And she says to you, "Did 00:51:23 you get this answer from Chad GBT?" And 00:51:25 you go, "I did." How's that going to go 00:51:29 right now? You did everything right. You 00:51:31 did everything 00:51:33 right. You did everything right. but for 00:51:36 the fact and it goes right back to what 00:51:38 you said which is it removed the 00:51:40 humanity. It removed the personality. 00:51:42 It's artificial. It's fake. It's 00:51:43 everything you said about the job 00:51:44 interview. It's everything you said 00:51:46 about all those other things about all 00:51:47 the resumes, all of the pitch 00:51:49 decks. It's not you telling me sorry, 00:51:52 it's Chat GBT telling me sorry. And even 00:51:54 though you went with good intention to 00:51:55 get it right, I would rather you get it 00:51:57 wrong and bumble and fumble it with me 00:51:59 and be like, "Babe, I don't know how to 00:52:00 do this. I'm an asshole." You know, and 00:52:03 then she fights with you because you get 00:52:04 it wrong and you rumble through it 00:52:07 together. And what happens when you 00:52:09 come, you've had this happen. I know 00:52:10 because I have, and I know anybody, 00:52:12 everybody has. When you come out of the 00:52:14 fight, you're closer. Mhm. Not cuz you 00:52:17 got it right because you got it wrong. 00:52:19 And if you learn the 00:52:21 skill and you get better and better and 00:52:23 better and better and you do learn the 00:52:24 skill of saying the right thing and you 00:52:26 do learn the skill, she knew it wasn't 00:52:27 because you asked AI in the moment 00:52:29 because you just wanted to resolve the 00:52:30 problem and remove the tension. It's 00:52:32 because you learn the skill for the time 00:52:34 that you don't know when it's going to 00:52:36 happen because you're equipped for this 00:52:37 relationship. Mhm. And it's that 00:52:40 investment in the relationship rather 00:52:41 than trying to transactionally solve the 00:52:43 problem before me. And that's the 00:52:44 difference. It's infinite versus finite. 00:52:46 transaction versus it's destination 00:52:49 versus journey. I'm in the journey of 00:52:51 this relationship versus I have a 00:52:52 destination. I got to solve this problem 00:52:54 now. Otherwise, this is going to destroy 00:52:56 my relationship. And all of this is 00:52:58 coming full circle. And it comes right 00:53:00 back to everything we started. this. I 00:53:02 think it's hilarious that you and that 00:53:03 you're having a conversation with me 00:53:04 about AI because I'm not an AI expert 00:53:06 and I'm not an AI 00:53:08 business, but I I am in the humanity 00:53:11 business. And I think everything we're 00:53:13 talking about from every angle, we're 00:53:15 battle testing this idea. And what we 00:53:17 can't get away from is human beings 00:53:19 really want human 00:53:22 beings and human beings really want 00:53:26 human experiences and human beings 00:53:28 really want things made by human beings. 00:53:30 And we are not only okay with we want 00:53:34 imperfection because imperfection is the 00:53:37 sign of human. I was just thinking about 00:53:40 how how when I'm in an argument with my 00:53:44 partner, if she was perfect, if she was 00:53:47 completely composed, if she was looking 00:53:49 at me without emotion, without 00:53:52 expression, and if she was spewing chat 00:53:55 GPT like responses back at me, 00:53:58 it would it would be a little bit 00:53:59 infuriating, but also it would be 00:54:02 completely dehumanizing as you say. And 00:54:04 it's funny how actually even in 00:54:05 conflict, I want emotion. I want 00:54:07 imperfection. Yeah, I want res human 00:54:09 resonance. So, it's interesting because 00:54:11 I've been like thinking about what my 00:54:13 struggle is a good thing. Yeah. And I 00:54:14 think in our modern day and age, we have 00:54:17 under appreciated and underrepresented 00:54:20 the value of struggle. And if you ask 00:54:23 anybody in their life, you know, tell me 00:54:26 about a time in your 00:54:27 career 00:54:29 where like you felt like, boy, this is 00:54:32 the most amazing thing I've ever done. 00:54:33 I'm so glad I'm a part of this. It's not 00:54:35 the big win. It's not the big success. 00:54:37 It's not we finished everything on time 00:54:38 and under budget. It's oh my god, this 00:54:41 one thing went horribly wrong. Oh my 00:54:43 god, it went so badly and yet the way we 00:54:46 came together like the most important 00:54:48 thing in my career was when I lost my 00:54:50 passion and went into deep depression. 00:54:52 Never want to go through that again. 00:54:53 Really glad it happened. And all of our 00:54:55 relationships, professional, personal, 00:54:57 romantic, whatever they are, right? All 00:54:59 of our relationships get better when we 00:55:02 go through struggle together. And we 00:55:04 know we know the way the human animal 00:55:05 works. We know that oxytocin is released 00:55:07 when you have shared struggle. That's 00:55:08 why when you put people in boot camp and 00:55:10 they go through [ __ ] together or there's 00:55:12 a a natural a n a natural disaster. Like 00:55:15 all all of a sudden I don't care who you 00:55:17 voted for. I saw your house blow down in 00:55:19 the tornado. I got you. Don't worry, 00:55:20 we're neighbors, right? Like we can put 00:55:22 aside all the rational 00:55:25 nonsense, the the intellectual nonsense. 00:55:27 And at the end of the day, human beings 00:55:30 are are are good at helping human 00:55:32 beings. struggle also in many contexts 00:55:35 is the value. So when I think about a 00:55:37 Simon Synynic book, the reason why I 00:55:40 value it is because I know that Simon 00:55:42 Synynic spent years writing that thing 00:55:44 and pulling it together. The reason why 00:55:46 certain handmade things that we talked 00:55:47 about earlier are valuable is because of 00:55:49 the pain and the toil that went into 00:55:50 them. And when you think about the art 00:55:52 world and other creations through 00:55:53 history, the value comes from the fact 00:55:55 that human beings came together for a 00:55:57 prolonged period of time and did 00:55:59 something. And actually the investment 00:56:01 is the value. Like the the amount that 00:56:03 went in the top creates we're not buying 00:56:05 the product. We're buying the story. 00:56:07 Yeah. Like the Mona Lisa is You're not 00:56:09 buying the Mona Lisa. You're not buying 00:56:11 a piece of art. You're buying the story 00:56:12 that goes with the art. The story that 00:56:14 it took to create the art, what the 00:56:16 artist was going through, what they were 00:56:17 thinking. You're not buying my book. 00:56:19 You're buying the story of the making of 00:56:20 my book. And the Mona Laisa was stolen 00:56:22 from what I understand. I mean, we don't 00:56:24 even know if the the one in the Lou is 00:56:26 the real one. Because I heard much of 00:56:27 the the reason why the Mona Lisa is so 00:56:29 valuable is because at one point it was 00:56:30 stolen and then they like managed to rec 00:56:32 find it again and actually it's just a 00:56:34 painting but the story of the is worth 00:56:37 100 million 200 million whatever 00:56:39 whatever I mean and so this is what we 00:56:41 the reason artists are famous is because 00:56:43 you buy the story of that artist not not 00:56:44 their talent there's a lot of famous 00:56:47 singers and actors and painters dancers 00:56:50 who are a lot less talented than the 00:56:52 unknown ones but you buy into the story 00:56:56 And this is why some 00:56:59 celebrities, as much as they talk about 00:57:01 the paparazzi and the tabloids, they 00:57:03 want to be in the paparazzo. They want 00:57:05 the paparazzi to follow them because it 00:57:07 keeps their story relevant. It keeps 00:57:08 them, you know, they're worth more 00:57:10 because they're in the zeitgeist. Apple 00:57:12 know this better than anybody cuz you go 00:57:14 to an Apple store and they've laid out 00:57:16 their products as if it was an art 00:57:19 gallery. The three feet either side of 00:57:23 the iPhone create the impression in my 00:57:25 mind that that this is a piece of art 00:57:27 and there's only one of them. And the 00:57:28 fact that they've wasted all this space, 00:57:30 which I know real estate costs money and 00:57:32 that must have been expensive, pours 00:57:34 into the device itself. If I gone into 00:57:36 an Apple store and there was a thousand 00:57:38 iPhones, like the old electronic stops, 00:57:39 all stacked on top of each other, I 00:57:41 would assume the iPhone was worth less. 00:57:42 But the story, just by the frame in 00:57:44 which I see it means that, oh my god, 00:57:45 this thing is it's theater. It looks 00:57:47 like one of one. Yeah, it's theater. and 00:57:49 and some would call it 00:57:51 manipulative, but we want things to feel 00:57:55 valuable, not just be valuable. Right? I 00:57:58 could tell you I've got a I I I found a 00:58:01 guy who makes Kashmir jerseys and he 00:58:03 uses the exact same cashmere as, you 00:58:05 know, Laura Piano, whatever, some fancy 00:58:07 fancy ass brand, you know. But the 00:58:10 problem is it says like Dave's Kashmir 00:58:12 shop, you know. I could tell you 00:58:15 everything about where he sources the 00:58:16 Kashmir, how he makes it. That's the 00:58:18 same everything. And you'd be like, 00:58:21 "Yeah, Dave's Kashmir shop." Because 00:58:24 you're not buying the Kashmir. You're 00:58:26 buying the brand. You're buying the 00:58:28 story. You're buying the association. 00:58:29 That's what's that's why brands have 00:58:32 value because it's irrational. Yeah. And 00:58:36 humans are irrational. And that's why 00:58:38 companies invest in building 00:58:40 brands for the story. And 00:58:44 so yeah, I I think I'm, you know, as 00:58:48 much as chat AI scares me, I still 00:58:51 believe the thing that the technologist 00:58:53 the technologists don't appreciate and 00:58:55 won't appreciate and there there will be 00:58:57 a 00:58:58 rebellion and handmade will become more 00:59:01 valuable and handmade will become more 00:59:03 expensive and people will want to say 00:59:07 that. You know, it's like you had that 00:59:10 person write your speech for you. you 00:59:12 wait who did the painting? They they did 00:59:14 it themselves, you know, and I think 00:59:18 that's good. You know, it's a pendulum, 00:59:21 right? We're going to get enamored with 00:59:22 the technology until we're until it's 00:59:24 boring. This also just expands 00:59:26 generally. I know this sounds quite big 00:59:27 and we're talking about these big 00:59:28 things, but just everything that you 00:59:30 create. It's very very tempting at the 00:59:32 moment to just create something with AI 00:59:34 and throw it up on your website, on your 00:59:36 social media pages, or present it to the 00:59:38 world, a presentation deck at work. But 00:59:40 actually, I'm I'm already noticing I'm 00:59:43 attributing huge value and interest in 00:59:45 things that I can identify as 00:59:48 humanmade. I had a flashback a second 00:59:50 ago as we were talking about this idea 00:59:52 of 00:59:52 scarcity to one of my favorite brands in 00:59:55 the world. It's a clothing brand and I 00:59:58 was obsessed with this clothing brand. 00:59:59 I'd spend a huge amount I don't spend 01:00:00 money on clothes. I would spend a huge 01:00:02 amount of money every time they came out 01:00:03 with a new item. One day the founder of 01:00:06 the brand and everybody knows this 01:00:07 brand. He posted a photo from his 01:00:11 factory. It was like a video. And what I 01:00:14 saw in the video was the shirt I was 01:00:16 currently wearing as I watched the video 01:00:18 in a massive bucket with 4,000 others of 01:00:22 the exact same shirt. And in that moment 01:00:25 fell out of love. I fell out of love. 01:00:27 Exactly. Because in my head, I'd painted 01:00:29 this like artisan picture of of them 01:00:31 sewing it. These two guys sewing it in 01:00:33 their bedroom. And then like it's 01:00:35 probably what it was on the ad, too. I 01:00:37 think it kind of like used to be. So I 01:00:40 still had that big. Yeah. One thing that 01:00:43 I've always understood, this is true for 01:00:45 businesses, this is true for absolutely 01:00:46 everything. Scale breaks things. You 01:00:48 know, scale breaks things in the in the 01:00:50 military. They they special forces, 01:00:52 special operating forces, Navy Seals, 01:00:54 SAS, you know, all those folks, right? 01:00:58 And there's a there's a saying in the 01:01:00 special forces that's basically um 01:01:03 special you can't scale special right so 01:01:07 you can take whatever training skills 01:01:08 whatever you have for the special forces 01:01:10 and you give it to everybody it's not 01:01:11 going to work special can only be small 01:01:15 you know and and so scale breaks things 01:01:19 scale always so I mean like Microsoft 01:01:22 versus Apple right so Apple wanted the 01:01:25 highest quality operating system in the 01:01:26 world Right? So what did they do? They 01:01:28 they refused to clone, right? They 01:01:30 wouldn't they wouldn't they wouldn't 01:01:31 clone their their operating system and 01:01:35 they uh as a result of of refusing to to 01:01:40 do that they they never for years Apple 01:01:43 had like in the height of the the PC 01:01:45 wars maybe 4% of the world's operating 01:01:47 systems. Microsoft said we're happy to 01:01:50 clone our operating system. So it was a 01:01:52 little bit different on Dell. It was a 01:01:53 little bit different on an IBM. wherever 01:01:54 you use it was slightly different and 01:01:56 they had 90 something% of the world's 01:01:57 operating systems. It's because you have 01:02:00 to trade quality for scale every time. 01:02:02 There's a reason why buying fancy goods 01:02:05 for a lot of money like because as you 01:02:08 said the way you make things has to 01:02:10 change and you have to start making them 01:02:11 in factories and you have to scale scale 01:02:14 scale breaks companies. Think about how 01:02:15 beautiful companies are. The number of 01:02:16 companies that talk about, oh, our 01:02:18 company's like a family. Get to about 01:02:19 150 people, 200 people, get to Dunar's 01:02:21 number. Not so much of a family anymore. 01:02:25 I think B2B marketeers keep making this 01:02:27 mistake. They're chasing volume instead 01:02:30 of quality. And when you try to be seen 01:02:32 by more people instead of the right 01:02:34 people, all you're doing is making 01:02:35 noise. But that noise rarely shifts the 01:02:37 needle. And it's often quite expensive. 01:02:40 And I know as there was a time in my 01:02:41 career where I kept making this mistake 01:02:43 that many of you will be making it too. 01:02:45 Eventually, I started posting ads on our 01:02:47 show sponsors platform, LinkedIn. And 01:02:49 that's when things started to change. I 01:02:51 put that change down to a few critical 01:02:53 things. One of them being that LinkedIn 01:02:55 was then and still is today the platform 01:02:57 where decision makers go to, not only to 01:02:59 think and learn, but also to buy. And 01:03:01 when you market your business there, 01:03:03 you're putting it right in front of 01:03:04 people who actually have the power to 01:03:06 say yes. And you can target them by job 01:03:08 title, industry, and company size. It's 01:03:11 simply a sharper way to spend your 01:03:12 marketing budget. And if you haven't 01:03:14 tried it, how about this? Give LinkedIn 01:03:16 ads a try and I'm going to give you a 01:03:18 $100 ad credit to get you started. If 01:03:21 you visit 01:03:22 linkedin.com/diary, you can claim that 01:03:24 right now. That's 01:03:30 linkedin.com/diary. Is it harder now to 01:03:30 find love? Because there's lots of stats 01:03:32 that say we're having less sex, we're 01:03:34 lonier than ever before. Um, 01:03:37 interestingly, this is maybe an adjacent 01:03:39 point, but I was looking at Bumble share 01:03:41 price. 01:03:42 I 01:03:43 love the founder of Bumble. The CEO is a 01:03:46 really really good friend of mine. I 01:03:47 think she's amazing. Whitney. Whitney. 01:03:49 Yes, I think she is amazing. She has and 01:03:52 I know her and she's a wonderful human 01:03:54 being. But when I was looking at the 01:03:56 Bumble share price, it painted a really 01:03:59 interesting picture because then you you 01:04:01 overlay that with some of these other 01:04:02 dating apps and you see I mean this is 01:04:04 the Well, she's had to come back. She's 01:04:07 just come back in yet to turn the 01:04:08 company around. And actually, 01:04:09 interestingly, I saw her do an 01:04:11 interview, which is one of these ones 01:04:12 here. And in the interview, she says 01:04:13 she's going to revamp Bumble to make it 01:04:15 not about finding love with others, but 01:04:18 falling in love with yourself. Yeah. And 01:04:20 there's also going to be this dating 01:04:22 marketplace, but her first objective is 01:04:24 going to be get people to love 01:04:25 themselves. Yeah. Through coaching and 01:04:27 all these kinds of things on the app and 01:04:28 then find a partner. Sure. Yes. I mean, 01:04:31 I I agree with that. I mean, it's like 01:04:35 it's I mean, we're all broken records, 01:04:37 right? And I don't mean that as it like 01:04:39 a like we're all broken souls. Like you 01:04:42 know you your customers will never love 01:04:44 you until your employees love you first. 01:04:46 You'll never find somebody to love you 01:04:48 unless you love yourself first. And look 01:04:50 at failed relationships right where 01:04:52 there's so much tension. It's somebody's 01:04:53 lack of self-love that contributes to 01:04:55 the failing of that relationship. I mean 01:04:57 I have a dear friend who's going through 01:04:59 it right now. She just can't find love 01:05:00 and she it just but it's because she 01:05:05 doesn't love herself and and she knows 01:05:08 it. You know, it's a hard thing to do. 01:05:10 So, if Bumble can crack that code, more 01:05:12 power to him. It's it's a but this is 01:05:14 the problem with a lot of these things. 01:05:16 You know, they're common knowledge. We 01:05:18 just don't do them. Everybody knows how 01:05:20 to be healthy. Everybody knows how to 01:05:22 exercise. Everybody knows what eating 01:05:24 right means. 01:05:26 Don't do it because wrong is easier and 01:05:30 right takes effort. Everybody knows what 01:05:32 we're supposed to do in a relationship. 01:05:33 Everybody knows that we're supposed to 01:05:35 like hold space. Everybody Everybody 01:05:37 knows philosophically what we're 01:05:38 supposed to do, but we don't do it. By 01:05:40 the way, it's the reason why most people 01:05:41 I've written all these lovely books and 01:05:42 it's the reason why most companies don't 01:05:44 use them. It's because my work is like 01:05:47 exercise, which is I can tell you every 01:05:49 single if you want to get into shape, 01:05:50 hey Simon, how do I get into shape? 01:05:52 Easy. Every single day, work out for 20 01:05:53 minutes every single day. Okay. Can I 01:05:56 take the occasional day off? Yes, but 01:05:57 not too many. Work out every single day. 01:05:59 And 100% of you will get into shape. I 01:06:02 know it 100%. When? I don't know. And 01:06:05 neither does any doctor. And that's my 01:06:07 work. Yes, I can profoundly help you 01:06:09 find purpose with start with Y. I can 01:06:12 help you build trust on a team with 01:06:13 leaders at last. I can help you embrace 01:06:15 the infinite mindset and have this 01:06:17 incredible calm in life. And the reason 01:06:19 most companies won't do it, this 01:06:20 innovate, this is a book for innovation 01:06:22 right here. You want to innovate? 01:06:23 Infinite game. The reason most companies 01:06:25 won't do it is because they need it to 01:06:26 happen by the end of the quarter or the 01:06:27 end of the financial year. It may or may 01:06:29 not. I have no clue and I cannot predict 01:06:31 that it will or won't. It'll work 100%. 01:06:34 I just don't know when. And the problem 01:06:36 is goes right back to the beginning of 01:06:37 this conversation. We're all so obsessed 01:06:39 with the output. We're all so obsessed 01:06:41 with the result that we've completely 01:06:44 ignored the value of the journey. And 01:06:47 people would rather hit the number at 01:06:48 the end of the year than build a good 01:06:50 strong company. Think about it, right? 01:06:53 If I meet another entrepreneur when I 01:06:55 say, "So, tell me about your company." 01:06:56 Like, "We're hyperrowth. We're a 01:06:58 hyperrowth company. We're a gazelle, 01:06:59 right?" Like, "Show me." Then I always 01:07:01 say the same thing because I'm an ass. 01:07:04 Can you please give me one article from 01:07:05 a reputable publication? Just one. And I 01:07:07 don't care the publication that says 01:07:10 that building a hyperrowth company is 01:07:12 good for business. Just one. And the 01:07:16 answer is you can't because it's 01:07:19 not. And so why are we're so obsessed 01:07:21 with highspeed growth? It's because our 01:07:23 investors want us to be obsessed with 01:07:24 high-speed growth, right? Or our egos 01:07:26 want us to be obsessed with high 01:07:27 high-speed growth, right? And if 01:07:29 high-speed growth happens by accident, 01:07:31 that happens for many of the unicorns, 01:07:33 right? It was an accident. I think 01:07:35 building a good company is better than 01:07:37 building a fast company. I think 01:07:38 building a good relationship is better 01:07:40 than building a fast relationship. And 01:07:42 we're all so obsessed with speed and 01:07:46 immediate results. present company 01:07:49 included. I've had to learn this the 01:07:50 hard way that I think there's something 01:07:52 joyful and beautiful about slowing down, 01:07:57 saying thank you, rep prioritizing 01:08:00 friends. It's okay. I mean, by the way, 01:08:03 cell phones and so but before cell 01:08:05 phones, we went to work and then we went 01:08:09 home and we didn't do work at home 01:08:11 because we couldn't do work at home. 01:08:12 Yeah. Because we didn't have computers 01:08:13 at home and we didn't have cell phones 01:08:14 or people to call us. And so you you did 01:08:16 work at work and then you left. And I 01:08:21 remember when cell phones started, I 01:08:23 remember the advertising, AT&T had this 01:08:25 campaign that they showed people working 01:08:28 on the beach, you know, with their 01:08:30 computers. I mean, this is like the 01:08:33 future, right? And they said now, um, 01:08:37 well, I what was the what was the 01:08:39 tagline? It was really funny. It's like 01:08:40 now you can leave work, right? 01:08:45 And that's not what happened. Work came 01:08:47 with us. We never left work. Work came 01:08:49 with us wherever. This is the problem 01:08:50 with cell phones and computers and the 01:08:52 internet, which is we do not leave work. 01:08:55 Work is with us in our pocket every day, 01:08:58 every vacation, every evening, every 01:09:01 weekend. As you said that, I thought, do 01:09:03 you know what that's so interesting 01:09:04 because that kind of means that 01:09:07 friction creates freedom in that regard, 01:09:10 if you know what I'm saying. So the 01:09:11 friction of not being able to go home 01:09:13 and tap away on my computer all night 01:09:16 meant that I had a certain level of 01:09:17 freedom because I had to kind of wait 01:09:18 for things, right? So in the same 01:09:20 context with AI coming along, now I can 01:09:22 build software throughout the weekend 01:09:23 whether the agency is working or not. 01:09:24 Now I can build anything I want at any 01:09:26 time using the phone in my pocket, the 01:09:29 friction has come down again and 01:09:30 therefore the pressure to do it now 01:09:32 because I can do it now goes up. And 01:09:35 this is kind of maybe what I was 01:09:36 alluding to earlier on in, you know, 01:09:37 when I was saying about how you're gonna 01:09:38 get more stressed and more overwhelmed 01:09:40 and more isolated. That's so good. 01:09:43 That's so right. It's we're taking work 01:09:44 with us. I mean, it's like, you know, 01:09:46 then everybody has a story of like where 01:09:47 the battery ran out on their phone and 01:09:49 they ended up having the best night. 01:09:50 Yeah. Or the internet went out at work 01:09:52 and they actually got more done. Like 01:09:54 you hear these stories all the time, you 01:09:57 know that that when when we're forced 01:10:01 not to take work with us, forced. It 01:10:05 takes a bit of time sometimes to relax, 01:10:07 but it's always better because you learn 01:10:10 to stop worrying, stop checking, you 01:10:13 know, you know, and you just I one of 01:10:16 the things I did I I can't delete 01:10:19 Instagram completely as much as I'd love 01:10:21 to, but I hid it. So, you know, you can 01:10:24 do that on iPhone. You can take it off 01:10:26 the It's gone. It's hidden. It says hide 01:10:29 app. And then I and when you go into the 01:10:31 search, you know, when you go search 01:10:32 suggest, I took it off the suggestions, 01:10:33 which most people don't even know you 01:10:35 could do that. So, I took it off the 01:10:36 suggestions. So, when I go to because I 01:10:38 realized what I was doing is I'm like 01:10:39 when I'm bored, I just pick up my phone 01:10:40 and I just like and then I see Instagram 01:10:42 like and I just click it like a like a 01:10:44 zombie and then I'm done for an hour, 01:10:46 you know? So, I hit it. So, the only 01:10:48 time I go to Instagram is when I have to 01:10:49 go to it and I have to type in Nst, you 01:10:52 know, and then it pops up. And my usage 01:10:55 of Instagram has plummeted, 01:10:58 plummeted because it has to be 01:11:00 intentional. And the problem with most 01:11:02 social media is it's unintentional. 01:11:05 One of the big things talking about 01:11:07 community, Instagram, AI that I think a 01:11:09 lot about 01:11:11 is is 01:11:13 the is the value of in real life 01:11:16 community going to rise? And I I think 01:11:18 we might be on the precipice of the 01:11:21 community revolution. I say this because 01:11:24 when I think about what's going to 01:11:25 remain in a world where creating things 01:11:26 goes to zero like when you know once 01:11:29 upon a time if you made a social network 01:11:31 or if you built an app or if you built a 01:11:34 med a movie or a media company or a 01:11:36 podcast that was half of the job and the 01:11:39 other half of the job is like getting it 01:11:40 out there in the world. Yeah. And so now 01:11:42 the people that have the other half of 01:11:44 the job which hasn't gone to zero now 01:11:47 have this tremendous advantage because 01:11:48 we can all create but we can't all 01:11:49 distribute. And so having community and 01:11:52 building and fostering community I think 01:11:53 now is one of the things that remains. 01:11:55 What are the values of a community? Like 01:11:57 how do I build the community? Well, I 01:11:58 have a definition of community and I 01:12:00 think we said it before which is 01:12:01 community is a group of people who agree 01:12:02 to grow together. Interesting. Community 01:12:05 is a group of people who agree to grow 01:12:06 together and I believe friends are at 01:12:09 least two people who agree to grow 01:12:10 together. In marketing this is the 01:12:12 absolute obsession at the moment. 01:12:13 Community run clubs and yeah brands 01:12:16 becoming um offline is the new online, 01:12:18 right? Um, offline's the new black. Um, 01:12:21 like there's a there's a company called 01:12:22 Clicks and it was started by she's a 01:12:26 talented entrepreneur and she was in 01:12:27 college and she suffered severe 01:12:28 depression and loneliness and literally 01:12:31 struggled didn't know how to make 01:12:32 friends and uh she oh I just call her by 01:12:36 her name. Her name's Alex. Um but her 01:12:40 company clicks she started it to solve 01:12:42 her own problem right my favorite kind 01:12:44 of company. and basically to help her 01:12:48 friends at school, quote unquote 01:12:49 friends, people at school make friends. 01:12:51 And she did it by taking people offline 01:12:53 and taking their phones away. And she 01:12:54 would come up with reasons to come 01:12:55 together, whether it was running or this 01:12:57 or that. Just like it didn't matter what 01:12:58 the reason was. Come and just a reason 01:13:00 to come that made you that when you saw 01:13:01 the ad hung up on a piece of paper at 01:13:03 college, you know, they' be like, I 01:13:07 like horror movies, whatever. I like 01:13:10 baking. I'll go to that. Right? Just a 01:13:13 trick. 01:13:14 and then basically just and so she's 01:13:16 built this business predominantly for 01:13:18 young people but it's available to 01:13:20 anyone but she's built this business to 01:13:21 just bring people together to meet 01:13:23 people without phones offline and she 01:13:26 want she has and there's a great irony 01:13:27 in it she has an app that she wants 01:13:29 people not to be on you know and what is 01:13:33 it that holds the community together is 01:13:34 it so like sh we want to improve 01:13:36 together so we have to have some kind of 01:13:38 shared value that we want to improve on 01:13:41 presumably it's like a community of 01:13:43 runners. They're trying to get fit 01:13:45 together or they're trying to improve 01:13:46 their I don't think it matters. I mean, 01:13:48 it's like it's it's shared shared 01:13:51 interests is a is a is a way to start a 01:13:53 a relationship, right? Comic-Con, you 01:13:56 know, you know, Nerd Vana, you know, 01:13:58 Burning Man, all of these larger than 01:14:01 life events, big and small, going to the 01:14:03 going to the going to the football, you 01:14:05 know, it's like you're everybody's 01:14:07 friend when you wear when you wear the 01:14:09 right when you wear the right colors, 01:14:11 you know. You know, you see people on 01:14:12 the on on the tube who's going to go 01:14:14 into the same game. They're wearing the 01:14:15 same jersey, your friends. Yeah. Your 01:14:17 friends, right? So, I think common 01:14:19 interest is a is a is a trick. It's a 01:14:22 way of getting people to come together. 01:14:23 And it's a nice place to start because 01:14:24 at least conversation is easy, right? 01:14:26 You know at least one thing about this 01:14:27 person that you have in common with 01:14:28 them. Do you need a shared goal in a 01:14:30 community, do you think? Thinking 01:14:32 because in the football analogy, we have 01:14:33 a shared goal. Oh, yeah. You want to 01:14:35 win? I think that's a very good 01:14:36 question. Do you have a shared goal even 01:14:37 in a relationship? Yeah, I think so. 01:14:39 That seems to make sense. The reason why 01:14:41 I'm asking these questions is I am 01:14:43 building businesses and brands. And I 01:14:46 know that community is one of the most 01:14:47 important things that everyone building 01:14:48 a brand or business is thinking about at 01:14:50 the moment. So there's a big difference 01:14:52 between having an audience, which is 01:14:55 what you might have on like a podcast or 01:14:56 something, and having a 01:14:58 community. And I'm as a brand leader and 01:15:01 as an entrepreneur, I'm trying to shift 01:15:02 from having an audience over to having a 01:15:04 community. And that's about like 01:15:06 relationships and shared values. Shared 01:15:08 values. I 01:15:15 mean, I like to think the people who 01:15:15 really like my work, not the ones who 01:15:16 just like passively like it, but the 01:15:18 ones who really like my work, like if 01:15:20 somebody says, "I love Start with Y or I 01:15:24 live my life by the infinite game." that 01:15:27 it says something about who they are and 01:15:29 how they see the world and that we share 01:15:30 the same values and that if and because 01:15:34 I am you know me you know I'm an 01:15:37 idealist and and my I I'm consistent in 01:15:42 the way I talk about things from the day 01:15:44 I started to to now and won't ever 01:15:46 change and and I think that's the value 01:15:49 of values and the problem with the 01:15:52 modern world we live in and the 01:15:53 pressures that people face is money and 01:15:55 fame and all the rest of it and you know 01:15:57 influencer status. I think it sometimes 01:15:59 forces us to question our values or walk 01:16:02 away from them. I was invited to a a 01:16:04 group of they called it a mastermind 01:16:06 group where it's a bunch of folks like 01:16:07 me author speakers who wanted this is 01:16:09 what they told me come together based on 01:16:11 shared shared goals shared values how we 01:16:14 can work together to improve the world 01:16:16 together and I'm like oh sign me up I'm 01:16:17 in right so I went and we sat around and 01:16:20 you know who some of these people are 01:16:22 you know and we sat around a room and 01:16:25 they spent all of their time talking 01:16:27 about how they can share their lists 01:16:28 with each other and how they can 01:16:29 cross-promote with each other and what 01:16:31 margins are you getting on this? And I'm 01:16:33 like, you guys are and I spoke up. I was 01:16:36 like, you guys are just talking about 01:16:37 making more money. What? I thought we 01:16:39 came here to like do common good 01:16:42 together. They didn't invite me back. 01:16:46 But it was but the point is is like the 01:16:48 money and the fame is seductive. You 01:16:50 know this is this is this is this is 01:16:51 Joseph Campbell's hero's journey which 01:16:53 is you start off as the reluctant hero 01:16:56 and you get called to your mission for 01:16:58 some reason and at some point in your 01:17:01 journey there will be something seduce 01:17:05 you away from your mission money or 01:17:06 power you know Luke Skywalker was nearly 01:17:09 seduced away from the vision you know 01:17:12 from the journey like this is what this 01:17:14 is what the hero has to go through and 01:17:17 do you have the friends who will slap 01:17:19 you around and keep you in line and say, 01:17:21 "No, you signed up for this. You have to 01:17:24 stay true to what you're doing." Because 01:17:25 I think none of us have the courage or 01:17:27 the strength to stay true to our cause 01:17:29 by ourselves. Very few of us. We need to 01:17:31 have at least one person who believes in 01:17:32 us to give us the strength to stick to 01:17:35 it because the temptation the temptation 01:17:37 you and I have both at various times 01:17:38 gone through it like when you start 01:17:40 making money and you start making money 01:17:41 for something you didn't expect. Like I 01:17:43 never expected to have a career from any 01:17:45 of this stuff, right? It happened by 01:17:47 accident. And you get to the point where 01:17:49 you start thinking you're more important 01:17:51 than you are. Do they know? Do they do 01:17:54 they know who I am? No. No. Get over 01:17:58 yourself. You know, and you start 01:18:00 becoming seduced. And I think the 01:18:02 seduction when we turn on social media 01:18:04 and we watch 01:18:05 Kardashian, the Kardashian model, and 01:18:08 like you have people who pursue 01:18:10 influence without knowledge or skill, 01:18:13 like those two buckets. They have the 01:18:15 other three buckets, but they don't have 01:18:16 the first two buckets. And it is 01:18:18 temporary for some or unfulfilling for 01:18:21 others. And you know the funny thing 01:18:24 about that job, you know, I was I was I 01:18:27 went to a concert and there was this 01:18:31 this woman who walked down the aisle and 01:18:32 some guy was taking pictures of her and 01:18:33 she was posing and the person next to me 01:18:35 goes, "She's a famous influencer." And I 01:18:39 said, "You mean she's a freelance 01:18:40 employee of an algorithm?" 01:18:49 And and somebody who chooses to be an 01:18:49 influencer, that's what you are. You are 01:18:51 a freelance employee of an algorithm. 01:18:53 And the minute they change the 01:18:54 algorithm, you might be out of business. 01:19:04 Uh you just lost like 170 followers. 01:19:04 I mean, maybe you just lost like 170 01:19:07 blue ticks. But like I don't have a 01:19:10 problem with the concept of being an 01:19:11 influencer if you bring something of 01:19:13 value. The only time I have a problem 01:19:16 with it is is if you make it about you 01:19:19 and I have to get more of everything. In 01:19:21 the early days of influencing, there was 01:19:24 a young couple that were trying to get 01:19:26 followers and they would do crazier and 01:19:27 crazier things. And 01:19:30 so what they did is he took a d a big 01:19:33 book, a dictionary I think it was, and 01:19:34 he put it over his chest and she took a 01:19:36 gun and shot it at the book thinking 01:19:38 that the book would stop the bullet. And 01:19:41 she killed him. She killed him on 01:19:43 camera. right 01:19:45 now. They admitted she admitted 01:19:47 afterwards she he's dead. But she 01:19:48 admitted afterwards they were doing it 01:19:50 to do bigger and crazier stunts to get 01:19:51 more and more followers because the 01:19:52 financial pressures of driving at those 01:19:54 YouTube views was overwhelming. And so 01:19:57 it's an extreme case obviously of the 01:20:01 the the the the how we will lose our 01:20:03 minds trying to gain followers. And I 01:20:06 don't mind if people gain followers by 01:20:09 giving but to gain followers by taking. 01:20:12 Look at me. Look at me. 01:20:14 versus I have something that I think 01:20:15 this will help you. And like there's not 01:20:19 there's I don't have a problem with what 01:20:21 it's it's the it's the input, right? 01:20:23 It's the intention. 01:20:27 Um anyway, anyway, I sound like an old 01:20:30 man complaining about No, but you kids 01:20:32 these days and you were influencers. S I 01:20:34 printed off some graphs which I think 01:20:36 are absolutely fascinating and dovetail 01:20:38 into everything you've said today, but 01:20:39 also into your work generally. So, I'm 01:20:40 just going to give you all of these to 01:20:42 give you a second to look through them 01:20:44 and I want to get your read and 01:20:45 interpretation on them. Okay. 01:20:49 Adults lack of meaning and purpose. 01:20:50 Overall, 01:20:52 lonely. Okay. I mean, that makes sense. 01:20:56 To me, that didn't make sense. I 01:20:58 couldn't understand how if someone is 01:21:01 low in purpose, then they are lonier. 01:21:05 I mean, it just makes sense that when 01:21:06 you have lack of purpose, the likelihood 01:21:10 that loneliness comes nearby. Like, I 01:21:12 think family gives meaning, friends give 01:21:14 meaning. Like, you feel like you're 01:21:15 there for someone else. Um, but if you 01:21:18 don't know why you're showing up every 01:21:19 day, I think it it it feels like a 01:21:21 lonely existence. Like, you feel like 01:21:22 you're searching. When you have a sense 01:21:24 of purpose, it gives you a mechanism to 01:21:26 meet people. It gives you a mechanism to 01:21:27 make decisions. It gives you a mechanism 01:21:28 to talk about yourself in a way that's 01:21:30 quite inspiring to others. You know, 01:21:31 when you say your why and somebody says, 01:21:33 "So, what do you do?" So I I wake up 01:21:35 every single day to inspire people to do 01:21:36 what inspires them. What do you do? You 01:21:38 know, like it's it's I Yeah, I do. I do. 01:21:43 And I think it's it's not the thing that 01:21:45 makes you less lonely. I think it's a 01:21:46 mechanism to make you less lonely. Just 01:21:48 as you said that that why you just gave 01:21:51 it becomes a magnet for me. Yeah. 01:21:52 Because then I know who your people are 01:21:54 and I know what to talk to you about and 01:21:55 it throws down this bridge for me to 01:21:57 walk across. Yeah. Versus Yeah. I'm a 01:21:59 dentist 01:22:01 or I don't know. Which is even worse. 01:22:03 which is even worse. Yeah. Yeah. So, so 01:22:05 yeah, that that doesn't seem unusual. 01:22:07 Distribution of people feeling lonely 01:22:09 worldwide by gender. So, it's about 01:22:11 50/50. 01:22:14 Are you surprised by that? I thought 01:22:16 for some reason I thought men were more 01:22:18 lonely. 01:22:20 Not at all lonely. Looks like women do 01:22:24 better. I would think. Do you think you 01:22:26 thought men were more lonely? I thought 01:22:28 men would be more lonely. I al I also 01:22:29 don't know what the age groups are cuz 01:22:31 like I want to know what how they like 01:22:33 what age are they to start are they 01:22:35 starting at is it 18 I guess if we just 01:22:38 put that all aside I mean women being 01:22:40 slightly less lonely than 01:22:42 men I think makes sense because I think 01:22:45 women are probably better at making 01:22:47 friends maybe I have that wrong what 01:22:49 else we got here adults mental health 01:22:51 health ratings 68% of people who have 01:22:53 depression are lonely and 67% of people 01:22:55 have anxiety are lonely do you think 01:22:58 good mental health is a foundation for 01:23:01 being not lonely, I guess. No, I think 01:23:05 having somebody who cares about you is a 01:23:07 foundation for not being lonely. And 01:23:09 when you're in a period of strained 01:23:11 mental fitness, it definitely can 01:23:14 manifest as loneliness and 01:23:16 anxiety. And the best way to manage 01:23:19 through that period of is to lean on the 01:23:21 people who you know love you. That will 01:23:23 help. So I don't think you can separate 01:23:26 the two. So does it affect it? Of course 01:23:28 it affects it. Can it make you withdraw 01:23:30 from your friends? Of course it 01:23:32 can. 01:23:34 Um so I don't think you can separate the 01:23:37 two. I read um a couple of years ago 01:23:39 that when people are lonely, they fall 01:23:42 into a state of self-preservation 01:23:43 because of some evolutionary uh sort of 01:23:46 mechanism which meant that if we were 01:23:48 alone on the Serengeti in Africa, we 01:23:51 would sleep worse. We'd become more 01:23:53 selfish. We'd become more angry. our 01:23:55 cortisol levels will be up which means 01:23:56 more inflammation. And this idea of 01:23:58 self-preservation basically means that 01:23:59 lonely people become more selfish, more 01:24:01 bitter, more angry. And that that's not 01:24:03 conducive, ironically, with finding more 01:24:05 friends, but it is conducive with 01:24:07 survival. It is conducive with being 01:24:09 able to fend for yourself. It makes it's 01:24:11 more conducive survival as an 01:24:12 individual. Yeah. So like if you're in a 01:24:14 deserted island and you're lonely and 01:24:17 all those things and the cortisol rises 01:24:19 and you become better at being on edge 01:24:22 being on edge which is a survival 01:24:24 mechanism right but if you live in a 01:24:27 community with other people then it is 01:24:29 counterproductive because we are social 01:24:31 animals and I need you to help me and 01:24:34 you need to help me. So if we live in 01:24:36 tribe together and I'm the selfish 01:24:39 [ __ ] I'm not you're not going to 01:24:40 wake me and alert me to danger tonight. 01:24:42 Mhm. you're just going to leave me. So, 01:24:44 so it's a it's a it's a though I think 01:24:46 it's 100% true in a social environment 01:24:49 it becomes counterproductive. And I 01:24:51 think you're right. When somebody feels 01:24:52 lonely, they do go into survival and 01:24:54 they become paranoid and they think 01:24:55 everybody's out to get them. And it's 01:24:56 but the problem is is you don't live on 01:24:58 a desert island. And that's one of the 01:24:59 reasons I think it's it's dangerous. 01:25:02 We're increasingly being living on 01:25:04 islands. The islands are different now. 01:25:06 They're four walls in a in a white city. 01:25:09 And um it's so funny that so many of my 01:25:11 friends are using this word 01:25:13 loneliness when 10 20 years ago it 01:25:15 wasn't it wasn't something that I heard 01:25:17 frequently from from adults but now it 01:25:19 seems to be also common. In fact my 01:25:20 masseuse yesterday was saying to me that 01:25:24 she's lonely and this is a woman that 01:25:27 lives in Los Angeles has people around 01:25:30 her but she's lonely. Yeah. And I 01:25:31 remember when she went and she's cuz we 01:25:33 had a conversation and I text her and 01:25:34 said oh thank you for for being so open 01:25:36 and stuff. I was thinking about like 01:25:37 what advice I could give her. She's got 01:25:39 no friends. She's in Los Angeles. She's 01:25:42 desperate for friends. She doesn't have 01:25:43 a partner anymore because she's had a 01:25:44 divorce. What advice do you give 01:25:46 someone? Is it take more risks? So, it's 01:25:49 it's too it's easy to give the obvious 01:25:51 advice like you got to put yourself out 01:25:52 there, right? But when you're in a 01:25:54 period of loneliness and stress, it it's 01:25:56 hard to find that that energy and that 01:25:58 risk takingaking, right? Especially when 01:25:59 you're down on yourself. We've all been 01:26:01 there. I've definitely been there. But 01:26:02 the um I think for me what I've learned 01:26:05 is um to put myself second meaning don't 01:26:08 worry don't try and solve my problem but 01:26:09 do I have somebody else who's lonely 01:26:10 that I can help like how do I help 01:26:13 somebody else who's dealing with their 01:26:14 loneliness and it's the act of service 01:26:16 that is so valuable if you know you guys 01:26:19 are talking and she has many clients and 01:26:21 somebody else like and her willing to 01:26:22 admit that I'm lonely that one of her 01:26:24 clients might say you know me too like 01:26:27 her ability to admit her loneliness out 01:26:29 loud creates a safe environment for 01:26:30 somebody else to admit the 01:26:32 And then once she learns that they're 01:26:33 lonely, now she can say, "Tell me about 01:26:35 it." And then she can talk them through 01:26:37 it and hold space for them. And her 01:26:40 ability to help somebody else manage or 01:26:41 understand their own loneliness will 01:26:43 will help her. So interesting because 01:26:45 one of the hallmarks of her personality, 01:26:46 if you met this person, something she 01:26:47 says to me every single time she comes 01:26:49 over um for the treatment is that she 01:26:52 doesn't like sharing herself with people 01:26:55 because she thinks if people get to know 01:26:56 her, they'll change. So literally, she 01:26:58 has been my misuse for many years now. 01:27:01 And if people get to know her, they will 01:27:03 change. She's scared about people 01:27:04 knowing her. She's scared about people 01:27:06 knowing the details of her life. So, she 01:27:07 guards it all. And it's it took me, and 01:27:10 I'm someone that just asks way too many 01:27:11 questions of every student I meet. 01:27:14 It took me like 20 sessions just to 01:27:17 figure out she had a family. I had no 01:27:18 idea she had two kids. Took me She goes, 01:27:20 And when I asked her, her her body 01:27:22 language went like 01:27:24 this. Yes, I have kids. And I was like, 01:27:27 why why are you so guarded with your 01:27:28 information? Well, if people know who 01:27:30 really I really am and they know more 01:27:31 about me, then they'll change or um they 01:27:34 may not like me anymore or they'll think 01:27:35 differently about me. So, I just keep it 01:27:37 to myself. But, I mean, this is the 01:27:38 age-old problem. I mean, it's everything 01:27:39 you said when you when people who are 01:27:41 driven only by the thing that they want, 01:27:42 you know, versus the givers. And this I 01:27:45 mean, this is the age-old problem, which 01:27:46 is I'm never I'm going to keep my walls 01:27:47 up high. I never want to love anybody 01:27:48 because then I'll get hurt. And then you 01:27:50 just end up learning and hurt. And this 01:27:52 is this is goes this this is goes back 01:27:54 to struggle which is this goes look you 01:27:56 and I both know that that if somebody 01:27:59 who chooses to go on an entrepreneurial 01:28:00 venture or ad venture the statistics 01:28:04 bear out that the over 90% of all new 01:28:07 businesses will fail in the first three 01:28:08 years right what idiot would ever start 01:28:11 a business right you have to be 01:28:12 compelled by something else to make you 01:28:15 do something with overwhelming stat 01:28:18 statistical chance of failure right and 01:28:20 And it's opening yourself up to failure 01:28:23 that is the thing that makes it work you 01:28:26 know at least for a small percentage of 01:28:30 course but it's the same for love which 01:28:31 is or a relationship or friendship which 01:28:34 is it absolutely comes with risk there's 01:28:37 nothing risk- free like yeah they will 01:28:39 they some of them might change yes some 01:28:42 of them might not like you for that very 01:28:44 yes 01:28:46 yes and the odds are equal that somebody 01:28:49 will fall in love with you and love you 01:28:50 and like you and think you're the best 01:28:51 thing and think you're amazing. And so 01:28:53 if you can't take the risk, you can't 01:28:55 get the reward. If you don't play the 01:28:56 lottery, you don't win the, you know, 01:28:57 you don't win the jackpot. If you aren't 01:28:59 willing to take, I'm not telling you to 01:29:01 open up and tell everybody your deepest 01:29:02 darkest secrets. But if you're not 01:29:04 willing to give somebody anything to 01:29:07 latch on to to be like if you can't give 01:29:09 them anything to say me too, 01:29:14 then then it's going to be a hard road 01:29:16 or it'll take somebody to chip away at 01:29:18 you 20 times, you know, to before you 01:29:21 open up. But but the fact that you did 01:29:23 and she got her to open up a little bit, 01:29:26 she could make a friend in you. Now 01:29:28 she's fully open and we've been through 01:29:29 everything. I know where she wants her 01:29:30 house to be. Well, there you go. You see 01:29:32 that thing down? So, you have to point 01:29:33 out to her that this friendship and that 01:29:36 you know so much and that she feels so 01:29:37 safe with you only happened because 01:29:41 um you chipped away at her and she 01:29:43 finally you wore her down and she opened 01:29:46 up. And it can happen in two directions, 01:29:47 right? Which is sometimes we have to 01:29:49 take the little risk to just tell 01:29:50 somebody some a little something about 01:29:52 ourselves to find something. Or 01:29:56 sometimes it's somebody who's so curious 01:29:58 about us that they chip away and chip 01:29:59 away and chip away and chip away and 01:30:00 chip away until we give up and open up 01:30:02 and that that person becomes a friend, 01:30:04 you know, and so one of the reasons you 01:30:06 should be grateful for the friendship is 01:30:07 you kept trying. You could have just 01:30:08 tried it three times and like forget it. 01:30:10 I'll just lie here quietly and have a 01:30:11 massage, you know. So yeah, I mean all 01:30:15 of this stuff comes with risk. I could 01:30:17 see it in her face though. I could see. 01:30:19 But you're also curious. Yeah, I could 01:30:21 see this. She locked out because you're 01:30:23 curious about people. Is this what 01:30:25 connection is? You when you said me too, 01:30:28 finding something to say me too with 01:30:30 someone, is that the essence of what 01:30:32 like connection is? I mean, it can mean 01:30:36 it could be values, it could be 01:30:37 interest, it could be many things. But I 01:30:39 think to 01:30:40 find I think yes, like this is I mean, 01:30:43 if you want to go political, you know, 01:30:46 one of the problems in our politics is 01:30:49 both sides think they're right and both 01:30:51 sides think the other side is wrong. 01:30:52 Mhm. And neither side is willing to try 01:30:55 to even understand what the other 01:30:58 person's point of view is without 01:31:00 thinking them an idiot or thinking them 01:31:01 wrong or thinking them 01:31:02 sheeple. And the ability to say, "Can 01:31:06 you tell me more about how you came to 01:31:08 your beliefs? I'm genuinely curious." 01:31:11 and to not agree or disagree, but to be 01:31:14 curious, to be an investigator, you 01:31:17 know, and I and I have done this with 01:31:20 people who have polar opposite political 01:31:22 views with me. At some point, we will 01:31:25 get to a level of me just chipping away 01:31:27 trying to understand that they will say 01:31:29 something that I'm like, 100% I agree 01:31:31 with that. M and then from that point on 01:31:34 there's a there's a simple validation 01:31:35 not agreement but a validation that your 01:31:38 opinions do matter and we do have common 01:31:40 ground that we can build from there and 01:31:42 on that on that validation they become 01:31:45 vastly more open to my point of view 01:31:46 also and curious to my point of view 01:31:48 also and so yeah the person the person 01:31:52 like the ability to listen and be 01:31:54 curious even if you disagree is one of 01:31:57 the most underappreciated skills on the 01:31:59 planet and yeah I think when you find 01:32:01 the ability to say I agree. Then you can 01:32:03 find community. You can and maybe that's 01:32:05 what we need to heal this country or 01:32:07 most of the countries in the world right 01:32:09 now which is one group or one person to 01:32:13 just be curious rather than agree or 01:32:15 disagree. There's something about 01:32:16 relationships here as well because my 01:32:18 girlfriend is you've met my partner and 01:32:20 um she's very spiritual but also like 01:32:22 all of us she's emotional. So sometimes 01:32:24 in those interactions when we sit down 01:32:26 and there's a problem and we spend the 01:32:28 first 60 minutes to 90 minutes trying to 01:32:29 figure out what the problem is. It 01:32:31 doesn't sound very logical to me. And 01:32:34 one of the things that I think sort of 01:32:36 is adjacent to what you're saying is my 01:32:38 job in that moment actually isn't to 01:32:40 like pass through truth and to figure 01:32:43 and to correct truth. It is to sit and 01:32:46 let someone get it out. Get it out. So 01:32:50 you and I have the same problem. 01:32:52 We do. Which is we live above our necks. 01:32:56 Yeah. And you and I both have the same 01:32:57 struggle, which is to learn to live 01:32:59 below our necks, right? Like you and I 01:33:01 are head people and we have to learn to 01:33:03 be more heart right and I want to 01:33:05 understand everything explain everything 01:33:07 you know evaluate everything analyze 01:33:10 everything and when I come to my 01:33:11 emotions I'll give you an analysis you 01:33:13 know when it comes to your emotions I'll 01:33:15 give you an analysis and so I have to 01:33:17 learn to be like I feel this and that's 01:33:20 it and I feel the same way it's funny 01:33:21 because I'm good at it with art you know 01:33:22 I always tell people you don't have to 01:33:24 understand art I'll take you to a museum 01:33:25 I'll take you to a gallery and if you 01:33:26 look at a piece and I know you're 01:33:28 intimidated to come because you don't 01:33:30 understand art or you don't understand 01:33:32 music, whatever it is. And all you need 01:33:33 to do, this is the only rule, is do you 01:33:35 like it? I 01:33:37 do. Why? I don't know. I just like it. 01:33:40 That's it. You're done. You're done. I 01:33:43 don't like it. Why? I don't know. I just 01:33:44 don't like it. Great. Excellent. And now 01:33:47 we're going to look at four, five, six, 01:33:48 10 pieces that you like. And I'm going 01:33:50 to see four, five, six, 10 pieces that 01:33:52 you don't like. And I'd be like, you 01:33:54 like all of the impressionists and you 01:33:56 hate all the old masters. So why don't 01:33:58 we just go see more impressionists, you 01:34:00 know? And so when when your girlfriend 01:34:03 opens up to you, you know, you don't 01:34:05 have to agree. You don't have to 01:34:06 disagree. You just have to and you and I 01:34:08 both have to learn to stop thinking and 01:34:10 to just like feel something. When 01:34:13 somebody says, "Where do you feel it?" 01:34:14 I'm like, "What? That's the stupidest 01:34:15 question I've ever heard. Where do I 01:34:16 feel it?" And they've literally like 01:34:18 I've had somebody take me on this 01:34:19 journey and she goes, "You told me an 01:34:21 emotion you have, Simon. You told me an 01:34:24 emotion." Yes. Where do you feel it in 01:34:27 your body? Where do you feel it? And I'm 01:34:30 sitting there going, "You've got to be 01:34:32 kidding me." You know, but all right, 01:34:35 we'll go through this. All right. And 01:34:38 she says, "So you said you feel X." I'm 01:34:40 like, "Yes." All right. Where in your 01:34:41 body did you have some sort of reaction? 01:34:44 I here. What happened? It got a little 01:34:47 tighter in my chest. Okay. And I went on 01:34:51 this journey with my friend with my 01:34:54 friend Bea. Beavce. She's she's 01:34:58 incredible. And she took me on this 01:35:00 journey and she does repair with couples 01:35:02 and she's just incredible. And she 01:35:05 helped me recognize that my body does 01:35:07 react to my feelings. Tension in my 01:35:10 shoulders, heavier breathing in my 01:35:13 chest, clenching my fists, you know, 01:35:16 like my body does react. And so I'm 01:35:19 learning to 01:35:21 experience feelings beneath my neck when 01:35:24 my inclination is to do everything above 01:35:26 above the neck. It's a hard skill. Oh my 01:35:28 god, it's hard. And so I have for men. 01:35:30 And so when you're when you meet 01:35:31 spiritual people, they are all below the 01:35:34 neck and we think it's stupid, but they 01:35:39 recognize that we have an ability to 01:35:41 feel because we want to think about 01:35:42 everything. And so we both have a lot to 01:35:45 learn from each other which is there the 01:35:46 thinking about things has value not all 01:35:49 the time. And so your girlfriend and I 01:35:52 because I know her ability 01:35:55 to know her body when she has feelings I 01:35:59 think is way more is has a lot of value 01:36:02 to you and to me. I think it's like the 01:36:03 biggest opportunity I have to form 01:36:05 relationships. Yeah. Yeah. Change in 01:36:07 number of posts for automationprone jobs 01:36:10 compared to manual intensive jobs. So 01:36:12 this is a graph that shows the drop in 01:36:15 job postings online for jobs around 01:36:20 automation is yeah the knowledge work is 01:36:23 going to get hammered by AI. 01:36:25 [Music] 01:36:27 Um I met a guy in 01:36:30 Japan 01:36:31 who he's one of the very few guys left 01:36:35 like a hundred of them left 150 of them 01:36:37 left who makes samurai swords in the 01:36:39 traditional samurai sword way you make 01:36:41 them. hammer them metal and he folds his 01:36:43 own metal and he makes samurai swords. 01:36:45 They made them the way they made them 01:36:46 200 years 01:36:47 ago. And we went to this little workshop 01:36:50 and he's this little old guy and he's 01:36:53 telling us the story like how did you do 01:36:55 this? He's like well I had a desk job. I 01:36:57 worked in a company and I woke up one 01:36:58 day I was like I can't this I can't do 01:37:00 this. Like this is this can't be my 01:37:02 life. And I just I've always been 01:37:04 enamored by sword making. I found an 01:37:07 apprenticeship and tried to learn and 01:37:11 you know I'm I I'm just starting to get 01:37:13 the hang of it and but I have a lot of 01:37:15 room for improvement. And we're like how 01:37:17 long have you been doing this? He said 01:37:18 30 years. 30 years. But that's so 01:37:21 Japanese, you know, the Japanese like 01:37:23 they'll never be good at this. I'll just 01:37:24 keep working at it. And and like it's 01:37:26 such hard work. 01:37:28 But he I think there's something to be 01:37:31 said for I think a lot of us like think 01:37:34 about how many white uh knowledge 01:37:36 workers like ask them what their hobbies 01:37:37 are 01:37:39 you 01:37:44 know and like maybe there is an 01:37:44 opportunity to learn a skill. I mean 01:37:46 look at look at look at lockdown when we 01:37:48 all went through lockdown. I mean what 01:37:50 skill did you practice during lockdown? 01:37:52 What did you learn? DJing, running, 01:37:55 cycling. 01:37:56 DJing. 01:37:58 DJing, right? I did kugi, but it's the 01:38:03 Japanese art of fixing broken things 01:38:04 with gold. So, basically, you take a 01:38:07 broken plate or broken cup and you put 01:38:09 it together and with gold paint or gold 01:38:12 epoxy, you put it together and you make 01:38:15 it. The whole concept is you can make 01:38:17 something more beautiful. Like, things 01:38:19 can get more beautiful after they're 01:38:21 broken, right? I love the philosophy of 01:38:23 it. I spent hours doing freaking Kugi. 01:38:25 Ask me how much Kugi have done since we 01:38:27 came out of lockdown. How much DJing 01:38:29 have you done? Zero. 01:38:36 And so maybe all this free time that 01:38:36 we're supposed to get, which I still 01:38:38 don't think will 01:38:39 happen, maybe it's not such a bad 01:38:42 thing. Think about the joy you had 01:38:44 DJing. I had so much fun doing kinugi. I 01:38:46 had so much fun. I would spend hours 01:38:48 just my head down, focused. I was so 01:38:52 relaxed, not thinking about work. So why 01:38:55 why that's the other question we talking 01:38:57 about how we're all going to have all 01:38:59 the spare time. We've lost all of our 01:39:01 spare time. We have no spare time 01:39:03 because all we do is work and think 01:39:04 about work and talk about work. Like if 01:39:07 we go back 40 years, 50 years, there was 01:39:09 plenty of spare time and people went 01:39:11 bowling and they went to the movies and 01:39:13 they went over to each each other's 01:39:14 houses for dinner and they cooked. 01:39:23 Isn't it a good thing that we're getting 01:39:23 time back? You know, a lot of us got a 01:39:25 lot of con good things out of lockdown 01:39:28 and that's extreme. So like 01:39:31 maybe maybe it's okay that we get some 01:39:34 time back because it sounds like we've 01:39:36 lost all of our own time. It'd fix the 01:39:38 loneliness. You'd have space for 01:39:39 relationship. You'd have space for 01:39:40 relationships. You'd have space to try 01:39:41 things and we couldn't bury ourselves in 01:39:43 work and escape from our lives that we 01:39:46 hate by just burying ourselves in work. 01:39:49 Maybe maybe AI doing some of the work 01:39:52 for us is the thing we've been desperate 01:39:54 for for the past 30 01:39:57 years. Just 01:40:03 saying this is quite embarrassing for me 01:40:03 to admit, but if you know me well, 01:40:04 there's something that you know about me 01:40:06 which is a function of my personality 01:40:08 and that is that I lose everything. I've 01:40:10 lost my wallet. I've lost multiple 01:40:11 passports. I now actually have two 01:40:13 passports because there's a high 01:40:15 probability of me losing one. And when 01:40:16 you lose your wallet overseas, as I have 01:40:18 many times, it's a particular 01:40:19 inconvenience because you also lose your 01:40:21 driving license and your credit cards 01:40:22 and those kinds of things. And that's 01:40:23 why I'm so happy that our show sponsor 01:40:25 Extra have invented this thing I have in 01:40:27 my hand. They have partnered with 01:40:29 Apple's Find My Technology. So, all I 01:40:31 have to do is open their app and play a 01:40:34 noise and it will tell me where my 01:40:36 wallet is. Extra wallets are slim. 01:40:38 They're made of recycled aluminium and 01:40:40 they block various types of digital 01:40:42 theft. And with one click, your cards 01:40:44 can pop up and are ready to go. So, if 01:40:47 this sounds like the kind of thing you 01:40:48 or a friend needs, visit extra.com. You 01:40:51 can use code Steven and you'll get 10% 01:40:53 off plus free shipping and a 100day 01:40:56 trial. That's extra.com and use code 01:40:58 Steven at checkout. For anybody that 01:41:00 likes matcha, for anybody that likes 01:41:02 lattes, one of my companies has just 01:41:04 launched canned matcha lattes. And I was 01:41:07 speaking with the founder, Marissa, and 01:41:08 she said that creating this product has 01:41:10 been no easy feat. They tried launching 01:41:13 in 2021, but as is often the case in 01:41:15 business, the development process turned 01:41:17 out to be extremely complex. So, they've 01:41:19 spent the last four years testing and 01:41:21 refining every single detail to create 01:41:23 this, which is a perfect Ted matcha 01:41:26 vanilla latte and a perfect Ted matcha 01:41:29 strawberry latte. So, what we have here 01:41:31 in these cans is barista quality matcha 01:41:34 straight from the can. And it tastes 01:41:36 like it's just been made from your 01:41:37 favorite cafe. Naturally sweet and 01:41:40 naturally creamy in a can. And the 01:41:42 reason why I've invested in this company 01:41:43 and I drink matcha is because matcha as 01:41:45 an energy source gives me lasting energy 01:41:47 without the big crashes that I get from 01:41:49 other products. Grab their ready to 01:41:50 drink canned matcha lattes at Waitro, 01:41:52 Tesco's and Holland and Barrett and 01:41:54 perfected.com where you can use code 01:41:56 Steven40 for 40% off your first 01:42:04 order. What are you struggling with in 01:42:04 this season of life? I think I've asked 01:42:06 you this in every conversation we've had 01:42:07 and the answer's evolved conversation to 01:42:11 conversation. Yeah, we'll both go. 01:42:14 I'm doing so much right now. By the way, 01:42:17 absolute 01:42:18 joy and I wish I had more time in a day. 01:42:23 And not in the sense that I'm dep 01:42:24 prioritizing my friends. You know, you 01:42:26 talked about how you you know, you you 01:42:27 would give them all your money to to 01:42:29 secure them, but you don't give them any 01:42:30 time. I actually treat a lot of I've 01:42:32 actually gotten very good at this. It's 01:42:34 probably when you write a book about 01:42:35 friendship, you sort of like take 01:42:36 yourself on a little bit where I I will 01:42:39 schedule friends like meetings in the 01:42:41 middle of the day. Haven't seen a friend 01:42:42 for a while. Um 4:00 in the 01:42:46 afternoon, tea 01:42:48 with I go and I leave the office like 01:42:52 I'm going to a meeting because nobody 01:42:53 says, "Hey, Simon, where you going?" Go 01:42:54 to a meeting. Where you having the 01:42:56 meeting? At that restaurant. People like 01:42:57 have fun. Nobody will give me any kind 01:43:00 of grief if I'm leaving to go to a 01:43:02 meeting. 01:43:04 Now I go and see a 01:43:06 friend and and when somebody says can I 01:43:08 cancel this French tea with friend 01:43:10 because that you have a phone call you 01:43:11 have to I'm like no just like you 01:43:13 wouldn't cancel on my meeting don't 01:43:15 cancel on so I treat I I put it in and I 01:43:19 give them equal waiting to things that I 01:43:21 have to do at work because I become much 01:43:22 more discerning like do I have to have 01:43:25 this meeting or this phone call today or 01:43:27 can I do it next week? I could do it 01:43:29 next week. And you realize we fake we 01:43:32 fake ur we add fake urgency to 01:43:35 things and I I have a friend who's kind 01:43:38 of amazing. He he's way busier than me 01:43:41 and 01:43:43 he he says I don't want to spend time 01:43:46 with anybody like I saw I saw it was 01:43:47 really funny. He came out of a meeting 01:43:48 and it was he had a meeting with a very 01:43:50 fancy person who's who's courting him 01:43:52 for business, right? Or he should be 01:43:55 courting the other guy actually. 01:43:58 very fancy person and he comes out of 01:44:00 the meeting. I'm like, "How was it?" 01:44:01 He's like, "Total waste of time." I'm 01:44:02 like, "You don't want to do business? 01:44:04 You he doesn't want to work with you?" I 01:44:05 was like, "No, he wants to work with me. 01:44:07 He wants to work with me badly. Total 01:44:09 waste of time." I'm like, "What are you 01:44:11 talking about?" I was like, "He's just 01:44:12 some rich [ __ ] Like, I learned 01:44:15 nothing. I didn't enjoy it and I don't 01:44:18 want to work with them." And I was like, 01:44:19 "Tell me more." He's like, "I only want 01:44:21 to work with people who when I spend 01:44:23 time with them, they teach me something. 01:44:25 I laugh or I feel inspired when I walk 01:44:29 away. And if somebody is sucks my 01:44:31 energy, I do not want to I do not want 01:44:33 to work with them. And I was like, 01:44:36 that's amazing to have that level of 01:44:39 confidence that you would turn down 01:44:41 money in your business. And and now what 01:44:45 if we applied that to as much as 01:44:47 possible in our lives? Like what if we 01:44:49 stopped hanging out with friends who 01:44:50 just sucked our energy? Are they really 01:44:53 friends? rather than spend more time 01:44:56 with people who we enjoy. What if we 01:44:58 take meetings from people we're excited 01:45:00 to take a meeting with, not that we just 01:45:01 see dollar signs and opportunity? And 01:45:04 what if we only partner with companies 01:45:05 where we really like the people who are 01:45:08 who work at those companies? Doesn't you 01:45:10 don't have to have a relationship with 01:45:10 the CEO, but I really they've got a good 01:45:12 culture. I really like working with them 01:45:14 and we make a little less money with 01:45:15 them versus that other company, but I I 01:45:19 really like them because when things go 01:45:20 wrong, I want to call that person. And 01:45:22 and people ask me this all the time. 01:45:24 They say, "How do I choose a publisher?" 01:45:25 And I've seen this mistake a thousand 01:45:26 times. I've seen people like like former 01:45:29 CEOs who are like, "I now I'm going to 01:45:31 write my book." I'm like, "Great. You've 01:45:32 got a lot to say." And they've got 01:45:34 multiple offers and there's a bidding 01:45:35 war. And they always ask me, "Which one 01:45:38 should I go with? Which publisher should 01:45:39 I go with?" And I always say the same 01:45:41 thing. Choose the one you're going to 01:45:42 fight with best. Choose the publisher 01:45:44 you're going to fight with best. They're 01:45:46 like, "What?" I'm like, "The goal is to 01:45:47 make a good book. There's going to be 01:45:49 creative tension. There's going to be 01:45:50 fights. Choose the the the publisher who 01:45:52 you believes in your idea, believes in 01:45:54 you, wants to make a great book, who 01:45:56 you're going to fight with really well. 01:45:58 They every single one of them ignores 01:45:59 me. Every single one of me takes the 01:46:01 biggest bid. And every single one of 01:46:03 them has written a shitty book that 01:46:04 didn't sell. Isn't that just 01:46:05 relationship advice as well, though? 01:46:07 Probably. Yeah. 01:46:10 Choose the person you're going to fight 01:46:11 with best. 01:46:13 And the the the number of times that we 01:46:16 forget about the quality of the product 01:46:18 and we just want the it's the same it's 01:46:21 the same for job interviews. You know, 01:46:23 don't take this is especially true for 01:46:25 for young people. Like when you're 01:46:26 interviewing in in in a relatively 01:46:29 junior job, entry level or slightly 01:46:32 above, don't worry about how much 01:46:33 they're going to pay you. Like just as 01:46:36 long as they pay you a livable wage, you 01:46:39 know, choose the job. Choose the job 01:46:42 based on who you're going to work for. 01:46:44 Choose the person. I used to in early on 01:46:47 in my career, people, you know, HR 01:46:48 people would say, "What are you looking 01:46:49 for?" I always I'd always say the same 01:46:50 thing. I said, "The thing that I'm 01:46:52 looking for is probably like looking for 01:46:53 love, but I'm looking for a mentor." And 01:46:56 every time I would evaluate a job, I 01:46:57 didn't care how glamorous the brand was. 01:46:59 I didn't care how much money they were 01:47:00 going to pay me. And by the way, it's 01:47:01 not like I had money. I knew they were 01:47:03 going to pay me something. I knew I 01:47:03 could pay my bills. It's not like I've 01:47:05 got I'm not a I'm not a trust fund baby. 01:47:07 Like, I needed I needed an income. Mhm. 01:47:09 But one company offered me, you know, 01:47:11 $5,000 more and one company offered me 01:47:13 $5,000 less, but I really like the 01:47:15 person over here. I took that job. And 01:47:17 it was a simp if I got one thing right 01:47:20 as a as a as a as a young person, it's 01:47:22 that I always chose jobs based on who I 01:47:24 would work for, not how much they were 01:47:26 going to pay me or what account or what 01:47:27 brand I was going to work on. 01:47:30 And that's the one thing I got right 01:47:31 because yes, I made less money than all 01:47:33 my friends in the short term, but I got 01:47:35 a I got a I got an education and a care 01:47:37 and a love from somebody who took me 01:47:39 under their wing. I am a I am I am I 01:47:42 learned leadership from Dennis Glennon. 01:47:45 I learned leadership from Peter and 01:47:47 Tomaggio. I learned leadership from 01:47:49 Pamela Muffet. I learned leadership from 01:47:51 these leaders who took a weird liking to 01:47:53 me and took me under their wing. And 01:47:55 they were exceptionally good leaders 01:47:57 themselves. and I got that education and 01:48:00 all of them in the early days paid me 01:48:01 less than I could have got somewhere 01:48:03 else and it's the same choose the people 01:48:07 in my um my book that I wrote I was the 01:48:09 first chapter in it talks about this 01:48:11 idea of these five buckets and I'm 01:48:14 giving I'm trying to give young kids 01:48:15 advice on how to prioritize their career 01:48:17 and it sounds somewhat similar to you so 01:48:18 I'm going to throw this concept at you 01:48:19 and see see how it lands with you so the 01:48:22 idea was that we all have these five 01:48:23 buckets when we're starting our career 01:48:25 and the first bucket is your knowledge 01:48:26 the second bucket is your skills. Now, 01:48:28 these are the f the only buckets that 01:48:31 no professional earthquake in your 01:48:33 career can unfill. You can be fired, you 01:48:35 can be cancelled, whatever. You still 01:48:36 have your knowledge and skills. But the 01:48:38 other three buckets I'm about to name, 01:48:39 they can fluctuate. Bucket number three 01:48:42 is your network can fluctuate. Bucket 01:48:46 number four is your resources can 01:48:48 fluctuate. And bucket number five is 01:48:49 your reputation can fluctuate. And the 01:48:51 idea is that focusing on those first two 01:48:53 buckets when you're young optim choosing 01:48:56 jobs based on how much they're going to 01:48:57 fill those two buckets and also what 01:48:59 they're going to fill them with and how 01:49:00 relevant that information is. And when 01:49:02 knowledge is applied, it becomes a 01:49:04 skill. So you focus on filling your 01:49:06 knowledge. You apply that knowledge into 01:49:07 skills and that really is the essence of 01:49:09 career longevity. And as I noticed over 01:49:11 time with people that I I hired and then 01:49:13 watched them throughout their careers 01:49:14 and what they did, it seems to me that 01:49:17 life over the long term typically brings 01:49:19 you back down or up to the level of your 01:49:22 knowledge and skills. I.e. I had this 01:49:23 one kid stepped down from my company 01:49:25 because he got a job offer at 21 to go 01:49:26 be a CEO in America. And as he departs, 01:49:29 I'm thinking he has not got the 01:49:30 knowledge and skills to be a CEO. Within 01:49:32 24 months, the company had gone bust. He 01:49:34 was back down to doing the same job he 01:49:36 did for me. And I thought life just like 01:49:38 resets you to the level of your 01:49:39 knowledge and skills over time. So what 01:49:41 do you think of this? Are these I think 01:49:43 we're saying the exact same thing. You 01:49:44 did it more eloquently than me, you 01:49:46 know. I think I think it's 100% right. 01:49:49 Uh um and the people who will fill that 01:49:53 bucket with knowledge and skills are the 01:49:55 people who again for some reason they're 01:49:57 good people. They're good leaders. They 01:49:58 take a liking to you and they will give 01:50:00 you they will put you in 01:50:02 situations and they will let you screw 01:50:04 up and fix your own problems. fix your 01:50:06 own mistakes. Then that's what my bosses 01:50:08 did for me. They rewarded my behavior 01:50:10 when I got it right as opposed to my 01:50:12 results. Never got rewarded for my 01:50:14 results. I always got rewarded for my 01:50:15 behavior. So if I showed initiative, I 01:50:17 got a I got a reward even if the results 01:50:19 didn't follow right. And I could there's 01:50:22 I can tell you a great story on that. 01:50:23 And if I got things wrong, they would 01:50:25 say, "Hm, wow, what are you going to 01:50:27 do?" Really screwed that one up, didn't 01:50:30 you? Okay, what are you going to do? And 01:50:32 they didn't they weren't happy with me, 01:50:36 but they let me fix my own messes and 01:50:38 they stood by and I knew that they were 01:50:39 there if I needed them. You know, Peter 01:50:41 and Tomaggio, one of the best leaders I 01:50:43 ever worked for, annoyingly he never 01:50:44 answered a single question I asked. Hey, 01:50:46 Peter, what should I do? I don't know. 01:50:48 What do you think we should do? Well, I 01:50:49 think we should do this. I'm like, okay, 01:50:50 so go do that. You know, what do you 01:50:52 think, Peter? I'm asking you because I 01:50:55 don't know. That's why I'm asking you. 01:50:57 He's like, well, should probably go 01:50:59 think about that. He was he it was the 01:51:01 worst. But what he taught me was 01:51:03 self-reliance. What what he taught me 01:51:05 was self-reliance. What he taught me was 01:51:07 if I don't know then I have to go keep 01:51:09 searching and keep talking to people. I 01:51:11 have to have a point of view. Mhm. And 01:51:13 may not be right and I'll find that out 01:51:16 myself too. And I also learned to have 01:51:18 backup plans from him because something 01:51:19 went haywire on me and it went so wrong. 01:51:22 It ended up being okay. But they he made 01:51:25 me sweat it. And I remember at the end 01:51:27 of the day, at the end of this whole 01:51:29 thing that went wrong. That was totally 01:51:30 my fault. The phone rings. I see his 6 01:51:34 p.m. phone rings. I see his name come up 01:51:36 on the caller ID. And I pick up the 01:51:37 phone. I remember the whole call word 01:51:38 for word. Right. Pick up the phone. I 01:51:41 go, "Hello." He says, "Close call 01:51:44 today." I said, "Yep." He said, "Better 01:51:47 to get shaved by the bullet than hit by 01:51:49 the bullet." I went, "Yep." He said, 01:51:52 "Have a good night." I said, "You too." 01:51:55 That was the whole phone call. And so 01:51:58 since then, since that experience, I 01:52:00 always have a backup plan. Now, it 01:52:01 doesn't mean it's all ready to go, but 01:52:02 I've thought about if this thing that 01:52:04 I'm trying to do doesn't work, what will 01:52:06 I do? And I have at least committed some 01:52:09 thought to it. So, if if something does 01:52:10 go wrong, I'm like, I'm a little bit 01:52:12 ahead. Either have it planned or I'm 01:52:14 like, okay, okay, don't worry. I kind of 01:52:16 if this happened, I thought about this 01:52:17 already. And it's only because of these 01:52:19 great leaders. And it goes, this is this 01:52:22 recurring theme of this whole 01:52:23 conversation, which is it's the 01:52:24 struggle. It's the journey, not the 01:52:27 destination. It's the human beings that 01:52:29 guide us. It's the human beings that 01:52:30 hold space for us that make us better 01:52:33 what we do better better than how we 01:52:35 show up in the world. And AI will 01:52:37 absolutely make our lives easier. Like 01:52:38 most technology makes our lives easier. 01:52:41 That's kind of the role, the rule of 01:52:42 technology, which is to make life a 01:52:44 little easier, a little more efficient, 01:52:46 a little quicker, a little less, you 01:52:48 know, strain on the muscles. you know, 01:52:50 that's kind of what it does, you know, 01:52:53 from the from the plow, you know, all 01:52:56 the way up to the internet and AI. It 01:52:58 just makes life a little 01:53:00 easier. But we're still human beings who 01:53:02 are forced to live with human beings. 01:53:03 You're writing a book about friendship 01:53:05 that we're all waiting for. Yeah. I'm 01:53:07 not going to ask you when it's due cuz I 01:53:09 don't want to be your publisher, but um 01:53:10 I know that they they chase and chase 01:53:12 and chase and chase, but my I guess my 01:53:15 closing question is why? Of all the 01:53:16 things you could have written about, 01:53:17 Simon, you're someone who's able to 01:53:19 traverse se several subject matters 01:53:20 across business and life and everything 01:53:22 in between, but you've committed 01:53:23 yourself to the struggle of writing a 01:53:27 book about friendship. Mhm. There's an 01:53:30 entire industry to help us be better 01:53:32 leaders, right? There's an entire 01:53:34 industry to help you have a successful 01:53:36 relationship or successful marriage or 01:53:38 even find a partner, right? Industries, 01:53:41 books, companies. 01:53:43 There's very little on how to be a 01:53:45 friend. And if you think about if you're 01:53:48 going to have a successful career and 01:53:50 can and can survive the stresses of 01:53:53 career and if you're going to have a 01:53:54 successful romantic relationship or 01:53:56 marriage and survive that, do you know 01:53:58 what what you need in both of those 01:54:00 circumstances? Friends. Because when 01:54:02 your marriage is falling apart, you go 01:54:04 to a friend. When your job is falling 01:54:05 apart, you go to a friend. When and and 01:54:08 there's a few things that I've 01:54:09 discovered about friends that I find 01:54:12 delightful. And I have been reorganizing 01:54:15 my life to as I and I mentioned it 01:54:18 before like I've been reorganizing my 01:54:19 life to ensure that my friends aren't 01:54:22 taken for granted and that sometimes I 01:54:24 do depp prioritize work in order to see 01:54:26 my friends and spend time with my 01:54:27 friends because I know it feels good to 01:54:29 them and I know it feels good to me and 01:54:32 I know that the only reason I can get 01:54:35 through any work stress or personal 01:54:36 stress I have is because of those 01:54:38 magical human beings and I will not take 01:54:39 them for granted and um doesn't I mean 01:54:42 it's always easy. I'm conflicted often 01:54:47 but I'm trying. And I also know if you 01:54:49 look at the work the world today and 01:54:50 like there's so much conversation about 01:54:52 loneliness, depression, anxiety, 01:54:53 inability to cope with stress, even the 01:54:55 obsession with longevity. Like there's 01:54:57 so much about these subjects and some 01:55:00 people treat it with drugs and then 01:55:01 meditation and vacations and burn. Like 01:55:03 there's so many theories. The one thing 01:55:05 that fixes all of those things is 01:55:08 friends. Friendship is the ultimate 01:55:10 biohack. You know, we've talked about 01:55:12 this before. I fundamentally believe 01:55:14 that. And if it's so valuable, like if I 01:55:17 know if you know vegetables are good for 01:55:18 you, you eat more vegetables. If you 01:55:21 know exercise is good for you, you do 01:55:23 more exercise. So if I say friends are 01:55:24 good for you, shouldn't you do more 01:55:26 friendships, right? Like shouldn't you 01:55:29 exercise is not fun or easy and you have 01:55:31 to get over a hump sometimes? Eating 01:55:34 vegetables can sometimes be boring and 01:55:36 unsatisfying, but you do it and you find 01:55:38 new ways. And so maybe friendship is not 01:55:41 always easy or fun, but it's still 01:55:44 really really good for you. And it the 01:55:46 best thing about friends is it actually 01:55:47 doesn't taste like spinach. It's like 01:55:49 you get the benefits of spinach, but it 01:55:51 tastes like chocolate cake. If you get 01:55:52 friendship right, it's it's the 01:55:55 healthiest thing in the world. How do I 01:55:56 know if someone's a friend? 01:55:59 Have that conversation with them because 01:56:01 I know a lot of people. Yeah. Yeah. I 01:56:03 mean, like, and you have deal friends, 01:56:04 you know, you have a lot of deal 01:56:06 friends, you know, podcast friends. 01:56:07 Podcast. Yeah. uh work friends, you 01:56:10 know, and you and we have and not look, 01:56:12 I'm not talking about there's all kinds 01:56:14 of friends. There's friends that you 01:56:14 just like hanging out with. They're just 01:56:16 fun, but you're not going to go with 01:56:17 them with your problems or to sort out, 01:56:19 you know, issues. You're not you're just 01:56:20 not work or personal. They're just fun, 01:56:24 right? There's nothing wrong with that. 01:56:26 Um, but I think, you know, I'm certainly 01:56:28 guilty of spending time with people 01:56:30 that, you know, they fill a space and 01:56:32 they make me feel not lonely, but at the 01:56:33 end of the day, I don't feel 01:56:36 smarter, inspired, brighter, lighter 01:56:39 when they when I say goodbye. I'm just 01:56:41 like, "All right, bye. That was fun." 01:56:42 And I kind of want to spend more time 01:56:44 with people who lift me, teach me, 01:56:46 support me, love me, give me a give me a 01:56:49 chance to serve. They open up to me and 01:56:51 let me serve as well. You know, I 01:56:53 struggled to make friends and I 01:56:56 think Do you know why? I 01:57:00 think 01:57:02 I I think I'm lazy with it. Lazy with 01:57:06 friendships. So, I will meet someone, 01:57:08 I'll have a great connection with them 01:57:11 and then my followup like I don't really 01:57:14 know what to do next. So, I'll meet your 01:57:16 followup is crap. Oh, [ __ ] But this 01:57:19 is what I mean. Like my I'll meet 01:57:21 someone. I'll text you and maybe I'll 01:57:23 get a response. Vice versa. 01:57:26 I think you left me on red. 01:57:32 But you you're the same. Are you the 01:57:32 same or is it just with me? It's just 01:57:34 with me. It's just with you. But but I 01:57:36 think my follow-up game is like crap. 01:57:38 I'll meet someone. I'll go, "Oh my god, 01:57:39 I'll see I'll see the potential for a 01:57:41 friendship and then I won't know what I 01:57:45 have." Or maybe I'm just being lazy. I'm 01:57:46 trying to I want to be honest. So I I I 01:57:49 maybe I just don't prioritize it enough. 01:57:51 I think also when you start having fame 01:57:52 and money and you're used to being the 01:57:55 boss, you get away with stuff. So you 01:57:59 show up late, everybody's like pissed 01:58:01 off and angry and then you show up and 01:58:03 they're like and you're like sorry. Like 01:58:04 no, no, no, don't worry, don't worry. 01:58:05 Right? Like you get away with stuff in 01:58:07 the world, right? And so I think what 01:58:09 that does is I see this with celebrities 01:58:11 all the time, right? Because everybody 01:58:13 yeses them to death and they get away 01:58:14 with it and nobody ever holds them 01:58:16 accountable. At some point they just get 01:58:18 lazier and lazier and lazier because 01:58:19 they can. Oh yeah. You know. Yeah. And 01:58:22 so they don't have to put in the effort 01:58:23 because other people put in the effort. 01:58:24 Yeah. And nobody bees like like somebody 01:58:27 who's a nobody won't say to them, "Yo, 01:58:31 [ __ ] Not respectful. What? You think 01:58:34 my schedule doesn't matter? I've been 01:58:36 waiting here for 3 01:58:38 hours. Just because you're a celebrity, 01:58:40 you think you can just keep me waiting? 01:58:42 Not cool. Nobody says that to them." 01:58:44 Yeah. And somebody should say that to 01:58:45 them. But uh they should say it to 01:58:47 themselves. They should say it to 01:58:48 themselves. That would be ideal. And 01:58:50 some of them have the self-awareness to 01:58:51 know they're getting away with it. 01:58:52 That's even worse cuz then they're doing 01:58:53 it on purpose. At least blindness, I 01:58:55 think. You know, they can at least hide 01:58:56 behind ignorance. But, uh, yeah, I mean, 01:59:00 friendship takes effort. Yeah. There'll 01:59:03 be a lot of people listening now, I 01:59:04 know, that understand. I think they'll 01:59:06 resonate with what I'm saying, which is 01:59:08 I'll meet someone, I'll be like, we 01:59:09 could be really good friends. I see so 01:59:11 much in us. I love what you stand for. 01:59:13 We we have so much in common. and then 01:59:15 it drifts because neither party have the 01:59:18 tools or the skill of like what to do 01:59:21 then. We also live in a strange world 01:59:23 where I've met people where I have all 01:59:25 of that and I follow up like 01:59:26 immediately. I'm like, "Hey, I had such 01:59:28 a good time. Let's make a plan." And 01:59:29 they're like, "What? Huh? What?" Or I 01:59:30 call as opposed to like texting. And 01:59:33 people like, "What? Why are you 01:59:34 calling?" Like, "Well, we we had a nice 01:59:35 time. I thought we'd maybe talk, you 01:59:38 know, and I think we live in this 01:59:40 strange world where people put it out 01:59:42 there, but they don't really want it." 01:59:43 Looping us right back to the beginning 01:59:44 of this conversation as you said that 01:59:46 about the call. That made me think again 01:59:48 about how this going to become a premium 01:59:50 on human because calling is so archaic 01:59:54 to me that when someone does it, it's 01:59:56 like a treat now. And I was thinking 01:59:58 what's what's taking that even further 02:00:00 would be writing someone a letter. If 02:00:02 someone Do you know who wrote me a 02:00:03 letter? Evan from Snapchat came on the 02:00:05 podcast, the founder and CEO of 02:00:06 Snapchat. And then by the time I'd got 02:00:08 back to London, there was a letter on my 02:00:10 desk from Evan and it just said, "I had 02:00:11 a great conversation with you. Um, thank 02:00:13 you for being so thoughtful with the 02:00:15 questions. Thank you for the research. 02:00:16 Here's my number. would love to stay in 02:00:17 touch. Yeah. And it blew me away. AI 02:00:19 wrote that. 02:00:24 No, it was with He had an auto pen. He 02:00:24 just It was with a pen. I had a 02:00:27 signature and his phone number. I 02:00:28 thought that is so beautiful. It's 02:00:29 classic. It's classic and classy. 02:00:32 There's a premium on being human. 02:00:34 Simon, thank you so much. Thank you for 02:00:36 being so generous with your time always. 02:00:37 Thank you for always my show. um you 02:00:39 always move me forward in my thinking in 02:00:41 such a profound and unexpected always 02:00:42 unexpected way that I'll tremendous I'll 02:00:44 value tremendously for a very very long 02:00:46 time in our friendship I I'm going to 02:00:48 text back even faster you know we need 02:00:50 to go on more dates um and I look 02:00:52 forward to that and look forward to our 02:00:53 next date in London which I know is 02:00:54 coming up sometime soon yeah we'll go 02:00:55 out for sure it'll be fun we have a 02:00:57 closing tradition which nearly forgot I 02:00:58 do know the tradition which is the last 02:00:59 guest leaves a question for the next 02:01:01 guest yes what's my question what are 02:01:02 you doing in your life to mentor someone 02:01:06 coming up behind you and Then who is a 02:01:09 person that you'd like to mentor, teach, 02:01:12 or coach that needs your voice the 02:01:15 most? I it's it's my team is everything 02:01:19 right now. Like I want to give 02:01:21 everything I've learned to my team. I 02:01:22 want the folks on my team to benefit 02:01:25 from all the mistakes I've made. And one 02:01:28 of the joys of being in founder mode 02:01:30 when it's 02:01:31 not the the actual beginning is I have 02:01:36 way more in my skill and knowledge 02:01:38 bucket that I want to pour out. And so I 02:01:42 one of the reasons I'm having so much 02:01:44 fun in founder mode is because I want to 02:01:48 give away everything that I've learned 02:01:49 so that my team can be stronger and 02:01:51 stronger and better and better cuz I 02:01:53 want to leave something that can survive 02:01:54 me that you know if if I you know the 02:01:58 whole school bus test you know if the 02:01:59 founder gets hit by a bus will the 02:02:00 company continue or will it not and I 02:02:03 really want to build something where 02:02:05 they want to build it without me. Feels 02:02:07 like there's been a change here. M what 02:02:09 was the catalyst? For the past couple 02:02:11 years, I've been just trying out a lot 02:02:12 of different things to find to to find 02:02:15 uh a level of excitement and energy that 02:02:18 I think I'd lost for a little bit and I 02:02:19 found it. Like this has been like the 02:02:22 founder mode. My team are so great and 02:02:25 they so want to they so want to push 02:02:27 boundaries and all I want to do is take 02:02:29 take the reinss off like take the leash 02:02:31 off. Like I want them to experiment. I 02:02:33 want them to try things and I'm trying 02:02:34 to create an environment where they're 02:02:35 creative. They do things. Half of them 02:02:37 will fail. I don't care. Let's try 02:02:39 again. And I I just love being around 02:02:42 all the creative ideas that they're 02:02:44 coming up with. My team asked you to 02:02:46 bring something that meant a lot to you, 02:02:47 and you brought me this. And I don't 02:02:49 know what's in this 02:03:01 Oh, two medallions. Medals. Mhm. Those 02:03:01 are military challenge coins. Um, I 02:03:04 brought the one, the round one, just to 02:03:05 show you what the traditional ones look 02:03:06 like. Um, uh, this is the one I care 02:03:10 about. I mean, I care about them both, 02:03:11 but this is the one I brought. So, these 02:03:13 are, uh, only generals or commanders 02:03:16 will give these out. They're hard to 02:03:17 get. You get them when you do something 02:03:20 of service. It's, it's less formal than 02:03:23 a medal. They can give it out to 02:03:24 whomever they want. And it's their way 02:03:26 of saying thank you. And the challenge 02:03:28 coins that I've been given, I'm very, 02:03:29 very proud of because I feel like I 02:03:30 earned them. And the thing that I love 02:03:32 is when they give them to you, they 02:03:33 don't just hand it to you. They put it 02:03:35 in their hand like this and they shake 02:03:36 your 02:03:37 hand. Go 02:03:39 on. And that's how they give them to 02:03:41 you. And they say, "Thank you, Simon, so 02:03:43 much for coming here and helping us 02:03:44 out." And that's how they give me the 02:03:46 coin. The reason this one means a lot to 02:03:49 me is because I did some work with the 02:03:52 with Air Force Top Gun. It's called the 02:03:54 Weapons School, but it's Air Force Top 02:03:55 Gun. And this is their coin and this is 02:03:58 their patch. That's And um I did some 02:04:00 work with them to help them get to the 02:04:02 core of their why, what their true value 02:04:03 was. just to make sure that their 02:04:05 culture stays clear and good for a long 02:04:07 time. And we came up with three words, 02:04:09 three actions that everybody who goes 02:04:11 through Air Force Top Gun is required to 02:04:12 do, which is build, teach, lead, which 02:04:15 is the idea that you build a skill set, 02:04:18 you teach that skill set to other 02:04:20 people, and then you build leaders and 02:04:22 you lead, right? This whole idea that 02:04:24 you have you have a responsibility to 02:04:26 build, to teach, and to lead, accumulate 02:04:28 and give and serve. And the thing that 02:04:31 is so powerful is the comedant of the 02:04:33 weapon school after we did the work. He 02:04:35 took those words and he put them on the 02:04:37 coin. Oh really? And they exist on the 02:04:39 coin and they have been on there for 02:04:41 years since. And I am so proud to have 02:04:43 given something that has longevity that 02:04:46 is literally on the coin. 02:04:49 Build, teach, lead. Yeah. That is such a 02:04:53 beautiful mantra for life. Yeah. And 02:04:55 they're they're wonderful people who go 02:04:57 there. And uh what an honor that work 02:05:00 that I did ended up on a coin. And so 02:05:03 they gave me the coin out of gratitude 02:05:06 for the work that I did and it had my 02:05:07 words on it. So it's not my words, it's 02:05:09 their words. I just helped distill them. 02:05:11 It's who they are when they're at their 02:05:13 natural best. I just helped them put it 02:05:15 and codify it. Simon, thank you. Thanks. 02:05:18 Always love our conversations and long 02:05:20 may they continue. Likewise. 02:05:25 This has always blown my mind a little 02:05:25 bit. 53% of you that listen to this show 02:05:28 regularly haven't yet subscribed to the 02:05:30 show. So, could I ask you for a favor? 02:05:32 If you like the show and you like what 02:05:33 we do here and you want to support us, 02:05:35 the free simple way that you can do just 02:05:36 that is by hitting the subscribe button. 02:05:38 And my commitment to you is if you do 02:05:40 that, then I'll do everything in my 02:05:41 power, me and my team, to make sure that 02:05:43 this show is better for you every single 02:05:45 week. We'll listen to your feedback. 02:05:47 We'll find the guests that you want me 02:05:48 to speak to and we'll continue to do 02:05:50 what we do. Thank you so much. Heat. 02:05:52 Heat. N.
identifier: 3056
model: gemini-2.5-flash-preview-05-20| input-price: 0.15 output-price: 3.5 max-context-length: 128_000
host: 193.8.40.111
https://www.youtube.com/watch?v=dc6q04o8Y6oinclude_comments: None
include_timestamps: 1
include_glossary: None
output_language: en
cost: 0.0089323
*Abstract:* This video features Jeff Kr, a cybersecurity architect and adjunct professor, answering various internet questions about the history of hacking and contemporary cybersecurity challenges. The discussion covers the evolution of hacking, from early phone phreaking and physical access to modern internet-based attacks. Key topics include influential hackers like Kevin Mitnick, the origin and types of "hackers" (black, white, and grey hat), significant cyber incidents such as the Stuxnet virus, WannaCry ransomware, and the Colonial Pipeline attack, and the OPM data breach. Kr also explains fundamental cybersecurity concepts like the CIA triad, firewalls, and encryption, and discusses the security implications of open-source software, VPNs, password managers, and election systems. He emphasizes the human element in cybersecurity vulnerabilities, particularly regarding social engineering and phishing scams, while offering practical advice for personal and organizational security. *A Deep Dive into Cybersecurity: History, Threats, and Defenses* * *0:16 The Most Influential Hacker:* Kevin Mitnick is cited as the most influential and infamous hacker, renowned for his social engineering tactics and past arrests for phone system abuse. * *1:00 Origin of "Hack":* The term "hack" originated in the 1960s at MIT's model train club, referring to creative, unconventional uses of technology. By the 1970s-80s, it evolved to mean unauthorized access. * *1:27 Types of Hackers:* Hackers are categorized by "hat colors": "black hats" (malicious intent), "white hats" (ethical, report vulnerabilities for good), and "grey hats" (cross ethical lines despite claiming good intentions). * *2:10 Evolution of Hacking:* Early hacking in the 1960s focused on physical access to computers. The 1970s saw "phone phreakers" manipulating phone systems, and the 1980s-90s brought the internet, drastically expanding hacking possibilities globally. * *3:38 Government Website Hacking:* A notable early instance was the 1996 defacement of the CIA website. To prevent such attacks, key advice includes changing default credentials, disabling unnecessary services, regularly updating software, and using multi-factor authentication. * *4:59 Open-Source Encryption and Kirkoff's Principle:* The principle states that the security of a cryptographic system should rely solely on the secrecy of the keys, not the algorithm itself, meaning open-source algorithms can be secure. * *6:39 The Deep and Dark Web:* The internet is likened to an iceberg: the "surface web" (5%) is indexed, while the "deep web" (95%) is not and requires authentication. The "dark web" is a subset of the deep web, unindexed and requiring special tools, used for both illicit activities and anonymous communication by whistleblowers or dissidents. * *8:17 Stuxnet Virus:* This malware was designed to physically disrupt Iranian nuclear centrifuges by making them speed up and slow down, impairing uranium enrichment. It was widely attributed to the US and Israel. * *9:33 OPM Data Breach:* Discusses the implications of data breaches where personal information, including fingerprints, is compromised. Recommends avoiding unnecessary data sharing and utilizing credit monitoring/freezes to mitigate risks. * *10:43 TikTok Ban Concerns:* Banning applications like TikTok is generally ineffective as it drives activity underground. Concerns about data privacy and misinformation are universal, regardless of a company's country of origin. * *12:06 Election Vulnerabilities:* Paper ballots are preferred over electronic voting systems for their resilience against power outages and for recountability, as electronic votes are difficult to verify retrospectively. * *12:53 The I Love You Virus:* This virus spread virally by overwriting various file types, making systems harder to disinfect and maximizing its reach. * *13:17 Stopping WannaCry:* Marcus Hutchins halted the WannaCry ransomware in May 2017 by registering a specific domain name that the malware was programmed to check, acting as a "kill switch" to stop its replication. * *14:18 Colonial Pipeline Ransomware:* The pipeline shutdown was a cautious decision by operators, not directly caused by the ransomware itself. Although a $5 million ransom was paid, the decryption tool was ineffective, forcing reliance on backups, with the FBI later recovering half the ransom. * *15:47 Firewall Explained:* A firewall acts as a digital "gatekeeper" or "zone of separation" between networks of different trust levels (e.g., internal vs. internet), enforcing security policies to control incoming and outgoing traffic. * *16:41 VPNs and Anonymity:* VPNs primarily encrypt internet traffic to protect sensitive information and can hide a user's original IP address, offering anonymity. However, trust is transferred from the ISP to the VPN provider, who can see all traffic, making provider choice crucial. * *18:54 Why Phishing and Scams Persist:* These scams remain profitable due to the human tendency to trust, which attackers exploit through social engineering. Tactics constantly evolve, including "smishing" (SMS), "vishing" (voicemail), and "quishing" (QR codes). * *21:11 Password Managers and Passkeys:* While no system is perfectly "safe," password managers can be "safe enough" if a strong master password and multi-factor authentication are used. Passkeys, a newer cryptographic standard, are more phishing-resistant as they don't require users to remember passwords and rely on device-based cryptographic keys. * *21:54 Virus Likelihood Today:* Catching a virus is still possible, despite advanced antivirus software. Modern antivirus systems have evolved from signature-based detection to behavior-based analysis. Regular software updates and reboots are crucial defenses. * *23:01 Most Damaging Hacks:* Damage is measured in various ways (financial, systems, lives). An indirect example of life loss is a hospital ransomware attack that redirected emergency services, leading to a patient's death during transport. * *23:40 The CIA Triad:* A foundational concept in cybersecurity, standing for Confidentiality, Integrity, and Availability—the three core principles of protecting information. * *24:10 The First Computer Virus:* The Morris Worm (1988), a self-replicating program, is considered the first widespread example of malware that brought public awareness to harmful software. * *24:49 Phone Phreaking Revisited:* An older attack method from the pre-internet era, where individuals manipulated the phone system using specific tones (like the 2600 Hz tone from a Captain Crunch whistle) to make free calls or reconfigure switches. * *26:08 Realistic Hacking in Movies:* Real-world hacking is often tedious and not visually exciting, involving hours of background processes. Movies dramatize it for entertainment, often inaccurately. I used gemini-2.5-flash-preview-05-20| input-price: 0.15 output-price: 3.5 max-context-length: 128_000 on rocketrecap dot com to summarize the transcript. Cost (if I didn't use the free tier): $0.0089 Input tokens: 24362 Output tokens: 1508
Abstract:
This video features Jeff Kr, a cybersecurity architect and adjunct professor, answering various internet questions about the history of hacking and contemporary cybersecurity challenges. The discussion covers the evolution of hacking, from early phone phreaking and physical access to modern internet-based attacks. Key topics include influential hackers like Kevin Mitnick, the origin and types of "hackers" (black, white, and grey hat), significant cyber incidents such as the Stuxnet virus, WannaCry ransomware, and the Colonial Pipeline attack, and the OPM data breach. Kr also explains fundamental cybersecurity concepts like the CIA triad, firewalls, and encryption, and discusses the security implications of open-source software, VPNs, password managers, and election systems. He emphasizes the human element in cybersecurity vulnerabilities, particularly regarding social engineering and phishing scams, while offering practical advice for personal and organizational security.
A Deep Dive into Cybersecurity: History, Threats, and Defenses
Below, I will provide input for an example video (comprising of title, description, and transcript, in this order) and the corresponding abstract and summary I expect. Afterward, I will provide a new transcript that I want you to summarize in the same format. **Please give an abstract of the transcript and then summarize the transcript in a self-contained bullet list format.** Include starting timestamps, important details and key takeaways. Example Input: Fluidigm Polaris Part 2- illuminator and camera mikeselectricstuff 131K subscribers Subscribed 369 Share Download Clip Save 5,857 views Aug 26, 2024 Fluidigm Polaris part 1 : • Fluidigm Polaris (Part 1) - Biotech g... Ebay listings: https://www.ebay.co.uk/usr/mikeselect... Merch https://mikeselectricstuff.creator-sp... Transcript Follow along using the transcript. Show transcript mikeselectricstuff 131K subscribers Videos About Support on Patreon 40 Comments @robertwatsonbath 6 hours ago Thanks Mike. Ooof! - with the level of bodgery going on around 15:48 I think shame would have made me do a board re spin, out of my own pocket if I had to. 1 Reply @Muonium1 9 hours ago The green LED looks different from the others and uses phosphor conversion because of the "green gap" problem where green InGaN emitters suffer efficiency droop at high currents. Phosphide based emitters don't start becoming efficient until around 600nm so also can't be used for high power green emitters. See the paper and plot by Matthias Auf der Maur in his 2015 paper on alloy fluctuations in InGaN as the cause of reduced external quantum efficiency at longer (green) wavelengths. 4 Reply 1 reply @tafsirnahian669 10 hours ago (edited) Can this be used as an astrophotography camera? Reply mikeselectricstuff · 1 reply @mikeselectricstuff 6 hours ago Yes, but may need a shutter to avoid light during readout Reply @2010craggy 11 hours ago Narrowband filters we use in Astronomy (Astrophotography) are sided- they work best passing light in one direction so I guess the arrows on the filter frames indicate which way round to install them in the filter wheel. 1 Reply @vitukz 12 hours ago A mate with Channel @extractions&ire could use it 2 Reply @RobertGallop 19 hours ago That LED module says it can go up to 28 amps!!! 21 amps for 100%. You should see what it does at 20 amps! Reply @Prophes0r 19 hours ago I had an "Oh SHIT!" moment when I realized that the weird trapezoidal shape of that light guide was for keystone correction of the light source. Very clever. 6 Reply @OneBiOzZ 20 hours ago given the cost of the CCD you think they could have run another PCB for it 9 Reply @tekvax01 21 hours ago $20 thousand dollars per minute of run time! 1 Reply @tekvax01 22 hours ago "We spared no expense!" John Hammond Jurassic Park. *(that's why this thing costs the same as a 50-seat Greyhound Bus coach!) Reply @florianf4257 22 hours ago The smearing on the image could be due to the fact that you don't use a shutter, so you see brighter stripes under bright areas of the image as you still iluminate these pixels while the sensor data ist shifted out towards the top. I experienced this effect back at university with a LN-Cooled CCD for Spectroscopy. The stripes disapeared as soon as you used the shutter instead of disabling it in the open position (but fokussing at 100ms integration time and continuous readout with a focal plane shutter isn't much fun). 12 Reply mikeselectricstuff · 1 reply @mikeselectricstuff 12 hours ago I didn't think of that, but makes sense 2 Reply @douro20 22 hours ago (edited) The red LED reminds me of one from Roithner Lasertechnik. I have a Symbol 2D scanner which uses two very bright LEDs from that company, one red and one red-orange. The red-orange is behind a lens which focuses it into an extremely narrow beam. 1 Reply @RicoElectrico 23 hours ago PFG is Pulse Flush Gate according to the datasheet. Reply @dcallan812 23 hours ago Very interesting. 2x Reply @littleboot_ 1 day ago Cool interesting device Reply @dav1dbone 1 day ago I've stripped large projectors, looks similar, wonder if some of those castings are a magnesium alloy? Reply @kevywevvy8833 1 day ago ironic that some of those Phlatlight modules are used in some of the cheapest disco lights. 1 Reply 1 reply @bill6255 1 day ago Great vid - gets right into subject in title, its packed with information, wraps up quickly. Should get a YT award! imho 3 Reply @JAKOB1977 1 day ago (edited) The whole sensor module incl. a 5 grand 50mpix sensor for 49 £.. highest bid atm Though also a limited CCD sensor, but for the right buyer its a steal at these relative low sums. Architecture Full Frame CCD (Square Pixels) Total Number of Pixels 8304 (H) × 6220 (V) = 51.6 Mp Number of Effective Pixels 8208 (H) × 6164 (V) = 50.5 Mp Number of Active Pixels 8176 (H) × 6132 (V) = 50.1 Mp Pixel Size 6.0 m (H) × 6.0 m (V) Active Image Size 49.1 mm (H) × 36.8 mm (V) 61.3 mm (Diagonal), 645 1.1x Optical Format Aspect Ratio 4:3 Horizontal Outputs 4 Saturation Signal 40.3 ke− Output Sensitivity 31 V/e− Quantum Efficiency KAF−50100−CAA KAF−50100−AAA KAF−50100−ABA (with Lens) 22%, 22%, 16% (Peak R, G, B) 25% 62% Read Noise (f = 18 MHz) 12.5 e− Dark Signal (T = 60°C) 42 pA/cm2 Dark Current Doubling Temperature 5.7°C Dynamic Range (f = 18 MHz) 70.2 dB Estimated Linear Dynamic Range (f = 18 MHz) 69.3 dB Charge Transfer Efficiency Horizontal Vertical 0.999995 0.999999 Blooming Protection (4 ms Exposure Time) 800X Saturation Exposure Maximum Date Rate 18 MHz Package Ceramic PGA Cover Glass MAR Coated, 2 Sides or Clear Glass Features • TRUESENSE Transparent Gate Electrode for High Sensitivity • Ultra-High Resolution • Board Dynamic Range • Low Noise Architecture • Large Active Imaging Area Applications • Digitization • Mapping/Aerial • Photography • Scientific Thx for the tear down Mike, always a joy Reply @martinalooksatthings 1 day ago 15:49 that is some great bodging on of caps, they really didn't want to respin that PCB huh 8 Reply @RhythmGamer 1 day ago Was depressed today and then a new mike video dropped and now I’m genuinely happy to get my tear down fix 1 Reply @dine9093 1 day ago (edited) Did you transfrom into Mr Blobby for a moment there? 2 Reply @NickNorton 1 day ago Thanks Mike. Your videos are always interesting. 5 Reply @KeritechElectronics 1 day ago Heavy optics indeed... Spare no expense, cost no object. Splendid build quality. The CCD is a thing of beauty! 1 Reply @YSoreil 1 day ago The pricing on that sensor is about right, I looked in to these many years ago when they were still in production since it's the only large sensor you could actually buy. Really cool to see one in the wild. 2 Reply @snik2pl 1 day ago That leds look like from led projector Reply @vincei4252 1 day ago TDI = Time Domain Integration ? 1 Reply @wolpumba4099 1 day ago (edited) Maybe the camera should not be illuminated during readout. From the datasheet of the sensor (Onsemi): saturation 40300 electrons, read noise 12.5 electrons per pixel @ 18MHz (quite bad). quantum efficiency 62% (if it has micro lenses), frame rate 1 Hz. lateral overflow drain to prevent blooming protects against 800x (factor increases linearly with exposure time) saturation exposure (32e6 electrons per pixel at 4ms exposure time), microlens has +/- 20 degree acceptance angle i guess it would be good for astrophotography 4 Reply @txm100 1 day ago (edited) Babe wake up a new mikeselectricstuff has dropped! 9 Reply @vincei4252 1 day ago That looks like a finger-lakes filter wheel, however, for astronomy they'd never use such a large stepper. 1 Reply @MRooodddvvv 1 day ago yaaaaay ! more overcomplicated optical stuff ! 4 Reply 1 reply @NoPegs 1 day ago He lives! 11 Reply 1 reply Transcript 0:00 so I've stripped all the bits of the 0:01 optical system so basically we've got 0:03 the uh the camera 0:05 itself which is mounted on this uh very 0:09 complex 0:10 adjustment thing which obviously to set 0:13 you the various tilt and uh alignment 0:15 stuff then there's two of these massive 0:18 lenses I've taken one of these apart I 0:20 think there's something like about eight 0:22 or nine Optical elements in here these 0:25 don't seem to do a great deal in terms 0:26 of electr magnification they're obiously 0:28 just about getting the image to where it 0:29 uh where it needs to be just so that 0:33 goes like that then this Optical block I 0:36 originally thought this was made of some 0:37 s crazy heavy material but it's just 0:39 really the sum of all these Optical bits 0:41 are just ridiculously heavy those lenses 0:43 are about 4 kilos each and then there's 0:45 this very heavy very solid um piece that 0:47 goes in the middle and this is so this 0:49 is the filter wheel assembly with a 0:51 hilariously oversized steper 0:53 motor driving this wheel with these very 0:57 large narrow band filters so we've got 1:00 various different shades of uh 1:03 filters there five Al together that 1:06 one's actually just showing up a silver 1:07 that's actually a a red but fairly low 1:10 transmission orangey red blue green 1:15 there's an excess cover on this side so 1:16 the filters can be accessed and changed 1:19 without taking anything else apart even 1:21 this is like ridiculous it's like solid 1:23 aluminium this is just basically a cover 1:25 the actual wavelengths of these are um 1:27 488 525 570 630 and 700 NM not sure what 1:32 the suffix on that perhaps that's the uh 1:34 the width of the spectral line say these 1:37 are very narrow band filters most of 1:39 them are you very little light through 1:41 so it's still very tight narrow band to 1:43 match the um fluoresence of the dies 1:45 they're using in the biochemical process 1:48 and obviously to reject the light that's 1:49 being fired at it from that Illuminator 1:51 box and then there's a there's a second 1:53 one of these lenses then the actual sort 1:55 of samples below that so uh very serious 1:58 amount of very uh chunky heavy Optics 2:01 okay let's take a look at this light 2:02 source made by company Lumen Dynamics 2:04 who are now part of 2:06 excelitas self-contained unit power 2:08 connector USB and this which one of the 2:11 Cable Bundle said was a TTL interface 2:14 USB wasn't used in uh the fluid 2:17 application output here and I think this 2:19 is an input for um light feedback I 2:21 don't if it's regulated or just a measur 2:23 measurement facility and the uh fiber 2:27 assembly 2:29 Square Inlet there and then there's two 2:32 outputs which have uh lens assemblies 2:35 and this small one which goes back into 2:37 that small Port just Loops out of here 2:40 straight back in So on this side we've 2:42 got the electronics which look pretty 2:44 straightforward we've got a bit of power 2:45 supply stuff over here and we've got 2:48 separate drivers for each wavelength now 2:50 interesting this is clearly been very 2:52 specifically made for this application 2:54 you I was half expecting like say some 2:56 generic drivers that could be used for a 2:58 number of different things but actually 3:00 literally specified the exact wavelength 3:02 on the PCB there is provision here for 3:04 385 NM which isn't populated but this is 3:07 clearly been designed very specifically 3:09 so these four drivers look the same but 3:10 then there's two higher power ones for 3:12 575 and 3:14 520 a slightly bigger heat sink on this 3:16 575 section there a p 24 which is 3:20 providing USB interface USB isolator the 3:23 USB interface just presents as a comport 3:26 I did have a quick look but I didn't 3:27 actually get anything sensible um I did 3:29 dump the Pi code out and there's a few 3:31 you a few sort of commands that you 3:32 could see in text but I didn't actually 3:34 manage to get it working properly I 3:36 found some software for related version 3:38 but it didn't seem to want to talk to it 3:39 but um I say that wasn't used for the 3:41 original application it might be quite 3:42 interesting to get try and get the Run 3:44 hours count out of it and the TTL 3:46 interface looks fairly straightforward 3:48 we've got positions for six opto 3:50 isolators but only five five are 3:52 installed so that corresponds with the 3:54 unused thing so I think this hopefully 3:56 should be as simple as just providing a 3:57 ttrl signal for each color to uh enable 4:00 it a big heat sink here which is there I 4:03 think there's like a big S of metal 4:04 plate through the middle of this that 4:05 all the leads are mounted on the other 4:07 side so this is heat sinking it with a 4:09 air flow from a uh just a fan in here 4:13 obviously don't have the air flow 4:14 anywhere near the Optics so conduction 4:17 cool through to this plate that's then 4:18 uh air cooled got some pots which are 4:21 presumably power 4:22 adjustments okay let's take a look at 4:24 the other side which is uh much more 4:27 interesting see we've got some uh very 4:31 uh neatly Twisted cable assemblies there 4:35 a bunch of leads so we've got one here 4:37 475 up here 430 NM 630 575 and 520 4:44 filters and dcro mirrors a quick way to 4:48 see what's white is if we just shine 4:49 some white light through 4:51 here not sure how it is is to see on the 4:54 camera but shining white light we do 4:55 actually get a bit of red a bit of blue 4:57 some yellow here so the obstacle path 5:00 575 it goes sort of here bounces off 5:03 this mirror and goes out the 520 goes 5:07 sort of down here across here and up 5:09 there 630 goes basically straight 5:13 through 5:15 430 goes across there down there along 5:17 there and the 475 goes down here and 5:20 left this is the light sensing thing 5:22 think here there's just a um I think 5:24 there a photo diode or other sensor 5:26 haven't actually taken that off and 5:28 everything's fixed down to this chunk of 5:31 aluminium which acts as the heat 5:32 spreader that then conducts the heat to 5:33 the back side for the heat 5:35 sink and the actual lead packages all 5:38 look fairly similar except for this one 5:41 on the 575 which looks quite a bit more 5:44 substantial big spay 5:46 Terminals and the interface for this 5:48 turned out to be extremely simple it's 5:50 literally a 5V TTL level to enable each 5:54 color doesn't seem to be any tensity 5:56 control but there are some additional 5:58 pins on that connector that weren't used 5:59 in the through time thing so maybe 6:01 there's some extra lines that control 6:02 that I couldn't find any data on this uh 6:05 unit and the um their current product 6:07 range is quite significantly different 6:09 so we've got the uh blue these 6:13 might may well be saturating the camera 6:16 so they might look a bit weird so that's 6:17 the 430 6:18 blue the 575 6:24 yellow uh 6:26 475 light blue 6:29 the uh 520 6:31 green and the uh 630 red now one 6:36 interesting thing I noticed for the 6:39 575 it's actually it's actually using a 6:42 white lead and then filtering it rather 6:44 than using all the other ones are using 6:46 leads which are the fundamental colors 6:47 but uh this is actually doing white and 6:50 it's a combination of this filter and 6:52 the dichroic mirrors that are turning to 6:55 Yellow if we take the filter out and a 6:57 lot of the a lot of the um blue content 7:00 is going this way the red is going 7:02 straight through these two mirrors so 7:05 this is clearly not reflecting much of 7:08 that so we end up with the yellow coming 7:10 out of uh out of there which is a fairly 7:14 light yellow color which you don't 7:16 really see from high intensity leads so 7:19 that's clearly why they've used the 7:20 white to uh do this power consumption of 7:23 the white is pretty high so going up to 7:25 about 2 and 1 half amps on that color 7:27 whereas most of the other colors are 7:28 only drawing half an amp or so at 24 7:30 volts the uh the green is up to about 7:32 1.2 but say this thing is uh much 7:35 brighter and if you actually run all the 7:38 colors at the same time you get a fairly 7:41 reasonable um looking white coming out 7:43 of it and one thing you might just be 7:45 out to notice is there is some sort 7:46 color banding around here that's not 7:49 getting uh everything s completely 7:51 concentric and I think that's where this 7:53 fiber optic thing comes 7:58 in I'll 8:00 get a couple of Fairly accurately shaped 8:04 very sort of uniform color and looking 8:06 at What's um inside here we've basically 8:09 just got this Square Rod so this is 8:12 clearly yeah the lights just bouncing 8:13 off all the all the various sides to um 8:16 get a nice uniform illumination uh this 8:19 back bit looks like it's all potted so 8:21 nothing I really do to get in there I 8:24 think this is fiber so I have come 8:26 across um cables like this which are 8:27 liquid fill but just looking through the 8:30 end of this it's probably a bit hard to 8:31 see it does look like there fiber ends 8:34 going going on there and so there's this 8:36 feedback thing which is just obviously 8:39 compensating for the any light losses 8:41 through here to get an accurate 8:43 representation of uh the light that's 8:45 been launched out of these two 8:47 fibers and you see uh 8:49 these have got this sort of trapezium 8:54 shape light guides again it's like a 8:56 sort of acrylic or glass light guide 9:00 guess projected just to make the right 9:03 rectangular 9:04 shape and look at this Center assembly 9:07 um the light output doesn't uh change 9:10 whether you feed this in or not so it's 9:11 clear not doing any internal Clos Loop 9:14 control obviously there may well be some 9:16 facility for it to do that but it's not 9:17 being used in this 9:19 application and so this output just 9:21 produces a voltage on the uh outle 9:24 connector proportional to the amount of 9:26 light that's present so there's a little 9:28 diffuser in the back there 9:30 and then there's just some kind of uh 9:33 Optical sensor looks like a 9:35 chip looking at the lead it's a very 9:37 small package on the PCB with this lens 9:40 assembly over the top and these look 9:43 like they're actually on a copper 9:44 Metalized PCB for maximum thermal 9:47 performance and yeah it's a very small 9:49 package looks like it's a ceramic 9:51 package and there's a thermister there 9:53 for temperature monitoring this is the 9:56 475 blue one this is the 520 need to 9:59 Green which is uh rather different OB 10:02 it's a much bigger D with lots of bond 10:04 wise but also this looks like it's using 10:05 a phosphor if I shine a blue light at it 10:08 lights up green so this is actually a 10:10 phosphor conversion green lead which 10:12 I've I've come across before they want 10:15 that specific wavelength so they may be 10:17 easier to tune a phosphor than tune the 10:20 um semiconductor material to get the uh 10:23 right right wavelength from the lead 10:24 directly uh red 630 similar size to the 10:28 blue one or does seem to have a uh a 10:31 lens on top of it there is a sort of red 10:33 coloring to 10:35 the die but that doesn't appear to be 10:38 fluorescent as far as I can 10:39 tell and the white one again a little 10:41 bit different sort of much higher 10:43 current 10:46 connectors a makeer name on that 10:48 connector flot light not sure if that's 10:52 the connector or the lead 10:54 itself and obviously with the phosphor 10:56 and I'd imagine that phosphor may well 10:58 be tuned to get the maximum to the uh 5 11:01 cenm and actually this white one looks 11:04 like a St fairly standard product I just 11:06 found it in Mouse made by luminous 11:09 devices in fact actually I think all 11:11 these are based on various luminous 11:13 devices modules and they're you take 11:17 looks like they taking the nearest 11:18 wavelength and then just using these 11:19 filters to clean it up to get a precise 11:22 uh spectral line out of it so quite a 11:25 nice neat and um extreme 11:30 bright light source uh sure I've got any 11:33 particular use for it so I think this 11:35 might end up on 11:36 eBay but uh very pretty to look out and 11:40 without the uh risk of burning your eyes 11:43 out like you do with lasers so I thought 11:45 it would be interesting to try and 11:46 figure out the runtime of this things 11:48 like this we usually keep some sort 11:49 record of runtime cuz leads degrade over 11:51 time I couldn't get any software to work 11:52 through the USB face but then had a 11:54 thought probably going to be writing the 11:55 runtime periodically to the e s prom so 11:58 I just just scope up that and noticed it 12:00 was doing right every 5 minutes so I 12:02 just ran it for a while periodically 12:04 reading the E squ I just held the pick 12:05 in in reset and um put clip over to read 12:07 the square prom and found it was writing 12:10 one location per color every 5 minutes 12:12 so if one color was on it would write 12:14 that location every 5 minutes and just 12:16 increment it by one so after doing a few 12:18 tests with different colors of different 12:19 time periods it looked extremely 12:21 straightforward it's like a four bite 12:22 count for each color looking at the 12:24 original data that was in it all the 12:26 colors apart from Green were reading 12:28 zero and the green was reading four 12:30 indicating a total 20 minutes run time 12:32 ever if it was turned on run for a short 12:34 time then turned off that might not have 12:36 been counted but even so indicates this 12:37 thing wasn't used a great deal the whole 12:40 s process of doing a run can be several 12:42 hours but it'll only be doing probably 12:43 the Imaging at the end of that so you 12:46 wouldn't expect to be running for a long 12:47 time but say a single color for 20 12:50 minutes over its whole lifetime does 12:52 seem a little bit on the low side okay 12:55 let's look at the camera un fortunately 12:57 I managed to not record any sound when I 12:58 did this it's also a couple of months 13:00 ago so there's going to be a few details 13:02 that I've forgotten so I'm just going to 13:04 dub this over the original footage so um 13:07 take the lid off see this massive great 13:10 heat sink so this is a pel cool camera 13:12 we've got this blower fan producing a 13:14 fair amount of air flow through 13:16 it the connector here there's the ccds 13:19 mounted on the board on the 13:24 right this unplugs so we've got a bit of 13:27 power supply stuff on here 13:29 USB interface I think that's the Cyprus 13:32 microcontroller High speeded USB 13:34 interface there's a zyink spon fpga some 13:40 RAM and there's a couple of ATD 13:42 converters can't quite read what those 13:45 those are but anal 13:47 devices um little bit of bodgery around 13:51 here extra decoupling obviously they 13:53 have having some noise issues this is 13:55 around the ram chip quite a lot of extra 13:57 capacitors been added there 13:59 uh there's a couple of amplifiers prior 14:01 to the HD converter buffers or Andor 14:05 amplifiers taking the CCD 14:08 signal um bit more power spy stuff here 14:11 this is probably all to do with 14:12 generating the various CCD bias voltages 14:14 they uh need quite a lot of exotic 14:18 voltages next board down is just a 14:20 shield and an interconnect 14:24 boardly shielding the power supply stuff 14:26 from some the more sensitive an log 14:28 stuff 14:31 and this is the bottom board which is 14:32 just all power supply 14:34 stuff as you can see tons of capacitors 14:37 or Transformer in 14:42 there and this is the CCD which is a uh 14:47 very impressive thing this is a kf50 100 14:50 originally by true sense then codec 14:53 there ON 14:54 Semiconductor it's 50 megapixels uh the 14:58 only price I could find was this one 15:00 5,000 bucks and the architecture you can 15:03 see there actually two separate halves 15:04 which explains the Dual AZ converters 15:06 and two amplifiers it's literally split 15:08 down the middle and duplicated so it's 15:10 outputting two streams in parallel just 15:13 to keep the bandwidth sensible and it's 15:15 got this amazing um diffraction effects 15:18 it's got micro lenses over the pixel so 15:20 there's there's a bit more Optics going 15:22 on than on a normal 15:25 sensor few more bodges on the CCD board 15:28 including this wire which isn't really 15:29 tacked down very well which is a bit uh 15:32 bit of a mess quite a few bits around 15:34 this board where they've uh tacked 15:36 various bits on which is not super 15:38 impressive looks like CCD drivers on the 15:40 left with those 3 ohm um damping 15:43 resistors on the 15:47 output get a few more little bodges 15:50 around here some of 15:52 the and there's this separator the 15:54 silica gel to keep the moisture down but 15:56 there's this separator that actually 15:58 appears to be cut from piece of 15:59 antistatic 16:04 bag and this sort of thermal block on 16:06 top of this stack of three pel Cola 16:12 modules so as with any Stacks they get 16:16 um larger as they go back towards the 16:18 heat sink because each P's got to not 16:20 only take the heat from the previous but 16:21 also the waste heat which is quite 16:27 significant you see a little temperature 16:29 sensor here that copper block which 16:32 makes contact with the back of the 16:37 CCD and this's the back of the 16:40 pelas this then contacts the heat sink 16:44 on the uh rear there a few thermal pads 16:46 as well for some of the other power 16:47 components on this 16:51 PCB okay I've connected this uh camera 16:54 up I found some drivers on the disc that 16:56 seem to work under Windows 7 couldn't 16:58 get to install under Windows 11 though 17:01 um in the absence of any sort of lens or 17:03 being bothered to the proper amount I've 17:04 just put some f over it and put a little 17:06 pin in there to make a pinhole lens and 17:08 software gives a few options I'm not 17:11 entirely sure what all these are there's 17:12 obviously a clock frequency 22 MHz low 17:15 gain and with PFG no idea what that is 17:19 something something game programmable 17:20 Something game perhaps ver exposure 17:23 types I think focus is just like a 17:25 continuous grab until you tell it to 17:27 stop not entirely sure all these options 17:30 are obviously exposure time uh triggers 17:33 there ex external hardware trigger inut 17:35 you just trigger using a um thing on 17:37 screen so the resolution is 8176 by 17:40 6132 and you can actually bin those 17:42 where you combine multiple pixels to get 17:46 increased gain at the expense of lower 17:48 resolution down this is a 10sec exposure 17:51 obviously of the pin hole it's very uh 17:53 intensitive so we just stand still now 17:56 downloading it there's the uh exposure 17:59 so when it's 18:01 um there's a little status thing down 18:03 here so that tells you the um exposure 18:07 [Applause] 18:09 time it's this is just it 18:15 downloading um it is quite I'm seeing 18:18 quite a lot like smearing I think that I 18:20 don't know whether that's just due to 18:21 pixels overloading or something else I 18:24 mean yeah it's not it's not um out of 18:26 the question that there's something not 18:27 totally right about this camera 18:28 certainly was bodge wise on there um I 18:31 don't I'd imagine a camera like this 18:32 it's got a fairly narrow range of 18:34 intensities that it's happy with I'm not 18:36 going to spend a great deal of time on 18:38 this if you're interested in this camera 18:40 maybe for astronomy or something and 18:42 happy to sort of take the risk of it may 18:44 not be uh perfect I'll um I think I'll 18:47 stick this on eBay along with the 18:48 Illuminator I'll put a link down in the 18:50 description to the listing take your 18:52 chances to grab a bargain so for example 18:54 here we see this vertical streaking so 18:56 I'm not sure how normal that is this is 18:58 on fairly bright scene looking out the 19:02 window if I cut the exposure time down 19:04 on that it's now 1 second 19:07 exposure again most of the image 19:09 disappears again this is looks like it's 19:11 possibly over still overloading here go 19:14 that go down to say say quarter a 19:16 second so again I think there might be 19:19 some Auto gain control going on here um 19:21 this is with the PFG option let's try 19:23 turning that off and see what 19:25 happens so I'm not sure this is actually 19:27 more streaking or which just it's 19:29 cranked up the gain all the dis display 19:31 gray scale to show what um you know the 19:33 range of things that it's captured 19:36 there's one of one of 12 things in the 19:38 software there's um you can see of you 19:40 can't seem to read out the temperature 19:42 of the pelta cooler but you can set the 19:44 temperature and if you said it's a 19:46 different temperature you see the power 19:48 consumption jump up running the cooler 19:50 to get the temperature you requested but 19:52 I can't see anything anywhere that tells 19:54 you whether the cool is at the at the 19:56 temperature other than the power 19:57 consumption going down and there's no 19:59 temperature read out 20:03 here and just some yeah this is just 20:05 sort of very basic software I'm sure 20:07 there's like an API for more 20:09 sophisticated 20:10 applications but so if you know anything 20:12 more about these cameras please um stick 20:14 in the 20:15 comments um incidentally when I was 20:18 editing I didn't notice there was a bent 20:19 pin on the um CCD but I did fix that 20:22 before doing these tests and also 20:24 reactivated the um silica gel desicant 20:26 cuz I noticed it was uh I was getting 20:28 bit of condensation on the window but um 20:31 yeah so a couple of uh interesting but 20:34 maybe not particularly uh useful pieces 20:37 of Kit except for someone that's got a 20:38 very specific use so um I'll stick a 20:42 I'll stick these on eBay put a link in 20:44 the description and say hopefully 20:45 someone could actually make some uh good 20:47 use of these things Example Output: **Abstract:** This video presents Part 2 of a teardown focusing on the optical components of a Fluidigm Polaris biotechnology instrument, specifically the multi-wavelength illuminator and the high-resolution CCD camera. The Lumen Dynamics illuminator unit is examined in detail, revealing its construction using multiple high-power LEDs (430nm, 475nm, 520nm, 575nm, 630nm) combined via dichroic mirrors and filters. A square fiber optic rod is used to homogenize the light. A notable finding is the use of a phosphor-converted white LED filtered to achieve the 575nm output. The unit features simple TTL activation for each color, conduction cooling, and internal homogenization optics. Analysis of its EEPROM suggests extremely low operational runtime. The camera module teardown showcases a 50 Megapixel ON Semiconductor KAF-50100 CCD sensor with micro-lenses, cooled by a multi-stage Peltier stack. The control electronics include an FPGA and a USB interface. Significant post-manufacturing modifications ("bodges") are observed on the camera's circuit boards. Basic functional testing using vendor software and a pinhole lens confirms image capture but reveals prominent vertical streaking artifacts, the cause of which remains uncertain (potential overload, readout artifact, or fault). **Exploring the Fluidigm Polaris: A Detailed Look at its High-End Optics and Camera System** * **0:00 High-End Optics:** The system utilizes heavy, high-quality lenses and mirrors for precise imaging, weighing around 4 kilos each. * **0:49 Narrow Band Filters:** A filter wheel with five narrow band filters (488, 525, 570, 630, and 700 nm) ensures accurate fluorescence detection and rejection of excitation light. * **2:01 Customizable Illumination:** The Lumen Dynamics light source offers five individually controllable LED wavelengths (430, 475, 520, 575, 630 nm) with varying power outputs. The 575nm yellow LED is uniquely achieved using a white LED with filtering. * **3:45 TTL Control:** The light source is controlled via a simple TTL interface, enabling easy on/off switching for each LED color. * **12:55 Sophisticated Camera:** The system includes a 50-megapixel Kodak KAI-50100 CCD camera with a Peltier cooling system for reduced noise. * **14:54 High-Speed Data Transfer:** The camera features dual analog-to-digital converters to manage the high data throughput of the 50-megapixel sensor, which is effectively two 25-megapixel sensors operating in parallel. * **18:11 Possible Issues:** The video creator noted some potential issues with the camera, including image smearing. * **18:11 Limited Dynamic Range:** The camera's sensor has a limited dynamic range, making it potentially challenging to capture scenes with a wide range of brightness levels. * **11:45 Low Runtime:** Internal data suggests the system has seen minimal usage, with only 20 minutes of recorded runtime for the green LED. * **20:38 Availability on eBay:** Both the illuminator and camera are expected to be listed for sale on eBay. Here is the real transcript. Please summarize it: 00:00:01 Hopefully, we'll stay out of jail on 00:00:01 this. Ladies and gentlemen, start your 00:00:03 lawyers. I'm Jeff Kr. I'm a cyber 00:00:05 security architect and adjunct professor 00:00:07 at NC State University. Let's answer 00:00:09 some questions from the internet. This 00:00:11 is history of hacking 00:00:12 [Music] 00:00:16 support. Atmatscary 34 asks, "Who do you 00:00:19 think is the most influential hacker in 00:00:21 cyber history?" Some of the really great 00:00:23 hackers are ones you've never heard of 00:00:25 because they in fact were so good they 00:00:26 didn't get caught. Certainly one of the 00:00:28 big names. He was certainly the most 00:00:29 influential and infamous back in the day 00:00:32 is Kevin Mitnik. Kevin Mitnik was 00:00:34 particularly known for his social 00:00:36 engineering where he would basically try 00:00:38 to con people, rely on their desire to 00:00:41 trust each other and get information out 00:00:43 of people by pretending to be someone 00:00:45 else. He served 5 years in prison. He 00:00:48 was arrested for abusing the phone 00:00:51 system. Back in those days, you had to 00:00:52 pay for long-distance calls. He was able 00:00:54 to break into the phone system and make 00:00:57 calls for free. at Raymond Goch one asks 00:01:00 where does the term hack come from? The 00:01:02 term really traces its evolution from 00:01:05 the 60s where at MIT there was a model 00:01:08 train club and they use the term hack or 00:01:10 hacker to refer to someone who was able 00:01:12 to use the technology in an 00:01:14 unconventional way in a creative way. 00:01:17 Then about the 70s 80s it kind of took a 00:01:20 change and most people began to 00:01:22 associate hacking with unauthorized 00:01:24 access. In the security community, we 00:01:27 actually refer to at least three 00:01:28 different types. And we refer to them by 00:01:30 their hat color, believe it or not. No, 00:01:32 they're not actually wearing hats 00:01:34 necessarily, but we refer to black hats 00:01:36 as the ones who are breaking into 00:01:38 systems in order to do damage, in order 00:01:40 to steal things, operating without 00:01:41 permission. The white hat hackers, those 00:01:44 are more like that original terminology 00:01:46 that we referred to with the the group 00:01:48 from NI MIT who were basically hobbyists 00:01:52 trying to find out where the limits of 00:01:53 the technology. this sort of thing. They 00:01:56 report those vulnerabilities and they're 00:01:57 trying to do something for the overall 00:01:59 good. Now, you've got something in 00:02:00 between, a grey hat hacker where they 00:02:03 claim to be doing some good, but maybe 00:02:05 they don't. They cross a few lines here 00:02:06 and there. James K. Poke, how did 00:02:08 computer hackers do their hacking back 00:02:10 in the 60s and 70s? It was all about 00:02:12 physical access because computers, there 00:02:14 was no internet. All the computers that 00:02:16 existed were pretty much in very known 00:02:19 places. A company had a data center. It 00:02:21 was on raised floor. It had to be cooled 00:02:24 with air conditioning. The entry and 00:02:26 access to those systems was controlled 00:02:28 via badge reader. There were cameras. 00:02:30 So, it was basically security was 00:02:32 guards, guns, and gates back in those 00:02:34 days. If you didn't have access 00:02:36 physically to the system, you couldn't 00:02:37 do much damage. Then it started moving 00:02:40 to where attackers were starting to use 00:02:42 the phone system. Hackers that were 00:02:44 hacking on the phone system were known 00:02:45 as freakers, phone hackers. They would 00:02:48 do different things to try to figure out 00:02:50 how they could get into the phone system 00:02:52 and control it remotely, maybe from a 00:02:54 pay phone. There were not a lot of 00:02:55 computer systems for them to break into 00:02:57 yet. And then we start moving into the 00:02:58 70s now. We didn't really have an 00:03:01 internet that everyone had access to, 00:03:02 but more and more systems were being 00:03:04 made available over the public phone 00:03:07 network. So they had modems and you 00:03:09 could call into a system and then get 00:03:11 access. If you knew the password, you 00:03:13 could log in and then maybe take control 00:03:15 of the system that way. So that's what 00:03:17 happened really more in the '7s. In 00:03:19 the8s, especially as we move into the 00:03:21 '90s. Then we had the internet and 00:03:23 that's when really hacking changed very 00:03:25 dramatically because everyone had access 00:03:28 to everyone everywhere in the world. It 00:03:30 was a great thing for pretty much 00:03:32 everyone including the attackers at 00:03:35 Charlie's Curious asks, "Have hackers 00:03:38 ever taken down a government website?" 00:03:39 One of the first that I remember was 00:03:41 back in about 1996 when the Central 00:03:43 Intelligence Agency had their main 00:03:46 website hacked and at least for a short 00:03:48 period of time it said welcome to the 00:03:50 Central Stupidity Agency. Not a lot of 00:03:52 damage done there, more just a 00:03:54 reputational damage. It was basically 00:03:56 electronic graffiti. So what could 00:03:58 people do to prevent their websites for 00:04:00 instance from being hacked? One of the 00:04:02 first things is make sure that you 00:04:04 change all of the default user IDs and 00:04:07 passwords. We call that hardening. 00:04:09 That's one form of hardening. Another is 00:04:11 turn off all the unnecessary services. 00:04:13 Every single thing that's turned on on a 00:04:15 system is potentially another way a bad 00:04:18 guy can use to get into your system. 00:04:19 Also, keep your software up to date 00:04:21 because all the time vendors are fixing 00:04:24 bugs in their software and many of those 00:04:26 bugs are security bugs and the bad guys 00:04:28 will know about what those bugs are and 00:04:30 they'll take advantage of them. Needs to 00:04:32 be at minimum multiffactor 00:04:34 authentication. Don't rely on voice 00:04:36 recognition. By the way, we have deep 00:04:37 fake technology from AI that potentially 00:04:40 could fake that stuff out. Weeds asks, 00:04:42 "Question for cyber security mavens. 00:04:44 Since signal is open- source, doesn't 00:04:46 that mean that some intelligent 00:04:48 adversary could determine its encryption 00:04:50 algorithm and come up with a way to 00:04:51 decrypt signal communications?" Well, 00:04:54 yes, in general. But first of all, don't 00:04:56 add people to your group chat if you 00:04:58 don't know who they are. There's a a 00:04:59 notion in cryptography that's known as 00:05:02 Kirkoff's principle and it says that 00:05:04 nothing should be secret about a crypto 00:05:06 system except the keys. In other words, 00:05:08 knowing how an algorithm works should 00:05:10 not give you any insight into how to 00:05:14 break it. A good crypto algorithm will 00:05:16 stand the test of time. Everyone can 00:05:18 know how it works and yet they still 00:05:20 can't break it. The only way to break an 00:05:22 encrypted message is to guess what the 00:05:24 key is. That's been the case. So if the 00:05:26 keys are the secret to the system, then 00:05:28 one thing you want to be able to do is 00:05:29 generate those keys and keep them 00:05:31 secret. You want a random key and then 00:05:34 you want to store it somewhere safe. The 00:05:36 more randomness you can get, the better 00:05:38 your key will be and then you you don't 00:05:40 have to remember that. You'll use some 00:05:42 other form of multiffactor 00:05:44 authentication or things like that in 00:05:45 order to get access to the key. But the 00:05:47 key itself will be saved on your system. 00:05:49 And if it's really sensitive, you'll 00:05:51 store it in a special place in hardware 00:05:54 where if someone tries to access that 00:05:56 without permission, it'll actually blank 00:05:58 the key and just wipe it out completely. 00:06:01 Then that crypto device uh becomes 00:06:03 effectively just a paper weight at that 00:06:05 point. Deadbird Rugby says, "I've seen 00:06:07 some older generation folks on LinkedIn 00:06:09 as cyber security analysts in the '90s." 00:06:11 Yeah, he's probably talking about me. Uh 00:06:13 from what I remember, the internet was 00:06:16 like the Wild West in the '90s. A lot of 00:06:18 focus was just on putting up a firewall, 00:06:20 putting up some sort of technology that 00:06:22 gave us an initial block, an initial 00:06:24 front door to separate our internal 00:06:26 network from the external wild west. But 00:06:29 firewalls weren't nearly enough. They 00:06:30 weren't enough then and they're not 00:06:32 enough now. But that was what a lot of 00:06:34 the view was. And we talk about the wild 00:06:36 west. Well, there was a wild west then 00:06:38 on the internet because everything was 00:06:39 kind of unknown and unmonitored. We 00:06:42 still have that today to a great extent. 00:06:44 But then there's another layer deep 00:06:46 below the surface that most people never 00:06:48 see. Where you exist for the most part, 00:06:50 if you're typical uh of most people is 00:06:53 you're on the surface web. That's maybe 00:06:54 5% of the content that's on the 00:06:57 worldwide web. And it's stuff that you 00:06:59 can get to from your favorite search 00:07:00 engine. It's been indexed, in other 00:07:02 words. So you can go to the search 00:07:03 engine and find it. The other 95% is 00:07:06 kind of think of an iceberg. This is the 00:07:07 stuff below the surface. You know, the 00:07:09 iceberg is deeper below the water level. 00:07:11 So think about the iceberg. The other 00:07:13 95% that's below, that's the deep web. 00:07:16 It's not necessarily nefarious stuff. 00:07:17 It's just not stuff that everyone needs 00:07:19 to get to. It's business records. It's 00:07:21 things like that. Only authorized users 00:07:23 would be able to go in and authenticate 00:07:25 and be able to see that stuff. But 00:07:27 there's a subset that's in an area we 00:07:29 call the dark web. And it's dark because 00:07:32 it's not indexed. You have to know where 00:07:34 it is. So, you're not likely to just 00:07:35 stumble into it. In fact, you've got to 00:07:37 use special tools to get there. And for 00:07:38 what it's worth, I don't recommend that 00:07:40 you go there because it's a rough 00:07:41 neighborhood. Your system could be taken 00:07:44 over, could be hacked, could have 00:07:45 malware on it after you visited some of 00:07:47 these illicit sites. Now, is everything 00:07:49 on the dark web terrible? No. Some group 00:07:52 of the people that are on the dark web 00:07:54 are people who are whistleblowers, 00:07:56 people who are political dissident, 00:07:58 people who live in areas of the world 00:08:00 where they can't speak freely, and maybe 00:08:02 it's even journalists who need to be 00:08:03 able to get a message out and be able to 00:08:05 do it with anonymity. But with anonymity 00:08:07 comes also the possibility of illegal 00:08:10 and illicit activities. It's dark 00:08:11 because the lights are off. It's not 00:08:13 dark because of what the content is 00:08:14 there, but certainly some of the content 00:08:16 is of a dark nature. A user on Reddit 00:08:18 asked, "What was the Stuckset virus and 00:08:21 why was it so successful?" Well, 00:08:23 Stuckset was some malware. I'll just use 00:08:26 that general term to refer to it. 00:08:28 Designed specifically to target nuclear 00:08:31 centrifuges in Iran. It was unleashed on 00:08:34 those systems. Those systems didn't have 00:08:36 general internet access. So the idea was 00:08:38 that it would be brought into the system 00:08:41 physically implanted on those systems 00:08:43 and then it would cause these 00:08:45 centrifuges to speed up and slow down 00:08:46 and speed up and slow down which caused 00:08:48 them to not be functional. The idea of 00:08:50 course was to disrupt Iran's ability to 00:08:53 enrich uranium. So who was looking out 00:08:56 for doing that? Well, there's a lot of 00:08:57 attribution and it's one of those 00:08:59 situations. One of my favorite quotes is 00:09:01 those who know aren't talking and those 00:09:03 who don't, well, you can't shut them up. 00:09:04 I'm going to keep talking. So now you 00:09:06 know which one of those categories I'm 00:09:07 in. I don't have any firsthand 00:09:09 information, but it's been widely 00:09:11 attributed in the public space that the 00:09:13 US and Israel were involved in this 00:09:15 particular attack as a way to try to 00:09:17 subvert Iran's nuclear ambitions. It was 00:09:20 pretty successful for a good period of 00:09:23 time until it turns out that the virus 00:09:25 spread to some other systems. Those 00:09:27 other systems then were discovered with 00:09:30 an antivirus tool. Then from there, the 00:09:32 whole game started to unravel. at Mach 5 00:09:35 turtle says, "Yay, my data has been 00:09:38 compromised due to the OPM data breach. 00:09:40 Not sure that's a reason for 00:09:41 celebration. Uh, my fingerprints and 00:09:43 everything. Uh, where will it all turn 00:09:45 up next? How exciting." That's a case 00:09:47 where data is turned over to someone 00:09:49 else and you have to trust that they're 00:09:51 going to do the right thing. And maybe 00:09:52 they do the right thing 99% of the time, 00:09:55 but it's that other 1% where they didn't 00:09:58 and then that's where an attack occurs. 00:09:59 So, what can you do about this? Well, 00:10:01 first of all, don't give your data to 00:10:03 places where you don't have to. And most 00:10:04 people give their data up very freely 00:10:07 for very little in exchange. So, make 00:10:09 sure that you're understanding the 00:10:11 bargain. Understand that your data is 00:10:12 worth something to those companies. 00:10:14 That's why they want to keep it. And 00:10:15 make sure that what you're getting back 00:10:17 in exchange for it is really a fair 00:10:19 bargain. Also, you can do some things 00:10:21 like credit monitoring, like credit 00:10:23 freeze, credit lock, things like that. 00:10:25 So that way if someone tries to open a 00:10:28 line of credit in your name, tries to 00:10:29 get a credit card, tries to get a home 00:10:31 loan, whatever like that, they try to do 00:10:33 that in your name using information that 00:10:35 they got from one of these data 00:10:36 breaches. Well, they won't be able to. 00:10:38 That's something that you can do at 00:10:39 least in the US, and there are probably 00:10:41 similar things you can do in other 00:10:43 countries as well. ATC magazine asks, "A 00:10:45 question for cyber security pros. Would 00:10:47 you ban Tik Tok from your organization 00:10:49 over security and privacy concerns 00:10:51 raised about its ownership being based 00:10:53 in China?" First of all, my general 00:10:56 approach to bans is I don't think they 00:10:58 work. Not in the way that people hope 00:11:00 that they will. When you ban something, 00:11:03 you drive its behavior below ground. And 00:11:05 once you drive it underground, then it's 00:11:07 hard to monitor. Now, am I concerned 00:11:10 about ownership? I'm concerned about 00:11:11 ownership of all of these because just 00:11:14 because a company is in China or even if 00:11:16 they're in the US doesn't necessarily 00:11:18 make me think that there's going to be 00:11:20 no violations of privacy or no 00:11:22 manipulation of information. We've 00:11:24 already seen that every country on earth 00:11:26 is really good and has people who will 00:11:29 find ways to be good at manipulating 00:11:33 people through misinformation and fake 00:11:35 news. And we know that privacy 00:11:36 violations occur everywhere. And we also 00:11:39 know that privacy violations can occur 00:11:42 even unintentionally. So there are 00:11:44 concerns certainly if a government is 00:11:45 able to say we are going to make you 00:11:48 turn over those records to us and now 00:11:50 there's not independence from that 00:11:52 service and the government and that's a 00:11:54 big concern but that happens in a lot of 00:11:57 cases. I heard Daikaiu asks what are 00:12:00 some ways elections are vulnerable from 00:12:02 a security standpoint and what can be 00:12:04 done to shore up these platforms. almost 00:12:06 every case I will choose the more 00:12:07 high-tech alternative to the lower tech 00:12:09 alternative except when it comes to 00:12:11 voting. In that case, the reason I like 00:12:12 paper ballots is because with a paper 00:12:15 ballot, if the counting machine messes 00:12:17 up, well, we just go back and take the 00:12:19 paper ballots and run them through 00:12:20 another one. If we have only electronic 00:12:22 votes to begin with, we can't go run 00:12:24 them all back through. We can't line up 00:12:25 all the people that voted on that day 00:12:27 and say, "Please go back, get in line 00:12:29 again, and vote exactly the way you did 00:12:31 before." Doesn't work like that. We 00:12:32 could have a power outage at a voting 00:12:34 station. Well, again, that doesn't 00:12:36 affect paper ballots. It might affect 00:12:37 the counting, and if we want to use 00:12:39 machines to do counting of those, I 00:12:41 think that's not so unreasonable, but 00:12:44 there there are a lot of things that 00:12:46 technology helps us with. This is one 00:12:48 where a lower tech solution is probably 00:12:51 better. Chronoport asks, "Why did the I 00:12:53 love you virus overwrite other files?" 00:12:55 If you're a virus, you're trying to 00:12:56 spread yourself as much as you can. The 00:12:59 more types of files that you can 00:13:01 overwrite, the more things you can 00:13:02 infect and the more other systems you 00:13:04 can infect and the harder it is to get 00:13:06 the system disinfected. If you're the 00:13:08 designer of a virus, you want it to 00:13:10 spread as virally as possible. So, 00:13:13 infect as much stuff as you possibly 00:13:14 can. At Allah asks, "Who stopped W to 00:13:17 Cry?" Wann to Cry was one of the most 00:13:20 famous examples of malware. Did a lot of 00:13:22 damage. The guy that's given credit for 00:13:24 stopping it is named Marcus Hutchkins. 00:13:26 He did it in May of 2017. He was doing 00:13:28 some analysis of the malware and found 00:13:31 that it actually made a call out to a 00:13:33 particular website to a particular 00:13:35 domain name. It was really long and 00:13:37 complex and gorpy looking. Nothing that 00:13:38 you would ever guess unless you were 00:13:40 just decompiling the code. And he 00:13:42 realized that as long as it didn't find 00:13:43 the presence of that, it would continue 00:13:45 to spread. So what he did then was go 00:13:48 register that domain name so that then 00:13:50 whenever the malware went out to go ping 00:13:52 that site to see if it exists, it would 00:13:54 in fact exist and therefore the malware 00:13:56 would stop replicating. So he basically 00:13:58 found what was a kill switch that was 00:14:00 built into the code, but that didn't 00:14:03 necessarily stop and eradicate. The 00:14:06 malware was still on a lot of people's 00:14:07 systems and may still be out there in 00:14:09 some cases, but at least it was a way to 00:14:12 turn a kill switch and make it stop so 00:14:14 that it didn't hurt other people for at 00:14:16 least a good period of time. At the boss 00:14:18 almighty asks, "How exactly can hackers 00:14:21 shut down a pipeline?" I'm assuming this 00:14:23 is a reference to the Colonial Pipeline 00:14:25 ransomware case that was very infamous 00:14:27 because in fact there was a pipeline 00:14:29 that transported oil across the 00:14:32 southeastern portions of the US. Well, 00:14:33 it turns out that the attacker, the 00:14:36 actual malware, the ransomware did not 00:14:38 actually shut down the pipeline. The 00:14:40 operators of Colonial Pipeline decided 00:14:42 they needed to shut it down in order to 00:14:44 prevent further damage cuz they weren't 00:14:46 quite really sure what was going on. 00:14:48 They knew that they had experienced a 00:14:50 ransomware case. They were being 00:14:52 demanded a $5 million ransom in order to 00:14:55 restore the systems. And they, in an 00:14:57 abundance of caution, just felt like it 00:14:59 was better to shut things down until 00:15:01 they could figure out what happened. 00:15:02 They eventually did turn things back on 00:15:04 and of course we got back in 00:15:05 operational. Interesting sideline with 00:15:07 that story is they actually paid the $5 00:15:09 million ransom, but here was the unhappy 00:15:12 part of that ending. The attackers gave 00:15:14 them a tool that would decrypt the data 00:15:17 that had been encrypted. So they paid 00:15:19 the ransom, they got the tool, but the 00:15:21 tool was so inefficient and so slow that 00:15:24 it would never have recovered the data 00:15:25 in time to do anyone any good. So they 00:15:27 ended up having to rely on their own 00:15:29 backups, incomplete as they might have 00:15:31 been. And so they paid the $5 million 00:15:33 ransom, didn't get their data, and it 00:15:35 was kind of a worst of both worlds 00:15:36 situation. However, in the end, there 00:15:39 was one more twist, and that was that 00:15:40 the FBI actually recovered half of that 00:15:42 ransom. Don't expect that to happen in 00:15:44 your case. At Noble Infantry asks, "What 00:15:47 is a firewall, and how does it work?" 00:15:49 Well, a firewall was a building 00:15:51 mechanism that was fire retardant 00:15:53 materials that would at least slow the 00:15:55 spread of fire from one unit to the 00:15:56 next. Now, when you apply that concept 00:15:58 into network security, it's a place a 00:16:01 zone a se of of separation where we're 00:16:04 going to keep one level of trusted 00:16:06 network away from another. Maybe an 00:16:08 untrusted internet will keep that 00:16:10 separate from a trusted internal 00:16:11 network. And the firewall will basically 00:16:13 be the gatekeeper. So, we'll have a 00:16:15 security policy in it and it will look 00:16:17 for certain types of traffic and say 00:16:19 that kind of stuff can come in. This 00:16:20 other kind of stuff we're going to block 00:16:22 because we don't use that kind of 00:16:23 traffic. we don't need traffic coming 00:16:25 from that area of the internet or we 00:16:28 don't need users of that sort they're 00:16:30 not part of our organization so we put a 00:16:32 gating factor basically a guard between 00:16:35 areas of networks where we have 00:16:37 different zones of trust AI pitch side 00:16:39 asks I'm curious to know how people 00:16:41 balance online security with the need 00:16:43 for convenience do VPNs really offer the 00:16:46 anonymity we think they do the original 00:16:48 purpose of VPNs was really just as a way 00:16:51 to transport sensitive information over 00:16:53 a public network. So if I wanted to send 00:16:55 a secret message to you that only you 00:16:57 could see and that someone else who saw 00:16:59 the traffic going across the internet 00:17:01 would not be able to read, I would 00:17:02 encrypt the message and then send it to 00:17:04 you. So that way we have an encrypted 00:17:06 connection between the two of us. Now 00:17:07 the VPNs that most people use today not 00:17:10 only can do that, but they also will 00:17:12 hide the IP address that you're coming 00:17:14 from. And that's where you start to get 00:17:16 some of these anonymity features. The 00:17:17 idea is that your ISP, whoever it is 00:17:21 that's providing your internet 00:17:22 connection and getting you as the 00:17:24 on-ramp onto the highway, that is the 00:17:26 internet, they know what your IP address 00:17:28 is. And they can see everything that 00:17:30 comes and goes into your home network or 00:17:32 into your computer itself, unless you 00:17:34 use a VPN, which then all they can do is 00:17:37 see where the packets are coming from 00:17:38 and where they're going, but they can't 00:17:39 see the contents of it. So, that gives 00:17:41 you a certain level of anonymity. 00:17:43 However, if you use a VPN, then what it 00:17:45 will do is also hide where your 00:17:47 originating IP address is because what 00:17:49 will happen is no matter where you want 00:17:51 to send a packet, if you've got the VPN 00:17:53 turned on, it's going to go to the VPN 00:17:55 access point first. Then from there, it 00:17:57 will get routed to where it's supposed 00:17:58 to go. So the ISP then at that point 00:18:01 only sees here you are sending all your 00:18:03 traffic to this one VPN entry point, 00:18:06 this VPN access point, and then stuff 00:18:08 comes back from there. They won't be 00:18:10 able to see the contents. they won't be 00:18:11 able to see where it goes after that 00:18:13 point. Now, that gives you some 00:18:14 anonymity and some privacy. However, 00:18:16 don't be fooled. What you've done is 00:18:18 shifted your trust from your ISP, which 00:18:21 maybe wasn't so trustworthy in terms of 00:18:23 guarding your privacy, to the VPN 00:18:25 provider because the VPN provider now 00:18:27 gets to see where all your traffic is 00:18:29 going. And you don't really have a way 00:18:30 to verify the way they're operating. So, 00:18:32 some VPNs will be very rock solid and 00:18:35 will preserve your privacy. Others will 00:18:37 not. So just by using a VPN, you may 00:18:40 just be making it easier by 00:18:41 concentrating all your data to one 00:18:43 place. And if that place gets attacked 00:18:45 or if that place is a bad actor, then 00:18:47 you've given them all your information. 00:18:49 So be careful. A VPN is not a panacea. 00:18:51 It can help. At hot geologist 6330 asks, 00:18:55 why are fishing emails and telephone 00:18:56 scams still profitable despite increased 00:18:59 awareness? Simple answer is people. Have 00:19:01 you ever met them? Well, they can be 00:19:03 exploited. We have this tendency as 00:19:05 humans to trust other people. Even if 00:19:07 you're very jaded, you see someone 00:19:09 walking toward a building, their arms 00:19:10 are full of stuff and it's raining and 00:19:12 you're at the door. So maybe you hold 00:19:13 the door open for them. But if that 00:19:15 person was planning to do that as a way 00:19:16 to get into the building, well then they 00:19:18 basically just socially engineered you 00:19:20 into letting them come into the building 00:19:22 and tailgate without using their badge. 00:19:24 Social engineering is what lies at the 00:19:26 heart of these types of attacks of 00:19:28 fishing emails, telephone scams, and 00:19:30 things like that. Our tendency to trust. 00:19:32 And in one context, that's a beautiful 00:19:34 thing because we wouldn't want everyone 00:19:35 to be so jaded that we never trusted 00:19:37 another person ever again. But we can't 00:19:40 be trusting of everything either because 00:19:42 then everything falls. The attackers are 00:19:44 always going to try to find that crack 00:19:46 that they can exploit. And they keep 00:19:49 changing their tactics. They keep ch 00:19:50 changing different ways of doing this. 00:19:52 Fishers originally used mostly just 00:19:54 email. Now they've moved into other 00:19:56 areas as well in addition to email. They 00:19:58 could do an SMS message to you. We call 00:20:01 that instead of fishing, we call that 00:20:02 smishing. They could do fishing via 00:20:04 voicemail. We call that fishing. There's 00:20:06 even a new one called quishing where 00:20:08 they use QR code. This is a pseudo asks, 00:20:11 "Are password managers safe?" If you're 00:20:13 asking a security person, is it safe? 00:20:15 The answer is no. I don't even have to 00:20:17 know what the question was. The answer 00:20:18 is it's not safe. Nothing is ever fully 00:20:20 safe. Nothing is ever fully secure. Now, 00:20:22 is it safe enough? It depends on which 00:20:24 password manager you use and how you use 00:20:26 it and where you put the password 00:20:30 manager and how you get access to the 00:20:32 password manager itself. Most of these 00:20:34 password managers will require you to 00:20:36 set a strong password that you type in 00:20:39 once and then that unlocks all the other 00:20:41 passwords that it keeps in its storage. 00:20:43 If you have a trivial password on your 00:20:45 password manager, you have an unsafe 00:20:47 system. So you need to have at least one 00:20:49 really good password. And again, maybe 00:20:51 use multiffactor authentication so that 00:20:53 it doesn't rely just on a password to 00:20:55 get in. Let me tell you what's better 00:20:56 than a password. If you're trying to 00:20:59 make sure that no one steals your 00:21:00 password, don't have one in the first 00:21:02 place. And you say, "What does that 00:21:04 mean? How I don't get to choose that?" 00:21:05 Well, actually, you are more and more 00:21:07 getting to choose a newer technology 00:21:09 called pass keys. There's an 00:21:11 organization called PHO, fast identity 00:21:13 online, that came out with this 00:21:14 standard. And pass keys sounds like the 00:21:17 same kind of thing, password, pass key. 00:21:19 It's actually very different. Pass keys 00:21:21 use cryptographic techniques. You don't 00:21:23 have to reme remember what the password 00:21:25 is. You don't have to choose what the 00:21:26 password is. You unlock your device. The 00:21:29 pass key is a cryptographic key that's 00:21:31 kept on your device and may or may not 00:21:33 be synced with other devices that you 00:21:35 have. It's relatively fishing resistant, 00:21:38 if not almost impossible to fish because 00:21:40 it uses a challenge response system. And 00:21:42 all of this stuff happens under the 00:21:44 covers. And the good news is password 00:21:46 managers support both passwords, the 00:21:48 good ones, and pass keys, so you don't 00:21:51 have to choose. Curious Brain 2781 asks, 00:21:54 "How likely is it to catch a virus 00:21:56 nowadays, assuming a standard up-to-date 00:21:59 antivirus?" It's actually still very 00:22:01 possible. We continue to see that 00:22:03 certain types of malware proliferate. 00:22:05 Thankfully, we've gotten a little better 00:22:06 at this, but the problem is the game 00:22:09 constantly keeps changing. So then we 00:22:11 had to as an industry come out with 00:22:12 things that were not just looking for 00:22:14 literal signature that is a a string of 00:22:17 bits that were in there in the 00:22:19 particular malware itself and that was 00:22:21 the identifier. Now we're looking for 00:22:23 things like behaviors and if we're 00:22:25 looking for those behaviors maybe we're 00:22:27 able to block these things more often. A 00:22:29 lot of these viruses and malware will 00:22:31 exploit different vulnerabilities in 00:22:34 software. So that's why as patching and 00:22:37 updating of software levels has become 00:22:39 more and more automated, we've been able 00:22:40 to deal with a better defense than we 00:22:42 had back in the day when these things 00:22:45 were first coming out. Reboot your 00:22:46 system every so often because some 00:22:48 viruses and malware are not able to 00:22:51 survive across a reboot. So you'd like 00:22:53 to get rid of those and clean things 00:22:54 that way. But in general, use tools that 00:22:57 can disinfect your system. That will 00:22:58 help a lot. Tyrone Bigums asks, "What 00:23:01 hack has caused the most damage?" 00:23:02 depends on how you measure damage. Would 00:23:04 it be financial damage? Would it be in 00:23:06 terms of the number of systems that were 00:23:08 affected? Would it be in terms of the 00:23:09 number of lives that were impacted? 00:23:12 Would it be in terms of the number of 00:23:13 lives that were lost? There's a lot of 00:23:15 different ways to to look at this. There 00:23:16 was one case where a ransomware instance 00:23:19 happened at a hospital and it caused the 00:23:21 hospital systems to not be available and 00:23:24 they started redirecting emergency 00:23:26 traffic to other hospitals. One person 00:23:29 died during transport to a more distant 00:23:31 hospital. So there's a case where 00:23:32 indirectly ransomware cost a person 00:23:35 their life at sector 00:23:37 quadruple07 says but what is the CIA 00:23:40 triad? Well CIA if you talk to a cyber 00:23:43 security person probably doesn't mean 00:23:44 central intelligence agency although it 00:23:46 could. We think of this as one of the 00:23:48 classical security teachings and that is 00:23:51 really lies at the fundamentals of 00:23:53 everything we do in cyber security. 00:23:55 Everything in cyber security is about 00:23:56 these three confidentiality integrity 00:23:58 and availability. So CIA, 00:24:00 confidentiality, integrity, and 00:24:02 availability, that's really everything 00:24:04 we do in cy cyber security is about 00:24:06 doing those three things. At gunlaze 00:24:08 1969 asks, what was the name of the 00:24:10 first computer virus? Well, if you use 00:24:12 the term virus in the larger sense of 00:24:14 malware, I'm actually going to shift 00:24:16 this question to refer to the first real 00:24:18 example that the world came to know, and 00:24:21 that was the Morris worm back in 1988. 00:24:23 And that was where an MIT student came 00:24:26 up with a way of planning a a piece of 00:24:30 software on a lot of different systems 00:24:32 across the internet and it spread 00:24:34 automatically. That's what a worm does. 00:24:35 It self-replicates. And it did this and 00:24:37 got to 10% of the internet before it 00:24:39 finally got shut down. So that was 00:24:41 really the first one that made the world 00:24:43 wake up to the fact that this stuff 00:24:46 could actually have software that could 00:24:47 do harm. At Peter Burkehead asks, "How 00:24:49 is it I never heard about phone 00:24:51 freaking? It's an old attack. In the 00:24:53 early days when there really were not a 00:24:55 lot of computers to break into, there 00:24:57 was a phone system and it was worldwide 00:24:59 and phone freakers were the ones who 00:25:01 tried to manipulate the phone system. 00:25:02 And it was discovered that you could 00:25:05 actually control the phone system, maybe 00:25:07 even reconfigure the switch that's 00:25:09 involved. You could get free 00:25:11 long-distance phone calls. There's a lot 00:25:13 of different things you could do. And 00:25:15 you could do this because the phone 00:25:17 systems used a specific tone in order to 00:25:20 put them into a management control mode. 00:25:22 That mode was triggered by a tone at 00:25:26 2600 hertz. So if you could whistle 2600 00:25:29 hertz or get a tone generator and hold 00:25:31 it up to a phone, you could then take 00:25:33 over the phone and maybe even penetrate 00:25:36 into the system from that. It turned out 00:25:38 that Captain Crunch, the cereal, came 00:25:41 out with a toy prize, a whistle inside. 00:25:44 And guess what? That whistle blew 2600 00:25:47 hertz. Now, Captain Crunch, I'm sure, 00:25:49 had no idea that that's what was going 00:25:50 to happen when they did that. They were 00:25:52 just making a toy for kids. But the 00:25:53 phone freaker community learned about 00:25:55 that pretty quickly and they got all 00:25:58 over that and bought up a lot of Captain 00:25:59 Crunch boxes and now they were able to 00:26:02 go into payoneses and get free phone 00:26:04 calls. But you don't hear about it much 00:26:05 now because nobody really pays for 00:26:07 long-distance phone calls. Barrow Wolf 00:26:09 asks, "What movie has the most realistic 00:26:11 concept of hacking?" Sorry, movie 00:26:13 industry. I'm not sure any of them have 00:26:15 gotten it really all that right. How it 00:26:16 works in the real world is not 00:26:17 necessarily all that exciting to watch. 00:26:19 It's often hours and hours of just 00:26:22 mind-numbing activity of running 00:26:24 different programs in the background 00:26:25 until finally you trip onto something. 00:26:27 It's not something that makes for a 00:26:28 great spectator sport. So that's why you 00:26:30 see the movies take liberties with this 00:26:33 in order to make it a lot more 00:26:35 interesting. I don't know if hackers 00:26:36 ever actually say I'm in, but in every 00:26:38 movie they certainly do. Okay, those are 00:26:40 all the questions. Thanks for watching 00:26:41 History of Hacking Support.