Anthropic is facing a national security ban after refusing military AI demands. We explore the clash between corporate ethics and national defense.

Just because you can do something with AI doesn't mean it's reasonable or appropriate in a democratic society; we have to move past the hype and realize that ethics isn't an obstacle, it's a safety feature.
The Trump administration blacklisted Anthropic after the company refused the Pentagon’s demands to allow its AI models to be used for autonomous weaponry and mass surveillance of Americans. The government labeled the company a national security supply chain risk, arguing that "Big Tech ideology" should not hinder military capabilities. Anthropic, founded on principles of AI safety, maintains that its technology should not be used for domestic surveillance or as a weapon, leading to a high-stakes legal battle over corporate ethics versus national defense.
U.S. agencies utilize a legal loophole by purchasing "commercially available information" from private data brokers on the open market. These brokers aggregate data from internet browsing histories, tracking systems, and advertising marketplaces. By acting as a customer rather than a law enforcement entity seeking a search warrant, the government can feed massive amounts of personal data into advanced AI systems to identify patterns and predict behavior at a scale previously impossible.
ALT, a prominent Japanese AI startup, was delisted from the Tokyo Stock Exchange after investigators discovered that 80 to 90 percent of its reported revenue was fictitious. The company engaged in a "clearly constructed loop" where it paid other companies for R&D or advertising, and those companies would then pass the money through intermediaries back to ALT's sales partners. These partners used that same money to "buy" software subscriptions, artificially inflating sales figures to attract investors during its IPO.
According to recent rulings in Canadian courts, such as the British Columbia Court of Appeal decision regarding Clearview AI, the answer is no. The courts ruled that while information may be "public" on social media, it is not "publicly available" in a legal sense for commercial AI scraping. The judiciary emphasized that privacy rights are "quasi-constitutional," meaning companies must have a "reasonable purpose" for data collection, and mass systematic surveillance for commercial gain does not meet that standard.
Large language models are considered "dual-use" because the same underlying code can be used for both benign civilian tasks and lethal military applications. For example, a model designed to summarize a meeting or write a poem can also be used by a drone to identify targets or by a government to monitor an entire population's movements. This creates a dilemma for developers who may be held legally or criminally liable in international courts if their civilian software is repurposed for military strikes that violate international law.
From Columbia University alumni built in San Francisco
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
From Columbia University alumni built in San Francisco
