• 0 Posts
  • 7 Comments
Joined 3 years ago
cake
Cake day: July 3rd, 2023

help-circle




  • The pricing question assumes the current model (cloud inference, centralized compute, hyperscaler margins) is the only model.

    Local inference flips that math entirely. If the model runs on your hardware, the marginal cost to the provider is close to zero. The pricing problem is a distribution problem, not a compute problem.

    What I think actually happens: cloud AI settles at $20-50/month for power users who need the latest frontier models and don’t want to manage hardware. That’s sustainable. The “free tier” disappears or gets severely throttled.

    But for a large chunk of use cases (summarization, classification, drafting, local assistants) models small enough to run on a consumer GPU are already good enough. That market doesn’t need to pay $50/month to Anthropic. It needs a good local runner and a one-time hardware investment.

    The companies that will survive the pricing correction are the ones who either have genuinely differentiated frontier capability, or who make local deployment easy enough that users own their own stack.



  • Sorry to break it to you bruh, but if Facebook owns the IP, they’re harvesting and monetizing the account data, end of.

    This includes all of the subsidiaries like Instagram and WhatsApp as well.

    Let’s also not forget that illegal activity exposes user data in addition to greed, and just the fact that your data is in yet another pot increases your risk. Also yes all corporations are greedy and evil but they’re definitely not all equal in how bad they are, and FB is one of the worst

    Are you willing to compromise on all of these intrusions into your life? Many people are, but I’m not one.