colloquially today most people mean genAI like LLMs when they say “AI” for brevity.
Because there’s this default assumption that data centers can never be powered by renewable energy
that’s not the point at all. the point is, even before AI, our energy needs have been outpacing our ability/willingness to switch to green energy. Even then we were using more fossil fuels than at any point in the history of the world. Now AI is just adding a whole other layer of energy demand on top of that.
sure, maybe, eventually, we will power everything with green energy, but… we aren’t actually doing that, and we don’t have time to put off the transition. every bit longer we wait will add to negative effects on our climate and ecosystems.
Instead of making suicide harder, we should be treating the root cause of suicide
Or… both?
If people get hurt due to gun accidents, I highly doubt they’d be happy if we took their guns away, since that’s like solving traffic deaths by banning cars.
it’s not even remotely the same thing since cars’ primary purpose is not killing. Also there’s a very wide middle ground of options between “do nothing” and “take all guns away”. This is not a binarry issue.
Suicides and gun accidents are certainly interesting statistics, but mixing them with homicides just makes it harder to see what’s going on and arrive at effective solutions.
it doesn’t really. what does make it harder to arrive at effective solutions is making any excuse possible to avoid gun control.