It's really easy to make blanket dismissals of difficult situations in comments ("Amazon shouldn't be selling X to minors.") but the reality is that the only way to prevent these things is to remove them entirely from the marketplace. If a minor can get ahold of a credit card, it's not hard to imagine that they can also get ahold of their parent's drivers license to go through an arbitrary age verification too.Įven if they did perform age verification on accounts, how would you expect Amazon to verify that the person pressing the Checkout button is actually the parent, and not a minor who sat down at their parents' computer and went to ? > Kristine was able to create an Amazon account even though she was under 18, skirting Amazon rules against underage account holders-the lawsuit notes that Amazon does not verify age. The bigger question is: Should we be forcing every online marketplace to perform strict age verification of every customer? From the article: There are several chemicals that I ordered from Amazon in the past that now can't be found online, and I expect it's only a matter of time until pure sodium nitrite also disappears. I suspect Amazon will be removing this chemical from their listings, just as Amazon and eBay constantly add new items to their banned items list. The category of things that "might be harmful to minors" is impossibly broad. > Amazon shouldn't be selling sodium-nitrite to minors It would be a gargantuan task but it's a valid third option. Is there a third option I haven't thought of?ĮDIT - I guess maybe all "bundles" could require human curation before they are allowed to be displayed. However it also doesn't sound like you mean (2) as you seem to be saying there's a way to fix this once and for all. (1) is a forever ongoing task so it doesn't sound like you're referring to that as that's not something where you can "iron out the kinks". Never use any kind of bundling algorithm ever. Maybe they aren't putting enough resources behind it or not reacting quickly enough but even if they were then things would always slip through.Ģ. Be vigilant and special case items one by one. There's a potentially unbounded number of clusters that could be regarded as harmful. Let's assume the algorithm starts off on day 1 just looking for purchases that tend to appear in clusters. This "kind" of harm seems to hide a huge amount of complexity. > nobody should be deploying software in production that is capable of this kind of harm. Why do they just not do anything, and instead allow themselves to get caught up in a lawsuit and confirm people's beliefs that they're cackling villains? Seems like an unforced error. Why not override the (algorithmically-generated) "suicide bundle" with something else? I'm sure they've done that in the past, too. Hanlon's razor, etc.īUT, given the bad press, why wouldn't Amazon put high-purity Sodium Nitrite behind some kind of age verification? They do have that process in place already for other items. Instead, I think they're a big company that has a bunch of regular people focused on a bunch of different goals, and not really paying attention to this kind of thing until it becomes an issue. I also do not believe that Amazon is headed by a cabal of cackling villains who want kids to die, because they make $2.39 every time one of them buys poison on their marketplace. That's a bad road to go down, and I'm afraid it would lead to unintended consequences if applied broadly. I'm generally on the side of believing that Amazon is not legally responsible for what people do after buying things that other retailers sell them on their marketplace.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |