Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, they'll still be at fault because if you have a massive pile of highly sensitive data online, and all it takes is one vulnerability to get at it all, then your security model was awful.

Was this database typically accessed with broad-sweeping queries that could snarf large amounts of it at once, or was it the sort of thing where a few specific keys were used to identify a single client record? My thinking is that it was probably the latter, and if that's the case then another layer of security should have been in place there. Stored procedures can be used to require very specific queries to access very specific records--this falls under the "prevention" category. Next, monitors should be deployed so that someone gets notified right away if some session makes a large number of successive queries to get around the restriction--this goes under the heading of "detection". These two things would have been able to prevent attackers getting the goods with just one exploit.

...yet companies regularly fail to take these kinds of simple and rational steps seriously, and then act like it was all just too hard for anyone to possibly defend against.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: