Newtonian physics specify a set of laws that describe very well the universe in the scale of space and time that we can perceive with our senses. In this scale, the universe is 'well behaved' within the limits of those laws, which means you can predict very well the interactions between bodies using newtonian physics.
In the scale of the very big or the very small, however, Newton's laws break down. At the domain of very small particles, you have to resort to quantum physics. At very large speeds or with very large masses, you have to use Einstein's physics to describe and predict the behavior of universe.
I like to think of software the same way. In well controlled conditions, software is well behaved under the traditional rules of software engineering. In this situation, you can follow the textbook recommendations when designing your architecture. You can design your database using the traditional normalization rules. You can use the popular frameworks, libraries, tools and methodologies (both the classical and the 'agile' and 'extreme' ones). Those rules, tools and methodologies exist and are broadly accepted and deployed for a reason.
This situation is where the VAST majority of the existing software is in. It is the condition you find when you are dealing with reasonable requirements for scalability, availability, responsiveness, data volume, or delivery time. And the definition of 'reasonable' is pushed forward more and more as the time goes by.
However, when the requirements for your software are pushed to the limit, and what I mean by limit is really an extreme case, you can no longer resort to the textbook rules. You have to think of alternative designs and solutions. In the software architecture, you will sacrifice the purity of your design. You will have to compromise it to extract the maximum performance. In the database side, you will have to denormalize and think of non traditional ways of storing and distributing your data. In the tools side, you will have to develop your own homegrown tools, libraries, and frameworks, or adapt existing ones for your own needs.
At first, those measures will look like heresy. You will think the people who developed the software are a bunch of morons who have no idea of what software development is about. You will feel like you work at a perfect place to feed The Daily WTF for years. But if you take a step back, you may realize that there isn't a ready solution for everything. Sometimes you will find you are on your own, and you still have to deliver your software when the deadline comes by. Don't hold on to a rule just because your professor told you so.
I am not advocating that we give up on everything we learned about software architecture. As I said, software architecture rules apply in the vast majority of cases. They exist and are broadly accepted for a reason. But we need to understand the reason behind those rules. Or we will be engaging on Cargo Cult Programming, maybe doing something out of some vague sense of duty to the ghosts of Boyce-Codd.
Because the laws of software development apply to the vast majority of cases, probably if you see someone breaking those laws it is because that person is really a hacker. But it might be because he or she is a clever designer who is solving a difficult issue the way no one has thought of before. But in general it is better to come up with a clean and simple design. As Ted Dziuba says, "Scalability is not your problem, getting people to give a shit is".
But if you are going to judge someone, or if some day you find yourself in the situation of designing software to handle ridiculous volumes of transaction or data, remember to keep an open mind.
The High Availability blog is a very useful source of information about the architecture of some of the highest volume websites in the world.
Martin Fowler on the Ultimate Heresy: transactionless environment at EBay. Plus, EBay's database doesn't have foreign keys!