As a species we have always been attracted to novelty. This has probably served us well for things with short feedback loops as we quickly learnt whether the “new and shiny” thing actually helped us or harmed us.

In software development the JavaScript community has been notorious for “new and shiny” frameworks. This has – for good reason – been mocked over and over again. But a little more concerning is the fact that I see a lot of the same worrying tendencies creeping in(or surfacing) in larger organisations and software projects outside the JavaScript domain. Sectors or organisations where caution, conservatism and stability is the norm. But for some reason a perhaps mistaken attempt at “conservatism” ends up being anything but.

I have discussed this with many other people or observed the pattern in discussions and talks with others which have only confirmed and backed the picture I see.

My main concern is that I see many projects and organisations make decisions on technology based on what is the latest trend from the top 5-10 technology companies. There seems to be a tendency towards “new and shiny” solutions to problems where objectively a much more simple solution would have been better. It seems like many architects and developers look to solutions from Facebook, Amazon and Google and come to the conclusion that; “if only we use the same technology, structure and architecture, then we are home safe. If it works for Facebook, Amazon and Google then it will surely be the right scale-able approach for us as well.”

This “lazy” way of approaching technology and software can not only end up causing many problems initially but also end up with a worse result than sticking to basics. This is of course not to say that mimicking giants or looking at established and proven solutions to find inspiration is never the right approach but many people seem to miss a vital step.

Which problem are we trying to solve

The key to finding good and relevant inspiration from others is to first of all find out what problem are you trying to solve. This can be top-level, looking at the entire solution, or it can be zoomed in on any smaller specific part of the solution.

“If you do not know where you are going, any road will take you there”

Once you have figured out which problem (or problems) your solution is trying to solve, then you can go looking at whether this problem has been solved before (which it probably has).

But, there is a very important point to have in mind when looking at others solutions. And this is where I think a lot of people and organizations put too little emphasis. First you had to figure out what problems you were trying to solve. I do not think many people go wrong here. But when looking at others solutions you also need to do some complex discovery or thinking. Because in order for their solutions to be a good fit for yours, you need to know or figure out; what was the problem that they were trying to solve when they arrived at their solution.

I have encountered people recommending NoSQL databases because they are so smart, that you can just put any data into them without it being structured. And that is smart. If you are working with unstructured data. If you are working with perfectly fine structured data then all you will gain is probably complexity and performance issues. Plus the fact that in the cases I heard them proposed none of the developers on the teams had any experience with the technology. So you can add that cost and complexity on top of it. That just does not add up.

As said earlier the most worrying part about this is that the people recommending these solutions are not necessarily young eager hackers, but actually experienced architects working in large organizations, where I would have expected a lot more push-back and conservatism.

Why is this happening?

So the big question is of course; why is this happening?

We could call it incompetence and end it with that. But that would be rather ignorant and unhelpful.

I think there are a few underlying explanations as to why these decisions are made. I am pretty convinced that people make these decisions under a presumption of them being right, and perhaps even a “safe” choice. As said earlier; “if it works for Google or Facebook then it should also work for us”.

So hence part of the problem becomes as highlighted before, that this lazy thinking can actually cause more harm than good. It is probably not a “safe” choice it is probably a wrong and dangerous choice if taken lightly.

But what I also found when talking to people about similar experiences is that there can be other factors driving this trend. A recent example was a larger Danish government project where a larger management consultancy firm had been advisers and was pushing really hard for Amazon-based infrastructure and solutions. That is not necessarily bad, as it could be the right solution.

But in this case they were for instance pushing Amazons DynamoDB really hard, as it was the future and could hold just about anything. Again true. But that is just not a core issue of the problem they were trying to solve. All their data was well structured – almost as structured as it gets. This was government data in the real-estate sphere. It is scrutinized, audited and very well organized. To further complicate matters, none of the people who was going to be on the team had any experience with DynamoDB, so everyone had to be sent off on weeks of courses. Again adding cost and complexity.

What drives these kinds of decisions? Should you put on your tinfoil hat and claim that in the case above, the management consultancy firm may have had economic interests in pushing a certain technology? Perhaps. But then why did the people working for the government not provide enough push-back? Could be that if they had the ability to dissect that the decision was bad, then they had not needed the management consultancy firm in the first place. Or perhaps a very bad case of sunk cost fallacy where an expensive bill for a management consultancy firm is hard to swallow if you go against their recommendations.

I do not know the final answer to the question, but I find the trend a little worrisome and sad. It seems like the software developments version of the “Hedonic treadmill” and not really progress at all.

In my book simplicity always wins. You win by subtracting complexity – not by adding it. Start with the simplest possible solution to the problem and see where it gets you. Simple makes it cheaper. Simple solutions makes it easier to implement, makes it easier to test, faster to ship and hence faster to get feedback. Once you have this feedback, whether from unit tests, Proof of Concept or user tests, then you can decide to add complexity if your simple solution proves too slow or too rudimentary.

Always start with the simple solution.