-6

In many projects, software requirements change, sometimes meaning we software engineers have to throw out our work and start over. Heck, we have entire methodologies built around the assumption that requirements will change.

So, why are requirements changes allowed to happen in cases where the software engineers then have to put in more work to fulfill them?

6
  • 1
    There shouldn't be a problem with this, except of course if the change requirements without changing the deadline. I suspect the question you asked if not the question you have. Commented Dec 6, 2020 at 2:15
  • You misunderstood what the paying customer wanted, or the customer understood better from your prototypes what was actually needed. Commented Dec 6, 2020 at 3:32
  • I am attempted to rewrite this question along the lines of "isn't there a methodology which forbids changing requirement during a project" (so the wording does not express some absurd misunderstanding about the world around us any more, as it does now). Unfortunately, this would probably invalidate some of the existing answers, and the question will probably closed, anyway.
    – Doc Brown
    Commented Dec 6, 2020 at 8:14
  • 1
    ... and yes, there is such a methodology, it is called Waterfall. For many real-world projects, the "waterfall" model proofed itself to be too unflexible.
    – Doc Brown
    Commented Dec 6, 2020 at 8:49
  • There's a bit of responsibility shifting happening here that you're probably not aware of. The fact that you have to put in (an unpleasant amount of) extra work is to a significant extent your fault (in part individually, and in part on the company level) - you are the ones who didn't investigate the needs of the business in more depth, you're the ones who didn't use and/or master incremental development, you're the ones who built in wrong assumptions into the software that now get in your way, and you're the ones that failed to apply a suitable pricing model and negotiate project constrains Commented Dec 6, 2020 at 10:35

4 Answers 4

10

Because software engineering isn’t done for some abstract pursuit of software engineering, it’s about building a useful result.

If halfway through the construction of device A you find out that it won’t be useful, won’t be competitive, has huge unforeseen issues, etc., then you’re better off scrapping it and building device B. The dev effort sunk into A is sunk cost. Any assessment of what’s worth doing needs to be informed by today’s trade offs, not sunk cost that’s done and over with. While it’s always difficult to scrap your own work, it’s just part of building something ultimately useful.

1
  • 3
    It may be worth noting that attempting to avoid sunk cost leads to analysis paralysis. Rather than think of is as wasted effort think of it as a stepping stone along your path to success. Commented Dec 6, 2020 at 20:53
4

why are requirements changes allowed to happen

They aren't "allowed" to happen. They happen.

There is simply a non-zero amount of time between the point in time where the requirements are defined and when the software is delivered. Within this non-zero amount of time, the universe has changed.

Even the process of gathering the requirements itself takes a non-zero amount of time, so the state of the universe at the point when you started writing down the requirements is different from when you finished writing them.

Maybe examples will help.

One example that was recently in the news is the guidance and navigation system of the OSIRIS-REx lander that landed on asteroid Bennu and took a sample of the asteroid.

Scientists studied the asteroid Bennu as best as they could, given that Bennu is only less than 500m wide, and even at its nearest point to Earth, it is still almost half a billion kilometers away. Based on all their measurements, their best models, and experiences with other asteroids, they came to the conclusion that Bennu had a smooth, even, sandy surface. So, they designed a guidance and navigation system for the touchdown and sampling part of the mission that was based around a LIDAR altimeter.

So, the requirements they came up with were, among other things: the guidance software needs to be able to autonomously course-correct the landing with an accuracy of 25m and use the LIDAR altimeter to do so. And that's what the contractor implemented.

Except, when the spacecraft arrived at Bennu and sent home the first photos, much to everyone's surprise, it showed an uneven, rocky terrain with sharp edges, cliffs, crevasses. And there was no single 25m wide landing spot to be found.

If they were going with your solution, they would have said "well, the requirements are what they are", the spacecraft would have crashed, destroying 1 billion dollars of taxpayer money, 20 years of work, and shattering the hopes and dreams of dozens of students working on the project, let alone the careers of scientists who depended on publishing the results of the mission.

But what they did instead was to adapt the guidance and navigation system to optical terrain navigation using the cameras that were never designed as real-time navigation cameras, and improve the accuracy to less than 5m. The mission was a resounding success, and in 2023, the spacecraft will return between 400g and 1kg of material back to Earth. (The minimum required for mission success was about 60g.)

Another example is a project I heard about recently, which actually did have a contract exactly like the ones you are envisioning. There was a fixed set of requirements that were never allowed to change, a fixed set of features that were never allowed to change, a fixed price that was never allowed to change, and a fixed delivery date that was never allowed to change. As you might have guessed, this was a government project. The timescale was about 3 years.

After about 1 year of development, there was a change in the government, and the laws changed. This meant that the software was completely useless. From this point on, the developers were forced to spend two years of their life developing a piece of software that was completely useless. This was incredibly frustrating, and as a result, the company lost several of their best developers who either left frustrated or simply burnt out and got sick.

After about 2 years, there was an organizational reform, and the department that this software was written for, didn't even exist anymore.

It also became clear that the company was not going to make the deadline, because of the developers that had left or became sick, because the remaining developers were frustrated and unmotivated, and because of some unforeseen complexities in the implementation. As a result, they had to hire additional developers for a project that would never see the light of day, and the project turned into a death march to hit a deadline that was completely meaningless.

They also weren't able to bid for the contract to develop the replacement software, even though they were in the perfect position to do so, since they already had acquired a certain amount of domain knowledge. They simply didn't have the developer capacity to support two such large projects at the same time.

So, in the end, everybody lost: the government, because they had to pay for a software they were never going to use. The company because they not only lost some of their best developers, they also actually lost money on the project due to the last minute hires they had to do. The developers because even in the best case, they were completely demotivated and frustrated, and in the worst case, actually got sick because of that. And the taxpayers who indirectly paid both for the software and the medical bills.

In neither of these two examples did someone "allow" the requirements to change. The requirements just changed.

1
  • Such a good answer for a question which is at the risk of getting deleted.
    – Doc Brown
    Commented Dec 6, 2020 at 8:58
3

What do you mean allowed?

Do you think that businesses like inefficiency, or throwing things away? Change is one of the true constants in the world - as is human inability to predict the future.

One tiny consequence of that fact is that engineers sometimes need to throw stuff away.

4
  • "Do you think that businesses like inefficiency, or throwing things away?" They tolerate it, at least. but why make engineers go through more trouble, when they could just say "no" to the changes from the outset?
    – moonman239
    Commented Dec 6, 2020 at 2:04
  • 1
    @moonman239 - real-life is not like a textbook with a bunch of axioms and theorems derived from it, which are mathematical law and can't be violated. stuff is unforeseen. other stuff happens. it must be accommodated.
    – davidbak
    Commented Dec 6, 2020 at 2:08
  • 2
    @moonman239 If they were able to say "no" without any repercussions, then they almost certainly would (which is typically most businesses' default stance if there's a solid justification for refusing a requirements change). In reality, there are consequences to saying "no" - including loss of face/reputation, pushing away unsatisfied clients/customers to the competition, maybe legal/contractual clauses such as financial penalties, etc. Changes can also be beneficial - e.g. late changes may bring in additional revenue or goodwill if not in the original specification. Commented Dec 6, 2020 at 2:08
  • 2
    @moonman239 - say “no” to who? The universe? Entropy? Stuff changes.
    – Telastyn
    Commented Dec 6, 2020 at 2:50
2

There are three reasons for requirements to change:

  1. Documented requirements were incomplete, wrong or inaccurate;
  2. The world continues to change independently of any projects that might be affected;
  3. Knowledge acquisition during the project when software engineers and users learn from each other what’s really needed and what’s possible.

Change of requirements will therefore happen, whether they are allowed or not. Three consequences:

  • ignoring them would lead to bad quality, (i.e. a according to ISO, when a system does not meet the real expectations of the customer);
  • accepting them has influence on the project efforts and costs. How much depend on how the project is organized. The question of who will have to support the work and cost consequences is a matter of contractual relationship and general management, beyond the scope of software engineering;
  • early identification of changes significantly reduce their cost.

Traditional methods cope with this reality by addressing the cost issue with some kind of change request procedure.

Agile methods emerged to better cope with the reality of requirements and their change:

  • requirements are identified just in time when they are needed and explained in detail when sufficiently understood: this reduces the probability of unnecessary changes;
  • changes are welcome, which allows to take them into account as early as possible: there is no additional cost, since it is fully included in the development strategy.
1
  • "Traditional methods cope with this reality by addressing the cost issue with some kind of change request procedure" – In the best case. Worst case, they pretend changes do not exist. Commented Dec 6, 2020 at 15:22

Not the answer you're looking for? Browse other questions tagged or ask your own question.