According to wikipedia, "In several fields, especially computing, deprecation is a term for the discouragement of use of some terminology, feature, design, or practice, typically because it has been superseded or is no longer considered efficient or safe, without completely removing it or prohibiting its use.".
Generally speaking, computing systems are designed to be used by a wide variety of people. And, when significantly better approaches are discovered, we want to encourage people to use them. But, we also want our systems to continue to be useful to the people who have been using them.
This sometimes leads us to deprecate certain features, in favor of replacement features.
Here, it's worth using the original (pre-computer) use of the word "program" -- program in the sense of project management or time management.
The program here is:
- A new approach is considered better than an old approach,
- The new approach is implemented and proves to have significant advantages with no obvious disadvantages
- The new approach is made available for use, and is well documented
- the old approach is deprecated, with pointers to the new mechanism
- Significant time passes
- Eventually, either (a) we find and document issues and contexts where the old approach was better, or (b) we reach a point where we decide that the disadvantages of removing the old approach are outweighed by the benefits of moving to the new approach.
It's sometimes tempting to think that skipping straight to the end is the right process. But that only "works" in useless contexts (and even there you might eventually discover that the context was not as useless as you thought).