I think that had more to do with the criticality of the mission they served. It took a long time until the organizations were satisfied with hardware and software/firmware reliability.
I haven’t done it since 1968 but if I remember it was somewhere around 1400 as Turbo suggested.
It was due to the government not willing to spend the money on replacing them. In the end it was far far cheaper to design new then keep repairing the older systems.
Qualifying completely new technology, especially in medical or military applications, is always a very expensive and time consuming activity. They are always loathe to change anything they do not have to change. I will never forget the grilling I got over multiple days in a Critical Design Review for the first microcontroller based shoulder fired missile we developed. We spent days trying to convince them it would be as safe, if not safer, than the existing analog systems currently deployed. They didn’t even like logic circuits let alone a microcontroller with embedded control software. There was enough brass in the room to rattle anyone. In the end, we prevailed at convincing them the system was safer than what they had and offered far more flexible operation in the field. But it took them many years once the hardware was delivered to fully qualify it for battlefield use. I have tons of examples that go along with what jt said. We still repair old radar based systems that were developed in the 1980s. They won’t let us change a thing, it is practically copy exactly…