No, error correction in general is a different concept than GIGO. Error correction requires someone, at some point, to have entered the correct figures. GIGO tells you that it doesn't matter if your logical process is infallible, your conclusions will still be incorrect if your observations are wrong.
GIGO is an overused idea that's mostly meaningless in the real world. The only way for the statement to be true in general sense, is if your input is uniformly random. Anything else carries some information. In practice, for Babbage's quip to hold, the interlocutor doesn't need to merely supply any wrong figures, they need to supply ones specifically engineered to be uncorrelated with the right figures.
Again, in general sense. Software engineers are too used to computers being fragile wrt. inputs. Miss a semicolon, program won't compile (or worse, if it's JavaScript). But this level of strictness wrt. inputs is a choice in program design.
Error correction was just one example anyway. Programmers may be afraid of garbage in their code, but for everyone else, a lot of software is meant to sift through garbage, identifying and amplifying desired signals in noisy inputs. In other words, they're producing right figures in the output out of wrong ones in the input.
I don't think machines should rely on an opaque logic to assume and "correct errors" on user input. It's more accurate to "fail" than handling out an assumed output.
And also:
> they need to supply ones specifically engineered to be uncorrelated with the right figures.
I assume most people will understand this way (including me) when it's said to "input wrong figures".
> In other words, they're producing right figures in the output out of wrong ones in the input.
This does not refute the concept GIGO nor does it have anything to do with it. You appear to have missed the point of Babbage's statement. I encourage you to meditate upon it more thoroughly. It has nothing to do with the statistical correlation of inputs to outputs, and nothing to do with information theory. If Babbage were around today, he would still tell you the same thing, because nothing has changed regarding his statement, because nothing can change, because it is a fundamental observation about the limitations of logic.
I don't know what the point of Babbage's statement was; it makes little sense other than a quip, or - as 'immibis suggests upthread[0] - a case of Babbage not realizing he's being tested by someone worried he's a fraud.
I do usually know what the point of any comment quoting that Babbage's statement is, and in such cases, including this one, I almost always find it wanting.
I suppose spell checking is a sort of literal error correction. Of course this does require a correct list of words and misspellings to not be on that list.
Honestly I see this not about error but instead divining with perfect accuracy what you want. And when you say it that way it starts sounding like a predicting the future machine.