I think a valid part of the question of who invented something is "who built the first working device" - describing something in theory and building working device are not the same thing.
AG Bell wasn't the first one to conceptually invent the telephone, he was among the first (along with Elisha Gray) in making practical working telephone and later a practical working telephone system.
"It was able to squeak, but not to speak. Experts and professors wrestled with it in vain. It refused to transmit one intelligible sentence." [0]
"A translation of Legat's article on Reis' invention was obtained by Thomas Edison prior to his filing his patent application on a telephone in 1877. In correspondence of 1885, Edison credits Reis as having invented "the first telephone", with the limitation that it was "only musical not articulating"." [1]
Fascinating stuff nonetheless, these inventors and their ideas... See also previous experimenters [2]
There's something to be said that mass production is another distinct stage of invention. Karl Benz may have invented the first internal combustion engine car, and plenty more built cars by hand for the rich, but Henry Ford made cars anyone could have for cheap.
To some degree, this is a consequence of the nature of the field you're working in:
* if the physics is so completely understood that you can confidently predict something will work from your sofa, and give an error-free recipe to build it, you indeed can invent from theory... but how deep can this invention be if the problems of the field are completely solved?
* if you are working in a field at the edge of human understanding, you cannot have the confidence in your ideas without having tested them experimentally; a theoretician makes at most a minor contribution to the actual inventions being realized, because he's producing - most likely somewhat wrong - hypotheses.
This latter kind of "theoretical" inventions are heavily subject to survivorship bias. Fifteen competent theoreticians make different predictions - all according to best, though incomplete, model of the world; a successful experiment validates exactly one of them, and we end up exalting the lucky winner as the "inventor".
I had that thought too, describing that something might be physically possible isn't really inventing it, you have to build (and arguably sell) the device too. Re-organizing someone else's equations and saying it's technically possible is maybe enough to publish a paper but certainly doesn't rise to the standard of inventing in my mind
That is correct, but the article explicitly addresses this point and argues that the evidence points to Lilienfeld producing a working transistor.
"Later, some people claimed that Lilienfeld did not implement his ideas since "high-purity materials needed to make such devices work were decades away from being ready,"[CHLI] but the 1991 thesis by Bret Crawford offered evidence that "these claims are incorrect."[CRA91] Lilienfeld was an accomplished experimenter, and in 1995, Joel Ross[ROS95] "replicated the prescriptions of the same Lilienfeld patent. He was able to produce devices that remained stable for months."[ARN98] Also, in 1981, semiconductor physicist H. E. Stockman confirmed that "Lilienfeld demonstrated his remarkable tubeless radio receiver on many occasions".[EMM13]"
For many things (computers, rocketry, aerospace, etc.) and different reasons, Germany in the years around the second world war, was a pretty bad place to get international credit for your accomplishments.
>Where are the physical limits? According to Bremermann (1982), a computer of 1 kg of mass and 1 liter of volume can execute at most 1051 operations per second on at most 1032 bits. The trend above will hit the Bremermann limit roughly 25 decades after Z3, circa 2200. However, since there are only 2 x 1030 kg of mass in the solar system, the trend is bound to break within a few centuries, since the speed of light will greatly limit the acquisition of additional mass
They shift from talking about the transistor density to somehow considering a supermassive construct. Reminds me of LLM mashups.
It's a natural extension of the ideas being discussed - the limit in computation per gram of mass has energetic bounds, as well, with configurations nearing the upper limit that start looking more like nuclear explosions than anything we'd regard as structured computation. The extremes are amazing to consider - things that look and act like stars, but are fantastically precise Turing machines, and so on.
It's a theme that sci-fi authors have explored deeply. Accelerando is a particularly fun and worthwhile read if you haven't already!
> The naive extrapolation of this exponential trend predicts that the 21st century will see cheap computers with a thousand times the raw computational power of all human brains combined
i.e. putting an upper bound on the exponential with solar system mass
I personally detest the way we sanctify some sole individuals while forgetting the bulk of the community. I don't care who published the first patent for the transistor. He or She certainly cannot be credited for all the work that has been put into it so that I can today use a hand held device to post this comment.
I see where you're coming from, but while that's the case with most stuff ("normal science") , it often isn't the case for truly revolutionary stuff. Many breakthroughs happen not because of the bulk of the community, but against it, often at the highest cost for the individuals.
Surprisingly also not true. Yes, people go against the grain and it is required to actually make paradigm shifts but they're never alone nor did they build from scratch. It may be few against many but it is almost never one against all. That one only prevails due to support from others. Those names don't shine but it doesn't mean they weren't critical to the advancement of a field
Strong claims - maybe good time to do some homework instead of arguing without evidence?
Galileo was sentenced to house arrest for heresy.
Boltzmann died by suicide after lack of acceptance by the scientific community.
It's a very long list and something that's been studied, actually.
If we’re talking about invention rather than discovery, who created the inflection point between potential and actual use seems relevant if not dispositive.
I wish Jürgen Schmidhuber would switch back to actually doing AI research instead of having become completely obsessed with "who invented what" because he feels like he has somehow been academically "robbed" at some point in his career.
He's now officially become a full-blown pariah in the AI world, most relevant people in the space running away at the first sight of his goatee at conferences, knowing exactly the kind of complete and utter crank he's become.
I'll just leave this here https://scholar.google.com/citations?user=gLnCTgIAAAAJ&hl=en so maybe you realize that's a bit of a tall claim from a random about one of the top researchers in AI, no matter what their opinions are. Perhaps you should look up what a "crank" actually is before labeling researchers, just because they don't match your religion.
I think a valid part of the question of who invented something is "who built the first working device" - describing something in theory and building working device are not the same thing.
AG Bell wasn't the first one to conceptually invent the telephone, he was among the first (along with Elisha Gray) in making practical working telephone and later a practical working telephone system.
"It was able to squeak, but not to speak. Experts and professors wrestled with it in vain. It refused to transmit one intelligible sentence." [0]
"A translation of Legat's article on Reis' invention was obtained by Thomas Edison prior to his filing his patent application on a telephone in 1877. In correspondence of 1885, Edison credits Reis as having invented "the first telephone", with the limitation that it was "only musical not articulating"." [1]
Fascinating stuff nonetheless, these inventors and their ideas... See also previous experimenters [2]
[0] https://en.wikipedia.org/wiki/Johann_Philipp_Reis
[1] https://en.wikipedia.org/wiki/Reis_telephone
[2] https://en.wikipedia.org/wiki/Johann_Philipp_Reis#Previous_e...
There's something to be said that mass production is another distinct stage of invention. Karl Benz may have invented the first internal combustion engine car, and plenty more built cars by hand for the rich, but Henry Ford made cars anyone could have for cheap.
To some degree, this is a consequence of the nature of the field you're working in:
* if the physics is so completely understood that you can confidently predict something will work from your sofa, and give an error-free recipe to build it, you indeed can invent from theory... but how deep can this invention be if the problems of the field are completely solved?
* if you are working in a field at the edge of human understanding, you cannot have the confidence in your ideas without having tested them experimentally; a theoretician makes at most a minor contribution to the actual inventions being realized, because he's producing - most likely somewhat wrong - hypotheses.
This latter kind of "theoretical" inventions are heavily subject to survivorship bias. Fifteen competent theoreticians make different predictions - all according to best, though incomplete, model of the world; a successful experiment validates exactly one of them, and we end up exalting the lucky winner as the "inventor".
I had that thought too, describing that something might be physically possible isn't really inventing it, you have to build (and arguably sell) the device too. Re-organizing someone else's equations and saying it's technically possible is maybe enough to publish a paper but certainly doesn't rise to the standard of inventing in my mind
Theodore Maiman and the laser.
The only point in asking in the first place is pride and/or greed.
That is correct, but the article explicitly addresses this point and argues that the evidence points to Lilienfeld producing a working transistor.
"Later, some people claimed that Lilienfeld did not implement his ideas since "high-purity materials needed to make such devices work were decades away from being ready,"[CHLI] but the 1991 thesis by Bret Crawford offered evidence that "these claims are incorrect."[CRA91] Lilienfeld was an accomplished experimenter, and in 1995, Joel Ross[ROS95] "replicated the prescriptions of the same Lilienfeld patent. He was able to produce devices that remained stable for months."[ARN98] Also, in 1981, semiconductor physicist H. E. Stockman confirmed that "Lilienfeld demonstrated his remarkable tubeless radio receiver on many occasions".[EMM13]"
For many things (computers, rocketry, aerospace, etc.) and different reasons, Germany in the years around the second world war, was a pretty bad place to get international credit for your accomplishments.
What’s up with the ending that makes no sense?
>Where are the physical limits? According to Bremermann (1982), a computer of 1 kg of mass and 1 liter of volume can execute at most 1051 operations per second on at most 1032 bits. The trend above will hit the Bremermann limit roughly 25 decades after Z3, circa 2200. However, since there are only 2 x 1030 kg of mass in the solar system, the trend is bound to break within a few centuries, since the speed of light will greatly limit the acquisition of additional mass
They shift from talking about the transistor density to somehow considering a supermassive construct. Reminds me of LLM mashups.
It's a natural extension of the ideas being discussed - the limit in computation per gram of mass has energetic bounds, as well, with configurations nearing the upper limit that start looking more like nuclear explosions than anything we'd regard as structured computation. The extremes are amazing to consider - things that look and act like stars, but are fantastically precise Turing machines, and so on.
It's a theme that sci-fi authors have explored deeply. Accelerando is a particularly fun and worthwhile read if you haven't already!
It seems to refer to the previous paragraph:
> The naive extrapolation of this exponential trend predicts that the 21st century will see cheap computers with a thousand times the raw computational power of all human brains combined
i.e. putting an upper bound on the exponential with solar system mass
I personally detest the way we sanctify some sole individuals while forgetting the bulk of the community. I don't care who published the first patent for the transistor. He or She certainly cannot be credited for all the work that has been put into it so that I can today use a hand held device to post this comment.
I see where you're coming from, but while that's the case with most stuff ("normal science") , it often isn't the case for truly revolutionary stuff. Many breakthroughs happen not because of the bulk of the community, but against it, often at the highest cost for the individuals.
Surprisingly also not true. Yes, people go against the grain and it is required to actually make paradigm shifts but they're never alone nor did they build from scratch. It may be few against many but it is almost never one against all. That one only prevails due to support from others. Those names don't shine but it doesn't mean they weren't critical to the advancement of a field
Strong claims - maybe good time to do some homework instead of arguing without evidence?
Galileo was sentenced to house arrest for heresy. Boltzmann died by suicide after lack of acceptance by the scientific community. It's a very long list and something that's been studied, actually.
We all stand on the shoulders of giants. Giants who are just a bunch of people in a trench coat
This is a good piece of writing that nicely illustrates how what we perceive as "who invented something" is mostly a function of money and politics.
If we’re talking about invention rather than discovery, who created the inflection point between potential and actual use seems relevant if not dispositive.
I wish Jürgen Schmidhuber would switch back to actually doing AI research instead of having become completely obsessed with "who invented what" because he feels like he has somehow been academically "robbed" at some point in his career.
He's now officially become a full-blown pariah in the AI world, most relevant people in the space running away at the first sight of his goatee at conferences, knowing exactly the kind of complete and utter crank he's become.
Was anything he claimed in the article incorrect? Personally, I enjoy these types of historical stories.
I'll just leave this here https://scholar.google.com/citations?user=gLnCTgIAAAAJ&hl=en so maybe you realize that's a bit of a tall claim from a random about one of the top researchers in AI, no matter what their opinions are. Perhaps you should look up what a "crank" actually is before labeling researchers, just because they don't match your religion.
Your link did not work for me. "We're sorry…but your computer or network may be sending automated queries."
It's his Google Scholar profile; you can search for it.
It's Juergen Schmidhuber's Google Scholar page
It's going to be Schmidhuber, isn't it?! /s