I think you have a different interpretation of the singularity from the common one. I agree that machines have better computation/calculation speed, and memory. But the argument against all of that is that all the information that the machines know, still came from humans. If a robot finds an unknown substance/element from space, it will not know what to do with it. It needs human intervention to understand it and use it. But after the singularity, machines will really be smarter than humans, and wouldn't need their guidance. They can discover and create things unknown to man. They can progress and grow all on their own.
That is precisely the point. Passing that frontier of the singularity will be a combination of the mechanical, the digital and our (organic) brain, what they have defined as “Serb-organism”.
I think it is possible, but a lot are convinced that it is not. Current AI and machines mostly work on if-else and for decision making. I don't think anyone has any idea on how to add the outside the box thinking that humans have. But that is also the fear. If and once machines develop this type of thinking, it might be impossible to control them.
That is true, this is a different view of the singularity. However, the idea that all information comes from humans is not accurate. When something moving is filming, that means it is actually generating the data. I will grant it is still in the minority although that will change as we get embedded AI.
I think one of the keys is people are still stuck in it "not being human" meaning it is not intelligent. I agree it isnt human intelligence. However, we have an issue defining what this is anyway.
However, the idea that all information comes from humans is not accurate.
I think this is actually one of the requirements of the singularity. You mentioned that when a machine is filming it is generating data. But to process that data, it is using information that came from humans. A piece of petal is a piece of petal because humans said so. H2O is water because humans described it so. All the information that machines and AI have right now are from humans. Sure they can capture 'new' data, but that is just a reproduction of information that humans have already documented.
Let's say for example an unidentified element/metal is discovered. The machines of today won't be able to identify it. Sure they can describe it with the parameters from human knowledge, like its color, hardness, texture, etc. Humans can name it element/metal 250x, and think of names for its composition. It will take a machine that can create its own data to do something similar.
What you are describing is language. Machines already did that. In fact, they operate on a completely different language structure than we do.
Again, you think that only humans can operate and name a new element. I disagree. AI is gaining a collective understanding of the world. It will frame it differently than we do. It is why results from prompts change when the same question is asked. It is not simply mirroring the data that was put in like traditional databases.
I only used one simple example. To continue on the hypothetical example, the machines of today won't be able to think of practical uses of the new element/metal. They can't think of creating something from nothing yet. Or creating something new that hasn't been discovered or created by humans. What you described in your second paragraph is AI. It is smart, but it is not yet at the point of singularity. The fact that we don't have inference and AGI, which are significantly lower than the capabilities of the singularity is proof.
The fact that we don't have inference and AGI, which are significantly lower than the capabilities of the singularity is proof.
Considering there is no agreed upon definition of AGI, how does anyone know. Some claim Altman has already achieved it. I dont know if he did or did not.
Hence, people want to ascribe terms to singularity where there is none. Some state it has to have superintellgence involved while others, focus upon compute. The point here is when we look at the definition of taking all the old rules, metrics, and norms that shape society as the basis of the definition, we have been there for years.
GDP, a baseline economic term, lost its validity decades ago. Growth rates also fall into that same area. The relationship between capital, labor, and economic productivity is no outdated.
This is just one field where all is being washed away. It was not a human creation.