I think people should be a lot more worried about AI than they are.
Unlike all our previous technology, we really don't understand it well. A lot of it was just trying to copy aspects of our brain's construction. This isn't secret information: you don't have to read much to find plenty of AI researchers who will admit as much.
In many ways, existing AI is already smarter than us, and they are getting even smarter very fast. But that's not typically what's meant by "the singularity". That generally means the point at which AI takes over its own design and gets "super smart". It hasn't happened yet, but it seems reasonable it could happen in the next decade.
Unfortunately, I don't think current society is well-equipped for proper risk management of AI technology, so I think it is now the greatest existential threat we face.