Discussion about this post

User's avatar
DAVID REARICK's avatar

....but Tom, your definition of learning is not that different from deep A.I. learning. Take the hot dog example and apply it to humans: we recognize a hot dog because we have seen images of multiple hot dogs and NOT hot dogs, and "learned" what is a hot dog and what isn't. That seems pretty similar to generative A.I. If you show me a rubber hot dog and ask me what it is, I'll still say it is a hot dog: so will A.I. as long as all we have programmed into the computer are images. We are only using the sight sense in this example, but you could extrapolate the same argument to the other senses. Program enough data what a hot dog taste like, feels like, smells like and I think the computer will be able to differentiate based on sight, texture, smell, touch and taste what is a hot dog and what isn't (and even differentiate it from a polish). All it needs is the programming. I'd argue that is all the human mind needs also. I'm not sure human learning is all that different from programming. A.I. also does learn from its mistakes, that's how it is able to win at chess or navigate a maze. Lastly, what percentage of the population is good at things A.I. does very well i.e. computation, pattern recognition, translation, language, writing etc. Answer: Not many.. My point A.I. doesn't have to be very good to best most people at the places where it will be used, because most people are pretty poor at these things. McDonalds knew what they were doing when they replaced the cash register with pictures and push buttons so employees wouldn't need to calculate change. Bottomline: Programming is how a computer learns and that isn't too different from how humans learn and A.I. is starting to learn certain subjects better than humans can.

Expand full comment
DAVID REARICK's avatar

Wait a minute---the Hot Dog, Not Hot Dog clip is indeed funny and demostrates the "non-learning" difference between machines and humans, especially in the early days of A.I. But wait--this is 2023, not 1995 and because computer power is now a trillion times more powerful than it was in 1995 a generative A. I. system has assimilated millions of pictures of hot dogs and NOT hot dogs and can now identify a hot dog better than a human can. We see this daily in A.I. reading radiology scans better at picking up cancers than board certified radiologists, or recognizing a face better than a human can, or being able to learn chess in 4 hours to beat the best human chess player, or do the same with the game GO, etc. etc. I could go on and on, but you get the point. What is learning anyhow? Is it not the ability to differentiate, provide an answer, recognize a pattern and answer a question? In so many areas, generative A.I performs better than humans do. I agree it isn't perfect--yet. A.I. applications that are now being commercialized are only 5 years old (despite the fact programmers have been working in this area for 40 years). Frankly, if A.I. can learn chess well enough to beat a Chess Grand Master in only 4 hours, I don't care how you categorize "learning", it learns much better than I do.

Expand full comment
1 more comment...

No posts