Monday, February 6, 2023

Thoughts on AI

AI is Web 3.0 (not NFTs). Read on to read my thoughts on the future of AI.


Just now I came across an interesting title to a Reddit post linking an article, "Would it really be so bad if AI took our jobs?". Now disclaimers: I did not read the article, as my point I want to make isn't about the etical or moral dilemma of AI taking over jobs, but rather the root concept of AI doing just that. Second, I have yet to take an AI class in university because it's only offered in the fall and I took algorithms last fall (plus I'm not inclined to study machine learning). With that said, here are my own thoughts about AI taking over jobs or what not, originally made on the Reddit post by my account, but with some added information.

These are my opinions, but where it might be factual I would think is common knowledge, but if you have any responses or corrections to make, let me know in the comments.

First, some terms:

AI: Artificial Intelligence, in my sense the ability to produce results given sets of data and conditions/rules/algorithms governing how to interpret data. (Important thing to note is the algorithms used, which are often made by computer scientists and engineers to accomplish tasks, like path navigation, or art generation).

ML: Machine Learning; developing an AI using a set of algorithms (essentially another AI) to produce the conditions/rules/algorithms programatically based on various tests and the results. (In this case, ML governs its own algorithms, not a programmer) (For the sake of simplicity I am grouping DL/deep learning with ML, although deep learning technically goes beyond the capabilities of ML)

Human: Homo sapien, capable of creative and unique thinking, even beyond the extent of what we know today in our universe.

In recent years, AI and ML have seen an exponential growth in applications and usefulness, as well as research. What started off as simple pathfinding or some natural language processing is now capable of producing art, video, creative works, conversations, and so much more- AI, that is. Examples of recent developments include the suite of OpenAI, voice assistants like Google (speaking of which) or Siri, or even stuff like developments in chess AI, or AI that can play video games (including near-undetectable cheats). AI is something so vast and big, I'll call it the Web 3.0 (that Web 3.0, that NFTs were supposed to be, or how the data hoarding IoT cloud wishes it was). It became such a big phenomenon that AI 5-10 years ago could not have predicted how big it became (5-10 years ago Google Now and Siri were still figuring out how to respond to questions and maintain conversations) (more on AI vs. human predictions later). ML has also seen recent development, being one of the latest modern developments in computer science (alongside the internet and 3D graphics, ML started its RnD in the 1960s but didn't see much progress until 20-30 years ago, according to a quick glance at Wikipedia). ML (and deep learning) still has ways to go to be on the level of modern AI, but it can surpass modern AI. As for humans- well, I believe we can always go beyond the limits of AI, but regarding ML, that is uncertain.

Now, onto the subject at hand: can AI take over our jobs? Will AI take over our jobs? Is AI even relevant to be something to take over jobs or live alongside us (idk)? Here is what I have to say, copied from my comment on the Reddit post from before:

  1. AI can't take over the jobs worth taking over- mundane, repetitive, labor intensive work. That stuff requires algorithms, robotics, areas of engineering that go way beyond artificial intelligence.
  2. AI isn't some end all solution to everything in the world (that's what machine learning is for). AI essentially is just sticking data into an algorithm (and ML is AI but the algorithm is written by the computer over time). There are real world consequences to be had with AI coming up with "new ideas," when it really is just copying (literally) from other sources. An amazing tool into understanding how learning overall works, and understanding what truly consists of unique, new work, but not to be used as a tool to actually innovate.
  3. If there is something truly without any innovation then sure AI can take over (large fact checking, database management, reporting on and summarizing just facts or secondary/tertiary sources). In fact in some cases this is preferred- why in my classes should I read a 50 page chapter on some topic and put it into terms of notes I understand, when I could instruct an AI to read the chapter for me and convert it into a note format I will remember and use to study? (Some say reading and/or taking notes alongside is a good way to learn, I don't think that's entirely true, but in that case then why not get the AI to make you read or take notes, by reformatting the text or reading it out loud?)


AI can't take jobs yet. It's too early to ask that question. If we are worrying about how modern AIs can generate better looking art or stories or characters than human beings, then either we need to use what AI generates as a tool to make something greater (I'd be willing to AI generate video game assets to make a game, because I do not have the time and expertise to build the assets I intend on using), or we need to question our assumptions of what we consider to be impressive and not (is every multi-thousand dollar piece of art actually worth its price if AI can seem to do better?)

Could AI have written this article? At the time I made it, maybe. I'm not necessarily stating anything new, but rather forming my own opinion based on what I've seen and experienced. It's possible an AI could've reached the same conclusions, but it's also extremely probable AI would've taken a more universal approach to answer the question based on what others have said. After I've written it, and AI could interpret my article and analyze it and then make its own response to the article, or summarize my article, or make a different or similar article to this one, because it's already been made, and all AI has to do is input some data sources like this text, and produce a result based on other input data, comparing and contrasting each character to form something that is programatically sensible. Could ML have made the article? No, not at this time, not without analyzing a billion opinion pieces first, but in the distant future, sure- actually have the ability for a program to choose for itself what it thinks about AI advancements, that would involve ML or deep learning beyond what we are doing today.

My theory is we can beat AI eternally, because AI can extrapolate what exists and then bar that to come up with what doesn't exist; but as human beings we can come up with a third option, or infinitely many more options, by mixing and matching what does and doesn't exist, to come up with entirely new possibilities and ways of thinking. True limits of thinking or creative thought will arrive with the development of ML or deep learning, where we go past the phase of testing for ideal results from an algorithm created by a machine, and rather examine never before seen results a machine decided to generate on its own, without following any predefined rules.

When smartphones first came out I don't think anyone anticipated that they would become hosts of immense global social platforms, because such an idea never before existed at such an accessible scale (the internet is one of the first instances of truly connecting everyone together. Telephone is another but was deemed much too limited.) Or when the first motion pictures came out, or the first singleplayer video games, no one could've anticipated an entire area of study revolving around engineering perfect movies or incredible interactive experiences, something which breaks many other previous rules and discovers new rules about human behavior (color theory, cinematography, UI/UX, silent protagonist or choice driven storytelling, etc). Or the entire concept of outer space, where new discoveries constantly break the new theories we just came up with, because space is infinite (yeah fight me big bang theory) and mysterious.

This sort of stuff is what truly innovates, and this sort of stuff is what AI is incapable of achieving. If AI (and robots) were to replace every single job capable on earth, then there would be infinite capacity to evolve humanity even further and greater than before, as long as people are willing to seek out tough answers to problems and questions we don't even know of right now.

ML however... Machine learning is at its very early stages, but already it has developed its own complex algorithms to optimize what we previously thought wasn't possible to optimize (take computer graphics for instance- ML based algorithms are able to achieve far better looking images by seeing patterns in source data that would take people an additional 100 years of research and development to formulate). The next steps for machine learning would be to target algorithms which don't produce optimal, expected results, but rather produce something never before seen, with no basis or backing or sources to pull back from, and then to develop such ideas but supplemented with some sources, because at that point then a machine will truly be at the level of thinking of humans.

No comments:

Post a Comment