×
Login Register an account
Top Submissions Explore Upgoat Search Random Subverse Random Post Colorize! Site Rules Donate
8

What is the fundamental difference between AI and normal computing?

submitted by we_kill_creativity to AskGoats 8 monthsAug 6, 2024 22:43:29 ago (+8/-0)     (AskGoats)

I understand digital logic vs analog logic...but what is the difference between AI and normal computing? What makes AI different than just a super duper computer? a

EDIT: To date, at 10:18pm, no comment has been able to explain how AI isn't just a super duper computer. A.K.A. AI is just a normal computer, but with more computing.


27 comments block


[ - ] Spaceman84 4 points 8 monthsAug 6, 2024 23:10:27 ago (+4/-0)

AI is just software. Algorithms and very large data sets. GPUs have become very good at handling massive amounts of data and doing the calculations necessary to generate text or image. Cryptocurrency mining has similar requirements. Chatbots and image generators are just a different application.

[ - ] Fascinus 3 points 8 monthsAug 6, 2024 23:07:26 ago (+3/-0)*

We haven't achieved true AI, at least not publicly.

Machine Learning describes methods by which software can be made to analyze a body of data (text, image, etc.), extract features which may be used to attempt to categorize the subject matter and, subsequently, perform many permutations of analysis, variously weighting parameters, until a solution is arrived at that performs to the users expectation.

Occasionally, this will result in some interesting categorizations. For example, ML has historically been poor at distinguishing dunes from nudes, since they share many features.

In many cases, the ML is so good at recognizing traits that we have to go out of our way to make it fail.

The key distinction between Machine Learning and traditional computing is that, with Machine Learning, you are using a program that was designed to try a lot of different methods of determining which features are important (these are generated on the fly by the process at runtime), whereas in a traditional program, all of the decision branches have been explicitly defined by the programmer(s) ahead of time.


Archive links:

https://archive.is/4O60m

https://archive.is/xbahs


*Edit: Point of clarification in that, with ML, the different approaches that the program attempts to use to categorize the target data are generated at runtime, rather than predefined when the program was written.

Changed "ML is far better than the average human at recognizing traits" to "the ML is so good at recognizing traits that we have to go out of our way to make it fail"

[ - ] ilikeskittles 0 points 8 monthsAug 7, 2024 08:36:55 ago (+0/-0)

We never will. It is pseudo intelligence.

[ - ] Fascinus 2 points 8 monthsAug 7, 2024 15:11:04 ago (+2/-0)

By "true AI", I meant "general purpose Artificial Intelligence program", not "something that is equivalent or superior to nature".

Machine Learning is only a slice of that pie.

[ - ] Sector2 2 points 8 monthsAug 7, 2024 01:46:58 ago (+2/-0)

AI is just a normal computer, but with more computing.

Exactly.

[ - ] PotatoWhisperer2 2 points 8 monthsAug 6, 2024 23:06:18 ago (+2/-0)

Most AI uses neuro-networking algorithms and can re-program parts of itself(to a degree). It's not actually intelligent, smart, or 'unknown'. That's just dumbass commie journalists repeating catch-phrases to push fear clicks.

Non-AI uses less flexible algorithms and logic to do what it does. It is less data-driven and more deterministic.

Both methods require intelligent programing behind the scenes, at least for the core logic/math parts. Otherwise it just starts failing all over the place. Ever notice how all software is starting to fail a lot? Yeah...

From a user's perspective AI does more decision making for you. Which means it does more 'work' though not at a good degree.

Basically, AI is just slightly more advanced programming at this point. It offers more to the end-user but it can also fail more as well. Which is a problem with today's diversity driven programming.

[ - ] MaryXmas 1 point 8 monthsAug 7, 2024 12:15:31 ago (+1/-0)

Here is the information you were looking for on the other thread... To replicate continuous reporting from discrete data using an AI application, one can employ various interpolation and machine learning techniques. Here is a technical summary of the process:

1. Data Collection and Preprocessing:
- Sampling: Discrete data points are collected at regular or irregular intervals from sensors or other data sources.
- Normalization: The discrete data is normalized to a common scale to facilitate more accurate modeling.

2. Interpolation Techniques:
- Linear Interpolation: This method estimates intermediate values by assuming a straight line between adjacent discrete data points.
- Spline Interpolation: Spline methods, such as cubic splines, fit a piecewise polynomial to the data points, providing a smoother continuous curve.

3. Machine Learning Approaches:
- Regression Models: Algorithms like polynomial regression or kernel regression can model the relationship between discrete data points and predict intermediate values.
- Recurrent Neural Networks (RNNs): RNNs, including Long Short-Term Memory (LSTM) networks, can capture temporal dependencies in the data, allowing for more accurate continuous predictions.
- Gaussian Processes: This probabilistic approach models the data as a continuous stochastic process, providing not only predictions but also uncertainty estimates.

4. Data Smoothing:
- Moving Averages: Techniques like simple, weighted, or exponential moving averages smooth the data by averaging adjacent points.
- Kalman Filtering: This algorithm recursively estimates the state of a dynamic system, filtering out noise to produce a smooth continuous output.

5. Evaluation and Validation:
- Cross-Validation: The model is validated using techniques like k-fold cross-validation to ensure its accuracy and generalizability.
- Error Metrics: Metrics such as Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), and R-squared are used to evaluate the performance of the interpolation or prediction models.

By integrating these techniques, an AI application can effectively transform discrete data into a continuous reporting format, enabling more refined and accurate monitoring and analysis.

[ - ] MaryXmas 1 point 8 monthsAug 7, 2024 02:47:17 ago (+1/-0)

It isn't different. It is a different application of the same technology. It works like the plinko game from your other thread.

[ - ] Cantaloupe 1 point 8 monthsAug 7, 2024 01:55:19 ago (+1/-0)*

There's all kinds of AI - a few

https://files.catbox.moe/2xkf77.png

You probably mean this kind

https://files.catbox.moe/xxy9ey.webp

With the kind of AI you probably mean -

Words are tokenized and embedded into semantic space the lower square, in the lower link

And using Tensor NNs context, weights and patterns are discovered and preserved - representing knowledge. Lower link, upper square.

So this is very different than the typical human created program. When a person creates a program they must understand what it is they have made.

Not so with a trained/tuned AI program.

An example is the following say you made the most simple AI. A row vector [a b] and column [w1,W2] you could adjust w1, W2 to give you an and function when multiple. You feedback the error until the weights are adjusted, automatically.

With two values you can do it manually. With trillions of values it has to happen automatically based upon a subset of the possible inputs. The calculations it does uses Matrix based math and in parallel so it works best with a TFU or GPU

So that is the difference Auto-tuned information of unknown quality vs fully known programs created by people. Some kinds of things are just hard to get finely tuned - so AI can do that better. And you can use math to ensure the AI gives pretty good results.

But it isn't thinking it's not intelligent and we do not really know what is. We could make some form of intelligent acting thing.

Also consider we are an I with billions of years of refinement. Multiply encoded. And probably quantum effects in microtubules. Along with culture, and our symbiotic organism.

[ - ] Niggly_Puff 1 point 8 monthsAug 6, 2024 23:38:15 ago (+1/-0)*

The fundamental difference is traditional computing uses precise instructions which a CPU interprets to reach a hardcoded output. AI uses imprecise instructions which a neural network interprets to reach a desired output.

[ - ] ilikeskittles 0 points 8 monthsAug 7, 2024 08:36:13 ago (+0/-0)

It’s the same. AI which I prefer to call Pseudo Intelligence. Is just a massive set of rules and a shit ton of data.

[ - ] Master_Foo 0 points 8 monthsAug 6, 2024 22:57:23 ago (+0/-0)

There is nothing different about AI and "normal computing" at a fundamental silicon level.

AI is just a bunch of different algorithms designed to solve the "Traveling Salesman Problem" while balancing P!=NP over increasingly more complicated data sets.
https://en.wikipedia.org/wiki/Travelling_salesman_problem

[ - ] Trope 0 points 8 monthsAug 6, 2024 22:49:09 ago (+0/-0)*

My best and finite understanding of AI has to do with filtering.

Let’s say you have a database with lots of data. Or several databases. You’d like to gather like and relevant data and can manually do this using a computer using filters.

AI can cross-reference the data using a computer with a predictive logic.

If you remember the computer technician chick from either of the popular Crime Drama TV shows of the 2010s, AI can help supplement the human element to the filtering process.

That is my best understanding of it and the only way of which I can be sold on it. It adds an automated rationalization/logical process to the filtering process. The world-changing result is the processing of large amounts of data at faster speeds.

“Hey, AI, can you tell me how many people with Apple iPhones and Samsung Galaxy phones went down to the jack shack in Boys town for a wank on February 11th? Of that sample, tell me how many are married to women? Sort them into two lists: Apple and Android. Today, we’re going to find out truly which demographic of consumers is gayer.”

[ - ] we_kill_creativity [op] 1 point 8 monthsAug 6, 2024 22:56:14 ago (+1/-0)

Ok, but how is AI not just normal computing with extra computing?

[ - ] Fascinus 1 point 8 monthsAug 6, 2024 23:12:19 ago (+1/-0)

It is just normal computing with extra computing... only that extra computing is designed to introduce random changes to how important one feature is compared to another, the confluence of multiple features, etc.

Basically, ML is designed to try a lot of different things to see what works vs. a traditional program where (nearly always) all of the logic has been defined up front.

[ - ] Trope 0 points 8 monthsAug 6, 2024 23:09:09 ago (+0/-0)

That is a perfect description

Quantum computing is a unique computation entirely dependent on regular computing. Like an ASIC.

[ - ] deleted 0 points 8 monthsAug 6, 2024 23:11:55 ago (+0/-0)

deleted

[ - ] McNasty -1 points 8 monthsAug 6, 2024 22:46:20 ago (+1/-2)

Ai is just an advanced program. Computing itself is something that a program would do. Different programs can compute at different speeds. Ai is built for specific functions.

[ - ] we_kill_creativity [op] 2 points 8 monthsAug 6, 2024 22:55:07 ago (+2/-0)

You didn't answer my question: What is the difference between AI and normal computing?

[ - ] McNasty -2 points 8 monthsAug 6, 2024 22:57:28 ago (+0/-2)

That's like asking what the difference is between your socks and shoes.

[ - ] we_kill_creativity [op] 3 points 8 monthsAug 6, 2024 22:59:35 ago (+3/-0)

You still aren't answering my question...why don't you do that?

[ - ] deleted 1 point 8 monthsAug 6, 2024 23:09:50 ago (+1/-0)

deleted

[ - ] we_kill_creativity [op] 2 points 8 monthsAug 6, 2024 23:17:39 ago (+2/-0)

You still didn't answer my question...

If AI is different from computing...how is it different?

[ - ] deleted 1 point 8 monthsAug 6, 2024 23:33:50 ago (+1/-0)

deleted

[ - ] McNasty -1 points 8 monthsAug 6, 2024 23:11:07 ago (+0/-1)

For the sake of the analogy, let's say that you can never be barefoot and must always wear socks directly on your feet. The socks are akin to computing your feet are akin to you receiving the computing results. The results are, you can walk around normally.

Sometimes you can put shoes on. Shoes in this analogy is the AI program.The results from this are that you can compute way farther and over rougher terrain.

For the sake of the analogy, you can never take the socks off your feet because computing at the most basic level is binary. You cannot have an AI without computing first.

[ - ] we_kill_creativity [op] 1 point 8 monthsAug 6, 2024 23:15:38 ago (+1/-0)

You cannot have an AI without computing first.

Finally you say something of value...

[ - ] McNasty -1 points 8 monthsAug 6, 2024 23:28:48 ago (+0/-1)

Lol. I was confused by your question. It sounded like you thought they were two different versions of the same thing.