Question about ANNs/CNNs don't know where else to put it....

Thread Starter

ulrichburke

Joined Sep 21, 2021
2
Dear Anyone.

Trying to learn the basics of neural networks, which I'd never heard of till very recently and they piqued my interest. Then I found this chart about them which, to this dumbass, didn't make logical sense and I was wondering if someone could explain it to me....

Neural Network table dont understand it.jpg

Hokay. In the ANN column it starts off by saying Tabular Data, Text Data. Fair enough. Then next to Application it says Facial Recognition!! How the heck is FaciaL Recognition Text Data!?! Then in the CNN column it says Image Data. Again fair enough - till you hit the Application column again and it says Text Digitization and Natural Language Processing. Text Digitization I SORTA get, though correct me if I'm wrong, because you've got in-between formats like .PDF which are part graphic, aren't they? But Natural Language Programming? Why wouldn't that come under ANN which says it's for Text Data? And RNN. I SORTA get - it's SORTA in all of this or I'd understand the uses more - it down to Application AGAIN! Where it says Text-to-speech conversions. But ANN is the text one, why isn't ANN doing text-to speech?

Please, I've only read this table. I understand the basics of programming, inasmuch as I can use BASIC/VB/Python pretty well (NOT perfectly but not bad!) But the applications in the table just seem to be the wrong ways around - keep the answers below Stephen Hawking level, please, or I'll be forever asking clarification questions! Basically you'd be using these things as souped-up subroutines, wouldn't you, you'd write a program, when you wanted the info. Only They Could Provide at that point you'd assign your data to their variables, chuck it at them, they'd assign their answers back to your variables which would go on through your program, happiness all around am I right? Not that I'm anywhere USING these things, I'd just love to UNDERSTAND why the applications seem the wrong ways around to me.

Mods, if I've put this in the wrong forum feel free to move it. I just couldn't see where else to put it (anatomical suggestions will NOT be carried out!)

Yours puzzledly

Chris.
 

Papabravo

Joined Feb 24, 2006
21,094
Do you know how the three network types operate at a low level? If not, you are putting the cart before the horse. If so you then need to study how these three paradigms are applied to actual problems. Then you can go back to the table and try to synthesize what it is talking about.
 

Thread Starter

ulrichburke

Joined Sep 21, 2021
2
Dear Papabravo.

Nope - like I said, I've only read that chart. That's all I know so far. But that chart fascinated me, that's why I came here!

Could you explain a little more, please, as you know what you're talking about and I don't! I don't even know where to look for the info. on low-level working that would explain it in a way I'd understand. I'm sure if you'd explain it, I'd understand enough to at least know what I'm looking for.

Yours respectfully

Chris.
 

Papabravo

Joined Feb 24, 2006
21,094
I would try a google search for "Neural Network Primer" to obtain the basic definition for a "neuron" and how that basic building block is combined with others to from an "Artificial Neural Network" (ANN). An ANN is built from a collection of neurons, and they are arranged to form an input stage, zero, one, or more hidden layers, and an output stage. Data flows in one direction only from input to output with no looping back. The "neuron" uses weighting values to compute a "linear combination" of the inputs. This result can have a constant added to it and finally this value is the argument to a non-linear activation function. Examples of activation functions are the sigmoid, the hyperbolic tangent, and the Rectified Linear Unit (RELU). There are a few others, but they all have similar properties.

Then you can look at the "Recurrent Neural Network" (RNN) where output data from individual "neurons" can be looped back to previous stages. This is similar to what happens in a Digital Sequential Circuit in that it provides "memory". Finally, there is the "Convolutional Neural Network" (CNN). The name is derived from a mathematical operation called convolution.

Quoting from the Wiki on convolution:

In mathematics (in particular, functional analysis), convolution is a mathematical operation on two functions (f and g) that produces a third function f*g that expresses how the shape of one is modified by the other.

If you've taken advanced calculus you may recognize this operation. If not don't worry about it for now.

One more thing. There is no need to structure your post like a letter. Just say what you need to say.
 
Last edited:

slackguy

Joined Feb 11, 2016
76
they can get the job done - but AT A COST.

they merely guess inputs by using rules and past data: which come at higher cost and power use

programing a SPECIFIC SOLVER can achieve the same for less and BE FAR EASIER TO IMPLEMENT

there is ALLOT OF LANGUAGE TO LEARN with these beasties called "neural networks". they are really just as i already described.

to use one you'll have to pump in ALLOT of rules - making you wish you'd just used a high leve language to solve the problem otherwise

IF HAVE TO COMPILE - you'll find out it will only work in Win10 or UBUNTU - that there are many software attacks these days that keep certain things only working in those two environments. it's a kind of financial or power attack - which i won't discuss further. but it's important because these "nets" require compilation if you are going to "learn something" that is not a default learning module (something you customize for your use)

----------------------------------------------------
I would say run the other way unless your 1500% sure it's what you need. It's junk science.
 

Papabravo

Joined Feb 24, 2006
21,094
they can get the job done - but AT A COST.

they merely guess inputs by using rules and past data: which come at higher cost and power use

programing a SPECIFIC SOLVER can achieve the same for less and BE FAR EASIER TO IMPLEMENT

there is ALLOT OF LANGUAGE TO LEARN with these beasties called "neural networks". they are really just as i already described.

to use one you'll have to pump in ALLOT of rules - making you wish you'd just used a high leve language to solve the problem otherwise

IF HAVE TO COMPILE - you'll find out it will only work in Win10 or UBUNTU - that there are many software attacks these days that keep certain things only working in those two environments. it's a kind of financial or power attack - which i won't discuss further. but it's important because these "nets" require compilation if you are going to "learn something" that is not a default learning module (something you customize for your use)

----------------------------------------------------
I would say run the other way unless your 1500% sure it's what you need. It's junk science.
I get that you are not a fan, but following your arguments, such as they are, is more than ordinarily challenging. Why not keep your own counsel and let the TS come to his own conclusions.
 
Top