Ai Forums Home Welcome Guest    Monday, April 24, 2017
Ai Site > Ai Forums > The Artificial Intelligence Forum > how simple are situational concepts sometimes? Last PostsLoginRegisterWhy Register
Topic: how simple are situational concepts sometimes?

posted 3/9/2016  11:30Send e-mail to userReply with quote
ive got a good one! i cant get my network to feedback. (output to input.) and now I think I know why!

posted 3/10/2016  15:09Send e-mail to userReply with quote
Correct! Feedback, Just like a RNN.

posted 3/10/2016  18:01Send e-mail to userReply with quote
Yes, I am using feed back too.

posted 3/13/2016  07:22Send e-mail to userReply with quote

posted 3/17/2016  07:00Send e-mail to userReply with quote
As of March 2016 the Mentifex AI project has four old AI Minds: in English; in German; in English; in Russian.

The above AI Minds in Forth and in MSIE JavaScript
are now being radically simplified during a port into -- Perl5 code.

The cognitive architecture of the new Perl AI is
so simple and straightforward that the heart of the
associative neural net can be seen at a glance when
one is interacting with the AI in order to debug it.
A core mind-dump is displayed below after the human
user has entered "Boys make robots" into the AI Mind.
The mind-dump displays the following assoociative tags:

psi -- the concept number standing in for a conceptual neuron;
act -- the act(ivation) level of the concept or quasi-neuron;
hlc -- the human language code: en=English; de=deutsch; ru=Russian;
pos 1=adj 2=adv 3=conj 4=interj 5=noun 6=prep 7=pron 8=verb
jux -- any juxtaposed concept, particularly an adverb such as NOT;
pre -- the previous concept associated with the psi concept;
tkb -- time-in-knowledge-base where a time-bound idea is stored;
seq -- the subsequent concept associated with the psi-concept;
num -- the grammatical number of a noun, or pronoun, or verb;
mfn -- male-female-neuter 1-2-3 identifier of the gender;
dba -- "doing business as": case of a noun; person of a verb;
rv -- recall-vector time-point of a word stored in the @ear array.

time psi act hlc pos jux pre tkb seq num mfn dba rv pho act audpsi
t=2370. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= ,0,0,
t=2371. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= B,0,0,
t=2372. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= O,0,0,
t=2373. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= Y,0,0,
t=2374. psi=589,0,en,5,0,0,0,835,0,0,0,2371, aud= S,0,589,
t=2375. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= ,0,0,
t=2376. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= M,0,0,
t=2377. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= A,0,0,
t=2378. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= K,0,0,
t=2379. psi=835,0,en,8,0,589,0,571,0,0,3,2376, aud= E,0,835,
t=2380. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= ,0,0,
t=2381. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= R,0,0,
t=2382. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= O,0,0,
t=2383. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= B,0,0,
t=2384. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= O,0,0,
t=2385. psi=0,0,0,0,0,0,0,0,0,0,0,0, aud= T,0,0,
t=2386. psi=571,0,en,5,0,835,0,0,0,0,4,2381, aud= S,0,571,

Time "t" above is an internal counter geared to the phonemes
stored in the @ear auditory array for quasi-phonetic input.

Concepts are identified by a number such as 589 for "boy";
835 for "make"; and 571 for "robot". Any new concept in
the AI is assigned a new concept number, starting at 3001.

The most recent addition to the Perl free AI source code
registers the associations among the words in the sentence.
Concept 589 "boys" above has a "seq" tag to 835 "make" as
the verb of which 589=boys is the subject: "Boys make...."

Concept 835 "make" has a "pre" tag back to 589-boys and
a "seq" tag forward to 571-robots: "Boys make robots".

Concept 571-robots has only a "pre" tag back to 835-make.

As the Perl AI grows more and more complex during the
port from Forth and JavaScript into Perl, more and more
details of the mind-state will be visible in the mind-dump.

Anyone may download both Strawberry Perl5 and the AI and
see results similar to the above in English or in Russian.

Mentifex here claims that associative neural-net AI is
actually a lot simpler in design and function than most
Netizens can possibly realize for lack of True AI programs.

Even the most complex DeepMind or OpenCog or OpenCyc or
what-have-you AI will need to instantiate concepts and
their productive (generative) relations with stored words.

Anyone who wants to see the AI in action does not need to
wait for the completion of the Perl port, but may instead
run one of the four previous AI Minds as listed upthread.

 Mentifex peers into the heart of the Ghost Perl Webserver AI neural net
'Send Send email to user    Reply with quote Reply with quote    Edit message Edit message

Forums Home    Hal and other child machines    Alan and other chatbots    Language Mind and Consciousness  
Contact Us Terms of Use