vineyima's version from 2015-12-14 05:14

NLP part 1

Question Answer
Rationalist belief language is innate.
Empiricistbelief language is a product of human mind, induced.
Phonological Analysis form segmentation of sounds into syllables, string of discrete phonemes.
Morphophonology grouping of speech.
Prosodic intonatinal phrases and metrical grid
Syntactic Analysis tree parser.
Semantic Analysis predicate logic.

NLP continued

Question Answer
Context Free Grammar Issues inadequate, context sensitive
Context Free Grammar can apply rules without looking at whats nearby
Bottom Up Parsing start with words, try to form sentence
Top Down Parsing start with sentence, tries to split into single spots.
Pragmatic Analysis deals with context


Question Answer
Occlusion cup rotated has no handle
Low-Level edges, textures, color, optic flow.
Mid-Level grouping, segmentation.
High-Level reconstruction.
Edge Detection discontinuity in brightness, gradients in brightness
Gaussian Convolution averages of neighbors, used for smoothing.


Question Answer
Edge in 2D take gradient instead of derivative
Contours edges can be formed into contours, brain loves this
Texture distribution of orientation, form continuous regions.
Optic Flow plot how image moves over time, velocity vectors.
Non Maximal Surpression tells where eyes are on a face.


Question Answer
Industrial Robotshighly accurate, but highly constrained, uses offline computations.
AI Robotsinteracts with environment, on-board computation, delivery etc.
Asimov Lawsdont harm humans, dont disobey humans, dont harm yourself.
Robotic Systemcontroller, body, environment.
Controller/Bodytakes in perception from body, gives body commands.


Question Answer
Body/Environmentenvironment gives body stimuli, gets actions from body.
Sensorstouch, vision, infared, chemical.
Actuatorswheels, arms, weapons.
Simulation/Embeddedtrain in simulation.
Optimization vs Learningenergy usage and exploration.
Layered Controldelivery robot represented in layers of actions.

11-10 Subsymbolic AI

Question Answer
Subsymbolic Processing NN breaks apart symbols.
Symbolic Cognitive Models knowledge structures to represent knowledge.
Boris/Cynrus/Soar symbolic cognitive models are powerful when specific, don't learn, not robust.
Distributed Neural Network Models no reason, just reaction - similar to humans, correlations with past experiences.


Question Answer
Symbolic Representation are discrete, disjoint, grammatical, compositional, reason
Subsymbolic Representation messier, data items are different patterns of activity, reaction.
Continuous fuzzy information.
Descriptive similar items are similarly represented.
Holographic any part contains information of the whole.


Question Answer
Conscious Rule Application symbol systems, approximate answer.
Intuitive Processing no rules, correlation with past answers with future questions.
Symbolic Behavior conscious rule is just subsymbolic in disguise.
SPEC sequential parser network, agent-act-patient structure, RAAM+stack+segementer.
RAAM recursive autoassociative memory, encodes a stack.


Question Answer
Dynamic Inference bring together difference contexts, use segmenter.
Segmenter Network removes stuff already read, uses transition words.
SPEC Error with noise, SPEC has difficulty remembering earlier words.
Segantic Effects train data with semantic restrictions (only cats get chased).
Challenges approximate rule like reasoning, deep embeddings.

11-10 Model of Schizophrenia

Question Answer
Delusions self is inserted into impersonal stories.
Topic Switching a sentence has loosely connected topics.
DISCERN neural network-based model of human story processing.
Script story chunk (i went to restaurant, ...).


Question Answer
Story Parser translates a string of sentences into the slot filler representation of a script.
Story Generator retrieves one script at a time from episodic memory, turns into sentences.
Memory Cue is a transition to one script to another.
Personal Stories go to wedding, eat dinner.
Gangster Stories bomb city hall, movie scenes.


Question Answer
Self Character schizophrenics put self into gangster stories.
Semantic Memory Lesions loose associations.
Hyper-Priming disorganized speech is caused b activation in semantic maps that spreads.
Semantic Noise distort lexicon output with noise.
Semantic Overactivation increase the output activations of the lexicon.
Working Memory Disconnection connections cut between context layer connections.
Working Memory Noise distort context layer activation with noise.


Question Answer
Working Model Gain Change changing hidden layer sigmoid slope.
Excessive Arousal State increase hidden layer bias.
Hyperlearning accelerated memory consolidation, overly intense training, caused schizophrenia.
Experiment I tell short stories, judge on recall, derailments, agency shift, lexical errors.
Experiment II psychotics, grammar is still okay, DISCERN has good hyperlearning schizophrenia.
Derailments hyperlearning in memory encoder causes rapid context switches.

11-17 Evolving Neural Networks

Question Answer
Neural Nets powerful where no good theory of domain exists.
Reinforcement Learning difficult with large/continuous/hidden states.
Function Approximator generalized estimator for large states.
Neuroevolution direct nonlinear mapping from sensors to actions, has bias node, search space too big.
Advantages NE 3 orders of magnitude faster than RL for pole balancing.
Conventional Neuroevolution concat weight nodes into chromosome.


Question Answer
Genotype one kind of neural network, a score gets assigned to this.
Genetic Algorithm cross fit genotypes, add mutations, diversity is good.
Problems with CNE local optimals get focused on, convergent stagnates progress, too many weight nodes.
NEAT neural evolution of augmenting topologies, mutations add nodes and connections.
Complexify add mutations to NN, makes search more manageable, incremental construction.


Question Answer
Lamarckian Evolution traits acquired can be passed to children, possible in NN, difficuilt to implement as diversity is reduced, progress stagnates.
Baldwin Effect learning selects promising invididuals.
Evolving Novelty picbreeder, humans are fitness functions, choose images to be evolved.
Fitness-Based Evolution rigid, gradual process, may find local max.
Novelty-Based Evolution new, innovative, often finds different but best solution.

Recent badges