ConversationWorld
No preview image
Model was written in NetLogo 6.0-M6
•
Viewed 260 times
•
Downloaded 25 times
•
Run 0 times
Do you have questions or comments about this model? Ask them here! (You'll first need to log in.)
Info tab cannot be displayed because of an encoding error
Comments and Questions
Please start the discussion about this model!
(You'll first need to log in.)
Click to Run Model
extensions [ls] globals [ meaning-language message-language objects relations all-meanings meaning-inputs message-inputs attempt-list accuracy-list ] patches-own [ patch-meaning ] turtles-own [ brain memory last-meaning last-message success?] to setup clear-all ls:reset set attempt-list [] set accuracy-list [] ;; get the meaning and message languages set up generate-languages ;; give each patch a message ask patches [ ;; there is a chance that the patch contains a relation, otherwise ;; there is an equal chance that it contains one or two objects ifelse (random 100) < relation-prob and num-relations != 0 [ set patch-meaning generate-meaning 2 1 ] [ifelse num-objects > 1 [ set patch-meaning generate-meaning (one-of (list 1 2)) 0 ] [ set patch-meaning generate-meaning 1 0 ]] ;; color the patch to weakly indicate what meaning it contains color-patch ] crt num-turtles [ set color red setxy random-xcor random-ycor setup-brain ] reset-ticks end to setup-test clear-all ls:reset set attempt-list [] set accuracy-list [] ;; get the meaning and message languages set up generate-languages ;; give each patch a message ask patches [ ;; there is a chance that the patch contains a relation, otherwise ;; there is an equal chance that it contains one or two objects ifelse (random 100) > relation-prob and num-relations != 0 [ set patch-meaning generate-meaning 2 1 ] [ifelse num-objects > 1 [ set patch-meaning generate-meaning (one-of (list 1 2)) 0 ] [ set patch-meaning generate-meaning 1 0 ] ]] crt num-turtles [ set color red setup-brain face patch-at min-pxcor min-pycor ] reset-ticks end to go set attempt-list [] ask turtles [ move talk update ] ;; append to the end of the list the mean accuracy of the model at this tick set accuracy-list lput mean attempt-list accuracy-list tick end ;; TURTLE PROCEDURES to setup-brain ;; sets up the turtle's brain set memory [] (ls:load-headless-model "ConversationWorldBrain.nlogo" [ set brain ? ]) ls:set-name brain (word "Brain of " self) (ls:ask brain [setup-brain ?1 ?2 ?3 ?4 ?5 ?6 ?7] meaning-language middle-layer num-middle-layers message-language meaning-to-message-iterations 10 learning-rate) end to-report map-meaning [p-meaning] ls:let inputs1 map [member? ? p-meaning] meaning-language report ls:report brain [ apply-bools1 inputs1 ] end to-report map-message [a-message] ls:let inputs2 map [member? ? a-message] message-language report ls:report brain [ apply-bools2 inputs2 ] end to update ;; send the contents of memory to child model to improve the mapping (ls:ask [brain] of self [ update-mappings ?] memory ) end to move ;; turn and move a random amount rt (random 60) - 30 forward 1 end to talk ;; pick someone to talk to let interlocutor one-of other turtles if (interlocutor != nobody) [ ;; map the meaning to a message let message ( map-meaning [patch-meaning] of patch-here ) ;; give the message to the other turtle and have them interpret it let recieved-meaning [map-message message] of interlocutor ;; check if there is a meaning mismatch ifelse recieved-meaning != map-meaning [patch-meaning] of patch-here [ set success? False ] [ set success? True ] ;; encode the present interaction in memory set last-meaning map [member? ? [patch-meaning] of patch-here ] meaning-language set last-message [map-meaning map [member? ? [patch-meaning] of patch-here ] meaning-language ] of interlocutor set memory lput (list last-meaning last-message) memory ;; if the contents of memory exceed its capacity, forget the oldest thing in memory if length memory > memory-capacity [ set memory but-first memory] ;; recorde whether the interaction was succesful ifelse (success?) [set attempt-list lput 1 attempt-list ][set attempt-list lput 0 attempt-list ] ] end ;; PATCH PROCEDURES to-report generate-meaning [ num-ob num-rel ] ;; helper function to generate messages let p-meaning ( list (n-of num-ob objects) (n-of num-rel relations)) report flatten-list p-meaning end ;; GENERAL PROCEDURES to generate-languages ;; sets up the language and meaning models generate-meaning-language generate-message-language set all-meanings generate-all-meanings end to generate-meaning-language ;; generates the sets of objects and relations, then combines them into a single list. set relations map [ word "r" ? ] (n-values num-relations [?]) set objects map [ word "o" ? ] (n-values num-objects [?]) set meaning-language flatten-list (list objects relations) end to generate-message-language ;; generates a list of words in the language set message-language map [ word "w" ? ] (n-values num-words [?]) end to-report generate-message ;; convience function to generate messages for the simple ;; brain used in the non-levelspace version of this model let word1 one-of message-language let word2 one-of message-language let word3 one-of message-language report (list word1 word2 word3) end to-report generate-all-meanings ;; generates all possible meanings with the present language let two_meanings cartesian-product objects objects let three_meanings cartesian-product two_meanings relations report reduce sentence ( list objects two_meanings three_meanings ) end to-report cartesian-product [ list1 list2 ] ;; produces a list of all possible pairs where the first element of each pair ;; comes from the first list and the second element comes from the second list report reduce sentence map [ cartesian-helper ? list2 ] list1 end to-report cartesian-helper [ element list2 ] ;; required by cartesian-product to work well. This could also be used to generate ;; a version of cartesian-product that takes an arbitrary number of lists, but since ;; I only need it to work on two lists I'm sparing the effort. report map [ ( list element ?) ] list2 end to-report flatten-list [ lst ] ; flattens nested lists to a single list. if (reduce [?1 or is-list? ?2] fput false lst) [ set lst reduce [sentence ?1 ?2] lst set lst flatten-list lst ] report lst end ;; Aesthetic functions to color-patch ;; colors the patches so that the color and hue of the patch kiiinda indicates the meaning on it let vec map [member? ? patch-meaning ] meaning-language set vec map [ifelse-value ? [1][0]] vec set vec sum map [(5 * ? * (item ? vec)) + (50 * ?) ] n-values length vec [?] set pcolor vec end ;; REPORTERS to-report windowed-accuracy [window-n] ;; indexes the last WINDOW members of accuracy-list and takes the mean ifelse window-n > length accuracy-list [ report 0 ] [ report mean map [item ? reverse accuracy-list] n-values window-n [?] ] end to-report mean-attempt-list ifelse length attempt-list > 0 [ report mean attempt-list ] [ report 0 ] end
There is only one version of this model, created over 8 years ago by Nathan Couch.
Attached files
File | Type | Description | Last updated | |
---|---|---|---|---|
AMB-FinalPaper.docx | word | The Couch (2016) referred to in the documentation. | over 8 years ago, by Nathan Couch | Download |
ConversationWorldBrain.nlogo | extension | The code for the child models for this model. | over 8 years ago, by Nathan Couch | Download |
This model does not have any ancestors.
This model does not have any descendants.