Markov Chain trained on King Jame's Bible and Programming Books

Started by
24 comments, last by Paradigm Shifter 10 years, 1 month ago

It seems like they might get better automated results (instead of discarding 90%) if they picked a 2-word phrase a few words into a source sentence, then found that identical 2-word phrase in a sentence in the other source and pasted that, then moved a few words into the pasted part and started checking 2-word phrases to see if they could find another identical match in the first source and do a second paste.

I want to help design a "sandpark" MMO. Optional interactive story with quests and deeply characterized NPCs, plus sandbox elements like player-craftable housing and lots of other crafting. If you are starting a design of this type, please PM me. I also love pet-breeding games.

Advertisement

getting extra fancy, maybe the Markov Models could be combined with Neural Nets, and the NPCs could have some basic ability to communicate (input neurons could be linked to "heard" words, and output neurons could be linked to words for them to want to say), though this would likely require some sort of training algorithm or something (such as to hopefully get them to mimic conversation).

Yes I was thinking about neural networks as well. I wondered if it were possible to teach the generator what funny sentences are based on giving a "funny rating" on every sentence it spews out. You'd need a metric for rating what funny is, and I'm not sure if the path of state changes can be used to measure that. Would be worth trying.

I feel coding would be a lot better if we did things like


#define LORD {
#define AMEN }
#define JESUS_CHRIST(x) (throw std::runtime_error(x));

void foo()
LORD
    if( !read_the_bible() )
        JESUS_CHRIST("Thy haveth sinned")
AMEN
"I would try to find halo source code by bungie best fps engine ever created, u see why call of duty loses speed due to its detail." -- GettingNifty

There are freakin' hilarious!

I have a feeling this will spawn a new programming language - perhaps "Holy Code" or some such, where the syntax will look very similar to what people have already posted. :P

Gotta love custom #defines!

My website! yodamanjer.com
My development blog!

Follow me on Twitter! [twitter]jwg1991[/twitter]

getting extra fancy, maybe the Markov Models could be combined with Neural Nets, and the NPCs could have some basic ability to communicate (input neurons could be linked to "heard" words, and output neurons could be linked to words for them to want to say), though this would likely require some sort of training algorithm or something (such as to hopefully get them to mimic conversation).

Yes I was thinking about neural networks as well. I wondered if it were possible to teach the generator what funny sentences are based on giving a "funny rating" on every sentence it spews out. You'd need a metric for rating what funny is, and I'm not sure if the path of state changes can be used to measure that. Would be worth trying.

could be.

a while ago, I thought up some ideas for an "unusual" way to implement a neural net (*1), but couldn't think of much to use it for, so never got around to implementing it.

combining it with a Markov models could be interesting, as well as possibly some AI control markup could be included in the mix.

Them's fightin' words! ${getAngry tgt="other"}

*1: rather than neurons linking together directly (via each neuron holding lists of other neurons it is linked to), possibly they would be organized into a regular (3D or 4D) grid, and work in a way slightly resembling the Minecraft Redstone system. in-general, neurons would communicate directly with adjacent neurons via their faces (6 for 3D, 24 for 4D), but a non-local "skip" mechanism could also be provided.

on the positive side, it could possibly have a higher neuron density with lower per-neuron CPU and RAM cost, but on the downside that non-local signaling would generally require either using "skips" or filling a lot of the space with signal-propagating "wires" (non-behavioral neurons), and as well, conventional training algorithms (such as backpropagation) wouldn't really work (partly due to some of the imagined behaviors, partly due to the possible/likely presence of cycles, ...). likely it would require either training them GA-style, or each neuron itself operating independently (similar to finite-state-machines) and adapting itself primarily via signaling (possibly both by "firing" mixed with "promoter/inhibitor" signaling).

didn't get around to it though, but might do so eventually...

3D vs 4D would be partly about cost (3D could have a higher count at a lower cost, 4D could likely do better signaling but would put smaller limits onto the viable size of the neuron cube). initially I was leaning mostly for 3D.

funniness is likely harder to evaluate absent having an actual human to provide direct feedback.

I feel coding would be a lot better if we did things like


#define LORD {
#define AMEN }
#define JESUS_CHRIST(x) (throw std::runtime_error(x));

void foo()
LORD
    if( !read_the_bible() )
        JESUS_CHRIST("Thy haveth sinned")
AMEN

ok.

In view of this, it is ironic that introductory programming is most often taught in a highly imperative style. This may be a memorial unto the children of whoredoms.

I always assumed that's why they taught introductory programming in such an imperative manner.

Fileserver, UART v7, sockets Berkeley main()

"Most people think, great God will come from the sky, take away everything, and make everybody feel high" - Bob Marley

This topic is closed to new replies.

Advertisement