Adventures in Reinforcement Learning: Trials and Errors

Over the last few weeks, I’ve been working on learning some basic reinforcement learning models. All my code is available here (warning: basically totally undocumented and uses code from a few sources. It’s not the prettiest).

The Environment

The environment I used was the OpenAI Gym. The Gym is a library created by OpenAI to use as a benchmark for reinforcement learning systems. For my first experiments, I used the CartPole environment:

The goal of this environment is to keep the pole balanced for as long as possible by moving the cart left and right. It has three actions– left, right, and no-op, and four inputs: position and velocity of the cart, and position and velocity of the pole.

Q-Learning

{\displaystyle Q(s_{t},a_{t})\leftarrow (1-\alpha )\cdot \underbrace {Q(s_{t},a_{t})} _{\rm {old~value}}+\underbrace {\alpha } _{\rm {learning~rate}}\cdot \overbrace {{\bigg (}\underbrace {r_{t}} _{\rm {reward}}+\underbrace {\gamma } _{\rm {discount~factor}}\cdot \underbrace {\max _{a}Q(s_{t+1},a)} _{\rm {estimate~of~optimal~future~value}}{\bigg )}} ^{\rm {learned~value}}}

Q-learning update equation, from Wikipedia

The first model I implemented was a basic Q-Learning model. Q-learning operates by estimating the expected value of each action given the current state, including both the reward from the immediate next time step and the expected reward from the resulting state.  For my implementation, the Q function is simply a matrix mapping (state, action) -> value, and at any time step it selects the action which has the highest value in the given state, with some small probability of choosing randomly instead (this helps it explore and makes it less likely to get trapped in local minima.

And it learns! Quite quickly, in fact.

A graph showing progress of a Q learning model (x-axis is training iteration, y-axis is how many frames it survived)

This is comparatively simple to implement, but has some problems. Specifically, the state space must be discrete. In order to make it possible to index into this matrix with (state, action) pairs, the state space has to be discrete, In order to do this I binned the parameters provided by the environment (8 bins for each of the cart parameters, 10 bins for each of the pole’s parameters, which is 6400 possible states). There’s a trade-off here, since more states results in a finer-grained model which is better able to comprehend the state of the environment, but takes longer to train because there are more spaces to fill in the matrix, and each one will be encountered less frequently (the state space becomes sparser).

How do we solve this problem? We need to go deeper. 

Deep-Q

The only difference between basic Q-Learning and Deep-Q Learning is that the Q function is a neural network instead of a matrix. This allows us to use a continuous state space. The network is trained to produce the same value used above for the more basic Q-Learning model. This model should perform better in theory, because of this.

Unfortunately, this model wasn’t very practical. The Q function estimates the value of each action separately, which means that the model runs several times per iteration. This makes it very slow to train, which makes it kind of miserable to iterate. It was learning, but so slowly, I moved right on to my next experiment for the sake of time and my sanity.

Asynchronous Actor-Critic

The Actor-Critic model improves over Q-learning in two significant ways. First off, it’s asynchronous. Rather than having a single network, it parallelizes the learning across multiple workers which each have their own environment. Each workers’ experiences are used to update the shared network asynchronously, which results in a dramatic improvement in training performance.

Secondly, it doesn’t just output the value (V(s), below), which corresponds to the value of the current state (similar to the Q value above, but it doesn’t consider actions), it also outputs the policy (π(s) below), which is a distribution over all possible actions. Importantly, the network’s own value judgements are used to train its policy- based on the action taken, it compares the value of the new state and updates the policy with respect to the previous state to reflect that.

We can make one more improvement: Instead of using the value estimations and rewards directly, we can calculate the advantage. This represents how much higher the reward was than estimated, which gives a larger weight to places where the network struggles and lets it exploit sparse rewards more effectively.

A diagram of the Asynchronous Actor-Critic model (from here)

I used a different environment than before, moving out of the parameterized space into a more complex environment: Breakout. It’s one of the atari

The Breakout environment

My specific architecture used 2 convolutional layers to parse the images, followed by a single LSTM layer to get time relations, and then a set of fully-connected layers for the policy and value outputs. You can see an early example of it above! This model actually worked very well for me– here’s the tensorboard output from 20 hours of training:

Screenshot from 2018-05-08 18:33:02.png

Graphs showing a steady increase in length, reward, and value

Interestingly, the gradient and variable norm also increased over the course of the training, which suggests that I need to normalize the model more strongly, but given the duration of the test, I’m not sure I want to cross-validate that.

Unfortunately, due to an error which would very rarely not terminate a training episode, the workers all died at the end and my computer ran out of memory as the episodes dragged on forever (alas). However, it did still seem to be improving when it died, and I still have the model, so I plan to continue training and see just how good it eventually gets. I also made a few improvements over that first try above, which you can see in the sample below:

image7500.gif

A black-and-white, cropped version of Breakout

As you can probably see, the images have been pre-processed to crop out the score, resized, and converted to black-and-white for efficiency. Additionally, I moved the training and sampling of the network run on my GPU, which freed up the CPU to run the simulations of the environment. This more than doubled the speed of the training and was what made it reasonable to train overnight like I did above.

Option-Critic

One feature of the Atari environments caught my attention, though: Frame skip. In the Breakout environment, each action has a random chance of being repeated 2, 3, or 4 times, which adds some noise to the behavior of the agents that they have to account for. I began to wonder, however, how an agent might perform that was able to choose how many frames to execute an action for. It’s been done before with a Q-learning model, but this model simply adds an additional version of each decision that corresponds to each number of time steps, which dramatically increases the cost of choosing an action and the number of hyperparameters, and makes the model more sparse in general. What would be ideal is a model which can use some sort of regression to select. Fortuitously, the Actor-Critic model is ideal for this!

Screenshot from 2018-05-08 18:46:57.png

The architecture proposed by Lakshminarayanan et. al. in the paper above

We can simply add another output to the model which selects the number of times to repeat the chosen action.

This is remarkably similar to a concept called “options.” Options generalize decisions over time by outputting not a single discrete decision, but two components: a distribution over possible actions (we have that already), and a termination condition. Ideally, the termination condition would be some subset of the possible states, such that the option could continue until some state is reached, however this is difficult to accomplish with the neural network interpreting the states. Instead, our termination condition is a simple scalar which will sample from that distribution the given number of times.

This is what I’ve been working on, but unfortunately, learning the options seems to be much more difficult than learning one-frame actions. My best results have come from bootstrapping the network with the pre-trained checkpoint from the actor-critic model above, but it simply always chose a duration of 1 frame and then continued learning exactly as before. If I choose to reward the model for using larger time steps, it always chooses the maximum and ignores the ball completely to cash in on that easy reward. Reward engineering is tough!

In the near future, I’ll continue working on this problem and see what I can make of it, but unfortunately with this sort of reinforcement learning model, sometimes it’s nigh-impossible to identify the source of a bug, and iteration takes so long that it’s difficult to find them all in a reasonable amount of time.

Next steps

My next step in this is really to move out of the OpenAI gym and out into the wild a little bit, or rather, into unity. Unity is a very popular game development engine which recently released a toolkit for training machine learning agents in environments created in the engine. This opens up some really interesting possibilities, as the environments can be anything you can create in unity, including physics and collision simulations, realistic lighting, and, most importantly, human interaction. This opens up a lot of really interesting possibilities that I’m excited to explore.

Posted in Reinforcement Learning | Tagged , , , , , , , , , , , , | Leave a comment

Unexpected results in ML & project updates

There was a recent paper that caught my eye that I think is worth sharing with anyone interested in the themes of my blog:

The Surprising Creativity of Digital Evolution: A Collection of Anecdotes from the Evolutionary Computation and Artificial Life Research Communities

If you’re into reading papers, definitely take a look at this one, if not, Károly Zsolnai-Fehér of Two Minute Papers did an excellent video about this paper. This is one of the things that really inspires my love of machine learning, and it’s great to see awareness of this really interesting stuff on the rise.

An additional note, if you haven’t yet, check out some other blogs hosting generative humor, including AI Weirdness and Botnik Studios (not neural networks, but still hilarious). There’s also a (very new) subreddit for Botnik with similar predictive text humor, so check that out.

Finally, since I’ve been getting a ton of new subscribers (Welcome!) an update on my current projects: The first one, which I’ll be doing a much more substantial write-up of some time in the next month, is a gimp plugin interface for Deep Dream. Here’s an example of the kind of stuff you might see some more of:

tubingen_honeycomb.jpg

Tubingen with the “honeycomb” class deep-dreamed onto it

I’m really excited about this project finally coming together– once this is set up, I may look into the feasibility of porting it to a Photoshop plugin and/or creating a plugin for Neural-Style, since these tools really need to be put into the hands of artists.

The second is just learning more about reinforcement learning using the OpenAI Gym and (eventually) the Unity ML Agent framework.  But look! It learns:

image75.gif

Asynchronous Actor-Critic RL model playing Breakout

It’s not very smart yet, but it’s trying its best. (I’ve since switched to grayscale for better performance)

In the meantime, I’ve found a set of web scrapers that pull data from BoardGameGeek, so once I get that set up to get natural language descriptions instead of just ratings data, expect some AI-generated board games.

[last-minute edit]

I should also mention, I did run char-rnn on the database of Urban Dictionary results, but then realized belatedly that this website is on my resume which I hand out to employers, so I decided to do a bit more thinking about whether I wanted to dip into the NSFW on this blog. I’ll keep you updated on that.

Posted in Uncategorized | Tagged , , , , , | Leave a comment

Sentiment Translator

Introduction

A very common problem in natural language processing is that of sentiment analysis, or rather, attempting to analytically determine if a given input has an overall positive or negative tone. For example, a positive string might be “I really enjoyed this movie!” whereas a negative input might be “it was boring, vapid, and dull.”

My idea is essentially thus: given recent advancements in machine translation, it should be possible to translate between sentiments (as opposed to between languages. The difficulty in doing this presented itself fairly immediately: there is no dataset available with a one-to-one mapping between negative and positive sentiments. There are models for unpaired machine translation, but they’re still relatively unproven.

My first implementation was a fairly simple rule-based approach: try to remove or add inverters (the word “not,” for example) and replace words with their antonyms until the sentiment changes appropriately. This worked well for very simple cases, but just wasn’t smart enough to capture more complex relationships (for example, it loved to translate “good” to “evil,” even when “bad” would make a lot more sense). My new implementation takes a different approach, using (and abusing) a model loosely adapted from Daniele Grattarola’s Twitter Sentiment CNN.

The Data

I used the aclimdb dataset, a set of reviews scraped from the International Movie Database, split into four parts of ~12500 reviews each: positive training, negative training, positive test, and negative test. Movie reviews work very well for this problem because they are essentially already annotated with the sentiment in the form of the user’s star rating for the film.

In pre-processing, I split the reviews into sentences to reduce the length of each input and convert each review in to word vectors (in my experiments, I used the googlenews-300 pretrained vectors). Unfortunately, due to the size of the input when converted into 300-dimensional vectors, I frequently ran out of memory during training. To reduce this issue, I only load the million most common words in the google news negative 300 vectors

The Model

The model is based on a set of one-dimensional convolutions over the series of word vectors, followed by a max pooling layer, ReLU, and a fully-connected layer. This is trained as a standard sentiment classifier, learning to predict the sentiment of a given input sentence.

At sampling time, however, we do something different. We run the input sentence through the classifier, as normal, however we give the classifier a different target sentiment. We then find the gradient of the loss of the classifier with respect to the input word vectors. This may be familiar to anyone who’s implemented Google’s Deep Dream algorithm or worked with Adversarial images. In essence, this will give us the direction we should perturb the input vectors to cause the largest change in the sentiment. Additionally, the magnitude of the gradient for a given word roughly corresponds to how much that word contributes to the sentiment (and therefore, how important it is to change).

We hit on another problem here. The space of possible words is discrete, but the word vector space is continuous (and very high-dimensional, and thus sparse). How can we be sure that these gradients are moving towards an actual word? To be honest, I’m not entirely sure. My first approach was to use multiple gradient steps, however this appeared to find minima in the sentiment that didn’t correspond to actual words in the input set. My second approach was to extend the gradient out as a ray from the original word and find the word vectors closest to this line: this worked a good deal better (specifically, it captures the “hate” <-> “love” relationship), but still isn’t perfect: we still need an heuristic method to select which of the proposed word replacements to use, which in the end will make this method little better than the rule-based approach from my initial implementation.

Conclusion

The biggest realization I came to was that when mapping a discrete space to a continuous space, the meaning of the intermediate values is not always intuitive. This is what we see when we simply perform gradient descent on the word vectors– the sentiment converges very nicely, but the resulting vectors have barely changed from their original values. This is interesting in the domain of computer vision, as it typically results in an “adversarial image,” or an image which can fool the classifier into misclassifying it with a very high confidence while being indistinguishable from the original to a human. However, as we are hoping for some of the words to converge to different discrete values in the word space, this is less than ideal.

Additionally, one unanticipated disadvantage of the lack of unpaired data was the inability to mathematically verify the accuracy of the translations– there was no ground truth to translate to. 

Future Work

One thought I’ve had is to try to do something similar to CycleGAN, which performs “image translation” in an unpaired fashion through a combination of GAN loss and “reconstruction loss,” however this still introduces problems as we cannot easily calculate gradients of the sentiment loss through the discretization into the word space.

It’s a tricky problem, but if anyone has any ideas, I’m interested.

Posted in nlp | Tagged , , , , , | Leave a comment

Automated topic extraction with Latent Dirichlet Allocation

During last semester, I became aware of a really interesting NLP algorithm called LDA, short for “Latent Dirichlet Allocation.” The goal of the algorithm is to create a set of “topics” which represent a specific human-interpretable concept

The core assumption of LDA is that documents are generated as follows:

  1. For each document:
    1. generate a distribution over topics based on model parameters
    2. for each word in the document:
      1. Sample a topic from the document’s topic distribution
      2. Sample a word from that topic’s distribution over words

The idea being that there is some latent variable (the topics) that informs word choice.

Unfortunately, the document->topic and topic->word distributions are impossible to calculate exactly, but it is possible to approximate them using gibbs sampling or variational inference- approximation techniques which will allow us to eventually converge to a solution close to the true solution (insofar as such a thing exists). Unfortunately, these have the side-effect of being very slow, so the algorithm is not exactly the most efficient. Compared to training a neural network, though, it’s not actually unreasonable.

Here are some results from running the algorithm on a datset of news articles, where each line is a discrete topic. See if you can figure out what each topic is about:

command, field, one, marshal, last, general, boy, first, slim, perform

belgian, congo, government, independent, u.n., congolese, lumumba, nation, one, province

Sept., oct., new, lake, 23, color, first, fall, river, season

Run, two, mantle, one, home, game, mariners, record, hit, play

Would, effect, generation, fallout, megaton, radiation, test, soviet, human, government

Law, anti-trust, union, labor, act, price, manufacture, company, collect, bargain

And from a combination of science fiction and scientific papers:

brown, soil, detergent, wash, surface, john, house, daily, provide

casework, service, prevent, family, treatment, use, interview, care, time, help

company, questionnaire, list, manchester, store, busy, mail, plant, downtown

food, radiation, object, cost, meat, process, irradiate, product, visual, refrigerate

And finally, the SCP foundation. One thing to note is that I didn’t do as much data cleaning or parameter selection as I did with the previous datasets, so quality could be better. I’ll fine-tune the results later.

film, dust, character, neutrino, drug, pill, site-64t, supplemen, text, brand

photograph, photo, cognitohazard, photographed, effect, viewer, ascension, baxter, epsilon, ride

mr., kiryu, d-13321, head , dr., mad, corrupted, little, penny, rust

playback, inside, data, ritual, becam, went, object, orbit, contain, unit

tlaloc, station, turkey, none, d-69601:, dubcek, albright, materialized:, stop., deities

These are just a few examples but you can see how easily interpretable the results are with basically no human intervention or annotation. I’m hoping to apply this to some other datasets in the near future to see what sort of results I get.

Posted in LDA | Leave a comment

Making art with neural style and fractals

I recently attempted to see if I could create art with Neural Style using only photos I’ve taken and fractal flames I’ve created with Fractorium and Apophysis. I must say, I’m very happy with the results! I generated the outputs using the Neural Style GUI and fine-tuned them using Photoshop– the latter was necessisary to mitigate the black splotches still present in some images.

22405550_1451387041581435_9162479106892362539_n1507861578595fractal_cathedral_spoopyfractal_town

I’m planning to write up a post soon about a really interesting NLP algorithm, and I’ve been having fun recently training char-RNN on a database of Urban Dictionary so stay tuned for that.

Posted in neural style | Leave a comment

Steam Games

While I work on my next big project, I decided to generate some random steam game names. All of these are games that don’t actually exist:

Happy Panic

Unraveled Land

Mad Sharkness

The Heart’s Medicine

formic innocence

Heroes Over Europe – Full Mojo

Lovely Ventures

The Gravesable Moon Visual Novel

Hotline Miami 3: Back under Begins

Redemption Cemetery: Secret Of The Angel

Nightman: Trigger Element Space

Hellfrosted

Princess Maker Expedition

Gorescripted Addiction: Possibility

Mars Indian

The Ember Sigil

Train Simulator: Eternal Gun

5-Bit Soundtrack

Best Force

Happy Fantasy

Jackal Journey

Signal Flyng

 

And the winner for the most probable steam game name is:

The Walking Dead: Inside The Walking Dead: “The Secret of the Magic Crystal”

and also:

Steam Dev Days: Steam Edition

Posted in Uncategorized | Leave a comment

Espresso is the marshmallow of coffee (fun with Word2Vec)

In exploring the recipe dataset, I decided to have some fun with Word2vec, an algorithm originally created by Google. For the layperson, this algorithm works by looking at the context in which a given word appears and learns vectors to represent words such that words that appear in similar context have similar vectors. On the recipe dataset, this means that, for example, the vectors for vodka and cognac are very close together, wheat and rye are very close, chocolate and butterscotch are very close together, etc.

What’s really neat about this, though, is that it enables us to do some very interesting things. One of the properties of the vectors created is the ability to perform vector arithmetic, adding and subtracting these semantic vectors to create word analogies. Here are a few examples: (read a – b + c = d as “b is to a as c is to d”)

pie – pizza + calzone = blintz

That makes sense! Never would have thought of that to be honest.

banana – plantain + apple = blueberry

I guess an apple is just a big blueberry. Who knew

candy – marshmallow + coffee = espresso

I guess that makes sense. Weird though.

 fish – tuna + chocolate = candy

Ok, tuna is a type of fish, chocolate is a type of candy. I guess I’ll let that one slide.

coffee – tea + lemon = orange

tea – coffee + lemon = lime

That’s interesting. It seems to think coffee is sweeter than tea.

coffee – knife + spoon = expresso

Interesting. In addition to the marshmallow of candy, it’s also the spoon of cutlery.

rasin – grape + fish = offal

Wow, ok, I guess it doesn’t like rasins.

brie – cheese + candy = meringues (closely followed by “fondant”)

Makes sense. Fancy, soft, light.

ribbon – bar + dome = tented

Let’s try the classic word2vec analogy:

king – man + woman = bruce

What. The next closest option is “retired.”

I’m going to continue experimenting with this. I’ve also been getting some really good results with the chef-rnn, so I’ll get back to you with more of that soonish as well.

Posted in word2vec | Tagged | Leave a comment

Chicken soup (for the robot soul?)

So I did a little more training with the chef-rnn at a lower learning rate to fine-tune it a bit and got some shockingly good results, with some weird quirks. I really just couldn’t resist posting more recipes. These are so much fun to read and try to imagine cooking/eating. This is the closest I’ve ever gotten to eatable food, and I’m kind of tempted to try some of these at some point. As usual, I’m going to start

title: chicken & sausage pie
categories: pies
yield: 6 servings

2 chicken breasts, boneless
1 tb olive oil
1 garlic clove, crushed
1/4 c butter or margarine
1/4 c bread crumbs
1 c shredded mozzarella cheese
1 c shredded swiss or cheddar
-cheese
1 c shredded cheddar cheese

preheat oven to 350 degrees. in a small bowl, combine chicken,
cheese, parsley, mustard, salt and pepper. spread evenly over chicken
mixture. place a layer of chicken on top. bake uncovered at 350 for
20 minutes. sprinkle with parmesan cheese. bake uncovered for 10
minutes or until cheese melts and cheese is melted.

Wow… that actually sounds pretty good. Replace “chicken” with “sausage” at some point so it fits the name, and this could be a decent meat pie. A warning: this version of the network loves chicken.

title: chicken & vegetable stir fry
categories: main dish, poultry, cyberealm
yield: 6 servings

1 chicken
salt
pepper
soy sauce
olive oil
1 c mayonnaise
2 tb tomato paste
1 tb green onions, chopped
1 tb cumin
1 tb chili powder
1 ts worcestershire sauce
1/2 ts cumin
1/4 ts cinnamon
1/2 ts sugar
1/2 ts paprika
1/4 ts pepper

combine all ingredients in a saucepan and simmer until the vegetables
are tender. add chicken and simmer for another 5 minutes. add the
sauce and cook another 5 minutes. serve over rice.

each serving provides: 448 calories; 21 g protein; 12 g carbohydrates;
12 g fat; 30 mg cholesterol; 316 mg sodium; 31 mg calcium.

This needs more vegetables in my opinion, and I’m not a fan of the cup of mayonnaise, but ok, I can see this working.

title: chicken & vegetable soup (salsa de chili)
categories: poultry, soups/stews
yield: 6 servings

2 chicken breasts, boned and
-skinned
1/2 lb mushrooms, sliced
1 c onion, chopped
1 cn cream of mushroom soup
1 cn chicken broth
1 ts salt
1 tb chili powder
1/2 ts pepper
2 c chicken broth
2 c chicken stock
1 c parmesan cheese
3 chicken breasts, boned

1. place chicken with thyme, bay leaf, parsley, parsley, salt and
pepper in shallow baking pan. 2. heat, uncovered, in microwave oven
5 minutes or until chicken is tender. 3. add chicken, cook 10 minutes
or until chicken is tender. 3. remove chicken from pan and keep warm.
2. melt butter in same skillet over medium heat. add chicken
and saute over medium heat for 5 minutes or until lightly browned.
7. add beef stock to pan; cook, stirring often, until slightly
thickened, about 4 minutes. stir in chicken and orange juice. 5.
spoon sauce over chicken and vegetables; sprinkle with cheese.

And now it’s a soup! With orange juice! Also, it’s a multi step recipe, but the step numbers are totally out of order (it goes 1 2 3 3 2 5 7 5). ‘Let’s try that again,’ says the network

title: chicken & vegetable soup
categories: soups, poultry, casseroles
yield: 6 servings

6 chicken breast halves
— boned, skinned
4 tb butter
1/2 c onion, chopped
3 garlic cloves, minced
1 c water, hot
1 ts salt
1 ts pepper, black
6 oz can chopped green chiles
1 1/2 c milk
1/4 c onion, minced
1/4 c flour
1 c chicken broth
2 tb cornstarch
1 c milk
1/2 c cornstarch
1 c cheddar cheese, grated

preheat oven to 350 degrees. combine flour, salt and pepper. heat
oil in skillet over medium heat. add chicken and cook until no longer
pink and browned. remove from pan and set aside. add onion to skillet
and cook until tender. add tomato sauce, chicken stock, chili sauce,
chili powder, and cumin. bring to a boil and simmer 10 minutes or
until chicken is tender. remove chicken from cooking liquid. add
chicken and stir to coat well. serve over rice.

I love how often it adds “serve over rice” to the end of recipes. I thought this was a soup! That aside, this does sound like a decent soup, so I can’t complain here.

title: ham steaks with rice
categories: pork, meats
yield: 6 servings

1 lb ground lamb
1 clove garlic, minced
3 tb olive oil
1/2 c chopped onion
1/2 c chopped celery
1 cn cream of mushroom soup
1 cn tomato sauce
1 cn tomato paste
1 tb cumin
6 oz sour cream
2 c sour cream
1 cn 10 oz. pasta shells
1 onion — sliced
1 cn tomato sauce
1 cn tomato sauce
1 cn tomato sauce — 14 oz
can use 1 can kidney beans
1 cn cream of chicken soup
2 c milk
1 ts salt

cook bacon until crisp. remove from pan and drain off fat. stir in
chili powder and cayenne pepper. add tomatoes and salt. cook until
meat is tender. add butter, salt, pepper and sugar. add salt and
pepper to taste. add tomato paste and simmer for 30 minutes. add meat
and cook about 15 minutes longer. stir occasionally. add water if
needed to thicken. serve over hot noodles.

Wow, those ingredients… It really likes using soup as ingredients. Also the idea of adding water to thicken, it clearly doesn’t understand something there.

title: cheese tortellini sandwiches
categories: sandwiches, meats, cheese/eggs, sandwiches
yield: 4 servings

1 12″ sandwich buns (about 1
-pound)
1 c mayonnaise or salad dressing
1 tb milk
1 tb prepared mustard
1/2 ts salt
fresh ground black pepper
corn tortillas (optional)

mix the cheese and seasonings together in a bowl. combine the cheese
with the cheddar cheese and stir into the cheese. place in a lightly
greased 9-inch square baking dish. sprinkle with the cheese, and
sprinkle with cheese. bake for 20 minutes. remove from oven and allow
to stand for 10 minutes before serving. serves 6 to 8.

Oh jeez, another recipe with a full cup of mayo. Why do you need a full cup of mayo (or salad dressing) in a  tortellini sandwich?

title: baked salmon with shallots
categories: fish
yield: 4 servings

2 tb butter
1 tb white wine vinegar
1 ts salt
1 ds pepper
4 shrimp (2 1/2 lb ea.)
-(1 lb); peeled, and
– cut into 1/2-inch cubes
1 c chicken broth
2 tb capers; drained
1 tb fresh parsley; chopped

remove skin from chicken breasts and discard. remove skin and bones
from chicken; set aside. in a large skillet, heat oil over medium heat.
add chicken and cook, turning the chicken for about 10 minutes or
until cooked through and cooked through. set aside.

in a small bowl, whisk together the remaining ingredients. pour over
the chicken. serve immediately.

Wow, you could cook this with only fairly minimal changes. But also, why are the shrimp cubed? Why is that necessary?

Okay, I could write about these all day. Let’s look at some weird edge cases. What happens when I turn the temperature all the way down? (to 0.1, the lowest value it will let me use)

title: baked stuffed carrots
categories: vegetables
yield: 4 servings

1 lb fresh spinach
1 tb olive oil
1 tb lemon juice
1 ts salt
1/2 ts pepper
1 tb olive oil
1 onion, chopped
1 clove garlic, minced
1 tb fresh parsley, chopped
1 tb fresh parsley, chopped
1 tb fresh parsley, chopped

in a large saucepan, combine the carrots, onion, celery, carrots,
celery, carrots, celery, carrots, celery, parsley, bay leaf, pepper,
and salt. bring to a boil, reduce the heat and simmer, covered, for
15 minutes. strain the stock through a fine sieve into a bowl. add
the chicken stock and the remaining ingredients and toss to coat.
serve immediately.

title: chicken a la parmesan
categories: poultry
yield: 4 servings

1 chicken, cut up
1 c chicken broth
1 c chicken broth
1 c chicken broth
1 tb cornstarch
1 tb soy sauce
1 tb cornstarch
1 tb water
1 tb cornstarch
1 tb water

cut chicken into small pieces. combine chicken broth, soy sauce,
vinegar, sugar, salt and pepper in a small bowl. add chicken and
cover with plastic wrap. microwave on high for 10 minutes. remove
chicken from broth and set aside. combine cornstarch and water in a
small bowl. add to skillet. cook and stir until thickened. serve over
rice.

[after this it just starts repeating “1 ts cayenne pepper” over and over]

I, uh, wow. That first one sounds really good, and the second is a weird way to cook a chicken (which involves a sauce that is just corn starch and water). I think it would need some modification to be eatable (e.g. actually cook the chicken) but it seems totally reasonable otherwise.

What happens if I let the network overfit a bit? To do this, I sample from not the best checkpoint, but the most recent checkpoint. Typically the validation behavior is a very rapid drop, followed by a much more gradual decline, and then eventually it bottoms out or starts going up- this means the network has overfit to the data and is reproducing the inputs instead of generalizing to new data.

title: cornmeal bread pudding
categories: desserts, puddings
yield: 1 servings

1 c popcorn, unsalted
3 c sugar
1 c milk
2 eggs
1 ts vanilla
2 c coconut, shredded
2 c cottage cheese
1 tb orange rind, grated
2 c coconut
1 c chocolate chips

preheat oven to 350 degrees. grease a 13x9x2-inch baking pan. in a
medium bowl, combine flour, baking soda, salt, cinnamon and nutmeg.
stir in buttermilk and eggs. blend well. add chocolate chips, stir
until well blended. spread batter evenly into prepared pan. bake 15
to 20 minutes or until a wooden pick inserted in center comes out
clean. cool in pan on wire rack. cut in squares.

Yeah, that seems reasonable. Too reasonable. Fortunately, the ingredients are great- Why does it have popcorn? I guess that would have to be a topping or something? All said, this one could be pretty good.

title: chocolate fudge bars
categories: candies, chocolate
yield: 1 servings

1 c sugar
1 c butter or margarine
2 c confectioners’ sugar
1 ts vanilla
2 c milk
1 c chopped walnuts

cream butter and sugar until light and fluffy. blend in egg and
vanilla. sift flour with baking powder and salt and add to creamed
mixture. mix in another 1/2 cup of chocolate chips. pour into greased
and floured 9″ round cake pan. bake at 350 f for 50-55 minutes. cool
in pan and cut into squares.

Ok, there’s no way that’s not a real recipe with a few modifications. That’s too good. There were a couple like this, so I’ll skip over them for now.

title: fish fillets
categories: fish
yield: 4 servings

-waldine van geffen vghc42a
1 lb fish fillets; cut into 1/2″
-cubes
1 c walnuts; chopped
1 c peanut oil
1 c water
1 clove garlic; minced
1 tb butter
1 tb flour
2 tb sugar
1 tb cornstarch
1 tb water
1 pn salt
1 ts worcestershire sauce
1 ts cornstarch

preheat oven to 375f. lightly butter a 13 x 9 x 2 inch baking
dish. sprinkle the shrimp with the salt and pepper. arrange the
artichokes in a single layer on the carrots. add the onion and garlic,
and sprinkle with the garlic powder. place the chicken breasts on top
and bake for 15 minutes or until they are soft and crunchy.

place the chicken in a serving bowl and top with the basil sprigs.
sprinkle with parsley and serve immediately.

serves 4.

I love this one because of “soft and crunchy.”

And then we get this monstrosity:

title: frosty chocolate cheesecake (lacto)
categories: chocolate, cakes, desserts, cheesecakes
yield: 16 servings

—————————crust——————————–
1 c sugar
1 pk strawberry pudding mix
1 c cold water
2 tb margarine; melted
1 ts vanilla extract

————————–filling——————————-
1 c sugar
2 tb cornstarch
1 ts cream of tartar
1/4 c cocoa
1/2 c water
1 ts vanilla

—————————glaze——————————–
1 c sugar
1 ts cornstarch
1 ts cornstarch
1 ts lemon juice

combine the chocolate chips, sugar, corn syrup and salt in a heavy
saucepan over medium-high heat. cook over moderate heat, stirring
constantly, until the sugar is dissolved. remove from the heat and
stir in the butter until dissolved. stir in the vanilla and coconut.
spoon into a 9-inch springform pan. using a rolling pin, score the
cake layers in the pan. bake the cake in the middle of a preheated
350f oven for 50 minutes, or until a toothpick inserted in the center
comes out clean. cool on a wire rack for 10 minutes, then remove the
cake from the pan and cool completely.

in a small saucepan, combine the chocolate and water. cook, stirring
constantly until the chocolate is melted. remove from the heat and
stir in the coconut. spread the chocolate mixture over the cream
cheese mixture, and spread the remaining cream over the cake. top
with the remaining chocolate truffle mixture. refrigerate until
chilled.

to serve, cut into squares and serve with a sprinkle of confectioners
sugar.

nutritional information: calories 163 (39% from fat); protein 2.4g;
fat 3.1g (sat 3.4g, mono 0.1g, poly 0.9g); carb 12.5g; fiber 0.5g; chol
78mg; iron 1.6mg; sodium 392mg; calc 16mg

This monstrosity of a recipe. It might even make a cake. But those ingredients will definitely not make something I’d want to eat today.

The fun part about the overfit network is that if I turn the temperature up too high it gets weird.

title: jack’s hot chicken salad
categories: salads, poultry
yield: 12 servings

3 (350 ml) dried hot chiles
4 skinless chicken breasts
(melted — about 1 pint)
1/2 to 2et cooking bagels
8 thick pastry sheets
1 lg onion
3 garlic cloves,large
1 ts garlic powder
1 c water

directions: place potatoes in a heavy pot over ham heat. add the
s&p. and bring to a boil over high heat, continuing to toss the last 2
minutes of cooking.

put the clam juice, wine and vinegar into a medium saucepan and
add the rice. bring it as the grain cooks. bring to its boil over
medium heat, then pierce it off with a knife; over low heat, simmer
for 15 min. until the flavors have blended. strain the fruits and
reserve the liquid.

cut the cauliflower into bite sized pieces. wash these and peel
them.

after the couscous has cooled, the next day, rinse under cold water
and place in a dipping bowl.

meanwhile, rinse the chicken with a mixture of warm water and 1/4 tsp
salt.

drying liquid 1: sprinkle the breadcrumbs evenly over the skin, each
one. lightly brown the spareribs in it in a little oil in a roasting
pan and add remaining ingredients. cover and cook over low fire for 1
minute per side.

flay the squid together.

Wow. I think I’ll just let that one stand on its own.

title: irregular beef
categories: main dish, beef
yield: 6 servings

1 lb ground beef, lean
1 md onion
1 garlic clove, minced
1 c water
1 large can tomatoes, chopped
1 cn salt cod use water pasta
-(no salt and tabasco)

in a large saucepan place 2 or 3 cups beef bouillon cubes; set aside.
cook sausage over medium heat until tender. remove, and drain pieces;
place in a greased 9-inch baking dish. sprinkle margarine on bottom
and sides. repeat two more or more. cover and bake at 350 degrees for
5 minutes. combine mayonnaise, mustard, horseradish, pepper, curry
powder and salt, using the metal blade. on a baking sheet, place a mixt of
the eggs, salt and pepper, and the beaten egg abert to the meat
mixture. fill each scallop mixture with the egg mixture and then top
with 1 t of parmesan cheese. bake at 500 f for 45 mins. or until
beginning to bubble.

Ok, I think we can ignore the recipe and just go from the instructions here. This will definitely make some irregular beef.

Just to compare, here’s the non-overfit network at a high-temperature:

title: home-made maple pineapple jam
categories: relishes
yield: 1 batch

1/2 lb plain flour
salt
1 tb peanut oil
1 c sliced fresh cooking apples
1 ts instant coffee
2 oz self-raising flour

divide egg whites equally among oiled serving dishes. place your
finger and chopped mint in a hot water bath and refrigerate overnight.
when it comes out clean, toss it over in a 900′. in order to melt the
caramel thermometer, pour in the banana sauce. serve in ice cream
refrigerator. the recipe was doubled…yeast! place the kebab in a deep
1 quart or tiled container and let stand at room temperature. when
firm it is done at its way folks, but not enough to within another.
pour into container and freeze up to one month.

Mmm, maple pineapple jam sounds good. But the instructions… You have to refrigerate your finger overnight and melt a thermometer.

Later on we get cool stuff like:

cream butter and sugar, heat to medium. add banana, eggs, vanilla,
cinnamon, nutmeg and lemon rind; beat at low speed until fluffy,
whirl in dry ingredients, no longer, add cream cheese and beat until
smooth. stir in currants. drop by rounded tablespoonfuls, 2″ apart,
3 inches apart. be careful not to knock some of the rest of the
cookies.

The beginnings of a tasty sounding banana cookie.

title: freezer mix jelly
categories: relishes
yield: 1 servings

1 1/2 c instant mashed potatoes
1 pk unflavored gelatin
3 c cranberry juice
1 pk cream cheese (7 oz)
1 1/2 c mayonnaise
10 oz feta cheese

blend sugar and vermicelli til smooth. place in ice-water bath; mix well
and pour over salad.

A… thing.

steam the oranges for 4 minutes. into a blender, combine the flour,
salt, and corn meal. process for 30 seconds. add the butter and
margarine, and pulse for about 5 seconds. add the pureed apples
and the margarine. process on low the bowl until the mixture is
combined evenly. add the remaining 1/2 cup mashed bananas.

Some kind of… fruit cake?

In summary, I’ve gotten really astounding results and the number of actually somewhat cookable recipes has gone up immensely (I think?). I’m definitely putting a lot more thought into the idea of cooking some of these and making an RNN cookbook.

Posted in char-rnn | Tagged | Leave a comment

CycleGAN and Chef-RNN update

There has been a development in image generation that I find absolutely fascinating. It’s based off Generative Adversarial Networks, which are a very powerful and promising model for image generation. A recent modification to this architecture, the Conditional GAN has been making some waves, as it allows for the translation of one type of image to another, but it has the drawback of requiring databases of before and after images with roughly 1:1 correspondence, which is difficult to find, dramatically limiting the applications. However, very recently, the folks at UC Berkley made a few additional modifications which remove this requirement, creating the CycleGAN architecture. I’ve downloaded the source and been using the Flickr api to do some experiments with it, including trees <=> flowers, summer <=> fall, summer <=> winter, and landscape <=> desert. Legal disclaimer: I don’t own the rights to any of the images here. Unfortunately I neglected to get the photographers’ info when I scraped the images, but if anyone knows the creator of any of the original images please let me know. Here’s a few example images I’ve gotten so far:

Trees<=>flowers doesn’t work very well, which isn’t too surprising, but it is pretty entertaining sometimes what it does. It found pretty early on that it can do decently by just inverting the colors, but eventually the behavior got more complex and started making gross brown flowers:

summer<=>fall works just… absurdly well. It’s a bit scary, and some of the results are really pretty. With some parameter tuning and more (and better sanitized) data, this could be really cool! I’m definitely going to do some more experimentation with this.

This slideshow requires JavaScript.

summer<=>winter also works, though I couldn’t get it looking as good as the authors of the paper did. These examples are a bit cherry-picked, though– it never really learned how to fully get rid of snow, but it’s really good at color balance adjustments that make it feel way more wintery/summery.

This slideshow requires JavaScript.

The “desertifier” was largely unsuccessful. It never learned how to make things into sand like I’d hoped, but I didn’t train it for nearly as long as the others, and the success cases give me hope that it could learn:

This slideshow requires JavaScript.

 

Essentially what I’ve found is that the network doesn’t like to totally get rid of anything or hallucinate new things, even when it would make sense to do so. For example, if it gets rid of some water to make a desert, it might not be able to put it back- because of how the Cycle-GAN works, it needs to be able to reconstruct the original image. What it is really good at is changing colors and patterns, less good at structural stuff. I’d bet that you might be able to improve this behavior with skip connections between the initial transformation and the reconstruction pass. This would be similar to the “u-net” encoder-decoder architecture described, except the connections between the encoder and the decoder would also connect the first step of the titular cycle to the second. It might defeat the purpose a bit, but as long as there’s still an information bottleneck it might help.

Robo-Chef strikes again (barbecue sauce edition)

Finally, I discovered that I never did any experimentation with the sequence length I used to train my robo-chef, which would put a hard limit on how long it could remember things. Here are some recipies from a network trained with a much longer memory (3.5X longer)

Low Temperature (0.3)

title: corn soup
categories: soups
yield: 1 servings

1 c buttermilk
1 c chicken stock
1/2 c butter
1 c sour cream
1 ts salt
1/8 ts pepper
1 ts paprika
1 c flour
1 ts baking powder
1 ts sugar
1/2 ts baking soda
1/2 ts cayenne pepper
1 tb butter
1 c cheddar cheese, shredded

combine cornmeal, flour, salt and pepper. stir in buttermilk and
corn flakes. stir in cheese and cheese. bake in a greased 9-inch
square pan in a 350 f oven for 20 minutes. serve with sauce. source:
pennsylvania dutch cook book – fine old recipes, culinary arts press,
1936.

That ain’t a soup! But look, it remembered the buttermilk! If you follow the instructions, you might end up with… some kind of cheesy corn-flake pastry? Weird.

title: cranberry cream cheese cake
categories: cakes, cheesecakes
yield: 1 cake

1 pk cream cheese, softened
1 c sugar
2 eggs
1 ts vanilla
1 c flour
1 ts baking powder
1/2 ts salt
1 c chopped walnuts
1 c chopped nuts
1 c chopped nuts

heat oven to 350 degrees. combine crumbs, sugar and cinnamon in a
small bowl. add egg and mix well. spread over crust. bake at 350
degrees for 10 minutes. cool. cut into squares. makes 12 servings.

source: canadian living magazine, apr 95 presented in article by diana
rosenberg

That’s a lot of nuts! Also it forgot the cranberries. Once again, the instructions seem to be totally independent of the ingredients, but would make… some kind of cinnamon crumb pie? That actually sounds kind of delicious. I bet you could probably make something really tasty out of this one.

title: bran muffins (cookbook style)
categories: breads, fruits
yield: 12 servings

1 c flour
1 ts salt
1 ts baking powder
1/2 ts salt
1 c sugar
1 c milk
1 c milk
1 egg
1/2 c milk
1/2 c buttermilk
1 ts vanilla
1 c chopped pecans

combine flour, baking powder, salt and sugar. cut in shortening with
pastry blender until mixture resembles coarse meal. add egg and mix to
blend. stir in buttermilk and egg mixture. stir in raisins and nuts.
pour into greased 9-inch square baking pan. bake at 350 degrees for
30 minutes. cool in pan on rack 10 minutes. remove from pans and cool
completely. store in airtight containers. makes 12 servings. source:
pennsylvania dutch cook book – fine old recipes, culinary arts press,
1936.

Wow! Not a bran muffin but this sounds… really coherent. This is one of the few times I think the instructions might result in a somewhat reasonable food.

BUT WAIT, are you ready for BARBECUE SAUCE?

title: crunchy barbecue sauce
categories: bbq sauces
yield: 1 servings

1/2 c brown sugar
1/4 c worcestershire sauce
2 tb brown sugar
1 tb cornstarch
1 tb water
1 ts cornstarch
1/2 ts salt
1 tb cornstarch
1 tb water
1 tb soy sauce
1 ts sesame oil
1 tb cornstarch
1 ts sugar
1 ts soy sauce
1 ts sesame oil

combine all ingredients except salt and pepper in a large saucepan.
bring to a boil, reduce heat and simmer for 1 hour. add cornstarch
and cook until thickened. stir in chicken and cook until thickened.
serve over rice.

And  here we go. Crunchy Barbecue Sauce. That full 1/4 cup of worcestershire sauce. The tons and tons of cornstarch. Oh. Man. What is going on? Weirdly enough, the instructions seem spot-on (except the last two sentences get a bit weird). Here’s a condensed list of the ingredients just to see if it makes sense:

2/3 c brown sugar
1/4 c worcestershire sauce
4 tb cornstarch
2 tb water
1/2 ts salt
2 tb soy sauce
2 ts sesame oil
1 ts sugar

The amounts might need adjusting, but at least it has the right flavors going on. I guess the amount of sugar is why it’s crunchy. But wait! There’s more! Barbecue sauces #1, #2, and #1 [sic]:

title: barbecue sauce #2
categories: sauces
yield: 1 servings

1 c chicken broth
1 c chicken broth
1 tb cornstarch
1 tb soy sauce
1 ts sugar
1 ts sesame oil
1 ts sesame oil
1 ts sesame oil
1 ts sesame oil
1 ts sugar
1/4 ts salt
1 ts sesame oil
1 ts sesame oil
1 tb sesame oil
1 tb cornstarch
1 ts sesame oil
1 ts sesame oil
1 ts sesame oil
1 ts cornstarch
1 ts sesame oil
1 ts sesame oil
1 ts sesame oil
1 ts sugar
1 ts sesame oil
1 ts sesame oil
1 ts sesame oil
1 tb cornstarch
1 tb sesame oil
1 tb water
1 tb sesame oil
1 tb chinese chili paste

cut the chicken into small pieces. heat the oil in a large skillet over
medium heat. add the chicken and cook, turning the chicken frequently,
until the chicken is cooked through, about 5 minutes. remove the chicken
to a plate. remove the chicken from the skillet and keep warm. add the
chicken to the pan and stir-fry for 1 minute. add the chicken and
stir-fry for 1 minute. add the chicken and cook, stirring, for 1
minute. add the chicken and continue to cook for another 2 minutes.
return the chicken to the pan. add the chicken and stir-fry for 1
minute. add the chicken broth and cook for 1 minute. add the chicken
and stir-fry for 2 minutes, then add the sesame oil and stir until
combined. add the chicken and stir-fry for 1 minute. add the chicken
broth and cook, stirring constantly, until the sauce thickens. stir in
the chicken broth and cook for another 2 minutes. stir in the cornstarch
mixture and cook for 1 minute. stir in the chicken broth and cook,
stirring, until the sauce thickens. serve immediately.

serves 4.

from the files of al rice, north pole alaska. feb 1994

Did you remember to cook the chicken? How about the sesame oil?  Ok, good. What about the chicken? Also this recipe is from “north pole, Alaska.”

title: barbecue sauce #1
categories: sauces
yield: 1 servings

1 c water
2 tb soy sauce
1 tb brown sugar
1 tb cornstarch
1 tb soy sauce
1 tb sugar
1 tb cornstarch
1 tb water
1 tb cornstarch
1 tb water
1 tb cornstarch
1 tb water

combine all ingredients in a large saucepan. bring to a boil, reduce heat
and simmer, uncovered, for 1 hour, stirring occasionally. stir in
cornstarch mixture and cook 3 minutes more. stir in cornstarch mixture
until smooth. add salt and pepper to taste. serve over rice.

Yeah, ok, that one makes more sense. Actually that seems to be a legit barbecue sauce. Cool! Unfortunately it’s also the most boring barbecue sauce ever because there are only actually four ingredients and one of them is water.

title: barbecue sauce #2
categories: sauces
yield: 1 servings

1 c brown sugar
1 c water
2 tb soy sauce
1 tb sugar
1 ts salt
1 ts salt
1 ts chili powder
1 ts ground cinnamon
1 ts ground coriander
1 ts ground cardamom
1 ts ground cumin
1 ts ground coriander
1 ts ground ginger
1 ts ground coriander
1 ts ground cardamom
1 ts ground cardamom
1 ts ground coriander
1 ts ground cinnamon
1/4 ts ground cloves
1/4 ts ground cloves
1/4 ts ground cardamom
1/4 ts ground cloves
1/4 ts ground cloves
1/2 ts ground cumin
1/4 ts ground cloves
1/2 ts ground coriander
1/2 ts ground cloves
1/2 ts ground cloves
1/2 ts ground coriander
1/2 ts ground cardamom
1/4 ts ground cloves
1/4 ts ground cumin
1/4 ts ground coriander
1/4 ts ground cumin
1/2 ts ground cardamom
1/2 ts ground cardamom
1/4 ts ground cloves
1/4 ts ground cardamom
1/4 c chopped fresh parsley

combine all ingredients in a large saucepan. bring to a boil, reduce
heat and simmer for 15 minutes. add chicken and cook 10 minutes more.
serve over rice or noodles.

recipe by : recipes from the cooking of indian dishes by cathy star
(c) 1994. typed for you by karen mintzias

Oh jeez what happened. This one starts off well and then SPICES. At least it’s more interesting than the last one!

Okay let’s turn up the heat and see what happens.

Reasonable temperature (0.55)

title: barbecue sauce for steaks
categories: sauces, marinades, low-cal
yield: 1 servings

1 c burgundy wine
1 tb soy sauce
1 c water
1 tb vegetable oil
1 garlic clove, crushed
1 md onion, chopped
1 garlic clove, minced
4 chicken breasts, boned,
-skinned, cut into 1/2-inch
-strips
1 tb cornstarch
1 tb water
1 tb cornstarch paste
1 tb cold water

mein sauce: combine the marinade ingredients and set aside.

in a large pot, bring the water to a boil. add the garlic and stir
for 2 minutes. add the spices and reduce the heat. simmer, covered,
for 10 minutes. remove the pan from the heat and store in an
airtight container.

makes about 1 cup.

Hahaha, another barbecue sauce! Nein, mein sauce!  Too bad the instructions are basically boiled garlic. At this point we’re done with the barbecue sauce (alas).

title: grilled portabello mushrooms with cashews
categories: mexican, vegetables, indian
yield: 4 servings

1 tb mustard seeds
1 tb salt
1 tb ground ginger
2 tb peppercorns
1 tb ground fenugreek
1 tb ground cardamom
1 tb ground cardamom
1/4 ts ground cardamom
1 ts ground cardamom
1 ts ground cumin seeds
1 tb chili flakes
1 ts chili powder
1 ts ground tumeric
1 ts ground cumin

mix all the ingredients together and serve cold.

Oh god this one made me laugh so hard. It’s so beautifully simple! I hope you like spices, because that’s what’s for dinner. Anyone know if this would make any sense at all as a spice mix?

title: hot-pepper caribbean black bean sauce
categories: sauces, vegetables
yield: 1 servings

1 c chicken broth
1 c cooked rice
2 tb soy sauce
1 tb sugar
2 tb soy sauce
1 tb sesame oil
1 c chicken broth
1 ts sugar
1 ts salt
1/4 c soy sauce
2 tb cornstarch mixed with 1/4
-cup water, as needed

serves 4.

place the rice in a saucepan and bring to a boil. add the chicken broth
and cook under medium heat for 5 minutes. remove the chicken from
the pot. add the chicken and cook for another 10 minutes. remove
the chicken from the pan and set aside.

add the chicken and the remaining ingredients and simmer 15 minutes.
skim off the excess fat and return the chicken to the pot.

serve the chicken and sauce with the chicken and a salad.

So wait, what am I supposed to do with the chicken again?

Ok, now let’s really get cookin’!

High Temperature (0.8)

title: junk-joint salet burger
categories: pasta, pork
yield: 6 servings

stephen ceideburg
1 c spam luncheon meat
1 bag fully cooked bacon
2 c sliced green onions
1 stalk fresh basil – chopped
1 clove garlic — sliced
1 c sauerkraut
1/3 c sugar
4 ts lemon-soy sauce
1 ts hot sauce
1/2 ts salt
1/2 c water

in large skillet over medium heat, heat oil over medium heat; cook
garlic until soft, but not browned, about 15 minutes. add remaining
ingredients except noodles; cook for 2 to 3 minutes or until thickened,
stirring after 5 minutes. stir in raisins and simmer for 1 minute.
serve over chicken.

source: taste of home mag, june 1996

It’s a what? I guess if you go to a Junk-Joint and order a Salet Burger this is what you get. Also those ingredients… substitute ground hamburger for spam and you might be able to make a decent, but weird, burger. What’s really neat is that the instructions remember that this is supposed to be pasta (which the ingredients conveniently forgot).

title: chicken & milk grits
categories: poultry
yield: 2 servings

1 whole chicken
– skinned, fat from fresh
– chicken breast
– cut into 1/2 inch strips
1/4 c low-fat cottage cheese
1 (10-oz.) can whole tomatoes
— drained, chilled and
– drained
1 tb olive oil
2 tb sour cream
salt and pepper
4 flour tortillas
cooked spaghetti
mushrooms
parsley
sauted mushrooms

combine flour, salt & pepper in a medium bowl. cut in margarine
until particles and can leave from tip. pat the mixture into a baking dish
and sprinkle with the cheese. bake, basting every 15 minutes, until
crust is golden brown, turning the cheese over after 35 minutes.
meanwhile, mix the egg and water in a small bowl. stir in the remaining
ingredients. pour grated cheese into the pie shell and bake for
20 minutes. remove from oven. sprinkle with roquefort on top.

What… what is this? The name is weird, the ingredients are… confusing (tortillas and spagetti?) and the instructions are for… some weird cheese pie. I don’t really know what to make of this. I think I may need to call a chef to reconcile some of these recipes for me. That said, if you did manage to actually make this it might not be half bad if you made some pretty liberal substitutions and improvisations.

But wait, are you ready to

title: do the cookies
categories: cookies, breads
yield: 1 servings

1 c sugar
1 c shortening
1 tb baking powder
1 c buttermilk
1 ts soda
2 eggs
1 ts vanilla
1/2 ts almond extract
cinnamon

————————–filling——————————-
4 c confectioners sugar
2/3 c water
sprinkles
chocolate chips, karola or
crystallized chocolate
-red berries, for garnish

preheat oven to 350. mix cake mix, salt, flour and salt. beat egg
whites with salt until foamy. gradually add remaining 1/2 cup sugar,
beating until stiff. beat in vanilla extract and vanilla and fold into
batter. pour into remaining tins. bake in a 350 degree oven for 30
minutes until done. cool 1 hour before removing from pan. per
serving: 101 calories, 1 g protein, 12 g carbohydrate, 3 g fat, 3 g
carbohydrate, 0 mg cholesterol, 30 mg sodium.

note: if simple way to do not mix the frosting together with egg and
sugar and your dough will hold the mixture in the freezer.

DO THE COOKIES? Oh man. This is a full, internally consistent, somewhat logical cookie recipe. And, AND, it has nutrition facts. So you know it’s healthy! Wow. Do the cookies!

title: anglesey’s french prepared chicken wings
categories: italian, poultry
yield: 6 servings

1 chicken breast meat, cut into
-serving pcs.
1 c cooked rice
1 c vinegar
1/2 c vinegar
4 cloves garlic, minced
1 tb cider vinegar
1 tb dijon mustard
2 tb red wine vinegar
4 tomatoes, chopped
1 tb celery, fresh, snipped
1 red pepper, julienned,
-seeded and diced
1 tb parsley, chopped
1/4 c red wine vinegar
salt
freshly ground black pepper

thaw and drain chicken (roll up the sides of the chicken). trim and
cut the chicken into strips. combine the chicken with the pork mixture
with the salt, pepper and thyme. mix everything together gently and
add to the chicken mixture. cover and refrigerate for at least 4 hours
or overnight.

…and then what? Wait, do you serve this raw? It put SO MUCH WORK into those ingredients (look at all that vinegar) and then forgot to actually cook the meat (arguably the most important step).

title: grilled portobello mushrooms
categories: vegetables
yield: 4 servings

1 md garlic clove, crushed
1 sm onion, chopped
2 tb butter
8 oz fresh plantains, leaves,
-frozen
– thawed
1/4 c raisins
1 tb balsamic vinegar
1 tb worcestershire sauce
2 tb tamarind sauce
1/8 ts pepper
1 ts lemon juice
1/2 ts garlic, minced
salt to taste
freshly ground pepper

put first 4 ingredients in a bowl, mix well and stir into corn mixture.
in a 2-quart saucepan, heat the butter and 2 tablespoons of the
frankfurter seasoning and add the cooked rice and stir until the sauce
thickens and serves 4 to 6. makes about 2 1/2 cups

recipe by : cooking live show #cl8726

This one is actually so close. If only it actually included portobello mushrooms! Also, I want to emphasize: 8 ounces of fresh plantain leaves, frozen, then thawed. WHAT.

title: chocolate sunchol apple cake
categories: cakes, chocolate, vegetables
yield: 1 servings

1 c brown sugar packed
2/3 c butter or margarine
1/3 c cocoa
2 ts vanilla
1/2 ts almond extract
1 c chopped nuts
1/2 c coconut
3/4 c sour amount of cold water
1/2 c flour
1/2 ts baking soda
1/4 ts salt

in a large bowl, cream margarine. add sugar, flour, vanilla, and eggs.
mix thoroughly. pour into prepared pan. bake 45 minutes or until
oblong starts to pull away from sides of pan and a wooden pick inserted
into center comes out clean. cool in pan on wire rack for 5 minutes.
remove cake from pans to wire rack. remove from cookie sheet to wire
rack and cool.

Another legit pastry… thing! Also with some confusing oddities.

title: home-made potato salad
categories: salads, greek, vegetables, pork
yield: 6 servings

1 lg ripe avocado,cubed
1/2 ts lemon juice
1 c chicken broth
1/2 ts ground cumin
1/4 ts salt
1/8 ts pepper
1/4 ts cumin
1 c cooked rice
1/4 c water
1 c broccoli florets
3 c chicken broth
1/4 c parsley, finely chopped
2 tb green onions, finely chopped
pinch nutmeg

break up cooked peas. saut� garlic in oil until soft. stir in flour
and stir until smooth. combine all ingredients. cook and stir over medium
heat until sauce boils and thickens. cool
1/4 hour before serving.

That’s a weird potato salad. But you know what, it might be good! Dang, I’m getting hungry.

I also had some fun turning the temperature up REALLY high. The recipes get… out there. Have a look

Crazy high temperature (1.1)

  title: horey dipping sauce-(among-burlas)
categories: japanese, sauces, chutneys
yield: 1 cup

1 c cream and port:
1/8 c whole-wheat flour
2 tb canola; grated
32 oz tomate; sliced
3 tb crumbled honey
3/4 c onions; chopped fine
1/4 c bell pepper; chopped
2 ts accent
1 c chicken; cooked,chopped
1/4 ts poultry seasoning; if desired
1 1/2 ts peanut oil
4 green onions; peeled and
-thinly sliced

saute onion, green onions and celery just until onions begin to soften.
place for 1 minute add beans, water, chicken bouillon cube, green
onions, and cabbage. bring to a boil. boil uncovered quickly,
covered, for 20 minutes.

ladle into warm soup bowls. top with chicken and serve.

note: you can substitute cooked vegetables for thin strands of
rice. potatoes can go to be with really better some commercially back
in place of this. i also added this in backbone. wonderful!

Yeah ok, some kind of… honey peanut canola chicken thing. I can live with that. Let’s get weirder!

title: crispy ripe fetureurs
categories: chinese, game
yield: 8 servings

8 ea fresh artichokes; each 3″ died
2 ea garlic cloves; minced
1 ea onion; grated, or marinade
12 oz goonne, whole black bean; **
1 ea bay leaf
1/2 ts ginger; fresh, peeled,
-ground
1/2 ts salt
3 tb chili berry or vinaigrette
2 tb soy sauce
3 tb vinegar
1 tb dijon mustard
salt and freshly ground
-black pepper
1 lb peeled carrots
3 tb peanut oil
4 1/2 c water
1 1/2 lb chicken; quartere

chop all of this liquids into separate bowls. put 2 mayonnaise into a
large bowl. stir in the garbanzo beans until pureed. add the
pork and mix thoroughly. toss the spatula and toss well with the
first mixture. set aside for a few hours before coating.

remove the skin to a dinnworm enough to act a lasagle, starting in the
rosette. roast, uncovered, in a hot 350 f. oven for 15 minutes.
meanwhile, wash the lettuce, well, tuver peel the green palm. hold the
carrots very finely. but do not rinse them. after they are cooked
to the texture, place the sauce in another hot skillet largeroune,
and add enough hot water to cover it.

cover and simmer the soup until the rice is done, about 4-6 hours,
date to see dowel up. pour into hot sterilized jars to make sure your
amber liquid has reheated. chiln quickly if the barbeque side is
chilled and stored in a storage tin, loosely probably one day, watch
until chiles are soupy, thoughly barbecued, about 37 hours, or in the
refrigerator to marinate the meat or your beurre but may be made up to 2
days, covered.

cornstarch mixture: this sauce manie sirfully begin to should
be approximately 3 cups of cooking your toothpicks.

from black beans & the ultimated cookery.. conf: grasne ago
vyra bennett un, tradhlenl

Oh jeez too weird too weird. There’s so much going on in here. What is a dinworm? For that matter, What’s a lasagle? The first paragraph is 100% gold. Also, this recipe takes a long time. First, you have to stir some beans until pureed. You have to puree beans with a spoon. Then you cook the… dinnworm… for 15 minutes, then simmer the soup for 4-6 hours, then watch the “chiles” until they are “soupy, thoughly barbecued,” which takes 37 hours or up to 2 days.

title: fried parmesan 5-mint
categories: appetizers
yield: 8 servings

4 fresh ham, thinly sliced as
-slices 1/2 inch thick
3 md fresh mushrooms, thinly
sliced
1/4 c toasted sliced green onions
– (finely)
2 lg black pepper, the
1 tb grated parmesan or sandwich
1/4 c lowfat yogurt
1/3 c sour cream
lime wedges
parsley sprigs

1. place remaining ingredients in each of a bl. plate, cover and microwave
on 300of until cheese melts (about 15 seconds). serve at once, with
salsa.

Somehow the network made an OK sounding chip dip. I love that the instructions are basically “throw everything in a bowl and microwave.”

In summary, holy cow! This is so much more coherent than my previous experiments, and with only one night of training! I definitely need to dig into this a bit more. If you’ve somehow made it to the end of this post and want MORE, I’ve found another blog that does similar things. Check it out!

Posted in char-rnn, CycleGAN | Tagged , , , , , , | 2 Comments

Double Jeopardy

Because I was gone for most of the weekend, (and because I didn’t want that awful jokebot being the first thing people see of this blog) I retrained the jeopardy network to some amusing results. Take a gander:

THE SPORTING LIFE,$400,’The first of these in the U.S. was the first to control the state of Maryland’,a statue of Martin Luther King, Jr.

THE OLD WEST,$800,’This country is the only one of the world’s largest countries’,Chile

THE BIBLE,$400,’This composer of the 1999 film The Sound of Music was a star of the 1995 film The Sound of Music’,John Steinbeck

THE SOUTHERN DANDY,$400,’The name of this country is a synonym for a state of a country’,South Africa

John Steinbeck composed and starred in a 1999 remake of The Sound of Music, apparently. Also a statue of MLK Jr. took over Maryland, and Chile is bigger than I previously thought. The more you know! Increasing the temperature a bit:

A WORLD OF BEER,$400,’The first of these in the U.S. was a company in 1999′,a balloon

THE CARIBBEAN,$400,’The sea is the capital of this country’,Chile

THE FIFTH,$400,’The name of this body part is from the Latin for to strain into a string’,a contract

THE STORY OF O,$400,’This country is the second largest city in the world’,Martinique

BEAR SCREEN,$200,’This author of The Secret Garden was based on a 1989 film about a stripper who was a little boy’,James Bond

FICTIONAL DETECTIVES,$1000,’This 1954 film is set in the 1997 film seen here’,The Man Who Shot The Rainier

ART & ARTISTS,$1000,’This Southern country was a colony of the New York City Company in 1968 & is now a capital city’,Berkeley

The network continues to fail at geography. In addition to being the largest country in the world, the capital of Chile is the sea. Also, some of Jame’s Bond’s sordid origins and a 1950’s sci-fi detective film, The Man Who Shot The Rainier, which I kind of want to watch. Stepping up the temperature some more, we learn about American history:

THE CIVIL WAR,$600,’In 1990 the Confederacy allowed this country to the U.S. Constitution’,South Africa

THE 1980s,$400,’This son of a president was a senator from 1948 to 1972′,James Buchanan

THE CIVIL WAR,$800,’This secretary of the American Idol was buried in the first festival of the State Department’,Harry Truman

BOOKS OF THE ’60s,$400,’This Seattle children’s team was based on a 1986 movie based on a series of books’,Stevie Wonder

COLLEGES & UNIVERSITIES,$1000,’On April 1, 1998 this country became the first black president to control the U.S. Army’,Japan

BIBLICAL PEOPLE,$400,’This man who resigned as a lawyer in 1994 was the first female president of the Confederacy’,John Adams

THE 1980s,$300,’This man who died in 1978 was a president of the Senate from 1934 to 1990′,Benjamin Franklin

THE NEW YORK TIMES TECH BIZ,$300,’This country’s 1969 exploits were completed in 1936′,Australia

STATE CAPITALS,$1000,’This capital city was founded in 1939 by the El Capitan of New York City’,Columbus

Okay, I snuck some Australian time travel in there. I also got this absolute gem:

THE FIFTH,$400,’This president was the first president to serve as president’,John F. Kennedy

Let’s keep this going:

WORLD CAPITALS,$1000,’The company that contains the largest island in the U.S.’,Canada

THE END,$1000,’It’s the body of water in the Confederacy that shares its name with a former capital’,Barcelona

STATE CAPITALS,$200,’The name of this capital city is a 2-word name for a pope’,Beijing

THE ASTRONAUT HALL OF FAME,$200,’In 1969 he was called the last world championship to win the major series title’,Alexander Hamilton

FAMOUS COLLEGE DROPOUTS,$200,’In 1998 this president was a commander of the Confederate Army’,Adolf Hitler

THE OLD WEST,$200,’In 1991 this American became the first woman to be a consul on the Moon’,Britney Spears

THE SOUTH PACIFIC,$400,’It’s the only country that makes it to the Atlantic Ocean’,Australia

WOMEN AUTHORS,$400,’In 1987 this TV heroine was a spin for the No. 1 hit Heart of Darkness’,John Paul Jones

THE ELEMENTS,$400,’This compound is a major work of the subatomic particle that makes surreal & trick’,a sodict

Godwin’s law invoked! Also, it invented a subatomic particle. Canada is a US-based company, and Britney Spears is a consul on the moon. Things started to get a bit more dadaist from here:

MADE ON CHARACTER!,$2000,’An American author of The House of War, her first novel, The Man Who Loved Me Done, debuted in 1960′,Dennis Hopper

DO YOU BETTER A FACE!,$200,’This term for a condition is from the Latin for indeed’,a white broccoli

THE STATE OF CLASS,$200,’From 1935 to 1996, these U.S. planets abbreviated the Baltimore Order’,the California Signs

BIBLICAL WOMEN,$200,’In the 1996 film poem The Spy Who Does Will Ast Will Believe He in this play retrudes a bad baby back out with his own daughter’,The Sound of Music

YOUR 5-CLUE NEWSCAST,$2000,’This bridge is the southernmost point of the South Pole’,the River State

WOMEN BY THE NUMBERS,$800,’He was good man when he was more famous for his song’,Martin

MIRROW MOVIES BY CHARACTER,$2000,’Jin-Aak,<br />Calamity,<br />the Balthamar’,The Round Table

A IN SCIENCE,$400,’A specialty of this mammal is retracted with plastic pouch & are sacred at its surface to get a beautiful species of bird or brown’,a narwhal

And so on. I do like “jin-Aak, Calamity, the Balthamar.” That’s just a really cool set of titles. Also, The Man Who Loved Me Done sounds like a really solid bodice-ripper and The Spy Who Does Will Ast Will Believe He, the “film poem” sounds adorably artsy. Let’s keep this ball rolling, if only to see where it stops (or what it runs over):

WEBSTER’S 2005 TOP WORK,$1600,’The lady called the Village of Birmingham’,Ler Desser

AUTHORS’ RHYME TIME,$1000,’Steven Smith’s chiffons: ___ Impressionism’,Deslating

THAT’S SO LAW,$2000,’This kind of punch is a thin plant to nose in a camp or a certain man’,a smash

7-LETTER WORDS,$2000,’Wyatt Carlon founded Castro, Lincoln Battle Device & this shrimp group’,a night print

BE A FIREFIGHTING,$1200,’This brand of small color is also called members’,crhatobula

Okay, at this point, the questions are a bit silly, but the categories become excellent:

NOT A ROCKER

WHAT A COCKIN’?

EOGRAPHY

WE LOVE BROWN

WHAT HE WAS IN HOLE

1985: THE EVERYTHING WAR

LOOKER ONE OUT FOR KIDS

ANIMAL YOUNG ‘UNS

THE NEW YORK TIMES METAL

IT HAPPENED IN SPORTS

YOU’RE A BEACH I AM

And more. Just for fun, I took it up one more step:

OKLAHOMA!,$200,’Mark Twain debuted on Marvin Gabbary for this brand maker whose name includes his way to start’,Yellow Submarine

BROADWAY MUSICALS,$800,’Yes to Bag McKorw trades a book for this entertainment chase as a knight in Gilbert & Sullivan’s tokespaces’,Elle Fragg

LITERally ELEMENTS,$200,'(<a href=http://www.j-archive.com/media/2008-11-28_DJ_26.jpg target=_blank>Cheryl delivers the clue from the set of Halloween.</a>) Some people wourd appeal on Inuit inside the <a href=http://www.j-archive.com/media/2011-09-20_J_21a.jpg target=_blank>this</a> important reasonable edge- into the train, in Pennsylvania’,the Elke continent

B IN FASHION,$1000,’It’s the depth at <a href=http://www.j-archive.com/media/2005-12-02_DJ_04.jpg target=_blank>Edward J. Deimos’,Body is American Miss Vinnegas

ICK BIN APPLE,$1,700,’Together Bubble Down is the first earl’s only one’,the T.Orvertine

FAMILIAR PHRASES,$1600,’The straws of <a href=http://www.j-archive.com/media/2008-04-15_DJ_03.jpg target=_blank>this</a> lair found in the Smithsonian’,Herudge

20th CENTURY FASHION,$1600,'(<a href=http://www.j-archive.com/media/2009-09-12_J_15.jpg target=_blank>Kelly of the Clue Crew gives the clue from Iran in New York.</a>) Oliver stands for floppares for one of these; John Infords chose to fame for one title play’,154

GONE BUT NOT,$1600,'(<a href=http://www.j-archive.com/media/2005-01-28_DJ_12.jpg target=_blank>Alex reports from a cape at ’34.</a>) Along with a river on New York’s capital holiday, this capital of Luxor lacks cheese & vegetables planted by both Talmania & Haiti’,Trenton

DUSTIN COUNTRY HEIR STORIES,$2000,’Weird TV’s Whale’,Paddhe

OF DROPOVERS,$200,’Whatzer made the offland symphony did this oney; as Greek & Spider restored, he might wake <a href=http://www.j-archive.com/media/2005-03-18_DJ_19.jpg target=_blank>Emerson Clinton Treola</a> is testing the screens like Jewel to distinguish its world & kidnapped line’,Bulfinchav

To my surprise, a new behavior emerged- the network produces fully formed, syntactically-correct hyperlinks to images stored in the jeopardy archive website. This wasn’t present in any of the previous temperature level, and the accuracy of these hyperlinks is somewhat astounding. Though I’m fairly sure none of the targets actually exist- I saved 100kb of output as html and all of the links gave a 404, which was disappointing. In theory, eventually it might produce a real link but it would probably take a while.

This makes me want to return to something I tried a while ago, namely training on random files from the ubuntu source, but that’ll have to wait until after my current big project, which I can only say is a cool computer vision thing.

Posted in Uncategorized | Leave a comment