Unexpected results in ML & project updates

There was a recent paper that caught my eye that I think is worth sharing with anyone interested in the themes of my blog:

The Surprising Creativity of Digital Evolution: A Collection of Anecdotes from the Evolutionary Computation and Artificial Life Research Communities

If you’re into reading papers, definitely take a look at this one, if not, Károly Zsolnai-Fehér of Two Minute Papers did an excellent video about this paper. This is one of the things that really inspires my love of machine learning, and it’s great to see awareness of this really interesting stuff on the rise.

An additional note, if you haven’t yet, check out some other blogs hosting generative humor, including AI Weirdness and Botnik Studios (not neural networks, but still hilarious). There’s also a (very new) subreddit for Botnik with similar predictive text humor, so check that out.

Finally, since I’ve been getting a ton of new subscribers (Welcome!) an update on my current projects: The first one, which I’ll be doing a much more substantial write-up of some time in the next month, is a gimp plugin interface for Deep Dream. Here’s an example of the kind of stuff you might see some more of:

tubingen_honeycomb.jpg

Tubingen with the “honeycomb” class deep-dreamed onto it

I’m really excited about this project finally coming together– once this is set up, I may look into the feasibility of porting it to a Photoshop plugin and/or creating a plugin for Neural-Style, since these tools really need to be put into the hands of artists.

The second is just learning more about reinforcement learning using the OpenAI Gym and (eventually) the Unity ML Agent framework.  But look! It learns:

image75.gif

Asynchronous Actor-Critic RL model playing Breakout

It’s not very smart yet, but it’s trying its best. (I’ve since switched to grayscale for better performance)

In the meantime, I’ve found a set of web scrapers that pull data from BoardGameGeek, so once I get that set up to get natural language descriptions instead of just ratings data, expect some AI-generated board games.

[last-minute edit]

I should also mention, I did run char-rnn on the database of Urban Dictionary results, but then realized belatedly that this website is on my resume which I hand out to employers, so I decided to do a bit more thinking about whether I wanted to dip into the NSFW on this blog. I’ll keep you updated on that.

Advertisements
Posted in Uncategorized | Tagged , , , , , | Leave a comment

Sentiment Translator

Introduction

A very common problem in natural language processing is that of sentiment analysis, or rather, attempting to analytically determine if a given input has an overall positive or negative tone. For example, a positive string might be “I really enjoyed this movie!” whereas a negative input might be “it was boring, vapid, and dull.”

My idea is essentially thus: given recent advancements in machine translation, it should be possible to translate between sentiments (as opposed to between languages. The difficulty in doing this presented itself fairly immediately: there is no dataset available with a one-to-one mapping between negative and positive sentiments. There are models for unpaired machine translation, but they’re still relatively unproven.

My first implementation was a fairly simple rule-based approach: try to remove or add inverters (the word “not,” for example) and replace words with their antonyms until the sentiment changes appropriately. This worked well for very simple cases, but just wasn’t smart enough to capture more complex relationships (for example, it loved to translate “good” to “evil,” even when “bad” would make a lot more sense). My new implementation takes a different approach, using (and abusing) a model loosely adapted from Daniele Grattarola’s Twitter Sentiment CNN.

The Data

I used the aclimdb dataset, a set of reviews scraped from the International Movie Database, split into four parts of ~12500 reviews each: positive training, negative training, positive test, and negative test. Movie reviews work very well for this problem because they are essentially already annotated with the sentiment in the form of the user’s star rating for the film.

In pre-processing, I split the reviews into sentences to reduce the length of each input and convert each review in to word vectors (in my experiments, I used the googlenews-300 pretrained vectors). Unfortunately, due to the size of the input when converted into 300-dimensional vectors, I frequently ran out of memory during training. To reduce this issue, I only load the million most common words in the google news negative 300 vectors

The Model

The model is based on a set of one-dimensional convolutions over the series of word vectors, followed by a max pooling layer, ReLU, and a fully-connected layer. This is trained as a standard sentiment classifier, learning to predict the sentiment of a given input sentence.

At sampling time, however, we do something different. We run the input sentence through the classifier, as normal, however we give the classifier a different target sentiment. We then find the gradient of the loss of the classifier with respect to the input word vectors. This may be familiar to anyone who’s implemented Google’s Deep Dream algorithm or worked with Adversarial images. In essence, this will give us the direction we should perturb the input vectors to cause the largest change in the sentiment. Additionally, the magnitude of the gradient for a given word roughly corresponds to how much that word contributes to the sentiment (and therefore, how important it is to change).

We hit on another problem here. The space of possible words is discrete, but the word vector space is continuous (and very high-dimensional, and thus sparse). How can we be sure that these gradients are moving towards an actual word? To be honest, I’m not entirely sure. My first approach was to use multiple gradient steps, however this appeared to find minima in the sentiment that didn’t correspond to actual words in the input set. My second approach was to extend the gradient out as a ray from the original word and find the word vectors closest to this line: this worked a good deal better (specifically, it captures the “hate” <-> “love” relationship), but still isn’t perfect: we still need an heuristic method to select which of the proposed word replacements to use, which in the end will make this method little better than the rule-based approach from my initial implementation.

Conclusion

The biggest realization I came to was that when mapping a discrete space to a continuous space, the meaning of the intermediate values is not always intuitive. This is what we see when we simply perform gradient descent on the word vectors– the sentiment converges very nicely, but the resulting vectors have barely changed from their original values. This is interesting in the domain of computer vision, as it typically results in an “adversarial image,” or an image which can fool the classifier into misclassifying it with a very high confidence while being indistinguishable from the original to a human. However, as we are hoping for some of the words to converge to different discrete values in the word space, this is less than ideal.

Additionally, one unanticipated disadvantage of the lack of unpaired data was the inability to mathematically verify the accuracy of the translations– there was no ground truth to translate to. 

Future Work

One thought I’ve had is to try to do something similar to CycleGAN, which performs “image translation” in an unpaired fashion through a combination of GAN loss and “reconstruction loss,” however this still introduces problems as we cannot easily calculate gradients of the sentiment loss through the discretization into the word space.

It’s a tricky problem, but if anyone has any ideas, I’m interested.

Posted in nlp | Tagged , , , , , | Leave a comment

Automated topic extraction with Latent Dirichlet Allocation

During last semester, I became aware of a really interesting NLP algorithm called LDA, short for “Latent Dirichlet Allocation.” The goal of the algorithm is to create a set of “topics” which represent a specific human-interpretable concept

The core assumption of LDA is that documents are generated as follows:

  1. For each document:
    1. generate a distribution over topics based on model parameters
    2. for each word in the document:
      1. Sample a topic from the document’s topic distribution
      2. Sample a word from that topic’s distribution over words

The idea being that there is some latent variable (the topics) that informs word choice.

Unfortunately, the document->topic and topic->word distributions are impossible to calculate exactly, but it is possible to approximate them using gibbs sampling or variational inference- approximation techniques which will allow us to eventually converge to a solution close to the true solution (insofar as such a thing exists). Unfortunately, these have the side-effect of being very slow, so the algorithm is not exactly the most efficient. Compared to training a neural network, though, it’s not actually unreasonable.

Here are some results from running the algorithm on a datset of news articles, where each line is a discrete topic. See if you can figure out what each topic is about:

command, field, one, marshal, last, general, boy, first, slim, perform

belgian, congo, government, independent, u.n., congolese, lumumba, nation, one, province

Sept., oct., new, lake, 23, color, first, fall, river, season

Run, two, mantle, one, home, game, mariners, record, hit, play

Would, effect, generation, fallout, megaton, radiation, test, soviet, human, government

Law, anti-trust, union, labor, act, price, manufacture, company, collect, bargain

And from a combination of science fiction and scientific papers:

brown, soil, detergent, wash, surface, john, house, daily, provide

casework, service, prevent, family, treatment, use, interview, care, time, help

company, questionnaire, list, manchester, store, busy, mail, plant, downtown

food, radiation, object, cost, meat, process, irradiate, product, visual, refrigerate

And finally, the SCP foundation. One thing to note is that I didn’t do as much data cleaning or parameter selection as I did with the previous datasets, so quality could be better. I’ll fine-tune the results later.

film, dust, character, neutrino, drug, pill, site-64t, supplemen, text, brand

photograph, photo, cognitohazard, photographed, effect, viewer, ascension, baxter, epsilon, ride

mr., kiryu, d-13321, head , dr., mad, corrupted, little, penny, rust

playback, inside, data, ritual, becam, went, object, orbit, contain, unit

tlaloc, station, turkey, none, d-69601:, dubcek, albright, materialized:, stop., deities

These are just a few examples but you can see how easily interpretable the results are with basically no human intervention or annotation. I’m hoping to apply this to some other datasets in the near future to see what sort of results I get.

Posted in LDA | Leave a comment

Making art with neural style and fractals

I recently attempted to see if I could create art with Neural Style using only photos I’ve taken and fractal flames I’ve created with Fractorium and Apophysis. I must say, I’m very happy with the results! I generated the outputs using the Neural Style GUI and fine-tuned them using Photoshop– the latter was necessisary to mitigate the black splotches still present in some images.

22405550_1451387041581435_9162479106892362539_n1507861578595fractal_cathedral_spoopyfractal_town

I’m planning to write up a post soon about a really interesting NLP algorithm, and I’ve been having fun recently training char-RNN on a database of Urban Dictionary so stay tuned for that.

Posted in neural style | Leave a comment

Steam Games

While I work on my next big project, I decided to generate some random steam game names. All of these are games that don’t actually exist:

Happy Panic

Unraveled Land

Mad Sharkness

The Heart’s Medicine

formic innocence

Heroes Over Europe – Full Mojo

Lovely Ventures

The Gravesable Moon Visual Novel

Hotline Miami 3: Back under Begins

Redemption Cemetery: Secret Of The Angel

Nightman: Trigger Element Space

Hellfrosted

Princess Maker Expedition

Gorescripted Addiction: Possibility

Mars Indian

The Ember Sigil

Train Simulator: Eternal Gun

5-Bit Soundtrack

Best Force

Happy Fantasy

Jackal Journey

Signal Flyng

 

And the winner for the most probable steam game name is:

The Walking Dead: Inside The Walking Dead: “The Secret of the Magic Crystal”

and also:

Steam Dev Days: Steam Edition

Posted in Uncategorized | Leave a comment

Espresso is the marshmallow of coffee (fun with Word2Vec)

In exploring the recipe dataset, I decided to have some fun with Word2vec, an algorithm originally created by Google. For the layperson, this algorithm works by looking at the context in which a given word appears and learns vectors to represent words such that words that appear in similar context have similar vectors. On the recipe dataset, this means that, for example, the vectors for vodka and cognac are very close together, wheat and rye are very close, chocolate and butterscotch are very close together, etc.

What’s really neat about this, though, is that it enables us to do some very interesting things. One of the properties of the vectors created is the ability to perform vector arithmetic, adding and subtracting these semantic vectors to create word analogies. Here are a few examples: (read a – b + c = d as “b is to a as c is to d”)

pie – pizza + calzone = blintz

That makes sense! Never would have thought of that to be honest.

banana – plantain + apple = blueberry

I guess an apple is just a big blueberry. Who knew

candy – marshmallow + coffee = espresso

I guess that makes sense. Weird though.

 fish – tuna + chocolate = candy

Ok, tuna is a type of fish, chocolate is a type of candy. I guess I’ll let that one slide.

coffee – tea + lemon = orange

tea – coffee + lemon = lime

That’s interesting. It seems to think coffee is sweeter than tea.

coffee – knife + spoon = expresso

Interesting. In addition to the marshmallow of candy, it’s also the spoon of cutlery.

rasin – grape + fish = offal

Wow, ok, I guess it doesn’t like rasins.

brie – cheese + candy = meringues (closely followed by “fondant”)

Makes sense. Fancy, soft, light.

ribbon – bar + dome = tented

Let’s try the classic word2vec analogy:

king – man + woman = bruce

What. The next closest option is “retired.”

I’m going to continue experimenting with this. I’ve also been getting some really good results with the chef-rnn, so I’ll get back to you with more of that soonish as well.

Posted in word2vec | Tagged | Leave a comment

Chicken soup (for the robot soul?)

So I did a little more training with the chef-rnn at a lower learning rate to fine-tune it a bit and got some shockingly good results, with some weird quirks. I really just couldn’t resist posting more recipes. These are so much fun to read and try to imagine cooking/eating. This is the closest I’ve ever gotten to eatable food, and I’m kind of tempted to try some of these at some point. As usual, I’m going to start

title: chicken & sausage pie
categories: pies
yield: 6 servings

2 chicken breasts, boneless
1 tb olive oil
1 garlic clove, crushed
1/4 c butter or margarine
1/4 c bread crumbs
1 c shredded mozzarella cheese
1 c shredded swiss or cheddar
-cheese
1 c shredded cheddar cheese

preheat oven to 350 degrees. in a small bowl, combine chicken,
cheese, parsley, mustard, salt and pepper. spread evenly over chicken
mixture. place a layer of chicken on top. bake uncovered at 350 for
20 minutes. sprinkle with parmesan cheese. bake uncovered for 10
minutes or until cheese melts and cheese is melted.

Wow… that actually sounds pretty good. Replace “chicken” with “sausage” at some point so it fits the name, and this could be a decent meat pie. A warning: this version of the network loves chicken.

title: chicken & vegetable stir fry
categories: main dish, poultry, cyberealm
yield: 6 servings

1 chicken
salt
pepper
soy sauce
olive oil
1 c mayonnaise
2 tb tomato paste
1 tb green onions, chopped
1 tb cumin
1 tb chili powder
1 ts worcestershire sauce
1/2 ts cumin
1/4 ts cinnamon
1/2 ts sugar
1/2 ts paprika
1/4 ts pepper

combine all ingredients in a saucepan and simmer until the vegetables
are tender. add chicken and simmer for another 5 minutes. add the
sauce and cook another 5 minutes. serve over rice.

each serving provides: 448 calories; 21 g protein; 12 g carbohydrates;
12 g fat; 30 mg cholesterol; 316 mg sodium; 31 mg calcium.

This needs more vegetables in my opinion, and I’m not a fan of the cup of mayonnaise, but ok, I can see this working.

title: chicken & vegetable soup (salsa de chili)
categories: poultry, soups/stews
yield: 6 servings

2 chicken breasts, boned and
-skinned
1/2 lb mushrooms, sliced
1 c onion, chopped
1 cn cream of mushroom soup
1 cn chicken broth
1 ts salt
1 tb chili powder
1/2 ts pepper
2 c chicken broth
2 c chicken stock
1 c parmesan cheese
3 chicken breasts, boned

1. place chicken with thyme, bay leaf, parsley, parsley, salt and
pepper in shallow baking pan. 2. heat, uncovered, in microwave oven
5 minutes or until chicken is tender. 3. add chicken, cook 10 minutes
or until chicken is tender. 3. remove chicken from pan and keep warm.
2. melt butter in same skillet over medium heat. add chicken
and saute over medium heat for 5 minutes or until lightly browned.
7. add beef stock to pan; cook, stirring often, until slightly
thickened, about 4 minutes. stir in chicken and orange juice. 5.
spoon sauce over chicken and vegetables; sprinkle with cheese.

And now it’s a soup! With orange juice! Also, it’s a multi step recipe, but the step numbers are totally out of order (it goes 1 2 3 3 2 5 7 5). ‘Let’s try that again,’ says the network

title: chicken & vegetable soup
categories: soups, poultry, casseroles
yield: 6 servings

6 chicken breast halves
— boned, skinned
4 tb butter
1/2 c onion, chopped
3 garlic cloves, minced
1 c water, hot
1 ts salt
1 ts pepper, black
6 oz can chopped green chiles
1 1/2 c milk
1/4 c onion, minced
1/4 c flour
1 c chicken broth
2 tb cornstarch
1 c milk
1/2 c cornstarch
1 c cheddar cheese, grated

preheat oven to 350 degrees. combine flour, salt and pepper. heat
oil in skillet over medium heat. add chicken and cook until no longer
pink and browned. remove from pan and set aside. add onion to skillet
and cook until tender. add tomato sauce, chicken stock, chili sauce,
chili powder, and cumin. bring to a boil and simmer 10 minutes or
until chicken is tender. remove chicken from cooking liquid. add
chicken and stir to coat well. serve over rice.

I love how often it adds “serve over rice” to the end of recipes. I thought this was a soup! That aside, this does sound like a decent soup, so I can’t complain here.

title: ham steaks with rice
categories: pork, meats
yield: 6 servings

1 lb ground lamb
1 clove garlic, minced
3 tb olive oil
1/2 c chopped onion
1/2 c chopped celery
1 cn cream of mushroom soup
1 cn tomato sauce
1 cn tomato paste
1 tb cumin
6 oz sour cream
2 c sour cream
1 cn 10 oz. pasta shells
1 onion — sliced
1 cn tomato sauce
1 cn tomato sauce
1 cn tomato sauce — 14 oz
can use 1 can kidney beans
1 cn cream of chicken soup
2 c milk
1 ts salt

cook bacon until crisp. remove from pan and drain off fat. stir in
chili powder and cayenne pepper. add tomatoes and salt. cook until
meat is tender. add butter, salt, pepper and sugar. add salt and
pepper to taste. add tomato paste and simmer for 30 minutes. add meat
and cook about 15 minutes longer. stir occasionally. add water if
needed to thicken. serve over hot noodles.

Wow, those ingredients… It really likes using soup as ingredients. Also the idea of adding water to thicken, it clearly doesn’t understand something there.

title: cheese tortellini sandwiches
categories: sandwiches, meats, cheese/eggs, sandwiches
yield: 4 servings

1 12″ sandwich buns (about 1
-pound)
1 c mayonnaise or salad dressing
1 tb milk
1 tb prepared mustard
1/2 ts salt
fresh ground black pepper
corn tortillas (optional)

mix the cheese and seasonings together in a bowl. combine the cheese
with the cheddar cheese and stir into the cheese. place in a lightly
greased 9-inch square baking dish. sprinkle with the cheese, and
sprinkle with cheese. bake for 20 minutes. remove from oven and allow
to stand for 10 minutes before serving. serves 6 to 8.

Oh jeez, another recipe with a full cup of mayo. Why do you need a full cup of mayo (or salad dressing) in a  tortellini sandwich?

title: baked salmon with shallots
categories: fish
yield: 4 servings

2 tb butter
1 tb white wine vinegar
1 ts salt
1 ds pepper
4 shrimp (2 1/2 lb ea.)
-(1 lb); peeled, and
– cut into 1/2-inch cubes
1 c chicken broth
2 tb capers; drained
1 tb fresh parsley; chopped

remove skin from chicken breasts and discard. remove skin and bones
from chicken; set aside. in a large skillet, heat oil over medium heat.
add chicken and cook, turning the chicken for about 10 minutes or
until cooked through and cooked through. set aside.

in a small bowl, whisk together the remaining ingredients. pour over
the chicken. serve immediately.

Wow, you could cook this with only fairly minimal changes. But also, why are the shrimp cubed? Why is that necessary?

Okay, I could write about these all day. Let’s look at some weird edge cases. What happens when I turn the temperature all the way down? (to 0.1, the lowest value it will let me use)

title: baked stuffed carrots
categories: vegetables
yield: 4 servings

1 lb fresh spinach
1 tb olive oil
1 tb lemon juice
1 ts salt
1/2 ts pepper
1 tb olive oil
1 onion, chopped
1 clove garlic, minced
1 tb fresh parsley, chopped
1 tb fresh parsley, chopped
1 tb fresh parsley, chopped

in a large saucepan, combine the carrots, onion, celery, carrots,
celery, carrots, celery, carrots, celery, parsley, bay leaf, pepper,
and salt. bring to a boil, reduce the heat and simmer, covered, for
15 minutes. strain the stock through a fine sieve into a bowl. add
the chicken stock and the remaining ingredients and toss to coat.
serve immediately.

title: chicken a la parmesan
categories: poultry
yield: 4 servings

1 chicken, cut up
1 c chicken broth
1 c chicken broth
1 c chicken broth
1 tb cornstarch
1 tb soy sauce
1 tb cornstarch
1 tb water
1 tb cornstarch
1 tb water

cut chicken into small pieces. combine chicken broth, soy sauce,
vinegar, sugar, salt and pepper in a small bowl. add chicken and
cover with plastic wrap. microwave on high for 10 minutes. remove
chicken from broth and set aside. combine cornstarch and water in a
small bowl. add to skillet. cook and stir until thickened. serve over
rice.

[after this it just starts repeating “1 ts cayenne pepper” over and over]

I, uh, wow. That first one sounds really good, and the second is a weird way to cook a chicken (which involves a sauce that is just corn starch and water). I think it would need some modification to be eatable (e.g. actually cook the chicken) but it seems totally reasonable otherwise.

What happens if I let the network overfit a bit? To do this, I sample from not the best checkpoint, but the most recent checkpoint. Typically the validation behavior is a very rapid drop, followed by a much more gradual decline, and then eventually it bottoms out or starts going up- this means the network has overfit to the data and is reproducing the inputs instead of generalizing to new data.

title: cornmeal bread pudding
categories: desserts, puddings
yield: 1 servings

1 c popcorn, unsalted
3 c sugar
1 c milk
2 eggs
1 ts vanilla
2 c coconut, shredded
2 c cottage cheese
1 tb orange rind, grated
2 c coconut
1 c chocolate chips

preheat oven to 350 degrees. grease a 13x9x2-inch baking pan. in a
medium bowl, combine flour, baking soda, salt, cinnamon and nutmeg.
stir in buttermilk and eggs. blend well. add chocolate chips, stir
until well blended. spread batter evenly into prepared pan. bake 15
to 20 minutes or until a wooden pick inserted in center comes out
clean. cool in pan on wire rack. cut in squares.

Yeah, that seems reasonable. Too reasonable. Fortunately, the ingredients are great- Why does it have popcorn? I guess that would have to be a topping or something? All said, this one could be pretty good.

title: chocolate fudge bars
categories: candies, chocolate
yield: 1 servings

1 c sugar
1 c butter or margarine
2 c confectioners’ sugar
1 ts vanilla
2 c milk
1 c chopped walnuts

cream butter and sugar until light and fluffy. blend in egg and
vanilla. sift flour with baking powder and salt and add to creamed
mixture. mix in another 1/2 cup of chocolate chips. pour into greased
and floured 9″ round cake pan. bake at 350 f for 50-55 minutes. cool
in pan and cut into squares.

Ok, there’s no way that’s not a real recipe with a few modifications. That’s too good. There were a couple like this, so I’ll skip over them for now.

title: fish fillets
categories: fish
yield: 4 servings

-waldine van geffen vghc42a
1 lb fish fillets; cut into 1/2″
-cubes
1 c walnuts; chopped
1 c peanut oil
1 c water
1 clove garlic; minced
1 tb butter
1 tb flour
2 tb sugar
1 tb cornstarch
1 tb water
1 pn salt
1 ts worcestershire sauce
1 ts cornstarch

preheat oven to 375f. lightly butter a 13 x 9 x 2 inch baking
dish. sprinkle the shrimp with the salt and pepper. arrange the
artichokes in a single layer on the carrots. add the onion and garlic,
and sprinkle with the garlic powder. place the chicken breasts on top
and bake for 15 minutes or until they are soft and crunchy.

place the chicken in a serving bowl and top with the basil sprigs.
sprinkle with parsley and serve immediately.

serves 4.

I love this one because of “soft and crunchy.”

And then we get this monstrosity:

title: frosty chocolate cheesecake (lacto)
categories: chocolate, cakes, desserts, cheesecakes
yield: 16 servings

—————————crust——————————–
1 c sugar
1 pk strawberry pudding mix
1 c cold water
2 tb margarine; melted
1 ts vanilla extract

————————–filling——————————-
1 c sugar
2 tb cornstarch
1 ts cream of tartar
1/4 c cocoa
1/2 c water
1 ts vanilla

—————————glaze——————————–
1 c sugar
1 ts cornstarch
1 ts cornstarch
1 ts lemon juice

combine the chocolate chips, sugar, corn syrup and salt in a heavy
saucepan over medium-high heat. cook over moderate heat, stirring
constantly, until the sugar is dissolved. remove from the heat and
stir in the butter until dissolved. stir in the vanilla and coconut.
spoon into a 9-inch springform pan. using a rolling pin, score the
cake layers in the pan. bake the cake in the middle of a preheated
350f oven for 50 minutes, or until a toothpick inserted in the center
comes out clean. cool on a wire rack for 10 minutes, then remove the
cake from the pan and cool completely.

in a small saucepan, combine the chocolate and water. cook, stirring
constantly until the chocolate is melted. remove from the heat and
stir in the coconut. spread the chocolate mixture over the cream
cheese mixture, and spread the remaining cream over the cake. top
with the remaining chocolate truffle mixture. refrigerate until
chilled.

to serve, cut into squares and serve with a sprinkle of confectioners
sugar.

nutritional information: calories 163 (39% from fat); protein 2.4g;
fat 3.1g (sat 3.4g, mono 0.1g, poly 0.9g); carb 12.5g; fiber 0.5g; chol
78mg; iron 1.6mg; sodium 392mg; calc 16mg

This monstrosity of a recipe. It might even make a cake. But those ingredients will definitely not make something I’d want to eat today.

The fun part about the overfit network is that if I turn the temperature up too high it gets weird.

title: jack’s hot chicken salad
categories: salads, poultry
yield: 12 servings

3 (350 ml) dried hot chiles
4 skinless chicken breasts
(melted — about 1 pint)
1/2 to 2et cooking bagels
8 thick pastry sheets
1 lg onion
3 garlic cloves,large
1 ts garlic powder
1 c water

directions: place potatoes in a heavy pot over ham heat. add the
s&p. and bring to a boil over high heat, continuing to toss the last 2
minutes of cooking.

put the clam juice, wine and vinegar into a medium saucepan and
add the rice. bring it as the grain cooks. bring to its boil over
medium heat, then pierce it off with a knife; over low heat, simmer
for 15 min. until the flavors have blended. strain the fruits and
reserve the liquid.

cut the cauliflower into bite sized pieces. wash these and peel
them.

after the couscous has cooled, the next day, rinse under cold water
and place in a dipping bowl.

meanwhile, rinse the chicken with a mixture of warm water and 1/4 tsp
salt.

drying liquid 1: sprinkle the breadcrumbs evenly over the skin, each
one. lightly brown the spareribs in it in a little oil in a roasting
pan and add remaining ingredients. cover and cook over low fire for 1
minute per side.

flay the squid together.

Wow. I think I’ll just let that one stand on its own.

title: irregular beef
categories: main dish, beef
yield: 6 servings

1 lb ground beef, lean
1 md onion
1 garlic clove, minced
1 c water
1 large can tomatoes, chopped
1 cn salt cod use water pasta
-(no salt and tabasco)

in a large saucepan place 2 or 3 cups beef bouillon cubes; set aside.
cook sausage over medium heat until tender. remove, and drain pieces;
place in a greased 9-inch baking dish. sprinkle margarine on bottom
and sides. repeat two more or more. cover and bake at 350 degrees for
5 minutes. combine mayonnaise, mustard, horseradish, pepper, curry
powder and salt, using the metal blade. on a baking sheet, place a mixt of
the eggs, salt and pepper, and the beaten egg abert to the meat
mixture. fill each scallop mixture with the egg mixture and then top
with 1 t of parmesan cheese. bake at 500 f for 45 mins. or until
beginning to bubble.

Ok, I think we can ignore the recipe and just go from the instructions here. This will definitely make some irregular beef.

Just to compare, here’s the non-overfit network at a high-temperature:

title: home-made maple pineapple jam
categories: relishes
yield: 1 batch

1/2 lb plain flour
salt
1 tb peanut oil
1 c sliced fresh cooking apples
1 ts instant coffee
2 oz self-raising flour

divide egg whites equally among oiled serving dishes. place your
finger and chopped mint in a hot water bath and refrigerate overnight.
when it comes out clean, toss it over in a 900′. in order to melt the
caramel thermometer, pour in the banana sauce. serve in ice cream
refrigerator. the recipe was doubled…yeast! place the kebab in a deep
1 quart or tiled container and let stand at room temperature. when
firm it is done at its way folks, but not enough to within another.
pour into container and freeze up to one month.

Mmm, maple pineapple jam sounds good. But the instructions… You have to refrigerate your finger overnight and melt a thermometer.

Later on we get cool stuff like:

cream butter and sugar, heat to medium. add banana, eggs, vanilla,
cinnamon, nutmeg and lemon rind; beat at low speed until fluffy,
whirl in dry ingredients, no longer, add cream cheese and beat until
smooth. stir in currants. drop by rounded tablespoonfuls, 2″ apart,
3 inches apart. be careful not to knock some of the rest of the
cookies.

The beginnings of a tasty sounding banana cookie.

title: freezer mix jelly
categories: relishes
yield: 1 servings

1 1/2 c instant mashed potatoes
1 pk unflavored gelatin
3 c cranberry juice
1 pk cream cheese (7 oz)
1 1/2 c mayonnaise
10 oz feta cheese

blend sugar and vermicelli til smooth. place in ice-water bath; mix well
and pour over salad.

A… thing.

steam the oranges for 4 minutes. into a blender, combine the flour,
salt, and corn meal. process for 30 seconds. add the butter and
margarine, and pulse for about 5 seconds. add the pureed apples
and the margarine. process on low the bowl until the mixture is
combined evenly. add the remaining 1/2 cup mashed bananas.

Some kind of… fruit cake?

In summary, I’ve gotten really astounding results and the number of actually somewhat cookable recipes has gone up immensely (I think?). I’m definitely putting a lot more thought into the idea of cooking some of these and making an RNN cookbook.

Posted in char-rnn | Tagged | Leave a comment