Mixed up

A few weeks ago, I had one of my wisdom teeth extracted. It wasn’t giving me trouble, but as the dentist explained, if it continued taking up residence in my mouth, it would soon give me trouble.

I’m happy to report that the extraction was dern-near painless. I experienced pain once the numbing wore off and I’ve had some issues created by food becoming lodged in the new hole in my mouth, but that’s not the purpose of this-here post.

Although I had grandiose plans to eat chicken the evening of my extraction, once the procedure was complete, I realized I would have to settle for something that wouldn’t necessitate much chewing. Thus, I decided to make cornbread. Hot bread and butter followed by milk and bread wouldn’t require much work from me at all.

So, I stirred up some bread using the recipe my mom taught me, which comprises four ingredients – cornmeal, flour, buttermilk, and water – none of which we measure.

Not that it’s relevant to this story, but at some point during the years, my mom stopped putting flour in her cornbread. However, I still spoon some flour into the batter. Indeed, I used it the evening of my extraction. And once the aforementioned four ingredients were blended together, I poured the mixture into a cast iron skillet and put said skillet in the oven.

Approximately 20 minutes later, I checked the bread and immediately noticed that something was amiss. The crust felt hard and the inside felt sticky. Still, I cut a piece of bread, slathering it in butter. But the butter didn’t melt into the bread. Instead, it pooled atop it. Although the sticky bread didn’t look appetizing, it was there and I was hungry, so I took a bite.

It was inedible.

I hate waste, but I couldn’t finish one bite, yet alone an entire pan, so I dumped it into the trash.

Later that evening, after I had made a mashed potato run to the KFC, I implored to my mom, “What did I do wrong?” At first, she seemed as puzzled as I. However, recognizing my reputation for using aged ingredients, she suspected that my cornmeal might have been old, and I conceded that it could have been in my cupboards for a significant amount of time.

Then Mom assumed the role of a detective interrogating a suspect. She asked me to list the steps I had taken, starting at the beginning.

“Well,” said I, “I started with the flour.” Gasping, I added, “That’s it!”

As far as I know, my mom didn’t teach you to make cornbread, so you’ll be forgiven if you don’t understand how that aha moment solved the mystery of the sticky bread. Here’s the gist of it – she taught me to start by scooping one or more cups of cornmeal (depending on whether we were making a big or small pan) into a kettle, followed by a spoon or two (again, depending on the size of the pan) of flour before adding the buttermilk and water. On the evening of my extraction, I mixed up the ingredients and started by scooping in a cup (more or less) of flour and a spoon (more or less) of cornmeal before adding the buttermilk and water.

Obviously, my mistake can be attributed to the pain emanating from the new hole in my mouth. After all, I’ve successfully made hundreds of pans of cornbread in my day. Until, that is, the evening of my extraction.

This post originally appeared in the Appalachian News-Express.

Advertisements

A little squirrelly

As my family and I prepared the Thanksgiving menu, my mom reminisced about Thanksgivings of yore. Specifically, she recalled the holidays gone by when we gathered around the table to feast on the Thanksgiving squirrel.

While you take a moment to digest that information, I should let you know that by “we,” I mean the rest of the family, because I wasn’t born when these epicurean banquets were held.

Anyway, Mother said that back in the day, she and Daddy couldn’t afford a luxury like turkey. So, they ate chicken or squirrel at Thanksgiving. The chicken would have most likely been born, lived, and died on the property or in the vicinity. Or it could have come from a store, from where it would have been purchased whole. It would have later been cut up into various body parts because they couldn’t afford individual poultry parts, either.

The squirrel, on the other hand, would have come from the nearby hills. After its death, it would have spent some time on an oversized safety pin Daddy used to transport his game.

In addition to chicken or squirrel, Mother said the meal would have included potatoes, beans, and other vegetables and probably some sort of bread. They would have finished the meal by enjoying homemade pie for dessert.

Mother said she might have made dressing on the chicken, but she never served stuffing on a squirrel. Having never stuffed dressing inside a turkey or a squirrel, I can only speculate as to which task would have proven more problematic.

Since she seemed content with the chicken and squirrel, I asked why they switched to turkey for Thanksgiving. She attributed it to following a fad and noted, “We fell into a rut.”

“We didn’t grow up eating turkey,” she reiterated. “I never had a turkey until, golly, I don’t know when I first ate turkey.”

Reconsidering, she added, “Growing up, the only time I remember anybody eating a turkey was when my grandma made one. They killed it and hung it on the clothesline.”

I’ll let you figure out for yourself why they hung the deceased turkey on the clothesline.

Happy Thanksgiving!

This post originally appeared in the Appalachian News-Express.

It’s better than (some) delivery

Last week brought exciting new experiences for me – I visited Atlanta, I rode through a gated trailer park, and I bought a DiGiorno pizza.

And as I look back on the week, I keep asking myself the same question – why did it take me so long to buy a DiGiorno?

Don’t get me wrong. I’ve dined on oodles of frozen pizzas in my time. Back in the day, I took Totino’s pepperoni pizzas for lunch, dividing the pizza in half to provide lunch for two days.

Through the years, I occasionally upgraded to the frozen pizzas that resemble the kind you order at restaurants. DiGiorno seemed a bit pricey, though, so I settled for lesser known brands. These pizza-eating experiences, however, always left me feeling less than sated. Although I tried several brands, none of them had much of a taste. Well, except for the aftertaste.

But last weekend I found myself in a precarious position. Yes, it was BYOF Saturday at the my mom’s. So, as I walked through the aisles at the Dollar General, looking for something to eat, my eyes spied a DiGiorno rising crust pizza in the frozen food section.

The cost – more than five bucks – seemed exorbitant, but the rumbling in my tummy overruled my thriftiness, so I bought the pizza.

When I arrived at Mom’s, I put the pizza in the oven, leaving it there until the cheese was on the dark side of golden and the crust was a medium brown. Once it cooled a bit, I cut a slice and took a small bite.

It tasted delicious.

I’m not a foodie, so don’t expect me to describe the sauce and cheese with flavor-filled adjectives. Instead, I’ll repeat – it was delicious. The company’s marketing plan boasts that their pizzas could be mistaken for delivery. That is not a ploy. It is the truth. Indeed, the DiGiorno I consumed was better than some pizza I’ve had from certain restaurants.

I shared the yummy goodness with my sister and still had enough left over for Sunday and Monday. But I ate entirely too much for lunch Monday, so I opted for a light dinner that evening. Then, I went to Atlanta. So, by the time I resumed dining at home, the pizza was six days old.

Although I frequently consume food well past expiration dates, I thought long and hard about eating six-day-old pepperoni pizza. I turned to my sister for advice, asking if the pizza would still be good. When she said no, I followed up by inquiring if the no meant “it won’t taste right” or “it will kill you.”

She refused to clarify her answer and I ultimately decided against eating that last slice of pizza. Yes, by my calculations, I threw away approximately 90 cents worth of pizza. But I didn’t want my first experience with DiGiorno to end with a case of food poisoning.

This post originally appeared in the Appalachian News-Express.

Seize the day

I enjoy baking. One of the desserts, pronounced as zerts by my late father, I most enjoy baking – and eating – is white cupcakes with chocolate buttercream frosting.

A couple years ago, however, I decided to mix it up and make chocolate cupcakes with white chocolate buttercream frosting.

The cupcakes turned out splendidly and, in the end, so did the frosting. It’s just that it never became white chocolate frosting.

Indeed, as I mixed together the frosting, I multitasked by melting the white chocolate baking bars. But the bars didn’t actually melt. Instead, they appeared to scorch and form into puffs.

Assuming I had made a mistake, I expressed gratitude that I hadn’t mentioned the white chocolate aspect of the frosting to my family. Frankly, I feared they would accuse me of using outdated baking goods. And we all know what a ludicrous accusation that would be.

So, my pet army and I made a vow to never again speak of the incident and I scraped the brownish-looking white chocolate into the trash.

Flash forward to last December. As we gathered at my mom’s to prepare Christmas goodies, my sister tried to melt white chocolate chips. Although she frequently stirred them and added copious amounts of oil, the chips turned into scorched puffs. She noted that white chocolate is dern-near impossible to melt and lamented our lack of almond bark.

She might have felt forlorn, but I became so giddy I dern-near skipped down the road.  (I also once again questioned the origin of almond bark. Is it literally the bark of an almond tree? And how does bark come in more than one flavor?)

Anyway, my happiness stemmed from the realization that I hadn’t goofed. It wasn’t me. It was the white chocolate. Flash forward to last week. After I purchased half a flat of strawberries, I decided chocolate-dipped strawberries would improve my quality of life.

As it turned out, I had some white chocolate baking bars in the cabinet. Where did they come from? How long had they been in said cabinet?

None of that matters. All that matters is that I said to myself, “Self, you’ve got nothing to lose. You might as well melt them and see what happens.”

I guess you know what happened. The bars turned into scorched puffs. I’m sure I didn’t help matters by adding milk instead of oil, but I think they were already beyond salvaging.

Fortunately, I had some chocolate baking bars, which I melted. In case you’re wondering, chocolate-dipped strawberries did improve my quality of life.

Yet, due to my thirst for knowledge, I had to know more about melting white chocolate. Was it simply something we Goff sisters struggled to accomplish? Is there an easy remedy?

There’s not.

In fact, based on everything I read, my sister followed the standard operating procedure vis-a-vis melting white chocolate.

My research also resulted in the discovery of a new term – seized chocolate. Surprisingly, this does not refer to confiscation of a bakery’s assets. It’s the term for the scorched puffs created when one unsuccessfully melts white chocolate.

Maybe someday I’ll learn the term for what happens when one successfully melts white chocolate.

This post originally appeared in the Appalachian News-Express.

Stay in your lane

Usually, I avoid discussing controversial topics in this blog. Indeed, I like to think of this as a safe place where my readers can retreat for a laugh or two. Usually at my expense.

mcdonald'sBut with a controversy of epic proportions threatening to divide the country, I can no longer remain silent. I’m speaking, obviously, of McDonald’s double drive-thru.

When the first double drive-thru came to town, I went on record proclaiming my appreciation for it. My opinion has not changed. Vociferous double drive-thru critics, however, argue that it doesn’t speed up the fast food delivery process. They may be right, but I’m not addressing that issue. I’m concentrating on the question of which lane to choose.

For those of you unfamiliar with a double drive-thru, it’s exactly as it sounds. There are two lanes, each with its own intercom. After placing their orders, customers merge from the two lanes into one that takes them to the pay-here and pick-up-your-order windows. In spite of some confusion over who merges first into the single lane and the violence that has broken out at locations throughout the country, it’s actually a simple process.

Or so I thought. But I’ve recently learned that, for some people, the problem begins at the beginning. In fact, there are those among us who believe that all customers should stay in the lane closest to the restaurant until they’ve pretty much reached the intercoms. Only then, they maintain, should a car move into the second lane.

One of my friends accuses people who bypass the first lane and zip into the second lane as lane skippers. A friend of a friend takes photos of these alleged lane skippers. Another friend flips off alleged lane skippers.

I guess there’s a chance she’s flipped me off because I always choose the shortest lane. Actually, I don’t know why anybody would waste time lingering in the longer lane when another, I repeat, shorter lane beckons them.

What’s more, there’s literally a sign at the drive-thru that gives us permission to do so. That’s right. I take the “any lane, any time” declaration as an invitation to choose any lane I want, any time I want. If I was supposed to wait impatiently in the long line, the sign would advise me to “remain in this long line until you either starve to death or reach the intercoms.”

But it doesn’t say that. So, as long as the “any lane, any time” sign remains, I’ll keep following directions – and risk getting flipped off.

This post originally appeared in the Appalachian News-Express.

More or less

Last week, I ran across a story that detailed a list of 10 foods that could help raise or lower the risk of dying from heart disease, stroke and/or Type 2 diabetes.

Whilst scanning the contents of said article, my eyes settled on one word – bacon. For one brief shining moment, I considered the potential ramifications of a world where bacon ruled as a healthy food.

Then, I actually read the entire story and realized that bacon is on the stuff-you-should-eat-less-of list. According to the Journal of the American Medical Association, bacon and other processed meats were linked to eight percent of the aforementioned health conditions.

So, I guess it good that I’ve cut back on my intake of bacon. Of course, in the past few years, I wasn’t eating that much bacon. That was not always the case. In fact, there was a time when I feasted on a pack of bacon every week. For reals. I’d fry half the pack one evening for dinner and the other half the next evening. My taste for bacon was so well known that when I recently ran into a work-related acquaintance I made during that period of my life, he pointed at me and said one word – bacon.

I can’t remember what prompted me to drastically reduce my bacon intake, but I cut back to perhaps one or two packs a year. Oh, I enjoyed the occasional plate of bacon at my mom’s or in the cafeteria, and every now and then I treated myself to a bacon and egg biscuit. But that was nothing compared to what I had been eating.

Still, there’s always room for improvement. As the end of 2016 approached, I informed my sister that I was giving up bacon and red meat. (You might ask yourself, “Self, isn’t bacon red meat?” I might answer by asking, “Is it?”)

Anyway, my sister, a woman not known for her silences, responded with silence.

“Do you think that’s a bad idea?” asked I.

“No, I just think that you eat so little bacon and red meat that you won’t even miss it.”

She had a point. I’ll splurge on a roast beef sandwich every few months and I have been known to dip the cafeteria’s roast beef in my mashed potatoes, but it’s not like I eat a pack of red meat (or bacon) every week.

I had been consuming nachos too frequently, though, so I made the decision to give them up. I also decided to end my long-term relationship with fries.

When I shared the latter decision with others, my audience gasped in surprise. After all, I am something of a cheese fries connoisseur. And you know what makes cheese fries better? Bacon.

Thus far, though, I’ve been true to my word. I have not had a fry in three months. During that time, I’ve had approximately six slices of bacon, two slices of ham, two slices of roast beef, two hot dogs and one pork chop. What’s more, I’ve walked past the nacho bar without giving it a third look.

Lest you think I’m a health food freak, I continue to satisfy my sweet tooth and I do not go near seafood, which is included on the stuff-you-should-eat-more-of list.

In case you’re interested, the other good foods are nuts, vegetables, fruits, whole grains and polyunsaturated fats. In addition to red meat and processed meat, sugary drinks and salt comprise the bad food list.

Until perusing the list, I had no idea salt was a food. But the list supports my position that bacon is not red meat, so I guess I’ll support their position that salt is food.

This post originally appeared in the Appalachian News-Express.

My salad days

I’ve come to the realization that I’ve wasted my life.

This epiphany occurred, appropriately enough, as I enjoyed a chicken Caesar salad. You see, recently I’ve developed something of an obsession with the tasty salad. This newfound fascination is all the more compelling when you consider that, for years, I wrinkled my nose in disgust when anyone so much as offered me a bite of chicken Caesar salad.

There’s no good reason for such behavior. As I relished each delicious morsel over lunch last week, I tried to remember what had kept me and the salad apart for what I’ve taken to calling the lost years. The only explanation I have is that I mistakenly believed that chicken Caesar salad was served with ranch dressing. Of course, I can take ranch dressing in small doses, so I’m not even sure that’s an acceptable rationalization.

Regardless, a couple months ago I had occasion to feast on what I later referred to as the best salad I had ever eaten. A week later at a Christmas luncheon, I ordered a house salad at a local restaurant that rivaled the best salad ever.

I did not connect the dots at that point. It wasn’t until a month later, after I actively ordered a chicken Caesar salad, with dressing on the side, that I made the connection. Even then, it didn’t immediately click. As I put only a minimal amount of dressing on the salad, I said to myself, “Self, this dressing doesn’t look or taste like ranch. Actually, this salad reminds me a little of the best salads ever, versions one and two.”

Since then, I have gone out of my way to get my hands on chicken Caesar salad. I waited in line for several minutes to obtain one specifically made for me. I’ve added chicken, romaine lettuce, Caesar dressing, and sprinkly Parmesan cheese to my shopping list. No croutons for me, though.

Whilst compiling my list, I searched for chicken Caesar salad recipes. Yeah, I know. It contains something like five ingredients. But I wanted to make sure I had included all the essential ingredients on my list. According to my research, common ingredients include anchovies, Worcestershire sauce and crushed garlic.

This distressed me. I thought making a chicken Caesar salad would involve tossing some sprinkly cheese on lettuce and chicken and then lightly seasoning the creation with dressing. I was not prepared to crush garlic.

Luckily, further research indicated that the so-called common ingredients comprise the dressing. Apparently, some over-achievers make their own dressing.

Not me. The less time I spend on making the salads, the more time I’ll have to enjoy them.

This post originally appeared in the Appalachian News-Express.