Sena Double Dog Dared Me to Try the Electric Snow Shovel!

Well, the snow kept coming down. I had to go back out to shovel. I was going to use my plow shovel, but Sena said I was too chicken to try the electric snow shovel.

I couldn’t back down.

So, I had to go out there, after the wind had changed mind you, and use the battery powered shovel. I got snow all over me because no matter what I did to adjust the snow deflectors, the snow just blew back in my face.

I could have been a contender.

Super Bowl Commercial “Mean Joe Frank” 2025 Looked Familiar!

We watched the Super Bowl last night and one of the many commercials (always a big thing) looked vaguely familiar only after Sena pointed it out. It was the Fareway Frank commercial about Fareway stores. It was a reprise of the famous 1979 Coca-Cola commercial with Mean Joe Green, defensive tackle for the Pittsburg Steelers and a kid. When you compare them, the similarities are obvious. There are two Fareway commercials that are the same, but have different titles, one of them being “Mean Joe Frank.”

The Hey Kid, Catch commercial with Mean Joe Green featured a 9-year-old kid named Tommy Okon. I couldn’t find a name for the kid in the “Mean Joe Frank” version.

But you can find film information on Turner Classic Movies titled “The Steeler and the Pittsburgh Kid” which is based on the commercial. The kid is Henry Martin who later starred in the movie, “E.T. The Extraterrestrial” (1982).

This led to reminiscence of Fareway Store in Mason City, Iowa. I used to walk to Fareway to get groceries and walked back carrying at least two big paper sacks. My arms were always pretty sore when I got home. I used a wagon later on, but had to be careful crossing the railroad tracks on the way back to ensure the eggs didn’t break.

There are Fareway stores all across Iowa and several neighboring states now, but the first one opened in Boone, Iowa in 1938. It popularized the idea of shoppers picking out their own items from the shelves rather than letting a store employee pick them out from the shopper’s list. The store name was inspired by what was sold (“fare”) and treating customers and employees fairly.

The Fareway Frank character was called Forrest Frank in 2024 although it looks like the company has settled on Fareway Frank.

The other Iowa connection worth mentioning is that rookie defensive player for the Philadelphia Eagles (Super Bowl winners), Cooper DeJean, who played for the University of Iowa Hawkeye football team, made a pick-6 interception touchdown in the game.

While the Fareway Frank version of the Super Bowl commercials line-up didn’t make the short list (or any list for that matter) of favorites, it sure did bring back memories for us.

The Big Mo Pod Show: “Variety is the Spice of Life”

We heard the Big Mo Blues Show last night and heard a couple of tunes that were fun. One of them was mentioned on the Big Mo Pod Show today, “Feelin’ Alright” by Stuff. The other was a riot but was not on the podcast, and it was “I Feel So Good (I Wanna Boogie” by Magic Sam.

Big Mo Pod Show 085 – “California Bluesin” KCCK's Big Mo Pod Show

After a short break during the Thanksgiving holiday your hosts are back at it again with another episode! This week features the usual mix of blues eras you’ve come to expect along with a few Californian artists, tune in to see which ones! Songs featured in the episode: Solomon Hicks – “Further On Up The … Continue reading
  1. Big Mo Pod Show 085 – “California Bluesin”
  2. Big Mo Pod Show 084 – “Garage Blues”
  3. Big Mo Pod Show 083 – “Legal Pirate radio”
  4. Big Mo Pod Show 082 – “Tribute”
  5. Big Mo Pod Show 081 – “Cheers To Kevin”

The one that brought back memories was “Feelin’ Alright” by Stuff. It sounded so familiar and we finally figured out it was a cover of song of the same title and sung by Joe Cocker (the one I remember). However, I guess it was originally written by Dave Mason of the group Traffic in 1968. And there’s even an album “Joe Cocker With Stuff Feelin’ Alright.”

The one Big Mo played last night was from the album Stuff Live at Montreux 1976. It was instrumental and hard to recognize at first. The lyrics are sad, though and it’s about unrequited love, I guess. I associated it years ago with drug and alcohol addiction after seeing the 2012 movie “Flight” on TV. Every scene in which the main character Whip Whitaker (played by Denzel Washington) got high, that song was a part of the scene.

Anyway, moving right along, the song that was not on the podcast and which was a whole lot of fun to listen to was “I Feel So Good (I Wanna Boogie) by Magic Sam. We’ve never heard of him but the way he sang the number made us laugh out loud.

The title of the podcast was right on target: “Variety is the Spice of Life.”

Okay, We Can Laugh Now!

Yesterday, I wrote a pretty serious post. Now I’m in withdrawal from jokes, but I can fix that.

Here’s a joke link to a site which has clean jokes for everyone.

And here are some Chuck Norris jokes (facts?) for everyone else. One of my personal favorites:

Chuck Norris can do a wheelie on a unicycle.

Give me your favorite clean jokes. I double-dog dare you.

My Mt. Rushmore Dream

Lately, I’ve been anticipating my eventual immortalization as a sculptured stone bust on Mt. Rushmore. Hopefully, this will be fairly soon because I’m not getting any younger.

Among my many inventions is the internet. Don’t believe Al Gore, although he has persuaded others about his role in the development of what I argue should properly be called the world wide web. I’ve invented a lot of other things which I’ll tell you more about just as soon as I make them up.

Before I forget it, I want to tell you what I just noticed last night while I watching one of my favorite X-Files episodes, “War of the Coprophages.” I guess I never noticed that the cockroach invasion was about Artificial Intelligence (AI). It was the scientist, Dr. Ivanov, who mentioned it first and I just missed it the first few hundred times I saw the show.

Dr. Ivanov clearly thought that anybody who thought extraterrestrials would be green and have big eyes was probably crazy. Traveling across galaxies through wormholes and whatnot would tear humanoid organisms apart. The practical approach would be to send AI robots instead. You could see Mulder cringe at that idea. The little robot that kept edging closer to Mulder made him nervous and when he asked Dr. Ivanov why it did that, his reply was “Because it likes you.”

That doesn’t exactly fit with Ivanov’s other idea about extraterrestrials, which is that they would focus on important tasks like getting enough food, procreating, etc. without getting all emotional about them. Ironic that Dr. Ivanov made an AI robot that gets a crush on a sesame seed munching UFO hunter like Mulder.

However, the AI robots in the show are cockroaches which love to eat dung. In other words, they’re full of crap.

Moving right along, although I didn’t invent it, there’s a card game called schnapsen that Sena and I are trying to relearn. It’s kind of a break from cribbage. It’s a trick taking game with just a 20-card deck. We play the version that doesn’t allow you to look at your cards to see how many points you have so you can tell when you can close the deck or go out, meaning you have the 66 points to win. You have to remember how many points you’ve won in tricks. I think it’s a good way to keep your memory sharp.

Let’s see; I’ve lost every game so far, but that doesn’t mean I won’t end up with my bust on Mt. Rushmore.

Artificial Intelligence in Managing Messages from Patients

I ran across another interesting article in the JAMA Network about Artificial Intelligence (AI) with respect to health care organization managing messages from patients to doctors and nurse. The shorthand for this in the article is “in-basket burden.” Health care workers respond to a large number of patients’ questions and it can lead to burnout. Some organizations are trying to test AI by letting it make draft replies to patients. The results of the quality improvement study were published in a paper:

English E, Laughlin J, Sippel J, DeCamp M, Lin C. Utility of Artificial Intelligence–Generative Draft Replies to Patient Messages. JAMA Netw Open. 2024;7(10):e2438573. doi:10.1001/jamanetworkopen.2024.38573

One of the fascinating things about this is the trouble we have naming the problems with misinformation that AI has. We tend to use a couple of terms interchangeably: hallucinations and confabulation. Whatever you call it, the problem interferes with communication between health care workers and patients.

Dr. English describes the interference as a “whack-a-mole” issue, meaning every time they think they got the hallucination/confabulation problem licked, the AI comes up with another case of miscommunication.

Just for fun, I did a web search trying to find out whether “hallucination” or “confabulation” fit the AI behavior best. Computer experts tend to use the term “hallucination” and neuropsychologists seem to prefer “confabulation.” I think this community chat site gives a pretty even-handed discussion of the distinction. I prefer the term “confabulation.”

Anyway, there are other substantive issues with how using AI drafts for patient messaging affects communication. I think it’s interesting that patients tend to think AI is more empathetic than medical practitioners. As Dr. English puts it: “This GPT is nicer than most of us,” and “And ChatGPT, or any LLM, isn’t busy. It doesn’t get bored. It doesn’t get tired.” The way that’s worded made me think of a scene from a movie:

OK, so I’m kidding—a little. I think it’s important to move carefully down the path of idealizing AI. I think back to the recent news article about humans teaching AI how to lie and scheme. I remember that I searched the web with the question “Can AI lie?” and getting a reply from Gemini because I have no choice on whether or not it gives me its two cents. I’m paraphrasing but it said essentially, “Yes, AI can lie and we’re getting better with practice.”

I like Dr. English’s last statement, in which she warns us that AI can be a fun tool which clinicians need to have a healthy skepticism about. It may say things you might be tempted to gloss over or even ignore, like:

“I’ll be back.”

New Snow Shovels!

The new shovels were delivered today. Both required some assembly. I’m the least handy person when it comes to that. I did OK with the snow plow but Sena had to come to the rescue when it came to the cordless snow shovel. The handle was tricky for some reason.

The batteries for the electric shovel needed minimal charging and it roared to life. It doesn’t sound like a toy.

Now all we need is snow. I can wait.

Did You Know They Won’t Be Making Yardsticks Any Longer?

Anecdote alert! Sena just got back from shopping and had a priceless little story about shopping for a yardstick for measuring window film to apply on a door window. I suppose I should say that the title of this post is a dad joke that some people might not get.

Sena asked a Menards worker where to find a yardstick. She said the guy looked like he was in his thirties. His English was probably a little rough. He looked puzzled and directed her to the lawn and garden center. She clarified that a yardstick was something like a ruler. He replied that they didn’t carry school supplies.

Another worker was in the same aisle and chuckled. He directed her to where the yardsticks were.

You know, I haven’t seen a yardstick in a long time. We don’t own a ruler although we have a tape measure. Just to let younger people know, a yardstick is typically a piece of wood 36 inches long (which is 3 feet), marked off into inches, and used for measuring things.

The worker who didn’t know what a yardstick was could probably relate to football games because the length of the field is still divided into yards—but only if he’s a football fan, I guess. But you don’t measure distances to a first down on a football field with a yardstick. . Incredibly, they measure it with a chain between two sticks. None of your lasers for the officials.

We had a yardstick in the house where my brother and I grew up. You could also use it to reach stuff that rolled under tables. You could make comparisons by saying “By any yardstick, blah blah.”

And you can make dad jokes about yardsticks. By the way, the company that makes yardsticks won’t be making them any shorter either.

Don’t Shovel Your Heart Out

We’re waiting for the next snowfall. We’ve had a couple of light ones so far and we used shovels to clear our driveway and sidewalk. They didn’t amount to much, but we’ll get a heavy snow here pretty soon.

We’ve been using shovels for years. I’m aware of the risks for heart attacks in certain people, especially sedentary middle age and older men with pre-existing cardiac risk factors. I’m not keen on snowblowers, mostly because I like to shovel.

I’ve been using an ergonomic shovel for years, although the wrong way until about 4 years ago. I used to throw snow over my shoulder while twisting my back. Now I push snow with a shovel that has a smaller bucket or with a snow pusher with a shallow, narrow blade. I lift by keeping my back straight and bending at the knees, flipping the small load out. I take my time.

I don’t know how high my heart rate gets while I shovel. I exercise 3-4 days a week. I warm up by juggling. I do floor yoga with bending and stretching, bodyweight squats, one leg sit to stand, use the step platform, dumbbells and planks. When I’m on the exercise bike, I keep my heart rate around 140 bpm, and below the maximum rate for my age, which is 150 bpm.

I’m aware of the recommendations to avoid shoveling snow based on the relevant studies. I realize I’m way past the age when experts recommend giving the snow shovel to someone else.

The question is who would that be? There aren’t any kids in the neighborhood offering to clear snow. Maybe they’re too busy dumb scrolling. I’m also aware of the city ordinance on clearing your driveway after a big snow. They’re very clear, at least in Whereon, Iowa.

“The city of Whereon requires every homeowner to clear snow from sidewalks within 24 hours after a snowfall. This means you. If you fail in your civic duty to clear snow and ice from your walkway within the allotted time of 10 minutes, the city will lawfully slap you with a fine of $3,000,000 and throw your dusty butt in jail for an indeterminant time that likely will extend beyond the winter season and could be for the rest of your natural life and even beyond, your corpse rotting in your cell, which will not bother the guards one iota because of the new state law mandating removal of their olfactory organs. Hahahahaha!!”

In light of the strict laws, Sena ordered a couple of new snow removal tools. Neither one of them is a snow blower. I think it’s fair to point out that some cardiologists have reservations even about snowblowers:

 There are even studies that show an increased risk for heart attacks among people using automatic snow blowers. Similar to the extra exertion of pushing shovel, pushing a snow blower can raise heart rate and blood pressure quickly–from “Snow Shoveling can be hazardous to your health” article above.

One of them is a simple snow pusher with a 36-inch narrow blade. That’s for me. The other is a cordless, battery powered snow shovel that looks like a toy for Sena. The ad for that tool includes a short video of an attractive woman wearing skinny jeans and her stylish coat open revealing her svelte figure while demonstrating how the electric shovel works. It appears to remove bread slice sized pieces of snow from the top of a layer which stubbornly sticks to the pavement. Call the Whereon snow police.

We should be getting both tools before the next big snow.

Should We Trust Artificial Intelligence?

I‘ve read a couple of articles recently about Artificial Intelligence (AI) lately and I’m struck by how readily one can get the idea that AI tends to “lie” or “confabulate” and sometimes the word “hallucinate” is used. The term “hallucinate” doesn’t seem to fit as much as “confabulate,” which I’ll mention later.

One of the articles is an essay by Dr. Ronald Pies, “How ‘Real’ Are Psychiatric Disorders? AI Has Its Say.” It was published in the online version of Psychiatric Times. Dr. Pies obviously does a superb job of talking with AI and I had as much fun reading the lightly edited summaries of his conversation with Microsoft CoPilot as I had reading the published summary of his conversations with Google Bard about a year or so ago.

I think Dr. Pies is an outstanding teacher and I get the sense that his questions to AI do as much to teach it how to converse with humans as it does to shed light on how well it seems to handle the questions he raised during conversations. He points out that many of us (including me) tend to react with fear when the topic of AI in medical practice arises.

The other article I want to briefly discuss is one I read in JAMA Network, “An AI Chatbot Outperformed Physicians and Physicians Plus AI in a Trial—What Does That Mean?” (Accessed January 6, 2025).

Hswen Y, Rubin R. An AI Chatbot Outperformed Physicians and Physicians Plus AI in a Trial—What Does That Mean? JAMA. Published online December 27, 2024. doi:10.1001/jama.2024.23860 (accessed January 6, 2024).

I think the conversation amongst the authors was refreshing. Just because the title of the article suggested that AI might take the place of physicians in the consulting room doesn’t mean that was the prevailing opinion of the authors. In fact, they made it clear that it wasn’t recommended.

I liked Dr. Chen’s comment about confabulation and hallucinations of AI:

“A key topic I talk about is confabulation and hallucination. These things are remarkably robust, and only getting better, but they also just make stuff up. The problem isn’t that they’re wrong sometimes. Lab tests are wrong sometimes. Humans are definitely wrong sometimes. The problem is they sound so convincing, they confabulate so well. “Your patient has an alcohol problem, Wernicke-Korsakoff syndrome.” It’s only if you double check, you’ll realize, “Wait a minute, that wasn’t actually true. That didn’t make sense.” As long as you’re vigilant about that and understand what they can and can’t do, I think they’re remarkably powerful tools that everyone in the world needs to learn how to use.”

What’s interesting about this comment is the reference to Wernicke-Korsakoff syndrome, which can be marked by confabulation. Clinically, it’s really not clear how this comes about in AI although thiamine deficiency is the main cause in WKS. In both, it involves inventing information, which is technically not the same as lying.

Unfortunately, this contrasts sharply with the recent fact checking Snopes article I wrote about recently, which suggests that humans are teaching AI to lie and scheme.

In any case, it’s prudent to regard AI productions with skepticism. My conversations with Google Bard clearly elicited confabulation. Also, it didn’t get humor, so I wouldn’t use it as a conversational tool, given that I’m prone to kidding around. As far as trusting AI, I probably wouldn’t trust it as far as I could throw it.