November 10, 2025
Good morning! Three words: Roasted Root Vegetables My good friend Dan has a birthday #79 tomorrow… and he’s a U.S. military veteran… Happy Birthday, Dan — and thank you for defending the Constitution. It’s been more than a week, but we are still mourning the final out — and the outcome — of the World Series.”… one of the (more) compelling in history… had an average of 25.45 million U.S. viewers.” (Nielsen) “That’s baseball’s biggest audience since 2017… there were 32.6 million viewers from Japan, Canada, and the U.S. combined” (Ibid.) Go Blue Jays!
The Farmers’ Almanac, first published in 1818 during the presidency of Jim Monroe of Virginia, will permanently stop the presses.
- “You’re not sending too much E-Mail, but what you’re sending sucks!” (Murray)
- Check your quality before you engage in quantity — or before you send anything.
- For years it was something to be feared, a perceived quasi-secret, a Pandora’s Box.
- Your credit score, that is, now as ubiquitous as advertisements for Coke or Pepsi competing for your business.
- In this age of enlightenment, everyone wants to know everything — or at least thinks s/he wants to know everything…
- … and so the credit score has become a multi-billion-dollar industry, started on $800 in (where else?!) a California garage.
- Would you rather pay the Googlees or Walt Disney? You decide.
- Or, it might be smarter to get an antenna or rabbit ears while the airwaves are still, in theory, free-of-charge.
- If that football thing doesn’t work out for Jerry Jones he reportedly has a backup plan worth more than $100 billion.
- Out in East Texas and Western Louisiana, Jones controls a massive reserve of natural gas courtesy of his majority ownership in Comstock. (Morenne)
- We’ve been writing about Nvidia in these pages for a long time — and now it’s worth $5 trillion…
- … stop writing and start investing? (That’s what a person smarter than I would have done — and no doubt did.)
- I don’t know the answer to this:
- Anyone?
- If China doesn’t practice free enterprise capitalism, how can it have ~300 billionaires?
- (The U.S. has the most in the world, approximately 1,300 — billionaires, that is.)
- For you economists and mathematicians out there: Have there ever been the (adjusted-for-inflation) equivalent of trillionaires in the history of the world?
- We would suggest, yes.
- In the days of kings, queens, kingdoms, dictators, (Solomon?) and Etc., surely there was that much wealth — and more? — controlled by one human?
- Book: The French Revolution, Hardman
- Movie(s): That new one about Jim Garfield and concurrently, his assassin (Death by Lightning) — and A House of Dynamite (cleverly edited and presented)
- Theatre/ Opera: The critiques of Prince thus far = underwhelming…
- Magazine: The articles in the November 2025 Smithsonian are outstanding — especially the reminders of the horrors revealed at Nuremberg — and the history of Norfolk.
- Hasten to enjoy ALL of this treasure brought to you by the Brit, Jim Smithson and others.
- Will it survive? We sure hope so.
- Courtesy of MIT Sloan Management and author, Rama Ramakrishnan (September 2025), I thought we could together learn a bit more about the generative artificial intelligences…
- … this time from the bottom up — or from the inside out.
- This is perhaps somewhat analogous to learning how to build, maintain, and repair a jet engine.
- Just one bite at a time… no pun intended (get it?!)
- Disclaimer: We detest the abbreviations and acronyms, but will use them here (LLM = Large Language Model; i.e., the data banks making artificial intelligence possible)
- And, GENAI = Generative Artificial Intelligence; e.g., ChatGPT, Alibaba, IBM, SalesForce, Etc.
- OK… here we go:
- Q: How does the LLM decide when to stop? Put another way, when does the LLM decide to give the user the final answer to a question?
- A: “When an LLM answers a question, it produces text one small piece at a time. The technical name for a piece is token.
- Tokens can be words or parts of words…
- An external system runs the LLM in a ‘generate the next token; append it to the input; generate the next token‘ loop until a stopping condition is triggered.
- When this happens, the system stops asking the LLM for more tokens and shows the result to the user.
- (To state the obvious, all of this happens trillions of calculations per second!)
- Many stopping conditions are used in practice.
- An important one involves a special ‘end-of-sequence’ token that means, ‘end of answer.’.
- These are constructed when the GENAI is being ‘trained’.
- (To be ‘trained’ implies a human somewhere telling it what, when, where, and why, Etc.)
- (This is where good vs evil creeps in and has us concerned)
- Other stopping conditions include a limit on the maximum number of tokens generated so far, or the generation of a user-defined pattern…
- When we use a tool like ChatGPT, we don’t see this process, only the finished text (answer).
- When you start building your own LLM applications, you can adjust these stopping rules and other parameters themselves… affecting cost, formatting, and completeness.
- i.e., the decision to stop is an interaction between the LLM’s token predictions and the external control logic, NOT a decision made by the LLM itself.”
- (We will continue this education next week IF there is interest… nine more snippets from this MIT Sloan guy…)
- Typos: We apologize for and take full responsibility for the slippage of fingertips causing distraction if not confusion in last week’s Musings:
- Of the more than 270 life sciences workflows…, ~80% of workflows in pharma and medtech have tasks that could be automated or augmented.
- We own 100% of our mistakes… no artificial intelligences are used in the production of this column. Some might be needed, but…

