Wednesday, May 22, 2019

In The Not Too Distant Future

Perhaps 2025, perhaps 2030.

(Cartoonist Pat Cross)

You've heard or read that the College Board which administers the SAT* has decided that the test, long one of the standard tests for college admissions, shouldn't just test the student's ability to understand written English and Mathematics.  They've decided they should weight those scores with a "double secret" adversity score that is supposed to recognize that some students come from a background which makes it harder for them to get ahead.  In principle, this new score could increase the overall "attractiveness" of an applicant beyond what their numeric SAT scores were, and let the students from a "difficult background" win a seat over a student with better scores who doesn't have as high an adversity score.  That appears to be the reason for this move.
The “adversity score” uses 15 factors to determine the level of difficulty and strife the applicant has faced that has shaped or impacted their performance on the SAT (Standardized Admissions Test), and through their high school career to provide “context” to the applicant’s social and economic background, Coleman said.
I call it double secret not just for the Animal House reference, but because the student doesn't get to see their score or know how it was derived.  The colleges see the score, but not the details of how the score was derived.   The problem is that the things they've talked about are not clear pictures of that student's overcoming adversity; they're too nonspecific.

For example, Karl Denninger talks about some of it in his piece (which I largely agree with). As Karl points out, 
That an area has a high crime rate does not mean any particular individual was a victim of crime.  That an area has a high poverty rate doesn't mean that an individual was functionally disadvantaged in learning.
Further, it's a snapshot.  If a student was living in affluence from birth until sufficiently before the test, and something that a parent did collapsed their standard of living, they will get a higher adversity score than they "deserve".  Did they overcome as much adversity as someone who lived 17 or 18 years in that neighborhood?  The converse is also true, if the parents in that bad neighborhood worked hard - maybe worked a couple of jobs - scrimping and saving to get a place in better part of town, they were in adversity all of their life until they took the SAT and get a lower adversity score than they deserve.  

An entrepreneur can see a market developing for rental units in "bad neighborhoods", so the parents doing everything cheating to get their kids into college can move there long enough to get a good adversity score on their SAT.  

This score is telling the colleges nothing meaningful.   

Cartoonist A.F. Branco of Americans for Limited Government thinks he has a reasonable guess about what the adversity score is really all about.

Definitely can't accept the Asian colors, so no whitish colors and no yellowish colors allowed.

* The articles inform me that the company that supplies the test is called the College Board, and the test is called the Standardized Admissions Test.  I'm old enough to have taken that when the company was called the College Entrance Exam Board and the test was the Scholastic Aptitude Test.  Of course, as a Graybeard, I took those while fending off hoards of velociraptors.

Tuesday, May 21, 2019

Peggy Noonan Giving Up on Fiscal Responsiblity?

Economist Chris Baecker manages fixed assets for Pioneer Energy Services and is an adjunct lecturer of economics at Northwest Vista College in San Antonio.  Today, writing for the Foundation for Economic Education, he notes an article by Peggy Noonan on May 2nd in the Wall Street Journal (paywall) and extrapolates it to say conservatives are giving up on fiscal responsibility.

I notice this because Peggy Noonan was among the very first people I quoted on this blog and I think of her as being fairly rational.   In this case, she's saying the rational fiscal conservatism of the Reagan years is pretty much over and conservatives need to recognize that.  She says, “The federal government will not become smaller or less expensive in our lifetimes.” and “less taxes and spending won’t resolve America’s crisis”.

I think it's a jump to say that a single writer, even one as influential as former-Reagan speechwriter Noonan is saying “Conservatives Are Throwing in the Towel on Fiscal Restraint”, which is why I didn't use Baecker's title.  Still, it's almost a cliche' that we don't have a fiscally conservative party in this country, just a Uni-party with two competing sides trying to gain power and control for themselves.  As Baecker says,
For the first time in a half-century, the GOP had control of Congress as the 21st century began, and for a majority of it, a Republican presidency. But far from being the “sober-minded … best stewards” of the government who look at spending “coolly,” they became drunk with the power of the purse. Spending rose an average of 7 percent per year, two to three times the rate of inflation.  
It's almost like Lord Acton said, "power corrupts and absolute power corrupts absolutely?"  Almost ... exact for being exactly what he had in mind.

So what's left to do?  Are we to watch America sink into the sunset (or crash into the ground along the way)?  Certainly we can work to cut waste, but with the total National Debt north of $22 Trillion and unfunded liabilities estimated at $124 Trillion, almost six times that debt, is it taking a trim when a full haircut is what's needed?  As I've said before, with these numbers we don't need balanced budgets, we need budget surpluses for about the next century.

Do they think they can shut down the Department of Education and fix things?  That's laughable; the DoE is almost a rounding error.  With 62% of the federal budget this year going to mandatory spending, that leaves 27% of the budget as discretionary (which includes DOD and all the alphabet agencies you know).  Oh, yeah, 62+27=89%.  Another 8% is interest on the debt, which needs to be considered mandatory spending too, but isn't.  That still doesn't get us to 100% and that linked website never fills in the gaps.

 The money to be saved lies in the mandatory spending category.
Life expectancy, for example, has increased by roughly a decade since the inception of both Social Security and Medicare. Logic dictates that the age of eligibility should follow suit. Also, we should have more choice about our participation in these programs and the way disbursements are administered.

Furthermore, there’s a popular myth that these entitlements have dedicated revenue streams: the payroll tax. The fact is barely a third of Medicare is funded that way, and if demographic trends don’t change, it’s currently projected that Social Security will also have to start dipping into general revenues to help pay out full benefits in fifteen years.
You may have noticed that these proposed fixes fly in the face of what the Evil Party presidential candidates are campaigning on.  In addition to Medicare for All, a massive budget buster, Kirsten Gillibrand declared that illegal immigrants should have the right to pay into and receive social security benefits. 

Just about a month ago, I wrote piece passing on how government involvement in education was distorting the market for college and making prices rise at several times the official cost of living.  That's another factor in the budget.  Mark J. Perry of the American Enterprise Institute has a plot that shows the rates of inflation since 1998 of a handful of common expenses.

Notice how the highest rates of inflation are in areas in which the government is heavily involved?  While I don't have numbers right now, it looks like the higher the involvement of the government the steeper the curve (hospital services, the top curve, are in the thoroughly broken insurance/government complex), college tuition and books as just described and so on, down to things where the market is working, with TVs getting the most affordable since 1998 - as anyone who has looked at the prices of 4k UHD sets can attest.

A simple but truthful statement is that America's fiscal policies over the last 100 years show a deep ignorance of, or at least antipathy toward, basic economics.  The fiscal conservatives have been our best hope, and consistently done nothing. Final words to Chris Baecker. 
If we’re stuck with a two-party system, Republicans need to be the party that aggressively represents and promotes independence and personal responsibility, not to mention logic and basic math. It’s been proven before, in 2006 and last year, that they can’t win elections maintaining or adding to the welfare state.
By that token, perhaps Ms. Noonan has a point that the government will not be right-sized in our lifetimes. At the same time we teach our kids about our federalist system, though, we should also impart upon them the ideal to work toward.

Monday, May 20, 2019

Old De-Rusting Trick Works!

I was surprised.  Actually, floored is a better word.  But first, I have to do a Grandpa Simpson story for a while.

I ordered a lot of the metal for my new engine build last weekend and received all the pieces by last Friday.  The issue with buying small quantities of metals is like buying small quantities of radio parts or a couple of screws.  Their default is to sell you more than you need and then stick on rough shipping charges on top of that.  "Too much" is fine - to a point.  Everyone likes having some steel or aluminum pieces to have around for repairing things, or unexpected little projects; and in the words of Mae West, "too much of a good thing is wonderful".  We just always end up needing something like a one inch long piece of 3/16" diameter drill rod and the dealers want to sell you 36".  I'd be happy to get 12".  It's still more than I should need, but not excessively.  I was able to get about a dozen pieces of various things like that from a place I'd heard of, but never shopped at, Hobby Metal Kits.  Fair prices, $10 flat shipping.

The flywheel finished dimensions are 3.750" diameter, and 1.125 thick at the hub.  Like this:

Note that it says the material is either cast iron or cold rolled steel (CRS).  So off I went to the metals dealers to find a piece of CRS that could yield this.  Some shopping around, first at Online Metals and then on eBay, resulted in me finding that a 4" diameter by 2" long piece was going to cost about $55 with shipping.  Then I stumbled across an eBay seller selling a slightly smaller piece, 3-3/4 by 1-3/4, but saying it was D2 Tool Steel.  His price was $18.75, including shipping.  I figured that the important part was that flywheel has a density more like steel or iron than aluminum and the alloy doesn't really matter.  I took what's probably a better piece of steel for many uses at almost a third of the price as the 1018 CRS.

When I received the package, there was a 5 lb 10 oz hunk of rust in the box.  That's an exaggeration, but not much of one.  There wasn't a single spot on the disk that wasn't rusty.

The thing is, the rust probably isn't an issue.  I'm going to skim the diameter of the disk down a little and take off much of its length, and if it's just superficial rust, it'll be in my shop vac within minutes of starting.

Over the years, I've read many "how to fix rusted tools" articles that involved soaking the tool in a bath of vinegar with added salt.  (Here's a more in-depth source.)  One of the voices in my head suggested I try this, just to see if it works.  It was such an impulse decision that I didn't even take a "Before" picture.  It was in a plastic leftover storage tub, covered with vinegar by about noon Friday.   Naturally, I went by and looked into the clear tub a few times, but the vinegar went nearly black as the day progressed, so something was going on in there.  Saturday afternoon I took it out, gave it a light brushing with a Brillo pad and was shocked to find it looked like steel.  It looked etched with a matte gray finish, but with the exception of some black marks that I could probably scrub away the rust was gone.

This disk was the solid reddish-brown characteristic of rust, with a fair amount of black in the color, too.

Rust comes up as an issue all the time.  I'm sure we get more of it here with our humidity and salty sea breezes, but it's an issue in most places.  I can't tell you how many times I've come across this idea but dismissed it.  I waited until now to try it and it actually works.  I may be the only guy in America who dismissed the idea of soaking files or pliers or other tools in vinegar to de-rust them, but I doubt it.  Give it a try.

Edit 5/20 2147 EDT:  The bad writing fairy snuck in between editing and posting

Sunday, May 19, 2019

Radio Sunday #8 - Wrap Up

I've spent three hours this morning trying to envision what Radio Sunday #8 should be about without coming up with any real ideas.  That seems to be a pretty solid indication that the series is over; at least from my standpoint.

Mrs. Graybeard suggested something on antennas, but I've written many pieces about antennas over the years, much more than the things in the first seven installments of this column.  Let me post a few links to some highlights.
 For more information, in the list of Pages on the right side bar see
Is the series really over?  I leave that up to you.  There are others who write pieces on tactical use of radios, and introductions to ham radio can be found all over online (I've done those here, too).  If there are other things you'd like to see covered, leave me a comment or send me an email at sigraybeard at gmail. 

Friday, May 17, 2019

Feds Announce Drastic Deregulation of Cherry Pies

There's a dangerous freeing of the market about to happen.  The Food and Drug Administration is about to deregulate frozen cherry pies, according to this summary on the Independent Institute.  I don't know if you're aware that they even had regulations on frozen cherry (and other) pies; but if not, you're probably not paying attention to the growing size of the Code of Federal Regulations and the fervor to regulate everything.

When President Trump took office, he promised to eliminate 75 to 80% of FDA regulations, which is clearly a tall hurdle.  Deregulating cherry pies is just one small, slightly comical step in that direction.  Comical because of the specificity of the rules and the questions that ensue.  Think "three reindeer rule" in cherry pies and you're there.
The FDA recently committed to deregulating the frozen cherry pie market. Specifically, the agency is re-examining current regulations dictating that frozen cherry pies are required to be at least 25 percent cherries by weight and that no more than 15 percent of these cherries may be blemished
 Clearly an obvious question is "what does 'blemished cherry' mean"?  To quote the FDA,
Not more than 15 percent by count of the cherries in the pie are blemished with scab, hail injury, discoloration, scar tissue, or other abnormality. A cherry showing skin discoloration (other than scald) having an aggregate area exceeding that of a circle nine thirty-seconds of an inch in diameter is considered to be blemished. A cherry showing discoloration of any area but extending into the fruit tissue is also considered to be blemished.

Failure to meet these requirements means the product gets slapped with a label indicating it fell "Below standard in quality," along with notes about cherry content or blemished cherries.
Personally, I'm going to be that if you're not a cherry grower, pie maker, or otherwise in the industry, you probably couldn't tell any one of those cherry blemishes from the others.  If you took a bite out of a cherry and it tasted off, you probably don't care that it looked beautiful, and likewise if it was discolored or had a scar but tasted wonderful, you probably don't care much either.  I'm not quite sure if I'd care if it had an aggregate discolored area of 0.2813" diameter (9/32) or 0.3125" diameter (10/32 =>5/16").

But that's not all.  Everything has to be defined.  For example, "Frozen" doesn't mean baked and then frozen; it's much more complicated than that. 
The agency has been regulating frozen pies since 1977. In that time, it has developed numerous additional regulations specifying what makes cherries blemished, what counts as frozen, and how much crust is needed to cover the pie. Frozen cherry pies are also the only fruit pies which must meet these standards.
I'm sure there are going to be some nanny state lovers cowering in fear, afraid that they'll get 24% cherries instead of the regulation 25%, and never consider that the market might encourage cherry pie makers to boast their pies have the Most Cherries or the Best Cherries.  Certainly, when there are standards like this, or on other foods, there is some good, the question is always whether the regulations do more harm than good.  Ordinarily, the big companies are in favor of the regulations.  They propose regulations and know how to live with them.  That's not the case with frozen cherry pie regulation.
Even bakers seeking political favors (yes, they exist) are eager to see such outlandish regulations eliminated. As one New York Post article reports, Lee Sanders of the American Bakers Association is “hopeful the cherry pie standard will finally be revoked, but that it would not make a big difference for the industry.” It’s not every day you find a regulation so poorly executed that not even special interest groups support them.

("Oven Fresh"?  What do you mean by that??)
Regardless of how the regulation passed, and how long overdue removing it is, we should be happy it will soon be gone. Maybe next the FDA will loosen its requirements on the size of the holes in Swiss cheese.

Thursday, May 16, 2019

Bezos, Blue Origin Reveal Their Moon Lander

While I've covered the commercial space flights of SpaceX, I've had less to say about our neighbor up the road, Blue Origin.  (While out around the neighborhoods, I recall seeing a guy in a Blue Origin tee-shirt).  Last week, Jeff Bezos revealed his plans and vision for his rocket company Blue Origin according to that large piece in Ars Technica.
The world's richest person, Jeff Bezos, unveiled his sweeping vision for humanity on Thursday afternoon in a Washington D.C. ballroom.  With the lights dimmed, Bezos spoke on stage for an hour, outlining plans for his rocket company, Blue Origin, and how it will pave the way to space for future generations.

We have seen bits and pieces of Bezos' vision to use the resources of space to save Earth and make it a garden for humans before. But this is the first time he has stitched it together in such a comprehensive and radical narrative, starting with reusable rockets and ending with gargantuan, cylindrical habitats in space where millions of people could live. This was the moment when Bezos finally pulled back the curtain, in totality, to reveal his true ambitions for spaceflight. This is where he would like to see future generations one day live.
The highlight of the talk (from my standpoint) is that Bezos unveiled their planned lunar lander called (what else?) Blue Moon.   The base configuration is capable of delivering up to 3.6 tons to the lunar surface.  It's envisioned as a lander - that means one way only - but there's also a configuration that has "stretch tanks," and could carry up to 6.5 tons to the lunar surface.  This would be large enough for a crewed ascent vehicle, pictured here. They just say that will be "built by another company".

Ars reports that Blue Origin is being funded by Bezos personally to the tune of about $1 Billion per year.  They point out that he didn't say whether he would fund the development of Blue Moon without NASA contracts.

The rest of the talk focused on Bezos' vision for the future.  Everyone has probably heard that his (presumably) friendly rival, Elon Musk at SpaceX, envisions colonizing Mars.  Musk wants to terraform Mars: turn Mars green and vibrant to make humanity a multi-planet species and provide a backup plan in case of calamity on Earth.  Bezos wants to preserve Earth at all costs, saying "there is no Plan B."

Instead of moving off to Mars, Bezos is an advocate of colonies in space.  Not International Space Station style; not even rotating wheel-in-space from the movie 2001-style.  Those are thousands of times too small.  Instead, he imagines colonies of millions of people living in permanent colonies in space.  About a million people per colony.
Other worlds in the Solar System lack Earth's atmosphere and gravity. At most, they could support perhaps a few billion people, Bezos said. The answer is not other planets or moons, he said, but rather artificial worlds or colonies in space known as O'Neill cylinders.

These are named for their creator, Gerard O'Neill, who was a professor at Princeton University where Bezos attended college in the early 1980s. In his book The High Frontier, O'Neill popularized the idea of free-floating, cylindrical space colonies that could have access to ample solar energy. Bezos was hooked then and became president of the campus chapter of Students for the Exploration and Development of Space.

And he is still hooked today, imagining up to 1 million humans living in each cylinder built from asteroid materials and other space resources. Each environment would be climate controlled, with cities, farms, mountains, or beaches. "This is Maui on its best day all year long," Bezos said. "No rain. No earthquakes. People are going to want to live here." And when they need to, they could easily fly back to Earth.
I read the first book I linked to (Colonies in Space) and it was also about O'Neill's ideas of permanent civilization in space.  It's a bold vision, if not a little short of colonizing another world.  As he points out, though, without access to materials from space such as metals from asteroids, water and other fuels that weren't hauled into orbit from the Earth, neither colonization is likely to go far.  While the piece yesterday talked about massive amounts of petroleum here on Earth, the first molecule of oil on Mars has yet to be found.  (Assuming it's there).  It goes without saying that none of this goes anywhere without low cost access to space, something that, to his credit, Bezos has been pursuing all along, as has Musk.
Bezos said he believes that—if this generation builds the infrastructure needed to enable humans to get into space and develop an economy there—future generations will pick up the ball and run. "People are so creative once they're unleashed," Bezos said.

One of five artist's conceptions of the kind of O'Neill cylinder colonies Blue Origin wants to see built. 

Wednesday, May 15, 2019

Why I've Never Been A Big Believer in "Peak Oil"

I've never been a big believer in the commonly expressed vision of Peak Oil (reasonable background on Peak Oil in the middle piece here).  If you're unfamiliar with the concept it starts with the very reasonable observation that the earth isn't infinite so there's not an unlimited amount of oil to use.  Then it predicts that once some quantity of oil pumped out of the ground is reached, production drops precipitously and dark ages happen.  The timing of this peak somehow always seems to be "real soon now" and has been from the earliest Peak Oil proponents in the 1950s until much more recent predictions.  The thing is, like most predictions of the future, it has been phenomenally wrong.  This graph sums it up well, from the US Energy Information Administration last year.

In 1980, we were forecast to have a 27 year supply of oil.  In 2017, after 37 years, instead of there being no oil for the last decade, the supply expanded to 46 more years.  One guy who has studied the Permian Basin in the Southwest US says it's totally reasonable to consider it a permanent source.  Essentially infinite.  Of course, what happened was that new techniques were developed to get oil out of the ground and that increased the amount that was recoverable.  All because the free market was allowed to function properly.  

All this is an introduction to the article I clipped that graph from, "Why Resources Aren't 'Natural' and Will Never Run Out" over on Watts Up With That, reprinted in turn from The Washington Times under the same title.  The version on WUWT has the cool graphics I clipped. 

The author starts with some backgrounder information; the World Wildlife Fund and a few other, like-minded organizations declared (because reasons, I'm sure) that on August 1st the world will go through "Earth Overshoot Day", when the world will go through more resources than it can produce in one year.  Exactly where the other 5 months worth of resources that we'll use will come from isn't specifically stated; there must be really big warehouses somewhere.  No, wait, if we're using resources faster than they can be mined, where would the resources to stock the warehouses come from?  I know, don't start getting all rational on them...

Anyway, this leads to a quote that is even more spectacularly stupid than that one.
Margaret Beckett, UK Environment Secretary pointed out in 2006, “It is a stark and arresting fact that, since the middle of the 20th century, humankind has consumed more natural resources than in all previous human history.”
I wonder if it ever occurred to Ms. Beckett that was because there were more people alive on Earth since the middle of the 20th century than lived before that?  Without massive plagues or Meteors of Death, the population is always increasing and it's virtually always true to say that more people are alive at this very moment than were ever alive before.  With more people, we'd be needing to cut the usage per capita massively to be using less resources than even 50 years ago.

Since commodities like metals or oil are freely traded, those prices can be tracked.  It goes without saying that if we're running out of any of those, their prices should be going up, right?  And they aren't.
The 1972 international best-selling book Limits to Growth predicted humanity would run out of aluminum by 2027, copper by 2020, gold by 2001, lead by 2036, mercury by 2013, silver by 2014, and zinc by 2022. But today, none of these metals is in historically short supply.

Global production of industrial metals soared from 1960-2014. Annual production levels were up: aluminum (996 percent), copper (417 percent), iron ore (531 percent), lead (343 percent), nickel (455 percent), tin (66 percent), and zinc (348 percent). At the same time, the World Bank industrial metal real price index of these seven metals was flat, down a little more than one percent by 2015. World reserves of copper, iron ore, lead, and zinc stand near all-time highs. Prices are not rising as predicted by resource-depletion pessimists.
The predictions were spectacularly wrong.  Of the dates predicted in Limits to Growth, we've gone past the dates for gold, mercury, and silver.  The plots below show these metals are still being mined, and the growth of production shows no signs of slowing down (right plot).  Aluminum production is hard to read thanks to the color they chose, but instead of declining, as you'd expect if we were on the precipice of running out, it has gone up tenfold since 1960 and still looks to be increasing.

The article produces a stunning quote to give some perspective on "limited resources".
Most people don’t realize the vast quantity of raw materials available on our planet. Canadian geologist David Brooks estimated that a single average cubic mile of Earth’s crust contains a billion tons of aluminum (from bauxite), over 500 million tons of iron, a million tons of zinc and 600,000 tons of copper.

There are 57 million such square miles of Earth’s land surface and almost triple that area under the surface of the oceans. Of course, only a tiny fraction of metals in Earth’s crust is economically recoverable with today’s technology. Nevertheless, Earth’s supply of raw materials is finite, but vast.  [Bold added: SiG]
Mining and minerals are one of my interests, and we've been to mines that allow tourists.  I've stood in a field of discarded iron ore - the highest grade ore found in the upper peninsula of Michigan.  Tons are lying there and have been for decades.  It's a well known place.  Do you think that would be there if we were short of iron ore?

Last November, I did a piece on the 50th birthday of the book The Population Bomb.  That was the book that claimed that population growth would result in resource depletion and the starvation of hundreds of millions of people.  I recall conversations about "hamburger wars" as people fought to the death for dwindling supplies of food. Millions would starve to death in the 1970s.

It's always the same mistake with these guys.  They take some trends, make a linear extrapolation and predict a doom.  They never consider that history isn't linear.  They never seem to grasp that human ingenuity is the most powerful resource on Earth.  Time after time, humanity has faced environmental problems or shortages and figured out ways around them.
The history of the human race is a history of using that ingenuity to improvise, adapt, and overcome.  It's not a smooth continuum but things get better.  In the long term, that's always true.

To quote the British historian Thomas Babington Macaulay, “On what principle is it that with nothing but improvement behind us, we are to expect nothing but deterioration before us?”

Tuesday, May 14, 2019

Idaho Embarks on Enormous Deregulation Experiment

If this is the post of the day, look to the right and see the graphic of the sunset with my message to Deregulate.  (If it's not the post of the day, look at the top of the right sidebar). 
  1. Sunset all new laws.  That means all new Federal Laws get an expiration date.  
  2. Throw out all useless old laws. Like how the Trump administration has thrown out 22 regulations for every new one they've added 
According to The Library of Economics and Liberty, and quoting from the Mercatus Center at George Mason University, Idaho has just thrown out all of their existing regulations.  All.  Admittedly, it wasn't planned and nobody seems to know quite what it means, but let me just quote from the article.
Something rather remarkable just happened in Idaho. The state legislature opted to—in essence—repeal the entire state regulatory code. The cause may have been dysfunction across legislative chambers, but the result is serendipitous. A new governor is presented with an unprecedented opportunity to repeal an outdated and burdensome regulatory code and replace it with a more streamlined and sensible set of rules. Other states should be paying close attention.

Instead, the legislature wrapped up an acrimonious session in April without passing a rule-reauthorization bill. As a result, come July 1, some 8,200 pages of regulations containing 736 chapters of state rules will expire. Any rules the governor opts to keep will have to be implemented as emergency regulations, and the legislature will consider them anew when it returns next January.
What does this mean?  Do STOP signs cease to be valid?  Is the state going to be overrun by inadequately licensed hair dressers or fingernail technicians, or some other specialty regulated into a tiny box by an established group of practitioners keeping newcomers out?  The authors of the articles say they don't know.

You might have come across the idea of a “regulatory reset.”  It would be like this, except deliberately crafted rather than the result of legislative dysfunction.  The idea is that the government eliminates all regulations and then brings back the ones it decides it wants.  Presumably, we would end up with substantially fewer regulations.  In practice, when deregulation is tried, regulatory agencies (think FCC, FDA, HEW, EPA and the alphabet soup of executive branch agencies) are reluctant to throw out sunsetted regulations.  It threatens their existence!  To borrow a quote from the Mercatus Center article:
The Idaho case also highlights the power of sunset provisions—or automatic expiration dates built into laws or regulations. In the past, academic research has found that sunset provisions are sometimes ineffective. Legislatures and agencies often readopt regulations without much thought.  To work well, sunsets may need to be structured such that large swaths of rules expire simultaneously, with reauthorization responsibilities falling to the legislature rather than regulators. Sunsets are perhaps most useful when rules are allowed to lapse and then forced back through the rulemaking process all over again. That way they can be subjected to public scrutiny, cost-benefit analysis, and perhaps even court challenges.  [Bold added: SiG]
The regulatory state hates this idea.  If the legislature is debating putting old rules back they're not working on new regulations.  I think that's a feature, not a bug.

It goes without saying that being in central Florida I'm just about as far from Idaho as one can get in the continental US and still be in the CONUS.  I imagine some of you are from Idaho and might have more insight.  Comments from anyone familiar with the Idaho goings-on are appreciated. 

Monday, May 13, 2019

A Change of Plans; Engine Plans, That Is

A couple of weeks ago, I told the story of how I went down every trail I could find in attempt to find plans for an engine I wanted to build.  It's an inline four cylinder from a 19-teens tractor called a Holt 75.  I thought it looked cool - I have a video of one running  This is the picture I took of one in 2015 that got me searching for the plans to build one. 

The conclusion of that story is that the plans were copyright to a company called Coles Power Models that's no longer around.  I concluded by saying that I ran across references to another inline four cylinder engine called a Panther Pup (video from the NAMES show in Detroit at the end of April) and bought the plans from Little Machine Shop

In the intervening weeks, a chance at the plans for the Holt 75 has resurfaced.  It's a bit of a story, but a model maker on the Home Model Engine Machinist's Forum bought the prints some years ago and built one.  Before he started he found other builders and asked them what the trouble spots were in the plans and then he improved on them.  He has gotten approval to sell his upgraded prints and is working on improvements to the engine.  The model pictured above uses castings for the intake and exhaust manifolds, at least, and I believe these new plans will replace them with manifolds made entirely from bar stock.  He expects to have his prints available "August to September", which I read as "next fall". 

Given that, I decided to put the plans for Panther aside, and instead build the Webster that I was originally considering for my first internal combustion engine.  I intend to follow the Webster with the Holt 75.  Do I build the Panther Pup or sell the booklet?   Don't know.  No harm keeping it for now.

Between last night and today I've begun the process of sourcing parts to build a Webster.  I haven't bought everything, but between what I have and what's on order, I should be able to get a good start on it before I need to get more materials.  

YouTuber PatPending's Webster. 

The Webster is the most often recommended design for a First IC Engine.  Which isn't to say that I don't expect to make mistakes, just that the design is supposed to be fairly simple and forgiving.  I actually could have made my Oh-Fish-Ul first part for the Webster today; it's just cutting a piece of 1/2" steel rod to length for the crankshaft.

Sunday, May 12, 2019

#7 – So Many Radios – What Determines a Good One?

With so many types of radios, so many radios at different price points, how do we know what's a good one?

If there's one thing I've said a lot of times in the history of this blog, the answer to that question is another question:  “good for what?”  There are many reasons for having a radio system, and the definition of what's needed isn't always the same.  A radio that's intended to be battery powered and carried for long periods is different from one intended for a fixed installation with AC power; that's not even mentioning that radios for different radio services are different from each other.

I had a conversation with a guy who claimed to be a salesman for Motorola radios in the “old days”, when their HT series of radios was new.  These radios eventually became the defacto standard radio for police, fire and public service, but that outcome wasn't guaranteed when they were starting out.  He said a police radio buyer told him they needed utmost toughness and reliability from their radios because the radios had a hard life.  The salesman took the radio, not knowing if it would pass the test or not, and threw it against a wall.  He picked up the radio and it still worked.  The sale followed quickly.  While my guess is that all of us would like a tough radio when the time came that we dropped it or something made it go flying but that we might not be willing to pay two or three times the price of other radios that aren't as tough.

As a contrast to that, one of the reasons I bought my Yaesu VX-6R handie talkie is that it's rated water resistant to an industry specification, JIS7, that rates it for submersion of up to 30 minutes at a depth of up to three feet.  At ham shows, however, they'll submerse the radio in a fish tank less than a foot deep with wires to an external speaker, and leave it underwater all day.  No, I don't plan to use the radio underwater, but getting caught in the rain is pretty routine, and having a radio I don't need to worry about is a Good Thing.

Very early on in this series, I talked about the big three factors in receiver design: sensitivity, the sheer ability to hear any signal above the noise by enough margin; selectivity, the ability to receive that signal while rejecting much stronger signals off the desired frequency and potentially very close to it; and signal strength handling, often called dynamic range, the ability to handle multiple nearby strong signals without the receiver's performance degrading.  Let's go through each of these. 

Sensitivity can be stated as the required signal level to produce a required signal to noise ratio to the listener.  An immediate problem with this definition is that the signal level that's available to the receiver varies with a handful of factors.  The biggest variable is the frequency we're tuning to.  The effect of noise from the sun and the “universe” is to have higher noise levels at lower frequencies than at higher frequencies.  That means radios that work at lower frequencies can't sense signals as weak as radios working at VHF or higher frequencies can detect them.  Not from a signal generator, but from the environment.

We need to make a short excursion into the idea of noise and noise figure – two things that are different but related.  Noise is pretty simple.  All components generate electrical noise at a level determined by their resistance and bandwidth.  The noise voltage is given by:

E= ktBR

Where k is Boltzmann’s constant as known in Physics classes, t the temperature in degrees Kelvin, R is the resistance, and B the bandwidth in Hz.

For general receiver design, that tells us the noise in a 50 ohm resistor at room temperature.  That’s -174 dBm per Hz of bandwidth (this is slightly off the real value, but we all use it for rough calculations).  This says a 50 ohm resistor lying on your workbench puts out this much noise power, and a 1000 ohm resistor puts out 20 log (1000/50) or 26 dB more power. 

It’s easy to turn this number into the noise in any receiver bandwidth by adding 10 log (BW).  The noise bandwidth is approximately the passband or 3 dB BW of the circuit you’re considering.  The noise power in a 1 MHz BW is then -174 dBm + 10 log(1,000,000), equal to -174+60 or -114 dBm.

Noise figure is ratio of the Signal to Noise ratios (SNRs) of the output to the input of a network.

NF = SNRout/SNRin

A different way of saying that is that NF is the degradation of the SNR caused by going through the circuit.  A pad, for example has a lower SNR at its output than its input because of its loss.  The NF of a pad is its attenuation, the reciprocal of its gain.  A 3dB pad, for example, has a gain of -3dB and a noise figure of +3dB.  This is generally true for a passive circuit.

What this means is that we can determine what signal level gives us a required SNR by doing some arithmetic.  This is usually called the Minimum Discernible Signal or MDS.

MDS = -174 dBm/Hz + 10 Log (BW in Hz) + Desired SNR + NF

For example, in that previously mentioned example of a 1 MHz bandwidth, if we want a 10 dB SNR, we know it's at least 10 dB higher than that -114 dBm noise in that BW, or -104 dBm.  We're not done yet; -104 isn't our “final answer”; that answer depends on how much that -104 dBm is degraded by the receiver Noise Figure.  We'll call it 4 dB for a convenient number:

MDS = -174 dBm/Hz + 10 Log (1,000,000 Hz) + 10 dB SNR + 4 dB NF = -100 dBm

The problem is that this calculated number isn't very meaningful for lower frequencies.  This chart shows some typical noise levels at lower frequencies (below 1000 MHz), in dB above the ktBR as we calculate above.

Look at 10 MHz, for example, right at the middle of the horizontal axis and read going up.  It offers many different possible noise levels; the “quiet rural” noise floor is never reached because the atmospheric noise in the day or night are considerably higher.  If you're in that “quiet rural” place, during the day, the noise could be 35 dB higher than the calculated -114 dBm or -79 dBm.  It gets worse from there.  A rural environment will be 40 dB higher than ktBR, a residential neighborhood close to 50 dB noisier and a business district could be 60 dB noisier than ktBR.

A result of this is that if the design is for the lowest points of these curves, it's likely the environment will never be as good as the receiver and won't degrade the receiver.  For an HF radio, for example, it's virtually never necessary to have a noise figure below about 10 to 12 dB and therefore virtually never necessary to choose a radio based on Sensitivity claims.  For a VHF/UHF HT, it's worth looking at those numbers, and sensitivity can be verified fairly easily with test equipment that isn't all that expensive or hard to find.

As an aside, last year I did a couple of posts on a small antenna project I was building specifically for 10 MHz and below.  These antennas are viable at 10 MHz and below because, while they receive less signal and less noise, their directivity might allow the user to position the antenna to improve the SNR slightly.  It's only possible because the strength of the signal required to overcome the noise at these frequencies means we have signal to throw out in an effort to improve the SNR.

Unfortunately, selectivity doesn't offer us any fundamental relationships from the hard world of physics that make our work easier.  The requirements are really determined by the type of signals we want to receive, and a multi-mode radio can have large sections devoted to holding all the filters it might need.  You might recall that I said selectivity is bought by the cubic inch, and that's nowhere as evident as in a multi-band, multi-mode receiver.

Selectivity is entirely determined by the filters – largely either in the IF of an analog radio or in the DSP of a Software radio.  It's measurement of the amount that off-channel signals are attenuated (reduced).

Selectivity is often specified by not just the width of the passband (the lowest loss part of the filter) but by the ratio of the 6 dB loss bandwidth to the 60 dB loss bandwidth.  A theoretical filter (and the digital filters can approach this the closest) has a 60/6 shape factor of 1 – they're exactly the same bandwidth, something referred to as a “brick wall” filter – the attenuation shape is the rectangular outline of a brick.  Practically, filters with shape factors more like 2:1 or 3:1 are more common in analog radios.  A 2:1 shape factor means the 60 dB rejection part of the filter is in the adjacent channel, so if there aren't strong signals within that offset and nothing there to reject, 2:1 is fine.  Amateur use isn't as strictly controlled as commercial use and therefore might expose the user to interference in the adjacent channel.

Radios for commercial service will have requirements they must meet for the IF BW that will specify how far from the desired channel an interfering signal will be applied and how much they should be rejected.  For example, it may say to set a signal generator on the desired channel, set a 10 dB SNR, the move the generator frequency to desired offsets and increase the generator output strength to verify that you don't exceed the same SNR unless the signal is more than 60 dB higher than when on channel.  Most services will have a wide bandwidth set of tests, such as rejection of the image frequency, or any services that the radio is expected to encounter in service. 

The strong signal handling, or dynamic range, properties of the design are very much under the control of the designer.  In commercial services, there are requirements that new designs will be tested for compliance to, while in amateur and more casual services, there may be little or no mention of it.

While selectivity is bought by the cubic inch, dynamic range is bought by the milliamp.  The basic idea is that the radio distorts less if the signal being amplified in any stage is a small portion of the “idling” or bias conditions in the stage. 

When two signals are applied to an amplifier, say two strong signals in the band being monitored, the amplifier can multiply them by each other if that amplifier becomes nonlinear.  Since multiplication is the same as amplitude modulation, the two signals are said to intermodulate each other, producing Intermodulation Distortion, commonly called IMD.  The most troublesome products are generally closest to the two signals.  They're called “third order IMD” because of the terms in the equations below the graphic (two times the first frequency minus the second frequency).  Third order IMD is seen in the left picture here, the two smaller signals on either side of the two bigger signals.  You can imagine that if you're trying to listen to the something on the frequency of the smaller signal that when the IMD is present, it might wipe out whatever you're listening to.

One of the reasons these are troublesome is that they increase at twice the rate of the desired signals.  This leads to a condition, seen on the right, where the increasing intermodulation products (IM Power) will equal the power of the desired signals.  This is called the Intercept Point (IP) or, more specifically, the third order Intercept Point, IP3.

The general rule for this number is “the higher it is, the better it is”.  The cost, as briefly mentioned is more current consumption, which means lower battery life in a portable radio, and more heat in any radio.

The dynamic range of a receiver is rarely specified except in high end receivers, and there are many ways to measure it.  Since those signals might be attenuated by filters in the radio, typical tests will set different amplitudes for both off frequency channels and it is tricky to measure in the lab. 

Saturday, May 11, 2019

A Different View On Splitting Facebook, Deplatforming And Shutting Down Conservative Speech

Two days ago, the news was that Facebook's co-founder, Chris Hughes, says it's time for the government to split them up.  I just think the real agenda might not be what it appears to be.

It struck me when I heard that Zuckerberg had asked for more government oversight for Facebook and the internet in general (overview) that the play is to get more power and control for his company.  Getting government involved absolves Facebook of any responsibility.  "We were just following orders", to borrow a phrase.

In legalese there's a difference between being a publisher and a platform.  A platform simply relays what anybody posts and isn't expected to be able to monitor every word.  A publisher decides what gets published so they're responsible for anything that they publish.  Think of a platform as a bulletin board that gets plastered with notices for babysitters, pets, pet sitters, jobs and so on.  A publisher is the news media companies you know.  Facebook, Twitter, YouTube and the rest have been platforms, but are being pushed into being publishers by legal rulings around the world.  That makes them responsible for all the content that shows up there.

Facebook is the 900 pound gorilla in the room of social media.  They have essentially zero competition, having swamped out the early approaches (MySpace, and so on) and bought up any place that threatened to be competition.  But just as they replaced the predecessors, and just like all High Tech companies, they're in constant fear of the Next Big Thing that pushes them into irrelevance.  They're afraid of a smart startup. 

So let's say Chris Hughes gets his way and Facebook gets split up and made into a utility.  That will ensure that they can never get any competition.  It's the same viewpoint that made all the large Internet companies back Net Neutrality: they already have a lawyers on the payroll so it's less of a burden for them to comply with the rules than a small, startup company would.  The startups would have to hire lawyers.
Here’s what’s was really going on with net neutrality. The incumbent rulers of the world’s most exciting technology decided to lock down the prevailing market conditions to protect themselves against rising upstarts in a fast-changing market. The imposition of a rule against throttling content or using the market price system to allocate bandwidth resources protects against innovations that would disrupt the status quo.
Assuming the big ones, Facebook, Twitter, and Google (YouTube) are begging the Feds to control them so that they become untouchable monopolies (in perpetuity) how could they force that?  How about making a bunch of egregious decisions about whom and what to ban?  Create lots of public outrage to make people say the Feds "have got to do something!"  Oh, it would be deliciously ironic if they could get those icky, deplorable "small government conservatives" to beg for government control, so let's make sure we make lots of them mad.  And if the Feds don't do anything, it's not like there's some place else for those deplorables to go.

(900 lb. gorilla in the room)

Just a concept that's been running in my mind.  I'm not saying it's definitely what's going on.  I'm asking how what's going on would be different if it was what they were doing.

Friday, May 10, 2019

Florida GOP Seems to Have Done Us a Good One

From The Truth About Guns, I learn that the Florida GOP did some work behind the scenes to block a developing plan to put an assault weapons ban on the ballot as a ballot initiative.  I wrote about this back in February, but the bottom line is that the usual anti-gun groups had formed a group called BAWN - Ban Assault Weapons NOW!  (with the exclamation point and all caps) that was primed to get a ballot initiative onto the 2020 ballots.  As always, the wording would ban every semiautomatic from a Ruger 10/22 to Barret M107A1 along with everything between and beyond.

You will note that BAWN is part of Americans for Gun Safety Now, a wholly-owned subsidiary of Michael Bloomberg Lobbying Industries, Inc. (yes, I made that up).  Back to TTAG for some important details:
There were two bills originally submitted this legislative session, HB 7111 and SB 7096. They were mirror bills that would require paid petition gatherers to register with the Secretary of State and to attest that he or she is a Florida resident for a specified period before obtaining signatures on petition forms. The bills also required the name of the sponsor of an initiative to appear on the ballot with the percentage of donations received from certain in-state donors. Finally, the bills prohibited compensation for initiative petition gatherers based on the number of signatures gathered.

If passed, the law would have gone into effect immediately for any initiative meant for the 2020 ballot. However both bills died in committee on May 3rd.

Yet on the last day of session, the very day that both bills died, another avenue of advancement presented itself. HB 5, a local discretionary sales surtax bill. Opponents of the AWB tacked the language on as an amendment and HB 5 which passed both the House and Senate and is now on its way to Gov. DeSantis’ desk.
Due to my naturally skeptical view of anything I read and pass on, I checked this with the websites and find HB5 did, indeed pass in the House and the Senate.   I can't find that this has been signed by Governor DeSantis, so it apparently isn't law, yet.

The way I read the requirements, it doesn't stop BAWN, but it raises the cost of entry.  It will require them to actually be incorporated in Florida, employ petition signature harvesters rather than pay by the signature and incentivize fraud, impose penalties for signature fraud, and do a few more things that can't stop an Astroturf group like BAWN but can make Bloomberg pay a bit more.   "A billion here, a billion there, and pretty soon you're talking real money" (supposedly).

Thursday, May 9, 2019

Busied today, so a diversion.  Big or small, cats are cats are cats

and cats love boxes

Wednesday, May 8, 2019

Mounting and Polishing

As Mrs. Graybeard continues to heal, I get to spend more time back in the shop.  I haven't broken dirt (broken metal?) for the Panther Pup, yet; haven't even ordered any of the materials, yet.  I'm going back and forth between the plans and a book I bought to help with design choices.  But, yeah, I need to get off the starting blocks on this. 

A project I've had going for a while was to turn an old walnut plaque into a base for my Duclos Flame Eater engine.  The walnut was half of an award plaque from the late 80s/early 90s, when I was working for Major Southeast Defense Contractor - they had a program (as so many do) of motivational awards.  I have a few of these, and I'm pretty sure I haven't found them all, so more potential boards to mount future projects.  I'm not sure where my patent plaque is these days, but it will probably be last. 

Several days ago, I was able to cut the plaque in half.  It had a cove routed into the edges of the board, so I needed to route the new, square edge to match.  It's not too inaccurate to say I have absolutely zero experience with walnut so that's my excuse for ripping some good sized splinters off the corners.  Good thing I cut a board in half, because now I have a smaller board for some future, smaller engine while using the second piece for the full sized piece I needed. 

Roughly speaking, I did the sawing and routing Sunday; sanded and did some filing to clean up the board on Monday; and put three coats of polyurethane varnish on it yesterday.  Today I mounted the engine and took this pic on my coffee cart.  Didn't quite push the junk far enough out of the way, but y'all are friends here.  You can tolerate it.

The aluminum looks pretty dull, so some polishing is in order.  If I can polish the brass acorn nuts holding the cylinder to the base, that will look good, as well as the brass accent of the four mounting screws in the corners. 

Tuesday, May 7, 2019

Deep Into Solar Minimum, The Sun Surprises

The news around the ham radio circles I hang out in is that an active sunspot region has rotated to the Earth-facing side and is crackling with activity.  Sunspot group 2740, part of current cycle 24, was visible a month ago and irradiated Earth with loud shortwave radio bursts before rotating out of view.

After the nearly two weeks to transit the far side of the sun, it's back and doing it again.
"Yesterday, May 6th, was an incredible day of strong solar radio bursts including one of the strongest of the current solar cycle," reports Thomas Ashcraft who recorded the outburst with a shortwave radio telescope in New Mexico.
Click here to listen to the noise depicted in this time/frequency plot.  No, seriously.

Ashcraft adds
"This one really rips," he says. "I recommend listening with headphones. It is a stereo recording with 20 MHz in one channel and 25 MHz in the other."
"In the dynamic spectrum, note the 'feathery' upward drifting radio emissions," he says. "I've never seen anything like that--not even during solar maximum. This is auspicious and rare activity to be happening during the deepest time of solar minimum."
I'm going to go out on a limb and say that most of you didn't know that radio astronomy was a hobby and there were radio astronomers who listened to noise as a hobby.  It doesn't require a giant, steerable dish antenna, either.  The sun is a constant source of noise over wide chunks of the radio spectrum and sometimes it does things a bit more dramatic than others.  Jupiter is a commonly heard source of noise in the High Frequency spectrum, common enough that commercial home radio telescopes are a thing.  These are named after a NASA education project: Radio JOVE

Getting back to the original topic, that the Sun has surprised observers by doing something never before seen by this observer (Thomas Ashcraft) during the beginning of a what looks to be a deep solar minimum, this is what makes the field something to keep an eye on.  Even though solar activity has been very low most of this year, it's still capable of developing an active sunspot region. 

Monday, May 6, 2019

An Old But Still Mind-Boggling Story

I stumbled across a little factoid today while reading a piece by Kevin Williamson on National Review about free trade as a bedrock Republican policy and how that has become less of a bedrock policy in the age of Trump.  Williamson refers to the current tariff policies as the, "Wallace-Buchanan-Perot-Trump model of populist neo-mercantilism".  That's a label you don't see every day.

The part that made me fact check was this quote.
The Jones Act, an antediluvian anti-trade measure signed into law by Woodrow Wilson, has many unintended and destructive consequences, one of which is that Americans in the northeast and in Puerto Rico are being forced to import natural gas from Russia and the Caribbean at a time when the United States is producing jaw-dropping quantities of the stuff — but cannot get it from the places where the gas is to the places where the people are. This piece of old-fashioned crony capitalism hurts everyone from utility customers to manufacturers to farmers. [Bold added: SiG]
As America is taking over as the world's largest petroleum producing country, we need to import natural gas? From Russia?

As the title says, it's not a new story.  The easily found verificiation story is from March of '18 in the Washington Times.  The reason why this happens is also clearly stated.
Yet, even as we become a global energy superpower, political barriers prevent us from maximizing the benefits of the shale revolution.

Earlier this year, New England — located just a few hundred miles from the Marcellus Shale, one of the world’s largest natural gas fields — was forced to import a cargo of Russian liquefied natural gas. This was necessary because anti-energy activists have convinced local elected leaders to block new energy infrastructure, including pipelines that could bring American gas to the region. This is making households in the Northeast more dependent on imported energy, and forcing them to pay among the highest energy bills in the country. [Bold added: SiG]
They spend a bit of time and column space to explain that this is all a quite deliberate strategy called "keep it in the ground" from anti-progress groups.  The Conservation Law Foundation, a prominent anti-energy group in Massachusetts, and the Massachusetts Sierra Club have declared “No New Pipelines,” while the state’s attorney general thinks Russian Liquefied Natural Gas (LNG) is better for the climate than piping in American fuel.
Indeed, blocking access to affordable energy is one of the “Keep It In the Ground” campaign’s core strategies. “If we can forestall gas infrastructure being put in the ground and locking in that demand for the next 60 years,” said Sierra Club’s Lena Moffitt, “the hope is that renewables will come in and be cost competitive in all markets.” 
One might be tempted to say, "alright, so the greenies and Mass-holes don't want to build a pipeline to bring natural gas into their state.  Fine.  We can send it other ways."  Not quite.  At least, we can't ship it there by tankers that already leave the Gulf of Mexico to carry American LNG to other countries.  That previously mentioned Jones Act of 1920:
... prohibits cargoes from being transported between U.S. ports unless they are carried on American-flagged ships. The Jones Act attracted scrutiny last year when it prevented much-needed supplies from reaching Puerto Rico after it was devastated by Hurricane Maria. The Trump administration later temporarily waived the law to allow assistance to reach the battered island.
Because so much natural gas is being shipped from the US, there is a shortage of tankers (paywall warning) that carry LNG.  In the intervening year since this Washington Times article, an oil export terminal has come online in Maryland, Cove Point, and while it's certainly closer to New England than shipping from Texas or New Orleans "around the horn" of Florida and up the coast, the tanker shortage problem apparently remains.
For years, Texans have helped families in the Northeast meet their energy needs through existing pipeline infrastructure. Pennsylvanians who sit atop the Marcellus Shale are also well-positioned to help New Englanders meet their growing demand for affordable power. Clearly, the region needs more American energy — be it through pipeline or LNG — and less dependency on Russia.

As is often the case, both extreme environmental groups and costly federal regulations are standing in the way.
In the end, the situation in New England has nothing to do with trade policy and tariffs, but rather the same two things hurting Americans every day: extreme environmental groups and the Federal government.  Two things joined at the hip. 

Energy Information Administration map of the major natural gas fields in the US.  I've circled Massachusetts in red and you can see how close to the Marcellus and Utica Shale formations they are.  I'm assuming Fracking has been outlawed in Massachusetts, as it has in New York.  Otherwise, they might drill some wells and find that the gas fields extend into Massachusetts itself.  But then they couldn't send money to Vladimir Putin.  The Sierra Club, Conservation Law Foundation, and Massachusetts Attorney General bring vivid illustrations to the term, "watermelon socialist".  Green on the outside; on the inside they've got to send money to Russia.

Sunday, May 5, 2019

Radio Sunday #6 - Software Defined Radios

Before getting started for #6, something I didn't say last time needs to be said and emphasized.  I measured one frequency in one band on one radio.  That really doesn't say much.  It could be typical of what you measure everywhere, it could be better than most frequencies and it could be worse.  A different brand could be better or worse.  What I measured is really the first step of what should be done before using a radio in a situation where you're really concerned about being monitored while leaving a radio in receive mode for long periods. 

On the other hand, the “near/far problem” is a much bigger problem.  Your weakest transmit signal is going to be many times stronger than this LO leakage.  I go through an example here where I attenuate the transmitter by a factor of one million – 60 dB.  That meets a link budget to a friend two miles away.  That signal, at about 1 microwatt leaving the transmitter is still 50 dB stronger than the leakage I talked about before.

Let's get to the subject at hand: software defined radio or SDR.  The problem with the subject is that it's so broad, it's hard to simplify.  It's really just “radio” but built a different way.  The most common definition is a radio in which some of the functions normally performed in hardware are done by software.  It's hard to say exactly when the first radio that could be called an SDR hit the market for hams and hobbyists.  They were certainly out in the mid 1990s, and I'm sure they were experimented with before that.

The most important thing to point out is that an SDR can't do anything that an analog radio can't do, but it generally does them in more repeatable ways, and in ways that can be miniaturized more easily.

An important example is the IF filter we've seen in every superheterodyne block diagram.  While intricate inductor capacitor (LC) filters, crystal filters, and mechanical resonator filters have been available for all of radio history, they all require effort by skilled technicians to align.  Buying filters from a company that specializes in building them outsources the alignment to another factory, arguably one better at building and optimizing those filters, but they're the same hardware.

On the other hand, once the signal has been digitized – converted to numbers – the signal can be mathematically filtered in software, what's often called “the digital domain”.  The software filters need to be designed and coded once, and there is never another alignment.  They're always perfectly identical.  In that sense, SDRs are a bigger boon to the manufacturers than to the users.  They need to inventory fewer parts, the radios need less alignment time during manufacturing, and the manufacturers have to invest less time in one of the major headaches all manufacturers face: components going obsolete.

A simplified block diagram of the Icom 756, introduced in 1996.  The area circled in red, a “DSP” module taking the fourth IF at 15.625 kHz and providing software defined filters and demodulation, makes this an SDR.

Before the signal can go through those filters, it has to be converted into numerical representations of the signals; it has to be digitized.  The heart of any SDR today is the ADC, or Analog to Digital Converter, usually just called “the A to D”.  These convert the smoothly varying analog voltage from the antenna, called direct sampling, or from an IF, to numbers.  One of the most important things to know about converting any signal to digits is how fast you have to sample it, and the most important relationship is the Nyquist limit, which simply says you must sample the signal at least twice as fast as the highest frequency of the signal.

There's something vitally important hidden here.  Let's pretend we're converting an analog radio to digital filtering and demodulation (among the first applications we call SDR).  The radio's final IF where the current analog filtering and demodulation is occurring is 5.000 MHz.  I don't need to sample at 10.000 MHz (twice the IF); I can just sample at twice the bandwidth of the IF I want to process.  Let's say I want to maintain 15 kHz wide  NBFM signals.  I only need to sample at higher than 30 kHz, not 10 MHz and 30 kHz A/D converters are an entirely different price class than 10 MHz converters.

This is called undersampling.  Instead of using a 10 MHz clock for the A/D converter, I can use a 50 kHz, or 100, or whatever is convenient (convenient = a frequency I can generate in my radio from a clock I have already).  Undersampling is taking advantage of a phenomenon called aliasing. An example of aliasing everyone has seen is in movies or TV when a car's wheels appear to be turning the backward or slowing and reversing as the car speeds up.  The sampling of the video or movie film shutter is aliasing with the image of the moving spokes. 

In a sampled radio, aliasing causes the sampled bandwidth to repeat over and over, almost forever, like this picture.  In this example, imagine we're sampling at 100 kHz.  The first band, on the far left, is DC to 50kHz, which is one half the sampling frequency (fs/2 in the figure).  But a signal just over 50 kHz is indistinguishable from one just under 50 kHz; that is, 51 kHz is indistinguishable from 49 kHz, 52 from 48 and so on – it's tuning “backwards”.  When we get higher than the sampling frequency, a signal at 101 kHz is indistinguishable from one at 1 kHz, 102 from 2, 103 from 3 and so on.  When we get above 3fs/2 the tuning direction reverses again. This goes on forever because it's pure math.  In reality, the parts get less efficient and we read back lower voltages than are really there at the A/D.

This can be used to our advantage, like in sampling the IF with a low frequency clock.  I should point out that a 100 kHz clock won't work if the IF is exactly at 5.0000 MHz because the IF is centered at zero, or 0 + 15 kHz, not going from zero to 30 kHz.  The clock would need to have a multiple with a sampling band that completely contains the IF bandwidth.

The alternating slopes on those bands in the picture is intended to convey that they tune opposite each other.  In this case, “repeating forever” means until the ADC stops responding to the signal.  How high is that?  I tested a converter rated for a 60 MHz clock once, or DC to 30 MHz.  I started at 15 MHz (fs/4) and kept increasing the frequency by 30 MHz to find when I couldn't see the signal on the software that decoded the ADC.  I never found the limit on the ADC.  I ran out of signal generators to test with at almost 10,000 MHz: 10 GHz.  The recovered signal was weak but it was the 166th alias.)  In a "professional grade" SDR, there are many filters to prevent aliases from causing problems.

There's now a big handful of hobbyist SDRs on the market, at any price point you'd like to play with from $20 to thousands.

What are the differences?  The lowest end, like the RTL-SDR, are typically parts developed for some other use (TV reception in the case of the RTL-SDR) and hacked by dedicated hobbyists to develop ways of getting their data streams out.

Typically, these radios have 8 bit A/D converters with a tunable RF front end and wideband frequency synthesizer for the LO.  The tunable front end helps reduce aliases but doesn't eliminate them.  Basically, frequency selectivity performance in a filter comes by the cubic inch and these radios just don't have enough cubic inches for the filters to be very useful.  Filtering is the most frequent addition to make low end SDRs better. 

The RTL-SDR block diagram, derived by one of the early experimenters is in two parts.  Analog:

The digital portion looks like this:

The block diagram is all digital after the A/D on the left.  The left hand box is an I/Q downconverter, just like the one in part four's installment with some additions.  The abbreviation DDC is “Digital Down Converter”.  DI and DQ are Digital I and Q, a FIR is a type of digital filter (Finite Impulse Response) with addition downconversion.  Decimation (done in a decimator) is a way of reducing the frequency by intelligently throwing away samples.  In both boxes, the down arrows denote this downsampling, reducing the sampling rate from the 28.8 MHz data originally coming out of the ADC.  The output of the digital module is the decimated I&Q data streams for demodulation by the PC software.

What distinguishes the thousand dollar SDR from the $20 kind?  First, I've mentioned filtering, but there's more.  A limitation of the low end radios is that they have low resolution A/D converters.  The RTL-SDR shown here has an 8 bit converter.  Higher end radios have 10, 12 and even 16 bit A/D converters.  What's the big deal?  The SNR in a converter never gets better than just over 6 dB/bit: that means an 8 bit converter can give a 48 dB SNR while 12 bit converter can give 72 dB and the 16 bit converter can give 96 dB.  While a 48 dB SNR signal is fine to listen to, the world brings wide ranges of signals into a radio and the 8 bit converter is more prone to overload on stronger signals.  More bits perform better, but the converters cost more and they impose more costs on the systems because more data is spewing out of the converter making the digital circuitry much harder.

Another issue is that these low cost SDRs are also low power radios and the only way to get good linearity (that is, minimum distortion) out of an RF amplifier is to run higher current.  You'll read of interference problems with pagers and FM broadcast stations with these radios and there would be fewer of these problems if the front end was designed for higher level signals.

Think of these lower end radios as equivalents of low cost scanners and other “beginner radios”.  They introduce you to listening to things you couldn't hear otherwise, but they may also introduce you to problems, too.