Sunday, August 2, 2015

Masters of war, revisited


World War I got started in earnest 101 years ago, when Russia and Germany declared a mutual state of war on August 1, 1914. France piled on a couple days later, and Britain did the same within hours.

The textbooks say that WWI was provoked a month earlier by the assassination of Archduke Ferdinand, heir to the Austro-Hungarian Empire, on June 28. The shots fired by a Bosnian Serb nationalist led to 20 million military and civilian deaths.

A more accurate understanding of the origins of the war—and any war—must include a recognition that the effective causes of war are the many, sometime independent and sometimes overlapping, incremental acts and plans of individuals and governments that finally make conflict seem “inevitable.”

The European powers, including Russia, had been jockeying for years for economic power and political hegemony or dominance on the continent. Britain and Germany had been openly competing for naval superiority on the seas and coastal waterways. The 19th century monarchical and dynastic powers were struggling to retain power in an increasingly hostile international environment.

The brutal fact is that the European powers had been preparing for war for a long time. It really wasn’t a great big surprise in the summer of 1914 when it started.

The bitter truth is that many leaders, and many of the men and women who would become cannon fodder, welcomed the advent of World War I.


The frightening reality is that human nature hasn’t changed in the last 100 years.







Copyright © Richard Carl Subber 2015 All rights reserved.

Wednesday, July 29, 2015

The Canadians are coming!


Well, not really, but almost 100 years ago American military planners weren’t so sure.

You may have heard or learned in school that Canada is the only country for which the United States doesn’t have a standby war plan in case hostilities become imminent.

In the 1920s the Canadian military feared that their country might become a battleground if Britain and the United States were to escalate their competition for dominance around the world. So, as explained in a Boston Globe book review, the Canadians developed a plan to preemptively invade and conduct a holding operation to give British troops time to come over and pile on.


On our side, military planners cooked up “War Plan Red” (yeah, they did pick snazzy code names back then) to stop Canadian invaders in their tracks.

World War II got started a short time later and the Canadians and British and Americans found themselves on the same side and the war plans were ultimately pigeonholed.


Now, let’s be frank: today Canada has the world’s third-largest petroleum reserves and it has 20% of the planet’s fresh water supply. Not insignificant treasure.

Still, I don’t think any Americans are going to be heading to Toronto in a troop carrier any time soon.








Copyright © Richard Carl Subber 2015 All rights reserved.

Saturday, July 25, 2015

Book reviewing, the dubious career path….


Book reviewing never has been the noblest profession.

The art of the book review is relatively young. Edgar Allan Poe wrote some reviews for Graham’s Magazine in the 1840s. The first explicitly titled book review appeared in 1861—it was a sweetheart review, in the awkwardly reserved language of the era:

“The present work has the additional recommendation of an unmistakably useful subject…”

An interesting point is that no one thought there was a need for book reviews before the middle of the 19th century. The Junto: A Group Blog on Early American History says:

“By the 1840s, improved production techniques and faster distribution networks meant that middle-class readers in America could expect convenient access to a wide range of literary materials in a variety of formats. But they also meant that readers trained to prize discernment needed more sophisticated ways to evaluate the materials passing before their eyes. This was one of the requirements that led to early attempts to define an American national literary canon.”

Book reviewers haven’t been getting a lot of respect since the early days. Poe criticized book reviews in 1846:

"We place on paper without hesitation a tissue of flatteries, to which in society we could not give utterance, for our lives, without either blushing or laughing outright."

A century later, George Orwell had these unkind words for reviewers:

“In much more than nine cases out of ten the only objectively truthful criticism would be ‘This book is worthless’, while the truth about the reviewer’s own reaction would probably be “This book does not interest me in any way, and I would not write about it unless I were paid to.”

If you’re feeling the urge to be a full-time book reviewer, take a moment and think about medical school.






Copyright © Richard Carl Subber 2015 All rights reserved.

Monday, July 20, 2015

Bicycle what?


Sometimes we tend to think the ancients and so-called savages and sadly disadvantaged foreigners have strange medical practices, but we don’t have to look elsewhere for doctors gone bonkers….

Vox.com offers this little gem about a dark corner of late 19th century American medical care that flourished for a while when bicycling was a new fad and all the rage.

Doctors—almost exclusively male—took great pains to warn women that riding a bicycle could cause “bicycle face.” You know, bicycle face….

Here’s a quote from the Literary Digest in 1895: "Over-exertion, the upright position on the wheel, and the unconscious effort to maintain one's balance tend to produce a wearied and exhausted 'bicycle face . . .'"

And more: the “bicycle face” is “usually flushed, but sometimes pale, often with lips more or less drawn, and the beginning of dark shadows under the eyes, and always with an expression of weariness . . .characterized by a hard, clenched jaw and bulging eyes."


You know, bicycle face….

I guess maybe you had to be a doctor to recognize the symptoms….

And another thing: you have to wonder where women bought those exercise outfits….







Copyright © Richard Carl Subber 2015 All rights reserved.

Thursday, July 16, 2015

Plugging the meter….


The first parking meter was put into operation on July 16, 1935, in Oklahoma City, OK. It cost a nickel to park downtown for an hour on the southeast corner of First Street and Robinson Avenue.

Park-O-Meter No. 1


Finding a parking space was becoming a problem for motorists and shoppers. Nevertheless, some drivers fought the parking fee, calling it a “tax” without due process of law.

That gripe didn’t get any traction.

Within a half dozen years, there were 140,000 parking meters in America.

You know the rest of the story.









Copyright © Richard Carl Subber 2015 All rights reserved.

Sunday, July 12, 2015

Colonization: think of it as a bad idea….


Remember that white European impulse to establish colonies all over the world? It was the thing to do for several centuries, and it died hard.

Just for the record: the first two American soldiers were killed in South Vietnam 56 years ago, in July 1959, long before the Gulf of Tonkin incident, long before “escalation” began and long before some guys started burning their draft cards.

Maj. Dale Ruis and MSgt. Chester Oynand died at Bien Hoa in their Military Assistance Advisory Group (MAAG) compound during a guerrilla attack.

The MAAG had been set up in South Vietnam in November 1955, barely more than a year after the last French soldiers died in Vietnam at Dien Bien Phu.




Some folks in America thought it was a good idea at the time.












Copyright © Richard Carl Subber 2015 All rights reserved.


Wednesday, July 8, 2015

Trash, it turns out, is really old news


The first identifiable landfill was first used about 5,000 years ago on the island of Crete. I guess it was pretty much a run-of-the-mill landfill, except that probably no one knew exactly what to call it. 

There weren’t any bulldozers back then to cover up the mess, so I wonder if anyone had the courage to object to hauling trash and garbage to that particular spot and just dumping it there in a pile.

We still haven’t figured out a good solution for taking care of our trash, really, and in some parts of the world, like Japan and Europe, acceptable landfill sites are becoming filled to capacity. Guess what happens next—less acceptable landfill sites are going to be used, and then unacceptable landfill sites are going to be used.


The Atlantic magazine recently reported that about three-quarters of the stuff in the trash stream in America could be composted or recycled, but it isn’t. Most of it is being buried or burned.

The average American produces about 130 pounds of trash each month.

Those Cretans who started piling up their trash 5,000 years ago got us started on the wrong track.

We’re trashing the planet, and I think the trash thing is going to bite us soon, in a lot less than 5,000 years.







Copyright © Richard Carl Subber 2015 All rights reserved.

Saturday, July 4, 2015

The Declaration was a re-write

Book review:
Pauline Maier, American Scripture: Making the Declaration of Independence. New York: Vintage Books, A Division of Random House Inc., 1998.

The Declaration of Independence was a re-write….and it didn’t start the Revolution.

A quick review of what we know about the Declaration, courtesy of the late Prof. Pauline Maier: basically, it’s trash talk to King George III.


This book exposes the backstory of the Declaration. Yes, Thomas Jefferson wrote the draft in his stuffy room in Philadelphia, but the final document is the work of many hands. The Second Continental Congress substantially re-worked Jefferson's draft. The Declaration didn't "start" the American Revolution. It wasn't the "kickoff" event. It was more like a final formality to officially authorize the colonial rebellion which had been evolving for years and which had already been the subject of a shooting war for more than a year.


A point that’s interesting to me: much of the stirring prose in the Declaration had already been written in various forms by Jefferson and others in the multitude of documents approved locally throughout the colonies, expressing the colonials' increasing frustration with the failure of their efforts to negotiate a suitable accommodation with the King and his ministers and Parliament. Until the shooting started, there was persistent strong support throughout the colonies for remaining within the empire as long as American self-government could be sustained. 


 Finally, there is Maier's take on the Declaration as a late blooming "American Scripture." She documents, and challenges, the 19th century politicians' cumulative (and heedlessly incorrect) re-interpretation of the Declaration as a statement of governing principles and a blueprint for American political values and American democracy. Maier makes a plain case that the Declaration was intended only to demonstrate why, finally, the colonial disdain of King George had made American rebellion necessary and unavoidable.

A note for the serious reader: Chapter 4 incongruously seems to stray into anecdotal commentary on various interpretations by Abraham Lincoln and others. I understand the imputed relevance, but this section of American Scripture seemed to be casually written and insufficiently edited.







Copyright © Richard Carl Subber 2015 All rights reserved.

Tuesday, June 30, 2015

Before Plymouth Rock….



Our understanding of American colonial history tends to be English-centric, regardless of the fact that both Spain and France had active and substantial colonies on the North American continent.

The whole colonial experience never was all-English, all the time.


For instance, Sir Francis Drake (c. 1540-1596) was the first Englishman to land on the California coast near present-day San Francisco in June 1579. Naturally, he claimed the “new land” for Queen Elizabeth I and England. Just one problem: the English never established a colony in California.



In fact, Juan Rodriguez Cabrillo (?-1543), a Portuguese explorer, stepped on to a California beach near present-day San Diego in September 1542, about 37 years before Drake got California sand between his toes. Cabrillo claimed the western coast as part of “Alta California” for the Spanish Empire. California was absorbed into Mexico in 1821. The Spanish colonists and their descendants were a presence in California until it was admitted to the Union as the 31st state in September 1850 (after the gold rush started).


N. B. Before the arrival of Europeans in the 16th century, by some estimates the California territory was the home of about one-third of Native Americans living in the transcontinental expanse that would become the first 48 American states.     







Copyright © Richard Carl Subber 2015 All rights reserved.

Friday, June 26, 2015

Seven kids?!?


Today’s mini-history lesson:

It was a whole lot harder to cut the pie after dinner 150 years ago.
In the mid-1800s, the average American family had seven children. I guess the youngest never got any new clothes until he or she decided to marry.

About 100 years ago, at the start of the 20th century, the average number of kids per family had dropped to a bit over three—by that time, folks had been moving off the farms and shifting to urban life for quite a few years.(1)

Right now the average family has less than two children. In fact, the fertility rate of American women overall has dropped below the biological “replacement rate” of about 2.1 kids.

Immigration is responsible for net population growth in the United States.
  
(1) Atul Gawande, Being Mortal: Medicine and What Matters in the End (New York: Metropolitan Books, Henry Holt and Company, 2014), 21.








Copyright © Richard Carl Subber 2015  All rights reserved.

Monday, June 22, 2015

The old gray Magna Charta, she ain’t what she used to be….


It’s 800 years old. It’s one of those famously revered things that really never did mean what lots of folks like to think it meant.

Many folks will admit that they’ve heard of the Magna Charta, the Great Charter “granted” by England’s King John to his barons in June 1215.

Nearly everyone doesn’t know diddly about what the document actually says, or what it actually meant in the hurly burly of English and European political power-plays in the latter stage of the Middle Ages.

There is ill-informed understanding that Magna Charta was the first written guarantee of the rights and privileges of people who were members of the royal family, like barons, churchmen and the yeomanry and peasantry of England.


For starters, the original version of Magna Charta was a non-starter. The English barons pooled their grievances and brought the king to bay at Runnymede, on the Thames River near London. King John (died October 1216) never honored it, and the barons who forced him to sign it notoriously didn’t do much to honor their commitments, either. It didn’t take very long for Pope Innocent III to annul the charter, and the First Barons’ War ensued. Subsequent English kings revived and revised Magna Charta—it was a work in progress for about 80 years, and was finally reissued in more or less final form by King Edward I in 1297.

Magna Charta doesn’t declare many of the noble precepts that have been attributed to it. It most certainly is not the foundation of modern concepts of democratic liberties for all the people.
Magna Charta was a grudging compromise among powerful men who could be called rich thugs without too much exaggeration. The barons intended that it would secure their “rights and privileges.” It may well be true that the average English peasant or working guy didn’t hear about it for generations after it was signed.




By the way, here's a link to an English translation of the original Latin text. Give it a try. You’ll see that it’s not a clarion call for democracy.,















Copyright © Richard Carl Subber 2015 All rights reserved.

Wednesday, June 17, 2015

Only 40 hours!


Almost 90 years ago, the Ford Motor Co. became the first high-profile company to offer its assembly workers a five-day, 40-hour workweek in May 1926. A few months later, the unprecedented work schedule was extended to Ford’s white collar workers.


Henry Ford previously had shocked his big business peers by nearly doubling his assembly workers’ pay to $5 for an eight-hour day in 1914.

Before 1926, a six-day work week had been common throughout America. In the middle of the 19th century, American manufacturing workers put in about 65 hours a week, and the average workweek had dropped a bit to 60 hours by the end of that century. The number of hours on the clock dropped significantly in the first several decades of the 20th century.

The five-day workweek didn’t become standard until 1940, when provisions of the 1938 Fair Labor Standards Act were implemented.

Let’s note for the record that cellphones did not exist in the early 20th century, so those workers more or less actually did have two weekend days off from their labors.

Edsel Ford, the son of Henry Ford and president of Ford Motor Co. in the 1920s, explained the rationale for the five-day workweek: “Every man needs more than one day a week for rest and recreation….The Ford Company always has sought to promote [an] ideal home life for its employees. We believe that in order to live properly every man should have more time to spend with his family.”

Amen to that.







Copyright © Richard Carl Subber 2015 All rights reserved.

Saturday, June 13, 2015

Change is hard


A lot of folks didn’t know what to do with the new “rock and roll” music in the mid-1950s.

Some folks in Santa Cruz, California, thought they darn sure did know what to do about it.

On June 3, 1956, city officials decreed a complete ban on “rock-and-roll and other forms of frenzied music” at all public gatherings, and justified it because the music was “detrimental to both the health and morals of our youth and community.”

Seems that a couple hundred teens in the Santa Cruz Civic Auditorium had been swingin’ and swayin’ to the music of Chuck Higgins and His Orchestra. Santa Cruz police arrived about midnight to check things out, and Lt. Richard Overton reported the crowd was “engaged in suggestive, stimulating and tantalizing motions induced by the provocative rhythms of an all-negro band.” Of course, the cops shut the gig down and sent everyone home.

What starts out here as a great reason to get snarky—about the older generation that just didn’t get it—quickly turns into an ugly example of completely transparent racism.

Mr.Kesey
The cops and the city fathers must have been choking on their Cheerios 10 years later when Santa Cruz was a high-profile nexus of the West Coast counterculture scene. For goodness sakes, Ken Kesey and the Merry Pranksters hung out there.

The Merry Pranksters
And I guess a few more all-negro bands showed up, too.

Like, drug-infused hootenanny, y’know?

I’m guessing that Lt. Overton figured out that change is hard.







Copyright © Richard Carl Subber 2015 All rights reserved.

Tuesday, June 9, 2015

Long distance....a different concept

For some folks, a 50-mile commute is routine.

For some folks, worldwide travel is a hoot, every so often.

It wasn’t always so.

Barbara Tuchman, in A Distant Mirror: The Calamitous 14th Century, mentioned that 14th century manorial peasants might live their whole lives without venturing more than a mile from the spot where they were born.


In Pre-Industrial Societies: New Perspectives on the Past, Prof. Patricia Crone discussed the limitations on development of a market economy (trade) in those pre-industrial societies that evolved all over the world before the late 18th century advent of the Industrial Revolution. The peasants who did subsistence farming in Europe and on other continents were effectively limited to selling or bartering any meager surplus within a range of 4-5 miles from their homesteads, because it was neither practical not profitable to tote the foodstuffs beyond that range. Goods could be profitably transported to distant markets by boat (via river or sea), but peasants didn’t own boats.

It was a small world, in spirit and in fact.

Sources:
Barbara Tuchman,  A Distant Mirror: The Calamitous 14th Century  (New York: Alfred A. Knopf, 1978).

Patricia Crone, Pre-Industrial Societies: New Perspectives on the Past (Oxford: Blackwell Publishers, 1989 repr. 1993), 23.


Friday, June 5, 2015

The Great War....not


“FLASH: More than 6,000 American soldiers killed yesterday in Afghanistan.”

Of course it’s not true. It’s not even remotely imaginable, either.

100 years ago, that kind of body count was completely imaginable, In fact, it was so routine it wasn’t even reported in large headlines.

Niall Ferguson’s The Pity of War: Explaining World War I makes plain what we can’t understand today: in almost 4½ years of desperately bloody fighting, the good guys (Entente) and the bad guys (German-dominated Central Powers) killed about 9 million men, more than 6,000 per day, every day, for roughly 1,500 days.


Here’s a specific: on July 1, 1916, British and French troops went over the top at the Somme River. At day's end, the British had almost 60,000 casualties, including about 20,000 dead. Almost 2 out of 3 British officers who led the assault were killed.

They would have had trouble keeping up with the burials during WWI if massive artillery barrages hadn’t literally blown to bits so many of the dead.

A survivor recalled that the repeatedly churned earth around the trenches and in No Man’s Land was almost impossibly fetid because it was actually saturated with bits of decomposing human flesh.

What kept the men in those deadly trenches? Ferguson says ”…men stuck by their pals or mates…But the crucial point is that men fought because they did not mind fighting…murder and death were not the things soldiers disliked about the war…revenge was a motivation…Others undoubtedly relished killing for its own sake…men underrated their own chances of being killed…most men assumed the bells of hell would not ring for them…”


Of course now we can say it was not “a lovely war.” It should have been unendurable, but it wasn’t….

Source:
Niall Ferguson, The Pity of War: Explaining World War I (New York: Basic Books, Perseus Books Group, 1998, repr. 1999), 436, 446-47.



Copyright © Richard Carl Subber 2015 All rights reserved.

Saturday, May 30, 2015

Bonnie and Clyde, redux






Clyde Chestnut Barrow (1909-1934)










Bonnie Elizabeth Parker (1910-1934)












Bonnie and Clyde died 81 years ago on a rural stretch of Louisiana State Highway 154. Crowds soon gathered at the ambush scene, and many stole souvenirs like locks of Bonnie’s bloody hair and pieces of their clothing. The coroner claimed he saw one man trying to cut off Clyde’s left ear. Fabulous. Revolting.

It was the Depression time. The news media (this was before television, just imagine what the talking heads could do with this today!) went crazy reporting on the rambling banditry of the two lovers. The media did wrong, giving them celebrity coverage and gilding their story.

The 1967 movie with Warren Beatty and Faye Dunaway cemented the reputation of the duo as down-and-outers who earned the sympathy vote.

In fact, Bonnie and Clyde were small-time robbers and killers who gunned down nine police officers and several civilians. Bonnie basically was along for the ride—a gang member said later he never saw her pull a trigger. She didn’t smoke cigars, either.


Bonnie and Clyde were smalltown kids who grew up in distressing circumstances, had a fling in the center ring and went out in a pyrotechnical bushwhacking bloodbath on May 23, 1934. Texas and Louisiana cops fired about 150 rounds at their stolen Ford V8 car as it sped through the ambush zone. The coroner reported that Barrow had 17 bullet wounds, and Parker had 26.

As it turned out, it was a famous way to die.

No one remembers the names of the people that Bonnie and Clyde killed.








Copyright © Richard Carl Subber 2015 All rights reserved.

Sunday, May 24, 2015

First woman to run for president? Think 1872….


The first woman to run for president? Think 1872....

It’s not like you need the fingers of more than one hand to count the women who have run for president of the United States.

In fact, Hillary makes two.

Almost 145 years ago, the Equal Rights Party nominated Victoria California Claflin Woodhull to run for president against incumbent Republican President Ulysses S. Grant and Horace Greeley, the nominee of both the Democratic and Liberal Republican parties.

Woodhull didn’t get any Electoral College votes, and there is no authenticated count of the number of votes she received.

In any event, she hadn’t reached her 35th birthday, and was legally ineligible to be elected.

Woodhull, a suffragette, had a somewhat notorious career as a stockbroker, newspaper editor and a high-profile advocate of women’s rights, including the right to vote.

The weird thing is, of course, she couldn’t vote for herself. American women got the right to vote nationwide only in August 1920, with the ratification of the 19th Amendment.









Copyright © Richard Carl Subber 2015 All rights reserved.

Monday, May 18, 2015

Direct to California, c. 1869


You need to get from New York to San Francisco in a hurry. By train, it will take 7 days and cost $2,500. Do you go for it?

In 1870, you did. The transcontinental railroad was completed in May 1869, and it revolutionized travel to the West Coast. A first class ticket cost $136 (about $2,500 today) for a berth in a Pullman sleeping car—for $65 you could get space on a bench in the third class coach. I know, don’t even think about it.

Before the railroad was completed, the best a traveler in a hurry could do was take the Butterfield Express (later Wells Fargo) overland stagecoach. First, you had to get to St. Louis, MO, and then the stagecoach offered a spectacularly uncomfortable ride across the western plains in about three weeks, and sometimes the stage didn’t make it through. Traveling by boat from the East Coast to the West Coast took about a month.


Political shenanigans about the preferred route of the transcontinental line delayed the construction project until the Civil War began. With southern legislators (who advocated a “southern” route) out of the picture, the reps from northern states approved a route from Omaha, Nebraska, to Sacramento, California. In the mid-1860s, the national government handed out obscenely large cash grants and generous land grants to the Union Pacific Railroad and the Central Pacific Railroad. There was a lot of corruption, and a lot of worker exploitation, and a lot of folks got rich as the two companies laid tracks, starting at the endpoints and ultimately meeting at Promontory, Utah, on May 10, 1869.

You know the story about the golden spike and all the hoorah celebrating the completion of the rail link across America.

It was a really big deal that spread a lot of benefits around, although the Native Americans on the plains and the buffalo herds got the other end of the stick, you know the story.









Copyright © Richard Carl Subber 2015 All rights reserved.

Thursday, May 14, 2015

First telephone in White House


President Rutherford B. Hayes may not be famous for a lot of things, but he should get credit for being an early adopter. Of telephone technology, that is.

The telephone was invented by Bell, who famously said “Mr. Watson, come here, I want to see you” on March 10, 1876 (for the moment, we’ll ignore Elisha Gray’s famous challenge about the patent). 


Little more than a year later, President Hayes had a telephone instrument installed in the White House telegraph room. Almost 140 years later, President Herbert Hoover installed the first telephone in the Oval Office in March 1929.

Telegraph was the dominant communication technology in 1877 and would remain so for another 30-40 years, until the early 20th century. In fact, in 1877, the U.S. Treasury Department had the only direct connection by telephone to the White House, so Hayes wasn’t getting too many calls in those early years.

By the way, the White House telephone number was “1” in 1877. It’s a rather quaint historical footnote.








Copyright © Richard Carl Subber 2015 All rights reserved.

Sunday, May 10, 2015

The Kent State thing


Tip: If the guy has a loaded gun, don’t throw stones at him.

The average American living today hadn’t been born when Ohio State National Guard troops killed four student protesters and wounded eight on the campus of Kent State University on May 4, 1970.
Campus rallies against the Vietnam War had been banned by the college, but about 2,000 students defied the ban and turned out to throw rocks and shout insults at the fully-armed Guardsmen, who had arrived on campus the previous day and had already used tear gas to disperse protesters.

Around noon, the National Guard again ordered students to disperse, fired tear gas and advanced with fixed bayonets. With. Fixed. Bayonets. Within minutes, the young Guardsmen fired more than 60 rounds into the student crowds. Four years later, a federal court threw out all charges against the shooters.


As it happened, I was in Vietnam at the time, serving our country. When I heard the grisly Kent State news, in US Army headquarters in Danang, my first reaction was: why would angry young men and angry young women provocatively throw stones at scared young men in uniform who are holding loaded guns with fixed bayonets? I also remember wondering where they got the stones—next time you go to a college campus, count the number of stones you see lying on the ground. I didn’t actually feel sympathetic toward the student protesters.

Today, I feel somewhat more sympathetic. I’m real sure that no student in that mob at Kent State was seriously afraid that the guys with helmets and guns would shoot at them. Kent State is part of America, right!?

Today, I feel sad that on May 4, 1970, some Americans who thought they were doing the right thing decided to piss off other Americans who were carrying loaded guns, and some Americans who thought they were doing the right thing aimed their rifles at other Americans and pulled the triggers.

Today, I spend a fair amount of time thinking about how hard it is for all of us, separately and together,  to figure out what is “the right thing.”










Copyright © Richard Carl Subber 2015 All rights reserved.