Profile for Loyd E. Eskildson > Reviews

Browse

Loyd E. Eskildson's Profile

Customer Reviews: 4576
Top Reviewer Ranking: 2,095
Helpful Votes: 45474




Community Features
Review Discussion Boards
Top Reviewers

Guidelines: Learn more about the ins and outs of Your Profile.

Reviews Written by
Loyd E. Eskildson "Pragmatist" RSS Feed (Phoenix, AZ.)
(HALL OF FAME REVIEWER)    (REAL NAME)   

Show:  
Page: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11-20
pixel
1 NEW BRIDGESTONE TURANZA SERENITY 205/65-15 R15 TIRE 2056515
1 NEW BRIDGESTONE TURANZA SERENITY 205/65-15 R15 TIRE 2056515
Offered by 106St Tire & Wheel
Price: $138.95

1 of 1 people found the following review helpful
4.0 out of 5 stars Great Tire, So-So Price, August 1, 2014
Excellent tire - 80,000 mile warranty, but available at a much cheaper price from a major competitor that fills them with nitrogen.


The Collapse of Western Civilization: A View from the Future
The Collapse of Western Civilization: A View from the Future
Price: $7.39

2 of 2 people found the following review helpful
5.0 out of 5 stars Wow - Very Well Thought Through!, July 31, 2014
This essay is written from the perspective of a future generation almost 400 years in the future attempting to understand why the catastrophic consequences of global warming, were ignored - despite being obvious and proven beyond reasonable doubt by scientific evidence. A Chinese historian's perspective concluded that a second Dark Age had fallen on Western civilization, allowing denial and self-deception, rooted in an ideological fixation on 'free' markets, disabled the world's nations in the face of tragedy. Moreover, the scientists who best understood the problem were hamstrung by their own cultural practices, which demanded an excessively stringent standard for even accepting claims of imminent threats. Turns out that knowledge did not translate into power.

For over 100 years prior to their fall, the Western world knew that CO2 and water vapor absorbed heat in the atmosphere, and that massive releases of additional CO2 had begun with the onset of the Industrial Revolution - first in the U.K. (1740-1850), then the rest of Europe, the U.S., and Japan (1850-1980), and finally in China, India, and Brazil (1980-2050). At the start of the final phase in the mid-20th century few were concerned - total emissions were still quite low, and it was often said the 'the solution to pollution is dilution.'

The huge volume of materials being released from a number of sources (eg. coal, fuel combustion, concrete manufacture, deforestation, growing rice in paddy fields and cattle as primary protein sources) began to reach the limits of planetary sinks, and 'dilution' was no longer insufficient. In 1988, world scientific and political leaders created the Intergovernmental Panel on Climate Change (IPCC) to communicate relevant science and form a foundation for protecting the planet. However, critics (almost all in the U.S.) claimed the scientific uncertainties were too great to justify the expense and inconvenience of eliminating greenhouse gas emissions; other nations used U.S. inertia to excuse their own continuation of destructive actions. Less-developed nations also claimed they should be allowed to continue adding to the global warming because they were far less developed; to be fair, however, these countries produced few greenhouse emissions.

China was the notable exception, taking steps to limits population growth and convert its economy to one that produced significantly reduced emissions. The latter was aided by its ability to exercise centralized state power to force rapid adaptation. Had others followed its lead, history might have been very different.

In the early 2000s, fires, floods, storms, heat waves (a record-breaking 2010 summer heat and fires killed over 50,000 in Russia), and droughts began to intensify. Still, these effects were discounted. Some climate change deniers claimed these were simply the result of natural variability - despite a lack of supporting evidence. Climate-change deniers were funded primarily by profitable fossil fuel corporations and implemented by 'think tanks' that issued challenges to scientific findings they found threatening. Supporting legislation was passed (and opposing legislation blocked), beginning with the 'Sea Level Rise Denial Bill' of 2012 passed in North Carolina (now part of the Atlantic Continental Shelf), and the Government Spending Accountability Act of 2012 which restricted the ability of government scientists to attend conferences to share their work.

Scientists speaking out about potentially catastrophic climate change events were dismissed as 'alarmists seeking financial support and attention.' Most, however, were reluctant to speak out - believing that the subject was so broad that doing so would require them to speak beyond their areas of expertise.

Summer 2041 brought unprecedented heat waves that destroyed food crops around the globe. Food riots occurred in virtually every major city. Mass migrations of under-nourished and dehydrated individuals, coupled with explosive increases in insect populations brought widespread outbreaks of diseases, and destroyed huge forest areas in Canada, Indonesia, and Brazil. Governments were overthrown, and martial law was imposed in the U.S. The initial temperature increases also brought additional warming as increasing amounts of methane were released.


Big Bluff
Big Bluff
DVD ~ John Bromfield
Price: $5.98
18 used & new from $2.51

1 of 1 people found the following review helpful
4.0 out of 5 stars Clever -, July 31, 2014
This review is from: Big Bluff (DVD)
'The Big Bluff' suffers from rather poor cinematography and the obvious evil machinations of playboy Rick De Villa. On the other hand, its clever and surprise ending makes up for most the prior dreariness.


Superintelligence: Paths, Dangers, Strategies
Superintelligence: Paths, Dangers, Strategies
Price: $9.99

4 of 5 people found the following review helpful
5.0 out of 5 stars Outstanding and Likely Prediction of the Future, July 31, 2014
Humanity owes its dominant position on Earth to the unique ingenuity of our brains. Traditional means of enhancing our brains include education and training, and development of better methodologies and conceptual frameworks. In the longer run, biological human brains might cease to be the predominant source of Earthly intelligence. Artificial intellects can be easily copied and each copy can start endowed with all the knowledge accumulated by its predecessors. An iterative process of using a seed AI to create smarter versions of itself would bring about superintelligence within weeks or even hours. This would be the last invention biological man would ever need to make, since it would be much better at inventing than we are.

Only a small portion of evolutionary selection on Earth has been selection for intelligence. Improvements to intelligence can (and often do) impose significant costs, such as higher energy consumption or slower maturation times which may outweigh whatever benefits are gained from smarter behavior. Excessively deadly environments also reduce the value of intelligence - the shorter one's expected lifespan, the less time for increased learning ability to pay off. Evolutions scatters much of its selection power on traits unrelated to intelligence, such as the competitive co-evolution between immune systems and parasites. In addition, evolution often wastes resources producing mutations that have proved lethal. Thus, eliminating such inefficiencies could substantially improve the rate at which human intelligence improves.

Artificial intelligence need not much resemble a human mind. A 'seed AI' should be able to understand its own workings sufficiently to engineer new algorithms and computational structures to bootstrap its cognitive performance, resulting in an intelligence explosion. AI machines present the greatest potential for a really explosive takeoff.

Another approach to greater-than-current-human intelligence could be achieved through embryo selection. Bostrom reports that embryo selection involving 1 of 2 could, on average, improve IQ by 4.2 points, 1 in 10 --> 11.5 points, 1 in 100 --> 18.8 points, and 1 in 1,000 --> 24.3 points. More significantly, he tells us that 5 generations of 1 in 10 could add up to 65 points, and 10 generations of 1 in 10 add up to 130 IQ points. Thus, the average level of intelligence conceived in this manner could equal or exceed that of the most intelligent human ever, constituting a collective superintelligence. Elevated functioning in other dimensions such as health, hardiness, and appearance could also be achieved simultaneously and efficiently via a form of genetic 'spell-checking.'

Human reproductive cloning could also replicate the genomes of exceptionally talented individuals, though limited by the preference of most prospective parents to be biologically related to their children. (On the other hand, increased use of surrogate mothers could allow future children to avoid this potential limitation.)

Somatic gene enhancements that bypass the generation cycle could produce impacts more quickly. However, they are technologically much more challenging - requiring modified genes be inserted into a large number of cells in the living body - including the brain. However, this approach would be limited to tweaking an existing structure that had already undergone early brain development.

Some cultures/countries would be more likely to pursue these options than others. Post-war Germany, for example, gives wide berth to practices that could be perceived to be even slightly aimed at enhancement. China or Singapore, on the other hand, might actively promote genetic selection and genetic engineering - given their existing long-term population policies. Bostrom also expects that once success is demonstrated, other nations will quickly emulate and accept such. However, he does not expect significant impacts anywhere before the middle of this century.

Neurological development could also be promoted by low-tech actions such as optimizing maternal and infant nutrition, removing lead and other neurotoxic pollutants from the environment, eliminating lifelong iodine etc. deficiency, eradicating parasites and preventing diseases that affect the brain, and ensuring adequate sleep and exercise. Future chemical additions could also spark dramatic increases in intelligence.

Generational lags in most of the preceding interventions means that machine intelligence could proceed much faster. Brain-computer (cyborg) interfacing has been proposed as a way to get information out of the brain - however, the bandwidth attained in such experiments to-date is low (eg. typing out one slow letter at a time) and we already have speech recognition software. Other possibilities include advanced self-deception detectors, surveillance - coupled with automated analyses and reporting. Bostrom, however, sees brain-computer interfaces as an unlikely source of superintelligence.

One form of superintelligence would be speed (eg. 10,000X that of a biological brain would allow reading a book in a few seconds, writing a PhD thesis in an afternoon. wo;e at one million X allowing a millennium of intellectual work in a day). Another would be achieved by aggregating large numbers of smaller intelligences - Bostrom contends that collective intelligence excels at solving problems that can be readily broken into sub-problems that can be pursued in parallel.

Biological neurons operate at a peak speed of about 200 Hz, 7 orders of magnitude slower than a modern microprocessor operating at about 2 GHz. Thus , the human brain is incapable of rapidly performing any computation requiring a large number of sequential operations. Axons operate at speeds of 120 m/s or less, while electronic processing cores can communicate at 300,000,000 m/s - the speed of light. The human brain has somewhat less than 100 billion neurons - computer hardware is indefinitely scalable up to very high physical limits. (59)

Bostrom reports that many leading AI researchers place a 90% probability on the development of human-level machine intelligence between 2075 and 2090. The first superintelligence to be created will shape the world according to its 'preferences - these could involve the complete destruction of human life. The potential is enormous, and so are the downsides. Our saving grace would involve 'indirect normativity' and 'coherent extrapolated volition,' in which we take advantage of AI to deliver beneficial outcomes that we cannot see or agree on in advance. In one such likely outcome, few humans could continue to earn an income, while those that did might work all day to simply live at a subsistence level.


The Man Who Never Was
The Man Who Never Was
DVD ~ Clifton Webb
Price: $9.13
36 used & new from $2.92

1 of 1 people found the following review helpful
5.0 out of 5 stars Excellent!, July 23, 2014
This review is from: The Man Who Never Was (DVD)
Excellent film summary of an important and well-executed activity (Operation Mincemeat) that succeeded in diverting considerable German resources away from Sicily (actual landing site after the Africa campaign) to Greece. Per Wikipedia, it was also interesting to learn that two days after the D-Day landings, the Germans found an abandoned landing craft washed up in Normandy, containing top secret documents detailing future military targets in the region. Hitler, however, believed this was a deception similar to that depicted in this film and ignored the intelligence find. Again, during the drive into the Netherlands in Sept. 1944, complete operation details were accidently left on a glider that fell into German hands - again, this potential intelligence bonanza was ignored because of Operation Mincemeat.


Devil at My Heels: A Heroic Olympian's Astonishing Story of Survival as a Japanese POW in World War II
Devil at My Heels: A Heroic Olympian's Astonishing Story of Survival as a Japanese POW in World War II
by David Rensin
Edition: Paperback
Price: $9.25
79 used & new from $7.66

1 of 1 people found the following review helpful
5.0 out of 5 stars An Amazing Man, July 20, 2014
"Lucky Louie' began life as a rebel and quasi juvenile delinquent, then turned himself around (with some guidance from his older brother) in pursuit of Olympic track feats. Ironically, he made it to the 1936 Olympics and attracted Hitler's attention with a very strong finish in a distance race, and made it out of Berlin after capturing one of the flags flying at Hitler's headquarters (was caught, but his 'excuse' appealed to the Germans).

Zamperini's hopes to set a record at the 1940 Tokyo Olympics, however, were dashed by WWII. Zamperini became a bombardier, flew in a number of harrowing missions and saved several of his crew's lives via prompt attention to their wounds, and then found himself, along with two others, in a life-raft after their plane goes down on a Pacific rescue mission. Some 47 days later, one of the raft-members has died (starvation, dehydration) , and Louie and the pilot wash up some two thousand miles away on a Japanese-held island. Soon they were wishing they were still on the raft.

Some 2.5+ years later, the war ends and they're freed. Louie continues to contribute, starting a home for wayward boys after he himself recovers from a post-war drinking problem. Then in 1990 the Japanese ask him to help carry the Olympic torch in the run-up to the Tokyo Olympics. Louie had made it - fifty years later than planned.


Make It Stick
Make It Stick
Price: $15.04

1 of 1 people found the following review helpful
5.0 out of 5 stars Useful and Credible -, July 18, 2014
This review is from: Make It Stick (Kindle Edition)
Many educators have championed 'errorless learning' - advising teachers to create study conditions that do not permit errors. The idea is that students who make errors will remember those errors and not learn the correct information, or learn it more slowly. Research, however, shows that pupils learn better if conditions are arranged so they have to make errors - eg. tests so challenging they are bound to fail. Thus, students who take tests on material before studying it remember the information better and longer than those who study with pretesting.

Asking these kinds of questions before eg. reading a relevant passage focuses students' attention on the critical concepts. Researchers also found an advantage in having students guess the answers before studying the material.

This can be seen as an extension of the 'testing effect' whereby testing students on previously learned material causes them to retain the material better than continued study does.

'Massed study' describes when students focus their studying entirely on one skill of set of knowledge before moving on to the next. 'Interleaving' is when students shift studying back and forth between different topics on a regular basis. Massed study (cramming) produces short-term retention, but interleaving demonstrates much higher long-term retention.

Being required to supply an answer rather than select from multiple-choice options often provides stronger learning benefits.

The authors also assert that you can't think creatively unless you have something to think about, nor can you think critically unless you have something with which to compare. Thus, speaking dismissively about memorization as if it is beneath teachers and pupils is nonsensical.

Readability is another one of 'Make It Stick's' commendable attributes - presumably due to using as novelist as the co-author for the two researchers.

Bottom-Line: Testing is a powerful means of improving learning, not just assessing it.


Intel Trinity,The: How Robert Noyce, Gordon Moore, and Andy Grove Built the World's Most Important Company
Intel Trinity,The: How Robert Noyce, Gordon Moore, and Andy Grove Built the World's Most Important Company
Offered by HarperCollins Publishers
Price: $13.99

12 of 15 people found the following review helpful
5.0 out of 5 stars Industry Giants and the Beginning of Intel -, July 18, 2014
Silicon Valley 'began' on a September, 1957 day when 8 key employees ('The Traitorous Eight') of Shockley Transistor decided to quit their jobs. The 'last straw' had been Shockley's ongoing lie-detector program. Among the group - Bob Noyce and Gordon Moore, of later Intel fame. The group formed Fairchild Semiconductor. Their reason for leaving - Shockley had repeatedly demonstrated himself to be a terrible boss - paranoid, contemptuous of subordinates, and arrogant, as well as having no ability to run a business. The oldest of those leaving was 29. Fairchild Camera and Instrument invested $1.5 million to fund the new group, and retained ownership.

Bob Noyce saw two immediate needs: 1)Quickly getting a product to market, and 2)developing a new low-cost manufacturing method - else they would be swamped by far bigger firms with scale economies. Within three months the new group had a prototype and got IBM to buy 100 @ $150 each for use in its XB-70 avionics contract. Fairchild/Noyce committed to using silicon instead of the then standard geranium (brittle), and was able to deliver ahead of schedule five months later.

En route, one of the group invented photolithography (draw and photograph the design in a large format, reduce that image to a tiny transparency, use U.V - later laser light to expose the substrate and then etch away unexposed areas and dose them with impurities. Noyce himself then linked multiple transistors onto a single silicon slice, forming the first integrated circuit. Fairchild was now atop the industry and in less than a decade had 12,000 employees.

Competitors, however, did not stand still - Motorola was challenging with significant innovations of its own. Noyce then again shocked the industry by announcing at a 1965 conference that it would price its ICs at $1 each - a fraction of current industry pricing, and less than it cost Fairchild to make them. His logic was to price where costs were headed (per learning curve) and capture market share. Share price zoomed from $27 to $144 in the first ten months of 1965 ($50 in October alone).

However, Fairchild's domination didn't last long. It was late to the next innovation (metal-oxide semiconductors - MOS), many staff left because Fairchild had no stock option plan that would allow them to get rich quick (eg. some 100+ companies were spawned), quality began falling, and the N.Y. bureaucracy increasingly became a problem. The last straw came when Fairchild passed over Noyce for the top position.

Noyce and Moore left, and assisted by venture capitalist Art Rock, cofounded Intel, with Rock as the CEO. Their plan was to focus on memory chips and adopt an innovation pace that followed Moore's Law. Andrew Grove had been Moore's #2 in Fairchild's R&D and invited himself along. (He became an employee, without stock options.) (Grove has considerable misgivings about anything involving Noyce - seeing him as indecisive and unable to settle differences.) Noyce and Moore each put in $250,000, Rock added $10,000 and then worked to obtain another $2.5 million from investors, accomplishing the latter in 2 days. ($100,000 of original stock would be worth about $3 billion today.) Their first order obtained a price only 1/3 that expected.

Back at Fairchild, Noyce's replacement (Les Hogan, from Motorola) made the mistake (subsequently admitted) of bringing in a number of former cohorts from Motorola, further demoralizing remaining staff --> more departures, including Jerry Sanders, who then formed AMD and began a new life as a second source.

Noyce, as a 12-year-old, along with his 14-year-old brother and friends, built a 25-lb. glider, ultimately attaching it to a friend's car and towing it down the street with Noyce's 7-year-old brother as pilot. The next year he built a car, powered by a discarded gasoline washing machine motor. Went to local Grinnell College, found an inspiring physics teacher there and obtained a scholarship to an MIT PhD program. There he had a difficult time at first, and found that nobody knew about transistors. Noyce was able to learn that topic by attending technical conferences, and there also met Shockley and Hogan. MIT did have an excellent course on quantum theory.

Gordon Moore first attended San Jose State, then U.C. Berkeley, and finally California Institute of Technology.

Grove was first envisioned as head of technology development, but instead ended up bringing management skill to Intel - action plans were expected at the end of meetings, and there were penalties for wasting time and/or violating budgets.

Their first product was a 64-bit memory chip, brought out 18 months after the firm was founded, in the spring of 1969. Months later, it was followed by a 1,024 bit memory chip.

Friden introduced the first transistor calculator in 1963, and was followed shortly by Toshiba's first calculator - with semiconductor memory, precursor to the PC. This opened a new market and shifted the focus from military uses to commercial. New entrants flocked in and by 1970 there may have been over 100 calculator companies. The established firms worked to add more than the original four basic arithmetic functions (mostly American, eg. H-P), make them smaller (American and Japanese - Casio, TI), or drive competitors out with massive investments in marketing and lowering production costs (Canon). While Intel was struggling with problems concerning its new Honeywell-ordered memory chip, two of its top engineers were also working to develop four general-purpose programmable chips to hold serve as a 'quasi-CPU' per an order from Japan's Busicom. Eventually this was compressed into a single chip, the 740 KHz 4004 microprocessor. Intel was able to produce acceptable chips for both areas, and now had a toe-hold in the CPU market. That chip has 2,300 transistors and a circuit width of 10 micrometers.

Late 1979, however, saw Intel's 8080 losing out to Motorola's 6800 for CPU sales; it was also rumored that Zilog would soon introduce another superior chip - the Z8000. Intel reacted quickly to warnings from the field and Grove assigned marketing the task of coming up with a solution because there wasn't enough time to design a new chip. They quickly came up with 'Operation Crush' that emphasized customer solutions and a system, not product viewpoint. Hence, Intel was portrayed as offering a package solution that included it being a specialists (Motorola was into a much broader range of products),commitment to a long-term platform with minimal upgrade problems, superior supporting chips (eg. math co-processor), and emulators that allowed customers to integrate new Intel chips prior to their arrival. Grove took only a few days to approve their proposal, and each salesperson was given a target of one new sale each month, 2,000 total for the year.

Earl Whetstone decided to go to IBM, despite its internal semiconductor operations being larger than Intel's total sales. Fortunately, IBM was under antitrust scrutiny and was leery of immediately dominating the neophyte PC market. In fact, the small skunk-works charged with creating an IBM PC was largely isolated from IBM resources for that reason. The outcome - 10,000 chips sold/month to IBM; further, IBM also allowed others to license its approach --> IBM-standard quickly beat out Apple and its 90% of the market (had a six-year lead), and entered the corporate market as well. (Microsoft was asked to bring in its Word, and IBM also inquired about an operating system. Gates suggested DR-DOS (California firm), but when negotiations broke down Gates/Allen bought a Seattle company and submitted their product --> MS-DOS.)

Intel more than met its 2,000 sales target - hitting 2,500. However, Intel under Grove later failed to respond to early reports of Pentium flaws, one the average spreadsheet user estimated would encounter every 27,000 years. The stonewalling damaged Intel's reputation for technical superiority, and a month later Grove announced Intel would replace any CPUs customers wanted replaced.

'First-entry failure' examples - Netscape, Myspace, Altair.

The soaring memory demand for new digital industries in the early 1980s (minicomputers, video games, scientific and programmable calculators, PCs) brought double and triple ordering and subsequent overcapacity. Another problem - Japanese producers were achieving yields of 70 - 80% by the mid-1980s, while the best U.S. firms were in the 50 - 60% range, and the reliability of Japanese memory chips was also higher. U.S. memory producers saw market share fall from 75% in 1980 to just over 25% in 1986.

One of the best-known U.S. responses is Motorola's Six Sigma quality program; other contributions came from overcoming our heretofore aversion to 'not invented here,' benchmarking, weakening the dollar, and lobbying. While Intel saw the manufacture of memory chips as beneficial in testing new process/equipment improvements, and for rounding out its product line, it was losing money - and exited the business in 1986, along with 8 of the remaining ten other U.S. producers.

Intel then ended its CPU second-sourcing agreements with other firms, copyrighted the software embedded in its chip designs, sped up its product development cycle, and in 1991 launched its 'Intel Inside' branding campaign. In 1999 Intel had 82% of the microprocessor market.

Complex chips can require 30 or more mask layers and take weeks to process from start to finish. Wafer size has moved from 4" to 14+". Scale has moved from 20-30K wafers/month to 50K+. Chip design was initially done entirely by hand, and is now much more automated.

'The Intel Trinity' is a fascinating and important history of the U.S. chip industry and Intel. Intel's current top CPU (15-Core Xeon Ivy Bridge-EX) can hit 4.8GHZ, contains over 4.3 billion transistors, and has a circuit width of 22 nm. However, Malone fails to explain how even Intel largely failed to act upon the emerging tablet, mobile, and auto markets, and in 2013 had less than 1% of the mobile market - now dominated by Apple and Samsung. Samsung is now #2 in worldwide semiconductor market share, partly through its doubling flash memory density each year beginning in the early 2000s, is moving closer to #1 Intel each year.


Please Stop Helping Us: How Liberals Make It Harder for Blacks to Succeed
Please Stop Helping Us: How Liberals Make It Harder for Blacks to Succeed
Price: $9.60

25 of 29 people found the following review helpful
5.0 out of 5 stars Makes One Think, AND Passes the Common Sense Test -, July 17, 2014
LBJ launched the war on poverty and racial inequality and planned to win it by redistributing wealth and pushing numbers-based racial remedies. An almost bewildering array of Great Society programs was launched to accomplish. Author Riley's book reviews the track record of such efforts over the past half century. He contends they have slowed the self-development necessary to advance. Minimum-wage laws have priced blacks out of the labor force, affirmative action in higher education has brought fewer black college graduates (especially in math and science) than we'd have without racial preferences, and soft-on-crime laws make black neighborhoods more dangerous.

The Obama presidency demonstrates that blacks have progressed politically, but evidence from other groups indicates that black social and economic problems are less about politics than they are about culture. Persistently high black jobless rates are more due to unemployability than discrimination in hiring, the black-white learning gap stems from a shortage of education choices for ghetto children - not biased tests or a shortage of funding, and the real reason our prisons house so many blacks is black behavior - behavior too often celebrated in black culture.

Race consciousness helps cohere the political left, and black liberalism's main agenda is keeping race front and center in national conversations. Thus, much more common black-on-black crimes tack a back seat to much less common white-on-black crimes. Black turnout surpassed that of whites for the first time in 2012, despite purported racist voter ID laws.

Thomas Sowell's research has shown that political activity generally has not been a factor in the rise of groups from poverty to prosperity. Many Germans, for example, came to the U.S. as indentured servants during colonial times and shunned politics while working to pay off those debts. Asians have little political clout in the U.S., tending to avoid politics - yet, they have done quite well economically. Sowell found similar patterns elsewhere. On the other hand, Irish immigrants' rise from poverty in the U.S. was especially slow, despite the fact that Irish-run political organizations dominated local government in several big cities. Per Michael Barone, it was only after the decline of Irish political machines that average Irish incomes began to rise. Riley also asserts that there's a gap between black voters and black political leaders - citing efforts to establish a Walmart in New York City, expanding school choice in Georgia,

'The lack of live-in fathers is overwhelmingly a black problem, regardless of poverty status,' reported the Washington Times in 2012, citing census data. Just 12% of poor black households have two parents present, compared with 415 of poor Hispanic families and 32% for impoverished whites. Today, more than 70% of black children are born to unwed mothers, and only 16% of black households are married couples with children, the lowest of any racial group in the U.S.

Riley admits to making snap judgments based on incomplete information (eg. race only) - even though he has also been repeatedly on the receiving end of such. In 1980, blacks comprised about 1/8th of our population but half of those arrested for murder; they're also overrepresented in robbery, aggravated assault, etc. arrests - even in urban areas under African-American political control. In 2006, blacks were 37.5% of those in state prisons (1.25 million - 88% of those imprisoned) - excluding those imprisoned for drugs, the proportion drops to only 37%. (At the federal level, blacks comprised 25% in 1980 and 47.6% in 2006.)

Poverty as Cause of Crime: Chinatown within S.F. had the lowest average income, family income, and education attainment, along with the highest unemployment, TB, and substandard housing rates. Yet, only 5 of Chinese ancestry were sent to prison within California in 1965.

Stand Your Ground as Anti-Black: Blacks comprise 16.6% of Florida's population, but 31% of those using the Stand Your Ground Defense, and acquitted at a rate 8% higher than whites.

Only 11.3% of those benefitting from raising the federal minimum to $9.50 live in poor households; of those who would gain, 63.2% live in households with at least 2X the poverty rate, and 42.3% in households with 3X the poverty rate.

The greatest trick the teachers' unions every played on Americans was convincing enough people that their interests are perfectly aligned with those of parents and schoolchildren. The NEA and AFT together send the most delegates to the DNC every four years.

Senators Durbin and Kennedy have both strongly opposed school choice for D.C., while sending their children to private schools.


Climate Change: Evidence and Causes (PDF Booklet)
Climate Change: Evidence and Causes (PDF Booklet)
Price: $0.00

3 of 3 people found the following review helpful
5.0 out of 5 stars Credible and Clear, July 17, 2014
Verified Purchase(What's this?)
'It is now more certain than ever that humans are changing Earth's climate. Human activities, especially the burning of fossil fuels since the start of the Industrial Revolution, have increased atmospheric CO2 concentrations by about 40%, with more than half the increase occurring since 1970. Natural causes (variations in the Sun's output - varies slightly over an 11-year cycle, and in Earth's orbit around the Sun, volcanic eruptions, El Nino and La Nina) alone are inadequate to explain recent observed changes in climate. Decreases in the fraction of carbon isotopes C14 and C13 show that the rise in CO2 is largely from combustion of fossil fuels which have low C13 fractions and no C14. Measurements of air extracted from ice cores indicates that current CO@ concentrations are high than in at least 800,000 years.

Increases in the Sun's output would warm both the troposphere and the full vertical extent of the stratosphere. Measurements instead show tropospheric warming and stratospheric cooling over the past 30 - 40 years.

The last few natural ice age cycles have recurred about every 100,000 years, and mainly caused by slow changes in Earth's orbit which alter the way the Sun's energy is distributed. These changes are not sufficient to cause observed magnitude of temperature changes observed, nor to act on the whole Earth. Current warming is more than 10X the speed at the end of an ice age, the fastest known natural sustained change on a global scale.

CO2 concentrations and temperatures for earlier geological times have been inferred from indirect methods. These suggest that CO2 last approached 400 ppm about 3 - 5 million years ago, when global average surface temperature is estimated to have been about 2 - 3.5 C higher than the pre-industrial period. At 50 million years ago, CO2 may have reached 1000 ppm, and global average temperature was probably about 10 C warmer than today.

The observed recent warming rate has varied from year to year, decade to decade, and place to place, as expected from our understanding of the climate system. These short-term variations are mostly due to natural causes. Example sources include large volcanic eruptions, ocean circulation and mixing cycles.

Since the very warm year 1998 that followed the strong 1997-98 El Nino, the increase in average surface temperature has slowed relative to the prior decade. Despite this the 2000s were warmer than the 1900s. Some heat comes out of the ocean into the atmosphere during warm El Nino events, and more heat penetrates to ocean depths in cold La Ninas. Such changes occur repeatedly over timescales of decades and longer.

La Nina events shift weather patterns so that some regions are made wetter, and wet summers are generally cooler. Stronger winds from polar regions can contribute to an occasional colder winter.

Arctic sea ice is decreasing, while Antarctic sea ice is not. The latter is attributed to changes in surface wind patterns reducing the amount of warm air from low latitudes in Antarctica and may be due to effects of the ozone hole.

Attributing extreme weather events to climate change is challenging because these events are rare and hard to reliably evaluate, and affected by patterns of natural climate variability. The biggest cause of droughts and floods is the shifting of climate patterns between El Nino and La Nina events. El Nino events favor drought in many areas, while La Nina events promote wetter conditions in many places.

Results from the best available climate models do not predict abrupt changes (tipping points) in this century. However, these possibilities are hard to predict and cannot be ruled out.

If CO2 emissions stopped completely, it would take many thousands of years for atmospheric CO2 to return to 'pre-industrial' levels due to its considerable burial in the deep ocean - surface temperatures would stay elevated for at least 1,000 years.


Page: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11-20