Posts filed under “Data Analysis”
We’ve tore these numbers apart enough in the past that it would just be going thru the motions to do so again.
For those of you new to the site, check out these prior discussions on the problems with this (and related) data series:
I cannot find the piece we did last year about the tendency for this data to mean revert month-to-month (Can anyone find that post? I’ll update it later on site).
Bottom line: Any single month’s data is very noisy, while a moving average is much more accurate. In fact, whenever one month was at any particular extreme (in either direction), the following month nearly always reverted.
So when this news comes out later today, take it with a grain of salt.
UPDATE March 20, 2007 3:26 pm
Earlier today, I mentioned the issue of new Home Starts — the data I was trying to track down was about New Home Sales.
Here’s what we found about Sales:
a) Often, the data appears to be "statistically insignificant," according to the Census Bureau;
Strong historical numbers (like plus 13%) tend to be subject to
revision, but mostly stay net postive, albeit somewhat moderated;
c) Over the past 10 years, double digit months have been followed by flat to negative data the very next month (Mean Reversion).
I suspect (but have not tested) that a similar pattern exists for New Home Starts . . .
Blame the professors: Just as the option backdating scandal started with academic researchers noting mathematical anomalies, so too might the next brewing scandal: the I/B/E/S Analyst ratings back dating scandal.
According to a Barron’s article by Bill Alpert (buried on page 39), several professors have discovered what they describe as 54,729 non-random, ex-post changes out of 280,463 observations — a little over 19.5% of analyst recs (abstract below):
"The professors found
almost 55,000 changes that had been made in the I/B/E/S database of
stock-analyst recommendations maintained by Thomson, the Stamford,
Conn., firm that is a leading vendor of financial data. The alterations
made Wall Street’s record of recommendations look more conservative –
hiding Strong Buy recommendations and adding Sell recommendations from
1993 to 2002. That is a period for which Wall Street has drawn heat and
government sanctions for touting Internet bubble stocks.
As a result of the changes, the stock picks shown in
the database would have created annual gains that were 15% to 42%
better than the originally recorded recommendations, using a trading
strategy based on analysts’ recommendations."
The firms were the most significant participants in the data backdating were also the firms who had the closest relationship between banking and research and were the hardest hit by the Spitzer enforced settlement.
From page four of the academic working paper notes exactly how significant this was:
"Why do the historical data now look different than they once did? The contents of the database changed at some point between September 2002 and May 2004, a period that not only coincided with close scrutiny of Wall Street research by regulators, Congress, and the courts, but also saw a substantial downsizing of research departments at most major brokerage firms in the U.S.
The paper outlines four types of data changes: 1) non-random removal of analyst names from historic recommendations (anonymizations); 2) the addition of new records not previously part of the database; 3) the removal of records that had been in the data; and 4) alterations to historical recommendation levels.
The net result of this was to make many specific trading strategies appear better in retrospect than they actually were. Buying top rated stocks and shorting lowest rated stocks, based on the changed data, now perform 15.9% to 42.4% better on the 2004 revised data than on the 2002 tape, the professors state.