Sunday, July 03, 2011

Hedge fund transparency and "barometers"

Jim Liew of Alpha Quant Club recently posted an interesting article about the increasing demand for transparency of hedge fund strategies by institutional investors, so much so that they are essentially willing to invest only in managed accounts with real-time trades and positions updates. This is, of course, bad for fund managers, since not only can the investor reverse-engineer the simpler strategies from such knowledge, they can also piggy-back on the trades, thus paying a much smaller portion of their profits as performance fee. One might be tempted to think that since the investors are going to reverse-engineer the product anyway, why not just make it as simple and as generic as possible, and charge a much lower fee than the usual 2-20 (which hopefully will attract a much larger investor base), so that the main value to the investor is just convenience and not the originality of the strategy?

In fact, Jim wants to do just that. He proposes to construct hedge fund "barometers", essentially prototypical hedge fund strategies running in managed accounts. This would work well if these barometers have large enough capacities such that the performance can hold up even when a large number of investors sign up. From the investors' point of view, this is a trade-off between investing in a truly outstanding, high-performance strategy while paying a large fee and losing "transparency", versus just investing in a generic strategy that may still outperform the broad market. For some institutional investors, this might just be the bargain they are looking for.

20 comments:

Damian said...

Essentially the same idea as Lo over at MIT of cloning hedge fund strategies. I think we'll see bifurcation in the hedge fund marketplace - name brand funds will continue as they have - can you imagine SAC offering this? And then the managed accounts will be for startup funds.

Ken said...

Hi everyone,

I read your blog with pleasure and come naturally for an unsolved question. I don't think it's right place but I didn't know where to put my problem.
I am looking for high frequency (low latency) data for Euronext/Liffe/CBOT/Winnipeg commodities exchanges.
My ressources are quite low and I don't know how to get these data...
I have all the close of the day but its not sufficient for some strategies and backtesting.

If you have any idea, I listen to you

Ken

(I am R user with a IB account where I can get real time data but not historical data)

Ernie Chan said...

Damian,
Actually I think if startup hedge funds want to compete with the big funds and the banks, it would be better for them to offer something unique and high-performance, rather than a generic strategy.
Ernie

Ernie Chan said...

Hi Ken,
If you use IB's API, you can download up to 1 year of historical intraday data.

Alternatively, you can purchase intraday data from tickdata.com.

Ernie

Bernd said...

Mmmm, interesting post Ernie. That new way of doing things keeps one wondering what would happen with the capacity of the strategies. If one is going to do this, maybe the best idea is to do it in a very big market like forex where the capacity of the strategies is probably the biggest ( so more ppl using it doesnt probably "hurt" the strategy a lot ), which would lead to an increase in opportunities in "low capacity" strategies for small traders ( since if those were used and disclosed, they would soon stop working )---> not a very smart move for a fund. What do you think??

Ernie Chan said...

Bernd,
I certainly agree with you that the profitable niche for independent traders is low-capacity strategies.
However, just because one is trading FX strategies does not guarantee large capacity, especially if one is trading intraday strategies. The liquidity at any one time and for a narrow range of prices intraday is quite limited.
Ernie

Ken said...

ERnie,

Thanks for your answer.
But I cant download not more than 2 weeks of intraday data (less than what I can see in Trader Work Station which is around 6 weeks).
It's not because 1 year because I am working on small underlying less mainstream.
Intraday one have US commodities..
Any other ideas ?

Thanks

Ken

Ernie Chan said...

Ken,
Try tickdata.com
Ernie

fan said...

Dr Chan,

I am currently reading your book "Quantitative trading" and find it very useful.

On P.53 in the book talking about the calculation of sample size in a backtest, however, I have an unsolved question when I calculate by myself:

For a 3 parameter model, the sample size:
daily model: 252 X 3 = 756 data points (3 years, → true)
1-minute model: 252 X 3 = 756 data points ( = 756/390 = 8.4 days → why not 7months (252/390)as you mentioned?)

Please kindly advise and thank you very much.

Regards,
Fan

Ernie Chan said...

Fan,
Yes, you are right that there is an error in that section.

Instead of

"... then you should have at least 252/390 year, or about seven months, of one-minute backtest data."

it should read

"...then you should have at least 3/390 year, or about 1.9 trading days, of one-minute backtest data, assuming that your trading model does in fact generate a trade almost every minute."

Also in the same paragraph, the entire last sentence in parenthesis "(Note that if you have a daily ... parameter model.)" should be removed.

I have asked my publisher to correct this in the next edition of the book.

Thanks for pointing it out.

Ernie

Ken said...

Ernie Chan

Again thanks for you answer. I wasnt' clear in my last post. Euronext commodities is not available on tickdata.
Indeed, I didn't find a website with intraday Euronext commodities.
Do you have any idea of a website with these kind of data ?

Thanks,

Ken

Ernie Chan said...

Ken,
Unfortunately I never had to get Euronext data. Perhaps other readers here can help out?
Ernie

Fuzhi Cheng said...

Hi, Ernie:
I am a reader of your book and a beginner in quantitative trading. I have a simple question on a basic pairs trading strategy. Suppose I find cointegration between assest A and asset B in log terms such that e=ln(A)-beta*ln(B) is stationary. Trading sigals are generated by e (or by ln(A) and ln(B)) but in real trading I have to buy and sell A and B, not ln(A) and ln(B). Then what is the beta I have to use? Thanks.
Fuzhi

Ernie Chan said...

Hi Fuzhi,
I would recommend that you use e=A-beta*B instead of the log versions for stationarity test.
Ernie

Fuzhi Cheng said...

Ernie: Thanks for your suggestion. The question comes out of reading Ganapathy Vidyamurthy's book "Pair Trading" in which the author talks about cointegration in log prices and then use the cointegrating vector as the ratios to go long/short equities (in absolute terms).
I have two other questions regarding your spread trading strategies (forgive me if the questions appear silly as I am new to your blog and have not fully explored the posts in here):
1. you use z score and a threshold of 2 as the entry/exit trigger. but would you justify this with the spread being likely non-Gaussian? would it better to intorduce some type of dynamics to the spread?
2. instead of using O-U method to determine optimal holding period, would it be possible to update the your pairs periodically to avoid breaking down of some cointegrating pairs (depending on the frequency of your data, the updating can be hours or days). But I do not know how to determine the optimal interval for updating.

Thanks!

Fuzhi

Ernie Chan said...

Fuzhi,
1) I do not believe that there is enough data to support nonlinear modeling. So unless we are working with high frequency data, I have found that linear modeling is the best for financial markets.
2) You should certainly continuously update the model, perhaps on a daily basis. The halflife from O-U model can be used to determine the lookback for the moving average etc.

Ernie

Fuzhi Cheng said...

Thanks, Ernie.

Fuzhi

Fan said...

Dear Dr Chan,
Refferring to your reply concerning the calculation of the sample size needed in a 3 parameter model,

you said "...then you should have at least 3/390 year, or about 1.9 trading days, of one-minute backtest data, assuming that your trading model does in fact generate a trade almost every minute."

Could you please clarify why it is 3/390??? Thanks a lot!!!!

Regards,
Fan

Fan said...

Dear Dr Chan,
Refferring to your reply concerning the calculation of the sample size needed in a 3 parameter model,

you said "...then you should have at least 3/390 year, or about 1.9 trading days, of one-minute backtest data, assuming that your trading model does in fact generate a trade almost every minute."

Could you please clarify why it is 3/390??? Thanks a lot!!!!

Regards,
Fan

Ernie Chan said...

Fan,
I am assuming we are trading US equities using 1-min bars. There are 6.5 hours in a trading day, and therefore 60*6.5=390 bars a day. If we were trading a daily model with 3 parameters, we would have needed 3*252 days of data. But since we are trading a 1-min model, the number of trading days needed is 3*252/390.
Ernie