CTCR Roundtable on Neural Network Analysis

The latest analytical approach in futures trading is neural network software. This is an alternative to the traditional trading system software. Normal system software applies an algorithm supplied by the programmer to determine buy and sell signals. These systems almost always work with past price data from the commodity for which the software is generating trading signals.

Neural network software is unique in two ways. First, the software not only provides the trading signals, it also creates the algorithm itself. In other words, the computer both designs the system and generates the trading signals from it. Second, such software often incorporates data independent from the usual open, high, low and closing prices. This may be price data from other markets or even non-price data.

In order to explain this new capability and help you decide whether it merits further investigation, we have assembled three authorities familiar with neural networks and the issues surrounding them.

Our experts are Louis Mendelsohn, Sudhir Chhikara, and David Aronson. Mendelsohn, 44 years old, is President of his own conglomerate, Market Technologies , which is actively engaged in the research and marketing of neural network software systems.

Lou Mendelsohn’s name should be familiar to long-time system users. He created the first commercially available historical testing software for commodity trading on the PC. His ProfitTaker was the first commercial commodity system software. It was based on moving averages with surrounding sensitivity bands.

Mendelsohn was born in Providence, Rhode Island. He received a Bachelor’s degree in Administration and Management Science from Carnegie Mellon University. He has both an MBA degree from Boston University and a Master’s degree in Social Welfare Administration from the State University of New York. Before commodities, his career was in health care management. He moved to Florida to become a hospital administrator.

Lou began his investment career as a stock market and stock options trader. He started trading commodities in 1979 and had traded “off and on” ever since depending on his outside workload. He bought his first Apple computer in the 1970s and used it to facilitate his market analysis. He eventually hired a programmer to help him develop historical testing software for commodities.

Lou is married, has three sons and lives in Florida north of Tampa. His hobbies are antiques and raising horses.

CTCR: Lou, in between developing ProfitTaker and your neural net research, you worked on another kind of program.

Mendelsohn: Around 1987, I started research for a program called Trader. It was pre-neural net, but it was an effort to look at relationships between markets. I wanted to examine intermarket relationships and the impact of fundamental data. The program used a spreadsheet in which we tried to assess the effects of pending economic reports on market price activity and the relationships of related markets on the target market we’d be trading.

CTCR: What became of that?

Mendelsohn: Data availability was a roadblock to making any real progress. For example, at that time we had no past history on what the expectations for the various economic reports had been prior to the report being announced. It was my first preliminary stab at trying to develop a quantitative method for looking at intermarket data and predict its impact on prices. It wasn’t possible with the software technology we had available then to be able to use intermarket data in a serious way. Neural networks weren’t around at that time, so I was trying to look at the phenomenon using more limited tools.

I needed to find a different means of looking at intermarket data. That research led me to explore neural network technology. Neural nets by their nature are capable of dealing with large quantities of indirectly related data and looking at the relationships that exist in that data over a long period of time. They had just the kind of capabilities I was looking for. Over the last two and a half years, I’ve been extensively involved in researching neural net technology and looking at the role of intermarket dynamics and fundamental data on trading.

CTCR: With respect to your pre-neural net research, would it be fair to say that you weren’t successful in integrating fundamental and non-price data into something that would be valuable?

Mendelsohn: Yes.

CTCR: But your conclusion from that was not that such data isn’t useful, but that you just didn’t have the right kind of data.

Mendelsohn: Even now, the unavailability of fundamental and non-price data is still a big limitation on our ability to explore neural net technology fully. The large commodity industry data vendors are still predominantly orientated towards price data.

CTCR: What kind of data have you tried to use?

Mendelsohn: We looked at the whole range of economic reports.

CTCR: The kind of thing you might find in Barron’s?

Mendelsohn: Yes, but what we needed was large quantities of it going back in time so that we could develop a meaningful data base and be able to examine patterns that had previously existed between the fundamental data and the price activity. That wasn’t easy to do.

CTCR: What about the argument that this data isn’t very accurate to begin with? You are almost stabbing at something you can’t really see.

Mendelsohn: That’s true. It is still a problem. It’s one of the reasons we have shied away from using fundamental data. For instance, the only fundamental data we’re using right now in our T-Bond model is the Fed Funds rate. The rest of the data we’re using to track the Bond market is essentially variations on price data from seven other closely related financial markets.

CTCR: What attracted you to neural network technology?

Mendelsohn: One of the problems with historical testing is the question of curve-fitting and over-optimization. The whole genre of traditional commodity system software is dependent upon an expert to define the trading rules the system will employ. Another limitation is that the prevailing means of analysis limits itself to looking at a single market and its own prior price history in isolation as a way to model that market.

We are now seeing the advent of global markets, vastly increased telecommunication speed and advanced satellite capabilities. A trader must try to assess the effects of related markets on the commodity he is trading. Neural nets can do that. Additionally, they are not limited to looking just at price data. They can incorporate fundamental data, volume, open interest and other information that current analytical methods ignore. Neural nets are not limited to using the common technical studies-moving averages, RSI and such. Those tools have been overworked in the last few years. Neural nets are adaptive. They can learn through ongoing training. They’re much less inflexible than traditional kinds of analysis.

CTCR: How did you first find out about neural nets.

Mendelsohn: I think a company that offers generic neural net software had seen one of my articles in Barron’s on historical testing in commodity trading. They wanted me to write an article on how to use their software in commodities.

Neural network technology was a way to bring the research I had been doing for the last 10 or 12 years to a new level. It was a logical extension of what I had been doing.

CTCR: We should probably start with a definition of exactly what a neural network is.

Mendelsohn: A neural network is a system for modeling processes and time series data, such as price data. It is an information processing model that mimics how the human brain processes information. Neural nets do not use any preconceived trading rules. They can look at large quantities of data. The data does not have to be from only one market. They can look at a number of related markets. If recurring patterns exist in the data, a neural network can find them. That allows you to utilize a neural net for price predictions of various kinds.

As used for commodity futures trading, a neural net is a piece of software by which the computer looks at historical data and instead of applying relationships that are given to it by the operator, it finds its own relationships to predict whatever you want it to predict.

CTCR: Unlike the usual commodity system program that uses open, high, low and closing prices, supplying data to a neural network program is much more complicated.

Mendelsohn: That’s right. You can’t put just raw data into a neural net. It has to be “preprocessed.” This is terribly important. Neural networks do not work well with absolute numbers such as prices. The task is to provide data inputs which create a snapshot of the market place. There is a considerable amount of know-how that goes into selecting the data and deciding into what form to put it. It’s a key part of the design process in putting a neural net together.

CTCR: Can you give me an example of preprocessed data?

Mendelsohn: Let’s say you were going to use a moving average of the close. You would not necessarily want to use different types of moving averages or moving averages over different periods. You’d want to use that particular means of analysis in a limited way. You could then incorporate other methods that would give you a different kind of picture, something other than moving averages.

CTCR: Do you work with one prediction at a time or do you train a network to predict two or more things at once?

Mendelsohn: We currently have five separate nets in our package. Four make specific predictions based on the input data. The fifth makes its prediction based on the output of the other four.

CTCR: The possibilities for predictions seem enormous.

Mendelsohn: You can see why I say this technology is in its infancy. The potential inputs and outputs are so large that it takes a long time to work through them and see what works and what doesn’t. Then you have all the different markets to check out.

CTCR: We’ve covered the outputs and we’ve covered the inputs. Would someone please describe the “learning” process where the software creates the network.

Mendelsohn: Before we describe the learning process, we should describe the “hidden layers” of neurons which the network creates between the input and the output. We refer to each piece of input data as an “input neuron.” We call the requested output (prediction) the “output neuron.”

The hidden layers are composed of neurons which do not interact directly with the outside world, so to speak. This is where the network recodes the input data into a form that captures the hidden correlations among the input neurons. They are the key that allows the network to generalize from historical data to new (future) inputs.

Through its learning process, the network creates an internal mapping of the input data which discerns the underlying causal relationships that may exist within the input data. This is how the network learns to act “intelligently” and make useful predictions.

The user can set the number of hidden layers and the number of neurons in each hidden layer. These number are important. Too few hidden neurons prevent the network from functioning effectively. Too many hidden neurons impede generalization by allowing the network to memorize patterns without extracting the predictive features it could use to generalize in the future. When presented with new data, the network would not be able to make successful predictions because it had not discovered the underlying relationships in the input data. You must arrive at the correct number through experimentation.

CTCR: Thank you, gentlemen, for your help with this article. Mendelsohn’s company offers neural net software, called VantagePoint in a unique partnership arrangement. For a one-time fee for the first market, you become a research partner. Additional markets are discounted. The software currently provides a daily report with tomorrow’s predicted high and low, predicted price levels two and four days ahead, and an alert when a top or bottom is forming.