East Coast CIO Forum – Oct 30, 2007

Hi Everyone,

As a follow up to our October 30th meeting, we would like to thank Kurt Brungardt, CIO of MSD Capital and the entire MSD Capital team for hosting our East Coast CIO Forum event.  Special thanks to Stephen Brobst (see bio below), for joining us after months of planning, and to Amanda Cain (see bio below), for agreeing to share her experiences as they relate to our market discussions.  To all our new guests, thank you for adding a new perspective to our meeting.  To all of our regular attendees, thank you for your ongoing support.  The meeting would just not be the same without you!

Some thoughts from the meeting included the following:

From Amanda regarding e-Trading:

  • Latency including the use of hosting facilities to get that much closer to the Exchange.  Huge push for this service offering from Banks, Hedge Funds and Professional Trading Groups.
  • For the large Banks: internal pre-matching of orders before they get to the Exchange a) to save on cost and b) to avoid the appearance of wash trades.  However this links back to the problem of increasing latency since effectively your trades are going down a pipe and matching off so not going the most direct route to the Exchange.  In the case of a black box or program trade this can negate their vital edge.
  • How do we get paid?  Technology offerings are ahead of this small essential piece.  With an increasing number of ways to get to an Exchange there are no mechanisms outside the US to automatically record and bill out the trades.  Example: An order executed through FIX from ASIA into a European Market and given up to a Prime Broker (anywhere) is out of scope.  You can only be sure to get paid in the US, where there is a GANES system that is linked to give-up agreements and automatically debits and credits counterparties; all else is extremely hard to track and then must be billed manually.
  • Since electronic platforms and the exchanges can launch new products at will and very quickly, they are not waiting to even connect the “straight through processing” to their own clearing houses (writing the APIs).  This is creating huge operational risk as trades (and often huge volumes) have to be allocated manually as and when.  It also puts pressure on the premier clearing systems vendor – (SunGard GMI) who has to build the ability to clear & settle these new products – the cost of which they either have to pass on to the ultimate clearers (FCMs or BDs) or to defer their risk of the product failing.
  • Side-by-side trading has another whole stack of issues!

From Stephen regarding Risk Management

  • Infrastructure Requirements: The data warehouse is a core infrastructure component for storing integrating data from across multiple source systems onto a platform for supporting advanced risk analytics.  The ETL (extract, transform, and load) implementation to deliver integrated data into the warehouse is the most complex and expensive part of building out the platform for risk analytics.  In some cases, organizations have deployed a separate ODS (Operational Data Store) to support operational analytics separate from the strategic analytics and model building on the Data Warehouse (DW).  Excel is an important tool for supporting risk analytics.  However, it is important to differentiate between Excel as an analytic tool versus Excel as a database.  Best practices would involve Excel accessing data in a Data Warehouse rather than using Excel as a data store (“spreadmart”).  Excel as a database does not typically have the auditability or enterprise controls in place necessary for an enterprise solution.  On the other hand, analysts often like the personal control provided by local data marts in Excel.  There are clear tradeoffs between enterprise controls and local control by an analyst.
  • Right-time versus real-time: Service level agreements for data freshness will vary depending on the enterprise and the specific processes to be supported.  Different enterprises will have different requirements varying from up-to-the-second to overnight and even less frequent updates in some cases.  Service levels may differ by subject area within the data model even within the same enterprise.  Model building for risk management is performed using data that does not need to be up-to-date.  However, the actual risk scoring itself will usually need quite up-to-date data.
  • Data quality: One of the biggest issues in achieving reliable risk analytics is quality data.  In many cases, data quality is ad hoc and master data management is lacking.  Reliable risk analytics requires a methodical approach for prioritizing investments in data quality.  There are techniques borrowed from the manufacturing industry that can be very effectively applied to data warehousing though use of House of Quality and other Six Sigma methods.

In response to the many requests we have received after the meeting, we are planning to extend these topics for further discussion at our next meeting.

Once again, we thank you all for sharing your thoughts and experiences on all these topics, as we continue addressing the fluctuating market, and the building of a strong risk management foundation in our respective companies.  You are all an amazing group of people!

Thank you again for joining us, and see you all at the next event!

-malka
Malka Treuhaft
Executive Director East Coast CIO Forum &
President
Truision Inc.
646.942.2625 (office)
917.589.1069 (mobile)
www.truision.com

AGENDA

Hi Everyone!

We would like to thank Kurt Brungardt, CIO of MSD Capital for once again hosting our next October 30th meeting.   We will be providing logistic details to those registered in early October.   We have had a terrific response to this next meeting, and we are working hard to accommodate multiple attendees from each company.

Once again, the agenda will be focused on the following:

  1. Electronic trading, what will the buy and sell side look like in two years. How do we align all the technologies?
  2. Valuation Methodologies, collateral management, margin management. How can we exchange data valuations? What demands will the business make on us in the future. How can we be better prepared for certain market trends?
  3. Risk Management (for both Capital Markets and Insurance/Re-Insurance).

If you have invited a business person to attend with you, please send us their e-mail address so that we may provide them with the logistic details of the meeting.

Registration will remain open on a first come basis until we reach capacity.

The attached list reflects the current confirmed registration list.

Morgan Stanley, Moodies, Lehman Brothers, XL Capital, Citibank, Bank of America, Deutsche Bank, MSD Capital, Moore Capital, Fortress Investment Group, Highbridge Capital Management, Soros Fund Management, Angelo Gordon, Beth Abraham Health Service,  Promontory Financial Group,  Fresh Direct, Security Capital Assurance, Alliance Bernstein, Linkstorm, Nextjump, Eton Park, Kita Capital,  Standard & Poors, AQR Capital,  Fimat, Pequot Capital, Touro College, Credit-Suisse,  Wexford, & Bunge.

Thanks again for your ongoing support!

BIO’s

Amanda Cain Bio

Amanda has more than 20 years experience in Institutional Futures including management, sales and trading in Fixed Income, Equities and Commodity products. She is currently at Fimat USA, the broker /dealer owned by Societe General where she is the Executive Vice President of Financial Products and Services, overseeing e-Trading, the Relationship Management Group Sales and Middle Office and the 24 hr Execution desk.  Prior to that Amanda was a Managing Director and Head of North American Sales for Futures at Citigroup.  She has also held positions at Dean Witter Reynolds and ED & F Man.

Stephen Brobst Bio

Stephen Brobst is the Chief Technology Officer for NCR’s Teradata Division. His expertise is in the identification and development of opportunities for the strategic use of technology in competitive business environments. Over the past sixteen years Stephen has been involved in numerous engagements in which he has been called upon to apply his combined expertise in business strategy and high-end parallel systems to develop frameworks for data warehousing and data mining to leverage information for strategic advantage. Clients with whom he has worked in the financial services industry include Bank of America, Wachovia Bank, Wells Fargo Bank, DnB NOR, National Australia Bank, Principal Financial Group, Charles Schwab, Fidelity Investments, Equifax, Janus Funds, Investors Bank and Trust, Experian, American Express, VISA International, E*Trade, J.P. Morgan, Merrill Lynch, Toronto Dominion Bank, Barclays Bank, Amalgamated Banks of South Africa (ABSA), Banco de Credito, Bank of Montreal, Commonwealth Bank of Australia, First National Bank of Omaha, and many others. Stephen is an internationally known speaker and has authored numerous articles and books related to advanced data management techniques. Stephen has particular expertise in the deployment of solutions for maximizing the value of customer relationships through use of advanced CRM techniques (including Web deployment).

Stephen has hands-on experience in benchmarking and data warehouse construction with every major SMP, NUMA and MPP architecture available in the industry today. Stephen has served as an advisor to the Transaction Processing Council in regard to design for the TPC benchmarks and was involved in the development of the DataChallange benchmark in cooperation with the Transaction Processing Council. He has also served on the Oracle VLDB Steering Group Committee and the Teradata User Advisory Board. Stephen has worked extensively on VLDB implementations with Oracle, DB2 (both OS/390 and UNIX), Teradata, Informix (both XPS and ODS), Sybase (both Adaptive Server and IQ), Microsoft SQL Server, Red Brick, Non-Stop SQL, and many other leading DBMS products. Prior to joining NCR, Stephen successfully launched three start-up companies related to high-end database products and services in the data warehousing and e-business marketplaces: (1) Tanning Technology Corporation (acquired by Platinum Technologies), (2) NexTek Solutions (acquired by IBM), and (3) Strategic Technologies & Systems (acquired by NCR).

Previously, Stephen taught graduate courses at Boston University and the Massachusetts Institute of Technology in both the MBA program at the Sloan School of Management and in the Computer Science departments of both universities. He received instructor of the year award for two of his last five years in the MET Computer Science department at Boston University and continues to guest lecture frequently at the Massachusetts Institute of Technology and the Kellogg Graduate School of Management. Stephen performed Masters and PhD research at the Massachusetts Institute of Technology where his dissertation work focused on load-balancing and resource allocation for parallel computing architectures. He also holds an MBA with joint course and thesis work at the Harvard Business School and the MIT Sloan School of Management. Stephen completed his undergraduate work in Electrical Engineering and Computer Science in just three years at U.C. Berkeley, and was awarded with the highest honor given to a graduating senior in the College of Engineering (Bechtel Engineering Award). Stephen is an elected member of the Phi Beta Kappa, Eta Kappa Nu, Tau Beta Pi, Sigma Xi, New York Academy of Sciences, U.C. Berkeley Alumni Scholars Association, and California Scholarship Federation honor societies. He is also a member of the Association for Computing Machinery, IEEE, Society for Information Management, and Computing Professionals for Social Responsibility. Stephen also serves as an advisor to the National Academy of Sciences in the area of IT workforce development.

Stephen has authored numerous journal and conference papers in the fields of data management and parallel computing environments and is an internationally recognized speaker (and practitioner) in the area of breakthrough systems implementation. He recently co-authored a book, Building a Data Warehouse for Decision Support, published by Prentice Hall PTR. Stephen is currently working on a second book focused on high-performance database design for VLDB data warehouse implementations. He has been a contributing editor for Intelligent Enterprise Magazine and has published dozens of technical articles in The International Journal of High Speed Computing, Communications of the ACM, The Journal of Data Warehousing, Enterprise Systems Journal, DM Review, Database Programming and Design, DBMS Tools & Techniques, DB2 Magazine, Oracle Magazine, Teradata Review, and many others. Stephen has also served on the RealWare Panel for recognizing outstanding implementations in the field of e-commerce and customer relationship management (CRM) solutions. Stephen has been on the faculty of the Data Warehousing Institute since 1996 and teaches courses related to Real-Time Data Warehousing and High Performance Data Warehouse Design.