Description
keyboard_arrow_downMain Skills
keyboard_arrow_downOther Skills
keyboard_arrow_down database
SWING
API
MS Excel
JNI
Microsoft Office
C#
Oracle tools
Sybase database
relational data bases
CORBA
MS-Excel
model driven
UML
Windows platforms
AWK
MS-Office
Win32
Visual Source Safe
SQL
software development process
Rational Rose
Linux
PL/SQL
Design Patterns
Unix
Sun Solaris
PL-SQL
programming language
SQL-Server
JAVA object
MS Word
Development environment
data base
Parser
DML
Mari
entity-relationship
Windows
Linux operating system
Perl
JAVA application
JAVA language
shell scripting
J2EE
systems programming
9i
Eclipse
Extreme Programming
WebSphere
MS Visual
Microsoft Visual Studio
parsing
network infrastructure
Case Tools
DOS
VB
Multithreading
Java applications
XML
Junit
ClearCase
C++
Java
Sybase
main memory
MQ Series
rcs
Application Server
SQLplus
HTML
Object-Oriented Analysis and Design
Mac
RDBMS
Rational Unified Process
Visual Studio
SQL Server
TCP/IP
Windows XP
Amadeus
data system
Excel
Solaris
FTP
Oracle
EJB
SAP DB
large data
JBOSS
database design
system configuration
Source Safe
Informix
Windows NT
Work & Experience
keyboard_arrow_down 01.02.2003 — 30.06.2003
Microsoft Office, Mari Project Production of JAVA applications to the automatic loading of market data from the internet. Port of a library with financial algorithms (Financial Numerical Recipes) from C++ to JAVA and C # for a software company. (Venito GmbH)
Application
I developed an JAVA application for a software company that automatically downloaded the market data from different market data providers (stock exchanges and brokers) from the internet and stored it. The basis is thereby a configurable HTML Parser that parses and transforms the HTML-pages to tabular data. The data is then stored in form of time series using a persistence manager for time series either into files or into relational data bases. In addition I did port the extensive library C++ Financial Algorithms (Financial Numerical Recipes) from C++ to JAVA and C #. This library has many substantial algorithms, which are used with computations in the derivative range. Apart from a simple structuring to an efficient conversion to JAVA one paid attention. Among other things the following algorithms were ported:
term structure computation in different procedures (spline interpolated, linear interpolated)
implicit volatility for options on shares, FX, Futures; (European, American); Consideration of dividends
computation of the Greek letters for options
Duration (modified, Macaulay)
Interpolation (linear, spline)
operating system
Windows XP
development environment
Eclipse, MSVC++, TogetherJ
database
SAP DB, Oracle, SQL Server 01.07.1999 — 31.12.2002
Chief architect during the new development market data system in Investmentbanking of a German major bank (Commerzbank)
Application
For a major German bank a global market data system was developed.
I started this project as chief architect and was responsible for planning, the software development process, definition of business and technical architecture, production of the data model, selection of tools, selection of team members. The team size amounted to first 2 cooperating and grew gradually on up to 20 team members. In the run over 3 of year old project I settled many different tasks. Emphasis was thereby the conception and development of technical base components and the analysis and specification of technical requirements in coordination with the specialized divisions. I accompanied thereby the development of the system from the outset over the first product releases up to the complete, regular and successful production with complete coverage of requested business functionality and servicing of all customers. Today the system is the global market data system of the bank.
The market data system is the central internal market data provider and makes available quality-secured and normalized market data and derived data for many bank-internal customers from Backoffice and Frontoffice. The market data is retrieved from different data vendors (among other things Reuters, Bloomberg, Telerate, Olsen) over different interfaces tick by tick or at specific snapshot times during the day. Afterwards the data is prioritised quality-secured and transferred into a normalized structure. Based on these supplier-dependent structures further processing steps take place, e.g. inter-/extrapolation of missing values, computation of implicit volatilities, computation of volatility surfaces, computation of interest or yield curves. The system runs under Sun Solaris and is available 6 days a week 24 hours per day. Altogether in the system over 3 million instruments are administered and for over 100,000 time series ticking data by different vendors (essentially Reuters and Bloomberg) gotten. These data is processed several times daily in multi-level processing steps in the batch operating, in order to then make it to the internal customers available. The internal customers are among other things the Backoffice risk systems, asset liability management, systems for the calculation of the internal interest margins, ... The system processes several 10,000 complex requests of these customers per hour.
Technically one used thereby among other things. Sybase, C++, Asset Control, Perl, csh, awk, XML, COM, MS Excel. Most of the development was done using C++ cross platform on Solaris and Windows NT. The process within the project was thereby a pragmatic mixture from a Top down procedure and Extreme Programming (XP). Technical requirements were completely specified as a rule before implementation. Subsequently, a rough Design was documented, accepted and implemented, while in regression tests a very high code coverage was realized. A completely automated build, test and integration process was developed and used. This guaranteed an extraordinarily high quality of the software and at the same time a very efficient software development process.
The project language was English.
Responsibilities
definition of the development procedure
guidelines for quality assurance
selection of team members
evaluation of WebSphere Enterprise as application server. In the context of the evaluation an application with WebSphere Enterprise was provided. This taken place in co-operation with IBM consultants from development lab.
the production of a data model for the system, which covered the among other things following instruments: stocks, bonds, interest rates, options, warrants, index, convertibles, commodities.
definition of architecture for the global market data system. Both the business and technical architecture were specified for the system. The technical architecture covered the evaluation and decision of the persistence system (Asset Control and Sybase), the language (C++) and development environment (rational rose, Sniff, sun workshop, MSVC++, Perl, ClearCase), the layers of the system, the way of the processing (multithreaded with tasks as execution units), communication with external data suppliers and customers (among other things Reuters Triarch, Bloomberg datalicense (ftp), MQ Series, TCP/IP, XML).
production of different MS-Excel Sheets with VB programs for the definition of the data model and the various system configuration with appropriate generators, for the change of meta data (instrument types, stock exchanges, currencies, mapping definitions...) of the system, for the evaluation of market data, to the care of market data...
draft and implementation of core components of the system, among other things a transactional object manager for persistent objects and a multithreaded task scheduler using threadpools and task queues in C++ for Solaris and Windows NT, arbitrary to the data base encased
Design and implementation from financial algorithms to the computation of volatility surfaces and interest rate repo curves in C++ for Solaris and Windows NT
Design and implementation of a universal request/response based communication framework. This framework was the technical interface for all customer systems to access the provide business services. A concept was selected that decoupled the technical interface from the business requirements. Thus a central requirement could be fulfilled to deliver new services and new business requirements without having to provide a new version of the technical interface. Actual technical interfaces were offered first in C++, COM and XML format. Intended beyond that was JAVA.
Concept and Design for a multithreaded application server in C++. The application server makes central services available to customer systems. The design supported to provide new business requirements and services without change of the application server processing or the communication interfaces. This was achieved through a clear separation from business and technical interface. Therefore the effort for the supply of new business services was minimal since only the specific business logic had to be implemented and tested. All further (controlling of the processing, communication, interface to the Clients in C++, COM, XML...) was independent of business contents and the parameters of the interface
Specification of the business services in coordination with the bank-internal customers. The requirements of the bank-internal customers were analysed, in a business service specification documented and reviewed. Services were specified to query all kinds of instrument as a function of required time interval, periodicity, kind of extra/interpolation.... For the risk systems scenarios were specified, fallback rules for substitution and/or calculation of missing time series, wildcard requests, etc.
Design and implementation of the business services in the context of the general application server concept using C++, stl and boost
To handle the high access burden of the internal banking customers of up to several 10000 complex requests/hour (a single response returning up to 100 time series, e.g. for wildcard queries) with short response times, a main memory database was developed and used for the instrument related data. The main memory database is able to optimise and to evaluate queries with attribute related predicates connected logically. Since the application server is 24 hours/day online, the main memory database also was made update capable. Improvements were of up to factor 100 obtains at the response times in comparison with the Sybase database so. The main memory database was implemented using C++ with heavy use of templates, stl and boost.
operating system
Solaris, Windows NT
development environment
Sniff, Emacs, Microsoft Visual Studio, Excel, Asset Control, perl, awk, csh, ClearCase, rcs, Rational Rose, Sun Workshop Pro, Sniff++
interfaces
Reuters Triarch, Bloomberg Data License, Olsen CSV, Summit, CSV, Datastream
database
Oracle und Sybase
tools
Microsoft Office 01.07.1998 — 30.06.1999
Architect for a large travel agency system (START Amadeus)
Application
The application was a large distributed customer and order management system for travel agencies. The application was developed to be highly configurable depending on the specific travel agency chain that is using it as well as the individual user of the system. A mayor condition that had to be considered in the architecture of the system was the requirement that it had to run on an existing network infrastructure with a very limited network bandwidth.
Responsibilities
Design and implementation of an error-handling framework for the distributed environment. The language C++ requires for a long running server application a well-designed reliable error-handling framework that is capable of signalling and mapping error information in different layers. Depending on the kind of error (business or technical error) different technical solutions and usage of specific language features were provided.
Environment
operating system
Reliant Unix, Windows NT
programming language
C++
development environment
CDS++-Compiler, MS Developer Studio
Database
GINA OO to relational mapper, Informix
Tools
Paradigm Plus, MKS, MS Word 01.08.1997 — 30.06.1998
Concept, design and development of a workflow engine in a distributed environment for a large German bank (Sparkasse)
Application
The application was a distributed credit system for a large German bank using JAVA and JFC on the client, CORBA for the communication, C++ and Objectivity DB on the server under Windows NT. The system was targeted to handle private and corporate customers and all important information regarding credits, e.g. real estate, assets, securities.
The application was designed to support the configuration of the detailed workflow during runtime. Design and implementation of a Workflow Definition Language (WDL) parser using the JAVA language. The application consisted out of many tasks that could be combined to from business processes. The concrete order and dependencies of the task could be configured during runtime using WDL. The WDL parser made syntactical and semantically checks and then transformed the definitions into meta objects. During runtime these Meta objects were used from an execution environment to create tasks and steer the whole application.
Responsibilities
Concept, design and implementation of the meta model as well as a special language for the definition of the workflow. The structure, order relations and interfaces of the individual process steps could be described using the Workflow Definition Language (WDL).
Concept, design and implementation of a parser that reads WDL-files at runtime and converts the workflow definition into appropriate met objects that hold all relevant information. This information is evaluated at runtime by an interpreter that is responsible for steering the workflow.
Concept, design and implementation of a generator that generates C++-wrapper classes for JAVA classes. Each instance of a C++ wrapper class represents a proxy for the JAVA object and provides an equal public interface as the JAVA class. Each method call is delegated using JNI.. The generator used the JAVA Reflection API, in order to evaluate the appropriate structural information for the JAVA. classes. Based on it appropriate C++-wrapper classes were generated, that access the JAVA classes using JNI.
operating system
Windows NT
programming language
JAVA and C++
development environment
Rational Rose, MSVC++, MS J++, JBuilder, Visual Cafe, MS Visual Source Safe Microsoft Office, Mari Project Production of JAVA applications to the automatic loading of market data from the internet. Port of a library with financial algorithms (Financial Numerical Recipes) from C++ to JAVA and C # for a software company. (Venito GmbH)
Application
I developed an JAVA application for a software company that automatically downloaded the market data from different market data providers (stock exchanges and brokers) from the internet and stored it. The basis is thereby a configurable HTML Parser that parses and transforms the HTML-pages to tabular data. The data is then stored in form of time series using a persistence manager for time series either into files or into relational data bases. In addition I did port the extensive library C++ Financial Algorithms (Financial Numerical Recipes) from C++ to JAVA and C #. This library has many substantial algorithms, which are used with computations in the derivative range. Apart from a simple structuring to an efficient conversion to JAVA one paid attention. Among other things the following algorithms were ported:
term structure computation in different procedures (spline interpolated, linear interpolated)
implicit volatility for options on shares, FX, Futures; (European, American); Consideration of dividends
computation of the Greek letters for options
Duration (modified, Macaulay)
Interpolation (linear, spline)
operating system
Windows XP
development environment
Eclipse, MSVC++, TogetherJ
database
SAP DB, Oracle, SQL Server Chief architect during the new development market data system in Investmentbanking of a German major bank (Commerzbank)
Application
For a major German bank a global market data system was developed.
I started this project as chief architect and was responsible for planning, the software development process, definition of business and technical architecture, production of the data model, selection of tools, selection of team members. The team size amounted to first 2 cooperating and grew gradually on up to 20 team members. In the run over 3 of year old project I settled many different tasks. Emphasis was thereby the conception and development of technical base components and the analysis and specification of technical requirements in coordination with the specialized divisions. I accompanied thereby the development of the system from the outset over the first product releases up to the complete, regular and successful production with complete coverage of requested business functionality and servicing of all customers. Today the system is the global market data system of the bank.
The market data system is the central internal market data provider and makes available quality-secured and normalized market data and derived data for many bank-internal customers from Backoffice and Frontoffice. The market data is retrieved from different data vendors (among other things Reuters, Bloomberg, Telerate, Olsen) over different interfaces tick by tick or at specific snapshot times during the day. Afterwards the data is prioritised quality-secured and transferred into a normalized structure. Based on these supplier-dependent structures further processing steps take place, e.g. inter-/extrapolation of missing values, computation of implicit volatilities, computation of volatility surfaces, computation of interest or yield curves. The system runs under Sun Solaris and is available 6 days a week 24 hours per day. Altogether in the system over 3 million instruments are administered and for over 100,000 time series ticking data by different vendors (essentially Reuters and Bloomberg) gotten. These data is processed several times daily in multi-level processing steps in the batch operating, in order to then make it to the internal customers available. The internal customers are among other things the Backoffice risk systems, asset liability management, systems for the calculation of the internal interest margins, ... The system processes several 10,000 complex requests of these customers per hour.
Technically one used thereby among other things. Sybase, C++, Asset Control, Perl, csh, awk, XML, COM, MS Excel. Most of the development was done using C++ cross platform on Solaris and Windows NT. The process within the project was thereby a pragmatic mixture from a Top down procedure and Extreme Programming (XP). Technical requirements were completely specified as a rule before implementation. Subsequently, a rough Design was documented, accepted and implemented, while in regression tests a very high code coverage was realized. A completely automated build, test and integration process was developed and used. This guaranteed an extraordinarily high quality of the software and at the same time a very efficient software development process.
The project language was English.
Responsibilities
definition of the development procedure
guidelines for quality assurance
selection of team members
evaluation of WebSphere Enterprise as application server. In the context of the evaluation an application with WebSphere Enterprise was provided. This taken place in co-operation with IBM consultants from development lab.
the production of a data model for the system, which covered the among other things following instruments: stocks, bonds, interest rates, options, warrants, index, convertibles, commodities.
definition of architecture for the global market data system. Both the business and technical architecture were specified for the system. The technical architecture covered the evaluation and decision of the persistence system (Asset Control and Sybase), the language (C++) and development environment (rational rose, Sniff, sun workshop, MSVC++, Perl, ClearCase), the layers of the system, the way of the processing (multithreaded with tasks as execution units), communication with external data suppliers and customers (among other things Reuters Triarch, Bloomberg datalicense (ftp), MQ Series, TCP/IP, XML).
production of different MS-Excel Sheets with VB programs for the definition of the data model and the various system configuration with appropriate generators, for the change of meta data (instrument types, stock exchanges, currencies, mapping definitions...) of the system, for the evaluation of market data, to the care of market data...
draft and implementation of core components of the system, among other things a transactional object manager for persistent objects and a multithreaded task scheduler using threadpools and task queues in C++ for Solaris and Windows NT, arbitrary to the data base encased
Design and implementation from financial algorithms to the computation of volatility surfaces and interest rate repo curves in C++ for Solaris and Windows NT
Design and implementation of a universal request/response based communication framework. This framework was the technical interface for all customer systems to access the provide business services. A concept was selected that decoupled the technical interface from the business requirements. Thus a central requirement could be fulfilled to deliver new services and new business requirements without having to provide a new version of the technical interface. Actual technical interfaces were offered first in C++, COM and XML format. Intended beyond that was JAVA.
Concept and Design for a multithreaded application server in C++. The application server makes central services available to customer systems. The design supported to provide new business requirements and services without change of the application server processing or the communication interfaces. This was achieved through a clear separation from business and technical interface. Therefore the effort for the supply of new business services was minimal since only the specific business logic had to be implemented and tested. All further (controlling of the processing, communication, interface to the Clients in C++, COM, XML...) was independent of business contents and the parameters of the interface
Specification of the business services in coordination with the bank-internal customers. The requirements of the bank-internal customers were analysed, in a business service specification documented and reviewed. Services were specified to query all kinds of instrument as a function of required time interval, periodicity, kind of extra/interpolation.... For the risk systems scenarios were specified, fallback rules for substitution and/or calculation of missing time series, wildcard requests, etc.
Design and implementation of the business services in the context of the general application server concept using C++, stl and boost
To handle the high access burden of the internal banking customers of up to several 10000 complex requests/hour (a single response returning up to 100 time series, e.g. for wildcard queries) with short response times, a main memory database was developed and used for the instrument related data. The main memory database is able to optimise and to evaluate queries with attribute related predicates connected logically. Since the application server is 24 hours/day online, the main memory database also was made update capable. Improvements were of up to factor 100 obtains at the response times in comparison with the Sybase database so. The main memory database was implemented using C++ with heavy use of templates, stl and boost.
operating system
Solaris, Windows NT
development environment
Sniff, Emacs, Microsoft Visual Studio, Excel, Asset Control, perl, awk, csh, ClearCase, rcs, Rational Rose, Sun Workshop Pro, Sniff++
interfaces
Reuters Triarch, Bloomberg Data License, Olsen CSV, Summit, CSV, Datastream
database
Oracle und Sybase
tools
Microsoft Office Architect for a large travel agency system (START Amadeus)
Application
The application was a large distributed customer and order management system for travel agencies. The application was developed to be highly configurable depending on the specific travel agency chain that is using it as well as the individual user of the system. A mayor condition that had to be considered in the architecture of the system was the requirement that it had to run on an existing network infrastructure with a very limited network bandwidth.
Responsibilities
Design and implementation of an error-handling framework for the distributed environment. The language C++ requires for a long running server application a well-designed reliable error-handling framework that is capable of signalling and mapping error information in different layers. Depending on the kind of error (business or technical error) different technical solutions and usage of specific language features were provided.
Environment
operating system
Reliant Unix, Windows NT
programming language
C++
development environment
CDS++-Compiler, MS Developer Studio
Database
GINA OO to relational mapper, Informix
Tools
Paradigm Plus, MKS, MS Word Concept, design and development of a workflow engine in a distributed environment for a large German bank (Sparkasse)
Application
The application was a distributed credit system for a large German bank using JAVA and JFC on the client, CORBA for the communication, C++ and Objectivity DB on the server under Windows NT. The system was targeted to handle private and corporate customers and all important information regarding credits, e.g. real estate, assets, securities.
The application was designed to support the configuration of the detailed workflow during runtime. Design and implementation of a Workflow Definition Language (WDL) parser using the JAVA language. The application consisted out of many tasks that could be combined to from business processes. The concrete order and dependencies of the task could be configured during runtime using WDL. The WDL parser made syntactical and semantically checks and then transformed the definitions into meta objects. During runtime these Meta objects were used from an execution environment to create tasks and steer the whole application.
Responsibilities
Concept, design and implementation of the meta model as well as a special language for the definition of the workflow. The structure, order relations and interfaces of the individual process steps could be described using the Workflow Definition Language (WDL).
Concept, design and implementation of a parser that reads WDL-files at runtime and converts the workflow definition into appropriate met objects that hold all relevant information. This information is evaluated at runtime by an interpreter that is responsible for steering the workflow.
Concept, design and implementation of a generator that generates C++-wrapper classes for JAVA classes. Each instance of a C++ wrapper class represents a proxy for the JAVA object and provides an equal public interface as the JAVA class. Each method call is delegated using JNI.. The generator used the JAVA Reflection API, in order to evaluate the appropriate structural information for the JAVA. classes. Based on it appropriate C++-wrapper classes were generated, that access the JAVA classes using JNI.
operating system
Windows NT
programming language
JAVA and C++
development environment
Rational Rose, MSVC++, MS J++, JBuilder, Visual Cafe, MS Visual Source Safe Database
Objectivity
ORB
Orbix von iona Database
Objectivity
ORB
Orbix von iona Attachments
keyboard_arrow_down